Earth

NASA's infrared analysis of Tropical Storm Sebastien sees wind shear

image: On Nov. 22, 2019 at 1:15 a.m. EST (0515 UTC) the MODIS instrument that flies aboard NASA's Aqua satellite showed on area in Sebastien's southwestern corner where cloud top temperatures were as cold as or colder than (in yellow) minus 80 degrees Fahrenheit (minus 62.2 Celsius).

Image: 
NASA/NRL

Tropical Storm Sebastian continued to move in a northeasterly direction through the North Atlantic Ocean as NASA's Aqua satellite passed overhead. Infrared imagery from an instrument aboard Aqua revealed very high, powerful storms with very cold cloud top temperatures in the southwestern quadrant of the storm.  It also revealed that the storm was being sheared apart by outside winds.

Tropical cyclones are made of up hundreds of thunderstorms and infrared data can show where the strongest storms are located. They can do that because infrared data provides temperature information, and the strongest thunderstorms that reach highest into the atmosphere have the coldest cloud top temperatures.

On Nov. 22 at 1:15 a.m. EST (0515 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite used infrared light to analyze the strength of storms within the tropical cyclone. MODIS found those strongest storms only in the southwestern side of the storm where cloud top temperatures were as cold as or colder than minus 80 degrees Fahrenheit (minus 62.2 Celsius). NASA research has found that cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

The reason the strongest storms were happening only in that quadrant is that outside winds from the southwest are pushing the bulk of clouds and precipitation to the northeast. In general, wind shear is a measure of how the speed and direction of winds change with altitude. Tropical cyclones are like rotating cylinders of winds. Each level needs to be stacked on top each other vertically in order for the storm to maintain strength or intensify. Wind shear occurs when winds at different levels of the atmosphere push against the rotating cylinder of winds, weakening the rotation by pushing it apart at different levels.

At 11 a.m. EST (1500 UTC), the center of Tropical Storm Sebastien was located near latitude 25.2 degrees north and longitude 55.3 degrees west about 695 miles (1,120 km) northeast of the Northern Leeward Islands.

Sebastien is moving toward the east-northeast near 15 mph (24 kph). An east-northeastward or northeastward motion at a similar forward speed is expected through the weekend of Nov. 23 and 24. Maximum sustained winds have decreased to near 50 mph (85 kph) with higher gusts.  The estimated minimum central pressure is 1,000 millibars.

The National Hurricane Center said that strong wind shear is expected to prevent Sebastien from getting better organized, so gradual weakening is anticipated. Sebastien is forecast to dissipate by early next week.

Credit: 
NASA/Goddard Space Flight Center

Lack of sleep may explain why poor people get more heart disease

Sophia Antipolis, 22 November 2019: Insufficient sleep is one reason why disadvantaged groups have more heart disease. That's the finding of a study published today in Cardiovascular Research, a journal of the European Society of Cardiology (ESC).1

People with lower socioeconomic status sleep less for a variety of reasons: they may do several jobs, work in shifts, live in noisy environments, and have greater levels of emotional and financial stress.

This was the first large population-based study to examine whether lack of sleep could partly explain why poor people have more heart disease. It found that short sleep explained 13.4% of the link between occupation and coronary heart disease in men.

Study author Dusan Petrovic, of the University Centre of General Medicine and Public Health (unisanté), Lausanne, Switzerland, said: "The absence of mediation by short sleep in women could be due to the weaker relationship between occupation and sleep duration compared to men."

"Women with low socioeconomic status often combine the physical and psychosocial strain of manual, poorly paid jobs with household responsibilities and stress, which negatively affects sleep and its health-restoring effects compared to men," he said.

He said: "Structural reforms are needed at every level of society to enable people to get more sleep. For example, attempting to reduce noise, which is an important source of sleep disturbances, with double glazed windows, limiting traffic, and not building houses next to airports or highways."

The study was part of the Lifepath project, and pooled data from eight cohorts totalling 111,205 participants from four European countries. Socioeconomic status was classified as low, middle, or high according to father's occupation and personal occupation. History of coronary heart disease and stroke was obtained from clinical assessment, medical records, and self-report. Average sleep duration was self-reported and categorised as recommended or normal sleep (6 to 8.5), short sleep (6), and long sleep (more than 8.5) hours per night.

The contribution of insufficient sleep was investigated using a statistical approach called mediation analysis. It estimates the contribution of an intermediate factor (sleep) to an association between the main exposure (socioeconomic status) and the main outcome (coronary heart disease or stroke).

Credit: 
European Society of Cardiology

Omega-3 fatty acids' health benefit linked to stem cell control, researchers find

For years, researchers have known that defects in an ancient cellular antenna called the primary cilium are linked with obesity and insulin resistance. Now, researchers at the Stanford University School of Medicine have discovered that the strange little cellular appendage is sensing omega-3 fatty acids in the diet, and that this signal is directly affecting how stem cells in fat tissue divide and turn into fat cells.

The finding represents a missing link between two worlds -- that of dietary science, and that of molecular and cellular biology. Dietary studies have long found that the consumption of omega-3 fatty acids, essential fatty acids common in fish and nuts, is associated with lower risk of heart disease, stroke, arthritis and even depression.

A paper describing the research will be published online Nov. 21 in Cell. The senior author is Peter Jackson, PhD, professor of microbiology and immunology and of pathology. The lead author is postdoctoral scholar Keren Hilgendorf, PhD.

Looking for signaling molecule

Researchers in Jackson's laboratory weren't looking for omega-3s when they started their research. They were only looking for the signaling molecule that fat stem cells were sensing. The molecule could have been anything: signaling pathways in cellular biology often involve esoteric molecules few people have heard of. They only knew that in rare diseases involving a defect in the primary cilium, people are always hungry, cannot stop eating and become obese and insulin resistant. So they were surprised when the signal turned out to be omega-3 fatty acids.

"When we saw that the cell was responding to omega-3 fatty acids, we realized that this had changed from just a molecular biology story to a story showing the molecular biology of how diet controls stem cells," Jackson said.

The cells sense the presence of omega-3 fatty acids through a tiny, hair-like appendage called the primary cilium, an ancient structure derived from the many flagella that algae cells first used almost 1 billion years ago to move through the oceans and sense their surroundings. Over time, as single-celled organisms evolved into multicellular creatures that first swam the oceans and then crawled onto land, cells ditched most of their flagella. But most cells kept a single flagellum, the primary cilium, to use as a highly sensitive antenna; it can pick up extremely subtle signals about the world outside the cell, helping to regulate the cell's function and fate.

Jackson and his colleagues found that when omega-3 fatty acids bind to a receptor called FFAR4 on the cilia of fat stem cells, it prompts the fat stem cells to divide, leading to the creation of more fat cells. This provides the body with more fat cells with which to store energy, something that is healthier than storing too much fat in existing fat cells. "What you want is more, small fat cells rather than fewer, large fat cells," Jackson said. "A large fat cell is not a healthy fat cell. The center is farther away from an oxygen supply, it sends out bad signals and it can burst and release toxic contents." Large fat cells are associated with insulin resistance, diabetes and inflammation, he added.

Furthermore, the researchers found that the presence of saturated fats or the blockage of ciliary signaling of the FFAR4 receptor does not lead to an increase in the creation of new fat cells from stem cells, but rather the addition of fat to existing cells. "Rather than looking how diet correlates with health, we have gone from molecule to receptor to cell to document why 'healthy fats' are beneficial and 'unhealthy fats' contribute to disease," Hilgendorf said. "We have provided a mechanism explaining why omega-3 fatty acids are critical for maintaining healthy fat balance and saturated fats should be limited."

The research also may change scientific understanding of how the body manages fat storage in a healthy person. "Researchers often talk about the movement of fat in and out of cells, but what we are showing is the importance of stem cell activity in creating new fat cells as being critical for the body's energy management," said Carl Johnson, a graduate student in the stem cell biology and regenerative medicine program and a co-author of the paper.

Credit: 
Stanford Medicine

Life under extreme conditions at hot springs in the ocean

image: Aerial view of the acidic hot springs in the shallow water of the Taiwanese Kueishantao volcanic island, visible through the whitish discoloration of the sea water by sulphur.

Image: 
Mario Lebrato, Uni Kiel

The volcanic island of Kueishantao in northeastern Taiwan is an extreme habitat for marine organisms. With an active volcano, the coastal area has a unique hydrothermal field with a multitude of hot springs and volcanic gases. The acidity of the study area was among the highest in the world. The easily accessible shallow water around the volcanic island therefore represents an ideal research environment for investigating the adaptability of marine organisms, some of which are highly specialised, such as crabs, to highly acidified and toxic seawater.

For about ten years, marine researchers from the Institute of Geosciences at Kiel University (CAU), together with their Chinese and Taiwanese partners from Zhejiang University in Hangzhou and the National Taiwan Ocean University in Keelung, regularly collected data on geological, chemical and biological processes when two events disrupted the results of the time series in 2016. First, the island was shaken by an earthquake and hit by the severe tropical typhoon Nepartak only a few weeks later. On the basis of data collected over many years, the researchers from Kiel, China and Taiwan were now able to demonstrate for the first time that biogeochemical processes had changed due to the consequences of the enormous earthquake and typhoon and how different organisms were able to adapt to the changed seawater biogeochemistry in the course of only one year. The first results of the interdisciplinary study, based on extensive data dating back to the 1960s, were recently published in the international journal Nature Scientific Reports.

"Our study clearly shows how closely atmospheric, geological, biological and chemical processes interact and how an ecosystem with extreme living conditions such as volcanic sources on the ocean floor reacts to disturbances caused by natural events," says Dr. Mario Lebrato of the Institute of Geosciences at Kiel University. For years, scientists led by Dr. Dieter Garbe-Schönberg and Dr. Mario Lebrato from the Institute of Geosciences at the CAU have been researching the shallow hydrothermal system "Kueishantao". The selected site has a large number of carbon dioxide emissions in the shallow water. In addition, the sources release toxic metals. Sulphur discolours the water over large areas. The volcanic gases - with a high sulphur compounds - lead to a strong acidification of the sea water. Through methods of airborne drone surveying, modelling, regular sampling and laboratory experiments research into the hydrothermal field therefore makes an important contribution to the effects of ocean acidification on marine communities. Only a few specialized animal species such as crabs, snails and bacteria live in the immediate vicinity of the sources. A few metres away, on the other hand, is the diverse life of a tropical ocean.

"Due to the high acidity, the high content of toxic substances and elevated temperatures of the water, the living conditions prevailing there can serve as a natural laboratory for the investigation of significant environmental pollution by humans. The sources at Kueishantao are therefore ideal for investigating future scenarios," says co-author Dr. Yiming Wang, who recently moved from Kiel University to the Max Planck Institute for the Science of Human History in Jena.

After the severe events in 2016, the study area changed completely. The seabed was buried under a layer of sediment and rubble. In addition, the acidic warm water sources dried up, and the composition of the sea water had significantly and continuously changed over a long period of time. Aerial photos taken with drones, samples taken by research divers from Kiel and Taiwan as well as biogeochemical investigations clearly showed the spatial and chemical extent of the disturbances. These were recorded by the biologist and research diver Mario Lebrato and his Taiwanese colleague Li Chun Tseng and compared with the results of earlier samplings. "What initially looked like a catastrophe for our current time series study turned out to be a stroke of luck afterwards. This gave us the rare opportunity to observe how organisms adapt to the severe disturbances. We were able to draw on a comprehensive database to do this" explains project manager Dr. Dieter Garbe-Schönberg from the Institute of Geosciences at Kiel University.

Credit: 
Kiel University

Breaking (and restoring) graphene's symmetry in a twistable electronics device

image: Illustration of controlled rotation of boron nitride (BN) layers above and below a graphene layer introduce coexisting moiré superlattices, which change size, symmetry, and complexity as a function of angle. In this system the Columbia researchers achieve unprecedented control over monolayer graphene's bandstructure within a single device, by mechanically rotating boron nitride atop graphene aligned to a bottom BN slab."

Image: 
Nathan Finney and Sanghoon Chae/Columbia Engineering

New York, NY--November 21, 2019--A recent study from the labs of James Hone (mechanical engineering) and Cory Dean (physics) demonstrates a new way to tune the properties of two-dimensional (2D) materials simply by adjusting the twist angle between them. The researchers built devices consisting of monolayer graphene encapsulated between two crystals of boron nitride and, by adjusting the relative twist angle between the layers, they were able to create multiple moiré patterns.

Moiré patterns are of high interest to condensed matter physicists and materials scientists who use them to change or generate new electronic material properties. These patterns can be formed by aligning boron nitride (BN, an insulator) and graphene (a semimetal) crystals. When these honeycomb lattices of atoms are close to alignment, they create a moiré superlattice, a nanoscale interference pattern that also looks like a honeycomb. This moiré superlattice alters the quantum mechanical environment of the conducting electrons in the graphene, and therefore can be used to program significant changes in the observed electronic properties of the graphene.

To date, most studies on the effects of moiré superlattices in graphene-BN systems have looked at a single interface (with either the top or bottom surface of the graphene considered, but not both). However, a study published by Hone and Dean last year demonstrated that total rotational control over one of the two interfaces was possible within a single device.

By designing a device that has persistent alignment at one interface, and tunable alignment at the other, the Columbia team has now been able to study the effects of multiple moiré superlattice potentials on a layer of graphene.

"We decided to look at both the top and bottom surfaces of the graphene in a single nanomechanical device," said Nathan Finney, a PhD student in Hone's lab and co-lead-author of the paper, published online September 30 by Nature Nanotechnology and now the cover story of the November print edition. "We had a hunch that by doing so, we would be able to potentially double the strength of the moiré superlattice using the coexisting moiré superlattices from the top and bottom interfaces."

The team discovered that twisting the angle of the layers enabled them to control both the strength of the moiré superlattice as well as its overall symmetry, inferred from the significant changes in the electronic properties of the graphene observed.

At angles close to alignment, a highly altered graphene band structure emerged, observable in the formation of coexisting non-overlapping long-wavelength moiré patterns. At perfect alignment, the graphene's electronic gaps were either strongly enhanced or suppressed, depending on whether the top rotatable BN was twisted 0 or 60 degrees. These changes in the electronic gaps corresponded to the expected changes in in symmetry for the two alignment configurations--inversion symmetry broken at 0 degrees, and inversion symmetry restored at 60 degrees.

"This is the first time anyone has seen the full rotational dependence of coexisting moiré superlattices in one device," Finney notes. "This degree of control over the symmetry and the strength of moiré superlattices can be universally applied to the full inventory of 2D materials we have available. This technology enables the development of nanoelectromechanical sensors with applications in astronomy, medicine, search and rescue, and more."

The researchers are now refining the ability to twist monolayers of a wide range of 2D materials to study such exotic effects as superconductivity, topologically induced ferromagnetism, and non-linear optical response in systems that lack inversion symmetry.

Credit: 
Columbia University School of Engineering and Applied Science

Grid reliability under climate change may require more power generation capacity

video: A new university/national laboratory study reveals the importance of climate-water impacts on U.S. electric grid planning

Image: 
Erica Klein-Meisenhelter-The Graduate Center, CUNY

Researchers created a new modeling approach that accounts for climate and water impacts on electricity infrastructure development.

The new analysis compares results with traditional modeling approaches that may or may not consider climate impacts, revealing that the U.S. power grid may need more capacity than previously thought to adapt to future climate-water conditions.

Adaptations include additional renewable and natural gas construction and, along with regional electricity generation trade-offs, lead to lower water use and carbon emissions, potentially helping mitigate climate change

NEW YORK, November 21, 2019 -- A new analysis from university-based and national laboratory researchers applied a new modeling approach for long-term electricity generation infrastructure planning that considers future climate and water resource conditions. Compared to traditional projections, which do not consider climate-water impacts on electricity generation, results of this new approach show the national power grid may need an additional 5.3% to 12% of power-generating capacity to meet demand and reliability requirements. The changes would lower water use and carbon emissions, potentially helping mitigate future climate changes.

The new study, which will be featured as the cover article in the December 3 online and December 13 print issue of Environmental Science & Technology, is available online today.

"This is the first time anyone has modeled future electricity infrastructure under climate change using a method that includes feasibility checks to ensure results meet reliable power supply thresholds under climate and water resource constraints," said the study's lead author Ariel Miara, a senior research associate with the Advanced Science Research Center at The Graduate Center, CUNY (CUNY ASRC) and an energy, water, and environment researcher with the U.S. Department of Energy's National Renewable Energy Laboratory (NREL). "We combined high-resolution hydrological, thermal-power plant, and capacity-expansion models to improve confidence in long-term electricity infrastructure planning under future climate-water impacts. Typically, this isn't done for electricity infrastructure planning, or it's done but without feasibility checks on results. Our approach allowed us to assess region-specific climate-water impacts on power supply reliability and identify potential adaptation steps to enhance reliability."

The current U.S. grid relies heavily on thermal power plants that use coal, nuclear, and natural gas fuels; these are affected by warm ambient temperatures and need large amounts of water for cooling purposes. Renewable energy sources such as solar PV and wind require minimal amounts of water for operation as they do not require cooling, but these technologies play a much smaller role in generating energy across today's power grid. Regional differences in power grid configuration and development to year 2050, together with changes in climate and water availability, suggest that some regions may face power-reliability challenges.

With their study, researchers posed four questions:

How will future climate and water resource conditions impact four electricity infrastructure scenarios?

How will their new method of modeling climate-water impacts on electricity-generation compare to previous efforts?

What types of technology choices would be needed to adapt to future climate-water conditions and meet reliable electricity-generation levels?

What are the resulting economic and environmental implications?

To answer these questions, the research team first simulated capacity-expansion scenarios for four electricity mixes favoring different technology types (coal, nuclear, solar, and business as usual) without considering climate-water impacts. These projections to year 2050 provided a baseline understanding of results using current capacity-expansion approaches. For their next step, researchers factored in climate-water impacts on each electricity mix. This approach allowed them to assess its effect on different types of systems and elucidate potential adaptation steps required for each to meet energy demands. Their analysis found that capacity reserve margins drop below certain reliability levels when capacity projections didn't account for climate-water impacts, or when they try to but don't include feasibility checks.

"We showed that power systems may face reliability challenges without climate-water adaptation," said Miara. "Viable solutions included tradeoffs in regional technology choice and typically more renewable-based versus thermal power generation. This results in lower overall water use and emissions for our electricity generation needs."

"This analysis is the capstone to a long series of studies that this research team has produced to analyze unique climate-water-environment interactions in the context of long-term energy planning under climate change," said Charles Vörösmarty, a co-author of the study and director of the CUNY ASRC Environmental Sciences Initiative. "The study provides valuable information and an initial roadmap for scientists, infrastructure planners, power companies, and communities to have informed conversations about how we plan future systems."

Credit: 
Advanced Science Research Center, GC/CUNY

Bone breakthrough may lead to more durable airplane wings

ITHACA, N.Y. - Cornell researchers have made a new discovery about how seemingly minor aspects of the internal structure of bone can be strengthened to withstand repeated wear and tear, a finding that could help treat patients suffering from osteoporosis. It could also lead to the creation of more durable, lightweight materials for the aerospace industry.

The team's paper, "Bone-Inspired Microarchitectures Achieve Enhanced Fatigue Life," was published Nov. 18 in the Proceedings of the National Academy of Sciences. Co-authors include Cornell doctoral students Cameron Aubin and Marysol Luna; postdoctoral researcher Floor Lambers; Pablo Zavattieri and Adwait Trikanad at Purdue University; and Clare Rimnac at Case Western Reserve University.

For decades, scientists studying osteoporosis have used X-ray imaging to analyze the structure of bones and pinpoint strong and weak spots. Density is the main factor that is usually linked to bone strength, and in assessing that strength, most researchers look at how much load a bone can handle all at once.

But a team led by senior author Christopher J. Hernandez, associate professor in the Sibley School of Mechanical and Aerospace Engineering and in the Meinig School of Biomedical Engineering, is interested in long-term fatigue life, or how many cycles of loading a bone can bear before it breaks.

"The best way to understand the fatigue properties of material is to think about a part in your car that breaks every so often, so you have to take it to the shop. Well, why did it break? It was clearly strong enough, because it worked for months, years, just fine. But after cycling and cycling and cycling, tens of millions of cycles, it breaks," Hernandez said. "We've known about this property of materials for 150 years, and it's embedded in the design of everything we do. But not too many people had done this kind of study of the bone."

The internal architecture of bone consists of vertical plate-like struts that determine its strength when overloaded. The bone also has horizontal rod-like struts, which have little influence on strength and are essentially "window dressing." Hernandez and his team suspected that other aspects of architecture were important. Using new computer software, lead author Ashley Torres, M.A. '15, Ph.D. '18, MBA '19, was able to perform a deeper analysis of a bone sample and found that, when it comes to withstanding long-term wear and tear, the horizontal rod-like struts are critical for extending the bone's fatigue life.

"If you load the bone just once, it's all about how dense it is, and density is mostly determined by the plate-like struts," said Hernandez, who is also an adjunct scientist at the Hospital for Special Surgery, an affiliate of Weill Cornell Medicine. "But if you think about how many cycles of low-magnitude load something can take, these little sideways twiggy struts are what really matter. When people age, they lose these horizontal struts first, increasing the likelihood that the bone will break from multiple cyclic loads."

The team used a 3D printer to manufacture bone-inspired material made from a urethane methacrylate polymer. The researchers varied the thickness of the rods and were able to increase the material's fatigue life by up to 100 times.

Hernandez anticipates the reinforced microstructure lattices his team developed could be incorporated into just about any device, and would be particularly beneficial to the aerospace industry, where ultra-lightweight materials need to withstand tremendous and repeated strain.

"Every wind gust that an airplane hits causes a cycle of loading on it, so an airplane wing gets loaded thousands of times during every flight," Hernandez said. "If you want to make a durable device or a vehicle that is lightweight and will last a long time, then it really matters how many cycles of loading the part can take before it breaks. And the mathematical relationship we've derived in this study lets somebody who's designing one of these lattice structures balance the needs for stiffness and strength under a single load with the needs for tolerating many, many lower-level load cycles."

Credit: 
Cornell University

Scientists identify underlying molecular mechanisms of Alexander disease

image: Immunofluorescence staining of Alexander Disease iPSC-astrocytes showing cell nuclei (white), cytoplasmic GFAP filaments (magenta), and perinuclear GFAP aggregates (green; marked by yellow arrowheads).

Image: 
Lab of Natasha Snider, PhD, UNC School of Medicine

CHAPEL HILL, NC - November 21, 2019 - Scientists have known that genetic mutations leading to the production of a defective protein called GFAP cause Alexander disease (AxD), a debilitating neurodegenerative condition that can present during infancy, adolescence, or adulthood. Many people with the rare condition die within the first few years, but some survive for several decades. Now, UNC School of Medicine researchers are learning about the differences in the underlying biology of patients with severe and milder forms of AxD. Led by Natasha Snider, PhD, assistant professor of cell biology, an international group of scientists has discovered that the mutant form of GFAP undergoes different chemical modifications, depending on time of onset of symptoms.

Published in the online journal eLife, this research marks the first time scientists have been able to model very specific chemical changes to GFAP that occur inside the AxD brain using an in vitro system derived from AxD patient cells. This is allowing Snider and colleagues to probe the details of how GFAP misfolding and accumulation alters cellular mechanics to lead to disease progression and death.

"We are now further investigating the enzymes responsible for the key reactions inside brain cells that lead to AxD," Snider said. "We believe our research findings may open the door to new drug development opportunities for researchers and ultimately new kinds of therapies for people with this terrible disease."

AxD is a leukodystrophy, a rare group of disorders of the nervous system that involve the destruction of myelin, the fatty sheath that insulates long connective nerve cells and promotes the necessary communication of electrical impulses throughout the nervous system. As myelin deteriorates in people with AxD or other types of leukodystrophy, the activities of the nervous system deteriorate as well.

Most cases of Alexander disease occur during infancy and involve myelin destruction. Babies with AxD have enlarged brains, and they experience seizures, stiffness in the arms and legs, and developmental delay. Sometimes, though, symptoms do not occur until later in childhood or even in adulthood, and in the absence of leukodystrophy, when symptoms include speech abnormalities, swallowing difficulties, seizures, and poor coordination. Over time, abnormal protein deposits containing GFAP known as Rosenthal fibers accumulate in specialized cells called astrocytes, which support and nourish other cells in the brain and spinal cord.

Since 2011, Snider has been studying the mechanisms of GFAP accumulation with the hope of finding an existing drug or compound to help AxD patients and developing insights needed to create a new kind of therapy. GFAP forms intermediate filaments - structures that shape the 'skeleton' of astrocytes. Toxic accumulation of GFAP that is incapable of forming a proper structure leads to astrocyte dysfunction, which harms surrounding neuronal and non-neuronal cells in AxD patients. Problems of GFAP accumulation in astrocytes have also been found in other diseases, such as giant axonal neuropathy and astrocytoma tumors.

For the eLife study, Snider and colleagues combined mass spectrometry-based proteomic analysis of AxD and non-AxD human brain tissue with induced pluripotent stem cells and CRISPR/Cas9 gene editing technology to connect the relevant disease phenotypes to the underlying cell biology. This work illuminated key mechanisms involved in GFAP misfolding and revealed new markers of disease severity. For the first time, they made a clear molecular distinction between AxD children who die young and people who live for several decades.

Using the cell line model created by Rachel Battaglia, a graduate student in the Snider lab, in collaboration with Adriana Beltran, PhD, assistant professor of pharmacology at UNC, the team observed specific types of GFAP aggregates sequestered outside the misshapen membranes of cell nuclei. "This phenomenon had been observed previously in astrocytes of AxD patients," Snider said. "But ours is the first demonstration of this phenomenon in a model cell line in the lab to help us probe how exactly GFAP accumulation affects other cellular organelles to cause disease." These findings also relate to published literature on other debilitating and fatal human diseases associated with defects in intermediate filament proteins that have similar functions to GFAP.

The next step is to use this new knowledge to see if these molecular markers of GFAP aggregation can be leveraged for the creation of new treatments to help people with Alexander Disease.

Credit: 
University of North Carolina Health Care

Study: Wildfires in Oregon's blue mountains to become more frequent, severe due to climate change

image: A mosaic of burn patches of different fire severities in a primarily ponderosa pine forest is shown six months after the Canyon Creek Complex wildfire.

Image: 
Brooke Cassell | Portland State University

Under a warming climate, wildfires in Oregon's southern Blue Mountains will become more frequent, more extensive and more severe, according to a new Portland State University-led study.

Researchers from PSU, North Carolina State University, University of New Mexico and the U.S. Forest Service looked at how climate-driven changes in forest dynamics and wildfire activity will affect the landscape through the year 2100. They used a forest landscape model, LANDIS-II, to simulate forest and fire dynamics under current management practices and two projected climate scenarios.

Among the study's findings:

Even if the climate stopped warming now, high-elevation species such as whitebark pine, Engelmann spruce and sub-alpine fir will be largely replaced by more climate- and fire-resilient species like ponderosa pine and Douglas fir by the end of the century.

A growing population of shade-loving grand fir that has been expanding in the understory of the forest was also projected to increase, even under hotter and drier future climate conditions, which provided fuels that helped spread wildfires and made fires even more severe.

Brooke Cassell, the study's lead author and a recent Ph.D. graduate from PSU's Earth, Environment and Society program, said that if these forests become increasingly dominated by only a few conifer species, the landscape may become less resilient to disturbances, such as wildfire, insects and diseases, and would provide less variety of habitat for plants and animals.

Cassell said that the team's findings suggest that forest managers should consider projected climate changes and increasing wildfire size, frequency and severity on future forest composition when planning long-term forest management strategies.

The team also suggests that in light of the projected expansion of grand fir, managers should continue to reduce fuel continuity through accelerated rates of thinning and prescribed burning to help reduce the extent and severity of future fires.

Credit: 
Portland State University

How to fight illegal cocoa farms in Ivory Coast

image: Study co-author Sery Gonedelé Bi, on left, holds a cocoa plant found at an illegal farm in the Dassioko Forest Reserve in Ivory Coast.

Image: 
Photo by W. Scott McGraw, Courtesy of Ohio State University

COLUMBUS, Ohio - The world's love for chocolate has helped decimate protected forests in western Africa as some residents have turned protected areas into illegal cocoa farms and hunting grounds.

But an international group of researchers has found that simply patrolling the grounds of two forest reserves in Ivory Coast has helped reduce illegal activity by well more than half between 2012 and 2016.

The researchers themselves were among those who conducted the foot patrols, said W. Scott McGraw, professor of anthropology at The Ohio State University and co-author of a recent paper in Tropical Conservation Science documenting their success.

McGraw said that on patrols he participated in, farmers tending illegal cocoa crops were often surprised that anyone approached them and told them to stop their activities.

"They just weren't used to encountering anyone who put up any kind of resistance. We told them they weren't allowed to be growing cocoa there and they would respond that 'no one has ever told us we couldn't,'" he said.

"It was just so blatant."

McGraw conducted the study with Sery Gonedelé Bi, Eloi Anderson Bitty and Alphonse Yao of the University Félix Houphouët Boigny in Ivory Coast.

Forest loss is a major problem in Ivory Coast. Between 2000 and 2015, the country lost about 17 percent of its forest cover, driven by an annual deforestation rate - 2.69 percent - that was among the highest in the world, according to Global Forest Watch.

Much of that forest was cleared for cocoa farms to meet the growing demand for chocolate worldwide. The destruction of the forests - as well as the illegal poaching found in this study - threaten several critically endangered primates that live in these reserves, McGraw said. A previous study by McGraw and his colleagues documented the threat to primates.

The researchers conducted this study in the Dassioko Sud and Port Gauthier forest reserves along the Atlantic coast.

Regular patrols were carried out three to four times a month in both reserves between July 2012 and June 2016. Each team usually consisted of six to eight people, including the researchers, law enforcement officials (occasionally) and community members recruited from neighboring towns and paid with funds from several conservation organizations.

The patrol teams would normally go out for seven hours at a time, taking random routes so as to not be predictable. They noted the time and direction of all gun shots, collected discarded gun cartridges, and counted and dismantled all snares used to capture monkeys and other game.

They recorded the size and location of all logging and farming operations and destroyed all plantation crops, most of which were new cocoa seedlings.

When the patrols included law enforcement personnel, they would arrest poachers, as well as loggers and illegal farmers.

McGraw said he was never on one of the rare patrols that encountered a poacher with a gun, although his co-authors were.

"I was told those were the most tense situations. That is not a standoff that I would want to be a part of," he said.

Altogether, during the four years of the study, the patrol teams apprehended six poachers, heard 302 gunshots, deactivated 1,048 snares and destroyed 515 hectares of cocoa farms.

But the good news is that the number of these illegal activities dropped dramatically over the years they patrolled. For example, the researchers documented about 140 signs of illegal activity (such as shotgun shells, snares or cocoa plantings) at Port Gauthier Forest Reserve in August 2012. In June 2016, there were fewer than 20. Similar reductions in illegal activities were recorded in Dassioko Sud Forest Reserve.

The patrol activities weren't responsible for all the decline, McGraw said. For example, a portion of the poaching decline could have been related to the 2014 Ebola outbreak, which reduced demand for bushmeat.

But much of the decline in illegal activities in the reserves occurred before 2014, he noted.

Regular patrols in the two reserves have not continued since 2016, McGraw said. Surveys since then suggest that the gains made by the patrols are holding and illegal activities have not returned to their previous highs.

"That said, I think it is important that there is a renewed effort to patrol these reserves. We need something sustained," he said.

"We need Ivorian officials to take a stronger stand against illegal activities in the country's protected areas."

Credit: 
Ohio State University

Climate change reassessment prompts call for a 'more sober' discourse

An international research team has called for a more sober discourse around climate change prospects, following an extensive reassessment of climate change's progress and its mitigation.

They argue that climate change models have understated potential warming's speed and runaway potential, while the models that relate climate science to consequences, choices and policies have understated the scope for practical mitigation against it. Policymakers are becoming aware of the former bias but seldom perceive the latter.

Their study will be published Thursday, August 28th in the IOP Publishing journal Environmental Research Letters.
Lead and corresponding author Dr Amory Lovins, from Rocky Mountain Institute, Colorado, USA, said: "The IPCC's 2018 Special Report is a stark and bracing reminder of climate threats. We know focussed and urgent action to combat climate change is still essential. But our findings show that both despair and complacency are equally unwarranted.

"We found that, while climate change models have understated potential warming, the models used to guide policy makers have understated the scope for practical, let alone profitable, mitigation against it.

"Indeed, since 2010, and despite the past three years' disappointing slowdown in energy savings, global decarbonisation has accelerated to trend on course (averaged over the past three years) to achieve the Paris 2 C? target. Large gains from energy efficiency have been underemphasized and modern renewable heat--decarbonising about as much as solar power plus windpower--has generally been overlooked altogether."

Co-author Professor Daniel Kammen, from the University of California, Berkeley, USA, said: "We find that the actual rate of decarbonisation in the global economy is significantly higher than is used in many baseline assessments of technological change. No single climate action can be sufficient to meet national climate goals, but rapid gains in energy efficiency uniquely enable economy-wide transitions to a low-carbon system that make achieving the Paris Climate Goals possible, if we take aggressive actions across all sectors of the economy."

The researchers found that recent developments in energy markets and analyses could open new prospects for the achievability, social/economic acceptability, and economic attractiveness of the climate targets in the Paris Agreement, including its aspirational 1.5 C? target.

Professor Kammen said: "These developments include the recent dramatic cost reductions and scale-up of deployment of solar and wind energy, which we are now critically also starting to see for energy storage, too"

"What we need now is a renewed and coordinated effort to represent these developments in influential global climate and energy systems models. Doing so is critical to saving trillions of dollars, while achieving stringent climate mitigation outcomes."

The study notes that recent progress in and future potential for advanced end-use energy efficiency has also been overlooked.

Professor Diana Ürge-Vorsatz, from Central European University, Hungary, is Vice Chair of IPCC Working Group III and a co-author of the study. She said: "These two classes of resources have already shrunk the gap between pre-2010 implementation rates and those needed to achieve targets indicated by the climate modeling literature. Many models, using 'historic' trends, consider 1.5-2.0 per cent per year drops in primary energy intensity to be ambitious. However, the 2010-18 rate averaged 2.03 per cent per year, even reaching 2.7 per cent per year in 2015, and could rise further.

"Reduced primary energy intensity, plus an increased share of decarbonized final supply, have lately matched the sustained 3.4 per cent per year that IPCC AR5 found necessary for a 2 C? trajectory.

Together they are only half of, but trending toward, the sustained 6.7 per cent per year needed for 1.5 C?."
The study recommends some new approaches for future modelling. It argues the need to reconsider reliance on pre-2011 energy data, and to better understand and apply modern energy efficiency options from advanced practitioners and their engineering-based literature.

It notes there is also an opportunity to acknowledge, study, test, and if warranted apply high-quality work from other disciplines.

Lead author Dr Lovins underlined: "Cross-fertilization with different perspectives and schools of thought beyond technocracy can often provide step-changes in enriching analytical insights. Models confirm the scope for ambitious mitigation pathways, and provide an important way to inform emitting industries, policymakers, and the public about rapidly exploiting both modern energy efficiency and the short atmospheric lifetimes of CH4 and other super-emitters.

"Enhanced, more complementary ways of reducing these concentrated emissions and exploiting nonlinear benefits can capture new business and socio-political opportunities by applying basic first-aid principles to our planet's ailing climate."

Professor Kammen added: "When the mainstream climate models integrate these methodological advances and new evidence, they are likely to recalibrate the prospects for achieving ambitious climate targets, including 1.5 C?.

"Furthermore, the rich menu of climate-change mitigations--whether driven by business, public policy, or civil society and individual choice--need not wait for these modelling improvements, but all would benefit from them."

Co-author Professor Luis Mundaca, from Lund University, Sweden, concluded: "The evidence is now clear that climate mitigations, particularly on the demand side, well in excess of those traditionally modelled will make sense, make money, and create large co-benefits, chiefly for development, equity, health, and security. Refined modelling therefore need not precede but should evolve in parallel with ambitious policy interventions and aggressive adoption."

Credit: 
IOP Publishing

Study: Young children can learn math skills from intelligent virtual characters

U.S. children lag behind their international peers in science, technology, engineering, and math (STEM) skills, which has led to calls for an integrated math curriculum for 3- to 6-year-olds. A new study examined whether young children's verbal engagement with an onscreen interactive media character could boost their math skills. The study concluded that children's parasocial (that is, one-sided) emotional relationships with the intelligent character and their parasocial interactions (in this case, talking about math with the character) led to quicker, more accurate math responses during virtual game play.

The findings are from research conducted at Georgetown University. They appear in Child Development, a journal of the Society for Research in Child Development.

"Our study suggests that children's relationships and interactions with intelligent characters can provide new pathways for 21st century education, with popular media characters bridging traditional boundaries between home and school settings," says Sandra L. Calvert, professor of psychology and director of the Children's Digital Media Center at Georgetown University, who led the study.

Researchers studied 217 children ages 3 to 6 years, most of whom were European American and from college-educated families. They examined the children's math learning from a game featuring a prototype of an intelligent character based on the media character Dora from the animated series, Dora the Explorer, who responded to children with spoken language. In three studies, each of which took place over about a year, researchers initially asked if children could learn from the intelligent character. Then they examined the role of children's parasocial relationships by including or not including a character in the virtual game. And then they examined the role of social contingency, with some children's talk about math receiving corrective feedback from the character and other children's talk not receiving the feedback.

Children were taught the add-1 rule--that adding 1 to a number increases the total sum by a single unit--which is one of the most basic and earliest math concepts children learn. Researchers examined whether the children could learn this rule from an intelligent character in a virtual game, and how that learning was influenced by the children's feelings for the character and their talk with the character. They also examined whether the children's learning in a screen-based context would transfer to learning with physical objects, such as crayons.

Children who had stronger emotional feelings for the character and who talked more to the character about math had quicker, more accurate math responses during their virtual game play, the study found. Children also transferred what they had learned from the virtual game to physical objects more successfully when the game included an embodied virtual character (as opposed to a noncharacter female voiceover of what was said) and when the character used socially contingent replies to children's talk about math. The findings suggest that children's emotionally tinged parasocial relationships and parasocial talk about math with virtual characters increased their mastery of early math skills.

"Our work sheds light on how children's connection to a character and interactions with them through math talk can improve learning of basic early math skills, a lesson that may be extended to other academic and social areas," explains Evan Barba, associate professor of communication, culture, and technology at Georgetown University, who coauthored the study.

"The implication of our findings is that media characters that are children's friends and playmates can also be children's trusted peers and teachers in math and other subjects," concludes Calvert.

Credit: 
Society for Research in Child Development

Researchers develop new database of druggable fusion targets

When sections from two separate genes merge due to various factors, such as translocation or splicing, the hybrid that is formed is called a gene fusion. In recent years, it has been discovered that these fusion events play a vital role in the development of cancers and other complex diseases. However, there are very few resources which collate all this information and make it available in one place. By analysing over a million nucleic acid sequences from publicly available data, a team led by Dr. Milana Frenkel-Morgenstern, of the Azrieli Faculty of Medicine at Bar-Ilan University, has identified 111,582 fusions in eight species (human, mouse, rat, fruit-fly, wild boar, zebrafish, yeast and cattle).

The latest and most up-to-date version of this database, known as ChiTaRS, has just been published in the scientific journal Nucleic Acids Research. This database is being maintained currently by the Laboratory of Cancer Genomics and Biocomputing of Complex Diseases at the Azrieli Faculty of Medicine in Safed and will be extremely useful to clinicians specializing in complex diseases, particularly, cancers, Alzheimer's disease, schizophrenia and many others.

This edition of ChiTaRS collects cases of druggable fusion targets. In recent years, many of these fusion genes have served as specific targets, particularly for chemotherapy drugs. Some commonly known examples include the BCR-ABL1 fusion in chronic myeloid leukaemia (CML) and the EML4-ALK chimera in non-small cell lung cancer (NSCLC). ChiTaRS 5.0 provides a list of more than 800 druggable fusions, being used as targets by close to 120 drug or drug combinations, that are useful for personalised therapy in complex diseases.

This resource will be a boon to researchers working on identifying the functional role of chimeras/fusions in carcinogenesis. It will also be advantageous in the fields of 3D chromatin maps and evolutionary biology, among others. This is a new updated resource for Prof. Frenkel-Morgenstern's research group, which already maintains online resources like a server for text-mining of fusions (ProtFus) and protein-protein interactions of fusions (ChiPPI).

Credit: 
Bar-Ilan University

Black carbon found in the Amazon River reveals recent forest burnings

image: International study quantified and characterized charcoal and soot produced by incomplete burning of trees and transported by river to the Atlantic

Image: 
Léo Ramos Chaves - Revista Pesquisa FAPESP

Besides swathes of destroyed vegetation, forest fires in Amazonia leave their imprint on the Amazon River and its tributaries. Incomplete burning of trees results in the production of black carbon, solid particles that enter the waters of the Amazon in the form of charcoal and soot and are transported to the Atlantic Ocean as dissolved organic carbon.

For the first time, an international group of researchers have quantified and characterized the black carbon flowing in the Amazon River. Their findings, published in Nature Communications, show that most of the black carbon transported to the ocean is "young" and probably results from recent forest fires.

"By radiometric dating [a method that quantifies the amount of carbon-14 or other naturally occurring radioactive isotopes present in material based on their known rate of decay to determine the material's age to about 60,000 years ago] and molecular composition analysis, we concluded that most of the black carbon we found in the Amazon River was produced in recent years by the burning of trees," said Jeffrey E. Richey, a professor at the University of Washington in the United States and a coauthor of the study.

As a visiting researcher at the University of São Paulo's Center for Nuclear Energy in Agriculture (CENA-USP), in the past five years, Richey has conducted a project supported by São Paulo Research Foundation - FAPESP under the auspices of its São Paulo Excellence Chair (SPEC) program, with the aim of elucidating the role of the Amazon River basin in the global carbon cycle.

In November 2015, during one of the driest seasons in the region, the researchers who worked on the project collected samples of black carbon dissolved in the main channel of the Amazon and in four tributaries - the Negro, Madeira, Trombetas and Tapajós.

This period was chosen for execution of the study because water levels were low and connectivity between the Amazon and its floodplain was limited. "As a result, we were able to obtain samples only of permanent water and more accurately identify the sources of black carbon in the river basin," Richey said.

Molecular markers

Carbon-14 levels and contents in samples were measured using molecular markers, such as the polycarboxylic acid released by oxidation of aromatic polycyclical hydrocarbons in black carbon.

Quantitative measurement of the markers was combined with molecular characterization of the samples using ultra-high-resolution mass spectrometry.

The results of the analyses showed that the black carbon dissolved in the Amazon and its tributaries is generally "young" but ages as it proceeds toward the ocean.

Samples collected in localities relatively distant from the Atlantic, such as Óbidos in Pará State, were younger, while those collected farther downstream were older.

"This suggests the black carbon may age as it moves from dry land to the river and then flows on to the sea. Also, more reactive components may be removed during the transportation of this material," Richey said.

"The more recent material may be submitted to a process of mineralization in the river as it flows to the sea. This could cause a change in its molecular profile so that it emits an 'older' signal. There are still various aspects of the storage and transportation of this material from dry land to rivers and then the ocean that we need to understand better."

In a new project, also supported by FAPESP, the researchers plan to perform a larger number of measurements for comparison with the data for 2015 in an effort to find out whether the production of "young" black carbon and hence the frequency of forest fires have increased in recent years.

"Concern about the recent burnings in Amazonia is particularly acute with regard to the carbon generated. Part of it goes into the atmosphere in the form of carbon dioxide, but a large proportion is retained in the soil or water in the form of black carbon," Richey said.

Largest source of organic matter

According to the researchers, the Amazon River accounts for one-fifth of global freshwater discharge to the oceans and is the largest single source of seaborne terrestrial organic matter, "exporting" between 22 and 27 million metric tons of dissolved organic carbon per year on average. For this reason, it is a crucial system through which to understand the cycling and transportation of black carbon, the most stable carbon compound in nature.

A large and refractory component of the global carbon cycle, black carbon in particulate form acts as a biospheric carbon sink by removing carbon from faster atmosphere-biosphere processes and storing it in sedimentary reservoirs. Knowledge of the origin, dynamics and fate of this material is essential to the development of models for predicting how the global carbon cycle may interact with climate change, Richey stressed.

"Our understanding of the role of black carbon at the regional and global scales is inadequate, owing largely to limited knowledge of the processing, quality and fate of dissolved black carbon during its exportation by rivers to the ocean," he said.

"For example, we need to know how long the black carbon produced by recent forest fires takes to reach the Amazon River."

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Living in ethnic enclaves may improve pregnancy outcomes for Asian/Pacific islanders

Among Asian/Pacific Islander women living in the United States, those who reside in ethnic enclaves--areas with a high concentration of residents of a similar ancestry--are less likely to have pregnancy or birth complications than those living in other areas, suggests a study by researchers at the National Institutes of Health and other institutions. The findings appear in the Journal of Racial and Ethnic Health Disparities.

Women in enclaves were less likely to have gestational diabetes, to deliver preterm, or to have an infant who was small for gestational age (a possible indicator of failure to grow adequately in the uterus). The researchers theorize that living in ethnic enclaves may improve health by offering easier access to health professionals of similar ancestry, access to traditional diets that are healthier than typical U.S. diets, and less incentive to engage in unhealthy habits like smoking and alcohol abuse.

"Our findings suggest that providing Asian/Pacific Islanders with culturally appropriate health care resources may be a key factor in overcoming disparities," said the study's senior author, Pauline Mendola, Ph.D., of the Epidemiology Branch at NIH's Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD).

The U.S. Census Bureau defines "Asian" as a person having origins in the original peoples of the Far East, South East Asia or the Indian Subcontinent. Pacific Islanders have origins among the original peoples of Hawaii, Guam, Samoa or other Pacific Islands.

To conduct the study, researchers analyzed data from more than 8,400 women of Asian/Pacific Islander heritage who took part in a study on labor and delivery at 19 hospitals throughout the United States. They estimated the ethnic makeup of the women's communities from Census data and the American Community Survey from the National Historical Geographic Information System, which is supported in part by NICHD.

Compared to Asian/Pacific Islander women who lived in other areas, those who lived in ethnic enclaves were 39% less likely to develop gestational diabetes, 26% less likely to deliver preterm, and 32% less likely to have an infant small for gestational age.

The researchers noted that residents of ethnic enclaves often have stronger social networks than ethnic minorities who live elsewhere. They theorized that these social ties may ease the stress of discrimination and reduce the likelihood of resorting to unhealthy coping mechanisms, such as smoking and alcohol use. Moreover, residents of ethnic enclaves may have more access to health-relevant goods and services. For example, access to ethnic grocery stores make it possible to maintain traditional diets, which are healthier than a typical U.S. diet. Similarly, residents of ethnic enclaves may have access to health care providers of similar ancestry, who can provide culturally relevant health care information in a native language.

The authors concluded that their results suggest that improving access to culturally appropriate resources among Asian/Pacific Islander communities may improve health promotion efforts in these populations. They noted that the records they analyzed did not include information on the women's ancestry or immigration history. For this reason, they could not detect differences in pregnancy outcomes between Asian or Pacific Islander ancestry groups.

Credit: 
NIH/Eunice Kennedy Shriver National Institute of Child Health and Human Development