Earth

Plant diversity in European forests is declining

image: While some small-ranged species have disappeared, widespread, nitrogen-loving, and occasionally exotic species are on the rise.

Image: 
Martin Adámek / Czech Academy of Sciences

In Europe's temperate forests, less common plant species are being replaced by more widespread species. An international team of researchers led by the German Centre for Integrative Biodiversity Research (iDiv) and the Martin Luther University Halle-Wittenberg (MLU) has found that this development could be related to an increased nitrogen deposition. Their results have been published in the journal Nature Ecology & Evolution.

The number of animal and plant species is declining globally. By contrast, there are occasionally opposing trends in individual local ecosystems, where there may even be evidence of an increase in species richness (number of species). How can this apparent contradiction be explained, and what are the reasons for it?

It was precisely these questions that an international team of scientists wanted to explore. Using data from a total of 68 different locations in temperate forests in Europe - including forest sites in Thuringia, Saxony-Anhalt and Bavaria - they investigated how the diversity of herb-layer plant species has changed over the past decades. For this, the researchers had to assess stocks of 1,162 different plant species. This set of data was compiled by a network of forest ecologists, called forestREplot. "This network has the advantage that the experts on the actual locations can be asked if something is unclear, and, in this way, it differs from many other large databases," said lead author Ingmar Staude, a doctoral student at iDiv and the MLU.

The analysis of these data was made possible by the sDiv synthesis centre of iDiv. The scientists found that plant species with a small geographical range, which can often be found in only a few forests, tend to have an increased risk of extinction within the respective forests. "This is not so much due to the smaller population size of such plants, but rather to their ecological niche," explains Ingmar Staude. Small-ranged species are often those adapted to relatively few nutrients in the soil.

The scientists were able to show that chronic and excessive nitrogen deposition in many parts of Europe is related to the increased risk of extinction of such species. In contrast, plant species that prefer nutrient-rich soils, such as nettle and blackberry benefit. These plants grow faster under higher nutrient supply and have now a sudden competitive advantage.

While small-ranged species have disappeared, widespread, nitrogen-loving, and occasionally exotic species are on the rise. The average biodiversity of individual forests has therefore not actually decreased. However, the biodiversity of the biome has decreased as small-ranged species were commonly lost. Based on their research, the researchers estimated a 4% decrease over the last decades. However, they point out that many of the sites investigated are in protected areas, and if areas used for forestry were to be examined, the decline could be even greater.

"We now have to find out whether the processes we observe in forests are similar in other biomes," said Ingmar Staude. With the help of the sDiv synthesis centre of iDiv, data are to be evaluated now for a number of biomes; for example, European grasslands and Alpine ecosystems.

The loss of less common species has an impact on ecosystems. If individual plant species disappear, some insect species and soil organisms also disappear along with them. And the further regional floras homogenise, the less effectively these ecosystems can react to changing environmental conditions. The scientists of this study argue that nitrogen deposition needs to be reduced to decrease the extinctions of small-ranged species. These species play an important role when it comes to the capacity of our forest ecosystems to adapt to changing environmental conditions.

Credit: 
German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig

Cell membrane proteins imaged in 3D

image: Ultrabright x-rays revealed the concentration of erbium (yellow) and zinc (red) in a single E.coli cell expressing a lanthanide-binding tag and incubated with erbium.

Image: 
Brookhaven National Laboratory

UPTON, NY--A team of scientists including researchers at the National Synchrotron Light Source II (NSLS-II)--a U.S. Department of Energy (DOE) Office of Science User Facility at DOE's Brookhaven National Laboratory--have demonstrated a new technique for imaging proteins in 3-D with nanoscale resolution. Their work, published in the Journal of the American Chemical Society, enables researchers to identify the precise location of proteins within individual cells, reaching the resolution of the cell membrane and the smallest subcellular organelles.

"In the structural biology world, scientists use techniques like x-ray crystallography and cryo-electron microscopy to learn about the precise structure of proteins and infer their functions, but we don't learn where they function in a cell," said corresponding author and NSLS-II scientist Lisa Miller. "If you're studying a particular disease, you need to know if a protein is functioning in the wrong place or not at all."

The new technique developed by Miller and her colleagues is similar in style to traditional methods of fluorescence microscopy in biology, in which a molecule called green fluorescent protein (GFP) can be attached to other proteins to reveal their location. When GFP is exposed to UV or visible light, it fluoresces a bright green color, illuminating an otherwise "invisible" protein in the cell.

"Using GFP, we can see if a protein is in subcellular structures that are hundreds of nanometers in size, like the nucleus or the cytoplasm," Miller said, "but structures like a cell membrane, which is only seven to 10 nanometers in size, are difficult to see with visible light tags like GFP. To see structures the size of 10 nanometers in a cell, you benefit greatly from the use of x-rays."

To overcome this challenge, researchers at NSLS-II teamed up with scientists at the Massachusetts Institute of Technology (MIT) and Boston University (BU) who developed an x-ray-sensitive tag called a lanthanide-binding tag (LBT). LBTs are very small proteins that can bind tightly to elements in the lanthanide series, such as erbium and europium.

"Unlike GFP, which fluoresces when exposed to UV or visible light, lanthanides fluoresce in the presence of x-rays," said lead author Tiffany Victor, a research associate at NSLS-II. "And since lanthanides do not occur naturally in the cell, when we see them with the x-ray microscope, we know the location of our protein of interest."

The researchers at NSLS-II, MIT, and BU worked together to combine LBT technology with x-ray-fluorescence.

"Although LBTs have been used extensively over the last decade, they've never been used for x-ray fluorescence studies," Miller said.

Beyond obtaining higher resolution images, x-ray fluorescence simultaneously provides chemical images on all trace elements in a cell, such as calcium, potassium, iron, copper, and zinc. In other studies, Miller's team is researching how trace elements like copper are linked to neuron death in diseases like Alzheimer's. Visualizing the location of these elements in relation to specific proteins will be key to new findings.

In addition to their compatibility with x-rays, LBTs are also beneficial for their relatively small size, compared to visible light tags.

"Imagine you had a tail attached to you that was the size of your whole body, or bigger," Miller said. "There would be a lot of normal activities that you'd no longer be able to do. But if you only had to walk around with a tiny pig's tail, you could still run, jump, and fit through doorways. GFP is like the big tail--it can be a real impediment to the function of a many proteins. But these little lanthanide-binding tags are almost invisible."

To demonstrate the use of LBTs for imaging proteins in 3-D with nanoscale resolution, the researchers at MIT and BU tagged two proteins in a bacterial cell--one cytoplasmic protein and one membrane protein. Then, Miller's team studied the sample at the Hard X-ray Nanoprobe (HXN) beamline at NSLS-II and the Bionanoprobe beamline at the Advanced Photon Source (APS)--a DOE Office of Science User Facility at DOE's Argonne National Laboratory.

"HXN offers the world-leading x-ray focus size, which goes down to about 12 nanometers. This was critical for imaging the bacterial cell in 3-D with nanoscale resolution," said Yong Chu, lead beamline scientist at HXN. "We also developed a new way of mounting the cells on a specialized sample holder in order to optimize the efficiency of the measurements."

By coupling the unparalleled resolution of HXN with the capabilities of LBTs, the team was able to image both of the tagged proteins. Visualizing the cell membrane protein proved LBTs can be seen at a high resolution, while imaging the cytoplasmic protein showed LBTs could also be visualized within the cell.

"At high concentrations, lanthanides are toxic to cells," Victor said, "so it was important for us to show that we could treat cells with a very low lanthanide concentration that was nontoxic and substantial enough to make it past the cell membrane and image the proteins we wanted to see."

Now, with this new technique demonstrated successfully, scientists hope to be able to use LBTs to image other proteins within the cell at a resolution of 10 nanometers.

Credit: 
DOE/Brookhaven National Laboratory

Australian study: Many home blood pressure monitors not validated for accuracy

DALLAS, April 13, 2020 -- Most blood pressure devices sold for home monitoring in Australia - and possibly worldwide - may not have been validated for accuracy and could lead to misdiagnoses and inappropriate treatment, according to new research published today in Hypertension, an American Heart Association journal.

In this study, the researchers studied the online blood pressure device marketplace in Australia, including large, multi-national, e-commerce businesses such as Amazon and eBay, which was the source of over 90% of the devices examined. Validating a blood pressure device means it has gone through rigorous testing to make sure it is measuring accurate blood pressure readings. The researchers found:

Only 6% of the 972 models of blood pressure monitoring devices available for purchase had been validated;

More than half of the blood pressure monitors on the market were wristband models, and none had been validated;

Slightly over 18% of the upper-arm cuff blood pressure devices had been validated; and

Non-validated devices were cheaper than those that had been tested for accuracy.

"People around the world monitor their blood pressure using home devices to help to effectively manage hypertension and to help determine their risk for heart attacks or strokes," said James E. Sharman, Ph.D., lead researcher for the study and deputy director at the Menzies Institute for Medical Research at the University of Tasmania in Australia. "If the devices haven't been properly validated for accuracy, treatment decisions could be based on incorrect information. We found non-validated devices dominate the Australian marketplace, which is a major barrier to accurate blood pressure monitoring and cardiovascular risk management.

"Inaccurate blood pressure measuring devices could have a major implication for public health. If blood pressure is incorrectly overestimated it could lead to unnecessary prescriptions or higher doses than needed of blood pressure lowering medications, which are usually prescribed for life. Medications are costly, have potential side effects and patients incorrectly labeled with high blood pressure could suffer unnecessarily. When blood pressure is incorrectly underestimated, people might remain at increased risk for a heart attack or stroke that could otherwise be avoided with the appropriate medication and dose, and/or lifestyle changes," Sharman said.

"International policies need to be strengthened to ensure that home use medical devices are rigorously tested for accuracy before being cleared for sale by regulatory authorities. Currently, manufacturers of blood pressure devices conduct their own accuracy testing, an honor system with potential real-life consequences for patients," said Sharman.

Home blood pressure monitoring is recommended by the American Heart Association for people with high blood pressure because it provides more blood pressure readings rather than the occasional measurement in a doctor's office or health care clinic. The Association recommends using a blood pressure monitor with an upper-arm cuff that has been independently validated. In addition, the Association suggests taking the device to their healthcare provider to double-check it for accuracy.

The study may be limited because it only reviewed devices purchased via Australia's online marketplace. These suppliers also sell products in the U.S. and worldwide, however, the focus of this study is the Australian market, so it is not known if these same blood pressure devices are sold in the U.S. The FDA requires manufacturers to submit documentation that the devices are tested for precision before they can be available for sale in the U.S. However, the devices are not independently evaluated; rather, the manufacturers conduct their own accuracy tests.

Credit: 
American Heart Association

Tech not hurting social skills of 'kids these days'

video: Despite the time spent with smartphones and social media, young people today are just as socially skilled as those from the previous generation, a new study suggests.

Researchers compared teacher and parent evaluations of children who started kindergarten in 1998 - six years before Facebook launched - with those who began school in 2010, when the first iPad debuted.

Results showed both groups of kids were rated similarly on interpersonal skills such as the ability to form and maintain friendships and get along with people who are different. They were also rated similarly on self-control, such as the ability to regulate their temper.

Image: 
Ohio State University

COLUMBUS, Ohio - Despite the time spent with smartphones and social media, young people today are just as socially skilled as those from the previous generation, a new study suggests.

Researchers compared teacher and parent evaluations of children who started kindergarten in 1998 - six years before Facebook launched - with those who began school in 2010, when the first iPad debuted.

Results showed both groups of kids were rated similarly on interpersonal skills such as the ability to form and maintain friendships and get along with people who are different. They were also rated similarly on self-control, such as the ability to regulate their temper.

In other words, the kids are still all right, said Douglas Downey, lead author of the study and professor of sociology at The Ohio State University.

"In virtually every comparison we made, either social skills stayed the same or actually went up modestly for the children born later," Downey said.

"There's very little evidence that screen exposure was problematic for the growth of social skills."

Downey conducted the study with Benjamin Gibbs, associate professor of sociology at Brigham Young University. The study was just published online in the American Journal of Sociology.

The idea for the study came several years ago when Downey had an argument at a pizza restaurant with his son, Nick, about whether social skills had declined among the new generation of youth.

"I started explaining to him how terrible his generation was in terms of their social skills, probably because of how much time they spent looking at screens," Downey said.

"Nick asked me how I knew that. And when I checked there really wasn't any solid evidence."

So Downey, with his colleague, decided to investigate. For their study, they used data from The Early Childhood Longitudinal Study, which is run by the National Center for Educational Statistics.

The ECLS follows children from kindergarten to fifth grade. The researchers compared data on the ECLS-K cohort that included children who began kindergarten in 1998 (19,150 students) with the cohort that began kindergarten in 2010 (13,400 students).

Children were assessed by teachers six times between the start of kindergarten and the end of fifth grade. They were assessed by parents at the beginning and end of kindergarten and the end of first grade.

Downey and Gibbs focused mostly on the teacher evaluations, because they followed children all the way to fifth grade, although the results from parents were comparable.

Results showed that from the teachers' perspective, children's social skills did not decline between the 1998 and 2010 groups. And similar patterns persisted as the children progressed to fifth grade.

In fact, teachers' evaluations of children's interpersonal skills and self-control tended to be slightly higher for those in the 2010 cohort than those in the 1998 group, Downey said.

Even children within the two groups who had the heaviest exposure to screens showed similar development in social skills compared to those with little screen exposure, results showed.

There was one exception: Social skills were slightly lower for children who accessed online gaming and social networking sites many times a day.

"But even that was a pretty small effect," Downey said.

"Overall, we found very little evidence that the time spent on screens was hurting social skills for most children."

Downey said while he was initially surprised to see that time spent on screens didn't affect social skills, he really shouldn't have been.

"There is a tendency for every generation at my age to start to have concerns about the younger generation. It is an old story," he said.

These worries often involve "moral panic" over new technology, Downey explained. Adults are concerned when technological change starts to undermine traditional relationships, particularly the parent-child relationship.

"The introduction of telephones, automobiles, radio all led to moral panic among adults of the time because the technology allowed children to enjoy more autonomy," he said.

"Fears over screen-based technology likely represent the most recent panic in response to technological change."

If anything, new generations are learning that having good social relationships means being able to communicate successfully both face-to-face and online, Downey said.

"You have to know how to communicate by email, on Facebook and Twitter, as well as face-to-face. We just looked at face-to-face social skills in this study, but future studies should look at digital social skills as well."

Credit: 
Ohio State University

Livestock and poultry farming should be the future focus of agricultural ammonia emissions control

image: Livestock and poultry farming is the largest contributor to agricultural ammonia emissions

Image: 
Jiayu Huang

As the only alkaline gas in the atmosphere, ammonia can react with sulfur dioxide and nitrogen oxides to form secondary fine particles, accelerating the generation of atmospheric haze. Ammonia is therefore considered as a catalyst and accelerator for the widespread haze pollution problem in China. In the past decade, China has made great progress in the prevention and control of atmospheric sulfur dioxide and nitrogen oxide pollution, but little has been done in terms of ammonia emissions control.

Compiling an emissions inventory is an important basis for ammonia control. Although there are some existing global or national emission inventories, they are not very applicable to regional-scale ammonia emission reduction policies due to the lack of specific regional information. Professor Weishou Shen's research group at Nanjing University of Information Science and Technology (NUIST) investigated ammonia emissions from agriculture in Jiangsu Province from 2000 to 2017 based on the emissions factor method, and the findings have been published in Atmospheric and Oceanic Science Letters.

"Jiangsu is where our research team is based, and it happens that this province ranks first in ammonia emissions from agriculture in key areas of national air pollution control in China," explains Shen. "We therefore feel it's our duty to investigate the characteristics and trend of change of ammonia emissions from agriculture here."

The team selected five regional representative nitrogen fertilizers and four typical livestock types and calculated their corresponding emission factors to investigate the characteristics and trends of agricultural ammonia emissions in Jiangsu from 2000 to 2017. They found that ammonia emissions from agriculture were mainly contributed by livestock and poultry farming (78.08%) and nitrogen fertilizer application (21.92%), and presented a fluctuating interannual trend during the study period.

"We suggest that ammonia emissions control from livestock and poultry farming should be a future focus of agricultural ammonia control," concludes Shen.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Reducing sulfur dioxide emissions alone cannot substantially decrease air pollution

image: Chemical cycle of SO2, NOx, and NH3 in the atmosphere

Image: 
Qianqian Zhang

High loadings of fine particulate matter (PM2.5) during haze are mostly produced from the chemical reactions of the reactive gas precursors, including sulfur dioxide (SO2), nitrogen oxides (NOx), ammonia (NH3), and volatile organic compounds. In an ideal world, air pollution would be cured by wiping clean any one of these four PM2.5 precursors. However, in the real world, we have to go step by step, considering the technological conditions and the economic costs in the emission control strategies. Besides, these gases are subject to a certain thermodynamic equilibrium in the atmosphere. Theoretically, NH3 prefers to combine with SO2 (sulfuric acid) to form ammonium sulfate, which is stable in the atmosphere. Excessive NH3 will react with nitrogen dioxide (nitric acid) to form ammonium nitrate, which is unstable, and the formation of which is influenced by the relative abundance of NH3 and nitrogen dioxide. Consequently, a decrease in SO2 emissions leaves more NH3 to form ammonium nitrate, and it may also perturb the balance between NH3 and nitrogen dioxide.

Due to the delivery of the Air Pollution Control Action Plan, SO2 emissions have declined dramatically since 2013. It also offers us an opportunity to examine whether a reduction in SO2 will perturb the balance between NH3 and nitrogen dioxide in forming ammonium nitrate, and to decide how to make emission control strategies in the future.

Professor Xingying ZHANG from the National Satellite Meteorological Center and his coauthors have addressed this issue. They evaluated and compared the behavior of PM2.5 with respect to NOx and NH3 emission changes in high (2013) and low (2018) SO2 emission cases.

Prof. Zhang's group has found that, from 2013 to 2018, due to the changes in precursor emissions, the simulated annual mean PM2.5 concentration decreased by nearly 20%, more than half of which was driven by reduced SO2 emissions. "To evaluate the influence of a reduction in SO2 emissions on the sensitivity of PM2.5 to NOx and NH3 emissions, we conducted model sensitivity studies by separately perturbing NOx and NH3 emissions by ?25%. Then, we calculated the relative reduction of PM2.5 concentration caused by a 1% decrease in NOx and NH3 emissions," explains Professor Zhang.

According to the study of Prof. Zhang, it can be concluded that, due to the reduced emissions of SO2, and considering the high level of NH3 emissions in China, nitrogen dioxide emissions control is more effective in reducing the surface PM2.5 concentration in China. This paper has been published in Atmospheric and Oceanic Science Letters.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Single-electron pumping in a ZnO single-nanobelt transistor

image: Conductance oscillations in the ZnO NB SET at 4.2 K: (a) with uniform Coulomb gap; (b) with non-uniform Coulomb gap; contour plots of differential trans-conductance at 4.2 K? (c) for single-tunneling quantum dots; (d) for multi-tunneling junctions.

Image: 
©Science China Press

Single electron pumping devices with high efficiency and controllability at room temperature play an essential role in implement spin-based quantum computing and quantum information processing. In a recent study, which was published in SCIENCE CHINA Physics, Mechanics & Astronomy, single-electron transistors (SETs) based on single indium-doped ZnO nanobelt (NB) were built by Xiulai Xu, et al., scientists at the Institute of Physics, Chinese Academy of Sciences. Clear Coulomb oscillations in the SETs were observed at 4.2 K. Single- and double-electron pumping was also achieved by using a back-gated AC signal for different pumping voltages.

Diluted magnetic semiconductors (DMSs), which possess both magnetic and semiconductor properties, was found that the magnetic properties in host materials could be formed by introducing small portions of magnetic materials and that their optoelectronic transport properties did not concurrently degrade. These researchers stated: "Many DMSs, such as indium-doped Mn5Ge3 and (Ga, Mn)As, have been used for precise spin injection and detection purposes. However, the low Curie temperatures, caused by narrow bandgap, limit their application at room temperature. Therefore, wide bandgap semiconductor materials such as ZnO doped with transition metals are highly desirable."

The doping of transition metals in ZnO can improve its ferromagnetic properties, which are advantageous for future applications in spintronics. Among these ZnO materials, they pointed out, "ZnO nanobelts (NBs) are attractive candidates for optoelectronic and nanoscale electronic applications because of the direct wide bandgap (3.37 eV), high exciton binding energy at room temperature (60 meV), and large surface-to-volume ratio."

Being highly charge sensitive, single-electron transistors (SETs) are ideal for studying quantum effects such as Coulomb blockade, tunneling, and single-electron pumping and have shown vast applications in charge detection, thermometry, single-spin detection, single-photon detection, and so on. According to the researchers, single-electron tunneling-based devices such as SETs and single-electron pumps have been investigated using metal, semiconducting materials, and DMSs [(Ga, Mn)As-based SETs] for spin storage and single-electron charging. However, only a few studies on ZnO NB SETs have been conducted, and there has been no research into the advantages for single-electron spin control of single-electron pumping in ZnO quantum dots.

In the study, SETs based on single indium-doped ZnO NB were built. Strong Coulomb oscillations were observed at 4.2 K. "Periodic and non-periodic Coulomb diamonds observed were attributed to the presence of single uniform quantum dots and multi-quantum dots, respectively. The charging energy values were 4 and 5 meV in the case of the single and multi-dots systems, respectively, and the corresponding diameters of the quantum dots were approximately 86 and 70 nm," they explained.

Single- and double-electron pumping was also achieved by using a back-gated AC signal for different pumping voltages. The realization of controlled single- and double-electron pumping in ZnO quantum dots with the simplest configuration was a significant step toward understanding the coherent properties of electron spin in quantum dots for future applications.

The results indicate that ZnO NBs are promising candidates for single-electron spin detection, which is useful for quantum computing and quantum information. Furthermore, the simple configuration of the device used in the study will make it more compatible with standard Si technology in the future.

Credit: 
Science China Press

Computer model predicts how drugs affect heart rhythm

image: Colleen E. Clancy with Pei-Chi Yang and Kevin DeMarco of her research team (from left to right).

Image: 
Copyright UC Regents

UC Davis Health researchers have developed a computer model to screen drugs for unintended cardiac side effects, especially arrhythmia risk.

Published in Circulation Research, the study was led by Colleen E. Clancy, professor of physiology and membrane biology, and Igor Vorobyov, assistant professor of physiology and membrane biology.

Clancy is a recognized leader in using high-performance computing to understand electrical changes in the heart.

"One main reason for a drug being removed from the market is potentially life-threatening arrhythmias," Clancy said. "Even drugs developed to treat arrhythmia have ended up actually causing them."

The problem, according to Clancy, is that there is no easy way to preview how a drug interacts with hERG-encoded potassium channels essential to normal heart rhythm.

"So far there has been no surefire way to determine which drugs will be therapeutic and which will harmful," Clancy said. "What we have shown is that we can now make this determination starting from the chemical structure of a drug and then predicting its impact on the heart rhythm."

Using a drug's chemical formula, the computer model reveals how that drug specifically interacts with hERG channels as well as cardiac cells and tissue. The outcomes can then be validated with comparisons to clinical data from electrocardiogram (ECG) results of patients. For the study, the researchers validated the model with ECGs of patients taking two drugs known to interact with hERG channels -- one with a strong safety profile and another known to increase arrhythmias. The results proved the accuracy of the model.

Clancy envisions the model will offer an essential pre-market test of cardiac drug safety. That test could ultimately be used for other organ systems such as the liver and brain.

"Every new drug needs to go through a screening for cardiac toxicity, and this could be an important first step to suggesting harm or safety before moving on to more expensive and extensive testing," Clancy said.

Credit: 
University of California - Davis Health

Long-living tropical trees play outsized role in carbon storage

image: Since 1982, more than 200,000 trees are measured every five years on Barro Colorado Island in Panama.

Image: 
Christian Ziegler

A group of trees that grow fast, live long lives and reproduce slowly account for the bulk of the biomass--and carbon storage--in some tropical rainforests, a team of scientists says in a paper published this week in the journal Science. The finding that these trees, called long-lived pioneers, play a much larger role in carbon storage than previously thought may have implications in efforts to preserve forests as a strategy to fight climate change.

"People have been arguing about whether these long-lived pioneers contribute much to carbon storage over the long term," said Caroline Farrior, an assistant professor of integrative biology at The University of Texas at Austin and a primary investigator on the study. "We were surprised to find that they do."

It is unclear the extent to which tropical rainforests can help soak up excess carbon dioxide in the atmosphere produced by burning fossil fuels. Nonetheless, the new study provides insights about the role of different species of trees in carbon storage.

Using more than 30 years' worth of data collected from a tropical rainforest in Panama, the team has uncovered some key traits of trees that, when integrated into computer models related to climate change, will improve the models' accuracy. With the team's improved model, the scientists plan to begin answering questions about what drives forest composition over time and what factors affect carbon storage.

Most existing Earth system models used to forecast global climate decades from now, including those used by the Intergovernmental Panel on Climate Change, represent the trees in a forest as all basically the same.

"This analysis shows that that is not good enough for tropical forests and provides a way forward," Farrior said. "We show that the variation in tropical forest species's growth, survival and reproduction is important for predicting forest carbon storage."

The project was led by Nadja Ru?ger, research fellow at the German Centre for Integrative Biodiversity Research (iDiv), Halle-Jena-Leipzig.

In addition to the finding about long-lived pioneers, the team found the composition of a tropical forest over time depends on how each tree species balances two different sets of trade-offs: growth versus survival (for example, one type of tree might grow fast but die young) and stature versus reproduction (another might grow tall but reproduce leisurely). Plotting every species as a point on a graph based on where they fall along these two different axes allowed the scientists to have a more sophisticated and accurate model than prior ones, which usually focused exclusively on the first of these two trade-offs or parametrized the groups by different means.

"To really appreciate that there is this second trade-off between stature and reproduction, and that it's important in old-growth forests, is a big deal biologically," Farrior said.

The team also discovered that the nearly 300 unique tree species that live on Barro Colorado Island, which sits in the middle of the Panama Canal, can be represented in their computer model by just five functional groups and still produce accurate forecasts of tree composition and forest biomass over time.

It's not possible to directly verify the forecasts of a forest model in future decades. So the researchers did the next best thing: They seeded their model with forest composition data collected at their site in Panama during the 1980s and then ran the model forward to show that it accurately represents the changes that occurred from then until now. This is called "hindcasting."

Next, they plan to explore how a warming world might benefit trees with certain traits over others, shifting forest composition and the potential of forests to store carbon.

"One of the biggest unknowns in climate forecasting is: What are trees going to do?" said Farrior. "We really need to get a handle on that if we're going to accurately predict how climate will change and manage forests. Right now, they're absorbing some of the excess carbon we're producing and delaying climate change, but will they keep doing it?"

Credit: 
University of Texas at Austin

A glimpse into the future of tropical forests

image: Around 300 tree species grow in 50 hectares of old-growth forest at Barro Colorado Island, Panama.

Image: 
Christian Ziegler

Leipzig / Panama City. Tropical forests are a hotspot of biodiversity. Against the backdrop of climate change, their protection plays a special role and it is important to predict how such diverse forests may change over decades and even centuries. This is exactly what researchers at the German Centre for Integrative Biodiversity Research (iDiv), the University of Leipzig (UL) and other international research institutions have achieved. Their results have been published in the scientific journal Science.

Nowhere in the world is the loss of the so-called primary forests advancing faster than in the tropics. The natural primary forests are compelled to give way to agriculture and livestock farming and, as a result of clearing, important habitats are lost. In addition, the carbon stored in the trees is released as CO2. When the cleared areas are no longer used, new 'secondary' forests grow on them and these then capture part of the previously released CO2. The promotion of such natural forest areas can therefore offer an inexpensive way of mitigating climate-damaging CO2 from the atmosphere and, at the same time, promote biodiversity.

However, not all forests develop in the same way. In order to manage the recovery and renaturation of tropical forests, it is necessary to be able to predict how the forests will develop. To achieve this, certain parameters must be known; how quickly do the trees grow and how quickly do they die? How many offspring do they produce, which then in turn ensure the continued existence of the species? This is precisely the data which has been recorded in Panama over the past 40 years, in one of the most researched tropical rainforests in the world, for 282 tree species.

Using this data, researchers were able to show that trees pursue different strategies during their development. On the one hand, they differ in terms of their pace of life; while 'fast' species both grow and die quickly, 'slow' species grow slowly and reach an old age. On the other hand, trees can differ regarding their stature, irrespective of pace of life. These 'infertile giants', also known as long-lived pioneers, grow relatively quickly, and reach a great stature, but produce only a few offspring per year, contrasting with the 'fertile dwarfs'; small shrubs and treelets which grow slowly and do not live long, but produce a large number of offspring.

But how many, and which factors of this demographic diversity have to be taken into account in order for us to be able to predict the development of a diverse forest? An international research team used a digital experiment to answer this question. In a computer model, they simulated how trees grow, die, produce offspring and compete for light as in a real forest. They allowed different configurations of the model to compete against each other; these contained either all 282 species from Panama or only a few selected 'strategy types'. The species differed in only one or two respects; their pace of life and their stature. The respective model predictions were then compared with the observed development of real, regrowing secondary forests.

The researchers found that their model worked reliably with only five strategy types, but that both strategy dimensions must be taken into account. "In particular, the long-lived pioneers are important because they account for the bulk of biomass - and carbon - in this forest type at almost all ages, and not only in middle-aged forests as assumed so far," said first author Dr Nadja Rüger, junior research group leader at iDiv and UL.

Following years of research, Rüger and her colleagues have now been able to establish a completely data-driven modelling approach which can be used to predict the development of species-rich forests, without the usual, tedious adjustment and calibration of unknown model parameters, thus saving both time and resources. "Basically, we were able to reduce the forest to its essence, and that was only possible because we know so much about the tree species in the forest in Panama," said Rüger.

While forests are being impacted by climate change, they are also significantly slowing its pace - estimates are the vegetation of the earth is soaking up approximately 34% of the carbon molecules we emit, annually. However, scientists are not sure whether we will be able to count on this significant ecosystem service in the future. "By advancing our ability to predict forest carbon storage and represent the rich biodiversity within tropical forests, we are now on a path to much more accurately capture important ecological processes in the global models that are used by policy makers to predict the pace of climate change", said co-author Caroline Farrior, an assistant professor at the University of Texas at Austin.

Credit: 
German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig

Researchers develop one-way street for electrons

image: In an ultra-small geometry under the right conditions, electrons can be treated like particles bouncing off a wall. If the electrons are contained in a wire and symmetry is broken, the electrons can be preferentially funneled in one direction and blocked in the other, creating an electrical diode.

Image: 
J. Custer

Researchers at the University of North Carolina at Chapel Hill made a one-way street for electrons that may unlock the ability for devices to process ultra-high-speed wireless data and simultaneously harvest energy for power. The researchers did this by shaping silicon on a microscopic scale to create a funnel, or "ratchet," for electrons.

This method overcomes the speed limitations of prior technologies by removing interfaces that tend to slow down devices."This work is exciting because it could enable a future where things like low-power smartwatches are wirelessly charged from the data they already receive without ever needing to a leave a person's wrist," said James Custer Jr., a doctoral student in UNC-Chapel Hill's College of Arts & Sciences.

The findings were published April 10 in the journal Science. Custer is lead author. He worked with collaborators at Duke and Vanderbilt universities.

Electrons carry electrical current, and they typically don't care about the shape of the wire in which the current flows. Yet, when things get very small, shape begins to matter. The funnels here are ultra-small, more than a million times smaller than a typical electrical wire. As a result, the electrons inside behave like billiard balls -- bouncing freely off of surfaces. The asymmetric funnel shape then causes the electrons to bounce preferentially in one direction. In effect, the electrons are forced to follow a one-way street.

Under a direct current (DC) voltage, the funnel makes it easier for current to flow in a forward direction than reverse direction, creating an electrical diode. When alternating current (AC) is applied, the structure still only allows current to flow in one direction, behaving as a ratchet and causing electrons to build up on one side. This process is like a socket wrench, which ratchets force to produce physical motion in only one direction.

The work has shown that these electron ratchets create "geometric diodes" that operate at room temperature and may unlock unprecedented abilities in the illusive terahertz regime.

"Electrical diodes are a basic component of electronics, and our results suggest there could be a completely different paradigm for the design of diodes that operate at very high frequencies," said James Cahoon, an associate professor of chemistry. Cahoon is corresponding author and led the study's research group. "The results are possible because we grow the structures from the bottom up, using a synthetic process that yields geometrically precise, single-crystalline materials."

The electron ratchets are created by a process previously developed in the Cahoon group called ENGRAVE, which stands for "Encoded Nanowire Growth and Appearance through VLS and Etching." ENGRAVE uses a vapor-liquid-solid process to chemically grow single-crystal cylinders of silicon, called nanowires, with precisely defined geometry.

"A lot of the work in this field has previously been done with expensive materials at cryogenic temperatures, but our work highlights that geometric diodes made with relatively cheap silicon can function at room temperature, which even surprised us at first," Custer said. "We hope our results spark a surge of interest in geometric diodes."

Diodes are the backbone for all technology; they allow computers to process data by encoding signals as 1s and 0s. Traditionally, diodes require interfaces between materials, such as between n-type and p-type semiconductors or between semiconductors and metals. By contrast, geometric diodes are made of a single material and simply use shape to direct charges preferentially in one direction.

With continued development, nanowire electron ratchets promise to pave a high-speed, one-way road to new technologies.

Credit: 
University of North Carolina at Chapel Hill

Streaming services flouting India's regulations banning tobacco imagery in all media

Streaming services that are popular with teens and young people in India are flouting the nation's regulations on exposure to tobacco imagery in any media platform, reveals an analysis of 10 on-demand streaming series, published online in the journal Tobacco Control.

The rules, which are designed to protect young people, should be more rigorously enforced, and the guidelines for the implementation of Article 13 of the WHO Framework Convention on Tobacco Control should be updated to include streaming services and other new media, conclude the researchers.

Almost 266 million people aged 15 and older use tobacco in India, and the resulting health problems are "substantial," say the researchers.

In response, India has strengthened its tobacco control efforts, particularly in relation to teens and young people who are highly susceptible to the effects of tobacco imagery, by banning the advertising and promotion of all tobacco products in every media platform.

And since 2012, any film or TV programme containing tobacco imagery must include prominent audio-visual anti-tobacco health warnings for specified periods of time, irrespective of whether it's produced in India or elsewhere.

On-demand streaming services, such as Netflix, YouTube, Hotstar and Amazon Prime Video, have become increasingly popular among young people in India. The researchers therefore wanted to find out how much tobacco imagery is present in streamed content and how well streaming services comply with Indian tobacco control regulations.

They held focus group discussions with school and college students, aged 15 to 24, in New Delhi, to find out which streaming services they used the most and what they watched.

Based on these discussions, the researchers came up with the 10 most popular series, comprising 188 episodes. All but two of the series were streamed on Netflix; the rest were streamed on Amazon Prime Video. Only two of the series were Indian productions.

The 10 series were: The Marvellous Mrs Maisel (rated 16+); Stranger Things (16+); Bodyguard (16+); Riverdale (13+); Narcos (16+); Sacred Games (18+); Mirzapur (18+); Chilling Adventures of Sabrina (16+); 13 Reasons Why (16+); and The Crown (16+).

The researchers used a validated method (Breathe California) to count the number of tobacco incidents in each series. Incidents were defined as the actual or implied use of a tobacco product by an actor.

The analysis showed that 70% of the series depicted tobacco incidents which ranged from zero to 1652 in The Marvellous Mrs Maisel. More than half of the total number of episodes (57.4%) contained at least one such incident.

Narcos contained 833 incidents; The Crown 599; Stranger Things 233; Chilling Adventures of Sabrina 171; Mirzapur 78; and Sacred Games 67.

The Marvellous Mrs Maisel (18 episodes over two seasons) had the highest average number of tobacco incidents (87.5) per episode for the entire series, followed by The Crown (20 episodes over two seasons) with 29, and Narcos (30 episodes over three seasons) with 26.5.

Indian productions contained fewer tobacco incidents per episode and per hour than those produced elsewhere.

Four out of the 10 series depicted tobacco brands, including Mayburn, Camel, Marlboro, Salem and Newport. All these series were foreign productions.

But none of the series that included tobacco incidents complied with the tobacco-free film and TV rules in India.

Their analysis suggests that the extent of tobacco imagery and brand placement in on-demand streaming service content in India is high, while compliance with the rules is low, say the researchers.

"There is no reason to expect that the effects of exposure to tobacco imagery in streaming shows should be any different than the effects of tobacco imagery in films," they write.

"On-demand streaming content providers and governments should heed the lessons learnt from the film industry and apply the same rules to include tobacco imagery in the content available through on-demand streaming platforms," they add.

And it's clear that the legislation "is blatantly being violated in this new media, indicating the need for better enforcement of existing rules in India and updating the guidelines for implementation of Article 13 of the WHO Framework Convention on Tobacco Control," they conclude.

Credit: 
BMJ Group

Aha! + Aaaah: Creative insight triggers a neural reward signal

image: Animation showing insight-related brain activity followed by neural reward-related activity. The horizontal axis shows time, in milliseconds, counting down to the button-press that test-subjects made (at 0 milliseconds) immediately upon solving an anagram puzzle. Maps of high-frequency "gamma" EEG activity are projected on to two head models. The map on the right shows activity associated with insight solutions minus activity associated with noninsight solutions. The appearance of the lightbulb marks the onset time of the insight-related activity recorded over left prefrontal cortex. The map on the left shows the reward-related activity recorded over right prefrontal cortex in individuals who are high in reward sensitivity. This reward signal is marked by the appearance of a lightbulb with a smiley face at about 460 milliseconds before the button-press.

Image: 
Drexel University

Creativity is one of humanity's most distinctive abilities and enduring mysteries. Innovative ideas and solutions have enabled our species to survive existential threats and thrive. Yet, creativity cannot be necessary for survival because many species that do not possess it have managed to flourish far longer than humans. So what drove the evolutionary development of creativity? A new neuroimaging study led by Yongtaek Oh, a Drexel University doctoral candidate, and John Kounios, PhD, a professor in Drexel's College of Arts and Sciences and director of its Creativity Research Lab, points to an answer.

The study, recently published in NeuroImage, discovered that, in some people, creative insights, colloquially known as "aha moments," trigger a burst of activity in the brain's reward system -- the same system which responds to delicious foods, addictive substances, orgasms and other basic pleasures.

Because reward-system activity motivates the behaviors that produce it, individuals who experience insight-related neural rewards are likely to engage in further creativity-related activities, potentially to the exclusion of other activities -- a notion that many puzzle aficionados, mystery-novel devotees, starving artists and underpaid researchers may find familiar, according to Kounios.

"The fact that evolution has linked the generation of new ideas and perspectives to the human brain's reward system may explain the proliferation of creativity and the advancement of science and culture," said Kounios.

The study focused on the phenomenon of aha moments, or insights, as prototypical instances of creativity. Insights are sudden experiences of nonobvious perspectives, ideas or solutions that can lead to inventions and other breakthroughs. Many people report that insights are accompanied by a mind-expanding rush of pleasure.

The team recorded people's high-density electroencephalograms (EEGs) while they solved anagram puzzles, which required them to unscramble a set of letters to find a hidden word. Such puzzles serve as small-scale models of more complex forms of problem-solving and idea generation. They noted which solutions were achieved as insights that suddenly popped into awareness, in contrast to solutions that were generated by methodically rearranging the letters to look for the right order.

Importantly, the test subjects also filled out a questionnaire that measured their "reward sensitivity," a basic personality trait that reflects the degree to which an individual is generally motivated to gain rewards rather than avoid losing them.

The test subjects showed a burst of high-frequency "gamma-band" brainwaves associated with aha-moment solutions. However, only highly reward-sensitive people showed an additional burst of high-frequency gamma waves about a tenth of a second later. This second burst originated in orbitofrontal cortex, a part of brain's reward system.

The study shows that some people experience creative insights as intrinsically rewarding. Because this reward-related burst of neural activity occurred so quickly after the initial insight, only a tenth of a second, it did not result from a conscious appraisal of the solution. Rather, this fast reward response was triggered by, or integrated with, the insight itself.

Low-reward-sensitivity test subjects did experience nearly as many insights as the high-reward-sensitivity ones, but their insights did not trigger a significant neural reward response. Thus, neural reward is not a necessary accompaniment to insight, though it occurs in many people.

This study suggests that measurements of general reward sensitivity may help to predict who will practice, develop and expand their creative abilities over time.

Credit: 
Drexel University

Canada lynx disappearing from Washington state

PULLMAN, Wash. - Canada lynx are losing ground in Washington state, even as federal officials are taking steps to remove the species' threatened status under the Endangered Species Act.

A massive monitoring study led by Washington State University researchers has found lynx on only about 20% of its potential habitat in the state. The study, published recently in the Journal of Wildlife Management, covered more than 4,300 square miles (7,300 km) in northeastern Washington with camera traps but detected lynx in only 29 out of 175 monitored areas.

The results paint an alarming picture not only for the persistence of lynx but many other cold-adapted species, said Dan Thornton, an assistant professor in WSU's School of the Environment.

"Lynx are good sentinel species for climate change," said Thornton, the corresponding author on the study. "They are specialized, have larger ranges and need really cold, snowy environments. So, as they go, they are like an early warning system for what's going to happen to other climate sensitive species."

Wildfire, rising temperatures and decreasing snowpack have all hurt the lynx's ability to survive in Washington, the researchers found.

In the last 24 years, large wildfires have ripped through northeastern parts of the state, destroying habitat for lynx and their favorite food: snowshoe hare. It can take as long as 20 to 40 years for that landscape to recover.

The lack of snow and cold are also a problem, as lynx with their bigger paws are specially adapted to hunt on snow and for the prey that live there. As temperature rises, warmer adapted species like bobcat and cougar could also bring competition into lynx territory.

"We learned that lynx are responding strongly to many of these factors - snow conditions, temperature and fire - that are likely to change even more as the climate warms," said Thornton.

The connection to Canadian populations is also key for the lynx survival in Washington, and that connection is complicated by differing conservation status. In Washington state, they are protected at the state and federal levels as a threatened species. In Canada, they are harvested for their pelts.

The lynx's protected status in the U.S. may also change. Lynx are currently found in Maine, Minnesota, Montana, Colorado, Idaho and Washington, but a 2016 federal draft assessment found the species would disappear from its northern range without protection by 2100. However, a new assessment in 2018 concluded that the lynx could be removed from threatened status under the federal Endangered Species Act.

Living in high, remote areas, lynx are challenging to study, and estimates of actual individuals are difficult to make, but according to an analysis by the Washington State Department of Fish and Wildlife based on data collected in the 1980s, the state used to have about 7,800 square miles of habitat capable of supporting 238 animals. In 2017, that estimate was revised down to about 2,300 square miles capable of supporting 38 to 61 lynx. This latest study adds strong evidence that their territory in Washington is further contracting.

To document the elusive animals, WSU graduate student Travis King, the lead author on the study, covered thousands of kilometers and spent two summers in the field. He also relied upon many partners and volunteers, ranging from government natural resource agency employees and conservation groups to hikers and citizen scientists. The researchers and volunteers deployed and collected 650 camera traps which generated more than 2 million images which were, in turn, sorted with the help of dozens of WSU student volunteers.

This is the first time such a comprehensive method using camera traps to track lynx has been employed. Thornton and his colleagues are now working to use the method to estimate the lynx range in Glacier National Park in Montana.

Credit: 
Washington State University

Electronic cigarette use among young adult cancer survivors

What The Study Did: This study used national survey data from young adults ages 18 to 39 to compare e-cigarette use among cancer survivors with their peers without cancer.

Authors: Helen M. Parsons, Ph.D., M.P.H., of the University of Minnesota in Minneapolis, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamaoncol.2020.0384)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the articles for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network