Tech

The ocean's 'biological pump' captures more carbon than expected

video: Scientists have long known that the ocean plays an essential role in capturing carbon from the atmosphere, but a new study shows that the efficiency of the ocean's "biological carbon pump" has been drastically underestimated.

Image: 
Video by Elise Hugus, UnderCurrent Productions, © Woods Hole Oceanographic Institution

Every spring in the Northern Hemisphere, the ocean surface erupts in a massive bloom of phytoplankton. Like plants, these single-celled floating organisms use photosynthesis to turn light into energy, consuming carbon dioxide and releasing oxygen in the process. When phytoplankton die or are eaten by zooplankton, the carbon-rich fragments sinks deeper into the ocean, where it is, in turn, eaten by other creatures or buried in sediments. This process is key to the "biological carbon pump," an important part of the global carbon cycle.

Scientists have long known that the ocean plays an essential role in capturing carbon from the atmosphere, but a new study from Woods Hole Oceanographic Institution (WHOI) shows that the efficiency of the ocean's "biological carbon pump" has been drastically underestimated, with implications for future climate assessments.

In a paper published April 6 in Proceedings of the National Academy of Sciences, WHOI geochemist Ken Buesseler and colleagues demonstrated that the depth of the sunlit area where photosynthesis occurs varies significantly throughout the ocean. This matters because the phytoplankton's ability to take up carbon depends on amount of sunlight that's able to penetrate the ocean's upper layer. By taking account of the depth of the euphotic, or sunlit zone, the authors found that about twice as much carbon sinks into the ocean per year than previously estimated.

The paper relies on previous studies of the carbon pump, including the authors' own. "If you look at the same data in a new way, you get a very different view of the ocean's role in processing carbon, hence its role in regulating climate," says Buesseler.

"Using the new metrics, we will be able to refine the models to not just tell us how the ocean looks today, but how it will look in the future," he adds. "Is the amount of carbon sinking in the ocean going up or down? That number affects the climate of the world we live in."

In the paper, Buesseler and his coauthors call on their fellow oceanographers to consider their data in context of the actual boundary of the euphotic zone.

"If we're going to call something a euphotic zone, we need to define that," he says. "So we're insisting on a more formal definition so that we can compare sites."

Rather than taking measurements at fixed depths, the authors used chlorophyll sensors --indicating the presence of phytoplankton-- to rapidly assess the depth of the sunlit region. They also suggest using the signature from a naturally-occuring thorium isotope to estimate the rate at which carbon particles are sinking.

Credit: 
Woods Hole Oceanographic Institution

Pollen-based 'paper' holds promise for new generation of natural components

video: NTU Singapore scientists have created a paper-like material derived from pollen that bends and curls in response to changing levels of environmental humidity. The ability of this paper made from pollen to alter its mechanical characteristics in response to external stimuli may make it useful in a wide range of applications, including soft robots, sensors, artificial muscles, and electric generators.

Combined with digital printing, pollen paper may hold promise for the fabrication of a new generation of programmable natural actuators - components in a machine that are responsible for moving and controlling a mechanism.

Image: 
NTU Singapore

Scientists at Singapore's Nanyang Technological University (NTU Singapore) have created a paper-like material derived from pollen that bends and curls in response to changing levels of environmental humidity.

The ability of this paper made from pollen to alter its mechanical characteristics in response to external stimuli may make it useful in a wide range of applications, including soft robots, sensors, artificial muscles, and electric generators.

Combined with digital printing, pollen paper may hold promise for the fabrication of a new generation of programmable natural actuators - components in a machine that are responsible for moving and controlling a mechanism.

The findings, published in the Proceedings of the National Academy of Sciences of the United States of America this week, show how the NTU Singapore team formulated the paper using softened pollen grains.

They demonstrated the pollen-based paper's properties by folding it into a flower that 'blooms' in the presence of water vapour. They also showed that the pollen material's physical properties can be adjusted, with a strip of pollen-based paper that is able to 'walk'.

The corresponding authors of this paper are Assistant Professor Song Juha of the School of Chemical and Biomedical Engineering, and Professor Cho Nam-Joon and Professor Subra Suresh of the School of Materials Science and Engineering at NTU.

NTU Distinguished University Professor Subra Suresh, who is also the NTU President, said: "Much progress has been made in developing bioinspired sensors and actuators based on engineered synthetic materials, but these materials come with limitations such as issues with environmental sustainability and relatively high cost. There remains a critical need to incorporate cost-effective and eco-friendly materials. Just as pine cones open and close their scales depending on the amount of moisture in the air, our NTU research team has shown that pollen paper created from naturally abundant pollen grains responds as an actuator to changes in environmental humidity."

NTU Professor Cho Nam-Joon, who holds the Materials Research Society of Singapore Chair in Materials Science and Engineering, said: "These findings build on the recent work by our NTU team, in which we showed how hard pollen grains can be converted into soft microgel particles that alter their properties in response to external stimuli. This process also renders pollen and the products we create from it, non-allergenic."

Pollen paper that bends, flips and moves

To form the paper, the NTU team first transformed the ultra-tough pollen grains from sunflowers into a pliable, gel-like material through a process similar to conventional soap-making. This process includes removing the sticky oil-based pollen cement that coats the grain's surface, before incubating the pollen in alkaline conditions for hours.

The resulting gel-like material is then cast into a mould and left to dry, forming a paper-like material. Using scanning electron microscopy, the scientists observed that the pollen-based paper comprises alternating layers of pollen particles, with the top layer significantly rougher than the bottom layer.

The top surface of the pollen paper, which appears frosted to the naked eye, showed remnants of the sunflower pollen grains' distinct spikes, contributing to its roughness. The bottom surface, which takes on a mirror-like surface finish, was relatively smoother.

This structural difference in pollen particle layers means that in the presence of water vapour the paper starts to bend, and under dry conditions, it unbends. Repeated cycles of humid and dry conditions cause the paper to perform a flipping motion over time.

NTU Assistant Professor Song Juha explained: "During water or water vapour absorption, the pollen particles in the paper swell and expand. Due to the structural difference in the pollen particle layers, the paper swells differently at different parts. This induces internal stresses through the thickness of the paper, which forces it to bend."

To show that it is possible to customise the water-vapour responsiveness of pollen paper, the team adjusted processing parameters, chiefly the alkaline incubation time of the pollen grains. They joined two pollen paper samples, each prepared under different incubation times (3 hours and 12 hours), to form a strip of bi-material pollen paper with a visible boundary.

When the bi-material pollen paper was exposed to a humid-dry cycle, the two pollen paper samples' different reactions to humidity caused the paper to 'walk' like a caterpillar that moves by alternatingly expanding and contracting its soft body.

The scientists also demonstrated the potential application of pollen paper as a self-actuating soft robot through a flower made from pollen paper that 'blooms' through gradual absorption of water vapour.

Prof Suresh said: "The pollen paper we've developed shows strong mechanical actuation as the humidity changes. This naturally occurring material showspotential for developing a wide spectrum of actuation systems with customised properties for different functional needs."

Credit: 
Nanyang Technological University

Scientists' warning to humanity on insect extinctions

image: As the human race continues to battle the COVID-19 pandemic, scientists have found that the planet's insects are also facing a crisis after accelerating rates of extinction have led to a worldwide fall in insect numbers

Image: 
Dr Matt Hill, University of Huddersfield

SOME of the tiniest creatures on the planet are vital for the environment. But there is a worldwide fall in insect numbers after an accelerating rate of extinction.

Now, a global group of 30 scientists - including University of Huddersfield lecturer Dr Matt Hill - has highlighted the issue and suggests practical steps that everyone can take to help halt the decline. These include mowing lawns less often, avoiding pesticides and leaving old trees, stumps and dead leaves alone.

A specialist in aquatic environments, Dr Hill teaches on the University's recently-established geography degree course and supervises students as they take part in conservation projects.

He is a co-author of two articles - available online to all - in the journal Biological Conservation entitled 'Scientists' warning to humanity on insect extinctions' - https://www.sciencedirect.com/science/article/pii/S0006320719317823 - and 'Solutions for humanity on how to conserve insects' - https://www.sciencedirect.com/science/article/pii/S0006320719317793.

Long-term decline

Pollution and human impact on habitats mean that insects such as beetles, dragonflies and mayflies plus other macroinvertebrates such as snails are in long-term decline across the world, in the UK as much as anywhere, said Dr Hill. Yet they make a vital contribution to the environment.

"They provide food for other animals and they can also have a significant role in the functioning of freshwater ecosystems, forming a critical component in the diversity of life," he added.

Dr Hill was sought out to work with scientists in countries that include Germany, the UK, Columbia, Finland and South Africa. They pooled their research into insect decline and collaborated on the two new articles.

These tell how factors such as climate change, the loss of habitats and pollution - including harmful agricultural practices - have all contributed to declines in insect population and to species extinctions.

Insects have many functions in the ecosystem that cannot be replicated by technology or any other innovation. For example, the need for pollination means that crops depend on insects to survive, and their decomposition means they contribute to nutrient cycling.

We can all do our bit to help

The team behind the research and the new articles have drawn up a nine-point plan that enables individuals to contribute to insect survival:

1. Avoid mowing your lawn frequently; let nature grow and feed insects

2. Plant native plants; many insects need only these to survive

3. Avoid pesticides; go organic, at least for your own backyard

4. Leave old trees, stumps and dead leaves alone; they are home to countless species

5. Build an insect hotel with small horizontal holes that can become their nests

6. Reduce your carbon footprint; this affects insects as much as other organisms

7. Support and volunteer in conservation organisations

8. Do not import or release living animals or plants into the wild that could harm native species

9. Be more aware of tiny creatures; always look on the small side of life.

Dr Hill specialises in aquatic habitats and his areas of research include pondscapes - including garden ponds - and freshwater in urban areas. He teaches modules that include ecological theory and practical conservation. Students take part in field trips that include sampling of macroinvertebrates in rivers and in their second year they conduct an in-depth study of the impact of urbanisation.

"The students are very responsive to the issues and very interested in the conservation of insects and animals in general," he said.

Credit: 
University of Huddersfield

NASA finds heavy rainfall in powerful tropical cyclone Harold

image: On April 6, 2020, the GPM satellite provided an estimate of rainfall rates in powerful Tropical Cyclone Harold over Vanuatu in the Southern Pacific Ocean. The highest rates were in the rain band to the southeast of the eye, at 48 mm (1.8 inches) per hour. Near the eye, rates in some areas also exceeded 40 mm (1.6 inches) per hr.

Image: 
JAXA/Jason West, NASA EOSDIS

One of NASA's satellites that can measure the rate in which rainfall is occurring in storms passed over powerful Tropical Cyclone Harold just after it made landfall in Vanuatu in the Southern Pacific Ocean.

Tropical Cyclone Harold developed from a low-pressure system that was observed to the east of Papua New Guinea last week, and has tracked to the southeast, where it has already caused flooding and loss of life in the Solomon Islands.

Now a Category 4 cyclone, the most powerful yet of 2020, Harold made landfall on the South Pacific nation of Vanuatu on Monday, April 6, not long before the Global Precipitation Measurement mission or GPM passed overhead. GPM's Dual-frequency Precipitation Radar and GPM Microwave Imager data provided data on rainfall rates. "The highest rates were in the rain band to the southeast of the eye, at 48 mm (1.8 inches) per hour," said B. Jason West, Science Data Analyst at NASA's Goddard Space Flight Center in Greenbelt, Md. "Near the eye, rates in some areas also exceeded 40 mm (1.6 inches) per hour."

Early reports from Vanuatu indicate heavy flooding and property damage.

The Vanuatu Meteorological Service (VMS) posted Tropical Cyclone Warning Number 27 for the Sanma, Penama, Malampa and Shefa Provinces. At 8 a.m. EDT (11:00 p.m. Vanuatu local time), VMS noted that "Severe Tropical Cyclone Harold was located at latitude 16.0 degrees south and longitude 168.8 degrees east, about 80 kilometers (50 miles) east northeast of Ambrym and 105 km (65 miles) northeast of Epi.  Severe Tropical Cyclone Harold has been moving in an east-southeasterly direction at 19 kph (10 knots/12 mph) in the past 3 hours. Maximum sustained winds close to the center are estimated at 230 kph (125 Knots/143 mph).

The VMS warning noted, "Damaging gale force winds, destructive storm force winds and hurricane force winds with heavy rainfalls and flash flooding over low lying areas and areas close to river banks including coastal flooding is expected over Sanma, Penama, Malampa and Shefa Provinces including Torba province tonight. Very rough to phenomenal seas with heavy to phenomenal swells are expected over northern and central open and coastal waters tonight as the system continues to move over the Central Islands of Vanuatu. High Seas wind warning and a Marine strong wind warning are current for all coastal and open waters of Vanuatu. People, including sea-going vessels are strongly advised not to go out to sea within affected area until the system has moved out of the area."

The Vanuatu National Disaster Management Office (NDMO) advises that Red Alert is in effect for Sanma, Penama, Malampa and Shefa Provinces, while Yellow Alert for Torba provinces.

Credit: 
NASA/Goddard Space Flight Center

More pavement, more problems

image: Urbanization can lead to more intense flooding, on average, researchers say.

Image: 
David Goff

Think your daily coffee, boutique gym membership and airport lounge access cost a lot? There may be an additional, hidden cost to those luxuries of urban living, says a new Johns Hopkins University study: more flooding.

For every percentage point increase in roads, parking lots and other impervious surfaces that prevent water from flowing into the ground, annual floods increase on average by 3.3%, the researchers found.

The study was published today in Geophysical Research Letters.

"With recent major floods in heavily urbanized cities like Houston and Ellicott City, we wanted to better understand how much urbanization is increasing flood flows," said Annalise Blum, a former postdoctoral fellow in Johns Hopkins University's Earth and Planetary Sciences Department and the paper's first author. Blum is now an American Association for the Advancement of Science (AAAS) Science & Technology Policy Fellow.

While previous studies have tried to estimate how much impervious surfaces affect flooding, those studies used smaller datasets--looking at only one stream or a small set of streams--that weren't generalizable across the country. These studies also couldn't isolate the cause-and-effect relationship between impervious surfaces and flood magnitude, says Blum, because they couldn't effectively control for other factors such as climate, dams and land use. These other factors make it difficult to say how more impervious cover impacts flood magnitude.

Working with Paul Ferraro, a Bloomberg Distinguished Professor with joint appointments in the Carey Business School and the Department of Environmental Health and Engineering at The Johns Hopkins University, the research team employed mathematical models not often used in the study of water or floods.

"Inferring cause and effect in the environment around us is difficult. However, in the last few decades, fields like economics and biostatistics have made great advances in methods that can isolate cause and effect. By bringing these methods to hydrology, we hope that we can spur advances in hydrological science, as well as in the urban policies and programs that depend on that science," says Ferraro.

Prior studies study single streams over time or multiple streams at a single point in time. These studies, separately, however, cannot determine if differences in floods are due to differences in impervious surfaces or changes in other factors.

Blum and colleagues created a data set that allowed them to leverage differences across both time and space to isolate the effect of impervious surfaces on floods. The research team analyzed 39 years of data (1974-2012) from more than 2,000 U.S. Geological Survey streamgages, which measure the amount of water flowing through a stream. The team then merged the stream data with data on the growth of impervious surfaces in the basins upstream of the gages.

The authors estimate that annual flood magnitude (defined as annual maximum streamflow) increases by 3.3%, on average, for every percentage point increase in patios, garages, pavement or other impervious surfaces.

"Due to the large variability in annual flooding, it is difficult to isolate the effect of urbanization. Combining these large datasets with both time and space dimensions allowed us to tease out and calculate the magnitude of the effect," says Blum.

Blum hopes that researchers will apply the methods to other environmental challenges and use the results to prepare for unintended consequences of urbanization.

"If you're looking at a basin that you expect will urbanize in the next five years, these findings will give you a ballpark estimate of additional flooding to expect due to that urbanization," she says.

Credit: 
Johns Hopkins University

Follow your gut

image: Newly identified digestive-brain axis controls food choice

Image: 
Gil Costa

Food has something of a "magic hold" on us, as certain flavors and textures can pretty much dictate what we do. Just think about the spicy dish that keeps bringing you back to that remote Chinese restaurant, or the irresistibly creamy but expensive ice-cream at the Italian place on the corner.

But is it only your palate that controls your food choices? It may feel like it, but the answer is no. In fact, much of what is going on happens beyond the walls of your mouth, through interactions between your digestive system and your brain.

What are your digestive and nervous systems talking about, and how can it influence your behavior? The group of Albino Oliveira-Maia, head of the Neuropsychiatry Unit at the Champalimaud Centre for the Unknown in Lisbon, Portugal, has been working on finding answers to these questions. The team's most recent set of results - describing a novel digestive-brain axis they had identified in mice - were published today (April 6, 2020) in the scientific journal Neuron.

Follow your gut

"The mouth is the first checkpoint - deciding whether the food should be accepted or rejected", explains Oliveira-Maia. "Once inside, the food is broken up into nutrients, and the post-ingestive phase begins. "In this phase, it's the digestive system's turn to 'taste' the food and talk to the brain about your meal choice".

According to Oliveira-Maia, post-ingestive processes can be divided into two types. The first deals with the present - how nutritious the food is and how much of it should be consumed. The second is a learning process that determines how the organism should respond to the same food in the future.

An example of this "post-ingestive learning" is when the body's appraisal of the food's nutritional value leads an individual to develop a preference towards it. How would that work? Imagine two foods that taste exactly the same but have different nutritional values: one is high, and the other is low. According to decades of studies, post-ingestive learning leads both animals and humans to develop a preference for the more nutritious food. It actually makes sense because it is in the organism's best interest to identify which food is more nutritious and opt for that food when possible.

Oliveira-Maia and his team wondered whether these same post-ingestive signals might be involved in other types of learning. More precisely, they asked whether they could lead animals to actively seek out certain foods.

To study this question, the team developed a task in which animals would press levers to receive a direct injection of food into their stomach. "It was important to do it this way to eliminate the palatable aspects of the food and focus purely on its post-ingestive effects", explains Ana Fernandes, the first author of the study. "In one experiment, we made two levers available to the mice: one that triggered the injection of high-calorie food and another that triggered the injection of low-calorie food. Then, we allowed them to choose freely between the two levers and observed their response."

The results of the experiment were clear: even though the mice were not able to taste the food, they ended up spending their efforts on the "high-calorie" lever.

With this novel experimental paradigm, the team established a new form of post-ingestive learning. Next, they proceeded to pin down the physiological mechanism involved.

Gut feeling

To study the learning mechanism, the team first asked how information about the nutritional value of the food reaches the brain? "To answer this question, we focused on the Vagus Nerve. This is a long nerve that forms bi-directional connections between the brain and multiple internal organs", says Oliveira-Maia.

According to Oliveira-Maia, most research on the relation of the Vagus Nerve and feeding behavior concentrated on the nerve's connections with the gut. But he and his team decided to take a different approach. "Results from my previous work, indicated that a specific branch of the Vagus Nerve might be involved - the one that carries information that comes from the liver."

Why would the liver be particularly important for this learning process, instead of the gut? "Different parts of the gut may have incomplete information about the nutritional value of the food that is being eaten. The liver, on the other hand, metabolically filters the blood arriving from most of the gut. This means that it's well-positioned to function as an overall metabolic sensor", Fernandes explains.

And indeed, when the team tested their hypothesis by lesioning the hepatic branch of the Vagus Nerve, the mice were unable to acquire this new type of learning. This supported their assumption that this specific branch is the one sensing and relaying post-ingestive signals to the brain during the learning process.

From the gut to the brain and back again

This exciting finding raised a pressing question: where in the brain were the post-ingestive signals being sent to?

The team began with the first immediate suspect - dopamine - a molecule involved in various cognitive processes. Multiple studies have illustrated an association between feeding and dopamine neurons in the brain. However, direct links between post-ingestive signals and the activity of these neurons had not been demonstrated.

The team implemented several experimental approaches to study whether dopamine neurons were involved in the post-ingestive learning they had discovered. Their conclusions provided concrete proof that they are.

"An important component of the study was the discovery that dopamine neurons were involved in this new learning process", says Rui Costa, also a senior author of the study.

The team was able to demonstrate not only that these neurons were sensitive to post-ingestive signals, but also that their activity was necessary for the learning to occur. And to complete the picture, the final experiment tested whether the neurons were influenced by the hepatic branch of the vagus nerve. Again, the answer was yes: when the branch was cut, the neurons' response to post-ingestive signals was significantly reduced.

Why would dopamine neurons, in particular, be part of this learning process? "Dopamine neurons have been shown to respond to reward, for example when a sweet treat reaches our tongue", explains Costa. "This study shows that these neurons are also activated when foods reach the stomach and intestine. Furthermore, we showed that the activation of dopamine neurons when nutrients reach the gut is critical for driving food-seeking behaviors."

The bigger picture

Together, the results of the study reveal a novel learning process - orchestrated between the digestive system and the brain - that compels animals to seek out food that they never actually tasted. This testifies to the potency of the subconscious processes that control behavior.

Oliveira-Maia believes this work provides fundamental insight into how unique patterns of feeding behavior emerge. And while it might not have immediate clinical applications, he believes that this work is ultimately relevant for understanding and treating eating-related disorders such as obesity. "It's very early to say what will come from this work down the line. Still, the inspiration regarding the relationship between dopamine receptors and obesity is a big part of why we did this work in the first place", he concludes.

Credit: 
Champalimaud Centre for the Unknown

Examining association between childhood video game use, adolescent body weight

What The Study Did: Researchers looked at whether there was a long-term association between using video games at an early age and later weight as a teenager, as well as what role behaviors such as physical activity, the regularity of bedtimes and consuming sugar-sweetened beverages might play. The study was a secondary analysis of data from a study that included 16,000 children born in the United Kingdom.

Authors: Rebecca J. Beeken, Ph.D., of the University of Leeds in Leeds, England, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamapediatrics.2020.0202)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Hereditary mutation that drives aggressive head and neck, and lung cancers in Asians

Researchers from the Cancer Science Institute of Singapore (CSI Singapore) at the National University of Singapore (NUS) have uncovered a genetic variant in a gene called MET that is responsible for more aggressive growth of head and neck, and lung cancers. A further probe into the finding revealed therapeutic strategies that could potentially target this genetic alteration, thereby paving the way for clinicians to develop better and more effective treatments for cancer patients of such profile.

The study, published in prestigious scientific journal Nature Communications on 25 March 2020, was conducted in close collaboration with clinicians from the National University Cancer Institute, as well as researchers from the National Cancer Centre Singapore and the Bioinformatics Institute at the Agency for Science, Technology and Research, Singapore.

The MET gene encodes for a cancer promoting protein that relays growth, survival and transmission of signals in cancer cells. In the study led by Professor Goh Boon Cher and Dr Kong Li Ren from CSI Singapore, the team of researchers identified a form of MET protein, which showed ethnic preference with higher incidence among Asians, and is associated with poorer prognosis in patients diagnosed with head and neck squamous cell carcinoma or lung squamous cell carcinoma. Even though the MET variant does not seem to predispose an individual to cancer, it leads to more aggressive growth of cancers that have already developed.

Unlike other MET mutants, this genetic variant also does not appear to be inhibited by existing MET-blocking drugs that have been developed and approved in the clinical setting, prompting the researchers to conduct further investigation on the mechanism behind the genetic alteration.

Leveraging the research team's multi-disciplinary expertise and state-of-the-art molecular modelling, the team found that the single amino-acid change in the MET receptor from the genetic alternation leads to preferential strong binding to another cancer promoting protein, HER2. Both proteins then work cooperatively to drive cancer aggression and enable cancer cells to survive therapies involving MET-blocking drugs.

"The mechanism of this MET variant is novel and unreported. This finding contributes to the growing evidence of the role of genetic variants in affecting clinical outcome, and underscores the importance of diving deep into our genetic inheritance in cancer research," said Dr Kong, Research Fellow at CSI Singapore who initiated the study.

Knowledge of this unique mechanism also facilitated the team in identifying several HER2 inhibitors capable of blocking cancer progression caused by this genetic alteration using laboratory models.

Prof Goh, Deputy Director and Senior Principal Investigator at CSI Singapore, said, "Our study represents a conceptual advancement to cancer research, as we have shown that it is possible to block the activity of a cancer-driving gene by administrating a targeted therapy directed not against the mutant protein in question, but rather, a corresponding protein with which it binds to. The remarkable anti-tumour responses observed in our experimental models, coupled with the availability of FDA-approved HER2 inhibitors also presents a huge opportunity for clinicians to improve disease outcome of this genetic alteration via precision medicine."

The research team is now translating the findings to a clinical trial where patients tested positive for this MET variant gene are treated with suitable medications that have shown effectiveness in the laboratory.

Credit: 
National University of Singapore

How old are whale sharks? Nuclear bomb legacy reveals their age

image: Whale sharks can exceed 40 feet and weigh up to 40 tons, according to some estimates.

Image: 
NOAA

Nuclear bomb tests during the Cold War in the 1950s and 1960s have helped scientists accurately estimate the age of whale sharks, the biggest fish in the seas, according to a Rutgers-led study.

It's the first time the age of this majestic species has been verified. One whale shark was an estimated 50 years old when it died, making it the oldest known of its kind. Another shark was an estimated 35 years old.

The study in the journal Frontiers in Marine Science used a measure of lingering radioactivity (radiocarbon dating) from nuclear explosions to estimate shark ages, and it could help ensure the survival of this endangered species.

"Accurate estimates of longevity, growth and mortality will better inform management and conservation efforts for whale sharks," said lead author Joyce Ong, a postdoctoral associate in the Department of Marine and Coastal Sciences in the School of Environmental and Biological Sciences at Rutgers University-New Brunswick. "The extended longevity, slow growth rates, late maturity and global connectivity of this species indicate high susceptibility to death caused by human impacts, such as ship strikes. Hence, this knowledge can help conservation managers adjust their strategies to be more effective."

Whale sharks (Rhincodon typus) can weigh up to 40 tons. The largest one reported was 65.6 feet long, but they rarely exceed about 39 feet, according to the National Oceanic and Atmospheric Administration. Whale sharks have broad, flat heads with short snouts, and their backs feature a white, yellow and grey checkerboard pattern.

It's been difficult to estimate whale sharks' ages at death since they, like all sharks and rays, lack otoliths (bony structures) that scientists use to assess the ages of other fish. Resembling tree rings, whale shark vertebrae have distinct bands that increase as the animal ages. Some studies suggest that a new ring forms annually; others concluded that one forms every six months.

Led by Ong, a research team from several institutions examined the radioactive legacy of the Cold War's nuclear arms race.

In the 1950s and 1960s, the United States, Soviet Union, Great Britain, France and China tested nuclear weapons in the open air, including several kilometers above Earth (1 kilometer is 0.62 miles). That temporarily doubled the level of carbon-14, a form of carbon and naturally occurring radioactive element, in the atmosphere. Archaeologists and historians frequently use carbon-14 to date ancient bones and artifacts. Nuclear explosions release carbon-14 and Cold War fallout saturated the air and oceans. The carbon-14 gradually moved through food webs, resulting in an elevated carbon-14 level that lingers.

The scientists tested the carbon-14 levels in the growth rings of the whale shark stored in Taiwan and one in Pakistan, accurately dating them.

Next steps include seeking the vertebrae of stranded sharks and studying more large, old whale sharks, Ong said. That will allow scientists to refine growth models, resulting in more accurate estimates of growth and natural mortality that will lead to more effective conservation strategies.

Credit: 
Rutgers University

Insect wings hold antimicrobial clues for improved medical implants

image: This is an E. coli bacteria lying on a bed of nano-nails.

Image: 
Professor Bo Su, University of Bristol

Some insect wings such as cicada and dragonfly possess nanopillar structures that kill bacteria upon contact. However, to date, the precise mechanisms that cause bacterial death have been unknown.

Using a range of advanced imaging tools, functional assays and proteomic analyses, a study by the University of Bristol has identified new ways in which nanopillars can damage bacteria.

These important findings, published in Nature Communications, will aid the design of better antimicrobial surfaces for potential biomedical applications such as medical implants and devices that are not reliant on antibiotics.

Bo Su, Professor of Biomedical Materials at the University of Bristol's Dental School, who authored the research said:

"In this work, we sought to better understand nanopillar-mediated bactericidal mechanisms. The current dogma is that nanopillars kill bacteria by puncturing bacterial cells, resulting in lysis. However, our study shows that the antibacterial effects of nanopillars are actually multifactorial, nanotopography- and species-dependent.

"Alongside deformation and subsequent penetration of the bacterial cell envelope by nanopillars, particularly for Gram-negative bacteria, we found the key to the antibacterial properties of these nanopillars might also be the cumulative effects of physical impedance and induction of oxidative stress.

"We can now hopefully translate this expanded understanding of nanopillar-bacteria interactions into the design of improved biomaterials for use in real world applications."

Funded by the Medical Research Council, the implications of the research are far-reaching. Prof. Su explains:

"Now we understand the mechanisms by which nanopillars damage bacteria, the next step is to apply this knowledge to the rational design and fabrication of nanopatterned surfaces with enhanced antimicrobial properties.

"Additionally, we will investigate the human stem cell response to these nanopillars, so as to develop truly cell-instructive implants that not only prevent bacterial infection but also facilitate tissue integration."

Credit: 
University of Bristol

What is the Asian hornet invasion going to cost Europe?

image: Consensus climate suitability of the yellow-legged hornet predicted from species distribution modelling.

Image: 
Prof. Franck Courchamp

Since its accidental introduction in 2003 in France, the yellow-legged Asian hornet (Vespa velutina nigrithorax) is rapidly spreading through Europe. Both experts and citizen scientists keep on identifying the new invader spreading all over the Old Continent in the last decades.

In a recent study, French scientists led by Prof. Franck Courchamp at the Université Paris-Saclay and the CNRS, tried to evaluate the first estimated control costs for this invasion. Supported by the INVACOST project, their findings are published in the open-access journal NeoBiota.

Since its invasion to France in 2004 when it was accidentally introduced from China, the Asian hornet has been spreading rapidly, colonising most of France at an approximate rate of 60-80 km per year, and also invading other European countries: Spain in 2010, Portugal and Belgium in 2011, Italy in 2012, Germany in 2014 and the UK in 2016. In the recent paper, published in the open-access journal Evolutionary Systematics, Dr. Martin Hussemann from CeNaK, University of Hamburg has recorded the northernmost capture of the Asian hornet in Hamburg in September 2019.

These data show that the Asian hornet is spreading all around Europe faster and faster with every year, even in climatically less favourable regions. The rapid invasion of the species is not necessarily caused by human-mediated dispersal, the species can rapidly spread on its own, but nevertheless, it is not uncommon.

Within its native and invasive range, V. velutina nigrithorax actively preys on honeybees, thus, causing harm to apiculture. Due to its active praying on wild insects, the Asian hornet also has a negative impact on ecosystems in general and contributes to the global decline of pollination services and honey production. Furthermore, by nesting in urban areas, the Asian hornet, which is well known for its aggressive behaviour, is a potential threat to human activities.

Currently, the control of the invasion is mainly undertaken by nest destruction and bait trapping, but none of these methods is sufficient enough to achieve complete eradication.

To proceed with the further control of the invasion, there is a need to evaluate economic costs. Those costs are divided into 3 main categories: (1) prevention of the invasion, (2) fighting the invasion and (3) damage caused by the invasion.

The cost of fighting the invasion of the Asian hornet is the cost of nest destruction. To identify those costs, the research team has studied information about the companies providing the services in the nest destruction, extrapolated the cost of nest destruction spatially and modelled the potential distribution of the invasive.

As the calculations show, at the moment, the estimated yearly costs for eradication would be €11.9M for France, €9.0M for Italy and €8.6M for the United Kingdom.

"In 2006, only two years after the hornet was first observed in France, three departments were already invaded and the cost of nest destruction was estimated at €408k. Since then, the estimated yearly costs have been increasing by ~€450k each year, as the hornet keeps spreading and invades new departments. Overall, we estimated €23M as the cost of nest destruction between 2006 and 2015. If this temporal trend can be extrapolated for the next few years (i.e. if the hornet keeps spreading at a similar rate), we expect the yearly cost of nest destruction to reach an estimated value of €11.9M (given all suitable areas are invaded) in just 12 years," shares Prof. Franck Courchamp.

In Japan and South Korea, where the species has already been observed, the total yearly cost of nest destruction is estimated at €19.5M and €11.9M respectively.

So far, nests eradication is the most effective way to fight the invasion, though, it is not sufficient enough. As a result, so far, only 30-40% of the detected nests are destroyed each year in France. Moreover, rather than the result of a controlled strategy, those destroyed nests are only the ones that have been determined of particular potential harm to human or beekeeping activities. The researchers point out that this is not enough.

In conclusion, the scientists call for more active measures and research, related to the invasion of V. velutina nigrithorax. Provided that other countries, including the USA, Australia, Turkey and Argentina appear to be climatically suitable for the species, they are also in danger (e.g., €26.9M for the USA).

The current study presents only the first estimates of the economic costs resulting from the Asian hornet, but definitely more actions need to be taken in order to handle harmful invasive species - one of the greatest threats to biodiversity and ecosystem functioning.

Credit: 
Pensoft Publishers

How the chemical industry can meet the climate goals

Switzerland's Federal Council has decided that the country should become carbon-neutral by 2050. The governments of many other countries are pursuing similar goals. This may be challenging as far as car traffic and the entire power sector are concerned, but not impossible - with systematic electrification and the exclusive use of carbon-neutral energy sources, for example.

A switch of this kind will be more difficult for the chemical industry. While for many other industrial sectors one of the primary concern is their energy efficiency, the chemical industry must also address the question of raw materials. "Polymers, plastics, synthetic textile fibres and medicines all contain carbon. It has to come from somewhere," explains Marco Mazzotti, Professor of Process Engineering at ETH Zurich. As things stand, the vast majority of this carbon comes from oil and natural gas. During production, and when the chemical products are burned or decompose at the end of their life, they release CO2.

Using concrete figures and methanol production as a case study, Mazzotti and co-workers from ETH Zurich and Utrecht University have now systematically compared various approaches that aim at reducing net CO2 emissions from the chemical industry to zero. The main conclusion from the new study is that the goal of achieving net-zero CO2 emissions in the chemical industry is in fact attainable. However, all the approaches the study examined for achieving this target have both advantages and disadvantages, which manifest themselves differently in different regions of the world. In addition, all three concepts require more energy (in the form of electricity) than current production methods.

Capture CO2 or use biomass

One approach involves continuing to use fossil resources as raw materials, but systematically capturing CO2 emissions and sequestering them underground using a process known as carbon capture and storage (CCS). The big advantage here is that today's industrial production processes would not need to be changed. However, the storage sites must be suitable in terms of their geology, offering for example deep sedimentary layers that contain salt water. Such sites are not found all over the world.

Another approach would see industry using carbon from CO2 captured in advance from air or from industrial waste gases. This process is called carbon capture and utilisation (CCU). Hydrogen required for chemical products would be obtained from water using electricity. The approach would involve a major overhaul of chemical production processes and rebuilding large parts of the industrial infrastructure. In addition, it requires an extremely large amount of electricity - six to ten times more than CCS. "This method can be recommended only in countries with a carbon-neutral electricity mix," Mazzotti explains, continuing: "We clearly demonstrate that using large quantities of electricity from coal or gas-fired power stations would, in fact, be much worse for the climate than the current production method based on fossil fuels."

A final option would be to use biomass (wood, sugar plants, oil plants) as raw material for the chemical industry. Although this method requires less electricity than the others, it involves very intensive land use to grow the crops - requiring 40-240 times more land than the other approaches.

The future of flying

Mazzotti and his co-authors based their study on the production of methanol, which is similar to the process used for producing fuels. Their work therefore also informs the discussion about future aircraft fuels, as Mazzotti points out: "We hear it time and again, even from experts, that the only way aviation can become carbon-neutral is through the use of synthetic fuels," he says. "But that's not true." Producing synthetic fuels is an extremely energy-intensive process. If electricity from coal or gas-fired power stations were to be used for this purpose, synthetic fuels would have an even larger carbon footprint than fossil fuels. The study shows that there are at least two viable alternatives to synthetic fuels: aviation could continue to use fossil fuels if the CO2 emitted by aircraft were captured and sequestered elsewhere, or the fuels could be obtained from biomass.

Credit: 
ETH Zurich

Researchers hope to improve future epidemic predictions

RESEARCH TRIANGLE PARK, N.C. -- As the world grapples with the COVID-19 pandemic, a new mathematical model could offer insights on how to improve future epidemic predictions based on how information mutates as it is transmitted from person to person and group to group.

The U.S. Army funded this model, developed by researchers at Carnegie Mellon University and Princeton University, through the Army Research Laboratory's Army Research Office, both elements of the Combat Capabilities Development Command.

The model suggests that ideas and information spread and evolve between individuals with patterns similar to genes in that they self-replicate, mutate and respond to selective pressure as they interact with their host.

"These evolutionary changes have a huge impact," said CyLab faculty member Osman Yagan, an associate research professor in Electrical and Computer Engineering at Carnegie Mellon University and corresponding author of the study. "If you don't consider the potential changes over time, you will be wrong in predicting the number of people that will get sick or the number of people who are exposed to a piece of information."

In their study, published March 17 in the Proceedings of the National Academy of Sciences, the researchers developed a mathematical model that takes the evolutionary changes of both disease and information into consideration. The research tested the model against thousands of computer-simulated epidemics using data from two real-world networks: a contact network among students, teachers, and staff at a U.S. high school, and a contact network among staff and patients in a hospital in Lyon, France.

"We showed that our theory works over real-world networks," said the study's first author, Rashad Eletreby, who was a Carnegie Mellon doctoral candidate when he wrote the paper. "Traditional models that don't consider evolutionary adaptations fail at predicting the probability of the emergence of an epidemic."

The researchers said the epidemic model most widely used today is not designed to account for changes in the disease being tracked. This inability to account for changes in the disease can make it more difficult for leaders to counter a disease's spread or make effective public health decisions such as when to institute stay at home orders or dispatch additional resources to an area.

"The spread of a rumor or of information through a network is very similar to the spread of a virus through a population," said Dr. H. Vincent Poor, one of the researchers on this study and Princeton's interim dean of engineering. "Different pieces of information have different transmission rates. Our model allows us to consider changes to information as it spreads through the network and how those changes affect the spread."

While the study is not a silver bullet for predicting the spread of today's coronavirus or the spread of misinformation, the authors say it is a big step.

In the future, the team hopes that their research can be used to improve the tracking of epidemics and pandemics by accounting for mutations in diseases and ultimately considering interventions like quarantines and then predicting how those interventions would affect an epidemic's spread when the pathogen is mutating as it spreads.

"This work demonstrates the importance of basic research and the ability of scientists in various disciplines to inform each other's work," said Dr. Edward Palazzolo, program manager for the Social and Cognitive Networks Program at the Army Research Office. "Although in its early stages, these models show promise for understanding network diffusion in light of mutations."

In addition to the Army, the National Science Foundation and the Office of Naval Research also supported this research. Other researchers co-authored the paper include Yong Zhuang and Kathleen Carley from Carnegie Mellon University.

Credit: 
U.S. Army Research Laboratory

Coffee grounds show promise as wood substitute in producing cellulose nanofibers

image: Cellulose nanofibers with up to 25 nm wide were produced from spent coffee grounds by TEMPO-mediated oxidation.

Image: 
Yokohama National University

Researchers at Yokohama National University (YNU) meticulously examined cellulose nanofibers extracted from spent coffee grounds, identifying them as a viable new raw source.

The world generates over six million tons of coffee grounds, according to the International Coffee Organization. The journal Agriculture and Food Chemistry reported in 2012 that over half of spent coffee grounds end up in landfills. Cellulose nanofibers are the building blocks for plastic resins that can be made into biodegradable plastic products.

The YNU team, led by Izuru Kawamura, an associate professor at the Graduate School of Engineering Science, set out to build upon previous research into extracting cellulose nanofibers from coffee grounds. They published their findings on April 1 in the journal Cellulose.

"Our ultimate goal is to establish a sustainable recycling system with our cellulose nanofibers in the coffee industry," Kawamura said. "Now, more and more restaurants and cafés have been banned from using single-use straws. Following that movement, we aim to make a transparent disposal coffee cup and straw with an additive comprising cellulose nanofibers from spent coffee grounds."

Demand for cellulose nanofibers is increasing worldwide, as industries realize their potential as a more environmentally sound and sustainable way to produce plastics.

"Cellulose nanofibers have mainly been produced from wood-based materials such as pulp so far," Kawamura said. "Cellulose nanofibers can be potentially supplied from all plants from nature. We would like to emphasize spent coffee grounds are a promising raw material."

The key to extracting cellulose nanofibers from spent coffee grounds lies in cellulose, the material that comprises the beans' cell walls and accounts for about half the weight and volume of the grounds.

The YNU team ran the experiment of isolating cellulose nanofibers from beans' cell walls by catalytic oxidation, a process that oxidize the cell walls using a catalyst. This is a method previously reported by Akira Isogai at the University of Tokyo.

The team examined the resulting cellulose nanofibers with imaging techniques including x-ray diffraction, electron microscopy, and thermogravimetric analysis - a method of observing the fundamental structural features of cellulose nanofibers and comparing that derived from wood. The coffee ground-based cellulose nanofibers displayed uniformity and integrated well into polyvinyl alcohol, the building block for a variety of industrial and consumer products. His team found that their average diameter was 25 nanometers. For reference, a human hair measures about 90,000 nanometers in diameter.

Such consistency and integration with polymer resin are milestones that demonstrate the potential for coffee ground-based cellulose nanofibers as a wood substitute, Kawamura said, but further research is needed to develop a commercially viable process.

Kawamura believes cellulose nanofibers may soon play a major role in the automobile industry, offering a lightweight alternative to steel and plastic for auto bodies. As emission standards continue to tighten, the market for lighter cars will grow, making cellulose nanofibers an increasingly valuable commodity.

"The total weight of plastic resin made by cellulose nanofibers is very light compared to steel," Kawamura said. "It will bring an efficient reduction of CO2 emission." Resins built on nanofibers also work well in 3D printing, making them an environmentally friendly alternative to petroleum-based plastics for a host of potential products.

This new process may be a boon for the coffee industry, which has limited options for monetizing spent coffee grounds. Some cities have recycling programs, where spent coffee grounds are reused as nutrient-rich compost for greenhouses and mushroom farms. Other programs send spent coffee grounds to facilities that produce biogas. But overall, most coffee grounds still end up in landfills.

Credit: 
Yokohama National University

NASA finds Tropical Storm Irondro's heavy rainfall displaced

image: The GPM core satellite passed over Tropical Storm Irondro on Apr. 6 at 2:26 a.m. EDT (0626 UTC). Heaviest rainfall (orange), located southeast of the center, was falling at a rate 1 inch (25 mm) per hour. Light rain (blue) was found throughout the rest of the storm.

Image: 
NASA/JAXA/NRL

NASA analyzed Tropical Storm Irondro's rainfall and found heaviest rainfall was being pushed far southeast of the center because of strong wind shear.

NASA has the unique capability of peering under the clouds in storms and measuring the rate in which rain is falling. Global Precipitation Measurement mission or GPM core passed over Irondro from its orbit in space and measured rainfall rates throughout the storm on Apr. 6 at 2:26 a.m. EDT (0626 UTC). Heaviest rainfall was being pushed southeast of the center where it was falling at a rate of 1 inch (25 mm) per hour. Light rain was found throughout the rest of the storm.

In general, wind shear is a measure of how the speed and direction of winds change with altitude. Tropical cyclones are like rotating cylinders of winds. Each level needs to be stacked on top each other vertically in order for the storm to maintain strength or intensify. Wind shear occurs when winds at different levels of the atmosphere push against the rotating cylinder of winds, weakening the rotation by pushing it apart at different levels. Winds from the northwest were pushing against the storm and displacing the heaviest rainfall southeast of the center.

On Apr. 6 at 4 a.m. EDT (0900 UTC), the Joint Typhoon Warning Center issued their final warning on Irondro. Despite the wind shear, Tropical Storm Irondro had maximum sustained winds near 40 knots (46 mph/74 kph). Irondro was located near latitude 26.7 degrees south and longitude 89.5 degrees east. Irondro is becoming extra-tropical and is expected to become a cold core low pressure area later in the day.

Typhoons/hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Both the Japan Aerospace Exploration Agency, JAXA and NASA manage GPM.

Credit: 
NASA/Goddard Space Flight Center