Tech

Putting a future avocado apocalypse on ice

video: University of Queensland (UQ) avocado PhD researcher Chris O'Brien, who is supervised by UQ's Professor Neena Mitter and Dr Raquel Folgado from The Huntington Botanical Gardens in California, has developed the first critical steps to create a cryopreservation protocol for avocado, which has never been achieved until now, despite more than 40 years of research.

Image: 
(c) University of Queensland

The supply of smashed 'avo' is secure for generations after world-first research cryopreserved the tips of avocado shoots and then revived them to create healthy plants.

University of Queensland PhD student Chris O'Brien has developed the first critical steps to create a cryopreservation protocol for avocado which had never been achieved until now, despite more than 40 years of research.

"The aim is to preserve important avocado cultivars and key genetic traits from possible destruction by threats like bushfires, pests and disease such as laurel wilt - a fungus which has the capacity to wipe out all the avocado germplasm in Florida," Mr O'Brien said.

"Liquid nitrogen does not require any electricity to maintain its temperature, so by successfully freeze avocado germplasm, it's an effective way of preserving clonal plant material for an indefinite period."

Cryopreservation is the technology used to freeze human biological material such as sperm and eggs at minus 196 degrees Celsius, and has been used to freeze other plants such as bananas, grape vines and apple.

Mr O'Brien has been working with UQ Centre for Horticultural Science's Professor Neena Mitter and Dr Raquel Folgado from The Huntington Library, Art Museum, and Botanical Gardens in California to perfect the technique.

He used clonal shoot tip material developed from tissue culture propagation technology which enables up to 500 true to type plants to grow from a single shoot-tip.

"At first I was just recovering brown mush after freezing the avocado tips," Mr O'Brien said.

"There was no protocol so I experimented with priming the tips with Vitamin C, and used other pre-treatments like sucrose and cold temperature to prepare the cells - it was a question of trial and error to get the optimal mixture and correct time points."

The avocado shoot tips are placed on an aluminium foil strip, which allows for ultra-fast cooling and rewarming, then placed into a 'cryotube' before being stored in liquid nitrogen.

The frozen shoot tips can be revived in a petri dish containing a sucrose mixture to rehydrate.

"It takes about 20 minutes to recover them," Mr O'Brien said.

"In about two months they have new leaves and are ready for rooting before beginning a life in the orchard."

He has achieved 80 per cent success in regrowing frozen Reed avocado plants and 60 per cent with the Velvick cultivar.

Eighty revived avocado plants are now growing in a UQ glasshouse.

The recovered trees will be monitored for flowering times and fruit quality, with field trials planned with collaborators at Anderson Horticulture.

Professor Mitter said this was the first time the plants had experienced life outside the laboratory.

"I suppose you could say they are space-age avocados - ready to be cryo-frozen and shipped to Mars when human flight becomes possible," she said.

"But it is really about protecting the world's avocado supplies here on earth and ensuring we meet the demand of current and future generations for their smashed 'avo' on toast."

Credit: 
University of Queensland

For an effective COVID vaccine, look beyond antibodies to T-cells

More than 100 companies have rushed into vaccine development against COVID-19 as the U.S. government pushes for a vaccine rollout at "warp speed" -- possibly by the end of the year -- but the bar set for an effective, long-lasting vaccine is far too low and may prove dangerous, according to Marc Hellerstein of the University of California, Berkeley.

Most vaccine developers are shooting for a robust antibody response to neutralize the virus and are focusing on a single protein, called the spike protein, as the immunizing antigen. Yet, compelling evidence shows that both of these approaches are problematic, said Hellerstein, a UC Berkeley professor of nutritional sciences and toxicology.

A better strategy is to take a lesson from one of the world's best vaccines, the 82-year-old yellow fever vaccine, which stimulates a long-lasting, protective T-cell response. T-cells are immune cells that surveil the body continuously for decades, ready to react quickly if the yellow fever virus is detected again.

"We know what really good vaccines look like for viral infections," Hellerstein said. "While we are doing phase 2 trials, we need to look at the detailed response of T-cells, not just antibodies, and correlate these responses with who does well or not over the next several months. Then, I think, we will have a good sense of the laboratory features of vaccines that work. If we do that, we should be able to pick good ones."

Using a technique Hellerstein's laboratory developed and perfected over the past 20 years that assesses the lifespan of T-cells, it is now possible to tell within three or four months whether a specific vaccine will provide long-lasting cells and durable T-cell-mediated protection.

Hellerstein laid out his arguments in a review article published today in the journal Vaccine.

"There isn't a lot of room for major error here," Hellerstein said. "We can't just go headfirst down a less than optimal or even dangerous avenue. The last thing we want is for immunized people to get sick in a few months or a year, or get sicker than they would have. Whoever is paying for or approving the vaccine trials has the obligation to make sure that we look at the quality and durability of the T-cell response. And this would not delay the licensing process."

Misplaced focus on antibodies

Hellerstein points out that antibodies are not the primary protective response to infection by coronaviruses, the family of viruses that includes SARS-CoV-2. Indeed, high antibody levels to these viruses are associated with worse disease symptoms, and antibodies to coronaviruses, including SARS-CoV-2, don't appear to last very long.

This was noted in people infected by the first SARS virus, SARS-CoV-1, in 2003. SARS patients who subsequently died had higher antibody levels during acute infection and worse clinical lung injury compared to SARS patients who went on to recover. In MERS, which is also a coronavirus infection, survivors with higher antibody levels experienced longer intensive care unit stays and required more ventilator support, compared to subjects with no detectable antibodies.

In contrast, strong T-cell levels in SARS and MERS patients correlated with better outcomes. The same has also played out, so far, in COVID-19 patients.

"A strong antibody response correlates with more severe clinical disease in COVID-19, while a strong T-cell response is correlated with less severe disease. And antibodies have been short-lived, compared to virus-reactive T-cells in recovered SARS patients," Hellerstein said.

The most worrisome part, he said, is that antibodies also can make subsequent infections worse, creating so-called antibody-dependent enhancement. Two vaccines -- one against a coronavirus in cats and another against dengue, a flavivirus that affects humans -- had to be withdrawn because the antibodies they induced caused potentially fatal reactions. If an antibody binds weakly against these viruses or falls to low levels, it can fail to "neutralize" the virus, but instead help it get into cells.

Antibody-dependent enhancement is well known in diseases such as dengue and Zika. A recent UC Berkeley study in Nicaragua showed that antibodies produced after infection with Zika can cause severe disease, including deadly hemorrhagic fever, in those later infected by dengue, a related viral disease. This dangerous cross-reaction may also occur with antibodies produced by a vaccine. Hellerstein noted that a robust T-cell response is key to maintaining high levels of antibodies and may prevent or counteract antibody-dependent enhancement.

T-cells are a long-lasting defense

Hellerstein primarily studies the dynamics of metabolic systems, tagging the body's proteins and cells with a non-radioactive isotope of hydrogen, deuterium and tracking them through the living body. He began to study the birth and death rates of T-cells in HIV/AIDS patients over 20 years ago, using sophisticated mass spectrometric techniques designed by his laboratory.

Then, three years ago, he teamed up with immunologist Rafi Ahmed and his colleagues at Emory University to determine how long T-cells induced by the yellow fever vaccine stick around in the blood. Surprisingly, he said, the same T-cells that were created to attack the yellow fever virus during the first few weeks after a live virus vaccination were still in the blood and reactive to the virus years later, revealing a remarkably long lifespan. He and the team estimated that the anti-yellow fever T-cells lasted at least 10 years and probably much longer, providing lasting protection from just one shot. Their long lifespan allows these cells to develop into a unique type of protective immune cell.

"They (the T-cells) are a kind of adult stem cell, sitting silently in very small numbers for years or decades, but when they see viral antigen they go wild -- divide like crazy, put out cytokines and do other things that help to neutralize the virus," he said. "They are like seasoned old soldiers resting quietly in the field, ready to explode into action at the first sign of trouble."

The same deuterium-labeling technique could be employed to measure the durability of a COVID-19 vaccine's T-cell response, helping to pinpoint the best vaccine candidates while trials are ongoing, he said.

"We can, in my view, tell you the quality and durability or longevity of your T-cell response within a few months," he said. "These tests can be used to judge vaccines: Is a candidate vaccine reproducing the benchmarks that we see in highly effective vaccines, like the ones against smallpox and yellow fever?"

Hellerstein said that he was motivated to write a review on the role of antibodies versus T-cells in protective immunity against SARS-Cov-2 when he heard from experts in vaccine development that companies would likely not be interested in testing anything beyond the antibody response. The reason given was that it would slow down the approval process or could even turn up problems with a vaccine.

"That is why I wrote this review, honestly, because I was so upset by this response," he said. "At this moment in history, how can we not want to know anything that might help us? We need to get beyond the narrow focus on antibodies and look at the breadth and durability of T-cells."

Worrisome focus on spike protein

Hellerstein was also alarmed that most vaccines under development are focusing exclusively on inducing an antibody response against only one protein, or antigen, in the COVID-19 virus: the spike protein, which sits on the surface of the virus and unlocks the door into cells. But important new studies have shown that natural infection by SARS-CoV-2 stimulates a broad T-cell response against several viral proteins, not just against the spike protein.

T-cells produced after natural infection in SARS patients are also very long-lived, he said. A recent study showed that patients who recovered from SARS-CoV-1 infection in 2003 produced CD4 and CD8 T-cells that are still present 17 years later. These T-cells also react to proteins in today's SARS-CoV-2, which the patients were never exposed to, indicating that T-cells are cross-reactive against different coronaviruses -- including coronaviruses that cause common colds.

These findings all call into question whether limiting a vaccine to one protein, rather than the complement of viral proteins that the body is exposed to in natural infection, will induce the same broad and long-lasting T-cell protection that is seen after natural infection.

In contrast, vaccines like the yellow fever vaccine that employ attenuated viruses -- viruses that divide, but are crippled and can't cause damage to the body -- tend to generate a robust, long-lasting and broad immune response.

"If you are going to approve a vaccine based on a laboratory marker, the key issue is, 'What is its relationship to protective immunity?' My view is that T-cells have correlated much better than antibodies with protective immunity against coronaviruses, including this coronavirus. And T- cells haven't shown a parallel in COVID-19 to antibody-dependent enhancement that could make things worse, not better," he said.

The effectiveness and durability of the first COVID-19 vaccines could impact, for years, the public's already questioning attitude toward vaccines, he warned.

"It would be a public health and 'trust-in-medicine' nightmare, with potential repercussions for years -- including a boost to anti-vaccine forces -- if immune protection wears off or antibody-dependent enhancement develops and we face recurrent threats from COVID-19 among the immunized," he wrote in his review article.

Credit: 
University of California - Berkeley

Studying short-term cloud feedback to understand climate change in East Asia

image: Clouds play an important role in regulating Earth's radiation budget.

Image: 
Fei Wang

It is very important to understand how clouds responded to climatic change in the past, as it can help in reducing errors when predicting future global warming using climate models. Estimates of short-term cloud feedback based on observations offer a way to assess and improve the abilities of climate models in simulating cloud feedback, according to Hua Zhang, a scientist and professor at the Chinese Academy of Meteorological Sciences, and one of the authors of a recently published study in Advances in Atmospheric Sciences.

"Cloud radiative forcing in the East Asian monsoon region has unique characteristics, and the current deviation and uncertainty in simulating radiation budgets in East Asia are all related to its feedback, which greatly constrains our understanding of climate change in the region using climate models," explains Zhang. "Using observational data to study short-term cloud feedback in East Asia can provide an observational constraint on long-term feedback there, and thus help us to assess the contribution of cloud feedback to climate sensitivity."

First, a new set of cloud radiative kernels were built by Prof. Zhang and her team, based on the BCC_RAD radiation model, and the cloud feedback in response to interannual climate variability in East Asia was estimated from these kernels combined with cloud measurements from MODIS onboard NASA's Aqua satellite from July 2002 to June 2018.

In order to reveal the spatial distribution and seasonal variation of feedbacks due to different cloud types, the team adopted the cloud classification method provided by the ISCCP (International Satellite Cloud Climatology Project) dataset and selected four subregions according to the surface and monsoon types. The strongest cloud feedback was located in the subtropical monsoon region, mainly due to the contributions of nimbostratus and stratus.

They found that short-term cloud feedback in East Asia is mainly driven by decreases in mid- and low-cloud fraction, resulting from changes in the thermodynamic structure of atmosphere, and a decrease in low-cloud optical thickness, related to changes in cloud water content.

"Short-term cloud feedback is a useful variable for estimating the uncertainties relating to clouds, and it can provide a reference for the study of long-term cloud feedback and narrowing the inter-model uncertainties in long-term cloud feedbacks through the relationship between long- and short-term cloud feedbacks in East Asia," concludes Prof. Zhang.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Research brief: New insight on the impacts of Earth's biosphere on air quality

A new study led by a team of University of Minnesota researchers provides the first global satellite measurements of one of the most important chemicals affecting Earth's atmosphere.

Isoprene is a natural hydrocarbon emitted to the atmosphere in vast quantities -- approximately 500 billion kg per year -- by plants and trees. Isoprene is chemically reactive, and once in the atmosphere it combines with human-caused pollutants to adversely affect air quality. Isoprene also reacts with the main atmospheric oxidizing agent -- called OH radicals -- and therefore reduces the capacity of the atmosphere to scrub itself of pollutants and greenhouse gases.

Scientists look to atmospheric models to predict current and future atmospheric composition and air quality, as well as to diagnose the atmosphere's ability to remove greenhouse gases and air pollutants. But isoprene emission rates are highly uncertain due to sparse ground-based measurements, and scientists are also unsure of the extent to which isoprene acts to suppress or sustain the abundance of OH radicals in the atmosphere.

Now, researchers have developed the first-ever global measurements of isoprene from space. Using observations from the Cross-track Infrared Sounder (CrIS) satellite sensor, researchers developed a retrieval method that uses machine learning to determine the atmospheric concentration of isoprene over different parts of the world. They combined these measurements with atmospheric modeling to test current scientific understanding of global isoprene emissions and how isoprene affects atmospheric oxidation. The research will be published on Wednesday, September 9 in the journal Nature.

"Isoprene is one of the most important drivers of global atmospheric chemistry," said Dylan Millet, a professor in the U of M's Department of Soil, Water, and Climate. "These satellite measurements provide new understanding of how Earth's biosphere and atmosphere interact."

By combining the CrIS isoprene measurements with other satellite data, for the first time researchers were able to estimate the abundance of OH from space over isoprene source regions. These observations support recent laboratory and theory-based findings: isoprene emissions do lower atmospheric OH, but not nearly as strongly as was originally believed. As a result, the atmosphere maintains a significant ability to scrub itself of pollution even in the presence of natural isoprene emissions. Combining these measurements with other space-based data will open new doors to investigate changes in OH over time.

This research lays a foundation for multi-year studies examining seasonal-to-interannual isoprene changes and their impacts on the global atmosphere. Information from these new satellite measurements can also be used to improve current atmospheric models, with the goal of more accurately predicting air quality in a changing climate.

Researchers revealed that:

-The satellite measurements of isoprene show dramatic model overestimates over Amazonia. These disparities indicate a strong need for better understanding of tropical emissions of isoprene and other reactive chemicals.

-Over southern Africa, the CrIS measurements reveal a major isoprene hotspot that is missing from bottom-up predictions. This points to a need for further investigation of isoprene sources in this understudied region.

"These new satellite measurements reveal that, while our understanding of isoprene chemistry is getting pretty good, we still have a lot to learn about how isoprene emissions vary across Earth's different ecosystems," said Kelley Wells, a researcher in the Department of Soil, Water, and Climate in the U of M's College of Food, Agricultural and Natural Resource Sciences.

Credit: 
University of Minnesota

Oxford University researchers discover 'genetic vulnerability' in breast cancer cells

The study, published in the scientific journal Nature, has uncovered a genetic vulnerability present in nearly 10% percent of all breast cancers tumours, and found a way to target this vulnerability and selectively kill cancer cells. Each year, over five thousand newly diagnosed cases of breast cancer in the UK alone will carry this particular genetic fault, a proportion roughly double that driven by hereditary mutations such as those in the well-known BRCA genes.

A University of Oxford team of scientists led by Professor Ross Chapman, working together with researchers working at the Johns Hopkins University School of Medicine in Baltimore, USA, discovered that cells originating from a specific subset of human breast cancer tumours, could be killed with a chemical that inhibits PLK4, an enzyme important for a specialized part of a cell called the centrosome. A cell's centrosomes performs important functions during cell division, where it regulates the process in which copies of each chromosome are accurately segregated between two identical daughter cells. Normally, cells have safety mechanisms that protect them from losing their centrosomes. But the researchers discovered that these breast cancer cells could not survive without centrosomes.

The Oxford team wondered if the cancer cells they were studying had a genetic change that made them especially reliant on their centrosomes, and turned their attention to one feature of these cancer cells, an abnormal repeated stretch of a particular segment on chromosome 17. This genetic abnormality, known as 17q23 amplification, is already familiar to cancer researchers given its very high incidence in breast cancer.

Peter Yeow, a graduate student in Dr Chapman's laboratory, performed experiments that then revealed a gene known as TRIM37 was much more active in cells that had 17q23 amplification. They then went on to show overactive TRIM37 resulted in faulty centrosomes, which in turn led to mistakes during cell division. They speculate that the 'daughter' cells born of these abnormal cell divisions are much more likely to acquire new genetic mutations.

'We think that what is happening is that if cells acquire too many copies of TRIM37, the normally very carefully orchestrated process of cell division goes haywire, which in turn leads to our genomes becoming unstable" says Professor Chapman, from the MRC Weatherall Institute of Molecular Medicine at Oxford University. "This kind of genomic instability, where cells acquire all sorts of alterations to their genomes as they divide, is one of the hallmarks of cancer development.'

This means that cells with the 17q23 amplification are more likely to become cancerous. However, the researchers revealed this characteristic of cancer comes at a cost, the exact same defect leaves the cells entirely reliant on their centrosomes for cell division, a process central to tumour development. The researchers then demonstrated this weakness could be exploited using a drug that targets PLK4 and causes cells to lose their centrosomes, and that this treatment killed cancer cells with 17q23 amplification.

'It is slightly ironic that the same thing that makes the cells more likely to become cancerous also makes them uniquely vulnerable to losing their centrosomes, but is useful to us as scientists, because it means that we may be able to selectively target this kind of cancer cell in patients without affecting their healthy cells,' says Professor Chapman.

Unfortunately, the chemical PLK4 inhibitor that the researchers used to deplete centrosomes in cancer cells is not suitable for use in patients. However, they hope this information can be used to search for new PLK4-targeting drugs that have the same effect.

'We've found a previously unknown genetic vulnerability in breast cancer, and discovered a means to exploit this vulnerability and selectively kill cancer cells,' says Dr Chapman. 'We now hope that other researchers and pharmaceutical companies can generate new drugs that can target this process, to produce more effective and safer cancer treatments.'

What's also promising is that this genetic fault has also been detected in other cancer types, apart from breast cancer. 'Virtually any tumour, irrespective of origin, could be targeted if it harbours the 17q23 amplification. This greatly expands the number of patients that stand to benefit from therapies that may emerge from our study,' says Yeow.

Credit: 
University of Oxford

Advanced NVMe controller technology for next generation memory devices

image: Prototype board and floorplan of OpenExpress

Image: 
Professor Myoungsoo Jung, KAIST

KAIST researchers advanced non-volatile memory express (NVMe) controller technology for next generation information storage devices, and made this new technology named 'OpenExpress' freely available to all universities and research institutes around the world to help reduce the research cost in related fields.

NVMe is a communication protocol made for high-performance storage devices based on a peripheral component interconnect-express (PCI-E) interface. NVMe has been developed to take the place of the Serial AT Attachment (SATA) protocol, which was developed to process data on hard disk drives (HDDs) and did not perform well in solid state drives (SSDs).

Unlike HDDs that use magnetic spinning disks, SSDs use semiconductor memory, allowing the rapid reading and writing of data. SSDs also generate less heat and noise, and are much more compact and lightweight.

Since data processing in SSDs using NVMe is up to six times faster than when SATA is used, NVMe has become the standard protocol for ultra-high speed and volume data processing, and is currently used in many flash-based information storage devices.

Studies on NVMe continue at both the academic and industrial levels, however, its poor accessibility is a drawback. Major information and communications technology (ICT) companies around the world expend astronomical costs to procure intellectual property (IP) related to hardware NVMe controllers, necessary for the use of NVMe. However, such IP is not publicly disclosed, making it difficult to be used by universities and research institutes for research purposes.

Although a small number of U.S. Silicon Valley startups provide parts of their independently developed IP for research, the cost of usage is around 34,000 USD per month. The costs skyrocket even further because each copy of single-use source code purchased for IP modification costs approximately 84,000 USD.

In order to address these issues, a group of researchers led by Professor Myoungsoo Jung from the School of Electrical Engineering at KAIST developed a next generation NVMe controller technology that achieved parallel data input/output processing for SSDs in a fully hardware automated form.

The researchers presented their work at the 2020 USENIX Annual Technical Conference (USENIX ATC '20) in July, and released it as an open research framework named 'OpenExpress.'

This NVMe controller technology developed by Professor Jung's team comprises a wide range of basic hardware IP and key NVMe IP cores. To examine its actual performance, the team made an NVMe hardware controller prototype using OpenExpress, and designed all logics provided by OpenExpress to operate at high frequency.

The field-programmable gate array (FPGA) memory card prototype developed using OpenExpress demonstrated increased input/output data processing capacity per second, supporting up to 7 gigabit per second (GB/s) bandwidth. This makes it suitable for ultra-high speed and volume next generation memory device research.

In a test comparing various storage server loads on devices, the team's FPGA also showed 76% higher bandwidth and 68% lower input/output delay compared to Intel's new high performance SSD (Optane SSD), which is sufficient for many researchers studying systems employing future memory devices. Depending on user needs, silicon devices can be synthesized as well, which is expected to further enhance performance.

The NVMe controller technology of Professor Jung's team can be freely used and modified under the OpenExpress open-source end-user agreement for non-commercial use by all universities and research institutes. This makes it extremely useful for research on next-generation memory compatible NVMe controllers and software stacks.

"With the product of this study being disclosed to the world, universities and research institutes can now use controllers that used to be exclusive for only the world's biggest companies, at no cost,? said Professor Jung. He went on to stress, "This is a meaningful first step in research of information storage device systems such as high-speed and volume next generation memory."

Credit: 
The Korea Advanced Institute of Science and Technology (KAIST)

SwRI-led study indicates sand-sized meteoroids are peppering asteroid Bennu

image: This image shows the near-Earth asteroid Bennu as seen from NASA's OSIRIS-REx spacecraft. A new SwRI study suggests that Bennu experiences numerous impacts from sand-sized meteoroids orbiting the Sun.

Image: 
NASA/GSFC/University of Arizona

SAN ANTONIO -- Sept. 9, 2020 -- A new study published this month in JGR Planets posits that the major particle ejections off the near-Earth asteroid Bennu may be the consequence of impacts by small, sand-sized particles called meteoroids onto its surface as the object nears the Sun. The study's primary author is Southwest Research Institute scientist Dr. William Bottke, who used data from NASA's OSIRIS-REx mission.

Launched in 2016, NASA's OSIRIS-REx spacecraft is currently orbiting Bennu with the aim of briefly touching on the surface and obtaining a sample from the asteroid in October 2020, and then returning to Earth.

"While in orbit, the spacecraft has been sending images of Bennu back to Earth," Bottke said. "One of the most significant things we've noticed is that the asteroid is frequently ejecting materials into space. Tiny rocks are just flying off its surface, yet there is no evidence that they are propelled by sublimating ice, as one might expect from a comet. The biggest events launch rocks as large as a few centimeters."

Even more curious is the fact that the observed major ejection events tend to occur in the late afternoon on Bennu. Determined to get to the bottom of these events, Bottke reached out to Althea Moorhead at NASA's Marshall Space Flight Center. Moorhead is a member of NASA's Meteoroid Environment Office, a group that monitors and models meteoroids that may be hazardous to spacecraft.

"Over the years, Althea and her team have built a computer model that determines the number of tiny particles impacting spacecraft," Bottke explained. "We used this software to calculate the number of meteoroid impacts Bennu would face in its current orbit."

Many meteoroids originated on comets. As comets approach the Sun, pieces break off as a consequence of solar heating. Some comets even break apart, producing far more small particles than asteroid collisions in the asteroid belt. For this reason, comet fragments are thought to be the major source of meteoroids that fill the inner solar system.

Interpreting their modeling results, Bottke's study suggests that as Bennu draws closer to the Sun in its orbit, it experiences a higher number of meteoroid impacts. Moreover, sand-sized meteoroids are predicted to hit Bennu with the force of a shotgun blast about once every two weeks, with most striking in the head-on direction. Their impact location on Bennu corresponds to late afternoon and early evening.

Furthermore, Bottke's study points out that the Lunar Atmosphere and Dust Environment Explorer (LADEE) previously made similar observations about impacts on the Moon. As with Bennu, most meteoroids hit the Moon head-on (with head-on defined with respect to the motion of the Earth-Moon system around the Sun). The key difference between Bennu and the Moon is how they rotate around their spin axes. The Moon spins west to east, so head-on impacts correspond to sunrise. Bennu spins in the opposite direction, so head-on impacts hit near dusk.

At first, Bottke's modeling work seem to predict that meteoroids would eject too little material from Bennu to explain the OSIRIS-REx observations. However, a better match could be obtained if Bennu has a weak porous surface. The possibility that Bennu has this property was recently strengthened by studies of the Bennu-like asteroid Ryugu, the target of Japan's Hayabusa2 sample return mission. Using explosives to launch a small projectile into Ryugu, the Hayabusa2 team produced a crater that was larger than expected by most impact experts. If Bennu's surface is indeed similar to Ryugu's, meteoroid impacts should be capable of ejecting relatively large amounts of debris.

The OSIRIS-REx mission is led by the University of Arizona. NASA's Goddard Space Flight Center provides overall mission management and Lockheed Martin Space built the spacecraft and executes flight operations. OSIRIS-REx is a New Frontiers Program mission, administered by the Marshall Space Flight Center in Huntsville, Alabama, for NASA's Science Mission Directorate.

For more information, visit https://www.swri.org/planetary-science.

Credit: 
Southwest Research Institute

Making dog food more delectable by analyzing aromas

Dogs aren't known for being picky about their food, eating the same kibble day after day with relish. However, owners of pampered pooches want their pets to have the best possible culinary experience, especially for those rare finicky canines. Now, researchers reporting results from a pilot study in ACS' Journal of Agricultural and Food Chemistry have identified key aroma compounds in dog food that seem to be the most appealing to canines.

For dogs, palatability depends on a food's appearance, odor, taste and texture -- just as it does for people. Previous studies have suggested that odor is especially important for dogs. Some scientists have identified volatile compounds in dog food, but not much is known about how specific aroma compounds influence how readily the dog eats the food. Maoshen Chen and colleagues wanted to identify the key aroma compounds in six dog foods and correlate the compounds with dogs' intake of the foods.

The researchers began by feeding six adult beagles one of six foods for one hour each and determining how much the dogs ate. The intake of three of the foods was two to four times higher than that of the other three foods. Using mass spectrometry, the researchers found that 12 volatile aroma molecules were correlated, either positively or negatively, with the beagles' intake of the six foods. Then, the researchers added each aroma compound to an odorless food and gave the beagles a choice between food containing one of the compounds and the odorless food itself. From these experiments, the team determined that the dogs preferred food containing (E)-2-hexenal (which humans associate with an unpleasant, fatty odor), 2-furfurylthiol (sulfury, roasted, smoky odor) and 4-methyl-5-thiazoleethanol (meaty odor). In contrast, the dogs didn't care for food containing (E)-2-octenal (a slightly different unpleasant, fatty odor). Although other dog breeds and more subjects should be tested, these results could help dog food manufacturers formulate more palatable chow, the researchers say.

The authors acknowledge funding from the National Natural Science Foundation of China, the National Key R&D Program of China, the 111 Project, the National First-Class Discipline Program of Food Science and Technology, and the program of Collaborative Innovation Center of Food Safety and Quality Control in Jiangsu Province, China.

The abstract that accompanies this paper can be viewed here.

The American Chemical Society (ACS) is a nonprofit organization chartered by the U.S. Congress. ACS' mission is to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and its people. The Society is a global leader in providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, eBooks and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a specialist in scientific information solutions (including SciFinder® and STN®), its CAS division powers global research, discovery and innovation. ACS' main offices are in Washington, D.C., and Columbus, Ohio.

To automatically receive press releases from the American Chemical Society, contact newsroom@acs.org.

Follow us: Twitter | Facebook

Credit: 
American Chemical Society

Mineral undergoes self-healing of irradiation damage

image: Tabular monazite (top right; ca. 2.6 mm size) on a xenotime crystal. Königsalm near Senftenberg, Lower Austria.

Image: 
© Martin Slama

In nature there are quite a few minerals that incorporate uranium and thorium in their crystal structure. This causes radioactive self-irradiation that, over geologic periods of time, may destroy the crystal and transform it into a glassy form. As early as in 1893, the Norwegian mineralogist and geologist introduced the term "metamict" to describe this glassy state.

Self-irradiating minerals are currently in the focus of international research. This is because structural radiation damage may affect significantly the physical and chemical properties of minerals. Understanding the causes of these property changes is crucial for the Earth sciences - one of the most important technique to determine ages of minerals and rocks is based on the radioactive decay of uranium - and in materials sciences, as radioactive minerals are analogues of host ceramics for the immobilization of radioactive waste.

Monazite heals itself

So far it was not understood why some minerals (such as zircon, ZrSiO4) are often found in nature in an irradiation-vitrified state, whereas other species (such as monazite, CePO4) - in spite of even higher self-irradiation - never become metamict but, rather, are always observed in a moderately radiation-damaged state. This is first of all explained by insufficient stability of the monazite structure, resulting in inability to accumulate damage over geologic periods of time. Lutz Nasdala elucidates this, greatly simplified, by a comparison with cheese: "It is easily possible, using a pencil, to prick a hole into a hard ('stable') Emmentaler cheese, whereas analogously produced holes in a soft Camembert cheese would 'heal' in no time", Nasdala said.

Helium ions create and heal radiation damage

It has been supposed in the past already that partial self-healing of monazite is not only caused by the low thermal stability of this mineral, but also related to the action of natural alpha particles (that is, energy-rich helium cores that are emitted by an unstable nucleus in an "alpha-decay event"). The latter, however, was in apparent contrast to the observation that crystalline monazite is prone to alpha-irradiation damage.

In the new study the research team could unravel the causes of the self-healing by conducting irradiation experiments. Helium ions with energies of millions of electron volts (analogues of natural alpha particles) create structural damage in crystalline monazite. In contrast, the same helium ions cause structural recovery of radiation-damaged monazite. Hence crystalline monazite would correspond to "Emmentaler cheese" whereas radiation-damaged monazite becomes "Camembert cheese".

Such strong dependence of mineral properties on small changes in the structural state has never been described before. One consequence for Earth sciences research is that experiments with synthetic (that is, non-radiation-damaged) monazite may yield results that are not necessarily relevant for the behaviour of this (always moderately radiation-damaged) mineral in the Earth's interior.

Credit: 
University of Vienna

Mysterious cellular droplets come into focus

image: Individual protein molecules comprising the condensate are highlighted using color.

Image: 
Han-Yi Chou, University of Illinois, Urbana-Champaign

The world inside the human cell grew a bit more interesting in recent years as the role of a new biological structure became clearer.

It was long believed that most important operations in the cell occur within organelles. "They're there to do certain functions. For instance, mitochondria generate the energy that everything runs on," explained Aleksei Aksimentiev, a professor of physics at the University of Illinois at Urbana-Champaign. "What is common to all of them is that they're surrounded by a lipid membrane. What people recently discovered is there are organelles that don't have lipid bilayers. They assemble spontaneously in the form of droplets. And those organelles have particular functions."

In recent years, with improved imaging capabilities, the roles, occurrence, and behavior of these membrane-less organelles have become clearer. In 2017 they were given a name: biological condensates. They are thought to play a role in DNA repair and aging, and researchers believe a number of neurological diseases are related to the condensate not working properly, including Amyotrophic lateral sclerosis, or ALS, where nerve cells break down, leading to loss of muscular function.

"Let's say you have DNA and it suddenly has a break. It's usually a really bad thing, because it cannot replicate, but there's a machinery that will come and repair it," he explained. "A bubble of condensate forms that miraculously attracts only the molecules that are required to repair the DNA. There are all kinds of different condensates and they all recruit the right molecules somehow."

How do these membrane-less organelles spontaneously form? And how do they recruit other molecules to help them?

The physics of this process appears similar to phase separation, like how oil and water spontaneously form droplets in the right conditions, but with some differences. In normal phase separation, temperature usually motivates the separation. In biology, it is a change in concentrations.

"We don't know exactly how it works," Aksimentiev said. "I'm specifically interested in how this recruitment happens, and how molecules recognize other molecules."

Aksimentiev is using the Frontera supercomputer at the Texas Advanced Computing Center (TACC), one of the fastest in the world, to better understand this process. Over the last decade, he and others developed the tools and methods to explore the behavior of biological systems at the atomic level using molecular dynamics simulations.

Aksimentiev is able to simulate biological systems with millions of interacting atoms in a realistic environment for microseconds or even milliseconds -- the timescales at which biological systems operate. Today's supercomputers allow larger, faster simulations, and permit scientists to ask and answer new questions.

Even by the standards of the field, biological condensates are challenging to study computationally. Unlike other ordered systems like proteins with known rigid structures, or disordered systems like water, biological condensates are what's known as 'partially disordered' -- a particularly hard type of structure to simulate.

Writing in the Journal of Physical Chemistry Letters in May 2020, Aksimentiev and graduate student Han-Yi Chou described coarse-grained molecular dynamics simulations on Frontera that charted the phase diagram (a graphical representation of the physical states of a substance under different conditions of temperature and pressure) of one particular biomolecular condensate -- fused in sarcoma (FUS). A nuclear DNA/RNA binding protein, FUS regulates different steps of gene expression, including transcription, splicing and mRNA transport. The research was supported by grants from the National Science Foundation and the National Institutes of Health.

The researchers showed that a particle-based molecular dynamics model can reproduce known phase separation properties of a FUS condensate, including its critical concentration and susceptibility to mutations.

They also showed that they could use chain collapse theory to determine the thermodynamic properties of the condensate and to link them to changes in the shape of individual condensate molecules.

The behavior of a biological condensate, with all its complex inter- and intramolecular interactions, can be described by a polymer physics model, they found. This makes computer modeling a useful tool for uncovering the behavior of these still-mysterious cellular actors.

Aksimentiev's research sets the stage for future studies that will elucidate the molecular mechanisms driving the formation of droplets in more complex biological condensates, like those that repair RNA. The work is one step on a long path to fully elucidate the mystery of biological condensates in cells -- another trick of nature slowly uncovered.

Credit: 
University of Texas at Austin, Texas Advanced Computing Center

The Lancet Global Health: Modelling study estimates health-care cost of COVID-19 in low- and middle-income countries at US$52 billion every four weeks

New modelling research, published in The Lancet Global Health journal, estimates that it could cost low- and middle-income countries (LMICs) around US$52 billion (equivalent to US$8.60 per person) over four weeks to provide an effective health-care response to COVID-19, assuming each country's reproductive number (average number of contacts that a case infects) remained unchanged (table 2).

However, the sizeable costs of a COVID-19 response in the health sector are likely to escalate if transmission increases--rising to as much as US$62 billion (US$10?15 per person) over four weeks under a scenario where current restrictions are relaxed and transmission increases by 50%.

This compares to the US$4.5 billion each year (equivalent to 65 cents per person) that the Commission on a Global Health Risk Framework for the Future recommended the world spend on pandemic preparedness in 2016--with most of this investment designated for upgrading public health infrastructure and capabilities in LMICs [1].

The authors caution that given each country's reproductive number was fixed for the analysis, the ability of individual-level measures (eg, contact tracing, quarantine, use of cloth masks) to slow transmission of the virus was not taken into account, so the true costs for countries could be lower.

"The costs of a COVID-19 response in the health sector will escalate, particularly if transmission increases", says Dr Agnès Soucat, Director of the Department of Health Systems Governance and Financing at WHO, Switzerland. "So instituting early and comprehensive measures to limit the further spread of the virus will be vital if we are to conserve resources and sustain the response." [2]

"Given the lack of pandemic preparedness in low- and middle-income countries, and the limited resilience of their health systems, major investment will be needed to counter the COVID-19 outbreak--reflecting the constrained health capacity in countries which are facing a virus that has spread and established itself", says co-author Dr Lucy Boulanger from WHO, Switzerland. "The arguments for investing in preparedness are strong, juxtaposed against this massive price tag for the response to COVID-19, and coupled with the expected shock on the global economy." [2]

In February 2020, WHO outlined the priority public health measures required by countries to counter the COVID-19 pandemic in the Strategic Preparedness and Response Plan (SPRP; table 1) [3].

In this study, researchers modelled the future health-care costs of the full response plan to counter the COVID-19 outbreak in 73 LMICs (accounting for 93% of the total population in LMICs) under three scenarios--the status quo in which current measures to restrict movement and ensure physical and social distancing continue so transmission levels are maintained; and under scenarios where restrictions are relaxed and transmission is increased by 50%, or when public health and social measures are intensified and transmission is decreased by 50% (table 2).

The number of COVID-19 cases were calculated from June 26, 2020, for each of the three scenarios over four weeks (ie, until July 24) and 12 weeks (until Sept 18), using an SEIR model--which captures the rate that people move from states of being susceptible, to exposed, to infected, and to recovered.

The researchers then modelled the cost of healthcare staff, equipment (eg, personal protective equipment, diagnostic tests), and infrastructure (eg, upgrading laboratories for diagnostic testing, establishing field hospitals to treat COVID-19 patients) that would be accrued by implementing the actions in the strategic response plan.

The total health-care costs for the COVID-19 response in the status quo scenario were estimated at around US$52 billion over four weeks, or US$8?60 per person. This amounts to around 20% of the health expenditure in low-income countries in 2017 (of US$41 per capita for a whole year).

For the decreased or increased transmission scenarios, the totals were about US$33 billion (US$5.42 per person) and $62 billion (US$10?15 per person), respectively (table 3).

Over 12 weeks, the costs are projected to triple under the status quo (US$154 billion) and increased transmission scenarios (US$197 billion), and to reach around US$52 billion in the decreased transmission scenario. The similarity in the price tag of the status quo scenario for 4 weeks and the decreased transmission scenario for 12 weeks highlights the importance of working to reduce virus transmission and contain the cost of the response, researchers say.

The main cost drivers were clinical case management (54% overall cost; eg, field hospitals, biomedical equipment, drugs, safe burial teams), maintaining essential services (21%; eg, coordination teams, salaries, outreach teams, rented ambulances), rapid response and case investigation (14%; eg, contact-tracing teams), and infection prevention and control (9%; eg, protective equipment, masks, hand-washing stations; table 4).

"Our results emphasize that critical components of health systems need to exist when an outbreak occurs - including healthcare staff, laboratories, and mechanisms for coordination - as these are essential to deliver an effective response", says Dr Tessa Tan-Torres Edejer who led the research. "More work needs to be done at the country level to identify gaps in both preparedness and response for this and future pandemics." [2]

The authors note several limitations, including that the analysis did not include isolation, quarantine, or waste management costs, and the use of international market prices without freight, insurance, and import tariffs might also underestimate costs.

Credit: 
The Lancet

Generic cholesterol drugs save medicare billions of dollars, study finds

image: Generics

Image: 
UT Southwestern Medical Center

DALLAS - Sept. 9, 2020 - The switch from brand name to generic cholesterol medications that occurred between 2014 and 2018 has saved Medicare billions of dollars, even as the number of people on cholesterol-lowering drugs has increased, UT Southwestern scientists have calculated. Their data, published in the journal JAMA Cardiology, suggest that policymakers and clinicians could help cut Medicare costs even further by switching more patients to generic drugs.

"One of the most important contributors to our health care costs is expenditure on prescription drugs," says Ambarish Pandey, M.D., a cardiologist and assistant professor of internal medicine at UTSW. "The switch to generics is an effective strategy to cut the costs incurred by health systems."

Pandey is considered a thought leader in how to leverage information gleaned from electronic health records and other health databases. He was named a Texas Health Resources Clinical Scholar in 2018 and has previously published papers on how much extended-release drugs cost the U.S. health care system, the burden of heart failure and heart attack-associated hospital readmission and mortality in the Medicare population, and the development of novel strategies to predict and prevent heart failure in this population, among other topics.

"It's important for our health care system to find avenues to become more cost-efficient and accessible. Even though there is still a lot of work to be done, it is encouraging to see how quickly patients switched to generic options once they became available," says first author Andrew Sumarsono, M.D., an assistant professor of internal medicine at UTSW. "This rapid switch to generics saved Medicare a lot of money."

In the U.S., 95 million adults have total cholesterol levels higher than 200 mg/dL, and most are recommended to lower their low-density lipoprotein (LDL) cholesterol through a combination of lifestyle changes and LDL-lowering medications. The most popular of these are statins, currently prescribed to more than 35 million people in the U.S.

Between 2010 and 2018, patents on a number of statins and other LDL-lowering drugs expired, enabling drugmakers to begin producing generic versions. The widely prescribed brand-name drugs Crestor and Zetia, for instance, both had patents expire during this time frame.

In the new study, Pandey and his colleagues used the Medicare Part D Prescription Drug Event Database - which includes information on all outpatient drug spending for Medicare Part D beneficiaries. The data they analyzed spanned January 2014 through December 2018. Between 2014 and 2018, the number of Medicare Part D enrollees grew from 37.7 million to 44.2 million, and the total number of prescriptions for LDL-lowering drugs increased by 23 percent, from 20.5 million to 25.2 million.

Despite that increase in prescriptions for LDL-lowering drugs, total costs on these drugs went down. The number of prescriptions that were for generic drugs rose by 35 percent, from 17.8 million to 24 million. And the overall spending on statins declined by 52 percent, from $4.8 billion in 2014 to $2.3 billion in 2018.

"This really shows how prescription patterns can have a very big influence on health care costs," says Pandey.

During the timespan studied, Medicare still spent $9.6 billion on brand-name LDL-lowering medications and could have saved an additional $2.5 billion of that by switching to generics more quickly when they became available, the study found.

A newer class of LDL-lowering drugs, called PCSK9 inhibitors, became available in 2015; Pandey's group showed that prescriptions for these drugs were low and increased by 144 percent, from 25,569 to 62,476, between 2016 and 2018. Follow-up studies could investigate the effect PCSK9 has had on Medicare spending and prescriptions in the time since, Pandey says. He'd also like to study how out-of-pocket costs for patients have changed over time.

"Statins are one of the most important drugs to study in this context because they're just so widely prescribed," he adds. "But there are also other drugs that certainly have substantial costs to the health care system and need to be studied in this respect as well."

"It's amazing to see these drug advances in medicine," Sumarsono says. "But we also want to make sure that everyone who needs these drugs can afford them."

Credit: 
UT Southwestern Medical Center

Urbanization and agriculture are land uses that most affect Brazil's rivers

image: Survey shows how a lack of planning may affect the availability of a natural resource that is already becoming scarce

Image: 
Kaline de Mello

Brazil has more freshwater than any other country, but this resource is dwindling because of climate change, rising consumption and inadequate treatment, among other factors. Worse, Brazil's rivers are increasingly polluted due to a lack of proper land use planning.

Agriculture and urbanization are the main culprits, closely followed by mining. Although mining occupies a small percentage of Brazil's territory, it has a huge impact on water quality, according to a literature review by a group of researchers published in Journal of Environmental Management.

The review was led by Kaline de Mello, a biologist at the University of São Paulo's Institute of Biosciences (IB-USP). Mello is supported by São Paulo Research Foundation - FAPESP .

Researchers affiliated with the Federal Universities of the ABC (UFABC), Minas Gerais (UFMG) and São Carlos (UFSCar) in Brazil and the University of Massachusetts (UMass Amherst) and Oregon State University (OSU) in the United States also participated.

This study is the first to provide a nationwide overview of the impact of land use on water quality. "Most research offers projections of the impact of land use changes on the amount of water available and not on water quality. We set out to see what water quality will be like 30 years from now," said Ricardo Hideo Taniwaki, a professor at UFABC and a coauthor of the published article.

The authors evaluated the impacts of all possible future scenarios, ranging from worst-case to best-case scenarios for the impact of changes in land use on water quality while also considering climate change.

Extensive survey

The analysis was divided into stages. First, having collected land use and land cover data from the platform Mapbiomas, the researchers observed conservation of native vegetation and the extent of activities potentially affecting water quality, particularly agriculture, pasture, silviculture (forestry), mining and urbanization.

"Next, we separated the field studies that assessed the effects of the activity in question on nearby rivers in the various Brazilian biomes," Mello said. The parameters used to measure water quality included fecal bacteria, sediment, nitrogen, phosphorus, heavy metals, and other pollutants.

The second stage showed that degradation varies according to the scale or dimension used to evaluate it and that this should be taken into account when conservation action is planned. Land use impacts on water quality are evaluated in one or all of the following spatial dimensions: at the water sampling site, in the riparian vegetation and in the entire catchment area. "Catchment analysis appears to best reflect overall water quality," Taniwaki said.

The temporal dimension involves rainfall and other seasonal variations, such as temperature. "This is important in the context of climate change," Taniwaki said. "Heavier precipitation and longer droughts are expected. In the absence of best agricultural practice, river pollution will increase."

Finally, the article discusses mathematical models that predict future water quality. "We highlight models available in Brazil that can be used to simulate the impact of positive and negative measures, as well as the data required to do so," Mello said.

Impact by soil type

Pasture and cropland account for 28.8% of the territory and are found mainly in the Cerrado (42% of the total) and Atlantic Rainforest (62%) biomes. "In areas of pasture, soil compaction by animals affects water absorption. Surface runoff increases, and so does the volume of polluted water entering streams and rivers when it rains," Mello said.

Agricultural activities also affect runoff dynamics and increase the amount of pollutants such as nitrogen, phosphorus and other chemicals in water courses. "It's worth recalling that Brazil is one of the world's largest consumers of fertilizer and agrochemicals, which have a significant impact on surface water and groundwater," Mello said.

In urban areas, there are two main problems. "The soil is almost entirely sealed and made impervious by concrete and tarmac, so that runoff with pollutants of all kinds including heavy metals enters the water courses when it rains, and Brazil has few stormwater treatment programs," Taniwaki stated.

Although urban areas occupy only 0.6% of Brazil's land mass, cities are major drivers of water quality degradation due to untreated sewage, which fills rivers with fecal bacteria, organic matter and other pollutants. Some 48% of the population is not connected to a domestic sewerage network, and only 10% of the largest cities treat more than 80% of the domestic and industrial waste they collect.

Mining also occupies a small percentage of the territory but has an enormous local impact on water quality, discharging heavy metals that are toxic to plants and animals as well as humans into water courses. This impact was evidenced by catastrophic tailings dam failures in Mariana and Brumadinho, in the state of Minas Gerais.

The Mariana disaster polluted more than 650 km of the Doce, one of Brazil's major rivers, affecting over 1 million people. Analysis of water from Paraopeba, one of the rivers affected by the Brumadinho disaster, showed that after the accident, levels of lead and mercury were 21 times those acceptable.

"More than 40 mine tailings dams are at risk of similar accidents up and down the country," Taniwaki said.

Most endangered biomes

Loss of native plant cover is the main threat to water sources in all biomes, Mello noted, citing the state of rivers and streams in the Atlantic Rainforest biome, where 65% of the population lives. Only 26% of the original vegetation remains in this dwindling biome, and water quality is considered good in only 6.5% of its rivers.

The Amazon biome and the Cerrado are also cause for concern. Although much of the Amazon's native vegetation is still in place, it is distinctly endangered. "In 2019, the Amazon suffered its greatest forest cover loss in ten years, according to the National Institute for Space Research [INPE]," Mello said.

Deforestation in the region grew 108% in January 2020 compared with the same month of 2019. Only 19% of the original vegetation survives in the Cerrado. "More research is needed on water quality in these two biomes, which are suffering the most from the advance of the agricultural frontier," Mello said.

The future of water in Brazil

Public administrators and researchers can use the mathematical models available in the literature to predict future water quality in their regions and help make a decisions on the kind of intervention that will be most effective to deal with the specific situation. One of the tools highlighted in the article is multicriteria assessment, an approach that uses participation by civil society and private enterprise to partner with the state in prioritizing areas to be restored at a time of financial austerity.

The quality of the available data must be improved in order for this analysis to be performed more assertively, but the researchers also argue that the quantity is insufficient and that far more data are needed. "It's hard to make predictions with the water quality and land use data we have now, and predictions are vital to public policy formulation," Taniwaki said.

"The estimates now available point to severe water quality degradation unless deforestation is halted and basic sanitation improves in the years ahead," declared Mello. The long-term negative consequences include increased public spending to treat polluted water before it is used or to transport it from more distant areas. This extra cost will have to be passed on to consumers in their water bills. Drastic changes in the other environmental services provided by rivers will also be required.

"On the other hand, simulations of restoration in Permanent Conservation Areas [APPs in Portuguese, mainly riparian forest] resulting from compliance with the Brazilian Forest Code point to increased water quality due to a reduction in sediments, nitrogen and phosphorus," according to Mello.

Hence, it is important to enforce environmental legislation and planned agricultural and urban expansion. "The literature we reviewed also shows the negative effects of lowered standards, watered-down legislation, and less investment in research," Taniwaki said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Where rocks come alive: NASA's OSIRIS-REx observes an asteroid in action

video: Using data collected by NASA's OSIRIS-REx mission, this animation shows the trajectories of particles after their emission from asteroid Bennu's surface. The animation emphasizes the four largest particle ejection events detected at Bennu from December 2018 through September 2019. Additional particles, some with lifetimes of several days, that are not related to the ejections are also visible.

Image: 
M. Brozovic/JPL-Caltech/NASA/University of Arizona

It's 5 o'clock somewhere - and while here on Earth, "happy hour" is commonly associated with winding down and the optional cold beverage, that's when things get going on Bennu, the destination asteroid of NASA's OSIRIS-REx mission.

In a special collection of research papers published Sep. 9 in the Journal of Geophysical Research: Planets, the OSIRIS-REx science team reports detailed observations that reveal Bennu is shedding material on a regular basis. The OSIRIS-REx spacecraft has provided planetary scientists with the opportunity to observe such activity at close range for the first time ever, and Bennu's active surface underscores an emerging picture in which asteroids are quite dynamic worlds.

The publications provide the first in-depth look at the nature of Bennu's particle ejection events, detail the methods used to study these phenomena, and discuss the likely mechanisms at work that cause the asteroid to release pieces of itself into space.

The first observation of particles popping off the asteroid's surface was made in January 2019, mere days after the spacecraft arrived at Bennu. This event may have gone completely unnoticed were it not for the keen eye of the mission's lead astronomer and University of Arizona's Lunar and Planetary Laboratory scientist, Carl Hergenrother, one of the lead authors of the collection.

Much like ocean-going explorers in centuries past, the space probe relies on stars to fix its position in space and remain on course during its years-long voyage across space. A specialized navigation camera onboard the spacecraft takes repeat images of background stars. By cross-referencing the constellations the spacecraft "sees" with programmed star charts, course corrections can be made as necessary.

Hergenrother was poring over these images that the spacecraft had beamed back to Earth when something caught his attention. The images showed the asteroid silhouetted against a black sky dotted with many stars - except there seemed to be too many.

"I was looking at the star patterns in these images and thought, 'huh, I don't remember that star cluster,'" Hergenrother said. "I only noticed it because there were 200 dots of light where there should be about 10 stars. Other than that, it looked to be just a dense part of the sky."

A closer inspection and an application of image-processing techniques unearthed the mystery: the "star cluster" was in fact a cloud of tiny particles that had been ejected from the asteroid's surface. Follow-up observations made by the spacecraft revealed the telltale streaks typical of objects moving across the frame, setting them apart from the background stars that appear stationary due to their enormous distances.

"We thought that Bennu's boulder-covered surface was the wild card discovery at the asteroid, but these particle events definitely surprised us," said Dante Lauretta, OSIRIS-REx principal investigator and professor at LPL. "We've spent the last year investigating Bennu's active surface, and it's provided us with a remarkable opportunity to expand our knowledge of how active asteroids behave."

Since arriving at the asteroid, the team has observed and tracked more than 300 particle ejection events on Bennu. According to the authors, some particles escape into space, others briefly orbit the asteroid, and most fall back onto its surface after being launched. Ejections most often occur during Bennu's local two-hour afternoon and evening timeframe.

The spacecraft is equipped with a sophisticated set of electronic eyes - the Touch-and-Go Camera Suite, or TAGCAMS. Although its primary purpose is to assist in spacecraft navigation, TAGCAMS has now been placed into active duty spotting any particles in the vicinity of the asteroid.

Using software algorithms developed at the Catalina Sky Survey, which specializes in discovering and tracking near-Earth asteroids by detecting their motion against background stars, the OSIRIS-REx team found the largest particles erupting from Bennu to be about 6 centimeters (2 inches) in diameter. Due to their small size and low velocities - this is like a shower of tiny pebbles in super-slo-mo - the mission team does not deem the particles a threat to the spacecraft.

"Space is so empty that even when the asteroid is throwing off hundreds of particles, as we have seen in some events, the chances of one of those hitting the spacecraft is extremely small," Hergenrother said, "and even if that were to happen, the vast majority of them are not fast or large enough to cause damage."

During a number of observation campaigns between January and September 2019 dedicated to detecting and tracking mass ejected from the asteroid, a total of 668 particles were studied, with the vast majority measuring between 0.5 and 1 centimeters (0.2-0.4 inches), and moving at about 20 centimeters (8 inches) per second, about as fast - or slow - as a beetle scurrying across the ground. In one instance, a speedy outlier was clocked at about 3 meters (9.8 feet) per second.

On average, the authors observed one to two particles kicked up per day, with much of the material falling back onto the asteroid. Add to that the small particle sizes, and the mass loss becomes minimal, Hergenrother explained.

"To give you an idea, all of those 200 particles we observed during the first event after arrival would fit on a 4-inch x 4-inch tile," he said. "The fact that we can even see them is a testament to the capabilities of our cameras."

The authors investigated various mechanisms that could cause these phenomena, including released water vapor, impacts by small space rocks known as meteoroids and rocks cracking from thermal stress. The two latter mechanisms were found to be the most likely driving forces, confirming predictions about Bennu's environment based on ground observations preceding the space mission.

As Bennu completes one rotation every 4.3 hours, boulders on its surface are exposed to a constant thermo-cycling as they heat during the day and cool during the night. Over time, the rocks crack and break down, and eventually particles may be thrown from the surface. The fact that particle ejections were observed with greater frequency during late afternoon, when the rocks heat up, suggests thermal cracking is a major driver. The timing of the events is also consistent with the timing of meteoroid impacts, indicating that these small impacts could be throwing material from the surface. Either, or both, of these processes could be driving the particle ejections, and because of the asteroid's microgravity environment, it doesn't take much energy to launch an object from Bennu's surface.

Of the particles the team observed, some had suborbital trajectories, keeping them aloft for a few hours before they settled back down, while others fly off the asteroid to go into their own orbits around the sun.

In one instance, the team tracked one particle as it circled the asteroid for almost a week. The spacecraft's cameras even witnessed a ricochet, according to Hergenrother.

"One particle came down, hit a boulder and went back into orbit," he said. "If Bennu has this kind of activity, then there is a good chance all asteroids do, and that is really exciting."

As Bennu continues to unveil itself, the OSIRIS-REx team continues to discover that this small world is glowingly complex. These findings could serve as a cornerstone for future planetary missions that seek to better characterize and understand how these small bodies behave and evolve.

Credit: 
NASA/Goddard Space Flight Center

Designed antiviral proteins inhibit SARS-CoV-2 in the lab

video: B-roll of Dr. Longxing Cao of the UW Medicine Institute for Protein Design in Seattle at work conducting his studies of computer-designed, synthetic small protein binders as potential antivirals against the pandemic coronavirus. There are no live viruses in this lab.

Image: 
Ian Haydon/UW Medicine Institute for Protein Design

Computer-designed small proteins have now been shown to protect lab-grown human cells from SARS-CoV-2, the coronavirus that causes COVID-19.

The findings are reported today, Sept. 9, in Science

In the experiments, the lead antiviral candidate, named LCB1, rivaled the best-known SARS-CoV-2 neutralizing antibodies in its protective actions. LCB1 is currently being evaluated in rodents.

Coronaviruses are studded with so-called Spike proteins. These latch onto human cells to enable the virus to break in and infect them. The development of drugs that interfere with this entry mechanism could lead to treatment of or even prevention of infection.

Institute for Protein Design researchers at the University of Washington School of Medicine used computers to originate new proteins that bind tightly to SARS-CoV-2 Spike protein and obstruct it from infecting cells.

Beginning in January, more than two million candidate Spike-binding proteins were designed on the computer. Over 118,000 were then produced and tested in the lab.

"Although extensive clinical testing is still needed, we believe the best of these computer-generated antivirals are quite promising," said lead author Longxing Cao, a postdoctoral scholar at the Institute for Protein Design.

"They appear to block SARS-CoV-2 infection at least as well as monoclonal antibodies, but are much easier to produce and far more stable, potentially eliminating the need for refrigeration," he added.

The researchers created antiviral proteins through two approaches. First, a segment of the ACE2 receptor, which SARS-CoV-2 naturally binds to on the surface of human cells, was incorporated into a series of small protein scaffolds.

Second, completely synthetic proteins were designed from scratch. The latter method produced the most potent antivirals, including LCB1, which is roughly six times more potent on a per mass basis than the most effective monoclonal antibodies reported thus far.

Scientists from the University of Washington School of Medicine in Seattle and Washington University School of Medicine in St. Louis collaborated on this work.

"Our success in designing high-affinity antiviral proteins from scratch is further proof that computational protein design can be used to create promising drug candidates," said senior author and Howard Hughes Medical Institute Investigator David Baker, professor of biochemistry at the UW School of Medicine and head of the Institute for Protein Design. In 2019, Baker gave a TED talk on how protein design might be used to stop viruses.

To confirm that the new antiviral proteins attached to the coronavirus Spike protein as intended, the team collected snapshots of the two molecules interacting by using cryo-electron microscopy. These experiments were performed by researchers in the laboratories of David Veesler, assistant professor of biochemistry at the UW School of Medicine, and Michael S. Diamond, the Herbert S. Gasser Professor in the Division of Infectious Diseases at Washington University School of Medicine in St. Louis.

"The hyperstable minibinders provide promising starting points for new SARS-CoV-2 therapeutics," the antiviral research team wrote in their study pre-print, "and illustrate the power of computational protein design for rapidly generating potential therapeutic candidates against pandemic threats."

Credit: 
University of Washington School of Medicine/UW Medicine