Earth

NIH funded research related to every new cancer drug approved from 2010-2016, totals $64B

Federally funded research contributed to the science underlying each of the 59 new cancer drugs approved by the FDA from 2010-2016 according to a study from Bentley University. The article, titled "NIH funding for research underlying new cancer therapies," suggests that the level of NIH funding for cancer research is substantially higher than previously estimated.

The new report from the Center for Integration of Science and Industry, published in The Lancet Oncology, shows that the United States government invested $64 billion in research related to these 59 drugs. This included $54 billion for basic research related to the biological targets for these drugs and $9.9 billion for applied research related to the drugs themselves. This total is greater than the annual budget allocations for the National Cancer Institute (NCI) over the last decade and previous estimates of total NIH spending on cancer by the NIH Research, Condition, Disease Categorization (RCDC) database.

Overall, the NCI contributed $20 billion to research on new cancer drugs, representing 31% of the total for basic research and 43% of the total for applied research. The majority of the funding for research leading to these drugs was funded by other institutes across the NIH, and represents the "spillover" of basic research in areas such as immunology and endocrinology, which proved to have practical applications in treating cancer.

"This study suggests that the scale of the public sector contribution to the basic and applied research that leads to new cancer therapies is substantially greater than previously estimated, and demonstrates the importance of funding a broad base of basic research that may have unexpected applications," said Dr. Fred Ledley, Director of the Center for Integration of Science and Industry, and the senior author on this study. "While much of the research we identified in this study was originally aimed at understanding the immune system and treating immunological disease, this research ultimately provided insights that led to the development of innovative immunotherapies for cancer."

The study involved the analysis of 712 thousand research publications that were directly related to the 59 new cancer therapies approved from 2010-2016 or their biological targets. Of these 37% were supported by the NIH. This support comprised 117 thousand fiscal years of funding totaling $63.9 billion. Of this total, 92% was for basic research on biological targets and systems, rather than the drugs themselves.

Credit: 
Bentley University

On the hunt for megafauna in North America

image: Researchers analysed ancient DNA from bone fragments and soil found inside Hall's Cave, located in central Texas.

Image: 
Curtin University, Mike Bunce

Research from Curtin University has found that pre-historic climate change does not explain the extinction of megafauna in North America at the end of the last Ice Age.

The research, published today in Nature Communications, analysed ancient DNA from bone fragments and soil found inside Hall's Cave, located in central Texas. The researchers discovered important genetic clues to the past biodiversity in North America and provided new insights into the causes of animal extinctions during the Ice Age.

The research was an international collaboration between Curtin University, University of Texas-Austin, Texas A&M University and Stafford Research Labs.

Lead researcher Mr Frederik Seersholm, Forrest Foundation Scholar and PhD candidate from Curtin's School of Molecular and Life Sciences, said the analysis tracks how biodiversity in Texas changed as temperatures dropped, and then recovered around 13,000 years ago.

"At the end of the last ice-age, Earth experienced drastic climate changes that significantly altered plant and animal biodiversity. In North America these changes coincided with the arrival of humans," Mr Seersholm said.

"When we combined our new data with existing fossil studies in the region, we obtained a detailed picture of the biodiversity turnover against the backdrop of both human predation and pre-historic climate changes.

"Our findings show that while plant diversity recovered as the climate warmed, large animal diversity did not recover.

"Of the large-bodied animals, known as megafauna, identified at the cave, nine became extinct and five disappeared permanently from the region.

"In contrast, small animals which are not believed to have been hunted intensely by humans, adapted well to the changing climate by migrating. Hence, the data suggests a factor other than climate may have contributed to the extinction of the large mammals."

While the research team acknowledges it is difficult to assess the exact impact of human hunting on the megafauna, they believe there is now sufficient evidence to suggest our ancestors were the main driver of the disappearance of ice age species such as the mammoth and sabre-toothed cat.

Mr Seersholm said the findings demonstrate how much information is stored in seemingly insignificant bone fragments.

"The study builds on years of research at Hall's cave, which have helped shape our understanding of the North American megafauna since the first analyses were conducted in the 1990s," Mr Seersholm said.

"By combining new genetic methods with classic stratigraphy and vertebrate palaeontology, our research adds to this story.

"We found that while small mammals and plants in the region seemed to be able to cope fine with the changing climate, the megafauna did not. Because humans are the only other major factor, we hypothesise that human hunting of megafauna was the driving force of the animals' decline."

Credit: 
Curtin University

Development of a novel approach 'Tracing Retrogradely the Activated Cell Ensemble'

image: The scheme shows the TRACE method that labels active inputs to a specific brain area through a retrograde labelling activity depended viral approach.

Image: 
Nathalie Krauth

A major interest in brain research is understanding behavioural experiences at the synaptic and circuit level. However, gaining insight into the neural circuits that underlie a specific behavioural experience is a challenge. The perplexity lies in defining which of the large number of neuronal inputs a specific region of the brain receives, expresses a defined behaviour, and yet there are no direct methods available to overcome this challenge. Today, a common mapping approach in neuroscience is to use retrograde tracing viruses and identify all neuronal inputs in a population of neurons in a specific region of the brain. But neuroscientists are still left to rely on their experience, knowledge and previous findings to exactify specific inputs that underlie a defined behaviour.

A collaborative study led by circuitry experts at the department of Molecular Biology and Genetics, department of Biomedicine and centre of Proteins in Memory unravel a new approach or so to say, an unbiased and accurate strategy, that selectively tags arriving inputs that are activated by a defined stimulus and projected to a particular brain region. The researchers' new approach (TRACE) uses an approach where a retrograde virus is combined with a Cre-recombinase that is under the control of an activity dependent promoter to express a fluorescent marker in input neurons.

Using mouse models, the Nabavi and Capogna group have demonstrated the input specificity of their novel approach on four different brain circuits receiving defined sensory inputs. Researchers performed behavioural testing using optogenetic stimulation, odor driven innate fear detection and shock-induced circuit experiments. They also performed imaging of brain slices to investigate specific neural activity.

Their findings show that TRACE can identify active sensory neural inputs that underlie a defined behaviour. This new labelling system is an additional tool in the toolbox to gain access to neurons in the brain that are activated by a particular stimulation or experience.

Credit: 
Aarhus University

New experiment design improves reproducibility

image: Prof. Dr Holger Schielzeth from the University of Jena is part of the team that recommends the targeted inclusion of variations in the design of animal experiments in order to increase the area to which the results can be reliably transferred.

Image: 
(Photo: Jan-Peter Kasper/FSU)

Jena, Germany (02.06.2020) For some scientific disciplines, such as medical or drug research, experiments with live animals are still indispensable. Scientists are aware of their responsibilities in this sensitive area and strive to keep the number of experiments as low as possible. Extensive standardisation processes are supposed to increase the efficiency of the experiments, thus reducing the number needed. However, biological complexity, and in particular a dependence on the context of the individual experiments, often make it difficult to reproduce and generalise the results. In the current research journal "Nature Reviews Neuroscience", an international team led by the University of Bern recommends ways of reducing the number of experiments.

Results often depend on context

"The reproducibility of results is a crucial element of science. Results are reproducible if research results obtained from an initial study can be confirmed in independent replicate studies," explains ecologist Prof. Holger Schielzeth from the University of Jena in Germany, one the study's co-authors. "A fundamental problem of biological research is that the results are often very dependent on context. We therefore propose integrating one of these influencing factors - namely biological variability - into the design of the experiment in order to produce more generally valid results."

Standardisation is limiting

Researchers currently standardise the conditions and characteristics of laboratory animals in experiments, such as for the administration of a potential drug, according to strict criteria. In doing this, they want to eliminate all influencing factors that have nothing to do with the immediate objective of the experiment and thus increase the reproducibility of the results. This standardised approach, however, limits the range of conditions to which the results obtained can be generalised. This means that more studies are necessary to confirm the results.

"We therefore recommend the targeted inclusion of contextual variation into the design of experiments, so as to increase the range to which the results can be reliably transferred," says Schielzeth. "This increases the potential for reproducibility and thus reduces the total number of experiments." A "systematic heterogenisation" of animal characteristics and environmental factors could be achieved in a modified version of the randomised block design. This involves pairing up treatments and experimental controls in small blocks, each block being tested in slightly different contexts.

Fewer follow-up studies needed

This arrangement enables researchers to find out whether specific results can be generalised or are to be attributed to influencing factors specific to the experiment. Scientists would be able to address biological variations within a study and, for example, take different sexes, age categories or housing conditions of the animals into consideration. This would give them more reliable findings from a single experiment. Further research on this new method should lead to better guidelines for future experiments.

"We are aware that this experiment design can lead to an increase in the number of animals used for experiments during an initial study," says Schielzeth. "However, much fewer follow-up studies are needed to verify the result, which leads to a significant reduction in the number of animals overall." The team therefore calls on research institutions and regulatory authorities to introduce systematic heterogenisation as a standard model for experiments.

Credit: 
Friedrich-Schiller-Universitaet Jena

New discovery could highlight areas where earthquakes are less likely to occur

Scientists from Cardiff University have discovered specific conditions that occur along the ocean floor where two tectonic plates are more likely to slowly creep past one another as opposed to drastically slipping and creating catastrophic earthquakes.

The team have shown that where fractures lie on the ocean floor, at the junction of two tectonic plates, sufficient water is able to enter those fractures and trigger the formation of weak minerals which in turn helps the two tectonic plates to slowly slide past one another.

The new findings, which have been published in the journal Science Advances, could potentially help scientists understand the size of stresses at specific fault lines and whether or not the tectonic plates could possibly trigger an earthquake.

This, in turn, could potentially contribute to solving one of the greatest challenges that faces seismologists, which is to be able to forecast earthquakes with enough precision to save lives and reduce the economic damage that is caused.

Earth's outer layer, the lithosphere, is made up tectonic plates that shift over the underlying asthenosphere like floats on a swimming pool at rates of centimetres per year.

Stresses begin to build up where these plates meet and are relieved at certain times either by earthquakes, where one plate catastrophically slips beneath the other at a rate of meters per second, or by creeping whereby the plates slip slowly past one another at a rate of centimetres per year.

Scientists have for a long time been trying to work out what causes a particular plate boundary to either creep or to produce an earthquake.

It is commonly believed that the slip of tectonic plates at the juncture of an oceanic and continental plate is caused by a weak layer of sedimentary rock on the top of the ocean floor; however, new evidence has suggested that the rocks deeper beneath the surface in the oceanic crust could also play a part and that they may be responsible for creep as opposed to earthquakes.

In their study, the team from Cardiff University and Tsukuba University in Japan looked for geological evidence of creep in rocks along the Japan coast, specifically in rocks from oceanic crust that had been deeply buried in a subduction zone, but through uplift and erosion were now visible on the Earth's surface.

Using state-of-the-art imaging techniques the team were able to observe the microscopic structure of the rocks within the oceanic crust and use them to estimate the amount of stress that was present at the tectonic plate boundary.

Their results showed that the oceanic crust was in fact far weaker than previously assumed by scientists.

"This means that, at least in the ancient Japanese subduction zone, slow creep within weak, wet oceanic crust could allow the ocean lithosphere to slip underneath the overlying continent without earthquakes being generated," said lead-author of the study Christopher Tulley, from Cardiff University's School of Earth and Ocean Sciences.

"Our study therefore confirms that oceanic crust, typically thought to be strong and prone to deforming by earthquakes, may instead commonly deform by creep, providing it is sufficiently hydrated."

Credit: 
Cardiff University

Integrating behavioral health services into medical practices faces barriers

Integrating behavioral health services into physician medical practices faces cultural and financial barriers, but providing technical support and improved payment models may enhance the long-term sustainability of the approach, according to a new RAND Corporation study conducted in collaboration with the American Medical Association.

Examining a diverse group of 30 physician practices that have pursued behavioral health integration, researchers found that there were many reasons that practices incorporated mental health services, including wanting to improve quality by expanding access to behavioral health services.

But even within the study's sample of practices that had successfully adopted behavioral health integration, financial sustainability was a pervasive concern, regardless of the payment models used by the practices. The findings are published online by the Annals of Internal Medicine.

"We found that behavioral health integration is possible in a wide variety of medical practices, not just in primary care," said Dr. Peggy G. Chen, co-author of the study and a physician researcher at RAND, a nonprofit research organization. "The key factor in the success of behavioral health integration was adaptation to each practice's needs and resources."

"The COVID-19 pandemic has exposed and magnified the flaws in our mental health system and the true burden of mental illness in our country," said Dr. Patrice A. Harris, president of the American Medical Association. "Behavioral health care integration can help save lives and is a proven model that has many advantages over a more divided one. The AMA is committed to establishing a viable pathway for combining physical and behavioral health care to make a real impact in our nation's growing mental health crisis."

One in 5 adults in the U.S. has a clinically significant mental health or substance use disorder, yet many people do not receive treatment for their problems because of a shortage of mental health providers and lack of access to mental health services. One potential solution to the low levels of mental health treatment is integrating behavioral health into medical care.

Most approaches to behavioral health integration fall into two general archetypes: a co-located model where onsite behavioral health clinicians provide enhanced access within physician practices or an offsite model where behavioral health clinicians (usually psychiatrists) supervise onsite care managers who help nonbehavioral health clinicians
meet their patients' behavioral health needs.

To explore the experiences of successful behavioral health integration, researchers interviewed leaders and clinicians from 30 physician practices in different parts of the country and from different medical specialties that have implemented behavioral health integration.

They also consulted with experts in clinical care, research and health policy related to behavioral health integration and vendors that provide behavioral telehealth services or technical integration assistance to physician practices.

Physician practice leaders reported positive effects of behavioral health integration on their practices, such as creating an increased sense of providing high-quality patient care and meeting more of their patients' needs.

Barriers to behavioral health integration, they said, included cultural differences with mental health providers and impediments to the flow of information between medical and behavioral health providers.

Researchers say that efforts to improve interprofessional training and collaboration may help address cultural barriers and facilitate patient care that addresses both medical and behavioral health needs. In addition, enhancements to electronic health records and clarification of privacy regulations may improve communication between behavioral and nonbehavioral health clinicians.

Although prior research has demonstrated a favorable return on investment for behavioral
health integration, medical practices in the new study reported difficulty in estimating the specific effects of behavioral health integration on total medical expenses.

"Despite research evidence demonstrating the effectiveness of behavioral health integration, cultural, informational and financial challenges remain," Chen said. "Tailored, context-specific technical support to guide practices' efforts and payment models that improve the business case for those efforts may enhance the long-term sustainability of behavioral health integration."

Credit: 
RAND Corporation

New technique takes 3D imaging an octave higher

image: The CSU-developed harmonic optical tomography microscope.

Image: 
Jeff Field/Colorado State University

A collaboration between Colorado State University and University of Illinois at Urbana-Champaign resulted in a new, 3D imaging technique to visualize tissues and other biological samples on a microscopic scale, with potential to assist with cancer or other disease diagnoses.

Their technique, which allows specimens to generate light at double the frequency, or half the wavelength, of the incident light, is referred to as harmonic optical tomography and looks at 3D signals that are generated from the sample. The team's work is described in a paper, "Harmonic optical tomography of nonlinear structures," published online June 1 in Nature Photonics.

Harmonic optical tomography, or HOT, is based on using holographic information, which measures both the intensity and the phase delay of the light, to generate 3D images of a sample by exploiting a new physical mechanism used to obtain three-dimensional images.

"Our lab specializes in using holographic data to investigate live cells and tissues," said Gabriel Popescu, a professor of electrical and computer engineering at University of Illinois and director of the Quantitative Light Imaging Laboratory at the Beckman Institute for Advanced Science and Technology. "We wanted to extend this technique to nonlinear samples by combining the holographic data and new physics models."

Range of applications

Usually images, such as those captured by a cellphone camera, flatten three-dimensional information onto a two-dimensional image. Three-dimensional imaging that can peer into the interior of an object provides critical information for a diverse range of applications, such as medical diagnostics, finding cracks in oil wells and airplane wings, using tomographic X-ray, and ultrasound methods.

In this collaboration, the team developed theoretical models to describe how to image the tissue and discovered a unique capability for 3D imaging that arises, counterintuitively, by illuminating the sample with blurry, out-of-focus laser light. The team designed and built a new system at Colorado State University to collect data. The data was then reconstructed with computational imaging algorithms. The experiments verified an entirely new form of optical tomography, producing outstanding validation of the experimental predictions.

"A key to the experimental demonstration of this new nonlinear tomographic imaging was a custom, high-power laser, designed and built by CSU graduate student Keith Wernsing," said Randy Bartels, professor in CSU's Department of Electrical and Computer Engineering and paper co-author. "This source was integrated into a custom off-axis holographic microscope that used a high numerical aperture condenser lens defocused for widefield illumination. It is this special illumination condition that allows the nonlinear optics to create the second-harmonic generation signal and obtain information to form a 3D image. This work is an exciting example of how close dialogue enables refinement of both theory and experimental design to produce innovative new concepts."

Added Varun Kelkar, an ECE graduate student who formerly worked with paper co-author Professor Kimani C. Toussaint, Jr.: "HOT started out as an interesting theoretical project I worked on with Professor Popescu as a part of his graduate level microscopy course in my first year of grad school. Developing the idea required synthesis of concepts from several sub-fields of optics that I learned throughout my undergrad and grad school. I am excited to see it mature into a functioning experimental prototype." Kelkar is currently a member of Professor Mark Anastasio's Computational Imaging Science Lab at University of Illinois.

Two types of samples

The researchers used two types of samples to test their theory, said Chengfei Hu, a graduate student in the Popescu group. The first was a manufactured crystal that is typically used for generating nonlinear signals. The second was a biological sample where they used a muscle tissue. The technique is useful for looking at objects that are difficult to study by conventional imaging methods.

Collagen is an extremely bright generator of second harmonic using the same process that makes green light in a laser pointer. As collagen is the most abundant protein in the human body, the ability of collagen nonlinear properties can change the frequency of the light that they used in this new tomographic imaging approach, Popescu said. "Most investigators look at it in 2D and not 3D," he said "Using this technique, we can use the orientation of the collagen fibers as a reporter of how aggressive the cancer is."

According to Jeff Field, director of the Microscopy Core Facility at CSU and a research scientist in electrical engineering: "This new type of tomographic imaging could prove to be very valuable for a wide range of studies that currently rely on two-dimensional images to understand collagen fiber orientation, which has been shown to be prognostic for a number of types of cancer."

Simpler and faster

Field, who helped design and build the HOT microscope, compared this new tomographic strategy to other forms of tomography.

"In most tomographic imaging methods, like a CT (computed tomography) scan in the hospital, either the sample or the illumination must be rotated, which can be very challenging to implement on a microscopic scale," Field explained. "With this new method, the sample only needs to be translated in one direction, which simplifies the geometry significantly and minimizes misalignments, making it easier to apply to a wide array of applications."

Field went on to describe how much faster the new HOT microscope is at acquiring 3D data as compared to laser-scanning microscopy.

"The most common method for 3D second harmonic imaging is laser scanning, where a focused beam is moved pixel-by-pixel to form a 2D image. A 3D image is reconstructed from a 'stack' of these 2D images taken from different depths in the tissue," he said. "HOT also collects 2D images as a function of depth, but without the slow process of pixel-by-pixel scanning. This makes it possible to collect 3D images in a fraction of the time that is typically required by laser-scanning."

Unlike typical laser-scanning microscopes, "an additional benefit of HOT is that its speed makes it much less vulnerable to vibrations and unwanted microscope drift, which leads to sharper images and increased repeatability," said co-author Toussaint, a former professor in the College of Engineering at Illinois and now professor in the School of Engineering at Brown University.

Credit: 
Colorado State University

'A litmus paper for CO2:' Scientists develop paper-based sensors for carbon dioxide

image: This paper-based sensor turns orange as it is exposed to carbon dioxide. Image courtesy of Al Meldrum.

Image: 
Al Meldrum.

A new sensor for detecting carbon dioxide can be manufactured on a simple piece of paper, according to a new study by University of Alberta physicists.

"You can basically think of it as a litmus paper for carbon dioxide," said Al Meldrum, professor in the Department of Physics and co-author on the study led by graduate student Hui Wang. "The work showed that you can make a sensitive carbon dioxide detector out of a simple piece of paper. One could easily imagine mass produced sensors for carbon dioxide or other gases using the same basic methods."

The sensor, which changes colours based on the amount of carbon dioxide in the environment, has many potential applications--from industries that make use of carbon dioxide to smart buildings. And due to its paper base, the sensor is low cost to create and provides a simple template for mass production.

"In smart buildings, carbon dioxide sensors can tell you about the occupancy and where people tend to congregate and spend their time, by detecting the carbon dioxide exhaled when we breathe," explained Meldrum. "This can help to aid in building usage and design. Carbon dioxide sensors currently can be very expensive if they are sensitive enough for many applications, so a cheap and mass-produced alternative could be beneficial for these applications."

While this research demonstrates the sensing ability and performance of the technology, a mass-producible sensor would require further design, optimization, and packaging.

Credit: 
University of Alberta

Study: Reflecting sunlight to cool the planet will cause other global changes

How can the world combat the continued rise in global temperatures? How about shading the Earth from a portion of the sun's heat by injecting the stratosphere with reflective aerosols? After all, volcanoes do essentially the same thing, albeit in short, dramatic bursts: When a Vesuvius erupts, it blasts fine ash into the atmosphere, where the particles can linger as a kind of cloud cover, reflecting solar radiation back into space and temporarily cooling the planet.

Some researchers are exploring proposals to engineer similar effects, for example by launching reflective aerosols into the stratosphere -- via planes, balloons, and even blimps -- in order to block the sun's heat and counteract global warming. But such solar geoengineering schemes, as they are known, could have other long-lasting effects on the climate.

Now scientists at MIT have found that solar geoengineering would significantly change extratropical storm tracks -- the zones in the middle and high latitudes where storms form year-round and are steered by the jet stream across the oceans and land. Extratropical storm tracks give rise to extratropical cyclones, and not their tropical cousins, hurricanes. The strength of extratropical storm tracks determines the severity and frequency of storms such as nor'easters in the United States.

The team considered an idealized scenario in which solar radiation was reflected enough to offset the warming that would occur if carbon dioxide were to quadruple in concentration. In a number of global climate models under this scenario, the strength of storm tracks in both the northern and southern hemispheres weakened significantly in response.

Weakened storm tracks would mean less powerful winter storms, but the team cautions that weaker storm tracks also lead to stagnant conditions, particularly in summer, and less wind to clear away air pollution. Changes in winds could also affect the circulation of ocean waters and, in turn, the stability of ice sheets.

"About half the world's population lives in the extratropical regions where storm tracks dominate weather," says Charles Gertler, a graduate student in MIT's Department of Earth, Atmospheric and Planetary Sciences (EAPS). "Our results show that solar geoengineering will not simply reverse climate change. Instead, it has the potential itself to induce novel changes in climate."

Gertler and his colleagues have published their results this week in the journal Geophysical Research Letters. Co-authors include EAPS Professor Paul O'Gorman, along with Ben Kravitz of Indiana State University, John Moore of Beijing Normal University, Steven Phipps of the University of Tasmania, and Shingo Watanabe of the Japan Agency for Marine-Earth Science and Technology

A not-so-sunny picture

Scientists have previously modeled what Earth's climate might look like if solar geoengineering scenarios were to play out on a global scale, with mixed results. On the one hand, spraying aerosols into the stratosphere would reduce incoming solar heat and, to a degree, counteract the warming caused by carbon dioxide emissions. On the other hand, such cooling of the planet would not prevent other greenhouse gas-induced effects such as regional reductions in rainfall and ocean acidification.

There have also been signs that intentionally reducing solar radiation would shrink the temperature difference between the Earth's equator and poles or, in climate parlance, weaken the planet's meridional temperature gradient, cooling the equator while the poles continue to warm. This last consequence was especially intriguing to Gertler and O'Gorman.

"Storm tracks feed off of meridional temperature gradients, and storm tracks are interesting because they help us to understand weather extremes," Gertler says. "So we were interested in how geoengineering affects storm tracks."

The team looked at how extratropical storm tracks might change under a scenario of solar geoengineering known to climate scientists as experiment G1 of the Geoengineering Model Intercomparison Project (GeoMIP), a project that provides various geoengineering scenarios for scientists to run on climate models to assess their various climate effects.

The G1 experiment assumes an idealized scenario in which a solar geoengineering scheme blocks enough solar radiation to counterbalance the warming that would occur if carbon dioxide concentrations were to quadruple.

The researchers used results from various climate models run forward in time under the conditions of the G1 experiment. They also used results from a more sophisticated geoengineering scenario with doubling of carbon dioxide concentrations and aerosols injected into the stratosphere at more than one latitude. In each model they recorded the day-to-day change in air pressure at sea level pressure at various locations along the storm tracks. These changes reflect the passage of storms and measure a storm track's energy.

"If we look at the variance in sea level pressure, we have a sense of how often and how strongly cyclones pass over each area," Gertler explains. "We then average the variance across the whole extratropical region, to get an average value of storm track strength for the northern and southern hemispheres."

An imperfect counterbalance

Their results, across climate models, showed that solar geoengineering would weaken storm tracks in both Northern and Southern hemispheres. Depending on the scenario they considered, the storm track in the Northern Hemisphere would be 5 to 17 percent weaker than it is today.

"A weakened storm track, in both hemispheres, would mean weaker winter storms but also lead to more stagnant weather, which could affect heat waves," Gertler says. "Across all seasons, this could affect ventilation of air pollution. It also may contribute to a weakening of the hydrological cycle, with regional reductions in rainfall. These are not good changes, compared to a baseline climate that we are used to."

The researchers were curious to see how the same storm tracks would respond to just global warming alone, without the addition of social geoengineering, so they ran the climate models again under several warming-only scenarios. Surprisingly, they found that, in the northern hemisphere, global warming would also weaken storm tracks, by the same magnitude as with the addition of solar geoengineering. This suggests solar geoengineering, and efforts to cool the Earth by reducing incoming heat, would not do much to alter global warming's effects, at least on storm tracks -- a puzzling outcome that the researchers are unsure how to explain.

In the Southern Hemisphere, there is a slightly different story. They found that global warming alone would strengthen storm tracks there, whereas the addition of solar geoengineering would prevent this strengthening, and even further, would weaken the storm tracks there.

"In the Southern Hemisphere, winds drive ocean circulation, which in turn could affect uptake of carbon dioxide, and the stability of the Antarctic ice sheet," O'Gorman adds. "So how storm tracks change over the Southern Hemisphere is quite important."

The team also observed that the weakening of storm tracks was strongly correlated with changes in temperature and humidity. Specifically, the climate models showed that in response to reduced incoming solar radiation, the equator cooled significantly as the poles continued to warm. This reduced temperature gradient appears to be sufficient to explain the weakening storm tracks -- a result that the group is the first to demonstrate.

"This work highlights that solar geoengineering is not reversing climate change, but is substituting one unprecedented climate state for another," Gertler says. "Reflecting sunlight isn't a perfect counterbalance to the greenhouse effect."

Adds O'Gorman: "There are multiple reasons to avoid doing this, and instead to favor reducing emissions of CO2 and other greenhouse gases."

Credit: 
Massachusetts Institute of Technology

Aluminum oxide crystal tested as a UV radiation sensor

image: Intensity and temperature. Detecting and measuring UV exposure in different environments was the scope of a research

Image: 
Neilo Marcos Trindade and his team

Exposure to ultraviolet (UV) radiation is a risk factor for skin cancer and other diseases. The risk increases when sunlight, the natural source of UV radiation, is combined with artificial sources such as UV lamps for medical therapy, among others. Detecting and measuring UV exposure in different environments was the scope of a research project conducted in Brazil at the São Paulo Federal Institute of Education, Science and Technology (IFSP) in collaboration with the University of São Paulo Physics Institute (IF-USP).

The researchers investigated the sensitivity to UV radiation of aluminum oxide doped with carbon and magnesium (Al2O3:C,Mg), as reported in the article "Thermoluminescence of UV-irradiated α-Al2O3:C,Mg" published in Journal of Luminescence.

"Carbon-doped aluminum oxide [Al2O3:C] was already well known for its acute sensitivity to various kinds of radiation, such as X-rays and beta and gamma rays. It is used for personal and environmental UV dosimetry. What we discovered was that the material also responds to UV when it is doped with magnesium as well as carbon," said Professor Neilo Marcos Trindade, first author of the article.

The response in question is thermoluminescence, light emission by a material when heated after being exposed to radiation. "Magnesium doping promotes a large number of defects in the crystal, and this makes the material respond even better to ionizing radiation and to nonionizing radiation like UV," Trindade said.

The investigation was conducted by Trindade and two undergraduates, Maicon Gois Magalhães and Matheus Cavalcanti dos Santos Nunes, with the collaboration of Elisabeth Mateus Yoshimura, a professor at IF-USP, and Luiz Gustavo Jacobsohn, a professor at Clemson University in the United States. It was supported by São Paulo Research Foundation - FAPESP via a Regular Research Grant awarded to Trindade, a Scientific Initiation Scholarship awarded to Magalhães, and a Scientific Initiation Scholarship awarded to Nunes.

Other findings

In addition to the main discovery, which enables these crystals to be used to detect UV, Trindade and his students obtained two other important findings. The first was that the material's response to UV radiation is analogous to its response to beta radiation, meaning that many measurements of beta radiation performed to date may have been affected by UV interference.

The second was that the two responses vary according to different patterns, enabling the UV interference problem to be solved in future devices. The material responds to beta radiation in a linear manner: its luminescence increases incrementally in a continuous curve as its exposure to ionizing radiation increases. In the case of UV radiation, the luminescence of the material does not vary linearly. "There is a saturation point after which the luminescence ceases to intensify even as exposure continues," Trindade said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Behaviors and traits that influence social status, according to evolutionary psychologists

AUSTIN, Texas - Beyond fame and fortune, certain traits and behaviors may have pervasive influence in climbing the social ladder, according to a study by evolutionary psychologists at The University of Texas at Austin.

The study of 2,751 individuals in 14 nations identified universally valued qualities, such as intelligence and honesty, that can heighten a person's social status. It also identified universal double-standards that socially reward men for certain sexual behaviors but punish women. The findings were published in the Journal of Personality and Social Psychology and fill an important gap in understanding the psychology behind who rises and falls within human societies.

"Humans live in a social world in which relative rank matters for nearly everything -- your access to resources, your ability to attract mates, and even how long you live," said UT Austin evolutionary psychologist David Buss, one of the study's lead authors. "From an evolutionary perspective, reproductively relevant resources flow to those high in status and trickle slowly, if at all, to those lower on the social totem pole."

The researchers compared people's impressions of 240 factors -- including acts, characteristics and events -- to determine what increased and impaired a person's esteem in the eyes of others. They found that certain qualities such as being honest, hard-working, kind, intelligent, having a wide range of knowledge, making sacrifices for others, and having a good sense of humor increased a person's social value.

"From the Gypsies in Romania to the native islanders of Guam, people displaying intelligence, bravery and leadership rise in rank in the eyes of their peers," said UT Austin psychology graduate student Patrick Durkee, who led the study with Buss. "But possessing qualities that inflict costs on others will cause your status to plummet, whether you live in Russia or Eritrea."

Being known as a thief, as dirty or unclean, as mean or nasty, acquiring a sexually transmitted disease, and bringing shame on one's family decreased a person's social status or value. These status-harming actions can also lead to a person being ostracized from the group -- "an action that would have meant near-certain death in ancestral environments," the researchers said.

"Although this study was conducted prior to the current pandemic, it's interesting that being a disease vector is universally detrimental to a person's status," Buss said. "Socially transmitted diseases are evolutionarily ancient challenges to human survival, so humans have psychological adaptations to avoid them. Lowering a person's social status is an evolutionarily ancient method of social distancing from disease vectors."

In considering universal gender differences in status criteria, the study supported that the ability and willingness to protect others -- demonstrating bravery and physical formidability and taking risks to protect allies -- was more status-enhancing for men than women. On the other hand, women were more valued socially for qualities relating to domestic skills and attractiveness.

Sexual strategies and mating practices also influenced status and showed strong gender differences. Being sexually promiscuous decreased the status of both genders, but hurt women's status considerably more even in the most sexually egalitarian cultures in the study, the researchers said. Conversely, attaining a committed long-term mate increased the status of both genders, but somewhat more so for women. And fidelity increased the status of both women and men substantially and equally.

The study also identified cultural differences in qualities influencing social status. For example, practicing witchcraft damaged a person's status in Zimbabwe and Eritrea, but has virtually no impact on status in Estonia, Russia or the United States. And although valued universally, a good sense of humor contributes to a large boost in status in Poland; a moderate boost in China, South Korea and Japan; but only a slight boost in Eritrea.

Credit: 
University of Texas at Austin

Study reveals continuous pathway to building blocks of life

image: Generation of target molecules relevant to the origins of life in a complex chemical network driven by radiation.

Image: 
Ruiqin Yi, ELSI

Researchers have long sought to understand the origins of life on Earth. A new study conducted by scientists at the Institute for Advanced Study, the Earth-Life Science Institute (ELSI), and the University of New South Wales, among other participating institutions, marks an important step forward in the effort to understand the chemical origins of life. The findings of this study demonstrate how "continuous reaction networks" are capable of producing RNA precursors and possibly ultimately RNA itself--a critical bridge to life. A link to the paper, published by Proceedings of the National Academy of Sciences, can be found here: https://www.pnas.org/content/early/2020/06/01/1922139117

While many of the mechanisms that propagate life are well understood, the transition from a prebiotic Earth to the era of biology remains shrouded in mystery. Previous experiments have demonstrated that simple organic compounds can be produced from the reactions of chemicals understood to exist in the primitive Earth environment. However, many of these experiments relied on coordinated experimenter interventions. This study goes further by employing a model that is minimally manipulated to most accurately simulate a natural environment.

To conduct this work, the team exposed a mixture of very simple small molecules--common table salt, ammonia, phosphate, and hydrogen cyanide--to a high energy gamma radiation source. These conditions simulate radioactive environments made possible by naturally occurring radioactive minerals, which were likely much more prevalent on early Earth. The team also allowed the reactions to intermittently dry out, simulating evaporation in shallow puddles and beaches. These experiments returned a variety of compounds that may have been important for the origins of life, including precursors to amino acids and other small compounds known to be useful for producing RNA.

The authors use the term "continuous reaction network" to describe an environment in which intermediates are not purified, side products are not removed, and no new reagents are added after the initial starting materials. In other words, the synthesis of molecules occurs in a dynamic environment in which widely varied compounds are continuously being formed and destroyed, and these products react with each other to form new compounds.

Jim Cleaves, frequent IAS visitor and ELSI professor, stated, "These types of continuous reaction networks may be quite common in chemistry, but we are only now beginning to build the tools to detect, measure, and understand them. There is a lot of exciting work ahead."

Future work will focus on mapping out reaction pathways for other chemical substances and testing whether further cycles of radiolysis followed by dry-down can generate higher order chemical products. The team believes these models can help to determine what primitive planetary environments are most conducive to the formation of complex molecules. These studies could in turn help other scientists identify the best places to look for life beyond Earth.

Credit: 
Institute for Advanced Study

Army Research Laboratory supporting Texas A&M research on armor performance

The United States Army Research Laboratory is lending support to a Texas A&M University research project investigating potential improvement of ballistic performance of armor materials.

The project, led by Dr. Justin Wilkerson, assistant professor and James J. Cain '51 Faculty Fellow II in the J. Mike Walker '66 Department of Mechanical Engineering, focuses on identifying what damaging effects could be caused by particular flaws -- known as vacancies -- in the atomic structure of aluminum.

Wilkerson's research paper on the topic, "Evolving structure-property relationships in metals with nonequilibrium concentrations of vacancies," was recently featured on the cover of the Journal of Applied Physics. The paper was co-authored with Wilkerson's former postdoctoral advisee Dr. Sara Adibi.

Although similar research has been conducted by Dr. Celia Reina and coworkers from the California Institute of Technology, Wilkerson's study delves deeper into the subject by calculating the effect of changes due to vacancies over time on the material's mechanical properties.

"Mechanical behavior of these materials could not be understood via the Lattice Kinetic Monte Carlo simulations alone, which was what had been done prior to this," said Wilkerson. "To take the next step forward, we made use of supercomputing facilities to conduct a suite of large-scale molecular dynamics simulations."

Atoms are arranged in a highly ordered pattern referred to as a crystal lattice, and if an atom is missing from a perfect lattice structure, this defect is called a vacancy. At high temperatures, the vacancies can come together forming vacancy clusters.

Wilkerson said a large concentration of vacancies in a material may be generated under shock loading which could impact ballistic performance metrics, including spall strength.

"The effect of vacancies on the mechanical behavior of materials on short timescales, such as microseconds, has remained largely unexplored," said Wilkerson. "Even on such short timescales, we find that vacancies may also play a significant role in the high-temperature failure of metals subject to very high tensile pressures."

A prospective idea from the findings is that it may be possible to use this knowledge to improve the ballistic performance of next-generation armors for the U.S. Army.

"Now that we better understand the importance of this mechanism to ballistic performance the next step is to develop material processing strategies that slow vacancy production rates in shocked materials," said Wilkerson.

Credit: 
Texas A&M University

Atmospheric scientists identify cleanest air on Earth in first-of-its-kind study

image: Aerosol filter samplers probe the air over the Southern Ocean on the Australian Marine National Facility's R/V Investigator

Image: 
Kathryn Moore/Colorado State University

Colorado State University Distinguished Professor Sonia Kreidenweis and her research group identified an atmospheric region unchanged by human-related activities in the first study to measure bioaerosol composition of the Southern Ocean south of 40 degrees south latitude.

Kreidenweis' group, based in the Department of Atmospheric Science, found the boundary layer air that feeds the lower clouds over the Southern Ocean to be pristine - free from particles, called aerosols, produced by anthropogenic activities or transported from distant lands. Their findings are published in Proceedings of the National Academy of Sciences.

Weather and climate are complex processes connecting each part of the world to every other region, and with climate changing rapidly as a result of human activity, it's difficult to find any area or process on Earth untouched by people. Kreidenweis and her team suspected the air directly over the remote Southern Ocean that encircles Antarctica would be least affected by humans and dust from continents. They set out to discover what was in the air and where it came from.

"We were able to use the bacteria in the air over the Southern Ocean as a diagnostic tool to infer key properties of the lower atmosphere," said research scientist Thomas Hill, coauthor on the study. "For example, that the aerosols controlling the properties of SO clouds are strongly linked to ocean biological processes, and that Antarctica appears to be isolated from southward dispersal of microorganisms and nutrient deposition from southern continents. Overall, it suggests that the SO is one of very few places on Earth that has been minimally affected by anthropogenic activities."

Samples were collected during the NSF-funded SOCRATES field campaign, led by research scientist and coauthor Paul DeMott. Graduate student Kathryn Moore sampled the air in the marine boundary layer, the lower part of the atmosphere that has direct contact with the ocean, aboard the Research Vessel Investigator as it steamed south from Tasmania to the Antarctic ice edge. Research scientist and first author Jun Uetake examined the composition of airborne microbes captured from the ship. The atmosphere is full of these microorganisms dispersed over hundreds to thousands of kilometers by wind.

Using DNA sequencing, source tracking and wind back trajectories, Uetake determined the microbes' origins were marine, sourced from the ocean. Bacterial composition also was differentiated into broad latitudinal zones, suggesting aerosols from distant land masses and human activities, such as pollution or soil emissions driven by land use change, were not traveling south into Antarctic air.

These results counter all other studies from oceans in the subtropics and northern hemisphere, which found that most microbes came from upwind continents. Plants and soil are strong sources of particles that trigger freezing of supercooled cloud droplets, known as ice-nucleating particles. This process reduces cloud reflectivity and enhances precipitation, increasing the amount of sunlight reaching the surface and altering Earth's radiative balance.

Over the Southern Ocean, sea spray emissions dominate the material available for forming liquid cloud droplets. Ice-nucleating particle concentrations, rare in seawater, are the lowest recorded anywhere on the planet.

The air over the Southern Ocean was so clean that there was very little DNA to work with. Hill attributed the quality of their results to Uetake and Moore's clean lab process.

"Jun and Kathryn, at every stage, treated the samples as precious items, taking exceptional care and using the cleanest technique to prevent contamination from bacterial DNA in the lab and reagents," Hill said.

Credit: 
Colorado State University

Genetic cause of difference in sexual development uncovered

Researchers at the Francis Crick Institute, the Institut Pasteur and their clinical collaborators have identified a cause of testicular tissue developing in people with female chromosomes.

"Differences in sexual development" (DSD) are genetic conditions in which there is a mismatch between the chromosomal sex, XY or XX, which are typically of males and females, and some aspect of their anatomy, for example, an XY individual born as female or XX individual born as male. In many of the cases the genetic cause is unknown. The prevalence is 1 in 2,500-4,000 individuals.

In this paper, published in Proceedings of the National Academy of Sciences USA, researchers looked into a complex condition where people with two X chromosomes, which is typical of females, have testicular tissue instead of ovaries, or gonads with a mixture of both.

They focused on a subset of people with this rare condition, who also do not have the Y chromosome gene (SRY) which is usually needed for the creation of male reproductive organs.

By analysing the DNA of 78 children with this condition, they found that 9% had mutations affecting a specific part of another gene called Wilms' Tumor 1 (WT1), that is located not on a sex chromosome, but on one of the other 22 chromosome pairs.

Using human cell cultures, they found that these changes interfere with the delicate balance between the promotion of either the formation of male testis or female ovaries. This imbalance suppresses a protein (β-CATENIN) which is important to the development of ovaries, resulting in the balance shifting to favour the development of testicular tissue.

"There are still many unknowns when it comes to the development of sexual organs. And for some patients with differences in their sexual development, and their families, the condition can be confusing and upsetting".

"Understanding the underlying cause of sexual differences could help make it easier to identify hard to diagnose conditions and help families and doctors make more informed treatment and management decisions," says Anu Bashamboo, group leader of the Unit of Human Developmental Genetics, at the Institut Pasteur, Paris.

To test their findings, the researchers experimentally altered the WT1 gene in mice to mimic one of the human mutations. When this change was introduced into mice with two X chromosomes, which normally develop as females, these developed signs of male development.

"The WT1 gene was well known to be important for testis development. It's really interesting that we've found it also plays a crucial role in development of ovaries," says Robin Lovell-Badge, group leader of the Stem Cell Biology and Developmental Genetics Laboratory at the Crick.

"It appears to have a dual function, creating one protein with parts that contribute to male and others to female development," continues Robin. "Small changes in the balance of these two functions appear to alter the way sexual organs develop, even if our chromosomes are male or female."

"It is often difficult to understand how these differences develop in humans because these changes take place early on in development. By the time they are diagnosed, often well after birth, it is too late to study them. Recreating the mutation in mice gives us the opportunity to prove that it causes these differences and to learn why and how they occur in the presence of the mutation," says lead author Nitzan Gonen, group leader at The Mina and Everard Goodman Faculty of Life Sciences, Bar-Ilan University, Israel.

The researchers will continue to study the role of the WT1 gene in sexual development.

Credit: 
The Francis Crick Institute