Earth

Revising climate models with new aerosol field data

image: Instrumentation inlets and the view from the top of the tower at the Manitou Experimental Forest Observatory near Woodland Park, Colorado

Image: 
Delphine Farmer, CSU

Smoke from the many wildfires burning in the West have made air quality hazardous for millions of people in the United States. And it is the very tiniest of the aerosol particles in that air that make it particularly harmful to human health. But for decades, we haven't known how long these particles actually stay aloft.

New research by Colorado State University scientists is giving us a much better understanding of this process, which can help not only in air quality forecasting, but also in global climate modeling.

Aerosol particles, whether from wildfire smoke or car exhaust, play a large role in how much heat is absorbed or deflected by the atmosphere. However, we haven't entirely understood how quickly these tiny particles were pulled out of the air - especially in the absence of moisture. This has added substantial uncertainty to already-complex climate models.

Delphine Farmer, an associate professor in the Department of Chemistry in the CSU College of Natural Sciences, knew it was time we could do better.

Farmer and her colleagues recently announced that they have been able to detect, in real-world environments - from forests to grasslands - the rate at which these important particles actually leave the atmosphere. Their findings first appeared online the week of October 5 in the Proceedings of the National Academy of Sciences.

"This work really highlights the importance and power of field measurements," Farmer said. "We can directly use observations from field studies to narrow the uncertainties in climate models, and to improve our understanding of climate-relevant processes."

Zeroing in on uncertainty

Aerosol particles fall out of the air in two main ways. The first and most common is known as "wet" deposition, when moisture plucks them out of the air, whether through cloud formation, snow, or rainfall. Scientists have had a fairly good handle on this force, which accounts for some 80% of the aerosol effect in the atmosphere.

But the other force, "dry" deposition, has been much more mysterious, although it plays a not-insignificant role globally. Because aerosols are so small (measured in nanometers and microns) they don't simply come tumbling down due to gravity. They can waft along in currents of air for a long time. Just how long, however, has been the question.

"When a particle is emitted into the atmosphere, the amount of time it hangs out in the air depends on these removal processes," Farmer said. This is crucial, she explained, because "the longer a particle hangs out in the atmosphere, the more opportunity it has to travel farther, or make clouds, or impact human health. So getting the removal process right is essential for predicting particle concentrations - and their effects."

Early results from theoretical calculations in the 1970s and '80s, and cruder measurements completed over smooth surfaces around 2000, have been fed into climate models for decades.

This is where Farmer, who has made a research career tracking atmospheric chemistry with high-resolution instruments, saw an opportunity for improvement.

Improved climate models - and human health

Farmer and her colleagues knew that, of course, the land - and even ocean - surface isn't all smooth. So they wanted to see what was actually happening to these particles in the real world.

In particular, they looked at the forces beyond gravity that were driving these aerosols' journeys. "For the small, climate- and health-relevant particles, turbulence in the atmosphere brings particles down to surfaces and allows those particles to get stuck," Farmer said.

And because of this, these small particles don't have a straight path to a surface - especially in a complex surface environment like a forest. Farmer explained it as each microscopic aerosol particle running its own gauntlet, "kind of like American Ninja Warrior, where the particle has to avoid hitting different obstacles in order to stay in the atmosphere. And each gauntlet is particularly challenging for different sizes of particles."

To see how these variously sized particles were faring in this obstacle course, the researchers deployed an ultra-high sensitivity aerosol spectrometer, which uses a laser to count particles. They set up measuring stations in a pine forest in the Manitou Experimental Forest in Colorado, and in grasslands in the Southern Great Plains in Oklahoma, to capture real-world data on these particles as they eventually landed.

"We measured how fast different particles run this gauntlet," Farmer explained. "Then we used those measurements to figure out which part of the gauntlet slowed different particles down."

They found a much narrower range of lifetimes for these important particles than had been suggested by earlier modeling. In fact, the old predictions were counting on a faster removal of the very small particles (those less than 100 microns) and a slower removal of the larger particles (those greater than 400 microns).

"This means that we may have been underestimating the aerosol indirect effect in models," Farmer said. "The good news is that we have been overestimating the uncertainty - we now know particle loss rates better."

The new findings can be applied to all sorts of uneven surfaces, from forests to grasslands to agricultural areas even to choppy seas.

More aerosol effects over land

When integrating their findings into models of the aerosol effects globally, Farmer and her coauthors predict there will be more aerosol effect than previously assumed over certain land areas, including parts of North America, Europe, Asia, South America, Australia, and sub-Saharan Africa - and a lowering of the aerosol effect over oceans.

"It turns out that the particles' race to settle on a surface is pretty important for predicting radiative effects" and what the future climate might look like, Farmer said.

Their new data also suggests we've been underestimating the amount of the aerosols in the air that are most harmful to human health, those smaller than 2.5 nanometers (also known as PM2.5), which are, for example, the most commonly hazardous part of wildfire smoke.

"Our revised [number] increases surface PM2.5 concentrations by 11% globally and 6.5% over land," Farmer and her collaborators wrote in their new paper. Which is important to know because "exposure to PM2.5 is linked to respiratory and cardiovascular diseases."

Coauthors on the study included Jeffery Pierce, an associate professor in the Department of Atmospheric Sciences in the Walter Scott, Jr. College of Engineering, and Kelsey Bilsback, a postdoctoral researcher there; as well as doctoral researchers in the Department of Chemistry Ethan Emerson, Anna Hodshire, and Holly DeBolt; and Gavin McMeeking from the Handix Scientific company in Boulder.

This important work also demonstrates just how advanced - and impactful - field measurement technologies are becoming.

"To me, the most exciting aspect of this work is that we are able to take real-world measurements over a forest and a grassland site and use them to directly improve our understanding of the climate system," Farmer said.

Credit: 
Colorado State University

Expanded newborn screening could save premature infants' lives

Expanding routine newborn screening to include a metabolic vulnerability profile could lead to earlier detection of life-threatening complications in babies born preterm, according to a study by UC San Francisco researchers. The new method, which was developed at UCSF, offers valuable and time-sensitive insights into which infants are at greatest risk during their most vulnerable time, immediately after birth.

The study, published in Nature Pediatric Research by scientists at the UCSF California Preterm Birth Initiative (PTBI-CA), assessed the records of 9,639 preterm infants who experienced mortality or at least one complication or mortality.

Using the results of standard newborn profiles and blood tests, they identified a combination of six newborn characteristics and 19 metabolites that, together, created a vulnerability profile that reliably identified preterm babies at substantially increased risk for death and severe illness.

"Our results point to a number of potential biological pathways that may play a key role in the development of negative outcomes in babies born preterm," said the study's lead author Scott Oltman, MS, epidemiologist, UCSF PTBI-CA. "If we can better understand these pathways, new treatments or preventative measures may be possible."

The study appears online in Nature Pediatric Research

Metabolites are molecules such as glucose or thyroid stimulating hormone (TSH) that are naturally produced by our cells as we break down food or medications. In a newborn, these molecules may originate from the mother's bloodstream or be generated by the infant and can be used to assess whether the body is functioning normally.

Of particular note are the investigative team's findings that Black babies were 35 percent more likely than white babies to die or experience major complications, including serious breathing and digestive conditions known as respiratory distress syndrome and necrotizing enterocolitis.

"We are particularly excited about the potential for these metabolic models to help us address critical inequities in outcomes in Black infants," said senior author Laura Jelliffe-Pawlowski, PhD, MS, professor of Epidemiology and Biostatistics in the UCSF School of Medicine, and director of Discovery and Precision Health with PTBI-CA. "Going forward, we should be able to create personalized care plans for each baby born too early, which will help us reduce race/ethnic disparities in outcomes."

Advances in science have enabled even the most fragile preterm babies to survive in greater numbers and at younger gestational ages. In the United States, approximately 1 in 10 live infants is delivered preterm. However, preterm birth and related comorbidities are the leading cause of death for U.S. children under five years of age, with neonatal (newborn) deaths accounting for 46 percent of mortality in this age group.

Previous models attempting to predict complications after preterm birth have relied only upon the infant's gestational age, birthweight and other clinical characteristics. This study expanded the characteristics to include maternal factors, such as race, maternal age and education. It also identified 19 molecules such as TSH and glycine that also contributed to prediction. These metabolites are routinely tested in newborn screens, but are not assessed as a composite. These metabolites play important roles in many biological pathways including in digestion, respiration and temperature regulation.

"Some of the pathways we have identified may offer inroads for intervention and could eventually lead to fewer deaths and lessened short- and long-term disability in babies born too early," Jelliffe-Pawlowski said.

This study is paving the way for continued research on how these models could help preterm newborn babies. The next phase of this study is funded by the National Institutes of Health and begins this fall through 2025. It will enroll 100 very preterm babies in California and Iowa to test how well the newly identified metabolic models work in neonatal intensive care unit (NICU) settings. As part of this work, PTBI-CA researchers will collaborate with the Benioff Center for Microbiome Medicine to look at the microbiomes of babies in the new study to identify additional drivers of short- and long-term of outcomes.

Credit: 
University of California - San Francisco

UM researchers help study largest estimated Greenland ice loss

MISSOULA - University of Montana researchers have contributed to a study forecasting significant ice loss in Greenland. According to the study just published in the journal Nature, Greenland will lose more ice this century than in the past 12,000 years if greenhouse gas emissions are not curbed.

Through a multiorganizational collaboration, the study brought together climate modelers, ice core scientists, remote sensing experts and paleoclimate researchers.

The team used ice sheet modeling to reconstruct the ancient climate, measuring the accuracy of the model against real-world measurements taken by satellites, aerial surveys and field work. Focusing on the southwestern sector of the Greenland Ice Sheet, they traced the ice sheet from the Holocene epoch 12,000 years ago and projected the ice sheet's future into 2100.

Jesse V. Johnson, a UM professor of computer science, and Jacob Downs, now a UM postdoctoral researcher in applied mathematics, joined the study.

While other researchers focused on discovering how the ice sheet has changed over time, Johnson and Downs measured past climate and temperatures through studying concentrations of gases trapped in the ice. By integrating data from ice sheet retreat and past temperatures into a numerical model of ice dynamics, they then estimated how snowfall has fluctuated with temperature changes the past 12,000 years and could impact the study's modeling results.

Working with such a large, diverse team was eye-opening to Johnson and Downs and helpful for contextualizing their research.

"There is so much incredible science that we were totally ignorant of," Johnson said. "We learned how the climate of the past can be found by measuring the waxes in leaves trapped in the mud under lakes and how micro-fossils found in the ocean can tell you the water's temperature. We really didn't know much about this area of paleo-climate proxies going into this project, but came away fascinated with what they can tell us about Earth's climate history."

The results of the Greenland Ice Sheet study, however, were sobering.

"Basically, we've altered our planet so much that the rates of ice sheet melt this century are on pace to be greater than anything we've seen under natural variability of the ice sheet over the past 12,000 years," said Jason Briner, professor of geology at the University at Buffalo and study lead.

Johnson said the results of the study can't be underscored enough, and comparing today's enormous potential loss with past loss over 12,000 years is important to put it into perspective.

"Such comparisons are critical in understanding what we are living through now," Johnson said. "These changes are much greater than what has been experienced in more than twice the recorded history of homo sapiens. We often wonder what our ancestors would have done when faced with similar circumstances. In this case, the answer is that we don't know. Our ancestors never experienced anything like this."

Briner said the answer lies in the world curbing greenhouse gases. Currently, the Greenland Ice Sheet is set to lose four times its largest loss in 12,000 years in the high greenhouse-gas emissions scenario known as RCP8.5 by the Intergovernmental Panel on Climate Change.

Under a reduced-emission RCP2.6 scenario, however, the ice loss will only be slightly bigger than what the ice sheet has experienced the past 12,000 years.

"Our findings are yet another wake-up call, especially for countries like the U.S.," Briner said. "Americans use more energy per person than any other nation in the world. Our nation has produced more of the CO2 that resides in the atmosphere today than any other country. I think Americans need to go on an energy diet."

Credit: 
The University of Montana

Infrared NASA imagery finds Chan-hom organizing, consolidating

image: On Oct. 6 at 0353 UTC (Oct. 5 at 11:53 p.m. EDT) NASA's Aqua satellite analyzed Tropical Storm Chan-hom using the Atmospheric Infrared Sounder or AIRS instrument. AIRS found coldest cloud top temperatures as cold as or colder than (purple) minus 63 degrees Fahrenheit (minus 53 degrees Celsius) around the consolidating center.

Image: 
NASA JPL/Heidar Thrastarson

NASA's Aqua satellite analyzed the large Tropical Storm Chan-hom as it tracked through the Northwestern Pacific Ocean. Aqua imagery showed the storm was consolidating, indicating a strengthening trend.

One of the ways NASA researches tropical cyclones is using infrared data that provides temperature information. The AIRS instrument aboard NASA's Aqua satellite captured a look at those temperatures in Chan-hom and gave insight into the size of the storm and its rainfall potential.

Cloud top temperatures provide information to forecasters about where the strongest storms are located within a tropical cyclone. The stronger the storms, the higher they extend into the troposphere, and the colder the cloud top temperatures. NASA provides that data to forecasters at the Joint Typhoon Warning Center so they can incorporate it into their forecasting.

On Oct. 6 at 0353 UTC (Oct. 5 at 11:53 p.m. EDT) NASA's Aqua satellite analyzed Tropical Storm Chan-hom using the Atmospheric Infrared Sounder or AIRS instrument. Forecasters at the Joint Typhoon Warning Center (JTWC) in Honolulu, Hawaii noted that animated enhanced infrared satellite imagery showed the system is consolidating.  JTWC noted there has been an improvement in the developing strong bands of thunderstorms wrapping into the low-level center.

AIRS found coldest cloud top temperatures as cold as or colder than minus 63 degrees Fahrenheit (minus 53 degrees Celsius) around the center of circulation. NASA research has shown that cloud top temperatures that cold indicate strong storms that have the capability to create heavy rain.

Despite the consolidation, a microwave image captured at 7:36 a.m. EDT (1136 UTC) indicated that the convective banding remains fragmented.

Chan-hom's Status on Oct. 6

By 11 a.m. EDT (1500 UTC) on Oct. 6, Chan-hom was located near latitude 24.5 degrees north and longitude 137.2 degrees east. It is located approximately 552 nautical miles east-southeast of Kadena Air Base, Okinawa, Japan. It was moving to the west and had maximum sustained winds near 55 knots (63 mph/102 kph).

Chan-hom's Forecast

Chan-hom is forecast to continue consolidating and organizing while it tracks generally to the northwest. It is expected to peak at typhoon strength in two days before weakening again.

Credit: 
NASA/Goddard Space Flight Center

UB study finds no apparent link between undocumented immigration and crime

BUFFALO, N.Y. - An analysis by a University at Buffalo-led team using two estimates of undocumented immigration suggests that, on average, this population reduced or had no effect on crime in 154 U.S. metropolitan areas studied, including places such as New York City, Chicago and Las Vegas.

"Even after estimating the undocumented immigrant population in U.S. metropolitan areas in two different ways, we found that undocumented immigrants had no significant effect on violent crime and actually had a significant negative effect on property crime," says Robert Adelman, an associate professor of sociology in UB's College of Arts and Sciences. "This suggests that increases in the undocumented population is accompanied by decreases, on average, in property crime in U.S. metropolitan areas."

The findings, published Oct. 3 in the Journal of Crime and Justice, are consistent with the results of an earlier 2017 study of the relationship between immigration and crime by an Adelman-led team. The earlier study used four decades of data on the documented foreign-born population in the U.S., which also showed, on average, no significant link between immigration patterns and increased crime in a sample of 200 U.S. metropolitan areas.

The new study does not, either explicitly or implicitly, address whether or not individual immigrants do or do not commit crimes, Adelman points out.

"People from all backgrounds commit crimes. However, the bulk of the evidence indicates that, at least at the metropolitan level, in places where there are more immigrants, there also seems to be more economic and cultural vitality."

The findings of Adelman's team about the relationship between immigration and crime are not isolated or unusual conclusions. The majority of similarly themed studies in recent decades also found no significant relationship between immigration and crime. However, those earlier studies by other researchers rarely explored the potential impact of undocumented immigration on crime, largely because until recently there were no reliable estimates of the size of the undocumented population.

The just-published analysis of the relationship between immigration and crime, though, used recent data compiled by the Pew Research Center and the Migration Policy Institute. The cross-sectional data provide a one-time, one-year snapshot that is illuminating, but is a different approach than the 40 years of data used in the longitudinal 2017 study. The strength of the current analysis is that it used two different estimates of undocumented immigrants -- and the results are overwhelmingly similar regardless of which estimate is considered.

"Because these data [in the current study] are not longitudinal, it's much more difficult to establish causality than when you have data that lets you look at an effect over time, but the findings are still useful because of the undocumented measures compared in the study," says Adelman. "There is a serious body of high quality scholarship among those who study immigration and crime whose work in general simply does not find this overwhelming negative portrait of immigrants that has been painted in the current political climate."

Adelman, an expert on patterns, trends and processes related to immigration who also serves as chair of UB's Department of Sociology, has long-standing interests in social, racial and economic inequality, with a particular interest in outcomes associated with immigrants and their integration into American society.

"Studying the link between immigration and crime is important because it's one of the factors that is misinterpreted in American society," says Adelman. "The full context of immigration is complex, with competing narratives and scholarship."

And the answers are not always straightforward, according to Adelman.

"Some groups benefit from immigration while others may realize competition from immigration, but all of this has to be placed on the table so that we can debate the issues with facts, data, and the scientific method," says Adelman. He was joined on this study by co-authors Yulin Yang, a postdoctoral fellow at Cornell University; Lesley Williams Reid, University of Alabama professor of criminology; James Bachmeier, Temple University associate professor of sociology; and Mike Maciag, public policy journalist, Maciag Research.

Credit: 
University at Buffalo

Holidays bring severe spike in nut allergies for children

image: A new study examining the link between peanut and tree-nut anaphylaxis in children and holidays found spikes at Halloween and Easter. The study, led by a team of researchers from the Montreal Children's Hospital of the McGill University Health Centre (MCH-MUHC), found that most were previously unknown allergies, calling for increased awareness.

Image: 
Canadian Medical Association Journal

A new study examining the link between peanut and tree-nut anaphylaxis in children and holidays found spikes at Halloween and Easter. The study, led by a team of researchers from the Montreal Children's Hospital of the McGill University Health Centre (MCH-MUHC), found that most were previously unknown allergies, calling for increased awareness.

"Identifying certain times associated with an increased risk of anaphylaxis - a serious and life-threatening allergic reaction - could help to raise community awareness, support and vigilance," says Melanie Leung, a fourth-year medical student at McGill University and Dr. Moshe Ben-Shoshan, a pediatric allergist and immunologist at the MCH-MUHC and scientist at the Research Institute of the MUHC, with coauthors. "This information would identify the best timing for public awareness campaigns to prevent allergic reactions."

Researchers compared anaphylaxis at Halloween, Easter, Christmas, Diwali, Chinese New Year and Eid al-Adha.

Data from across the country

The study included 1,390 patients visiting participating pediatric emergency departments between 2011 and 2020 in four Canadian provinces: British Columbia, Ontario, Quebec, and Newfoundland and Labrador. The median age of patients was 5.4 years and 62 percent were boys.

For peanut-triggered anaphylaxis, there was an 85 percent increase in daily average cases during Halloween and a 60 percent increase during Easter compared with the rest of the year. For anaphylaxis triggered by unknown nuts, there was a 70 percent increase during Halloween and Easter compared with the rest of the year. However, the researchers did not find an increase at Christmas, Diwali, Chinese New Year or Eid al-Adha.

"The difference in the anaphylaxis incidence among holidays may have been due to the social setting in which each holiday takes place," says Leung. "At Halloween and Easter, children often receive candies and other treats from people who may be unaware of their allergies. The absence of such an association at Christmas may be because Christmas is a more intimate celebration among family members and close friends, who are more vigilant regarding allergen exposure."

Canadian labelling may also be a factor, as individual packages of one-bite candies and snacks, which are exempt from labelling requirements listing ingredients, are popular at Halloween and Easter.

Education and awareness key to risk reduction

"Our findings suggest that educational tools to increase vigilance regarding the presence of potential allergens are required among children with food allergies, their families and lay people interacting with children who have food allergies. Newer strategies targeting intervals associated with high anaphylaxis risk are required," says Dr. Ben-Shoshan.

Credit: 
McGill University

Researchers identify process for regenerating neurons in the eye and brain

image: David Hyde in his lab.

Image: 
Matt Cashore/University of Notre Dame.

The death of neurons, whether in the brain or the eye, can result in a number of human neurodegenerative disorders, from blindness to Parkinson's disease. Current treatments for these disorders can only slow the progression of the illness, because once a neuron dies, it cannot be replaced.

Now, a team of researchers from the University of Notre Dame, Johns Hopkins University, Ohio State University and the University of Florida has identified networks of genes that regulate the process responsible for determining whether neurons will regenerate in certain animals, such as zebrafish.

"This study is proof of principle, showing that it is possible to regenerate retinal neurons. We now believe the process for regenerating neurons in the brain will be similar," said David Hyde, professor in the Department of Biological Sciences at Notre Dame and co-author on the study.

For the study, published in Science, the researchers mapped the genes of animals that have the ability to regenerate retinal neurons. For example, when the retina of a zebrafish is damaged, cells called the Müller glia go through a process known as reprogramming. During reprogramming, the Müller glia cells will change their gene expression to become like progenitor cells, or cells that are used during early development of an organism. Therefore, these now progenitor-like cells can become any cell necessary to fix the damaged retina.

Like zebrafish, people also have Müller glia cells. However, when the human retina is damaged, the Müller glia cells respond with gliosis, a process that does not allow them to reprogram.

"After determining the varying animal processes for retina damage recovery, we had to decipher if the process for reprogramming and gliosis were similar. Would the Müller glia follow the same path in regenerating and non-regenerating animals or would the paths be completely different?" said Hyde, who also serves as the Kenna Director of the Zebrafish Research Center at Notre Dame. "This was really important, because if we want to be able to use Müller glia cells to regenerate retinal neurons in people, we need to understand if it would be a matter of redirecting the current Müller glia path or if it would require an entirely different process."

The research team found that the regeneration process only requires the organism to "turn back on" its early development processes. Additionally, researchers were able to show that during zebrafish regeneration, Müller glia also go through gliosis, meaning that organisms that are able to regenerate retinal neurons do follow a similar path to animals that cannot. While the network of genes in zebrafish was able to move Müller glia cells from gliosis into the reprogrammed state, the network of genes in a mouse model blocked the Müller glia from reprogramming.

From there, researchers were able to modify zebrafish Müller glia cells into a similar state that blocked reprogramming while also having a mouse model regenerate some retinal neurons.

Next, the researchers will aim to identify the number of gene regulatory networks responsible for neuronal regeneration and exactly which genes within the network are responsible for regulating regeneration.

Credit: 
University of Notre Dame

Fly larvae extract will replace antibiotics in fighting plant pathogens

image: MIPT doctoral student Heakal Mohamed holding a Petri dish used in the experiment.

Image: 
Natalia Arefieva/MIPT Press Office

Biotechnologists from MIPT have developed a method for extracting the active constituents from the fat of black soldier fly larvae. These compounds possess unique antimicrobial properties and can destroy bacteria that cause farm crop diseases and are resistant to antibiotics. The study was published in Microorganisms.

Bacteria cause an essential subset of the diseases affecting farm crops. The standard way of combating them is with antibiotics, but their yearslong overuse has led to microbes developing resistance. Besides that, antibiotics do not always target harmful bacteria only. They may also kill the microbes that are beneficial to plants.

In a search for an alternative way of protecting plants from pathogenic bacteria, researchers from MIPT turned to flies of the species Hermetia illucens, commonly known as black soldier flies. The team's hypothesis was that the constituents of the fly larvae fat could be an antimicrobial agent.

Black soldier flies originate from South America but are also found in the wild across the globe. The larvae of H. illucens are mass-produced in insect factories to feed livestock and fish. It is mainly due to larvae having an unpretentious diet and accumulating much protein and fat under a chitinous cover. The animals are fed with either larva without prior processing or in the form of a protein extract produced in multiple ways.

In their study, MIPT biotechnologists used the larvae fat obtained by mechanical squeezing under a press. To extract its biologically active constituents, the team tested 20 different organic solvents attempting to find the one best suited for the purpose. Eventually, MIPT PhD student Heakal Mohamed and the project's principal investigator Elena Marusich selected a solving agent composed of water, methanol, and hydrochloric acid. It enabled the extraction of more than 4% of the active fatty acids contained in the larvae fat. Methanol facilitates the dissolving of fatty acids in water, and acidification stabilizes the resulting mixture. The technique proved 50 times more effective than all previously available methods.

"We found a way to mix the solvents in the right proportions for extracting the chemical compounds of interest," said Elena Marusich, deputy head of the Laboratory of Innovative Drugs and Agricultural Biotechnology at MIPT. "The resulting extract -- called AWME -- has antimicrobial properties. We have shown it to be more effective than antibiotics, so it could virtually replace antibiotics in agriculture for fighting phytopathogenic bacteria."

The researchers tested the antibacterial effect of their extract on five strains of pathogenic bacteria affecting plants. Experiments included growing bacteria on the surface of agar-containing Petri dishes. The researchers placed disks of filtering paper soaked in AWME of a specific concentration onto the growing bacterial lawn. The experiments revealed that harmful bacteria died in the presence of the fly fat extract.

The extract is stable enough to withstand prolonged storage in a refrigerator without losing its antimicrobial properties.

"A widespread use of our extract in agriculture will require additional experiments with other common plant pathogens, as well as research into the mechanisms underlying the extract's antibacterial activity. We want to express our immense gratitude to Gennady Ivanov, a true enthusiast and pioneer of H. illucens larva cultivation in Russia, way off from the fly's native South America. Gennady is the CEO of Biolaboratorium LLC, a resident of the Skolkovo Innovation Center, and the NordTechsad LLC, which received the 2019 Golden Autumn Award for the production of animal feed. His work and generosity made our research possible," said Sergey Leonov, who heads the Laboratory for the Development of Innovative Drugs and Agricultural Biotechnology at MIPT.

Credit: 
Moscow Institute of Physics and Technology

Scientist maps CO2 emissions for entire US to improve environmental policymaking

image: Emissions map of entire U.S. landscape at high space- and time-resolution with details on economic sector, fuel and combustion process.

Image: 
Courtesy Northern Arizona University

With intense wildfires in the western U.S. and frequent, intense hurricanes in the Gulf of Mexico, the nation is again affected by extreme weather-related events resulting from climate change. In response, cities, states and regions across the country are developing policies to reduce their emissions of greenhouse gases, chiefly carbon dioxide (CO2). Even though many state and local governments are committed to these goals, however, the emissions data they have to work with is often too general and too expensive to provide a useful baseline and target the most effective policy.

Professor Kevin Gurney of Northern Arizona University's School of Informatics, Computing, and Cyber Systems today published results in the Journal of Geophysical Research detailing greenhouse gas emissions across the entire U.S. landscape at high space- and time-resolution with details on economic sector, fuel and combustion process.

Gurney, who specializes in atmospheric science, ecology and public policy, has spent the past several years developing a standardized system, as part of the Vulcan Project, that quantifies and visualizes greenhouse gases emitted across the entire country down to individual power plants, neighborhoods and roadways, identifying problem areas and enabling better decisions about where to cut emissions most effectively. Leading up to the nationwide study, Gurney produced emissions maps of several different large cities, including the Los Angeles megacity, Indianapolis, the Washington, D.C./Baltimore metropolitan area and Salt Lake City.

Funded by NASA, Gurney developed the high-resolution emissions map as an effective tool for scientific and policy applications. His goal is to provide policymakers across the nation with a means to strategically address problem areas instead of taking an inefficient, costly approach.

"We're providing U.S. policymakers at national, state and local scales with a scalpel instead of a hammer. Policies that might be relevant to California are possibly less relevant for Chicago or New York. They need to have information that reflects their unique conditions but follows a rigorous, standardized scientific approach. In this way, they can have confidence in the numbers which, in turn, will stimulate smart investment in reducing emissions."

One of the strengths of Gurney's approach is validation by atmospheric monitoring of CO2 from ground-based and satellite instruments.

"By synthesizing the detail of building and road-scale emissions with the independence and accuracy of atmospheric monitoring," Gurney said, "we have the best possible estimate of emissions with the most policy-relevant detail."

An animated video of the Vulcan Project output is available online.

Through characterization of CO2 emissions across the entire US landscape every kilometer, from coast to coast, Gurney points out that the system offers every US city an inventory on emissions. "By extracting all cities in the US from our data product, we can offer every city a consistent and comprehensive assessment of their emissions. Like the US weather forecasting system, this problem is best solved with a single systemic approach and shared with city stakeholders so they can do what they know how to do better than anyone - reduce emissions in ways that meet their individual needs." Gurney said.

Credit: 
Northern Arizona University

NASA gages Tropical Storm Delta's strength in infrared

image: On Oct. 5 at 3:05 a.m. EDT (0705 UTC) NASA's Aqua satellite analyzed Tropical Storm Delta using the Atmospheric Infrared Sounder or AIRS instrument. AIRS found coldest cloud top temperatures as cold as or colder than (purple) minus 63 degrees Fahrenheit (minus 53 degrees Celsius) around the center of circulation.

Image: 
NASA JPL/Heidar Thrastarson

NASA's Aqua satellite analyzed Tropical Storm Delta in infrared imagery as it moved through the Caribbean Sea. The imagery provided cloud top temperatures to identify the strongest areas within the storm.

Potential Tropical Cyclone 26 formed in the Caribbean Sea on Sunday, Oct. 4 by 5 p.m. EDT. Six hours later, the National Hurricane Center (NHC) classified it as Tropical Depression 26. By 8 a.m. EDT, satellite imagery helped confirm that the depression had strengthened into a tropical storm. At that time, it was given the name Delta.

Analyzing Delta's Temperatures and Strength

One of the ways NASA researches tropical cyclones is using infrared data that provides temperature information. The AIRS instrument aboard NASA's Aqua satellite captured a look at those temperatures in Delta and gave insight into the size of the storm and its rainfall potential.

Tropical cyclones do not always have uniform strength, and some sides have stronger sides than others. The stronger the storms, the higher they extend into the troposphere, and the colder the cloud top temperatures. NASA provides that data to forecasters at NOAA's National Hurricane Center or NHC so they can incorporate in their forecasting.

On Oct. 5 at 3:05 a.m. EDT (0705 UTC) NASA's Aqua satellite analyzed Tropical Storm Delta using the Atmospheric Infrared Sounder or AIRS instrument. AIRS found coldest cloud top temperatures as cold as or colder than (purple) minus 63 degrees Fahrenheit (minus 53 degrees Celsius) around the center of circulation. NASA research has shown that cloud top temperatures that cold indicate strong storms that have the capability to create heavy rain.

When meteorologists studied satellite imagery, including infrared imagery, it was noted that deep convection (rising air that forms the thunderstorms that make up a tropical cyclone) had been steadily improving in both vertical depth and structure of the storm since 2 a.m. EDT. Even the cloud pattern was becoming more circular with upper-level outflow of air at the top of the storm now having become established in all quadrants. That is an indication of improvement in a storm's structure.

NHC noted, "However, there are still some indications in satellite imagery that the low-level and the mid-upper-level circulations are not yet vertically aligned, with the low-level center still located just inside the northern edge of the convective cloud shield."

Eight hours after the AIRS image, the convective structure of Delta continued to improve. Earlier microwave data and early-light visible satellite imagery showed that the center of the tropical cyclone re-formed farther south within the area of deep convection.

Warnings and Watches on Oct. 5

NOAA's National Hurricane Center has issued a number of watches and warnings for Delta on Oct. 5. A Hurricane Warning is in effect for Cuba's province of Pinar del Rio. A Tropical Storm Warning is in effect for the Cayman Islands including Little Cayman and Cayman Brac, and for the Isle of Youth.

A Hurricane Watch is in effect for the Cuban province of Artemisa and for the Isle of Youth. A Tropical Storm Watch is in effect for the Cuban province of La Habana.

Delta's Status

At 11 a.m. EDT (1500 UTC) on Oct. 5, the center of Tropical Storm Delta was located near latitude 16.4 degrees north and longitude 78.6 degrees west. Delta is centered about 135 miles (215 km) south of Negril, Jamaica and about 265 miles (425 km) southeast of Grand Cayman.

Delta is moving toward the west near 7 mph (11 kph), and a turn toward the west-northwest is forecast later today. A faster northwestward motion is expected on Tuesday (Oct. 6) and Wednesday (Oct. 7). Maximum sustained winds have increased to near 45 mph (75 kph) with higher gusts. The estimated minimum central pressure is 1002 millibars.

Delta's Forecast

NHC expects additional strengthening during the next few days, and Delta is expected to become a hurricane on Tuesday before it nears western Cuba. On the forecast track, the center of Delta is expected to move away from Jamaica later today, move near or over the Cayman Islands early Tuesday, and approach western Cuba Tuesday afternoon or evening. Delta is forecast to move into the southeastern Gulf of Mexico Tuesday night, and be over the south-central Gulf of Mexico on Wednesday.

Credit: 
NASA/Goddard Space Flight Center

How Hispanic and Asian populations influence US food culture

Media and academics often equate assimilation with the process of immigrants becoming more similar to U.S.-born populations over time and across generations, says University of Arizona researcher Christina Diaz.

"But assimilation is likely a two-way street. And we see this, but there have been no tests done," said Diaz, an assistant professor in the School of Sociology in the College of Social and Behavioral Sciences.

In a new study, Diaz and co-author Peter Ore, a graduate student in sociology, looked for evidence that the U.S. community is impacted by minority populations. They used ethnic restaurants - both national chains and local eateries - as test cases.

The study, "Landscapes of Appropriation and Assimilation: The Impact of Immigrant-Origin Populations on U.S. Cuisine," is published in the Journal of Ethnic and Migration Studies. A portion of the research was conducted while Diaz was a 2018 Career Enhancement Fellow through the Woodrow Wilson Foundation.

The researchers found strong evidence that Asian and Hispanic populations are important contributors to local food culture. Those populations predict the number of Hispanic and Asian local ethnic restaurants - but not chains - in a given county. The size of local Hispanic and Asian populations also is linked to non-ethnic ownership of ethnic restaurants, and the availability of local Asian and Hispanic cuisine is strongly associated with education levels of the white majority population.

Focusing on Food

Diaz says the study was a unique effort to investigate, on a national level, whether Asians and Hispanics exert cultural influence on local populations. Assimilation is difficult to empirically test, she said.

To tackle that problem, Diaz and Ore pooled county-level data from the U.S. census, the American Community Survey, the Economic Research Service, the Voting and Elections Collection from CQ Press, Reference USA and Nielsen marketing data.

"This paper was a big introduction to me of the complexities of creating this whole architecture of data from a lot of different sources," Ore said.

Diaz and Ore included both immigrants and U.S.-born persons in the ethnic groupings, because, "oftentimes those outside of the ethnic group tend to code ethnic people as foreigners regardless of where they're born, and also because food is an enduring cultural attribute that gets passed down through generations," Diaz said.

Diaz added that data analysis revealed the same pattern of findings when the ethnic grouping only included immigrants.

Why focus on restaurants to test assimilation? Assimilation scholars argue that cuisine is among the first markers of ethnicity to become absorbed in local communities, Diaz said.

"If we do not observe patterns that suggest Asians and Hispanics are associated with local tastes via restaurants, it is unlikely these populations will transform other dimensions of social life," Diaz said.

Diaz acknowledges that just because someone enjoys Mexican and Asian food doesn't mean they welcome immigrants.

"A greater acceptance of food ways is not going to be reflective of increasingly positive intergroup relations or dynamics," Diaz said. "This is a small initial step to provide spaces for people across different ethnic groups and different racial categories to potentially interact."

Models of Assimilation

Diaz and Ore tested three competing models of assimilation: relational assimilation, appropriative assimilation, and racial or ethnic threat.

With relational assimilation, the demand for ethnic products is linked to the ethnic population; when one increases, the other increases.

"This theory suggests ethno-racial hierarchies may potentially weaken with prolonged intergroup exposure," Diaz said.

With appropriative assimilation, an increase in ethnic products is unrelated to the growth in the ethnic population, suggesting that ethnic goods are being appropriated by the dominant groups without minority involvement.

"This would expand mainstream food preferences and possibly bolster the economic status of majority populations while doing very little to reduce structural disadvantages faced by minority populations," Diaz said.

In the third scenario, racial or ethnic threat, ethnic restaurants would be lower in areas with the highest concentration of immigrants. Some research suggests there is a "tipping point" when increased immigration results in "natives shying away from immigrant food or culture because of perceived political threat or competition for employment," Diaz said.

Diaz and Ore primarily found evidence for relational assimilation. Counties with proportionally larger numbers of Asians and Hispanics had significantly more Asian and Hispanic restaurants.

They ran various tests to ensure that the relationship between ethnic groups and ethnic restaurants was not driven solely by the Hispanic or Asian demand for ethnic cuisine.

"We found that restaurant availability is also highest in really diverse areas, so we have reason to believe that there really is something about these intergroup interactions that are fertile for ethnic restaurant demand," Diaz said.

Analysis of restaurant ownership resulted in an interesting finding: Those outside of the ethnic community were more likely to own Hispanic or Asian restaurants in densely Hispanic or Asian populated areas.

"We interpret this as evidence that ethnic populations can transform tastes, demands and opportunities for those outside of the ethnic community," Diaz said.

Might this also be evidence of appropriation?

"We are agnostic about whether non-ethnic ownership is necessarily appropriation," Diaz said. "We are interpreting high rate of ownership among non-ethnics as relational assimilation because we see evidence of a heightened relationship in areas with a dense co-ethnic population."

The same cannot be said of fast-food ethnic chain restaurant ownership, where non-ethnic ownership was unrelated to the actual size of the ethnic community, suggesting appropriative assimilation.

Diaz and Ore did not find evidence of racial or ethnic threat.

"We suspect that restaurant spaces may be less likely to invoke hardened protest by majority groups than other markers of ethnicity, such as foreign-language programs in schools or employer preferences for specific skills," Diaz said.

Education's Impact

Another key finding is that the availability of local Asian and Hispanic restaurants in a community is strongly associated with the share of the majority populations with a college degree.

Research has shown that educated populations may be more likely to engage in cultural exchanges with immigrant and minority communities, particularly as consumers of ethnic products and services.

"More educated people tend to signal their status by presenting themselves as being eclectic or omnivorous," Ore said.

Diaz and Ore also analyzed Nielsen grocery data to obtain a secondary indicator of the impact of ethnic groups on food consumption and found the same pattern. An increase in the ethnic population resulted in an increase in ethnic grocery purchases by the majority population.

Diaz emphasizes that the study does not illustrate any decrease in the assimilation of immigrant groups.

"We're trying to shift the focus to demonstrate that the fabric of U.S. culture can transform as a result of immigration," Diaz said. "This doesn't mean that immigrants are assimilating any less."

Credit: 
University of Arizona

Pancreatic surgery: lower mortality with larger case volumes

For certain surgical procedures, can a correlation be shown between the volume of services provided per hospital and the quality of treatment results? This is the question addressed in eight commissions that the Federal Joint Committee (G-BA) awarded to the Institute for Quality and Efficiency in Health Care (IQWiG) in Germany. An IQWiG rapid report is now available for the seventh intervention to be tested, complex pancreatic surgery.

According to the findings, for complex pancreatic surgery a positive correlation can be inferred between the volume of services and the quality of treatment results: In hospitals with larger case volumes, the survival probabilities for patients are higher overall, fewer fatal complications occur and hospital stays are mostly shorter.

High-risk procedures usually performed as elective surgery

The pancreas produces both digestive secretions and hormones such as insulin, glucagon or somatostatin, which have a regulating effect on carbohydrate metabolism and digestion.

Surgical procedures of the pancreas are considered complex and thus high risk, and are usually performed as elective, i.e. planned, surgery. These procedures are largely performed due to complications caused by chronic inflammation or due to malignant neoplasms.

Between 2009 and 2014, about 35,000 complex surgical procedures of the pancreas were performed overall in Germany due to malignant neoplasms. The hospital mortality rates for patients who underwent complex pancreatic surgery in Germany between 2009 and 2013 were around 10 percent. The G-BA has set a minimum volume for complex pancreatic surgery in Germany, which currently stands at ten procedures per year and hospital location.

Positive correlation between volume of services and survival probability

The IQWiG project team identified 42 retrospective observational studies investigating the correlation between the volume of services and the quality of treatment results in complex pancreatic surgery. Of these studies, 36 contained usable data.

The data analysis showed that the overall survival probabilities for patients who undergo pancreatic surgery are higher when they are treated in hospitals with larger case volumes and by surgeons with more routine in this type of surgery. For the outcomes "treatment-related complications" and "length of hospital stay", a correlation between the volume of services and the quality of treatment results can also be shown at both the hospital and surgeon level, to the benefit of hospitals and surgeons with a high volume of services. For the outcomes "fatal complications" and "tumour-free resection margin", a positive correlation between the volume of services and the quality of treatment results can be inferred at the hospital level. For other outcomes, either no such correlation can be shown or no usable data are available.

IQWiG found no meaningful studies examining the effects of specific minimum case volumes introduced into the health care system for complex pancreatic surgery on the quality of treatment results.

Process of report production

In February 2019, the G-BA commissioned IQWiG to prepare a report on the correlation between the volume of services and the quality of treatment results for complex pancreatic surgery in an accelerated procedure as a so-called rapid report. The work on the project started in January 2020. This rapid report was sent to the contracting agency, the G-BA, in August 2020.

Credit: 
Institute for Quality and Efficiency in Health Care

40 percent of Amazon could now exist as rainforest or savanna-like ecosystems

A larger part of the Amazon rainforest is at risk of crossing a tipping point where it could become a savanna-type ecosystem than previously thought, according to new research. The research, based on computer models and data analysis, is published in the journal Nature Communications.

Rainforests are very sensitive to changes that affect rainfall for extended periods. If rainfall drops below a certain threshold, areas may shift into a savanna state.

"In around 40 percent of the Amazon, the rainfall is now at a level where the forest could exist in either state - rainforest or savanna, according to our findings," says lead author Arie Staal, formerly a postdoctoral researcher at the Stockholm Resilience Centre and the Copernicus Institute of Utrecht University.

The conclusions are concerning because parts of the Amazon region are currently receiving less rain than previously and this trend is expected to worsen as the region warms due to rising greenhouse gas emissions.

Staal and colleagues focused on the stability of tropical rainforests in the Americas, Africa, Asia and Oceania. With their approach they were able to explore how rainforests respond to changing rainfall.

"By using the latest available atmospheric data and teleconnection models, we were able to simulate the downwind effects of disappearance of forests for all tropical forests. By integrating these analyses over the entire tropics, the picture of the systematic stability of tropical forests emerged," says Obbe Tuinenburg, former assistant professor at the Copernicus Institute of Utrecht University and visiting scientist at the Stockholm Resilience Centre.

The team explored the resilience of tropical rainforests by looking at two questions: what if all the forests in the tropics disappeared, where would they grow back? And its inverse: what happens if rainforests covered the entire tropical region of Earth?

Such extreme scenarios could inform scientists about the resilience and stability of real tropical forests. They can also help us understand how forests will respond to the changing rainfall patterns as greenhouse gases in the atmosphere rise.

The researchers ran the simulations starting with no forests in the tropics across Africa, the Americas, Asia and Australia. They watched forests emerge over time in the models. This allowed them to explore the minimum forest cover for all regions.

Staal said, "The dynamics of tropical forests is interesting. As forests grow and spread across a region this affects rainfall - forests create their own rain because leaves give off water vapour and this falls as rain further downwind. Rainfall means fewer fires leading to even more forests. Our simulations capture this dynamic."

The team ran the models a second time, this time in a world where rainforests entirely covered the tropical regions of Earth. This is an unstable scenario because in many places there is not enough rainfall to sustain a rainforest. In many places the forests shrank back due to lack of moisture.

Staal says, "As forests shrink, we get less rainfall downwind and this causes drying leading to more fire and forest loss: a vicious cycle."

Finally the researchers explored what happens if emissions keep rising this century along a very high-emissions scenario used by the Intergovernmental Panel on Climate Change (IPCC).

Overall, the researchers found that as emissions grow, more parts of the Amazon lose their natural resilience, become unstable and more likely to dry out and switch to become a savanna-type ecosystem. They note that even the most resilient part of the rainforest shrinks in area. In other words, more of the rainforest is prone to crossing a tipping point as emissions of greenhouse gases reach very high levels.

"If we removed all the trees in the Amazon in a high-emissions scenario a much smaller area would grow back than would be the case in the current climate," says co-author Lan Wang-Erlandsson of the Stockholm Resilience Centre.

The researchers conclude that the smallest area that can sustain a rainforest in the Amazon contracts a substantial 66% in the high-emissions scenario.

In the Congo basin the team found that the forest remains at risk of changing state everywhere and will not grow back once gone, but that under a high emissions scenario part of the forest becomes less prone to crossing a tipping point. But Wang-Erlandsson adds 'This area where natural forest regrowth is possible remains relatively small."

"We understand now that rainforests on all continents are very sensitive to global change and can rapidly lose their ability to adapt," says Ingo Fetzer of the Stockholm Resilience Centre. "Once gone, their recovery will take many decades to return to their original state. And given that rainforests host the majority of all global species, all this will be forever lost."

The academics found that the minimal and maximal extents of the rainforests of Indonesia and Malaysia are relatively stable because their rainfall is more dependent on the ocean around them than on rainfall generated as a result of forest cover.

The study only explored the impacts of climate change on tropical forests. It did not assess the additional stress of deforestation in the tropics due to agricultural expansion and logging.

Credit: 
Stockholm Resilience Centre

Vaccine ingredients could be hiding in small molecule libraries

image: The researchers found a molecule that can be used as vaccine adjuvant and strengthens the immune response when added to a vaccine.

Image: 
Mindy Takamiya/Kyoto University iCeMS

Many vaccines include ingredients called adjuvants that help make them more effective by eliciting a stronger immune response. Identifying potential adjuvants just got easier, thanks to an approach described by scientists at Kyoto University's Institute for Integrated Cell-Material Sciences (iCeMS) and colleagues in the journal Angewandte Chemie.

The team of chemists and biologists in Japan report they found a molecule that, when added to a vaccine, strengthens the immune response just as well as a commonly used adjuvant. Vaccine adjuvants are an essential part of clinically used antigen vaccines, such as influenza, hepatitis and cervical cancer vaccines.

"Adjuvants generate a robust and long-lasting immune response, but the ones currently in use, like aluminium salts and oil-in-water emulsions, were developed in the 1920s and we don't precisely understand how they work, which is why they are often called 'immunologists' dirty little secret,'" says iCeMS chemical biologist Motonari Uesugi, who led the study.

The new adjuvant was discovered by screening a library of 8,000 small molecules for their ability to self-assemble. Molecular self-assembly is the spontaneous self-organization of molecules through non-electron-sharing bonds. This is a well-known concept in materials science that is also employed by living organisms to perform complex biological functions.

"We hypothesized that structures that come together through molecular self-assembly might mimic structures in pathogens, like viruses, stimulating a similar immune response," says Uesugi.

The team found 116 molecules that can self-assemble and then screened them for the ability to increase interleukin-6 expression by macrophages. Macrophages are immune cells that detect and 'eat up' pathogens circulating in the body. They also release proteins, such as interleukin-6, that activate other immune cells.

The research led to the discovery of a molecule called cholicamide. This molecule self-assembled to form a virus-mimicking structure that is engulfed by macrophages and similar immune cells. The structures are transported into specialized vacuoles to combine with a specific receptor called toll-like receptor 7, which sparks a heightened immune response. Specifically, it leads to the release of immune-stimulating cues like interleukin-6.

Further investigations and comparisons demonstrated that cholicamide was just as potent in inducing an immune response as the adjuvant Alum when added to an influenza vaccine given to mice.

"Our study, to the best of our knowledge, is the first report of using a small molecule library for vaccine adjuvant discovery," says Uesugi. "We hope the new approach paves the way for discovering and designing self-assembling small molecule adjuvants against pathogens, including emerging viruses."

Further studies are needed to determine how cholicamide mimics the single RNA strands of viruses to activate toll-like receptor 7. The researchers also want to understand how cholicamide binds to the receptor to elucidate the effects of this interaction.

Credit: 
Kyoto University

Hidden DNA fragment the 'trigger switch' for male development

image: Left, XY mouse lacking Sry-T, that developed as female. Right, XX mouse carrying Sry-T transgene, that developed as male.

Image: 
Makoto Tachibana, Osaka University.

Biology textbooks may need to be re-written, with scientists finding a new piece of DNA essential to forming male sex organs in mice.

An international research collaboration with The University of Queensland found the Y-chromosome gene that makes mice male is made up of two different DNA parts, not one, as scientists had previously assumed.

UQ's Institute of Molecular Biosciences Emeritus Professor Peter Koopman said the critical DNA fragment had been hidden from researchers for more than 30 years.

"Expression of the Y chromosomal gene Sry is required for male development in mammals and since its discovery in 1990 has been considered a one-piece gene," he said.

"Sry turns out to have a cryptic second part, which nobody suspected was there, that is essential for determining the sex of male mice. We have called the two-piece gene Sry-T."

The scientists tested their theory and found that male mice (XY) lacking in Sry-T developed as female, while female mice (XX) carrying a Sry-T transgene developed as male.

The success rate for the experiments was almost 100 per cent.

Emeritus Professor Koopman said the discovery would change how basic biology and evolution was taught around the world.

"For the last 30 years, we've been trying to figure out how this works," he said.

"Sry is a master switch gene because it flicks the switch for male development, it gets the ball rolling for a whole series of genetic events that result in a baby being born as a male instead of female.

"This new piece of the gene is absolutely essential for its function; without that piece, the gene simply doesn't work.

"We've discovered something massively important in biology here, because without Sry there can be no sexual reproduction and hence no propagation and survival of mammalian species."

The discovery may apply to efforts to manipulate sex ratios in agriculture or for biological pest management. But Emeritus Professor Koopman was quick to point out that, for ethical and practical reasons, the discovery cannot be utilised on human embryos.

"Once we understand better how males and females are specified in non-human species of mammals, then it offers the opportunity to influence that process," he said.

"The ability to select for the desired sex could dramatically increase efficiencies for agricultural industries such as the dairy industry (females) or the beef industry (males).

"People have been trying to figure out ways to skew to the desired sex in these industries for a long time, and now that we understand more about the fundamental mechanism of Sry it may be possible through genetic means."

Credit: 
University of Queensland