Tech

Fertilizer feast and famine

Commercial organic and synthetic nitrogen fertilizer helps feed around half of the world's population. While excessive fertilizer use poses environmental and public health risks, many developing nations lack access to it, leading to food insecurity, social unrest and economic hardship.

A team of scientists, led by the University of California, Davis, has published a study that identifies five strategies to tackle the problem. These include applying fertilizers more precisely, getting nitrogen to where it's needed most, removing nitrogen pollution from the environment, reducing food waste and empowering consumers to think about sustainable food options.

"We have a two-sided challenge and we can't just focus on one side and forget about the other," said lead author Ben Houlton, professor and director of the John Muir Institute of the Environment at UC Davis. "People not having access to fertilizer to grow food is as much of a problem as inefficient use of it."

TOO LITTLE

The cost of fertilizer is a major barrier in emerging market economies, particularly for farmers in sub-Saharan Africa and parts of Latin America. Government subsidies can help, but the research suggests the problem isn't just an economic one. Policies among governments need to be better coordinated to help farmers gain access to fertilizer, using the most advanced and sustainable precision-agricultural approaches.

A coordinated international policy is urgently needed, said Houlton. While groups like the International Nitrogen Initiative have made significant progress in advancing global nitrogen issues, the study calls for a formal research mandate similar to the United Nations' Intergovernmental Panel on Climate Change to solve the global nitrogen problem.

TOO MUCH

Feeding an expected 10 billion people by 2050 could increase fertilizer use by as much as 40 percent. Shifting fertilizer application practices will be key, said Houlton. Slow-release fertilizers, "fertigation" (fertilizers with irrigation water) and using new sensor technologies and drones can help improve nitrogen efficiency. These techniques can be costly, presenting challenges to adoption.

"Similar to offering consumers rebates for buying the first electric cars, we need incentives for farmers to adopt these practices," Houlton said.

The study also discusses ways to remove nitrogen pollution from the environment, including river and floodplain restoration projects and buffer strips designed to improve water quality.

FOOD WASTE AND DIET

One-quarter of all food produced is wasted. Its disposal at landfills also produces greenhouse gases such as methane and nitrous oxide. The research suggests repurposing food waste as animal feed or turning it into compost. The study also highlights the need for increasing consumer awareness to reduce overbuying.

Another strategy for reducing nitrogen overuse is to empower consumers to understand sustainable food growing practices and healthy food choices . Not all crops, dairy or meat is grown in the same way. The study suggests more research and life cycle assessments of how different growing practices affect nitrogen footprints so consumers can determine options that make the most sense for their particular culture and values.

"Nitrogen as a problem is quite solvable" Houlton said. "The benefits of a sustainable nitrogen balance can materialize remarkably quickly, from assisting in humanitarian crises to slashing global climate pollutants, preserving Earth's biodiversity, and reducing toxic algae blooms in rivers, lakes and the sea."

Credit: 
University of California - Davis

Researchers take key step toward cancer treatments that leave healthy cells unharmed

image: Researchers have opened up a possible avenue for new cancer therapies that don't have the side effects that oftentimes accompany many current cancer treatments by identifying a protein modification that specifically supports proliferation and survival of tumor cells.

Image: 
Maca Franco, OSU College of Science

CORVALLIS, Ore. - Researchers have opened up a possible avenue for new cancer therapies that don't have the side effects that oftentimes accompany many current cancer treatments by identifying a protein modification that specifically supports proliferation and survival of tumor cells.

Depending on the kind of cancer and the type of treatment, a patient might suffer from many side effects, including anemia, loss of appetite, bleeding, bruising, constipation, delirium, diarrhea, fatigue, hair loss, nausea, sexual issues or bladder problems.

Scientists at Oregon State University, the University of Central Florida and New York University made the protein-modification discovery while studying neurofibromatosis type 2. The condition, commonly known as NF2, is characterized by the development of tumors of the nervous system called schwannomas.

"The hallmark of tumor cell behavior is their uncontrolled growth," said Maca Franco, professor of biochemistry and biophysics in OSU's College of Science. "Tumors cells need to constantly produce energy and building blocks to replicate."

Researchers led by Franco and Oregon State undergraduate student Jeanine Pestoni found that schwannoma cells produce an oxidant and nitrating agent, peroxynitrite, which modifies an amino acid, tyrosine, in proteins.

When tyrosine becomes nitrated in specific proteins, an effect is the reprogramming of the tumor cells' metabolism, enabling them to proliferate.

"To sustain persistent growth, tumor cells change the way they produce energy and building blocks and present a signature metabolic phenotype that differs from that of normal cells," Franco said. "We discovered that peroxynitrite, the most powerful oxidant produced by cells, controls the metabolic changes that occur in tumor cells of the nervous system and supports their growth. We believe that there are specific proteins that when they become nitrated acquire a new function they did not have before, and this new function may control tumor growth."

Peroxynitrite is produced at high levels in "pathological conditions," she said - such as those found in tumors - but not in normal tissues.

"This opens up the exciting possibility of targeting peroxynitrite production exclusively in tumor cells as a new therapeutic strategy for the treatment of tumors of the nervous system, with minimal to no side effects on normal tissues," Franco added. "We are uncovering a completely new category of targets for the treatment of solid tumors, and not only tumors of the nervous system - it may have broader implications for the treatment of several cancer types. We can go after proteins that usually aren't modified in normal cells; we can target those modified proteins with inhibitors that don't affect normal cells, hopefully developing a treatment with minimal side effects."

Credit: 
Oregon State University

Missing link in algal photosynthesis found, offers opportunity to improve crop yields

video: This week, in the journal Proceedings of the National Academy of Sciences, a team from Louisiana State University (LSU) and the University of York report a long-time unexplained step in the CCM of green algae--which is key to develop a functional CCM in food crops to boost productivity. The researchers discovered a missing link in the photosynthetic process of green algae called Chlamydomonas reinhardtii that could be used to boost crop productivity.

Image: 
RIPE Project

BATON ROUGE, La. -- Photosynthesis is the natural process plants and algae utilize to capture sunlight and fix carbon dioxide into energy-rich sugars that fuel growth, development, and in the case of crops, yield. Algae evolved specialized carbon dioxide concentrating mechanisms (CCM) to photosynthesize much more efficiently than plants. This week, in the journal Proceedings of the National Academy of Sciences, a team from Louisiana State University (LSU) and the University of York report a long-time unexplained step in the CCM of green algae--which is key to develop a functional CCM in food crops to boost productivity.

"Most crops are plagued by photorespiration, which occurs when Rubisco--the enzyme that drives photosynthesis--cannot differentiate between life-sustaining carbon dioxide and oxygen molecules that waste large amounts of the plant's energy," said James Moroney, the Streva Alumni Professor at LSU and member of Realizing Increased Photosynthetic Efficiency (RIPE). "Ultimately, our goal is to engineer a CCM in crops to surround Rubisco with more carbon dioxide, making it more efficient and less likely to grab oxygen molecules--a problem that is shown to worsen as temperatures rise."

Led by the University of Illinois, RIPE is an international research project that is engineering crops to be more productive by improving photosynthesis with support from the Bill & Melinda Gates Foundation, the U.S. Foundation for Food and Agriculture Research (FFAR), and the U.K. Government's Department for International Development (DFID).

Whereas carbon dioxide diffuses across cell membranes relatively easily, bicarbonate (HCO3-) diffuses about 50,000 times more slowly due to its negative charge. The green algae Chlamydomonas reinhardtii, nicknamed Chlamy, transports bicarbonate across three cellular membranes into the compartment that houses Rubisco, called a pyrenoid, where the bicarbonate is converted back into carbon dioxide and fixed into sugar.

"Before now, we did not understand how bicarbonate crossed the third threshold to enter the pyrenoid," said Ananya Mukherjee, who led this work as a graduate student at LSU before joining the University of Nebraska-Lincoln as a postdoctoral researcher. "For years, we tried to find the missing component, but it turns out there are three transport proteins involved in this step--which were the missing link in our understanding of the CCM of Chlamydomonas reinhardtii."

"While other transport proteins are known, we speculate that these could be shared with crops more easily because Chlamy is more closely related to plants than other photosynthetic algae, such as cyanobacteria or diatoms," said Luke Mackinder, a lecturer at York who collaborated with the RIPE team on this work with support from the Biotechnology and Biological Sciences Research Council (BBSRC) and the Leverhulme Trust.

Creating a functional CCM in crops will require three things: a compartment to store Rubisco, transporters to bring bicarbonate to the compartment, and carbonic anhydrase to turn bicarbonate into carbon dioxide.

In a 2018 study, RIPE colleagues at The Australian National University demonstrated that they could add a compartment called a carboxysome, which is similar to a pyrenoid, in crops. Now this study completes the list of possible transport proteins that could shuttle bicarbonate from outside the cell to this carboxysome structure in crops' leaf cells.

"Our research suggests that creating a functional CCM in crops could help crops conserve more water and could significantly reduce the energy-taxing process of photorespiration in crops--that worsens as temperatures rise," Moroney said. "The development of climate-resilient crops that can photosynthesize more efficiently will be vital to protecting our food security."

Realizing Increased Photosynthetic Efficiency (RIPE) is engineering staple food crops to more efficiently turn the sun's energy into food to sustainably increase worldwide food production, with support from the Bill & Melinda Gates Foundation, the U.S. Foundation for Food and Agriculture Research, and the U.K. Government's Department for International Development.

Credit: 
Louisiana State University

Maya more warlike than previously thought

image: UC Berkeley's David Wahl and Lysanna Anderson of USGS with a local assistant taking a sediment sample from the center of Lake Ek'Naab from an inflatable platform. All the equipment had to be carried 2 kilometers down a steep jungle trail to the lake.

Image: 
Francisco Estrada-Belli, Tulane

The Maya of Central America are thought to have been a kinder, gentler civilization, especially compared to the Aztecs of Mexico. At the peak of Mayan culture some 1,500 years ago, warfare seemed ritualistic, designed to extort ransom for captive royalty or to subjugate rival dynasties, with limited impact on the surrounding population.

Only later, archeologists thought, did increasing drought and climate change lead to total warfare -- cities and dynasties were wiped off the map in so-called termination events -- and the collapse of the lowland Maya civilization around 1,000 A.D. (or C.E., current era).

New evidence unearthed by a researcher from the University of California, Berkeley, and the U.S. Geological Survey calls all this into question, suggesting that the Maya engaged in scorched-earth military campaigns -- a strategy that aims to destroy anything of use, including cropland -- even at the height of their civilization, a time of prosperity and artistic sophistication.

The finding also indicates that this increase in warfare, possibly associated with climate change and resource scarcity, was not the cause of the disintegration of the lowland Maya civilization.

"These data really challenge one of the dominant theories of the collapse of the Maya," said David Wahl, a UC Berkeley adjunct assistant professor of geography and a researcher at the USGS in Menlo Park, California. "The findings overturn this idea that warfare really got intense only very late in the game."

"The revolutionary part of this is that we see how similar Mayan warfare was from early on," said archaeologist Francisco Estrada-Belli of Tulane University, Wahl's colleague. "It wasn't primarily the nobility challenging one another, taking and sacrificing captives to enhance the charisma of the captors. For the first time, we are seeing that this warfare had an impact on the general population."

Total warfare

The evidence, reported today in the journal Nature Human Behaviour, is an inch-thick layer of charcoal at the bottom of a lake, Laguna Ek'Naab, in Northern Guatemala: a sign of extensive burning of a nearby city, Witzna, and its surroundings that was unlike any other natural fire recorded in the lake's sediment.

The charcoal layer dates from between 690 and 700 A.D., right in the middle of the classic period of Mayan civilization, 250-950 A.D. The date for the layer coincides exactly with the date -- May 21, 697 A.D. -- of a "burning" campaign recorded on a stone stela, or pillar, in a rival city, Naranjo.

"This is really the first time the written record has been linked to an event in the paleo data sets in the New World," Wahl said. "In the New World, there is so little writing, and what's preserved is mostly on stone monuments. This is unique in that we were able to identify this event in the sedimentary record and point to the written record, particularly these Mayan hieroglyphs, and make the inference that this is the same event."

Wahl, a geologist who studies past climate and is first author of the study, worked with USGS colleague Lysanna Anderson and Estrada-Belli to extract 7 meters of sediment cores from the lake. Laguna Ek'Naab, which is about 100 meters across, is located at the base of the plateau where Witzna once flourished and has collected thousands of years of sediment from the city and its surrounding agricultural fields. After seeing the charcoal layer, the archaeologists examined many of Witzna's ruined monuments still standing in the jungle and found evidence of burning in all of them.

"What we see here is, it looks like they torched the entire city and, indeed, the entire watershed," Wahl said. "Then, we see this really big decrease in human activity afterwards, which suggests at least that there was a big hit to the population. We can't know if everyone was killed or they moved or if they simply migrated away, but what we can say is that human activity decreased very dramatically immediately after that event."

This one instance does not prove that the Maya engaged in total warfare throughout the 650-year classic period, Estrada-Belli said, but it does fit with increasing evidence of warlike behavior throughout that period: mass burials, fortified cities and large standing armies.

"We see destroyed cities and resettled people similar to what Rome did to Carthage or Mycenae to Troy," Estrada-Belli said.

And if total warfare was already common at the peak of Mayan lowland civilization, then it is unlikely to have been the cause of the civilization's collapse, the researchers argue.

"I think, based on this evidence, the theory that a presumed shift to total warfare was a major factor in the collapse of Classic Maya society is no longer viable," said Estrada-Belli. "We have to rethink the cause of the collapse, because we're not on the right path with warfare and climate change."

'Bahlam Jol burned for the second time'

Though Mayan civilization originated more than 4,000 years ago, the Classic period is characterized by widespread monumental architecture and urbanization exemplified by Tikal in Guatemala and Dzibanché in Mexico's Yucatan. City-states -- independent states made up of cities and their surrounding territories -- were ruled by dynasties that, archaeologists thought, established alliances and waged wars much like the city-states of Renaissance Italy, which affected the nobility without major impacts on the population.

In fact, most archaeologists believe that the incessant warfare that arose in the terminal Classic period (800-950 A.D.), presumably because of climate change, was the major cause of the decline of Mayan cities throughout present day El Salvador, Honduras, Guatemala, Belize and Southern Mexico.

So when Wahl, Anderson and Estrada-Belli discovered the charcoal layer in 2013 in Laguna Ek'Naab -- a layer unlike anything Wahl had seen before -- they were puzzled. The scientists had obtained the lake core in order to document the changing climate in Central America, hoping to correlate these with changes in human occupation and food cultivation.

The puzzle lingered until 2016, when Estrada-Belli and co-author Alexandre Tokovinine, a Mayan epigrapher at the University of Alabama, discovered a key piece of evidence in the ruins of Witzna: an emblem glyph, or city seal, identifying Witzna as the ancient Mayan city Bahlam Jol. Searching through a database of names mentioned in Mayan hieroglyphs, Tokovinine found that very name in a "war statement" on a stela in the neighboring city-state of Naranjo, about 32 kilometers south of Bahlam Jol/Witzna.

The statement said that on the day "... 3 Ben, 16 Kasew ('Sek'), Bahlam Jol 'burned' for the second time." According to Tokovinine, the connotation of the word "burned," or puluuy in Mayan, has always been unclear, but the date 3 Ben, 16 Kasew on the Mayan calendar, or May 21, 697, clearly associates this word with total warfare and the scorched earth destruction of Bahlam Jol/Witzna.

"The implications of this discovery extend beyond mere reinterpretation of references to burning in ancient Maya inscriptions," Tokovinine said. "We need to go back to the drawing board on the very paradigm of ancient Maya warfare as centered on taking captives and extracting tribute."

Three other references to puluuy or "burning" are mentioned in the same war statement, referencing the cities of Komkom, known today as Buenavista del Cayo; K'an Witznal, now Ucanal; and K'inchil, location unknown. These cities may also have been decimated, if the word puluuy describes the same extreme warfare in all references. The earlier burning of Bahlam Jol/Witzna mentioned on the stela may also have left evidence in the lake cores -- there are three other prominent charcoal layers in addition to the one from 697 A.D. -- but the date of the earlier burning is unknown.

Mayan archaeologists have reconstructed some of the local history, and it's known that the conquest of Bahlam Jol/Witzna was set in motion by a queen of Naranjo, Lady 6 Sky, who was trying to reestablish her dynasty after the city-state had declined and lost all its possessions. She set her seven-year-old son, Kahk Tilew, on the throne and then began military campaigns to wipe out all the rival cities that had rebelled, Estrada-Belli said.

"The punitive campaign was recorded as being waged by her son, the king, but we know it's really her," he said.

That was not the end of Bahlam Jol/Witzna, however. The city revived, to some extent, with a reduced population, as seen in the lake cores. And the emblem glyph was found on a stela erected around 800 A.D, 100 years after the city's destruction. The city was abandoned around 1,000 A.D.

"The ability to tie geologic evidence of a devastating fire to an event noted in the epigraphic record, made possible by the relatively uncommon discovery of an ancient Maya city's emblem glyph, reflects a confluence of findings nearly unheard of in the field of geoarchaeology," Wahl said.

Credit: 
University of California - Berkeley

NASA Catches tropical storm Francisco's approach to landfall in southern Japan

image: On Aug. 5, 2019 at 9 a.m. EDT (1300 UTC) the MODIS instrument that flies aboard NASA's Aqua satellite showed strongest storms (red) in Tropical Storm Francisco circled the center. There, cloud top temperatures were as cold as minus 70 degrees Fahrenheit (minus 56.6 Celsius).

Image: 
NASA/NRL

Infrared imagery from NASA's Aqua satellite shows that Tropical Storm Francisco had powerful thunderstorms with heavy rain capabilities around the center of circulation as it moves toward landfall in southern Japan.

On Aug. 5, 2019, the Japan Meteorological Agency has issued warnings for the Amami, Kyushu and Shikoku. Advisories are in effect for Chugoku, Kinki Ogasawara, Okinawa and Tokai.

NASA's Aqua satellite used infrared light to analyze the strength of storms in Tropical Storm Francisco and found strongest storms circling the center. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

On Aug. 5 at 9 a.m. EDT (1300 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite  found strongest thunderstorms had cloud top temperatures as cold as minus 70 degrees Fahrenheit (minus 56.6 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

Microwave satellite imagery revealed an eye had formed in the center of those powerful thunderstorms.

The Joint Typhoon Warning Center or JTWC noted at 11 a.m. EDT (1500 UTC) that Francisco had maximum sustained winds near 60 knots (60 mph/111kph). Francisco was centered near 31.6 degrees north latitude and 132.4 degrees east longitude. Tropical storm Francisco was located approximately 203 nautical miles east-southeast of Sasebo, Japan. Francisco has tracked westward.

After the storm makes landfall in Kyushu, Japan, it is forecast to pass into the south of the Korean peninsula, and turn to the northeast as it becomes extra-tropical over the Sea of Japan.

Credit: 
NASA/Goddard Space Flight Center

Researchers forecast failure in disordered materials

Disordered materials - such as cellular foams, fiber and polymer networks - are popular in applications ranging from architecture to biomedical scaffolding. Predicting when and where these materials may fail could impact not only those materials currently in use, but also future designs. Researchers from North Carolina State University and the University of California Los Angeles were able to forecast likely points of failure in two-dimensional disordered laser-cut lattices without needing to study detailed states of the material.

The interior of disordered materials is formed by a network of connections between slender beams that intersect at various points - or nodes - throughout the material. Their structure allows for both compression and deformation, enabling them to withstand different types of force.

Estelle Berthier, postdoctoral researcher at NC State and lead author of a paper describing the research, set out to determine whether it is possible to predict where failure is most likely to occur in a disordered network. Berthier and co-author Karen Daniels, professor of physics at NC State, generated lattices based on the contact networks observed within granular materials and looked at a property known as geodesic edge betweenness centrality (GEBC).

"The importance of an edge in a network is in terms of its ability to connect different parts of network using the shortest path," Berthier says. "In our model lattice, when you connect each node of the network taking the shortest path, you use one of these beams, or edges. If you go through a particular edge a lot, then that edge has high centrality. Think about using the shortest path, or road, between two cities. The centrality value is the most popular road on that shortest path."

In collaboration with UCLA mathematician Mason Porter, the researchers used a computer algorithm to calculate the GEBC for the lattice and found that edges with a higher centrality value than the mean were the most likely to fail.

"If you have higher traffic on a particular road, then there's more wear and tear," Berthier says. "Similarly, a higher centrality value means that a particular path within the material is dealing with more force 'traffic,' and should be monitored more closely or perhaps shored up in some way."

The researchers found that the GEBC values alone were enough to identify failure sites in the material.

"One of the things that surprised me about the results was that the calculations don't require us to know any of the materials' properties, just how the parts have been connected together," Daniels says. "Of course, we can make the predictions even stronger by including information about the physical interactions in our calculations."

Credit: 
North Carolina State University

Unique electrical properties in quantum materials can be controlled using light

image: This is a microscopic image of multiple electrodes on a sheet of Weyl semimetal, with red and blue arrows depicting the circular movement of the light-induced electrical current by either left- (blue) or right-circularly polarized light (right).

Image: 
Zhurun Ji

Insights from quantum physics have allowed engineers to incorporate components used in circuit boards, optical fibers, and control systems in new applications ranging from smartphones to advanced microprocessors. But, even with significant progress made in recent years, researchers are still looking for new and better ways to control the uniquely powerful electronic properties of quantum materials.

A new study from Penn researchers found that Weyl semimetals, a class of quantum materials, have bulk quantum states whose electrical properties can be controlled using light. The project was led by Ritesh Agarwal and graduate student Zhurun Ji in the School of Engineering and Applied Science in collaboration with Charles Kane, Eugene Mele, and Andrew M. Rappe in the School of Arts and Sciences, along with Zheng Liu from Nanyang Technological University. Penn's Zachariah Addison, Gerui Liu, Wenjing Liu, and Heng Gao, and Nanyang's Peng Yu, also contributed to the work. Their findings were published in Nature Materials.

A hint of these unconventional photogalvanic properties, or the ability to generate electric current using light, was first reported by Agarwal in silicon. His group was able to control the movement of electrical current by changing the chirality, or the inherent symmetry of the arrangement of silicon atoms, on the surface of the material.

"At that time, we were also trying to understand the properties of topological insulators, but we could not prove that what we were seeing was coming from those unique surface states," Agarwal explains.

Then, while conducting new experiments on Weyl semimetals, where the unique quantum states exist in the bulk of the material, Agarwal and Ji got results that didn't match any theories that could explain how the electrical field was moving when activated by light. Instead of the electrical current flowing in a single direction, the current moved around the semimetal in a swirling circular pattern.

Agarwal and Ji turned to Kane and Mele to help develop a new theoretical framework that could explain what they were seeing. After conducting new, extremely thorough experiments to iteratively eliminate all other possible explanations, the physicists were able to narrow the possible explanations to a single theory related to the structure of the light beam.

"When you shine light on matter, it's natural to think about a beam of light as laterally uniform," says Mele. "What made these experiments work is that the beam has a boundary, and what made the current circulate had to do with its behavior at the edge of the beam."

Using this new theoretical framework, and incorporating Rappe's insights on the electron energy levels inside the material, Ji was able to confirm the unique circular movements of the electrical current. The scientists also found that the current's direction could be controlled by changing the light beam's structure, such as changing the direction of its polarization or the frequency of the photons.

"Previously, when people did optoelectronic measurements, they always assume that light is a plane wave. But we broke that limitation and demonstrated that not only light polarization but also the spatial dispersion of light can affect the light-matter interaction process," says Ji.

This work allows researchers to not only better observe quantum phenomena, but it provides a way to engineer and control unique quantum properties simply by changing light beam patterns. "The idea that the modulation of light's polarization and intensity can change how an electrical charge is transported could be powerful design idea," says Mele.

Future development of "photonic" and "spintronic" materials that transfer digitized information based on the spin of photons or electrons respectively is also made possible thanks to these results. Agarwal hopes to expand this work to include other optical beam patterns, such as "twisted light," which could be used to create new quantum computing materials that allow more information to be encoded onto a single photon of light.

"With quantum computing, all platforms are light-based, so it's the photon which is the carrier of quantum information. If we can configure our detectors on a chip, everything can be integrated, and we can read out the state of the photon directly," Agarwal says.

Agarwal and Mele emphasize the "heroic" effort made by Ji, including an additional year's measurements made while running an entirely new set of experiments that were crucial to the interpretation of the study. "I've rarely seen a graduate student faced with that challenge who was able not only to rise to it but to master it. She had the initiative to do something new, and she got it done," says Mele.

Credit: 
University of Pennsylvania

How wildfires trap carbon for centuries to millennia

Charcoal produced by wildfires could trap carbon for hundreds of years and help mitigate climate change, according to new research published today.

The extensive and unprecedented outbreak of wildfires in the arctic and the vast amounts of CO2 they are emitting have been hitting the headlines across the world.

But a new Nature Geoscience study quantifies the important role that charcoal plays in helping to compensate for carbon emissions from fires. And the research team say that this charcoal could effectively 'lock away' a considerable amount of carbon for years to come.

In an average year, wildfires around the world burn an area equivalent to the size of India and emit more carbon dioxide to the atmosphere than global road, rail, shipping and air transport combined.

As vegetation in burned areas regrows, it draws CO2 back out of the atmosphere through photosynthesis. This is part of the normal fire-recovery cycle, which can take less than a year in grasslands or decades in fire-adapted forests.

In extreme cases, such as arctic or tropical peatlands, full recovery may not occur for centuries.

This recovery of vegetation is important because carbon that is not re-captured stays in the atmosphere and contributes to climate change.

Deforestation fires are a particularly important contributor to climate change as these result in a long-term loss of carbon to the atmosphere.

Now, a new study by researchers at Swansea University and Vrije Universiteit Amsterdam has quantified the important role that charcoal created by fires - known as pyrogenic carbon - plays in helping to compensate for carbon emissions.

Lead author Dr Matthew Jones, who recently joined the UEA's School of Environmental Sciences from Swansea Univsersity, said: "CO2 emitted during fires is normally sequestered again as vegetation regrows, and researchers generally consider wildfires to be carbon neutral events once full biomass recovery has occurred.

"However, in a fire some of the vegetation is not consumed by burning, but instead transformed to charcoal. This carbon-rich material can be stored in soils and oceans over very long time periods.

"We have combined field studies, satellite data, and modelling to better quantify the amount of carbon that is placed into storage by fires at the global scale."

The paper, which was co-authored by Dr Cristina Santin and Prof Stefan Doerr, from Swansea University, and Prof Guido van der Werf, of Vrije Universiteit Amsterdam, explained that, as well as emitting CO2 to the atmosphere, landscape fires also transfer a significant fraction of affected vegetation carbon to charcoal and other charred materials.

The researchers say this pyrogenic carbon needs to be considered in global fire emission models.

Dr Jones said: "Our results show that, globally, the production of pyrogenic carbon is equivalent to 12 per cent of CO2 emissions from fires and can be considered a significant buffer for landscape fire emissions.

"Climate warming is expected to increase the prevalence of wildfires in many regions, particularly in forests. This may lead to an overall increase in atmospheric CO2 emissions from wildfires, but also an increase in pyrogenic carbon storage. If vegetation is allowed to recover naturally then the emitted CO2 will be recaptured by regrowth in future decades, leaving behind an additional stock of pyrogenic carbon in soils, lakes and oceans.

"We expect any additional pyrogenic carbon to be trapped for a period of centuries to millennia, and although it will eventually return to the atmosphere as charcoal degrades, it is locked away and unable to affect our climate in the meantime.

"This brings some good news, although rising CO2 emissions caused by human activity, including deforestation and some peatland fires, continue to pose a serious threat to global climate."

There are still important questions to be answered about how a warmer, more drought-prone climate will affect the global extent of wildfires in the future.
For example, will there be more fire in arctic peatlands as we are experiencing this summer, and what proportion of CO2 emissions will be recaptured by future vegetation regrowth?

But this new research shows that pyrogenic carbon production should be considered as a significant product of fires and an important element of the global carbon cycle.
Global fire emissions buffered by the production of pyrogenic carbon is published in the journal Nature Geoscience.

Credit: 
University of East Anglia

Rice lab produces simple fluorescent surfactants

image: Rice University chemists have produced an array of fluorescent surfactants for imaging, biomedical and manufacturing applications.

Image: 
Illustration by Ashleigh Smith McWilliams/Rice University

HOUSTON - (Aug. 5, 2019) - Laboratories use surfactants to separate things, and fluorescent dyes to see things. Rice University chemists have combined the two to simplify life for scientists everywhere.

The Wiess School of Natural Sciences lab of chemist Angel Martí introduced a lineup of eight fluorescent surfactants in Pure and Applied Chemistry. They're examples of what he believes will be a modular set of fluorescent surfactants for labs and industry.

Martí and Rice graduate student and lead author Ashleigh Smith McWilliams developed the compounds primarily to capture images of single nanotubes or cells as simply as possible.

"We can stain cells or carbon nanotubes with these surfactants," Martí said. "They stick to cells or nanotubes and now you can use fluorescent microscopy to visualize them."

Soaps and detergents are common surfactants. They are two-part molecules with water-attracting heads and water-avoiding tails. Put enough of them in water and they will form micelles, with the heads facing outward and the tails inward. (Similar structures form the protective, porous barriers around cells.)

McWilliams produced the surfactants by reacting fluorescent dyes with alcohol-based, nonpolar tails, which made the heads glow when triggered by visible light. When the compounds wrap around carbon nanotubes in a solution, they not only keep the nanotubes from aggregating but make them far easier to see under a microscope.

"Surfactants have been used for many different applications for years, but we've made them special by converting them to image things you can generally not see," Martí said.

"Fluorescent surfactants have been studied before, but the novel part of ours is their versatility and relative simplicity," McWilliams said. "We use common dyes and plan to produce these surfactants with an array of colors and fluorescent properties for specific applications."

Those could be far-reaching, Martí said.

"These can go well beyond imaging applications," he said. "For instance, clothing manufacturers use surfactants and dyes. In theory, they could combine those; instead of using two different chemicals, they could use one.

"I can also envision using these for water purification, where surfactant dyes can be tuned to trap pollutants and destroy them using visible light," Martí said. "For biomedical applications, they can be tuned to target specific cells and kill only those you radiate with light. That would allow for a localized way to treat, say, skin cancer."

Martí said his lab was able to confirm fluorescent surfactants are the real deal. "We were able to characterize the critical micelle concentration, the concentration at which micelles start forming," he said. "So we are 100% sure these molecules are surfactants."

Credit: 
Rice University

Geoengineering versus a volcano

image: This is ash from the 1991 eruption of Mount Pinatubo in the Philippines.

Image: 
Courtesy of Jackson K./USGS

Washington, DC-- Major volcanic eruptions spew ash particles into the atmosphere, which reflect some of the Sun's radiation back into space and cool the planet. But could this effect be intentionally recreated to fight climate change? A new paper in Geophysical Research Letters investigates.

Solar geoengineering is a theoretical approach to curbing the effects of climate change by seeding the atmosphere with a regularly replenished layer of intentionally released aerosol particles. Proponents sometimes describe it as being like a "human-made" volcano.

"Nobody likes the idea of intentionally tinkering with our climate system at global scale," said Carnegie's Ken Caldeira. "Even if we hope these approaches won't ever have to be used, it is really important that we understand them because someday they might be needed to help alleviate suffering."

He, along with Carnegie's Lei Duan (a former student from Zhejiang University), Long Cao of Zhejiang University, and Govindasamy Bala of the Indian Institute of Science, set out to compare the effects on the climate of a volcanic eruption and of solar geoengineering. They used sophisticated models to investigate the impact of a single volcano-like event, which releases particulates that linger in the atmosphere for just a few years, and of a long-term geoengineering deployment, which requires maintaining an aerosol layer in the atmosphere.

They found that regardless of how it got there, when the particulate material is injected into the atmosphere, there is a rapid decrease in surface temperature, with the land cooling faster than the ocean.

However, the volcanic eruption created a greater temperature difference between the land and sea than did the geoengineering simulation. This resulted in different precipitation patterns between the two scenarios. In both situations, precipitation decreases over land--meaning less available water for many people living there--but the decrease was more significant in the aftermath of a volcanic eruption than it was in the geoengineering case.

"When a volcano goes off, the land cools substantially quicker than the ocean. This disrupts rainfall patterns in ways that you wouldn't expect to happen with a sustained deployment of a geoengineering system," said lead author Duan.

Overall, the authors say that their results demonstrate that volcanic eruptions are imperfect analogs for geoengineering and that scientists should be cautious about extrapolating too much from them.

"While it's important to evaluate geoengineering proposals from an informed position, the best way to reduce climate risk is to reduce emissions," Caldeira concluded.

Credit: 
Carnegie Institution for Science

A new lens for life-searching space telescopes

image: University of Arizona researchers have designed a fleet of 35 powerful space telescopes that will search for the chemical signatures of life on other worlds.

Image: 
Nautilus team

The University of Arizona Richard F. Caris Mirror Laboratory is a world leader in the production of the world's largest telescope mirrors. In fact, it is currently fabricating mirrors for the largest and most advanced earth-based telescope: The Giant Magellan Telescope.

But there are size constraints, ranging from the mirror's own weight, which can distort images, to the size of our freeways and underpasses that are needed to transport finished pieces. Such giant mirrors are reaching their physical limits, but when they do, the UA will continue to be a global contributor to the art of gathering light and drive change in the way astronomers observe the stars.

"We are developing a new technology to replace mirrors in space telescopes," said UA associate professor Daniel Apai, of Steward Observatory and the Lunar and Planetary Laboratory. "If we succeed, we will be able to vastly increase the light-collecting power of telescopes, and among other science, study the atmospheres of 1,000 potentially earth-like planets for signs of life."

Apai leads the space science half of the team, while UA professor Tom Milster, of the James C. Wyant College of Optical Sciences, leads the optical design of a replicable space telescope dubbed Nautilus. The researchers intend to deploy a fleet of 35 14-meter-wide spherical telescopes, each individually more powerful than the Hubble Space Telescope.

Each unit will contain a meticulously crafted 8.5-meter diameter lens, which will be used for astronomical observations. One use particularly exciting for Apai is analyzing starlight as it filters through planetary atmospheres, a technique which could reveal chemical signatures of life.

When combined, the telescope array will be powerful enough to characterize 1,000 extrasolar planets from as far away as 1,000 light years. Even NASA's most ambitious space telescope missions are designed to study a handful of potentially Earth-like extrasolar planets.

"Such a sample may be too small to truly understand the complexity of exo-earths," according to Apai and Milster's co-authored paper, which was published July 29 in the Astronomical Journal along with several other authors, including Steward Observatory astronomer Glenn Schneider and Alex Bixel, an astronomer and UA graduate student.

To develop Nautilus, Apai and Milster defined a goal and designed Nautilus to meet it.

"We wanted to search 1,000 potentially earth-like planets for signs of life. So, we first asked, what kinds of stars are most likely to host planets? Then, how far do we need to go in space to have 1,000 earth-like planets orbiting around them? It turned out that it's over 1,000 light years - a great distance, but still just a small part of the galaxy," Apai said. "We then calculated the light collecting power needed, which turned out to be the equivalent of a 50-meter diameter telescope."

The Hubble mirror is 2.4 meters in diameter and the James Webb Space Telescope mirror is 6.5 meters in diameter. Both were designed for different purposes and before exoplanets were even discovered.

"Telescope mirrors collect light - the larger the surface, the more starlight they can catch," Apai said. "But no one can build a 50-meter mirror. So we came up with Nautilus, which relies on lenses, and instead of building an impossibly huge 50-meter mirror, we plan on building a whole bunch of identical smaller lenses to collect the same amount of light."

The lenses were inspired by lighthouse lenses - large but lightweight - and include additional tweaks such as precision carving with diamond-tipped tools. The patented design, which is a hybrid between refractive and diffractive lenses, make them more powerful and suitable for planet hunting, Milster said.

Because the lenses are lighter than mirrors, they are less expensive to launch into space and can be made quickly and cheaply using a mold. They are also less sensitive to misalignments, making telescopes built with this technology much more economical. Much like Ford did for cars, Ikea did for furniture, and SpaceX for rockets, Nautilus will use new technology, a simpler design, and lightweight components to provide cheaper and more efficient telescopes with more light-collecting power.

Nautilus telescopes also don't require any fancy observing technique.

"We don't need extremely high-contrast imaging. We don't need a separate spacecraft with a giant starshade to occult the planet host stars. We don't need to go into the infrared," Apai said. "What we do need is to collect lots of light in an efficient and cheap way."

In the last few decades, computers, electronics and data-collection instruments have all become smaller, cheaper, faster and more efficient. Mirrors, on the other hand, are exceptions to this growth as they haven't seen big cost reductions.

"Currently, mirrors are expensive because it takes years to grind, polish, coat and test," Apai said. Their weight also makes them expensive to launch. "But our Nautilus technology starts with a mold, and often it takes just hours to make a lens. We also have more control over the process, so if we make a mistake, we don't need to start all over again like you may need to with a mirror."

Additionally, risk would be distributed over many telescopes, so if something goes wrong, the mission isn't scrapped. Many telescopes remain.

"Everything is simple, cheap and replicable, and we can collect a lot of light," Apai said.

Apai and Milster have another vision if they succeed: "Using the low-cost, replicated space telescope technology, universities would be able to launch their own small, Earth- or space-observing telescopes. Instead of competing for bits of time on Hubble, they'd get their own telescope, controlled by their own teams," Apai said.

In January, Apai and Milster's team, along with UA assistant professor Dae Wook Kim and professor Ronguang Liang of the College of Optical Sciences and Jonathan Arenberg from Northrop Grumman Aerospace Systems, received $1.1 million from the Moore Foundation to create a prototype of a single telescope and test it on the 61-inch Kuiper Telescope on Mt. Bigelow by December 2020.

"The University of Arizona is just one of the few places in the world, and usually the first in the world, to generate such pioneering telescope systems," Milster said. "And it fits right in line with our history and our prominence in optical sciences and astronomy that we develop this technology."

Credit: 
University of Arizona

Reverse engineering the fireworks of life

video: This time-lapse video shows the growth and branching of microtubules, dubbed the 'fireworks of life' because these microscopic structures make up the skeleton of the cell. Graduate student Akanksha Thawani and three professors -- biologist Sabine Petry, engineer Howard Stone, and biophysicist Joshua Shaevitz - succeeded in reverse-engineering the recipe for building these fireworks. Elapsed time is shown in seconds and the white scale bar shows 10 μm. (For reference, a human hair is 50 to 100 μm across.)

Image: 
Akanksha Thawani, Princeton University

Imagine standing in a lumberyard and being asked to build a house -- without blueprints or instructions of any kind. The materials are all in front of you, but that doesn't mean you have the first idea how to get from point A to point B.

That was the situation facing the Princeton biologists who are building microtubules, the skeleton of the cell, from scratch.

"We did not think it was possible," said Sabine Petry, an assistant professor of molecular biology. For years, Petry and the researchers in her lab have dazzled the biological world with videos of what they call "the fireworks of life," which show the branching and growth of these microscopic structures. "From making fireworks to getting to the recipe of how fireworks are made? We had imagined and brainstormed about it for five years." In that time, her team had painstakingly determined the fireworks' components, one protein at a time, and graduate student Akanksha Thawani had come up with a model for the sequence, but testing it seemed impossible.

But then the journal's reviewers told them they couldn't publish their model unless they proved it experimentally.

"Admittedly, after watching Akanksha work on this so long, when the referee asked for more work, I was skeptical that we could sort out the order of molecular attachments in any reasonable time," said Howard Stone, Princeton's Donald R. Dixon '69 and Elizabeth W. Dixon Professor of Mechanical and Aerospace Engineering and Thawani's co-adviser. "But Akanksha was focused and disciplined, and systematically tackled experiments that identified the order of the molecular attachments. It was stunning to follow her detective work."

"They asked us, and we wanted to get it published, so that did the trick," Petry said. "The review process gets a lot of bad press, but reviewers can sometimes push you to the next level." The results of their work appear in the journal eLife.

Building a house without blueprints

Microtubules are the bricks and mortar of the cell, used to build cell walls and the spindles of mitosis and meiosis -- without them, even single-celled organisms couldn't reproduce -- but until now, no one knew exactly how microtubules branch off each other. For a decade, researchers have known that the branching, caused as the microtubules grow from each other, was key to assembling spindles and making connections between the cell components.

"The missing piece for a decade or so has been this microtubule branching -- that microtubules don't grow just linearly, but they actually branch, and they can branch again and again, creating those fireworks," Petry said.

While Petry's team had identified the components necessary to build microtubules, they hadn't put together the sequence -- the recipe -- that spelled out exactly how to assemble them, at the molecular level, to make the spindles grow and branch into fireworks. And for the most part, that was fine. Biology did it for them. If they put the right components together, the fireworks just grew.

But how did it happen, exactly? That was the question that nagged at Thawani, a chemical and biological engineering graduate student doing her research in Petry's lab.

"For the longest time, I've been staring at them and wondering how this worked, from scratch," said Thawani, who recently won the prestigious Charlotte Elizabeth Proctor Fellowship for graduate students in their final year. "We start from no microtubules at all, and then, within 15 minutes, we have these beautiful structures. How do you generate a structure from those nanometer-sized proteins? What was it about their binding kinetics or their organization that would result in the structures that we see?"

Thawani was uniquely positioned to tackle these questions, having spent years studying chemical engineering and physics as well as molecular biology. She has essentially invented a new subspecialty in-between the three fields. "At the intersections between disciplines -- that's where the next, best science is," she said.

The eLife paper stands at that unusual crossroads: of the four authors, all except Thawani are principal investigators (PIs) of their own research labs, in three usually unrelated fields: Petry in biology; Stone in engineering; and Joshua Shaevitz, a professor of physics and the Lewis-Sigler Institute for Integrative Genomics.

"I don't know of many examples where there is one first author and then three PIs," said Petry. "I think that's a strength of Princeton. I don't know any other place where it's that easy to get three professors together to make a project happen."

The key, Thawani had realized, was creating a computer model based on precise measurements of the growth patterns of microtubules. That required imaging the fireworks with total internal reflection fluorescence (TIRF) microscopy, a strength of the Petry lab, which has developed techniques to optically isolate a 100 nanometer-thick region of the sample so that branching microtubules can be seen in a sea of background molecules. (For reference, a human hair is about 500 times wider than that.)

But even then, every pixel recorded by the camera included thousands of molecules. Thawani had to find a way to disaggregate the visual data to make single-molecule observations, which required months of complicated image analysis -- and help from Shaevitz, who has spent years on image analysis.

Ultimately, Thawani measured exactly when and where a single protein binds to an existing microtubule to start a new branch, as well as its rate of growth, looking at one molecule at time.

"The traditional approach, where you change the amounts of different molecules in the branching reaction, doesn't allow you to figure out the order that things need to happen," said Shaevitz, who is also the co-director of the NSF-funded Center for the Physics of Biological Function. "By looking at individual molecules, we can literally watch the assembly piece by piece as it happens."

Thawani then created a computer model using those parameters. Other scientists have tried to model microtubule branching before, but none had access to such accurate measurements to test their model outputs against. She then tested various sequences that the researchers had brainstormed over the years, and the model ruled out all but one of them.

So now the research team had the ingredients -- proteins called TPX2, augmin and γ-TuRC -- as well as the sequence of steps, but the computer couldn't tell them which protein to add when. And as anyone who has assembled kit furniture or baked bread from scratch knows, doing the steps out of order just doesn't work.

The final twist

The experiments required by the reviewers revealed that Thawani and Petry's expectations were exactly backwards. "We went in thinking it had to be augmin first and then TPX2, but it turned out to be the other way around," Thawani said. "That was the twist."

With that discovery, the researchers had the complete recipe to generate microtubule fireworks: If TPX2 is deposited on existing microtubules, followed by binding augmin with γ-TuRC, then new microtubules will nucleate and branch.

As a final step, they confirmed that the proteins would bind with precisely the speed predicted by Thawani's computer model. "That was the third breakthrough," Petry said, "that those numbers matched, that what was predicted by her model in the computer was true for the biology."

"This work from Petry is really an important addition that will help drive the field forward," said Daniel Needleman, the Gordon McKay Professor of Applied Physics and a professor of molecular and cellular biology at Harvard University. "I think that this work, in combination with results from my group and from Jan Brugués (at the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden), have really clarified the 'rules' for microtubule nucleation in spindles. The next step will be to figure out the molecular processes that govern those rules. Petry and colleges have setup a system that should really help to do that."

Looking back, Petry said, the work was "full of surprises, both experimentally and in what one can achieve and how it can be achieved. Revisiting this long-standing question, incorporating professors from three fields, the review process -- the whole system worked."

Credit: 
Princeton University

Technique uses magnets, light to control and reconfigure soft robots

Researchers from North Carolina State University and Elon University have developed a technique that allows them to remotely control the movement of soft robots, lock them into position for as long as needed and later reconfigure the robots into new shapes. The technique relies on light and magnetic fields.

"We're particularly excited about the reconfigurability," says Joe Tracy, a professor of materials science and engineering at NC State and corresponding author of a paper on the work. "By engineering the properties of the material, we can control the soft robot's movement remotely; we can get it to hold a given shape; we can then return the robot to its original shape or further modify its movement; and we can do this repeatedly. All of those things are valuable, in terms of this technology's utility in biomedical or aerospace applications."

For this work, the researchers used soft robots made of a polymer embedded with magnetic iron microparticles. Under normal conditions, the material is relatively stiff and holds its shape. However, researchers can heat up the material using light from a light-emitting diode (LED), which makes the polymer pliable. Once pliable, researchers demonstrated that they could control the shape of the robot remotely by applying a magnetic field. After forming the desired shape, researchers could remove the LED light, allowing the robot to resume its original stiffness - effectively locking the shape in place.

By applying the light a second time and removing the magnetic field, the researchers could get the soft robots to return to their original shapes. Or they could apply the light again and manipulate the magnetic field to move the robots or get them to assume new shapes.

In experimental testing, the researchers demonstrated that the soft robots could be used to form "grabbers" for lifting and transporting objects. The soft robots could also be used as cantilevers, or folded into "flowers" with petals that bend in different directions.

"We are not limited to binary configurations, such as a grabber being either open or closed," says Jessica Liu, first author of the paper and a Ph.D. student at NC State. "We can control the light to ensure that a robot will hold its shape at any point."

In addition, the researchers developed a computational model that can be used to streamline the soft robot design process. The model allows them to fine-tune a robot's shape, polymer thickness, the abundance of iron microparticles in the polymer, and the size and direction of the required magnetic field before constructing a prototype to accomplish a specific task.

"Next steps include optimizing the polymer for different applications," Tracy says. "For example, engineering polymers that respond at different temperatures in order to meet the needs of specific applications."

The paper, "Photothermally and Magnetically Controlled Reconfiguration of Polymer Composites for Soft Robotics," appears in the journal Science Advances. First author of the paper is Jessica Liu, a Ph.D. student at NC State. The paper was co-authored by Jonathan Gillen, a former undergraduate at NC State; Sumeet Mishra, a former Ph.D. student at NC State; and Benjamin Evans, an associate professor of physics at Elon University.

Credit: 
North Carolina State University

Sesame allergy is more common than previously known

CHICAGO --- Sesame allergy affects more than 1 million children and adults in the U.S., more than previously known, reports a new Northwestern Medicine study.

But sesame labeling is currently not required by law as are the other top eight allergens like peanut and milk, and is often labeled in a potentially confusing manner, such as tahini. This increases the risk of accidental ingestion.

The new study provides the first up-to-date estimates on the current prevalence of sesame allergy among U.S. children and adults in all 50 states.

"Our study shows sesame allergy is prevalent in the U.S. in both adults and children and can cause severe allergic reactions," said lead study author Dr. Ruchi Gupta, professor of pediatrics and of medicine at Northwestern University Feinberg School of Medicine and a physician at Ann & Robert H. Lurie Children's Hospital of Chicago. "It is important to advocate for labeling sesame in packaged food. Sesame is in a lot of foods as hidden ingredients. It is very hard to avoid."

Gupta also is director of the Center for Food Allergy and Asthma Research at Feinberg.

The paper will be published Aug. 2 in JAMA Network Open.

The study directly informs ongoing regulatory rule-making by the U.S. Food & Drug Administration, which is currently considering whether sesame should be added to the list of key food allergens for which mandatory product labeling is required. Unlike in other countries (the European Union and Australia), current U.S. law does not require sesame-containing products be labeled.

The Food Allergen Labeling and Consumer Protection Act of 2014 mandates labeling of the top eight allergenic foods/food groups including peanut, milk, shellfish, tree nuts, egg, wheat, soy and finfish along with proteins derived from them.

More than 1.5 million children and adults in the U.S. (.49% of the population) report a current sesame allergy, and more than 1.1 million (.34% of the population) report either a physician-diagnosed sesame allergy or a history of sesame-allergic reaction symptoms, the study found.

The data also indicate many individuals who report sesame allergies and experience potentially severe allergic reactions are not obtaining clinical diagnosis of their allergies.

"Clinical confirmation of suspected food allergies is essential to reduce the risk of unnecessary allergen avoidance as well as ensure patients receive essential counseling and prescription of emergency epinephrine," said first author Christopher Warren, an investigator with the Center for Food Allergy and Asthma Research.

Unlike allergies such as milk or egg, which often develop early in life and are outgrown by adolescence, sesame allergy affects children and adults to a similar degree. In addition, four in five patients with sesame allergy have at least one other food allergy. More than half have a peanut allergy, a third are tree-nut allergic, a quarter are egg-allergic and one in five are allergic to cow's milk.

Study investigators administered a survey via telephone and web to more than 50,000 U.S. households. The survey asked detailed information about any suspected food allergies, including specific allergic reaction symptoms, details about clinical diagnosis of food allergies as well as demographic information. They obtained responses for a nationally representative sample of approximately 80,000 children and adults.

Credit: 
Northwestern University

Surgery simulators are key to assessment of trainees

image: Fifty participants were recruited from four stages of neurosurgical training; neurosurgeons, fellows and senior residents, junior residents, and medical students.

Image: 
Helmut Bernhard/The Neuro

Machine learning-guided virtual reality simulators can help neurosurgeons develop the skills they need before they step in the operating room, according to a new study.

Research from the Neurosurgical Simulation and Artificial Intelligence Learning Centre at The Neuro (Montreal Neurological Institute and Hospital) and McGill University shows that machine learning algorithms can accurately assess the capabilities of neurosurgeons during virtual surgery, demonstrating that virtual reality simulators using artificial intelligence can be powerful tools in surgeon training.

Fifty participants were recruited from four stages of neurosurgical training; neurosurgeons, fellows and senior residents, junior residents, and medical students. They performed 250 complex tumour resections using NeuroVR, a virtual reality surgical simulator developed by the National Research Council of Canada and distributed by CAE, which recorded all instrument movements in 20 millisecond intervals.

Using this raw data, a machine learning algorithm developed performance measures such as instrument position and force applied, as well as outcomes such as amount of tumour removed and blood loss, which could predict the level of expertise of each participant with 90 per-cent accuracy. The top performing algorithm could classify participants using just six performance measures.

This research, published in the Journal of the American Medical Association on Aug. 2, 2019, shows that the fusion of AI and VR neurosurgical simulators can accurately and efficiently assess the performance of surgeon trainees. This means that AI-assisted mentoring systems can be developed that focus on improving patient safety by guiding trainees through complex surgical procedures. These systems can determine areas that need improvement and how the trainee can develop these important skills before surgeons operate on real patients.

"Physician educators are facing increased time pressure to balance their commitment to both patients and learners," says Dr. Rolando Del Maestro, the lead author of the study. "Our study proves that we can design systems that deliver on-demand surgical assessments at the convenience of the learner and with less input from instructors. It may also lead to better patient safety by reducing the chance for human error both while assessing surgeons and in the operating room."

Credit: 
McGill University