Tech

Green Deal: Good for a climate-neutral Europe - bad for the planet

image: By high imports of agricultural products, the EU is outsourcing environmental damage, the researchers of KIT say.

Image: 
(Photo: Markus Breig, KIT)

Europe is to become the first climate-neutral continent by 2050 - this goal of the "Green Deal" was announced by the EU in late 2019. Carbon emissions shall be reduced, while forestation, agriculture, environmentally friendly transport, recycling, and renewable energies shall be pushed. In Nature, scientists of Karlsruhe Institute of Technology (KIT) now show that this "Green Deal" might be a bad deal for the planet, as the EU will outsource environmental damage by high imports of agricultural products. The researchers recommend actions for the deal to push global sustainability. (DOI: 10.1038/d41586-020-02991-1)

The "Green Deal" adopted by the European Commission is to change European agriculture significantly in the next years and to contribute to making Europe the first climate-neutral continent. By 2030, about a quarter of all agricultural areas shall be farmed organically. Use of fertilizers and pesticides shall be reduced by 20 and 50 percent, respectively. In addition, the EU plans to plant 3 billion trees, to restore 25,000 km of rivers, and to reverse the decrease in populations of pollinators, such as bees or wasps. "These measures are important and reasonable," says Richard Fuchs from the Institute of Meteorology and Climate Research - Atmospheric Environmental Research (IMK-IFU), KIT's Campus Alpine in Garmisch-Partenkirchen. "But it will be also necessary to specify foreign trade goals. Otherwise, we will only outsource the problem and continue to damage our planet." The research team compared sustainability conditions abroad with those in Europe and recommended actions for a standardized procedure. 

Sustainability Standards Must Be Defined and Harmonized

According to the study, the European Union annually imports millions of tons of agricultural products. In 2019, one fifth of crops were imported from abroad, as were many meat and dairy products. However, the imports come from countries, whose environmental legislations are far less stringent than those in Europe. For instance, genetically modified organisms have been subject to strong limitations in EU agriculture since 1999. Still, Europe imports genetically modified soy beans and corn from Brazil, Argentina, the USA, and Canada, the study reveals. "On the average, Europe's trading partners use more than twice as much fertilizers than we. Use of pesticides has also increased in most of these countries," Fuchs says. In his opinion, the problem is that each nation defines sustainability in a different way. Things forbidden in Europe might be permitted elsewhere. "By importing goods from these countries, the EU just outsources environmental damage to other regions and earns the laurels for its green policy at home," the climate researcher points out. 

The scientists of KIT recommend to urgently harmonize sustainability standards, to strongly reduce the use of fertilizers and pesticides and avoid deforestation. "The EU cannot impose its standards in other countries, but it can demand that goods entering the European market meet EU requirements," Richard Fuchs says. 

Evaluation of CO2 Footprint Worldwide and Reduction of Meat Consumption

The researcher points out that Europe's CO2 footprint has to be evaluated worldwide and improved afterwards. Carbon balancing according to the Paris Agreement only covers emissions caused by domestic production, but not emissions due to the production of these goods abroad. 

Moreover, the scientists promote reduction of consumption of meat and dairy products. This would reduce the import of agricultural products. Domestic production in accordance with adequate standards should be strengthened. For this purpose, areas with a low diversity of species or not used for agriculture so far could be converted. This would reduce deforestation in the tropics, which is mainly caused by the creation of new framing areas. Harvest yields might be increased by the CRISPR gene editing technology, the team says. This technology improves the edible mass, height, and pest resistance of plants without using genes of another species. 

"Not all measures are easy to implement. Reorientation of agricultural production, however, would contribute to protecting Europe's food crops against global market fluctuations, disturbances of the supply chain, and some impactsd of climate change," Fuchs says. "Only then will the "Green Deal" be a good deal not only for a climate-neutral Europe, but also for our entire planet."

Credit: 
Karlsruher Institut für Technologie (KIT)

Applying environmental genomics to coral conservation

image: A coral reef in New Caledonia

Image: 
Oliver Selmoni, EPFL

Oceans are a bellwether for the planet's health, absorbing over 90% of the sun's energy. They demonstrate the extent to which rising temperatures are threatening coral reefs and other vital ecosystems that support biodiversity. In 2016 and 2017, an abrupt rise in surface temperatures in the Pacific Ocean caused mass bleaching on an unprecedented scale. Australia's Great Barrier Reef was especially hard-hit.

Bleaching occurs when heat stress disrupts the symbiotic relationship between corals and the tiny algae that live inside them, providing a source of nutrients for coral and giving them their color. Persistent bleaching can lead to coral death. In the past two decades, abnormal heatwaves caused entire sections of reef off the coast of Australia - measuring several kilometers in length - to turn white.

Scientists have already found that some reefs are better equipped to cope with recurring heat stress than others. For his thesis research, Oliver Selmoni, a doctoral assistant at EPFL's Laboratory of Geographic Information Systems (LASIG), applied the principles of environmental genomics to characterize this ability to adapt. Selmoni cross-referenced the results of genetic analyses of coral samples with ocean temperature data captured by satellites to determine what made some corals better able to withstand rising temperatures.

Building a study from the ground up

Having applied his method to pre-existing data on a coral species in Japan, Selmoni travelled to New Caledonia to build a new study from the ground up. He collected his own coral samples with the help of the IRD scientists based in Nouméa. The findings were published in Nature Scientific Reports on 12 November. "New Caledonia is home to the world's second-longest coral reef, expanding to over 1,000 km," says Selmoni. "This relatively compact ecosystem is exposed to dramatic contrasts in environmental conditions, which makes it an ideal candidate for studying climate adaptation."

The study aimed to test two hypotheses. The first is that coral populations learn to adapt to warmer seas after experiencing prolonged heat stress over many years. "The longer higher temperatures persist, the more likely it is that climate-resilient traits will develop and be passed down from generation to generation," explains Selmoni. The second hypothesis relates to connectivity: corals reproduce by releasing larvae into the water, which are then carried in ocean currents. "Corals rely on nearby populations for survival. When a reef is destroyed by environmental stressors or human activity, larvae from elsewhere are needed to kick-start repopulation," he adds.

Establishing marine protected areas

Selmoni's first task was to assess the composition of the marine environment in New Caledonia, using satellite data stretching back 30 years. After selecting 20 sites with the greatest temperature contrasts, he headed into the field to collect samples. "We focused on three flagship Pacific coral species that are susceptible to bleaching and relatively easy to find," he recalls. "It was a huge undertaking: 3,000 km by road and another 1,000 km by boat!" Selmoni shared details of his experience on the EPFL Out There blog.

Using environmental genomics methods at LASIG, he found that the field observations supported his connectivity and adaptation hypotheses. "As expected, we observed a correlation between likelihood of adaptation and prolonged exposure to high heat stress. Conversely, corals in locations that had never experienced heat stress showed no climate-adaptive traits," explains Selmoni.

Looking ahead, the maps developed in the study could be used to establish new marine protected areas (MPAs) - zones where fishing, tourism, industry and other human activities are restricted - in places where, through connectivity, heat-resistant coral strains could populate reefs around the archipelago. Another option could be to select and grow climate-adaptive corals, then transplant them into nearby reefs that are less able to withstand rising temperatures, thereby accelerating the process of natural selection. "Over time, these hardier strains can help rebuild damaged reefs or make existing coral populations more resilient to bleaching," adds Selmoni.

Credit: 
Ecole Polytechnique Fédérale de Lausanne

New maps document big-game migrations across the western United States

LARAMIE, Wyo. - For the first time, state and federal wildlife biologists have come together to map the migrations of ungulates - hooved mammals such as mule deer, elk, pronghorn, moose and bison - across America's West. The maps will help land managers and conservationists pinpoint actions necessary to keep migration routes open and functional to sustain healthy big-game populations.

"This new detailed assessment of migration routes, timing and interaction of individual animals and herds has given us an insightful view of the critical factors necessary for protecting wildlife and our citizens," said USGS Director Jim Reilly.

The new study, Ungulate Migrations of the Western United States: Volume 1, includes maps of more than 40 big-game migration routes in Arizona, Idaho, Nevada, Utah and Wyoming.

"I'm really proud of the team that worked across multiple agencies to transform millions of GPS locations into standardized migration maps," said Matt Kauffman, lead author of the report and director of the USGS Wyoming Cooperative Fish and Wildlife Research Unit. "Many ungulate herds have been following the same paths across western landscapes since before the United States existed, so these maps are long overdue."

The migration mapping effort was facilitated by Department of the Interior Secretary's Order 3362, which has brought greater focus to the need to manage and conserve big-game migrations in the West. It builds on more than two decades of wildlife research enhanced by a technological revolution in GPS tracking collars. The research shows ungulates need to migrate in order to access the best food, which in the warmer months is in the mountains. They then need to retreat seasonally to lower elevations to escape the deep winter snow.

Big-game migrations have grown more difficult as expanding human populations alter habitats and constrain the ability of migrating animals to find the best forage. The herds must now contend with the increasing footprint of fences, roads, subdivisions, energy production and mineral development. Additionally, an increased frequency of droughts due to climate change has reduced the duration of the typical springtime foraging bonanza.

Fortunately, maps of migration habitat, seasonal ranges and stopovers are leading to better conservation of big-game herds in the face of all these changes. Detailed maps can help identify key infrastructure that affect migration patterns and allow conservation officials to work with private landowners to protect vital habitats and maintain the functionality of corridors.

The migration maps also help researchers monitor and limit the spread of contagious diseases, such as chronic wasting disease, which are becoming more prevalent in wild North American cervid populations such as deer, elk and moose.

"Arizona is excited to be part of this effort," said Jim deVos, assistant director for wildlife management with the Arizona Game and Fish Department. "This collaboration has allowed us to apply cutting-edge mapping techniques to decades of Arizona's GPS tracking data and to make those maps available to guide conservation of elk, mule deer and pronghorn habitat."

Many of these mapping and conservation techniques were pioneered in Wyoming. Faced with rapidly expanding oil and gas development, for more than a decade the Wyoming Game and Fish Department and the USGS Cooperative Research Unit at the University of Wyoming have worked together to map corridors to assure the continued movements of migratory herds on federal lands.

Migration studies have also reached the Wind River Indian Reservation, where researchers are collaborating with the Eastern Shoshone and Northern Arapaho Fish and Game to track mule deer and elk migrations and doing outreach to tribal youth. Director Reilly emphasized that the interactions with state agencies and the tribes, especially with the Wind River students, have been a hallmark of this effort and have been remarkably successful.

For example, the mapping and official designation of Wyoming's 150-mile Red Desert as part of the Hoback mule deer migration corridor enabled science-based conservation and management decisions. Detailed maps also allowed managers to enhance stewardship by private landowners, whose large ranches are integral to the corridor. Partners funded fence modifications and treatments of cheatgrass and other invasive plants across a mix of public and private segments within the corridor.

"Just like Wyoming, Nevada has long valued our mule deer migrations," said Tony Wasley, director of the Nevada Department of Wildlife. "This effort has provided us with a new level of technical expertise to get these corridors mapped in a robust way. We look forward to using these maps to guide our stewardship of Nevada's mule deer migrations."

In 2018, the USGS and several western states jointly created a Corridor Mapping Team for USGS scientists to work side-by-side with state wildlife managers and provide technical assistance through all levels of government. With coordination from the Western Association of Fish and Wildlife Agencies and the information-sharing and technical support of the team, agency biologists from Arizona, Idaho, Nevada, Utah and Wyoming collaborated to produce migration maps for the five big-game species. In 2019, the Corridor Mapping Team expanded to include mapping work across all states west of the Rocky Mountains.

In addition to managers from the respective state wildlife agencies, the report was coauthored by collaborating biologists from the USDA Forest Service, the National Park Service, and the Bureau of Land Management, among others. The maps themselves were produced by cartographers from the USGS and the InfoGraphics Lab at the University of Oregon.

Credit: 
U.S. Geological Survey

Risk of childhood asthma by caesarean section is mediated through the early gut microbiome

The prevalence of caesarean section has increased globally in recent decades. While the World Health Organisation suggests that the procedure should be performed in less than 15% of births to prevent morbidity and mortality, the prevalence is higher in most countries. Children born by caesarean section have an increased risk of developing asthma and other immune-mediated diseases compared to children born by vaginal delivery. A link between caesarean section and later disease has been suggested to be mediated through microbial effects.

For the first time, in a new study published in Science Translational Medicine, researchers from Copenhagen Prospective Studies on Asthma in Childhood (COPSAC), University of Copenhagen, Danish technical University and Rutgers University describe how delivery by caesarean section leads to a skewed gut microbiome and associates with asthma development in the first 6 years of life.

Using the well-established Copenhagen Prospective Studies on Asthma in Childhood2010 (COPSAC2010) mother-child cohort the researchers analyzed the effects of delivery mode on the gut microbiome at multiple timepoints in the first year of life in and to explore whether perturbations of the microbiome can explain the delivery mode-associated risk of developing asthma during childhood.

Increased asthma risk was found in children born by caesarean section only if their gut microbiota at age 1 year still carried a caesarean section signature. No associations with asthma existed from the very early though more pronounced microbial perturbations.

"Even though a child is born by caesarean section and has an immense early microbial perturbation, this may not lead to a higher risk of asthma, if the microbiome matures sufficiently before age 1 year," says Jakob Stokholm, senior researcher and first author on the study.

He continues: "Our study proposes the perspective of restoring a caesarean section perturbed microbiome and thereby perhaps prevent asthma development in a child, who is otherwise in high risk. This study provides a mechanism for the known link between C-section birth and heightened risk of asthma: it is a one-two punch-abnormal early microbiota and then failure to mature."

Søren J. Sørensen, professor at the University of Copenhagen, adds:

"This study has implications for understanding the microbiota's role in asthma development after delivery by caesarean section and could in the future potentially lead to novel prevention strategies and targeted, efficient microbiota manipulation in children who had early perturbations of the microbiome."

Credit: 
University of Copenhagen

Landslide along Alaskan fjord could trigger tsunami

A glacier that had held an Alaskan slope in place for centuries is melting, releasing the soil beneath in what can be described as a slow-motion landslide, researchers say. But there's also the possibility of a real landslide that could cause a devastating tsunami.

In a study published last week, scientists noted that the slope on Barry Arm fjord on Prince William Sound in southeastern Alaska slid some 120 meters from 2010 to 2017. These are some of the first measurements to quantify how the slope is falling there.

"We are measuring this loss of land before the tsunami occurs," said Chunli Dai, lead author of the paper and a research scientist at The Ohio State University's Byrd Polar and Climate Research Center.

The study was published in Geophysical Research Letters.

Landslides on slopes near glaciers generally occur when glacial ice melts, a phenomenon occurring more rapidly around the world because of climate change. Landslides can prompt tsunamis by sending massive amounts of dirt and rocks into nearby bodies of water. Such a landslide happened in 2017 in western Greenland, prompting a tsunami that killed four people.

Scientists estimate that a landslide at Barry Arm fjord could be about eight times larger than that Greenland landslide.

If the entire slope collapsed at once, the researchers found, tsunami waves could reach communities throughout the sound, which are home to hundreds of people and visitors, including fishermen, tourists and members of an indigenous Alaskan group called the Chugach.

For this study, researchers used satellite data to measure and monitor the size of the glacier that had covered the Barry Arm slope, and to measure the amount of land that had already been displaced, which is found to be directly linked to Barry Arm glacier's melting. Then, they built models to identify the potential landslide risk.

The data showed that, from 1954 to 2006, Barry Glacier thinned by less than a meter per year. But after 2006, the melt rapidly increased, so that the glacier was thinning by about 40 meters per year. The glacier retreated rapidly from 2010 to 2017, the researchers found. The land's "toe" -- the bottom point of the falling slope -- had butted against the glacier in 2010. By 2017, that toe was exposed, and butted up against the water in Prince William Sound.

The researchers modeled potential tsunami scenarios, and found that, if the land along that slope collapsed at once, the resulting tsunami would send currents between 25 and 40 meters per second -- enough to cause significant damage to large cruise and cargo ships and fishing boats, as well as overwhelming kayakers, all of which frequent Prince William Sound.

Waves could reach 10 meters in the nearby town of Whittier. The tsunami could disrupt fiber optic service to parts of Alaska, the researchers noted -- two of the five submarine fiber optic lines to Alaska run below Prince William Sound. And oil from the 1989 Exxon Valdez oil spill still lingers in sediment in Prince William Sound, meaning it is possible that a tsunami could send that oil back into the environment.

"If the slope fails at once, it would be catastrophic," said Dr. Bretwood Higman, a geologist with Ground Truth Alaska and co-author of the study.

When and if that massive landslide occurs depends on geology, climate and luck. An earthquake, prolonged rains, thawing permafrost or snowmelt could trigger one, the researchers said. (A 2018 earthquake in Alaska did not trigger a landslide, the researchers noted.)

"People are working on early-detection warnings, so if a landslide happens, people in nearby communities might at least get a warning," said Anna Liljedahl, an Alaska-based hydrologist with Woodwell Climate Research Center, and another co-author. "This kind of research might help with building those early-warning systems."

Credit: 
Ohio State University

Advanced atomic clock makes a better dark matter detector

image: Cartoon depicting a clock looking for dark matter

Image: 
Hanacek/NIST

JILA researchers have used a state-of-the-art atomic clock to narrow the search for elusive dark matter, an example of how continual improvements in clocks have value beyond timekeeping.

Older atomic clocks operating at microwave frequencies have hunted for dark matter before, but this is the first time a newer clock, operating at higher optical frequencies, and an ultra-stable oscillator to ensure steady light waves, have been harnessed to set more precise bounds on the search. The research is described in Physical Review Letters .

Astrophysical observations show that dark matter makes up most of the "stuff" in the universe but so far it has eluded capture. Researchers around the world have been looking for it in various forms. The JILA team focused on ultralight dark matter, which in theory has a teeny mass (much less than a single electron) and a humongous wavelength--how far a particle spreads in space--that could be as large as the size of dwarf galaxies. This type of dark matter would be bound by gravity to galaxies and thus to ordinary matter.

Ultralight dark matter is expected to create tiny fluctuations in two fundamental physical "constants": the electron's mass, and the fine-structure constant. The JILA team used a strontium lattice clock and a hydrogen maser (a microwave version of a laser) to compare their well-known optical and microwave frequencies, respectively, to the frequency of light resonating in an ultra-stable cavity made from a single crystal of pure silicon. The resulting frequency ratios are sensitive to variations over time in both constants. The relative fluctuations of the ratios and constants can be used as sensors to connect cosmological models of dark matter to accepted physics theories.

The JILA team established new limits on a floor for "normal" fluctuations, beyond which any unusual signals discovered later might be due to dark matter. The researchers constrained the coupling strength of ultralight dark matter to the electron mass and the fine-structure constant to be on the order of 10-5 (1 in 100,000) or less, the most precise measurement ever of this value.

JILA is jointly operated by the National Institute of Standards and Technology (NIST) and the University of Colorado Boulder.

"Nobody actually knows at what sensitivity level you will start to see dark matter in laboratory measurements," NIST/JILA Fellow Jun Ye said. "The problem is that physics as we know it is not quite complete at this point. We know something is missing but we don't quite know how to fix it yet."

"We know dark matter exists from astrophysical observations, but we don't know how the dark matter connects to ordinary matter and the values we measure," Ye added. "Experiments like ours allow us to test various theory models people put together to try to explore the nature of dark matter. By setting better and better bounds, we hope to rule out some incorrect theory models and eventually make a discovery in the future."

Scientists are not sure whether dark matter consists of particles or oscillating fields affecting local environments, Ye noted. The JILA experiments are intended to detect dark matter's "pulling" effect on ordinary matter and electromagnetic fields, he said.

Atomic clocks are prime probes for dark matter because they can detect changes in fundamental constants and are rapidly improving in precision, stability and reliability. The cavity's stability was also a crucial factor in the new measurements. The resonant frequency of light in the cavity depends on the length of the cavity, which can be traced back to the Bohr radius (a physical constant equal to the distance between the nucleus and the electron in a hydrogen atom). The Bohr radius is also related to the values of the fine structure constant and electron mass. Therefore, changes in the resonant frequency as compared to transition frequencies in atoms can indicate fluctuations in these constants caused by dark matter.

Researchers collected data on the strontium/cavity frequency ratio for 12 days with the clock running 30% of the time, resulting in a data set 978,041 seconds long. The hydrogen maser data spanned 33 days with the maser running 94% of the time, resulting in a 2,826,942-second record. The hydrogen/cavity frequency ratio provided useful sensitivity to the electron mass although the maser was less stable and produced noisier signals than the strontium clock.

JILA researchers collected the dark matter search data during their recent demonstration of an improved time scale -- a system that incorporates data from multiple atomic clocks to produce a single, highly accurate timekeeping signal for distribution. As the performance of atomic clocks, optical cavities and time scales improves in the future, the frequency ratios can be re-examined with ever-higher resolution, further extending the reach of dark matter searches.

"Any time one is running an optical atomic time scale, there is a chance to set a new bound on or make a discovery of dark matter," Ye said. "In the future, when we can put these new systems in orbit, it will be the biggest 'telescope' ever built for the search for dark matter."

Credit: 
National Institute of Standards and Technology (NIST)

NIST designs a prototype fuel gauge for orbit

image: Many satellites perform highly important and lucrative tasks, but some may be decommissioned with fuel still in the tank due to the current methods of measuring fuel quantity. Fuel gauges with higher accuracy could help ensure that satellites stay operational for longer and more is made of their time in orbit.

Image: 
NASA Jet Propulsion Laboratory

Liquids aren't as well behaved in space as they are on Earth. Inside a spacecraft, microgravity allows liquids to freely slosh and float about.

This behavior has made fuel quantity in satellites difficult to pin down, but a new prototype fuel gauge engineered at the National Institute of Standards and Technology (NIST) could offer an ideal solution. The gauge, described in the Journal of Spacecraft and Rockets, can digitally recreate a fluid's 3D shape based on its electrical properties. The design could potentially provide satellite operators with reliable measurements that would help prevent satellites from colliding and keep them operational for longer.

"Every day that a satellite stays in orbit amounts to probably millions of dollars of revenue," said Nick Dagalakis, a NIST mechanical engineer and co-author of the study. "The operators want to utilize every drop of fuel, but not so much that they empty the tank."

Letting a satellite's tank run dry could leave it stranded in its original orbit with no fuel to avoid smashing into other satellites and producing dangerous debris clouds.

To reduce the probability of collision, operators save the last few drops of fuel to eject satellites into a graveyard orbit, hundreds of kilometers away from functioning spacecraft. They may be wasting fuel in the process, however.

For decades, gauging fuel in space has not been an exact science. One of the most frequently relied upon methods entails estimating how much fuel is being burned with each thrust and subtracting that amount from the volume of fuel in the tank. This method is quite accurate at the start when a tank is close to full, but the error of each estimate carries on to the next, compounding with every thrust. By the time a tank is low, the estimates become more like rough guesses and can miss the mark by as much as 10%.

Without reliable measurements, operators may be sending satellites with fuel still in the tank into an early retirement, potentially leaving a considerable amount of money on the table.

The concept of the new gauge -- originally devised by Manohar Deshpande, a technology transfer manager at NASA Goddard Space Flight Center -- makes use of a low-cost 3D imaging technique known as electrical capacitance volume tomography (ECVT).

Like a CT scanner, ECVT can approximate an object's shape by taking measurements at different angles. But instead of shooting X-rays, electrodes emit electric fields and measure the object's ability to store electric charge, or capacitance.

Deshpande sought the expertise of Dagalakis and his colleagues at NIST -- who had previous experience fabricating capacitance-based sensors -- to help make his designs a reality.

In the NanoFab clean room at NIST's Center for Nanoscale Science and Technology, the researchers produced sensor electrodes using a process called soft lithography, in which they printed patterns of ink over copper sheets with a flexible plastic backing. Then, a corrosive chemical carved out the exposed copper, leaving behind the desired strips of metal, Dagalakis said.

The team lined the interior of an egg-shaped container modeled after one of NASA's fuel tanks with the flexible sensors. Throughout the inside of the tank, electric fields emitted by each sensor can be received by the others. But how much of these fields end up being transmitted depends on the capacitance of whatever material is inside the tank.

"If you have no fuel, you have the highest transmission, and if you have fuel, you're going to have a lower reading, because the fuel absorbs the electromagnetic wave," Dagalakis said. "We measure the difference in transmission for every possible sensor pair, and by combining all these measurements, you can know where there is and isn't fuel and create a 3D image."

To test out what the new system's fuel gauging capabilities might look like in space, the researchers suspended a fluid-filled balloon in the tank, mimicking a liquid blob in microgravity.

Many liquids commonly used to propel satellites and spacecraft, such as liquid hydrogen and hydrazine, are highly flammable in Earth's oxygen-rich atmosphere, so the researchers opted to test something more stable, Dagalakis said.

At Deshpande's recommendation, they filled the balloons with a heat transfer fluid -- normally used for storing or dissipating thermal energy in industrial processes -- because it closely mimicked the electrical properties of space fuel.

The researchers activated the system and fed the capacitance data to a computer, which produced a series of 2D images mapping the location of fluid throughout the length of the tank. When compiled, the images gave rise to a 3D rendition of the balloon with a diameter that was less than 6% different than the actual balloon's diameter.

"This is just an experimental prototype, but that is a good starting point," Dagalakis said.

If further developed, the ECVT system could help engineers and researchers overcome several other challenges presented by liquid's behavior in space.

"The technology could be used to continuously monitor fluid flow in the many pipes aboard the International Space Station and to study how the small forces of sloshing fluids can alter the trajectory of spacecraft and satellites," Deshpande said.

Credit: 
National Institute of Standards and Technology (NIST)

COVID-19 shutdown effect on air quality mixed

image: UD's Cristina Archer is a professor in the College of Earth, Ocean and Environment, with a joint appointment between the Physical Ocean Science and Engineering (POSE) program of the School of Marine Science and Policy and the Department of Geography.

Image: 
Photo by Evan Krape

In April 2020, as Delaware and states across the country adopted social distancing measures to deal with the public health crisis caused by the coronavirus (COVID-19), University of Delaware professor Cristina Archer recalled having a bunch of people tell her that the skies looked bluer than usual.

This simple observation led Archer to investigate an important and complicated research question: Did the social distancing measures adopted in the United States, and the resulting lower number of people using various means of transportation, cause an improvement in air quality across the country?

Unfortunately, unlike those observed clear, blue skies, the answer is a bit murky.

"Just because people stayed at home and just because they drove less, it didn't necessarily mean that the air quality was better," said Archer, professor in the School of Marine Science and Policy in UD's College of Earth, Ocean and Environment (CEOE). "By some parameters the air quality was better, but by some other parameters, it was not. And it actually got worse in numerous places."

The results of the study were recently published in the Bulletin of Atmospheric Science and Technology. The study was led by researchers at UD, Penn State and Columbia University.

From UD, coauthors include doctoral students Maryam Golbazi and Nicolas Al Fahel. Other coauthors on the study are Guido Cervone, professor of geography, and meteorology and atmospheric science at Penn State, and Carolynne Hultquist, a former doctoral student in Cervone's laboratory and now a postdoctoral researcher at the Earth Institute of Columbia University.

To assess air quality, Archer focused on nitrogen dioxide and fine particulate matter, specifically PM2.5. The two pollutants are federally regulated and are both primary and secondary pollutants, meaning they can be emitted either directly into the atmosphere or indirectly from chemical reactions.

Nitrogen dioxide is emitted during the fuel combustion by all motor vehicles and airplanes while particulate matter is emitted by airplanes and, among motor vehicles, mostly by diesel vehicles, such as commercial heavy-duty diesel trucks. They are both also emitted by fossil-fuel power plants, although particulate matter is mostly emitted by coal power plants.

"Nitrogen dioxide is a good indicator of traffic," said Archer. "We had evidence already from some preliminary papers in China that nitrogen dioxide had dropped significantly over the areas of China where a lockdown was in place."

The researchers compared measurements of these two pollutants in April 2020 against those in April over the five previous years -- from 2015-2019.

They chose April because pretty much every state had some kind of social distancing measure in place by April 1, which led to changes in lifestyle.

"Even in the states where there were not that many infections, there was a change in the mobility of people," said Archer.

To quantify social distancing, the researchers used a mobility index calculated and distributed by Descartes Labs, a predictive intelligence company that compiles large data sets from around the world. Their algorithms accounted for people's maximum distance travelled in a day by tracing the user's location multiple times a day while using selected apps on their smartphones.

"One of the big uncertainties with trying to forecast future air quality is how the atmosphere will respond to lower emissions of certain pollutants," said Cervone. "COVID-19 gave us some insights into the effects of lower emission rates on the environment. We had this unique situation that showed us what happens if people stop driving."

In addition, they used 240 ground monitoring sites to measure nitrogen dioxide and 480 for particulate matter, as well as satellite data from the National Aeronautics and Space Administration's (NASA) Ozone Monitoring Instrument (OMI) to measure the total tropospheric column -- the lowest layer of earth's atmosphere -- of nitrogen dioxide.

Overall, they found that there were large, statistically significant decreases of nitrogen dioxide at 65% of the monitoring sites, with the NASA OMI satellite data showing an average drop in 2020 by 13% over the entire country when compared to the mean of the previous five years.

The particulate matter concentrations, however, were not significantly lower in 2020 than in the past five years. They were also more likely to be higher than lower in April 2020 when compared to the previous five years.

"Not surprisingly, we found that the air quality improved in terms of nitrogen dioxide at all of the stations. So as soon as people started to stay home, traffic was reduced and the air quality was better for nitrogen dioxide," said Archer. "But when we looked at particulate matter, there was almost no difference. There was no improvement on average with respect to particulate matter and at 24% of the sites, April 2020 was worse than the previous five years for particulate matter concentrations."

Golbazi echoed these sentiments saying that the researchers "were expecting that the reduction in transportation would reduce nitrogen dioxide concentrations. However, our results about PM2.5 were surprising because we learned that this scale of reduction in human mobility did not really reduce the PM2.5 concentrations."

This is significant because while nitrogen dioxide is a precursor to other pollutants, such as tropospheric ozone formation and the formation of nitric acid in acid rain, which will eventually lead to negative impacts on human health, particulate matter is dangerous on its own.

Fine particulate matter is an air pollutant that consists of microscopic particles that pose a great risk to human health because they can directly penetrate into human lungs, the bloodstream and even the heart.

Archer said that they are working on a hypothesis as to why particulate matter might have been elevated while nitrogen dioxide decreased.

One possible explanation is that particulate matter is emitted by diesel vehicles, which handle most of the nation's deliveries. While traffic with regard to gasoline-fueled passenger vehicles was down in April, the diesel trucks and the regular freight traffic didn't change significantly.

In addition, more particulate matter is emitted when individuals use heat to warm their personal homes when compared to office spaces.

"In the office, you're very unlikely to have a stove or a fireplace, whereas at home you are," said Archer. "And in April 2020, even though in many states it's already the warm season, in the Northeast, we had an incredibly cold April. So particulate matter is probably responding to those residential heating changes as well as the normality or even above normality of freight traffic and diesel vehicles."

Archer said the next steps in the research are to perform more studies to confirm their hypothesis. In addition, the work will be presented at the upcoming virtual American Geophysical Union fall meeting.

As for those bright, sunny days observed in Delaware at the start of the shutdown in April, Archer said that it was probably just a matter of perception.

"People were saying to me, 'It seems to me the skies are bluer,' but they were not," she said. "It was probably a coincidence that there were some sunny and clear days. In Delaware, actually, our skies were more polluted in April 2020 because of higher-than-average particulate matter concentrations."

Credit: 
University of Delaware

Scientists pinpoint two new potential therapeutic targets for rheumatoid arthritis

image: Dr Achilleas Floudas, front right, and Professor Ursula Fearon, left, with members of Trinity College Dublin's Molecular Rheumatology Group.

Image: 
Trinity College Dublin

A collaborative team of scientists has pinpointed two new potential therapeutic targets for rheumatoid arthritis - a painful inflammatory disease that affects an estimated 350 million people worldwide.

Rheumatoid arthritis (RA) is the most common form of Inflammatory arthritis, affecting 1-2% of the world's population. It is characterised by progressive joint inflammation, damage and disability, which severely impacts a patient's quality of life. There is currently no cure.

Contrary to popular belief, RA is not a "disease of the elderly". Disease onset occurs in adults between 35-45 years of age, and it also afflicts children.

"B cells" are key cells of the immune system, which are responsible for the production of antibodies that fight infections. However, in RA, these B cells--for reasons not yet fully understood--fail to recognise friend from foe and thus attack the joints. This leads to the tell-tale joint inflammation that causes such pain in patients.

In the new study, just published in international journal, JCI Insight, the collaborative team made two key discoveries.

Led by Dr Achilleas Floudas and Professor Ursula Fearon from Trinity College Dublin's Molecular Rheumatology group in the School of Medicine, the team discovered a new cell population that is especially troublesome in people living with RA, and also learned how these cells accumulate in the joints.

Collectively, their work puts two potential new therapeutic targets for RA on the radar. Dr Floudas said:

"We discovered a novel population of B cells in the joints of patients with RA, and these cells are more inflammatory and invasive than those we knew before. Their damaging effects rely on the production of specific coded messages, in the form of proteins called cytokines and energy pathways within the cells, which essentially maintain their activation. Basically, they 'switch on', cause inflammation, and are maintained within the environment of the inflamed joint.

"We also discovered a new mechanism by which these B cells accumulate in the joint, by pinpointing the protein that seems to be responsible for attracting them to the joints. As a result, we now have two new potential targets for people living with RA. We are some way away from a therapeutic solution but if we can find a way of targeting these B cells and/or the protein that attracts them to the joints, we can one day hope to develop a therapy that could impact positively on millions of people living with RA."

Credit: 
Trinity College Dublin

New study outlines steps higher education should take to prepare a new quantum workforce

A new study outlines ways colleges and universities can update their curricula to prepare the workforce for a new wave of quantum technology jobs. Three researchers, including Rochester Institute of Technology Associate Professor Ben Zwickl, suggested steps that need to be taken in a new paper in Physical Review Physics Education Research after interviewing managers at more than 20 quantum technology companies across the U.S.

The study's authors from University of Colorado Boulder and RIT set out to better understand the types of entry-level positions that exist in these companies and the educational pathways that might lead into those jobs. They found that while the companies still seek employees with traditional STEM degrees, they want the candidates to have a grasp of fundamental concepts in quantum information science and technology.

"For a lot of those roles, there's this idea of being 'quantum aware' that's highly desirable," said Zwickl, a member of RIT's Future Photon Initiative and Center for Advancing STEM Teaching, Learning and Evaluation. "The companies told us that many positions don't need to have deep expertise, but students could really benefit from a one- or two-semester introductory sequence that teaches the foundational concepts, some of the hardware implementations, how the algorithms work, what a qubit is, and things like that. Then a graduate can bring in all the strength of a traditional STEM degree but can speak the language that the company is talking about."

The authors said colleges and universities should offer introductory, multidisciplinary courses with few prerequisites that will allow software engineering, computer science, physics, and other STEM majors to learn the core concepts together. Zwickl said providing quantum education opportunities to students across disciplines will be important because quantum technology has the opportunity to disrupt a wide range of fields.

"It's a growing industry that will produce new sensors, imaging, communication, computing technologies, and more," said Zwickl. "A lot of the technologies are in a research and development phase, but as they start to move toward commercialization and mass production, you will have end-users who are trying to figure out how to apply the technology. They will need technical people on their end that are fluent enough with the ideas that they can make use of it."

Zwickl's participation in the project was supported in part by funding RIT received from the NSF's Quantum Leap Challenge Institutes program. As a co-PI and lead on the education and workforce development for the proposal, he said he is hoping to apply many of the lessons learned from the study to RIT's curriculum. He is in the process of developing two new introductory RIT courses in quantum information and science as well as an interdisciplinary minor in the field.

Credit: 
Rochester Institute of Technology

Soccer players' head injury risk could be reduced with simple adjustments to the ball

video: The average soccer player heads the ball 12 times in a game, and each header carries up to 100g acceleration, enough to cause serious brain damage. A Purdue study led by Prof. Eric Nauman quantifies how inflating balls to lower pressures, and subbing them out when they get wet, has the potential to reduce head injuries by about 20%.

Image: 
Purdue University/Jared Pike

WEST LAFAYETTE, Ind. -- Up to 22% of soccer injuries are concussions that can result from players using their heads to direct the ball during a game.

To reduce risk of injury, a new study recommends preventing how hard a ball hits the head by inflating balls to lower pressures and subbing them out when they get wet.

The study, conducted by Purdue University engineers, found that inflating balls to pressures on the lower end of ranges enforced by soccer governing bodies such as the NCAA and FIFA could reduce forces associated with potential head injury by about 20%.

But if the ball gets too wet, it can quickly surpass the NCAA weight limit for game play and still produce a nasty impact, the researchers said.

"If the ball has too high of a pressure, gets too waterlogged, or both, it actually turns into a weapon. Heading that ball is like heading a brick," said Eric Nauman, a Purdue professor of mechanical engineering and basic medical sciences with a courtesy appointment in biomedical engineering.

Soccer governing bodies already regulate ball pressure, size, mass and water absorption at the start of a game, but Nauman's lab is the first to conduct a study evaluating the effects of each of these ball parameters on producing an impact associated with potential neurophysiological changes.

The results are published in the journal PLOS One. The researchers discuss the work in a video on YouTube at https://youtu.be/3b_19wW7K6A.

The study also evaluated ball velocity, finding that this variable actually contributes the most to how hard a ball hits. But ball pressure and water absorption would be more realistic to control.

"You can't control how hard a player kicks a ball. There are other ways to decrease those forces and still have a playable game," Nauman said.

A professional soccer player heads the ball about 12 times over the course of a single game and 800 times in games over an entire season, past studies have shown.

The lower end of NCAA and FIFA pressure ranges, which the researchers discovered could help reduce the ball's peak impact force, already aligns with pressures specified by the manufacturer on the ball. These specifications would provide an easy way to know if a pressure is low enough to reduce risk of head injury.

"The study really sheds light on the issue of how the weight and impact of the ball can change under different conditions. Sports governing bodies and manufacturers can use this research to further reduce the risk of lasting brain functional or structural injury as a result of head impacts accrued through soccer game play," said Francis Shen, a professor of law at the University of Minnesota whose research focuses on the intersection of sports concussions and the legal system.

Nauman and Shen met through the Big Ten-Ivy League Traumatic Brain Injury Collaboration, a multi-institutional research effort to better understand the causes and effects of sport-related concussion and head injuries.

In this study, Nauman's lab tested three soccer ball sizes - a 4, 4.5 and a 5 - by kicking them against a force plate in a lab. Even though only size 5 balls are played by professional adults, the researchers also observed the smaller 4 and 4.5 sizes played by kids under the age of 12 to evaluate how much the size of a ball contributes to peak impact force.

The study included 50 trials for each ball size at four different pressures, ranging from 4 psi to 16 psi. This range includes pressures below standard manufacturing specifications and near the limit of soccer governing body regulations.

Purdue graduate student Nicolas Leiva-Molano did 200 kicks per ball size for a total of 600 kicks.

To test water absorption, the researchers submerged each ball size for 90 minutes - the duration of a game regulated by soccer governing bodies. They weighed and rotated each ball every 15 minutes.

Within the first 15 minutes, a size 5 soccer ball had already exceeded the allowable weight gain cited in NCAA soccer rules.

Based on this study's findings, a size 4.5 soccer ball is the safest to play in terms of forces contributed by pressure, mass and water absorption. But reducing pressure and limiting water absorption made a difference for all three ball sizes.

"This was a very simple experiment. But there just hasn't been much data out there on these issues, and that's a huge problem," Nauman said.

The next step would be to replicate this experiment outside of the lab, ideally in partnership with a high school or college athletic conference, which would allow the researchers to study the effects of ball hits at different parameters before and after a season.

"There are lots of examples in sports where organizations have changed the rules to make the game safer. This new study suggests a simple way to further those efforts for safer equipment and game play," Shen said.

Credit: 
Purdue University

Research produces intense light beams with quantum correlations

image: Pump laser for production of quantum-correlated light beams

Image: 
Marcelo Martinelli / IF-USP

The properties of quantum states of light are already leveraged by such highly sophisticated leading-edge technologies as those of the latest sensitivity upgrades to LIGO, the Laser Interferometer Gravitational-Wave Observatory, deployed to detect gravitational waves since September 2015, or the encryption keys used for satellite on-board security.

Both solutions use crystals as noise-free optical amplifiers. However, the use of atomic vapors has been considered a more efficient alternative that enhances the accessibility of non-classical light states.

“We show that oscillators based on these atomic amplifiers can generate intense beams of light with quantum correlations,” Marcelo Martinelli, a researcher in the University of São Paulo’s Physics Institute (IF-USP), told Agência FAPESP. Martinelli is a co-author of an article published in Physical Review Letters describing the main results to date of a Thematic Project for which he is the principal investigator and which is supported by FAPESP.

Both crystals and atomic vapors can be used to produce quantum correlated pairs of light beams. Investigating the behavior of these sources is a challenge. The behavior of light below a certain level of power resembles that of the light produced by a bulb. Above a certain threshold, its characteristics are similar to those of a laser. “It’s as if the crystals or atomic vapor converted the light from a lamp into laser light. It’s easier to investigate this transition in the atomic medium than the crystalline medium since more intense beams can be produced in an atomic medium,” Martinelli said.

Optical cavities are used for this purpose. Controlling cavity geometry and atomic vapor temperature, Martinelli and collaborators were able to produce photon coupling in more open cavities.

“This offered two advantages in comparison with the old crystal-based cavities – more quantum efficiency so that the number of photons supplied by the output window easily surpassed the number of photons lost to the environment, and a chance to investigate more subtle details of the transition between light with heterogeneous frequencies and the production of intense laser-like beams. It was as if we had opened a window on to the quantum dynamics of the phase transition,” Martinelli said.

Potential applications include high-precision metrology with manipulation of the quantum noise in light and information encoding via quantum entanglement.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Advancing fusion energy through improved understanding of fast plasma particles

image: Physicist Laura Xin Zhang with figures from her paper.

Image: 
Collage by Elle Starkman/PPPL Office of Communications.

Unlocking the zig-zagging dance of hot, charged plasma particles that fuel fusion reactions can help to harness on Earth the fusion energy that powers the sun and stars. At the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL), an experimentalist and two theorists have developed a new algorithm, or set of computer rules, for tracking volatile particles that could advance the arrival of safe, clean and virtually limitless source of energy.

Close interaction

"This is a success story about close interaction between theorists and experimentalists that shows what can be done," said Hong Qin, a principal theoretical physicist at PPPL. He and Yichen Fu, a theoretical graduate student whom he advises, collaborated on the algorithm with Laura Xin Zhang, an experimental graduate student and lead author of a paper that reports the research in the journal Physical Review E. Qin and Fu coauthored the paper.

Fusion powers the sun and stars by combining light elements in the form of plasma -- the state of matter composed of free electrons and atomic nuclei, or ions, that makes up 99 percent of the visible universe -- to release massive amounts of energy. Scientists around the world are seeking to produce controlled fusion on Earth as an ideal source for generating electricity.

The new PPPL algorithm helps track fast charged particles in the plasma. The particles could, for example, stem from the injection of high-energy neutral beams that are broken down, or "ionized" in the plasma and collide with the main plasma particles. "We care about this because we want to understand how these fast particles influence the plasma," Zhang said.

Neutral beams play many roles when broken down into fast plasma particles. "We use them to do all sorts of things," Zhang said. "They can heat and drive current in the plasma. Sometimes they create plasma instabilities and sometimes they reduce them. Our simulations are all part of understanding how these particles behave."

First a problem

When Zhang first tried simulating the fast particles she ran into a problem. She used a classic algorithm that failed to conserve energy during what is called the pitch-angle scattering process of particles colliding. Such scattering is often observed in fusion plasma when electrons collide with ions that are roughly 2,000-times heavier in collisions akin to ping-pong balls bouncing off basketballs.

For Zhang, the problem "was similar to trying to simulate the orbit of a planet," she recalled. Just as the energy of an orbit does not change, "you want an algorithm that conserves the energy of the scattered plasma particles," she said.

Conserving that energy is critical, said Qin, whom Zhang consulted. "If an algorithm that simulates the process does not conserve the energy of the particles, the simulation cannot be trusted," he said. He thus devised an alternative method, an explicitly solvable algorithm that conserves the energy of the particles, which Zhang went on to try.

" I'm an experimentalist at heart and my approach to problems is to try it," she said. "So I ran a bunch of simulations and did all kinds of numerical experiments that showed the algorithm worked better than the classic algorithm that failed to conserve energy." However, the alternative method could not be proven theoretically.

Qin next handed the problem to graduate student Fu, who put together a clever mathematical proof of the correctness of the algorithm that could become a step to further solutions.

"The algorithm we developed is for a simplified model," Zhang said. "It drops several terms that will be important to include. But I am charging ahead and aiming to apply the algorithm we've developed to new plasma physics problems."

Credit: 
DOE/Princeton Plasma Physics Laboratory

Exoskeleton-assisted walking improves mobility in individuals with spinal cord injury

image: For this study, two types of exoskeletons were used by participants with spinal cord injury - Ekso GT, shown here, and Rewalk.

Image: 
Kessler Foundation

East Hanover, NJ. November 12, 2020. Exoskeletal-assisted walking is safe, feasible, and effective in individuals disabled by spinal cord injury, according to the results of a federally funded multi-site randomized clinical trial. The article, "Mobility skills with exoskeletal-assisted walking in persons with SCI: Results from a three-center randomized clinical trial" (doi: 10.3389/frobt.2020.00093), was published August 4, 2020 in Frontiers in Robotics and AI. It is available open access at https://www.frontiersin.org/articles/10.3389/frobt.2020.00093/full

The authors are Eun-Kyoung Hong, Pierre Asselin, MS, Steven Knezevic, Stephen Kornfeld, DO, and Ann M. Spungen, EdD, of the James J. Peters VA Medical Center, Gail Forrest, PhD, of Kessler Foundation, Peter Gorman, MD, and William Scott, PhD, of the University of Maryland School of Medicine, and Sandra Wojciehowski, PT, of Craig Hospital and Kessler Foundation. The study was conducted at three sites: James J. Peters VA Medical Center, Bronx, NY; Kessler Foundation, West Orange, NJ; and the University of Maryland.

Study investigators sought to establish guidelines for clinical exoskeletal-assisted walking programs for individuals with spinal cord injury. Their goal was to determine the number of exoskeleton training sessions needed by individuals with varied mobility deficits to gain adequate exoskeletal assisted walking skills and attain velocity milestones. Two powered exoskeletons were used in the study: the Ekso GT (Ekso Bionics), and ReWalk (ReWalk Robotics).

The 50 participants included individuals with tetraplegia and paraplegia, both motor complete and incomplete. In this randomized control trial, their performance was measured over a total of 36 sessions. Participants were randomized to Group 1 (exo-assisted walking) or Group 2 (usual activity) for 12 weeks; each group crossed over to the other study arm. After 12, 24, and 36 sessions, their progress was measured by the 10-meter walk test seconds (s) (10MWT), 6-min walk test meters (m) (6MWT), and the Timed-Up-and-Go (s) (TUG).

The majority of participants mastered the ability to ambulate effectively with the assistance of the exoskeleton, according to Dr. Forrest, director of the Tim and Caroline Reynolds Center for Spinal Stimulation, and associate director of the Center for Mobility and Rehabilitation Engineering Research at Kessler Foundation. After 12 sessions, 31 (62%), 35 (70%), and 36 (72%) participants achieved the milestones established for the 10MWT, 6MWT, and TUG, respectively. After 36 sessions, the results improved, with 40 (80%), 41 (82%), and 42 (84%) of participants meeting the criteria for the 10MWT, 6MWT, and TUG, respectively.

"Participants showed improvement regardless of level of injury, completeness, or duration of injury," noted Dr. Forrest, "indicating that exoskeletons can be used to improve mobility across a broad spectrum of individuals with neurological deficits caused by spinal cord injury. Our results can be used to guide the application of exoskeletons to spinal cord injury rehabilitation, and the timely acquisition of skills for the safe use of these devices for rehabilitation and community use."

Credit: 
Kessler Foundation

New prediction algorithm identifies previously undetected cancer driver genes

Irvine, CA - November 12, 2020 - A new study, led by researchers from the University of California, Irvine, has deepened the understanding of epigenetic mechanisms in tumorigenesis and revealed a previously undetected repertoire of cancer driver genes. The study was published this week in Science Advances.

Using a new prediction algorithm, called DORGE (Discovery of Oncogenes and tumor suppressoR genes using Genetic and Epigenetic features), researchers were able to identify novel tumor suppressor genes (TSGs) and oncogenes (OGs), particularly those with rare mutations, by integrating the most comprehensive collection of genetic and epigenetic data.

"Existing bioinformatics algorithms do not sufficiently leverage epigenetic features to predict cancer driver genes, despite the fact that epigenetic alterations are known to be associated with cancer driver genes," said senior author Wei Li, PhD, the Grace B. Bell chair and professor of bioinformatics in the
Department of Biological Chemistry at the UCI School of Medicine. "Our computational algorithm integrates public data on epigenetic and genetic alternations, to improve the prediction of cancer driver genes."

Cancer results from an accumulation of key genetic alterations that disrupt the balance between cell division and apoptosis. Genes with "driver" mutations that affect cancer progression are known as cancer driver genes, and can be classified as TSGs and oncogenes OGs based on their roles in cancer progression.

This study demonstrated how cancer driver genes, predicted by DORGE, included both known cancer driver genes and novel driver genes not reported in current literature. In addition, researchers found that the novel dual-functional genes, which DORGE predicted as both TSGs and OGs, are highly enriched at hubs in protein-protein interaction (PPI) and drug/compound-gene networks.

"Our DORGE algorithm, successfully leveraged public data to discover the genetic and epigenetic alterations that play significant roles in cancer driver gene dysregulation," explained Li. "These findings could be instrumental in improving cancer prevention, diagnosis and treatment efforts in the future."

Credit: 
University of California - Irvine