Culture

New tool aims to assist military logistics in evacuating noncombatants

Researchers from the U.S. Army and North Carolina State University have developed a computational model that can be used to expedite military operations aimed at evacuating noncombatants, disaster response or humanitarian relief.

"What sets this tool apart from other models is that it is designed for use in both planning and during operations," says Brandon McConnell, corresponding author of a paper on the new model and a research assistant professor in NC State's Edward P. Fitts Department of Industrial & Systems Engineering.

In short, the new model is designed to help officials determine what needs to be where and at what time in order to complete an evacuation as quickly as possible.

"In terms of specificity, we're talking about where a given truck will be at any point in time during an operation," McConnell says.

The research was developed with a focus on noncombatant evacuation operations in South Korea. However, it could be used in a wide variety of scenarios.

"There is a tremendous amount of complexity associated with the Army's South Korea noncombatant evacuation mission, and that presents a great opportunity for investigation and improvement," says U.S. Army Captain John Kearby, first author of the paper and a former graduate student at NC State. Kearby is currently an instructor at the U.S. Military Academy at West Point, but previously served as an engineer company commander in South Korea. "The goal of this research was, and is, to encourage the development of better and more robust evacuation plans."

"The existing simulation models are both sophisticated and detailed - they have been valuable tools for helping us study operations like these," McConnell says. "However, they're not designed to respond to rapidly changing scenarios.

"The new model can operate in near-real time, making it operationally relevant. After all, even the best plans need at least minor modifications during execution."

"The tool will need fine-tuning before it can be implemented - it would benefit from a user-friendly interface, for one thing - but it highlights the potential that operational models have for helping the military achieve its objectives both in and out of wartime," says Joseph Myers, mathematical sciences division chief at the Army Research Office, an element of U.S. Army Combat Capabilities Development Command's Army Research Laboratory.

The paper, "Modeling and Transportation Planning for U.S. Noncombatant Evacuation Operations in South Korea," is published in the Journal of Defense Analytics and Logistics. The paper was co-authored by Thom Hodgson, Michael Kay and Russel King of NC State's Fitts Department and the Center for Additive Manufacturing and Logistics.

Credit: 
North Carolina State University

How much does black carbon contribute to climate warming?

video: When carbon burns -- whether it's the charcoal on your barbecue or from a forest fire -- soot is released into the atmosphere. But what goes up must come down, so what happens to soot?

Image: 
Michigan Tech

Researchers at Michigan Technological University and Brookhaven National Laboratory, along with partners at other universities, industry, and national labs, have determined that while the shape of particles containing black carbon does have some effect on atmospheric warming, it's important to account for the structural differences in soot particles, as well as how the particles interact with other organic and inorganic materials that coat black carbon as it travels through the atmosphere.

Published today in the Proceedings of the National Academy of Sciences, the article provides a framework that reconciles model simulations with laboratory and empirical observations, and that can be used to improve estimates of black carbon's impact on climate.

Every Black Carbon Particle is Unique

Black carbon's absorption of solar radiation is comparable to that of carbon dioxide. Yet black carbon only remains in the atmosphere for days to weeks, while carbon dioxide can remain in the atmosphere for hundreds of years.

Scientists for years have approximated black carbon particles as spherically shaped in models, that frequently became coated by other organic materials. The thought was that as the soot particles travel through the atmosphere, the coating had what is called a "lensing effect"; the coat focuses light down on the black carbon, causing increased radiation absorption. And while soot particles are indeed coated in organic materials, that coating is not uniform from particle to particle.

"When you take an image under the microscope, the particles never looks perfectly like a sphere with the same coating," said Claudio Mazzoleni, professor of physics at Michigan Tech and one of the article's co-authors. "If you do a numerical calculation about perfect spheres coated by a shell, a model will show an enhanced absorption of the black carbon particles by a factor of up to three."

Empirical studies of black carbon particles demonstrate that absorption is much less than models would suggest, calling into question the effectiveness of the model as well as our understanding of black carbon's climate forcing effect.

Research suggests that the organic material coating is not fully spherical; depending on how the organic materials cling to a black carbon particle, the resulting shape can cause the particle to act very differently even if it has the same amount of material as another soot particle that is differently shaped. But even more important is that the amount of coating might change disparately from particle to particle. These two attributes both decrease the expected absorption enhancement.

Laura Fierce, an associate atmospheric scientist at Brookhaven National Laboratory, applied the particle-resolved model to account for particle heterogeneity while modeling black carbon.

"Whereas most aerosol models simplify the representation of particle composition, the particle-resolved model tracks the composition of individual particles as they evolve in the atmosphere," Fierce said. "This model is uniquely suited to evaluate error resulting from common approximations applied in global-scale aerosol models."

Less Effect on Climate Warming Than We Thought

Essentially, the researchers have introduced into climate modeling the diversity of organic and inorganic coating on particles and the non-uniform nature of the particles themselves. By combining an empirical model with laboratory measurements, the model predicted a much lower enhancement increase in absorption by black carbon than previously thought. The updated modeling also brings the model's output much closer to what was measured in the field.

"People think black carbon has a very strong warming effect on the atmosphere, which depends on absorption," Mazzoleni said. "If you have larger absorption, it contributes to warming and has greater climate impact. To understand how much black carbon contributes to the warming of climate, we need to understand these details because they can make a difference."

This research provides a path forward for improving predictions of black carbon's radiative effect on climate. Reducing black carbon emissions in the atmosphere can help reduce some of the effects of climate change. The results of this study suggest that a particle's absorption per mass is lower than previously thought, but how much black carbon is forcing atmospheric warming also depends on emissions levels, interactions with clouds and the distance a particle travels. And while reducing sooty emissions is significant, reducing carbon dioxide in the atmosphere is still of utmost importance.

Credit: 
Michigan Technological University

Scientists came up with nanoconcrete for casting under negative temperature conditions

image: A close-up view on huge scaffolding at the construction site for building the new Municipal Library. At the left the thick layers of scaffolds are necessary to support the forward horizontally front-facade.

Image: 
Fons Heijnsbroek (@fonsheijnsbroek51), Unsplash

Engineers from Far Eastern Federal University Military training center (FEFU, Vladivostok, Russia) together with colleagues from RUDN University have developed concrete mixture with nano additives for monolithic construction up to ten stories high. The concrete casting is possible within a very humid climate and negative temperature down to minus 5-degree centigrade. Given that, the constructed buildings will not require major renewal for 50 years. The related article is published in Construction and Building Materials.

Casting concrete under low-temperature conditions is a serious challenge for the construction industry. If the water in the concrete freeze, the fluidity of the concrete go wrong which will violate its curing and promote in-slab lumps forming. Casting at temperatures below plus five degrees already requires special technology. Breach of one leads to reduced characteristics of monolithic structures that run down prematurely.

Engineers of the FEFU Military training center (MTC), together with colleagues from RUDN University, suggested introducing into the concrete mix special additives (super plasticizers) whose properties are improved via nanotechnology. The development helps to maintain the strength and durability of concrete structures erected during the cold season, not to make the construction process more expensive.

"The characteristics of the new nano mixture meet the needs of civil engineering and generally exceed regulatory requirements. The mixture is suitable for casting of civil structures up to 10 stories high under humidity conditions. That makes it relevant for construction in humid continental, monsoon and moderately cold climate. The mixture is curing quickly, the outcoming concrete slab has a dense structure with no lumping and with a pore size smaller than in conventional concrete. Thus, moisture, which destroys ordinary concrete, not capable of penetrating the new one. The properties of concrete slab remain unchanged for 50 years". Explained one of the research authors Roman Fediuk, Lt. Col., professor at the FEFU MTC, the winner of the XIII All-Russian contest "Engineer of the Year 2018".

The scientist went on that the new concrete mixture, like previous developments, contains less cement replaced with ash from energy production and screenings of crushing sand that makes concrete environmentally friendly. Within that, the technological properties of the new mixture are the same as of the mixtures containing high-grade cement, which makes the development more cost-effective.

Engineers found out the proportions of the modified additives empirically; after that, mathematical models refined the calculations. The mixture with such a verified composition resists the frost and has no increased fluidity causing at low temperatures in-slab lumping of concrete (segregation) and a decrease in the strength of the cast structure. Moreover, engineers used up to 40 percent less water in the mixture, and increased the strength and density of concrete slab. High density and gel pore size get achieved not only because of nano-additives but also due to the technology of additional grinding of concrete particles. A grinder was also developed at FEFU.

The new mixture already underwent a test-drive. As field research, engineers built the five-story parking lot. The concrete stone was curing for 28 days under natural conditions with temperature differences from plus five to minus six degrees, the result came in accordance with the stated standards.

As additives, scientists used components already well known in the construction industry, the properties of which they improved with nanoparticles. Thus, they have strengthened naphthalene formaldehyde resin by the properties of silicon dioxide, and the resulted concrete turned out to be stronger maintaining operational characteristics longer. Saponified wood resin and sodium nitrate also are the components of the mixture.

A scientific school for the development of intelligent composites for special and civil engineering run at the FEFU MTC. The main idea of the school is to design artificial materials similar to the natural ones. For example, concrete should have the strength of natural stone. This theory is developing by modern science, Geonics (Geomimetics) established by professor Valery Lesovik of V.G. Shukhov Belgorod State Technological University, a corresponding member of the Russian Academy of Architecture and Construction Sciences. Engineers from Moscow, Kazan, and the Russian Far East are working on the development of the method. The new concrete design is in accordance with the principles of geomimetis, the ultimate goal of which is to design new materials for a comfortable human environment.

At the next stage of research, scientists plan to develop a concrete mixture for casting under negative temperatures down to minus fifteen degrees centigrade.

Credit: 
Far Eastern Federal University

Antioxidant supplements do not improve male fertility, NIH study suggests

WHAT:
Antioxidant supplements do not improve semen quality among men with infertility, according to a new study supported by the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), part of the National Institutes of Health. The study also found that antioxidant supplements likely do not improve pregnancy and live birth rates. The study appears in Fertility and Sterility.

Antioxidant supplements are commercially available to help treat male infertility, but research on its effects on semen quality and rates of pregnancy and live birth are limited. The new study reports on results from the Males, Antioxidants, and Infertility Trial (MOXI), a double-blind, randomized, placebo-controlled clinical study conducted at nine sites across the United States. The study enrolled 171 couples where the male partner had at least one abnormal reading on an analysis evaluating sperm concentration, mobility, shape and DNA quality; the female partners had normal fertility test results. Men received a placebo or an antioxidant supplement containing vitamins C, E and D, selenium, l-carnitine, zinc, folic acid and lycopene for at least three months and up to six months. MOXI was supported by NICHD's Reproductive Medicine Network.

The study team found no statistically significant differences in sperm concentration, mobility, shape and DNA quality between the placebo and antioxidant groups after three months. Furthermore, live birth rates did not seem to differ at six months between the antioxidant (15%) and placebo (24%) groups. However, the authors note that per the study design, recruitment was stopped before it reached the desired number of participants because no benefits were seen in the antioxidant group. Therefore, the authors only had enough participants to evaluate statistical differences in semen quality but not in pregnancy and live birth rates.

According to the authors, MOXI is the largest randomized, placebo-controlled trial to examine the effects of antioxidants, without additional assisted reproductive technology, on male infertility.

WHO:
Study author Esther Eisenberg, M.D., M.P.H., of NICHD's Fertility and Infertility Branch is available for interviews.

REFERENCE:
Steiner AZ et al. The effect of antioxidants on male factor infertility: the Males, Antioxidants, and Infertility (MOXI) randomized clinical trial. Fertility and Sterility DOI: 10.1016/j.fertnstert.2019.11.008 (2020)

Credit: 
NIH/Eunice Kennedy Shriver National Institute of Child Health and Human Development

Containing methane and its contribution to global warming

Methane is a gas that deserves more attention in the climate debate as it contributes to almost half of human-made global warming in the short-term. A new IIASA study shows that it is possible to significantly contribute to reduced global warming through the implementation of available technology that limits methane release to the atmosphere.

According to the study published in the journal Environmental Research Communications, it is possible to achieve reduced global warming in the near term by targeting methane through the fast implementation of technology to prevent its release to the atmosphere. This could mitigate some of the otherwise very costly impacts of climate change that are expected over the next few decades. To achieve the significant reductions in methane emissions caused by humans needed to meet the Paris Agreement, we however need to know exactly where and from what sources emissions are emitted so that policymakers can start developing strategies to contain methane and its contribution to global warming.

"To develop policy strategies to mitigate climate change through reductions of global non-CO2 greenhouse gas emissions like methane, we need detailed inventories of the sources and locations of current man-made emissions, build scenarios for expected developments in future emissions, assess the abatement potential of future emissions, and estimate the costs of reducing emissions. In this study, we looked at global methane emissions and technical abatement potentials and costs in the 2050 timeframe," explains study lead author Lena Hoglund-Isaksson.

Using the IIASA Greenhouse Gases - Air Pollution Interactions and Synergies (GAINS) model, the researchers endeavored to find out how well the GAINS bottom-up inventory of methane emissions at country and source-sector level between 1990-2015 match top-down estimates of the global concentration of methane measured in the atmosphere. In addition, they wanted to see how much methane would be emitted globally until 2050 if we take no further measures to reduce emissions.

The results show that at the global level, the GAINS methane inventory matches the top-down estimate of human-made methane emissions' contribution to the atmospheric concentration of methane quite well. A reasonable match between bottom-up and top-down budgets, both at the global and regional levels, is important for the confidence in bottom-up inventories, which are a prerequisite for policy strategies to be perceived as "certain enough" by stakeholders in climate mitigation.

The authors' analysis revealed a strong increase in emissions after 2010, which confirms top-down measurements of increases in the atmospheric methane concentration in recent years. According to this study, these are explained by increased methane emissions from shale gas production in North America, increased coal mining in countries outside of China, for instance, Indonesia and Australia, and increased generation of waste and wastewater from growing populations and economic development in Asia and Africa. In addition, the findings showed a small but steady increase in emissions from beef and dairy production in Latin America and Africa, highlighting how different the distribution of emission source sectors are across different world regions.

The findings further show that without measures to control methane emissions, there would be a global emission increase of about 30% until 2050. While it would technically be possible to remove about 38% of these emissions by implementing available abatement technology, it would still mean that a significant amount of methane would be released between 2020 and 2050, making it impossible for the world to stay below 1.5°C warming.

With that said, the researchers point out that technical abatement potentials can still be used to achieve considerable reductions in methane emissions in the near-term and at a comparably low cost. Between 30% and 50% of future global methane emissions can be removed at a cost below 50 €/t CO2eq. The use of fossil fuels will however also have to be phased-down to really make a difference. Technical abatement potentials are particularly limited in agriculture, which suggests that these emissions must be addressed through non-technical measures, such as behavioral changes to reduce milk and meat consumption, or institutional and socioeconomic reforms to address smallholder livestock herding as a means of risk management in Africa and South-East Asia.

"There is no one-size fits all solution for the whole world. In the Middle East and Africa, for instance, oil production is a major contributor to methane emissions with relatively extensive potentials for emission reductions at low cost. In Europe and Latin America, dairy and beef production are the main sources with relatively limited technical mitigation potentials, while in North America it is emissions from shale gas extraction that can significantly contain emissions at a low cost. Our study illustrates just how important it is to have a regional- and sector-specific approach to mitigation strategies," concludes Hoglund-Isaksson.

Credit: 
International Institute for Applied Systems Analysis

Cancer mechanics: How physical cues influence cell migration, metastasis, and treatment

image: The green represents collagen while areas of red represent breast cancer cells. From the research of Bo Sun

Image: 
Jihan Kim (Oregon State University), Yunsheng Cao (UCSD), Herbert Levin (Northeastern University), Wouter-Jan Rappel (UCSD) and Bo Sun (Oregon State University).

Please Note: The 2020 American Physical Society (APS) March Meeting that was to be held in Denver, Colorado from March 2 through March 6 has been canceled. The decision was made late Saturday (February 29), out of an abundance of caution and based on the latest scientific data available regarding the transmission of the coronavirus disease (COVID-19). See our official release on the cancelation for more details.

DENVER, COLO., FEBRUARY 28, 2020--Cancer cells are a product of their environment. The surrounding cells, extracellular matrix, and other features influence disease progression and spread of cancer cells to other parts of the body. Chemical cues like the presence of nutrients and oxygen have been studied for decades, but more recently, researchers have turned their attention to equally important physical cues.

New research on the tumor mechanical microenvironment will be presented at the 2020 American Physical Society March Meeting in Denver. Highlights include a study that looks at how the anisotropy of the extracellular matrix affects cancer cell migration, a novel optical tweezer-based tool that probes mechanical cues, and a model to find the best place within a tumor to inject a chemotherapy drug.

Mechanical Cues Drive Metastasis

Oregon State University researcher Bo Sun will present his findings on the complex interactions between cancer cell migration and the extracellular matrix, a 3D network of collagen, proteins, and other molecules that support the surrounding cells. Specifically, he will discuss the anisotropy of the extracellular matrix, and how a particular alignment of collagen fibers can create a cancer cell superhighway of sorts.

"For most research in the field so far, a dominant paradigm is that rigidity of the tissue environment is the dominant factor that guides cell migration. The cell wants to move to a more rigid region compared to a soft region," said Sun. "We're seeing that a different type of physical property, anisotropy of the environment, is more efficient in terms of guiding cell migration."

For instance, collagen fibers aligned circumferentially around the tumor causes the cancer cells to become trapped by this alignment, whereas fibers arranged radially provide a frictionless highway for cells to migrate outward. Studying the way cells move in their environment--and what factors drive that movement--has implications for understanding how cancer cells infiltrate new areas of the body.

Probing Tissues with Optical Tweezers

Biophysicist Kandice Tanner has repurposed a tool to measure the mechanical cues that may influence how tumor cells disseminate to different organs. The optical tweezer-based technique can probe the physical properties of cells in living animals with microscale resolution.

"Previously, characterization of mechanical properties of tissues, cells, and extracellular matrix hydrogels were mainly obtained using bulk rheological or nanometer scale techniques such as atomic force microscopy and primarily for in vitro systems," said Tanner, an investigator at the National Cancer Institute, part of the National Institutes of Health. "These techniques are useful to assess material properties but do not possess the resolution that is needed to resolve length scales that are compatible with the micron-size protrusions used by cells to respond to external cues."

Tanner and her colleagues employed the technique to measure the viscoelastic properties of tissue in living zebrafish and 3D culture models of breast cancer progression. For the latter project, they used optical trap-based active microrheology to map internal cellular and external extracellular matrix mechanics with near simultaneity. They found that, unlike healthy cells, breast tumor cells do not match their mechanical properties to the surrounding microenvironment.

"As cancer cells migrate from their original primary tumor, they encounter many physical cues before establishing new lesions in different organs such as the patient's bones, brain, liver, or lungs," said Tanner. "We believe that by decoding the role of the physical cues, we can understand why some tumor cells are able to colonize one organ versus the other."

Modeling Cancer Drug Response

Cancer treatment relies heavily on trial and error, which can lead to unnecessary toxicity and cost. Models that predict how a cancer drug will diffuse throughout the tumor offer a possible solution for oncologists. Aminur Rahman develops these kinds of mechanistic models of drug response as a method for oncologists to choose the most effective treatment before administering any medication.

"Our mechanistic models are able to produce dose-response curves that oncologists would see from cell line and drug data pairs," said Rahman, a post-doctoral researcher at Texas Tech University. "We realized that perhaps this research could be used for computer-aided treatment strategies."

He will present the results of multiple projects in a poster presentation. The first investigates a model of drug distribution after injection directly into a solid tumor and its effect on cancer cell death. While the model assumes the tumor is spherical and homogeneous, brain tumors in particular tend to be highly inhomogeneous and anisotropic. The second study develops a more sophisticated computational model for inhomogeneous-anisotropic drug diffusion using real-world diffusion tensor MRI data.

"Because of the tumor's inhomogeneity and anisotropy, the center might not be the best place to inject drugs," said Rahman. "We looked at different injection sites, and it was not necessary the center that would do the trick. In such cases, having a model would help an oncologist know where to inject the drug."

Credit: 
American Physical Society

Taking a bite out of food waste: Scientists repurpose waste bread to feed microbes

As much as a third of food produced for human consumption is wasted or lost globally every year. New research published in Frontiers in Microbiology suggests one way to take a big bite out of food waste is to use bread destined for the dumpster as a medium for cultivating microbial starters for the food industry.

While exact numbers regarding the amount of bread that is thrown away are hard to estimate, it is believed "hundreds of tons are wasted daily worldwide" from spoilage and other factors, including consumer preferences for products like crustless loaves.

The authors write that bread waste creates both economic loss and environmental impacts, as most of the waste ends up in landfills that emit greenhouse gases such as carbon dioxide and methane. The researchers propose repurposing all that discarded dough to feed the very microorganisms needed to kick start fermentation in food industries like bakeries, dairy and wine-making.

"We believe that the introduction of innovative bioprocessing technologies might be the key to unravel the burden of food waste [and] improving sustainability of the agro-food system," said team coordinator Dr. Carlo G. Rizzello at the University of Bari Aldo Moro in Italy.

Rizzello and his colleagues experimented with more than 40 different kinds of growing conditions to find the best combination for various bacteria, yeast and other microorganisms used in food fermentation. This involved discovering the right recipe of bread amount, enzymes and supplemental ingredients, as well as the ideal time and temperature for incubation.

The goal was to create a wasted bread medium (WBM) that would match or beat current production methods that rely on raw materials. And, in fact, the scientists did formulate a secret sauce using 50 percent waste bread that was appetizing to a wide variety of microorganisms, including bacteria used in yogurt production. Crucially, they estimate that the production cost of WBM is about a third that of conventional media.

"The protocol we were able to set up combines both the need for disposing of the huge amount of bread waste with that of cheap sources for media production, while fitting for the cultivation of several food industry starters, and it is patent pending," Rizzello said.

The idea is that the WBM protocols could be easily adopted by industrial bakeries, which currently rely on other companies to provide the starters. These businesses would benefit by "using their own waste to produce the medium and propagate the cultures, without modifying [or] adding any equipment to the existing technology," said lead author Michela Verni, who was responsible for the experimental design of the work.

"The strength of our study strictly relies on how easily applicable the protocol is, and proof of its feasibility is indeed the fact that the process is already scaled up at industrial level," she added. "Nevertheless, WBM offers a possibility for sustainable starter production to all the food industries working in the field of fermented foods and beverages."

Rizzello said WBM has applications beyond simple microbial cultivation. For example, it could be used as a food ingredient itself with a few tweaks to the WBM recipe and fermentation with different starters. Or it could serve as a substrate to feed microbes that produce specific compounds used in food supplements or cosmetics.

While WBM appears to be an effective medium for growing lactic acid bacteria and yeasts, Rizzello said further study is needed to determine if certain components or lack of some micronutrients might affect microbial metabolism in some significant way.

Credit: 
Frontiers

Conspiracy beliefs could increase fringe political engagement, shows new study

Washington, DC - Conspiracies abound in society and can have real world impacts when it leads some people to act, whether that means becoming more engaged politically, or less engaged. Previous research linking conspiracy beliefs and political actions provide mixed results. Some studies show people with a conspiracy worldview are more likely to disengage politically, while others show they are more engaged.

New research appearing in Social Psychological and Personality Science finds that when studying an average person, conspiracy beliefs lead to more willingness for engagement in "non-normative" roles, like illegally blocking a public entryway, while avoiding more typical political engagement, such as voting.

"Once regular people accept the basic premises of a conspiracy worldview, they come the conclusion that violent means of political engagement are a plausible consequence." Says Roland Imhoff, a professor of social and legal psychology at the Johannes Gutenberg University in Germany. Imhoff is lead author of the study, conducted together with Lea Dieterle (University of Cologne) and Pia Lamberty (Johannes Gutenberg University).

"This finding, together with the observation that many radical and terrorist groups employ conspiracy rhetoric in their pamphlets, might suggest that seeing the world as governed by hidden and illegitimate forces is a driving force for radical violent action as it a) seems justified and b) non-violent means seem futile," says Imhoff.

The researchers conducted two experiments, one in Germany (194 people) and another with Mturk workers based in the United States (402 people).

In both experiments, people were assigned to imagine being in a particular type of society. Some were assigned a conspiracy-focused description that suggested a few powerful groups controlled the fate of millions, others read an intermediate scenario where people wondered if the media and politicians could be trusted, and another group read about a world view that governments and the media were trustworthy and transparent.

Each person was then asked a set of follow-up questions about what political actions they'd be willing to engage in, from "normative" actions like voting, participating in rallies, or contacting media or politicians, to "non-normative" actions such as destroying property, harming others, or other illegal behaviors.

In both studies they found people who were presented with a high conspiracy scenario were more likely to engage in the non-normative political actions than those presented with a low conspiracy scenario. Political engagement for normative actions. Was higher for those reading about low conspiracy scenarios compared to the other two groups."

Imhoff notes that these are hypothetical situations, and understanding how things might play out in the real world need further research.

For future research, Imhoff suggests "we still lack a grip on the likely relevant differentiation between those who like or endorse conspiracies on social media or when we ask them to what extent they agree and those who actively produce, invent and disseminate conspiracy theories. Whether the latter category is of a similar psychological makeup as the former or just utilized conspiracy rhetoric for political goals is still not understood."

Credit: 
Society for Personality and Social Psychology

Ultrafast probing reveals intricate dynamics of quantum coherence

IMAGE: Three excitation pulses with wave vectors k1, k2, and k3 form three corners of a box with 4th pulse (local oscillation; LO) on the fourth corner.

Image: 
FLEET

Ultrafast, multidimensional spectroscopy unlocks macroscopic-scale effects of quantum electronic correlations.

Researchers found that low-energy and high energy states are correlated in a layered, superconducting material LSCO (lanthanum, strontium, copper, oxygen).

Exciting the material with an ultrafast (

The strong correlation between the energy of this coherence and the optical energy of the emitted signal indicates a coherent interaction between the states at low and high energy.

This kind of coherent interaction, reported here for the first time, is the root of many intriguing and poorly-understood phenomena displayed by quantum materials.

It is one of the first applications of multidimensional spectroscopy to study of correlated electron systems such as high-temperature superconductors.

PROBING QUANTUM MATERIALS

The intriguing magnetic and electronic properties of quantum materials hold significant promise for future technologies.

However, controlling these properties requires an improved understanding of the ways in which macroscopic behaviour emerges in complex materials with strong electronic correlations.

Potentially useful electric and magnetic properties of quantum materials with strong electronic correlations include: Mott transition, colossal magnetoresistance, topological insulators, and high-temperature superconductivity.

Such macroscopic properties emerge out of microscopic complexity, rooted in the competing interactions between the degrees of freedom (charge, lattice, spin, orbital, and topology) of electronic states.

While measurements of the dynamics of excited electronic populations have been able to give some insight, they have largely neglected the intricate dynamics of quantum coherence.

In this new study, researchers applied multidimensional coherent spectroscopy to the challenge for the first time, utilising the technique's unique capability to differentiate between competing signal pathways, selectively exciting and probing low-energy excitations.

Researchers analysed the quantum coherence of excitations produced by hitting LSCO (lanthanum, strontium, copper and oxygen) crystals with a sequence of tailored, ultrafast beams of near-infrared light lasting less than 100 femtoseconds

This coherence has unusual properties, lasts a surprisingly 'long' time of around 500 femtoseconds, and originates from a quantum superposition of excited states within the crystal.

2D spectrum showing energy difference between the states in the quantum superposition, shown before, during and after pulse overlap

"We found a strong correlation between the energy of this coherence and the optical energy of the emitted signal, which indicates a special coherent interaction between the states at low and high energy in these complex systems," says study author Jeff Davis (Swinburne University of Technology).

Because the number of available excitations affects the band structure of a crystal, the effective energy structure changes transiently during measurement, which links low-energy excitations and optically excited electronic states.

The study demonstrates that multidimensional coherent spectroscopy can interrogate complex quantum materials in unprecedented ways.

As well as representing a major advancement in ultrafast spectroscopy of correlated materials, the work has wider significance in optics/photonics, chemistry, nanoscience, and condensed-matter science.

THE STUDY

Persistent coherence of quantum superpositions in an optimally doped cuprate revealed by 2D spectroscopy was published in Science Advances in February 2020.

The authors acknowledge funding by the Australian Research Council (Future Fellowship and Centres of Excellence programs). Work was conducted at the Centre for Quantum and Optical Science (Swinburne University of Technology), Ruhr University (Germany) and University of Oxford (UK).

ULTRAFAST SPECTROSCOPY AT FLEET

Within FLEET, Jeff Davis uses ultrafast spectroscopy to study and control the microscopic interactions in 2D materials and how they lead to macroscopic behaviour.

In FLEET's third research theme, light-transformed materials, systems are temporarily driven out of thermal equilibrium to investigate the qualitatively different physics displayed and new capabilities for dynamically controlling their behaviour.

FLEET is an Australian Research Council-funded research centre bringing together over a hundred Australian and international experts to develop a new generation of ultra-low energy electronics.

Credit: 
ARC Centre of Excellence in Future Low-Energy Electronics Technologies

Exploring coronavirus risk factors and public health concerns

image: JACEP Open is a new official open access journal of the American College of Emergency Physicians (ACEP).

Image: 
ACEP

WASHINGTON, D.C.--Emergency physician-led teams are on the frontlines of coronavirus treatment, prevention and response. JACEP Open, a new official open access journal of the American College of Emergency Physicians (ACEP), explores coronavirus (COVID-19) concerns in two new analyses. The first paper explores risk factors for transmission while the second outlines broad public health concerns amplified during an outbreak.

"The impact of coronavirus is significant but pales in comparison to global influenza," said Dr. Matthew J. Fuller, MD, assistant professor and director of global health for the Division of Emergency Medicine, University of Utah and lead study author. "Lessons learned from past outbreaks are instructive while risk factors for transmission of coronavirus are still being assessed."

There are more confirmed cases of coronavirus thus far, but SARS (sudden acute respiratory syndrome) and MERS (Middle East respiratory syndrome), which are in the same virus family, have higher fatality rates and bring on more severe illness. A three percent fatality rate has been reported for coronavirus, compared to 35 percent or 15 percent for MERS and SARS, respectively, according to "Novel Coronavirus 2019: Emergence and Implications for Emergency Care."

Patients admitted to a Wuhan, China hospital with confirmed coronavirus had symptoms like fever (83 to 98 percent) or cough (76 to 82 percent) and roughly one-third had shortness of breath. About one-third of those patients required intensive care, mostly for oxygen support.

Patients at high risk for contracting the virus include anyone with flu-like symptoms who recently traveled to China or came into close contact with somebody who recently traveled to China. No risk is identifiable simply from passing by a person with a confirmed case of coronavirus. Transmission from a person who has an early stage of the virus but does not yet show symptoms has not been confirmed, according to the analysis.

Should that be proven possible, it would mean the novel coronavirus could be transmitted during the incubation period, like chicken pox or measles, the authors note.

Like other respiratory viruses, human-to-human transmission is thought to occur via droplets produced by cough or sneeze. It is believed that facial contact with contaminated surfaces (public water fountains, for example) could contribute to the spread, but that is less likely. The highest casualty rates are reported among elderly patients with multiple chronic conditions.

A second paper, "Coronavirus Disease 2019: International Public Health Considerations," discusses economic and social risks posed by this outbreak and others.

"Misinformation can spread just like a virus, obscuring communication from the international health community to medical professionals and the public," said Christopher J. Greene, MD, MPH, assistant professor of global health and international emergency medicine, University of Alabama Birmingham and lead study author. "Everyone would like to avoid a scenario where anxiety drives public behavior change."

In the initial response to an outbreak, global resource concentration can shift away from routine care toward outbreak management. One estimate referenced in the analysis concluded that reductions in service cost more than 10,600 lives during the Ebola outbreak. Indirect costs to affected regions (tourism, commerce) can also be substantial, Dr. Greene notes.

"Effective public communication helps ensure compliance with quarantine directives or other instructions. It's important for health professionals to break through the noise to encourage people, especially those potentially at risk, to take appropriate precautionary measures and heed the recommendations of health professionals," said Dr. Greene.

Credit: 
American College of Emergency Physicians

Astronomy student discovers 17 new planets, including Earth-sized world

image: Sizes of the 17 new planet candidates, compared to Mars, Earth, and Neptune. The planet in green is KIC-7340288 b, a rare rocky planet in the Habitable Zone

Image: 
Michelle Kunimoto

University of British Columbia astronomy student Michelle Kunimoto has discovered 17 new planets, including a potentially habitable, Earth-sized world, by combing through data gathered by NASA's Kepler mission.

Over its original four-year mission, the Kepler satellite looked for planets, especially those that lie in the "Habitable Zones" of their stars, where liquid water could exist on a rocky planet's surface.

The new findings, published in The Astronomical Journal, include one such particularly rare planet. Officially named KIC-7340288 b, the planet discovered by Kunimoto is just 1 ½ times the size of Earth - small enough to be considered rocky, instead of gaseous like the giant planets of the Solar System - and in the habitable zone of its star.

"This planet is about a thousand light years away, so we're not getting there anytime soon!" said Kunimoto, a PhD candidate in the department of physics and astronomy. "But this is a really exciting find, since there have only been 15 small, confirmed planets in the Habitable Zone found in Kepler data so far."

The planet has a year that is 142 ½ days long, orbiting its star at 0.444 Astronomical Units (AU, the distance between Earth and our Sun) - just bigger than Mercury's orbit in our Solar System, and gets about a third of the light Earth gets from the Sun.

Of the other 16 new planets discovered, the smallest is only two-thirds the size of Earth - one of the smallest planets to be found with Kepler so far. The rest range in size up to eight times the size of Earth.

Kunimoto is no stranger to discovering planets: she previously discovered four during her undergraduate degree at UBC. Now working on her PhD at UBC, she used what is known as the "transit method" to look for the planets among the roughly 200,000 stars observed by the Kepler mission.

"Every time a planet passes in front of a star, it blocks a portion of that star's light and causes a temporary decrease in the star's brightness," Kunimoto said. "By finding these dips, known as transits, you can start to piece together information about the planet, such as its size and how long it takes to orbit."

Kunimoto also collaborated with UBC alumnus Henry Ngo to obtain razor-sharp follow-up images of some of her planet-hosting stars with the Near InfraRed Imager and Spectrometer (NIRI) on the Gemini North 8-metre Telescope in Hawaii.

"I took images of the stars as if from space, using adaptive optics," she said. "I was able to tell if there was a star nearby that could have affected Kepler's measurements, such as being the cause of the dip itself."

In addition to the new planets, Kunimoto was able to observe thousands of known Kepler planets using the transit-method, and will be reanalysing the exoplanet census as a whole.

"We'll be estimating how many planets are expected for stars with different temperatures," said Kunimoto's PhD supervisor and UBC professor Jaymie Matthews. "A particularly important result will be finding a terrestrial Habitable Zone planet occurrence rate. How many Earth-like planets are there? Stay tuned."

Credit: 
University of British Columbia

How do zebrafish get their stripes? New data analysis tool could provide an answer

image: The new technique can analyze the output of models in a way that quantifies pattern features.

Image: 
Brown University

PROVIDENCE, R.I. [Brown University] -- The iconic stripes of zebrafish are a classic example of natural self-organization. As zebrafish embryos develop, three types of pigment cells move around the skin, eventually jostling into positions that form body-length yellow and blue stripes.

Scientists want to understand the genetic rules that direct this delicate dance, and a new algorithm developed by Brown University mathematicians could help them accomplish that. The algorithm, described this week in Proceedings of the National Academy of Sciences, is able to quantify various attributes of shapes and patterns, enabling scientists to more objectively test ideas about how zebrafish stripes -- and potentially other developmental patterns -- are formed.

"The overarching goal of studying zebrafish stripes is to understand the early development of organisms -- how genes express themselves to form structures and phenotypes," said Bjorn Sandstede, a professor in Brown's Division of Applied Mathematics and senior author of the research. "People have developed simulations to help understand these processes, but a challenge is that you're looking at a few zebrafish or a few images from simulations, and you're essentially eyeballing what the similarities and differences are. We wanted to create something that was automated and more objective."

Of stripes and spots

Zebrafish turn out to be great testbeds for evaluating how genetic changes can influence pattern formation. Their embryos are transparent and develop quickly, which gives scientists the opportunity to study stripe development in great detail. Over the years, researchers have found a number of genetic mutations that alter zebrafish pigment patterns. Some mutations change the straightness of stripes fish have, some introduce little breaks in the stripes, and others create an array of spots rather than stripes. These mutations provide an opportunity to better understand the rules governing stripe formation.

These differing patterns are the result of changes in the way pigment cell types interact with each other and move around during development. To understand the rules these cells follow, scientists have developed computer models that simulate cellular movement pattern formation. By tweaking the rules governing simulation and then seeing if the output matches the patterns of real fish, scientists can start to figure out what rules matter.

Sandstede and Alexandria Volkening, who earned her Ph.D. at Brown and is now postdoctoral researcher at Northwestern University, previously developed just such a simulation, and it has yielded new insights on stripe formation. But the new algorithm described in this latest paper, on which Volkening was a coauthor, provides a new way to evaluate the performance of that model and others, the researchers say.

The shape of data

The new algorithm employs a technique known as topological data analysis.

"This is a newer area of math and statistics that focuses on quantifying shape," said Melissa McGuirl, a graduate student at Brown and the study's lead author. "Essentially, it's a tool that allows us to track connected components and loops which correspond to shape features representing spots or stripes."

In this case, those connected components are made up of individual pigment cells in images of zebrafish or from simulations of zebrafish stripe development. The algorithm evaluates the extent to which the position of each cell is correlated with others, and thus whether the cells are part of a pattern element -- a stripe, a spot or something else. The beauty of the technique, the researchers say, is it can quantify patterns on a broad spectrum of spatial scales, from scale of just a few individual cells to whole fish.

"What we can do with this is determine a variety of descriptors that allow us to talk about things like how straight or curvy the stripes are, how many breaks there are in the stripes or what average cell-to-cell distances are," Sandstede said. "If there are spots, how many cells are included in each spot? Are they round or more elongated?"

With these more objective feature measures in hand, the researchers can better evaluate how well their model and others are capturing the dynamics of zebrafish pattern formation. And that, the researchers say, could lead to key insights about how genetic instructions manifest themselves in natural structures.

And the technique isn't just limited to zebrafish, Sandstede said.

"It's much more general than just zebrafish pigment cells," he said. "This is designed to quantify patterns and shapes, and it could really do that in any kind of system."

Credit: 
Brown University

Using a cappella to explain speech and music specialization

image: Figure shows original song (bottom left) and its spectrogram (above it, in blue). This spectrogram can be decomposed according to the amount of energy contained in spectral and temporal modulation rates (central panel). Auditory cortex on the right and left sides of the brain (right side of figure) decode melody and speech, respectively, because the melody depends more on spectral modulations and the speech depends more on temporal modulations.

Image: 
Robert Zatorre

Speech and music are two fundamentally human activities that are decoded in different brain hemispheres. A new study used a unique approach to reveal why this specialization exists.

Researchers at The Neuro (Montreal Neurological Institute-Hospital) of McGill University created 100 a capella recordings, each of a soprano singing a sentence. They then distorted the recordings along two fundamental auditory dimensions: spectral and temporal dynamics, and had 49 participants distinguish the words or the melodies of each song. The experiment was conducted in two groups of English and French speakers to enhance reproducibility and generalizability. The experiment is demonstrated here: https://www.zlab.mcgill.ca/spectro_temporal_modulations/

They found that for both languages, when the temporal information was distorted, participants had trouble distinguishing the speech content, but not the melody. Conversely, when spectral information was distorted, they had trouble distinguishing the melody, but not the speech. This shows that speech and melody depend on different acoustical features.

To test how the brain responds to these different sound features, the participants were then scanned with functional magnetic resonance imaging (fMRI) while they distinguished the sounds. The researchers found that speech processing occurred in the left auditory cortex, while melodic processing occurred in the right auditory cortex.

Music and speech exploit different ends of the spectro-temporal continuum

Next, they set out to test how degradation in each acoustic dimension would affect brain activity. They found that degradation of the spectral dimension only affected activity in the right auditory cortex, and only during melody perception, while degradation of the temporal dimension affected only the left auditory cortex, and only during speech perception. This shows that the differential response in each hemisphere depends on the type of acoustical information in the stimulus.

Previous studies in animals have found that neurons in the auditory cortex respond to particular combinations of spectral and temporal energy, and are highly tuned to sounds that are relevant to the animal in its natural environment, such as communication sounds. For humans, both speech and music are important means of communication. This study shows that music and speech exploit different ends of the spectro-temporal continuum, and that hemispheric specialization may be the nervous system's way of optimizing the processing of these two communication methods.

Solving the mystery of hemispheric specialization

"It has been known for decades that the two hemispheres respond to speech and music differently, but the physiological basis for this difference remained a mystery," says Philippe Albouy, the study's first author. "Here we show that this hemispheric specialization is linked to basic acoustical features that are relevant for speech and music, thus tying the finding to basic knowledge of neural organization."

Credit: 
McGill University

Study: The opioid crisis may be far worse than we thought

image: New research in the journal Addiction shows that the deaths from opioid-related overdoses may be 28% higher than previously reported. This discrepancy is more pronounced in several states, including Alabama, Mississippi, Pennsylvania, Louisiana, and Indiana, where the estimated number of deaths more than doubles.

Image: 
University of Rochester Medical Center

New research appearing in the journal Addiction shows that the number of deaths attributed to opioid-related overdoses could be 28 percent higher than reported due to incomplete death records. This discrepancy is more pronounced in several states, including Alabama, Mississippi, Pennsylvania, Louisiana, and Indiana, where the estimated number of deaths more than doubles - obscuring the scope of the opioid crisis and potentially affecting programs and funding intended to confront the epidemic.

"A substantial share of fatal drug overdoses is missing information on specific drug involvement, leading to underreporting of opioid-related death rates and a misrepresentation of the extent of the opioid crisis," said Elaine Hill, Ph.D., an economist and assistant professor in the University of Rochester Medical Center (URMC) Department of Public Health Sciences and senior author of the study. "The corrected estimates of opioid-related deaths in this study are not trivial and show that the human toll has been substantially higher than reported, by several thousand lives taken each year."

Hill and her team - including co-authors Andrew Boslett, Ph.D., and Alina Denham, M.S., with URMC - found that almost 72 percent of unclassified drug overdoses that occurred between 1999-2016 involved prescription opioids, heroin, or fentanyl - translating into an estimated 99,160 additional opioid-related deaths.

Gaps in Death Records

Hill and Boslett first stumbled upon the discrepancy while studying the economic, environmental, and health impacts of natural resources extraction. Many regions of the country hit the hardest by the opioid crisis overlap with areas associated with shale gas development and coal mining. As a part of her research, Hill was attempting to determine whether the shale boom improved or exacerbated the opioid crisis. However, as they started collecting data, they discovered that close to 22 percent of all drug-related overdoses where unclassified, meaning the drugs involved in the cause of death were not indicated.

A medical examiner or coroner becomes involved during any sudden and unexpected death of an otherwise healthy person and anyone suspected to have died from an unnatural cause. Under ideal circumstances, the cause of death is identified through a combination of evidence collected at the scene, a toxicological analysis of blood or tissue, and an autopsy. If the cause is determined to be drug-related, either accidental or a suicide, then the specific drugs identified in the person's system are recorded on the death certificate.

However, in practice, this process is expensive and time-consuming, dependent upon the resources and staffing available to the specific medical examiner's office, and potentially influenced by family members due to the stigma associated with opioid use. Additionally, the requirements to serve as a medical examiner or coroner varies nationally. In some states, the office is an elected position with no prerequisite for professional experience or training in forensic pathology.

Underreporting Concentrated in Several States

In the study, Hill and her colleagues obtained death records of individuals identified as having died from drug overdoses from the National Center for Health Statistics, part of the Centers for Disease Control and Prevention. In addition to the cause, the records also include any additional medical issues that might have contributed to the death. Employing a statistical analysis, the researchers were able to correlate the information in the death records of unclassified overdose deaths with contributing causes associated with known opioid-related deaths, such as previous opioid use and chronic pain conditions.

While the overall percentage of unclassified deaths declined over time, a phenomenon that the researchers speculate is due to a more focused effort by federal, state, and local officials to understand the scope of the crisis, in several states, the number remained high. The new estimates of actual opioid-related deaths show a pronounced increase in states like Alabama, Mississippi, Pennsylvania, Louisiana, and Indiana. In fact, in each of these states, the number of opioid-related deaths increased by more than 100 percent.

In Pennsylvania, for example, the number of reported opioid-related deaths was 12,374. The study estimates the actual number of deaths was 26,586. Consequently, the state's total number of deaths in places it behind only California and Florida, states with significantly higher populations, and moves Pennsylvania from fifteenth to sixth in terms of highest per capita death rates in 2016.

"The underreporting of opioid-related deaths is very dependent upon location and this new data alters our perception of the intensity of the problem," said Hill. "Understanding the true extent and geography of the opioid crisis is a critical factor in the national response to the epidemic and the allocation of federal and state resources to prevent overdoses, treat opioid use disorders, regulate the prescription of opioid medications, and curb the illegal trafficking of drugs."

Credit: 
University of Rochester Medical Center

Mom's gut microbes affect newborn's metabolism, mouse models suggest

Using mouse models, scientists have discovered a mother's gut microbiota may shape the metabolism of her offspring, by providing environmental cues during pregnancy that fine tune energy homeostasis in the newborn's microbiome. These findings suggest targeting the maternal microbiota - for example, by recommending dietary changes - could offer a preemptive strategy to protect offspring from future metabolic disease. A balanced microbiome has been linked to good health, while disruptions or changes to the microbiome have been linked to several diseases and disorders, including obesity, heart diseases and diabetes. Though the effect of maternal microbiota on an infant's health has been well-documented, much less is known about how the mother's gut microbes impact the offspring's microbiota at the embryonic stage. Ikuo Kimura and colleagues explored this question in a mouse model, specifically focusing on short chain fatty acids (SCFAs), microbiota-derived metabolites that fuel cells and signal communication between gut microbes and organs. They found that SCFAs produced by the pregnant mother's gut microbes dictated the differentiation of the offspring's neural, intestinal and pancreatic cells through cell signaling of GPR41 and GPR43, protein receptors on the surface of fat cells. This developmental process helped offspring maintain balanced energy levels, whereas offspring from mothers that lacked microbes altogether were highly susceptible to metabolic syndromes like obesity and glucose intolerance. One particular SCFA - propionate - played a vital role in preventing the development of metabolic disorder in the offspring, the researchers found. Supplementation by propionate may thus be a possible route for therapy, but the safety and efficacy of this application during pregnancy remains to be determined, says Jane Ferguson in a related Perspective that further details Kimura et al.'s study.

Credit: 
American Association for the Advancement of Science (AAAS)