Tech

A tree stump that should be dead is still alive; here's why

image: This image shows the Kauri tree stump in the study.

Image: 
Sebastian Leuzinger / iScience

Within a shrouded New Zealand forest, a tree stump keeps itself alive by holding onto the roots of its neighboring trees, exchanging water and resources through the grafted root system. New research, publishing July 25 in iScience, details how surrounding trees keep tree stumps alive, possibly in exchange for access to larger root systems. The findings suggest a shift from the perception of trees as individuals towards understanding forest ecosystems as "superorganisms."

"My colleague Martin Bader and I stumbled upon this kauri tree stump while we were hiking in West Auckland," says corresponding author Sebastian Leuzinger, an associate professor at the Auckland University of Technology (AUT). "It was odd, because even though the stump didn't have any foliage, it was alive."

Leuzinger and Bader, first author and an AUT senior lecturer, decided to investigate how the nearby trees were keeping the tree stump alive by measuring water flow in both the stump and the surrounding trees belonging to the same species. What they found is that the water movement in the tree stump was strongly negatively correlated with that in the other trees.

These measurements suggest that the roots of the stump and surrounding conspecific trees were grafted together, Leuzinger says. Root grafts can form between trees once a tree recognizes that a nearby root tissue, although genetically different, is similar enough to allow for the exchange of resources.

"This is different from how normal trees operate, where the water flow is driven by the water potential of the atmosphere," Leuzinger says. "In this case, the stump has to follow what the rest of the trees do, because since it lacks transpiring leaves, it escapes the atmospheric pull."

But while root grafts are common between living trees of the same species, Leuzinger and Bader were interested in why a living kauri tree would want to keep a nearby stump alive.

"For the stump, the advantages are obvious--it would be dead without the grafts, because it doesn't have any green tissue of its own," Leuzinger says. "But why would the green trees keep their grandpa tree alive on the forest floor while it doesn't seem to provide anything for its host trees?"

One explanation, Leuzinger says, is that the root grafts formed before one of the trees lost its leaves and became a stump. The grafted roots expand the root systems of the trees, allowing them to access more resources such as water and nutrients, as well as increase the stability of the trees on the steep forest slope. As one of the trees stops providing carbohydrates, this may go unnoticed and thus allow the "pensioner" to continue its life on the backs of surrounding, intact trees.

"This has far-reaching consequences for our perception of trees - possibly we are not really dealing with trees as individuals, but with the forest as a superorganism," Leuzinger says.

During a drought, for example, trees with less access to water might be connected to those with more access to water, allowing them to share the water and increase their chances of survival. However, this interconnectivity could also allow for the rapid spread of diseases such as kauri dieback, Leuzinger says.

To better understand how root systems are formed between kauri stumps and living trees, Leuzinger says he hopes to find more instances of these types of stumps and to explore root grafting in intact trees, which will help expand their scope of research.

"This is a call for more research in this area, particularly in a changing climate and a risk of more frequent and more severe droughts," Leuzinger says. "This changes the way we look at the survival of trees and the ecology of forests."

Credit: 
Cell Press

Generic mobile phone chargers escalate risk of burn, electrocution

image: Generic phone charger escalates risk of burn, electrocution.

Image: 
<i>Annals of Emergency Medicine</i>

Electric currents generated by mobile phone chargers, particularly from lower-cost generic manufacturers, are causing serious injuries. Generic mobile phone chargers are less likely to meet established safety and quality tests than the brand counterparts, according to analysis and case studies in Annals of Emergency Medicine.

"Generic phone chargers can cause burns or electrocutions," said Carissa Bunke, MD, a pediatric resident physician with University of Michigan C.S. Mott Children's Hospital and lead study author. "Teens and adolescents are particularly at risk of injury due to their frequent mobile device use. They should be advised to not sleep with their phones or mobile devices charging in bed and avoid leaving the charger plugged in when it is not connected to a phone."

In one case cited, a patient was thrown from his bed by electric current. Another involved a 19-year old woman injured when the end of a charger touched her necklace, transmitting electric current and causing second degree burns. (image)

The analysis notes that for a study conducted by Electrical Safety First in the United Kingdom, Apple provided 64 generic chargers for safety testing. Fifty-eight percent of these generic chargers failed the electric strength test, indicating a breakdown of the insulation barrier. Another test cited in the analysis evaluated 400 generic iPhone chargers for electric shock safety risks. Of these, twenty-two samples were immediately damaged during the testing process and only three samples passed an electric strength test, a 99 percent failure rate.

"Even with a low-voltage device, if the current is high, then the electric shock can be severe," Dr. Bunke said.

Generally, patients with these types of injuries require medication to manage their pain and follow-up at their primary care provider or the burn center. In most instances, patients are checked for irregular heart rhythm or related side effects. Severe cases could involve extensive tissue damage or deep burns that require skin grafts. Complications from these types of injuries could include muscle breakdown, trouble breathing or airway damage, or cardiovascular injuries.

Credit: 
American College of Emergency Physicians

Physicists from IKBFU create metallic alloy for magnetic refrigerator

image: Alexander Kamantsev, Immanuel Kant Baltic Federal University.

Image: 
Immanuel Kant Baltic Federal University

Physicists of the Laboratory of Novel Magnetic Materials of the Immanuel Kant Baltic Federal University study magnetic materials and magnetostructural phase transition in order to create a new magnetic cooling technology. They have studied the properties of manganese and arsenic alloys that have magnetocaloric characteristics.

It is a well-known fact that common refrigerators use freon, which is not environment-friendly. This gas is harmful to the ozone layer and causes a greenhouse effect.

Alexander Kamantsev, a researcher of the Laboratory of Novel Magnetic Materials (IKBFU):

We have discovered that the manganese-arsenic alloy is one of the best to use in the technology of solid state magnetic cooling at room temperature.

The results of the study were published in a high rating scientific magazine Journal of Alloys and Compounds. Co-authors of the study are the researchers from the Immanuel Kant Baltic Federal University and their colleagues from Donetsk Institute for Physics and Engineering named after A. A. Galkin, MISIS National University of Science and Technology, Indian Institute of Technology Madras, Wroc?aw University of Science and Technology and the Kotelnikov Institute of Radioengineering and Electronics of Russian Academy of Sciences.

Credit: 
Immanuel Kant Baltic Federal University

High-performance flow batteries offer path to grid-level renewable energy storage

A low-cost, high-performance battery chemistry developed by University of Colorado Boulder researchers could one day lead to scalable grid-level storage for wind and solar energy that could help electrical utilities reduce their dependency on fossil fuels.

The new innovation, described today in the journal Joule, outlines two aqueous flow batteries, also known as redox flow batteries, which use chromium and organic binding agents to achieve exceptional voltage and high efficiencies. The components are abundant in nature, offering future promise for cost-effective manufacturing.

"We're excited to report some of highest performing battery chemistries ever, beyond previous limits," said Michael Marshak, senior author of the study and an assistant professor in CU Boulder's Department of Chemistry. "The materials are low-cost, non-toxic and readily available."

Renewable energy sources provide a growing share of U.S. electrical production, but currently lack a large-scale solution for storing harvested energy and re-deploying it to meet demand during periods when the sun isn't shining and the wind isn't blowing.

"There are mismatches between supply and demand on the energy grid during the day," said Marshak, who is also a fellow in the Renewable and Sustainable Energy Institute (RASEI). "The sun might meet the grid's needs in the morning, but demand tends to peak in the late afternoon and continue into the evening after the sun has set. Right now, utility companies have to fill that gap by quickly revving up their coal and natural gas production, just like you'd take a car from zero to sixty."

Although lithium ion can provide power for smaller scale applications, you would need millions of batteries to backup even a small fossil fuel power plant for an hour, Marshak says. But while the lithium ion chemistry is effective, it's ill-suited to meet the capacity of an entire wind turbine field or solar panel array.

"The basic problem with lithium ion batteries is that they don't scale very well," Marshak said. "The more solid material you add, the more resistance you add and then all of the other components need to increase in tandem. So in essence, if you want twice the energy, you need to build twice the batteries and that's just not cost-effective when you're talking about this many megawatt hours."

Flow batteries have been identified as a more promising avenue. Aqueous batteries keep their active ingredients separated in liquid form in large tanks, allowing the system to distribute energy in a managed fashion, similar to the way a gas tank provides steady fuel combustion to a car's engine when you push the pedal.

While there are some examples of flow batteries operating consistently for decades (such as in Japan), they have struggled to gain a broad foothold in commercial and municipal operations due in part to their unwieldy size, high operating costs and comparably low voltage.

"The size is less of an issue for grid-scale systems, because it would just be attached to an already large structure," Marshak said. "What matters is cost, and that's what we wanted to improve on."

The researchers went back to basics, re-examining flow battery chemistries that had been studied years ago, but abandoned. The key turned out to be combining organic binding agents, or chelates, with chromium ions in order to stabilize a potent electrolyte.

"Some people have taken this approach before, but hadn't paid enough attention to the binding agents," said Brian Robb, lead author of the new study and a doctoral student in the Department of Chemical and Biological Engineering (CHBE). "You need to tailor the chelate for the metal ion and we did a lot of work finding the right one that would bind them tightly."

Marshak, Robb and co-author Jason Farrell customized chelate known as PDTA creates a "shield" around the chromium electron, preventing water from hampering the reactant and allowing one of the battery cells to disperse 2.13 volts--nearly double the operational average for a flow battery.

PDTA is a spinoff of EDTA, an agent already used in some hand soap, food preservatives and municipal water treatments due to its bacteria-stymying properties. EDTA is considered non-toxic. The chemistry also uses the benign form of chromium, the same type used in stainless steel surgical instruments.

"We got this to work at the relatively neutral pH of 9, unlike other batteries which use highly corrosive acid that's difficult to work with and difficult to dispose of responsibly," Robb said. "This is more akin to laundry detergent."

"You could order 15 tons of these materials tomorrow if you wanted, because there are existing factories already producing them," Marshak added.

Marshak and Robb have filed a patent on the innovation with assistance from CU Boulder Venture Partners. They plan to continue optimizing their system, including scaling it up in the lab in order to cycle the battery for even longer periods of time.

"We've solved the problem on a fundamental level," Marshak said. "Now there are a lot of things we can try in order to keep pushing the performance limit."

Credit: 
University of Colorado at Boulder

When a fix for one vision problem causes another

As we age, our eyes lose their ability to focus up close. It's a condition called presbyopia, and it's both extremely common and relatively easy to fix, with solutions like reading glasses, bifocals, or progressive lenses.

Another common correction, called monovision, solves the problem with different lenses in each eye, one that focuses nearby, the other that focuses far away. To adjust for the blur differences caused by wearing different lenses, the brain suppresses the blurrier image and preferentially processes the sharper image, effectively enhancing the depth of field. Ten million people in the United States currently use monovision to correct their presbyopia, and as the population ages that number is likely to grow.

A team from the University of Pennsylvania and the Institute of Optics in Madrid, Spain, recently discovered that monovision can cause dramatic misperceptions of the distance and 3D direction of moving objects. What's more, the farther away the objects are and the faster they move, the larger the misperceptions become. The researchers, led by Penn neuroscientist Johannes Burge, published their findings in Current Biology.

"Imagine you're riding in a car, pulling up to an intersection," Burge says. "A cyclist in cross-traffic is going by at 15 miles per hour. If you calculate it out, the misperception of depth will be about nine feet. That's a big deal--that's the width of a traffic lane."

In general, Burge's lab studies how the human visual system processes the images that fall on the back of the eye when we're sitting in a room or walking down the street. Burge is particularly interested in understanding what enables people to perceive motion, depth, and blur. "It seems really simple. We open our eyes and see," he explains. "But as with most things, when you look under the hood to see how it actually works, it turns out to be a lot more complicated."

This new line of research is closely related to the Pulfrich effect, a 100-year-old perceptual illusion named for German physicist Carl Pulfrich. To understand this effect, picture a clock pendulum swinging from side to side. Viewing the pendulum with one eye darkened, through a pair of sunglasses with one lens missing for example, the pendulum will not appear to move side to side, but rather in an elliptical trajectory that changes depths. The same effect occurs for images with different contrast, like looking through a pair of glasses with one fogged-up lens.

The illusion happens because the brain processes the darker (or less contrast-y) image milliseconds more slowly than the brighter (or more contrast-y) image. For moving images, the processing delay causes what's called a "neural binocular disparity," meaning the actual location of the image on the back of the eye doesn't match where the visual system estimates it to be. A similar principle explains how stereo-3D movies work.

Brightness and contrast variation between the eyes cause the classic Pulfrich effect. Monovision induces blur differences between the eyes, and blur reduces contrast, so Burge and colleagues hypothesized that monovision would also lead to the Pulfrich effect. Using an apparatus called a haploscope--essentially, a laboratory version of a 3D movie house--they measured the effect of different monovision-like corrections. Surprisingly, participants experienced a reverse Pulfrich effect rather than a classic one. Instead of being processed more slowly, the blurry image was processed milliseconds faster than the sharp image.

Burge had a guess for why: "Blurring an image doesn't change the contrast uniformly," he says. "Instead it reduces the contrast of the fine details more than that of the coarse details."

He offers as an example what happens when looking through a camera lens. "As the image goes out of focus, first you lose the pinstripes in my shirt and the hairs of my eyebrow. Then you lose the medium details. And finally, the coarse details," he explains. "Neuroscience has shown that the brain processes fine details more slowly than coarse details, all else being equal. Thus, we reasoned that the blurry image gets processed faster because the fine details in the sharp image are slowing it down. Additional experiments showed this reasoning was correct."

With the reverse Pulfrich effect identified, the researchers wanted a fix. "A darker lens slows down processing. A blurry lens speeds it up," Burge says. "We thought, if you darken the blurring lens, the two effects may cancel out. And that's exactly what happens."

The current work answers some questions, but also brings up many more: How does overall light level impact the effect? Is it worse at dusk or nighttime? Does the brain adapt to the illusion once someone acclimates to monovision? It's great fodder for future research, work that could have real implications for public health and public safety.

Credit: 
University of Pennsylvania

Brains work in sync during music therapy -- study

image: Dr. Clemens Maidhof and Professor Jorg Fachner of Anglia Ruskin University (ARU), who led the research.

Image: 
Please credit Anglia Ruskin University (ARU)

For the first time researchers have been able to demonstrate that the brains of a patient and therapist become synchronised during a music therapy session, a breakthrough that could improve future interactions between patients and therapists.

The research, published in the journal Frontiers in Psychology, was carried out by Professor Jorg Fachner and Dr Clemens Maidhof of Anglia Ruskin University (ARU).

This is the first music therapy study to use a procedure called hyperscanning, which records activity in two brains at the same time, allowing researchers to better understand how people interact.

During the session documented in the study, classical music was played as the patient discussed a serious illness in her family. Both patient and therapist wore EEG (electroencephalogram) caps containing sensors, which capture electrical signals in the brain, and the session was recorded in sync with the EEG using video cameras.

Music therapists work towards "moments of change", where they make a meaningful connection with their patient. At one point during this study, the patient's brain activity shifted suddenly from displaying deep negative feelings to a positive peak. Moments later, as the therapist realised the session was working, her scan displayed similar results. In subsequent interviews, both identified that as a moment when they felt the therapy was really working.

The researchers examined activity in the brain's right and left frontal lobes where negative and positive emotions are processed, respectively. By analysing hyperscanning data alongside video footage and a transcript of the session, the researchers were able to demonstrate that brain synchronisation occurs, and also show what a patient-therapist "moment of change" looks like inside the brain.

Lead author Jorg Fachner, Professor of Music, Health and the Brain at Anglia Ruskin University (ARU), said: "This study is a milestone in music therapy research. Music therapists report experiencing emotional changes and connections during therapy, and we've been able to confirm this using data from the brain.

"Music, used therapeutically, can improve wellbeing, and treat conditions including anxiety, depression, autism and dementia. Music therapists have had to rely on the patient's response to judge whether this is working, but by using hyperscanning we can see exactly what is happening in the patient's brain.

"Hyperscanning can show the tiny, otherwise imperceptible, changes that take place during therapy. By highlighting the precise points where sessions have worked best, it could be particularly useful when treating patients for whom verbal communication is challenging. Our findings could also help to better understand emotional processing in other therapeutic interactions."

Credit: 
Anglia Ruskin University

Researchers discover the science behind giving up

image: At the point of giving up, neurons in green get very active and suppress dopamine, a chemical associated with motivation, researchers found.

Image: 
Max Huffman

What happens when we give up?

Inside the brain, a group of cells known as nociceptin neurons get very active before a mouse's breakpoint. They emit nociceptin, a complex molecule that suppresses dopamine, a chemical largely associated with motivation.

The findings, reported July 25 in Cell, offer new insight into the complex world of motivation and reward.

The nociceptin neurons are located near an area of the brain known as the ventral tegmental area. The VTA contains neurons that release dopamine during pleasurable activities. Although scientists have previously studied the effects of fast, simple neurotransmitters on dopamine neurons, this study is among the first to describe the effects of this complex nociception modulatory system.

"We are taking an entirely new angle on an area of the brain known as VTA," said co-lead author Christian Pedersen, a fourth-year Ph.D. student in bioengineering at the University of Washington School of Medicine and the UW College of Engineering.

Researchers at the UW School of Medicine and at Washington University School of Medicine as well as colleagues at other universities, spent four years looking at the role of nociceptin in regulating motivation.

"The big discovery is that large complex neurotransmitters known as neuropeptides have a very robust effect on animal behavior by acting on the VTA," said Pedersen.

The researchers said this discovery could lead to helping people find motivation when they are depressed and conversely decrease motivation for drugs in substance- abuse disorders, like addiction.

The discovery came by looking at the neurons in mice seeking sucrose. The mice had to poke their snout into a port to get sucrose. At first it was easy, then it became two pokes, then five, increasing exponentially, and so on. Eventually, all the mice gave up. Neural activity recordings revealed that these "demotivation" or "frustration" neurons became most active when mice stopped seeking sucrose.

In mammals, the neural circuits that underlie reward seeking are regulated by mechanisms to keep homeostasis - the tendency to maintain internal stability to compensate for environmental changes. In the wild, animals are less motivated to seek rewards in environments where resources are scarce. Persistence in seeking uncertain rewards can be disadvantageous due to risky exposure to predators or from energy expenditure, the researchers noted.

Deficits within these regulatory processes in humans can manifest as behavioral dysfunctions, including depression, addiction, and eating disorders.

Senior author Michael Bruchas, professor of anesthesiology and pain medicine and of pharmacology at the University of Washington School of Medicine is one of the principal faculty in UW's new Center for Neurobiology of Addiction, Pain, and Emotion. He said the findings could go a long way into finding help for patients whose motivation neurons are not functioning correctly.

"We might think of different scenarios where people aren't motivated like depression and block these neurons and receptors to help them feel better," he said. "That's what's powerful about discovering these cells. Neuropsychiatric diseases that impact motivation could be improved."

Looking to the future, he said, these neurons could perhaps be modified in people seeking drugs or those that have other addictions.

Credit: 
University of Washington School of Medicine/UW Medicine

Make more with your 3D printers: from smooth surfaces to complex patterns

image: The top car was printed using the CurviSlicer program, while the one on the bottom was printed using a lower quality program.

Image: 
©Inria

The production revolution envisioned by 3D printing visionaries is only a few steps away, when we will be able to print objects with whatever shape and properties we need. This summer at the 2019 SIGGRAPH conference we will move three steps closer, when the scientists from Inria Nancy-Grand Est present their new findings.

Scientists from the MFX team led by Sylvain Lefebvre, research director at Inria, will present innovative and open-access algorithms that enable new possibilities on your regular fused filament printer. From 3D printing nearly-perfect round shapes, to producing flexible objects with complex elastic behaviors, or producing oriented grip patterns and multi-materials structures : their presentations will cover most of the fabrication process! The MFX team, together with colleagues from other Inria teams (Pixel, Maverick, and Imagine) is dedicated to empowering owners of even the most basic machine to come up with creative and state-of-the-art productions.

30 July 2019 - 11:51 am - Los Angeles Convention Centre, room 153

&laquo High performance rendering » session

Thibault Tricard and Semyon Efremov will unveil the new generation of irregular pattern generation software, paving the way for an easier production of composite materials with different properties in different directions.

Read more : https://bit.ly/32N2xTG

[Procedural Phasor Noise - https://hal.inria.fr/hal-02118508]

31 July 2019 - 11:29am - Los Angeles Convention Centre, room 150/151

&laquo Fabrication » session

Jimmy Etienne will present an innovative software that facilitates the production of round shaped objects with a regular 3D printer.

Read more : https://bit.ly/2JLjqXx

[CurviSlicer: Slightly curved slicing for 3-axis printers - https://hal.inria.fr/hal-02120033]

31 July 2019 - 11:51am - Los Angeles Convention Centre, room 150/151

&laquo Fabrication » session

Jonàs Martinez will tell you everything about the production of meta-material made of repetitions of with star-shaped units, thus showing surprising properties that you definitely can't find in any natural material.

Read more : https://bit.ly/2GsCiIW

[Star-Shaped Metrics for Mechanical Metamaterial Design - https://hal.inria.fr/hal-02118846]

The MFX team, located in Nancy, France, is common to Inria and the Loria. Its members focus on challenges related to shape complexity in the context of Computer Graphics and Additive Manufacturing. They consider the entire chain from modeling, visualization to interaction and part geometry processing before fabrication. They are at the origin of the IceSL open-access software.

Sylvain Lefebvre and his team will be available to answer your questions and discuss what new avenues their algorithms open up at any time, during the conference or after.

About Inria: Inria, the French national research institute for the digital sciences, promotes scientific excellence and technology transfer to maximise its impact.

It employs 2,400 people. Its 200 agile project teams, generally with academic partners, involve more than 3,000 scientists in meeting the challenges of computer science and mathematics, often at the interface of other disciplines. Inria works with many companies and has assisted in the creation of over 160 startups. It strives to meet the challenges of the digital transformation of science, society and the economy.

The Inria Nancy - Grand Est research centre was founded in 1986 to contribute to the economic revival of the region. It grew steadily from 7 project-teams and 50 people in 1990, to 21 project-teams and 450 people today, across three sites: Nancy, Strasbourg and Saarbru?cken. The Inria Nancy - Grand Est research centre develops most of its scientific activities in partnership with the French National Centre for Scientific Research (CNRS), the University of Lorraine and the University of Strasbourg. It also maintains close ties with other research institutes and universities from the wider region, mainly in Saarbru?cken and Luxembourg. Its research is structured around five main broad topics that you can find online: https://bit.ly/2Y8bfbP

Credit: 
Inria Nancy-Grand Est

New model identifies most efficient logistics for military operations

image: US Army Spc. Gabriel Garzon, a motor transport operator assigned to Company J, 1st Battalion, 41st Infantry Regiment, 2nd Infantry Brigade Combat Team, 4th Infantry Division, ground guides a military vehicle, Sept. 6, 2018, in Afghanistan.

Image: 
U.S. Army photo by Staff Sgt. Neysa Canfield

RESEARCH TRIANGLE PARK, N.C. (July 25, 2019) -- Military deployments to environments lacking basic infrastructure - whether humanitarian missions or combat operations - involve extensive logistical planning. As part of a research project for the U.S. Army, researchers at North Carolina State University designed a model to help military leaders better account for logistical risk and uncertainty during operational planning and execution.

The research, published in Journal of Defense Modeling & Simulation, uses an enterprise resource planning system that handles everything from requisitions to shipment of supplies to inventory tracking, to create computational models that can be used to identify the most efficient means of meeting the military's logistical needs.

"This research lays the mathematical and operational foundation for construction of a network-based model that captures routing alternatives and characterizes solutions for capacity planning and resiliency analysis in near-real time," said Dr. Joseph Myers, Army Research Office mathematical sciences division chief at the U.S. Army Combat Capabilities Development Command's Army Research Laboratory. "This project will provide military logistics planners with capabilities that are currently lacking in prevalent logistics planning tools."

"These models would be particularly valuable during expeditionary operations, in which the military is seeking to establish its presence - and its supply chain - in an environment that is subject to a fair amount of uncertainty," said Professor Brandon McConnell, a research assistant professor in NC State's Edward P. Fitts Department of Industrial and Systems Engineering and the paper's coauthor. "The model that we've developed can not only facilitate the military's ability to efficiently determine what will be needed where, but can also assess risk in near real time in order to account for uncertainty."

The new model, called the Military Logistics Network Planning System, draws on three sources of information. First is logistical data from the enterprise resource planning system. Second is operational data, such as an operation's mission, organization and timeline. Third is data on "mission specific demand," meaning logistical requirements that are particular to a given mission and its environment. For example, a combat operation being conducted in a cold, damp environment would have different requirements than a humanitarian mission being conducted in a hot, dry environment.

The system also uses two factors to assess risk and determine how risk might affect military capacities. The first factor is the likelihood that an event will happen; the second factor is what the consequences of that event will be. For example, if the likelihood of factors is identical, the model would give more weight to the event that could have a greater adverse impact on military personnel and mission performance.

"The MLNPS uses all of the available data, accounts for risk, then forecasts what the logistical outcomes will look like in reality," said McConnell, a former infantry captain in the U.S. Army who served two tours in Iraq. "The MLNPS can be used as a decision planning aid, allowing leaders to test-drive plans in order to identify courses of action that will best support carrying out an operation."

The MLNPS could also be used while an operation is being executed, as part of contingent logistical planning efforts that take place as circumstances change on the ground.

"Right now, the MLNPS is a robust proof-of-concept prototype, designed to demonstrate the potential value of powerful computational tools that can make use of [enterprise resource planning] systems," said McConnell. "Existing logistical tools are both valuable and powerful. However, I'm not aware of any other methods that make use of [enterprise resource planning] data and are also fast enough for operational use when time is of the essence."

The CCDC Army Research Laboratory (ARL) is an element of the U.S. Army Combat Capabilities Development Command. As the Army's corporate research laboratory, ARL discovers, innovates and transitions science and technology to ensure dominant strategic land power. Through collaboration across the command's core technical competencies, CCDC leads in the discovery, development and delivery of the technology-based capabilities required to make Soldiers more lethal to win our Nation's wars and come home safely. CCDC is a major subordinate command of the U.S. Army Futures Command.

Credit: 
U.S. Army Research Laboratory

Fracking likely to result in high emissions

Natural gas releases fewer harmful air pollutants and greenhouse gases than other fossil fuels. That's why it is often seen as a bridge technology to a low-carbon future. A new study by the Institute for Advanced Sustainability Studies (IASS) has estimated emissions from shale gas production through fracking in Germany and the UK. It shows that CO2-eq. emissions would exceed the estimated current emissions from conventional gas production in Germany. The potential risks make strict adherence to environmental standards vital.

In the last ten years natural gas production has soared in the United States. This is mainly due to shale gas, which currently accounts for about 60 per cent of total US gas production. Shale, a fine-grained, laminated, sedimentary rock, has an extremely low permeability, which in the past made it difficult - and uneconomical - to extract.

However, recent advancements in horizontal drilling and hydraulic fracturing have opened up previously unrecoverable shale gas reserves to large-scale, commercial production.

In light of experiences in the US and dwindling conventional gas reserves, the debate on shale gas has also taken centre stage in Europe. The purported climate advantages of shale gas over coal and the implications for domestic energy security have made fracking in shale reservoirs an interesting prospect for many European countries.

What emissions is shale gas production in Europe likely to cause?

IASS researcher Lorenzo Cremonese led a study that investigated the greenhouse gas and air pollutant emissions (including carbon dioxide, methane, carbon monoxide, nitrogen oxides, particulates and other volatile organic compounds) expected to result from future shale gas production in Germany and the UK.

A team of researchers from the University of Potsdam, the TNO Utrecht, the Freie Universität Berlin, and the IASS determined the amount of these chemical compounds that would be released into the atmosphere through fracking activities, based on estimated reservoir productivities, local capacity, and the technologies used. Their findings have been published in the International Journal Elementa - Science of the Anthropocene.

To quantify total emissions, the authors assigned gas losses to each stage of upstream gas production. In the process, they also generated two plausible emission scenarios: a 'realistic' and an 'optimistic' scenario.

While methane leakage rates for the optimistic scenario approximate official figures in national inventories, the rates for the realistic scenario exceed them by a large margin. The emission intensity of shale gas in electricity generation is up to 35 per cent higher than estimates of the current emission intensity of conventional gas in Germany. The study also questions the accuracy of methane leakage estimates for current conventional gas production.

Time to put the environmental risks of natural gas on the political agenda

At the same time, the results show that in all plausible scenarios, emissions of air pollutants like carbon monoxide, nitrogen oxides and particulate matter will have a negligible effect on overall national emissions of these substances. But unlike greenhouse gases, air pollutants have immediate health effects at local and regional level. They are the focus of another study currently being prepared.

The present study fills a gap in the scientific debate on European shale gas reserves and the consequences of exploiting them. "If shale gas becomes a reality in Europe, the risks arising from that will have to be minimised through strict adherence to environmental standards," explains Cremonese.

The study also provides valuable insights for the discussion on the climate effects of a new gas industry, and, more generally, on the question of if and how natural gas should play a role in the global energy transition.

"The major differences between the realistic and optimistic scenarios in terms of their anticipated emissions underline once again the importance of improving existing emissions reduction technologies and practices," says Cremonese. "In light of the climate crisis, the environmental risks posed by gas emissions need to move quickly onto the agenda in policymaking and in negotiations with the gas industry in order to keep the adverse effects of a European shale gas industry to an absolute minimum."

Credit: 
Research Institute for Sustainability (RIFS) – Helmholtz Centre Potsdam

Exposure to common chemicals in plastics linked to childhood obesity

WASHINGTON--Exposure to common chemicals in plastics and canned foods may play a role in childhood obesity, according to a study published in the Journal of the Endocrine Society.

Bisphenol S (BPS) and bisphenol F (BPF) are manufactured chemicals used in certain kinds of plastic, in the lining of aluminum-canned food and drinks, and in thermal paper from cash-register receipts. These chemicals have been used as a replacement for bisphenol A (BPA), a well-known endocrine-disrupting chemical that harms human health by interfering with the body's hormones.

"This research is significant because exposure to these chemicals is very common in the United States. BPS and BPF use is growing because manufacturers are replacing BPA with these chemicals, so that is contributing to the frequency of exposure," said the study's corresponding author, Melanie Jacobson, Ph.D., M.P.H, of NYU School of Medicine in New York, N.Y. "Although diet and exercise are still understood to the main drivers of obesity, this research suggests that common chemical exposures may also play a role, specifically among children."

In this study, researchers used data from the US National Health and Nutrition Examination Surveys to evaluate associations between BPA, BPS, and BPF and body mass outcomes among children and adolescents aged 6 to 19 years. Children who had greater levels of BPS and BPF in their urine were more likely to have obesity compared to children with lower levels.

"In a previous study, we found that the predecessor chemical to BPS and BPF--BPA--was associated with a higher prevalence obesity in U.S. children, and this study found the same trend among these newer versions of that chemical. Replacing BPA with similar chemicals does nothing to mitigate the harms chemical exposure has on our health," Jacobson said.

Credit: 
The Endocrine Society

Study finds new insights on overdose rates, county segregation, and socioeconomics

image: This is Dr. Cara Frankenfeld.

Image: 
George Mason University

New research from George Mason University finds that factors such as county poverty levels, social environment, employment rates, and racial or ethnic segregation affect overdose rates differently.

Fairfax, VA - A new study led by George Mason University’s College of Health and Human Services found new insights into the link between county socioeconomics and segregation on drug overdose deaths.

The study found that socioeconomic factors and segregated counties may affect the rate of drug overdose deaths independently and differently among racial and ethnic groups. This is the first study of its kind to explore both of these influences at the county level.

Dr. Cara Frankenfeld from the Department of Global and Community Health and Dr. Timothy Leslie from Mason’s Department of Geography and Geoinformation Science led the study published in Annals of Epidemiology in April.

“The social environment where people live may have a critical influence on the health of the area,” Frankenfeld noted. “That’s why it’s so important to study the link between racially or ethnically segregated counties—which may be the result of structural racism—and drug overdose deaths.”

Segregated counties—those that differ from surrounding areas in race, ethnicity, unemployment, or poverty—were often linked to different rates of overdose deaths among ethnic and racial groups. For example, counties with more racial and ethnic diversity had fewer overdose deaths in blacks and Hispanics, but similar overdose deaths in whites. Counties with higher unemployment diversity (unemployed vs. employed) had more Hispanic, but not white or black, overdose deaths. Counties with more poverty than neighboring counties had more overdose deaths in blacks, but not in whites or Hispanics. The strongest link was for unemployment diversity, across all groups, with a 35% higher rate for each five percent increase in unemployment diversity.

Counties with more disabled civilians had more drug overdose deaths across all racial and ethnic groups. However, the link between other socioeconomic factors and drug overdose deaths often varied by racial and ethnic groups. For example, counties with more racial diversity, more unemployment diversity, and more uninsured residents had fewer black and Hispanic overdose deaths, but this was not observed for whites. Counties with higher incomes had fewer Hispanic overdose deaths but more black overdose deaths. Counties with more unemployment had fewer Hispanic deaths but more white deaths.

This study used data for the three racial/ethnic groups from the Centers for Disease Control and Prevention (CDC Wonder) Underlying Cause of Death (1999 to 2015) and the American Community Survey (2010-2014).

Few studies have assessed the influence of poverty segregation (higher poverty rates than neighboring counties) and poverty diversity (different rates within counties) on drug overdose deaths. Poverty was an important strong factor, and more work in that area is recommended.

The research is challenged by a small number of deaths at lower levels of geography, but researchers recommend future research at the individual level to look at the interaction between individual socioeconomic characteristics and social geography. This could help us better understand the impact of the social environment on substance abuse and drug overdose deaths.

Credit: 
George Mason University

Microbial manufacturing: Genetic engineering breakthrough for urban farming

image: This is a petri dish with colonies growing microbial cells (factories).

Image: 
DiSTAP, SMART (MIT's research enterprise in Singapore)

Singapore, July 25, 2019 - Researchers at SMART, MIT's research enterprise in Singapore, and National University of Singapore (NUS) have developed a technology that greatly accelerates the genetic engineering of microbes that can be used to manufacture chemicals used for urban farming. The new technology will result in a faster, cheaper, more accurate, and near-scarless plasmid construction, using standard and reusable parts, that is compatible with most popular DNA assembly methods.

Explained in a paper titled "A standard for near-scarless plasmid construction using reusable DNA parts", which will be published this month in the prestigious academic journal, Nature Communications, the project is part of the SMART Interdisciplinary Research Group (IRG) - Disruptive & Sustainable Technologies for Agricultural Precision (DiSTAP). The IRG develops new technologies to enable Singapore, a city-state which is dependent upon imported food and produce, to improve their agriculture yield to reduce external dependencies.

Kang Zhou, a DiSTAP Principal Investigator who is also an assistant professor at the NUS Department of Chemical and Biomolecular Engineering (ChBE) and Xiaoqiang Ma, a postdoctoral associate at SMART, led the development of the technology while working on ways to support their colleagues who were working on enhancing vegetable yield in the country's urban farms. They were exploring ways on microbial fermentation which create fertilizers, nutrients and non-synthetic pesticides for urban farms, in the form of small molecules.

"The objective of this study was to create a technology that can engineer microbes faster and at a lower cost," said Ma. "Current technology is expensive and time-consuming. Researchers have to order customised materials from suppliers which takes a while to arrive. They also often use only 1% of the material, leading to wastage. As each material is customised, researchers have to re-order each time, which further delays and add costs to the production."

The new Guanine/Thymine (GT) DNA assembly technology significantly changes things by enabling genetic engineers to reuse genetic materials. It provides a simple method to define the biological parts as standard DNA parts. Further, unlike previous attempts at creating standardised materials which have an accuracy of up to 50%, the GT technology is able to reach an accuracy of close to 90%. As a near-scarless plasmid construction, the technology is substantially faster, being able to stitch up to 7 parts to a DNA as opposed to only 2 parts for other methods of similar accuracy.

"Being able to provide an accuracy of close to 90% for genetic materials while connecting up to 7 parts to a DNA is a game-changer in the creation of genetic materials by using standard parts," said Zhou. "We anticipate that the huge cost and time savings will enable the development of new fermentation processes that can manufacture green chemicals to make urban farming in Singapore more efficient and safer. This technology is also applicable to all genetic engineering fields outside of agriculture, and we are actively looking at ways we can deploy it for easy access."

In addition to commercialisation plans, the researchers are also planning to set up an e-commerce platform which can quickly create and distribute these genetic materials to researchers around the world. It will be the first such platform for reusable genetic engineering materials in the world.

Credit: 
Singapore-MIT Alliance for Research and Technology (SMART)

Tobacco-21 laws can lower smoking prevalence in the 18-20 age group

A new study published today by the scientific journal Addiction found that raising the legal age of sale of cigarettes from 18 to 21 in the U.S. was associated with a 39% reduction in the odds of regular smoking in 18- to 20-year-olds who had experimented with cigarettes. The reduction was even greater (50%) in those who had close friends who smoked when they were 16.

The study compares smoking prevalence among 18-20 versus 21-22-year-olds, in regions that did versus did not raise the legal age of tobacco sales to 21. In areas with tobacco-21 laws, 18-20-year-olds were much less likely to smoke than their same-age peers in areas without these policies. That differential was not evident for 21-22-year-olds, who would not have been bound by the sales restriction but should have been affected by other local factors that might explain the younger age-group's differential smoking rate (e.g., other local tobacco policies, regional attitudes towards smoking).

Lead author Abigail Friedman, assistant professor at the Yale School of Public Health, commented, "This research indicates that a 'social multiplier' effect may amplify the impact of tobacco-21 laws. While these policies were associated with a 39% drop in the odds of regular smoking overall, the reduction was larger among young people whose friends were likely to smoke before tobacco-21 laws were adopted. As peer smoking is a critical predictor of youth smoking, this study suggests that tobacco-21 laws may help reduce smoking among those most susceptible to tobacco use. This result supports raising the age of sale to 21 as a means to reduce young adult smoking and improve public health."

As of June 2019, sixteen U.S. states and over 400 localities have adopted tobacco-21 laws.

Credit: 
Society for the Study of Addiction

Supercomputers use graphics processors to solve longstanding turbulence question

image: This is a turbulent flow simulation displayed on a wraparound screen.

Image: 
Thomas Angus / Imperial College London

Advanced simulations have solved a problem in turbulent fluid flow that could lead to more efficient turbines and engines.

When a fluid, such as water or air, flows fast enough, it will experience turbulence - seemingly random changes in velocity and pressure within the fluid.

Turbulence is extremely difficult to study but is important for many fields of engineering, such as air flow past wind turbines or jet engines.
Understanding turbulence better would allow engineers to design more efficient turbine blades, for example, or make more aerodynamic shapes for Formula 1 cars.

However, current engineering models of turbulence often rely upon 'empirical' relationships based on previous observations of turbulence to predict what will happen, rather than a full understanding of the underlying physics.

This is because the underlying physics is immensely complicated, leaving many questions that seem simple unsolved.

Now, researchers at Imperial College London have used supercomputers, running simulations on graphics processors originally developed for gaming, to solve a longstanding question in turbulence.

Their result, published today in the Journal of Fluid Mechanics, means empirical models can be tested and new models can be created, leading to more optimal designs in engineering.

Co-author Dr Peter Vincent, from the Department of Aeronautics at Imperial, said: "We now have a solution for an important fundamental flow problem. This means we can check empirical models of turbulence against the 'correct' answer, to see how well they are describing what actually happens, or if they need adjusting."

The question is quite simple: if a turbulent fluid is flowing in a channel and it is disturbed, how does that disturbance dissipate in the fluid? For example, if water was suddenly released from a dam into a river and then shut off, what affect would that pulse of dam water have on the flow of the river?

To determine the overall 'average' behaviour of the fluid response, the team needed to simulate the myriad smaller responses within the fluid. They used supercomputers to run thousands of turbulent flow simulations, each requiring billions of calculations to complete.

Using these simulations, they were able to determine the exact parameters that describe how the disturbance dissipates in the flow and determined various requirements that empirical turbulence models must satisfy.

Co-author Professor Sergei Chernyshenko, from the Department of Aeronautics at Imperial, said: "From my first days studying fluid mechanics I had some fundamental questions that I wanted to know the answers to. This was one of them, and now after 40 years I have the answer."

Credit: 
Imperial College London