Tech

Unbroken: New soft electronics don't break, even when punctured

image: Current passes through a self-healing circuit.

Image: 
Photo by Alex Parrish, Virginia Tech

Want a smartphone that stretches, takes damage, and still doesn't miss a call?

A team of Virginia Tech researchers from the Department of Mechanical Engineering and the Macromolecules Innovation Institute has created a new type of soft electronics, paving the way for devices that are self-healing, reconfigurable, and recyclable. These skin-like circuits are soft and stretchy, sustain numerous damage events under load without losing electrical conductivity, and can be recycled to generate new circuits at the end of a product's life.

Led by Assistant Professor Michael Bartlett, the team recently published its findings in Communications Materials, an open access journal from Nature Research.

Current consumer devices, such as phones and laptops, contain rigid materials that use soldered wires running throughout. The soft circuit developed by Bartlett's team replaces these inflexible materials with soft electronic composites and tiny, electricity-conducting liquid metal droplets. These soft electronics are part of a rapidly emerging field of technology that gives gadgets a level of durability that would have been impossible just a few years ago.

The liquid metal droplets are initially dispersed in an elastomer, a type of rubbery polymer, as electrically insulated, discrete drops.

"To make circuits, we introduced a scalable approach through embossing, which allows us to rapidly create tunable circuits by selectively connecting droplets," postdoctoral researcher and first author Ravi Tutika said. "We can then locally break the droplets apart to remake circuits and can even completely dissolve the circuits to break all the connections to recycle the materials, and then start back at the beginning."

The circuits are soft and flexible, like skin, continuing to work even under extreme damage. If a hole is punched in these circuits, the metal droplets can still transfer power. Instead of cutting the connection completely as in the case of a traditional wire, the droplets make new connections around the hole to continue passing electricity.

The circuits will also stretch without losing their electrical connection, as the team pulled the device to over 10 times its original length without failure during the research.

At the end of a product's life, the metal droplets and the rubbery materials can be reprocessed and returned to a liquid solution, effectively making them recyclable. From that point, they can be remade to start a new life, an approach that offers a pathway to sustainable electronics.

While a stretchy smartphone has not yet been made, rapid development in the field also holds promise for wearable electronics and soft robotics. These emerging technologies require soft, robust circuitry to make the leap into consumer applications.

"We're excited about our progress and envision these materials as key components for emerging soft technologies," Bartlett said. "This work gets closer to creating soft circuitry that could survive in a variety of real-world applications."

Credit: 
Virginia Tech

Elephants solve problems with personality

image: African savanna elephant Tembo, at the San Diego Zoo, was one of the elephant participants in the study.

Image: 
Lisa Barrett

Just as humans have their own individual personalities, new research in the Journal of Comparative Psychology shows that elephants have personalities, too. Moreover, an elephant's personality may play an important role in how well that elephant can solve novel problems.

The article was written by Lisa Barrett and Sarah Benson-Amram in the University of Wyoming's Animal Behavior and Cognition Lab, led by Benson-Amram. It may be viewed here.

The authors of the paper tested 15 Asian elephants and three African savanna elephants in three zoos across the country -- the San Diego Zoo, the Smithsonian's National Zoological Park and the Oklahoma City Zoo -- with the help of elephant caretakers.

Previous work from Barrett and Benson-Amram demonstrated that Asian elephants can use water as a tool to solve a novel problem -- and reach a tasty marshmallow reward -- in what's called the floating object task. This time, the authors designed new novel tasks, as well as personality tests, for the elephants.

"We took a comprehensive approach by using three different problem-solving tasks and three types of personality assessments to determine if individual personality played a role in which elephants were able to solve these tasks," Barrett says. "Since we couldn't give the elephants a personality test like the ones you're familiar with online, we had to think creatively."

The authors developed novel object tests, in which they presented the elephants with an unfamiliar object, a mylar balloon, a burned log and the scent of a predator (lion or hyena), and recorded the elephants' responses. You can watch videos of the novel object trials: balloon, burned log and urine. They also asked elephant caretakers to fill out a survey about the personalities of the animals in their care; and, finally, they observed the elephants interacting with one another in their zoo habitats.

From those assessments, Barrett and Benson-Amram learned that the surveys and observations were the most reliable methods to get at elephant personality. Overall, Barrett and Benson-Amram measured traits such as active, affectionate, aggressive, defiant, excitable, mischievous, shy and sociable, which have been studied in other animals as well.

"We were eager to see if the personality traits we uncovered through the surveys and observations predicted success on novel problem-solving tasks," Benson-Amram says. "The elephants had an opportunity to solve each task three times, and we measured if they learned to solve faster over time, and then we traced their success back to their personality type."

The three problem-solving tasks included the trap tube task, which is a common test used with primates but which had never been presented outside of primates before. You can watch videos of the problem-solving trials: boxed ball, rod ball and trap tube.

Barrett and Benson-Amram found that elephants did learn to solve two out of the three tasks faster over time, even though the elephants only received three trials on each task. Traits including aggressiveness and activity were important predictors of problem-solving overall, but the personality traits measured did not significantly predict learning ability.

This study makes connections between two sources of individual variation, personality and cognition, in threatened species. One reason it is important to examine problem-solving in elephants is that they are faced with new problems that they need to solve regularly in the wild. For example, if certain traits enable elephants to overcome novel problems, elephants may be more likely to invade farmland and contribute to human-elephant conflict. With more research, managers can predict which elephants might overcome or habituate to deterrents, and managers can devote more resources to tracking elephants.

The authors call for more work on different forms of personality assessments to determine which methods would be best for management of zoo and wild elephants.

"Research with free-ranging elephants can extend this study to determine which personality traits are most important for solving novel problems that elephants experience in the wild," says Barrett, a 2020 graduate of UW's Program in Ecology and the Department of Zoology and Physiology.

Credit: 
University of Wyoming

NIST method uses radio signals to image hidden and speeding objects

video: This demonstration of the m-Widar (micro-Wave image detection, analysis and ranging) system shows, in the video on the left, a person walking and later crouching and lying down in an anechoic chamber. The transmitters and receiver are in a vertical line on the right side of the chamber. The second video on the right shows the instrument's view of the same scene. About 21 seconds into the video, a wallboard is inserted between the person and the instrument in the anechoic chamber, to show that m-Widar can "see" through walls.

Image: 
NIST

Researchers at the National Institute of Standards and Technology (NIST) and Wavsens LLC have developed a method for using radio signals to create real-time images and videos of hidden and moving objects, which could help firefighters find escape routes or victims inside buildings filled with fire and smoke. The technique could also help track hypersonic objects such as missiles and space debris.

The new method, described in Nature Communications, could provide critical information to help reduce deaths and injuries. Locating and tracking first responders indoors is a prime goal for the public safety community. Hundreds of thousands of pieces of orbiting space junk are considered dangerous to humans and spacecraft.

"Our system allows real-time imaging around corners and through walls and tracking of fast-moving objects such as millimeter-sized space debris flying at 10 kilometers per second, more than 20,000 miles per hour, all from standoff distances," said physicist Fabio da Silva, who led the development of the system while working at NIST.

"Because we use radio signals, they go through almost everything, like concrete, drywall, wood and glass," da Silva added. "It's pretty cool because not only can we look behind walls, but it takes only a few microseconds of data to make an image frame. The sampling happens at the speed of light, as fast as physically possible."

The NIST imaging method is a variation on radar, which sends an electromagnetic pulse, waits for the reflections, and measures the round-trip time to determine distance to a target. Multisite radar usually has one transmitter and several receivers that receive echoes and triangulate them to locate an object.

"We exploited the multisite radar concept but in our case use lots of transmitters and one receiver," da Silva said. "That way, anything that reflects anywhere in space, we are able to locate and image."

Da Silva explains the imaging process like this:

To image a building, the actual volume of interest is much smaller than the volume of the building itself because it's mostly empty space with sparse stuff in it. To locate a person, you would divide the building into a matrix of cubes. Ordinarily, you would transmit radio signals to each cube individually and analyze the reflections, which is very time consuming. By contrast, the NIST method probes all cubes at the same time and uses the return echo from, say, 10 out of 100 cubes to calculate where the person is. All transmissions will return an image, with the signals forming a pattern and the empty cubes dropping out.

Da Silva has applied for a patent, and he recently left NIST to commercialize the system under the name m-Widar (microwave image detection, analysis and ranging) through a startup company, Wavsens LLC (Westminster, Colorado).

The NIST team demonstrated the technique in an anechoic (non-echoing) chamber, making images of a 3D scene involving a person moving behind drywall. The transmitter power was equivalent to 12 cellphones sending signals simultaneously to create images of the target from a distance of about 10 meters (30 feet) through the wallboard.

Da Silva said the current system has a potential range of up to several kilometers. With some improvements the range could be much farther, limited only by transmitter power and receiver sensitivity, he said.

The basic technique is a form of computational imaging known as transient rendering, which has been around as an image reconstruction tool since 2008. The idea is to use a small sample of signal measurements to reconstruct images based on random patterns and correlations. The technique has previously been used in communications coding and network management, machine learning and some advanced forms of imaging.

Da Silva combined signal processing and modeling techniques from other fields to create a new mathematical formula to reconstruct images. Each transmitter emits different pulse patterns simultaneously, in a specific type of random sequence, which interfere in space and time with the pulses from the other transmitters and produce enough information to build an image.

The transmitting antennas operated at frequencies from 200 megahertz to 10 gigahertz, roughly the upper half of the radio spectrum, which includes microwaves. The receiver consisted of two antennas connected to a signal digitizer. The digitized data were transferred to a laptop computer and uploaded to the graphics processing unit to reconstruct the images.

The NIST team used the method to reconstruct a scene with 1.5 billion samples per second, a corresponding image frame rate of 366 kilohertz (frames per second). By comparison, this is about 100 to 1,000 times more frames per second than a cellphone video camera.

With 12 antennas, the NIST system generated 4096-pixel images, with a resolution of about 10 centimeters across a 10-meter scene. This image resolution can be useful when sensitivity or privacy is a concern. However, the resolution could be improved by upgrading the system using existing technology, including more transmitting antennas and faster random signal generators and digitizers.

In the future, the images could be improved by using quantum entanglement, in which the properties of individual radio signals would become interlinked. Entanglement can improve sensitivity. Radio-frequency quantum illumination schemes could increase reception sensitivity.

The new imaging technique could also be adapted to transmit visible light instead of radio signals -- ultrafast lasers could boost image resolution but would lose the capability to penetrate walls -- or sound waves used for sonar and ultrasound imaging applications.

In addition to imaging of emergency conditions and space debris, the new method might also be used to measure the velocity of shock waves, a key metric for evaluating explosives, and to monitor vital signs such as heart rate and respiration, da Silva said.

Credit: 
National Institute of Standards and Technology (NIST)

A new concept stent that suppresses adverse effects with cells

image: Calcein stained live OECs adhered on bare metal stent (BMS) and pDA /FN/ECM coated stent, respectively.

Image: 
Korea Institute of Science and Technology(KIST)

Medical materials that can be inserted into the human body have been used for decades in the field of regenerative medicine - for example, stents that can help dilate clogged blood vessels and implants that can replace teeth or bones. The prolonged use of these materials can result in serious adverse effects and loss of various functions - for example, inflammatory responses, generation of fibrous tissues around the material, and generation of blood clots that block blood vessels.

Recently, a Korean research team has drawn attention for developing a technology to reduce the adverse effects by accumulating the peripheral substances of cells on the surfaces of the materials. The Korea Institute of Science and Technology (KIST) has released an announcement that the research team of Dr. Yoon Ki Joung, from the Center for Biomaterials, has successfully developed a material that can be used to accumulate substances present at the cell periphery on the surface of implantable medical materials. The research was carried out in collaboration with the research team of Professor Dong Keun Han, working at the Cha University (President Dong-Ik Kim). This material can be used to deliver cell-based therapeutics to the desired sites as it can be loaded with therapeutic cells such as stem cells.

The researchers coated the surface of the material with a compound (polydopamine) and a protein (fibronectin) that can strongly bond with the surface of the material and biomaterial under study. They also used the material surfaces for cell culture studies. The cultured cells produced constituents of the cell periphery (three extracellular matrices). Following this, only the cell was removed and the extracellular matrix was kept intact. This resulted in the generation of space for the attachment of cells necessary for medical purposes. The extracellular matrix enables the adhesion and survival of cells in-vivo because of its high affinity toward cells. Hence, it can effectively deliver the required cells to the treatment sites and mitigate the adverse effects caused by the medical materials.

The researchers applied the developed material to the surface of a stent, a medical device used during surgical procedures to dilate clogged blood vessels. Stents can potentially block blood vessels and cause inflammation or blood clots as they are used to physically extend the blood vessels. This can wound the site that is being operated on. When the developed material was loaded and delivered with endothelial progenitor cells that can regenerate blood vessels, it exhibited excellent vasodilation effects. The damaged inner walls of the blood vessels could be regenerated as well. Regeneration of the inner walls of the blood vessels resulted in a decreased rate (by >70%) of neointimal formation.

Dr. Yoon Ki Joung of KIST said, "This technology can be used to improve various materials that are inserted into the human body. Therefore, it is expected to provide a universal platform for the development of implantable diagnostic and treatment devices (that can potentially dictate the future of technology in the field) and medical devices such as stents and implants that require long-term implantation."

Credit: 
National Research Council of Science & Technology

Can a calculator predict your risk of dementia?

image: Researchers have built and validated an online calculator that empowers individuals 55 and over to better understand the health of their brain and how they can reduce their risk of being diagnosed with dementia in the next five years.

Image: 
Project Big Life

Canadian researchers at The Ottawa Hospital, the University of Ottawa, the Bruyère Research Institute and ICES have built and validated an online calculator that empowers individuals 55 and over to better understand the health of their brain and how they can reduce their risk of being diagnosed with dementia in the next five years.

Their process was published today in the Journal of Epidemiology and Community Health, and the calculator is available at projectbiglife.ca.

Dementia is an umbrella term for loss of memory and other thinking abilities severe enough to interfere with daily life. Every year, 76,000 new cases of dementia are diagnosed in Canada, a number expected to increase as the population ages.

There is no cure or treatment for dementia. However, about a third of dementia may be preventable through lifestyle factors like physical activity, healthy eating, reducing alcohol and tobacco use, and managing conditions like diabetes and high blood pressure.

The researchers based the dementia calculator on survey data from over 75,000 Ontarians.

"What sets this dementia risk calculator apart is that you don't need to visit a doctor for any tests," said Dr. Stacey Fisher, the lead author of the study who performed the research largely in Ottawa while she was a PhD student supervised by Dr. Doug Manuel and Dr. Peter Tanuseputro at The Ottawa Hospital. "People already have all the information they need to complete the calculator in the comfort of their home." Dr. Fisher is currently a postdoctoral fellow at the University of Toronto and Public Health Ontario.

Factors in the Dementia Population Risk Tool (DemPoRT) include:

Age

Smoking status and lifetime exposure

Alcohol consumption

Physical activity

Stress

Diet

Sense of belonging

Ethnicity

Immigration status

Socioeconomic status of the neighbourhood

Education

Activities where assistance is needed

Marital status

Number of languages spoken

Health conditions

The calculator can be used by individuals to assess their dementia risk and help them modify their lifestyle. The researchers also have a goal for policy makers to use this algorithm to do the same thing for the general population.

Through this research, the team has developed the first predictive tool designed to predict dementia at a population level. It can predict the number of new cases in the community, identify higher-risk populations, inform dementia prevention strategies, and will be used to support Canada's national dementia strategy. By using regularly collected health data and surveys, population health experts have all the information they need to use the algorithm.

"This tool will give people who fill it out clues to what they can do to reduce their personal risk of dementia," said Dr. Peter Tanuseputro, senior author of the study, and scientist at The Ottawa Hospital, investigator at the Bruyère Research Institute, adjunct scientist at ICES and assistant professor at the University of Ottawa. "The COVID-19 pandemic has also made it clear that sociodemographic variables like ethnicity and neighbourhood play a major role in our health. It was important to include those variables in the tool so policy makers can understand how different populations are impacted by dementia, and help ensure that any prevention strategies are equitable."

The dementia calculator will be added to a list of existing calculators on Project Big Life that help Canadians estimate their own life expectancy based on habits and lifestyle choices.

The calculator was based on data from the Statistics Canada Canadian Community Health Surveys housed at ICES. Currently designed for use in Canada, it can be adapted for any of the 100 countries around the world that collect health survey data.

Credit: 
The Ottawa Hospital

Researchers solve a puzzle to design larger proteins

image: (left) The strand order swapping in de novo design of larger alpha-beta proteins has been a long-standing problem for the research team.
(right) Backbone ensembles generated from folding simulations identified that backbone strain caused the strand swapping.

Image: 
NINS/IMS

A team from Japan and the United States has identified the design principles for creating large "ideal" proteins from scratch, paving the way for the design of proteins with new biochemical functions.

Their results appear June 24, 2021, in Nature Communications.

The team had previously developed principles to design small versions of what they call "ideal proteins," which are structures without internal energetic frustration.

Such proteins are typically designed with a molecular feature called beta strands, which serve a key structural role for the molecules. In previous designs, the researchers successfully designed alpha-beta proteins with four beta strands.

"The ideal proteins we have created so far are much more stable and more soluble than proteins commonly found in nature. We think these proteins will become useful starting points for designing new biochemical functions of interest," said co-first author Rie Koga, researcher in Exploratory Research Center of Life and Living Systems of Japan's National Institutes of Natural Sciences (NINS).

The team found that while the designed proteins were structurally ideal, they are too small to harbor functional sites.

"We set out to test the generality of the design principles we developed previously by applying them to the design of larger alpha-beta proteins with five and six beta strands," said co-first author Nobuyasu Koga, associate professor in the Institute for Molecular Science of NINS.

The results were puzzling. They found that their experimental structures differed from their computer models, resulting in proteins that folded differently by swapping the internal locations of their beta strands. The team struggled with the strand swapping puzzle, but by iterating between computational design and laboratory experiments, they reached a conclusion.

"We emphasize that experimental structure determination is important for iterative improvement of computational protein design," said co-first author Gaohua Liu, chief scientific officer of Nexomics Biosciences.

"Sometimes we learn the most from these ideal proteins when their experimental structures differ, rather than match, their intended design, since this can lead to a deeper understanding of the underlying principles", added Gaetano Montelione, co-author and professor of chemistry and chemical biology at Rensselaer Polytechnic Institute.

The reason for the strand swapping, they determined, was due to the strain of the whole system on the foundational backbone structure. According to Nobuyasu Koga, the strain is global, instead of connection to connection. Proteins can adjust the length and register of strands across the system to alleviate this backbone strain.

Next, the researchers plan to continue studying the trade-off between more functional proteins with what could be considered less-than-ideal qualities.

"We would like to design proteins with more complex functional sites by incorporating non-ideal features such as longer loops, which are important not only for function but also for relieving global backbone strain," said David Baker, co-author and professor of biochemistry at the University of Washington.

Credit: 
National Institutes of Natural Sciences

New knowledge of Earth's mantle helps to explain Indonesia's explosive volcanoes

image: Agung, a volcano in Bali, had an explosive eruption in 2018.

Image: 
O.L. Andersen

Indonesia's volcanoes are among the world's most dangerous. Why? Through chemical analyses of tiny minerals in lava from Bali and Java, researchers from Uppsala University and elsewhere have found new clues. They now understand better how the Earth's mantle is composed in that particular region and how the magma changes before an eruption. The study is published in Nature Communications.

Frances Deegan, the study's first author and a researcher at Uppsala University's Department of Earth Sciences, summarises the findings.

"Magma is formed in the mantle, and the composition of the mantle under Indonesia used to be only partly known. Having better knowledge of Earth's mantle in this region enables us to make more reliable models for the chemical changes in magma when it breaks through the crust there, which is 20 to 30 kilometres thick, before an eruption."

The composition of magma varies greatly from one geological environment to another, and has a bearing on the kind of volcanic eruption that occurs. The Indonesian archipelago was created by volcanism, caused by two of Earth's continental tectonic plates colliding there. In this collision, Indo-Australian plate slides beneath the Eurasian plate at a speed of some 7 cm annually. This process, known as subduction, can cause powerful earthquakes. The tsunami disaster of 2004, for example, was caused by movements along this particular plate boundary.

Volcanism, too, arises in subduction zones. When the sinking tectonic plate descends into the mantle, it heats up and the water it contains is released, causing the surrounding rock to start melting. The result is volcanoes that are often explosive and, over time, build up arc-shaped groups of islands. Along the Sunda Arc, comprising Indonesia's southern archipelago, several cataclysmic volcanic eruptions have taken place. Examples are Krakatoa in 1883, Mount Tambora in 1815 and Toba, which had a massive super-eruption some 72,000 years ago.

Magma reacts chemically with surrounding rock when it penetrates Earth's crust before breaking out on the surface. It can therefore vary widely among volcanoes. To get a better grasp of the origin of volcanism in Indonesia, the researchers wanted to find out the composition of the "primary" magma, that is derived from the mantle itself. Since samples cannot be taken directly from the mantle, geologists studied minerals in lava recently ejected from four volcanoes: Merapi and Kelut in Java, and Agung and Batur in Bali.

Using the powerful ion beams from a secondary ion mass spectrometry (SIMS) instrument, an ultramodern form of mass spectrometer, the researchers examined crystals of pyroxene. This mineral is one of the first to crystallise from a magma. What they wanted to determine was the ratio of the oxygen isotopes 16O and 18O, which reveals a great deal about the source and evolution of magma.

"Lava consists of roughly 50 per cent oxygen, and Earth's crust and mantle differ hugely in their oxygen isotope composition. So, to trace how much material the magma has assimilated from the crust after leaving the mantle, oxygen isotopes are very useful," Frances Deegan says.

The researchers found that the oxygen composition of pyroxene minerals from Bali had hardly been affected at all during their journey through Earth's crust. Their composition was fairly close to their original state, indicating that a minimum of sediment had been drawn down into the mantle during subduction. An entirely different pattern was found in the minerals from Java.

"We were able to see that Merapi in Java exhibited an isotope signature very different from those of the volcanoes in Bali. It's partly because Merapi's magma interacts intensively with Earth's crust before erupting. That's highly important because when magma reacts with, for instance, the limestone that's found in central Java right under the volcano, the magma becomes full to bursting point with carbon dioxide and water, and the eruptions get more explosive. That may be why Merapi's so dangerous. It's actually one of the deadliest volcanoes in Indonesia: it's killed nearly 2,000 people in the past 100 years, and the most recent eruption claimed 400 lives," says Professor Valentin Troll of Uppsala University's Department of Earth Sciences.

The study is a collaboration among researchers at Uppsala University, the Swedish Museum of Natural History in Stockholm, the University of Cape Town in South Africa, the University of Freiburg in Germany and Vrije Universiteit (VU) Amsterdam in the Netherlands. The results of the study enhance our understanding of how volcanism in the Indonesian archipelago works.

"Indonesia is densely populated, and everything that gives us a better grasp of how these volcanoes work is valuable, and helps us to be better prepared for when the volcanoes erupt," says Frances Deegan.

Credit: 
Uppsala University

Some good news for those with migraines

UNIVERSITY OF TORONTO

TORONTO, ON - A new study from researchers at the University of Toronto found that 63% of Canadians with migraine headaches are able to flourish, despite the painful condition.

"This research provides a very hopeful message for individuals struggling with migraines, their families and health professionals," says lead author Esme Fuller-Thomson, who spent the last decade publishing on negative mental health outcomes associated with migraines, including suicide attempts, anxiety disorders and depression. "The findings of our study have contributed to a major paradigm shift for me. There are important lessons to be learned from those who are flourishing."

A migraine headache, which afflicts one in eight Americans, is the seventh most disabling disorder in the world. However, few studies have investigated the factors that are associated with mental health and well-being among those who experience them.

The University of Toronto study investigated optimal mental health in a large, representative sample of more than 2,000 Canadians with migraines. To be defined in excellent mental health, respondents had to achieve three things:
1) almost daily happiness or life satisfaction in the past month, 2) high levels of social and psychological well-being in the past month, and 3) freedom from generalized anxiety disorder and depressive disorders, suicidal thoughts and substance dependence for at least the preceding full year.

"We were so encouraged to learn that more than three in every five migraineurs were in excellent mental health and had very high levels of well-being," says Fuller-Thomson, a Professor at both the Factor-Inwentash Faculty of Social Work and the Department of Family & Community Medicine at U of T and who the director of U of T's Institute for Life Course and Aging.

Those experiencing migranes who had at least one person in their lives in whom they could confide were four times more likely to be in excellent mental health than those without a confidant. In addition, those who turned to their religious or spiritual beliefs to cope with everyday difficulties had 86% higher odds of excellent mental health than those who did not use spiritual coping. The researchers also found that poor physical health, functional limitations, and a history of depression were impediments to excellent mental health among those with migraines.

"Health professionals who are treating individuals with migraines need to consider their patients' physical health needs and possible social isolation in their treatment plans" says co-author Marta Sadkowski, a recent nursing graduate from the University of Toronto.

The researchers examined a nationally representative sample of 2,186 Canadian community-dwelling adults who reported that they had been diagnoses with migraines by a health professional. The data were drawn from Statistics Canada's Canadian Community Health Survey-Mental Health. This research was published online ahead of press this month in the Annals of Headache Medicine.

Credit: 
University of Toronto

Setting gold and platinum standards where few have gone before

image: Eight gold samples, four per panel, prior to assembly of the panels into a "stripline" target for Sandia National Laboratories' Z machine. There they were vaporized by the enormous pressures produced by Z's 20-million-ampere current pulse. This arrangement will permit four measurements, one for each pair of samples in which one pair is on each panel at the same position .

Image: 
Photo by Leo Molina

ALBUQUERQUE, N.M. -- Like two superheroes finally joining forces, Sandia National Laboratories' Z machine -- generator of the world's most powerful electrical pulses -- and Lawrence Livermore National Laboratory's National Ignition Facility -- the planet's most energetic laser source -- in a series of 10 experiments have detailed the responses of gold and platinum at pressures so extreme that their atomic structures momentarily distorted like images in a fun-house mirror.

Similar high-pressure changes induced in other settings have produced oddities like hydrogen appearing as a metallic fluid, helium in the form of rain and sodium a transparent metal. But until now there has been no way to accurately calibrate these pressures and responses, the first step to controlling them.

Said Sandia manager Chris Seagle, an author of a technical paper recently published by the journal Science, "Our experiments are designed to measure these distortions in gold and platinum as a function of time. Compression gives us a measurement of pressure versus density."

Following experiments on the two big machines, researchers developed tables of gold and platinum responses to extreme pressure. "These will provide a standard to help future researchers calibrate the responses of other metals under similar stress," said Jean-Paul Davis, another paper author and Sandia's lead scientist in the effort to reliably categorize extreme data.

Data generated by experiments at these pressures -- roughly 1.2 terapascals (a terapascal is 1 trillion pascals), an amount of pressure relevant to nuclear explosions -- can aid understanding the composition of exoplanets, the effects and results of planetary impacts, and how the moon formed.

The technical unit called the pascal is so small it is often seen in its multiples of thousands, millions, billions or trillions. It may be easier to visualize the scale of these effects in terms of atmospheric pressure units. The center of the Earth is approximately 3.6 million times the atmospheric pressure at sea level, or 3.6 million atmospheres. Z's data reached 4 million atmospheres, or four million times atmospheric pressure at sea level, while the National Ignition Facility reached 12 million atmospheres.

The force of the diamond anvil

Remarkably, such pressures can be generated in the laboratory by a simple compression device called a diamond anvil.

However, "We have no standards for these extreme pressure ranges," said Davis. "While investigators see interesting events, they are hampered in comparing them with each other because what one researcher presents at 1.1 terapascals is only 0.9 on another researcher's scale."

What's needed is an underlying calibration tool, such as the numerical table these experiments helped to create, he said, so that scientists are talking about results achieved at the same documented amounts of pressure.

"The Z-NIF experiments will provide this," Davis said.

The overall experiments, under the direction of Lawrence Livermore researcher D. E. Fratanduono, relied on Z machine's accuracy as a check on NIF's power.

Z's accuracy, NIF's power

Z's force is created by its powerful shockless magnetic field, generated for hundreds of nanoseconds by its 20 million-ampere pulse. For comparison, a 120-watt bulb uses one ampere.

The accuracy of this method refocused the higher pressures achieved using NIF methods.

NIF's pressures exceeded those at the core of the planet Saturn, which is 850 gigapascals. But its laser-compression experiments sometimes required a small shock at the start of the compression wave, raising the material's temperature, which can distort measurements intended to set a standard.

"The point of shockless compression is to keep the temperature relatively low for the materials being studied," said Seagle. "Basically the material does heat as it compresses, but it should remain relatively cool -- hundreds of degrees -- even at terapascal pressures. Initial heating is a troublesome start."

Another reason that Z, which contributed half the number of "shots," or firings, and about one-third the data, was considered the standard for results up to 400 gigapascals was because Z's sample size was roughly 10 times as big: 600 to 1,600 microns thick compared with 60 to 90 microns on NIF. A micron is a thousandth of a millimeter.

Larger samples, slower pulses equal easier measurements

"Because they were larger, Z's samples were less sensitive to the microstructure of the material than were NIF's," said Davis. "Larger samples and slower pulses are simply easier to measure to high relative precision. Combining the two facilities really tightly constrained the standards."

Combining Z and NIF data meant that the higher-accuracy, but lower-intensity Z data could be used to pin down the low-to-medium pressure response, and with mathematical adjustments, reduce error on the higher-pressure NIF data.

"The purpose of this study was to produce highly accurate pressure models to approximately one terapascal. We did that, so this combination of facilities has been advantageous," said Seagle.

Credit: 
DOE/Sandia National Laboratories

International study of rare childhood cancer finds genetic clues, potential for tailored therapy

image: New findings suggest that children with rhabdomyosarcoma could benefit from tumor genetic testing.

Image: 
National Human Genome Research Institute

In children with rhabdomyosarcoma, or RMS, a rare cancer that affects the muscles and other soft tissues, the presence of mutations in several genes, including TP53, MYOD1, and CDKN2A, appear to be associated with a more aggressive form of the disease and a poorer chance of survival. This finding is from the largest-ever international study on RMS, led by scientists at the National Cancer Institute’s (NCI) Center for Cancer Research, part of the National Institutes of Health.

The study, published in the Journal of Clinical Oncology on June 24, provides an unprecedented look at data for a large cohort of patients with RMS, offering genetic clues that could lead to more widespread use of tumor genetic testing to predict how individual patients with this childhood cancer will respond to therapy, as well as to the development of targeted treatments for the disease.

“These discoveries change what we do with these patients and trigger a lot of really important research into developing new therapies that target these mutations,” said Javed Khan, M.D., of NCI’s Genetics Branch, who led the study.

“The standard therapy for RMS is almost a year of chemotherapy, radiation therapy, and surgery. These children get a lot of toxic treatments,” said the study’s first author, Jack Shern, M.D., of NCI’s Pediatric Oncology Branch. “If we could predict who's going to do well and who's not, then we can really start to tailor our therapies or eliminate therapies that aren't going to be effective in a particular patient. And for the children that aren't going to do well, this allows us to think about new ways to treat them.”

RMS is the most common type of soft tissue sarcoma in children. In patients whose cancer has remained localized, meaning that it has not spread, combination chemotherapies have led to a five-year survival rate of 70%-80%. But in patients whose cancer has spread or come back after treatment, the five-year survival rate remains poor at less than 30%, even with aggressive treatment.

Doctors have typically used clinical features, such as the location of the tumor in the body, as well as its size and to what extent it has spread, to predict how patients will respond to treatment, but this approach is imprecise. More recently, scientists have discovered that the presence of the PAX-FOXO1 fusion gene that is found in some patients with RMS is associated with poorer survival. Patients are now being screened for this genetic risk factor to help determine how aggressive their treatment should be.

Scientists have also begun using genetic analysis to dig more deeply into the molecular workings of RMS in search of other genetic markers of poorer survival. In this new study — the largest genomic profiling effort of RMS tumors to date — scientists from NCI and the Institute for Cancer Research in the United Kingdom analyzed DNA from tumor samples from 641 children with RMS enrolled over a two-decade period in several clinical trials. Scientists searched for genetic mutations and other aberrations in genes previously associated with RMS and linked that information with clinical outcomes. Among the patterns that emerged, patients with mutations in the tumor suppressor genes TP53, MYOD1, or CDKN2A had a poorer prognosis than patients without those mutations.

Using next-generation sequencing, researchers found a median of one mutation per tumor. Patients with two or more mutations per tumor had even poorer survival outcomes. In patients without the PAX-FOXO1 fusion gene, more than 50% had mutations in the RAS pathway genes, although RAS mutations did not appear to be associated with survival outcomes in this study.

The researchers believe that although they have identified the major mutations that may drive RMS development or provide information about prognosis, they have only scratched the surface in defining the genetics of this cancer, with many more mutations yet to be discovered. They note that more work is needed to identify targeted drugs for those mutations, and future clinical trials could incorporate genetic markers to more accurately classify patients into treatment groups. Two NCI-sponsored Children’s Oncology Group clinical trials are currently being developed using these markers, and all participants will have their tumors molecularly profiled.

The researchers hope that routine tumor genetic testing for rare cancers, such as RMS, will soon be a standard part of the treatment plan, as it is for more common cancers, such as breast cancer.

“Genetic testing is going to become the standard of care,” said Dr. Shern. “Instead of just the pathologists looking at these tumors, we’re now going to have molecular profiling, and that’s a leap forward.”

Credit: 
NIH/National Cancer Institute

Cyclone study improves climate projections

image: The northern westerly jet stream, indicated by the thick red arrow, is especially important in regulating the winter weather in the Northern Hemisphere.

Image: 
CC-0 NASA

Migrating storms and local weather systems known as cyclones and anticyclones were thought to contribute to behaviors and properties of our global weather system. However, the means to probe cyclones and anticyclones were limited. For the first time, researchers demonstrated a new three-dimensional analytical methodology that can quantify the way individual cyclones and anticyclones impact broader weather systems. This study aids longer-term circulation and climate studies, including how storm characteristics may change in the future.

To many people, the term cyclone probably conjures up images of ferocious storm winds or the tornadoes central to the plot of the Wizard of Oz. However, cyclones, and anticyclones, are a broader set of large-scale weather phenomena that are, as it turns out, crucial to the way our global climate functions. Cyclones are simply systems of winds rotating around an area of low air pressure. These tend to indicate rainy or stormy weather and rotate clockwise south of the equator and counterclockwise to the north. Anticyclones conversely are weather systems rotating around areas of high pressure and tend to indicate calmer and sunnier weather. These rotate the opposite way to cyclones in either hemisphere.

"For decades now, atmospheric scientists have used a technique known as the Eulerian approach (named after 18th-century mathematician Leonhard Euler) to analyze long-term three-dimensional atmospheric data," said Project Research Associate Satoru Okajima from the Research Center for Advanced Science and Technology (RCAST) at the University of Tokyo. "However, the approach considers cyclones and anticyclones to be merely deviations from a background average, and not separate entities in their own right. Furthermore, anticyclones are often unconsciously disregarded, unlike cyclones, possibly because of their association with calmer weather."

Okajima, Professor Hisashi Nakamura from RCAST and Professor Yohai Kaspi from the Weizmann Institute of Science in Israel, adopted a novel technique to isolate data from winds rotating around an area of low or high air pressure from ever-present background winds such as the westerly jet stream, the fast-flowing air currents between 30 degrees and 60 degrees latitude in both hemispheres. This approach allowed the team to evaluate the effect local curvature, essentially the shape, of cyclones and anticyclones had on the westerly jet stream. This is in contrast to one of the previous standard ways to view these patterns called relative vorticity, which could not reveal such fine details.

"It was extremely challenging to quantitatively separate the cyclones from the anticyclones, and many attempts along the way fell flat. But our successful method can now be applied to various climate-model simulations and will hopefully help researchers better project the future of our warming climate," said Okajima. "Climate science is important to all of us as it affects so many things. But it is also particularly interesting as it combines so many subfields like oceanography, hydrology, computer science, physics, chemistry and mathematics. I hope our contribution can be a useful tool to climate scientists for making predictions about our ever-changing world."

Credit: 
University of Tokyo

Mixed cultures for a greater yield

image: A two-species mixture with two alternating rows of oats and blue lupin in Spain.

Image: 
Christian Schoeb, Crop Diversity Experiment / ETH Zurich

Monocultures dominate arable land today, with vast areas given over to single elite varieties that promise a high yield. But planting arable land with just one type of crop has its disadvantages: these areas are easy game for fungal and insect pests, posing a threat to crops. To keep pests at bay, farmers are having to use resistant varieties and various pesticides.

Mixed cultures present a potential alternative to monocultures. Rather than having large expanses of land planted with just one species or variety, several species or varieties are sown alongside each other. However, as little research has been done into this method, especially from an agricultural perspective, mixed cultures are rare in arable farming.

A team led by ETH Zurich Professor Christian Schöb has now revealed that mixed cultures actually produce a much higher yield than monocultures in arable farming. Their study was recently published in the journal Nature Plants.

Applying an ecological principle

Mixed cultures draw on the ecological principle that ecosystems are able to perform their functions more effectively where there is greater biodiversity. These functions include regulating water balance, maintaining soil fertility and increasing plant productivity.

This also goes for agricultural ecosystems: "Research into agriculturally used meadows has shown that areas with a larger mix of plants are more productive than those with just one or a few species," Schöb says.

Until now, barely any comparable studies had focused on arable farming. That is why, together with his colleagues, Schöb set out to investigate whether this basic ecological mechanism would also have a bearing on arable land, specifically with regard to yield.

The researchers created two test gardens: one in Switzerland, on the University of Zurich's Irchel campus, and the other in the Spanish province of Extremadura. The latter has a much drier and warmer climate than Zurich, allowing the researchers to examine how the crops grow under potential future climate conditions.

Greater yield from mixing just two crops

In their experiment, the researchers tested mixtures of two or four different crops chosen from eight selected species comprising wheat, oat, quinoa, lentil, lupin, flax and false flax (an oilseed similar to rapeseed) as well as coriander. Only the seeds of the different species were used. The plants were sown 12 centimetres apart in alternating, parallel rows.

The researchers compared the seed mass from the mixed-culture crops with those from monocultures. They also measured plants' biomass based on their growth above ground.

The result speaks for itself: compared to monoculture farming, even a mixture of two species increased yield by 3 percent in Spain and 21 percent in Switzerland. Where the researchers had sown four species alongside each other, the yield increase was as high as 13 and 44 percent in Spain and Switzerland respectively.

The researchers explained that this additional yield primarily comes down to the biodiversity effect: a greater variety of plants results in a better use of available resources and more effective, natural pest control - the experiments were conducted without pesticides.

Plants put a lot of energy into leaves and stems

The researchers also noted, however, that the plants in mixed cultures developed more leaves or stems than in monocultures. In other words, the plants invested more energy and matter in producing vegetative biomass and proportionally less into producing seeds. Schöb explained that the plants were having to make a compromise: the more effort they put into vegetative biomass, the less they have for seeds. "Despite this, the plants still produced more seeds on balance than in a monoculture," says the agricultural researcher.

He attributed the fact that the plants invested more energy in creating vegetative biomass to the varieties used in the experiments. "The seeds are bred specifically for monocultures. This means the plants are designed to perform best when they grow among other plants of the same variety."

Schöb deems it likely that the potential for extra yield is even greater with seeds suited to mixed cultures.

Over time, people have changed most crops to produce larger fruit and higher yield under monoculture conditions. The tomatoes grown today are gigantic compared with their wild counterparts, which have fruit the size of blueberries. In order to get the best yield possible from plants grown in mixed cultures, current breeding methods - targeting cultivation in monocultures - have to be adjusted slightly.

Sowing the right seeds

As things stand, no seeds are produced or marketed specifically for use in mixed cultures. The researchers are therefore busy harvesting and testing seeds from their own experiments. "We want to repeat our experiments using these self-produced seeds so we can test whether selection in a mixed culture literally does bear fruit," Schöb says.

A change in agricultural practice is, however, required if mixed cultures are to gain ground. Among other things, machines need to be able to harvest different crops at the same time and also separate the different harvest products. "These machines exist, but they are few and far between, not to mention expensive. There is simply too little demand for them at present," Schöb says. Together with optimised seeds and the right machines, mixed cultures will present farmers with a real opportunity for the future.

Credit: 
ETH Zurich

'Subterranean estuaries' crucial to sustainable fishing and aquaculture industries

Pioneering research, led by a team from Trinity College Dublin and the Marine Research Institute of the Spanish Research Council (IIM-CSIC) in Vigo (Galicia, Spain), suggests "subterranean estuaries" may be critical in managing sustainable fishing and aquaculture - two growing industries of global importance.

Subterranean estuaries are analogous to surface water estuaries, where freshwater flowing out to sea mixes with seawater, but are instead located underground, invisible to the naked eye. Yet the newly published research shows these hidden features are very important in the ecology of coastal systems and in filtering pollutants - some of which have been slowly travelling to sea for decades having leached from agricultural soils.

The research, just published open access in Limnology and Oceanography, uncovered subterranean estuaries in the Ria de Vigo in Galicia (one of the most productive coastal ecosystems in Europe and leader in bivalve production for human consumption) and assessed their importance to the coastal environment.

By employing a selection of natural environmental tracers that carry the chemical fingerprints of groundwater sources on land out to sea, the team estimated that almost 25% of the continental freshwater discharged to the Ria de Vigo comes from this invisible source.

The Biogeochemistry Research Group of Trinity's School of Natural Sciences, led the study (Project SUBACID). Explaining the significance of the work, and its wider implications for Irish waters, Carlos Rocha, Professor in Environmental Change, said:

"Bivalve aquaculture is a strategic, expanding sector in Irish sustainable development and features highly in the national plans to diversify food production. While our work was conducted in the Ria de Vigo, this area was carefully selected because of its capability to support aquaculture and its biogeographic similarity to parts of the Irish coastline.

"These subterranean estuaries have a high capability to filter out pollutants, like fertilisers, from freshwater. Given the extent to which they supply large ecosystems with incoming freshwater, they have a much more important role to play than many would have believed."

Juan Severino Pino Ibánhez, researcher from the Marine Research Institute-CSIC (Spain), added:

"We will now focus in more detail on which specific ecosystem services these invisible structures provide, and how they might affect, for example, the ongoing threat to this industry posed by ocean acidification caused by anthropogenic CO2 emissions to the atmosphere.

"We are currently strengthening the collaborative network established with the Marine Research Institute of Vigo to elucidate the functioning of these hidden ecosystems and their role in coastal health and resilience. Lessons learnt in Vigo together with ongoing research made by our group in Irish coastal ecosystems will help to understand the future of Irish coastal ecosystem services and food production."

Credit: 
Trinity College Dublin

Ethane proxies for methane in oil and gas emissions

image: Once the amount of ethane emissions are determined, the researchers combine that information with the gas composition data from a particular basin to convert the solved ethane emissions into methane emissions.

Image: 
Zachary Barkley, Penn State

Measuring ethane in the atmosphere shows that the amounts of methane going into the atmosphere from oil and gas wells and contributing to greenhouse warming is higher than suggested by the U.S. Environmental Protection Agency, according to an international team of scientists who spent three years flying over three areas of the U.S. during all four seasons.

"Ethane is a gas that is related only to certain sources of methane," said Zachary R. Barkley, researcher in meteorology and atmospheric science, Penn State. "Methane, however, is produced by oil, gas and coal fields, but also by cow's digestive systems, wetlands, landfills and manure management. It is difficult to separate out fossil fuel produced and natural methane."

The Atmospheric Carbon and Transport (ACT) America data made it possible to quantify methane emissions from oil, gas and coal sources, because the project measured not only methane, but also ethane. The researchers note that methane identified with ethane can be reliably connected to fossil fuel sources, however, the ratio of ethane to methane does vary with individual sources.

"ACT America was conceived as an effort to improve our ability to diagnose the sources and sinks of global greenhouse gases, to improve the diagnosis," said Kenneth J. Davis, professor of atmospheric and climate science, Penn State. "We wanted to understand how greenhouse gases are moved around by weather systems in the atmosphere. Prior to ACT, there was no data to map out the distribution of gases in weather systems."

From 2017 through 2019, researchers flew data collection missions over three portions of the U.S. -- the central Atlantic states including Pennsylvania, New York, Virginia, West Virginia and Maryland; the central southern states including Arkansas, Louisiana, Texas, Alabama, Oklahoma and Mississippi; and the central midwestern states including Nebraska, South Dakota, Kansas, Minnesota, Iowa, Missouri, Wisconsin, Michigan, Ohio, Indiana and Illinois. The researchers covered all four seasons and tracked how weather systems moved carbon dioxide, methane, ethane and other gases around in the atmosphere.

The researchers are not the first to suggest that estimates of methane are too low, but according to Barkley, they are the first to use ethane solely as a proxy. Ethane, although it will act as a greenhouse gas, only stays in the atmosphere for a few months before breaking down into other compounds, rather than the 10 years that methane remains in the atmosphere. Ethane is more of a problem for air pollution than greenhouse warming.

"We didn't look at any of the methane data at all and we still see the same results as everyone else," he said.

Another difference is that most previous airborne studies looked at small areas, emissions from single sites or fields. ACT-America looked at multistate regions and encompassed over two-thirds of U.S. natural gas production.

"Ethane data consistently exceeds values that would be expected based on (U.S.) EPA Oil and gas leak rate estimates by more than 50%," the researchers report in a recent issue of the Journal of Geophysical Research: Atmospheres. The researchers add that comparing the combined fall, winter and spring ethane emission estimates to an inventory of oil and gas methane emissions, they estimate that the oil and gas methane emissions are larger than EPA inventory values by 48% to 76%.

The researchers used ethane-to-methane ratios from oil and gas production basins for this study.

While carbon dioxide sources and sinks can be found across the Earth's surface, ethane and methane emissions come from specific locations on the ground that are known. Deserts and oceans and upland ecosystems emit little ethane or methane. Active oil and gas fields have high emissions. When estimating trace gas emissions, researchers usually take their first best guess and then run multiple iterations to minimize the difference between observed and simulated atmospheric concentrations of these gases.

Barkley notes that sometimes signals are hard to interpret, but that that is not the case with this study.

"The data are there," said Barkley. "The smaller plume in the model when increased by a factor of two suddenly matches the real time data."

Credit: 
Penn State

Mount Sinai study finds that rotator cuff injuries account for nearly half of shoulder injuries among collegiate baseball players, identifies other risks

Paper Title: Analysis of Common Shoulder Injuries in Collegiate Baseball Players

Journal: The Physician and Sportsmedicine (June 23, 2021, online edition)

Authors: Alexis Chiang Colvin, MD, Professor, Department of Orthopedic Surgery at the Icahn School of Medicine at Mount Sinai; Daniel A. Charen, MD, Resident, Department of Orthopedic Surgery at the Icahn School of Medicine at Mount Sinai; and other coauthors.

Bottom Line: Baseball players are highly susceptible to shoulder injuries due to significant microtrauma including repetitive overhead throwing. Mount Sinai researchers investigated men's National Collegiate Athletic Association (NCAA) baseball shoulder injury rates and associated risk factors. This research supports previous studies that encourage injury prevention measures, such as identifying and treating players with rotator cuff weakness and decreased range of motion in the preseason to decrease risks for shoulder injuries during the regular season.

How: In the retrospective cohort study, researchers used the NCAA Injury Surveillance Program (ISP) database to examine common shoulder injury data over 5 years for men's baseball players from the 2009-2010 to the 2013-2014 seasons. The study examined the occurrence of common injuries, such as those involving the rotator cuff, labrum and biceps tendon. Researchers also looked at the player position, need for surgery, and reoccurrence of injury, among other factors.

Results: The researchers found that rotator cuff injuries represented nearly half (46%), of all shoulder injuries, which were significantly more likely to be season ending. The Mount Sinai team also found that pitching and throwing were associated with rotator cuff tears, rotator cuff tendonitis cases, SLAP tears, and biceps tendonitis cases.

Why the Research Is Interesting: Mount Sinai researchers believe this is the first study to specifically analyze the epidemiology of shoulder injuries in NCAA baseball players.

Study Conclusions: Repetitive overhead throwing is a risk factor for shoulder injuries in collegiate baseball players, often leading to rotator cuff and biceps tendon injuries. While previous studies have evaluated strengthening and conditioning, pitch counts, and throwing mechanics, this research confirms there is still a high rate of shoulder injuries for repetitive overhead throwers. Mount Sinai researchers encourage physicians and trainers to refine and develop new injury prevention strategies to enhance care for collegiate baseball players.

Said Mount Sinai's Dr. Daniel Charen of the research:

The majority of shoulder injuries in NCAA baseball players involve the rotator cuff, Unfortunately, players that ultimately sustain a tear of the rotator cuff are more likely to be out for the remainder of the baseball season. Although many important advances have been made optimizing the shoulder health of baseball players, there is still an opportunity to improve injury prevention and treatment strategies in these repetitive overhead throwers.

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine