Culture

New research finds that ACOs are struggling to integrate social services with medical care

New findings from a Dartmouth-led study, published in the February issue of Health Affairs, show that despite effort and attention on the part of some healthcare providers to better address their patients' social needs--such as transportation, housing, and food--little progress is being made to integrate social services with medical care.

Given the substantial impact of social factors on health outcomes and medical costs, healthcare providers who join Accountable Care Organizations (ACO)--who receive financial incentives for meeting quality and cost-reduction targets--have been considered among the most likely to pursue integration.

To understand how a highly motivated and committed group of such providers has fared at addressing their patients' social needs, the researchers collected qualitative data from 22 ACOs across the U.S. from 2015-18. As early adopters, the group was actively working on initiatives to address social determinants of health.

However, even these ACOs faced substantial challenges in integrating social services with patient care. "First, we found that the ACOs were frequently 'flying blind,' lacking data on both their patients' needs and the capabilities of potential community partners," explains first author Genevra Murray, PhD, a research scientist at The Dartmouth Institute for Health Policy and Clinical Practice, who collaborated with colleagues from the University of California Berkeley and the University of North Carolina at Chapel Hill on the study.

In addition, partnerships between ACOs and community-based organizations, while critical, were only in the beginning stages of development. And, innovation was constrained by ACOs' difficulties in determining how best to approach return on investments in social services, given shorter funding cycles and more lengthy time horizons to see results.

"Policies that could facilitate the integration of social determinants include providing sustainable funding, implementing local and regional networking initiatives to facilitate partnership development, and developing standardized data on community-based organizations' services and quality to aid providers that seek partners," says Murray.

Credit: 
The Geisel School of Medicine at Dartmouth

Tumbleweeds or fibrils: Tau proteins need to choose

image: A schematic shows the growth of tau oligomers implicated in Alzheimer's and Pick's diseases. Monomers of tau tend to aggregate along two channels, one leading to fibrils that form tangles and the other leading to amorphous clumps in neurons. Rice University researchers simulated the proteins in computational models to see how and where the branching happens.

Image: 
Center for Theoretical Biological Physics

HOUSTON - (Feb. 3, 2020) - New simulations by Rice University scientists tell a tale of two taus and how they relate to neurological disease.

Their work suggests tau proteins take either of two paths to form aggregates suspected of promoting, and perhaps causing, Alzheimer's and Pick's (aka frontotemporal dementia) diseases. Precisely why remains a mystery, but figuring it out offers the possibility of controlling their fates.

Tau proteins, particularly in neurons, primarily regulate microtubules, the filaments that serve as roadways for cargo inside a cell and facilitate division. But they come in many forms and, as it turns out, these can aggregate in distinct ways.

The study by biophysicist Peter Wolynes and his team in the Proceedings of the National Academy of Sciences is the first computational analysis to draw a distinction between proteins that form either the solid fibrils found in the brains of patients with Alzheimer's and Pick's or disordered, tumbleweed-like clumps that float in the neurons' cytoplasm.

"There's a relationship between the form that turns into membraneless organelles (the tumbleweeds) and the form that becomes fibers," said Wolynes, a co-director of Rice's Center for Theoretical Biological Physics. "There seems to be two distinct pathways that the same tau molecule can follow, and the balance between the two pathways is affected by some biological process."

He said the models suggest that phosphorylation, which regulates many cellular signaling processes, may be the determining factor.

"Once we figure out which of these aggregates is the actual bad guy, then it should be possible to intervene in, say, the phosphorylation process itself in order to change the balance," Wolynes said.

The Rice lab used its coarse-grained AWSEM (associative memory, water-mediated, structure and energy model) analysis tool, which predicts how proteins fold, on a variety of tau protein models based on variants from patients. They found that the formation of fibers can be manipulated by phosphorylation which occurs at a multitude of sites in the tau protein.

"We found phosphorylation encouraged the formation of the amorphous aggregate, but it didn't encourage the formation of the fiber -- by as much, anyway," Wolynes said.

Phosphorylation can also go off the rails, he said. "There are something like 20 sites along the tau protein that can be phosphorylated, but in general, only four or five of them are," Wolynes said. "But sometimes, they've become hyperphosphorylated, which means the enzymes responsible for the process do more. They end up with, say, 10 sites phosphorylated instead of five, and that may have some effect.

"If the extra phosphorylation can cause more of the disease, we would want to find out which kinases do the phosphorylation and try to inhibit them with a drug, like we do in treatment for cancer," he said.

Tau proteins exhibited another interesting characteristic called backtracking, which the researchers saw in amyloid beta peptides that are also implicated in Alzheimer's. Both tend to aggregate until energetic barriers force them to partially unfold and then seek another path to their final, most stable forms.

It's at that point of frustration that aggregating taus appear to branch into different directions, Wolynes said. One set of tau forms parallel fibrils that aggregate into the ordered plaques observed in patients' brains, while the other loosely aggregates into the floating clump. The researchers suggested the backtracking mechanism may be a universal feature in protein aggregation, a topic for future study.

The loose aggregates present their own challenges to scientists, Wolynes said.

"They're a physics question in the following sense: They're localized objects, but why don't they all just glom together and form one huge organelle, like what happens with drops of oil in water?" he said. "Is it just that it takes too long for them to move around? Is it that they're constantly being made and disassembled? And what determines their size?

"At this point, it's still a very basic and fairly simple story," Wolynes said. "The actual story of tau is still too complicated for us, but what we have learned so far is fairly straightforward."

Credit: 
Rice University

New device identifies high-quality blood donors

image: This is is UBC mechanical engineering professor Hongshen Ma.

Image: 
University of British Columbia

Blood banks have long known about high-quality donors - individuals whose red blood cells stay viable for longer in storage and in the recipient's body.

Now a new device developed at UBC is showing promise as a method to identify these "super donors", potentially helping more than 4.5 million patients who need blood transfusions every year in Canada and the United States.

"We know that the deformability of red blood cells, or their ability to squeeze through small constrictions, is an important factor in the longevity of these cells in recipients of blood transfusions," explains lead researcher Hongshen Ma, a professor of mechanical engineering and biomedical engineering. "This is because cells that remain deformable longer can stay in circulation longer. But until now we didn't have a reliable way to measure this capability in donated red blood cells."

Ma and his team tested stored red blood cells from eight different donors using a custom-made microfluidics device to see how their deformability is maintained during storage. (Microfluidics is the study of the flow of liquids through channels smaller than the human hair.)

"We found that samples from two of the donors were significantly more stable - they remained deformable during the storage period - than other donors," said Ma. "We need to study this phenomenon further, but this result suggests that it will be possible to identify donors that can provide long circulating red blood cells for sensitive recipients."

"People who need frequent blood transfusions, benefit tremendously from red blood cells that that are able to appropriately circulate in the blood vessels to deliver oxygen. A method that can swiftly and accurately test the 'squeezability' of these cells can make transfusions safer for these patients and ultimately for anyone who needs a critical transfusion," said study author Dr. Mark Scott, a UBC clinical professor in pathology and laboratory medicine and senior scientist at the Centre for Innovation, Canadian Blood Services.

The team plans to work with Canadian Blood Services to test more donor samples in the near future to further develop their device and validate this result.

Credit: 
University of British Columbia

The one ring -- to track your finger's location

image: With continuous tracking, AuraRing can pick up handwriting -- potentially for short responses to text messages.

Image: 
Dennis Wise/University of Washington

Smart technology keeps getting smaller. There are smartphones, smartwatches and now, smart rings, devices that allow someone to use simple finger gestures to control other technology.

Researchers at the University of Washington have created AuraRing, a ring and wristband combination that can detect the precise location of someone's index finger and continuously track hand movements. The ring emits a signal that can be picked up on the wristband, which can then identify the position and orientation of the ring -- and the finger it's attached to. The research team published these results Dec. 11 in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies.

"We're thinking about the next generation of computing platforms," said co-lead author Eric Whitmire, who completed this research as a doctoral student at the Paul G. Allen School of Computer Science & Engineering. "We wanted a tool that captures the fine-grain manipulation we do with our fingers -- not just a gesture or where your finger's pointed, but something that can track your finger completely."

AuraRing is composed of a coil of wire wrapped 800 times around a 3D-printed ring. A current running through the wire generates a magnetic field, which is picked up by three sensors on the wristband. Based on what values the sensors detect, the researchers can continuously identify the exact position of the ring in space. From there, they can determine where the user's finger is located.

"To have continuous tracking in other smart rings you'd have to stream all the data using wireless communication. That part consumes a lot of power, which is why a lot of smart rings only detect gestures and send those specific commands," said co-lead author Farshid Salemi Parizi, a doctoral student in electrical and computer engineering. "But AuraRing's ring consumes only 2.3 milliwatts of power, which produces an oscillating magnetic field that the wristband can constantly sense. In this way, there's no need for any communication from the ring to the wristband."

With continuous tracking, AuraRing can pick up handwriting -- potentially for short responses to text messages -- or allow someone to have a virtual reality avatar hand that mimics what they're doing with their actual hand. In addition, because AuraRing uses magnetic fields, it can still track hands even when they are out of sight, such as when a user is on a crowded bus and can't reach their phone.

"We can also easily detect taps, flicks or even a small pinch versus a big pinch," Salemi Parizi said. "This gives you added interaction space. For example, if you write 'hello,' you could use a flick or a pinch to send that data. Or on a Mario-like game, a pinch could make the character jump, but a flick could make them super jump."

The researchers designed AuraRing to be ready to use as soon as it comes out of the box and not be dependent on a specific user. They tested the system on 12 participants with different hand sizes. The team compared the actual location of a participant's finger to where AuraRing said it was. Most of the time, the system's tracked location agreed with the actual location within a few millimeters.

This ring and wristband combination could be useful for more than games and smartphones, the team said.

"Because AuraRing continuously monitors hand movements and not just gestures, it provides a rich set of inputs that multiple industries could take advantage of," said senior author Shwetak Patel, a professor in both the Allen School and the electrical and computer engineering department. "For example, AuraRing could detect the onset of Parkinson's disease by tracking subtle hand tremors or help with stroke rehabilitation by providing feedback on hand movement exercises."

The technology behind AuraRing is something that could be easily added to smartwatches and other wristband devices, according to the team.

"It's all about super powers," Salemi Parizi said. "You would still have all the capabilities that today's smartwatches have to offer, but when you want the additional benefits, you just put on your ring."

Credit: 
University of Washington

'Wristwatch' monitors body chemistry to boost athletic performance, prevent injury

image: The metabolite monitoring device, shown here, is the size of a wristwatch. The sensor strip, which sticks out in this photo, can be tucked back, lying between the device and the user's skin. The device can be used for everything from detecting dehydration to tracking athletic recovery, with applications ranging from military training to competitive sports.

Image: 
Murat Yokus, NC State University

Engineering researchers have developed a device the size of a wristwatch that can monitor an individual's body chemistry to help improve athletic performance and identify potential health problems. The device can be used for everything from detecting dehydration to tracking athletic recovery, with applications ranging from military training to competitive sports.

"This technology allows us to test for a wide range of metabolites in almost real time," says Michael Daniele, co-corresponding author of a paper on the work and an assistant professor of electrical and computer engineering at North Carolina State University and in the Joint Department of Biomedical Engineering at NC State and the University of North Carolina at Chapel Hill.

Metabolites are markers that can be monitored to assess an individual's metabolism. So, if someone's metabolite levels are outside of normal parameters, it could let trainers or health professionals know that something's wrong. For athletes, it could also be used to help tailor training efforts to improve physical performance.

"For this proof-of-concept study, we tested sweat from human participants and monitored for glucose, lactate, pH and temperature," Daniele says.

A replaceable strip on the back of the device is embedded with chemical sensors. That strip rests against a user's skin, where it comes into contact with the user's sweat. Data from the sensors in the strip are interpreted by hardware inside the device, which then records the results and relays them to a user's smartphone or smartwatch.

"The device is the size of an average watch, but contains analytical equipment equivalent to four of the bulky electrochemistry devices currently used to measure metabolite levels in the lab," Daniele says. "We've made something that is truly portable, so that it can be used in the field."

While the work for this paper focused on measuring glucose, lactate and pH, the sensor strips could be customized to monitor for other substances that can be markers for health and athletic performance - such as electrolytes.

"We're optimistic that this hardware could enable new technologies to reduce casualties during military or athletic training, by spotting health problems before they become critical," Daniele says. "It could also improve training by allowing users to track their performance over time. For example, what combination of diet and other variables improves a user's ability to perform?"

The researchers are now running a study to further test the technology when it is being worn by people under a variety of conditions.

"We want to confirm that it can provide continuous monitoring when in use for an extended period of time," Daniele says.

"While it's difficult to estimate what the device might cost consumers, it only costs tens of dollars to make. And the cost of the strips - which can last for at least a day - should be comparable to the glucose strips used by people with diabetes.

"We're currently looking for industry partners to help us explore commercialization options for this technology," Daniele says.

Credit: 
North Carolina State University

Blood test identifies risk of disease linked to stroke and dementia

image: MRI scans shows showing the average measurable difference in white matter brain damage in people with low inflammatory blood test scores (below median) and those with high scores (above median).

Image: 
UCLA Health

A UCLA-led study has found that levels of six proteins in the blood can be used to gauge a person's risk for cerebral small vessel disease, or CSVD, a brain disease that affects an estimated 11 million older adults in the U.S. CSVD can lead to dementia and stroke, but currently it can only be diagnosed with an MRI scan of the brain.

"The hope is that this will spawn a novel diagnostic test that clinicians can start to use as a quantitative measure of brain health in people who are at risk of developing cerebral small vessel disease," said Dr. Jason Hinman, a UCLA assistant professor of neurology and lead author of the paper, which is published in the journal PLOS ONE.

CSVD is characterized by changes to the brain's white matter -- the areas of the brain that have a high concentration of myelin, a fatty tissue that insulates and protects the long extensions of brain cells. In CSVD, small blood vessels that snake through the white matter become damaged over time and the myelin begins to break down. This slows the communication between cells in the brain and can lead to problems with cognition and difficulty walking. And if the blood vessels become completely blocked, it can cause stroke.

The disease is also associated with a heightened risk for multiple forms of dementia, including Alzheimer's disease.

Typically, doctors diagnose CSVD with an MRI scan after a person has experienced dementia or suffered a stroke. About a quarter of all strokes in the U.S. are associated with CSVD. But many cases of the disease go undiagnosed because of mild symptoms, such as trouble with walking or memory, that can often be attributed to normal aging.

In the new study, Hinman and colleagues focused on six proteins related to the immune system's inflammatory response and centered on a molecule called interleukin-18, or IL-18. They hypothesized that inflammatory proteins that damage the brain in CSVD may be detectable in the bloodstream.

The researchers measured the levels of the proteins in the blood of 167 people whose average age was 76.4, and who had either normal cognition or mild cognitive impairment. As part of their voluntary participation in the study, 110 participants also underwent an MRI brain scan and 49 received a more advanced scan called diffusion tensor imaging.

People whose MRI or diffusion tensor imaging tests showed signs of CSVD had significantly higher levels of the six blood proteins, the researchers discovered. If a person had higher-than-average levels of the six inflammatory proteins, they were twice as likely to have signs of CSVD on an MRI scan and 10% more likely to very early signs of white matter damage. Moreover, for every CSVD risk factor that a person had -- such as high blood pressure, diabetes, or a previous stroke -- the inflammatory protein levels in their blood were twice as high, on average.

To confirm the results, the team performed the blood test in a group with a much higher risk for CSVD: 131 people who visited a UCLA Health emergency department with signs of stroke. Once again, the blood test results were correlated with white matter changes in the brain that were detected by an MRI.

"I was pleasantly surprised that we were able to associate blood stream inflammation with CSVD in two fairly different populations," Hinman said.

In MRI reports, the changes in the brain's white matter caused by CSVD are usually only categorized in general terms -- as mild, moderate or severe. The blood test is a step forward, Hinman said, because it provides a more quantitative scale for evaluating the disease. That means the blood test can be used to follow the progression of the disease or to identify people who are candidates for prevention efforts or treatments for CSVD.

"We're hopeful that this will set the field on more quantitative efforts for CSVD so we can better guide therapies and new interventions," Hinman said.

The blood test is not commercially available at this time.

Credit: 
University of California - Los Angeles Health Sciences

SwRI-led team identifies low-energy solar particles from beyond Earth near the Sun

image: Using data from NASA's Parker Solar Probe, an SwRI-led team identified low-energy particles, the smoking gun pointing to interactions between slow- and fast-moving regions of the solar wind accelerating high-energy particles from beyond the orbit of Earth. Using Integrated Science Investigation of the Sun (ISIS) instrument data, they measured low-energy particles in the near-Sun environment that had likely traveled back toward the Sun, slowing against the tide of the solar wind while still retaining surprising energies.

Image: 
NASA/Johns Hopkins APL/Steve Gribben

SAN ANTONIO -- Feb. 3, 2020 -- Using data from NASA's Parker Solar Probe (PSP), a team led by Southwest Research Institute identified low-energy particles lurking near the Sun that likely originated from solar wind interactions well beyond Earth orbit. PSP is venturing closer to the Sun than any previous probe, carrying hardware SwRI helped develop. Scientists are probing the enigmatic features of the Sun to answer many questions, including how to protect space travelers and technology from the radiation associated with solar events.

"Our main goal is to determine the acceleration mechanisms that create and transport dangerous high-energy particles from the solar atmosphere into the solar system, including the near-Earth environment," said Dr. Mihir Desai, a mission co-investigator on the Integrated Science Investigation of the Sun (IS?IS) instrument suite, a multi-institutional project led by Principal Investigator Prof. Dave McComas of Princeton University.. IS?IS consists of two instruments, Energetic Particle Instrument-High (EPI-Hi) and Energetic Particle Instrument-Low (EPI-Lo). "With EPI-Lo, we were able to measure extremely low-energy particles unexpectedly close to the solar environment. We considered many explanations for their presence, but ultimately determined they are the smoking gun pointing to interactions between slow- and fast-moving regions of the solar wind that accelerate high-energy particles from beyond the orbit of Earth. Some of those travel back toward the Sun, slowing against the tide of the outpouring solar wind but still retaining surprisingly high energies."

PSP, which will travel within 4 million miles of the Sun's surface, is collecting new solar data to help scientists understand how solar events, such as coronal mass ejections, impact life on Earth. During the rising portion of the Sun's activity cycle, our star releases huge quantities of energized matter, magnetic fields and electromagnetic radiation in the form of coronal mass ejections (CMEs). This material is integrated into the solar wind, the steady stream of charged particles released from the Sun's upper atmosphere. The high-energy solar energetic particles (SEPs) present a serious radiation threat to human explorers living and working outside low-Earth orbit and to technological assets such as communications and scientific satellites in space. The mission is making the first-ever direct measurements of both the low-energy source populations as well as the more hazardous, higher-energy particles in the near-Sun environment, where the acceleration takes place.

When the Sun's activity reaches a lull, roughly about every 11 years, solar equatorial regions emit slower solar wind streams, traveling around 1 million miles per hour, while the poles spew faster streams, traveling twice as fast at 2 million miles per hour. Stream Interaction Regions (SIRs) are created by interactions at boundaries between the fast and slow solar wind. Fast-moving streams tend to overtake slower streams that originate westward of them on the Sun, forming turbulent corotating interaction regions (CIRs) that produce shock waves and accelerated particles, not unlike those produced by CMEs.

"For the first time, we observed low-energy particles from these CIRs near the orbit of Mercury," Desai said. "We also compared the PSP data with data from STEREO, another solar energy probe. By measuring the full range of energetic populations and correlating the data with other measurements, we hope to get a clear picture of the origin and the processes that accelerate these particles. Our next step is to integrate the data into models to better understand the origin of SEPs and other materials. Parker Solar Probe will solve many puzzling scientific questions -- and is guaranteed to generate new ones as well."

Credit: 
Southwest Research Institute

Making high-temperature superconductivity disappear to understand its origin

image: Brookhaven Lab physicists (from left to right) Genda Gu, Tonica Valla, and Ilya Drozdov at OASIS, a new on-site experimental machine for growing and characterizing oxide thin films, such as those of a class of high-temperature superconductors (HTS) known as the cuprates. Compared to conventional superconductors, HTS become able to conduct electricity without resistance at much warmer temperatures. The team used the unique capabilities at OASIS to make superconductivity in a cuprate sample disappear and then reappear in order to understand the origin of the phenomenon.

Image: 
Brookhaven National Laboratory

UPTON, NY--When there are several processes going on at once, establishing cause-and-effect relationships is difficult. This scenario holds true for a class of high-temperature superconductors known as the cuprates. Discovered nearly 35 years ago, these copper-oxygen compounds can conduct electricity without resistance under certain conditions. They must be chemically modified ("doped") with additional atoms that introduce electrons or holes (electron vacancies) into the copper-oxide layers and cooled to temperatures below 100 Kelvin (?280 degrees Fahrenheit)--significantly warmer temperatures than those needed for conventional superconductors. But exactly how electrons overcome their mutual repulsion and pair up to flow freely in these materials remains one of the biggest questions in condensed matter physics. High-temperature superconductivity (HTS) is among many phenomena occurring due to strong interactions between electrons, making it difficult to determine where it comes from.

That's why physicists at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory studying a well-known cuprate containing layers made of bismuth oxide, strontium oxide, calcium, and copper oxide (BSCCO) decided to focus on the less complicated "overdoped" side, doping the material so much so that superconductivity eventually disappears. As they reported in a paper published on Jan. 29 in Nature Communications, this approach enabled them to identify that purely electronic interactions likely lead to HTS.

"Superconductivity in cuprates usually coexists with periodic arrangements of electric charge or spin and many other phenomena that can either compete with or aid superconductivity, complicating the picture," explained first author Tonica Valla, a physicist in the Electron Spectroscopy Group of Brookhaven Lab's Condensed Matter Physics and Materials Science Division. "But these phenomena weaken or completely vanish with overdoping, leaving nothing but superconductivity. Thus, this is the perfect region to study the origin of superconductivity. Our experiments have uncovered an interaction between electrons in BSCCO that correlates one to one with superconductivity. Superconductivity emerges exactly when this interaction first appears and becomes stronger as the interaction strengthens."

Only very recently has it become possible to overdope cuprate samples beyond the point where superconductivity vanishes. Previously, a bulk crystal of the material would be annealed (heated) in high-pressure oxygen gas to increase the concentration of oxygen (the dopant material). The new method--which Valla and other Brookhaven scientists first demonstrated about a year ago at OASIS, a new on-site instrument for sample preparation and characterization--uses ozone instead of oxygen to anneal cleaved samples. Cleaving refers to breaking the crystal in vacuum to create perfectly flat and clean surfaces.

"The oxidation power of ozone, or its ability to accept electrons, is much stronger than that of molecular oxygen," explained coauthor Ilya Drozdov, a physicist in the division's Oxide Molecular Beam Epitaxy (OMBE) Group. "This means we can bring more oxygen into the crystal to create more holes in the copper-oxide planes, where superconductivity occurs. At OASIS, we can overdope surface layers of the material all the way to the nonsuperconducting region and study the resulting electronic excitations."

OASIS combines an OMBE system for growing oxide thin films with angle-resolved photoemission spectroscopy (ARPES) and spectroscopic imaging-scanning tunneling microscopy (SI-STM) instruments for studying the electronic structure of these films. Here, materials can be grown and studied using the same connected ultrahigh vacuum system to avoid oxidation and contamination by carbon dioxide, water, and other molecules in the atmosphere. Because ARPES and SI-STM are extremely surface-sensitive techniques, pristine surfaces are critical to obtaining accurate measurements.

For this study, coauthor Genda Gu, a physicist in the division's Neutron Scattering Group, grew bulk BSCCO crystals. Drozdov annealed the cleaved crystals in ozone in the OMBE chamber at OASIS to increase the doping until superconductivity was completely lost. The same sample was then annealed in vacuum in order to gradually reduce the doping and increase the transition temperature at which superconductivity emerges. Valla analyzed the electronic structure of BSCCO across this doping-temperature phase diagram through ARPES.

"ARPES gives you the most direct picture of the electronic structure of any material," said Valla. "Light excites electrons from a sample, and by measuring their energy and the angle at which they escape, you can recreate the energy and momentum of the electrons while they were still in the crystal."

In measuring this energy-versus-momentum relationship, Valla detected a kink (anomaly) in the electronic structure that follows the superconducting transition temperature. The kink becomes more pronounced and shifts to higher energies as this temperature increases and superconductivity gets stronger, but disappears outside of the superconducting state. On the basis of this information, he knew that the interaction creating the electron pairs required for superconductivity could not be electron-phonon coupling, as theorized for conventional superconductors. Under this theory, phonons, or vibrations of atoms in the crystal lattice, serve as an attractive force for otherwise repulsive electrons through the exchange of momentum and energy.

"Our result allowed us to rule out electron-phonon coupling because atoms in the lattice can vibrate and electrons can interact with those vibrations, regardless of whether the material is superconducting or not," said Valla. "If phonons were involved, we would expect to see the kink in both the superconducting and normal state, and the kink would not be changing with doping."

The team believes that something similar to electron-phonon coupling is going on in this case, but instead of phonons, another excitation gets exchanged between electrons. It appears that electrons are interacting through spin fluctuations, which are related to electrons themselves. Spin fluctuations are changes in electron spin, or the way that electrons point either up or down as tiny magnets.

Moreover, the scientists found that the energy of the kink is less than that of a characteristic energy at which a sharp peak (resonance) in the spin fluctuation spectrum appears. Their finding suggests that the onset of spin fluctuations (instead of the resonance peak) is responsible for the observed kink and may be the "glue" that binds electrons into the pairs required for HTS.

Next, the team plans to collect additional evidence showing that spin fluctuations are related to superconductivity by obtaining SI-STM measurements. They will also perform similar experiments on another well-known cuprate, lanthanum strontium copper oxide (LSCO).

"For the first time, we are seeing something that strongly correlates with superconductivity," said Valla. "After all these years, we now have a better grasp of what may be causing superconductivity in not only BSCCO but also other cuprates."

Credit: 
DOE/Brookhaven National Laboratory

OU study finds the fingerprint of paddy rice in atmospheric methane concentration dynamics

NORMAN, OKLA. - A University of Oklahoma-led study shows that paddy rice (both area and plant growth) is significantly related to the spatial-temporal dynamics of atmospheric methane concentration in monsoon Asia, where 87% of paddy rice fields are situated in the world.

Methane is one of the major greenhouse gases. It has a lifetime of 12.4 years and its global warming potential is approximately 86 times higher than carbon dioxide over a 20-year period.

"Rice paddy is a large source of methane emission; however, it has been a challenging task to attribute relative role of rice paddy in the spatial distribution, seasonal dynamics and interannual variation of atmospheric methane concentration as measured by spaceborne sensors," said Xiangming Xiao, a member of the Earth Observation and Modeling Facility at OU and a professor in the Department of Microbiology and Plant Biology who coordinated this interdisciplinary study.

Over the past few years, researchers at OU developed annual paddy rice maps at 500-meter spatial resolution and quantified the spatial-temporal changes in rice paddy area in monsoon Asia during 2000-2015. By combining the annual paddy rice maps, rice plant growth data and atmospheric methane concentration (XCH4) data, researchers found strong spatial consistencies between rice paddy area and XCH4 and seasonal consistencies between rice plant growth and XCH4, including both single rice and double rice fields. Results from the study also yielded a decreasing trend in rice paddy area in monsoon Asia since 2007. This suggests that the change in rice paddy area could not be one of the major drivers for the renewed XCH4 growth since 2007.

The findings of this study demonstrate the importance of satellite-based paddy rice datasets in understanding the spatial-temporal dynamics of XCH4 in monsoon Asia. These annual maps of paddy rice are the first of their kind and could be used to further improve simulations of biogeochemical models that estimate methane emission from paddy rice fields, which are critically needed for analysis of spaceborne XCH4 data and simulations of atmospheric chemistry and transport models.

Credit: 
University of Oklahoma

Weather radar records drastic drop in mayfly populations

NORMAN, OKLA. - At the beginning of each summer, mayfly larvae emerge from bodies of water and shed their skin to become full-fledged mayflies, similar to how caterpillars become butterflies. Then, all at once, a swarm of these insects simultaneously takes flight to reproduce, acting as an important component in the food chain for birds.

Researchers at the University of Oklahoma, the University of Notre Dame and Virginia Tech applied radar technology - the same used for meteorology - to quantify the number of mayflies that emerged annually from two different bodies of water: the Upper Mississippi River and the Western Lake Erie Basin. Their goal was to characterize the size of these swarms using the same technique a meteorologist would use to quantify the amount of precipitation that may fall from a cloud.

Pulling radar data from the two locations over a span of eight years, the research team estimated that up to 88 billion mayflies can swarm from each location annually. Although the initial study was only intended to quantify mayfly swarms, researchers found a more than 50% decrease in population from 2012 to 2019 in these two Midwestern water bodies. The next steps are to investigate whether declines like this are widespread, and what may be causing such reductions in the mass emergence of this species of mayfly.

Credit: 
University of Oklahoma

Government grants deliver highest returns for college financing, says study

image: A new joint study of education policy in the United States shows that the existing student aid program of grants and subsidized loans increases both welfare and efficiency in the U.S. economy.

Image: 
Vancouver School of Economics/University of British Columbia

Merit-based grants are a government's best bet for providing effective student aid for long-term economic growth - increasing both welfare (measured in terms of long-term well-being outcomes) and efficiency, according to a new joint study from the University of British Columbia, Queen's, Princeton and Yale. The study focuses on current education policy in the United States, and finds that the current system of grants and loans has significant long-term value.

Figures from 2012 show the U.S. federal government spends around $150 billion on grants and loans annually. Given such a sizeable investment, the researchers wanted to test the effectiveness of such spending and found the current amount of federal aid is extremely valuable.

"We found that a $1000 increase in grants per year for every student, which corresponds to roughly a 50 per cent increase on average, would lead to a long-run gain in GDP of close to one per cent," said study co-author Giovanni Gallipoli, an associate professor at the Vancouver School of Economics at UBC. "This is a comparatively large return on investment."

The study finds grants remain the most effective at improving the country's overall welfare, more so than loans or tax cuts. The study's economic modelling shows that one third of ability-tested grant recipients make an extra $2,300 per year in earnings over their entire careers, confirming the high return per dollar spent.

The researchers say there will be additional benefits if grant programs are further expanded, especially those based on academic performance and merit.

The researchers argue ability-tested grants work best because they prioritize those students who are likely to have the highest returns to college attendance and who are most likely to complete a college education, irrespective of family and social background. These students have their tuition funded, based on grades and test scores, deriving large gains from their degree in the labour market.

The researchers do recognize that this method may have potential flaws because students from well-off families could still have an advantage to receive grants based on performance. These students would have greater access to different supports or resources, like tutors or mentors, to advance their cognitive skills that are simply not available to working class children with the same or similar abilities. For this reason, the researchers do see a significant benefit in keeping a portion of federal aid based on need intact.

"This approach also benefits non-recipients, through overall economic growth," said Gallipoli. "One key finding of the study is that expanding post-secondary education for any given generation reduces the cost of human capital accumulation for future generations to come."

Removing tuition grants completely would result in a drop in college attainment of more than three percentage points, and reduce output and welfare by two and three per cent respectively.

"Without grants, the student body would possess lower skills, and the system would become much more reliant on parental wealth and transfers," said Gallipoli.

Gallipoli also said the study's findings could have implications for education policy outside of the U.S., including here in Canada.

"The U.S. system, through a mix of grants and subsidized loans, funds education for students from very poor backgrounds quite well, provided they do well academically," he said. "In Canada, similar funding doesn't really exist, nor is it as readily available or as expansive to such a group."

Credit: 
University of British Columbia

How ants get angry: Precise 'lock and key' process regulates aggression, acceptance

video: In a new study, scientists in the Department of Biological Sciences at Vanderbilt report definitive evidence of a specific mechanism within ants that is responsible for unlocking aggressive behaviors toward other ants. The research--the first to pinpoint this mechanism and its precise role in ant biology--reports a social characteristic which could help account for their evolutionary success.

Image: 
Vanderbilt University

For most social animals, even humans, the ability to distinguish friend versus foe can be a challenge that often can lead to knee-jerk aggression. But when it comes to ants getting aggressive, there's a more sophisticated method to their madness.

In a new study, published this month in the Journal of Experimental Biology, scientists in the Department of Biological Sciences at Vanderbilt report definitive evidence of a specific mechanism within ants that is responsible for unlocking aggressive behaviors toward other ants. The research--the first to pinpoint this mechanism and its precise role in ant biology--reports a social characteristic which could help account for their evolutionary success.

"Eusocial ants are one of the biggest success stories in evolutionary biology, thanks in no small part to their advanced organizational behaviors and complex social interactions," said Laurence Zwiebel, senior author of the paper and Cornelius Vanderbilt Chair in Biological Sciences. "For years, researchers have hypothesized that ants have specific chemical markers which play key roles in their interactions. What surprised us is that ants not only have these markers, but require these signals be very precisely decoded by specific receptors to trigger aggression."

One of the most important aspects of ant identity is the ability to distinguish nestmates from non-nestmates that typically act as bad actors. To do this, they rely on chemical markers made up of specific odorants on their bodies, which Zwiebel refers to as a "coat of many odors," to emit complex odor blends that act as a sort of personal identifier for other ants.

In this study, the researchers discovered that ants have to smell and correctly decode these specific compounds on intruder ants from other colonies in order to "unlock" their aggressive behavior and defend their nest. This implies that ants default to acceptance and select aggression only if they are specifically triggered.

To study this "lock and key" mechanism, Zwiebel and graduate student Stephen Ferguson, the lead author on the study, gathered Camponotus floridanus ants from nine distinct colonies collected across the Florida Keys.

Before testing, they used a specific chemical agent previously discovered by Zwiebel lab to block or over-excite ant odorant receptors. They set up mini-dueling arenas for two ants (either from the same or different colonies) to interact and, if they decide, fight. During each arena test, they filmed the ants to score the interactions based on their aggressive behaviors--the most common being lunging, biting or dragging.

While ants with normal receptors continued to recognize and fight ants from other colonies, ants with blocked or over-activated receptors displayed dramatically reduced aggressive behavior.

"Accepting friends and rejecting foes is one of the most important decisions an ant worker must make," said Ferguson. "Our study finds that unless there is a clear and unambiguous threat, ants are more likely to be accepting than they are to be aggressive. This process may have contributed to the evolutionary success of these insects, and there may be important lessons about tempering aggression for other social beings such as humans."

Credit: 
Vanderbilt University

Arctic permafrost thaw plays greater role in climate change than previously estimated

image: Aerial image of interspersed a permafrost peatland in Innoko National Wildlife Refuge in Alaska interspersed with smaller areas of thermokarst wetlands.

Image: 
Miriam Jones, U.S. Geological Survey

Abrupt thawing of permafrost will double previous estimates of potential carbon emissions from permafrost thaw in the Arctic, and is already rapidly changing the landscape and ecology of the circumpolar north, a new CU Boulder-led study finds.

Permafrost, a perpetually frozen layer under the seasonally thawed surface layer of the ground, affects 18 million square kilometers at high latitudes or one quarter of all the exposed land in the Northern Hemisphere. Current estimates predict permafrost contains an estimated 1,500 petagrams of carbon, which is equivalent to 1.5 trillion metric tons of carbon.

The new study distinguishes between gradual permafrost thaw, which affects permafrost and its carbon stores slowly, versus more abrupt types of permafrost thaw. Some 20% of the Arctic region has conditions conducive to abrupt thaw due to its ice-rich permafrost layer. Permafrost that abruptly thaws is a large emitter of carbon, including the release of carbon dioxide as well as methane, which is more potent as a greenhouse gas than carbon dioxide. That means that even though at any given time less than 5% of the Arctic permafrost region is likely to be experiencing abrupt thaw, their emissions will equal those of areas experiencing gradual thaw.

This abrupt thawing is "fast and dramatic, affecting landscapes in unprecedented ways," said Merritt Turetsky, director of the Institute of Arctic and Alpine Research (INSTAAR) at CU Boulder and lead author of the study published today in Nature Geoscience. "Forests can become lakes in the course of a month, landslides occur with no warning, and invisible methane seep holes can swallow snowmobiles whole."

Abrupt permafrost thaw can occur in a variety of ways, but it always represents a dramatic abrupt ecological shift, Turetsky added.

"Systems that you could walk on with regular hiking boots and that were dry enough to support tree growth when frozen can thaw, and now all of a sudden these ecosystems turn into a soupy mess," Turetsky said.

Why thawing permafrost matters

Permafrost contains rocks, soil, sand, and in some cases, pockets of pure ground ice. It stores on average twice as much carbon as is in the atmosphere because it stores the remains of life that once flourished in the Arctic, including dead plants, animal and microbes. This matter, which never fully decomposed, has been locked away in Earth's refrigerator for thousands of years.

As the climate warms, permafrost cannot remain frozen. Across 80 percent of the circumpolar Arctic's north, a warming climate is likely to trigger gradual permafrost thaw that manifests over decades to centuries.

But in the remaining parts of the Arctic, where ground ice content is high, abrupt thaw can happen in a matter of months - leading to extreme consequences on the landscape and the atmosphere, especially where there is ice-rich permafrost. This fast process is called "thermokarst" because a thermal change causes subsidence. This leads to a karst landscape, known for its erosion and sinkholes.

Turetsky said this is the first paper to pull together the wide body of literature on past and current abrupt thaw across different types of landscapes.

The authors then used this information along with a numerical model to project future abrupt thaw carbon losses. They found that thermokarst always involves flooding, inundation, or landslides. Intense rainfall events and the open, black landscapes that result from wildfires can speed up this dramatic process.

The researchers compared abrupt permafrost thaw carbon release to that of gradual permafrost thaw, trying to quantify a "known unknown." There are general estimates of gradual thaw contributing to carbon emissions, but they had no idea how much of that would be caused by thermokarst.

They also wanted to find out how important this information would be to include in global climate models. At present, there are no climate models that incorporate thermokarst, and only a handful that consider permafrost thaw at all. While large-scale models over the past decade have tried to better account for feedback loops in the Arctic, the Intergovernmental Panel on Climate Change (IPCC)'s most recent report only includes estimates of gradual permafrost thaw as an unresolved Earth system feedback.

"The impacts from abrupt thaw are not represented in any existing global model and our findings indicate that this could amplify the permafrost climate-carbon feedback by up to a factor of two, thereby exacerbating the problem of permissible emissions to stay below specific climate change targets," said David Lawrence, of the National Center for Atmospheric Research (NCAR) and a coauthor of the study.

The findings bring new urgency to including permafrost in all types of climate models, along with implementing strong climate policy and mitigation, Turetsky added.

"We can definitely stave off the worst consequences of climate change if we act in the next decade," said Turetsky. "We have clear evidence that policy is going to help the north and thus it's going to help dictate our future climate."

Credit: 
University of Colorado at Boulder

Helping patients with binge eating disorders: There's an app for that

image: Associate Professor, Psychiatry
Icahn School of Medicine at Mount Sinai

Image: 
Icahn School of Medicine at Mount Sinai

Behavioral therapy assisted by a smartphone app, delivered via telemedicine by a health coach, was an effective treatment for several symptoms of binge eating disorders, according to a study conducted by researchers from the Icahn School of Medicine at Mount Sinai and published this week in The American Journal of Psychiatry.

According to The American Psychiatric Association, psychiatric disorders characterized by binge eating, including binge eating disorder and bulimia nervosa, affect up to 6.5 million Americans. Cognitive behavior therapies (CBT) have demonstrated efficacy in individual, group, guided self-help, and pure self-help versions, yet have limitations that include the need to attend in-person sessions consistently and the limited availability of trained therapists.

Since mobile technologies are increasingly available and popular among patients and clinicians, the researchers wondered if a digital treatment platform might offer a preferred and more accessible option that could serve as a cost-effective alternative to specialized treatments. The researchers focused on the Noom Monitor, a smartphone app developed to facilitate CBT with guided self-help (CBT-GSH).

"Through a previous pilot study of the Noom Monitor by our team, we know the platform is acceptable to patients, feasible to deliver, and when combined with CBT-GSH with a trained clinician, improves symptoms," said Thomas D. Hildebrandt, PsyD, Chief of the Center of Excellence in Eating and Weight Disorders at The Mount Sinai Hospital and lead author of the study. "The purpose of this study was to evaluate the robustness of the intervention when delivered by non-specialist health coaches in a community health care system via telemedicine. We were encouraged by the results that showed that this intervention is effective and can be scaled outside of specialty clinical programs."

Specifically, this randomized, controlled telemedicine trial compared 52-week outcomes of CBT-GSH plus the Noom Monitor versus standard of care, which included traditional psychiatric or medical care, in 225 members of an integrated health care system in the Pacific Northwest who had been diagnosed with binge eating disorder or bulimia nervosa. CBT-GSH treatment involved coaching sessions with a routine health coach and use of both a CBT-GSH self-help book and the Noom Monitor, which uses a customized self-monitoring system that tracks exercise, meals/snacks, compensatory behavior, body scrutinizing, craving, and weight. The research team found that patients receiving CBT-GSH plus Noom reported significantly more reductions in objective binge days (about 3 less days per month)and achieved higher rates of remission (56.7 percent vs. 30 percent) than the control group exposed to standard care (no specific eating disorder treatment). Similar patterns emerged for compensatory behaviors (vomiting, laxatives, excessive exercise), eating disorder symptoms (shape/weight/eating concerns, restraint), and clinical impairment.

For this study, all CBT-GSH coaching sessions were conducted via telephone and involved six sequential steps establishing self-monitoring, regular eating (three meals/two snacks), alternative activities to binge eating/purging, problem-solving, reduction in dietary restraint, importance of shape/weight, and relapse prevention. The first session lasted 60 minutes, and each subsequent session was 20-25 minutes. The first four sessions occurred weekly, while the following four sessions were biweekly. All coaches completed an eight-hour training led by Dr. Hildebrandt and another eating disorders specialist. Although coaches ended their intervention at 12 weeks, participants had access to the self-help manual and Noom Monitor beyond the coaching period and were encouraged to continue using the program until they achieved remission.

"In addition to providing an improvement in primary eating disorders symptoms, related depression, and impairment in functioning, the group treated with CBT-GSH plus the Noom Monitor had an increased remission rate beyond the intervention, suggesting that the effects of the intervention continued to facilitate changes within the follow-up period that were not observed among those who received standard care," said Dr. Hildebrandt. "Scaling and implementing empirically supported interventions have become an important priority across mental health conditions and our study shows that CBT-GSH via telemedicine is effective and scalable as an intervention for binge eating disorders."

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

Value transformation framework model seeks to guide transition to value-based healthcare

February 3, 2020 - With a new focus on quality of care and outcomes achieved, healthcare organizations are challenged to make the transition to value-based care. A model called the Value Transformation Framework (VTF) provides a structured, step-by-step approach to help guide the shift to value-based healthcare, reports a paper in the Journal for Healthcare Quality (JHQ), the peer-reviewed journal of the National Association for Healthcare Quality (NAHQ). The journal is published in the Lippincott portfolio by Wolters Kluwer.

"The VTF framework shows promise in supporting health center efforts to adapt, transform, and balance competing demands as they advance value-based models of care," writes Cheryl Modica, PhD, MPH, BSN, Director of the Quality Center of the National Association of Community Health Centers (NACHC). Established in 1971, the NACHC serves as the national voice for America's Health Centers and as an advocate for health care access for the medically underserved and uninsured. The article appears as part of an upcoming JHQ Special Issue, devoted to the topic of 'Quality as a Business Strategy.'

'Actionable Pathway' to Improve Value via Infrastructure, Care Delivery, and People

In response to rising healthcare costs, demographic trends, and new technologies, the transition to value-based care is occurring throughout the healthcare system. "Value-based care reimburses providers based on quality of care, outcomes, and cost, as opposed to a fee-for-service model that reimburses based on the volume of services delivered," Dr. Modica explains.

But to date, there has been no clear, standardized, organizing framework that federally qualified health centers can use as an "actionable pathway" toward systemwide change to advance value. This important group of health centers provides care to approximately 28 million patients across the United States - largely low-income patients facing social and environmental risk factors.

The NACHC Quality Center developed the VTF model to support federally qualified health centers in making the transition to value-based care. The model seeks to guide systems change toward the "Quadruple Aim" goals of value-based care: improved healthcare outcomes, improved patient experience, improved staff experience, and reduced costs.

Based on evidence-based and promising practices, the VTF addresses three health center system domains: Infrastructure, Care Delivery, and People. Within each domain are five Change Areas, providing well-defined but flexible steps toward improvement. For each of the 15 Change Areas, the model provides concise, step-by-step instructions to advance health center transformation. The Action Guides can be found at http://www.nachc.org/clinical-matters/value-transformation-framework/

The VTF model was field-tested as part of a two-year Cancer Transformation Project funded by the Centers for Disease Control and Prevention (CDC). Evaluation found a 13.6 and 6.5 percentage-point increase in colorectal and cervical cancer screening rates respectively, during the first year. Further steps included feedback from health center stakeholders, leading to fine-tuning of the Change Areas and recommendations for further implementation. Initial results of VTF deployment in a national cohort of 115 health centers in 19 states will be available in 2020.

Although its intended audience is federally qualified health centers, "the steps and actions described in the VTF may also apply to other health care organizations and networks," Dr. Modica concludes. "If the VTF approach continues to demonstrate value, it can provide an actionable guide for systems change in advancing on the Quadruple Aim goals."

The Special Issue includes five additional papers illustrating the wide range of programs being implemented by interprofessional teams across the health continuum to improve the quality of care. "Few healthcare systems have the resources to do the important work to systematically develop innovative models of care resulting in improved quality and safety while maximizing reimbursement and decreasing associated cost of care," write Guest Editors Cathy E. Duquette, PhD, RN, NEA-BC, CPHQ, FNAHQ, and Nidia S. Williams, PhD, MBB, CPHQ, FNAHQ. "Despite the differences in healthcare settings and populations served, the future of healthcare value will be dependent on embracing quality as a business strategy."

Credit: 
Wolters Kluwer Health