Tech

Residential segregation associated with black-white disparity in firearm homicide rates

Residential segregation is linked to many racial disparities in health, including cancer, high blood pressure, and diabetes. Now, a new study led by Boston University School of Public Health (BUSPH) researchers suggests the likelihood of dying from gun violence can be added to the list of adverse health outcomes associated with structural racism in the US.

The study, published in the Journal of the National Medical Association, finds states with greater residential segregation of black and white populations have higher racial disparities in firearm homicide fatalities. The study is one of the first to examine the relationship between racial residential segregation and firearm homicide fatalities at a state level over a 25-year period, and controlled for multiple race-specific measures of deprivation in education, employment, economic status, and housing.

"It was important for us to analyze this at the state level, because in the past we've found that a black person living in Wisconsin has a 22-fold higher risk of being fatally shot compared to a white person, but in New Mexico a black person has a 2-fold higher risk of being fatally shot compared to a white person," says BUSPH pre-doctoral fellow Anita Knopov, the study's lead author.

The researchers used data on annual state-specific firearm homicide rates from the Centers for Disease Control and Prevention (CDC) from 1991 through 2015. To measure racial segregation at the state level, the researchers used a well-established measure called the "index of dissimilarity," which reflects the degree of racial integration within neighborhoods across a state. The scale runs from 0 to 100, with higher numbers representing higher levels of racial residential segregation.

For every 10-point increase in the index of dissimilarity, the researchers found the ratio of black to white firearm homicide fatality rates increased by 39 percent. After controlling for levels of white and black deprivation, greater racial segregation continued to be associated with lower white fatality rates and higher black fatality rates.

"Racial residential segregation is independently linked with the racial disparity in firearm homicides, even when other racial inequalities are accounted for, including unemployment, poverty, income, wealth, and single-parent families," says study co-author Michael Siegel, professor of community health sciences at BUSPH.

"These findings show that a history of structural racism over decades in the past has significant implications for the lives of black people today."

Credit: 
Boston University School of Medicine

Testosterone research brings new hope for cancer patients

Many cancer patients suffer from a loss of body mass known as cachexia. Approximately 20 percent of cancer related deaths are attributed to the syndrome of cachexia, which in cancer patients is often characterized by a rapid or severe loss of fat and skeletal muscle. Dr. Melinda Sheffield-Moore, professor and head of the Department of Health and Kinesiology, along with researchers at University of Texas Medical Branch, recently published research in the Journal of Cachexia, Sarcopenia and Muscle showing that the hormone testosterone is effective at combatting cachexia in cancer patients and improving quality of life.

These findings are important, as there are currently no established therapies targeting this loss of skeletal muscle, and without an intervention, patients lose muscle function and become fatigued and weakened.

"We hoped to demonstrate these patients would go from not feeling well enough to even get out of bed to at least being able to have some basic quality of life that allows them to take care of themselves and receive therapy," Dr. Sheffield-Moore said.

Dr. Sheffield-Moore said doctors sought her expertise in nutrition and metabolism when patients were losing tremendous amounts of weight from cancer cachexia. She said that previous nutrition-focused treatment failed to combat this severe loss of body mass, which led her team to investigate the hormone testosterone as an option to combat the often debilitating consequences of cancer cachexia.

"We already know that testosterone builds skeletal muscle in healthy individuals, so we tried using it in a population at a high risk of muscle loss, so these patients could maintain their strength and performance status to be able to receive standard cancer therapies." Dr. Sheffield-Moore said.

During this five year National Cancer Institute funded study, patients with a type of cancer known as squamous cell carcinoma were treated with standard of care chemotherapy and/or radiation in addition to seven weeks of treatment with either testosterone or placebo. Throughout the study, patients were monitored for changes in physical activity, muscle and fat mass and tested for physical performance.

Patients in this study receiving testosterone maintained total body mass and increased lean body mass by 3.2 percent. Sustaining body mass is important considering most patients experience a 20 percent decrease in body mass or more depending upon the type of cancer.

"Patients randomized to the group receiving testosterone as an adjuvant to their standard of care chemotherapy and/or radiation treatment also demonstrated enhanced physical activity," Dr. Sheffield-Moore said. "They felt well enough to get up and take care of some of their basic activities of daily living, like cooking, cleaning and bathing themselves."

Additionally, Dr. Sheffield-Moore's lab is currently analyzing skeletal muscle proteomic data from this study. "What the proteome [profile of proteins found in the muscle] tells us is which particular proteins in the skeletal muscles were either positively or negatively affected by testosterone or by cancer, respectively," Dr. Sheffield-Moore said. "It allows us to begin to dig into the potential mechanisms behind cancer cachexia."

Dr. Sheffield-Moore hopes this research will help cancer patients increase quality of life and maintain eligibility to receive standard of care therapy if cachexia ensues.

Credit: 
Texas A&M University

Longer contracts leverage the free fuel in solar power at little O&M cost

image: This is the Gemasolar Concentrated Solar Power (CSP) plant, owned by Torresol Energy, in Seville, Spain.

Image: 
®SENER

The world's first 35-year day or night solar contract (ACWA Power's with DEWA in Dubai) also had a record-low price for solar with storage - of just 7.3 cents per kWh.

Energy developers always look to find ways to structure deals to reduce their costs. A key task in developing utility-scale renewable energy projects is finding every possible way to reduce the price at which you must sell power to make a project pencil out financially.

The advantage of any renewable energy like solar and wind is that with no future fuel purchases, there is no uncertain future expense, so being able to guarantee a set price over as long as possible would seem to leverage that advantage.

Normally solar contracts are only for 20 to 25 years. But in 2017, ACWA Power, a developer that is no stranger to innovative deal structures, applied out-of-the-box thinking on contract design to bid a record low price for solar with storage of just 7.3 cents per kilowatt hour for DEWA, in Dubai.

This ACWA Power PPA marked the first-ever 35-year contract for Concentrated Solar Power (CSP), the thermal form of solar that can operate a power block from its energy storage.

Did a 35-year solar power contract enable a lower price bid?

With a longer contract, the costs incurred in developing and permitting any new income-generating projects can be put off while revenue continues, so there are more years of income generation to amortize the upfront costs. But how much did it actually reduce the price?

ETH Zürich Professor of Renewable Energy Policy Johan Lilliestam has calculated, in a paper online at Renewable Energy Focus, that as much as 2 cents per kWh was knocked off the bid in ACWA Power's DEWA bid in Dubai.

In Concentrating solar power for less than USD 0.07 per kWh: finally the breakthrough? Lilliestam together with co-author Robert Pitz-Paal, co-director of the Institute of Solar Research at the German Aerospace Centre (DLR) attribute the cost reduction in part to the unusually long 35-year contract.

The paper states: "...with a more standard 20-year PPA, the LCOE would be USD 0.106 per kWh, which is about the same as declared by many Chinese stations under construction [7]. The long PPA duration thus directly reduces the LCOE by some 2 cents per kWh; in addition, it could help de-risking the investment by giving a very long-term perspective for investors, thus reducing the cost of capital."

But what additional costs might be incurred over a longer operating life?

In all energy-generating technologies, engineers must design components for a specific lifespan and have to prove that components will not fail within that time. Insurers guarantee components for a set time. The agreed 20-year design lifetime means engineers can design to meet one consistent requirement, ensuring that new components can be guaranteed to work reliably - and be insured - for that period.

Would the cost of replacing components outweigh the benefit of a 35-year contract? SENER knows what's involved in designing a project for greater longevity, as the engineering and construction firm for the 510 MW ACWA Power CSP project in Morocco, NOOR I,II and III.

SENER has been technology provider and contractor for 29 CSP projects and in three of those, it provided - roughly - all the technology and half the EPC (Engineering, procurement, and construction).

SENER's Gemasolar CSP project in Spain, the world's first commercial solar tower, has operated with its day and night solar successfully since being grid-connected in 2011.

"I wouldn't say there is a major problem for designing a plant for 35 years," SENER Performance Guarantee Manager Sergio Relloso said. "In our plants we designed the components to last for 25 years and it is completely possible to last 35 years without a problem."

Most of the expenses would fall under normal O&M costs. However, Relloso cautioned that higher O&M costs would be expected towards the end, for example in major equipment like the steam generators in the power block. But many of the expenses he described would be the normal O&M expenses, such as in the thermal energy storage system that enables CSP to generate solar at night.

"The HTF for example; we normally replace a small quantity year-by-year in a trough project just because with HTF there is some degradation," he pointed out. ??"This is not the case with the salts in a tower project, because there you don't have such a high temperature near the degradation limit for the salts which top out at 565°C, while their limit is 600ºC."

ACWA Power's 35-year DEWA project will combine both trough (600 MW) and tower (100 MW) technologies. In overall durability, mirrors, or heliostats - in both technologies - would see negligible degradation, Relloso said.

"We are not seeing any measurable degradation in our plants in mirrors; they have operated very well and normally the mirrors last a long time," Relloso said, referencing SEGS.

"Mirrors have had a really good track record at SEGS. You would replace year by year the small number of mirrors that are broken maybe in a high wind event or during maintenance tasks. But the percentage of breakage of mirrors is in the range of .1% to .3% of mirrors in a year - it is a very normal operation to replace mirrors in a CSP plant."

In a trough project, the receiver tubes that run along the length of the parabolic mirrors would have a higher replacement rate, he said, because "the receiver tubes in a trough plant are not as simple as the mirrors. They could be subjected to more degradation."

But in both tower and trough technologies, Relloso said that all the metal components themselves would last - from the heliostat structures in the solar field to the pipe racks in the power block, as everything is adequately protected and designed for 35 years.

With the longer period at a known price, ACWA Power's interesting contract design leverages the advantage of solar power generation; that its costs are more predictable over the long term than fossil energy, as the fuel is free.

With its ability to dispatch its power whenever needed, solar thermal energy competes directly with natural gas which is also a dispatchable form of thermal generation. Since CSP seems well suited to a 35-year lifespan, if the benefits outweigh the costs, longer contracts could enable lower costs going forward.

Credit: 
SolarPACES

Graphene could be key to controlling water evaporation

Graphene coatings may offer the ability to control the water evaporation process from various surfaces, according to new research.

The study, carried out by a team from the Chinese Academy of Sciences and the Collaborative Innovation Center of Quantum Matter (Beijing), looked at the interactions of water molecules with various graphene-covered surfaces.

It is published today in the journal 2D Materials.

Lead author Dr Yongfeng Huang, from the Chinese Academy of Sciences, said: "Water droplet evaporation is a ubiquitous and complicated phenomenon, and plays a pivotal role in nature and industry. Understanding its mechanism at the atomic scale, and controlling evaporation rate rationally is important for applications including heat transfer and body-temperature control. However, it remains a significant challenge."

The team's experiments showed that a graphene coating controls water evaporation by suppressing the evaporation rate on hydrophilic surfaces, and accelerating evaporation on hydrophobic ones.

Dr. Huang said: "More importantly, we found graphene is 'transparent' for evaporation. When a hydrophilic surface is coated with graphene, the contact line of the water droplet is dramatically shortened or elongated, because of adjustment in wetting angles. This leads to changes in the evaporation rate."

The researchers wanted to understand the 'transparency' in graphene-mediated evaporation, and uncover its underlying structure on the atomic scale. To do this, they conducted molecular dynamics simulations on water droplet evaporation, on surfaces with and without a graphene coating.

For the first time, they identified the atomic-scale mechanism for substrate-induced evaporation events. They found that water molecule forms a precursor state at the contact line before it evaporates.

Dr. Huang explained: "Further analysis showed water density in evaporation transition states is largest at the contact line, then decreases exponentially as it goes away from the substrate. Single water desorption at the contact line dominates the droplet evaporation process. Since the graphene does not alter the binding energy of a single water molecule, it has negligible effects on evaporation of per contact line.

"Our results are an important discovery on graphene-mediated evaporation, and also point to new ways to rationally control evaporation process, for realistic applications in heat transfer, printing and related areas."

Professor James Sprittles from the University of Warwick, UK assessed the work. He said: "Using experiments supplemented with molecular dynamics simulations, Dr. Huang and co-workers have provided fascinating insights into the molecular mechanisms governing the evaporation of water droplets on technologically-relevant graphene coated substrates.

"Their research shows that wettability is solely responsible for evaporation rate changes, and simultaneously opens up several interesting topics for future research, such as how molecular effects (e.g. precursor nanofilms and thermal fluctuations) can be incorporated into macroscopic modelling."

Credit: 
IOP Publishing

Research finds new molecular structures in boron-based nanoclusters

image: New research shows that clusters of boron and lanthanide atoms form an interesting and stable "inverse sandwich" structure.

Image: 
Wang Lab / Brown University

PROVIDENCE, R.I. [Brown University] -- Brown University researchers and collaborators from Tsinghua University in China have shown that nanoclusters made from boron and lanthanide elements form highly stable and symmetric structures with interesting magnetic properties.

The findings, published in Proceedings of the National Academy of Sciences on Monday, July 9, suggest that these nanoclusters may be useful as molecular magnets or assembled into magnetic nanowires. The research also helps shed light on the structure and chemical bonding of bulk boron lanthanides, which may help in engineering new boride materials.

"Boron lanthanides are an important class of materials used in electronics and other applications, but nanoclusters of boron lanthanides have not been studied," said Lai-Sheng Wang, a professor of chemistry at Brown and senior author of a paper describing the work. "We have just started to investigate these nanoclusters, and here we show that they can have an interesting 'inverse sandwich' structure with the right combination of boron and lanthanide atoms."

The structure -- a ring of bonded boron atoms with a single lanthanide atom bonded to each side -- emerged in clusters made from eight boron atoms and two atoms of either lanthanum or praseodymium (both members of the lanthanide group on the periodic table). Sandwich structures -- complexes in which two planar aromatic hydrocarbon molecules surround a single metal atom -- are well known in chemistry and their discovery earned a Nobel Prize in 1973. Inverse sandwich structures are known to form in uranium-organic molecular complexes, Wang says, but this is the first time the structure has been seen in boron lanthanides.

Wang's lab used a technique called photoelectron spectroscopy to study nanoclusters made of different chemical elements. The technique involves zapping atomic clusters with a high-powered laser. Each zap knocks an electron out of the cluster. By measuring the kinetic energy of those freed electrons, researchers can understand how the atoms in a cluster are bound together and infer the cluster's physical structure. To find the structures, Wang compared the photoelectron spectra with theoretical calculations done by quantum chemist Professor Jun Li and his students from Tsinghua.

"We found that clusters made of eight boron and two lanthanide atoms are highly symmetric as inferred from their simple spectral patterns," Wang said. "In chemistry, whenever we find something that's highly symmetric it's very exciting."

That symmetry is produced, Wang said, by the nature of the chemical bonds that hold the structure together. The nature of those bonds also makes the clusters highly magnetic. That could make them useful for in nano-electronics applications or elsewhere.

The research also helps to shed light on bulk borides, Wang says.

"It gives us a new way of understanding the bonding and structure of boride materials," he said. "By studying small units, we can gain insight into the bulk system, and I think we have gained some of that insight here."

Credit: 
Brown University

Can ultrashort electron flashes help harvest nuclear energy?

video: A video illustrating the experiments and findings of this study.

Image: 
Fabrizio Carbone/EPFL

The lab of Fabrizio Carbone at EPFL and their international colleagues have used ultrafast Transmission Electron Microscopy to take attosecond energy-momentum resolved snapshots (1 attosecond = 10-18 or quintillionths of a second) of a free-electron wave function. Though unprecedented in itself, the scientists also used their experimental success to develop a theory of how to create electron flashes within zeptosecond (10-21 of a second) timeframes, using already existing technology. This breakthrough could allow physicists to increase the energy yield of nuclear reactions using coherent control methods, which relies on the manipulation of quantum interference effects with lasers and which has already helped advance fields like spectroscopy, quantum information processing, and laser cooling.

In fact, one of the most elusive phenomena in physics is the excitation of an atom's nucleus by absorption of an electron. The process, known as "nuclear excitation by electron capture" (NEEC), was theoretically predicted fourty years ago, though it proved difficult to observe experimentally.

But in February 2018, US physicists were finally able to catch a glimpse of NEEC in the lab. The work was hailed as ushering in new nuclear energy-harvesting systems, as well as explaining why certain elements like gold and platinum are so abundant in the universe.

The EPFL researchers in their publication suggest a way of potentially exploiting the several orders of magnitude in energy harvesting possibly present in the nucleus of an atom via the coherent control of the NEEC effect. Such method would be enabled by the availability of ultrashort (as to zs) electron flashes. "Ideally, one would like to induce instabilities in an otherwise stable or metastable nucleus to prompt energy-producing decays, or to generate radiation," says Carbone. "However, accessing nuclei is difficult and energetically costly because of the protective shell of electrons surrounding it."

The authors state: "Our coherent control scheme with ultrashort electron pulses would offer a new perspective for the manipulation of nuclear reactions with potential implications in various fields, from fundamental physics to energy-related applications."

Credit: 
Ecole Polytechnique Fédérale de Lausanne

Geological records reveal sea-level rise threatens UK salt marshes, study says

image: A salt marsh in Tees Estuary, England, showing signs of erosion.

Image: 
Matthew Brain

Sea-level rise will endanger valuable salt marshes across the United Kingdom by 2100 if greenhouse gas emissions continue unabated, according to an international study co-authored by a Rutgers University-New Brunswick professor.

Moreover, salt marshes in southern and eastern England face a high risk of loss by 2040, according to the study published in Nature Communications.

The study is the first to estimate salt-marsh vulnerability using the geological record of past losses in response to sea-level change.

An international team of scientists, led by former Rutgers-New Brunswick Professor Benjamin Horton - now acting chair and a professor at the Asian School of the Environment at Nanyang Technological University - found that rising sea levels in the past led to increased waterlogging of the salt marshes in the region, killing the vegetation that protects them from erosion. The study is based on data from 800 salt-marsh soil cores. Tidal marshes rank among Earth's most vulnerable ecosystems.

"By 2100, if we continue upon a high-emissions trajectory, essentially all British salt marshes will face a high risk of loss. Reducing emissions significantly increase the odds that salt marshes will survive," said study co-author Robert E. Kopp, a professor in the Department of Earth and Planetary Sciences at Rutgers-New Brunswick and director of the Rutgers Institute of Earth, Ocean, and Atmospheric Sciences. Kopp led the development of the study's sea-level rise projections.

"Salt marshes, also called coastal wetlands, are important because they provide vital ecosystem services," said Horton. "They act as a buffer against coastal storms to protect the mainland and a filter for pollutants to decontaminate our fresh water. We also lose an important biodiversity hotspot. Salt marshes are important transitional habitats between the ocean and the land, and a nursery area for fish, crustacea, and insects. The take-home point from this paper is how quickly we are going to lose these ecologically and economically important coastal areas in the 21st century."

While the study looks at UK salt marshes, the counterpart in tropical environments such as Singapore are mangroves, which are just as vulnerable to sea-level rise as salt marshes.

"What is unknown is the tipping point that will cause a disintegration of mangroves to Singapore and elsewhere in Southeast Asia," Horton said. "We are currently collecting data to address the future vulnerability of mangroves to sea-level rise."

Credit: 
Rutgers University

Living in greener neighborhoods is associated with slower cognitive decline in elderly

image: Elderly people living in greener neighborhoods have a slightly slower cognitive decline, study shows.

Image: 
Photo by Huy Phan

Contact with greenspace is known to have beneficial effects for mental health. A new study by the Barcelona Institute for Global Health (ISGlobal), a centre supported by the "la Caixa" Foundation, suggests that it may also play a positive role against cognitive decline in elderly. In particular, this research published in Environmental Health Perspectives shows that the loss in cognitive functions expected as part of the ageing process is slightly slower in people who live in greener neighbourhoods.

Researchers performed a 10 years follow-up of 6,500 people aged 45 to 68 from the Whitehall II cohort in the UK. At three different timepoints during the course of the study, participants completed a battery of cognitive tests that assessed their verbal and mathematical reasoning, verbal fluency and short-term memory, as well as the decline in these functions. Neighbourhood greenspace for each participant was estimated using satellite images.

"There is evidence that the risk for dementia and cognitive decline can be affected by exposure to urban-related environmental hazards (such as air pollution and noise) and lifestyle (such as stress and sedentary behavior). In contrast, living near green spaces has been proposed to increase physical activity and social support, reduce stress, and mitigate exposure to air pollution and noise. Recent evidence has shown cognitive benefits of green space exposure in children, but studies on the possible relations of exposure to green spaces and cognitive decline in older adults are still very scarce and often have inconsistent results", says Carmen de Keijzer, ISGlobal researcher and first author of the study.

"Our data show that the decline in the cognitive score after the 10-years follow up was 4.6% smaller in participants living in greener neighbourhoods. Interestingly enough, the observed associations were stronger among women, which makes us think that these relations might be modified by gender", Carmen de Keijzer adds.

"The proportion of people over 60 years old in the world is expected to nearly double between 2015 and 2050 and the number of dementia cases has been predicted to grow at a similar pace worldwide. Although the differences in cognitive decline observed in our study are modest at individual level, they become much more significant if we consider these findings at population level", says Payam Dadvand, ISGlobal researcher and last author of the study. "If confirmed by future studies, our results may provide an evidence base for implementing targeted interventions aimed at decelerating cognitive decline in older adults residing in urban areas and hence improving their quality of life", he adds.

Credit: 
Barcelona Institute for Global Health (ISGlobal)

University of Montana ecology professor helps map climate corridors

MISSOULA - The corridors of land vital for many wildlife species in the face of climate change often are unprotected. Now, a recently published study from a University of Montana ecology professor and other researchers has tracked these shifting North American habitats.

Solomon Dobrowski, an associate professor of forest landscape ecology in UM's W.A. Franke College of Forestry & Conservation, was part of a team that used high-performance computing methods to map "climate corridors." Global Change Biology recently published the study at https://onlinelibrary.wiley.com/doi/abs/10.1111/gcb.14373. Climate corridors form the best route between current and future climate types. Because organisms need to avoid inhospitable climates, the corridors are often circuitous. Although previous studies have mapped climate connectivity areas over smaller regions, this is the first time scientists have mapped these areas over entire continents.

The researchers found that routes funneled along north-south trending passes and valley systems and along the leeward or drier slopes of north-south trending mountain ranges. Climate connectivity areas, where many potential dispersal routes overlap, often are distinct from protected areas and poorly captured by existing conservation strategies. Many of these merit increased levels of protection due to pressures from human land use.

"The paleo-ecological record provides clear evidence of plants and animals moving large distances in response to climate changes of the past, but those changes occurred over long time periods and without the human pressures we see now," Dobrowski said.

The researchers hope results from this study will help land managers create more effective responses to climate change by identifying landscape features that promote connectivity among protected areas.

"Even as governments step up their commitment to reduce future greenhouse gas emissions, this information can help planners identify climate corridors whose conservation would reduce loss of species from the climate change that is already locked into the system from past emissions," said Carlos Carroll of the Klamath Center for Conservation Research, lead author on the study.

Existing parks and protected areas with high importance for climate connectivity include southern Mexico, the southwestern U.S., and western and arctic Canada and Alaska. The Great Plains, eastern temperate forests, and high arctic and western Canadian Cordillera also hold crucial climate connectivity areas.

The study's authors also included researchers from the U.S. Forest Service and the University of Alberta as part of the AdaptWest Project, a high-resolution database that maps climate change-related threats to biodiversity across North America. The database is used by conservation organizations and agencies such as the Wilderness Society and the U.S. National Park Service to assess climate change vulnerability in different regions of the U.S. and Canada.

Credit: 
The University of Montana

Study analyzes opioid overdose risk during and after pregnancy among Massachusetts women

A study of women giving birth in Massachusetts found a higher level of opioid use disorder than have studies conducted in other states. In a paper published in the journal Obstetrics & Gynecology, the research team - consisting of investigators from the Mass. Department of Public Health (DPH) and several academic medical centers, led by a MassGeneral Hospital for Children (MGHfC) physician - found that opioid overdose events decreased during pregnancy, reaching their lowest level during the third trimester, but then increased during the postpartum period, becoming significantly higher during the second six months after delivery.

"Our findings suggest we need to develop extended and long-term services to support women and families impacted by substance use disorder," says Davida Schiff, MD, MSc, an MGHfC pediatrician and the lead and corresponding author of the paper. "We need additional research to determine the best ways to improve retention in treatment and adherence to medication therapy after delivery, and we need to enhance our medical and public health infrastructure to provide support to women in achieving long-term recovery."

With the increasing levels of opioid-use disorder across the U.S., overdose deaths have quadrupled over the past 15 years, the authors note. In many states, opioid overdoses have been cited as major contributors to pregnancy-associated deaths. Estimates of opioid use disorder among pregnant women have ranged from 0.4 to 0.8 percent, and estimates for all women of reproductive age up to 2 percent. But pregnancy often serves as motivation for women to enter treatment for substance use, the standard of which is behavioral therapy combined with medications like methadone or buprenorphine.

While discontinuing medication therapy increases risks of relapse and overdose, there has been little data available on either the timing of overdose events or the relationship of medication therapy to relapse during pregnancy and after delivery. To explore those factors, along with assessing characteristics of women with opioid use disorder who gave birth in Massachusetts, the team took advantage of a Department of Public Health dataset developed in response to a 2015 mandate from the state legislature.

"This unique dataset - which links statewide resources including hospital discharge data, ambulance trip records, birth and death certificates, and addiction treatment data - combines a rich array of data sources and illustrates multiple factors contributing to overdose, particularly the impact of receiving medication treatment with methadone or buprenorphine," says co-author Dana Bernson, MPH, of the Mass. DPH. "Additionally we were able to include non-fatal overdose events that required medical attention, while other states have only reported overdose deaths."

From the dataset that included almost 178,000 deliveries of a live infant of 20 weeks or greater gestational age to Massachusetts resident women between Jan. 1, 2012 and Sept. 30, 2014, the research team identified 4,154 deliveries to women who had some evidence of an opioid use disorder in the year before delivery. While the 2.3 percent prevalence of opioid use disorder is more than double that reported in other states, the comprehensive dataset may have given a more accurate reflection of the level of opioid use disorder than previous studies have provided.

Among all women in the dataset, 184 experienced an opioid overdose event - defined as either admission to a health care facility for overdose treatment or a death certificate listing opioid overdose as the cause of death - during the year before or after delivery. Around 25 percent of women with overdose events experienced two to four overdoses, leading to a total of 242 overdose events, 11 of which were fatal, during the study period.

Compared to women with evidence of an opioid use disorder who did not experience an overdose event, those who did experience an overdose were more likely to be younger, single, unemployed, less educated and less likely to have received adequate prenatal care. They were also more likely to have evidence of homelessness or a diagnosis of anxiety or depression. The risk of an overdose event decreased as a pregnancy progressed, reaching its lowest level during the third trimester, but increased during the postpartum period, becoming highest from 7 to 12 months after delivery. In fact, 78 women with no evidence of opioid use disorder during the year before delivery experienced an overdose event during the postpartum period.

Based on insurance claims, prescription records and methadone treatment records, more than 64 percent of women with evidence of an opioid use disorder received some type of medication therapy during the year before delivery. Overall, across the entire study period, overdose rates for women receiving medication therapy were lower than those not receiving treatment.

"The first year postpartum is a particularly vulnerable year for women with opioid use disorder," says Schiff, who is an instructor in Pediatrics at Harvard Medical School. "Factors such as loss of access to specialized care, fragmented transitions from prenatal to postpartum providers, postpartum depression, other psychiatric disorders and homelessness can add to the normal stresses involved with having a new baby. Discontinuing medication therapy following delivery also may play a role in increased overdose events."

Massachusetts Commissioner of Public Health Monica Bharel, MD, MPH, a co-author of the Obstetrics & Gynecology paper, says that to effectively address the current opioid epidemic it's critical to gain a complete picture of the individuals who are at highest risk. "These findings help expand the lens from which we view the epidemic and allow us to tailor our policies and programs in ways that will increase opportunities for treatment and recovery for these women and their children."

Credit: 
Massachusetts General Hospital

Database analysis more reliable than animal testing for toxic chemicals

UPDATED ON JULY 19, 2018

Advanced algorithms working from large chemical databases can predict a new chemical's toxicity better than standard animal tests, suggests a study led by scientists at Johns Hopkins Bloomberg School of Public Health.

The researchers, in the study that appears in the journal Toxicological Sciences on July 11, mined a large database of known chemicals they developed to map the relationships between chemical structures and toxic properties. They then showed that one can use the map to automatically predict the toxic properties of any chemical compound--more accurately than a single animal test would do.

The most advanced toxicity-prediction tool the team developed was on average about 87 percent accurate in reproducing consensus animal-test-based results--across nine common tests, which account for 57 percent of the world's animal toxicology testing. By contrast, the repetition of the same animal tests in the database were only about 81 percent accurate--in other words, any given test had only an 81 percent chance, on average, of obtaining the same result for toxicity when repeated.

"These results are a real eye-opener--they suggest that we can replace many animal tests with computer-based prediction and get more reliable results," says principal investigator Thomas Hartung, MD, PhD, the Doerenkamp-Zbinden Chair and professor in the Department of Environmental Health and Engineering at the Bloomberg School.

The computer-based approach could also be applied to many more chemicals than animal testing, which could lead to wider safety assessments. Due to costs and ethical challenges only a small fraction of the roughly 100,000 chemicals in consumer products have been comprehensively tested.

Animals such as mice, rabbits, guinea pigs and dogs annually undergo millions of chemical toxicity tests in labs around the world. Although this animal testing is usually required by law to protect consumers, it is opposed on moral grounds by large segments of the public, and is also unpopular with product manufacturers because of the high costs and uncertainties about testing results.

"A new pesticide, for example, might require 30 separate animal tests, costing the sponsoring company about 20 million dollars," says Hartung, who also directs the Center for Alternatives to Animal Testing, which is based in the Bloomberg School's Department of Environmental Health and Engineering.

The most common alternative to animal testing is a process called read-across, in which researchers predict a new compound's toxicity based on the known properties of few chemicals that have a similar structure. Read-across is much less expensive than animal testing, yet requires expert evaluation and somewhat subjective analysis for every compound of interest.

As a first step towards optimizing and automating the read-across process, Hartung and colleagues two years ago assembled the world's largest machine-readable toxicological database. It contains information on the structures and properties of 10,000 chemical compounds, based in part on 800,000 separate toxicology tests.

"There is enormous redundancy in this database--we found that often the same chemical has been tested dozens of times in the same way, such as putting it into rabbits' eyes to check if it's irritating," says Hartung. This waste of animals, however, gave the researchers information they needed to develop a benchmark for a better approach.

For their study, the team enlarged the database and used machine-learning algorithms, with computing muscle provided by Amazon's cloud server system, to read the data and generate a "map" of known chemical structures and their associated toxic properties. They developed related software to determine precisely where any compound of interest belongs on the map, and whether--based on the properties of compounds "nearby"--it is likely to have toxic effects such as skin irritation or DNA damage.

"Our automated approach clearly outperformed the animal test, in a very solid assessment using data on thousands of different chemicals and tests," Hartung says. "So it's big news for toxicology." Underwriter's Laboratories (UL), a company that specializes in developing public safety standards and testing against them, co-sponsored this work and is making the read-across software tool commercially available.

"One day perhaps, chemists will use such tools to predict toxicity even before synthesizing a chemical so that they can focus on making only non-toxic compounds," Hartung says.

"Machine learning of toxicological big data enables read-across structure activity relationships (RASAR) outperforming animal test reproducibility" was written by Tom Luechtefeld, Dan Marsh, Craig Rowlands, and Thomas Hartung.

Credit: 
Johns Hopkins Bloomberg School of Public Health

Rainy weather predicts bird distribution -- but climate change could disrupt it

image: Precipitation is the best predictor of Eastern Kingbirds' winter distribution.

Image: 
M. MacPherson

Understanding what environmental cues birds use to time their annual migrations and decide where to settle is crucial for predicting how they'll be affected by a shifting climate. A new study from The Auk: Ornithological Advances shows that for two species of flycatcher, one of the key factors is rain--the more precipitation an area receives, the more likely the birds are to be there during the non-breeding season.

Tulane University's Maggie MacPherson and her colleagues combined field techniques with species distribution models to investigate which environmental factors drove the migrations of Eastern Kingbirds and Fork-tailed Flycatchers. Using geolocators, devices that record a bird's daily location based on day length, they could track where individuals of each species went. The two species share similar behavior and habitat requirements, but differ in their range and migration strategies, and these strategies were compared to determine the influence of temperature, precipitation, and primary productivity (the amount of "green" vegetation). Precipitation turned out to be one of the most important predictors of their distribution, particularly in the non-breeding season.

MacPherson comments, "Although we understand how climate change is expected to affect regional temperature regimes, changes in patterns of seasonal precipitation remains unclear. As the locations of both species were positively correlated with the highest rainfall across the landscape during their non-breeding seasons, our research emphasizes the need for a better understanding of how flexible they may be in adjusting locations under new rainfall regimes. More research is needed to better understand how migratory birds relying on current rainfall regimes could benefit from climate-conscious conservation planning."

"In the face of climate change, having seasonal species distribution models like these is powerful for helping understand the biology of the species, and also for predicting how a population might change in size and geography in the future, or a species' flexibility to adjust its migratory timing," adds Mississippi State University's Auriel Fournier, an expert on species distribution models who was not involved in the study. "All of those predictions are vital for conservation planning and decision making. The use of two related species with different life history traits is also exciting, as it makes the results more broadly applicable."

Credit: 
American Ornithological Society Publications Office

Using light for next-generation data storage

image: Scientists Xuanzhao Pan and Dr. Nick Riesen demonstrating a novel optical data storage platform.

Image: 
Elizaveta Klantsataya

Tiny, nano-sized crystals of salt encoded with data using light from a laser could be the next data storage technology of choice, following research by Australian scientists.

The researchers from the University of South Australia and University of Adelaide, in collaboration with the University of New South Wales, have demonstrated a novel and energy-efficient approach to storing data using light.

"With the use of data in society increasing dramatically due to the likes of social media, cloud computing and increased smart phone adoption, existing data storage technologies such as hard drive disks and solid-state storage are fast approaching their limits," says project leader Dr Nick Riesen, a Research Fellow at the University of South Australia.

"We have entered an age where new technologies are required to meet the demands of 100s of terabyte (1000 gigabytes) or even petabyte (one million gigabytes) storage. One of the most promising techniques of achieving this is optical data storage."

Dr Riesen and University of Adelaide PhD student Xuanzhao Pan developed technology based on nanocrystals with light-emitting properties that can be efficiently switched on and off in patterns that represent digital information. The researchers used lasers to alter the electronic states, and therefore the fluorescence properties, of the crystals.

Their research shows that these fluorescent nanocrystals could represent a promising alternative to traditional magnetic (hard drive disk) and solid-state (solid state drive) data storage or blu-ray discs. They demonstrated rewritable data storage in crystals that are 100s of times smaller than that visible with the human eye.

"What makes this technique for storing information using light interesting is that several bits can be stored simultaneously. And, unlike most other optical data storage techniques, the data is rewritable," says Dr Riesen.

This 'multilevel data storage' - storing several bits on a single crystal - opens the way for much higher storage densities. The technology also allows for very low-power lasers to be used, increasing its energy efficiency and being more practical for consumer applications.

"The low energy requirement also makes this system ideal for optical data storage on integrated electronic circuits," says Professor Hans Riesen from the University of New South Wales.

"These results showcase the benefits of establishing complementary research capabilities and infrastructure at collaborating universities - this has been a deliberate strategy in the photonics domain that is bearing fruit across a number of projects," says Professor Tanya Monro, DVC-R at the University of South Australia.

The technology also has the potential to push forward the boundaries of how much digital data can be stored through the development of 3D data storage.

"We think it's possible to extend this data storage platform to 3D technologies in which the nanocrystals would be embedded into a glass or polymer, making use of the glass-processing capabilities we have at IPAS," says Professor Heike Ebendorff-Heidepriem, University of Adelaide. "This project shows the far-reaching applications that can be achieved through transdisciplinary research into new materials."

Dr Riesen says: "3D optical data storage could potentially allow for up to petabyte level data storage in small data cubes. To put that in perspective, it is believed that the human brain can store about 2.5 petabytes. This new technology could be a viable solution to the great challenge of overcoming the bottleneck in data storage."

The research is published in the open access journal Optics Express.

Credit: 
University of South Australia

If you build it, the birds will come -- if it meets their criteria

image: California Gnatcatcher's habitat needs go beyond simply having the right plants in place.

Image: 
A. Fisher

A study published in The Condor: Ornithological Applications presents a case study on how bird surveys can better inform conservation and vegetation restoration efforts. Previous conservation methods have emphasized plants as the key to recreating habitat preferred by a sensitive animal. However, this study shows that there's more to the coastal sagebrush habitat of California Gnatcatchers than just having the right plants present. Abiotic components such as topography and soil are important drivers of the biotic components, including plants, which pair together to make the complete ecosystem these birds need. Given this more complete perspective, future conservation efforts would be wise to consider all of the variables that make up an animal's habitat.

The U.S. Fish and Wildlife Service's Clark Winchell and Colorado State University's Paul F. Doherty, Jr., set out to find a way to improve the traditional "single-species -oriented" conservation plan. They used bird survey data to more accurately identify favorable habitat for California Gnatcatcher occupancy and discovered that as the ratio of coastal sagebrush increased from 10% to 40%, the probability of colonization and presence of these birds tripled. The amount of openness in the sagebrush habitat also correlated with the birds' occupancy probability (30-40% openness was ideal for the birds). Elevation and soil texture also influenced suitable habitat, with lower elevations and loam or sandy loam soils most preferred. Winchell and Doherty also found that the gnatcatchers preferred southern aspects, shallow slopes, and inland areas over other options. Being so detailed and using such a fine scale allowed more specific areas to be identified as suitable for gnatcatchers. Thorough research such as this will better aid conservation efforts, both by informing where restoration might be most successful and by providing restoration targets.

Winchell comments, "Restoration ecologists are generally not gnatcatcher biologists, and vice versa. Sometimes we tend to place restoration projects where land becomes available after political negotiations. We may want to consider what is that parcel of land trying to tell us--what does the land want to be, so to speak--versus assuming we can dictate the final outcome for a location. Considering the entire functionality of the surrounding ecosystem, including the physical components, the biological community, and understanding the dynamism of the ecosystem will lead to improved restoration and wildlife management outcomes and our study is one small step in that direction."

These results correlating soil, vegetation, and gnatcatcher occupancy harken back to lessons that Aldo Leopold taught us--namely, to start with the land and work with the land when managing wildlife. Leopold's holistic approach to conservation included the soils, waters, plants, and animals and is still relevant today.

Credit: 
American Ornithological Society Publications Office

Balancing foreign judgments against domestic policies

SMU Office of Research and Tech Transfer - A legal scholar at the Singapore Management University (SMU) has outlined a set of guiding principles to help judges to decide what to do if a foreign judgement comes into conflict with domestic public policy. These findings have been recently published in the Journal of Private International Law.

For most legal cases, a judgment passed by a foreign court will be recognised by other common law countries if important conditions are met, such as if the judgment is final, and if the foreign court has international jurisdiction over the parties involved.

Yet, in a small number of cases, a foreign judgment may be refused recognition when found to be contrary to a country's public policy. Here, the public policy doctrine serves as a defence to foreign judgements, and can play a crucial role in protecting a community's interests.

Not everyone, however, is a fan of the public policy doctrine. Critics have said that it entangles judges with controversial moral and political issues, while its uncertain and ambiguous nature makes it a tool of last resort only when all other legal options are exhausted.

"The public policy doctrine in private international law has not really been the subject of much academic commentary, contributing to its relatively amorphous nature," said study author Mr. Kenny Chng Wei Yao, a lecturer of law at the SMU School of Law.

Mr. Chng examined cases across major common law jurisdictions in which the public policy doctrine had been invoked, successfully or otherwise. He then discussed whether the cases where the public policy doctrine have been invoked are theoretically justifiable.

To provide practical guidance to judges, Mr. Chng proposed a set of two principles that the courts should consider in their application of the public policy doctrine: first, to uphold a universal norm of justice, and second, to protect community interests.

"I argue that there are two main principles that should ground the public policy doctrine. In the first principle, the courts are in fact thinking about universal norms of justice that are not specific to any community," Mr. Chng explained.

"The second principle acknowledges the sovereignty of individual nations, balancing out the first principle. Here, the public policy doctrine should be invoked to refuse recognition of a foreign judgment or law if it poses some kind of danger to community interests."

According to Mr. Chng, the relationship between the two principles can be governed by the concept of subsidiarity, an organising principle that has its roots in Catholic social thought and which also serves as a theoretical foundation for European Union law.

Credit: 
Singapore Management University