Tech

Clark University researchers: Development threatens tropical forests

image: This is Anthony Bebbington.

Image: 
Clark University

WORCESTER, Mass.--Tropical forests in the Amazon, Indonesia, and Mesoamerica face multiple threats from mining, oil, and gas extraction and massive infrastructure projects over the next two decades, according to a study by Clark University researchers and their international colleagues in the Proceedings of the National Academy of Sciences (PNAS). This encroachment not only threatens forests and biodiversity but also indigenous and rural communities.

Across the world, governments and investors are teaming up on huge regional development projects to build roads, railways, port facilities, and waterways. Their goal is to access areas deep within the forest that are rich with known and potential mineral, oil, coal, and natural gas reserves along with other natural resources, the authors point out.

"Governments have made framework policy commitments to national and cross-border infrastructure integration, increased energy production, and growth strategies based on further exploitation of natural resources. This reflects political settlements among national elites that endorse resource extraction as a pathway towards development," according to the authors of "Resource Extraction and Infrastructure Threaten Forest Cover and Community Rights," an article published December 3 in PNAS.

Five of the 12 authors have Clark ties, including lead investigator Anthony Bebbington, a Fellow of the National Academy of Sciences and the Milton P. and Alice C. Higgins Professor of Environment and Society and former director of Clark's Graduate School of Geography. Currently, Bebbington is serving as Australia Laureate Fellow at the University of Melbourne.

The other authors include Denise Humphreys Bebbington, research associate professor in Clark's International Development, Community and Environment Department; and three more researchers associated with the Graduate School of Geography: Laura Aileen Sauls, a doctoral candidate; John Rogan, associate professor; and Kimberly Johnson '16, M.S.'17, who graduated from the Accelerated B.A./Master's Degree Program in geographic information sciences and now works as a data mapping analyst at the Institute for Health Metrics and Evaluation in Seattle.

The Clark researchers partnered with researchers and officials from community- and environment-focused NGOs and academic institutions in Indonesia, Brazil, Peru, the Netherlands, El Salvador, and Australia.

The researchers used geospatial and qualitative data, including that gathered through interviews and workshops with stakeholders in Brazil, Indonesia, Peru, Mexico, Norway, and El Salvador.

Past research has shown that the expansion of infrastructure has led to deforestation, they say. But fewer investigators have studied how the complex interactions between two massive forces -- infrastructure investment and resource extraction -- have combined to threaten forests and communities, according to the authors.

"This rich body of work on deforestation and associated policy recommendations focuses far more on agriculture and forestry than on resource extraction or associated large-scale infrastructure," the authors explain. "There is even less analysis of the types of social and political relationships that have been created by these large-scale investments and which become self-perpetuating through lobbying and the re-entrenchment of power relations."

They stressed the urgency of this research, given the monumental infusion of government support for development across the world. In 2014, for example, the Group of 20 -- an international forum of governments, including the United States -- "committed to invest up to an additional $90 trillion in global infrastructure by 2030, and in 2016 committed to link infrastructure master plans across world regions," the authors note.

They urged researchers to shed light on governmental corruption and human rights abuses that often accompany these massive investments, and to promote policy-making that protects forests and communities. In Honduras, for example, violence and murder surrounded the Aguas Zarcas hydroelectric conflict in 2016.

"Such investment exacerbates existing conflicts and creates new ones, reflected in the most extreme cases by the killings of environmental defenders," the researchers stress. Citing the work of Global Witness, they note, "Globally, 200 such killings were reported in 2016, and 207 were reported in 2017, the majority linked to contestations over mining, logging, hydropower, agro-industrial, and infrastructure projects."

To protect forests and human rights, the authors suggest "new and different approaches to development that prioritize these objectives while accommodating some resource extraction and agroindustry priorities."

Examples include zoning to set aside forest areas and maintain communities; energy strategies focused on reducing dam building and fossil fuel extraction and eliminating coal; community-based forest management; financial incentives to decrease forest conversion; promotion of socially and environmentally responsible manufacturing and production; and especially important, a substantial community and human rights agenda.

The authors spell out converging patterns of resource extraction and forest loss. The direct impacts of mining and oil and gas extraction on forests were limited from 2000 to 2014, they say. However, there were exceptions. Forests were particularly affected by coal mining in Sumatra and Kalimantan; iron ore mining, charcoal and pig iron production in Brazil; and artisanal and small-scale gold mining in Madre de Dios, Peru, along rivers in the Brazilian and Colombian Amazon, across Kalimantan, and in Nicaragua.

On the other hand, forest loss and degradation has resulted more from the indirect impacts of resource extraction, combined with infrastructure investment. When roads are built to access resources, the government "signals" that those areas might be settled and developed.

This phenomenon has occurred in places like the Petén region in northern Guatemala and in Madre de Dios in southeastern Peru, part of the Amazon Basin. After the construction of the Southern Interoceanic Highway linking Brazil and Peru, for example, artisanal and small-scale gold mining intensified. In the future, mineral development could expand elsewhere in the Amazon Basin given large-scale commitments to invest in roads, waterways, and railroads as well as phenomena such as Venezuela's designation of a "mining arc" covering 12 percent of its territory, now under military control with suspension of constitutional rights. In Indonesia, forests and communities in Kalimantan are threatened by coal mining and a proposed railway running through the forest to a coastal port.

Meanwhile, governmental policies that promote growth will have major impacts on forests and contribute to increased greenhouse gas emissions, the authors say. They cite as examples Brazil's Growth Acceleration Program, building highways, waterways, and hydroelectric power plants across the Amazon; Honduras' investments in hydroelectric energy, mining, and petroleum exploration; and Nicaragua's focus on gold mining and exports. This political commitment to infrastructure projects without sufficient consideration of possible social and environmental impacts is also apparent at a subnational level where regulations are even weaker. This has led to further overlaps with indigenous lands and protected areas as in the case of Loreto, Peru, notes co-author César Gamboa of Derecho, Ambiente y Recursos Naturales (Law, Environment and Natural Resources), an NGO based in Lima, Peru.

Added to that are multi-state infrastructure and energy agreements, including major initiatives that will increase transportation across the Amazon and Mesoamerica and deepen integration across the Indonesian archipelago.

"What happens to the forests of Amazonia, Indonesia, and Mesoamerica over the next two decades will depend on which claims over these forests prevail in these contestations over land use," the authors conclude.

Credit: 
Clark University

Why a curious crustacean could hold secret to making renewable energy from wood

image: This is a gribble on a piece of wood.

Image: 
Claire Steele-King and Katrin Besser, University of York

Scientists studying the digestive system of a curious wood-eating crustacean have discovered it may hold the key to sustainably converting wood into biofuel.

Gribble are small marine invertebrates that have evolved to perform an important ecological role eating the abundant supplies of wood washed into the sea from river estuaries.

They can also be something of a marine menace, consuming the wood of boats and piers and causing considerable damage in the process.

Until now, the question of how gribble break through lignin - the highly resistant coating that wraps around the sugar polymers that compose wood - has been a mystery.

The team of scientists, led by the University of York, studied the hind gut of gribble, and discovered that Hemocyanins - the same proteins that make the blood of invertebrates blue - are crucial to their ability to extract sugars from wood.

The discovery brings researchers a step closer to identifying cheaper and more sustainable tools for converting wood into low carbon fuel - a promising alternative to fossil fuels like coal and oil.

Hemocyanins are a group of proteins better known for their role in transporting oxygen in invertebrates in a similar way to haemoglobin in animals. While haemoglobin binds oxygen through its association with iron atoms, giving blood its red colour; hemocyanins do this with copper atoms producing a blue colour.

Oxygen is a highly reactive chemical, and gribble have harnessed the oxidative capabilities of hemocyanins to attack the lignin bonds that hold the wood together.

The research, which involved teams from the Universities of York, Portsmouth, Cambridge and Sao Paulo, has revealed that treating wood with hemocyanins enables more than double the amount of sugar to be released - the same amount that can be released with expensive and energy consuming thermochemical pre-treatments currently used in industry.

Professor Simon McQueen-Mason, from the Department of Biology at the University of York, who led the research team, said: "Gribble are the only animal known to have a sterile digestive system. This makes their method for wood digestion easier to study than that of other wood-consuming creatures such as termites, which rely on thousands of gut microbes to do the digestion for them."

"We have found that Gribble chew wood into very small pieces before using hemocyanins to disrupt the structure of lignin. GH7 enzymes, the same group of enzymes used by fungi to decompose wood, are then able to break through and release sugars."

With pressure mounting for global action to be taken on climate change, many countries are rapidly trying to de-carbonise by switching to renewable energy sources such as biofuels.

Woody plant biomass is the most abundant renewable carbon resource on the planet, and, unlike using food crops to make biofuels, its use doesn't come into conflict with global food security.

Co-author of the paper, Professor Neil Bruce, from the Department of Biology, said: "In the long term this discovery may be useful in reducing the amount of energy required for pre-treating wood to convert it to biofuel.

"The cellulase-enhancing effect of the haemocyanin was equivalent to that of thermochemical pre-treatments used in industry to allow biomass hydrolysis, suggesting new options for bio-based fuel and chemicals production."

Lead author of the report, Dr Katrin Besser, added "it is fascinating to see how nature adapts to challenges and this discovery adds to evidence that haemocyanins are incredibly versatile and multi-functional proteins."

Credit: 
University of York

Undercover investigation: Socio-economic survey of pangolin hunting in Assam, India

image: This is a pangolin in its natural habitat, in Assam, India.

Image: 
Neil D'Cruze

Alarming footage captured by World Animal Protection and the Wildlife Conservation Research Unit (WildCRU) at University of Oxford reveals the heart-breaking moment a pangolin is brutally killed for its body parts to be sold on the black market in Assam, north-eastern India.

The footage was captured by an undercover researcher on their mobile phone, and shows a terrified pangolin hiding from hunters in a hollowed-out tree clinging for life, as its tail is tugged. The hunters use axes to cut the tree, but failing to remove the desperate animal, they light a fire to smoke it out. As the pangolin starts to suffocate and lose consciousness it makes a bolt for freedom but is captured, bagged and taken to a hut where the next stage of the ordeal takes place. The pangolin is repeatedly bludgeoned with a machete until it can barely move. While bleeding, it is then thrown into a cauldron of boiling water possibly still alive, where its tragic struggle comes to an end.

Pangolins are often referred to as the world's most trafficked mammal and this footage demonstrates the huge cruelty the animals endure when hunted. The harrowing clip is part of a two-year study, conducted by researchers from World Animal Protection and the University of Oxford, into traditional hunting practices in the state of Assam, that borders Bhutan. The study is published in the open-access journal Nature Conservation.

Interviews conducted by researchers with over 140 local hunters found that pangolins were largely targeted for their scales that are sold for a premium, with hunters earning the equivalent of four months' average salary for a single pangolin. The hunters from these communities were clearly unaware of the part they are playing in the international trafficking trade. Yet the illegal traders that then sell the animal products across the borders on the black market go on to make a large profit.

Pangolin scales are used in traditional Asian medicine particularly in China and Vietnam. They are made of keratin, the same material that makes human fingernails and hair, and they have no proven medicinal value. Pangolin meat is also considered to be a delicacy in some countries, and the scales are also used as decorations for rituals and jewellery. They are considered to be at high risk of extinction primarily as a result of illegal poaching.

Dr Neil D'Cruze, Global Wildlife Advisor at World Animal Protection and lead researcher said:

"Suffocated with smoke, beaten and boiled alive - this is a terrifying ordeal and pangolins clearly suffer immensely.

"This footage shines a spotlight on how truly shocking the practice of hunting pangolins truly is. Not only is this a major conservation issue - it's a devastating animal welfare concern. If we want to protect pangolins from pain and suffering in the countries they come from, we need to tackle the illegal poaching trade"

Professor David Macdonald, WildCRU, Department of Zoology, Oxford University said:

"Increasing demand driven by traditional Asian medicine is making pangolins a lucrative catch. It's easy to see why they are being commercially exploited, as scales from just one pangolin can offer a life changing sum of money for people in these communities, but it's in no way sustainable. Wild pangolin numbers are beginning to plummet."

Reliable estimates of how many pangolins remain in the wild are lacking, although it is thought that over a million individual pangolins were taken from the wild between 2000, and 2013. There are eight species of pangolin, all of which are considered threatened with extinction on the IUCN Red List of Threatened Species.

World Animal Protection works tirelessly to prevent cruelty to animals around the world. Although it is well documented that pangolins are being hunted and trafficked, until now, the immense suffering and cruelty that these animals endure when they are hunted has remained relatively overlooked.

To combat the global trade in their bodies and scales, and to protect pangolins from the unimaginable suffering they endure World Animal Protection is calling for:

- Strong enforcement of national and international laws;

- Removal of pangolins from the Pharmacopoeia of the People's Republic of China - the traditional medicine handbook for the industry;

- Investment in and promotion of herbal and synthetic alternatives;

- Combined and coordinated efforts by governments, NGOs and the traditional Asian medicine community to eliminate consumer demand for pangolin-based traditional Asian medicines, particularly in China and Vietnam;

- Support for alternative livelihoods, alleviation of poverty and education programmes within rural communities wherever pangolins are found globally, to stop the slaughter.

Credit: 
Pensoft Publishers

Salt-evolved zooplankton grow too slowly to block salt-induced algal blooms

image: Although salt-tolerant zooplankton survive in moderately salty conditions, their slower growth meant they were unable to control algal blooms in the presence of the most common road salt, sodium chloride.

Image: 
Rensselaer

TROY, N.Y. -- Small animals at the base of the freshwater food chain can rapidly adapt to salt pollution - from sources like winter road deicing, agriculture, and mining - but at a price. In a special December edition of Philosophical Transactions of the Royal Society B devoted to freshwater salt pollution, research shows that salt-adapted freshwater zooplankton grow 65 percent slower than regular zooplankton. Their slow growth cascades down the food chain in environments polluted with the most commonly found salt, triggering algal blooms.

"There's an upside and a downside to evolving salt tolerance," said Rick Relyea, principal investigator and professor and director of the Darrin Fresh Water Institute and member of the Center for Biotechnology and Interdisciplinary Studies at Rensselaer Polytechnic Institute. "The upside is the animals are more protected in moderately salt-polluted environments. But there is a cost for having this tolerance to salt, and as this work shows, one of the downsides is that the animals grow much more slowly."

The finding is one of two articles in the special edition "Salt in freshwaters: causes, ecological consequences and future prospects," published today. The issue explores how human activities that are increasing concentrations of salts in rivers, wetlands, and lakes "adversely affect freshwater biodiversity, and the ecosystem functions and services on which human societies rely."

Relyea's research team, which is investigating the effects of salt on aquatic environments through its work with the Jefferson Project on Lake George, also contributed an opinion piece calling attention to inadequate regulations. In "Regulations are needed to protect freshwater ecosystems from salinization," the research team pointed to regional inconsistencies in regulations governing acceptable levels of salinization, and a failure to differentiate between different types of salts, despite their vastly disparate environmental effects.

"If we want to prevent salt pollution from harming our freshwater ecosystems, we need consequential regulations informed by science that protect fresh waters across ecosystems, not political jurisdictions," said Matt Schuler, first author of the opinion article.

The research article, "Evolved tolerance to freshwater salinization in zooplankton: life-history, trade-offs, cross-tolerance and reducing cascading effects," is one of 14 Relyea's lab has published in the past two years on the effects of salt on aquatic ecosystems. In 2017, the team published research showing that a common species of zooplankton, Daphnia pulex, could evolve genetic tolerance to moderate levels of road salt in as little as two and a half months. Later that year, research conducted in cooperation with Rensselaer researcher Jennifer Hurley showed that the salt-adapted zooplankton had suppressed circadian rhythms.

"We knew there might be trade-offs, and we wanted to understand how those tradeoffs played out, from the individual organism to the ecosystem," said Bill Hintz, first author of the research article.

In using descendants of the same salt-tolerant zooplankton the team raised the previous year, one finding was quickly apparent: tolerance is persistent. Although about 30 generations separated the original population from their descendants, both groups showed the same tolerance to salt.

Ordinary zooplankton, which eat algae, die in moderately salty conditions. This causes a "cascading effect" of salt pollution whereby salinized freshwater is susceptible to algal blooms. Although the salt-tolerant zooplankton survive in moderately salty conditions, their slower growth meant tolerant populations were unable to control algal blooms in the presence of the most common road salt, sodium chloride. However, salt-tolerant zooplankton were able to control algal blooms in the presence of moderate pollution from two alternative road salts: magnesium chloride and calcium chloride.

"Because the salt-tolerant Daphnia are able to survive, you would hope that you wouldn't see a big algal bloom. We see this for some salt types, but that's not what we're seeing for all salt types," said Relyea. "The zooplankton are protected, but it doesn't always stop the phytoplankton from blooming."

The team also established that, although the zooplankton evolved higher salt tolerance in the presence of sodium chloride, they were also more tolerant of other common salt pollutants, such a magnesium chloride and calcium chloride, a trait known as "cross tolerance."

Credit: 
Rensselaer Polytechnic Institute

Saltier waterways are creating dangerous 'chemical cocktails'

image: Salt applied to roadways as a wintertime deicer has been shown to make significant contributions to increased salinity in freshwater streams in the United States, Europe and elsewhere. New research suggests that saltier water also liberates toxic metals and harmful nitrogen-containing compounds from streambeds and soils, creating dangerous "chemical cocktails" that can be more damaging than individual pollutants alone.

Image: 
Joseph Galella/University of Maryland

A recent study led by University of Maryland researchers found that streams and rivers across the United States have become saltier and more alkaline over the past 50 years, thanks to road deicers, fertilizers and other salty compounds that humans indirectly release into waterways. The team named this effect "Freshwater Salinization Syndrome."

New research from the same UMD-led group takes a closer look at the global, regional and local consequences of Freshwater Salinization Syndrome. The group found that salty, alkaline freshwater can release a variety of chemicals, including toxic metals and harmful nitrogen-containing compounds, from streambeds and soils in drainage basins. The results further suggest that many of these chemicals travel together throughout watersheds, forming "chemical cocktails" that can have more devastating effects on drinking water supplies and ecosystems when compared with individual contaminants alone.

The group's latest work, which includes field observations from the Washington, D.C. and Baltimore metropolitan areas, highlights the need for new and more comprehensive regulation and pollution management strategies. The research team published its findings December 3, 2018 in the journal Philosophical Transactions of the Royal Society B.

"The bottom line of our findings is that when humans add salt to waterways, that salt also releases a lot of dangerous collateral chemicals," said Sujay Kaushal, a professor of geology at UMD and lead author of the study. "It's clear that regulatory agencies need to find new ways to address these 'chemical cocktails' released by saltier water, rather than looking at individual freshwater pollutants one by one."

Salty, alkaline freshwater is already known to create big problems for drinking water supplies, urban infrastructure and natural ecosystems. For example, when Flint, Michigan, switched its primary water source to the Flint River in 2014, the river's high salt load combined with chemical treatments to make the water more corrosive, causing lead to leach from water pipes and creating that city's well-documented water crisis.

Kaushal and his colleagues' latest research project investigated the impacts of chemical cocktails created by saltier water in more detail. The group began by assessing previously published data from rivers in the U.S., Europe, Canada, Russia, China and Iran, substantially expanding the geographic boundaries of the researchers' previous work. Their analysis suggests that Freshwater Salinization Syndrome could be a global phenomenon, with the most conclusive support showing a steady trend of increased salt ions in both U.S. and European rivers. These trends trace back at least 50 years, with some data reaching back far enough to support a 100-year trend.

?"Given what we are finding, I continue to be surprised by the scope and magnitude of the recent degradation of Earth's surface waters," said study co-author Gene Likens, president emeritus of the Cary Institute of Ecosystem Studies and a distinguished research professor at the University of Connecticut. "The formation of novel chemical cocktails is causing deterioration far beyond my expectations."

In the snowy Mid-Atlantic states and New England, road salt applied to roadways in winter is a primary cause of Freshwater Salinization Syndrome. Kaushal and his colleagues took a deeper dive into the chemical consequences of road salt by performing detailed field studies in streams located near Washington, D.C. and Baltimore.

In one set of observations, the researchers sampled water from the Paint Branch stream near the UMD campus before, during and after a 2017 snowstorm. This aspect of the study allowed the team to trace the effects of road salt washed into the streams by the melting snow.

"We thought it would be interesting to get a view of the chemistry in an urban river throughout a snowstorm," said Kelsey Wood (B.S. '15, geology), a geology graduate student at UMD and a co-author of the study. "Salt concentrations during the snowstorm were surprisingly high--it was like we were analyzing sea water. But we weren't expecting such a high corresponding peak in metals."

Previous research has shown that very salty water can force metals--especially copper, cadmium, manganese and zinc--out of streambed soils and into stream water. In the Paint Branch stream, Kaushal and his colleagues noted large spikes in copper, manganese and zinc immediately following the snowstorm. In a similar set of observations in Washington, D.C.'s Rock Creek, the team observed notable spikes in cadmium, copper and zinc following other snowstorms.

In another series of experiments, the researchers artificially added salt to the Gwynns Falls stream near Baltimore to simulate what happens during a snowstorm and measured copper concentrations in the water before, during and after adding salt. The downstream data showed an instant spike in copper released from the streambed, suggesting a direct connection between the stream's salt content and copper in the water.

Salt ion concentrations can stay high for months following a storm, Kaushal added. This lengthens the amount of time that salt can draw metals from the soil, resulting in harmful cocktails of metals and salts transported far downstream.

"Looking at water quality data over several months in the winter, salt remains high and rarely has a chance to return to baseline before the next storm comes through and more salt is put on the roads," said Kaushal, who also has an appointment in UMD's Earth System Science Interdisciplinary Center. "This high salt load not only liberates metals and other contaminants, but there is also evidence that the initial salt pulse releases other salt ions from the streambed and soils, such as magnesium and potassium, which further contribute to keeping overall salt levels high."

In the heavily agricultural Midwest and areas of the Mid-Atlantic states, agricultural fertilizers are a significant cause of Freshwater Salinization Syndrome. To investigate further, the research team looked at water quality data from 26 different U.S. Geological Survey (USGS) monitoring sites along rivers in these areas.

These USGS stations collected data every 15 minutes on salinity, pH and nitrate ions--a harmful byproduct of agricultural fertilizers and other contaminants. These high-frequency measurements gave the research team valuable real-time insights, with several of the rivers showing a clear and nearly immediate connection between increased salinity and nitrate concentrations.

"To me, this study highlights the need to view salt as an emerging contaminant in freshwater," said Shahan Haq (B.S. '14, physical sciences), a geology graduate student at UMD and a co-author of the study. "Salt's ability to move heavy metals like copper from sediments into the water could have dangerous implications for our drinking water and could be toxic to wildlife. Our observations suggest that some rivers are already at risk, especially here in the eastern U.S. directly following road salt applications."

Credit: 
University of Maryland

Clincal trial to test male contraceptive gel’s efficacy launches

Three U.S. sites are enrolling couples in the first clinical trial to test the safety and efficacy of a gel for men to prevent unintended pregnancy. Today’s launch was announced jointly by the Population Council, the National Institute of Child Health and Human Development (NICHD), the Los Angeles Biomedical Research Institute and the University of Washington School of Medicine.

Force Push VR brings Jedi powers to life

image: Force Push is a novel VR technique that allows users to move objects with unprecedented nuance.

Image: 
Virginia Tech

Fans of the Star Wars franchise will have to wait more than a year from now to get their fix of Jedi-laden telekinetic spectacles on the big screen. The as-of-yet-to-be-titled Episode IX, the last installment of the space saga as was envisioned in 1977, won't be released until December 2019.

In the interim, stalwart practitioners of Jedi ways and other Force-sensitive beings can look to the small screen and thank Virginia Tech researchers for a recently developed virtual reality technique called Force Push.

Force Push gives its users the ability to move faraway objects with Yoda-like calm, nuance, and focus using an approach for remote object manipulation in VR.

"You basically push the object in the direction you want it to move to, just like in Star Wars when the Jedi masters try to move an object that's placed remotely, they can push or pull it," said Run Yu, Ph.D. candidate in the Department of Computer Science and the Institute for Creativity, Technology, and the Arts. Yu is first author on the recently published article in Frontiers in ICT detailing the research.

It's as simple as using subtle hand gestures to push, pull, or twirl objects. Users employ their bare hands using a natural gesture-to-action mapping for object manipulation in a VR setting.

"We wanted to try and do this without any device, just using your hands, and also do it with gestures in a way that's more playful," said Doug Bowman, the Frank J. Maher Professor of Computer Science and director of the Center for Human Computer Interaction.

Force Push provides a more physical, nuanced experience than traditional hand controllers allow in VR. It responds to the speed and magnitude of hand gestures to accelerate or decelerate objects in a way that users can understand intuitively.

The ability to respond to nuanced hand movement is due to the technique's novel physics-driven algorithms. Dynamically mapping rich features of input gestures to properties of physics-based simulation made the interface controllable in most cases. With Force Push, it's just as easy for users to apply the gentlest of nudges to an object as it is to throw a heavy object across the room. The researchers also believe the physics-based technique makes Force Push more plausible, so that users have a "realistic" experience of these magical powers.

To perform user experiments the team used an Oculus Rift CV1 for display and a Leap Motion was applied for hand tracking. The virtual environment was developed in the Unity game engine, and the native physics engine of Unity was used to drive the physics-based simulation of the Force Push interface.

"Every week we kind of tweak something different in order to make the experience feel right," said Bowman. "But now it feels really cool."

Credit: 
Virginia Tech

System can rapidly and accurately detect tumor margins during breast cancer surgery

Scientists from the RIKEN Cluster for Pioneering Research (CPR), Osaka University, and collaborators have developed a new rapid and inexpensive way to accurately detect the margins between cancer and non-cancerous tissue during breast surgery. Their system is noteworthy in that it can detect the morphology of the cells, differentiating between cells that are more or less dangerous.

Today, breast-conserving surgery is widely used for the treatment of breast cancer. As a result, finding exactly where a tumor ends and where the healthy tissue begins is an important--but difficult--task for cancer surgeons. Patients hope to keep as much of their healthy breast as possible, but not removing enough can lead to recurrences. At present, the most popular method for finding boundaries is frozen section analysis but it is time-consuming and labor-intensive. For the frozen section analysis, tissues have to be taken and examined during the surgery by a pathologist in a process that can take as long as a half hour.

Now, however, in a study published in Advanced Science, scientists have developed a way to sensitively, selectively, and quickly detect surgical margins by using a "click-to-sense" acrolein probe that conjugates with the components of live breast cancer cells. Using resected stumps from patients during surgery, they found that the method is both sensitive, in that in is almost equal to pathology in identifying tumor tissue, and selective, in that it does not often identify non-tumor tissue inaccurately as tumors.

The secret to the method is acrolein, a highly toxic chemical that is generated in tumor cells and other cells undergoing oxidative stress. Previously, the group had developed an azide probe that "clicks" to acrolein and then can be made to glow, giving visual clues on the concentration of acrolein in cells.

Using this method, they analyzed tissues in real-time from a group of patients who had given consent to participating in the study. They took 30 stumps of cancerous tissue and 30 of normal tissue resected from patients during surgery, and examined them using the fluorescence-based acrolein probe. Using the optimal amount of the probe, they found that both the sensitivity and selectivity were 97 percent. According to Shinzaburo Noguchi of Osaka University, whose team performed the surgeries, "We were quite surprised that the probe developed by RIKEN could so accurately and rapidly identify tissues. This method seems to have the potential to be a great advance for breast-conserving breast cancer surgery."

Looking to the future, Katsunori Tanaka of RIKEN, whose team developed the probe, says, "We are also excited that our system has been able to identify other types of cancer cells as well. In the current study we focused on breast cancer, which has a high prevalence, and we are planning to move it into clinical trials now and to launch studies with other types of cancer as well."

Credit: 
RIKEN

A new light on significantly faster computer memory devices

A team of scientists from Arizona State University's School of Molecular Sciences and Germany have published in Science Advances online today an explanation of how a particular phase-change memory (PCM) material can work one thousand times faster than current flash computer memory, while being significantly more durable with respect to the number of daily read-writes.

PCMs are a form of computer random-access memory (RAM) that store data by altering the state of the matter of the "bits", (millions of which make up the device) between liquid, glass and crystal states. PCM technology has the potential to provide inexpensive, high-speed, high-density, high-volume, nonvolatile storage on an unprecedented scale.

The basic idea and material were invented by Stanford Ovshinsky, long ago, in1975, but applications have lingered due to lack of clarity about how the material can execute the phase changes on such short time scales and technical problems related to controlling the changes with necessary precision. Now high tech companies like Samsung, IBM and Intel are racing to perfect it.

The semi-metallic material under current study is an alloy of germanium, antimony and tellurium in the ratio of 1:2:4. In this work the team probes the microscopic dynamics in the liquid state of this PCM using quasi-elastic neutron scattering (QENS) for clues as to what might make the phase changes so sharp and reproducible.

On command, the structure of each microscopic bit of this PCM material can be made to change from glass to crystal or from crystal back to glass (through the liquid intermediate) on the time scale of a thousandth of a millionth of a second just by a controlled heat or light pulse, the former now being preferred. In the amorphous or disordered phase, the material has high electrical resistance, the "off" state; in the crystalline or ordered phase, its resistance is reduced 1000 fold or more to give the "on" state.

These elements are arranged in two dimensional layers between activating electrodes, which can be stacked to give a three dimension array with particularly high active site density making it possible for the PCM device to function many times faster than conventional flash memory, while using less power.

"The amorphous phases of this kind of material can be regarded as "semi-metallic glasses"," explains Shuai Wei, who at the time was conducting postdoctoral research in SMS Regents' Professor Austen Angell's lab, as a Humboldt Foundation Fellowship recipient.

"Contrary to the strategy in the research field of "metallic glasses", where people have made efforts for decades to slow down the crystallization in order to obtain the bulk glass, here we want those semi-metallic glasses to crystallize as fast as possible in the liquid, but to stay as stable as possible when in the glass state. I think now we have a promising new understanding of how this is achieved in the PCMs under study."

A Deviation from the expected

Over a century ago, Einstein wrote in his Ph.D. thesis that the diffusion of particles undergoing Brownian motion could be understood if the frictional force retarding the motion of a particle was that derived by Stokes for a round ball falling through a jar of honey. The simple equation:
D (diffusivity) = kBT/6??r
where T is the temperature, ? is the viscosity and r is the particle radius, implies that the product D?/T should be constant as T changes, and the surprising thing is that this seems to be true not only for Brownian motion, but also for simple molecular liquids whose molecular motion is known to be anything but that of a ball falling through honey!

"We don't have any good explanation of why it works so well, even in the highly viscous supercooled state of molecular liquids until approaching the glass transition temperature, but we do know that there are a few interesting liquids in which it fails badly even above the melting point," observes Angell.

"One of them is liquid tellurium, a key element of the PCM materials. Another is water which is famous for its anomalies, and a third is germanium, a second of the three elements of the GST type of PCM. Now we are adding a fourth, the GST liquid itself..!!! thanks to the neutron scattering studies proposed and executed by Shuai Wei and his German colleagues, Zach Evenson (Technical University of Munich, Germany) and Moritz Stolpe (Saarland University, Germany) on samples prepared by Shuai with the help of Pierre Lucas (University of Arizona)."

Another feature in common for this small group of liquids is the existence of a maximum in liquid density which is famous for the case of water. A density maximum closely followed, during cooling, by a metal-to semiconductor transition is also seen in the stable liquid state of arsenic telluride, (As2Te3), which is first cousin to the antimony telluride (Sb2Te3 ) component of the PCMs all of which lie on the "Ovshinsky" line connecting antimony telluride (Sb2Te3 ) to germanium telluride (GeTe) in the three component phase diagram. Can it be that the underlying physics of these liquids has a common basis?

It is the suggestion of Wei and coauthors that when germanium, antimony and tellurium are mixed together in the ratio of 1:2:4, (or others along Ovshinsky's "magic" line) both the density maxima and the associated metal to non-metal transitions are pushed below the melting point and, concomitantly, the transition becomes much sharper than in other chalcogenide mixtures.

Then, as in the much-studied case of supercooled water, the fluctuations associated with the response function extrema should give rise to extremely rapid crystallization kinetics. In all cases, the high temperature state (now the metallic state), is the denser.

"This would explain a lot," enthuses Angell "Above the transition the liquid is very fluid and crystallization is extremely rapid, while below the transition the liquid stiffens up quickly and retains the amorphous, low-conductivity state down to room temperature. In nanoscopic "bits", it then remains indefinitely stable until instructed by a computer-programmed heat pulse to rise instantly to a temperature where, on a nano-second time scale, it flash crystallizes to the conducting state, the "on" state.

Lindsay Greer at Cambridge University has made the same argument couched in terms of a "fragile-to-strong" liquid transition".

A second slightly larger heat pulse can take the "bit" instantaneously above its melting point and then, with no further heat input and close contact with a cold substrate, it quenches at a rate sufficient to avoid crystallization and is trapped in the semi-conducting state, the "off" state.

"The high resolution of the neutron time of flight-spectrometer from the Technical University of Munich was necessary to see the details of the atomic movements. Neutron scattering at the Heinz Maier-Leibnitz Zentrum in Garching is the ideal method to make these movements visible," states Zach Evenson.

Credit: 
Arizona State University

Artificial magnetic field produces exotic behavior in graphene sheets

video: This is an animated image of a graphene sheet twisting on top of another sheet. Study led by young Brazilian researcher featured on cover of Physical Review Letters.

Image: 
Jose Lado, Spanish researcher coauthor of the study

A simple sheet of graphene has noteworthy properties due to a quantum phenomenon in its electron structure named Dirac cones in honor of British theoretical physicist Paul Dirac (1902-1984), who was awarded the Nobel Prize for Physics in 1933.

The system becomes even more interesting if it comprises two superimposed graphene sheets, and one is very slightly turned in its own plane so that the holes in the two carbon lattices no longer completely coincide.

For specific angles of twist, the bilayer graphene system displays exotic properties such as superconductivity (zero resistance to electrical current flow).

A new study conducted by Brazilian physicist Aline Ramires with Jose Lado, a Spanish-born researcher at the Swiss Federal Institute of Technology (ETH Zurich), shows that the application of an electrical field to such a system produces an effect identical to that of an extremely intense magnetic field applied to two aligned graphene sheets.

An article on the study has recently been published in Physical Review Letters and was selected to feature on the issue's cover. It can also be downloaded from the arXiv platform.

Ramires is a researcher at São Paulo State University's Institute of Theoretical Physics (IFT-UNESP) and the South American Institute for Fundamental Research (ICTP-SAIFR). She is supported by São Paulo Research Foundation - FAPESP
through a Young Investigator grant.

"I performed the analysis, and it was computationally verified by Lado," Ramires told. "It enables graphene's electronic properties to be controlled by means of electrical fields, generating artificial but effective magnetic fields with far greater magnitudes than those of the real magnetic fields that can be applied."

The two graphene sheets must be close enough together for the electronic orbitals of one to interact with the electronic orbitals of the other, she explained.

This means a separation as close as approximately one angstrom (10-10 meter or 0.1 nanometer), which is the distance between two carbon atoms in graphene.

Another requirement is a small angle of twist for each sheet compared to the other - less than one degree (α

Although entirely theoretical (analytical and numerical), the study has clear technological potential, as it shows that a versatile material such as graphene can be manipulated in hitherto unexplored regimes.

"The artificial magnetic fields proposed previously were based on the application of forces to deform the material. Our proposal enables the generation of these fields to be controlled with much greater precision. This could have practical applications," Ramires said.

The exotic states of matter induced by artificial magnetic fields are associated with the appearance of "pseudo-Landau levels" in graphene sheets.

Landau levels - named after the Soviet physicist and mathematician Lev Landau (1908-1968), Nobel Laureate in Physics in 1962 - are a quantum phenomenon whereby in the presence of a magnetic field, electrically charged particles can only occupy orbits with discrete energy values. The number of electrons in each Landau level is directly proportional to the magnitude of the applied magnetic field.

"These states are well-located in space; when particles interact at these levels, the interactions are much more intense than usual. The formation of pseudo-Landau levels explains why artificial magnetic fields make exotic properties such as superconductivity or spin liquids appear in the material," Ramires said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Meeting the challenge of engaging men in HIV prevention and treatment

image: Navy Petty Officer 1st Class Oliver Arceo draws blood from a sailor for routine HIV testing.

Image: 
US Navy photo by Petty Officer 1st Class Marie Montez

WHAT:
A new commentary from National Institutes of Health scientists asserts that engaging men in HIV prevention and care is essential to the goal of ending the HIV pandemic. The article by Adeola Adeyeye, M.D., M.P.A., and David Burns, M.D., M.P.H., of the National Institute of Allergy and Infectious Diseases (NIAID) and Michael Stirratt, Ph.D., of the National Institute of Mental Health (NIMH) also discusses potential solutions.

Scientific research has proven that people with HIV who take antiretroviral therapy (ART) as prescribed and achieve and maintain an undetectable level of virus in the blood have effectively no risk of transmitting the virus to their HIV-uninfected sexual partners. Other research has shown that when HIV-uninfected people consistently take a single daily oral tablet of the antiretroviral drugs emtricitabine/tenofovir disoproxil fumarate, their risk of acquiring HIV infection is reduced by as much as 95 percent. The challenge is implementing these approaches, known as treatment as prevention and pre-exposure prophylaxis (PrEP), and other forms of HIV prevention in a timely manner among everyone who needs them.

The authors point out that in sub-Saharan Africa, men are less likely than women to know their HIV status, engage in HIV care in a timely manner, stay in care and maintain an undetectable level of virus in the blood. The authors also note that in the United States, disparities by age, race and ethnicity persist in the use of ART among men who have sex with men

New strategies to engage men in HIV prevention and treatment must address three critical issues, the authors write. These are the lack of "touch points" where men naturally interact with the health care system; gender norms and prevailing constructs of masculinity, which typically subordinate health care to other concerns; and HIV stigma and discrimination. The authors describe innovative approaches being explored to overcome these challenges, including establishing HIV testing and care in workplaces and sports programs, ART home delivery, HIV self-testing, and the MenStar Coalition created by the President's Emergency Plan for AIDS Relief (PEPFAR), Unitaid, the Elton John AIDS Foundation and others to expand HIV diagnosis and treatment for men.

In addition, NIAID and NIMH are co-sponsoring two research Funding Opportunity Announcements designed to support development and testing of strategies to increase the engagement of men in HIV prevention and care domestically and globally. More information about these grant opportunities is available at PA-19-042 and PA-19-050.

Credit: 
NIH/National Institute of Allergy and Infectious Diseases

Triple combination cancer immunotherapy improves outcomes in preclinical melanoma model

image: Dr. Mehrotra is the co-scientific director of the oncology and immunotherapy programs in the Department of Surgery at the Medical University of South Carolina and a member of the Hollings Cancer Center.

Image: 
Sarah Pack. Medical University of South Carolina

Adoptive cell transfer (ACT) is a promising cancer immunotherapy that involves isolating T cells from cancer patients that are capable of targeting their tumor, selecting the more active T cells and expanding those in the lab, and then transfusing them back into patients. ACT is already available in the clinic for some diseases -- CAR T therapy, a form of ACT, was approved by the FDA in 2017 for children with acute lymphoblastic leukemia and adults with advanced lymphomas -- and many clinical trials of another form of ACT are under way in melanoma.

Although ACT has produced dramatic results in some of these patients, not all respond, and the therapy has thus far proven less effective against solid tumors. Optimizing ACT could enable more patients with more types of cancer to benefit from the promising therapy.

Combining ACT with a pan-PIM kinase inhibitor and a PD1 inhibitor improves outcomes in a preclinical model, report researchers at the Medical University of South Carolina (MUSC) in an article published online in October by Clinical Cancer Research. They showed that this triple combination treatment (PPiT) doubled the migration of anti-tumor T cells to the tumor site and quadrupled survival in mice compared to ACT alone.

"With this triple combination therapy, many more T cells persisted. That's important for ACT, because the longer the transfused T cells say inside the host to fight tumor cells, the better," says Shikhar Mehrotra, Ph.D., senior author of the article, who is co-scientific director of the oncology and immunotherapy programs in the Department of Surgery at the Medical University of South Carolina and a member of the Hollings Cancer Center.

Of the two agents administered along with ACT as part of this triple combination therapy, PD1 inhibitors are far better known. Clinical successes with checkpoint inhibitors, including PD1 and PDL1inhibitors, ushered in immunotherapy as the fifth pillar of cancer therapy, where it joined the ranks of chemotherapy, surgery, radiotherapy and targeted therapy. PD1 and PDL1 inhibitors take the brakes off of the immune system, enabling its T cells "to see" tumors that had been hiding in plain sight.

In contrast, PIM kinase inhibitors are relative new kids on the block. PIM kinases are proteins that can control many cellular processes, including energy. A clinical roadblock for ACT has been the lack of energy shown by readministered T cells. Mehrotra and his team set out to find whether targeting PIM kinase with an inhibitor could help these readministered cells maintain their energy longer.

"A T cell that starts proliferating is like any person who starts out fresh in the morning with a lot of energy," explains Mehrotra. "Just as the person may have less energy as the day goes on, the T cell can become becomes 'tired' and less effective. We wondered whether the PIM kinase inhibitors could help prevent that from happening."

Mehrotra and his team targeted PIM kinases in T cells to make them act like a specific subtype of T cell, called a central memory T cell. Most ACT trials use rapidly expanding effector T cells (T cells that are ready to attack the tumor), but these T cells often become exhausted when put back in patients. Central memory T cells produce more lasting responses against tumor cells. When Mehrotra and his team blocked PIM kinases in T cells, the cells started acting like memory T cells, as demonstrated by an increase in cell populations that express central memory T cell markers.

"All cells require energy," says Mehrotra. "If you can control the way that T cells use their energy, you could potentially block them from becoming exhausted. In this case, we targeted PIM kinases and show that, in combination with checkpoint therapy and ACT, we get an improvement in T cell response and tumor control."

Indeed, in a mouse model, the triple combination therapy, or PPiT, better controlled the growth of established melanoma than ACT, checkpoint therapy, or PIM kinase inhibitors alone or dual combinations of ACT and a PIM kinase inhibitor or ACT and checkpoint therapy. In addition, more T cells infiltrated the tumor and had decreased expression of PD1, making it harder for tumors to turn them off.

"We ultimately want to be able to implement this therapeutic approach in the clinic," says Mehrotra. "However, we must first explore any potential side effects of the pan-PIM kinase inhibitors and determine whether a more selective inhibitor targeting just one type of PIM kinase might be as effective while posing fewer potential side effects."

Credit: 
Medical University of South Carolina

Researchers produce six antibodies to combat Zika virus

MAYWOOD, IL - Researchers have generated six Zika virus antibodies that could be used to test for and possibly treat a mosquito-borne disease that has infected more than 1.5 million people worldwide.

The antibodies "may have the dual utility as diagnostics capable of recognizing Zika virus subtypes and may be further developed to treat Zika virus infection," corresponding author Ravi Durvasula, MD, and colleagues report in a study published in the journal PLOS ONE.

Dr. Durvasula is professor and chair of the department of medicine of Loyola Medicine and Loyola University Chicago Stritch School of Medicine. First author is Adinarayana Kunamneni, PhD, a research assistant professor in Loyola's department of medicine.

Zika is spread mainly by mosquitos. Most infected people experience no symptoms or mild symptoms such as a rash, mild fever and red eyes. But infection during pregnancy can cause miscarriages, stillbirths and severe birth defects such as microcephaly.

Zika virus is a textbook example of an emerging disease that appears quickly, often in remote areas with little or no public health infrastructure. There is no effective vaccine or drug to treat the disease.

"The recent Zika virus outbreak is a health crisis with global repercussions," Drs. Durvasula, Kunamneni and colleagues write in the PLOS ONE study. "Rapid spread of the disease within the epidemic regions, coupled with migration of infected persons, has underscored the need for rapid, robust and inexpensive diagnostic tools and therapeutics."

Antibodies could be key to diagnosing and treating Zika virus. An antibody is a Y-shaped protein made by the immune system. When a virus, bacterium or other pathogen invades the body, antibodies bind to antigens associated with the bug, marking it for the immune system to destroy.

Using a technology called ribosome display, researchers generated six synthetic antibodies that bind to the Zika virus. The antibodies, which are inexpensive to produce, could be used in a simple filter paper test to detect the Zika virus in the field. (If the filter paper turns color, the Zika virus is present.)

Because the Zika virus is evolving, it's useful to have six different antibodies. In the event the virus mutates, it's likely at least one of the antibodies still would match the virus and thus could still be used in diagnosis and treatment.

An antibody-based test for the Zika virus likely would be cheap and fast, and thus could easily be used to monitor mosquito populations for Zika. If the virus is present in an area, officials could respond by stepping up mosquito-abatement efforts. They also could educate the public - especially women who are pregnant or could become pregnant - on how to avoid mosquito bites by applying mosquito repellent, wearing long pants and long-sleeve shirts, eliminating standing water, etc.

The antibodies are "neutralizing," meaning that when they bind to the Zika virus, they prevent the virus from infecting cells. This effectively renders the virus harmless. The neutralizing property potentially could lead to the development of a drug that an at-risk woman could take to prevent the virus from infecting her fetus.

It will take further research to validate the antibodies' potential for diagnosing and treating Zika virus, researchers said.

Credit: 
Loyola Medicine

Altering cancer metabolism helps treatments attack tumors

Restricting the ability of cancer cells to metabolise sugar could make oncolytic viruses* more effective at attacking them, suggests a study published today in the journal Cancer Research.

Viruses that are trained to attack cancer cells - known as oncolytic viruses - can kill tumours without affecting healthy cells nearby. They normally work by invading the cells, multiplying and destroying the tumour from inside. They are currently being tested in clinical trials**.

In this new study, a team of scientists exposed lung, ovarian and colon cancer cells, and mouse models, to conditions similar to those in the human body, and investigated how manipulating cell metabolism can make cancer more vulnerable to oncolytic viruses.

In the lab, scientists usually keep cells at the perfect temperature and provide them with lots of glucose, as it's easier to grow and store them this way. In this study, the researchers changed the lab conditions to make them reflect what actually happens in the human body, where sugar levels are much lower.

They found that oncolytic viruses worked better when less glucose was available. To investigate whether they could make the virus work even harder, the researchers then used a drug*** to restrict the cancer cells' ability to metabolise sugar - its energy source - to see if this optimised the virus's cancer killing capability. They found that reducing sugar levels allowed the virus to multiply much faster, making treatment more effective and destroying cancer quicker.

Arthur Dyer, lead author and Cancer Research UK-funded PhD student from the University of Oxford, said: "Our research in the lab showed that restricting the amount of sugar available to cancer cells makes these cancer-attacking oncolytic viruses work even better. We already know that this virus is effective against cancer - and this sugar-starving technique is a way to make it even better."

This approach may also improve how potential cancer drugs are investigated in the lab.

Arthur Dyer added: "When studying any kind of drug in the lab, we keep the cells in very high sugar conditions - it's a bit like soaking them in Lucozade. But this doesn't reflect the conditions that these cells would be exposed to in the body, which are normally much poorer - in cancer they're even worse because tumours typically have poor circulation. Our approach is more realistic in mimicking the conditions in the human body, which ultimately may help us to better predict how patients will respond to drugs well before any trials are planned."

However, the researchers caution that their early findings should not be misinterpreted by patients who are looking to optimise treatments.

Professor Len Seymour, Cancer Research UK-funded study author from the University of Oxford, explains: "It's important to remember that changing your diet is not enough to starve cancer cells of sugar. A lot of people think that carbohydrates are bad, but that's not the case - we need them, and cutting out sugar won't cure cancer. Because cancer gobbles up glucose so quickly, the cells are very vulnerable to attack from a drug that targets the sugar pathway. The same effect cannot be achieved by eliminating sugar from your diet."****

Dr David Scott, Cancer Research UK's director of discovery research, said: "By making treatments work more effectively, we hope that patients will be able to see positive results faster than before. The next step is to test whether this approach works in clinical trials, and to find out which cancers respond best."

The team are aiming to test their glucose-limiting approach to improving oncolytic virus treatment in clinical trials to assess whether it could be successfully implemented in cancer patients.

Credit: 
Cancer Research UK

Beyond bone mineral density: Additional bone traits predict risk for fracture

BOSTON - Every year more than 2 million older Americans experience a fragility fracture to the hip, spine or wrist. Usually the result of a fall from standing height or less, fragility fractures stem underlying bone deterioration, not high-impact forces but to. Loss of bone mineral density (BMD) - the condition known as osteoporosis - is one way bones can become fragile, and screening patients for osteoporosis is the current standard for determining fracture risk in older adults. However, low bone mineral density is not the only cause of bone fragility, and the majority of older adults who sustain a fragility fracture do not meet the diagnostic criteria for osteoporosis. Physicians currently lack validated means of assessing fracture risk in these patients.

In the largest prospective study of its kind, researchers from Beth Israel Deaconess Medical Center (BIDMC) and the Institute for Aging Research at Hebrew SeniorLife used high-resolution tomography imaging to assess whether other bone characteristics besides bone mineral density can be used to determine fracture risk. The team found that assessing the microstructure of the two different types of bone tissues - compact bone and spongy bone - may be useful to predict the incidence of fragility fractures in those who would not otherwise be identified as at risk. The study is published today in The Lancet Diabetes and Endocrinology.

"Older women are at particularly high risk of fracture. In fact, the number of women who will experience a fragility fracture in any given year exceeds the number who will experience a first-time stroke, breast cancer or myocardial infarction combined," said co-lead author Mary L. Bouxsein, PhD, Director of the Center for Advanced Orthopedic Studies at BIDMC. "Improved methods to identify those in whom fractures are common but whom standard clinical testing does not identify as high risk would allow us to target treatment to this important group and ultimately reduce fracture burden."

The multi-national study included more than 7,000 older women and men from five North American and European countries. Participants - eight research cohorts at the Mayo Clinic, the Framingham Study in Massachusetts and sites in France, Canada, Switzerland and Sweden that comprise the Bone Microarchitecture International Consortium - underwent scanning measurements of the bones of the arms and legs.

While just eight percent of participants met the diagnostic criteria for osteoporosis, 11 percent of participants experienced a fracture. Those who experienced fracture were more likely to be older, female, have a lower BMI, use osteoporosis medications and have a previous fracture. Participants who sustained fractures had worse bone measurements for nearly all parameters compared to those who did not fracture.

The scientists' analysis demonstrated that several measures of bone density and structure at different sites on the bone - including the density of the compact bone tissue and the thickness of the spongy bone tissue at the wrist - were predictive of fracture. Failure load, the stress under which bone begins to fracture, was the bone characteristic most strongly associated with risk of fracture.

"Results from this large international cohort of women and men suggest deficits in density and structure throughout the bone contribute to fracture risk independently of bone mineral density and current risk assessment tools", said Lisa Samelson, PhD, who is an epidemiologist at the Institute for Aging Research at Hebrew SeniorLife and Harvard Medical School, and lead author of the study. "Further, assessment of these bone characteristics may be useful in those who would not otherwise be identified as being at high risk for fracture."

Credit: 
Beth Israel Deaconess Medical Center