Earth

Racial, ethnic minorities face greater vulnerability to wildfires

image: Battling fire during the Taylor Creek and Klondike Fires in the Rogue-Siskiyou National Forest, Oregon, 2018.

Image: 
Kari Greer

Environmental disasters in the U.S. often hit minority groups the hardest.

When Hurricane Katrina slammed New Orleans in 2005, the city's black residents were disproportionately affected. Their neighborhoods were located in the low-lying, less-protected areas of the city, and many people lacked the resources to evacuate safely. Similar patterns have played out during hurricanes and tropical storms ever since.

Massive wildfires, which may be getting more intense due to climate change and a long history of fire-suppression policies, also have strikingly unequal effects on minority communities, a new study shows.

Researchers at the University of Washington and The Nature Conservancy used census data to develop a "vulnerability index" to assess wildfire risk in communities across the U.S. Their results, appearing Nov. 2 in the journal PLOS ONE, show that racial and ethnic minorities face greater vulnerability to wildfires compared with primarily white communities. In particular, Native Americans are six times more likely than other groups to live in areas most prone to wildfires.

"A general perception is that communities most affected by wildfires are affluent people living in rural and suburban communities near forested areas," said lead author Ian Davies, a graduate student in the UW School of Environmental and Forest Sciences. "But there are actually millions of people who live in areas that have a high wildfire potential and are very poor, or don't have access to vehicles or other resources, which makes it difficult to adapt or recover from a wildfire disaster."

This study is one of the first to integrate both the physical risk of wildfire with the social and economic resilience of communities to see which areas across the country are most vulnerable to large wildfires. The approach takes 13 socioeconomic measures from the U.S. census -- including income, housing type, English fluency and health -- for more than 71,000 census tracts across the country and overlays them with wildfire potential based on weather, historical fire activity and burnable fuels on the landscape.

There aren't many studies looking at the societal impacts of massive wildfires, so the researchers relied on existing literature that examined other environmental disasters, mainly hurricanes, to identify socioeconomic factors that contributed to whether a person recovered from a disaster. Some of these factors include whether a person is above or below the poverty line, has a disability, is elderly, has a vehicle, and owns or rents their home.

All of these factors and additional data went into creating a vulnerability index that the research team used in combination with U.S. Forest Service assessments of wildfire potential to determine the vulnerability of 71,901 census tracts across the country.

"The argument that we and other scientists have made is natural disasters aren't completely natural -- they are products of both an environmental impact and the social, political and economic context in which the impact occurs," Davies said.

Overall, more than 29 million Americans -- many of whom are white and economically secure -- live with significant potential for extreme wildfires. However, within that segment, about 12 million people are considered "socially vulnerable" to wildfires based on the socioeconomic factors assessed in this study -- and for whom a wildfire could be devastating.

Additionally, they found that wildfire vulnerability is spread unequally across race and ethnicity. Communities that are mostly black, Hispanic or Native American experience 50 percent greater vulnerability to wildfires compared with other communities.

In the case of Native Americans, historical forced relocation onto reservations -- mostly rural, remote areas that are more prone to wildfires -- combined with greater levels of vulnerability due to socioeconomic barriers make it especially hard for these communities to recover after a large wildfire.

"Our findings help dispel some myths surrounding wildfires -- in particular, that avoiding disaster is simply a matter of eliminating fuels and reducing fire hazards, or that wildfire risk is constrained to rural, white communities," said senior author Phil Levin, a UW professor in environmental and forest sciences and lead scientist at The Nature Conservancy in Washington. "We can see that the impacts of recent fires were exacerbated for low-income residents facing a shortage of affordable housing, for example, and for Hispanic residents for whom English is not their first language."

As the researchers dug into their results, they corroborated their findings with news reports from specific wildfire events. For example, they found that in the 2017 wildfire season, emergency agencies in cities throughout California struggled to release timely and correct bilingual information. During the 2014 wildfires in eastern and central Washington, language barriers also prevented Hispanic farm workers from receiving evacuation alerts from authorities, and the only Spanish-language radio station in the area reportedly never received the emergency notification.

The researchers hope these broad, nationwide results will spawn more detailed studies focused on individual communities and their wildfire risk. But equally important, they say, is for organizations and municipalities to take these socioeconomic factors into account when helping their communities prepare for wildfires. Offering cost-share programs for residents to prepare their homes for wildfires, distributing evacuation notices in multiple languages and creating jobs focused on thinning local forests or clearing out flammable brush are all ways in which communities can reduce their vulnerability to wildfires, they said.

"I think the question is, how do we take these sorts of activities that are ultimately about building community and make it so they are attractive and useful for people who are busy and would much rather use what spare time they have to spend with their families?" Levin said. "I think ultimately it's about connections, building relationships and breaking down cultural barriers that will bring us to a better outcome."

Credit: 
University of Washington

Ozone hole modest despite optimum conditions for ozone depletion

image: This time-lapse photo from Sept. 10, 2018, shows the flight path of an ozonesonde as it rises into the atmosphere over the South Pole from the Amundsen-Scott South Pole Station. Scientists release these balloon-borne sensors to measure the thickness of the protective ozone layer high up in the atmosphere.

Image: 
Robert Schwarz/University of Minnesota

The ozone hole that forms in the upper atmosphere over Antarctica each September was slightly above average size in 2018, NOAA and NASA scientists reported today.

Colder-than-average temperatures in the Antarctic stratosphere created ideal conditions for destroying ozone this year, but declining levels of ozone-depleting chemicals prevented the hole from as being as large as it would have been 20 years ago.

"Chlorine levels in the Antarctic stratosphere have fallen about 11 percent from the peak year in 2000," said Paul A. Newman, chief scientist for Earth Sciences at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "This year's colder temperatures would have given us a much larger ozone hole if chlorine was still at levels we saw back in the year 2000."

According to NASA, the annual ozone hole reached an average area coverage of 8.83 million square miles (22.9 square kilometers) in 2018, almost three times the size of the contiguous United States. It ranks 13th largest out of 40 years of NASA satellite observations. Nations of the world began phasing out the use of ozone-depleting substances in 1987 under an international treaty known as the Montreal Protocol.

The 2018 ozone hole was strongly influenced by a stable and cold Antarctic vortex -- the stratospheric low pressure system that flows clockwise in the atmosphere above Antarctica. These colder conditions -- among the coldest since 1979 -- helped support formation of more polar stratospheric clouds, whose cloud particles activate ozone-destroying forms of chlorine and bromine compounds.

In 2016 and 2017, warmer temperatures in September limited the formation of polar stratospheric clouds and slowed the ozone hole's growth. In 2017, the ozone hole reached a size of 7.6 million square miles (19.7 square kilometers) before starting to recover. In 2016, the hole grew to 8 million square miles (20.7 square kilometers).

However, the current ozone hole area is still large compared to the 1980s, when the depletion of the ozone layer above Antarctica was first detected. Atmospheric levels of man-made ozone-depleting substances increased up to the year 2000. Since then, they have slowly declined but remain high enough to produce significant ozone loss.

NOAA scientists said colder temperatures in 2018 allowed for near-complete elimination of ozone in a deep, 3.1-mile (5-kilometer) layer over the South Pole. This layer is where the active chemical depletion of ozone occurs on polar stratospheric clouds. The amount of ozone over the South Pole reached a minimum of 104 Dobson units on Oct. 12 -- making it the 12th lowest year out of 33 years of NOAA ozonesonde measurements at the South Pole, according to NOAA scientist Bryan Johnson.

"Even with this year's optimum conditions, ozone loss was less severe in the upper altitude layers, which is what we would expect given the declining chlorine concentrations we're seeing in the stratosphere," Johnson said.

A Dobson unit is the standard measurement for the total amount of ozone in the atmosphere above a point on Earth's surface, and it represents the number of ozone molecules required to create a layer of pure ozone 0.01 millimeters thick at a temperature of 32 degrees Fahrenheit (0 degrees Celsius) at an atmospheric pressure equivalent to Earth's surface. A value of 104 Dobson units would be a layer that is 1.04 millimeters thick at the surface, less than the thickness of a dime.

Prior to the emergence of the Antarctic ozone hole in the 1970s, the average amount of ozone above the South Pole in September and October ranged from 250 to 350 Dobson units.

What is ozone and why does it matter?

Ozone comprises three oxygen atoms and is highly reactive with other chemicals. In the stratosphere, roughly 7 to 25 miles (about 11 to 40 kilometers) above Earth's surface, a layer of ozone acts like sunscreen, shielding the planet from ultraviolet radiation that can cause skin cancer and cataracts, suppress immune systems and damage plants. Ozone can also be created by photochemical reactions between the Sun and pollution from vehicle emissions and other sources, forming harmful smog in the lower atmosphere.

NASA and NOAA use three complementary instrumental methods to monitor the growth and breakup of the ozone hole each year. Satellite instruments like the Ozone Monitoring Instrument on NASA's Aura satellite and the Ozone Mapping Profiler Suite on the NASA-NOAA Suomi National Polar-orbiting Partnership satellite measure ozone across large areas from space. The Aura satellite's Microwave Limb Sounder also measures certain chlorine-containing gases, providing estimates of total chlorine levels.

The total amount of ozone in the atmosphere is exceedingly small. All of the ozone in a column of the atmosphere extending from the ground to space would be 300 Dobson units, approximately the thickness of two pennies stacked one on top of the other.

NOAA scientists monitor the thickness of the ozone layer and its vertical distribution above the South Pole by regularly releasing weather balloons carrying ozone-measuring "sondes" up to 21 miles (~34 kilometers) in altitude, and with a ground-based instrument called a Dobson spectrophotometer.

Credit: 
NASA/Goddard Space Flight Center

University of Miami Miller School of Medicine -- managing common ankle fractures

University of Miami Miller School of Medicine Experts Offer More Clarity on Managing Common Ankle Fractures

Although fairly common, management of an isolated lateral malleolus ankle fracture remains challenging for orthopaedic surgeons. A central question remains on determining whether surgery or a non-surgical approach is indicated for a particular patient.

A lot rides on whether the ankle is stable or not. However, no clinical examination or imaging strategy stands out as clearly superior for determining ankle stability after such an injury, said Amiethab A. Aiyer, M.D., an orthopaedic surgeon specializing in foot and ankle care at the University of Miami Miller School of Medicine.

"We want to be able to provide clinicians and residents ... with a much more up-to-date overview on what management options are available," Aiyer said.

So he and colleagues published a Review Article, "Management of Isolated Lateral Malleous Fractures" in The Journal of the American Academy of Orthopaedic Surgeons.

In general, non-surgical treatment is indicated when an ankle is stable after injury. These are often called "Weber A" fractures, or the most stable type of ankle fracture in the Danis-Weber classification system. Weber A fractures occur below the level of the joint connecting the tibia and fibula bones, also known as the level of the syndesmosis.

In contrast, when injury occurs above the syndesmosis, these "Weber C" fractures tend to be unstable and require surgery.

In between is a clinical "gray area": Weber B fractures at the level of the syndesmosis. Management of these injuries remains a lot less clear-cut, which is why Aiyer and colleagues decided to address these fractures specifically.

They recommend assessing the medial clear space (MCS) on radiographs to gauge the presence of medial-sided injury. In general, a wider MCS signals less ankle stability after a fracture. In addition, the severity of patient pain ratings can help guide management.

Again, stability is central to assessment. "Unfortunately, determining ankle stability after an isolated lateral malleolus fracture remains a diagnostic dilemma," the researchers wrote.

"You have to tease out how to gauge stability," Aiyer said. Determining if the deltoid ligament, the primary stabilizer of the ankle, is another key.

Imaging to the Rescue?

Dynamic imaging with stress radiographs remains the standard practice to detect tibiotalar instability. The three main options - manual external rotation stress radiographs, gravity stress radiographs and weight bearing radiographs - each carry their own considerations.

For example, because individual clinicians apply pressure with manual stress exams, variability of pressure and reproducibility of findings remain challenges. Gravity stress radiographs, in contrast, rely on the constant force of gravity and cause less pain for patients, the researchers noted.

Weight-bearing radiographs can vary by the amount of weight placed by the clinician, but they incorporate the inherent stability of the ankle in a neutral position. "If the patient is able to place weight on their ankle and there is no radiographic evidence of deltoid compromise, generally the ankle injury is stable and does not require surgery."

"If they are not able to get a weight-bearing radiograph, in my mind, a gravity stress view may be more important," Aiyer said. He added a caveat: if the patient holds their foot up against gravity because they are guarding against pain, it can cause a false negative result.

Choosing the imaging approach best suited for each individual is recommended. "It's particularly important to consider a patient's medical background, the baseline quality of their bone, and the habitus of the patient," Aiyer said.

Refer If in Any Doubt

Dr. Aiyer and colleagues recommend patients with a suspect isolated lateral malleolar fracture first undergo a history and physical exam.

In many instances, particularly in a community setting, orthopaedic surgeons do not have access to weight bearing x-rays. "More often that not, the ability for people to tease this out is pretty good, but if you are not applying the right amount of force or the patient is in a lot of pain, you may miss an injury."

"Physicians can refer their patients to our academic medical center, for additional workup or to give patients a second opinion on the best course of action."

Future Research

Going forward, Aiyer would like to compare outcomes among patients managed with new technologies, including different plating options. Recently manufacturers have release thinner plates, plates made of new materials and nailing devices. He added it would be important to include a cost-benefit analysis when studying these new advances.

Credit: 
University of Miami Miller School of Medicine

Pseudarthrosis following single-level ACDF is five times more likely when a PEEK interbody device is used

image: This is table 2.

Image: 
Copyright 2018 AANS.

CHARLOTTESVILLE, VA (OCTOBER 30, 2018). In spine surgery, "arthrodesis" is the term used to describe fusion of adjacent vertebrae following removal of an intervertebral disc. Arthrodesis is achieved by placing a bone graft or bone graft substitute between the vertebrae to bridge the empty space so that new bone can grow between. "Pseudarthrosis" is the term used to describe failure of this expected new bone growth.

Researchers at the Department of Neurological Surgery, Oregon Health & Science University, found pseudarthrosis to be five times more likely after a polyetheretherketone (PEEK) interbody spacer device had been used to bridge the gap between vertebrae during cervical spine surgery than after a structural (bone) allograft had been used. The researchers' findings appear in a new article published today in the Journal of Neurosurgery: Spine: "Fivefold higher rate of pseudarthrosis with polyetheretherketone interbody device than with structural allograft used for 1-level anterior cervical discectomy and fusion," written by Katie L. Krause, MD, PhD, and colleagues

Background

Anterior cervical discectomy and fusion (ACDF) is a common surgical procedure performed in patients experiencing pain or weakness due to a herniated or deteriorated intervertebral disc in the neck. During the operation, the surgeon approaches the cervical spine from the front of the body, thus avoiding the spinal cord, spinal nerves, and thick neck muscles. After reaching the spine, the surgeon removes the damaged disc, which lies between two adjacent vertebrae, and replaces it with a bone graft or an artificial graft packed with bone fragments or bone-inducing proteins that serves to "fuse" the two vertebrae together and stabilize the spine. ACDF can be performed at a single level (replacing one disc between two adjacent vertebrae with a graft) or at multiple levels (replacing multiple discs with grafts).

In this paper, the authors focus on two graft materials commonly used to bridge the gap between two vertebrae after disc removal: structural allograft and polyetheretherketone (PEEK). Structural allograft is a sterilized piece of bone obtained from a cadaver. It has no active bone cells or bone-inducing proteins itself, but acts as a natural scaffold over which bone can regrow. PEEK is a strong, biocompatible plastic. Although itself bioinert, this plastic scaffold is packed with bone shavings or proteins at the time of surgery to induce bone growth.

Present Study

In this retrospective study, the authors reviewed the cases of 127 patients who had undergone single-level ACDF and participated in at least 1 year of follow-up. Fifty-six (56) patients had received PEEK implants and 71 had received structural allografts. The goal of the study was to see which graft material was more effective at producing greater bone fusion following ACDF. This was done by reviewing follow-up radiographic images to identify which patients had pseudarthrosis, defined as a "lack of solid bone growth across the disc space at 1 or more years of radiographic follow-up," and by checking patient charts to see if repeated surgical intervention was required.

The authors found no significant differences in patient age, sex, or body mass index between the two patient groups (PEEK implant group and structural allograft group). There was also no significant difference in tobacco use between the two groups. Tobacco use was examined because there is substantial medical evidence that smoking has a negative effect on bone healing.

The authors found radiographic evidence of pseudarthrosis in 29 (52%) of the 56 patients with PEEK implants and in 7 (10%) of the 71 patients with structural allografts. They also found repeated surgery was warranted in 7 patients (24%) with PEEK implants and in only 1 patient (14%) with a structural allograft. These findings show that when used in a single-level ACDF, PEEK implants were associated with a significantly higher rate of bone nonunion (lack of new bone growth) and a significantly higher rate of the need for additional surgery than structural allografts.

Interestingly, the researchers cite literature on insurance reimbursement policies for ACDFs. The cost of an ACDF performed using a PEEK device can be reimbursed at a far greater amount than the cost of performing the same surgery using a structural allograft.

In light of the results of this study, the authors suggest that surgeons consider the risks of bone nonunion and the need for repeated surgery when choosing which graft to select for an ACDF. They also advocate a change in the discrepancy between reimbursement costs for surgeries involving PEEK implants and structural allografts.

When asked about the paper, the study's senior author, Dr. Khoi Than, said, "Hundreds of thousands of PEEK implants are placed in patients' necks every year, and our work verifies my suspicion that many of them are not healing. I would encourage my fellow surgeons to consider using structural allograft instead of PEEK, despite the lower reimbursement, as the former is clearly the better option for our patients."

Credit: 
Journal of Neurosurgery Publishing Group

Study: Increasing frequency of ocean storms could alter kelp forest ecosystems

image: Kelp forests provide habitat for a diverse array of sea life, from finfish and shellfish to corals and sponges. Such biodiversity could change if ocean storms become more frequent.

Image: 
University of Virginia/University of California, Santa Barbara

A large-scale, long-term experiment on kelp forests off Southern California brings new insight to how the biodiversity of coastal ecosystems could be impacted over time as a changing climate potentially increases the frequency of ocean storms.

Researchers at the University of Virginia and the University of California, Santa Barbara experimentally mimicked the loss of undersea giant kelp forests at four locations off the coast

of Santa Barbara and found that increasing storm frequency - as predicted by some climate change models - could dramatically alter the ratios and types of sea life that live along the California coastline.

"We found that the frequency of disturbance was the most important factor influencing kelp forest biodiversity, whereas the severity of disturbance in a given year played a minor role," said lead researcher Max Castorani, a professor of environmental sciences at UVA.

The study appears online Oct. 30 in the journal Ecology, and comes as scientists are trying to anticipate the ecological consequences of a changing climate. It is among the few long-term experiments to look into how kelp forests, which are major coastal marine habitats throughout the world, could change over time if climate model predictions play out in nature as many scientists expect. Some climate forecasts indicate that storm frequency and severity will increase, as already is happening in some regions.

To arrive at their findings, the researchers counted and measured more than 200 species of plants, invertebrates and fishes in large experimental and control kelp forests off Santa Barbara every three months over a nine-year period. They found that annual disturbances where kelp forests were experimentally cut back and reduced year-after-year, as happens during severe winter storms involving large waves, resulted in a doubling of smaller plants and invertebrates attached to the seafloor (algae, corals, anemones, sponges), but also resulted in 30 to 61 percent fewer fish and shellfish, such as clams, sea urchins, starfish, lobsters and crabs.

"Our findings surprised us because we expected that a single severe winter storm would result in big changes to kelp forest biodiversity," Castorani said. "Instead, the number of disturbances over time had the greatest impact because frequent disturbances suppress the recovery of giant kelp, with large consequences for the surrounding sea life."

As the largest of all algae, giant kelp grows up to 100 feet from the sea floor to the water's surface, creating a dense canopy, much like a terrestrial forest, that provides shading and shelter to organisms farther down in the water column and on the sea bottom. When the forest is destroyed by a large storm, the "understory" becomes brighter with sunlight, but is less physically complex and productive overall, affecting the balance of species diversity. While it is normal for this to periodically happen when large offshore storms drive destructive waves to the coastline, the forests typically recover rapidly. But greater frequency of storms would repeatedly hamper recovery, eventually resulting in vastly altered marine life.

Climate change forecasts predict increases in the frequency and severity of storms over the coming decades, potentially resulting in profound changes to kelp forest biodiversity, as the new study suggests. The repeated loss of giant kelp creates ecological "winners and losers," Castorani said. "Understory" creatures - the seaweeds, sponges, anemones and sea fans - are more likely to thrive, while several commercially and recreationally desirable fishes, crabs, lobsters, whelks and clams could decline.

The experiment was conducted at the National Science Foundation's Santa Barbara Coastal Long-Term Ecological Research site. The NSF funds numerous long-term research projects around the world designed to gain a big-picture view of changes to ecosystems over decades and beyond.

"It's a significant finding that the severity and frequency of disturbances influence kelp bed communities in different ways," said David Garrison, a director of the NSF's Long-Term Ecological Research program, which funded the study. "We need this kind of research to predict what future kelp bed communities will look like, and what ecosystem services they will provide."

Castorani said the nine-year study demonstrates the value of long-term ecological research for understanding environmental change as it occurs, like viewing a film of the environment rather than taking a snapshot that captures only a moment in time.

"Much of the focus of prior research has been on the response to a single event, but our new experiment shows the importance of studying repeated disturbances over many years," he said. "We would not have been able to understand the ecology of this system without the long-term support of NSF's LTER program."

Credit: 
University of Virginia

Bigger = better: Big bees fly better in hotter temps than smaller ones do

New Orleans (October 28, 2018)--Arizona State University researchers have found that larger tropical stingless bee species fly better in hot conditions than smaller bees do. Larger size may help certain bee species better tolerate high body temperatures. The findings run contrary to the well-established temperature-size "rule," which suggests that ectotherms--insects that rely on the external environment to control their temperature--are larger in cold climates and smaller in hot ones. The research will be presented today at the American Physiological Society's (APS) Comparative Physiology: Complexity and Integration conference in New Orleans.

Insects fall into three categories:

ectotherms (reliant on environmental temperatures for their own body temperature),

poikilotherms (reliant on environmental temperatures but can control their own temperature--or thermoregulate--by sun- and shade-seeking or other behaviors), or

endotherms (which can physiologically warm themselves).

"Bees fall along this entire range," explained lead author Meghan Duell, a graduate student at Arizona State University. "Most [insects] employ some means of behavioral thermoregulation. As body size increases, it's more likely that insects will be able to behaviorally and physiologically thermoregulate, especially in flying insects. Bigger bees, like bumblebees or the larger species in the work I'm presenting, are partially endothermic. They can warm themselves by shivering their flight muscles to produce heat but do not constantly physiologically regulate body temperature."

Excessive heat, such as that in the rainforests of Panama where the bees in this study originated from, can limit a bee's ability to fly. "If bees stop flying as often in hot temperatures, the amount of time they have to forage (and therefore pollinate flowering plants) decreases. This can mean they aren't able to collect enough food to maintain the colony," Duell said. "On a large enough scale, this negatively impacts the overall bee population and the plants they pollinate while collecting pollen and nectar for food."

Therefore, better flying performance is an advantage for bees in hot climates. Bees that are unable to fly in hot conditions ultimately end up walking from flower to flower, which is far less efficient than flying and means they are subject to even hotter temperatures on the surfaces of flowers and leaves.

In the new study, Duell and her collaborator, Jon F. Harrison, PhD, measured air and thorax temperatures of 10 species of stingless bees--which varied in body mass between 2 and 120 milligrams--to assess how well bees fly at high temperatures and the variations seen based on body size. The researchers also measured leaf and flower surface temperatures and air temperatures in sun and shade within the bees' native tropical forest canopy.

With the temperature-size rule in mind, the researchers were expecting the smaller bees to perform better in hot weather. Surprisingly, the opposite was true. Their findings showed that large bees seem to have adapted to the high temperatures and by using their ability to maintain their own heat. This flight performance advantage was also seen in cooler altitudes of the hot Panamanian rain forest.

"Essentially the bigger bees are exposed to higher temperatures--sometimes in excess of 10 degrees Celsius hotter than air temperature--because they produce a lot of heat while flying. That same heat producing ability gives them an advantage in cooler regions as well because they can be active earlier in the morning, later into the evening or on cooler days compared to smaller bees," Duell said.

Credit: 
American Physiological Society

Does the US discard too many transplantable kidneys?

San Diego, CA (October 27, 2018) -- Comparing transplant data between countries may help address the global organ shortage, according to a study that will be presented at ASN Kidney Week 2018 October 23-October 28 at the San Diego Convention Center. The study provides evidence that some kidneys discarded in the United States are a lost opportunity that could have benefitted some patients.

Approximately 2,000 donated kidneys are discarded in the United States each year, despite a serious shortage of organs for transplantation. By studying transplant data from the United Network for Organ Sharing and from the French Organ Procurement Agency from 2004 to 2014, Olivier Aubert, MD, PhD, Alexandre Loupy, MD, PhD (Paris Translational Research Center for Organ Transplantation), and their colleagues compared kidney quality and outcomes between the United States and France.

During this period, 156,089 kidneys in the United States and 29,984 kidneys in France were procured for transplant. A much higher proportion of transplanted French kidneys were considered higher-risk organs (as measured by the kidney donor profile index, KDPI) compared with US organs. During the decade, the KDPI of US kidneys only increased modestly, while in France, a steadily rising KDPI reflected a trend of more aggressive organ use. Models predicted that many transplanted French kidneys would have had a high probability of discard in the US system. If US centers adopted greater willingness to accept kidneys from older donors and other higher-risk donor groups, this change would provide an additional 132,445 allograft life-years to US transplant candidates over 10 years.

"The global shortage of organs for transplantation is a major public health concern. In the US alone, approximately 100,000 individuals are waiting for a kidney transplant," said Dr. Loupy. "New, creative solutions to address this concern are needed. By comparing transplant practices in two countries, we provide fresh evidence that older deceased donor organs are a valuable underutilized resource."

Dr. Aubert noted that international comparisons of transplant practice offer a natural experiment "so that the successful innovations in each country can be rapidly identified and exported. Transplantation could benefit from additional studies that cross borders, as this one did."

Credit: 
American Society of Nephrology

Climate change a threat to even the most tolerant oysters

New Orleans (October 27, 2018)--Climate change-associated severe weather events may cause flooding that threatens the survival of the Olympia oyster, new research suggests. The findings will be presented today at the American Physiological Society's (APS) Comparative Physiology: Complexity and Integration conference in New Orleans.

Oceans around the world typically have a salt content (salinity) of around 3.5 percent, but the percentage varies more in shallow coastal waters affected by rainfall. Researchers studied three groups of Olympia oyster from different areas of the California coast where the influence of rainfall on seawater salinity varies. One group was native to a large estuary--a body of seawater near the mouth of a river--that was routinely exposed to freshwater flooding from extreme precipitation, which decreased the salinity of the oysters' surroundings. A second group lived in a small estuary that received much less freshwater exposure, and a third group lived far away from the large estuary where salinity was also higher and more stable.

All organisms, including oysters, show higher expressions of genes that are involved with DNA damage and protein unfolding in response to extreme stress. Protein unfolding is a process in which proteins lose their structure and become unstable, which, if not repaired, will eventually lead to the animal's death. Researchers study the Olympia oyster because they are a "foundation species," meaning the presence of oysters provides habitat for a large number of other smaller species and creates a much healthier ecosystem. If the oysters die out, all of the associated species will too. Because of the vital role oysters play in coastal ecosystems, researchers want to know if oysters living in certain areas are more tolerant of low salinity and therefore better equipped to survive climate change.

The research team exposed all three groups of oysters and their offspring to low-salinity seawater (around 0.5 percent salt) and measured their gene expression patterns. They found that the oysters living closest to the large estuary were more tolerant of a five-day exposure to low-salinity seawater. "More frequent exposure to freshwater in this region likely forced oysters to evolve new ways of surviving in low salinity," explained Tyler Evans, PhD, from California State University East Bay, and first author of the study.

This group expressed considerably higher levels of mRNA--genetic material that tells cells what new proteins to make--than the less-tolerant oysters that were accustomed to higher salinities. Proteins encoded by the mRNA control the activity of the oyster's cilia (hair-like structures on the gills that move back and forth to circulate fluid inside the oyster shell). The researchers predict this added cilia movement increases survival by allowing oysters to keep their shells closed (and the low-salinity seawater out) for longer amounts of time. However, climate change is a concern for the survival of even the most tolerant group of Olympia oysters due to the expected increase in the severity of extreme-precipitation events that would expose the oysters to even longer periods of low salinity. "Even oysters having garnered greater low-salinity tolerance via natural selection will be vulnerable to future freshwater flooding events," the research team wrote.

Credit: 
American Physiological Society

Mouse study suggests vaccine strategy for immunocompromised patients

A study led by Som Nanjappa at the University of Illinois College of Veterinary Medicine identifies a cellular target that may improve efficacy in vaccines designed to protect immunocompromised individuals from potentially deadly opportunistic infections.

The study, conducted in a mouse model and recently published in the Journal of Immunology, shows that a protein important in regulating immune response, called CBLB, can be targeted in combination with an inactivated vaccine to elicit immunity through a unique T cell pathway. This approach may lead to protective vaccines for immune-impaired patients, such as those undergoing chemotherapy, immunosuppressive therapy, or immune deficiency.

While fungal pathogens rarely sicken healthy individuals, the incidence of fungal infections in people with HIV/AIDS or other immune deficiencies has risen sharply in recent years. This population is highly susceptible to fungal infections, resulting in as much as 70 percent mortality even when treated with antifungal medications.

"Because prevention is better than cure, the ideal solution would be to vaccinate immunocompromised individuals against such opportunistic infections," said Dr. Nanjappa. "Currently, however, there are no licensed fungal vaccines. Additionally, in order to be safe for use in immunocompromised patients, such a vaccine would need to be based on an inactivated rather than live pathogen. Yet inactivated vaccines stimulate a weaker immune response."

To address these obstacles to vaccine development, Dr. Nanjappa and his colleagues at the U. of I. and at the University of Wisconsin-Madison sought targets that could be used as adjuvants for fungal vaccines. Casitas B-lymphoma-b (CBLB) is a critical negative regulator of T cell response. Targeting CBLB has been shown to help control chronic viral infections and tumors. The new paper reports on extensive analyses of the role of CBLB in CD8+ T cell immune response to various live and inactivated vaccines in mouse models that had been depleted of CD4+ T cells.

CD4+ T cells, sometimes called "helper" T cells, are required players in almost all the body's immune responses. They signal activity by other infection-fighting white blood cells, causing B cells to secrete antibodies, macrophages to destroy microbes, and CD8+ T cells (sometimes called "cytotoxic" or "killer" T-cells) to kill infected cells. CD4+ T cells also appear to play a critical role in the body's ability to fight off fungal infections.

Previous work by Nanjappa and colleagues showed that a live attenuated fungal vaccine can, in the absence of CD4+ T cells, stimulate some CD8+ T cells (type 1 and type 17) to take on some of the function of CD4+ T cells and generate long-term immunity against fungal pathogens in a mouse model.

Data published in the current study support the premise that adjuvants targeting a negative regulator of T cell response such as CBLB could provide lasting immunity against lethal fungal pathogens in a population deficient in CD4+ T cells. The study also showed that targeting CBLB also invigorates CD8+ T cell response to existing viral infection.

These findings may have broad translational potential for clinical applications for a variety of immunocompromised conditions, from transplantation and chemotherapy to the immunosuppressive stages of pregnancy.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Diagnosing strokes is complicated by 'mimics' and 'chameleons'

MAYWOOD, IL - Stroke specialists often see conditions known as stroke "mimics" and "chameleons" that can complicate accurate diagnoses, Loyola Medicine neurologists report in the November 2018 issue of Neuroimaging Clinics of North America.

Stroke mimics are medical conditions that look like strokes, while chameleons are strokes that look like other conditions.

Diagnostic accuracy "may be complicated by the abundance of both 'stroke mimics' and 'stroke chameleons,'" neurologists Shannon Hextrum, MD, and José Biller, MD, wrote. Dr. Biller is professor and chair of the department of neurology of Loyola University Chicago Stritch School of Medicine. Dr. Hextrum completed a neurology residency at Loyola.

Drs. Biller and Hextrum examined mimics and chameleons associated with ischemic strokes, which account for about 85 percent of all strokes. Ischemic strokes are caused by blood clots that block blood flow to an area of the brain. (The other main type of stroke, hemorrhagic, is caused by bleeding in the brain.)

Permanent damage from an ischemic stroke can be minimized by quickly restoring blood flow. This can be done by administering the clot-busting IV drug tPA or by performing a minimally invasive surgery to remove the blood clot. But such treatments can do more harm than good if a patient is incorrectly diagnosed.

The exact prevalence of stroke mimics is unknown. According to previous studies, anywhere from 1.4 to 38 percent of patients admitted for suspected ischemic strokes actually have other conditions.

For example, Drs. Hextrum and Biller cite the case of a 79-year-old woman who experienced sudden weakness on the right side of her body and difficulty speaking - classic signs of a stroke. But a CT angiogram showed no evidence of stroke, and she later was correctly diagnosed as having viral encephalitis.

In another stroke mimic, a 60-year-old man had difficulty walking, speaking and reading. He also had vision problems that were preceded by a headache. The patient earlier had received radiation for a brain tumor. Rather than a stroke, he was experiencing SMART syndrome (stroke-like migraine attacks after radiation therapy).

A wide range of other conditions also can mimic ischemic strokes, including seizures, sepsis, low blood sugar, dizziness, vertigo, drug and alcohol toxicity and multiple sclerosis.

In treating an ischemic stroke, it's critically important to restore blood flow within the first few hours before brain cells die. But fewer than 10 percent of ischemic stroke patients receive a clot-busting drug and fewer still undergo surgery to remove the clot. One reason may be the prevalence of stroke chameleons.

Stroke chameleons with non-specific symptoms, such as nausea, vomiting and decreased mental activity, pose a particular challenge when triaging patients in the emergency room. But often, such patients also have neurologic deficits that can indicate a stroke.

Accurately diagnosing an ischemic stroke requires a detailed history and neurologic examination, which should not be rushed in an effort to speed administration of the clot-busting drug, Drs. Hextrum and Biller wrote.

They conclude: "Attention to subtleties of the neurologic examination and listening closely to patients remain critical for both diagnostic accuracy and development of sound clinical judgment."

Their paper is titled, "Clinical Distinction of Cerebral Ischemia and Triaging of Patients in the Emergency Department."

Credit: 
Loyola Medicine

How sleeping mammary stem cells are awakened in puberty

image: High resolution imaging of ducts in the mammary gland was critical for the discovery of how their growth is triggered in puberty.

Image: 
Walter and Eliza Hall Institute, Australia

Walter and Eliza Hall Institute researchers have discovered how the growth of milk-producing mammary glands is triggered during puberty.
Sleeping stem cells in the mammary gland are awoken by a protein dubbed FoxP1, according to the research that was published today in the journal Developmental Cell.

The research expands our knowledge of how the mammary gland - a component of the human breast - develops from stem cells, underpinning a better understanding of how defects in this process lead to breast cancer. The research was led by Dr Nai Yang Fu, Professor Jane Visvader and Professor Geoff Lindeman who is also a medical oncologist at the Royal Melbourne Hospital and the Peter MacCallum Cancer Centre, in collaboration with Professor Gordon Smyth and his bioinformatics team.

AT A GLANCE

Stem cells - the cells that can give rise to a range of other cells types - are often found in a dormant state in our body, and little is known about how they are awakened into an activated state.

Our researchers discovered 'sleeping' mammary stem cells are awoken at puberty by a gene called FoxP1. This triggers the rapid growth and development of mammary glands.

Without FoxP1, the mammary stem cells are locked in a dormant state and mammary glands could not grow

WAKING UP STEM CELLS

Stem cells in the mammary gland exist in a largely dormant or 'sleeping' state throughout life. In puberty, these stem cells need to be 'woken up' to drive the rapid expansion of the mammary gland, said Professor Visvader.

"The mammary stem cells are ready for a signal to start dividing," she said. "We discovered that a gene called FoxP1 is an essential part of this signal in puberty and the adult."

FoxP1 switches off the production of other proteins within cells - by repressing their genes.

"We discovered that FoxP1 switches off the production of one of the key proteins that keep mammary stem cells asleep. As the level of this protein drops, the stem cells wake up and begin to divide, driving mammary gland growth," Dr Fu said.

THE IMPORTANCE OF TEAM WORK

The project relied on collaboration between scientists with diverse skills, said Professor Visvader.

"This project brought together expertise in cell biology, developmental biology, bioinformatics and imaging to solve the question of how mammary stem cells are awoken in puberty and adult breast tissue.

"We're still looking for the precise connections linking female hormones and FoxP1, but we are one step closer to understanding the detailed process of breast development. This is also helping us to connect faulty cells that contribute to breast development with the development of breast cancer," she said.

Credit: 
Walter and Eliza Hall Institute

Scientists discover a new lead for mechanism of action of diabetes drug metformin

Canadian and British researchers have discovered how the frontline Type 2 diabetes drug metformin may work to help cells better take up and use glucose. Their study, published today in the prestigious journal Cell, may also explain other potential beneficial effects of metformin for prevention of a variety of chronic diseases, including cancers.

To show that metformin appeared to make the cells act as if they are starved for the essential mineral iron, biochemists at Université de Montréal used a new method to simultaneously probe how all of a cell's biochemical processes respond to the presence of a drug. Collaborating with researchers at the Francis Crick Institute in London, the UdeM team showed that metformin has a global effect on iron distribution in cells, resulting in alteration of essential biochemical processes.

The novel technology that made this discovery possible was developed in the lab of lead author Stephen Michnick, a biochemistry professor at UdeM and holder of a Canada Research Chair in cell architecture. "If you want to know what a drug or any other molecule is doing in the body, you need to survey everything going on in it's cells at once," said Dr. Michnick. "Today there are several ways to do this, but our method, called hdPCA, has the merit of being extremely simple to perform and interpret, non-invasive and inexpensive; it can be done in almost any lab." The method can be deployed to rapidly predict and confirm how a drug might affect cells and simultaneously identify any liabilities the drug might have if introduced into humans.

"We'd chosen to use metformin, mostly because it was an interesting test case, having no clear mechanism of action,"added the study's first author, UdeM biochemist Bram Stynen. "The lead to effects of metformin on iron homeostasis was a bonus of this study. A connection between iron metabolism and diabetes was already suspected but no-one had ever showed a specific antidiabetic effect of metformin in living cells connected to iron homeostasis." Added collaborator Markus Ralser, a biochemist at Francis Crick, "this makes a lot of sense; glucose metabolism most likely emerged evolutionarily from iron-dependent chemical reactions; such chemical relationships don't disappear in evolution."

Further cell and animal studies will have to be done to pin down how important iron-starvation mimicking effects of metformin are to glucose metabolism and how this mechanism might be better exploited to improve diabetes treatments.

Credit: 
University of Montreal

Air pollution and noise increase risk for heart attacks

image: Considering transportation noise should be an integral part of any studies looking at the impact of air pollution on health

Image: 
Jana Sönksen / Swiss TPH

Where air pollution is high, the level of transportation noise is usually also elevated. Not only air pollution negatively impacts on health, but also car, train and aircraft noise increases the risk for cardiovascular diseases and diabetes, as previous research has demonstrated. Studies investigating the effect of air pollution without sufficiently taking into account the impact that noise exhibits on health, might overestimate the effect of air pollution. These are the results of a comprehensive study conducted by the Swiss Tropical and Public Health Institute (Swiss TPH), which was published today in the peer-reviewed European Heart Journal.

The study looked at the combined effects of air pollution and transportation noise for heart attack mortality, by considering all deaths that occurred in Switzerland between 2000 and 2008. Analyses that only included fine particulates (PM2.5) suggest that the risk for a heart attack rises by 5.2% per 10 μg/m³ increase in the long-term concentration at home. Studies which also account for road, railway and aircraft noise reveal that the risk for a heart attack attributable to fine particulates in fact increases considerably less; 1.9% per 10 μg/m³ increase. These findings indicate that the negative effects of air pollution may have been overestimated in studies which fail to concurrently consider noise exposure.

"Our study showed that transportation noise increases the risk for a heart attack by 2.0 to 3.4% per 10 decibels increase in the average sound pressure level at home." said Martin Röösli, Head of the Environmental Exposures and Health Unit at Swiss TPH, and lead author of the published research. "Strikingly, the effects of noise were independent from air pollution exposure."

Effect of noise and air pollution are additive

The study also found that people exposed to both air pollution and noise are at highest risk of heart attack. Hence, the effects of air pollution and noise are additive. "Public discussions often focus on the negative health effects of either air pollution or noise but do not consider the combined impact." said Röösli. "Our research suggests that both exposures must be considered at the same time." This has implications for both policy as well as future research. Hence, Röösli and co-researchers recommend including transportation noise exposure in any further research related to air pollution and health to avoid overestimating the negative effects of air pollution on the cardiovascular system.

Data from across Switzerland

The study included all deaths (19,261) reported across Switzerland from the period 2000 to 2008. The air pollution (PM2.5) was modelled using satellite and geographic data, calibrated with air pollution measurements from 99 measurement sites throughout Switzerland. Nitrogen dioxides (NO2) were also modelled using 9,469 biweekly passive sampling measurements collected between 2000 and 2008 at 1,834 locations in Switzerland. Transportation noise was modelled by well-established noise propagation models (sonRoad, sonRAIL and FLULA 2) by Empa and n-sphere. The air pollution and the transportation noise models were applied for each address of the 4.4 million Swiss adult citizen (aged 30 years and above).

Credit: 
Swiss Tropical and Public Health Institute

Plump songbirds more likely to survive migration over Gulf of Mexico

URBANA, Ill. - A kilometer above Fort Morgan, Alabama, small migratory birds face a critical decision. Ahead lies a thousand kilometers of open water, the Gulf of Mexico, and a 22- to 24-hour flight without rest or food. On the other side, if they make it, they'll continue the journey to their South American winter habitat. For some, the journey will end in the waters of the Gulf.

With many migratory birds in decline, ornithologists are keen to identify "choke points" along their routes. Large geographic barriers like the Gulf are likely suspects, but survival rates across these barriers are difficult to estimate. A new study published in Proceedings of the Royal Society B provides the first survival estimates for small migratory birds crossing the Gulf, and the factors that explain whether or not they survive the crossing.

"We know a lot of birds die going across the Gulf because we see birds floating up on shore and in the stomach contents of sharks. We just don't know how many and how risky it is to go across the Gulf," says Mike Ward, lead author of the study, an associate professor in the Department of Natural Resources and Environmental Sciences at U of I, and avian ecologist at the Illinois Natural History Survey. "We figured out that survival depends on a combination of how fat they are - the fatter the better - and how much wind they have at their back."

Ward and his colleagues focused on Swainson's thrushes, small sparrow-sized birds that travel between Canada and South America twice each year. Some avoid the Gulf, opting to fly over Texas and mainland Mexico, but many more brave the treacherous shortcut between Fort Morgan and the Yucatan Peninsula. Why take the risk?

"They want to get to their wintering location as soon as possible because birds are territorial in the wintering grounds. They want to get to Columbia or Venezuela to get the best habitat for the winter," Ward explains.

In the study, Ward's colleagues captured Swainson's thrushes at Fort Morgan each fall for five years. For each of the 139 birds they caught, the researchers gauged fat reserves, determined sex, and used eyelash glue to attach a tiny radio transmitter to the bird's back. Meanwhile, Ward was on the Yucatan side erecting radio towers to pick up signals from the birds' transmitters.

Using sophisticated analyses, the team estimated survival probabilities for all departing birds. Using data from both the birds detected and not detected on the Yucatan side, they were able to determine the factors that predicted which individuals were likely to survive the crossing.

The researchers state, "Survival estimates varied with wind profit and fat, but generally, fat birds departing on days with favorable wind profits had an apparent survival probability of greater than 0.90, while lean individuals with no or negative wind profits had less than 0.33." In other words, the fatter the bird and the stronger the tailwind, the greater the probability of survival.

Going back to that moment of decision above Fort Morgan - to cross or not to cross - Ward says birds can usually tell if they're ready to make the trip.

"Birds that aren't fat enough know it. When they fly up in the sky at dusk, they circle around a little bit and head back north to find more food. The really fat ones - we call them little butterballs - fly up in the sky then start heading south. As long as they don't have a strong wind in their face, they should be fine. Individuals with intermediate levels of fat have to make a tough decision," he says.

Ward says that from a conservation perspective there's not much people can do to control the wind, but conservation efforts can improve birds' chances of surviving the journey across the Gulf. The action that people can do is help birds get fat.

"If people throughout the migration corridor provide habitat and food sources for birds to add fat, they're facilitating their ability to cross the Gulf even if the winds aren't ideal. Whether it's planting native shrubs in your backyard, or setting aside a big tract of forest, I'm a big proponent that every small thing helps."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Investigating glaciers in depth

Global sea level is rising constantly. One factor contributing to this rise is the melting of the glaciers. However, although the surface area of the glaciers has been well mapped, there is often no information regarding their thickness, making it impossible to calculate their volume. As a result, we cannot accurately calculate the effects on sea levels. Dr. Johannes Fürst from the Institute of Geography at Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) has developed an approach which can be used to draw up regional ice thickness maps for glaciers. He has now produced such a map for Svalbard and published his findings in Geophysical Research Letters (DOI: 10.1029/2018GL079734).

New ice thickness map, new findings

The FAU geographer Johannes Fürst has gathered and evaluated data measured by a number of international research teams since the early 1980s. These measurements have been entered into the new map of glacier thickness on Svalbard, an archipelago to the north of Norway with the main island Spitsbergen. Whereas previous studies have only looked at individual thickness measurements in isolation, projecting the total ice volume evolution on the basis of the surface area and just a few measurements, this map takes all available measurements into account in order to obtain a reliable estimate of the total ice volume. At 6,200 cubic kilometres, it is approximately one third smaller than previously presumed. Nevertheless, if this ice were to melt all of a sudden, it would still cause global sea levels to rise by 1.5 centimetres.

Fürst also provides an associated map of error estimates. Errors may be a direct result of the thickness measurements. At these locations, errors can readily be calculated. It is more difficult for positions away from the measurements. Starting from a certain measurement point, errors are estimated along flowline down the glacier based on speed, direction as well as local mass gain and loss. Researchers could also use this formula to calculate the ice thickness uncertainty for regions where hardly any measurements have been taken.

'In order to calculate the future demise of glaciers accurately, we have to know the thickness of the glaciers. Until now, however, we have only had very rough estimates, which vary greatly. This is down to the lack of measurements taken worldwide. My approach, which can also be used for other glaciers, may help in this respect,' Johannes Fürst explains.

Data, data, data

There are nearly 1,700 glaciers on Svalbard. On these, one million point measurements of the ice thickness have already been collected, ranging in date from the 1980s to the present day. These measurements have mainly been provided by British, Spanish, Norwegian and Danish teams of researchers, but Polish, Icelandic, French and Japanese researchers have also collected valuable data. One method of determining the thickness of the ice cap is using radar. A radar signal is sent down through the ice. The longer the signal takes to return to the measuring device, the thicker the ice is. 'It's like ping-pong: the table tennis player hits the ball and waits until it comes back. The longer he has to wait, the further away the ball was,' explains Fürst. Another method involves making several boreholes through several hundred metres of ice. The extracted ice cores were used, for example, to study past fluctuations in temperature or precipitation. A mining company has also used drilling in order to better judge the risks of mining for coal underneath the glacier. Fürst has included all these measurements in the ice thickness map.

Going with the flow

The ice thickness map gives new insights into the dynamic ice loss of glaciers. When new snow falls, its weight compresses previous layers of snow and a new mass of ice is gradually formed. This ice then flows downglacier, in some places until it reaches the ocean. Huge icebergs regularly break off from the ice cliffs there. The mass lost there every year can only be estimated accurately if you know how thick the ice at the ice cliffs actually is. Johannes Fürst has calculated an average thickness of 135 metres for all marine-terminating glaciers in the Svalbard archipelago. The previous estimate was 214 metres. Thanks to the new map, we are now able to accurately estimate the dynamic ice loss from Svalbard glaciers.

Credit: 
Friedrich-Alexander-Universität Erlangen-Nürnberg