Culture

Heart attack patients taken directly to heart centers have better long-term survival

Malaga, Spain - 3 March 2019: Heart attack patients taken directly to heart centres for lifesaving treatment have better long-term survival than those transferred from another hospital, reports a large observational study presented today at Acute Cardiovascular Care 20191 a European Society of Cardiology (ESC) congress. Directly admitted patients were older, suggesting that heart attacks in young adults, and particularly women, go unrecognised by paramedics and patients.

Study author Dr Krishnaraj Rathod, of Barts Health NHS Trust, London, UK, said: "The age of first heart attacks is getting younger, one of the reasons is because of lifestyle habits. The average age in our cohort is no longer 60, but around 40 years and we even see patients in their 30s. Directly admitted patients were sicker but they were also older, indicating that paramedics may think heart attack is unlikely in younger adults. My message to them is 'in cases of doubt, repeat the 12 lead ECG and consider speaking to the heart attack centre'."

People in their 30s and 40s should not ignore heart attack symptoms, particularly women who often have atypical symptoms, he said. "Younger patients likely wait longer to call for help because if they have chest pain, heart attack is not the first thing they think of. If you are in any doubt, phone an ambulance."

The study from the London Heart Attack Group included 25,315 patients with ST-elevation myocardial infarction (STEMI), a serious type of heart attack where a major artery supplying blood to the heart is blocked. Rapid opening of the artery with a stent using primary percutaneous coronary intervention (PCI) improves survival and guidelines2 advise taking STEMI patients directly to a primary PCI centre.

The study compared characteristics, time to primary PCI, and long-term outcomes of STEMI patients taken directly to a primary PCI hospital versus those transferred from another hospital. Patients with STEMI were treated with primary PCI between 2005 and 2015 at the eight primary PCI centres in London. Patient details were recorded at the time of the procedure in the British Cardiovascular Intervention Society dataset. Data on all-cause mortality were obtained from the Office for National Statistics.

A total of 17,580 (69%) patients were admitted directly to primary PCI centres and 7,735 (31%) were transferred from other hospitals. The time between call for help and first hospital admission was similar between the two groups. However, the median time from call for help to opening the blocked artery with primary PCI was 52 minutes longer in transferred patients compared to those admitted directly.

After a median follow-up of three years, patients admitted directly to a primary PCI centre were significantly less likely to have died than those transferred from another hospital (17.4% versus 18.7%). After adjusting for factors that could influence the risk of death including age, previous heart attack and diabetes, direct admission to a primary PCI hospital was associated with a 20% lower risk of all-cause death.

Dr Rathod said: "Our findings indicate that the superior survival in patients admitted directly to a primary PCI hospital was because there was a shorter gap between calling for help and receiving treatment."

"All patients with STEMI should be admitted directly to a primary PCI centre within 90 minutes of diagnosis by electrocardiogram (ECG), which is done by ambulance teams," he said. "Yet in our study nearly one-third were taken to another hospital first, indicating that a STEMI diagnosis was not made until patients reached that hospital, and they then had to be transferred. However, it must be noted that the rates of transfer directly to a primary PCI centre were better in the later years suggesting better identification of appropriate patients by healthcare staff."

Credit: 
European Society of Cardiology

Women call ambulance for husbands with heart attacks but not for themselves

Malaga, Spain - 3 March 2019: Women call an ambulance for husbands, fathers and brothers with heart attack symptoms but not for themselves. "It's time for women take care of themselves too" is the main message of two studies from the Polish Registry of Acute Coronary Syndromes (PL-ACS) presented today at Acute Cardiovascular Care 20191,2 a European Society of Cardiology (ESC) congress.

The findings come ahead of International Women's Day on 8 March. This year's campaign theme - #BalanceforBetter - is a call-to-action for driving gender balance across the world. Ischaemic heart disease is the leading cause of death in women and men3 yet today's research shows disparities in management.

Professor Mariusz G?sior, principal investigator of the registry, said: "Very often women run the house, send children to school, and prepare for family celebrations. We hear over and over again that these responsibilities delay women from calling an ambulance if they experience symptoms of a heart attack."

Dr Marek Gierlotka, registry coordinator, added: "In addition to running the household, women make sure that male relatives receive urgent medical help when needed. It is time for women to take care of themselves too."

A total of 7,582 patients with ST-elevation myocardial infarction (STEMI) were included in the analyses. STEMI is a serious type of heart attack where a major artery supplying blood to the heart is blocked. Faster restoration of blood flow translates into more salvaged heart muscle and less dead tissue, less subsequent heart failure, and a lower risk of death. Guidelines4 therefore recommend opening the artery with a stent within 90 minutes of diagnosis in the ambulance by electrocardiogram (ECG).

Overall, 45% of patients were treated within the recommended timeframe - these patients were less often women. After adjusting for factors that could influence the relationship, male sex remained an independent predictor of treatment within the recommended timeframe.

Patients within and outside the advised treatment window had similar rates of in-hospital mortality, but those treated promptly were less likely to have a left ventricle ejection fraction below 40% - meaning their heart was better able to pump blood and they had a lower chance of developing heart failure.

ECG results were transmitted from the ambulance to a heart attack centre in about 40% of patients. In women, the likelihood of ECG transfer rose with increasing age - from 34% in women aged 54 years and under to 45% in those aged 75 and above. In men, the rate of transfer was around 40% regardless of age.

Professor G?sior said: "One of the reasons women are less likely than men to be treated within the recommended time period is because they take longer to call an ambulance when they have symptoms - this is especially true for younger women. In addition, ECG results for younger women are less often sent to the heart attack centre, which is recommended to speed up treatment."

Dr Gierlotka said: "More efforts are needed to improve the logistics of pre-hospital heart attack care in young women. Greater awareness should be promoted among medical staff and the general public that women, even young women, also have heart attacks. Women are more likely to have atypical signs and symptoms, which may contribute to a delay in calling for medical assistance."

Pain in the chest and left arm are the best known symptoms of heart attack. Women often have back, shoulder, or stomach pain. Call an ambulance if you have pain in the chest, throat, neck, back, stomach or shoulders that lasts for more than 15 minutes.

Credit: 
European Society of Cardiology

Don't ignore heart attack symptoms, especially while traveling

Malaga, Spain - 2 March 2019: Don't ignore heart attack symptoms while travelling, keep emergency numbers at hand. That's the main message of a study presented today at Acute Cardiovascular Care 20191 a European Society of Cardiology (ESC) congress. Cardiovascular disease is the leading cause of natural death among people who are travelling, yet, so far, the long-term outlook for those who have a heart attack while on a trip is unknown.

"If you are travelling and experience heart attack symptoms such as pain in the chest, throat, neck, back, stomach or shoulders that lasts for more than 15 minutes, call an ambulance without delay," said study author Dr Ryota Nishio, of the Department of Cardiology, Juntendo University Shizuoka Hospital, Izunokuni, Japan.

This observational study included 2,564 patients who had a heart attack and rapid treatment with a stent (percutaneous coronary intervention; PCI) between 1999 and 2015 at Juntendo University Shizuoka Hospital. The hospital is on the Izu peninsula, a popular tourist destination near Mount Fuji, and is the regional centre for PCI.

The researchers compared the demographic and clinical characteristics in residents versus people travelling. Patients were followed up for 16 years and the death rates were compared between groups. Mortality data were collected from medical records, telephone contact, and postal questionnaires.

A total of 192 patients (7.5%) were travelling at the onset of the heart attack. Patients who were travelling were younger and had a higher prevalence of ST-elevation myocardial infarction (STEMI), a serious type of heart attack in which a major artery supplying blood to the heart is blocked.

The median follow-up period was 5.3 years. Locals had a significantly higher rate of all-cause death (25.4%) compared to non-residents (16.7%; p = 0.0015) but the rate of death from cardiac causes was comparable between groups.

Heart attacks during a trip were associated with a 42% lower risk of long-term all-cause death than those that occurred in residents, after adjusting for age, sex, hypertension, diabetes, dyslipidaemia, chronic kidney disease, current smoking, prior heart attack, Killip class,2 and STEMI (adjusted hazard ratio 0.58; 95% confidence interval 0.38-0.83; p = 0.0020).

"Our study shows that long-term outcomes after a heart attack while travelling can be good if you get prompt treatment," said Dr Nishio. "It is important that, when you are over the immediate emergency phase, and return home, you see your doctor to find out how you can reduce your risk of a second event by improving your lifestyle and potentially taking preventive medication."

He continued "We also found that overall, patients were more likely to die during follow-up if they were older, had prior heart attack, or had chronic kidney disease. If you fall into any of these groups or have other risk factors like high blood pressure, smoking or obesity, it is particularly important to make sure you know the emergency number at home and at any travel destination."

Dr Nishio noted that local patients had a higher rate of non-cardiac death, mainly due to cancer. "This may be because most non-residents were from urban areas where people tend to be more health conscious, actively seek medical advice, and have a greater choice of treatment than in remote areas like the Izu peninsula," he said. "In addition, having a heart attack while away from home is a traumatic event that may create a lasting impression and greater health awareness when patients return home."

Credit: 
European Society of Cardiology

New research opens door to more efficient chemical processes across spectrum of industries

CORVALLIS, Ore. - Chemical processes that are more efficient and less expensive may be coming to industries ranging from battery manufacturing to detergent production thanks to an Oregon State University researcher's work advancing metal oxides as catalysts.

The findings, by a collaboration that included scientists from the University of Delaware, were published in Nature Catalysis.

A catalyst increases the rate of a chemical reaction without being consumed by the reaction - thus it is able to perform the rate-increase function repeatedly. Catalysts are involved in the production of most chemicals significant in industry - plastics, dyes, explosives, fuels and more.

Catalysts have traditionally been based on precious metals such as platinum and palladium, explains Konstantinos Goulas, assistant professor of chemical engineering in the OSU College of Engineering and one of the authors of the study.

Those precious metals are expensive and, as catalysts for biomass conversion, "unselective" - that is, their ability to direct a reaction to yield a particular chemical is limited.

"That's why we undertook this study," Goulas said. "This work was inspired by our research on the conversion of biomass, such as wood and agricultural residues, into fuels and commodity chemicals. We wanted to understand the principles of biomass conversion using oxide-based catalysts, which previous studies had suggested were selective catalysts."

An oxide catalyst is a compound that contains at least one other element in addition to oxygen. Oxides are very abundant and can be relatively inexpensive; for example, most of the earth's crust consists of metal oxides.

By comparing how fast specific chemicals can be made on a variety of metal oxide catalysts, the team gained important insights related to what properties result in the best metal-oxide catalysts.

"Our study shows that oxide properties that are easy to determine, such as the Gibbs Free Energy of formation of the oxide, can predict the oxide's reactivity. This opens up new pathways for rational catalyst design and more efficient processes in many fields, from industrial chemistry to pollution abatement," Goulas said.

Credit: 
Oregon State University

New chemical probes advance search for new antibiotics

video: A video clip shows cell walls light up red in real time as a bacterium integrates fluorescent probe during growth.

Image: 
Video courtesy of the VanNieuwenhze Lab, Indiana University

BLOOMINGTON, Ind. -- Indiana University researchers are advancing knowledge about how bacteria build their cell walls that could contribute to the search for new antibacterial drugs. They have created a new tool to observe living cells in real time under a microscope.

"If you look at the history, no one's really discovered a fundamentally new class of antibiotic for the past 40 to 50 years," said IU chemist Michael VanNieuwenhze, who led the study. "Antibiotic resistance is a significant and urgent public health threat, and we think that new ways to address it -- including this -- have significant value."

The need for new ways to study bacteria is driven in large part by the threat of bacterial resistance. According to the Centers for Disease Control and Prevention, at least 2 million people in the U.S. get an antibiotic-resistant infection each year, and at least 23,000 people die.

"This new technology takes advantages of specific cellular enzymes to stick colored dyes -- or 'probes' -- into the walls of bacterial cells," VanNieuwenhze said. "Since these same enzymes are inhibited by other well-known antibacterial compounds -- most notably, penicillin -- we could theoretically also use these probes to seek out entirely new classes of drugs that inhibit the same reaction."

VanNieuwenhze's lab has already created two other cellular probes patented through IU -- called FDAAs (fluorescent D-amino acids) and DAADs (d-amino acid dipeptides) -- which are in use in laboratories across the globe. Their new class of probes, which build upon these earlier advances, are called rotor-fluorogenic D-amino acids, or RfDAAs. IU has also filed for a patent on this technology.

The main advantage of RfDAAs is their ease of use and ability to show cellular activity in real time.This is because the probes don't require washing steps to remove unincorporated chemicals that blur the distinct boundaries between bacterial cells and their surrounding environment.

Instead, RfDAAs only light up when they're integrated into bacteria's cell walls as part of the regular growth process. The probes illuminate cell walls faster and clearer without the steps that can stop cellular activity.

It's the difference between a snapshot and a video, VanNieuwenhze said. A video provides much more information about how cell walls grow, change and interact with their environment.

Already, VanNieuwenhze has launched a collaboration with IU School of Medicine to apply these methods to the search for new inhibitors of bacterial cell wall synthesis, and the probes are also in use to study bacterial cell division, which may unveil new targets for antibiotic discovery. In addition, the biotechnology company ThermoFisher Scientific recently purchased exclusive rights to market the two earlier probes as commercial products; other industrial groups have reached out about applying the technology to high-throughput screens designed to identify new drug leads.

Credit: 
Indiana University

Researchers discover clues to brain differences between males and females

BALTIMORE, MD., March 1 -- Researchers at the University of Maryland School of Medicine have discovered a mechanism for how androgens -- male sex steroids -- sculpt brain development. The research, conducted by Margaret M. McCarthy, Ph.D., who Chairs the Department of Pharmacology, could ultimately help researchers understand behavioral development differences between males and females.

The research, published in Neuron, discovered a mechanism for how androgens, male sex steroids, sculpt the brains of male rats to produce behavioral differences, such as more aggression and rougher play behavior. "We already knew that the brains of males and females are different and that testosterone produced during the second trimester in humans and late gestation in rodents contributes to the differences but we did not know how testosterone has these effects" said Dr. McCarthy.

Jonathan Van Ryzin, PhD, a Postdoctoral Fellow, was lead author on this research conducted in Dr. McCarthy's lab.

A key contributor to the differences in play behavior between males and females is a sex-based difference in the number of newborn cells in the part of the brain called the amygdala, which controls emotions and social behaviors. The research showed that males have fewer of these newborn cells, because they are actively eliminated by immune cells.

In females, the newborn cells differentiated into a type of glial cell, the most abundant type of cell in the central nervous system. In males however, testosterone increased signaling at receptors in the brain which bind endocannabinoids, causing immune cells to be activated. The endocannabinoids prompted the immune cells to effectively eliminate the newborn cells in males. Females rats in the study were unaffected, suggesting that the activation of the immune cells by the increased endocannabinoids in males was necessary for cell elimination. In this respect, this research shows that cannabis use, which stimulates endocannabinoids in the brain and nervous system, could impact brain development of the fetus and this impact could differ between male and female fetuses.

This study provides a mechanism for sex-based differences in social behaviors and suggests that differences in androgen and endocannabinoid signaling may contribute to individual differences in brain development and thus behavioral differences among people.

"These discoveries into brain development are critical as we work to tackle brain disorders as early in life as possible, even in pregnancy," said UMSOM Dean E. Albert Reece, MD, PhD, MBA, who is also the Executive Vice President for Medical Affairs, University of Maryland, and the John Z. and Akiko K. Bowers Distinguished Professor.

Credit: 
University of Maryland School of Medicine

How hair dye works (video)

image: Whether you need a disguise to run from the law or are just trying to emulate *NSYNC-era frosted tips, you may need some chemical assistance to put the hue in your do. To understand how these "shade-y" changes happen, you have to dive back into the history of chemistry. In this week's Reactions episode, get ready to learn everything you're "dyeing" to know about artificial hair color: https://youtu.be/zeReQ1wlcis.

Image: 
The American Chemical Society

WASHINGTON, March 1, 2019 -- Whether you need a disguise to run from the law or are just trying to emulate *NSYNC-era frosted tips, you may need some chemical assistance to put the hue in your do. To understand how these "shade-y" changes happen, you have to dive back into the history of chemistry. In this week's Reactions episode, get ready to learn everything you're "dyeing" to know about artificial hair color: https://youtu.be/zeReQ1wlcis.

Reactions is a video series produced by the American Chemical Society and PBS Digital Studios. Subscribe to Reactions at http://bit.ly/ACSReactions, and follow us on Twitter @ACSreactions.

The American Chemical Society, the world's largest scientific society, is a not-for-profit organization chartered by the U.S. Congress. ACS is a global leader in providing access to chemistry-related information and research through its multiple databases, peer-reviewed journals and scientific conferences. ACS does not conduct research, but publishes and publicizes peer-reviewed scientific studies. Its main offices are in Washington, D.C., and Columbus, Ohio.

To automatically receive press releases from the American Chemical Society, contact newsroom@acs.org">newsroom@acs.org.

Follow us: Twitter Facebook Instagram

Credit: 
American Chemical Society

Major genetic study confirms that many genes contribute to risk for Tourette's syndrome

A meta-analysis of multiple studies into the genetic background of Tourette's syndrome (TS) - a neurodevelopmental disorder characterized by chronic involuntary motor and vocal tics - finds that variants in hundreds of genes, working in combination, contribute to the development of the syndrome and suggests that Tourette's is part of a continuous spectrum of tic disorders, ranging from mild, sometimes transient tics to severe cases that can include psychiatric symptoms.

The report from an international team - led by investigators at Massachusetts General Hospital (MGH), the University of California at Los Angeles (UCLA), the University of Florida and Purdue University - also describes finding that individuals with more TS-associated variants are more severely affected, raising the possibility of predicting whether children with mild tic disorders will develop full-blown TS in the future.

"This study confirms that, for most patients, the underlying genetic basis of Tourette's syndrome is polygenic - that is, many genes working together to cause a disease," says Jeremiah Scharf, MD, PhD, of the Psychiatric & Neurodevelopmental Genetics Unit in the MGH Departments of Neurology and Psychiatry and the MGH Center for Genomic Medicine, co-senior and corresponding author of the report in the American Journal of Psychiatry. "This means that most people who have TS do not carry a single inactive gene but instead inherit hundreds of small DNA changes from both parents that combine to cause TS. This finding has multiple important implications, both scientifically as well as for patient advocacy and understanding of their symptoms."

While it is well known that most of the risk for TS is inherited, the few risk-associated gene variants that have been identified account for only a small percentage of cases. Many common gene variants acting in aggregate have been associated with increased disease risk, suggesting that large-scale, genome-wide association studies (GWAS) could clarify which potential risk genes do and which do not actually contribute the development of TS.

To achieve the largest possible data set, the investigators combined results from the only published GWAS study with new data from three international genetics consortia - the Tourette Association of America International Consortium for Genetics, the Gilles de la Tourette GWAS Replication Initiative, and the Tourette International Collaborative Genetics Study - for a total of 4,819 individuals with TS and almost 9,500 unaffected control volunteers. A secondary analysis of data from the Iceland-based deCode genetics study compared more than 700 individuals with TS to more than 450 with other tic disorders and more than 6,000 controls.

The results of this analysis identified multiple gene variants - only one of which met genome-wide significance - associated with increased TS risk. Use of an aggregated polygenic risk score based on the identified risk variants to analyze every individual in the study - both with and without TS, as well as individuals with less severe tic disorders - confirmed that those inheriting more risk variants had more severe symptoms. However, the presence of TS-associated variants was not restricted to those with tic disorders.

"Every one of the variants that contribute to developing TS is present in a significant proportion of the general population, which means that most people with TS do not have 'broken' or 'mutated' genes," says Scharf. "The movements and thoughts that individuals with TS have are the same ones that all of us have, but just to a greater degree. As doctors and researchers, we know that there is nothing that separates those with TS from other children and adults, and now we've shown this is actually true on a genetic level."

The development of a polygenic risk score for TS raises the future possibility of predicting whether the symptoms of children who develop tics, which typically worsen in early adolescence, will continue to be severe or will resolve as the child matures, something that is not currently possible. Future studies enrolling even larger groups of participants should improve this potential predictive ability.

"This study is an example of the great impact of collaborative research in order to be able to finally understand the cause of complex disease" says Peristera Paschou, PhD, associate professor in the Department of Biological Sciences at Purdue University and one of the co-senior authors. "As a next step, we are now expanding analysis to an even larger sample of close to 12,000 patients with TS, again made possible thanks to widespread international collaboration. We hope that this will yield even higher power to further clarify the genetics of TS."

Co-lead author Jae Hoon Sul, PhD, of the Jane and Terry Semel Institute for Neuroscience and Human Behavior at UCLA, says, "We need larger sample sizes to pinpoint specific genes that cause TS, and there is an ongoing collaborative effort across the U.S., Canada and Europe to increase numbers to tens of thousands of individuals with TS, which will definitely improve our chance of finding more genes related to TS. Only by all of us working collaboratively can we reach that important goal."

Scharf, an assistant professor of Neurology at Harvard Medical School, notes that regions of the brain most likely to be affected by the risk-associated variants in the polygenic risk score are parts of a circuit involved in motor learning, planning and selection of appropriate movements or actions, areas previously suggested to contribute to TS and other tic disorders. "Studies of other polygenic disorders - both brain and non-brain based - have shown that even if a single gene variant plays only a small role in causing a disorder, every gene may be a candidate for understanding disease mechanisms and finding new treatments. We hope that by continuing to find new TS genes, we will be able to find new treatments that are more effective without causing the significant side effects associated with existing therapies."

Credit: 
Massachusetts General Hospital

Detailed new primate brain atlas could lead to disease insights

image: The Brain/MINDS atlas can be used in conjunction with other brain mapping and imaging methods. Here, mapped neural connections of interest (atlas annotations--colored) are laid on top of MRI scans (greyscale).

Image: 
CSHL

Cold Spring Harbor, NY -- The ability to comprehensively map the architecture of connections between neurons in primate brains has long proven elusive for scientists. But a new study, conducted in Japan with contributing neuroscientists from Cold Spring Harbor Laboratory (CSHL), has resulted in a 3D reconstruction of a marmoset brain, as well as information about neuronal connectivity across the entire brain, that offers an unprecedented level of detail.

The study has introduced new methodology, combining experimental and computational approaches, that helps account for significant variation between individual brains. It allows for synthesizing unique brain connectivity maps into a single reference brain. The resulting data set for the marmoset brain is an ideal launching-off point for further studies, and scientists believe it may offer insights into human neural connectivity.

CSHL Professor Partha Mitra, who conceptualized and collaboratively led the study as part of Brain/MINDS research conducted at the RIKEN Center for Brain Science in Japan, explains that the endgame for any large-scale brain study is to learn more about human brain architecture and how disease can affect it. To do so, scientists must study a brain that is similar to a human's.

The brain architecture of marmosets more closely resembles that of humans than does the mouse brain, which has been the focus of similar efforts in the past. While mice are currently the mainstay for modeling human disease, the emergence of marmoset models of human neurological disorders has made marmosets a target of new research.

Among primates, the marmosets' relatively small brains lend themselves to thorough mapping of neural connections. And in comparison with extensively studied primates like the macaque, marmosets can be easier to study because their brain surfaces are flatter than the more folded cortical surfaces of larger primates.

The results of Mitra and colleagues' new study are detailed in the journal eLife.

"Brain connectivity studies have been carried out in the marmoset before," Mitra explains. "But we did not have complete three-dimensional digital data sets, showing connectivity patterns across several entire brains at the light-microscope resolution. The data we now have is completely unprecedented in scale and in information content."

With this new data and approach as a basis, Mitra and other neuroscientists are one step closer to making sense of the complex neural connections in the primate--and human--brain. The hope is that this line of research will eventually lead to fundamental therapeutic advances for human diseases.

Credit: 
Cold Spring Harbor Laboratory

Giving urban communities a voice in pollinator conservation initiatives

Get a sneak peek into these new scientific papers, publishing on March 4,5, 2019 in the Ecological Society of America's journal Frontiers in Ecology and the Environment.

Digging for ancient parasites in museum archives
Species origin is linked to extinction risk
Pollinator-friendly cities need to be human community-friendly, too
Is North America's "old growth" forest concept less important than we think?

 

Parasites hidden in museum specimens can teach us about diseases of the past and present

When ecologists respond to spreading infectious diseases, they need to establish a picture of the "normal" conditions they are trying to recover. According to a review published by researchers at the University of Washington and the Natural History Museum in London, the skeletons, fossils, and floating specimens found in museum and university collections provide a way for ecologists to track long-term shifts in parasitic infections. Many preserved specimens (such as frozen mammoth organs or fossilized dinosaur bones) also happen to contain preserved parasites. The authors explain how parasites can be examined using advanced imaging techniques and DNA analyses to reconstruct stories about diseases over time. 

Author Contact: Chelsea Wood (chelwood@uw.edu)

Harmon A, Littlewood TJ, and Wood CL. 2019. Parasites lost: using natural history collections to track disease change across deep time. Frontiers in Ecology and the Environment 17: https://esajournals.onlinelibrary.wiley.com/doi/10.1002/fee.2017.

Setting the record straight: non-native species are more frequently implicated in extinctions than native species

A number of papers published in the last two decades have argued against the use of species origin as a guiding principle for natural resource management, citing a lack of evidence that non-native species are truly a major cause of biological extinction or other environmental damage. A new analysis of the International Union for Conservation of Nature's Red List of Threatened Species shows that species classified as "alien" have in fact contributed to more plant and animal extinctions than have native species.

Author Contact: Tim Blackburn (t.blackburn@ucl.ac.uk)

Blackburn TM, Bellard C, and Ricciardi A. Alien versus native species as drivers of recent extinctions. Frontiers in Ecology and the Environment 17: https://esajournals.onlinelibrary.wiley.com/doi/10.1002/fee.2020.

Giving urban communities a voice in pollinator conservation initiatives

Parks, gardens, and vacant lots are ideal candidates for pollinator conservation sites, but in cities, the presence of undeveloped green spaces with lots of unmown grass and vegetation is sometimes viewed as a sign of poverty or neglect. Because tall plants offer concealment from onlookers, "pocket prairie" plots can even be viewed by residents as dangerous and as potential areas of criminal activity. A review by researchers from Ohio State University describes how scientists can connect with local communities to learn how to design public green spaces that are viewed as attractive and safe while still conserving populations of bees and other pollinators.

Author Contact: Mary Gardiner (gardiner.29@osu.edu)

Turo KJ and Gardiner MM. 2019. From potential to practical: conserving bees in urban public green spaces Frontiers in Ecology and the Environment 17: https://esajournals.onlinelibrary.wiley.com/doi/10.1002/fee.2015.

Out with OLD growth, in with ecological contiNEWity

Forest managers in North America usually rely on tree age when deciding which old-growth forests have the most conservation value. However, a new article by researchers from the Canadian Museum of Nature and Memorial University of Newfoundland contends that "ancient woodlands" do not necessarily require old, stately trees to be considered ancient. Instead, the length of time the area has existed uninterrupted as a forest - regardless of the age of individual trees in the forest - is a better way to identify priority areas for conservation. The authors suggest that lichens, which tend to rely on old forests, could be a way for conservation biologists and forest managers to determine how long an area has been forested. Most biologists and managers do not have expertise in identifying lichen species, but improvements in image recognition software could make it more feasible for non-lichenologists to learn how to identify these cryptic species in the field.

Author Contact: Yolanda Wiersma (ywiersma@mun.ca)

McMullin RT and Wiersma YF. Out with OLD growth, in with ecological contiNEWity: new perspectives on forest conservation. Frontiers in Ecology and the Environment 17: https://esajournals.onlinelibrary.wiley.com/doi/10.1002/fee.2016.

Credit: 
Ecological Society of America

Population increases and climate change point to future US water shortages

WASHINGTON -- Climate change plus population growth are setting the stage for water shortages in parts of the U.S. long before the end of the century, according to a new study in the AGU journal Earth's Future.

Even efforts to use water more efficiently in municipal and industrial sectors won't be enough to stave off shortages, say the authors of the new study. The results suggest that reductions in agricultural water use will probably play the biggest role in limiting future water shortages.

The new study is part of a larger 10-year U.S. Forest Service assessment of renewable resources including timber, rangeland forage, wildlife and water.

"The new study not only provides a best guess of future water supply and demand but also looks at what can we do to lessen projected shortages," said Thomas Brown, of the U.S. Forest Service Rocky Mountain Research Station in Colorado and the study's lead author.

To do that, the researchers used a variety of global climate models to look at future climate scenarios and how they will likely affect water supplies and demands. They also factored in population growth.

On the water supply side, the authors used a water yield model to estimate the amount of water that would become available for use across the country, and modeled how that water would be delivered to in-stream and off-stream uses or stored in reservoirs for future use.

The new study finds climate change and population growth are likely to present serious challenges in some regions of the U.S., notably the central and southern Great Plains, the Southwest and central Rocky Mountain States, and California, and also some areas in the South and the Midwest.

The heart of the new analysis is a comparison of future water supply versus estimated water demand in different water-using sectors, like industry and agriculture.

The study finds continued reductions in per-capita water use rates are likely in most water-use sectors, but will be insufficient to avoid impending water shortages because of the combined effects of population growth and climate change.

The study's authors looked at a variety of adaptive strategies for alleviating projected water shortages, like increasing reservoir storage capacity, pumping more water out of groundwater aquifers, and diverting more water from streams and rivers. Increasing the size of reservoirs does not look promising for fending off water shortages, especially in parts of the U.S. expected to get drier as climate change progresses.

"Where water is the limiting factor, a reservoir enlargement is unlikely to store any water," Brown said.

Further reductions in groundwater reserves and greater diversions of in-stream flows could help alleviate future shortages in many areas but come with serious social and environmental costs. If those costs are to be avoided, improvements in irrigation efficiency will need to become a high priority, and further transfers of water from agriculture to other sectors will likely be essential, the study's authors say.

Brown cautions that people should not read too much into the report regarding their local water supplies. The new study models large watersheds and does not look at what will happen on a city or county scale.

Credit: 
American Geophysical Union

New findings shed light on origin of upright walking in human ancestors

image: This is a fossil hominin talus from site GWM67 (2005) at the time of its discovery.

Image: 
Case Western Reserve University School of Medicine

The oldest distinguishing feature between humans and our ape cousins is our ability to walk on two legs - a trait known as bipedalism. Among mammals, only humans and our ancestors perform this atypical balancing act. New research led by a Case Western Reserve University School of Medicine professor of anatomy provides evidence for greater reliance on terrestrial bipedalism by a human ancestor than previously suggested in the ancient fossil record.

Scott W. Simpson, PhD, led an analysis of a 4.5 million-year-old fragmentary female skeleton of the human ancestor Ardipithecus ramidus that was discovered in the Gona Project study area in the Afar Regional State of Ethiopia.

The newly analyzed fossils document a greater, but far from perfect, adaptation to bipedalism in the Ar. ramidus ankle and hallux (big toe) than previously recognized. "Our research shows that while Ardipithecus was a lousy biped, she was somewhat better than we thought before," said Simpson.

Fossils of this age are rare and represent a poorly known period of human evolution. By documenting more fully the function of the hip, ankle, and foot in Ardipithecus locomotion, Simpson's analysis helps illuminate current understanding of the timing, context, and anatomical details of ancient upright walking.

Previous studies of other Ardipithecus fossils showed that it was capable of terrestrial bipedalism as well as being able to clamber in trees, but lacked the anatomical specializations seen in the Gona fossil examined by Simpson. The new analysis, published in the Journal of Human Evolution, thus points to a diversity of adaptations during the transition to how modern humans walk today. "The fact that Ardipithecus could both walk upright, albeit imperfectly, and scurry in trees marks it out as a pivotal transitional figure in our human lineage," said Simpson.

Key to the adaptation of bipedality are changes in the lower limbs. For example, unlike monkeys and apes, the human big toe is parallel with the other toes, allowing the foot to function as a propulsive lever when walking. While Ardipithecus had an offset grasping big toe useful for climbing in trees, Simpson's analysis shows that it also used its big toe to help propel it forward, demonstrating a mixed, transitional adaptation to terrestrial bipedalism.

Specifically, Simpson looked at the area of the joints between the arch of the foot and the big toe, enabling him to reconstruct the range of motion of the foot. While joint cartilage no longer remains for the Ardipithecus fossil, the surface of the bone has a characteristic texture which shows that it had once been covered by cartilage. "This evidence for cartilage shows that the big toe was used in a more human-like manner to push off," said Simpson. "It is a foot in transition, one that shows primitive, tree-climbing physical characteristics but one that also features a more human-like use of the foot for upright walking." Additionally, when chimpanzees stand, their knees are "outside" the ankle, i.e., they are bow-legged. When humans stand, the knees are directly above the ankle - which Simpson found was also true for the Ardipithecus fossil.

The Gona Project has conducted continuous field research since 1999. The study area is located in the Afar Depression portion of the eastern Africa rift and its fossil-rich deposits span the last 6.3 million years. Gona is best known as documenting the earliest evidence of the Oldowan stone tool technology. The first Ardipithecus ramidus fossils at Gona were discovered in 1999 and described in the journal Nature in 2005. Gona has also documented one of the earliest known human fossil ancestors - dated to 6.3 million years ago. The Gona Project is co-directed by Sileshi Semaw, PhD, a research scientist with the CENIEH research center in Burgos, Spain, and Michael Rogers, PhD, of Southern Connecticut State University. The geological and contextual research for the current research was led by Naomi Levin, PhD, of the University of Michigan, and Jay Quade, PhD, of the University of Arizona.

Credit: 
Case Western Reserve University

Using stardust grains, a new model for nova eruptions

What do tiny specks of silicon carbide stardust, found in meteorites and older than the solar system, have in common with pairs of aging stars prone to eruptions?

A collaboration between two Arizona State University scientists -- cosmochemist Maitrayee Bose and astrophysicist Sumner Starrfield, both of ASU's School of Earth and Space Exploration -- has uncovered the connection and pinpointed the kind of stellar outburst that produced the stardust grains.

Their study has just been published in The Astrophysical Journal.

The microscopic grains of silicon carbide -- a thousand times smaller than the average width of a human hair -- were part of the construction materials that built the Sun and planetary system. Born in nova outbursts, which are repeated cataclysmic eruptions by certain types of white dwarf stars, the silicon carbide grains are found today embedded in primitive meteorites.

"Silicon carbide is one of the most resistant bits found in meteorites," Bose said. "Unlike other elements, these stardust grains have survived unchanged from before the solar system was born."

Violent birth

A star becomes a nova -- a "new star" -- when it suddenly brightens by many magnitudes. Novae occur in pairs of stars where one star is a hot, compact remnant called a white dwarf. The other is a cool giant star so large its extended outer atmosphere feeds gas onto the white dwarf. When enough gas collects on the white dwarf, a thermonuclear eruption ensues, and the star becomes a nova.

Although powerful, the eruption doesn't destroy the white dwarf or its companion, so novae can erupt over and over, repeatedly throwing into space gas and dust grains made in the explosion. From there the dust grains merge with clouds of interstellar gas to become the ingredients of new star systems.

The Sun and solar system were born about 4.6 billion years ago from just such an interstellar cloud, seeded with dust grains from earlier stellar eruptions by many different kinds of stars. Almost all the original grains were consumed in making the Sun and planets, yet a tiny fraction remained. Today these bits of stardust, or presolar grains, can be identified in primitive solar system materials such as chondritic meteorites.

"The key that unlocked this for us was the isotopic composition of the stardust grains," Bose said. Isotopes are varieties of chemical elements that have extra neutrons in their nuclei. "Isotopic analysis lets us trace the raw materials that came together to form the solar system."

She added, "Each silicon carbide grain carries a signature of the isotopic composition of its parent star. This provides a probe of that star's nucleosynthesis -- how it made elements."

Bose collected published data on thousands of grains, and found that nearly all the grains grouped naturally into three main categories, each attributable to one kind of star or another.

But there were about 30 grains that couldn't be traced back to a particular stellar origin. In the original analyses, these grains were flagged as possibly originating in nova explosions.

But did they?

Making stardust

As a theoretical astrophysicist, Starrfield uses computer calculations and simulations to study various kinds of stellar explosions. These include novae, recurrent novae, X-ray bursts, and supernovae.

Working with other astrophysicists, he was developing a computer model to explain the ejected materials seen in the spectrum of a nova discovered in 2015. Then he attended a colloquium talk given by Bose before she had joined the faculty.

"I would not have pursued this if I hadn't heard Maitrayee's talk and then had our follow-up discussion," he said. That drew him deeper into the details of nova eruptions in general and what presolar grains could say about these explosions that threw them into space.

A problem soon arose. "After talking with her," Starrfield said, "I discovered our initial way of solving the problem was not agreeing with either the astronomical observations or her results.

"So I had to figure out a way to get around this."

He turned to multidimensional studies of classical nova explosions, and put together a wholly new way of doing the model calculations.

There are two major composition classes of nova, Starrfield said. "One is the oxygen-neon class which I've been working on for 20 years. The other is the carbon-oxygen class which I had not devoted as much attention to." The class designations for novae come from the elements seen in their spectra.

"The carbon-oxygen kind produce a lot of dust as part of the explosion itself," Starrfield said. "The idea is that the nova explosion reaches down into the white dwarf's carbon-oxygen core, bringing up all these enhanced and enriched elements into a region with high temperatures."

That, he said, can drive a much bigger explosion, adding, "It's really messy. It shoots out dust in tendrils, sheets, jets, blobs, and clumps."

Starrfield's calculations made predictions of 35 isotopes, including those of carbon, nitrogen, silicon, sulfur, and aluminum, that would be created by the carbon-oxygen nova outbursts.

It turned out that getting the right proportion of white dwarf core material and accreted material from the companion star was absolutely necessary for the simulations to work. Bose and Starrfield then compared the predictions with the published compositions of the silicon carbide grains.

This led them to a somewhat surprising conclusion. Said Bose, "We found that only five of the roughly 30 grains could have come from novae."

While this may seem a disappointing result, the scientists were actually pleased. Bose said, "Now we have to explain the compositions of the grains that didn't come from nova outbursts. This means there's a completely new stellar source or sources to be discovered."

And looking at the larger picture, she added, "We have also found that astronomical observations, computer simulations, and high-precision laboratory measurements of stardust grains are all needed if we want to understand how stars evolve. And this is exactly the kind of interdisciplinary science that the school excels at."

Credit: 
Arizona State University

Blood test could give two month warning of kidney transplant rejection

New research from the NIHR Guy's and St Thomas' Biomedical Research Centre has found a way to predict rejection of a kidney transplant before it happens, by monitoring the immune system of transplant patients.

The researchers have found that a signature combination of seven immune genes in blood samples can predict rejection earlier than current techniques. Monitoring these markers in transplant patients with regular blood tests could help doctors intervene before any damage to the organ occurs, and improve outcomes for patients.

A renal transplant offers the best treatment for patients whose kidneys have failed, with around 3,000 carried out annually in the UK. Acute rejection occurs when the body's immune system begins to attack the donated organ. This is a common complication in the first year after the transplant, affecting around 2 in 10 patients. It can affect the lifespan of the transplanted organ.

Currently, acute rejection can only be confirmed by taking a biopsy of the transplanted organ. While acute rejection can be treated, this can only be done when the organ is already affected and damage has already occurred.

Once the new technique is validated further, it has the potential to offer clinicians the use of a simple blood test to predict rejection. Being able to intervene before the event will help prevent damage to patients, and extend the life of the transplanted organ.

Dr Paramit Chowdhury, a consultant nephrologist at Guy's and St Thomas' and author on the paper said: "This advance could make a huge difference to our ability to monitor kidney transplant patients and treat rejection earlier. It may also save some patients from an unnecessary biopsy. It is a first step in getting a better insight into the status of a patient's immune system, allowing better tailoring of the patient's anti-rejection treatment.

"A big challenge at the moment is that even the best transplanted organ has a limited lifespan of up to 30 years. By being able to pick up signs of rejection early, we might increase the lifespan of the organ and help patients have a better quality of life, for longer."

The team recruited 455 patients who received a kidney transplant at Guy's Hospital and followed these patients over the first year of their transplant, collecting regular blood and urine samples. Using these samples and analysing the data over time, they developed a signature combination of seven genes that differentiated patients who developed rejection from those who did not.

They then tested for the signature via a blood test in a separate cohort of patients, and validated that it predicted transplant rejection.

The team also identified a six gene signature for a less common form of complication. BK-virus nephropathy can look clinically similar to acute rejection, but requires a very different therapy - reducing immunosuppression. Being able to distinguish between these complications would mean clinicians can ensure that patients receive the most appropriate treatment.

Dr Maria Hernandez Fuentes, visiting senior lecturer at King's College London and author on the study, said: "Biomarkers are naturally occurring genes or proteins that appear in the blood, which can tell us what is happening in the body. This is vital in determining the best course of treatment for patients. We were able to monitor the genes that were being expressed in transplant patients and map how these reflected their clinical outcomes.

"Being able to tell the difference between BK-virus nephropathy and acute rejection, which can look very similar in patients, just shows how we can use these molecular techniques to complement clinical practice.

"Further evaluation will be needed to fully validate the technique is reliable enough for clinical use, and it will be exciting to develop this research further."

The research is published in the journal EBioMedicine. It was supported by the National Institute for Health Research Biomedical Research Centre at Guy's and St Thomas' and King's College London, the Medical Research Council Centre for Transplantation and the EU (Framework Programme 7). Anonymised clinical data was also provided by the NIHR Health Informatics Collaborative.

Credit: 
NIHR Biomedical Research Centre at Guy’s and St Thomas’ and King’s College London

Could medical marijuana help grandma and grandpa with their ailments?

MINNEAPOLIS - Medical marijuana may bring relief to older people who have symptoms like pain, sleep disorders or anxiety due to chronic conditions including amyotrophic lateral sclerosis, Parkinson's disease, neuropathy, spinal cord damage and multiple sclerosis, according to a preliminary study released today that will be presented at the American Academy of Neurology's 71st Annual Meeting in Philadelphia, May 4 to 10, 2019. The study not only found medical marijuana may be safe and effective, it also found that one-third of participants reduced their use of opioids. However, the study was retrospective and relied on participants reporting whether they experienced symptom relief, so it is possible that the placebo effect may have played a role. Additional randomized, placebo-controlled studies are needed.

According to the Centers for Disease Control and Prevention, approximately 80 percent of older adults have at least one chronic health condition.

"With legalization in many states, medical marijuana has become a popular treatment option among people with chronic diseases and disorders, yet there is limited research, especially in older people," said study author Laszlo Mechtler, MD, of Dent Neurologic Institute in Buffalo, N.Y., and a Fellow of the American Academy of Neurology. "Our findings are promising and can help fuel further research into medical marijuana as an additional option for this group of people who often have chronic conditions."

The study involved 204 people with an average age of 81 who were enrolled in New York State's Medical Marijuana Program. Participants took various ratios of tetrahydrocannabinol (THC) to cannabidiol (CBD), the main active chemicals in medical marijuana, for an average of four months and had regular checkups. The medical marijuana was taken by mouth as a liquid extract tincture, capsule or in an electronic vaporizer.

Initially, 34 percent of participants had side effects from the medical marijuana. After an adjustment in dosage, only 21 percent reported side effects. The most common side effects were sleepiness in 13 percent of patients, balance problems in 7 percent and gastrointestinal disturbances in 7 percent. Three percent of the participants stopped taking the medical marijuana due to the side effects. Researchers said a ratio of one-to-one THC to CBD was the most common ratio among people who reported no side effects.

Researchers found that 69 percent of participants experienced some symptom relief. Of those, the most common conditions that improved were pain with 49 percent experiencing relief, sleep symptoms with 18 percent experiencing relief, neuropathy improving in 15 percent and anxiety improving in 10 percent.

Opioid pain medication was reduced in 32 percent of participants.

"Our findings show that medical marijuana is well-tolerated in people age 75 and older and may improve symptoms like chronic pain and anxiety," said Mechtler. "Future research should focus on symptoms like sleepiness and balance problems, as well as efficacy and optimal dosing."

Credit: 
American Academy of Neurology