Culture

Study: US tornado frequency shifting eastward from Great Plains

image: This is Victor Gensini.

Image: 
Northern Illinois University

DeKalb, IL - A new study finds that over the past four decades, tornado frequency has increased over a large swath of the Midwest and Southeast and decreased in portions of the central and southern Great Plains, a region traditionally associated with Tornado Alley.

The study, by meteorology professor Victor Gensini of Northern Illinois University and Harold Brooks of NOAA's National Severe Storms Laboratory in Norman, Okla., found significant decreasing trends in frequencies of both tornado reports and tornado environments over portions of Texas, Oklahoma and northeast Colorado.

Tornado Alley remains the top zone for tornadoes in the United States, but other areas, including the so-called Dixie Alley that includes much of the lower Mississippi Valley region, are catching up.

The researchers identified significant increasing trends of tornado reports and tornado environments in portions of Mississippi, Alabama, Arkansas, Missouri, Illinois, Indiana, Tennessee and Kentucky.

"Regions in the Southeast and Midwest are closing the gap when it comes to the number of tornado reports," said Gensini, who led the study published Oct. 17 in the Nature partner journal, Climate and Atmospheric Science.

"It's not that Texas and Oklahoma do not get tornadoes," Gensini said. "They're still the number one location in terms of tornado frequency, but the trend in many locations is down over the past 40 years."

The study examined tornado frequency trends in fine-scale resolution using two separate approaches, Gensini said.

The researchers tracked the number of tornado reports from 1979 to 2017, while also investigating regional trends in the daily frequency of tornado-environment formation over the same time period, using an index known as the Significant Tornado Parameter (STP). Frequently used for predicting severe weather, the index captures the coexistence of atmospheric ingredients favorable for producing tornadoes.

Both the number of actual tornado reports and the historical STP analysis showed the eastward uptick in tornado frequency.

"One could argue that because a region's population has increased, more tornadoes are sighted and reported," Gensini said. "But we also identified this eastward trend when using the STP index, which looks at the frequency of tornado environments and has nothing to do with people. This increases our confidence in the reporting trend that we're seeing."

The trend is important for understanding the potential for future tornado exposure, damage and casualties. Severe thunderstorms accompanied by tornadoes, hail and damaging winds cause an average of $5.4 billion of damage each year across the United States, and events with $10 billion or more in damages are no longer uncommon.

Previous research, including 2007 and 2016 studies by NIU professor Walker Ashley, has identified the Southeast as particularly vulnerable to tornadoes. Because of factors such as longer and larger tornado paths, expanding population density, mobile-home density and higher nighttime tornado probabilities, most tornado fatalities occur in the Southeast, particularly the mid-South region.

"We've shown the tornado frequency trend is increasing in the Midwest and Southeast," Gensini said. "While tornadoes can happen in all 50 states, if more tornadoes are happening in your area, you're more susceptible to one of these disasters.

"This could be taken into consideration when adopting building codes, identifying potentially impacted community assets, creating awareness and making emergency preparations," he added.

The researchers cannot say for sure whether the eastward shift in tornado reports and environments might be caused by natural or human-induced climate change.

"Clearly, there is a climate change signal here," Gensini said. "What's causing the change is still an open question."

Credit: 
Northern Illinois University

Researchers identify new approach for controlling dengue fever and Zika virus

image: This is Alexander Raikhel in his office at UC Riverside.

Image: 
I. Pittalwala, UC Riverside.

RIVERSIDE, Calif. -- Mosquitoes are the world's deadliest animals, killing thousands of people and causing millions of illnesses each year. To be able to reproduce and become effective disease carriers, mosquitoes must first attain optimal body size and nutritional status. 

A pair of researchers at the University of California, Riverside, have succeeded in using CRISPR-Cas9, a powerful tool for altering DNA sequences and modifying gene function, to decrease mosquito body size, moving the research one step closer to eliminating mosquitoes that carry dengue fever and Zika virus.

The researchers succeeded in postponing mosquito development, shortening the animal's lifespan, retarding egg development, and diminishing fat accumulation.

Alexander Raikhel, a distinguished professor of entomology, and Lin Ling, a postdoctoral scholar working with Raikhel, used CRISPR-Cas9 to disrupt the serotonin receptor Aa5HT2B in Aedes aegypti mosquitoes, the vectors of dengue fever, yellow fever, and Zika virus.

"Aa5HT2B controls insulin-like peptides," Raikhel said. "We were able to uncover the different roles that these peptides play in controlling body size and metabolism, and disrupt the gene associated with this receptor."

The team accomplished this, Raikhel said, by uncovering a key molecular pathway determining mosquito body size and metabolism.

"Mosquitoes of small size with diminished fat resources mature later and live shorter lives than nonmodified mosquitoes," he said. "Thus, these genetically engineered mosquitoes have low reproductive capacity and ability to transmit disease pathogens. These features of CRISR-Cas9 mutant mosquitoes can be exploited for developing novel mosquito control approaches. Many challenges remain on the road, however, toward achieving this goal." 

Study results appear in the Proceedings of the National Academy of Sciences.

Raikhel, the UC Presidential Chair and the Mir Mulla Endowed Chair in the Department of Entomology and a member of the National Academy of Sciences, explained that disease-transmitting female mosquitoes require a vertebrate blood meal to produce their eggs because egg development occurs only after a diet change from carbohydrate-rich nectar to protein-rich vertebrate blood.

Blood feeding, Raikhel added, boosts serotonin concentration and increases the level of the serotonin receptor Aa5HT2B in the "fat-body," the insect analog of vertebrate liver and adipose tissue. A target for hormones, the fat-body is the main nutrient sensor in insects. It links nutritional state, metabolism, and growth.

"Our study provides for the first time a link -- the serotonin receptor Aa5HT2B -- between blood feeding and the serotonin signaling that is specific to the fat-body," he said. "Aa5HT2B mediates serotonin action. Until now, the mechanisms of serotonin action specific to the fat-body were poorly understood. Understanding regulatory mechanisms that underlie determination of body size and metabolism is important for developing novel approaches to control mosquito populations and the diseases they carry."

One important question for further research is how CRISPR-Cas9 gene modification could be introduced into the wild mosquito population.

"This question is a topic of intense research in other laboratories," Raikhel said. "At UCR, we are continuing our efforts in identifying other key processes important for mosquito development that could be exploited for mosquito control."

Credit: 
University of California - Riverside

Penetrating the soil's surface with radar

image: Algeo stands on the left with a tablet while his advisor, Lee Slater, drags ground penetrating radar equipment over the soil's surface.

Image: 
Chris Watts, Rothamsted Research.

Ground penetrating radar isn't something from the latest sci-fi movie. It's actually a tool used by soil scientists to measure the amount of moisture in soil quickly and easily.

As with most technologies, it is getting better and new ways to use it are being tested. Jonathan Algeo, a graduate student at Rutgers University, has spent his studies making ground penetrating radar better for different uses, such as measuring soil moisture.

"It's a very common tool in research, agriculture, engineering, and the military for looking at buried objects and measuring water content," Algeo explains. "One of its main benefits is that it is very fast. One example is a tool with a wheel that allows the radar to take measurements as you drag it along the ground. In this way, you can very quickly take measurements across a large field or a line that's miles in length. Radar can be used quickly over a large area to answer many different questions."

The technology can be used to find underground tunnels, bedrock, or cracks of metal in the supports of a bridge. In terms of soil, the questions can vary. How much water is near the surface? How does it vary throughout a field site? The near-surface water content can affect climate, so it's important for computer-based climate models as well.

Being able to measure soil moisture in a field can allow farmers to optimize water usage so they aren't using too much or too little, especially in dry areas where water is limited. Looking at the very shallow subsurface allows farmers to test the efficiency of their irrigation systems.

How does it do this? "Ground penetrating radar uses two antennae. One puts out a signal and another receives it," Algeo says. "The outgoing signal is similar to a microwave or cell phone signal. That signal travels in all directions, but most of the energy is directed into the ground. When there is a buried object or a change in material, the radar signal reflects back to the surface, where it is picked up by the other antenna."

He adds that when there is more water in the soil, the waves move slower. When there is less water, they move faster. A scientist can use information the antennae collect from the waves to estimate the water content of the soil.

The equations and methods researchers use to estimate water content come in many different forms. Algeo's recent research tested which ones were best at estimating water content. The equations analyze the early time signal. These are the first radar waves to get back to the receiving antenna after going through just the top of the soil. The strength of this signal changes based on the water content of the top of the soil. It can be measured even in clay-rich soils where radar wouldn't normally be helpful.

Algeo and his team compared two methods of calculating a value for the early time signal to determine which, if either, was better at tracking changes in soil moisture. They found both methods were successful. This gives researchers the ability to quickly estimate water content across large field sites.

"In order for a method to get widespread use in industry, it needs to be proven beyond doubt by researchers like us," Algeo says. "We are trying to figure out all the details of where, how, and when early time signal analysis is most useful. This means users of ground penetrating radar will have another tool in their toolbox when they are trying to quickly measure subsurface water content."

"Ground penetrating radar is my favorite geophysical tool because we can get such a wide variety of information from the subsurface with it," he adds. "If there is a question about the subsurface, chances are it will be able to give you some insight into what's going on."

Credit: 
American Society of Agronomy

Adolescent THC exposure alters neurons/gene networks associated with psychosis risk

Corresponding Author: Yasmin Hurd, PhD, Director of The Addiction Institute of Mount Sinai, Icahn School of Medicine at Mount Sinai, New York, and other coauthors.

Bottom Line: Young adults with exposure to THC (the psychoactive component of cannabis) during adolescence have alterations in the structure of neurons and gene expression within these brain cells (which are critical for maintaining synaptic plasticity) in the prefrontal cortex, a brain region that mediates decision-making and other cognitive functions.

Results: Adolescent THC exposure reduces the branching of prefrontal cortical neurons and the number of spines, which are critical for cellular communication. This adolescent exposure is also associated with a reorganization of the gene expression of specific genes that are predominantly related to neuron development, synaptic plasticity and chromatin organization (epigenetic mechanisms). The gene networks affected by THC exposure mimicked those networks observed to be impaired in the prefrontal cortex of individuals diagnosed with schizophrenia.

Why the Research Is Interesting: The findings demonstrate that adolescent THC exposure can induce long-term structural changes, thus altering the developmental trajectory of adult cortical cells along with altering gene networks that are similarly disturbed in individuals suffering from schizophrenia.

Who: Animal model with exposure to THC, the psychoactive ingredient of cannabis. Analysis of a gene expression database of human schizophrenia patients.

When: Animals were exposed during adolescence and their brains studied into adulthood.

What: The study measured gene expression and the structure of neurons in the prefrontal cortex.

How: Cells were examined under a microscope using a computerized system to determine the shape of the neurons in the prefrontal cortex. A laser was used to specifically capture neurons in the prefrontal cortex and the cells were sequenced to determine the expression of genes. Computational analysis was used to compare the gene expression networks in the animal model and those of individuals with schizophrenia.

Study Conclusions: Adolescent THC exposure reduced the structural complexity of cortical neurons and associated genes that regulate the development of neurons. These were accompanied by significant changes in genes related to the epigenetic mechanisms which regulate DNA openness and chromatin structure that determines whether genes are turned "on" or "off." Moreover, the gene expression networks that were altered were similar to those observed to be impaired in the prefrontal cortex of human subjects with schizophrenia, meaning that adolescent THC exposure may alter psychiatric vulnerability, particularly in individuals with overlapping genetic disturbances within THC-sensitive gene networks.

Paper Title: Adolescent exposure to ? 9 -tetrahydrocannabinol alters the transcriptional trajectory and dendritic architecture of prefrontal pyramidal neurons

Said Mount Sinai's Dr. Yasmin Hurd of the research:

The study emphasizes that cannabis, particularly THC-prominent strains, has the capacity for long-term effects into adulthood, even after the drug is no longer in the body. These findings have important implications for the changing sociopolitical discussions regarding the recreational use of marijuana. The ability of THC to change the actual shape of developing neurons that are well-known to be essential for normal cortical communication is alarming. This emphasizes that even a drug that is not considered to be very harmful can alter the sensitivity of critical brain regions during adolescent development and, in particular, change the sensitivity of gene networks relevant to psychosis risk. More education is needed to inform teens about this and about other drugs that can impact the trajectory of the developing adolescent brain.

To request a copy of the paper or to schedule an interview with Dr. Yasmin Hurd, please contact Mount Sinai's Director of Media and Public Affairs, Elizabeth Dowling, at elizabeth.dowling@mountsinai.org or at 212 241-9200

Journal

Molecular Psychiatry

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

Study examines aspects of conscientious objection among nurses

One-on-one interviews with eight nurses in Ontario revealed that nurses making conscientious objections to ethically-relevant policies lack concrete supports and need protection in healthcare practice settings.

The authors of the Journal of Advanced Nursing study noted that healthcare practice is becoming more ethically complex, and nurses need to be able to address their issues of conscience to care that they ethically disagree with. For Canadian nurses, this need has been recently heightened by the national legalization of euthanasia, known as Medical Assistance in Dying in Canada.

"The situation of euthanasia in Canada highlights the need to attend to the freedom of conscience for nurses and all healthcare professionals. As voiced by nurses in this study who are experiencing conflicts of conscience in professional settings, freedom of conscience clauses is paramount to supporting healthcare professionals to practice with clear boundaries in supportive workplace environments where their conscientious objections are inclusively respected," said primary author Dr. Christina Lamb. "Importantly, conversations about conscience and conscientious objection need to occur on a routine basis in healthcare practice, so that respect for the human right to conscience becomes a forerunner in ethical conversations for healthcare professionals."

Credit: 
Wiley

For-profit nursing home residents more likely to be diagnosed with neglect issues

Residents receiving care in for-profit nursing homes are almost twice as likely to experience health issues caused by substandard care compared with clients living in not-for-profit facilities or in homes in the community, according to a new report in the journal Gerontology.

The researchers, led by Lee Friedman, associate professor of environmental and occupational health sciences in the University of Illinois at Chicago School of Public Health, also found that community-dwelling adults 60 years old and older who need assistance with tasks related to daily living but do not live in a nursing home had the fewest number of clinical signs of neglect compared with those living in any type of nursing facility.

"We saw more -- and more serious -- diagnoses among residents of for-profit facilities that were consistent with severe clinical signs of neglect, including severe dehydration in clients with feeding tubes which should have been managed, clients with stage 3 and 4 bed sores, broken catheters and feeding tubes, and clients whose medication for chronic conditions was not being managed properly," Friedman said.

Previous studies have demonstrated that clinically diagnosed signs of neglect are more prevalent among residents of for-profit nursing homes compared with not-for-profit facilities, but these studies have focused on individual clinical signs, such as bed sores or injuries. Because these clinical signs rarely occur in isolation, these past studies likely underestimated the population of residents experiencing serious adverse health effects due to neglect.

Friedman and his colleagues looked at medical records for 1,149 patients aged 60 and older identified from five greater Chicago metropolitan area hospitals that serve about 10 percent of all patients in Illinois. Patients included in the study were seen at these hospitals between 2007 and 2011 for issues ranging from mild to severe that could be related to poor quality care. The researchers assessed the relationship between residence type -- community-dwelling, not-for-profit facility, for-profit facility -- and clinical signs of neglect. Community-dwelling residents live in private homes, often with family members or friends.

The researchers used the Clinical Signs of Neglect Scale (CSNS) -- a scale developed by Friedman and his colleagues -- to quantify health problems related to substandard care and health outcomes among individuals they identified living in private homes, nonprofit nursing homes and for-profit nursing homes. The scale lists about two dozen conditions, ranging from constipation and dehydration to more serious issues such as severe bed sores and broken catheter tubes.

"Substandard care is a form of neglect and falls within the definition of elder abuse," Friedman said. "We have a growing number of people who need services provided by nursing facilities, but the reality is that a third of nursing homes in Illinois receive below-average ratings by the Centers for Medicare & Medicaid Services. Substandard care puts residents at great risk for serious health issues."

The study by Friedman showed that residents of for-profit nursing facilities are diagnosed with more clinical signs of neglect and these facilities were consistently inferior to not-for-profit nursing homes across numerous staffing, capacity and deficiency measures.

"For-profit nursing facilities pay their high-level administrators more, and so the people actually providing the care are paid less than those working at nonprofit places," he said. "So staff at for-profit facilities are underpaid and need to take care of more residents, which leads to low morale for staff, and it's the residents who suffer."

"More oversight of these facilities, both for-profit and not-for-profit, needs to occur together with improved screening and reporting of suspected cases of neglect by all parties," Friedman said.

"There needs to be better staffing and training for enforcing these measures. Performance improvement programs and quality assurance and assessment committees, tighter adherence to federal law by Central Management Services that ties Medicare and Medicaid reimbursement with quality of care, and pressure from insurance providers to limit costly outcomes could help reduce the unfortunate diagnoses we saw in our study."

Credit: 
University of Illinois Chicago

Why heart contractions are weaker in those with hypertrophic cardiomyopathy

When a young athlete suddenly dies of a heart attack, chances are high that they suffer from familial hypertrophic cardiomyopathy (HCM). It is the most common genetic heart disease in the US and affects an estimated 1 in 500 people around the world. A protein called myosin acts as the molecular motor which makes the muscles in the heart contract. Researchers had suspected for some time that the R403Q mutation in some of the myosin genes is among those that play a role in causing HCM. But experiments using mice models failed to show that this was indeed the case. (Mice are often used in experiments because their behaviour, biology and genetic material resemble those of humans).

An international team led by Professor Dilson Rassier from McGill's Department of Kinesiology and Physical Education, has discovered, by working with transgenic rabbits with the R403Q mutation, that in these rabbits, individual myosin molecules and myofibrils (the basic rod-like filaments inside muscles) produce less force and a lower maximum velocity of contraction than those isolated from healthy hearts.

They reached this conclusion by using advanced techniques such as atomic force microscopy and molecular motility assays (which allows them to visualize the movements of these proteins in vitro) to look more closely at what was going on within myosin molecules and also within myofibrils.

"It's been difficult to gain a clear picture of what is going on within the myosin proteins with this mutation, simply because of the technical and experimental limitations of looking closely at objects of this minute size and measuring their force and motility (myosin molecules are about 19 nanometres (0.0000019 centimetres)," explains Rassier. "The results should help clinicians develop drugs and chemicals that target this specific function of myosin in future."

Credit: 
McGill University

Top athletes weigh in on perceived effectiveness of anti-doping measures

When trying to determine how best to deter doping in competitive sports, who better to ask than the athletes themselves? A first-of-its-kind study in Frontiers in Psychology did precisely that by asking top level German cyclists and field athletes to rate which anti-doping methods they perceived as the most effective. The athletes identified improved detection and diagnostics, increased bans for offenders and anti-doping laws, which make doping a criminal offense, as the most important methods. Increased fines and leniency programs for offenders who cooperate in the identification of other offending athletes were ranked as far less effective.

"Currently, hundreds of millions of US dollars are invested annually to fight doping -- and top athletes have to suffer enormous cuts in their private lives in order to carry out anti-doping controls in accordance with the rules," says Dr. Daniel Westmattelmann of the University of Münster in Germany. "But despite the numerous publications in the field of anti-doping, the effectiveness of the implemented measures remains largely unknown."

No reliable methods currently exist for determining just how prevalent doping is, but estimates range anywhere from 1 to more than 60%. Since its founding in 1999, the World Anti-Doping Agency has been attempting to fight doping primarily with deterrents such as the threat of detection and subsequent severe fines and bans.

Despite this, doping remains an ongoing problem -- causing many to ask whether these strategies are really working. While many researchers have explored aspects of this problem, Westmattelmann and his colleagues are the first to ask athletes themselves which of the many available strategies they believe are the most effective.

"Athletes should be at the center of the anti-doping fight and they are probably the best at assessing anti-doping measures," explains Westmattelmann.

The study included 42 professional cyclists and 104 track and field athletes in Germany. The athletes completed an online questionnaire that asked them to rank 14 different anti-doping measures on a scale of one to five.

The results show for the first time that athletes rank some measures, such as stricter controls and harsher punishments, as far more effective than others. The study also showed no significant differences between how male and female athletes responded, although there were marginal sport-specific differences.

Although these results are limited to German athletes, the authors hope such questionnaires will be used to poll athletes worldwide -- and that these insights will guide future anti-doping regulations.

"The results are based on self-reported and subjective data from only German athletes in two sports, but it would be very interesting to carry out equivalent studies in other countries," says Westmattelmann. "Anti-doping organizations can take this knowledge into account for the allocation of anti-doping budgets."

Credit: 
Frontiers

Factors linked with wellbeing and medication adherence in young adults with kidney failure

Washington, DC (October 16, 2018) -- A new study evaluates important aspects of psychological health in young adults with kidney failure. The findings, which appear in an upcoming issue of the Clinical Journal of the American Society of Nephrology (CJASN), point to the need for additional efforts to address the wellbeing of these patients.

In addition to affecting their physical health, kidney failure affects the psychosocial health of young people. With this in mind, Alexander Hamilton, MD (University of Bristol, UK) and his colleagues conducted a study to determine which factors influence mental wellbeing and medication adherence in young adults who have received a kidney transplant or are undergoing dialysis.

After analyzing data from the UK Renal Registry and online surveys completed by 417 young adults in the UK with transplants and 173 on dialysis, the investigators found that wellbeing was positively associated with extraversion, openness, independence, and social support, and negatively associated with neuroticism, negative body image, stigma, psychological morbidity, and dialysis. Higher medication adherence was associated with living with parents, conscientiousness, physician access satisfaction, patient activation, age, and male sex, and lower adherence with comorbidity, dialysis, education, ethnicity, and psychological morbidity.

"Worse outcomes for mental wellbeing and medication adherence were both associated with psychological morbidity and dialysis treatment, whereas social support and living with parents were associated with better outcomes. These findings are important because mental health problems appear under-recognized and may be treatable," said Dr. Hamilton. "Our results suggest a possible role for routine measurement of psychological health in young people, to avoid missing opportunities to identify and improve mental health. This could help identify those at higher risk of poor outcomes for close monitoring, greater psychosocial support, or targeted interventions."

Dr. Hamilton added that there has been much focus both on programs to improve the transition from pediatric to adult care for kidney failure patients. "It is vital to understand which factors influence wellbeing and medication adherence, because by defining these we can seek interventions to improve areas of deficit," he said. "These areas really matter to patients."

In an accompanying Caregiver Perspective, Pam Duquette and Kelly Helm note that "clinics must create an environment where psychological health is consistently monitored and addressed, and patients and their caregivers are given tools to advocate for themselves. This study is a wonderful start and can be used as a stepping stone to further understand how a patient's environment effects their future care."

Also, an accompanying Patient Perspective by Amanda Grandinetti, MPH, stresses that differentiation in the nephrology care that adolescents and young adults receive from the adult population is critical.

Credit: 
American Society of Nephrology

The science of sustainability

image: Land meets sea.

Image: 
Uruma City, Japan © Ryo Yoshitake

The U.S. city of Louisville, Kentucky isn't known as a hotbed of environmental action and innovation, but that could change as it has recently become home to a first-of-its-kind collaboration between environmentalists, city leaders and public health professionals. The Green Heart Project, funded in part by the United States National Institutes of Health, will plant trees in neighborhoods throughout the city and monitor how they affect residents' health. It's a boundary-pushing medical trial--a controlled study of nature as a medical intervention.

Green Heart is just one project in one city, but it represents a new way of thinking about the role of conservation in solving human problems. It is part of an emerging model for cross-sector collaboration that aims to create a world ready for the sustainability challenges ahead.

Is this world possible? Here, we present a new science-based view that says "Yes"--but it will require new forms of collaboration across traditionally disconnected sectors, and on a near unprecedented scale.

Many assume that economic interests and environmental interests are in conflict. But new research makes the case that this perception of development vs. conservation is not just unnecessary but actively counterproductive to both ends. Achieving a sustainable future will be dependent on our ability to secure both thriving human communities and abundant and healthy natural ecosystems.

The Nature Conservancy partnered with the University of Minnesota, CIRES at the University of Colorado Boulder, and 11 other organizations to ask whether it is possible to achieve a future where the needs of both people and nature are advanced. Can we actually meet people's needs for food, water and energy while doing more to protect nature?

A False Choice

To answer this question, we compared what the world will look like in 2050 if economic and human development progress in a "business-as-usual" fashion and what it would look like if instead we join forces to implement a "sustainable" path with a series of fair-minded and technologically viable solutions to the challenges that lie ahead.

In both options, we used leading projections of population growth and gross domestic product to estimate how demand for food, energy and water will evolve between 2010 and 2050. Under business-as-usual, we played out existing expectations and trends in how those changes will impact land use, water use, air quality, climate, protected habitat areas and ocean fisheries. In the more sustainable scenario, we proposed changes to how and where food and energy are produced, asking if these adjustments could result in better outcomes for the same elements of human well-being and nature. Our full findings are described in a peer-reviewed paper--"An Attainable Global Vision for Conservation and Human Well-Being"--published in Frontiers in Ecology and the Environment.

These scenarios let us ask, can we do better? Can we design a future that meets people's needs without further degrading nature in the process?

Our answer is "yes," but it comes with several big "ifs." There is a path to get there, but matters are urgent--if we want to accomplish these goals by mid-century, we'll have to dramatically ramp up our efforts now. The next decade is critical.

Furthermore, changing course in the next ten years will require global collaboration on a scale not seen perhaps since World War II. The widely held impression that economic and environmental goals are mutually exclusive has contributed to a lack of connection among key societal constituencies best equipped to solve interconnected problems--namely, the public health, development, financial and conservation communities. This has to change.

The good news is that protecting nature and providing water, food and energy to a growing world do not have to be either-or propositions. Our view, instead, calls for smart energy, water, air, health and ecosystem initiatives that balance the needs of economic growth and resource conservation equally. Rather than a zero-sum game, these elements are balanced sides of an equation, revealing the path to a future where people and nature thrive together.

Two Paths to 2050

This vision is not a wholesale departure from what others have offered. A number of prominent scientists and organizations have put forward important and thoughtful views for a sustainable future; but often such plans consider the needs of people and nature in isolation from one another, use analyses confined to limited sectors or geographies, or assume that some hard tradeoffs must be made, such as slowing global population growth, taking a reduction in GDP growth or shifting diets off of meat. Our new research considers global economic development and conservation needs together, more holistically, in order to find a sustainable path forward.

What could a different future look like? We've used as our standard the United Nations' Sustainable Development Goals (SDGs), a set of 17 measures for "a world where all people are fed, healthy, employed, educated, empowered and thriving, but not at the expense of other life on Earth." Our analysis directly aligns with ten of those goals. Using the SDGs as our guideposts, we imagine a world in 2050 that looks very different than the one today--and drastically different from the one we will face if we continue in business-as-usual fashion.

To create our assessment of business-as-usual versus a more sustainable path, we looked at 14 measurements including temperature change, carbon dioxide levels, air pollution, water consumption, food and energy footprints, and protected areas.

Over the next 30 years, we know we'll face rapid population growth and greater pressures on our natural resources. The statistics are sobering--with 9.7 billion people on the planet by 2050, we can expect a 54 percent increase in global food demand and 56 percent increase in energy demand. While meetings these growing demands and achieving sustainability is possible, it is helpful to scrutinize where the status quo will get us.

The World Health Organization, World Economic Forum and other leading global development organizations now say that air pollution and water scarcity--environmental challenges--are among the biggest dangers to human health and prosperity. And our business-as-usual analysis makes clear what many already fear: that human development based on the same practices we use today will not prepare us for a world with nearly 10 billion people.

To put it simply, if we stay on today's current path, we risk being trapped in an intensifying cycle of scarcity--our growth opportunities severely capped and our natural landscapes severely degraded. Under this business-as-usual scenario, we can expect global temperature to increase 3.2°C; worsened air pollution affecting 4.9 billion more people; overfishing of 84 percent of fish stocks; and greater water stress affecting 2.75 billion people. Habitat loss continues, leaving less than 50 percent of native grasslands and several types of forests intact.

However, if we make changes in where and how we meet food, water and energy demands for the same growing global population and wealth, the picture can look markedly different by mid-century. This "sustainability" path includes global temperature increase limited to 1.6°C--meeting Paris Climate Accord goals--zero overfishing with greater fisheries yields, a 90 percent drop in exposure to dangerous air pollution, and fewer water-stressed people, rivers and agricultural fields. These goals can be met while natural habitats extend both inside and outside protected areas. All signatory countries to the Aichi Targets meet habitat protection goals, and more than 50 percent of all ecoregions' extents remain unconverted, except temperate grasslands (of which over 50 percent are already converted today).

What's Possible

Achieving this sustainable future for people and nature is possible with existing and expected technology and consumption, but only with major shifts in production patterns. Making these shifts will require overcoming substantial economic, social and political challenges. In short, it is not likely that the biophysical limits of the planet will determine our future, but rather our willingness to think and act differently by putting economic development and the environment on equal footing as central parts of the same equation.

Credit: 
University of Colorado at Boulder

Higher temperatures could help protect coral reefs

A new study in the journal Behavioral Ecology, published by Oxford University Press, suggests that higher water temperature, which increases the aggressiveness of some fish, could lead to better protection of some coral.

In the face of global warming, recent years have seen an increasing number of studies predicting the future of corals. It is well established that higher water temperatures lead many corals to die. Over the past century, global temperature has increased by 1°F. Meanwhile, research has shown that coral recovery can be significantly influenced by the behavior of species living around coral reefs.

Researchers here evaluated the relationship between fish behavior and coral performance using a farmerfish-coral system. Farmerfish (stegastes nigricans) are aggressive damselfish found around coral reefs in tropical climates that defend gardens of algae from intrusion by other fish. This study tested the relationship between coral recovery rates and the level of aggression exhibited by farmerfish groups when defending their gardens. The researchers did so by planting small coral fragments into farmerfish territories with different levels of aggressiveness.

The researchers collected data from 29 farmerfish colonies in French Polynesia from 2016 and 2017. They evaluated the average aggressiveness of each farmerfish group as well as the group's reaction when intruders entered the farmerfish group's territory.

Researchers found that more branching corals resided in the territories of aggressive farmerfish groups. In addition, corals experimentally planted into the territories of non-aggressive farmerfish suffered 80 percent more damage than the corals planted into the territories of aggressive groups.

Researchers also found that farmerfish groups composed of larger animals were more aggressive. However, follow-up analyses showed that group aggressiveness mattered more than group member size in determining coral success. Fish aggressiveness is therefore likely to be an important part of how coral reefs will grow and survive in future environments.

While warming oceans negatively impacts a variety of biological processes, this study hints that warmer temperatures, which often increase fish aggressiveness, could enhance the protective function of farmerfish for nearby corals.

"Predicting the future of corals will require a systems approach. Failing to account for broader ecological processes, such as species interactions, could lead us to issue the wrong predictions about how some corals will fare in future environments," said the paper's author, Jonathan Pruitt. "Heating up many corals even mildly can negatively impact a variety of physiological processes. However, this study shows that small increases could provide greater protection by resident fishes. Obviously this can't go on for forever, though. At some point, all the protection in the world won't matter anything if the corals can't feed themselves."

Credit: 
Oxford University Press USA

Loss of a microRNA molecule boosts rice production

The wild rice consumed by our Neolithic ancestors was very different from the domesticated rice eaten today. Although it is unclear when humans first started farming rice, the oldest paddy fields--in the lower Yangzi River Valley--date back to 4000 BC. During its long history of cultivation, rice plants with traits that reduce yield or impede harvest (e.g., grain shattering) were weeded out, whereas those with traits that increase yield (e.g., highly branched flowering structures) were selected and propagated. Although the resulting rice plants are super-producers that feed much of the world's population, they rely on human assistance and cannot withstand harsh environmental conditions.

Scientists can examine the genetic basis for some of the changes that took place during rice domestication by comparing genes in cultivated rice plants with those in their wild rice relatives. Using this approach, several key genes that were altered during domestication, such as those affecting grain shattering, have been identified and studied. Most of these genes encode transcription factors that bind to other genes and regulate their activity.

A team of researchers from the National Centre for Biological Sciences, Tata Institute of Fundamental Research in India led by Dr. P.V. Shivaprasad wondered whether another type of molecular regulator, named microRNAs, also contributed to the domestication of rice. MicroRNAs regulate specific target genes by binding to RNA copies of the gene and, together with other molecules, blocking their activity or chopping them into tiny fragments. In special cases, the resulting RNA fragments trigger a silencing cascade, shutting down the activity of genes that are similar to the initial target gene.

The researchers compared the microRNA populations of high-yielding indica rice lines with those of wild rice and several traditional rice varieties. One microRNA species stood out: miR397 accumulated to high levels in the flag leaves of wild rice, but was barely detectable in the other plants analyzed. The scientists showed that miR397 silenced several members of the laccase gene family via a silencing cascade. Laccase genes, of which there are 30 in the rice genome, encode proteins that promote woody tissue formation, thereby providing mechanical strength. By silencing a subset of these genes, miR397 greatly reduced the formation of woody tissue. Furthermore, when the scientists transgenically expressed the gene encoding miR397 in domesticated rice, the resulting plants were more similar to wild rice plants than to domesticated ones, with long, spindly stems; narrow, short leaves; few flowering structures; and hardly any rice grains. In effect, the team partially de-domesticated rice by increasing the levels of a single microRNA species.

These findings raise intriguing questions. If silencing several laccase genes by increasing miR397 levels negatively affects yield, would upregulating the expression of this same set of laccase genes boost grain production? In addition, would reducing the levels of miR397 in wild rice plants, and thereby lifting the repression of the laccase genes, improve yields, while retaining the traits that allow wild plants to thrive in harsh environments? "miR397 and laccase genes overlap with unknown genomic regions predicted to be involved in rice yield. Modifying their expression in wild species and cultivated rice would be useful in improving yield and other beneficial characters. We hope that our finding promotes future research to identify other changes associated with domestication of plants, spearheading further improvement in crops for the future," states Dr. Shivaprasad.

Credit: 
American Society of Plant Biologists

All in the family: Kin of gravitational wave source discovered

image: This image provides three different perspectives on GRB150101B, the first known cosmic analogue of GW170817, the gravitational wave event discovered in 2017. At center, an image from the Hubble Space Telescope shows the galaxy where GRB150101B took place. At top right, two X-ray images from NASA's Chandra X-ray observatory show the event as it appeared on January 9, 2015 (left), with a jet visible below and to the left; and a month later, on February 10, 2015 (right), as the jet faded away. The bright X-ray spot is the galaxy's nucleus.

Image: 
NASA/CXC

On October 16, 2017, an international group of astronomers and physicists excitedly reported the first simultaneous detection of light and gravitational waves from the same source--a merger of two neutron stars. Now, a team that includes several University of Maryland astronomers has identified a direct relative of that historic event.

The newly described object, named GRB150101B, was reported as a gamma-ray burst localized by NASA's Neil Gehrels Swift Observatory in 2015. Follow-up observations by NASA's Chandra X-ray Observatory, the Hubble Space Telescope (HST) and the Discovery Channel Telescope (DCT) suggest that GRB150101B shares remarkable similarities with the neutron star merger, named GW170817, discovered by the Laser Interferometer Gravitational-wave Observatory (LIGO) and observed by multiple light-gathering telescopes in 2017. 

A new study suggests that these two separate objects may, in fact, be directly related. The results were published on October 16, 2018 in the journal Nature Communications.

"It's a big step to go from one detected object to two," said study lead author Eleonora Troja, an associate research scientist in the UMD Department of Astronomy with a joint appointment at NASA's Goddard Space Flight Center. "Our discovery tells us that events like GW170817 and GRB150101B could represent a whole new class of erupting objects that turn on and off--and might actually be relatively common."

Troja and her colleagues suspect that both GRB150101B and GW170817 were produced by the same type of event: a merger of two neutron stars. These catastrophic coalescences each generated a narrow jet, or beam, of high-energy particles. The jets each produced a short, intense gamma-ray burst (GRB)--a powerful flash that lasts only a few seconds. GW170817 also created ripples in space-time called gravitational waves, suggesting that this might be a common feature of neutron star mergers.

The apparent match between GRB150101B and GW170817 is striking: both produced an unusually faint and short-lived gamma ray burst and both were a source of bright, blue optical light and long-lasting X-ray emission. The host galaxies are also remarkably similar, based on HST and DCT observations. Both are bright elliptical galaxies with a population of stars a few billion years old that display no evidence of new star formation. 

"We have a case of cosmic look-alikes," said study co-author Geoffrey Ryan, a postdoctoral researcher in the UMD Department of Astronomy and a fellow of the Joint Space-Science Institute. "They look the same, act the same and come from similar neighborhoods, so the simplest explanation is that they are from the same family of objects."

In the cases of both GRB150101B and GW170817, the explosion was likely viewed "off-axis," that is, with the jet not pointing directly towards Earth. So far, these events are the only two off-axis short GRBs that astronomers have identified.  

The optical emission from GRB150101B is largely in the blue portion of the spectrum, providing an important clue that this event is another kilonova, as seen in GW170817. A kilonova is a luminous flash of radioactive light that produces large quantities of important elements like silver, gold, platinum and uranium.

While there are many commonalities between GRB150101B and GW170817, there are two very important differences. One is their location: GW170817 is relatively close, at about 130 million light years from Earth, while GRB150101B lies about 1.7 billion light years away.

The second important difference is that, unlike GW170817, gravitational wave data does not exist for GRB150101B. Without this information, the team cannot calculate the masses of the two objects that merged. It is possible that the event resulted from the merger of a black hole and a neutron star, rather than two neutron stars.

"Surely it's only a matter of time before another event like GW170817 will provide both gravitational wave data and electromagnetic imagery. If the next such observation reveals a merger between a neutron star and a black hole, that would be truly groundbreaking," said study co-author Alexander Kutyrev, an associate research scientist in the UMD Department of Astronomy with a joint appointment at NASA's Goddard Space Flight Center. "Our latest observations give us renewed hope that we'll see such an event before too long."

It is possible that a few mergers like the ones seen in GW170817 and GRB150101B have been detected previously, but were not properly identified using complementary observations in different wavelengths of light, according to the researchers. Without such detections--in particular, at longer wavelengths such as X-rays or optical light--it is very difficult to determine the precise location of events that produce gamma-ray bursts. 

In the case of GRB150101B, astronomers first thought that the event might coincide with an X-ray source detected by Swift in the center of the galaxy. The most likely explanation for such a source would be a supermassive black hole devouring gas and dust. However, follow-up observations with Chandra placed the event further away from the center of the host galaxy.

According to the researchers, even if LIGO had been operational in early 2015, it would very likely not have detected gravitational waves from GRB150101B because of the event's greater distance from Earth. All the same, every new event observed with both LIGO and multiple light-gathering telescopes will add important new pieces to the puzzle.

"Every new observation helps us learn better how to identify kilonovae with spectral fingerprints: silver creates a blue color, whereas gold and platinum add a shade of red, for example," Troja added. "We've been able identify this kilonova without gravitational wave data, so maybe in the future, we'll even be able to do this without directly observing a gamma-ray burst."

Credit: 
University of Maryland

Just how blind are bats? Color vision gene study examines key sensory tradeoffs

image: A new study led by Bruno Simões, Emma Teeling and colleagues has examined the evolution of color vision genes across a large and diverse group of bat species.

Image: 
Professor Gareth Jones

Could bats' cave-dwelling nocturnal habits over eons enhanced their echolocation acoustic abilities, but also spurred their loss of vision?

A new study led by Bruno Simões, Emma Teeling and colleagues has examined this question in the evolution of color vision genes across a large and diverse group of bat species.

They show that the popular expression of being "blind as a bat" really doesn't hold true. Some bats that have the most advanced type of echolocation appear to have traded UV vision for exquisite hearing, and all bats that do not echolocate but live in caves have also lost UV vision. This suggests that not all bats are blind but some certainly have selected other senses over vision.

"Bats' sensory abilities have long been a source of fascination for evolutionary biologists," said Emma Teeling, the corresponding author of the study, which appears in the advanced online edition of the journal Molecular Biology and Evolution. "Using phylogenetics and molecular biology we are now able to delve more deeply into the evolutionary price of acquiring echolocation and nocturnality."

Bats are not just the only mammals that can truly fly, but also the only ones to rely on echolocation to make their way at finding prey in the dark. It's long been argued by scientists that tradeoffs to bat vision were made as a result of this gaining this unique nocturnal sensory adaptation.

To better understand the sources of these tradeoffs, the research team performed DNA sequencing and analyzed the key vision genes in bats, including the SWS1 (short wavelength sensitive, for blue/UV light) and MWS/LWS (medium or long wavelength sensitive, for green, yellow and red light) opsin genes.

An opsin gene's job is to make the photosensitive retina proteins that can turn photons of light into sight to see particular wavelengths. The research team's opsin gene analyses surveyed the largest dataset to date in bats, representing 20 out of 21 bat existing families that were primarily chosen for their diverse echolocation types and ecological niches.

In the study, the authors have shown, among the 111 species examined, the loss of SWS1 gene function is more common in bats than previously thought and suggests that this may be associated with the adoption of cave roosting first traced back to almost 30 million years ago. They found various mutations in the bat genomes that effected SWS1 gene function, with a completely nonfunctional gene found in two species.

Overall, there is a spectral fine tuning that happened to their vision to completely lose short wavelength visible light in the blue/UV wavelengths in 26 of the 111 species examined. They found that for the majority of Old-World, cave roosting bats have a non-functional SWS1 opsin.

Selection on the blue-sensitive SWS1 opsin gene, however, was found to vary significantly among bat species. The research team found evidence of multiple genetic mutations in which different bat species have lost the function of the SWS1 gene. To identify these genetic roots, they used phylogenetic analyses to build gene trees based on the SWS1 results and compared signatures of selection between different ecological niches, such as echolocating vs. non-echolocating species and cave roosting vs. non-cave roosting.

"Our work supports previous hypotheses which suggest the pseudogenization [or loss] of the SWS1 opsin may be related to the adoption of advanced echolocation (high-duty cycle) and, cave roosting habits," said Teeling.

When the SWS1 gene is present and working, the authors confirmed that it gives bats the ability to see in UV light.

"Our spectral tuning analysis of the 11 sites responsible for light sensitivity in the SWS1 opsin gene in both ancestral and extant bat species, provide further support for the presence of UV vision in bats," said Teeling. "In closed-canopy, forest-dwelling mammals a UV sensitive SWS1 opsin is associated with a nocturnal lifestyle. Furthermore, our results demonstrate that this visual pigment has been UV sensitive in all bats since they first diverged from other placental mammals about 78 MYA."

Importantly, the new data makes clear that loss of the SWS1 gene function is not always associated with the acquisition of advanced echolocation, as has been previously suggested.

For the other visual genes, they found that among the 45 species examined, the MWS/LWS opsin gene is highly conserved across lineages and under strong evolutionary pressure to maintain its function.

"Our spectral tuning analysis of the 5 amino-acid sites responsible for the λ max revealed that the majority of bat MWS/LWS visual pigments are tuned to a long-wavelength (~555 - 560nm)," said Teeling. "This suggests that despite the acquisition of laryngeal echolocation and a long history of nocturnality, the MWS/LWS opsin gene has evolved under very strong functional constraint in bats."

"Bats are not blind, with most species capable of seeing in both the UV and middle range of the color spectrum. This suggests that vision is still an important means of sensory perception even in echolocating, nocturnal bats. However, acquisition of the most advanced type of echolocation does coincide with loss of UV vision in most bats and surprisingly cave-roosting drives loss of UV vision in the non-echolocating lineages. This suggests that sensory trade-offs are more complex that previously considered and that bats still make fascinating subjects to understand the evolution of the mammalian sensome!"

This study makes a strong contribution to the ongoing scientific debate regarding the importance of color vision for nocturnal animals.

Credit: 
SMBE Journals (Molecular Biology and Evolution and Genome Biology and Evolution)

Diabetics more at risk of death from alcohol, accidents, suicide

Diabetic patients are more likely to die from alcohol-related factors, accidents or suicide, according to a study published in the European Journal of Endocrinology. The study findings suggest that the increased risk of death from these causes may be related to the mental health of patients, which may be adversely affected by the psychological burden of living with and self-treating this debilitating disease, with potentially serious complications.