Earth

Older Mexican American adults experiencing pain are at risk of developing frailty

Researchers funded by the National Institutes of Health have found that older Mexican Americans who suffer from pain were 1.7 times likelier to become frail, compared to study participants who did not report pain. The study published in the Journal of Pain by researchers at the University of Texas Medical Branch, Galveston, looked at pain as a predictor of frailty in older Mexican American adults in a follow-up period of 18 years. All participants were non-frail at the start of the study.

Frailty - defined by unintentional weight loss, weakness, exhaustion and slowness - can lead to poor health outcomes. Factors that led to a higher chance of becoming frail were older age, hip fracture, depressive symptoms and difficulty in performing activities of daily living. Participants with higher education levels, females and those with sounder mental state were less prone to developing frailty, according to researchers.

An estimated 100 million people in the United States have some form of pain, making it one of the most prevalent and expensive medical issues in America. In older adults, chronic pain is a risk factor for frailty but there is not enough knowledge of the relationship between pain and frailty in older Hispanic populations in the U.S., which is the nation's fastest growing segment of older adults. Older Mexican American adults are particularly prone to risk factors associated with pain, such as diabetes, obesity and disability. They also are more likely to have poor access to medical care and have lower levels of health literacy.

"Older Mexican Americans are an underserved population with disparities in healthcare access and delivery and health risks associated to their demographic group," said NIMHD Director Eliseo J. Pérez-Stable, M.D. "This study identifies the need to effectively manage pain in Latino populations by culturally appropriate interventions."

Study data was sourced from the Hispanic Established Populations for the Epidemiological Study of the Elderly (H-EPESE), an ongoing longitudinal study of 3,050 Mexican Americans aged 65 years and older and residing in five Southwestern states. Data was collected from 1995/96 and 2012/13.

At the start of data collection, participants were asked if they experienced pain in the previous month. Participants categorized as frail were not included in the study. As social and demographic factors, such as age, sex, marital status, literacy, mental health, disability and existing medical conditions may influence frailty, participants were also categorized along these variables.

"The relationship between social determinants, diabetes, physical function, mobility, frailty and pain in older Mexican Americans is complex and poorly understood. Early assessment and better pain management may prevent early onset of frailty in this group," said Kenneth Ottenbacher, Ph.D., study principal investigator at UTMB - Galveston.

Credit: 
NIH/National Institute on Minority Health and Health Disparities

Breakthrough in malaria research

image: The researchers specifically removed over 1,300 individual genes in the malaria parasite Plasmodium (in red, host cells in green) and were thus able to identify many new targets in the pathogen.

Image: 
Institut für Zellbiologie, Universität Bern

Despite great efforts in medicine and science, more than 400,000 people worldwide are still dying of malaria. The infectious disease is transmitted by the bite of mosquitoes infected with the malaria parasite Plasmodium. The genome of the parasite is relatively small with about 5,000 genes. In contrast to human cells, Plasmodium parasites only have a single copy of each individual gene. If one removes a gene from the entire genome of the parasite, this leads therefore directly to a change in the phenotype of the parasite. An international consortium led by Professors Volker Heussler from the Institute of Cell Biology (ICB) at the University of Bern and Oliver Billker from the Umeå University in Sweden and formerly at the Sanger Institute in Great Britain has taken advantage of this fact. The researchers have carried out a genome-wide gene deletion study on malaria parasites: They specifically removed over 1300 individual genes, observed the effects during the entire life cycle of the parasite and were thus able to identify many new targets in the pathogen. The present study was published in the prestigious journal Cell.

Individual genetic codes accelerate research by decades

The researchers used a malaria mouse model established at the Institute of Cell Biology at the University of Bern. Each of the 1300 parasite genes was replaced by an individual genetic code to analyze how the removal of the individual genes affects the parasite. The use of individual codes allows to study many parasites simultaneously and thus drastically shortens the time of their analysis. After three years of research, the international consortium succeeded in systematically screening the genome of the parasite in all life cycle stages. "The deletion screen carried out jointly with the Sanger Institute enabled us to identify hundreds of targets, particularly in the parasite's metabolism," said Rebecca Stanway from the ICB, one of the lead authors of this study.

Model calculations extend experimental findings

To systematically analyze the large number of identified metabolic genes, the Berne researchers have joined forces with Professor Vassily Hatzimanikatis of the EPFL in Lausanne and Professor Dominique Soldati-Favre of the University of Geneva to form the "MalarX" consortium, which is financially supported by the Swiss National Science Foundation. Using data of the malaria genome screen, the group at EPFL has calculated models that show essential metabolic pathways of the parasite. "Thanks to these models, it is now possible to predict which of the previously unexplored genes are vital for the parasite and are therefore suitable targets for malaria control" adds model expert Anush Chiappino-Pepe from the EPFL in Lausanne.

Some of these predictions were then experimentally confirmed by the Bern researchers in close collaboration with the group of Prof. Chris Janse at the University of Leiden in the Netherlands. "The genome-wide screen with the corresponding metabolic models represents a breakthrough in malaria research," said Magali Roques of the team in Bern. "Our results will support many malaria researchers worldwide. They can now concentrate on essential parasite genes and thus develop efficient drugs and vaccines against various stages of the parasite's life" added Dr Ellen Bushell, former scientist at the Sanger Institute.

Success through top infrastructure and international cooperation

According to Volker Heussler, this research approach was only possible by a combination of the enormous sequencing and cloning capacities at the Sanger Institute and the extraordinary infrastructure at the ICB in Bern where the entire life cycle of the malaria parasite is established. In addition, the ICB is equipped with an exceptional range of high-performance microscopes, which enable top-level research on the various life cycle stages of the parasite. Thanks to this excellent infrastructure, the laboratory of Volker Heussler has already published many internationally recognized studies on the early phase of parasite infection.

22 international scientists from the fields of molecular biology, parasitology, statistics and mathematical modelling participated in this project. "This illustrates the effort in conducting this study, analyzing the data and model the experimental findings to bring them in a meaningful context" said Volker Heussler.

Credit: 
University of Bern

New laser opens up large, underused region of the electromagnetic spectrum

image: This is an artistic view of the QCL pumped THz laser showing the QCL beam (red) and the THz beam (blue) along with rotating N2O (laughing gas) molecules inside the cavity.

Image: 
Arman Amirzhan, Harvard SEAS

The terahertz frequency range - which sits in the middle of the electromagnetic spectrum between microwaves and infrared light -- offers the potential for high-bandwidth communications, ultrahigh-resolution imaging, precise long-range sensing for radio astronomy, and much more.

But this section of the electromagnetic spectrum has remained out of reach for most applications. That is because current sources of terahertz frequencies are bulky, inefficient, have limited tuning or have to operate at low temperature.

Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), in collaboration with MIT and the U.S. Army, have developed a compact, room temperature, widely tunable terahertz laser.

"This laser outperforms any existing laser source in this spectral region and opens it up, for the first time, to a broad range of applications in science and technology," said Federico Capasso, the Robert L. Wallace Professor of Applied Physics and Vinton Hayes Senior Research Fellow in Electrical Engineering at SEAS and co-senior author of the paper.

"There are many needs for a source like this laser, things like short range, high bandwidth wireless communications, very high-resolution radar, and spectroscopy," said Henry Everitt,
Senior Technologist with the U.S. Army CCDC Aviation & Missile Center and co-senior author of the paper.

Everitt is also an Adjunct Professor of Physics at Duke University.

While most electronic or optical terahertz sources use large, inefficient and complex systems to produce the elusive frequencies with limited tuning range, Capasso, Everitt and their team took a different approach.

To understand what they did, let's go over some basic physics of how a laser works.

In quantum physics, excited atoms or molecules sit at different energy levels -- think of these as floors of a building. In a typical gas laser, a large number of molecules are trapped between two mirrors and brought to an excited energy level, aka a higher floor in the building. When they reach that floor, they decay, fall down one energy level and emit a photon. These photons stimulate the decay of more molecules as they bounce back and forth leading to amplification of light. To change the frequency of the emitted photons, you need to change the energy level of the excited molecules.

So, how do you change the energy level? One way is to use light. In a process called optical pumping, light raises molecules from a lower energy level to a higher one -- like a quantum elevator. Previous terahertz molecular lasers used optical pumps but they were limited in their tunability to just a few frequencies, meaning the elevator only went to a small number of floors.

The breakthrough of this research is that Capasso, Everitt and their team used a highly tunable, quantum cascade laser as their optical pump. These powerful, portable lasers, co-invented by Capasso and his group at Bell Labs in the 1990s, are capable of efficiently producing widely tunable light. In other words, this quantum elevator can stop at every floor in the building.

The theory to optimize the operation of the new laser was developed by Steven Johnson, Professor of Applied Mathematics and Physics at MIT, his graduate student Fan Wang and Everitt.

"Molecular THz lasers pumped by a quantum cascade laser offer high power and wide tuning range in a surprisingly compact and robust design," said Nobel laureate Theodor Hänsch of the Max Planck Institute for Quantum Optics in Munich, who was not involved in this research. "Such sources will unlock new applications from sensing to fundamental spectroscopy."

"What's exciting is that concept is universal," said Paul Chevalier, a postdoctoral fellow at SEAS and first author of the paper. "Using this framework, you could make a terahertz source with a gas laser of almost any molecule and the applications are huge."

The researchers combined the quantum cascade laser pump with a nitrous oxide -- aka laughing gas--laser.

"By optimizing the laser cavity and lenses, we were able to produce frequencies spanning nearly 1 THz," said Arman Amirzhan, a graduate student in Capasso's group and co-author of the paper.

"This result is one of a kind," said Capasso. "People knew how to make a terahertz laser before but couldn't make it broadband. It wasn't until we began this collaboration, after a serendipitous encounter with Henry at a conference, that we were able to make the connection that you could use a widely tunable pump like the quantum cascade laser."

This laser could be used in everything from improved skin and breast cancer imaging to drug detection, airport security and ultrahigh-capacity optical wireless links.

"I'm particularly excited about the possibility of using this laser to help map the interstellar medium," said Everitt. "Molecules have unique spectral fingerprints in the terahertz region, and astronomers have already begun using these fingerprints to measure the composition and temperature of these primordial clouds of gas and dust. A better ground-based source of terahertz radiation like our laser will make these measurements even more sensitive and precise."

Credit: 
Harvard John A. Paulson School of Engineering and Applied Sciences

Researchers block metastasis-promoting enzyme, halt spread of breast cancer

In a breakthrough with important implications for the future of immunotherapy for breast cancer, UC San Francisco scientists have found that blocking the activity of a single enzyme can prevent a common type of breast cancer from spreading to distant organs.

While studying a mouse model that replicates key features of early-stage human breast cancer, the researchers discovered that a ubiquitous enzyme called MMP9 is an essential component of the cancer's metastasis-promoting machinery, helping to create a hospitable environment for itinerant cancer cells to form new metastatic tumors.

"Metastasis is the biggest hurdle when it comes to successfully treating breast cancer, and solid tumors in general," said Vicki Plaks, PhD, now an assistant adjunct professor in the Department of Orofacial Sciences at UCSF. "Once a cancer becomes metastatic, there's really no cure, and the only option is to manage it as a chronic disease." Plaks co-led the team that made the discovery when still a postdoctoral fellow in the laboratory of Zena Werb, PhD, a professor of anatomy and associate director for basic science at the UCSF Helen Diller Family Comprehensive Cancer Center.

When they examined lung tissue in their mouse model, the researchers found that MMP9 is involved in remodeling healthy tissue and transforming it into a kind of safe haven for migrating breast cancer cells. When the cancer cells colonize these sites with the help of MMP9, they're able to start growing into new tumors.

The new study, published Nov. 14 in the journal Life Science Alliance, shows that these metastases can be stopped before they are able to lay the foundations for tumor growth. By administering an antibody that specifically targets and disrupts MMP9 activity, the scientists were able to prevent cancer from colonizing the lungs of mice. But interestingly, interfering with MMP9 had no effect on the primary tumor, which suggests that the enzyme's primary role in this scenario is helping existing malignancies metastasize and colonize other organs rather than promoting the growth of established primary tumors.

Prior to this study, Werb and others had found that MMP9 plays an important role in remodeling the extracellular matrix (ECM) -- a patchwork of biomolecules found outside of cells that provides structure and shape to organs, helps cells communicate with one another, and establishes a microenvironment that promotes cell health, among its many other functions. Although MMP9 was known to be involved in cancer, specifically in remodeling the ECM to build tumor niches that are hospitable to malignancies, its role in the earliest stages of metastasis had not been fully explored.

"Lots of studies that examined metastatic niche formation in breast cancer have focused on late-stage cancers, when the tumors are fairly progressed. What sets our study apart is that we chose to focus on processes that alter the tumor and metastatic microenvironment early on. This approach enabled us to show that MMP9 really matters in the early stages," said Mark Owyong, co-lead author of the new study with Jonathan Chou, MD, PhD, a clinical fellow in the UCSF School of Medicine. Owyong, Chou and Plaks conducted the research as members of the Werb lab.

The first hint that MMP9 might be involved in early-stage metastasis came from publicly available gene expression data from clinical breast cancer biopsies. While sifting through this data, the researchers noticed that MMP9 levels were elevated in metastatic disease.

To further investigate MMP9's role in metastasis, the researchers turned to a unique mouse model of "luminal B" breast cancer, which is among the most frequently diagnosed forms of the disease. "We selected the model because it's one of the few that captures the natural progression of breast cancer, closely mimicking the progression of the disease experienced by patients," Owyong said.

In a key set of experiments, the researchers injected tumor cells into mice that had early stage breast cancer but no discernible metastases. They found that the cells colonized the lungs and formed new tumor growth sites. But when these cells were injected into genetically identical mice without breast cancer, no metastases formed.

When the experiment was repeated in mice with early stage breast cancer whose MMP9 gene had been knocked out, there was a significant reduction in the size of metastatic lung tumors, though there was no effect on the primary breast tissue tumor. These findings suggest that MMP9 is required to promote metastasis, but not essential for continued growth of the primary tumor.

Similar results were seen when the researchers disrupted the activity of MMP9 with a unique antibody that specifically targets the activated form of the enzyme. The researchers injected tumor cells into these mice, followed by injections of the antibody every two days. At the end of the treatment regimen, the researchers inspected the mice and saw a significant reduction in the number and size of lung metastases in mice who received the antibody compared with those that didn't.

"This was a very promising result and suggests that a therapeutic paradigm focused on intercepting metastasis early might offer a new route for treating certain kinds of breast cancer," said Plaks.

The researchers also discovered that interfering with MMP9 activity helped recruit and activate cancer-fighting immune cells to metastatic sites, a result with important implications for treating certain types of metastatic breast cancer with immunotherapy.

Immunotherapies work by enlisting the body's immune system to find and kill cancer cells. But certain cancers -- including luminal B breast cancer, the main focus of the new study -- don't succumb to immunotherapy. According to Plaks, this is because, beyond their direct effects on metastatic growth, enzymes like MMP9 also play an important role in remodeling the ECM and building mesh-like barriers around metastatic sites that help to exclude immune cells. This may explain why some metastatic cancer cells are able to evade the immune onslaught triggered by immunotherapies.

But the new study shows that when MMP9 is incapacitated, metastatic sites may no longer be able to keep immune cells at bay. Plaks thinks that this represents an important step towards making breast cancer more susceptible to immunotherapies that have proven effective against other forms of cancer.

"These findings come at an exciting time in cancer immunology, with antibodies targeting MMP9 being actively explored for clinical use within the biotech industry," Plaks said. "There's been great interest in trying to use immunotherapy to treat metastatic breast cancers of the luminal B type, but so far, success has been limited. Our work indicates that a combination approach of immunotherapy with antibodies targeting MMP9 activity might actually succeed."

Credit: 
University of California - San Francisco

Future rainfall could far outweigh current climate predictions

image: Heavy rain falls across Dartmoor on Saturday 09 November 2019

Image: 
Lloyd Russell, University of Plymouth

Homes and communities across the UK have felt the full force of torrential downpours in recent weeks. And the UK's uplands could in future see significantly more annual rainfall than is currently being predicted in national climate models, according to new research by the University of Plymouth, UK.

Scientists analysed rainfall records from the 1870s to the present day and compared them against those featured in the Met Office's UK Climate Projections 2018 (UKCP18) report.

Their findings show that there has been a significant increase in spring, autumn and winter precipitation, greatest in upland windward areas of the region, with winter increases broadly consistent with UKCP18 projections.

However, their results show for spring, summer and autumn precipitation there could be large divergence by the mid to late 21st century, with the observed mismatch greatest in upland areas.

The study, published in Climate Research, was conducted by research student Thomas Murphy and academics from the University's School of Geography, Earth and Environmental Science and School of Biological and Marine Sciences.

Dr Paul Lunt, Associate Professor in Environmental Science and one of the study's authors, said: "Our study helps to contextualise the latest UK climate change projections, and suggest caution is required when making assumptions on climate impacts based on climate models. Current models predict that by 2050, summer rainfall on Dartmoor will fall by as much as 20%, but our results from past records show that in the uplands it is on an upward trajectory.

"This study shows there have been significant increases in spring, autumn, winter and annual precipitation for upland regions in South West England between 1879 and 2012. Meanwhile the moderate increases in summer precipitation represent a deviation from the drier summers predicted within current and previous climate models.

"In that regard, this research highlights the complex challenges facing those trying to predict the effects of climate change. Upland areas are among the most important UK regions in terms of biodiversity and carbon sequestration, but they are also the most vulnerable to increased precipitation."

Upland areas more than 300m above sea level cover around one third of the UKs land area and are considered of national and international importance due to their biodiversity and cultural heritage. They are also the source of 68 per cent of the UK's freshwater and have a significant role in flood risk mitigation.

Levels of precipitation in the uplands are typically twice the average of those for UK lowland sites. As a result, there contribution to flood risk and downstream flood damage is disproportion to their catchment area.

The research focused on Dartmoor and Plymouth, aiming to scrutinise one of the longest running upland and lowland precipitation records anywhere in Western Europe. It also looked at shorter-term records from a number of upland sites.

The results show that over the past 130 years rainfall in upland areas has increased in all seasons, with spring, autumn and winter increasing by more than 12%. In Plymouth, rainfall has increased by more than 5% in all seasons apart from summer, where there has been slight decrease. Annual levels in the uplands and Plymouth have risen by 11% (226mm) and 5% (46mm) respectively over the same period.

The researchers also say that while these results are important at a local scale, they are also relevant for upland coastal sites across the North East Atlantic.

Credit: 
University of Plymouth

Inoculating against the spread of viral misinformation

image: A small group of anti-vaccine ad buyers has successfully leveraged Facebook to reach targeted audiences, new research shows.

Image: 
Kirian Villalta, UMD School of Public Health

In a year that has seen the largest measles outbreak in the US in more than two decades, the role of social media in giving a platform to unscientific anti-vaccine messages and organizations has become a flashpoint.

In the first study of public health-related Facebook advertising, newly published in the journal Vaccine, researchers at the University of Maryland, the George Washington University and Johns Hopkins University show that a small group of anti-vaccine ad buyers has successfully leveraged Facebook to reach targeted audiences and that the social media platform's efforts to improve transparency have actually led to the removal of ads promoting vaccination and communicating scientific findings.

The research calls attention to the threat of social media misinformation as it may contribute to increasing "vaccine hesitancy," which the World Health Organization ranks among the top threats to global health this year. This increasing reluctance or refusal to vaccinate threatens to reverse the progress made in halting vaccine-preventable diseases, such as measles, which has seen a 30% increase in cases globally.

The research team, co-led by UMD's Dr. Sandra C. Quinn, GW's Dr. David Broniatowski and JHU's Dr. Mark Dredze, examined more than 500 vaccine-related ads served to Facebook users and archived in Facebook's Ad Library. This archive, which became available in late 2018, catalogued ad content related to "issues of national importance." Their findings reveal that the majority of advertisements (54%) which opposed vaccination, were posted by only two groups funded by private individuals, the World Mercury Project and Stop Mandatory Vaccination, and emphasized the purported harms of vaccination.

"The average person might think that this anti-vaccine movement is a grassroots effort led by parents, but what we see on Facebook is that there are a handful of well-connected, powerful people who are responsible for the majority of advertisements. These buyers are more organized than people think," said Amelia Jamison, a faculty research assistant in the Maryland Center for Health Equity, and the study's first author.

In contrast, those ads promoting vaccination did not reflect a common or organized theme or funder, and were focused on trying to get people vaccinated against a specific disease in a targeted population. Examples included ads for a local WalMart's flu vaccine clinic or the Gates Foundation campaign against polio.

Yet, because Facebook categorizes ads about vaccines as "political," it has led the platform to reject some pro-vaccine messages. "By accepting the framing of vaccine opponents - that vaccination is a political topic, rather than one on which there is widespread public agreement and scientific consensus - Facebook perpetuates the false idea that there is even a debate to be had," said David Broniatowski, associate professor of engineering management and systems engineering at GW, and principal investigator of the study. "This leads to increased vaccine hesitancy, and ultimately, more epidemics."

"Worse, these policies actually penalize pro-vaccine content since Facebook requires disclosure of funding sources for 'political' ads, but vaccine proponents rarely think of themselves as political. Additionally, vaccine opponents are more organized and more able to make sure that their ads meet these requirements."

Facebook is a pervasive presence in the lives of many people, meaning its decisions about how to handle vaccine messaging have far-reaching and serious consequences, said Sandra Crouse Quinn, professor and chair of the Department of Family Science at UMD's School of Public Health, and a principal investigator on the study.

"In today's social media world, Facebook looms large as a source of information for many, yet their policies have made it more difficult for users to discern what is legitimate, credible vaccine information. This puts public health officials, with limited staff resources for social media campaigns, at a true disadvantage, just when we need to communicate the urgency of vaccines as a means to protect our children and our families," said Quinn.

The researchers note that the data gathered for this study from Facebook's Ad Archive was collected in December 2018 and February 2019, before Facebook's March 2019 announcement of updated advertising policies designed to limit the spread of vaccine-related misinformation. This study provides a baseline to compare how new policy changes may change the reach of ads from anti-vaccine organizations. Those standards, issued in response to the proliferation of anti-vaccination misinformation that coincided with measles outbreaks across the U.S.in early 2019, include that Facebook will block advertisements that include false content about vaccines and disallow advertisers from targeting ads to people "interested in vaccine controversies," as they were previously able to do.

Yet, the messengers may simply mutate their messages, virus-like, to avoid the tightening standards. "There is a whole set of ads that focus on themes of freedom' or 'choice' and that elude the Facebook rules around vaccine ads," Broniatowski said.

Jamison says that the research team will continue to study how anti-vaccine arguments are spreading on Facebook and how the company is responding to demands from public health organizations to clean up its act.

"While everyone knows that Facebook can be used to spread misinformation, few people realize the control that advertisers have to target their message," said Mark Dredze, a John C. Malone associate professor of computer science at Johns Hopkins. "For a few thousand dollars, a small number of anti-vaccine groups can micro-target their message, exploiting vulnerabilities in the health of the public."

Credit: 
University of Maryland

Nearly half of accused harassers can return to work

EAST LANSING, Mich. - What happens behind the scenes when employees are accused of harassment? New research from Michigan State University revealed that almost half of accused harassers can go back to work when disputes are settled by arbitrators - or, third-parties who resolve disputes.

The findings, published by the Hofstra Labor & Employment Law Journal, closely examine the outcomes of arbitration awards involving harassers, as well as providing insight as to whether arbitration is the best solution to addressing workplace harassment.

"With all of the issues our society is facing right now, I wanted to figure out why we weren't doing a better job addressing harassment in the workplace," said Stacy Hickox, associate professor in MSU's School of Human Resources and Labor Relations. "I knew that it was challenging for employees to bring a claim of harassment to employers but wanted to know what employers are doing about actually responding."

Hickox and co-author Michelle Kaminski, associate professor in MSU's School of Human Resources and Labor Relations, examined 60 arbitration cases in which employees accused of harassment were challenging their punishment. In most cases, the employee was discharged and seeking to return to work.

They found that only 52% of the cases upheld the punishment of getting fired. In 13% of cases, the accused harassers were allowed back to work without any punishment. In the other cases, 12% could come back to work with no back pay; 20% of the cases reduced the discipline to a suspension and 2% were reduced to a warning.

"I was very surprised by the number of people who were proven to be harassers and were allowed to come back to work," Hickox said. "It is interesting that the employer's anti-harassment policies play a part in whether the harasser's discipline was upheld. Policies that included specific examples of harassment were more often associated with the discipline being upheld."

The issue, Hickox said, is that accused harassers have rights as well and can claim they were disciplined without just cause. Some return to work because employers fail to provide enough proof that the harassment occurred, while others are reinstated because arbitrators are wedded to employers' policies - with no gray areas. Arbitrators also reinstate harassers because he or she has long tenure with the employer.

"To be sure that these harassers aren't allowed back into the workplace, employers need to look much more closely at harassment policies, as well as the power they give arbitrators in resolving these cases," Hickox said.

Hickox and Kaminski found that in many cases, a company thought their employee accused of harassment should have received a tougher punishment; however, an arbitrator is wedded to the corporate policies and if a company's policy doesn't clearly prohibit the harassment, the arbitrator can't enforce it. Therefore, they recommend that anti-harassment policies be carefully crafted.

While arbitration can serve as a reasonable alternative to taking harassment cases to court, there are challenges that the current arbitration process presents. Arbitration takes place in private, which means that other employees and the public may never know the outcome. Additionally, Hickox said that most employees don't look very closely at new hire paperwork - or what rights they are signing away by agreeing to arbitration of all employment disputes.

"I believe that arbitration is a fair process and can be effective, but I'm a firm believer in consequences," Hickox said. "You can train people on harassment until they're blue in the face, but until there are clearer, more stringent policies from employers, the issue will continue."

Credit: 
Michigan State University

Drexel researchers create and stabilize pure polymeric nitrogen using plasma

video: Researchers at Drexel University's C&J Nyheim Plasma Institute have reported that liquid plasma can be used to produce an energy-dense material called polymeric nitrogen.

Image: 
Drexel University

Scientists have long theorized that the energy stored in the atomic bonds of nitrogen could one day be a source of clean energy. But coaxing the nitrogen atoms into linking up has been a daunting task. Researchers at Drexel University's C&J Nyheim Plasma Institute have finally proven that it's experimentally possible - with some encouragement from a liquid plasma spark.

Reported in the Journal of Physics D: Applied Physics, the production of pure polymeric nitrogen - polynitrogen - is possible by zapping a compound called sodium azide with a jet of plasma in the middle of a super-cooling cloud of liquid nitrogen. The result is six nitrogen atoms bonded together - a compound called ionic, or neutral, nitrogen-six - that is predicted to be an extremely energy-dense material.

"Polynitrogen is being explored for use as a 'green' fuel source, for energy storage, or as an explosive," said Danil Dobrynin, PhD, an associated research professor at the Nyheim Institute and lead author of the paper. "Versions of it have been experimentally synthesized - though never in a way that was stable enough to recover to ambient conditions or in pure nitrogen-six form. Our discovery using liquid plasma opens a new avenue for this research that could lead to a stable polynitrogen."

Previous attempts to generate the energetic polymer have used high pressure and high temperature to entice bonding of nitrogen atoms. But neither of those methods provided enough energy to excite the requisite ions - atomic bonding agents - to produce a stable form of nitrogen-six. And the polymeric nitrogen created in these experiments could not be maintained at a pressure and temperature close to normal, ambient conditions.

It's something like trying to glue together two heavy objects but only being strong enough to squeeze a few drops of glue out of the bottle. To make a bond strong enough to hold up, it takes a force strong enough to squeeze out a lot of glue.

That force, according to the researchers, is a concentrated ion blast provided by liquid plasma.

Liquid plasma is the name given to an emission of an ion-dense matter generated by a pulsed electrical spark discharged in a liquid environment - kind of like lightning in a bottle. Liquid plasma technology has barely been around for a decade though it already holds a great deal of promise. It was pioneered by researchers at the Nyheim Institute who have explored is use in a variety of applications, from health care to food treatment.

Because the plasma is encased in liquid it is possible to pressurize the environment, as well as controlling its temperature. This level of control is the key advantage that the researchers needed to synthesize polynitrogen because it allowed them to more precisely start and stop the reaction in order to preserve the material it produced. Dobrynin and his collaborators first reported their successful attempt to produce polynitrogen using plasma discharges in liquid nitrogen in a letter in the Journal of Physics D: Applied Physics over the summer.

In their most recent findings, the plasma spark sent a concentrated shower of ions toward the sodium azide - which contains nitrogen-three molecules. The blast of ions splits the nitrogen-three molecules from the sodium and, in the excited state, the nitrogen molecules can bond with each other. Not surprisingly, the reaction produces a good bit of heat, so putting the brakes on it requires an incredible blast of cold - the one provided by liquid nitrogen.

"We believe this procedure was successful at producing pure polynitrogen where others fell short, because of the density of ions involved and the presence of liquid nitrogen as a quenching agent for the reaction," Dobrynin said. "Other experiments introduced high temperatures and high pressures as catalysts, but our experiment was a more precise combination of energy, temperature, electrons and ions."

Upon inspection with a Raman spectrometer - an instrument that identifies the chemical composition of a material by measuring its response to laser stimulus - the plasma-treated material produced readings consistent with those predicted for pure polynitrogen.

"This is quite significant because until now scientists have only been able to synthesize stable polynitrogen compounds in the form of salts - but never in a pure nitrogen form like this at near-ambient conditions," Dobrynin said. "The substance we produced is stable at atmospheric pressure in temperatures up to about -50 Celsius."

Plasma, in its original gas-laden environment, has been under development for decades as a sterilization technology for water, food and medical equipment and it is also being explored for coating materials. But this is the first instance of liquid plasma being used to synthesize a new material. So, this breakthrough could prove to be an inflection point in plasma research, at the Nyheim Institute and throughout the field.

"This discovery opens a number of exciting possibilities for producing polymeric nitrogen as a fuel source," said Alexander Fridman, PhD, John A. Nyheim Chair professor in Drexel's College of Engineering and director of the C&J Nyheim Plasma Institute and co-author of the paper. "This new, clean energy-dense fuel could enable a new age of automobiles and mass transportation. It could even be the breakthrough necessary to allow the exploration of remote regions of space."

Credit: 
Drexel University

Observing changes in the chirality of molecules in real time

image: Chiral molecules - compounds that are mirror images of each other -- play an important role in biological processes and in chemical synthesis. Chemists at ETH Zurich have now succeeded for the first time in using ultrafast laser pulses to observe changes in chirality during a chemical reaction in real time.

Image: 
ETH Zurich / Joachim Schnabl

Some molecules can exist in two mirror-image forms, similar to our hands. Although such so-called enantiomers have almost identical physical properties, they are not the same. The fact that they behave to each other like image and mirror image is called chirality (from the Greek cheiro for hand). In nature, however, often only one enantiomer exists, for example in amino acids, DNA or sugars. The enzymes that produce these molecules are themselves chiral and therefore only produce one type of enantiomer.

This preference of nature has far-reaching consequences. For example, enantiomers of drugs can have completely different modes of action, such as being toxic or the can be completely ineffective. The food and cosmetics industry are also interested in chirality because fragrances and flavors are perceived differently depending on the enantiomer. Chemists therefore often try to produce only one enantiomer or, if this is not possible, to separate mixtures of enantiomers.

To distinguish enantiomers from each other, chemists use polarized light because the enantiomers rotate the plane of polarized light in opposite directions. The breaking or formation of chemical bonds takes place on a very short time scale, namely within a few femtoseconds (quadrillionths of a second). With the existing measurements, it has not been possible to monitor chirality in such short periods of time and thus follow a chemical process.

Understanding the reactions of chiral molecules

Researchers led by Hans Jakob Wörner, Professor at the Department of Chemistry and Applied Biosciences, have now developed a new method for observing changes in chirality directly during a chemical reaction in real time. The researchers have generated femtosecond laser pulses, with tailor-made temporally varying polarization, which are themselves chiral. This new approach enabled them for the first time to simultaneously achieve the necessary sensitivity to chirality and time resolution.

In their experiment, which the scientists reported on in the scientific journal PNAS, they excited the gaseous chiral molecule (R)-2-iodobutane with two ultra-short ultraviolet laser pulses. The excitation caused the bond between carbon and iodine to break. In this process, the 2-butyl radical is initially formed in a chiral conformation, which rapidly loses its chirality. With the help of the newly developed polarized laser pulses, they were then able to follow live how the chirality disappears after the bond break due to the cleavage of the iodine atom.

This new method can also be applied to the liquid or solid phase to observe the extremely rapid changes in molecular chirality, as the scientists say. The possibility of making chiral photochemical processes directly accessible on such short time scales now makes it possible to better understand the reactions of chiral molecules. This could facilitate the development of new or improved methods for the production of enantiomerically pure compounds.

Credit: 
ETH Zurich

Diverging trends: Binge drinking and depression

November 14, 2019 -- Binge drinking among U.S. adolescents precipitously declined from 1991 to 2018, according to a new study at Columbia University Mailman School of Public Health. Depressive symptoms among U.S. adolescents have sharply increased since 2012. And for the first time in the past 40 years, binge drinking and depressive symptoms among adolescents are no longer associated. The findings are published online in the Journal of Adolescent Health.

"Comorbidity of depression and drinking is among the bedrocks of psychiatric epidemiology findings - until now. Our results suggest that we need to be re-thinking the connections between mental health and alcohol among young people," said Katherine M. Keyes, PhD, associate professor of epidemiology at Columbia Mailman School of Public Health.

Data were drawn from the U.S. nationally representative Monitoring the Future surveys from 1991-2018 for 58,444 school-attending 12th-grade adolescents. Binge drinking was measured as having more than five drinks during the past two weeks. Depressive symptoms were measured based on agreeing or disagreeing with statements that life is meaningless or hopeless.

The relationship between depressive symptoms and binge drinking decreased by 16 percent from 1991 to 2018 and 24 percent among girls and 25 percent among boys. There had been no significant relation between depressive symptoms and binge drinking among boys since 2009; among girls, the relationship has been positive throughout most of the study period.

The results suggest that, on average, the relationship between binge drinking and depressive symptoms is dynamically changing and decoupling, according to the researchers.

"Although comorbidity between alcohol consumption and mental health is complex, the landscape of the adolescent experience is changing in ways that may affect both consumption and mental health," observed Keyes. "The declining correlation between binge drinking and mental health is occurring during a time of unprecedented decreases in alcohol consumption among U.S. adolescents and increases in mental health problems. Therefore, the relationship between substance use and mental health may need to be reconceptualized for ongoing and future research."

Credit: 
Columbia University's Mailman School of Public Health

Global climate change concerns for Africa's Lake Victoria

image: Members of the team explored outcrops along Lake Victoria in western Kenya. These sediments were used to understand the history of the lake over the past 100,000 years.

Image: 
Emily Beverly

Global climate change could cause Africa's Lake Victoria, the world's largest tropical lake and source of the Nile River, to dry up in the next 500 years, according to new findings from a team of researchers led by the University of Houston. Even more imminent, the White Nile -- one of the two main tributaries of the Nile -- could lose its source waters in just a decade.

Using ancient sediment from outcrops along the edge of the lake, Emily Beverly, assistant professor of sedimentary geology at the UH College of Natural Sciences and Mathematics, along with researchers at Baylor University, generated a water-budget model to see how Lake Victoria's levels respond to changes in evaporation, temperature, rainfall and solar energy. Their findings, published in Earth and Planetary Science Letters, indicate a rapid lake level decline was very possible tens of thousands of years ago and could happen again in the future.

"Our model predicts that at current rates of temperature change and previous rates of lake level fall, Lake Victoria could have no outlet to the White Nile in as little as 10 years. Every major port in Lake Victoria could be landlocked within a century, and Kenya could lose access to the lake in 400 years," Beverly explained.

The result would significantly affect the economic resources supplied by the lake and the livelihoods of approximately 40 million people living in the Lake Victoria Basin.

Kenya and Tanzania depend on the lake's freshwaters to support their fishing industries because the lake harvests more than one million tons of fish annually.

Uganda would be deprived of its primary source of electricity via hydropower and the water that sustains the Nile during non-flood stages.

The Kagera River basin, which is the main river that flows into Lake Victoria, feeds rainwater to Rwanda and Burundi which rely on agriculture and livestock production.

Lake Victoria gets most of its water from rain, and each year, the area gets about 55 inches of rainfall. The sediment analyzed from along the lake shows rainfall levels from 35,000 to 100,000 years ago were about 28 inches, or almost half of what they are today. The water-budget model in the study shows low amounts of rainfall caused the lake to dry up at least three times in the past 100,000 years and could happen again.

"It's so warm there and the sun is so strong because you are at the equator that evaporation is very high," said Beverly. "If the water balance is thrown off, the lake can dry up very quickly. It doesn't take much of a drop in precipitation to change it."

This study was made possible with grants from the National Science Foundation, National Geographic Society, Leakey Foundation, Geological Society of America and Society for Sedimentary Geology totaling more than $200,000. Other collaborators include University of Connecticut, University of Utah and the University of Cambridge.

Credit: 
University of Houston

For some urban areas, a warming climate is only half the threat

Climate researchers predict that global temperatures will increase by as much as 2 degrees C by 2050 due to growing concentrations of greenhouse gas emissions in the planet's atmosphere. 
 

But for many of the world's urban areas, a temperature rise due to the burning of fossil fuels isn't the half of it. 
 

A new study from the Yale School of Forestry & Environmental Studies (F&ES) projects that the growth of urban areas in the coming decades will trigger "extra" warming due to a phenomenon known as the urban heat island effect (UHI). According to their findings, urban expansion will cause the average summer temperature in these areas to increase about 0.5 to 0.6 degrees C -- but up to 3 degrees C in some locations. 
 

This warming, they show, will increase extreme heat risks for about half of the world's future urban population -- particularly in tropical regions in the Southern Hemisphere, where climate models already project stronger warming due to greenhouse gas emissions and where there is less capacity for adaptation. In these vulnerable regions, the authors argue, policies that restrict or redistribute urban expansion and planning strategies that mitigate UHIs are needed to reduce the effects on human health, energy systems, ecosystems, and urban infrastructure.

"We know that 60 to 70 percent of the world's population will be living in urban areas by midcentury," said Kangning Huang, a Ph.D candidate at F&ES and lead author of the paper published in Environmental Research Letters. "So if you don't know the change in urban heat island and you only focus on climate change from greenhouse gas-emissions, then you are underestimating the level of warming and heat that two-thirds of the global population will be exposed to."
 

The paper was co-authored by Karen Seto, the Frederick C. Hixon Professor of Geography and Urbanization Science at F&ES, as well as Xia Li from East China Normal University in Shanghai and Xiaoping Liu from Sun Yat-Sen University in Guangzhou, China.
 

According to the authors, this research represents an important step toward understanding the future of urbanization. Although the UN's World Urbanization Prospects has offered a trove of data on future urban population growth, it lacks specific information on how that growth will be distributed within countries and the changes in land uses. 
 

Moreover, although the UN Intergovernmental Panel on Climate Change reports incorporate modeling studies on future warming due to greenhouse gas emissions, they do not provide global-scale projections of warming as a consequence of increasing UHIs.
 

The urban heat island effect is a phenomenon in which urban areas are significantly warmer than the surrounding countryside, inducing local climate effects that compound the global effects of greenhouse gas emissions.
 

In the new study, the authors predict the likely distribution of urban population growth based on range of factors including economic growth, topography, and transportation networks assuming a continuation of historical trends in urban expansion. Then, using mathematical models, they forecast the intensification of the urban heat island effect down to local and regional scales. (The model produces spatial results at a resolution of five square kilometers.)
 

"If you want to do a comprehensive assessment of the risk of warming, you need to have this information," Huang said. "Climate impacts depend on location; it will be different, for instance, from Florida to California. So it's important to know where people are living and the resulting size of the urban areas."
 

Likewise, any future policy decisions will be dependent on local and regional conditions, he said. Although the paper doesn't make policy recommendations, it offers policymakers and scientists across the world a glimpse into what the future might look like in their regions, and an opportunity to craft policies that respond to those projections.

Among other findings, they report that more than 70 percent of new urban lands will concentrate in the more humid temperate and tropical zones. In North America and Europe, urban expansion will take place in temperate and cold zones, while in China, most of the expansion will occur in temperate zones in the country's eastern and southern sections. In Latin America, Southeast Asia, and Sub-Saharan Africa, the majority of new urban areas will be located in tropical zones.
 

While the UHI effects of this urban land expansion will vary largely across climate zones and regions, the hottest cores of medium-sized urban clusters (defined as 100 to 5,000 square kilometers) will see daily summertime temperature increases of 1 to 3 degrees C in some places. (According to their findings, the rise in UHI tends to rise most steeply as urban areas each a size of 100 square kilometers and then plateau at about 5,000 square kilometers: in other words, large cities typically have already reached their "heat island" plateau.) Many of these mid-sized cities, the authors note, may lack the capacity to adapt to stronger warming in comparison with larger cities.
 

The authors also estimate that urban areas will increase by up to 1.3 million square kilometers between 2015 and 2050, an increase of 171 percent over the global urban footprint in 2015. This would be the equivalent to building a new city the size of New York City every eight days for the next 35 years. More than 70 percent of the new urban lands will be concentrated in humid temperate and tropical zones, primarily in Asia and Africa.
 

These findings will help complement a growing body of research into the potential effects of a warmer, more urbanized world, Seto said.
 

"This is the first and only study that projects urban land expansion globally through 2050. It fills a critical knowledge gap in understanding the resource demands of future urbanization, such as for land, energy, materials," she said. "At the same time, this study also helps us understand how cities can modify their local environment to mitigate the impacts of urban-induced changes in local and regional temperatures."

Credit: 
Yale School of the Environment

What felled the great Assyrian Empire? A Yale professor weighs in

image: Deportees after the Assyrian siege of Lachish, Judea (701 B.C.E.). Detail from bas-relief removed from Sennacherib's 'Palace Without Rival,' Nineveh, Iraq, and now in The British Museum.

Image: 
The British Museum

The Neo-Assyrian Empire, centered in northern Iraq and extending from Iran to Egypt -- the largest empire of its time -- collapsed after more than two centuries of dominance at the fall of its capital, Nineveh, in 612 B.C.E.

Despite a plethora of cuneiform textual documentation and archaeological excavations and field surveys, archaeologists and historians have been unable to explain the abruptness and finality of the historic empire's collapse.

Numerous theories about the collapse have been put forward since the city and its destruction levels were first excavated by archaeologists 180 years ago. But the mystery of how two small armies -- the Babylonians in the south and the Medes in the east -- were able to converge on Nineveh and completely destroy what was then the largest city in the world, without any reoccupation, has remained unsolved.

A team of researchers -- led by Ashish Sinha, California State University, Dominguez Hills, and using archival and archaeological data contributed by Harvey Weiss, professor of Near Eastern archaeology and environmental studies at Yale -- was able for the first time to determine the underlying cause for the collapse. By examining new precipitation records of the area, the team discovered an abrupt 60-year megadrought that so weakened the Assyrian state that Nineveh was overrun in three months and abandoned forever. The research was published in Science Advances on Nov. 13.

Assyria was an agrarian society dependent on seasonal precipitation for cereal agriculture. To its south, the Babylonians relied on irrigation agriculture, so their resources, government, and society were not affected by the drought, explains Weiss.

The team analyzed stalagmites -- a type of speleothem that grows up from a cave floor and is formed by the deposit of minerals from water -- retrieved from Kuna Ba cave in northeast Iraq. The speleothems can provide a history of climate through the oxygen and uranium isotope ratios of infiltrating water that are preserved in its layers. Oxygen in rainwater comes in two main varieties: heavy and light. The ratio of heavy to light types of oxygen isotopes are extremely sensitive to variations in precipitation and temperature. Over time, uranium trapped in speleothems turns into thorium, allowing scientists to date the speleothem deposits.

Weiss and the research team synchronized these findings with archaeological and cuneiform records and were able to document the first paleoclimate data for the megadrought that affected the Assyrian heartland at the time of the empire's collapse, when its less drought-affected neighbors invaded. The team's research also revealed that this megadrought followed a high-rainfall period that facilitated the Assyrian empire's earlier growth and expansion.

"Now we have a historical and environmental dynamic between north and south and between rain-fed agriculture and irrigation-fed agriculture through which we can understand the historical process of how the Babylonians were able to defeat the Assyrians," said Weiss, adding that the total collapse of Assyria is still described by historians as the "mother of all catastrophes."

Through the archaeology and history of the region, Weiss was able to piece together how the megadrought data were synchronous with Assyria's cessation of long-distance military campaigns and the construction of irrigation canals that were similar to its southern neighbors but restricted in their agricultural extent. Other texts noted that the Assyrians were worrying about their alliances with distant places, while also fearing internal intrigue, notes Weiss.

"This fits into a historical pattern that is not only structured through time and space, but a time and space that is filled with environmental change," says Weiss. "These societies experienced climatic changes that were of such magnitude they could not simply adapt to them," he adds.

With these new speleothem records, says Weiss, paleoclimatologists and archaeologists are now able to identify environmental changes in the global historical record that were unknown and inaccessible even 25 years ago. "History is no longer two-dimensional; the historical stage is now three-dimensional," said Weiss.

Credit: 
Yale University

UNH researchers find climate change and turf seaweed causing 'patchy' seascape

image: The abundance of this type of turf seaweed could likely impact species habitats and the structure of the food web.

Image: 
Jennifer Dijkstra/UNH

DURHAM, N.H. – The effects of climate change are becoming more apparent, from the rapidly warming Gulf of Maine, to more frequent and severe storms and the increase of invasive turf seaweed. Researchers at the University of New Hampshire have found that these environmental developments are contributing to the transformation of the seafloor to a lower, more patchy seascape dominated by shrub-like seaweed which could impact species habitats and the structure of the food web.

“These shifts in nature have created a perfect breeding ground for much bushier, or turf, seaweed to take root,” says Jennifer Dijkstra, research assistant professor in UNH’s Center for Coastal and Ocean Mapping. “Our earlier research showed a clear increase in invasive seaweed in areas once dominated by tall blades of kelp, that are important in the protection of sea life, but this new research showed us just how widespread this shift is and the effects the turf seaweed could potentially have on the ecosystem.”

In their research, recently published in the journal Ecosphere, the researchers used high-resolution underwater video of the seafloor taken by moving back and forth in a “lawnmower” pattern. The footage of various seafloor habitats, collected from over 100 square miles, was transformed into photomosaics to generate a spatial seascape map. In the imaging they identified over 23 different types of seaweed, breaking it down to areas dominated by turf, kelp or mixed seaweed. Overall, the lower-lying turf seaweed made up a significantly larger amount of the seascape.

“We also looked at temperature increase by decades and saw that the warmer water temperatures central to climate change are likely shortening the growing season of kelp, which prefers colder conditions, but the more dominant forms of turf seaweed can thrive in these temperatures,” said Dijkstra. “So, turf that is already more easily dislodged by storms can become loose, reproduce and travel to take advantage of open space year-round in the Gulf of Maine.”

The researchers also examined the relationship between seascape patterns and patchiness and its relation to the abundance of fish in each habitat. Results showed that patch size had a direct correlation to the abundance of fish in habitat types and that more turf seaweed areas may lead to fewer observed fish in these habitats, specifically the mid-trophic level species like cunner. A habitat dominated by lower, bushier turf could lead to fewer areas for fish to hide and increase the time and energy that species like cunner spend seeking and defending their shelter, or may even cause them to occupy less sheltered areas in which the risk of predation is greater.

Researchers say the trend for increasing temperatures combined with more frequent and intense storms and fluctuations of available resources could create better conditions for the turf seaweed to prosper and grow and therefore increase the overall seascape patchiness. Their research indicates that these changes could propagate up the food web, specifically affecting those species that are residential and seek refuge and food within these habitats.

Co-authors on this study, all from UNH, are Yuri Rzhanov, research professor of ocean engineering; Brandon S. O’Brien, Ph.D. student; Kristen Mello ‘14, project research specialist; and Amber Litterer ‘16, Center for Coastal and Ocean Mapping.

This project was supported by a grant from NOAA (NA15NOS4200) and a UNH Research Engagement award.

The University of New Hampshire inspires innovation and transforms lives in our state, nation and world. More than 16,000 students from all 50 states and 71 countries engage with an award-winning faculty in top-ranked programs in business, engineering, law, health and human services, liberal arts and the sciences across more than 200 programs of study. As one of the nation’s highest-performing research universities, UNH partners with NASA, NOAA, NSF and NIH, and receives more than $110 million in competitive external funding every year to further explore and define the frontiers of land, sea and space.

Images to Download:

https://www.unh.edu/sites/default/files/historic_kelp_bed.jpg
CAPTION: Photo, taken in the 1990’s, showing the tall blades of kelp seaweed which once dominated the ocean floor in the Gulf of Maine and offered protection for certain species of sea life. PHOTO CREDIT: Larry Harris/ UNH

https://www.unh.edu/sites/default/files/turf_dominated_seascape.jpg
CAPTION: Shrub-like seaweed that now dominates the seascape of the Gulf of Maine. Pictured here is the low-lying invasive seaweed known as Dasysiphonia japonica. The abundance of this type of turf seaweed could likely impact species habitats and the structure of the food web. PHOTO CREDIT: Jennifer Dijkstra/UNH

https://www.unh.edu/sites/default/files/kelp_forest.png
CAPTION: Close up view of a bed of kelp seaweed that once dominated the sea floor in the Gulf of Maine. PHOTO CREDIT: Jennifer Dijkstra/UNH

https://www.unh.edu/sites/default/files/foster.jpg
CAPTION: The multiple colors in this photomosaic represent the more fragmented nature of a turf dominated seaweed community. PHOTO CREDIT: Jennifer Dijkstra/UNH

https://www.unh.edu/sites/default/files/lungingbmpkm.jpg
CAPTION: The abundance of green in this photomosaic shows the more cohesive dominance of kelp in this seaweed community. PHOTO CREDIT: Jennifer Dijkstra/UNH

Journal

Ecosphere

Credit: 
University of New Hampshire

Stanford researchers explore how citizens can become agents of environmental change

image: People take part in an environmental education initiative at Ninigret National Wildlife Refuge in Rhode Island. (Image credit: US Fish & Wildlife Service)

Image: 
U. S. Fish & Wildlife Service

If you like to walk in the woods, raft a river, dig in a garden or look at butterflies, you could become an agent of change.

Science and policy may not be enough to solve complex environmental challenges ranging from species extinction to water pollution, but actively engaged citizens could tip the balance, according to a new Stanford-led study that provides a blueprint for empowering people to turn the tide of environmental destruction. In Biological Conservation, the researchers outline four key facets of programs that have been successful in motivating and training people to have a meaningful impact.

"Effective environmental education moves people to persistent action through engaging with issues in relevant ways," said study lead author Nicole Ardoin, an associate professor at the Stanford Graduate School of Education and a senior fellow at the Stanford Woods Institute for the Environment. "Without it, making sustained change on environmental and sustainability issues simply is not possible."

While the rate of climate change and species extinction intensifies, the U.S. outdoor recreation sector is growing more than one-and-a-half-times faster than the overall economy. Historic federal legislation passed earlier this year is opening more than a million acres of wildlands to public access. Increasingly, nonprofit groups and government agencies are harnessing this growing interest in nature as a force for conserving it. The question for them has been what kinds of activities or educational engagement have the most measurable impact.

To get at that question, the researchers analyzed reviews of more than 100 environmental education programs that addressed topics such as habitat protection and restoration, water quality, energy conservation, climate change and recycling. From this, the researchers gleaned four keys to maximizing the chance that programs will help citizens make meaningful environmental impacts: focus on locally relevant issues; collaborate with experts; incorporate action-oriented learning strategies and approaches, such as hands-on experiments and policy recommendations; and measure outcomes.

The researchers point to citizen science as an example of the four principles at work. At its best, citizen science involves members of the general public participating in aspects of science initiatives from design to implementation. Their hands-on engagement in the conservation effort yields reliable, usable data. These initiatives provide community members with avenues for collecting and measuring local-scale data, working directly with scientists and research managers on relevant research and documenting outcomes.

Incorporating the study's findings into new environmental education programs could help more conservation organizations and agencies involve the public effectively in improving environmental quality. It could also help cyclists, river rafters, hikers and others contribute directly to the nature they enjoy.

Environmental education: Keys to success

1) Focus on local environmental issues or locally relevant dimensions of global issues

Example: Students in a reforestation project created nurseries, improved their abilities to serve as custodians, and participated in plantings on school grounds, as well as in other deforested parts of the community, with the goal of reconnecting with local ecosystems and enhancing student capabilities to serve as custodians.

2) Collaborate with scientists and resource managers

Example: A program brought together prison inmates with scientists, students and natural area managers to encourage lifelong learning, support ecological research and promote habitat restoration through plant production and captive rearing of animals in correctional facilities. Conservation practitioners provided the knowledge that resulted in raising and releasing approximately 550 frogs, 4,000 butterflies and 1 million plants.

3) Incorporate action elements into programs

Example: University students' sea turtle research and habitat assessment resulted in a proposal to the Mexican government for a marine protected area, with recommendations for fishing activity zones for resource and ecotourism uses.

4) Measure and report program outcomes

Example: Researchers in a reforestation education initiative reported not only the number of trees planted but also provided data on the survival rates of planted trees.

Credit: 
Stanford University