Tech

As rural western towns grow, so do their planning challenges

Nestled among state parks, red rock buttes and breathtaking mountain vistas, Sedona, Arizona, is one of the most popular resort towns in the American West.

Today, many Sedona visitors and residents frequently find themselves stuck in traffic, struggling to find parking or encountering crowds of people in the wilderness. Increased tourism might be one of the reasons for these issues, but another is a large commuter workforce, according to a new study by researchers at the University of Arizona and University of Utah.

Census data shows that of the roughly 7,000 jobs in Sedona, about 5,000 of them - 74% - are held by people who live outside the city limits in larger towns, such as Flagstaff and the Phoenix metropolitan area, or in adjacent rural communities in the Verde Valley. For some of those commuters, living in Sedona isn't an option due to a cost of living their jobs can't support, the researchers found.

Sedona's story is indicative of a trend unfolding in many rural gateway communities across the American West, according to the new study, which is published in the Journal of the American Planning Association. Booming tourism and a steady increase in new residents present unprecedented urban planning challenges.

Quantifying Years of Anecdotal Evidence

Planners, residents and public officials in gateway communities - which also include Bisbee, Arizona; Jackson, Wyoming; and Moab, Utah - have for years seen anecdotal evidence of the planning challenges that accompany population and tourism growth. The new study sought to quantify those challenges, said lead author Philip Stoker, assistant professor in the UArizona School of Landscape Architecture and Planning in the College of Architecture, Planning and Landscape Architecture.

"Personally, I've just noticed it from going to all these places," said Stoker, an expert on urban water use and natural resource management whose research focuses on the western U.S. "Moab, Utah, is one of my favorites and it's been highly impacted, so I had kind of a personal motivation to do this."

Stoker and his collaborators conducted in-depth interviews with 33 public officials and surveyed more than 300 others from about 1,500 gateway communities across the western U.S., which did not include coastal communities. Officials were asked specific questions about their communities' planning challenges and opportunities.

The researchers did not interview or survey residents of the communities they studied - only public officials. The reasons for that were both logistic and strategic, Stoker said: Officials' email addresses were public record and they were therefore easier to contact. But getting feedback from officials also meant that the study's data came from people who had strong knowledge of their communities' development challenges.

Housing Affordability, 'Small-town-ness' Were Major Issues

Among the study's most significant findings: Issues related to housing affordability were top-of-mind, "pervasive and urgent" for nearly all survey respondents and interviewees. Nearly 83% of respondents reported that housing affordability was either "moderately problematic," "very problematic" or "extremely problematic." Nearly all interviewees called housing affordability a key issue for their communities.

Home buyers from larger cities were a major factor in the rising cost of housing in gateway communities, Stoker said. Many people from large metropolitan areas have turned to gateway communities for retirement, vacation homes and - increasingly - remote work, Stoker said. With their larger incomes, they were willing to pay more for properties, causing prices for surrounding properties to go up.

"If you've been living there and growing up in this community and you don't have a job that's paying the salary of someone who's in, for example, downtown Seattle, you're going to be excluded from this community and your ability to invest in land and property if you haven't already," Stoker said.

Respondents also said they were concerned about the effects of growth on their communities' character or "small-town-ness," a quality that nearly 94% of survey respondents said was important.

On the other hand, the study found that roughly 12% of the communities studied were shrinking in population, which came with a new set of problems - a dwindling tax base that led to less money for infrastructure improvements and other crucial expenses.

Tourism Not a Pressing Issue, Most Officials Said

One finding that came as a surprise to the researchers: Reported tensions between long-term residents and tourists or between long-term residents and short-term residents were lower than expected. Only 16% of survey respondents said too much tourism was "extremely problematic." Interviewees talked about their "love-hate relationship" with tourism, and also called it a "double-edge sword" because of the economic benefits it brings.

"One of the anecdotal things we were hearing about is there's always this kind of old-timer-versus-newcomer dynamic in these communities," Stoker said. "Public officials across the questionnaire didn't report that it was as serious as we thought."

The caveat with that finding is that it came from public officials, Stoker said, adding that average citizens may have reported stronger tensions.

Implementing Solutions

Stoker co-authored the study with Lindsey Romaniello, who earned her master's degree in urban planning from the University of Arizona in May; Danya Rumore, director of the Environmental Dispute Resolution Program at the University of Utah; and Zacharia Levine, a Ph.D. student at the University of Utah.

Romaniello found out about the study during a class Stoker was teaching and immediately wanted to be involved.

"It was exactly up my alley and what I wanted to study," said Romaniello, a native of Ridgway, Colorado, near the famous ski-resort town of Telluride - another gateway community identified in the study.

"I'm mostly interested in rural places and rural community planning, specifically mountain towns and resort towns," added Romaniello, who is now a planner for Missoula County in Montana. "It was exactly what I was interested in."

Researchers hope they can use the feedback they've collected to call attention to the need for proactive planning in gateway communities.

"Our goal here was that if we can identify the problems, our next step is looking at what strategies can help these communities maintain and adapt to growth as it happens, and then control growth, too, so that it's not just happening to them," Stoker said.

Many communities are already getting to work, he added. Nearly all of the growth and planning issues identified in the study are regional issues, meaning that communities in the same areas should work together to tackle them, he said.

Credit: 
University of Arizona

Can sunlight convert emissions into useful materials?

Shaama Sharada calls carbon dioxide -- the worst offender of global warming -- a very stable, "very happy molecule."

She aims to change that.

Recently published in the Journal of Physical Chemistry A, Sharada and a team of researchers at the USC Viterbi School of Engineering seek to break CO2 apart and convert the greenhouse gas into useful materials like fuels or consumer products ranging from pharmaceuticals to polymers.

Typically, this process requires a tremendous amount of energy. However, in the first computational study of its kind, Sharada and her team enlisted a more sustainable ally: the sun.

Specifically, they demonstrated that ultraviolet (UV) light could be very effective in exciting an organic molecule, oligophenylene. Upon exposure to UV, oligophenylene becomes a negatively charged "anion," readily transferring electrons to the nearest molecule, such as CO2 -- thereby making the CO2 reactive and able to be reduced and converted into things like plastics, drugs or even furniture.

"CO2 is notoriously hard to reduce, which is why it lives for decades in the atmosphere," Sharada said. "But this negatively charged anion is capable of reducing even something as stable as CO2, which is why it's promising and why we are studying it."

The rapidly growing concentration of carbon dioxide in the earth's atmosphere is one of the most urgent issues humanity must address to avoid a climate catastrophe.

Since the start of the industrial age, humans have increased atmospheric CO2 by 45%, through the burning of fossil fuels and other emissions. As a result, average global temperatures are now two degrees Celsius warmer than the pre-industrial era. Thanks to greenhouse gases like CO2, the heat from the sun is remaining trapped in our atmosphere, warming our planet.

The research team from the Mork Family Department of Chemical Engineering and Materials Science was led by third year Ph.D. student Kareesa Kron, supervised by Sharada, a WISE Gabilan Assistant Professor. The work was co-authored by Samantha J. Gomez from Francisco Bravo Medical Magnet High School, who has been part of the USC Young Researchers Program, allowing high school students from underrepresented areas to take part in STEM research.

Many research teams are looking at methods to convert CO2 that has been captured from emissions into fuels or carbon-based feedstocks for consumer products ranging from pharmaceuticals to polymers.

The process traditionally uses either heat or electricity along with a catalyst to speed up CO2 conversion into products. However, many of these methods are often energy intensive, which is not ideal for a process aiming to reduce environmental impacts. Using sunlight instead to excite the catalyst molecule is attractive because it is energy efficient and sustainable.

"Most other ways to do this involve using metal-based chemicals, and those metals are rare earth metals," said Sharada. "They can be expensive, they are hard to find and they can potentially be toxic."

Sharada said the alternative is to use carbon-based organic catalysts for carrying out this light-assisted conversion. However, this method presents challenges of its own, which the research team aims to address. The team uses quantum chemistry simulations to understand how electrons move between the catalyst and CO2 to identify the most viable catalysts for this reaction.

Sharada said the work was the first computational study of its kind, in that researchers had not previously examined the underlying mechanism of moving an electron from an organic molecule like oligophenylene to CO2. The team found that they can carry out systematic modifications to the oligophenylene catalyst, by adding groups of atoms that impart specific properties when bonded to molecules, that tend to push electrons towards the center of the catalyst, to speed up the reaction.

Despite the challenges, Sharada is excited about the opportunities for her team.

"One of those challenges is that, yes, they can harness radiation, but very little of it is in the visible region, where you can shine light on it in order for the reaction to occur," said Sharada. "Typically, you need a UV lamp to make it happen."

Sharada said that the team is now exploring catalyst design strategies that not only lead to high reaction rates but also allow for the molecule to be excited by visible light, using both quantum chemistry and genetic algorithms.

The research paper marks high school student Gomez's first co-authored publication in a prestigious peer-reviewed journal.

Gomez was a senior at the Bravo Medical Magnet school at the time she took part in the USC Young Researchers Program over the summer, working in Sharada's lab. She was directly mentored and trained in theory and simulations by Kron. Sharada said Gomez's contributions were so impressive that the team agreed she deserved an authorship on the paper.

Gomez said that she enjoyed the opportunity to work on important research contributing to environmental sustainability. She said her role involved conducting computational research, calculating which structures were able to significantly reduce CO2.

"Traditionally we are shown that research comes from labs where you have to wear lab coats and work with hazardous chemicals," Gomez said. "I enjoyed that every day I was always learning new things about research that I didn't know could be done simply through computer programs."

"The first-hand experience that I gained was simply the best that I could've asked for, since it allowed me to explore my interest in the chemical engineering field and see how there are many ways that life-saving research can be achieved," Gomez said.

Credit: 
University of Southern California

Giant leap for molecular measurements

image: Laser pulses lasting for mere femtoseconds (one-quadrillionth of a second) are stretched to the nanosecond (one-billionth of a second) range.

Image: 
© 2020 Ideguchi et al.

Spectroscopy is an important tool of observation in many areas of science and industry. Infrared spectroscopy is especially important in the world of chemistry where it is used to analyze and identify different molecules. The current state-of-the-art method can make approximately 1 million observations per second. UTokyo researchers have greatly surpassed this figure with a new method about 100 times faster.

From climate science to safety systems, manufacture to quality control of foodstuffs, infrared spectroscopy is used in so many academic and industrial fields that it's a ubiquitous, albeit invisible, part of everyday life. In essence, infrared spectroscopy is a way to identify what molecules are present in a sample of a substance with a high degree of accuracy. The basic idea has been around for decades and has undergone improvements along the way.

In general, infrared spectroscopy works by measuring infrared light transmitted or reflected from molecules in a sample. The samples' inherent vibrations alter the characteristics of the light in very specific ways, essentially providing a chemical fingerprint, or spectra, which is read by a detector and analyzer circuit or computer. Fifty years ago the best tools could measure one spectra per second, and for many applications this was more than adequate.

More recently, a technique called dual-comb spectroscopy achieved a measurement rate of 1 million spectra per second. However, in many instances, more rapid observations are required in order to produce fine-grain data. For example some researchers wish to explore the stages of certain chemical reactions that happen on very short time scales. This drive prompted Associate Professor Takuro Ideguchi from the Institute for Photon Science and Technology, at the University of Tokyo, and his team to look into and create the fastest infrared spectroscopy system to date.

"We developed the world's fastest infrared spectrometer, which runs at 80 million spectra per second," said Ideguchi. "This method, time-stretch infrared spectroscopy, is about 100 times faster than dual-comb spectroscopy, which had reached an upper speed limit due to issues of sensitivity." Given there are around 30 million seconds in a year, this new method can achieve in one second what 50 years ago would have taken over two years.

Time-stretch infrared spectroscopy works by stretching a very short pulse of laser light transmitted from a sample. As the transmitted pulse is stretched, it becomes easier for a detector and accompanying electronic circuitry to accurately analyze. A key high-speed component that makes it possible is something called a quantum cascade detector, developed by one of the paper's authors, Tatsuo Dougakiuchi from Hamamatsu Photonics.

"Natural science is based on experimental observations. Therefore, new measurement techniques can open up new scientific fields," said Ideguchi. "Researchers in many fields can build on what we've done here and use our work to enhance their own understanding and powers of observation."

Credit: 
University of Tokyo

Drones can be a source of disturbance to wintering waterbird flocks

Newly published research, in Bird Study, carried out by the British Trust for Ornithology (BTO) in Scotland, shows that wintering waterbirds, such as ducks, geese, swans and wading birds can easily be scared into flight by drones.

In recent years, drone technology has improved rapidly, while at the same time the drones themselves have become ever cheaper and produced in ever greater quantities. Drones are now being used for recreational photography, surveillance, ecological research, remote sensing and even to deliver packages. The mass proliferation of drones and the increasingly likelihood of commercial and recreational drone use taking place close to wildlife creates a new and potentially significant source of disturbance to wild birds.

Such disturbance, which could affect rare and protected species, causes birds to waste energy and reduces their feeding time. In extreme cases, birds might stop using an area altogether, and be forced to feed elsewhere, where feeding opportunities may be poorer or the risk of predation higher. This could be particularly harmful during the cold winter months, when vast numbers of waterbirds come to Britain from the Arctic to feed up before the breeding season.

BTO scientists flew a commercially available quadcopter drone towards waterbird flocks in coastal, freshwater and arable crop farmland habitats. While one researcher flew the drone at a standard speed and height towards the flock, another observed the flocks through a telescope to record any responses to the drone as it approached, including alarm calls, signs of heightened alert levels and taking flight.

The BTO team found that larger flocks were more likely to take flight than smaller flocks, and large flocks also took flight at a greater distance from the drone than smaller flocks. This is probably because the larger the flock, the more likely there is to be a sensitive individual present - in almost all cases, once one bird had responded to the drone, the rest of the flock followed.

The researchers also found that the habitat the birds were in had a strong effect on responses. Birds at inland lochs where there was already lots of human activity were very unlikely to respond to the drone, while birds at coastal sites were more likely to respond. Birds in arable farmland were particularly sensitive - flocks feeding in this habitat are probably most susceptible to disturbance because of the need to be on the lookout for predators.

Lead author, David Jarrett, said: "While we expected that the drone would cause large flocks to flush, we were surprised that birds hardly seemed to respond to the drone at all at those inland lochs where there was already lots of human activity taking place. Hopefully this research can be used to help inform guidance and regulations on drone use in proximity to wild birds."

Britain hosts internationally important flocks of waterbirds outside the breeding season. While it has been thought that drones could be useful in monitoring their numbers, the disturbance caused by such monitoring would have to be carefully evaluated. If drone use were to become more frequent at important sites for our wintering waterbirds, and birds did not become accustomed to this novel form of disturbance, then the resulting increases in energy expenditure and stress would be likely to negatively affect their populations.

Credit: 
Taylor & Francis Group

New York and California may have already achieved herd immunity -- Ben-Gurion U. researcher

NEW YORK...September 1, 2020 -- Ben-Gurion University of the Negev (BGU) data scientist Prof. Mark Last sees the end of the coronavirus peak in Israel and believes that New York and California may have reached herd immunity.

Prof. Last of the BGU Department of Software and Information Systems Engineering, presented these finding virtually at the Artificial Intelligence and the Coronavirus workshop at the International Conference on Artificial Intelligence in Medicine (AIME) on August 26. He has been analyzing health data for the past 20 years.

The findings are based on the SIR Model of Infection Dynamics, which is being used to determine COVID-19 scenarios. In this model, the population is assigned to compartments with labels: S, I or R (Susceptible, Infectious or Recovered). Such models can show how different public health interventions may affect the spread of the epidemic, such as the most efficient technique for issuing a limited number of vaccines in a given population.

In late June, New York State was close to reaching herd immunity, according to the SIR model, which is defined by a disease reproduction number of less than one. Considering a steady decrease in reported mortality rates since then, the basic reproduction number under the current social distancing restrictions was 1.14. The basic reproduction number is the average amount of secondary infections an infected person will cause in a completely susceptible population.

At that time, New York had approximately 400,000 confirmed cases, implying 2.4 million (6x more) actual infections based on the results of serological tests conducted in the state.

Prof. Last says that these are similar to his estimates for California and Israel.

"In California, it appears that herd immunity was reached around July 15 with slightly more than 10% of their population (4.05 million) being infected," he says. "This means that their basic reproduction number R0 under current restrictions is only 1.1.

"In Israel, a further lockdown is not necessary if the current restrictions are maintained and there are no unusual spreading events," Last says. "If we maintain the current restrictions, then my model predicts that we are at the end of this peak, which should tail off at the end of August or the beginning of September. Moreover, according to my calculations, we need 1.16 million people with antibodies in order to achieve herd immunity and we are very close to that number," he says.

"If there is no unusual outbreak because of the return to school or mass indoor gatherings, then the infection rate will start dropping. While another lockdown would certainly reduce infection rates, there is no need at the present time since social and physical distancing is working to lower infection rates."

However, the outlook for COVID-19 patients admitted to intensive care units in Israel for COVID-19 is dire, with an estimated 80% mortality rate, according to Prof. Last's calculations. According to the World Health Organization, the global percentage is currently about 60%. In previous research unconnected to COVID-19, Last revealed that there is an average 20% mortality among all patients admitted to ICUs.

Prof. Last's model is based on the COVID-19 attributed deaths reported by the Israeli Ministry of Health on a daily basis and an estimation of the total number of infected people based on published results of serological tests rather than just on confirmed cases.

"We cannot know the actual number of cases of infection unless we test the entire population every day. Initial serological tests conducted in Israel indicate the ratio of confirmed cases to actual cases is about 1:10. Using those numbers, we now have slightly above one million people with antibodies in Israel and we need at least 1.2 million," he says.

Therefore, he is cautiously optimistic about the COVID-19 epidemic in Israel. "We are heading in the right direction, but it is important not to relax our restrictions or get overconfident," he warns.

Credit: 
American Associates, Ben-Gurion University of the Negev

Red fox displaces Arctic fox thanks to littering

image: The density of red foxes is increasing in Norway's mountainous areas. The more trash and food waste red foxes have access to, the greater their numbers. This photo was taken with a game camera and shows a red fox that has found food.

Image: 
NINA, game camera

Animal species that are at home in the high mountains are finding their habitats reduced and fragmented by roads. In addition, they face competition from scavengers from lower boreal areas that find their way to the mountains.

"More cabins, more tourism and increased car traffic means more litter and more roadkill. For the red fox, the crow and other scavengers, it means more tempting food," says Lars Rød-Eriksen, who is employed as a researcher in terrestrial ecology at NINA, the Norwegian Institute for Nature Research.

In his doctoral work at the Norwegian University of Science and Technology (NTNU), Rød-Eriksen surveyed road segments at Dovre, Saltfjellet and Hardangervidda to learn how wildlife is affected by the highways.

Roads = food

"We found that the red fox uses the road both to find food and to move from place to place. Especially in the winter, using the roadways is easier than travelling across the snowy terrain," he says.

"Using tracks in the snow and game cameras, we were able to document that the density of red foxes increases the closer to the road one gets. The more litter and food waste they have access to, the greater the number of red foxes that find their way to the area."

The researcher notes that the pattern is the opposite for Arctic foxes. "A lot of trash means few Arctic foxes. We found that the Arctic fox doesn't tend to stay close to the road. This is probably not because the they aren't attracted to the road, but because the presence of the red fox makes them keep their distance."

Weaker species displaced

Small rodents are the Arctic fox's specialty fare, but it isn't "too fussy" to eat trash. In competing with the red fox, however, it falls short.

"The Arctic fox is also attracted to roads, but the red fox is bigger and dominates in the competition between the species. There are also examples of red foxes that have killed Arctic foxes. Increased access to food enables the red fox to establish itself in the high alpine zone. The search for food is especially intense in late winter," Rød-Eriksen says.

The crow is both a competitor and a useful helper for the fox. Often crows are the first to discover a treat, but foxes are observant and use the crows to guide them to where the food is.

Unwelcome in the mountains

"The red fox has existed in the mountains before. But it's an invasive species and can disrupt the natural alpine ecosystem if it establishes itself there permanently, like it seems to be doing now. The Arctic fox is already an endangered species, and it seems likely that the red fox is impacting other alpine species as well, such as ptarmigan, that are ground nesters. We call it a cascade effect when several species are affected," says Rød-Eriksen.

How about a litter law?

More roads and increased traffic also mean more roadkill. Rød-Eriksen believes it's easier to tackle the littering problem than the roadkill.

"Information campaigns can inform people about the consequences of throwing out and leaving trash and food scraps behind. A lot of people probably don't give any thought to how littering can negatively impact wildlife. Other countries have stricter legislation against littering. Maybe Norway should also consider it. Personally, I think it would be effective," says Rød-Eriksen.
 

To record the movements of red foxes and Arctic foxes during the winter, Rød-Eriksen used tracking, supplemented by a game camera with bait at different distances from the road. These methods yielded good and reliable findings.

Summertime proved more difficult. The crows found the prey before the fox and often managed to eat it before the fox could get to it. Rød-Eriksen also placed artificial bird nests containing a real quail egg and a fake egg made from modelling clay along the transects.

The idea was that bite marks in the fake, soft egg would reveal whether a fox or a crow had tried to eat it. Here too, the crow created problems that made the results less reliable during the summer. Rød-Eriksen plans to take a closer look at seasonal variations and more comparable methods in future studies.

Credit: 
Norwegian University of Science and Technology

Your paper notebook could become your next tablet

image: Purdue engineers developed a simple printing process that renders any paper or cardboard packaging into a keyboard, keypad or other easy-to-use human-machine interfaces.

Image: 
Purdue University/Ramses Martinez

WEST LAFAYETTE, Ind. - Innovators from Purdue University hope their new technology can help transform paper sheets from a notebook into a music player interface and make food packaging interactive.

Purdue engineers developed a simple printing process that renders any paper or cardboard packaging into a keyboard, keypad or other easy-to-use human-machine interfaces. This technology is published in the Aug. 23 edition of Nano Energy. Videos showing this technology are available at https://youtu.be/TfA0d8IpjWU, https://youtu.be/J0iCxjicJIQ and https://youtu.be/c9E6vXYtIw0.

"This is the first time a self-powered paper-based electronic device is demonstrated," said Ramses Martinez, an assistant professor in Purdue's School of Industrial Engineering and in the Weldon School of Biomedical Engineering in Purdue's College of Engineering. "We developed a method to render paper repellent to water, oil and dust by coating it with highly fluorinated molecules. This omniphobic coating allows us to print multiple layers of circuits onto paper without getting the ink to smear from one layer to the next one."

Martinez said this innovation facilitates the fabrication of vertical pressure sensors that do not require any external battery, since they harvest the energy from their contact with the user.

This technology is compatible with conventional large-scale printing processes and could easily be implemented to rapidly convert conventional cardboard packaging or paper into smart packaging or a smart human-machine interface.

"I envision this technology to facilitate the user interaction with food packaging, to verify if the food is safe to be consumed, or enabling users to sign the package that arrives at home by dragging their finger over the box to proper identify themselves as the owner of the package," Martinez said. "Additionally, our group demonstrated that simple paper sheets from a notebook can be transformed into music player interfaces for users to choose songs, play them and change their volume."

Credit: 
Purdue University

Stealing information from host plants: How the parasitic dodder plant flowers

image: Dodder (Cuscuta australis) flowers together with the host cucumber.

Image: 
GUO Han

About 4,000-5,000 parasitic plant species exist. Among these, dodders (Cuscuta, Convolvulaceae) are distributed worldwide. Compared with normal autotrophic plants, they have a unique morphology - they are rootless and leafless and carry out no or very little photosynthesis.

Flowering is critical to reproduction in higher plants. Leaves sense environmental factors, such as day length (photoperiod), and initiate flowering programs when the environment and internal physiology are appropriate.

However, sequencing of dodder genomes has indicated that dodders have lost many genes that are critically important for controlling flowering in autotrophic plants. Thus, dodders are likely to have an exceptional flowering mechanism.

Recently, researchers led by WU Jianqiang from the Kunming Institute of Botany of the Chinese Academy of Sciences uncovered the underlying mechanism for dodder flowering. The team first investigated the flowering time of the dodder Cuscuta australis and found that C. australis always synchronizes its flowering time with the flowering time of its hosts.

The FT gene is very well conserved and it encodes a very important signaling protein that activates flowering. However, analysis of the FT gene in C. australis suggests that the dodder FT gene seems to be a pseudogene (i.e., it is nonfunctional).

Using biochemical tools, the team further demonstrated that when the host expresses FT signals, the flowering-inducer FT can travel into C. australis and activate the flowering program of C. australis.

"The dodder does not flower autonomously; instead, when the host plant produces the FT signal protein to activate flowering, the host-produced FT protein is transported into the dodder, thereby activating dodder flowering," said WU.

This study reveals that by eavesdropping on host FT flowering signals, the dodder can synchronize its flowering with its hosts.

This behavior is important, as it may enable dodders to parasitize diverse host plants: If the dodder has a fixed flowering time, and it is much later than the host, then when the host flowers, the nutrient level of the host plant usually drops because of seed development, and the host may even die before the dodder flowers.

On the other hand, if the dodder flowers much earlier than the host, dodder growth will end prematurely, and in this scenario, the dodder does not make as many seeds as the dodder that flowers at a time similar to the host.

This study sheds important light on the physiology, ecology, and evolution of dodders and may provide new strategies for biocontrol of parasitic weeds in agriculture and forestry.

Credit: 
Chinese Academy of Sciences Headquarters

Dodder uses the flowering signal of its host plant to flower

image: Dodder Cuscuta australis on a soybean host plant: The parasite is flowering and has already produced seed capsules. It uses its host's flowering signal for flower formation.

Image: 
Jingxiong Zhang, Kunming Institute of Botany, Chinese Academy of Sciences, China

The plant genus Cuscuta consists of more than 200 species that can be found almost all over the world. The parasites, known as dodder, but also called wizard's net, devil's hair or strangleweed, feed on other plants by attaching themselves to their hosts via a special organ, the haustorium, and withdrawing nutrients from them. They have neither roots nor leaves. Without leaves, they are hardly able to photosynthesize. Without roots they cannot absorb nutrients and water from the soil. On the other hand, they are integrated into the internal communication network of their host plants and can even pass on warning signals from plant to plant (see our press release Dodder: a parasite involved in the plant alarm system, July 24, 2017).

A team of scientists led by Jianqiang Wu, who has been the leader of a Max Planck Partner Group at the Kunming Institute of Botany, Chinese Academy of Sciences, now asked how the parasites manage to synchronize flowering with their hosts. They had observed that plants of the Australian dodder (Cuscuta australis) adjusted the time of their flowering to that of their respective host plant species.

Flower promoting signal FT from the host also determines the flowering time of the parasite

"The flowering time is controlled by leaves, as leaves can sense environmental cues and synthesize the flowering signal, a protein named FLOWERING LOCUS T (FT), which travels through the plant vascular system. We therefore wondered how a leafless parasite such as Cuscuta australis controls the timing of its flowering," says lead investigator Jianqiang Wu. In 2018, his team had sequenced the genome of C. australis and found that many genes important for regulation of flowering time were lost in C. australis genome. Therefore, C. australis seems to be unable to activate its own flowering mechanism.

Based on the fact that FT proteins are mobile signals, the researchers hypothesized that dodder eavesdrops on the flowering signals produced by the leaves of its host and uses them for producing its own flowers. To prove this eavesdropping scenario, they used genetically modified host plants in which the expression of FT genes had been altered, and this indeed affected the flowering time of the parasite. They also coupled the FT protein to a green fluorescent protein (GFP) as a tag and detected the host plant's flower promoting signal in the parasite: The tagged FT protein had migrated from host to parasite.

For dodder, it is the best strategy to synchronize flowering with that of its host. If it flowers much later than its host does, it may not be able to produce seeds at all, as the nutrients in the host are rapidly drained by the host's reproductive tissues. The host may even rapidly die before the parasite can even starts to produce seeds. However, if dodder flowers too early, its growth is likely prematurely ended and it may not be able to produce as many seeds as the dodder plants whose flowering time is synchronized with that of their hosts.

Regressive Evolution: Gene loss as an advantage

In the course of evolution, plant parasites have lost certain traits and "outsourced" physiological processes. As a result, various genes in their genomes may be lost. "This work establishes that for a plant parasite, losing control over flowering processes can be advantageous, as it allows the parasite to hijack its host's mobile flowering signals for its own use. It can thereby readily synchronize its physiology with that of its host", says co-author Ian Baldwin, director of the Department Molecular Ecology at the Max Planck Institute for Chemical Ecology. Because of the gene loss, dodder may be able to better adapt to the parasitic lifestyle and ultimately increase its fitness.

Credit: 
Max Planck Institute for Chemical Ecology

Being a selfish jerk doesn't get you ahead, research finds

The evidence is in: Nice guys and gals don’t finish last, and being a selfish jerk doesn’t get you ahead.

That’s the clear conclusion from research that tracked disagreeable people from college or graduate school to where they landed in their careers about 14 years later.

“I was surprised by the consistency of the findings. No matter the individual or the context, disagreeableness did not give people an advantage in the competition for power—even in more cutthroat, ‘dog-eat-dog’ organizational cultures,” said Berkeley Haas Prof. Cameron Anderson, who co-authored the study with Berkeley Psychology Prof. Oliver P. John, doctoral student Daron L. Sharps, and Assoc. Prof. Christopher J. Soto of Colby College.

The paper was published August 31 in the Proceedings of the National Academy of Sciences.

The researchers conducted two studies of people who had completed personality assessments as undergraduates or MBA students at three universities. They surveyed the same people more than a decade later, asking about their power and rank in their workplaces, as well as the culture of their organizations. They also asked their co-workers to rate the study participants’ rank and workplace behavior. Across the board, they found those with selfish, deceitful, and aggressive personality traits were not more likely to have attained power than those who were generous, trustworthy, and generally nice.

That’s not to say that jerks don’t reach positions of power. It’s just that they didn’t get ahead faster than others, and being a jerk simply didn’t help, Anderson said. That’s because any power boost they get from being intimidating is offset by their poor interpersonal relationships, the researchers found. In contrast, the researchers found that extroverts were the most likely to have advanced in their organizations, based on their sociability, energy, and assertiveness—backing up prior research.

“The bad news here is that organizations do place disagreeable individuals in charge just as often as agreeable people,” Anderson said. “In other words, they allow jerks to gain power at the same rate as anyone else, even though jerks in power can do serious damage to the organization.”

The age-old question of whether being aggressively Machiavellian helps people get ahead has long interested Anderson, who studies social status. It’s a critical question for managers, because ample research has shown that jerks in positions of power are abusive, prioritize their own self-interest, create corrupt cultures, and ultimately cause their organizations to fail. They also serve as toxic role models for society at large.

For example, people who read former-Apple CEO Steve Jobs’ biography might think, “Maybe if I become an even bigger asshole I’ll be successful like Steve,” the authors note in their paper. “My advice to managers would be to pay attention to agreeableness as an important qualification for positions of power and leadership,” Anderson said. “Prior research is clear: agreeable people in power produce better outcomes.”

While there’s clearly no shortage of jerks in power, there’s been little empirical research to settle the question of whether being disagreeable actually helped them get there, or is simply incidental to their success. Anderson and his co-authors set out to create a research design that would clear up the debate. (They pre-registered their analysis for both studies on aspredicted.org.)

What defines a jerk? The participants had all completed the Big Five Inventory (BFI), an assessment based on general consensus among psychologists of the five fundamental personality dimensions: openness to experience, conscientiousness, extraversion, neuroticism, and agreeableness. It was developed by Anderson’s co-author John, who directs the Berkeley Personality Lab. In addition, some of the participants also completed a second personality assessment, the NEO Personality Inventory-Revised (NEO PI-R).

“Disagreeableness is a relatively stable aspect of personality that involves the tendency to behave in quarrelsome, cold, callous, and selfish ways,” the researchers explained. “…Disagreeable people tend to be hostile and abusive to others, deceive and manipulate others for their own gain, and ignore others’ concerns or welfare.”

In the first study, which involved 457 participants, the researchers found no relationship between power and disagreeableness, no matter whether the person had scored high or low on those traits. That was true regardless of gender, race or ethnicity, industry, or the cultural norms in the organization.

The second study went deeper, looking at the four main ways people attain power: through dominant-aggressive behavior, or using fear and intimidation; political behavior, or building alliances with influential people; communal behavior, or helping others; and competent behavior, or being good at one’s job. They also asked the subjects’ co-workers to rate their place in the hierarchy, as well as their workplace behavior (interestingly, the co-workers’ ratings largely matched the subjects’ self-assessments).

This allowed the researchers to better understand why disagreeable people do not get ahead faster than others. Even though jerks tend to engage in dominant behavior, their lack of communal behavior cancels out any advantage their aggressiveness gives them, they concluded.

Anderson noted that the findings don’t directly speak to whether disagreeableness helps or hurts people attain power in the realm of electoral politics, where the power dynamics are different than in organizations. But there are some likely parallels. “Having a strong set of alliances is generally important to power in all areas of life,” he said. “Disagreeable politicians might have more difficulty maintaining necessary alliances because of their toxic behavior.”

Credit: 
University of California - Berkeley Haas School of Business

Wearable device could help EMTs, surgeons assess hemorrhage blood loss

image: Researchers have shown that they can accurately assess blood loss by measuring seismic vibrations in the chest cavity and by detecting changes in the timing of heartbeats. That could lead to development of a smart wearable device that could be carried by ambulance crews and medics and made available in emergency rooms and surgical facilities.

Image: 
Georgia Tech

Emergency medical technicians (EMTs), military medics, and emergency room physicians could one day be better able to treat victims of vehicular accidents, gunshot wounds, and battlefield injuries thanks to a new device under development that may more accurately assess the effects of blood loss due to hemorrhage.

A research team has now shown that it can accurately assess blood loss by measuring seismic vibrations in the chest cavity and by detecting changes in the timing of heartbeats. The knowledge, developed in the laboratory, could potentially lead to development of a smart wearable device that could be carried by ambulance crews and medics and made available in emergency rooms and surgical facilities.

"We envision a wearable device that could be placed on a person's chest to measure the signs that we found are indicative of worsening cardiovascular system performance in response to bleeding," said Omer Inan, associate professor in the School of Electrical and Computer Engineering at the Georgia Institute of Technology. "Based on information from the device, different interventions such as fluid resuscitation could be performed to help a victim of trauma."

The research, supported by the Office of Naval Research, was reported July 22 in the journal Science Advances. It included collaborators from the Translational Training and Testing Laboratories in Atlanta, an affiliate of Georgia Tech, and the University of Maryland.

Blood loss can result from many different kinds of trauma, but the hemorrhage can sometimes be hidden from first responders and doctors. Heart rates are normally elevated in people suffering from trauma, and blood pressure -- now the most commonly used measure of hemorrhage -- can remain stable until the blood loss reaches a life-threatening stage.

"It's very difficult because the vital signs you can measure easily are the ones that the body tries very hard to regulate," Inan said. "Yet you have to make decisions about how much fluid to give an injured person, how to treat them -- and when there are multiple people injured -- how to triage those with the most critical needs. We don't have a good medical indicator that we can measure noninvasively at an injury or battlefield scene to help make these decisions."

Using animal models, Inan and graduate students Jonathan Zia and Jacob Kimball carefully studied seismic vibrations from the chest cavity and electrical signals from the heart as blood volume was gradually reduced. The researchers wanted to evaluate externally measurable indicators of cardiovascular system performance and compare them to information provided by catheters making direct measurements of blood volume and pressure.

The key indicator turned out to be a seismocardiogram, a measure of the micro-vibrations produced by heart contractions and the ejection of blood from the heart into the body's vascular system. But the researchers also saw changes in the timing of the heart's activity as blood volume decreased, providing another measure of a weakening cardiovascular system.

"The most important lower-level feature we found to be important in blood volume status estimation were cardiac timing intervals: how long the heart spends in different phases of its operation," Inan said. "In the case of blood volume depletion, the interval is an important indicator that you could obtain using signals from a wearable device."

In such a device, these noninvasive mechanical and electrical measures could be combined to show just how critical a patient's blood loss was. Machine learning algorithms would use the measurements to generate a simple numerical score in which larger numbers indicate a more serious condition.

"We would give an indicator that is representative of the overall status of the cardiovascular system and how close it is to collapse," Inan said. "If one patient is rated 50 and another is 90, first responders could give priority to the patient with the higher number."

Beyond emergency situations, the new assessment technique could be helpful with many types of surgery in which quickly identifying unseen blood loss could improve the outcome for patients.

In future work, Inan and his collaborators expect to create a prototype device that could take the form of a patch just 10 millimeters square. Additional electrical engineering will be needed to filter out the kinds of background noise likely to be found in real-world trauma situations, and for successful operation when the patient is being transported.

"Long-term, we want to partner with clinicians to do studies in humans where we would use the wearable patch and be able to take measurements when people were coming into the trauma bay, or even while EMTs were still deployed," Inan said. "This could become a new way of monitoring hemorrhage that could be used outside of clinical settings."

The researchers also want to study the opposite problem -- how to determine when enough fluid has been provided to an injured patient. Too much fluid can cause edema, similar to the conditions of heart failure patients whose lungs fill with liquid.

This material is based on work supported by the Office of Naval Research (ONR) under grant N000141812579. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the ONR.

Credit: 
Georgia Institute of Technology

Research shows how a diet change might help US veterans with Gulf War illness

A new study from American University shows the results from a dietary intervention in U.S. veterans suffering from Gulf War Illness, a neurological disorder in veterans who served in the Persian Gulf War from 1990 to 1991.

The veterans’ overall number of symptoms were reduced and they experienced less pain and fatigue after one month on a diet low in glutamate, which is a flavor enhancer commonly added to foods, and that also functions as an important neurotransmitter in the nervous system.

Because the symptoms of GWI are similar to those of fibromyalgia, the U.S. Department of Defense provides funding for previously tested treatments in fibromyalgia that could also help veterans suffering from GWI. The low glutamate diet was previously shown to reduce symptoms in fibromyalgia, and thus, was a candidate for this funding. There are no cures for either illness, and treatments are being sought for both to manage chronic pain. GWI is thought to be connected to nervous system dysfunction in veterans. In the Gulf War, soldiers were exposed to various neurotoxins such as chemical warfare agents, pyridostigmine bromide (PB) pills, pesticides, burning oil fields, and depleted uranium.

“Gulf War Illness is a debilitating disorder which includes widespread pain, fatigue, headaches, cognitive dysfunction, and gastrointestinal symptoms. Veterans with GWI have a reduced quality of life as compared to veterans who do not have the illness,” said AU Associate Professor of Health Studies Kathleen Holton, who explores how food additives contribute to neurological symptoms and is a member of AU’s Center for Behavioral Neuroscience. “In this study testing the low glutamate diet, the majority of veterans reported feeling better. We saw significant reductions in their overall number of symptoms and significant improvements in pain and fatigue.”

The study, published in the journal Nutrients, details the experiments in a clinical trial of 40 veterans with GWI. The study participants were randomized to either immediately start the low glutamate diet for one month, or to a control group. After completion of the one-month diet, participants were challenged with monosodium glutamate and placebo to see if symptoms returned.

The challenge with MSG versus placebo resulted in significant variability in response among participants, with some subjects worsening, while others actually improved. This suggests that while a diet low in glutamate can effectively reduce overall symptoms, pain, and fatigue in GWI, more research is needed to understand how the diet may be altering how glutamate is handled in the body, and the specific role that nutrients may play in these improvements.

The role of glutamate

Glutamate is most easily identified when it is in the form of the food additive MSG; however, it appears most commonly in American diets hidden under many other food additive names in processed foods. Americans also consume glutamate through some foods where it occurs naturally, such as soy sauce, fish sauce, aged cheeses like parmesan, seaweed, and mushrooms.

Glutamate is known to play a role in pain transmission, where it functions as an excitatory neurotransmitter in the nervous system. When there’s too much of it, it can cause disrupted signaling or kill cells, in a process called excitotoxicity. Previous research has shown that glutamate is high in pain processing areas of the brain in individuals with fibromyalgia and migraine. High concentrations of glutamate have also been linked to epilepsy, multiple sclerosis, Parkinson’s disease, ALS, cognitive dysfunction (including Alzheimer’s), and psychiatric issues such as depression, anxiety and PTSD.

In her research, Holton limits people's exposure to glutamate, while also increasing intake of nutrients known to protect against excitotoxicity. She analyzes how diet affects cognitive function, brain wave activity, brain glutamate levels, and brain function using MRI. In the study of veterans, the low glutamate diet was made up of whole foods low in additives and high in nutrients. Holton theorizes that the increased consumption of nutrients that are protective against excitotoxicity may have led to improved handling of glutamate in the nervous system. The study and diet tested in the veterans were similar to her previous studies, where she observed improvements in those with fibromyalgia, as well as in Kenyan villagers living with chronic pain.

It will take more research to determine if reducing exposure to glutamate can be used as a treatment for chronic widespread pain and other neurological symptoms in U.S. veterans with GWI. Holton is currently pursuing funding for her next grant, which will recruit 120 veterans for a Phase 3 clinical trial to confirm the study’s findings in a larger group, and further explore the mechanisms for these effects.

Credit: 
American University

Nanomaterials based strategies for treatment of hypoxic tumor

image: Schematic illustration of strategies for treatment of hypoxic tumor with nanomaterials.

Image: 
©Science China Press

Hypoxia is a typical characteristic of most tumors, owing to the fast consumption of oxygen by tumor tissue over the supply through malformed and abnormal tumor vasculature. Hypoxia in tumor tissue promotes the probability of tumor metastasis and endows hypoxia-tolerant tumor cells with resistance to some tumor therapies, including chemotherapy, radiotherapy, photodynamic therapy, and immunotherapy.

Nanomaterials have been rapidly developed, which opens up new areas in biomedical applications. Nanomaterials equipped with drugs are easier to accumulate in tumor tissues compared with small molecules. Also, nanomaterials are facile to be modified and armed with enhancive multi-functions, which is favorable for tumor therapy.

One strategy is to directly elevate the oxygen level in tumor tissues. Oxygen-carrying nanomaterials, oxygen-generating nanomaterials, and oxygen-economizing nanomaterials are utilized to relieve the hypoxic tumor environment. As the oxygen level elevates in tumor site, the resistance to tumor therapies of hypoxia-tolerant tumor cells is reduced and the tumor therapies are more effective.

Another strategy is to diminish oxygen dependence. It is readily comprehensible that therapies independent with oxygen are powerful weapons to treat hypoxic tumors. Free radicals are substances with strong oxidizing properties, which can induce cell death. Radical-generating nanomaterials can be used to treat tumor with oxygen dependence. Besides, some gaseous molecules play an essential role in physiological modulation, and therapeutic gas-generating nanomaterials can control the delivery of gas for hypoxic tumor therapy.

In a new overview published in the Beijing-based National Science Review, scientists at Wuhan University and South-Central University for Nationalities, China present the latest advances on nanomaterials for the treatment of hypoxic tumors. Xian-Zheng Zhang et al. summarized nanomaterials for treatment of hypoxic tumor with different strategies: 1) Elevating oxygen level in tumor by nanomaterials (oxygen-carrying nanomaterials, oxygen-generating nanomaterials, oxygen-economizing nanomaterials) for enhanced oxygen-dependent tumor therapy and 2) diminishing oxygen dependence of nanomaterials (therapeutic gas-generating nanomaterials and radical-generating nanomaterials) for hypoxic tumor therapy are reviewed.

They also outlined the potential development direction of future nanomaterials for treatment of hypoxic tumors. "Oxygen-carrying nanomaterials or oxygen-generating nanomaterials are hard to continually alleviate the hypoxia for tumor therapy," Zhang said. "It is necessary to develop nanomaterials with controllable and sustained release of oxygen in tumor site, which may be favorable for synergistic therapy."

Credit: 
Science China Press

Serengeti leopard population densities healthy but vary seasonally, study finds

image: Leopards in the Serengeti try to avoid interacting with lions and other top predators during the dry season, when prey is less abundant.

Image: 
Photo by Michael Jeffords and Sue Post.* See Editor's note at the bottom of the release.

CHAMPAIGN, Ill. -- A study of camera-trap data from Serengeti National Park in Tanzania found that leopard population densities in the 3.7-million-acre park are similar to those in other protected areas but vary between wet and dry seasons. The fluctuations appear to be driven by the abundance of prey and how this affects interactions with other large carnivores like lions, researchers report.

Despite the long history of wildlife research in the Serengeti, this is the first peer-reviewed study of leopard densities in the park, said Max Allen, a carnivore ecologist with the Illinois Natural History Survey at the University of Illinois, Urbana Champaign who led the research. Allen and his team analyzed data from Snapshot Serengeti, a large collaborative effort that uses hundreds of camera traps to collect data on large cats and other wildlife in the Serengeti. The team published the new findings in the journal Biodiversity and Conservation.

"In the wet season, when potential prey species like Thomson's gazelle and impala are available in abundance, leopards appear at higher densities," Allen said. "In the dry season, leopards seem to work harder to avoid other large carnivores that compete with them for less abundant food."

The team used advanced analytical techniques called Bayesian statistics to estimate leopard densities for each camera-trap site and for the study area overall.

"We found 5.72 and 5.41 leopards per 100-square-kilometers in the wet and dry seasons, respectively," Allen said. "These densities suggest the leopard populations are at moderately healthy levels in the Serengeti. This reflects the importance of large conservation areas for large carnivores, as leopard populations are generally declining across their range."

The results also highlight the importance of citizen-scientist projects for the conservation of wild species, Allen said. Snapshot Africa is one of the most effective citizen science projects in the world, he said.

"Large carnivores at the top of the food chain play important roles in ecosystem regulation, and disease and population control," Allen said. "The human-induced changes to habitat availability and quality are accelerating the decline of large carnivores, which are already vulnerable because they have naturally low population densities at birth."

Understanding how carnivore populations are faring and what factors contribute to their success is essential to conserving them and the other wildlife in their ecosystem, Allen said. Capturing data about their habits through unobtrusive camera traps can lead to better management of the wild areas on which they depend.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Team's flexible micro LEDs may reshape future of wearable technology

image: The flexible micro LEDs can be twisted (on left) or folded (on the right). The LEDs, which can be peeled off and stuck to almost any surface, could help pave the way for the next generation of wearable technology.

Image: 
The University of Texas at Dallas

University of Texas at Dallas researchers and their international colleagues have developed a method to create micro LEDs that can be folded, twisted, cut and stuck to different surfaces.

The research, published online in June in the journal Science Advances, helps pave the way for the next generation of flexible, wearable technology.

Used in products ranging from brake lights to billboards, LEDs are ideal components for backlighting and displays in electronic devices because they are lightweight, thin, energy efficient and visible in different types of lighting. Micro LEDs, which can be as small as 2 micrometers and bundled to be any size, provide higher resolution than other LEDs. Their size makes them a good fit for small devices such as smart watches, but they can be bundled to work in flat-screen TVs and other larger displays. LEDs of all sizes, however, are brittle and typically can only be used on flat surfaces.

The researchers' new micro LEDs aim to fill a demand for bendable, wearable electronics.

"The biggest benefit of this research is that we have created a detachable LED that can be attached to almost anything," said Dr. Moon Kim, Louis Beecherl Jr. Distinguished Professor of materials science and engineering at UT Dallas and a corresponding author of the study. "You can transfer it onto your clothing or even rubber -- that was the main idea. It can survive even if you wrinkle it. If you cut it, you can use half of the LED."

Researchers in the Erik Jonsson School of Engineering and Computer Science and the School of Natural Sciences and Mathematics helped develop the flexible LED through a technique called remote epitaxy, which involves growing a thin layer of LED crystals on the surface of a sapphire crystal wafer, or substrate.

Typically, the LED would remain on the wafer. To make it detachable, researchers added a nonstick layer to the substrate, which acts similarly to the way parchment paper protects a baking sheet and allows for the easy removal of cookies, for instance. The added layer, made of a one-atom-thick sheet of carbon called graphene, prevents the new layer of LED crystals from sticking to the wafer.

"The graphene does not form chemical bonds with the LED material, so it adds a layer that allows us to peel the LEDs from the wafer and stick them to any surface," said Kim, who oversaw the physical analysis of the LEDs using an atomic resolution scanning/transmission electron microscope at UT Dallas' Nano Characterization Facility.

Colleagues in South Korea carried out laboratory tests of LEDs by adhering them to curved surfaces, as well as to materials that were subsequently twisted, bent and crumpled. In another demonstration, they adhered an LED to the legs of a Lego minifigure with different leg positions.

Bending and cutting do not affect the quality or electronic properties of the LED, Kim said.

The bendy LEDs have a variety of possible uses, including flexible lighting, clothing and wearable biomedical devices. From a manufacturing perspective, the fabrication technique offers another advantage: Because the LED can be removed without breaking the underlying wafer substrate, the wafer can be used repeatedly.

"You can use one substrate many times, and it will have the same functionality," Kim said.

In ongoing studies, the researchers also are applying the fabrication technique to other types of materials.

"It's very exciting; this method is not limited to one type of material," Kim said. "It's open to all kinds of materials."

Credit: 
University of Texas at Dallas