Tech

Researchers report high performance solid-state sodium-ion battery

image: Forming compatible interfaces between cathode active materials and solid electrolytes is important for high-performance all-solid-state batteries. The organic cathode demonstrated here is (electro)chemically and mechanically compatible with a sulfide electrolyte. Its moderate redox potential enables the reversible formation of a resistive active material-electrolyte interface.

Image: 
University of Houston

Solid-state sodium-ion batteries are far safer than conventional lithium-ion batteries, which pose a risk of fire and explosions, but their performance has been too weak to offset the safety advantages. Researchers Friday reported developing an organic cathode that dramatically improves both stability and energy density.

The improved performance, reported in the journal Joule, is related to two key findings:

The resistive interface between the electrolyte and cathode that commonly forms during cycling can be reversed, extending cycle life, and

The flexibility of the organic cathode allowed it to maintain intimate contact at the interface with the solid electrolyte, even as the cathode expanded and contracted during cycling.

Yan Yao, associate professor of electrical and computer engineering at the University of Houston and corresponding author of the paper, said the organic cathode - known as PTO, for pyrene-4,5,9,10-tetraone - offers unique advantages over previous inorganic cathodes. But he said the underlying principles are equally significant.

"We found for the first time that the resistive interface that forms between the cathode and the electrolyte can be reversed," Yao said. "That can contribute to stability and longer cycle life." Yao also is a principal investigator at the Texas Center for Superconductivity at UH. His research group focuses on green and sustainable organic materials for energy generation and storage.

Yanliang "Leonard" Liang, a research assistant professor in the UH Department of Electrical and Computer Engineering, said that reversibility of the interface is the key, allowing the solid-state battery to reach a higher energy density without sacrificing cycle life. Normally, a solid-state battery's ability to store energy is halted when the resistive cathode?electrolyte interface forms; reversing that resistance allows energy density to remain high during cycling, he said.

Lithium-ion batteries with their liquid electrolytes are able to store relatively high amounts of energy and are commonly used to power the tools of modern life, from cell phones to hearing aids. But the risk of fire and explosion has heightened interest in other types of batteries, and a solid-state sodium-ion battery offers the promise of increased safety at a lower cost.

Xiaowei Chi, a post-doctoral researcher in Yao's group, said a key challenge had been to find a solid electrolyte that is as conductive as the liquid electrolytes used in lithium-ion batteries. Now that sufficiently conductive solid electrolytes are available, a remaining challenge has been the solid interfaces.

One issue raised by a solid electrolyte: the electrolyte struggles to maintain intimate contact with a traditional rigid cathode as the latter expands and contracts during battery cycling. Fang Hao, a PhD student working in Yao's group, said the organic cathode is more pliable and thus able to remain in contact with the interface, improving cycling life. The researchers said the contact remained steady through at least 200 cycles.

"If you have reliable contact between the electrode and electrolyte, you will have a great chance of creating a high-performance solid-state battery," Hao said.

Credit: 
University of Houston

Study examines privacy policies, data sharing of popular apps for depression, smoking cessation

Bottom Line: This study looked at the privacy practices of popular apps for depression and smoking cessation. Researchers assessed the content of privacy policies and compared disclosures regarding data sharing with commercial third parties to actual behavior for 36 apps.

Authors: Kit Huckvale, M.B.Ch.B., M.Sc., Ph.D., UNSW (University of New South Wales) Sydney, Australia, and coauthors

(doi: 10.1001/jamanetworkopen.2019.2542)

Editor's Note: The article contains conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Airbnb's explosive growth jolts hotel industry's bottom line

image: Tarik Dogru, assistant professor at Florida State University's Dedman School of Hospitality, has a wide range of research interests including the sharing economy, hotel investment, behavioral finance and tourism economics.

Image: 
Tarik Dogru

TALLAHASSEE, Fla. -- Hospitality service Airbnb is fast becoming the 800-pound gorilla that's shaking up the hotel industry and forever changing it.

New research from Florida State University finds Airbnb's exponential growth worldwide is devouring an increasing share of hotel revenues and also driving down room prices and occupancy rates.

"This is the first study confirming the negative impact on those three key metrics in the hotel industry," said Tarik Dogru, an assistant professor at FSU's Dedman School of Hospitality. "Increased competition from Airbnb is mainly affecting hotel prices and revenues, but occupancy rates are also down slightly."

Over the past 10 years, Airbnb has grown into the world's largest online marketplace for lodgings and now qualifies as a disruptive innovation, Dogru said. His research, published in the journal Tourism Management, examined the effects of Airbnb in 10 major U.S. cities between 2008 and 2017.

The company's listings have grown more than 100 percent a year -- Airbnb now boasts six million listings in 81,000 cities worldwide -- and that dramatically larger supply of accommodations has significantly affected hotels' room prices and revenues.

Dogru found a 1 percent increase in Airbnb's supply lowered hotel revenues between 0.02 and 0.04 percent, which he estimated in New York City alone would have amounted to a loss ranging between $91 million and $365 million in 2016. The lost revenues came mainly from economy and luxury hotel listings.

That's a new finding for the luxury segment compared to previous research. Dogru attributes it to Airbnb's push to offer more luxury experiences. The company recently launched Beyond by Airbnb, a travel-planning service featuring high-end designer homes and other upscale properties, such as castles, treehouses, boats and yurts.

Dogru also found Airbnb negatively affected midscale hotels -- another new finding. Past research concluded room prices were so similar between midscale hotels and most Airbnb listings that consumers were not motivated to switch to Airbnb.

Not anymore. The study found increasing demand for authenticity in lodging, and travelers felt Airbnb properties were more authentic than franchised hotels.

"Consumers don't want to stay in a new city but in the same old hotel," he said. "More people want an authentic experience, and Airbnbs offer that -- the chance to engage with local people and have localized experiences."

Airbnb's business model has flourished for many reasons. Customers like having access to an enormous supply of properties and rooms at a wide variety of prices, often more competitive than hotels, and Airbnb collects commissions on every booking.

In addition, Dogru said the company does not follow conventional rules.

"Airbnb does not ensure the security of guests, it's not taxed in some jurisdictions, and it has flexibility to add new supply because of a lack of regulation," Dogru said. "Those are some reasons why Airbnb has a significant competitive advantage against the hotel industry because adding a new hotel to the market can take several years."

The lack of regulation in many Airbnb locations translates into less money for cities and local governments, and that reality raises questions for lawmakers and other policymakers because they cannot simply ban Airbnb.

"Decisions on how to regulate sharing-economy platforms will not be straightforward," Dogru said. "The application of excessive legislation and regulation driven by the interests of incumbent industries has the potential to stifle innovation that ultimately benefits the consumer. Lawmakers are clearly still grappling with the nuances of this emerging phenomenon."

Dogru believes local communities need Airbnb because it creates indirect economic benefits for restaurants, entertainment and leisure activities.

"You need Airbnb, especially when there is excess demand for rooms, but there must be regulation without killing innovation," he said. "Airbnb is still in its infancy, and more hosts are being added to its supply every day in many markets. Airbnb is indeed a disruptor, and it is here to stay."

Credit: 
Florida State University

Adding human touch to unchatty chatbots may lead to bigger letdown

Sorry, Siri, but just giving a chatbot a human name or adding humanlike features to its avatar might not be enough to win over a user if the device fails to maintain a conversational back-and-forth with that person, according to researchers. In fact, those humanlike features might create a backlash against less responsive humanlike chatbots.

In a study, the researchers found that chatbots that had human features -- such as a human avatar -- but lacked interactivity, disappointed people who used it. However, people responded better to a less-interactive chatbot that did not have humanlike cues, said S. Shyam Sundar, James P. Jimirro Professor of Media Effects, co-director of the Media Effects Research Laboratory and affiliate of Penn State's Institute for CyberScience (ICS).

High interactivity is marked by swift responses that match a user's queries and feature a threaded exchange that can be followed easily, according to Sundar.

"People are pleasantly surprised when a chatbot with low anthropomorphism -- fewer human cues -- has higher interactivity," said Sundar. "But when there are high anthropomorphic visual cues, it may set up your expectations for high interactivity -- and when the chatbot doesn't deliver that -- it may leave you disappointed."

On the other hand, improving interactivity may be more than enough to compensate for a less-humanlike chatbot. Even small changes in the dialogue, like acknowledging what the user said before providing a response, can make the chatbot seem more interactive, said Sundar.

"In the case of the low-humanlike chatbot, if you give the user high interactivity, it's much more appreciated because it provides a sense of dialogue and social presence," said lead author of the study, Eun Go, a former doctoral student at Penn State and currently assistant professor in broadcasting and journalism, Western Illinois University.

Because there is an expectation that people may be leery of interacting with a machine, developers typically add human names to their chatbots -- for example, Apple's Siri -- or program a human-like avatar to appear when the chatbot responds to a user.

The researchers, who published their findings in Computers in Human Behavior, currently online, also found that just mentioning whether a human or a machine is involved -- or, providing an identity cue -- guides how people perceive the interaction.

"Identity cues build expectations," said Eun Go. "When we say that it's going to be a human or chatbot, people immediately start expecting certain things."

Sundar said the findings could help developers improve acceptance of chat technology among users. He added that virtual assistants and chat agents are increasingly used in the home and by businesses because they are convenient for people.

"There's a big push in the industry for chatbots," said Sundar. "They're low-cost and easy-to-use, which makes the technology attractive to companies for use in customer service, online tutoring and even cognitive therapy -- but we also know that chatbots have limitations. For example, their conversation styles are often stilted and impersonal."

Sundar added the study also reinforces the importance of high interactivity, broadly speaking.

"We see this again and again that, in general, high interactivity can compensate for the impersonal nature of low anthropomorphic visual cues," said Sundar. "The bottom line is that people who design these things have to be very strategic about managing user expectations."

The researchers recruited 141 participants through Amazon Mechanical Turk, a crowdsourced site that allows people to get paid to participate in studies. The participants signed up for a specific time slot and reviewed a scenario. They were told that they were shopping for a digital camera as a birthday present for a friend. Then, the participants navigated to an online camera store and were asked to interact with the live chat feature.

The researchers designed eight different conditions by manipulating three factors to test the user's reaction to the chatbot. The first factor is the identity of a chatbot. When the participant engaged in the live chat, a message appeared indicating the users was interacting either with a chatbot or a person. The second factor is the visual representation of a chatbot. In one condition, the chatbot included a humanlike avatar and in another, it simply had a speech bubble. Last, the chatbots featured either high or low interactivity when responding to participants, with the only difference being that a portion of the user's response was repeated in the high condition. In all cases, a human was interacting with the participant.

While this study was carried out online, the researchers said that observing how people interact with chatbots in a laboratory may be one possible step to further this research.

Credit: 
Penn State

Alerting patients to their risk of gum disease improves inflammation and dental hygiene

In a new study published today in the Journal of Periodontology researchers found that using psychological techniques to communicate the risk of developing periodontal disease to patients improved dental hygiene over a three month period. It was further associated with reduced scores for gum inflammation as well.

Periodontal diseases are infections that cause inflammation of the structures around the teeth, including the gums, periodontal ligament and alveolar bone. In the earliest stage of periodontal disease -- gingivitis -- the inflammation is limited to the surface of the gums. In more severe forms of the disease - periodontitis - bone is destroyed around the teeth.

The team of scientists from King's College London's Faculty of Dentistry, Oral & Craniofacial Sciences tested a group of 97 adults with moderate periodontal disease who were registered patients at a London General Dental Practice.

They either received treatment as usual, an individualised report on their periodontal disease risk (PreViserTM), or an individualised report plus a programme of goal-setting, planning and self-monitoring based on psychological theory.

The study found that over 12 weeks:

Dental plaque reduced significantly in the two groups with whom risk was communicated, but not in the "treatment as usual" group.

The percentage of areas that bled on examination (gum inflammation) reduced in all groups, but the effect was more pronounced in the groups that received the psychological intervention.

Frequency of interdental cleaning improved only in the intervention groups

Lead author Dr. Koula Asimakopoulou, Reader in Health Psychology at King's College London said: "Our study shows that by adopting a simple psychological intervention, aided by the use of an online risk assessment tool, we can significantly improve measurable clinical outcomes and reduce initial signs of gum disease in patients seen routinely in General Dental Practice."

Dr Matthew Nolan, the dental practitioner who delivered the intervention noted: "Shaping how health information is presented to our patients appears to influence their subsequent behaviour. Patients are naturally concerned about their risk of periodontal disease; we have found that coupling their concern with a structured discussion of coping strategies and simple behaviour change techniques, may be a useful driving force in improving health outcomes within a routine dental consultation."

Dr Mark Ide, President of the British Society for Periodontology said: "This paper is interesting as it builds on research previously carried out at King's to show how useful a patient-focussed health care intervention can be in the real-life primary care setting."

"At a time when the best way to improve the periodontal health of the majority of people is being considered, this paper demonstrates how interdisciplinary teams of psychologists and dentists working together can deliver improvements in patients' oral health and periodontal status. Good daily oral care is a core element of achieving and maintaining good oral health, and this may have an impact on other aspects of health as well."

Credit: 
King's College London

National effort urged to overhaul 'broken' health data system

COLUMBUS, Ohio - Our system for protecting health data in the United States is fundamentally broken and we need a national effort to rethink how we safeguard this information, say three experts in data privacy.

In a perspective article in the April 18, 2019, issue of the New England Journal of Medicine, the experts call for an effort similar to what led to the Belmont Report in 1979, which laid the foundation for bioethics standards in the United States to protect human participants in research.

"Data scandals are occurring on a regular basis, with no end in sight," said Efthimios Parasidis, a co-author of the NEJM article and a professor at the Ohio State University's Moritz College of Law and College of Public Health.

"Data privacy laws for health information don't go far enough to protect individuals. We must rethink the ethical principles underlying collection and use of health data to help frame amendments to the law."

Parasidis wrote the article with Elizabeth Pike, Director of Privacy Policy in the Office of the Chief Information Officer at the U.S. Department of Health and Human Services; and Deven McGraw, chief regulatory office at Citizen, a company that helps people collect, organize and share their medical records digitally. Previously, McGraw was Deputy Director for Health Information Privacy at the Office of Civil Rights in the U.S. Department for Health and Human Services, and Acting Chief Privacy Officer at the Office of the National Coordinator for Health Information Technology.

Parasidis said a process analogous to the Belmont Report would be a good blueprint to follow today.

The National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research produced the 1979 report, which resulted in Congress passing laws to protect people who participated in medical research.

"Indignities in human subjects research compelled the government to create a commission to propose ethical guidance for new laws. We are experiencing a rerun of what was happening then, with the scandals involving use of health data now rather than the use of human subjects," Parasidis said. "We need an equivalent response."

Currently, the Health Insurance Portability and Accountability Act (HIPAA) is the main law protecting the data of patients. But it doesn't apply to many of the new companies and products that regularly store and handle customer health information, including social-media platforms, health and wellness apps, smartphones, credit card companies and other devices and companies.

"All of this data held by digital health companies raises a lot of ethical concerns about how it is being used," Parasidis said.

For example, some life insurers are offering contracts that have policyholders wear products that continuously monitor their health, and the information can be used to increase a customer's premiums.

Most regulations require only that consumers be notified about how their information is used and give their consent.

"That system doesn't work. Very few people read the notice and most people just click agree without knowing what they're agreeing to," he said.

So how can health data privacy be fixed?

One idea would be to establish data ethics review boards, which would review projects in which health data are collected, analyzed, shared or sold, according to the authors of the NEJM article.

Parasidis said such boards could function as safeguards required in both public and private settings, from university medical centers to private life insurance companies.

These boards could consider the benefits and risks of the proposed data use and consider policies governing data access, privacy and security. Members could include project developers, data analysts and ethicists, as well as people whose data would be collected.

"Right now, everything is about compliance. Companies and institutions check the boxes, fill out the forms and don't really think about whether they're doing the right thing," Parasidis said.

"Deliberations about use of health data should take the ethical obligations to individuals and society into account. The law should mandate that this occurs."

Credit: 
Ohio State University

Why researchers are mapping the world's manure

image: Global distribution of manure-rich cultivated areas. Green shades represent manure-rich areas with the most potential for recycling phosphorus. Manure-rich cultivated grid cells were most abundant in India, China, Southeast Asia, Europe and Brazil. Smaller patches are seen in central and east Africa, central United States and Central America.

Image: 
Stevens Institute of Technology

(Hoboken, N.J. – April 16, 2019) – Farmers rely on phosphorus fertilizers to enrich the soil and ensure bountiful harvests, but the world’s recoverable reserves of phosphate rocks, from which such fertilizers are produced, are finite and unevenly distributed. Stevens Institute of Technology is spearheading an international effort to map the global flow of phosphorus -- from soil to crops, and from there to livestock and humans, and eventually into sewers and landfills – and jump-start efforts to recapture and recycle the vital nutrient.

In the April 2019 issue of Earth's Future, David Vaccari, director of the Stevens Institute's department of civil, environmental and ocean engineering, and his team map that process globally for the first time, and identify regional "hot spots" where there's both significant demand for fertilizers, and significant potential for recapturing phosphorus from animal and human waste.

The team not only show there are significant untapped opportunities for recycling phosphorus but in a field where lack of well-integrated data has often impeded both local and regional planning, the work is a breakthrough in bridging the global to the local, actionable level.

“If we want to get serious about phosphorus recycling, these are the places where we’re going to get the most bang for our buck,” said Vaccari.

This year, the world’s farmers will use over 45 million metric tons of phosphorus fertilizers, much of which will be absorbed by crops, then eaten and excreted as waste by animals and people. Vaccari’s team, including researchers from China, Australia, Canada, Sweden, and Netherlands, combined recently developed datasets to map global crop production alongside human and livestock population levels. They then divided the planet into a grid of 10-kilometer wide blocks, allowing detailed local insights with an unprecedented overview of global phosphorus flows.

Some 72 percent of croplands with significant manure production nearby, and 68 percent of croplands with significant human populations nearby, are in regions that are heavily dependent on imported phosphorus, including large swathes of major emerging economies such as India and Brazil. The study also identifies significant surpluses of phosphorus-rich waste in much of Asia, Europe, and the United States, suggesting that both developing and developed economies could benefit from increased recycling.

The results also show that at least five times as much phosphorus is contained in animal manure as human waste, suggesting that livestock operations are an abundant target for recycling efforts. Almost half of the world’s farmlands – about 12 percent of the planet’s landmass – are co-located with manure-rich livestock operations, suggesting that in many regions manure could be applied to fields directly, or processed using bio-digesters to extract phosphorus for efficient and economical transport to farms.

First-author Steve Powers, a researcher at Washington State University who conceived of the study, and Vaccari are now trying to figure out exactly how much phosphorus can be recaptured from animal and human waste, and identify other opportunities for more efficient phosphorus use. “If we can recycle more of this locally-available waste phosphorus back into agriculture, we might be able to keep it away from leak points while reducing our dependence on future fertilizer imports and mining,” said Powers.

“Ideally, the 45 million metric tons of phosphorus fertilizers used each year would be completely reused, and we’d harvest their maximum potential to support food production,” said Vaccari. “This work is a step toward understanding how to get to that point.”

Credit: 
Stevens Institute of Technology

How bacteria build an enzyme that destroys climate-changing laughing gas

New research from the University of East Anglia reveals how soil bacteria build the only known enzyme for the destruction of the potent global warming and ozone-depleting gas nitrous oxide.

Alongside carbon dioxide (CO2) and methane, the greenhouse gas nitrous oxide (N2O), commonly known as 'laughing gas', is now a cause for great concern, and there is much international focus on reducing emissions.

It is hoped that the findings, published today in the journal Chemical Science, will help pave the way for strategies to mitigate the damaging effects of this climate changing gas.

N2O has around 300 times the global warming potential of CO2 and stays in the atmosphere for about 120 years, where it accounts for around nine per cent of total greenhouse gas.

It also destroys the ozone layer with similar potency to the now banned chlorofluorocarbons (CFCs).

Atmospheric levels of N2O are rising year on year as microorganisms break down synthetic nitrogen fertilisers which are added to agricultural soil, to satisfy the food supply demands of an ever-increasing global population.

Prof Nick Le Brun from UEA's School of Chemistry, said: "It is well known that some bacteria can 'breathe' N2O in environments where oxygen (O2) is limited.

"This ability is entirely dependent on an enzyme called 'nitrous oxide reductase', which is the only enzyme known to destroy N2O. It is therefore very important for controlling levels of this climate-changing gas.

"We wanted to find out more about how soil bacteria use this enzyme to destroy nitrous oxide."

The part of the enzyme where N2O is consumed (called the 'active site') is unique in biology, consisting of a complex arrangement of copper and sulfur (a copper-sulfide cluster). Until now, knowledge of how this unusual active site is built by bacteria has been lacking.

The UEA team discovered a protein called NosL, which is required for the assembly of the copper-sulfide cluster active site and makes the enzyme active.

They found that bacteria lacking NosL still produced the enzyme but it contained less of the copper-sulfide active site. Furthermore, when the same bacteria were grown with copper in short supply, the active site was completely absent from the enzyme.

The team also showed that NosL is a copper-binding protein, indicating that it functions directly in supplying copper for the assembly of the copper-sulfide cluster active site.

Prof Le Brun said: "The discovery of the function of NosL is the first step towards understanding how the unique active site of nitrous oxide reductase is assembled. This is key information because when assembly goes wrong, inactive enzyme leads to release of N2O into the atmosphere."

The UEA team was led by Prof Nick Le Brun and Dr Andy Gates from UEA's School of Biological Sciences, and included the University's Vice Chancellor Prof David Richardson - also from the School of Biological Sciences. They are part of international EU network focussed on understanding different aspects of N2O and the nitrogen cycle.

Dr Gates said: "Society is generally well aware of the need to address carbon dioxide emissions, but nitrous oxide is now emerging as a pressing global concern and requires researchers with different skill sets to work together to prevent further damaging effects of climate change.

"With increasing understanding of the enzymes that make and destroy N2O, we move closer to being able to develop strategies to mitigate the damaging effects of this climate changing gas on the earth's environment."

Credit: 
University of East Anglia

Climate engineering needs to look at the big picture, says researcher

image: Nadine Mengis, who until March 2019 was a Horizon Postdoctoral Fellow at Concordia

Image: 
Courtesy Nadine Mengis

Of all the different possible methods to combat anthropogenic climate change conceived of so far, among the least studied is climate engineering.

An umbrella term for large-scale projects designed to disrupt the Earth's carbon cycle or radiation balance, climate engineering has only relatively recently been included in the conversation about methods that could mitigate the harm caused by carbon emissions.

Research into various climate-engineering projects has grown, but according to Nadine Mengis, who until March 2019 was a Horizon Postdoctoral Fellow at Concordia's Matthews Climate Lab, too few of the studies produced to date have looked at the large-scale side effects these projects would have on interconnected variables.

In a new paper published in the journal Climatic Change, Mengis examines possible disruptions on relationships between a large number of what she calls "Earth system variables" -- the aspects of the climate that are not the direct target of the engineering projects will nonetheless be affected by them -- caused by three different climate-engineering methods.

The Big Three

Mengis, who changed her affiliation to Simon Fraser University this winter but completed her research for this paper while at Concordia, looked at three proposed climate-engineering methods for her study. Here is how each one works, in very broad terms:

First, solar radiation management. This involves actually adding aerosols into the stratosphere that would scatter incoming radiation from the sun and block some of the energy that would otherwise enter the Earth's system. Think of it as mimicking the effects of a volcanic explosion.

"In an unperturbed or un-managed climate, we would have an increasing temperature as a result of increasing CO2 concentrations," she says. "But if we manipulate the radiation balance of the planet, temperatures would level or go down, while CO2 levels remain unmitigated."

The second is ocean alkalinity enhancement. This involves grinding massive amounts of rocks and dumping them into the surface ocean, where they would absorb carbon dioxide via a chemical reaction.

This would lead to an increase in ocean alkalinity -- the ability to neutralize acid -- and subsequently raise oceanic pH levels, indicating lower levels of acidity. However, to be effective, Mengis says millions of tons of rock would have to be ground down and dumped in the ocean. Based on today's standards, it's not only impractical; ocean dumping is also illegal.

The third is large-scale afforestation. As the name implies, it is the exact opposite of deforestation -- but does not end with the planting of millions upon millions of trees.

Because trees only absorb large amounts of carbon when they are growing, says Mengis, these trees would have to be planted and then harvested, and the carbon in them sequestered. The cycle would have to be repeated constantly in order to be effective and would require, by some estimates, reforesting an area the size of Europe.

She is of course aware that none of these measures is possible to implement in the immediate future, but notes that these estimates are based on current emission rates. If humans reduce emissions dramatically, some form of these measures could be implemented at smaller scales.

Problems and solutions are global

Mengis stresses that the research on climate engineering methods remains far too narrow and therefore of limited value to take well-informed decisions.

"I think we're skipping ahead several steps," she says. "There are issues we need to look at before getting to specifics like how climate engineering can help crop yields."

She hopes the scientific community will use her paper as an orientation to investigate the side effects these projects can cause in a more comprehensive, holistic way. As for the public, she hopes they will come to understand that the field is still comparatively young.

"There are still a lot of unknown unknowns," she says. "Things we don't even know about yet might be impacted, and I hope my paper sheds some light on that."

Credit: 
Concordia University

90 percent of teens killed by an intimate partner are girls

Intimate partner homicide among teens does occur and 90 percent of the victims are girls, according to a new study in JAMA Pediatrics.

"This is a public health issue that should be taken seriously," said lead author Avanti Adhia, a senior fellow at the Harborview Injury Prevention and Research Center at the University of Washington School of Medicine.

The study looked at data from the National Violent Death Reporting System from 2003-2016, which included 2,188 homicides of young people 11-18 years where the relationship between the victim and perpetrator was known.

Of these homicides, 150 (6.9 percent) were classified as intimate partner homicide.

"While not a common occurrence, it does occur more often than people realize," said Adhia.

Adhia said 90 percent of the perpetrators are male, and guns are the most common weapon used.

"The majority of the homicides occur in older adolescence between the ages of 16-18," she said. "A common circumstance is when a victim ends a relationship with the perpetrator or there is jealousy over the victim dating someone new. "

Adhia said another common scenario is an acute altercation or argument that ends in death by firearm or stabbing.

The data comes from 32 states and each state contributed data for a different number of years, so no trend analysis was available. But the dataset has been expanded to 50 states and more cases will be available in the future.

"Partly why I was interested in this topic is the perception that teen dating violence is less serious than intimate partner violence among adults," said Adhia. "But it's important to understand that things can escalate among teens as well."

She said evidence-based interventions should be implemented in school and community settings around awareness, communication skills in relationships and bystander intervention.

Credit: 
University of Washington School of Medicine/UW Medicine

The discrete-time physics hiding inside our continuous-time world

image: Markov processes have been used to model the accumulation of sand piles.

Image: 
Santa Fe Institute Press

Scientists believe that time is continuous, not discrete -- roughly speaking, they believe that it does not progress in "chunks," but rather "flows," smoothly and continuously. So they often model the dynamics of physical systems as continuous-time "Markov processes," named after mathematician Andrey Markov. Indeed, scientists have used these processes to investigate a range of real-world processes from folding proteins, to evolving ecosystems, to shifting financial markets, with astonishing success.

However, invariably a scientist can only observe the state of a system at discrete times, separated by some gap, rather than continually. For example, a stock market analyst might repeatedly observe how the state of the market at the beginning of one day is related to the state of the market at the beginning of the next day, building up a conditional probability distribution of what the state of the second day is given the state at the first day.

In a pair of papers, one appearing in this week's Nature Communications and one appearing recently in the New Journal of Physics, physicists at the Santa Fe Institute and MIT have shown that in order for such two-time dynamics over a set of "visible states" to arise from a continuous-time Markov process, that Markov process must actually unfold over a larger space, one that includes hidden states in addition to the visible ones. They further prove that the evolution between such a pair of times must proceed in a finite number of "hidden timesteps", subdividing the interval between those two times. (Strictly speaking, this proof holds whenever that evolution from the earlier time to the later time is noise-free -- see paper for technical details.)

"We're saying there are hidden variables in dynamic systems, implicit in the tools scientists are using to study such systems," says co-author David Wolpert (Santa Fe Institute). "In addition, in a certain very limited sense, we're saying that time proceeds in discrete timesteps, even if the scientist models time as though it proceeds continually. The scientists may not have been paying attention to those hidden variables and those hidden timesteps, but they are there, playing a key, behind-the-scenes role in many of the papers those scientists have read, and almost surely also in many of the papers those scientists have written."

In addition to discovering hidden states and time steps, the scientists also discovered a tradeoff between the two; the more hidden states there are, the smaller the minimal number of hidden timesteps that are required. According to co-author Artemy Kolchinsky (Santa Fe Institute), "these results surprisingly demonstrate that Markov processes exhibit a kind of tradeoff between time versus memory, which is often encountered in the separate mathematical field of analyzing computer algorithms.

To illustrate the role of these hidden states, co-author Jeremy A. Owen (MIT) gives the example of a biomolecular process, observed at hour-long intervals: If you start with a protein in state 'a,' and over an hour it usually turns to state 'b,' and then after another hour it usually turns back to 'a,' there must be at least one other state 'c' -- a hidden state -- that is influencing the protein's dynamics. "It's there in your biomolecular process," he says. "If you haven't seen it yet, you can go look for it."

The authors stumbled on the necessity of hidden states and hidden timesteps while searching for the most energy-efficient way to flip a bit of information in a computer. In that investigation, part of a larger effort to understand the thermodynamics of computation, they discovered that there is no direct way to implement a map that both sends 1 to 0 and also sends 0 to 1. Rather, in order to flip a bit of information, the bit must proceed through at least one hidden state, and involve at least three hidden time steps. (See attached multimedia for diagram)

It turns out any biological or physical system that "computes" outputs from inputs, like a cell processing energy, or an ecosystem evolving, would conceal the same hidden variables as in the bit flip example.

"These kinds of models really do come up in a natural way," Owen adds, "based on the assumptions that time is continuous, and that the state you're in determines where you're going to go next."

"One thing that was surprising, that makes this more general and more surprising to us, was that all of these results hold even without thermodynamic considerations," Wolpert recalls. "It's a very pure example of Phil Anderson's mantra 'more is different,' because all of these low-level details [hidden states and hidden timesteps] are invisible to the higher-level details [map from visible input state to visible output state]."

"In a very minor way, it's like the limit of the speed of light," Wolpert muses, "The fact that systems cannot exceed the speed of light is not immediately consequential to the vast majority of scientists. But it is a restriction on allowed processes that applies everywhere and is something to always have in the back of your mind."

Credit: 
Santa Fe Institute

New microscopy technique peers deep into the brain

video: Using a new microscopy technique researchers simultaneously recorded from the cortex (green) and hippocampus (blue) of a mouse brain. Bright areas correlate with cell activity.

Image: 
Laboratory of Neurotechnology and Biophysics at The Rockefeller University

In order to understand the brain, scientists must be able to see the brain--cell by cell, and moment by moment. However, because brains comprise billions of microscopic moving parts, faithfully recording their activity comes with many challenges. In dense mammalian brains, for example, it is difficult to track rapid cellular changes across multiple brain structures--particularly when those structures are located deep within the brain.

A novel microscopy technique, developed by Rockefeller scientists, integrates new and existing approaches to help build a more cohesive picture of the brain. Described in Cell, the technology captures cellular activity across large volumes of neural tissue, with impressive speed and at new depths.

Laser focused

For decades, brain imaging has been plagued by trade-offs. Some techniques produce beautiful images but fail to record neural activity in real time. Others can keep up with the brain's speed but have poor spatial resolution. And although there are tactics that successfully combine rapidity and image quality, they typically capture only a small number of cells.

"This is in part because the limits that govern these tradeoffs have not been explored or pushed in a systematic and integrated manner," says Alipasha Vaziri, head of the Laboratory of Neurotechnology and Biophysics.

Hoping to end the era of trade-offs, Vaziri recently endeavored to improve upon a technique known as two-photon (2p) microscopy. It involves the application of a laser that causes bits of brain tissue to fluoresce, or light up; and for many researchers, 2p has long been the gold standard for probing cellular activity in the brain.

Yet, this technique has limitations. Standard 2p microscopy requires point-by-point scanning of a given region, which results in slow imaging. To resolve this issue, Vaziri and his colleagues implemented a novel strategy that permits recording from multiple brain regions in parallel, while carefully controlling the size and shape of each spot recorded.

Another weakness of traditional 2p is that it measures only the surface, or cortex, of the brain, neglecting structures buried deep within the organ, such as the hippocampus, which is involved in storing memories.

"One of the biggest challenges in neuroscience is developing imaging techniques that measure the activity of deep brain regions while maintaining high resolution," says Vaziri.

Taking up this challenge, he decided to make use of a newer technology: three-photon (3p) microscopy. Whereas 2P doesn't reach beyond the surface, or cortex, of a mouse brain, 3p penetrates deeper regions. Called hybrid multiplexed sculpted light microscopy, or HyMS, Vaziri's latest innovation applies 2P and 3P concurrently, allowing researchers to generate a picture of rapid cellular activity across multiple layers of brain tissue.

Deep dive

In addition to its hybrid laser strategy, HyMS also integrates other recent technical and conceptual advancements in the field--a synergistic approach that, Vaziri says, guided the development of the technology. The goal, he says, was to maximize the amount of biological information that could be obtained through multi-photon excitation microscopy while minimizing the heat produced by this method. And when testing their new system, the scientists certainly obtained a lot of information.

HyMS boasts the highest frame rate of available 3p techniques, which means it can capture biological changes at record speed. And whereas previous techniques scanned only a single plane of tissue, this technology can obtain information from the entire tissue sample and allows users to record from as many as 12,000 neurons at once. Another advantage of HyMS is its ability to simultaneously measure activity from brain areas at different depths. Since different layers of the brain constantly exchange signals, says Vaziri, tracking the interplay between these regions is key to understanding how the organ functions.

"Before, people hadn't even been able to look at the activity of neurons over the entire depth of the cortex, which has multiple layers, all at the same time," he says. "With this technology you can actually see what the information flow looks like within the cortex, and between cortical and subcortical structures."

In addition to probing new depths, HyMS allows researchers to record brain activity from animals as they actively engage with their environment. In a recent experiment, for example, the researchers used the technology to record signals from thousands of mouse neurons as an animal walked on a treadmill or listened to sounds. The fact that they were able to obtain good recordings suggests that the technique may be used to monitor large cell populations as animals perform diverse tasks--an application that could help elucidate neural mechanisms underlying various aspects of behavior and cognition.

Further, says Vaziri, techniques like HyMS will be vital to researchers hoping to better understand how brains process information. Neurons in the brain are densely interconnected and information is often represented not by individual cells, but by states of the network.

"To understand the dynamics of a network," he says, "you need to get accurate measurements of big portions of the brain at a single-neuron level. That's what we've done here."

Credit: 
Rockefeller University

Continuing PC vaccine in Kenya at full price cost-effective and could save thousands of lives

Co-led by the London School of Hygiene & Tropical Medicine and the KEMRI-Wellcome Trust Research Programme, the Pneumococcal Conjugate Vaccine Impact Study estimated that continuing the vaccine beyond 2022 would prevent - in the first ten years - more than 100,000 children and adults from contracting pneumococcal disease and save the lives of 14,000 children and adults who would otherwise have died.

The cost per year of healthy life saved (US $153) was considerably less than Kenya's annual gross domestic product (GDP) per person (US $1,790 in 2018), which meets - and far exceeds - the World Health Organization (WHO) GDP-related threshold for 'very cost-effective' health interventions.

The researchers say this study offers important evidence for policymakers at what is a crucial time for countries that have to decide whether to continue vaccine programmes that have benefited from Gavi's subsidy of the cost of vaccines once their economies grow and Gavi support diminishes.

Lead author Dr. John Ojal, of the London School of Hygiene & Tropical Medicine and the KEMRI-Wellcome Trust Research Programme, said: "If the Kenyan government decides not to continue the vaccine beyond January 2022, and children born after this time are no longer vaccinated with PCV, the incidence of invasive pneumococcal disease will rebound to that experienced before the vaccine was introduced. This is because there will be a growing pool of children who are not directly protected by vaccination and also not indirectly protected by vaccination of individuals around them."

PCV was introduced in Kenya in 2011 in order to protect children from pneumonia - one of the leading causes of death in children under five. Recent research has shown that PCV has reduced hospital admissions due to pneumonia by over a quarter.

At present, Kenya contributes just US $0.21 to the full cost of the vaccine, thanks to support from Gavi. By 2022, Kenya will enter an 'accelerated transition' phase, which will see the Gavi subsidy reduce rapidly until 2027, at which point Kenya will pay the full Gavi-negotiated price the country is eligible for. At US $3.05 per dose, this is a fraction of the per-dose cost of US $180 in the USA, but 15 times more than what Kenya is paying now.

To estimate the cost-effectiveness of continuing PCV after 2022, researchers combined data from ongoing detailed health surveillance in Kilifi, Kenya with mathematical modelling and health economics in order to determine the cost-effectiveness of continuing the pneumococcal conjugate vaccine programme beyond 2022.

Anthony Scott, Professor of Vaccine Studies at the London School of Hygiene & Tropical Medicine and Principal Investigator of the Pneumococcal Conjugate Vaccine Impact Study said: "Many low-income countries that receive Gavi support - including Kenya - will soon have to make vital decisions about whether to continue PCV use at full costs as they transition away from Gavi support. Studies like these are crucial so that policy makers can make evidence-based decisions that will impact on the health of their populations."

Researchers estimated that continuing PCV beyond 2022 will cost Kenya US $15.8 million per year - more than double the current total vaccine budget. After transition away from Gavi support, Kenya's financial contribution to other Gavi-supported childhood vaccines, such the pentavalent vaccine and those for yellow fever and rotavirus, will also have to increase.

John Ojal said: "This puts further stress on Kenya's health budget, so while our findings support an expansion of the vaccine budget, affordability may be a concern."

Work is ongoing to reduce the cost and improve the cost-effectiveness of the PCV programme: development of a cheaper vaccine, trials to reduce the number of required doses, and research by the KEMRI Wellcome Trust Research Programme and London School of Hygiene & Tropical Medicine to assess the effectiveness of smaller vaccine doses.

The authors acknowledge the limitations of the study, including that no local information was available about the proportion of children among non-hospitalised pneumonia patients who were treated as outpatients, and that the cost-effectiveness analysis is largely based on the assumption that reduction in disease will translate to reduction in mortality.

Credit: 
London School of Hygiene & Tropical Medicine

Petting zoos could potentially transmit highly virulent drug-resistant bacteria to visitors

New research presented at this year's European Congress of Clinical Microbiology & Infectious Diseases (ECCMID) in Amsterdam, Netherlands (13-16 April) shows that petting zoos can create a diverse reservoir of multidrug resistant (MDR) bacteria, which could lead to highly virulent drug-resistant pathogens being passed on to visitors.

The study is by Professor Shiri Navon-Venezia of Ariel University, Ariel, Israel and colleagues, and aimed to explore the prevalence, molecular epidemiology, and risk factors for animals in petting zoos becoming colonised by MDR bacteria. Petting zoos are a popular attraction around the world, allowing direct and indirect exposure of both children and adults to a diverse range of animal species. They are different from regular zoos because rather than visitors just looking at the animals, petting zoos are interactive with children visiting, holding and petting the animals.

Extended spectrum beta-lactamase (ESBL) and AmpC-producing Enterobacteriaceae (AmpC-E), which are resistant to a number of commonly used antibiotics, have become a matter of great concern in both human and veterinary medicine, so understanding the likelihood of them colonising the animals is critical to evaluating the risk that may be posed to visitors.

The researchers did a study across 8 randomly chosen petting zoos geographically distributed throughout Israel, taking samples of faecal matter as well as from the body surface (skin, fur, or feathers) from 228 animals belonging to 42 different species. Genetic sequencing was used to identify both the species of bacteria in each sample, and the presence of ESBL and AmpC drug resistance genes. Zoo owners were given questionnaires about the ages and medical histories of their animals which were analysed to determine additional risk factors.

In total, 382 samples were collected from 228 animals, and 12% of the animals were found to be colonised with at least one ESBL/AmpC-producing bacterial strain, with 35 different recovered species of bacteria. The majority (77%) of the MDR bacteria were obtained from faeces, with the remaining 23% coming from skin, fur, or feathers. A quarter of those animals which tested positive for drug-resistant bacteria were colonised by more than one bacterial strain. Among the bacterial strains identified, were the highly virulent E. coli ST656, which causes travellers' diarrhoea, and E. coli ST127; a frequent cause of urinary tract infections in humans.

Analysis of the data revealed that if an animal was treated with antibiotics it was seven times more likely to shed MDR bacteria. The study found that petting zoos provide a reservoir for a diverse range of ESBL/AmpC-E species, and are a potential source for shedding these highly virulent pathogens that may be transmitted to humans --mostly children -- that occasionally visit these facilities.

Professor Navon-Venezia concludes: "Our findings demonstrate that animals in petting zoos can result in shedding and transmission of MDR pathogens that may cause illness for human visitors, even when the animals appear healthy. We recognise the high educational and emotional value of petting zoos for children, therefore, we strongly recommend that petting zoo management teams implement a strict hygiene and infection control policy, together with rationalised antibiotic policy, in order to reduce the risk of transmission between animals and visitors."

She adds: "Immediate actions by zoo operators should include installation of handwashing stations to ensure proper handwashing before and after petting animals, prohibiting food and drinking near animals, and also not allowing petting of animals receiving antibiotic treatment."

Credit: 
European Society of Clinical Microbiology and Infectious Diseases

DIY gravitational waves with 'BlackHoles@Home'

image: The BlackHoles@Home project uses highly efficient simulation grids so that binary black hole collisions can be modeled on desktop computers. The black dots represent the black hole horizons for two black holes of different masses.

Image: 
Image courtesy of Z.Etienne/WVU

WASHINGTON, D.C., April 13, 2019 -- Researchers hoping to better interpret data from the detection of gravitational waves generated by the collision of binary black holes are turning to the public for help.

West Virginia University assistant professor Zachariah Etienne is leading what will soon become a global volunteer computing effort. The public will be invited to lend their own computers to help the scientific community unlock the secrets contained in gravitational waves observed when black holes smash together.

LIGO's first detection of gravitational waves from colliding black holes in 2015 opened a new window on the universe, enabling scientists to observe cosmic events spanning billions of years and to better understand the makeup of the Universe. For many scientists, the discovery also fueled expansion of efforts to more thoroughly test the theories that help explain how the universe works -- with a particular focus on inferring as much information as possible about the black holes prior to their collision.

First predicted by Albert Einstein in 1916, gravitational waves are ripples or disturbances in space-time that encode important information about changing gravitational fields.

Since the 2015 discovery, LIGO and Virgo have detected gravitational waves from eight additional black hole collisions. This month, LIGO and Virgo began new observing runs at unprecedented sensitivities.

"As our gravitational wave detectors become more sensitive, we're going to need to greatly expand our efforts to understand all of the information encoded in gravitational waves from colliding binary black holes," Etienne said. "We are turning to the general public to help with these efforts, which involve generating unprecedented numbers of self-consistent simulations of these extremely energetic collisions. This will truly be an inclusive effort, and we especially hope to inspire the next generation of scientists in this growing field of gravitational wave astrophysics."

His team -- and the scientific community in general -- needs computing capacity to run the simulations required to cover all possibilities related to the properties and other information contained in gravitational waves.

"Each desktop computer will be able to perform a single simulation of colliding black holes," said Etienne. By seeking public involvement through use of vast numbers of personal desktop computers, Etienne and others hope to dramatically increase the throughput of the theoretical gravitational wave predictions needed to extract information from observations of the collisions.

Black holes are known to contain two physical quantities: spin and mass. Spin, for example, can then be broken down further into direction and speed. Etienne's colleagues, therefore, are examining a total of eight parameters when LIGO or Virgo detect waves from a collision of two black holes.

"The simulations we need to perform, with the public's help, are designed to fill large gaps in our knowledge about gravitational waves from these collisions by covering as many possibilities as we can for these eight parameters. Current black hole simulation catalogs are far too small to properly cover this wide space of possibilities," Etienne said.

"This work aims to provide a critical service to the scientific community: an unprecedented large catalog of self-consistent theoretical predictions for what gravitational waves may be observed from black hole collisions. These predictions assume that Einstein's theory of gravity, general relativity, is correct, and therefore will provide deeper insights into this beautiful and complex theory. Just to give you an idea of its importance -- if the effects of Einstein's relativity theory weren't accounted for, GPS systems would be off by kilometers per day, just to name one example."

Etienne and his team are building a website with downloadable software based on the same Berkeley Open Infrastructure for Network Computing, or BOINC, system used for the SETI@Home project and other scientific applications. The free middleware system is designed to help harness the processing power of thousands of personal computers across the globe. The West Virginia team has named their project BlackHoles@Home and expects to have it up and running later this year.

They have already established a website where the public can begin learning more about the effort: https://math.wvu.edu/~zetienne/SENR/.

Credit: 
American Physical Society