Tech

Natural light flicker can help prevent detection

image: Wild Picasso triggerfish (Rhinecanthus aculeatus), a common reef fish.

Image: 
University of Bristol

Movement breaks camouflage, making it risky for anything trying to hide. New research, published in the Proceedings of the Royal Society B today [1 April] has shown that dynamic features common in many natural habitats, such as moving light patterns, can reduce being located when moving.
Dynamic illumination is particularly common in coral reefs, where patterns known as 'water caustics' play chaotically in the shallows. Researchers from the University of Bristol and the University of Queensland carried out behavioural experiments on the Great Barrier Reef, Australia.

Wild Picasso triggerfish (Rhinecanthus aculeatus), a common reef fish, were trained to locate and attack moving prey items within computer-simulated scenes on a Waterproofed iPad. Each scene contained 'water caustics' that varied in terms of motion (static or moving), scale (fine or coarse) and sharpness (sharp or diffuse), to illustrate the diversity of water caustics seen in natural habitats.

The presence of water caustics significantly increased the time for triggerfish to attack moving prey items compared to static caustic controls. Moreover, manipulating the sharpness and scale of water caustics implies that this delay should be maximised in shallow water: scenes with fine scale and sharp water caustics induced the longest attack latencies.

Dr Sam Matchette, a former PhD student in the University of Bristol's School of Biological Sciences and lead author, said: "Our research is the first to address the impacts of dynamic underwater illumination upon fish behaviour and directly assesses how visual features of water caustics can affect visually guided behaviour."

While being stationary remains the optimal strategy for the concealment of cryptic organisms, the findings here highlight conditions under which the disadvantage of moving can be reduced to some degree.

Dr Matchette added: "Due to the direct impact upon foraging efficiency, we predict that the presence of dynamic water caustics will have important consequences for decision-making regarding habitat choice and foraging by both wild prey and predators."

Credit: 
University of Bristol

Coercive measures are still frequently used in psychiatric care

The use of coercive measures in psychiatric care has decreased over the past years. However, a new study shows that coercive measures are still frequently used in Finland, and periods of both seclusion and mechanical restraint are long. According to root-level data collected from psychiatric wards, the use of coercive measures is considerably more common than could be concluded from the Care Register for Health Care. The results of the register-based study analysing the use of coercive measures were published in Nordic Journal of Psychiatry. The study was conducted in collaboration between the University of Eastern Finland, Niuvanniemi Hospital and Kuopio University Hospital.

Reducing the use of coercive measures is a significant goal in psychiatric care both in Finland and abroad. Yet coercive measures, such as seclusion, mechanical and physical restraint, and involuntary medication, are regularly used in psychiatric care. The most common reason for using coercive measures is violence or threat thereof, resulting from the patient's mental disorder.

The researchers collected data on the use of seclusion, mechanical and physical restraint, and involuntary medication in 2017 from all Finnish psychiatric wards offering specialised health care and from the wards of Finland's forensic psychiatry hospitals. A total of 140 psychiatric wards in 21 different organisations reported having used a coercive measure in 2017. Of these, 127 were psychiatric wards offering specialised health care in hospital districts.

Seclusion was the most commonly used coercive measure: seclusion was used by 109 wards a total of 4,006 times. The average duration of a seclusion period was nearly three days. The use of mechanical restraint was reported by 106 wards, but the frequency was considerably lower, amounting to 2,113 times. On average, the duration of a mechanical restraint episode was 17 hours. Involuntary medication was administered to patients 2,178 times by 95 wards, and the use of physical restraint was reported by 83 wards, amounting to a total of 1,064 times. The average duration of a physical restraint episode was less than one hour.

There were differences between the different organisations and wards in how they use coercive measures and report on their use. In Finland, the use of seclusion and mechanical restraint must be regularly reported to the Regional State Administrative Agencies. The obligation to report does not apply to other coercive measures, although the wards are required to collect and retain the related data for a period of two years. However, all wards could not provide data on the use of mechanical restraint and involuntary medication. Finland's forensic psychiatry hospitals, in contrast, were able to provide extensive data on all coercive measures used.

The root-level data on the use of coercive measures collected from psychiatric wards was considerably different from the data collected from the Care Register for Health Care for the same year.

"Some of the differences can be explained by the specific features of the system via which notifications are submitted to the Care Register for Health Care, but most discrepancies can probably be explained by the fact that not all coercive measures are entered in the system," says PhD student Emilia Laukkanen, Master of Health Sciences, from the University of Eastern Finland.

The study used root-level data on the use of coercive measures, i.e., data collected directly from psychiatric wards. Although data from the Care Register for Health Care can be used for annual comparisons, the researchers point out that findings of the study highlight the importance of collecting data directly from wards.

Credit: 
University of Eastern Finland

PKU's Zhou Feng and collaborators report the performance of water conservation in China

image: Contribution of 14 socioeconomic drivers to the change in total water use during P1 (1965 to 1975), P2 (1975 to 1992), and P3 (1992 to 2013). The length of each bar reflects the contribution of each driver to water use change during the corresponding period, whereas the value below the bar indicates its contribution proportion. The effects of 14 drivers were listed by same order during three periods: irrigation (blue), industrial (green), urban (red), and rural (purple) sectors. WUI stands for water use intensity.

Image: 
College of Urban and Environmental Sciences, Peking University

China's fast economic growth and accompanying rise in food demand are driving an increase in water use for agriculture and industry, thus threatening the country's water security. The findings of a new study underscore the value and potential of technological adoptions to help design targets and incentives for water scarcity mitigation measures.

Over the last century, people's water use has been increasing at more than twice the rate of the global population itself, with around 77% of this growth taking place in developing countries. According to the authors of the study published in the Proceedings of the National Academy of Sciences of the United States of America (PNAS), a lack of spatially detailed datasets however limits our understanding of historical water use trends and its key drivers, which makes future projections unreliable. As there are currently very few observation-based studies aimed at understanding the dynamics of historical water use, the authors endeavored to provide a detailed picture of how water use has been evolving amid socioeconomic, technological, and policy impacts, specifically in China. They provide evidence of the deceleration of human water use in the country and also attempted to identify the importance of water-conserving technological adoptions.

"The key question we wanted to address was how human water use responds to socioeconomic development, climate change, and policy interventions over time and space. We looked at China, not only because the country has transitioned from an underdeveloped country to the second largest economy in the world, but also because it is home to some of the Earth's most water-stressed regions. Diverse water conservancy measures were developed since the 1980s to avoid a long-term water crisis, but it is not well known how water use is influenced by economic growth, structural transitions, and policy interventions," explains study lead author Zhou Feng, an Associate Professor at Peking University in China.

The researchers found that although China's water use doubled between 1965 and 2013, there was a widespread slowdown in the growth rates from 10.66 km3 per year before 1975, to 6.23 km3 per year in 1975 to 1992, and further down to 3.59 km3 per year in the following years. These decelerations were attributed to reduced water use in irrigation and industry, which partly offset the increase driven by pronounced socioeconomic growth. The adoption of highly efficient irrigation techniques such as drip or sprinkler irrigation systems and industrial water recycling technologies explained most of the observed reduction of water-use intensities across China. Without these technologies, China's freshwater withdrawals would have been 80% more than the actual water use over the last two decades.

While water-conserving technological adoptions can deliver benefits of decoupling water use from socioeconomic development, studies in other countries have revealed an opposite relationship where technological adoption has led to an increase in intensive farming and thereby an increase in water use. According to the study, the first reason for these inconsistent results could be that intensive farming, such as high planting density and more sequential cropping had already been well developed in many Chinese prefectures. The second reason may lie in the nature of land institution in China where additional intensification requiring a change in irrigation infrastructure has been difficult to adopt due to the high fixed costs of the small fields allocated to farmers.

The authors explain that in China, the technological adoptions were accompanied by policy interventions including about 40 laws, regulations, programs, and action plans. In addition, the growth of China's water use is very likely to continue to slow down, as the latest policy interventions provide a more stringent constraint to approach a peak of water withdrawal. However, uncertainties and potential future water scarcity will come from three aspects:

First, China's land institution is undergoing a rapid transition towards large-scale farming through the farmland transfer system issued in 2014 alongside the adoption of water-conserving irrigation planned to cover 75% of the irrigated area in 2030. These ongoing transitions may lead farmers to expand irrigated areas or shift to water-intensive crops, which could offset the savings due to future improvement of irrigation efficiency.

The results further indicate that the westward development of the industrial sector has worsened water scarcity in many arid and semi-arid regions. High industrial water recycling has already been adopted in almost all these regions (>88%) except in Xinjiang, so that the potential for further water conservation would be limited. Without a stronger enforcement of capping water withdrawal, the industrial sector may become the most important driver continuing to increase water use.

Lastly, China is urbanizing at an unprecedented rate and the increasing per-capita income, coupled with generalized tap water accessibility, will likely stimulate more water-intensive lifestyles and thereby increase domestic water use.

The deceleration of water use revealed in this study partly challenges the results from global hydrological models, which commonly suggest an increase of total water use across China over the period 1971 to 2010. Zhou points out that one reason for this bias may be that technological change factors were prescribed as constant over space and time without consideration of policy interventions and actual technological adoption. It might however also be that socioeconomic activities data on China were simply disaggregated from national-scale statistics. The authors recommend that to improve model drivers, survey-based reconstruction datasets of water use — like those presented in this study — are valuable, and should be extended to other regions. In addition, the linkages between changes in water use and technological adoptions identified may also be useful in the design of more realistic future water withdrawal scenarios, with the ultimate goal to improve global models used to assess water use targets and water scarcity mitigation.

"Modeling water use is very complex and we need much more regional data and coordination to improve our understanding of people and how they use water. The modeling community should work together to achieve this as it is crucial to identify the key drivers and mechanisms behind changing water use patterns across the world that help make future projections more reliable. Future policies to underpin water targets in for example, the UN Sustainable Development Goals framework, will be key to addressing the challenge of decoupling water use from socioeconomic development in China and other water-stressed countries," concludes study co-author and IIASA Acting Water Program Director, Yoshihide Wada.

Credit: 
Peking University

The candy-cola soda geyser experiment, at different altitudes

image: When researchers dropped a candy into a soda below sea level (left), the soda foamed less than when the same experiment was performed at more than 10,000 feet above sea level (right).

Image: 
Adapted from <i>Journal of Chemical Education</i> <b>2020</b>, DOI: 10.1021/acs.jchemed.9b01177

Dropping Mentos® candies into a bottle of soda causes a foamy jet to erupt. Although science fair exhibitors can tell you that this geyser results from rapid degassing of the beverage induced by the candies, the precise means by which bubbles form hasn't been well characterized. Now, researchers reporting in ACS' Journal of Chemical Education used experiments in the lab and at various altitudes to probe the mechanism of bubble nucleation.

During production, soda is carbonated by sealing it under carbon dioxide pressure that is about four times the total air pressure. This causes carbon dioxide to dissolve in the beverage. When someone opens the container, carbon dioxide escapes from the space above the liquid, and the dissolved carbon dioxide slowly enters the gas phase, eventually causing the soda to go "flat." Mentos® greatly speed up this process: Carbon dioxide flows into tiny air bubbles on the rough surface of the candies, allowing the gas to rapidly jet to the surface of the soda. Thomas Kuntzleman and Ryan Johnson wondered if atmospheric pressure plays a role in carbon dioxide bubble formation. They reasoned that the answer could reveal more details of the process.

In the lab, the researchers added a Mentos® candy to water carbonated at different pressures and measured the mass lost from the liquid over time. They fit these data to an equation that allowed them to estimate that the bubbles on the surface of the candy were about 6 μm in diameter. In contrast to other candies, Mentos® could have a fortuitous balance between bubble size and the number of bubble sites that allows them to produce excellent fountains, the researchers suggest. Then, the researchers left the lab and examined the extent of soda foaming after candy addition at different altitudes, ranging from Death Valley (43 feet below sea level) to Pikes Peak (14,108 feet above sea level). They observed increased foam production at higher elevations; however, this effect could not be explained by the simple application of gas laws. Similar experiments could form the basis of classroom projects for students in general science through physical chemistry courses, the researchers say.

Credit: 
American Chemical Society

Models explain changes in cooking meat

Meat is no ordinary solid. Made up of complex networks of moisture-saturated proteins, it displays some intriguing physical properties when it is cooked. Several studies in the past have attempted to recreate this behaviour in computer simulations, but because this demands so much computing power, they have only achieved simplified, one-dimensional recreations of the process, which aren't particularly accurate. In new research published in EPJ Plus, mathematicians led by Dr Hala Nelson at James Madison University show that by modelling meat as a fluid-saturated matrix of elastic proteins, which are deformed as the fluid moves, cooking behaviours can be simulated more precisely.

The insights gathered by the team could have numerous advantages, such as improvements in the safety regulations which govern the meat we consume; optimisations its quality and flavour; and new ways to maximise its shelf life to ensure minimal wastage. In the team's model, the cooking process heated the fluid unevenly, causing it to move around and deform the protein matrix. In turn, the movement of the fluid is itself altered by this distortion. The result demonstrates a fairly strong agreement with real observations - where moisture is partially evaporated but is also pushed inwards from the meat surface during heating, causing the middle to swell.

Nelson and colleagues based their model on fundamental principles of conservation of mass, energy and momentum. They derived equations describing how polymers will behave when mixed with molecules of liquid, then fine-tuned their model's parameters until it was as realistic as possible. They then compared the outcomes of their simulations with experimental measurements of how thin steak slices shrink when cooked in the oven. In future studies, the team hopes to extend their simulations to 3D models. This would require far more computing power, but if achieved, could raise our level of understanding about the important food source.

Credit: 
Springer

Stable perovskite LEDs one step closer

image: Heyong Wang, Ph.D.-student at Linköping University.

Image: 
Magnus Johansson

Researchers at Linköping University, working with colleagues in Great Britain, China and the Czech Republic, have developed a perovskite light-emitting diode (LED) with both high efficiency and long operational stability. The result has been published in Nature Communications.

"Light-emitting diodes based on perovskites are still not sufficiently stable for practical use, but we have brought them one step closer", says Professor Feng Gao, and head of research at the Division of Biomolecular and Organic Electronics, Linköping University.

Perovskites are a large family of semiconducting materials that have aroused the interest of scientists around the world. Their special crystal structure means that they have excellent optical and electronic properties, while they are both easy and cheap to manufacture. Most progress has been made in research into the use of perovskites in solar cells, but they are also well-suited for the manufacture of LEDs.

The efficiency of the LEDs, which measures the fraction of charge carriers input to the material that are subsequently emitted as light, has increased considerably in recent years, and will soon reach that of competing technology. They are, however, not particularly stable, which means that so far they cannot be used in practice.

"Much remains to be done. Until now, most of the perovskite LEDs have either low efficiency or poor device stability", says Xiao-Ke LiU, research fellow in the Division of Biomolecular and Organic Electronics. He and Feng Gao are the principal authors of the article.

Many research groups have worked on this dilemma, without particular success. Now, researchers at LiU, working with colleagues in Great Britain, China and the Czech Republic, have found a way forward. They have used a perovskite that consists of lead, iodine and an organic substance, formamidinium. They have then embedded the perovskite into an organic molecule matrix to form a composite thin film.

"This molecule with two amino groups at its ends helps the other substances to form a high quality crystal structure that is characteristic for perovskites, and makes the crystal stable", says Heyong Wang, doctoral student in the Division of Biomolecular and Organic Electronics.

The new composite thin film has enabled the research group to develop LEDs with an efficiency of 17.3% with a long half-lifetime, approximately 100 hours.

Perovskites that contain lead and a halogen, in this case iodine, have the best light-emitting properties.

"We would very much like to get rid of the lead. So far we haven't found a good way to do this, but we are working hard on it, says Feng Gao.

The next steps are to test new combinations of different perovskites and organic molecules and to understand in detail how the nucleation and crystallisation processes occur. Different perovskites give light at different wavelengths, which is a requirement for the long-term goal of obtaining white light LEDs.

Credit: 
Linköping University

About the distribution of biodiversity on our planet

Since Charles Darwin, biologists have been using the so-called "biotic interactions" hypothesis to explain, at least in part, why the tropics around the equator are so species rich. The hypothesis focuses on the importance of interactions between species for biodiversity. The geneal idea is that species interactions increase towards the species-rich equator. Such interactions may include interactions such as between parasites and host, or between a predator and its prey. The intuitively appealing hypothesis is: The stronger the interactions between species, the faster evolutionary change, thus resulting in increased species diversity. Strong species interactions should further help maintain a high level of biodiversity. Testing this long-standing hypothesis has proven extremely difficult in the past, and the results from past studies aiming to test the "biotic interactions" hypothesis are mixed.

A new publication in Nature Communications now further challenges the general validity of the "biotic interactions" hypothesis. The study suggests that a specific but fundamental interaction between species - predation by large fish such as tunas or sharks - is stronger in the temperate zone than near the equator. According to the "biotic interactions" hypothesis, stronger interactions should be accompanied by a higher diversity of fish species - a pattern that is also not born out by the study. The study is headed by Dr. Marius Roesti, who began the research work at the University of British Columbia in Vancouver and is now working at the Institute of Ecology and Evolution at the University of Bern.

Analyzing more than 900 million predator attacks

To measure the interaction strength between large fish predators and small prey fish, the scientists analyzed four large data sets from pelagic (open ocean) longline fishing spanning all four major ocean basins (East and West Pacific, Atlantic, Indian Ocean). These data were used to infer how many predatory fish were caught per bait - a natural prey such as a mackerel or sardine. The researchers evaluated the catch of a fish predator as an attack by a predatory fish on its prey and hence, as an interaction between two species. "This investigation was only possible due to this extraordinary data set. The full data set spans the entire planet and records over 900 million catches of fish predators by longline fisheries over the past 55 years," comments Marius Roesti. The researchers then inferred where predatory fish bite most frequently and compared these results with the diversity of fish species.

Predatory fish attack most frequently in the temperate zone

The study found that predation by large fish is stronger at latitudes in the temperate zone than near the equator. "The latitudes with the relatively highest number of captured predators are in or near the temperate zone, and not near the equator. This result is generally true for all ocean basins and the entire period under investigation," says Roesti. Predation then decreased again towards the poles. Furthermore, the study shows that fish species richness is not highest where predation is strongest.

In a nutshell: Pelagic fish predation is not strongest near the equator, but in the temperate zone. In turn, fish species richness peaks at the equator. These results contradict the general idea of the "biotic interactions" hypothesis, at least for pealgic fish predation.

Credit: 
University of Bern

AI as mediator: 'Smart' replies help humans communicate during pandemic

ITHACA, N.Y. - Daily life during a pandemic means social distancing and finding new ways to remotely connect with friends, family and co-workers. And as we communicate online and by text, artificial intelligence could play a role in keeping our conversations on track, according to new Cornell University research.

Humans having difficult conversations said they trusted artificially intelligent systems - the "smart" reply suggestions in texts - more than the people they were talking to, according to a new study, "AI as a Moral Crumple Zone: The Effects of Mediated AI Communication on Attribution and Trust," published online in the journal Computers in Human Behavior.

"We find that when things go wrong, people take the responsibility that would otherwise have been designated to their human partner and designate some of that to the artificial intelligence system," said Jess Hohenstein, a doctoral student in the field of information science and the paper's first author. "This introduces a potential to take AI and use it as a mediator in our conversations."

For example, the algorithm could notice things are going downhill by analyzing the language used, and then suggest conflict-resolution strategies, Hohenstein said.

The study was an attempt to explore the myriad ways - both subtle and significant - that AI systems such as smart replies are altering how humans interact. Choosing a suggested reply that's not quite what you intended to say, but saves you some typing, might be fundamentally altering the course of your conversations - and your relationships, the researchers said.

"Communication is so fundamental to how we form perceptions of each other, how we form and maintain relationships, or how we're able to accomplish anything working together," said co-author Malte Jung, assistant professor of information science and director of the Robots in Groups lab, which explores how robots alter group dynamics.

"This study falls within the broader agenda of understanding how these new AI systems mess with our capacity to interact," Jung said. "We often think about how the design of systems affects how we interact with them, but fewer studies focus on the question of how the technologies we develop affect how people interact with each other."

In addition to shedding light on how people perceive and interact with computers, the study offers possibilities for improving human communication - with subtle guidance and reminders from AI.

Hohenstein and Jung said they sought to explore whether AI could function as a "moral crumple zone" - the technological equivalent of a car's crumple zone, designed to deform in order to absorb the crash's impact.

"There's a physical mechanism in the front of the car that's designed to absorb the force of the impact and take responsibility for minimizing the effects of the crash," Hohenstein said. "Here we see the AI system absorb some of the moral responsibility."

Credit: 
Cornell University

A COVID-19 palliative care pandemic plan: An essential tool

OTTAWA - Palliative care physicians have created a coronavirus disease 2019 (COVID-19) palliative care plan as an essential tool to provide care and help manage scare resources during the pandemic. The plan, which focuses on 8 critical elements -- "stuff," "staff," "space," "systems," "sedation," "separation," "communication" and "equity" -- is published in CMAJ (Canadian Medical Association Journal). https://www.cmaj.ca/content/cmaj/early/2020/03/31/cmaj.200465.full.pdf

Palliative care is a human right for patients. "The current COVID-19 pandemic will likely strain our palliative services beyond capacity," says Dr. James Downar, the head of the Division of Palliative Care at the University of Ottawa and a palliative care physician at The Ottawa Hospital and Bruyère Continuing Care. "We advise acting now to stockpile medications and supplies used in palliative care, train staff to meet palliative care needs, optimize our space, refine our systems, alleviate the effects of separation, have critical conversations, and focus on marginalized populations to ensure that all patients who require palliative care receive it."

"Many people already have advance care plans that stipulate that comfort measures are to be used if they become seriously ill," writes Dr. Downar with coauthors. "Other patients who are intubated and receiving mechanical ventilation but are not improving clinically will be extubated. A third group of patients may be denied ventilation because of resource scarcity."

The plan is an expansion of a framework developed by the US Task Force on Mass Casualty Critical Care for events with large numbers of injuries and casualties, with the addition of the last four elements, sedation, separation, communication and equity.

Stuff: medications should be stockpiled to provide comfort to patients on a larger scale, and for longer periods than usual. The authors suggest creating "palliative symptom management kits" for use by staff in long-term care facilities, paramedics and other health care professionals.

Staff: regional pandemic planning should include all health care professionals with palliative care training and involve educating others to offer palliative care.

Space: to accommodate large numbers of patients, it may be necessary to adapt specialized wards or nearby locations and ensure a quiet, peaceful environment for dying patients.

Systems: new triage systems and virtual care models may be used to allocate physicians and increase efficiency while reducing risk of infection.

Sedation: palliative sedation can provide comfort to people whose symptoms are unresponsive to standard comfort medications.

Separation: to lessen the sense of separation because of isolation measures, use video calling and technology to connect patient with family members.

Communication: open communication and understanding of a patient's wishes is critical, as many may not want to receive life-sustaining measures.

Equity: it is important to ensure marginalized groups, including people with disabilities or trauma and those living in poverty, have access to palliative care during a pandemic.

"Any triage system that does not integrate palliative care principles is unethical. Patients who are not expected to survive should not be abandoned but must receive palliative care as a human right," the authors conclude.

Credit: 
Canadian Medical Association Journal

On Mars or Earth, biohybrid can turn carbon dioxide into new products

image: A device to capture carbon dioxide from the air and convert it to useful organic products. On left is the chamber containing the nanowire/bacteria hybrid that reduces carbon dioxide to form acetate. On the right is the chamber where oxygen is produced.

Image: 
UC Berkeley photo by Peidong Yang

If humans ever hope to colonize Mars, the settlers will need to manufacture on-planet a huge range of organic compounds, from fuels to drugs, that are too expensive to ship from Earth.

University of California, Berkeley, and Lawrence Berkeley National Laboratory (Berkeley Lab) chemists have a plan for that.

For the past eight years, the researchers have been working on a hybrid system combining bacteria and nanowires that can capture the energy of sunlight to convert carbon dioxide and water into building blocks for organic molecules. Nanowires are thin silicon wires about one-hundredth the width of a human hair, used as electronic components, and also as sensors and solar cells.

"On Mars, about 96% of the atmosphere is CO2. Basically, all you need is these silicon semiconductor nanowires to take in the solar energy and pass it on to these bugs to do the chemistry for you," said project leader Peidong Yang, professor of chemistry and the S. K. and Angela Chan Distinguished Chair in Energy at UC Berkeley. "For a deep space mission, you care about the payload weight, and biological systems have the advantage that they self-reproduce: You don't need to send a lot. That's why our biohybrid version is highly attractive."

The only other requirement, besides sunlight, is water, which on Mars is relatively abundant in the polar ice caps and likely lies frozen underground over most of the planet, said Yang, who is a senior faculty scientist at Berkeley Lab and director of the Kavli Energy Nanoscience Institute.

The biohybrid can also pull carbon dioxide from the air on Earth to make organic compounds and simultaneously address climate change, which is caused by an excess of human-produced CO2 in the atmosphere.

In a new paper to be published March 31 in the journal Joule, the researchers report a milestone in packing these bacteria (Sporomusa ovata) into a "forest of nanowires" to achieve a record efficiency: 3.6% of the incoming solar energy is converted and stored in carbon bonds, in the form of a two-carbon molecule called acetate: essentially acetic acid, or vinegar.

Acetate molecules can serve as building blocks for a range of organic molecules, from fuels and plastics to drugs. Many other organic products could be made from acetate inside genetically engineered organisms, such as bacteria or yeast.

The system works like photosynthesis, which plants naturally employ to convert carbon dioxide and water to carbon compounds, mostly sugar and carbohydrates. Plants, however, have a fairly low efficiency, typically converting less than one-half percent of solar energy to carbon compounds. Yang's system is comparable to the plant that best converts CO2 to sugar: sugar cane, which is 4-5% efficient.

Yang is also working on systems to efficiently produce sugars and carbohydrates from sunlight and CO2, potentially providing food for Mars colonists.

Watch the pH

When Yang and his colleagues first demonstrated their nanowire-bacteria hybrid reactor five years ago, the solar conversion efficiency was only about 0.4% -- comparable to plants, but still low compared to typical efficiencies of 20% or more for silicon solar panels that convert light to electricity. Yang was one of the first to turn nanowires into solar panels, some 15 years ago.

The researchers initially tried to increase the efficiency by packing more bacteria onto the nanowires, which transfer electrons directly to the bacteria for the chemical reaction. But the bacteria separated from the nanowires, breaking the circuit.

The researchers eventually discovered that the bugs, as they produced acetate, decreased the acidity of the surrounding water -- that is, increased a measurement called pH -- and made them detach from the nanowires. He and his students eventually found a way to keep the water slightly more acidic to counteract the effect of rising pH as a result of continuous acetate production. This allowed them to pack many more bacteria into the nanowire forest, upping the efficiency nearly by a factor of 10. They were able to operate the reactor, a forest of parallel nanowires, for a week without the bacteria peeling off.

In this particular experiment, the nanowires were used only as conductive wires, not as solar absorbers. An external solar panel provided the energy.

In a real-world system, however, the nanowires would absorb light, generate electrons and transport them to the bacteria glommed onto the nanowires. The bacteria take in the electrons and, similar to the way plants make sugars, convert two carbon dioxide molecules and water into acetate and oxygen.

"These silicon nanowires are essentially like an antenna: They capture the solar photon just like a solar panel," Yang said. "Within these silicon nanowires, they will generate electrons and feed them to these bacteria. Then the bacteria absorb CO2, do the chemistry and spit out acetate."

The oxygen is a side benefit and, on Mars, could replenish colonists' artificial atmosphere, which would mimic Earth's 21% oxygen environment.

Yang has tweaked the system in other ways -- for example, to embed quantum dots in the bacteria's own membrane that act as solar panels, absorbing sunlight and obviating the need for silicon nanowires. These cyborg bacteria also make acetic acid.

His lab continues to search for ways to up the efficiency of the biohybrid, and is also exploring techniques for genetically engineering the bacteria to make them more versatile and capable of producing a variety of organic compounds.

Credit: 
University of California - Berkeley

New quantum technology could help diagnose and treat heart condition

Atrial fibrillation (AF) is a heart condition that causes an irregular and abnormally fast heart rate, potentially leading to blood clots, stroke, heart failure and other heart-related complications. While the causes of AF are unknown, it affects around one million people in the UK with cases predicted to rise at a great cost to the NHS.

Currently, AF is commonly diagnosed using an electrocardiogram (ECG), but this can only be done during an episode, so complementary means of diagnosis are needed.

AF is treated through a surgical procedure called 'catheter ablation', which carefully destroys the diseased area of the heart to interrupt abnormal electrical circuits. In 50% of cases, patients require further treatment.

Testing of the UCL-developed technology, published today in Applied Physics Letters, shows it can successfully image the conductivity of solutions mimicking biological tissues and therefore, could be used to diagnose AF and identify areas of the heart where surgery should be targeted.

It would work by mapping the electrical conductivity of the heart in 2D to identifying anomalies where the heart is misfiring.

Corresponding author, Dr Luca Marmugi (UCL Physics & Astronomy and UCLQ), said: "Atrial fibrillation is a serious condition that surprisingly little is known about it. We hope to change this through our work with clinicians in terms of both diagnosis and treatment.

"Surgery to treat atrial fibrillation effectively cuts the wires to prevent a short circuit in the heart, resetting the irregular heartbeat to a normal one, and our technology would help to identify where the short circuit is. While not available in the clinic yet, we've shown, for the first time, that it is possible to map the conductivity of live tissues in small volumes to an unprecedented level of sensitivity and at room temperature."

The team imaged solutions with a conductivity comparable to that of live tissues down to a sensitivity of 0.9 Siemens per metre and to a resolution of one cm using an unshielded atomic magnetometer with an AC magnetic field. These solutions were 5 ml in volume each to match the expected need of applications in AF diagnoses.

The signal was detected using Rubidium-based quantum sensors, which the team developed specifically to image small volumes accurately and consistently over a several days, with areas of brightness indicating high conductivity.

Being able to detect conductivity at less than one Siemens per metre is an improvement of 50 times on previous imaging results and demonstrates that the technique is sensitive and stable enough to be used to image biological tissues in an unshielded environment.

Co-author and group leader, Professor Ferruccio Renzoni (UCL Physics & Astronomy), said: "Electromagnetic induction imaging has been successfully used in a range of practical uses such as non-destructive evaluation, material characterization, and security screening, but this is the first time that it's been shown to be useful for biomedical imaging. We think it will be safe to use as it would expose organs, such as the heart, to one-billionth the magnetic field commonly used in MRI scanners.

"We've achieved a phenomenal level of sensitivity in an unshielded, room temperature environment, which brings us a lot closer to bringing this technology to the clinic. It was only possible by using quantum technologies and we are excited about the potential applications for improving clinical outcomes of atrial fibrillation."

The team envision an array of their quantum sensors that can be placed over the heart, giving readings in a matter of seconds.

The next step is for the team to collaborate with clinicians to integrate the technology into a tool for use in GPs surgeries and hospitals.

Credit: 
University College London

Injuries from motorized scooters

What The Study Did: Motorized scooters are increasingly popular and, in this study, researchers analyzed medical information for 61 adults who visited a single emergency department with scooter-related injuries.

Authors: Jeffrey D. Riley, M.D., of HonorHealth Scottsdale Osborn Medical Center in Arizona, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(10.1001/jamanetworkopen.2020.1925)

Editor's Note:  Please see the article for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Examining racial disparities in prostate cancer survival

What The Study Did: Data for nearly 230,000 men were used in this study to examine variations in survival in prostate cancer by geographic areas in the United States.

Authors: Quoc-Dien Trinh, M.D., of Harvard Medical School and Brigham and Women's Hospital in Boston, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(10.1001/jamanetworkopen.2020.1839)

Editor's Note:  The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Novel cell-based cancer immunotherapy shows promise in early studies

Scientists have developed a new immunotherapy that eradicates solid tumours in mice without adverse side effects, according to a new study published today in eLife.

The newly developed chimeric antigen receptor-T cell (CAR-T cell therapy) could soon be tested in clinical trials. In addition, the researchers used a new mouse model that could be used to test the safety, efficacy and mechanisms of CAR-T cell treatments for patients with solid cancers in future.

In CAR-T cell therapy, immune cells called T cells are collected from a blood sample of patients and reprogrammed with CAR molecules that recognise cancer-specific antigens expressed on cancer cells. When the CAR-T cells are given back to patients, they not only kill the antigen-expressing cancer cells directly but also switch on the immune system to fight the tumour.

"One of the challenges with developing CAR-T cells is that, sometimes, the CAR-T cells work against tumour antigens that are also present in lower amounts on normal cells, causing serious side effects which we only find out about in clinical trials," explains co-corresponding author Tomonori Yaguchi, Visiting Assistant Professor in the Division of Cellular Signaling, Institute for Advanced Medical Research, Keio University School of Medicine, Japan. "We proposed that if human CAR-T cells can cross-react with the mouse antigens and detect antigens found on normal mouse cells, we could test human CAR-T cells in mouse tumour models. This would allow more robust tests of both safety and effectiveness before the treatments reach clinical testing."

The team developed a CAR-T cell treatment targeting an antigen called glycipan-1 (GPC1). This antigen is found in large amounts on several types of human tumour cells, and also exists in low amounts on normal human and mouse cells. When the scientists tested CAR-T cells on mice bearing mouse tumours they found that the CAR-T cells effectively inhibited tumour growth without causing adverse side effects. In fact, for one of the mouse tumours, four out of five of the mice receiving CAR-T cell treatment remained completely tumour-free for at least 100 days. The team also found that the CAR-T cells enhanced immune responses against other tumour antigens than GPC1.

One of the most important new immunotherapy treatments for cancer are drugs called immune checkpoint inhibitors. These work by taking the brake off immune cells so that they can destroy cancer cells. When the team combined the CAR-T cells with a checkpoint inhibitory drug that blocks the activity of the PD1-protein found on T cells, this further enhanced the anti-tumour effects of CAR-T cell treatment, even though the checkpoint inhibitor alone had no effect on its own. This suggests that using CAR-T cells targeting GPC1 alongside a checkpoint inhibitor could be an effective combination treatment for cancer.

"We have generated CAR-T cells targeting GPC1 in both humans and mice and shown their effectiveness in mouse solid tumour models," concludes senior author Yutaka Kawakami, Professor in the Division of Cellular Signaling, Institute for Advanced Medical Research, Keio University School of Medicine. "By establishing a new type of model, we were able to test both the effectiveness, safety and anti-tumour mechanisms of CAR-T cells, showing the importance of choosing the most appropriate models for evaluating these novel types of cancer treatment."

Credit: 
eLife

Study shows potential for using fiber-optic networks to assess ground motions during earthquakes

A new study from a University of Michigan researcher and colleagues at three institutions demonstrates the potential for using existing networks of buried optical fibers as an inexpensive observatory for monitoring and studying earthquakes.

The study provides new evidence that the same optical fibers that deliver high-speed internet and HD video to our homes could one day double as seismic sensors.

"Fiber-optic cables are the backbone of modern telecommunications, and we have demonstrated that we can turn existing networks into extensive seismic arrays to assess ground motions during earthquakes," said U-M seismologist Zack Spica, first author of a paper published online Feb. 12 in the journal JGR Solid Earth.

The study was conducted using a prototype array at Stanford University, where Spica was a postdoctoral fellow for several years before recently joining the U-M faculty as an assistant professor in the Department of Earth and Environmental Sciences. Co-authors include researchers at Stanford and from Mexico and Virginia.

"This is the first time that fiber-optic seismology has been used to derive a standard measure of subsurface properties that is used by earthquake engineers to anticipate the severity of shaking," said geophysicist Greg Beroza, a co-author on the paper and the Wayne Loel Professor in Stanford's School of Earth, Energy & Environmental Sciences.

To transform a fiber-optic cable into a seismic sensor, the researchers connect an instrument called a laser interrogator to one end of the cable. It shoots pulses of laser light down the fiber. The light bounces back when it encounters impurities along the fiber, creating a "backscatter signal" that is analyzed by a device called an interferometer.

Changes in the backscatter signal can reveal how the fiber stretches or compresses in response to passing disturbances, including seismic waves from earthquakes. The technique is called distributed acoustic sensing, or DAS, and has been used for years to monitor the health of pipelines and wells in the oil and gas industry.

The new study in JGR Solid Earth extends previous work with the 3-mile Stanford test loop by producing high-resolution maps of the shallow subsurface, which scientists can use to see which areas will undergo the strongest shaking in future earthquakes, Beroza said.

In addition, the study demonstrates that optical fibers can be used to sense seismic waves and obtain velocity models and resonance frequencies of the ground--two parameters that are essential for ground-motion prediction and seismic-hazard assessment. Spica and his colleagues say their results are in good agreement with an independent survey that used traditional techniques, thereby validating the methodology of fiber-optic seismology.

This approach appears to have great potential for use in large, earthquake-threatened cities such as San Francisco, Los Angeles, Tokyo and Mexico City, where thousands of miles of optical cables are buried beneath the surface.

"What's great about using fiber for this is that cities already have it as part of their infrastructure, so all we have to do is tap into it," Beroza said.

Many of these urban centers are built atop soft sediments that amplify and extend earthquake shaking. The near-surface geology can vary considerably from neighborhood to neighborhood, highlighting the need for detailed, site-specific information.

Yet getting that kind of information can be a challenge with traditional techniques, which involve the deployment of large seismometer arrays--thousands of such instruments in the Los Angeles area, for example.

"In urban areas, it is very difficult to find a place to install seismic stations because asphalt is everywhere," Spica said. "In addition, many of these lands are private and not accessible, and you cannot always leave a seismic station standing alone because of the risk of theft.

"Fiber optics could someday mark the end of such large scale and expensive experiments. The cables are buried under the asphalt and crisscross the entire city, with none of the disadvantages of surface seismic stations."

The technique would likely be fairly inexpensive, as well, Spica said. Typically, commercial fiber-optic cables contain unused fibers that can be leased for other purposes, including seismology.

For the moment, traditional seismometers provide better performance than prototype systems that use fiber-optic sensing. Also, seismometers sense ground movements in three directions, while optical fibers only sense along the direction of the fiber.

The 3-mile Stanford fiber-optic array and data acquisition were made possible by a collective effort from Stanford IT services, Stanford Geophysics, and OptaSense Ltd. Financial support was provided by the Stanford Exploration Project, the U.S. Department of Energy and the Schlumberger Fellowship.

The next phase of the project involves a much larger test array. A 27-mile loop was formed recently by linking optical fibers on Stanford's historic campus with fibers at several other nearby locations.

The other authors of the JGR Solid Earth paper are Biondo Biondi of Stanford, Mathieu Perton of Universidad Nacional Autónoma de México and Eileen Martin of Virginia Tech.

Study: Urban Seismic Site Characterization by Fiber-Optic Seismology

This news release includes information provided by Stanford University.

Credit: 
University of Michigan