Tech

A new method to study lithium dendrites could lead to better, safer batteries

image: A lithium dendrite is imaged and stress-tested under an atomic force microscope tip.

Image: 
Zhang Lab/Penn State

Lithium ion batteries often grow needle-like structures between electrodes that can short out the batteries and sometimes cause fires. Now, an international team of researchers has found a way to grow and observe these structures to understand ways to stop or prevent their appearance.

"It is difficult to detect the nucleation of such a whisker and observe its growth because it is tiny," said Sulin Zhang, professor of mechanical engineering, Penn State. "The extremely high reactivity of lithium also makes it very difficult to experimentally examine its existence and measure its properties."

Lithium whiskers and dendrites are needle-like structures only a few hundred nanometers in thickness that can grow from the lithium electrode through either liquid or solid electrolytes toward the positive electrode, shorting out the battery and sometimes causing fire.

The collaborative team from China, Georgia Tech and Penn State successfully grew lithium whiskers inside an environmental transmission electron microscope (ETEM) using a carbon dioxide atmosphere. The reaction of carbon dioxide with lithium forms an oxide layer that helps stabilize the whiskers.

They report their results online this week in Nature Nanotechnology. The paper is "Revealing the growth and stress generation of lithium whiskers by in situ ETEM-AFM."

Innovatively, the team used an atomic force microscope (AFM) tip as a counter electrode and the integrated ETEM-AFM technique allows simultaneous imaging of the whisker growth and measurement of the growth stress. If the growth stress is too high, it would penetrate and fracture the solid electrolyte and allow the whiskers to continue growing and eventually short-circuit the cell.

"Now that we know the limit of the growth stress, we can engineer the solid electrolytes accordingly to prevent it," Zhang said. Lithium metal-based all-solid-state batteries are desirable because of greater safety and higher energy density.

This new technique will be welcomed by the mechanics and electrochemistry communities and be useful in many other applications, Zhang said.

Next up for the team is to look at the dendrite as it forms against a more realistic solid-state electrolyte under TEM to see exactly what happens.

Credit: 
Penn State

BPA replacement hinders heart function, study reveals

image: Professor Glen Pyle.

Image: 
(University of Guelph)

BPA's counterpart replacement BPS can hinder heart function within minutes of a single exposure, according to a new University of Guelph study.

The study is the first to show the instant effects bisphenol S (BPS) can have on the heart.

"We expected to find similar effects from BPS as we have with BPA, but not at the speed that it worked," said biomedical sciences professor Glen Pyle, who conducted the study with former master's student Melissa Ferguson. "This replacement chemical seems to be more potent."

Bisphenol A (BPA), a chemical used in plastic products, was banned from baby bottles in Canada in 2010 over concerns that it may leach into foods and cause hormone-related side effects. More manufacturers are now using BPS as a replacement in their products and labelling them as BPA-free.

When mice were given bisphenol BPA or BPS in amounts that mimicked typical human levels, their heart function worsened, especially in females, within minutes of exposure.

These findings are concerning, as endocrine receptors and metabolic pathways are similar in mice and humans, said Pyle.

"This study raises concerns about the safety of BPS as a replacement for BPA."

It's particularly worrisome for people with coronary heart disease, high blood pressure, diabetes or obesity, because the effects of BPS could increase the chance of a heart attack or make a heart attack more severe, he added.

"If the heart is in a precarious position, when you add a stressor you can make it worse."

Published recently in the journal Scientific Reports, the study entailed treating mouse hearts with BPA and BPS at levels typically seen in people. Each chemical on its own was found to depress heart function by dampening heart contractions causing slower blood flow. However, BPS had a quicker impact - within five minutes of exposure.

"Previous research has looked at the chronic effects that can happen when exposed to BPS over days," said Pyle. "But we are the first to show how fast BPS can work. This is an important finding because it means you don't need to have a buildup of the chemical over time to experience its harmful effects."

BPA is found in plastics used for food packaging, including liners for metal cans and other containers, as well as in medical devices such as hospital intravenous lines and dental sealants.

Although the body gets rid of bisphenols quickly, their ubiquitous use in so many consumer goods means that the substance persists.

Pyle advocates banning the substitute chemical BPS from such consumer products as food and beverage packaging, toys and thermal paper receipts. He also suggests consumers reduce plastic use, including single-use plastics.

Credit: 
University of Guelph

New closed-loop system offers promise as novel treatment for post-bariatric hypoglycemia

BOSTON - (January 8, 2020) - Gastric bypass vastly improves the health of the patients who elect to receive the surgery. Post-bariatric hypoglycemia, however, can be a severe complication experienced by 10 to 30 percent of patients.

Researchers at Joslin Diabetes Center and Harvard John A. Paulson School of Engineering and Applied Sciences have developed a closed-loop system that automatically provides patients with an appropriate, as-needed dose of liquid glucagon to treat this condition. The system, comprised of a continuous glucose monitor (CGM) and a glucagon pump that communicate via an algorithm-controlled application, would allow patients to go about their daily activities without the fear of dipping into dangerous low blood sugar levels. The success of the system was reported on Nov. 13 in The Journal of Clinical Endocrinology & Metabolism.

"Post-bariatric hypoglycemia is a profoundly life-altering condition for patients. Having unpredictable hypoglycemia that people can't detect is really an unsafe situation," says Mary Elizabeth Patti, M.D., Associate Professor of Medicine at Harvard Medical School, Investigator at Joslin, and senior author on the paper. "This system provides a way to help individuals keep their glucose in a safe range."

Over two hundred thousand people in the United States have bariatric surgery each year. Some types of these surgeries not only shrink the size of the stomach, but also change the way food travels through the intestines. As a result, high levels of certain hormones are released from the intestine after eating, and these hormones increase insulin production. These changes, in part, account for the reduction in obesity-associated problems, including type 2 diabetes. But in some patients, the surgery can trigger the body to over-produce insulin, leading to sharp drops in blood glucose levels.

"Hypoglycemia can be very disabling," says Dr. Patti. "Since it is not predictable, people can't plan in advance for it. And if it happens repeatedly, people can become unaware that their glucose is low. And if the glucose is severely low, they may have alterations in brain function and may not be able to think clearly. With more severe hypoglycemia, they may have loss of consciousness and may require the assistance of someone else. It becomes quite a dangerous situation."

Current treatments for post-bariatric hypoglycemia include strictly regulated meal plans, and medications to reduce insulin production after meals. Once a low blood glucose develops, patients have to consume sugar. If the patient has lost consciousness, a family member may have to administer an emergency dose of glucagon, a medication that increases glucose. These treatments, however, are frequently not sufficient on their own and may lead to unhealthy swings in blood sugar.

"This new automated glucagon delivery system is an important development because it helps protect these patients from developing undetected or difficult to treat low blood sugars," says Christopher Mulla, MD, first author on the study. "Glucagon provides patients with a treatment that doesn't involve eating, which they're often afraid of doing, and it does not cause rebound high blood sugars, which can then trigger another low blood sugar."

The system grew from a collaboration between clinical and computational scientists at Joslin Diabetes Center and Harvard John A. Paulson School of Engineering and Applied Sciences. Work on this system began about four years ago, when Dr. Patti realized that artificial pancreas algorithms which had been developed to treat diabetes by study co-senior author Dr. Eyal Dassau, Director, Biomedical Systems Engineering Research Group at the Harvard John A. Paulson School of Engineering and Applied Sciences and his team, could similarly be developed to detect, treat, and prevent severe hypoglycemia.

The team tested whether a glucagon pump and CGM could communicate to provide an adequate dose of glucagon to treat an impending low. During this first phase, glucagon doses were administered by the study physicians. In this newly published paper, the team closed the loop and allowed Dr. Dassau's algorithm to sense impending low blood sugar levels and automatically deliver an appropriate glucagon dose under supervision by the medical team.

"The way that we look at it, it is very similar to how in your car, you have an airbag," says Dr. Dassau. "You don't use that airbag every time that you stop at a traffic light, but when there is a severe event and there's a need to prevent catastrophe, the airbag will be deployed. We employing the same idea for the glucagon system: we detect, we analyze and then we deliver automatically a mini dose of glucagon."

Twelve patients participated in the study, which took place at Joslin's Clinical Research Center on two separate days. Upon arrival at Joslin, patients were hooked into a CGM and a pump that was filled either with glucagon or a placebo. The study was double-blind, meaning neither the study team nor the patients knew which medication was being delivered which day until the conclusion of the study . The team then induced hypoglycemia in each patient and allowed the algorithm to predict impending or detect current low blood sugar and deliver either glucagon or placebo. The results from each day were analyzed and compared.

"I was very pleased that the system was able to detect hypoglycemia consistently, that the patients were able to tolerate the small dose of glucagon that we used, and that it was effective," says Dr. Patti. "We used about a third of the usual emergency rescue glucagon dose, and that was sufficient to raise the glucose without causing a high glucose level."

Too high a dose of glucagon can lead to vomiting and other symptoms of hyperglycemia, which often occurs in patients given emergency-level doses for hypoglycemia. This new, closed-loop system significantly reduced the risk of over-treating. "That's one of the benefits of automation and running a closed loop. You can start with a very low dose of glucagon as it's needed, and add an additional small dose if indicated without overdosing," says Dr. Dassau.

The team has already started to adapt the algorithm from a computer application to a cell phone in preparation for the next phase of a clinical trial, which will send the entire system home with study participants to test in a real-world setting.

"We believe that it will provide a particularly helpful therapeutic option," says Dr. Patti. "Using the system to detect an upcoming severe low and treat it before it gets unsafe would be so important to improve safety and quality of life of patients with this type of hypoglycemia."

Credit: 
Joslin Diabetes Center

Harvard researchers help explain link between emotion and addictive substance use

CAMBRIDGE, MA -- What drives a person to smoke cigarettes - and keeps one out of six U.S. adults addicted to tobacco use, at a cost of 480,000 premature deaths each year despite decades of anti-smoking campaigns? What role do emotions play in this addictive behavior? Why do some smokers puff more often and more deeply or even relapse many years after they've quit? If policy makers had those answers, how could they strengthen the fight against the global smoking epidemic?

A team of researchers based at Harvard University now has fresh insights into these questions, thanks to a set of four interwoven studies described in a new report published in the Proceedings of the National Academy of Sciences: The studies show that sadness plays an especially strong role in triggering addictive behavior relative to other negative emotions like disgust.

The studies range from analysis of data from a national survey of more than 10,000 people over 20 years to laboratory tests examining the responses of current smokers to negative emotions. One study tested the volume and frequency of actual puffs on cigarettes by smokers who volunteered to be monitored as they smoked. While drawing from methodologies from different fields, the four studies all reinforce the central finding that sadness, more so than other negative emotions, increases people's craving to smoke.

"The conventional wisdom in the field was that any type of negative feelings, whether it's anger, disgust, stress, sadness, fear, or shame, would make individuals more likely to use an addictive drug," said lead researcher Charles A. Dorison, a Harvard Kennedy School doctoral candidate. "Our work suggests that the reality is much more nuanced than the idea of 'feel bad, smoke more.' Specifically, we find that sadness appears to be an especially potent trigger of addictive substance use."

Senior co-author Dr. Jennifer Lerner, the co-founder of the Harvard Decision Science Laboratory and Thornton F. Bradshaw Professor of Public Policy, Decision Science, and Management at Harvard Kennedy School, said the research could have useful public policy implications. For example, current anti-smoking ad campaigns could be redesigned to avoid images that trigger sadness and thus unintentionally increase cigarette cravings among smokers.

Lerner is the first tenured psychologist on the faculty of the Kennedy School. She was the Chief Decision Scientist for the U.S. Navy in 2018-19. Lerner has studied the impact of emotions on decision-making since the 1990s, examining issues including whether generalized negative emotions trigger substance abuse or whether a subset of specific emotions such as sadness are more important factors in addiction.

The other co-authors include Ke Wang, a doctoral student at the Kennedy School; Vaughan W. Rees, director of the Center for Global Tobacco Control at Harvard T.H. Chan School of Public Health; Ichiro Kawachi, the John L. Loeb and Frances Lehman Loeb Professor of Social Epidemiology at the Chan School; and Associate Professor Keith M.M. Ericson at the Questrom School of Business at Boston University. The work was funded by grants from the National Science Foundation and the National Institutes of Health.

Here are further details on the techniques and key findings of the four studies:

Examining data from a national survey that tracked 10,685 people over 20 years, the researchers found that self-reported sadness among participants was associated with being a smoker and with relapsing back into smoking one and two decades later. The sadder individuals were, the more likely they were to be smokers. Notably, other negative emotions did not show the same relationship with smoking.

Then the team designed an experiment to test causality: Did sadness cause people to smoke, or were negative life events causing both sadness and smoking? To test this, 425 smokers were recruited for an online study: one-third were shown a sad video clip about the loss of a life partner. Another third of the smokers were shown a neutral video clip, about woodworking; the final third were shown a disgusting video involving an unsanitary toilet. All participants were asked to write about a related personal experience. The study found that individuals in the sadness condition - who watched the sad video and wrote about a personal loss - had higher cravings to smoke than both the neutral group and the disgust group.

A similar approach in the third study measured actual impatience for cigarette puffs rather than mere self-reported craving. Similar to the second study, nearly 700 participants watched videos and wrote about life experiences that were either sad or neutral, and then were given hypothetical choices between having fewer puffs sooner or more puffs after a delay. Those in the sadness group proved to be more impatient to smoke sooner than those in the neutral group. That result built upon previous research findings that sadness increases financial impatience, measured with behavioral economics techniques.

The fourth study recruited 158 smokers from the Boston area to test how sadness influenced actual smoking behavior. Participants had to abstain from smoking for at least eight hours (verified by carbon monoxide breath test). They were randomly assigned to sadness or neutral control groups; smokers sat in a private room at the Harvard Tobacco Research Laboratory, watched the sad video and wrote about great loss, or watched a neutral video and wrote about their work environment. Then they smoked their own brand through a device that tested the total volume of puffs and their speed and duration. The results: smokers in the sadness condition made more impatient choices and smoked greater volumes per puff.

Lerner said the research team was motivated in part by the deadly realities of smoking: tobacco use remains the leading cause of preventable death in the United States despite five decades of anti-smoking campaigns. The global consequences are also dire, with one billion premature deaths predicted across the world by the end of this century.

"We believe that theory-driven research could help shed light on how to address this epidemic," Dorison said. "We need insights across disciplines, including psychology, behavioral economics and public health, to confront this threat effectively."

Credit: 
Harvard Kennedy School

Machine learning shapes microwaves for a computer's eyes

image: An example of a wave pattern (right) and its intensity levels (left) developed by the machine learning algorithm to best illuminate the most important features of an object being identified.

Image: 
Mohammadreza Imani, Duke University

DURHAM, N.C. -- Engineers from Duke University and the Institut de Physique de Nice in France have developed a new method to identify objects using microwaves that improves accuracy while reducing the associated computing time and power requirements.

The system could provide a boost to object identification and speed in fields where both are critical, such as autonomous vehicles, security screening and motion sensing.

The new machine-learning approach cuts out the middleman, skipping the step of creating an image for analysis by a human and instead analyzes the pure data directly. It also jointly determines optimal hardware settings that reveal the most important data while simultaneously discovering what the most important data actually is. In a proof-of-principle study, the setup correctly identified a set of 3D numbers using tens of measurements instead of the hundreds or thousands typically required.

The results appear online on December 6 in the journal Advanced Science and are a collaboration between David R. Smith, the James B. Duke Distinguished Professor of Electrical and Computer Engineering at Duke, and Roarke Horstmeyer, assistant professor of biomedical engineering at Duke.

"Object identification schemes typically take measurements and go to all this trouble to make an image for people to look at and appreciate," said Horstmeyer. "But that's inefficient because the computer doesn't need to 'look' at an image at all."

"This approach circumvents that step and allows the program to capture details that an image-forming process might miss while ignoring other details of the scene that it doesn't need," added Aaron Diebold, a research assistant in Smith's lab. "We're basically trying to see the object directly from the eyes of the machine."

In the study, the researchers use a metamaterial antenna that can sculpt a microwave wave front into many different shapes. In this case, the metamaterial is an 8x8 grid of squares, each of which contains electronic structures that allow it to be dynamically tuned to either block or transmit microwaves.

For each measurement, the intelligent sensor selects a handful of squares to let microwaves pass through. This creates a unique microwave pattern, which bounces off the object to be recognized and returns to another similar metamaterial antenna. The sensing antenna also uses a pattern of active squares to add further options to shape the reflected waves. The computer then analyzes the incoming signal and attempts to identify the object.

By repeating this process thousands of times for different variations, the machine learning algorithm eventually discovers which pieces of information are the most important as well as which settings on both the sending and receiving antennas are the best at gathering them.

"The transmitter and receiver act together and are designed together by the machine learning algorithm," said Mohammadreza Imani, research assistant in Smith's lab. "They are jointly designed and optimized to capture the features relevant to the task at hand."

"If you know your task, and you know what sort of scene to expect, you may not need to capture all the information possible," said Philipp del Hougne, a postdoctoral fellow at the Institut de Physique de Nice. "This co-design of measurement and processing allows us to make use of all the a priori knowledge that we have about the task, scene and measurement constraints to optimize the entire sensing process."

After training, the machine learning algorithm landed on a small group of settings that could help it separate the data's wheat from the chaff, cutting down on the number of measurements, time and computational power it needs. Instead of the hundreds or even thousands of measurements typically required by traditional microwave imaging systems, it could see the object in less than 10 measurements.

Whether or not this level of improvement would scale up to more complicated sensing applications is an open question. But the researchers are already trying to use their new concept to optimize hand-motion and gesture recognition for next-generation computer interfaces. There are plenty of other domains where improvements in microwave sensing are needed, and the small size, low cost and easy manufacturability of these types of metamaterials make them promising candidates for future devices.

"Microwaves are ideal for applications like concealed threat detection, identifying objects on the road for driverless cars or monitoring for emergencies in assisted-living facilities," said del Hougne. "When you think about all of these applications, you need the sensing to be as quick as possible, so we hope our approach will prove useful in making these ideas reliable realities."

Credit: 
Duke University

Ultrasound can make stronger 3D-printed alloys

image: 3D printed Titanium alloys under an electron microscope: sample on the left with large, elongated crystals was printed conventionally, while sample on the right with finer, shorter crystals was printed sitting on a ultrasonic generator.

Image: 
RMIT University

Researchers have used sound vibrations to shake metal alloy grains into tighter formation during 3D printing.

A study just published in Nature Communications shows high frequency sound waves can have a significant impact on the inner micro-structure of 3D printed alloys, making them more consistent and stronger than those printed conventionally.

Lead author and PhD candidate from RMIT University's School of Engineering, Carmelo Todaro, said the promising results could inspire new forms of additive manufacturing.

"If you look at the microscopic structure of 3D printed alloys, they're often made up of large and elongated crystals," Todaro explained.

"This can make them less acceptable for engineering applications due to their lower mechanical performance and increased tendency to crack during printing."

"But the microscopic structure of the alloys we applied ultrasound to during printing looked markedly different: the alloy crystals were very fine and fully equiaxed, meaning they had formed equally in all directions throughout the entire printed metal part."

Testing showed these parts had a 12% improvement in tensile strength and yield stress compared with those made through conventional additive manufacturing.

The team demonstrated their ultrasound approach using two major commercial grade alloys: a titanium alloy commonly used for aircraft parts and biomechanical implants, known as Ti-6Al-4V, and a nickel-based superalloy often used in marine and petroleum industries called Inconel 625.

By simply switching the ultrasound generator on and off during printing, the team also showed how specific parts of a 3D printed object can be made with different microscopic structures and compositions, useful for what's known as functional grading.

Study co-author and project supervisor, RMIT's Distinguished Professor Ma Qian, said he hoped their promising results would spark interest in specially designed ultrasound devices for metal 3D printing.

"Although we used a titanium alloy and a nickel-based superalloy, we expect that the method can be applicable to other commercial metals, such as stainless steels, aluminium alloys and cobalt alloys," Qian said.

"We anticipate this technique can be scaled up to enable 3D printing of most industrially relevant metal alloys for higher performance structural parts or structurally graded alloys."

Credit: 
RMIT University

Getting to the heart of heart beats: Cardiac thin filament structure and function revealed

image: Schematic diagram of the muscle fiber structure (left) and the molecular structure of the entire thin filament (right)

Image: 
Osaka University

Osaka, Japan - Researchers at Osaka University used electron cryomicroscopy (CryoEM) to image essential cardiac muscle components, known as thin filaments, with unprecedented resolution. They also discovered the mechanism by which these filaments regulate heart beat via cardiac muscle contractions in the presence or absence of calcium ions by changing their conformations. This work may have application in the development of new drugs for treating heart conditions caused by mutations that affect these structures and functions.

The human heart is a remarkable organ, capable of pumping blood for a lifetime without rest. However, many of the details of its inner workings remain unknown, partly because the exact structures of its muscle proteins in their natural forms are difficult to image. This is especially true for "thin filaments" - tiny filamentous structures made up of proteins called actin, troponin, and tropomyosin - owing to their complex interactions and small size. It has long been known that cardiac muscle contraction is controlled by the repeated increase and decrease in the concentration of calcium ions within muscle cells and that the control of contraction is accomplished by changes in the structure of the thin filaments when these ions bind to them. However, the exact mechanism was unclear.

Now, by using CryoEM, a technique recognized with the 2017 Nobel Prize in Chemistry, researchers at Osaka University have revealed the highest-resolution structural images of these proteins to date. Conventional electron microscopy usually damages fragile biological samples, meaning that their native shape in the body cannot be determined. In contrast, by the cryoEM technique, samples are flash-frozen so that proteins can be imaged while still in their native conformations.

"It has been very difficult to reveal the entire structure of the thin filament, but we succeeded in solving its structure using cryoEM and advanced image analysis," says first author Yurika Yamada. The Osaka team demonstrated how, in the absence of calcium ions, myosin access to the actin regions are blocked so that myosin heads cannot attach to them for muscle contraction. However, the binding of calcium ion changes the conformation of the thin filaments, exposing the attachment sites for contraction.

"Since many mutations in the component proteins of the thin filament are known to cause heart disease, including cardiac hypertrophy and cardiomyopathy, the revealed structures could provide a molecular and structural basis for novel drug design," explain senior authors Takashi Fujii and Keiichi Namba.

This research also highlights the power of cryoEM to reveal previously unseen anatomical detail with potential for yet unimagined medical breakthroughs.

Credit: 
Osaka University

Study finds deforestation is changing animal communication

Deforestation is changing the way monkeys communicate in their natural habitat, according to a new study.

This study, led by an anthropologist at the University of Waterloo, offers the first evidence in animal communication scholarship of differences in vocal behaviours in response to different types of forest edge areas.

Working in a tropical lowland rainforest in Costa Rica, the researchers examined how human-caused forest habitat changes have affected vegetation and, in turn, the rate and length of howling by the group-living howler monkey species.

Led by Laura Bolt, an adjunct professor of anthropology at Waterloo, the study compared how the communication behaviour of the mantled howler monkey differs in forest edges impacted by human activity, known as anthropogenic edges, compared to natural forest edges.

"Howler monkeys are well-known for making very loud, long-distance vocalizations called howls," said Bolt. "While howls are only produced by adult males, howl function is not entirely known, so we conducted our study to test the hypothesis that the intensity of howling by monkeys relates to defending ecological resources such as areas of richer vegetation or preferred feeding trees."

Anthropogenic areas were identified as areas within 50 meters of barbed wire fences marking the edge of the forest and the start of coconut plantations or cattle pasture, and natural forest edges as areas within 50 meters of a river.

The study found that males howl to defend high-quality resources, with notably longer durations of howling in the forest interior and at river edge areas where vegetation resources are richer. The researchers also found differences in howl length between river edge and anthropogenic edge areas, which is an important insight for conservation planning.

"Howler monkeys eat leaves and fruit, and if they are howling to defend these resources, we predicted that males would howl for longer durations of time when in a forest interior or near the river edge, where vegetation is richer compared to anthropogenic edge," said Bolt.

To conduct their study, the researchers collected data on mantled howler monkey howling behaviour from May to August in 2017 and 2018, following groups as they travelled across various edge and interior habitat zones. All monkey groups were well-habituated and did not react to the visible presence of the researchers.

With their evidence showing that anthropogenic deforestation is altering howler monkey behaviour, Bolt and her colleagues say that long-term howler monkey conservation initiatives should prioritize preservation of forest interior and river edge regions and re-forestation of human-caused forest edges.

"While it is yet unknown what implications these behavioural changes across different edge zones may have for monkey fitness," says Bolt, "our findings show that it is proximity to anthropogenic forest edge, rather than to naturally-occurring forest edge, that is changing howler monkey communication behaviour. This is just one of the many ways that howler monkeys are affected by deforestation."

Credit: 
University of Waterloo

'Resurrection ecology' of 600-year-old water fleas used to understand pollution adaptation

image: In this picture, a hatching Daphnia (bottom) and a pair of dormant Daphnia eggs (top-isolated from an ephippium, a kind of protective case produced by the 'mother Daphnia' in which the eggs get buried in the sediment.

Image: 
Dagmar Frisch

One of the leading threats to lakes since the rise of agriculture are runoffs from fertilizer, in the form of high phosphorus levels. These can trigger devastating events like eutrophication, where deadly algal blooms thrive on phosphorus, and in the process, outcompete and choke off vital nutrients from the rest of the lake.

But to best learn how organisms can adapt to eutrophication events requires comparing samples before and after the event ---a seemingly impossible feat to study eutrophication---since it arose with modern agricultural 100 years ago.

So, by taking advantage of the unique genomic model organism of tiny waterfleas, or Daphnia, an international team of researchers has now analyzed Daphnia from a phosphorus-rich Minnesota lake ---and compared it to revived, 600-year-old Daphnia dormant eggs found in the bottom sediments---- to better understand how these creatures cope with a dramatic environmental change.

Dagmar Frisch and her colleagues, based in the University of Birmingham's School of Biosciences, used the 'resurrection ecology' of Daphnia and new analysis tools to perform the study. The team was only able to make these discoveries by comparing the responses of modern Daphnia with their 600-year-old ancestors.

Both the modern and the ancient samples studied came from the same lake in Minnesota where eutrophication first started at the beginning of the 20th century. "We used existing data and state-of-the-art analytical methods to connect patterns of gene expression with the physiological responses that allow these animals to deal with increased environmental phosphorus" said author Dagmar Frisch, an expert in environmental paleogenomics. "This allowed us to identify which part of the gene network was accountable for the newly evolved response."

"Because Daphnia is such a central species in aquatic ecosystems, our study ultimately improves our understanding of how aquatic ecosystems can mitigate some of the effects of eutrophication, one of the major global threats to freshwater environments," said co-author Dörthe Becker, an expert in environmental 'omics'.

They were able to show a large cluster of several hundred genes uniquely adapted in modern-day Daphnia to high phosphorus levels. Many of these were involved in vital, core metabolic pathways necessary for Daphnia survival.

"We used network analysis methods to find out which genes 'communicate' with others to form clusters (or "modules"), and how this gene communication has changed in a keystone species over the last 600 years. In addition, we were able to connect these modules with particular observed traits, which was achieved for the first time in resurrection ecology," said co-author Marcin Wojewodzic.

"Our study emphasizes that evolution is a result of molecular fine-tuning that happens on different layers, ranging from basic cellular responses to complex physiological traits" said Becker. "The approach we used allows a more holistic view of how animals can and do respond to environmental change, and by that improve our understanding of organisms as integrated units of biological organization," said Frisch.

Next, the team will continue to explore how these networks and other molecular processes, including epigenetics play a role in evolutionary adaptation to changing environments.

Credit: 
SMBE Journals (Molecular Biology and Evolution and Genome Biology and Evolution)

Findings on education, malnutrition 'deeply disturbing'

SEATTLE--Despite progress toward global education targets, a new study reveals that 1 in 10 women ages 20-24 in low- and middle-income countries had zero years of schooling in 2017, and 1 in 6 had not completed primary school.

For the first time, researchers have mapped years of education and child malnutrition across all low- and middle-income countries at the level of individual districts. The findings include precision maps illuminating disparities within countries and regions often obscured by national-level analyses.

Nations with districts where high proportions of women had zero years of education in 2017 included Afghanistan, Niger, and the Gambia.

The research showed that gender inequality in education persists in many regions, with men achieving more years of education than women overall. A gap of more than three years between men and women was observed in nearly 140 districts in Yemen, Sudan, South Sudan, Nigeria, Kenya, the Democratic Republic of the Congo, Angola, and Afghanistan. The granularity of the estimates also shows where this gap varies widely between states, districts, or provinces.

The United Nations' Sustainable Development Goals set a target of universal secondary education by 2030. In 2017, fewer than 1% of the districts studied were close to meeting this goal for both men and women. The vast majority of those were in Uzbekistan, and the remainder in the Philippines.

"We know that education is closely related to people's health and well-being, particularly the health of mothers and children," said Dr. Simon I. Hay, senior author of the study and Director of the Local Burden of Disease (LBD) group at IHME. "This study enables all of us - teachers, educators, researchers, and policymakers - to look at disparities not just between countries, but at the level of individual communities."

Researchers also mapped "child growth failure" (CGF), defined as insufficient height and weight for a given age and exhibited by stunting, wasting, and underweight among children under 5.

The results show that 1 in 4 children living in the countries studied still suffered at least one dimension of malnutrition, and reveal inequality within countries doing well and those doing poorly. In Kenya, for example, there was a 9-fold difference in wasting between Tetu constituency and Turkana East constituency.

Predictions based on current trajectories for CGF estimate that only 5 low- and middle-income countries will achieve WHO goals for both stunting and wasting in all units. However, the analysis identifies priority regions where interventions could be targeted, as well as areas doing well. In Peru, the El Presupuesto por Resultados (Results-Based Budgeting) program, which includes community-level strategies, has been praised by the World Bank as a key driver in halving stunting levels in less than three years.

Both studies, conducted by IHME, analyzed low- and middle-income countries from 2000 to 2017. Interactive maps accompanying the publications visualize the results for each country and allow users to look at change over time.

Many countries demonstrated progress in education. South Africa, Peru, and Colombia saw substantial improvement over the study period. Researchers found that progress on women's education nationally often correlated with improved equality within a country. But in some countries, including India and Nigeria, national progress occurred in tandem with increased inequality across the study period.

In India, the proportion of women aged 20-24 achieving secondary education increased from 11% to 37% during the study period, and in Nigeria, the same demographic progressed from 12% to 45% attainment. However, the analysis showed that the majority of progress was driven by urban regions, particularly Maharashtra, India, and Lagos, Nigeria. Nigeria remained one of the countries with the world's highest inequality in education in 2017.

The studies were published January 9 in print in the journal Nature. They build on previous mapping of African nations published in 2018.

Credit: 
Institute for Health Metrics and Evaluation

Online patient tool is associated with increased likelihood of receiving kidney transplant

image: Regular use of online patient portals by hemodialysis patients is associated with greater likelihood of receiving a renal transplant.

Image: 
American College of Surgeons

CHICAGO (January 8, 2020): Taking a more active role in one's own health is known to promote better outcomes, but it is especially critical for patients who are waiting for a kidney transplant. Patients with kidney failure who actively used an online patient portal to track the status of their health care improved their chances of getting a kidney transplant and shortened their wait times for an operation, according to an "article in press" published on the website of the Journal of the American College of Surgeons ahead of print.

Keeping up with medical treatments can be difficult for patients with kidney failure. The alternative is that failing to adhere to their medication and treatment goals can jeopardize their health and delay a transplant procedure.

"We wanted to look at patient factors and tools outside the traditional medical system that can help dialysis patients achieve autonomy and better outcomes," said study coauthor Polina Zmijewski, MD, a general surgery resident at Rhode Island Hospital and the Warren Alpert Medical School of Brown University, both in Providence, R.I. After patients receive a kidney transplant, they must take suppressive medication to prevent organ rejection and attend regular medical appointments to guide them in maintaining a well-functioning transplanted kidney to stay healthy, she explained.

Online patient portals are beneficial because they allow users to conveniently access their personal health records, view lab tests, and schedule appointments. But these tools may also help improve clinical outcomes. Patient autonomy and proactive behavior are important to the process.

"It's like you get more invested in the game if you are able to keep score," said Dr. Zmijewski. "Our theory was that patients who were able to keep score by using an online patient portal would be more involved in their health care and would better comply with medical treatments, which would lead to better health, making it more likely that they would be the recipient of a kidney transplant."

For the study, researchers examined the medical records of 264 patients who were seen at two outpatient dialysis centers associated with Rhode Island Hospital. The patients were divided into two groups based on their use of MyLifespan, an online health record that allows patients to track their medical progress via the web or a smartphone.

Non-users included patients who never used the online tool, and "active" users included patients who used the tool and logged in at least once every two months and up to seven times per month.

Of the 264 patients who regularly received hemodialysis treatments, 38 were considered active MyLifespan users. Although the active users represented a small group, the researchers found they were a viable group to use for comparison. "For dialysis patients, their outcomes are easily measurable, whether or not they receive a renal [kidney] transplant," Dr. Zmijewski pointed out. The researchers compared the MyLifespan users with 226 non-users, matched for similar factors such as length of time on dialysis, age, sex, and race. Researchers then looked at whether active users were more likely to get a kidney transplant at three, four and five years after initiation of dialysis.

The researchers found that at three years from the start of hemodialysis, 5 percent of both users and non-users of the online tool received kidney transplants. However, the chances of receiving a transplant differed over longer periods of time. By four years, 23 percent of users received a kidney transplant versus 13 percent of non-users; and at five years, 40 percent of users received transplants, compared with only 14 percent of non-users.

"We found that patients who were active users of the portal tended to get kidney transplants more frequently, and they were more likely to receive transplants within five years of initiation of dialysis," Dr. Zmijewski said. The authors note that when a patient is more cooperative with medical treatments, the patient is "more likely to be healthy enough to undergo transplant surgery."

Another difference was how often patients logged into the online portal. While an exact shortened timeframe was not given, the researchers noted an overall positive correlation trend between frequency of logging in and a shortened time to transplant (p= 0.0067).

The connection can be explained by the fact that non-compliance with dialysis treatments may mean a weaker relationship with health care providers, poorer health and, ultimately, removal from the transplant wait list.

"The patient portal is a very good bridge from the dependent culture of hemodialysis to the independent culture of kidney transplant," said Dr. Zmijewski.
These results provide new evidence that patient portals that improve adherence to treatment goals are a valuable clinical tool that go beyond convenience. In dialysis patients, especially, portals can be used as a tool of autonomy that helps patients transition from dialysis to transplantation.

The take home message is that "improving patient involvement and empowering patients to participate in their own care is really the key to improving patient outcomes," said Dr. Zmijewski.

In addition to the small sample size, another limitation of the study was the inability of researchers to categorize the socioeconomic characteristics of users versus non-users due to the lack of data regarding the income and education level of the hemodialysis patients.

The next step will be to develop a pilot study that follows dialysis patients using the MyLifespan system. The aim of the project will be to more thoroughly examine if, and how, this type of intervention can help patients improve their health outcomes.

Credit: 
American College of Surgeons

Cellular clock regulating human spine development

video: Cell signaling molecules, lit up in green, peak each time the segmentation clock ticks in human induced pluripotent stem (iPS) cells.

Image: 
Pourquié lab/Harvard Medical School

More than 20 years ago, the lab of developmental biologist Olivier Pourquié discovered a sort of cellular clock in chicken embryos where each "tick" stimulates the formation of a structure called a somite that ultimately becomes a vertebra.

In the ensuing years, Pourquié and others further illuminated the mechanics of this so-called segmentation clock across many organisms, including creation of the first models of the clock in a lab dish using mouse cells.

While the work has improved knowledge of normal and abnormal spine development, no one has been able to confirm whether the clock exists in humans--until now.

Pourquié led one of two teams that have now created the first lab-dish models of the segmentation clock using stem cells derived from adult human tissue.

The achievements not only provide the first evidence that the segmentation clock ticks in humans but also give the scientific community the first in vitro systems enabling the study of very early spine development in humans.

"We know virtually nothing about human development of somites, which form between the third and fourth week after fertilization, before most people know they're pregnant," said Pourquié, professor of genetics in the Blavatnik Institute at Harvard Medical School and a principal faculty member of the Harvard Stem Cell Institute. "Our system should be a powerful one to study the underlying regulation of the segmentation clock."

"Our innovative experimental system now allows us to compare mouse and human development side by side," said Margarete Diaz-Cuadros, a graduate student in the Pourquié lab and co-first author of the Harvard Medical School-led paper, published Jan. 8 in Nature. "I am excited to unravel what makes human development unique."

Both models open new doors for understanding developmental conditions of the spine, such as congenital scoliosis, as well as diseases involving tissues that arise from the same region of the embryo, known as the paraxial mesoderm. These include skeletal muscle and brown fat in the entire body, as well as bones, skin and lining of blood vessels in the trunk and back.

Pourquié hopes that researchers will be able to use the new stem cell models to generate differentiated tissue for research and clinical applications, such as skeletal muscle cells to study muscular dystrophy and brown fat cells to study type 2 diabetes. Such work would provide a foundation for devising new treatments.

"If you want to generate systems that are useful for clinical applications, you need to understand the biology first," said Pourquié, who is also the Harvard Medical School Frank Burr Mallory Professor of Pathology at Brigham and Women's Hospital. "Then you can make muscle tissue and it will work."

Although scientists have derived many kinds of tissue by reprogramming adult cells into pluripotent stem cells and then coaxing them along specific developmental paths, musculoskeletal tissue proved stubborn. In the end, however, Pourquié and colleagues discovered that they could facilitate the transformation by adding just two chemical compounds to the stem cells while they were bathed in a standard growth culture medium.

"We can produce paraxial mesoderm tissue with about 90 percent efficiency," said Pourquié. "It's a remarkably good start."

His team created a similar model derived from embryonic mouse cells.

The HMS researchers were surprised to find that the segmentation clock began ticking in both the mouse and human cell dishes and that the cells didn't first need to be arranged on a 3D scaffold more closely resembling the body.

"It's pretty spectacular that it worked in a two-dimensional model," said Pourquié. "It's a dream system."

The team found that the segmentation clock ticks every 5 hours in the human cells and every 2.5 hours in the mouse cells. The difference in frequency parallels the difference in gestation time between mice and humans, the authors said.

Among the next projects for Pourquié's lab are investigating what controls the clock's variable speed and, more ambitiously, what regulates the length of embryonic development in different species.

"There are many very interesting problems to pursue," he said.

A third group publishing in the same issue of Nature uncovered new insights into how cells synchronize in the segmentation clock using mouse embryos engineered to incorporate fluorescent proteins.

Pourquié is senior author of the HMS-led paper. Postdoctoral researcher Daniel Wagner of HMS is co-first author. Additional authors are affiliated with Kyoto University, RIKEN Center for Brain Science and Brandeis University.

Pourquié has started a company called Anagenesis Biotechnologies based on protocols developed for this study. It uses high-throughput screening to search for cell therapies for musculoskeletal diseases and injuries.

Credit: 
Harvard Medical School

Smartphone cameras can speed up urinary tract infection diagnosis

image: Dr Nuno Reis from the University of Bath has developed a test to detect E. coli in urine samples that uses a smartphone camera

Image: 
University of Bath

Biological Engineers at the University of Bath have developed a test that could help medics quickly diagnose urinary tract infections (UTIs), using a normal smartphone camera.

Similar in principle to a pregnancy test, the process can identify the presence of harmful E. coli bacteria in a urine sample in just 25 minutes. As well as being far faster than existing testing, it could make accurate UTI testing more widely available in developing nations and remote regions thanks to its potential to be made portable, and far more cheaply than existing lab-based tests.

E. coli is present in 80 percent of bacterial UTIs, so if it is found it tells medical professionals that an antibiotic treatment is needed.

As well as a smartphone camera, the test, which could be adapted to detect a variety of bacterial infections, takes advantage of widely-available reagents and new micro-engineered materials. Researchers say the simplicity of the test, which has now passed the proof-of-concept stage, could deliver a new way to quickly identify treatments for patients in poorer or remote regions.

Described in the journal Biosensors and Bioelectronics, the test uses antibodies to capture bacterial cells in very thin capillaries within a plastic strip, detecting and identifying the cells optically rather than through the microbiological methods currently used.

Dr Nuno Reis, from Bath's Department of Chemical Engineering, led the development of the test. He says: "The test is small and portable - so it has major potential for use in primary care settings and in developing countries.

"Currently, bacterial infections in UTIs are confirmed via microbiological testing of a urine sample. That is accurate, but time-consuming, taking several days. We hope that giving medical professionals the ability to quickly rule in or rule out certain conditions will allow them to treat patients more quickly and help them make better decisions about the prescription of antibiotics."

The lack of rapid diagnostics for UTIs has in many cases led to a catch-all prescription of potentially unnecessary antibiotics, which increases the risk of bacteria becoming resistant to treatment - accepted as one of the biggest threats to global health and development.

How the test works

The test is carried out by passing a urine sample over a ridged plastic micro-capillary strip, containing an immobilising antibody able to recognise E. coli bacterial cells. If E. coli is present in the sample, antibodies in the reagents will bind with it, stopping it from passing through the section of plastic strip. Finally, an enzyme is added that causes a change in colour that can be picked up by a smartphone camera.

The system also measures the concentration of E. coli in the sample by analysing an image taken by the camera. The procedure is simple and could be manually operated or fully automated without any need for a mains power supply.

Aims to overcome regulators' concerns on smartphone use

To date, bodies such as the United States Food & Drug Administration (FDA) have not granted approval to techniques that use smartphones - citing the potential for both non-lab conditions and software updates to the phone to make tests unscientific. But Dr Reis hopes that the way the test uses a variable scale to digitally compare the pixels within an image will convince regulators to allow the treatment to move toward eventual production.

Dr Reis adds that wealthier nations could also benefit from adopting the methodology, as it could make testing within primary care facilities such as GP surgeries more viable, reducing the need to send samples to central labs for testing.

He says: "The UK and wealthy countries have seen a big shift to decentralised diagnostics to reduce the load on national or regional labs and provide doctors with important tools to make informed diagnoses.

"Driving more of this will bring better outcomes to patients in terms of speeding up the process, but will also lower the cost to healthcare providers. We are not talking about replacing centralised diagnostics services but providing the first point of contact with affordable and rapid tools to support prescription of antibiotics and avoid their overuse."

The next step for the process is clinical trials, which will require collaboration with clinical and commercial partners. Beyond this, the team will shortly begin working on refining the test to allow for the detection of other bacteria and their concentrations, which will help prescribe correct dosages and avoid the overuse of antibiotics.

Dr Reis concludes: "The smartphone solves one of the biggest problems of the decentralising of diagnostics because their capabilities are actually very sophisticated in certain conditions. They offer the same functionality as sophisticated scanners that have until now been available only in labs."

Credit: 
University of Bath

The start of biological spring in Africa is linked to the quantity of hours of sunshine

Experts from the University of Seville have published a recent study in which they determine that the start of the increase in the verdure of vegetation (equivalent to the start of spring) in Africa is directly connected to the amount of hours of sunshine a day, that is to say the it would be the "photoperiod" which controls this process and not the arrival of the first rains, as was believed until now. This work has been published in 'Communications Biology', a new review from the Nature group.

The increase in vegetation is driven by environmental factors such as the photoperiod, precipitation, temperature, hours of sunshine and the availability of nutrients. However, in Africa there is ambiguity as to which of these are key elements for driving the growth of vegetation, which can generate uncertainty when making predictions on the impact of global warming in the terrestrial ecosystems and their representation in dynamic vegetation models.

Using data obtained from satellites, researchers from the Faculty of Geography and History at the University of Seville have carried out a systematic analysis of the relationship between vegetation phenology (that is, the date on which seasonal changes are produced) and environmental factors. The study, in different regions of Africa, has revealed that it not just one, but a combination of environmental factors that influences the start and end of the season of vegetation growth. Although, the most important factor is the number of hours of sunshine.

"Consequently, to improve our predictions on this impact of climate change, the role of the photoperiod should be incorporated into the modelling of vegetation, climate and ecosystems. In addition, it is necessary to define clearly the response of vegetation to the interaction between a signal level of hours of constant light and the year-on-year variation in other factors, especially in a changing climate", explains the University of Seville teacher Víctor F. Rodríguez.

Credit: 
University of Seville

Hundreds of novel viruses discovered in insects

image: In a current study, scientists discovered hundreds of new viruses in insects.

Image: 
Photo: Paraskevopoulou/Charité

New viruses which cause diseases often come from animals. Well-known examples of this are the Zika virus transmitted by mosquitoes, bird flu viruses, as well as the MERS virus which is associated with camels. In order to identify new viral diseases quickly and prevent possible epidemics, DZIF scientists at Charité - Universitätsmedizin Berlin are targeting their search at viruses in animals. In a current study, they have now discovered hundreds of novel viruses in insects. The results have been published in PLOS Pathogens*.

"Every new virus we find could be a cause of illnesses that was previously unknown, both in humans and in livestock," explains Prof. Dr. Christian Drosten, Director of the Institute of Virology on Campus Charité Mitte. The scientist is a specialist for virus discovery and diagnostics at the German Center for Infection Research (DZIF). For example, his team has defined the international standard approach for diagnosing MERS. He is currently focusing on rare virus diagnoses using new sequencing techniques. "The more viruses we identify and add to our database, the easier it is for us to recognise the cause of new and unusual illnesses," says Prof. Drosten.

In the current study, the research team has made use of the largest international transcriptome database on insects, a kind of catalog of gene activity, and investigated the data it contains with regard to virus genomes. Whilst scientists have previously concentrated on mosquitos and other blood-feeding insects, this study includes all groups of insects. Viruses with negative strand RNA genomes have been systematically investigated. This group of RNA viruses includes important pathogenic viruses; these cause Ebola and measles, as well as rabies and lung infections.

In a total of 1.243 insect species, the researchers discovered viruses that can be classified in at least 20 new genera. "This is probably the largest sample of animals ever screened for new viruses," says Prof. Drosten. The working group has already added the new insect viruses to its search databases. With the help of these data, it will now be possible to investigate cases of rare and unusual illnesses in humans. This includes patients who display all the symptoms of a viral infection, however no virus can be identified in the case in question. "In such cases, we use high-throughput sequencing methods to search for all the viruses present in the patient," explains the virologist. "If the patient has a virus, we will find it, provided it is in our database or has similarities with a virus in our database." The chances of the search being successful will increase thanks to the addition of the new insect viruses.

As part of the DZIF project "Virus detection and preparedness", the scientists at Charité will continue to focus on anticipating and detecting future viral threats.

Credit: 
Charité - Universitätsmedizin Berlin