Culture

Vitamin B1 deficiency a key factor in the development of alcohol-related dementia

(Vienna, 09 September 2020) A common consequence of chronically high alcohol consumption is a decline in cognitive function, which can even progress to full-blown dementia. However, we do not yet fully understand how alcohol damages the brain. A research group led by Stephan Listabarth from MedUni Vienna's Department of Psychiatry and Psychotherapy, Division of Social Psychiatry, has now developed a hypothesis whereby iron deposits in the brain - resulting from alcohol-induced vitamin B1 deficiency - can be regarded as key factors in cognitive decline. The work has now been published in the leading journal "Alzheimer's and Dementia".

In Austria, around 5% of the population are alcohol dependent from the age of 15 onwards. This means that approximately 365,000 people are affected by the dangerous health consequences associated with high alcohol consumption. One of these consequences is a decline in cognitive function, especially memory and abstraction. This is then referred to as alcohol-related dementia. However, we do not yet fully understand the exact pathomechanism, that is to say the way in which the brain is damaged by alcohol.

Researchers Stephan Listabarth, Daniel König and Benjamin Vyssoki from the Department of Psychiatry and Psychotherapy, Division of Social Psychiatry at MedUni Vienna and Simon Hametner from MedUni Vienna's Department of Neurology, Division of Neuropathology and Neurochemistry, have now advanced a plausible hypothesis to explain alcohol-induced brain damage: the cognitive deterioration is caused by iron deposits in the brain but the administration of vitamin B1 could protect the brain from these deposits.

We know from various neurodegenerative diseases that iron deposits in the brain are responsible for nerve tissue damage. These deposits can also be detected in specific regions of the brain (including the basal ganglia) in people who drink a lot of alcohol. The hypothesis advanced by the study authors now also offers an explanation as to why iron deposits are so prevalent in this patient group: high alcohol consumption results in elevated iron levels in the blood and also to vitamin B1 (thiamine) deficiency, which, among other things, is important for maintaining the blood-brain barrier. If these two situations coincide, more iron will be deposited inside the brain, ultimately leading to oxidative tissue damage.

This newly described role of vitamin B1 in this process could represent a huge step forward in our understanding of the development of alcohol-related neurological damage and, in particular, could offer a new point of attack for preventive and therapeutic approaches. It would then be conceivable to give continuous vitamin B1 substitutionin future, as a preventive measure.

The researchers believe it would also be useful to evaluate the use of drugs to reduce iron levels (e.g. chelators), as is already done in other neurodegenerative diseases. The authors of the current work have already started planning a prospective clinical study to validate the above-mentioned relationship between alcohol dependency, vitamin B1 deficiency and cerebral iron deposits and to provide a basis for further research in the field of alcohol-related dementia in the future.

Credit: 
Medical University of Vienna

Climate engineering: Modelling projections oversimplify risks

Climate change is gaining prominence as a political and public priority. But many ambitious climate action plans foresee the use of climate engineering technologies whose risks are insufficiently understood. In a new publication, researchers from the Institute for Advanced Sustainability Studies in Potsdam, Germany, describe how evolving modelling practices are trending towards "best-case" projections. They warn that over-optimistic expectations of climate engineering may reinforce the inertia with which industry and politics have been addressing decarbonisation. In order to forestall this trend, they recommend more stakeholder input and clearer communication of the premises and limitations of model results.

The focus of the paper lies on the models underpinning the Intergovernmental Panel on Climate Change (IPCC) Assessment Reports - the first port-of-call for mapping combinations of technologies, alternative pathways of deployment, and climatic impacts. The authors show how modelling of solar radiation management and carbon dioxide removal technologies tends toward "best-case" projections. According to their analysis, the poorly substantiated promises delivered by these projections influence research, policy, and industry planning in the near term and may already be entrenching carbon infrastructures. In the case of certain kinds of carbon dioxide removal, for example, the prospect of future carbon capture is sometimes wrongly seen as a substitute for present mitigation.

Climate models are not neutral

The researchers outline ways in which this trend can be forestalled. They propose mechanisms for increasing stakeholder input and strengthening political realism in modelling. "The portrayal of modelling as explorative, technically focused mappings for supporting decision making is simplistic. Modellers have to choose parameters and design scenarios. Their choices cannot be 'neutral' - scenarios reflect hidden judgments and create benchmarks for further conversation, whether in assessment, or in technology and policy development," says co-author Sean Low. For that reason, there needs to be more transparency about the ways in which models are constructed, perceived, and applied. Efforts to expand modelling "reality checks" with technology experts, social scientists, and a wide range of users are a pragmatic first step.

Glossing over fine print can lead to big problems

The scientific community must also be wary of the selective use of projections. Projections offer schemes that are stylised, optimised, and deceptively simple. By abstracting from possible technical failures and messy politics, they can create a false sense of certainty regarding the feasibility of a particular course of action. But it would be wrong to use them as alternatives to existing climate action plans or instruction manuals. Since modelling projections can offer only partial depictions of systemic risk, it is problematic if political and industry interests co-opt a stylised version for pre-existing agendas and gloss over the models' fine print.

Much governance work ahead

The authors emphasise the need for policy guardrails: "In climate governance the devil really does lie in the details. The inertia of the carbon economy requires that significant efforts are made to prevent particular and short-term interests undermining policy integrity," says co-author Matthias Honegger. In addition to more transparent modelling, a lot of careful policy development and governance work is needed to ensure that solar radiation management and carbon dioxide removal technologies play a constructive role in future climate policy.

Credit: 
Research Institute for Sustainability (RIFS) – Helmholtz Centre Potsdam

Humans, not climate, have driven rapidly rising mammal extinction rate

image: Verreaux's sifaka

Image: 
Tobias Andermann

Human impact can explain ninety-six percent of all mammal species extinctions of the last hundred thousand years, according to a new study published in the scientific journal Science Advances.

Over the last 126,000 years, there has been a 1600-fold increase in mammal extinction rates, compared to natural levels of extinction. According to the new study, this increase is driven almost exclusively by human impact.

Human impact larger than the effects of climate

The study further shows that even prehistoric humans already had a significant destructive impact on biodiversity - one that was even more destructive than the largest climatic changes of Earth's recent history, such as the last ice age.

"We find essentially no evidence for climate-driven extinctions during the past 126,000 years Instead, we find that human impact explains 96% of all mammal extinctions during that time", asserts Daniele Silvestro, one of the researchers.

This is at odds with views of some scholars, who believe that strong climatic changes were the main driving force behind most pre-historic mammal extinctions. Rather, the new findings suggest that in the past mammal species were resilient, even to extreme fluctuations in climate.

"However, current climate change, together with fragmented habitats, poaching, and other human-related threats pose a large risk for many species", says Daniele Silvestro.

Analyses based on large global data set

The researcher's conclusions are based on a large data set of fossils. They compiled and analyzed data of 351 mammal species that have gone extinct since the beginning of the Late Pleistocene era. Among many others, these included iconic species such as mammoths, sabre tooth tigers, and giant ground sloths. Fossil data provided by the Zoological Society of London were an important contribution to the study.

"These extinctions did not happen continuously and at constant pace. Instead, bursts of extinctions are detected across different continents at times when humans first reached them. More recently, the magnitude of human driven extinctions has picked up the pace again, this time on a global scale", says Tobias Andermann from the University of Gothenburg.

Extinction rates will increase further, if nothing is done

The current extinction rate of mammals is likely the largest extinction event since the end of the dinosaur era, according to the researchers. Using computer-based simulations they predict that these rates will continue to rise rapidly--possibly reaching up to 30,000-fold above the natural level by the year 2100. This is if current trends in human behavior and biodiversity loss continue.

"Despite these grim projections, the trend can still be changed. We can save hundreds if not thousands of species from extinction with more targeted and efficient conservation strategies. But in order to achieve this, we need to increase our collective awareness about the looming escalation of the biodiversity crisis, and take action in combatting this global emergency. Time is pressing. With every lost species, we irreversibly lose a unique portion of Earth's natural history", concludes Tobias Andermann.

Credit: 
University of Gothenburg

Lumpy proteins stiffen blood vessels of the brain

Deposits of a protein called "Medin", which manifest in virtually all older adults, reduce the elasticity of blood vessels during aging and hence may be a risk factor for vascular dementia. Experts from the German Center for Neurodegenerative Diseases (DZNE) and the Hertie Institute for Clinical Brain Research (HIH) at the University of Tübingen report on this in the scientific journal PNAS. The researchers regard these deposits as targets for future therapies. Their findings are based on studies in mice and the analysis of human tissue samples.

Nearly all people over the age of 50 are known to have tiny lumps of the protein Medin in the walls of their blood vessels. "These deposits are apparently a side effect of the aging process. They are predominantly found in the aorta and in blood vessels of the upper body, including those of the brain. Most surprisingly, in our study we could not only detect Medin particles in brain tissue samples from deceased individuals but also in old mice - despite the limited lifespan of these animals," said Dr. Jonas Neher, head of the current study and a scientist at the DZNE's Tübingen site and the HIH.

Medin is considered problematic, because it belongs to a group of molecules called "amyloids" that are often associated with pathological conditions. A prominent example is "amyloid beta", whose aggregates are involved in Alzheimer's disease. "It has been assumed for quite some time that Medin aggregates have an unfavorable effect on blood vessels and can contribute to vascular diseases. Recent studies support this hypothesis. According to these previous findings, older adults with vascular dementia show increased amounts of Medin deposits compared to healthy individuals," said Neher.

Sluggish vessels

However, despite these suspicious signs, there has not yet been conclusive evidence that the protein lumps are actually harmful. A research team led by Neher has now succeeded in proving this - enabled by their finding that Medin deposits also form in aging mice. Their study is the result of a collaborative effort also involving scientists from Frankfurt, Munich, Liverpool and London. "We investigated how quickly blood vessels in the brain can dilate. For this we compared aged mice that have Medin deposits with mice that genetically lack Medin and therefore do not develop Medin deposits," said Neher. Such studies are difficult to conduct in humans, he explained: "Almost all older adults have Medin aggregates. Therefore, it is almost impossible to compare people of about the same age with and without aggregates."

Neher's team observed that mice - analogous to humans - show increasing amounts of Medin particles in their blood vessels with advancing age. "In this respect, the mouse seems to adequately mimic the situation in humans," said Neher.

In mice, the researchers also found that when the brain is active and a higher blood supply is needed, blood vessels with Medin deposits expand more slowly than those without Medin. "Brain vessels with Medin appear to be less flexible and therefore react more sluggishly." However, the ability of the vessels to expand rapidly is important for regulating blood flow and providing the brain with an optimal supply of oxygen and nutrients, the researcher said. "If this ability is impaired, it can have far-reaching consequences for the functioning of organs." Medin deposits therefore seem to contribute to the deterioration of blood vessel function at an advanced age. "And this is probably not only the case in the brain, because the deposits also occur in other blood vessels and could therefore lead not only to vascular dementia but also to cardiovascular disease."

Neher cannot give a definite answer about the mechanisms by which the Medin particles act on the blood vessels. However, he does have a theory: "Fibers run in the vessel wall that allow the blood vessel to stretch and contract. Since the protein deposits are embedded in the vessel wall, they may interfere with the function of these elastic fibers."

Therapeutic target

Medin is derived in an as yet unknown way from a larger protein that is involved in the formation of new blood vessels, amongst other things. "If this precursor molecule could be stabilized with drugs, the production of Medin might be modulated. Alternatively, the break-down of Medin aggregates could be stimulated. This could help to maintain vascular and brain health in old age. However, there are no such medicines available to date," said Neher. "It is therefore important to see Medin as a risk factor that almost every older adult carries within him or herself. Although Medin affects a really large group of people, it has so far received little attention in therapy research. Our findings suggest that it should move more into the spotlight."

Credit: 
DZNE - German Center for Neurodegenerative Diseases

Development of photovoltaics that can be applied like paint for real-life application

image: Dr. Hae Jung Son's team at KIST implemented high-efficiency solar cell technology on a large area through solution process method that utilizes spin coating.

Image: 
Korea Institue of Science and Technology(KIST)

Researchers in Korea have successfully developed a high-efficiency large-area organic solution processable solar cell by controlling the speed at which the solution of raw materials for solar cells became solidified after being coated. The team led by Dr. Hae Jung Son from the Photo-electronic Hybrids Research Center of the Korea Institute of Science and Technology (KIST) announced that they have identified the difference in the mechanism of film formation between a small area and a large area of organic solar cells in a solution process and, by resolving the issue concerning the related process technology, developed a high-efficiency large-area organic photovoltaics.

If a photovoltaic material is made in the form of paint that can be applied to any surface, such as the exterior of a building or a car, it will be possible to achieve energy self-sufficiency and provide low-cost eco-friendly energy to those suffering from energy poverty. Not only that, it will be easy to utilize space for installation of photovoltaics even on urban buildings, and ideally, the photovoltaic panels will be maintained by re-applying the "paint."

When it comes to solution processable solar cells, which work by coating the surface with the solar cell solution, the photoactive area that generates electricity still remain on a laboratory scale (less than 0.1?). When applied to a large area to produce sufficient electric power for it to be practical, there are issues related reduced performance and reproducibility due to material- and process-related limitations, and this has been an obstacle to commercialization.

Dr. Son's team at KIST revealed that commercially available organic materials become easily crystallized, which makes them unsuitable for large-area solution processes. In the case of the large-area solution process for industrial uses, the process through which the solvent in which the solar cell material is dissolved evaporates to form a film occurs slowly, thereby resulting in agglomeration and other phenomena, and this in turn lowers the efficiency of the solar cell. As for the spin coating method, which is a small-area process employed in laboratory research, the substrated is rapidly rotated during the film formation process in order to speed up the solvent evaporation, and this makes it possible to form a film without the aforementioned problem concerning reduced efficiency.

Based on this information, KIST researchers developed high-performance large-area organic photovoltaics by controlling the solvent evaporation rate following the coating step in a large-area solution process as a way to form a film optimized for solar cell performance. As a result, high-efficiency large-area organic photovoltaics with 30% higher power conversion efficiency than existing photovoltaics were attained.

Dr. Son said, "The core design principles of solar cell materials capable of high-quality large-area using the solution will accelerate the development of solution processable solar cells in the future. [This study] has contributed to not only raising the efficiency of next-generation solution processable solar cells but also the development of core technology for manufacturing large-area solar cell materials required for commercialization."

Credit: 
National Research Council of Science & Technology

Wild cousins may help crops battle climate change

image: A cross pollination is made by the field team in Lebanon to bring together modern varieties with wild relatives.

Image: 
Michael Major

Earth is getting hotter. Huge amounts of greenhouse gases are warming the planet and altering the climate. Heat waves are harsher. Droughts are longer. And some diseases and pests are stronger than ever.

All of that is bad news for many of Earth's inhabitants. But crops are especially vulnerable. We've bred them to depend on us, and they can succumb to many threats that are likely to get worse in the next century. All as we need more food to feed a growing population.

An international group of researchers set out to test how we can help our crops adapt in the coming decades. Their idea is to use wild crop relatives.

These cousins of domestic crops look like weeds and you have probably walked past them when hiking on mountain trails. You may have even seen them in the cracks of pavement in the cities. They have lived in harsh climates without any human help since the dawn of time.

Scientists hope that using crop wild relatives in breeding programs can add resilience to our domestic crops while keeping them delicious.

"Crop wild relatives have been selected by nature over millennia to withstand the very climatic stresses that we are trying to address, and hence present a new hope," says Filippo Bassi. Bassi is a scientist in Morocco at the International Center for Agricultural Research in the Dry Areas (ICARDA).

But it can be risky to change how breeders work. "Before making the final decision to shift investments from normal breeding to the use of crop wild relatives, it is critical to make sure that there is a real advantage in doing so," Bassi says.

To test this idea, Bassi's international team of scientists, coming from Africa, Europe, Asia and South America, focused on durum wheat.

The team gathered 60 unique varieties of wheat to expose to a battery of harsh tests. These included fungal diseases, drought and high temperatures. One-third of the wheat lines the team used were developed by combining wild relatives of wheat with strong, commercial varieties.

These wild relative-derived varieties of wheat were robust compared to more conventional varieties. About a third of wild relative varieties were resistant to the fungal disease Septoria, compared to just a tenth of the others. But conventional wheat varieties were more resistant to other diseases, like leaf rust, that have been the focus of past breeding programs.

Where the wild relative wheat varieties really shone was under drought and heat stress. During drought, the wild relative lines had larger grains, a critical adaptation and market trait for this crop. And, when the nutrient nitrogen was in short supply, the wild-derived lines produced a higher yield than the other wheat varieties.

"In the case of temperature, the crop wild relative presented a clear advantage with a yield increase of 42 percent under heat stress," says Bassi. "Yield losses to heat can be drastic, and the use of crop wild relatives to breed new varieties appears to be a very strategic approach to address this climatic challenge."

But resilience isn't the whole story. We depend on crops to make food. And crops are different from their wild cousins in large part because humans have selected crops over many centuries to adapt to their needs, including a preference for making delicious foods.

That is why Bassi's team also looked at the usefulness of the 60 wheat varieties for making pasta. Here, the wild-derived wheat lines were the least suitable for pasta making. "That's a disappointment," says Bassi. "But not a deal breaker."

"This does not prove that the use of crop wild relatives will inevitably result in poor industrial quality," says Bassi. "But rather that it is important for breeders to be aware of this risk and develop breeding strategies that address this issue."

Overall, durum wheat's wild relatives appeared useful. When crossed to elite commercial varieties, they provided increased resistance to heat, drought and some diseases. These are precisely the threats facing not just durum wheat, but most major crops in a warming world. That's good news for plant breeders -- and the public.

"The crop wild relatives showed great promise in terms of climate change adaptation," says Bassi. "I hope the public will be re-assured that breeders are testing all possible opportunities to prepare agriculture for climate challenges."

Credit: 
American Society of Agronomy

New perception metric balances reaction time, accuracy

PITTSBURGH--Researchers at Carnegie Mellon University have developed a new metric for evaluating how well self-driving cars respond to changing road conditions and traffic, making it possible for the first time to compare perception systems for both accuracy and reaction time.

Mengtian Li, a Ph.D. student in CMU's Robotics Institute, said academic researchers tend to develop sophisticated algorithms that can accurately identify hazards, but may demand a lot of computation time. Industry engineers, by contrast, tend to prefer simple, less accurate algorithms that are fast and require less computation, so the vehicle can respond to hazards more quickly.

This tradeoff is a problem not only for self-driving cars, but also for any system that requires real-time perception of a dynamic world, such as autonomous drones and augmented reality systems. Yet until now, there's been no systematic measure that balances accuracy and latency -- the delay between when an event occurs and when the perception system recognizes that event. This lack of an appropriate metric as made it difficult to compare competing systems.

The new metric, called streaming perception accuracy, was developed by Li, together with Deva Ramanan, associate professor in the Robotics Institute, and Yu-Xiong Wang, assistant professor at the University of Illinois at Urbana-Champaign. They presented it last month at the virtual European Conference on Computer Vision, where it received a best paper honorable mention award.

Streaming perception accuracy is measured by comparing the output of the perception system at each moment with the ground truth state-of-the-world.

"By the time you've finished processing inputs from sensors, the world has already changed," Li explained, noting that the car has traveled some distance while the processing occurs.

"The ability to measure streaming perception offers a new perspective on existing perception systems," Ramanan said. Systems that perform well according to classic measures of performance may perform quite poorly on streaming perception. Optimizing such systems using the newly introduced metric can make them far more reactive.

One insight from the team's research is that the solution isn't necessarily for the perception system to run faster, but to occasionally take a well-timed pause. Skipping the processing of some frames prevents the system from falling farther and farther behind real-time events, Ramanan added.

Another insight is to add forecasting methods to the perception processing. Just as a batter in baseball swings at where they think the ball is going to be -- not where it is -- a vehicle can anticipate some movements by other vehicles and pedestrians. The team's streaming perception measurements showed that the extra computation necessary for making these forecasts doesn't significantly harm accuracy or latency.

Credit: 
Carnegie Mellon University

Virtual tourism could offer new opportunities for travel industry, travelers

image: Dr. Arni S.R. Srinivasa Rao

Image: 
Phil Jones, Senior Photographer, Augusta University

A new proposal for virtual travel, using advanced mathematical techniques and combining livestream video with existing photos and videos of travel hotspots, could help revitalize an industry that has been devastated by the coronavirus pandemic, according to researchers at the Medical College of Georgia at Augusta University.

In a new proposal published in Cell Patterns, Dr. Arni S.R. Srinivasa Rao, a mathematical modeler and director of the medical school's Laboratory for Theory and Mathematical Modeling, and co-author Dr. Steven Krantz, a professor of mathematics and statistics at Washington University, suggest using data science to improve on existing television and internet-based tourism experiences. Their technique involves measuring and then digitizing the curvatures and angles of objects and the distances between them using drone footage, photos and videos, and could make virtual travel experiences more realistic for viewers and help revitalize the tourism industry, they say.

They call this proposed technology LAPO, or Live Streaming with Actual Proportionality of Objects. LAPO employs both information geometry - the measures of an object's curvatures, angles and area - and conformal mapping, which uses the measures of angles between the curves of an object and accounts for the distance between objects, to make images of people, places and things seem more real.

"This is about having a new kind of technology that uses advanced mathematical techniques to turn digitized data, captured live at a tourist site, into more realistic photos and videos with more of a feel for the location than you would get watching amovie or documentary," says corresponding author Rao. "When you go see the Statue of Liberty for instance, you stand on the bank of the Hudson River and look at it. When you watch a video of it, you can only see the object from one angle. When you measure and preserve multiple angles and digitize that in video form, you could visualize it from multiple angles. You would feel like you're there while you're sitting at home."

Their proposed combination of techniques is novel, Rao says. "Information geometry has seen wide applications in physics and economics, but the angle preservation of the captured footage is never applied," he says.

Rao and Krantz say the technology could help mediate some of the pandemic's impact on the tourism industry and offer other advantages.

Those include its cost-effectiveness, because virtual tourism would be cheaper; health safety, because it can be done from the comfort of home; it saves time, eliminating travel times; it's accessibility - tourism hotspots that are not routinely accessible to seniors or those with physical disabilities would be; it's safer and more secure, eliminating risks like becoming a victim of crime while traveling; and it requires no special equipment - a standard home computer with a graphics card and internet access is all that's needed to enjoy a "virtual trip."

"Virtual tourism (also) creates new employment opportunities for virtual tour guides, interpreters, drone pilots, videographers and photographers, as well as those building the new equipment for virtual tourism," the authors write.

"People would pay for these experiences like they pay airlines, hotels and tourist spots during regular travel," Rao says. "The payments could go to each individual involved in creating the experience or to a company that creates the entire trip, for example."

Next steps include looking for investors and partners in the hospitality, tourism and technology industries, he says.

If the pandemic continues for several more months, the World Travel and Tourism Council, the trade group representing major global travel companies, projects a global loss of 75 million jobs and $2.1 trillion in revenue.

Credit: 
Medical College of Georgia at Augusta University

Green light therapy shown to reduce migraine frequency, intensity

image: A study by University of Arizona Health Sciences researchers found that green light therapy resulted in about a 60% reduction in the pain intensity of the headache phase and number of days per month people experienced migraine headaches.

Image: 
University of Arizona Health Sciences/Kris Hanning

New research from the University of Arizona Health Sciences found that people who suffer from migraine may benefit from green light therapy, which was shown to reduce the frequency and intensity of headaches and improve patient quality of life.

According to the Migraine Research Foundation, migraine is the third most prevalent illness in the world, affecting 39 million people in the United States and 1 billion worldwide.

"This is the first clinical study to evaluate green light exposure as a potential preventive therapy for patients with migraine, " said Mohab Ibrahim, MD, PhD, lead author of the study, an associate professor in the UArizona College of Medicine - Tucson's Department of Anesthesiology, Pharmacology, and Neurosurgery and director of the Chronic Pain Management Clinic. "As a physician, this is really exciting. Now I have another tool in my toolbox to treat one of the most difficult neurological conditions - migraine."

Overall, green light exposure reduced the number of headache days per month by an average of about 60%. A majority of study participants - 86% of episodic migraine patients and 63% of chronic migraine patients - reported a more than 50% reduction in headache days per month. Episodic migraine is characterized by up to 14 headache days per month, while chronic migraine is 15 or more headache days per month.

"The overall average benefit was statistically significant. Most of the people were extremely happy," Dr. Ibrahim said of the participants, who were given light strips and instructions to follow while completing the study at home. "One of the ways we measured participant satisfaction was, when we enrolled people, we told them they would have to return the light at the end of the study. But when it came to the end of the study, we offered them the option to keep the light, and 28 out of the 29 decided to keep the light."

Dr. Ibrahim and co-author Amol Patwardhan, MD, PhD, who are affiliated with the UArizona Health Sciences Comprehensive Pain and Addiction Center, have been studying the effects of green light exposure for several years. This initial clinical study included 29 people, all of whom experience episodic or chronic migraine and failed multiple traditional therapies, such as oral medications and Botox injections.

"Despite recent advances, the treatment of migraine headaches is still a challenge," said Dr. Patwardhan, an associate professor and the vice chair of research in the Department of Anesthesiology. "The use of a nonpharmacological therapy such as green light can be of tremendous help to a variety of patients that either do not want to be on medications or do not respond to them. The beauty of this approach is the lack of associated side effects. If at all, it appears to improve sleep and other quality of life measures."

During the study, patients were exposed to white light for one to two hours a day for 10 weeks. After a two-week break, they were exposed to green light for 10 weeks. They completed regular surveys and questionnaires to track the number of headaches they experienced and the intensity of those headaches, as well as quality of life measurements such as the ability to fall and stay asleep or to perform work.

Using a numeric pain scale of 0 to 10, participants noted that green light exposure resulted in a 60% reduction in pain, from 8 to 3.2. Green light therapy also shortened the duration of headaches, and it improved participants' ability to fall and stay asleep, perform chores, exercise, and work.

None of the study participants reported any side effects of green light exposure.

"In this trial, we treated green light as a drug," Dr. Ibrahim said. "It's not any green light. It has to be the right intensity, the right frequency, the right exposure time and the right exposure methods. Just like with medications, there is a sweet spot with light."

Dr. Ibrahim has been contacted by physicians from as far away as Europe, Africa and Asia, all asking for the green light parameters and schematic design for their own patients.

"As you can imagine, LED light is cheap," he said. "Especially in places where resources are not that available and people have to think twice before they spend their money, when you offer something affordable, it's a good option to try."

The paper, "Evaluation of green light exposure on headache frequency and quality of life in migraine patients: A preliminary one-way cross-over clinical trial," was published online by Cephalalgia, the journal of the International Headache Society.

"These are great findings, but this is where the story begins," Dr. Ibrahim said. "As a scientist, I am really interested in how this works because if I understand the mechanism, then I can utilize it for other conditions. I can use it as a tool to manipulate the biological systems to achieve as much as we can."

Credit: 
University of Arizona Health Sciences

Hoarding and herding during the COVID-19 pandemic

Rushing to stock up on toilet paper before it vanished from the supermarket isle, stashing cash under the mattress, purchasing a puppy or perhaps planting a vegetable patch - the COVID-19 pandemic has triggered some interesting and unusual changes in our behavior.

Understanding the psychology behind economic decision-making, and how and why a pandemic might trigger responses such as hoarding, is the focus of a new paper published in the Journal of Behavioral Economics for Policy.

'Hoarding in the age of COVID-19' by behavioral economist Professor Michelle Baddeley, Deputy Dean of Research at the University of Technology Sydney (UTS) Business School, examines a range of cross-disciplinary explanations for hoarding and other behavior changes observed during the pandemic.

"Understanding these economic, social and psychological responses to COVID-19 can help governments and policymakers adapt their policies to limit negative impacts, and nudge us towards better health and economic outcomes," says Professor Baddeley.

Governments around the world have implemented behavioral insights units to help guide public policy, and influence public decision-making and compliance.

Hoarding behavior, where people collect or accumulate things such as money or food in excess of their immediate needs, can lead to shortages, or in the case of hoarding cash, have negative impacts on the economy.

"In economics, hoarding is often explored in the context of savings. When consumer confidence is down, spending drops and households increase their savings if they can, because they expect bad times ahead," explains Professor Baddeley.

"Fear and anxiety also have an impact on financial markets. The VIX 'fear' index of financial market volatility saw a dramatic 564% increase between November 2019 and March 2020, as investors rushed to move their money into 'safe haven' investments such as bonds."

While shifts in savings and investments in the face of a pandemic might make economic sense, the hoarding of toilet paper, which also occurred across the globe, is more difficult to explain in traditional economic terms, says Professor Baddeley.

Behavioural economics reveals that our decisions are not always rational or in our long term interest, and can be influenced by a wide range of psychological factors and unconscious biases, particularly in times of uncertainty.

"Evolved instincts dominate in stressful situations, as a response to panic and anxiety. During times of stress and deprivation, not only people but also many animals show a propensity to hoard."

Another instinct that can come to the fore, particularly in times of stress, is the desire to follow the herd, says Professor Baddeley, whose book 'Copycats and Contrarians' explores the concept of herding in greater detail.

"Our propensity to follow others is complex. Some of our reasons for herding are well-reasoned. Herding can be a type of heuristic: a decision-making short-cut that saves us time and cognitive effort," she says.

"When other people's choices might be a useful source of information, we use a herding heuristic and follow them because we believe they have good reasons for their actions. We might choose to eat at a busy restaurant because we assume the other diners know it is a good place to eat.

"However numerous experiments from social psychology also show that we can be blindly susceptible to the influence of others. So when we see others rushing to the shops to buy toilet paper, we fear of missing out and follow the herd. It then becomes a self-fulfilling prophesy."

Behavioral economics also highlights the importance of social conventions and norms in our decision-making processes, and this is where rules can serve an important purpose, says Professor Baddeley.

"Most people are generally law abiding but they might not wear a mask if they think it makes them look like a bit of a nerd, or overanxious. If there is a rule saying you have to wear a mask, this gives people guidance and clarity, and it stops them worrying about what others think.

"So the normative power of rules is very important. Behavioral insights and nudges can then support these rules and policies, to help governments and business prepare for second waves, future pandemics or other global crises."

Credit: 
University of Technology Sydney

Discovery challenges the foundations of gene therapy

image: Marti Cabanes-Creus in the Translational Vectorology lab at Children's Medical Research Institute.

Image: 
Children's Medical Research Institute, Westmead, Sydney Australia

A new publication by scientists from Children's Medical Research Institute has challenged one of the foundations of the gene therapy field and will help to improve strategies for treating serious genetic disorders of the liver.

The paper titled, Restoring the natural tropism of AAV2 vectors for human liver, was published in Science Translational Medicine today.

Adeno-associated virus 2 (AAV2) is a viral vector that is used to deliver gene therapy to the liver. It works as a delivery vehicle to carry therapeutic DNA to the target cells in the body. The way it does this is by binding a 'receptor' on the target cell, a molecule that tells the vector it is in the right place and helps to deliver its cargo into cells. However, clinical trials targeting diseases of the liver have had an unexpectedly low success rate using this vector and now the researchers from CMRI appear to have discovered the reason.

The teams of Dr. Leszek Lisowski, Head of the Translational Vectorology Research Unit, and Prof Ian Alexander, Head of the Gene Therapy Research Unit, have found that the original AAV2, which is commonly used in preclinical and clinical studies, binds tightly to its attachment receptor, heparan sulfate proteoglycans (HSPGs), but too tightly. Because HSPGs are found in many places in the body, not just on liver cells, the vector gets "trapped" before it reaches its intended destination. Therefore, very few vectors manage to deliver their therapeutic cargo to the liver, which greatly diminishes the therapeutic efficacy.

This led the CMRI teams to study naturally occurring adeno-associated viruses which they found were much more successful at delivering the therapy into the liver. These viruses use another receptor that's yet to be discovered. CMRI researchers are now able to make vectors in the lab that use this better receptor, instead of HSPGs, potentially making the next generation of gene therapy targeting the liver vastly more successful.

"This really challenges a basic concept in our field that binding strongly to HSPG was essential for AAV's entry into human cells and suggests that vectors targeting the other receptor used by natural AAVs, of human liver origin, are likely to be more effective for clinical gene therapy applications'' Dr. Lisowski said. "The prototypical AAV2, discovered over 50yrs ago, is the serotype on which the entire field of AAV vectorology and gene therapy is based. Our discovery will shake the foundations of the field of AAV-based gene therapeutics and will mark the beginning of a new era not only for biomedical research, but most importantly, for millions of patients affected by genetic disorders"

"It sheds new light and challenges our previous understanding and corrects misconceptions about how the vector binds to the cells,'' he added.

Lead author on the publication, Dr. Marti Cabanes-Creus, said they could now move forward to improve on the use of vectors to help children with liver conditions. "It will help us understand previous clinical data and how to improve on these" he said.

"By having a better vector, we can increase the safety and improve the efficiency. Because a lower dose will be needed to achieve therapeutic efficacy, the cost of those therapies will be decreased, which is an additional benefit to the patients, their families, and the healthcare system."

Dr. Cabanes-Creus added that "The lessons learned can potentially be extended to other tissues, beyond the liver, making this a very impactful study which will change the trajectory of AAV-based gene therapies."

Credit: 
Children's Medical Research Institute

New method prevents quantum computers from crashing

image: Newly developed methods ensure that the loss of individual qubits does not disrupt a quantum computer.

Image: 
Uni Innsbruck/Harald Ritsch

Qubits--the carriers of quantum information--are prone to errors induced by undesired environmental interactions. These errors accumulate during a quantum computation and correcting them is thus a key requirement for a reliable use of quantum computers.

It is by now well known that quantum computers can withstand a certain amount of computational errors, such as bit flip or phase flip errors. However, in addition to computational errors, qubits might get lost altogether. Depending on the type of quantum computer, this can be due to actual loss of particles, such as atoms or ions, or due to quantum particles transitioning for instance to unwanted energy states, so that they are no longer recognized as a qubit. When a qubit gets lost, the information in the remaining qubits becomes scrambled and unprotected, rendering this process a potentially fatal type of error.

Detect and correct loss in real time

A team of physicists led by Rainer Blatt from the Department of Experimental Physics at the University of Innsbruck, in collaboration with theoretical physicists from Germany and Italy, has now developed and implemented advanced techniques that allow their trapped-ion quantum computer to adapt in real-time to loss of qubits and to maintain protection of the fragile stored quantum information. "In our trapped-ion quantum computer, ions hosting the qubits can be trapped for very long times, even days", says Innsbruck physicist Roman Stricker. "However, our ions are much more complex than a simplified description as a two-level qubit captures. This offers great potential and additional flexibility in controlling our quantum computer, but unfortunately it also provides a possibility for quantum information to leak out of the qubit space due to imperfect operations or radiative decay." Using an approach developed by the Markus Müller's theoretical quantum technology group at RWTH Aachen University and Forschungszentrum Jülich, in collaboration with Davide Vodola from the University of Bologna, the Innsbruck team has demonstrated that such leakage can be detected and corrected in real-time. Müller emphasizes that "combining quantum error correction with correction of qubit loss and leakage is a necessary next step towards large-scale and robust quantum computing."

Widely applicable techniques

The researchers had to develop two key techniques to protect their quantum computer from the loss of qubits. The first challenge was to detect the loss of a qubit in the first place: "Measuring the qubit directly was not an option as this would destroy the quantum information that is stored in it", explains Philipp Schindler from the University of Innsbruck. "We managed to overcome this problem by developing a technique where we used an additional ion to probe whether the qubit in question was still there or not, without disturbing it", explains Martin Ringbauer. The second challenge was to adapt the rest of the computation in real-time in case the qubit was indeed lost. This adaptation is crucial to unscramble the quantum information after a loss and maintain protection of the remaining qubits. Thomas Monz, who lead the Innsbruck team, emphasizes that "all the building blocks developed in this work are readily applicable to other quantum computer architectures and other leading quantum error correction protocols."

Credit: 
University of Innsbruck

Feline leukaemia virus infection: A clinical and epidemiological enigma

image: The outcome of feline leukaemia virus infection portrayed as a set of balance scales. The most significant outcome is progressive infection, whereby the virus has the upper hand over the cat's immune response

Image: 
Regina Hofmann-Lehmann

Feline leukaemia virus (FeLV) is a gammaretrovirus that occurs worldwide in domestic cats, as well as small wild cats. It is associated with various serious, and sometimes fatal, diseases including anaemia, immunosuppression and certain cancers. First described over 55 years ago, FeLV has been the subject of intense research interest, which has led to increasingly robust diagnostic assays and efficacious vaccines. While the prevalence of this infection in domestic cats has reduced in many geographic regions, the disease is still something of an enigma and can spread quickly, particularly within naïve 'multi-cat' populations such as shelters and breeding catteries, as well as within pet homes with multiple cats. An important goal in order to reduce the prevalence further is understanding the FeLV status of every cat at risk of infection.

A state-of-the-art Premier Review published in the Journal of Feline Medicine and Surgery this month aims to contribute diagnostic expertise to veterinarians in practice by reviewing recent insights into infection pathogenesis, gained using molecular techniques.1 Writing for an international audience of veterinary practitioners and feline researchers, Professors Regina Hofmann-Lehmann, of the University of Zurich, Switzerland, and Katrin Hartmann, of LMU Munich, Germany, explain that not only are there several different outcomes of FeLV infection, but that these can vary over time. Newly classified as 'progressive', 'regressive', 'focal' and 'abortive' infection, the authors describe how it can be helpful to think of these outcomes in terms of a set of balance scales, with the cat's immune response on one side and the virus on the other.

From an epidemiological point of view, it is the progressively infected cat that is most significant. In these infections, the virus has the upper hand - these cats shed high numbers of FeLV particles and pose an infection risk to other cats. Regardless of their health status, progressively infected cats need to be kept apart from FeLV-naïve companions. From a clinical point of view, progressively infected cats are a priority too: they are at high risk of succumbing to potentially fatal disease; though, if well cared for, many can continue to live a healthy and happy life, sometimes for years.

Of the other possible outcomes, abortive infection is the most favourable for the cat - these cats have strong anti-FeLV immunity. Regressively infected cats will have developed a partially effective antiviral immune response that can keep the virus in check; however, they probably never clear the infection completely, and can shed virus, and thus pose an infection risk, in the early phase of infection or if reactivation occurs. In focal infection, which is comparatively rare, the cat's immune system keeps viral replication sequestered in certain tissues.

When it comes to FeLV testing, seemingly perplexing or 'discordant' test results are not uncommon, particularly in the early phase of infection, and can pose considerable challenges for the practitioner needing to establish the FeLV status and implement appropriate therapeutic and epidemiological measures. The authors discuss the most frequently used methods for FeLV detection, including free FeLV p27 antigen testing, viral RNA testing and FeLV provirus testing, focusing on when to test and how to interpret a positive or a negative result. The detection of anti-FeLV antibodies, including a point-of-care test for FeLV p15E introduced recently onto the European market, is also discussed. A diagnostic algorithm produced by the European Advisory Board on Cat Diseases (ABCD) that provides guidance on which test to choose in which scenario is incorporated within the review article.

As well as being expert members of the ABCD, both authors were members of an expert panel for recently published consensus guidelines from the American Association of Feline Practitioners (AAFP) on feline retrovirus testing and management,2 and together the guidelines and review article present the current state of knowledge about this potentially deadly virus. Discussing their ambition for their article, Professors Hofmann-Lehmann and Hartmann comment: 'We hope that this review will not only increase awareness of this fatal but preventable disease, but also help veterinarians in clinical practice when diagnosing this remarkable but tricky infection'.

Credit: 
SAGE

More cats might be COVID-19 positive than first believed, study suggests

A newly published study looking at cats in Wuhan, where the first known outbreak of COVID-19 began, shows more cats might be contracting the disease than first believed.

Researchers from Huazhong Agricultural University, in the Chinese city, took blood samples from 102 cats between January and March 2020, following the first outbreak. Nasal and anal swabs were also collected.

Reporting their findings in peer-reviewed journal Emerging Microbes & Infections, they show COVID-19 antibodies present in 15 of the blood samples taken from the cats. Of these, 11 cats had neutralizing antibodies - proteins that bind so successfully to a virus they block the infection.

None of the cats actually tested positive for COVID-19 or displayed obvious symptoms and, according to the results of return visits, none of these felines have died.

The sample of cats looked at included 46 abandoned from 3 animal shelters, 41 from 5 pet hospitals, and 15 cats were from COVID-19 patient families.

The three cats with the highest levels of antibodies were all owned by patients who had been diagnosed with COVID-19, whilst there were also signs of cats being infected with the virus by other cats from those that were abandoned (4) or based in the pet hospitals (4).

Commenting on the findings, lead author Meilin Jin states that whilst there is currently no evidence for cat-to-human transmission, precautions should be considered.

"Although the infection in stray cats could not be fully understood, it is reasonable to speculate that these infections are probably due to the contact with SARS-CoV-2 polluted environment, or COVID-19 patients who fed the cats.

"Therefore measures should be considered to maintain a suitable distance between COVID-19 patients and companion animals such as cats and dogs, and hygiene and quarantine measures should also be established for those high-risk animals."

The team assessed the type of antibody reactions in thorough detail and were able to describe the dynamic characteristics of the antibodies found.

Amongst many discoveries within the antibodies, they saw that the type of reaction produced by the cats resembles those observed in seasonal coronavirus infections, implying that the cats who have had SARS-CoV-2 infection "remain at risk of re-infection".

The authors state that this is a similar transient antibody response to also be observed in humans, and that their study should be used going forwards as a "reference for the clinical treatment and prevention of COVID-19".

"We suggest that cats have a great potential as an animal model for assessing the characteristic of antibody against SARS-CoV-2 in humans," they add.

From here, the team state that more research is needed to establish the route of Covid-19 from humans to cats.

"Retrospective investigation confirmed that all of antibody positive samples were taken after the outbreak, suggesting that the infection of cats could be due to the virus transmission from humans to cats. Certainly, it is still needed to be verified via investigating the SARS-CoV-2 infections before this outbreak in a wide range of sampling," Jin states.

Credit: 
Taylor & Francis Group

Sampling the gut microbiome with an ingestible pill

Gut microbes affect human health, but there is still much to learn, in part because they're not easy to collect. But researchers now report in ACS Nano that they have developed an ingestible capsule that in rat studies captured bacteria and other biological samples while passing through the gastrointestinal (GI) tract.

Currently, researchers obtain gut microbes by collecting stool samples or using techniques such as colonoscopy or endoscopy. However, stool samples can't capture all the microorganisms in the upper GI tract, and they can't keep microbes from different parts of the tract separate. Colonoscopy and endoscopy are invasive procedures, which deters some patients. Sarvesh Kumar Srivastava and colleagues wanted to avoid these drawbacks by designing a device that could be swallowed and then eliminated.  

The researchers developed a self-polymerizing reaction system of poly(ethylene glycol) diacrylate monomer, iron chloride and ascorbic acid -- all loaded into tiny hollow cylinders. The cylindrical microdevices were packaged in miniature gelatin capsules, which were coated with a protective layer to prevent digestion in the stomach's acidic environment. After they were fed to rats, the capsules remained protected in the stomach but disintegrated in the small intestine's more-neutral pH, releasing the microdevices. Exposure to intestinal fluid caused the cylinders' chemical cargo to polymerize, forming a hydrogel that trapped microbes and protein biomarkers in its surroundings, much like an instant snapshot of the intestine. The devices, which didn't cause inflammation or toxicity, were then surgically removed -- a step that the researchers say will be replaced by natural elimination in future. High-throughput sequencing studies showed that the bacterial population the devices captured closely resembled that of the gut. The researchers also demonstrated that these tiny cylinders could be triggered over a range of pH to deliver biologics, like insulin, to cells in a petri dish in the presence of intestinal mucus. This technology could advance understanding of host-microbiome interactions, providing insight into associated GI disease progression and paving the way for personalized gut therapies, the team says.

Credit: 
American Chemical Society