Culture

Solving the mysteries of water and air underground

image: PVC pipes painted with rust, known as IRIS tubes, help track how much oxygen is in the soil. When there isn't enough oxygen, microbes will turn the rust into regular iron, which washes away. These tubes are drying after being extracted from the experiments and rinsed.

Image: 
J.C. Fiola

Stand outside and look underneath your feet. There, perhaps under some grass, is the soil. On a dry day, all the spaces in the soil are filled with air. And some distance further down, those spaces are entirely water. So, what's in between?

That's the capillary fringe. And it might just be the most important -- and mysterious -- thing you've never heard of.

Like a paper towel wicking up water from a surface, water rises above its natural level in soils through capillary action. A lot of chemical and microbial activity in the soil varies based on how much water or air is around. So, the capillary fringe controls many important functions in the soil.

"Important processes like contaminant breakdown and carbon storage depend on the amount of water and oxygen available," says Jaclyn Fiola, now a graduate student at Virginia Tech. "Understanding the conditions in the capillary fringe will help us predict where certain soil processes will occur."

Fiola and her team set out to better understand this strange region. But that's no easy feat. With the entire fringe underground, it's invisible. And even scientists have a hard time agreeing on where the fringe begins and ends. That's where lab experiments come in handy.

The team gathered two kinds of soil, one sandy and one loamy. The scientists packed this soil into five-gallon buckets with holes near the bottom to allow water to enter.

To track the key events in the capillary fringe, Fiola turned to cleverly simple systems. To study how much oxygen was in the soil, the researchers painted PVC pipes with rust-embedded paint. They inserted these pipes into the soil.

Wherever there wasn't enough oxygen, microbes would "breathe" rust instead. That would turn the rust into a different form of iron, which washes away. By measuring how much rust was left, the team could get a glimpse beneath the soil.

The researchers were surprised to find that the water rose the entire height of the buckets in both types of soils. That means the capillary fringe extended at least 9 inches, more than they were expecting.

They were also surprised that the PVC pipes had lost their rust well above the water table. "This means the soil in the capillary fringe at least 2 inches above the water table is behaving like soil in the water table even though it's not fully saturated," says Fiola.

"Based on the findings, the soil directly above the water table behaves a lot like the saturated soil within the water table," says Fiola.

Wetlands are defined by the government as soils that are saturated near the surface. But if soils act like they're saturated even above the water table, that means more areas might act like wetlands and deserve protection.

Scientists also wanted to better understand how water and air in the capillary fringe can affect other soil processes. To track decomposition, they inserted wooden sticks into the soil. Researchers found that microbes eating the wooden sticks were finnicky.

"Our results suggest that the microbes that carry out decomposition require ideal conditions - not too wet and not too dry," says Fiola. The wood was most eaten away in the middle of the buckets where it was moist.

"The capillary fringe is far too complicated to define based on one single measurement," says Fiola. Even though her team measured many different aspects of the fringe, those measurements didn't always agree with one another.

Soils are complex, especially outside of the lab. So now the researchers are planning to study the capillary fringe in more realistic conditions and in the field.

That future work might give us a better understanding of -- and appreciation for -- the fuzzy, complex, and vital in-between spaces beneath our feet.

Credit: 
American Society of Agronomy

UMN trial shows hydroxychloroquine has no benefit over placebo in preventing COVID-19

MINNEAPOLIS, MN- June 3, 2020 - Today, University of Minnesota Medical School researchers published the results from the first randomized clinical trial testing hydroxychloroquine for the post-exposure prevention of COVID-19.

The trial results, which were published in the New England Journal of Medicine, determined that hydroxychloroquine was not able to prevent the development of COVID-19 any better than a placebo. Further, 40% of trial participants taking hydroxychloroquine developed non-serious side effects -- predominantly nausea, upset stomach or diarrhea. However, the trial found no serious side effects or cardiac complications from taking hydroxychloroquine.

The randomized placebo-controlled trial, which rapidly launched on March 17, tested if hydroxychloroquine could prevent COVID-19 infection in healthy persons after exposure to someone with COVID-19. The trial enrolled 821 non-hospitalized adults from across the U.S. and Canada, who were exposed to COVID-19 from someone living in their same household or as a healthcare worker or first responder. Half of the participants received five days of hydroxychloroquine while the other half received five days of a placebo. The trial was a double-blind trial, meaning that neither the participants nor the researchers knew what the participants received. Participants were followed for two weeks to see who developed symptomatic COVID-19.

Overall, approximately 12% of those given hydroxychloroquine developed COVID-19 versus approximately 14% given the vitamin placebo (folate). This was not a statistical difference, and even if there was a statistical difference, this would equate to treating 42 persons with hydroxychloroquine in order to prevent one infection. There was no further benefit to prevent infection among those who also took zinc or vitamin C.

David Boulware, MD, MPH, the senior investigator of the trial and infectious disease physician at the University of Minnesota, launched the trial with the hope that an inexpensive, widely-available, oral medication could prevent or treat coronavirus early in the disease before people are sick enough to need hospital care.

"Our objective was to answer the question of whether hydroxychloroquine worked to prevent disease or did not work," Boulware said. "While we are disappointed that this did not prevent COVID-19, we are pleased that we were able to provide a conclusive answer. Our objective was to find an answer."

Credit: 
University of Minnesota Medical School

Mass general model projects sharp rise in alcohol-related liver disease

Alcohol-related liver disease is currently the most common reason for liver transplantation in the United States. In recent years, high-risk drinking of alcohol--defined as exceeding daily drinking limits (more than 4 or 5 standard drinks for women and men, respectively) at least weekly for a whole year--has increased in nearly all sociodemographic groups, especially in women.

A new analysis by researchers at Massachusetts General Hospital, Harvard Medical School and Georgia Tech indicates that cases of alcohol-related liver disease will rise dramatically in the coming years without drastic steps to reduce high-risk drinking rates.

To project the impact of alcohol-related liver disease over the next two decades, the team developed a model of drinking patterns and alcohol-related liver disease in individuals born in 1900 to 2012 in the United States, with projections up to 2040. The model relied on data from multiple sources: the National Epidemiologic Survey on Alcohol and Related Conditions, the National Institute of Alcohol Abuse and Alcoholism, the US National Death Index, the National Vital Statistics System and numerous published studies. The model was validated in that it closely reproduced the trends in deaths due to alcohol-related liver disease that were observed from 2005 to 2018.

In the Lancet Public Health analysis, future trends in alcohol-related liver disease were modelled under three potential scenarios based on the level of interventions to address high-risk drinking:

Without any changes in trends in alcohol consumption (status quo), more than 1 million people could die from alcohol-related liver disease by 2040, of whom 35% are projected to be younger than 55 years
Reducing high-risk drinking rates to 2001 levels could prevent 35,000 deaths during the same time period
In contrast, decreasing the high-risk drinking rate by 3.5% per year--similar to the decrease in the rate of tobacco consumption observed after implementation of policies and social interventions in the 1960s to reduce smoking in the U.S.--could prevent 299,000 deaths compared with the status quo scenario (a 30% decrease)

The analysis also projected that in comparison with status quo, decreasing the high-risk drinking rates by 3.5% per year could prevent alcohol-related liver cancer and decompensated cirrhosis (a form of advanced liver disease) by 30% from 2019 to 2040.

"Our study underscores the need to bring alcohol-related disease to the forefront of policy discussions and identify effective policies to reduce high-risk drinking in the U.S.," said senior author Jagpreet Chhatwal, PhD, a senior scientist at the MGH Institute for Technology Assessment and an assistant professor at Harvard Medical School.

Lead author Jovan Julien, MS, a PhD student at Georgia Tech, noted that the impact of high-risk drinking takes years to observe. "Our model highlights the long-term risk, especially for younger generations whose drinking continues to outpace older generations," he said. "Given the practical infeasibility of clinical trials to study long-term effects of increased alcohol consumption, data-driven modeling studies such as ours are valuable for understanding the implications of high-risk drinking and making informed policy decisions," added co-author Turgay Ayer, PhD, MSc, an associate professor at Georgia Tech.

Credit: 
Massachusetts General Hospital

A potential new weapon in the war against superbugs

University of Melbourne researchers are finding ways to beat dangerous superbugs with 'resistance resistant' antibiotics, and it could help in our fight against coronavirus (COVID-19) complications.

As bacteria evolve, they develop strategies that undermine antibiotics and morph into 'superbugs' that can resist most available treatments and cause potentially lethal infections.

The Melbourne team has shown that a newly discovered natural antibiotic, teixobactin, could be effective in treating bacterial lung conditions such as tuberculosis and those commonly associated with COVID-19.

Their work could pave the way for a new generation of treatments for particularly stubborn superbugs.

Teixobactin was discovered in 2015 by a team led by Professor Kim Lewis at Northeastern University in Boston in 2015. His company is now developing it as a human therapeutic.

The new University of Melbourne research, published in mSystems journal, is the first to explain how teixobactin works in relation to the superbug Staphylococcus aureus - also known as MRSA.

MRSA is among bacteria responsible for several difficult-to-treat infections in humans, particularly post-viral secondary bacterial infections such as COVID-19 chest infections and influenza.

University of Melbourne Research Fellow in anti-infectives Dr Maytham Hussein and Associate Professor Tony Velkov's team synthesised an aspect of teixobactin to produce a compound that showed excellent effectiveness against MRSA, which is resistant to the antibiotic methicillin.

Dr Hussein said that there was no way to stop bacteria like MSRA from developing resistance to antibiotics as it was part of its evolution. This made combatting it extremely challenging.

"The rise of multi drug-resistant bacteria has become inevitable," Dr Hussein said. "These bacteria cause many deadly infections, particularly in immunocompromised patients such as diabetic patients or those with cancers, or even elderly people with post-flu secondary bacterial infections."

The University of Melbourne team is the first to find that teixobactin significantly suppressed mechanisms involved in resistance to vancomycin-based antibiotics that are recommended for complicated skin infections, bloodstream infections, endocarditis, bone and joint infections, and MRSA-caused meningitis.

The development could lead to new lung infection treatments and Associate Professor Velkov said it would greatly facilitate the pre-clinical development of teixobactin.

"Bacteria often develop resistance towards antibiotics within 48 hours after exposure," Associate Professor Velkov said. "The bacteria failed to develop resistance towards this compound over 48 hours.

"These novel results will open doors to develop novel antibacterial drugs for the treatment of multi-drug resistant Gram-positive infections - bacteria with a thick cell wall - which are caused by certain types of bacteria."

Credit: 
University of Melbourne

Sharp drop in Australia's bad cholesterol levels -- but rising in Asia and Pacific

The proportion of Australians with bad cholesterol levels has dropped significantly, while Asian and Pacific countries recorded a sharp rise, according to the world's largest study on the condition.

University of Queensland researchers were part of an international team that analysed data from 102.6 million people in 200 countries, spanning the period from 1980 to 2018.

UQ's Professor Annette Dobson said the results showed total and non-HDL 'bad' cholesterol levels had fallen in high-income nations including Australia, North-western Europe and North America, but had risen in low and middle-income nations, particularly in East and Southeast Asia.

"Australian women ranked 146th highest in the world for non-HDL cholesterol in 2018, compared with 32nd highest in 1980," she said.

"While Australian men ranked 116th highest for non-HDL cholesterol in 2018, compared to 31st highest in 1980.

"High cholesterol is responsible for almost four million deaths worldwide each year with half of these being in East, South and Southeast Asia."

Cholesterol is a waxy, fat-like substance found in blood which helps create healthy cells, but too much of it can lead to build-ups in blood vessels.

High non-HDL cholesterol can block blood supply and lead to heart attacks and strokes.

This type of cholesterol is usually caused by diets high in saturated fat but can be lowered effectively through the use of statins.

The researchers believe the findings suggest a need for pricing and regulatory policies that shift diets from saturated to non-saturated fats, while health systems must be prepared to treat those in need with effective medicines.

The data showed countries with the highest levels of non-HDL cholesterol now included China, Tokelau, Malaysia, the Philippines and Thailand.

But Australia may not be off the hook.

"While the study showed that non-HDL cholesterol levels had fallen over decades, Australia doesn't have up-to-date national data as blood samples are no longer included in the National Health Surveys, so we are in the dark about recent trends," Professor Dobson said.

Credit: 
University of Queensland

Terahertz radiation can disrupt proteins in living cells

Researchers from the RIKEN Center for Advanced Photonics, Tohoku University, National Institutes for Quantum and Radiological Science and Technology, Kyoto University, and Osaka University have discovered that terahertz radiation, contradicting conventional belief, can disrupt proteins in living cells without killing the cells. This finding implies that terahertz radiation, which was long considered impractical to use, may have applications in manipulating cell functions for the treatment of cancer, for example, but also that there may be safety issues to consider.

Terahertz radiation is a portion of the electromagnetic spectrum between microwaves and infrared light, which is often known as the "terahertz gap" because of the lack so far of technology to manipulate it efficiently. Because terahertz radiation is stopped by liquids and is non-ionizing--meaning that it does not damage DNA in the way that x-rays do--work is ongoing to put it to use in areas such as airport baggage inspections. It has generally been considered to be safe for use in tissues, though some recent studies have found that it may have some direct effect on DNA, though it has little ability to actually penetrate into tissues, meaning that this effect would only be on surface skin cells.

One issue that has remained unexplored, however, is whether terahertz radiation can affect biological tissues even after it has been stopped, through the propagation of energy waves into the tissue. The research group from RAP and the National Institutes for Quantum and Radiological Science and Technology recently discovered that the energy from the light cold enter into water as a “shockwave.” Considering this, the group decided to investigate whether terahertz light could also have an effect like this on tissue.

They chose to investigate using a protein called actin, which is a key element that provides structure to living cells. It can exist in two conformations, known as (G)-actin and (F)-actin, which have different structures and functions, as the (F)-actin is a long filament made up of polymer chains of proteins. Using fluorescence microscopy, they looked at the effect of terahertz radiation on the growth of chains in an aqueous solution of actin, and found that it led to a decrease in filaments. In other words, the terahertz light was somehow preventing the (G)-actin from forming chains and becoming (F)-actin. They considered the possibility that it was caused by a rise in temperature, but found that the small rise, of around 1.4 degrees Celsius, was not sufficient to explain the change, and concluded that it was most likely caused by a shockwave. To further test the hypothesis, they performed experiments in living cells, and found that in the cells as in the solution, the formation of actin filaments was disrupted. However, there was no sign that the radiation caused cells to die.

According to Shota Yamazaki, the first author of the study, published in Scientific Reports, "It was quite interesting for us to see that terahertz radiation can have an effect on proteins inside cells without killing them cells themselves. We will be interested in looking for potential applications in cancer and other diseases. He continues, “Terahertz radiation is coming into a variety of applications today, and it is important to come to a full understanding of its effect on biological tissues, both to gauge any risks and to look for potential applications.”

Credit: 
RIKEN

Bees grooming each other can boost colony immunity

video: Honeybees engaging in allogrooming behavior.

Image: 
Dr Adele Bordoni, University of Florence

Honeybees that specialise in grooming their nestmates (allogroomers) to ward off pests play a central role in the colony, finds a new UCL and University of Florence study.

Allogroomer bees also appear to have stronger immune systems, possibly enabling them to withstand their higher risk of infection, according to the findings published in Scientific Reports.

Ectoparasites (parasites that live on the outside of a host's body, such as mites) are a growing threat to honeybees worldwide, so the researchers say that supporting allogrooming behaviour might be an effective pest control strategy.

Lead author Dr Alessandro Cini, who began the project at the University of Florence before moving over to UCL Centre for Biodiversity & Environment Research, said: "An ectoparasitic mite, Varroa destructor, represents a major global threat to bee colonies. By understanding how allogrooming practices are used to ward off parasites, we may be able to develop strategies to promote allogrooming behaviour and increase resilience to the parasites.

"Here, we found worker bees that specialise in allogrooming are highly connected within their colonies, and have developed stronger immune systems.

"We suspect that if more bees engaged in these allogrooming behaviours that ward off parasites, the colony as a whole could have greater immunity."

Among bees, allogrooming consists of a worker using its mouth to remove debris, which may include parasites and other pathogens, from the body of another member of its colony.

In bee colonies, different groups of worker bees conduct different activities - one such specialisation is allogrooming, although it was not previously known how specialised the groomer bees are, and how their physiology may be different.

The current study focused on Apis mellifera, commonly known as the western honeybee, which is the most common species of honeybee and also the world's most-used bee for both honey production and pollination in agriculture.

As allogrooming would likely put the grooming bees at an elevated risk of contracting pathogens and parasites, the researchers tested their immune systems, and found that their hemolymph (like blood, but for insects) could more effectively clear out potentially harmful bacteria than the immune systems of other bees in the colony.

Co-author Dr Rita Cervo from the University of Florence said: "By identifying a striking difference in the immune systems of the allogrooming bees, which are involved in tasks important to colony-wide immunity from pathogens, we have found a link between individual and social immunity."

The researchers found that allogroomer bees occupy a central position in the colony's social network, as they are more connected to bees across the colony than the average bee, enabling their grooming habits to benefit a large number of bees and keep the colony as pest-free as possible.

The researchers found that allogrooming is a relatively weak, transient specialty, as the groomer bees still devoted a similar amount of time to other tasks as the rest of the colony's worker bees. The researchers say this shows that bees can develop physiological differences narrowly tailored to specific tasks, while still maintaining a degree of plasticity enabling them to switch to other tasks as needed.

The researchers did not detect any differences in how well the allogroomer bees could detect when other bees needed grooming, as their antennae were not more finely-tuned to relevant odours. It is possible they can detect who needs grooming in other ways, such as by noticing the 'grooming invitation dance' whereby bees shake their whole body from side-to-side.

Credit: 
University College London

An optimal decision-making strategy emerges from non-stop learning

Unlike machines, the behaviour of animals and humans almost always has an element of unpredictability. Countless experiments have shown that our responses to the exact same challenge are sometimes faster, sometimes slower, sometimes correct and sometimes wrong.

In the field of neuroscience, this variability is often attributed to what is called "noise". An ever-present "neural babble" that influences the way brains process and respond to incoming information.

A new collaborative study in rodents by a team of scientists from the Champalimaud Centre for the Unknown in Portugal, Harvard Medical School in the US, and the University of Geneva in Switzerland, shows that, in fact, this variability could sometimes be wrongly interpreted as noise. Instead, it may actually be the reflection of a behavioural strategy that was overlooked due to prior assumptions about how the subject should behave. Their results - published today (June 2nd) in the scientific journal Nature Communications - call into question what "optimal behaviour" really means.

An unexpected strategy

"It all started with a simple experiment", recalls Maria Inês Vicente, who collected the experimental data as part of her graduate work at the Champalimaud Centre for the Unknown and is currently working at Leiden University . "We took two different odours and created several mixtures of the two. During the experiment, the different mixtures were presented to the rats, one at a time. On each trial, the rats had to report which of two odours was more dominant. If it thought the answer was odour A, it would approach a water spout on the right, and if it opted for odour B, it would go to the left. Some mixtures had much more of one odour compared with the other, making it easier to tell which was more salient. Whereas in other mixtures, the difference was more subtle. If the rat got the correct answer, it received a water reward."

The researchers recorded how quickly the rats responded and whether their answer was right or wrong. To their surprise, when they analyzed the data, they realized that the rats' behaviour didn't follow a common decision-making rule. "In these types of tasks we tend to see a clear dependency between difficulty and decision time: on the harder, more subtle trials, animals (and humans) take longer to decide than on easy trials", says André Mendonça of the Champalimaud Centre for the Unknown. "Instead, our rats would take, on average, the same amount of time to make both hard and easy decisions."

"The explanation for this unexpected observation wasn't easy to come by", adds Jan Drugowitsch, a co-author affiliated with Harvard Medical School. "Finally, we found it by constructing a mathematical model that united separate branches in the field of decision-making. In a sense, our goal was to replicate the rats' behaviour in a 'machine's brain' with the hope of discovering the underlying variables that produced this surprising result."

The model revealed an unexpected strategy. On each trial, the rat was readjusting its behaviour according to the results of the previous trial. If the rat was correct in one trial, it would be biased towards the same odour in the next one. And vice versa, an incorrect response in one trial would lead to switching in the next.

Why did the animals adopt this particular strategy? "This strategy is consistent with a world-view where the environment is continuously changing, which leads the animals to update their decision-making approach on a trial-by-trial basis. From the outside, their behaviour appears highly variable but in fact they were just adapting too quickly. That is why it would have been easy to wrongly interpret this variablity as 'just' noise", Drugowitsch points out.

Optimality is in the eye of the beholder

Why did the rats opt for a different strategy from the expected one? The authors explain that there are several reasons, the first is the nature of the task. "There isn't just one type of sensory discrimination task", says Mendonça. "Various elements in the design of the task may draw out different decision-making strategies. For instance, if we had asked rats to localize the side where a sound comes from instead of discriminating between odours, their strategy would have aligned with our initial expectation. This is because there is a 'built-in' right-left category in the brain for certain sensory modalities that are naturally spatially separated, but that's not the case for olfaction."

Another reason is confidence. "Just like humans, rats appear to evaluate their own decisions and change their behaviour accordingly. When you are very confident and end up making the correct decision, there's really not much to learn. But what happens when you're confident, but then find out that you're actually wrong? In this case, you should change your behaviour drastically. Which is precisely what we saw with our rats", says Zachary Mainen, one of the group leaders who headed the study and who is affiliated with the Champalimaud Centre for the Unknown.

According to the authors, another explanation for the rat's choice of strategy is their "hard-wired" circuitry for learning. "Ironically, if they would not constantly readjust their responses according to the outcome of the last trial, they would actually do better. In fact, what we were originally expecting them to do is to construct an 'odour A - odour B' category and implement it", points out Alex Pouget, who is a group leader at the University of Geneva and co-author of the study. "Still, the rats' strategy makes sense."

As the authors explain, this observation doesn't mean the rat is a maladapted animal, on the contrary, they claim that the scientific community should reconsider what they define as "optimal behaviour". "Rats have evolved over millions of years to search and explore an ever-changing environment. Therefore, when we assess the behaviour of these animals, we should remember that it's not necessarily only about performance per se. Optimality should depend both on the problem at hand and the nature of the problem-solver", Pouget argues.

"We believe that our work is a good starting point for exploring further how different subfields of decision-making may interact. We also hope that other scientists will use and refine our models in follow-up experiments. It would be fascinating and informative to see when, how and why our model starts to fail. Making an error is an opportunity for learning something new, and that is both the result and take-home message of our study", Mendonça concludes.

Credit: 
Champalimaud Centre for the Unknown

Carnegie Mellon tool automatically turns math into pictures

image: A new software tool from Carnegie Mellon University turns abstract mathematical expressions into pictures than can be more easily understood.

Image: 
Carnegie Mellon University

PITTSBURGH-- Some people look at an equation and see a bunch of numbers and symbols; others see beauty. Thanks to a new tool created at Carnegie Mellon University, anyone can now translate the abstractions of mathematics into beautiful and instructive illustrations.

The tool enables users to create diagrams simply by typing an ordinary mathematical expression and letting the software do the drawing. Unlike a graphing calculator, these expressions aren't limited to basic functions, but can be complex relationships from any area of mathematics.

The researchers named it Penrose after the noted mathematician and physicist Roger Penrose, who is famous for using diagrams and other drawings to communicate complicated mathematical and scientific ideas.

"Some mathematicians have a talent for drawing beautiful diagrams by hand, but they vanish as soon as the chalkboard is erased," said Keenan Crane, an assistant professor of computer science and robotics. "We want to make this expressive power available to anyone."

Diagrams are often underused in technical communication, since producing high-quality, digital illustrations is beyond the skill of many researchers and requires a lot of tedious work.

Penrose addresses these challenges by enabling diagram-drawing experts to encode how they would do it in the system. Other users can then access this capability using familiar mathematical language, leaving the computer to do most of the grunt work.

The researchers will present Penrose at the SIGGRAPH 2020 Conference on Computer Graphics and Interactive Techniques, which will be held virtually this July because of the COVID-19 pandemic.

"We started off by asking: 'How do people translate mathematical ideas into pictures in their head?'" said Katherine Ye, a Ph.D. student in the Computer Science Department. "The secret sauce of our system is to empower people to easily 'explain' this translation process to the computer, so the computer can do all the hard work of actually making the picture."

Once the computer learns how the user wants to see mathematical objects visualized -- a vector represented by a little arrow, for instance, or a point represented as a dot -- it uses these rules to draw several candidate diagrams. The user can then select and edit the diagrams they want from a gallery of possibilities.

The research team developed a special programming language for this purpose that mathematicians should have no trouble learning, Crane said.

"Mathematicians can get very picky about notation," he explained. "We let them define whatever notation they want, so they can express themselves naturally."

An interdisciplinary team developed Penrose. In addition to Ye and Crane, the team included Nimo Ni and Jenna Wise, both Ph.D students in CMU's Institute for Software Research (ISR); Jonathan Aldrich, a professor in ISR; Joshua Sunshine, an ISR senior research fellow; cognitive science undergraduate Max Krieger; and Dor Ma'ayan, a former master's student at the Technion-Israel Institute of Technology.

"Our vision is to be able to dust off an old math textbook from the library, drop it into the computer and get a beautifully illustrated book -- that way more people understand," Crane said, noting that Penrose is a first step toward this goal.

Credit: 
Carnegie Mellon University

Discovery of proteins that regulate interorganelle communication

image: Structural changes of MAM due to FKBP8.

Image: 
@ Korea Brain Research Institute

Korea Brain Research Institute (KBRI headed by Suh Pann-ghill) announced on the 2nd that the joint research team of KBRI (team led by Dr. Mun Ji-young), Seoul National University, and Pohang University of Science and Technology discovered the proteins that engage in the formation of MAM*, which is the cellular signaling hub.

* MAM (mitochondria-associated membrane): The membrane that connects mitochondria and endoplasmic reticulum (ER)

The research results were published in the May issue of the leading scientific journal Proceedings of the National Academy of Sciences of the United States of America, and the paper title and authors are as follows.

* Title: Contact-ID, a tool for profiling organelle contact sites, reveals regulatory proteins of mitochondrial-associated membrane formation

* Authors: Kwak Chul-hwan1, Shin Sang-hee1, Park Jong-seok1 (1first authors), Jung Min-kyo, Truong Thi My Nhung, Kang Myeong-gyun, Lee Chai-heon, Kwon Tae-hyuk, Park Sang-ki*, Mun Ji-young*, Kim Jong-seo*, and Rhee Hyun-woo* (*corresponding authors)

Diverse organelles exist in living cells, which serve their own functions and communicate with each other via membrane contact sites. In particular, proteins localized at MAMs, which are the most critical cellular contact sites of mitochondria* and ER*, engage in key functions such as lipid metabolism and autophagy.

* Mitochondria: A type of organelle that powers the cell's metabolic activities in the form of ATP

* Endoplasmic reticulum (ER): A type of organelle that produces proteins and delivers them to cells

When mitochondria and ER come into contact with each other, MAM is formed to facilitate calcium transport. If an excessive amount of calcium is transported into mitochondria, it compromises its functioning and leads to the onset of disease. MAM has recently emerged as the center of attention after its role as a cellular signaling hub was discovered. In fact, genetic degeneration has been detected in the MAM of many patients suffering from degenerative neural diseases.

The joint research team invented a new technique (Contact-ID) for labeling and analyzing proteins localized at MAM and identified 115 MAM-specific proteins within live human cells through this technique.

Up until now, the centrifuge method has mainly been used to analyze the structure of MAM-specific proteins. However, this method had the disadvantages of excessive noise generated during the splitting process and low efficiency. The team's newly invented technique overcomes such challenges.

The large area-3D scanning electron microscopy (SEM) adopted by KBRI last year was also used to observe three-dimensional MAM structure within cells. As a result, it was discovered that FKBP8, an outer mitochondrial membrane (OMM) protein, facilitates the creation of MAM connecting mitochondria and ER and plays an instrumental role in calcium transport.

This study is significant in that it discovered proteins capable of regulating an increase in mitochondrial calcium, which is now known as a common cause of neurodegenerative diseases. This study is expected to be utilized as a new milestone in developing treatment options for Alzheimer's disease, Parkinson's disease, etc., in the future.

Dr. Mun Ji-young, a joint corresponding author, and Dr. Jung Min-kyo, a joint author, of KBRI said, "Diseases occur when inter-organelle communication is hindered or disturbed. We succeeded in more accurately identifying MAM-specific proteins that engage in this process."

They added, "We conducted joint research on the functions of FKBP8 from among the identified proteins. Our follow-up study will be targeted at its activities as a key factor for delaying or preventing damage to mitochondria."

Credit: 
Korea Brain Research Institute

Spine surgeons face COVID-19 challenges worldwide

Spine surgeons across the world are experiencing effects of COVID-19, including canceled procedures, changes in clinical roles, anxiety and risk of exposure to the disease themselves due to insufficient protective equipment. An international team of researchers reported these findings recently in the Global Spine Journal.

Using a 73-question survey administered between March 27 to April 4, the researchers gathered information about spine surgeons worldwide, including topics such as personal perspectives, family life, attitudes and coping strategies; caring for patients; implications of government and leadership; financial impact; research; education and training, and future challenges and impact.

The study evaluated survey responses from 902 spine surgeons from the membership of AO Spine, the largest society of international spine surgeons in the world, which has more than 6,000 members from 91 countries and seven global regions. The study is the "first" to assess global variations on impact of this pandemic among healthcare professionals, in this case spine surgeons.

With some geographic variations, the majority of the surgeons reported that 75 percent of surgical cases were canceled each week. Surgeons also reported experiencing elevated anxiety and uncertainty, and nearly 95 percent expressed a need for formal international guidelines about how to manage patients with COVID-19.

"Some institutes have some form of guidelines here and there, but standardized formal ones don't exist in the community and are needed," said Dr. Dino Samartzis, Associate Professor in the Department of Orthopaedic Surgery at Rush Medical College, and the primary investigator of the study. He further remarked, "This study shows that we need to prepare for other disasters moving ahead or risk history repeating itself."

The study showed that one in four surgeons reported working outside their normal scope of practice. About 83 percent of surgeons reported having access to COVID-19 testing, but only seven percent of surgeons reported being tested for COVID-19. In general, 13 percent of surgeons noted that they would not be compelled or not likely to disclose to patients if they were found positive for COVID-19.

The study also found that 50 percent of the surgeons surveyed lacked adequate protective equipment. In addition, 37 percent of the surgeons reported having at least one chronic disease, such as high blood pressure, placing them at a higher risk from COVID-19.

The study also highlighted that up to 97 percent of all surgeons were interested in online education. "With the uncertainly of how this pandemic plays out in the upcoming months and the need to get back to a recovery phase for all spine specialists, traveling to meetings, workshops and other educational events will be a challenge," stated Samartzis. The study further noted that telemedicine during this period has also taken prominence.

Samartzis also noted that, "The study further brings to light that regional variations also exist. This is very important for institutes, academia and industry to take stock in their current and future planning, decision-making and more personalized approaches to help facilitate and provide the necessary technology for the spine specialists to optimize their patient care and outcomes."

Dr. Howard S. An, the Morton International Endowed Chair Professor and director of the Spine Surgery Fellowship program in the Department of Orthopaedic Surgery at Rush Medical College and a co-investigator of this study, noted, "During this pandemic, the spine specialist has been a forgotten soldier." An went on to state that, "The impact this has had on their practice and patients is far-reaching since a surgical practice, in particular, supports and works with so many health specialists and support staff, and manages patients in immense pain and disability looking for solutions. If their practice is affected, everyone else will be affected. Society as a whole will be, to some degree, affected even further."

Credit: 
Rush University Medical Center

Scientist captures new images of Martian moon Phobos to help determine its origins

image: Images capture the Mars moon Phobos during different phases -- waxing, waning and full -- including the three images recently processed by Edwards

Image: 
NASA/JPL-Caltech/ASU/NAU

Christopher Edwards, assistant professor in NAU's Department of Astronomy and Planetary Science, just processed new images of the Martian moon Phobos that give scientists insight into the physical properties of the moon and its composition. The images of the small moon, which is about 25 kilometers (15 miles) in diameter, were captured by NASA's 2001 Mars Odyssey orbiter. When reviewed in combination with three previously released images, these new images could ultimately help settle the debate over whether the planetary body is a "captured asteroid"--pulled into perpetual orbit around Mars--or an ancient chunk of Mars blasted off the surface by a meteorite impact.

Along with scientists at NASA's Jet Propulsion Lab and Arizona State University, Edwards used the Thermal Emission Imaging System (THEMIS) onboard the 2001 Mars Odyssey orbiter to capture the images from about 6,000 kilometers (3,700 miles) above the moon's surface to measure temperature variations during different phases--waxing, waning and full:

An image taken on December 9, 2019, shows the surface of Phobos at its maximum temperature, 81 degrees Fahrenheit (27 degrees Celsius).

An image taken on February 25, 2020, shows Phobos while in eclipse, where Mars' shadow completely blocked sunlight from reaching the moon's surface. This event resulted in some of the coldest temperatures measured on Phobos to date, with the coldest being about -189 degrees Fahrenheit (-123 degrees Celsius).

On March 27, 2020, Phobos was observed exiting an eclipse, when its surface was still warming up.

Edwards has been a part of the THEMIS team since 2003. All of the THEMIS infrared images are colorized and overlain on THEMIS visible images taken at the same time, except for the eclipse image, which is overlain on a synthetic visible image of what Phobos would have looked like if it hadn't been in complete shadow.

"The THEMIS instrument is designed to look at the composition and physical properties of the surface of Mars under various conditions using its multi-wavelength visible and infrared cameras," Edwards said.

From the new images, he said, "We're seeing that the surface of Phobos is relatively uniform and made up of very fine-grained materials. These observations are also helping to characterize the composition of Phobos, which appears to be mostly basaltic. Future observations will provide a more complete picture of the temperature extremes on the moon's surface."

Odyssey is the longest-operating spacecraft around Mars, and has been orbiting the Red Planet for more than 18 years.

"In an effort to continue advancing new science from the Odyssey mission as it matures," Edwards said, "a couple of years ago we proposed we could look at Phobos as part of our extended mission proposal. That requires a BIG spacecraft maneuver, rotating it 180 degrees into a geometry in which it was never intended to operate."

"As far as Phobos goes," he said, "its origins are enigmatic. The orbit it is in is not very stable, and some scientists have proposed that the moon has been destroyed and reformed multiple times because of its orbital position. It also turns out that the orbit's exact geometry makes it hard to capture--so some teams have proposed it is derived from Mars. How that happened is not clear, either! Perhaps it's from a big meteorite impact that ejected material into the orbit, and the material grouped together to form Phobos. So that's why we're looking for the physical properties of the surface, which might help identify locations where we could see the primary composition and not just the fine-grained dust."

Edwards added, "JAXA, Japan's space agency, is sending a whole mission to investigate Phobos and Diemos (Mars' other moon) called the Martian Moons eXploration (MMX), so we're providing some good reconnaissance data for that upcoming mission!"

Credit: 
Northern Arizona University

Stomach issues, history of substance abuse found in teen vaping study

image: "This is the first study on teens and EVALI from UT Southwestern, and one of the first in the country regarding clinical features of EVALI in the pediatric population," says corresponding author Devika Rao, M.D., pediatric pulmonologist at Children's Health and assistant professor of pediatrics in the division of respiratory medicine.

Image: 
UTSW

DALLAS – June 2, 2020 – A study of teens diagnosed with the vaping-linked respiratory disease EVALI revealed that most also had gastrointestinal symptoms and a history of psychosocial factors, including substance abuse, UT Southwestern researchers found in one of the first clinical reviews of its kind.

The investigation, published online by Pediatrics, described the treatment of 13 adolescents for vaping-related lung injury at Children’s Health in Dallas, UTSW’s pediatric teaching hospital. Vaping involves inhaling aerosol from a battery-powered device, also known as an electronic cigarette. Vaping marijuana has been linked to EVALI (e-cigarette, or vaping, product use-associated lung injury).

“This is the first study on teens and EVALI from UT Southwestern, and one of the first in the country regarding clinical features of EVALI in the pediatric population,” says corresponding author Devika Rao, M.D., pediatric pulmonologist at Children’s Health and assistant professor of pediatrics in the division of respiratory medicine. “We found that teenagers often presented with GI symptoms, which were just as frequent as respiratory symptoms. In some cases, these teens had abdominal CT scans that ended up showing abnormalities in the lung, which was the first clue of lung injury.”

Clinicians found that the nature of lung injury varied from mild to severe, and that there was a much larger proportion of female and Hispanic patients hospitalized with EVALI compared with published adult studies, Rao says.

“It may be that in the adolescent population these groups are more vulnerable to risky behaviors than what was previously thought. This serves as a reminder to clinicians that a teen with EVALI is not necessarily always going to be white and male,” she adds.

In January, the Food and Drug Administration moved to blunt vaping among teens by banning fruit- and mint-flavored products. Over the past two years, the country has experienced an alarming rise in vaping as well as a corresponding increase in associated lung injuries.

In early 2019, the Centers for Diseases Control and Prevention (CDC) reported that more than 1 in 4 high school students and about 1 in 14 middle school students had used nicotine-containing e-cigarettes in the past 30 days.

About the same time, a National Institutes of Health-supported study by the University of Michigan found that twice as many high school students used e-cigarettes in 2018 compared with the previous year. Further, the study showed, 1 in 5 high school seniors reported having vaped nicotine at least once in the previous month – the largest single increase in the survey’s 44-year history, surpassing a mid-1970s surge in marijuana smoking.

Last summer, clinicians began seeing an uptick nationally in EVALI that was linked to products containing vitamin E acetate or tetrahydrocannabinol (THC). By early 2020, more than 2,800 hospitalized EVALI cases or deaths had been reported to the CDC. Despite these surges, however, the hallmarks of EVALI in adolescents are just beginning to be well characterized.

In the UTSW study, researchers from the pediatrics and emergency medicine departments studied 13 hospitalized adolescents diagnosed with confirmed or probable EVALI between December 2018 and November 2019. The majority (54 percent) were female, with a mean age of 15.9 years.

Both respiratory and gastrointestinal symptoms were reported in 85 percent of the teens. Vaping cannabinol products was reported in 92 percent of patients, and vaping nicotine was reported in 62 percent.

The analysis included sociodemographic characteristics, clinical presentation, laboratory and imaging results, pulmonary function testing, oxygen requirements, and clinic follow-up, Rao says. Many teens were originally believed to have pneumonia or viral gastroenteritis-like symptoms, but actually had EVALI, which was diagnosed based on eliciting a vaping history from the teens. Treatment with glucocorticoid led to an improvement both in symptoms and lung function.

Investigations also revealed that a large percentage of the patients had a history of psychosocial stressors, including substance abuse and mood disorders. In addition, almost half of the cohort was Hispanic, though it was unclear whether that simply reflected the sizable Hispanic population in North Texas.

Charting each patient’s risk-associated behavior was challenging, she says, but worth the unified and ongoing effort.

“In taking care of hospitalized teens with EVALI, we found that they were very hesitant to disclose their vaping habits,” Rao says. “A multidisciplinary effort – discussion among emergency medicine physicians, hospitalist medicine physicians, pulmonologists, toxicologists, behavioral medicine specialists, and intensivists – is key to successful treatment of these patients.

“We know that teens can be vulnerable to lung injury from vaping, and we know there are many experience stressors that perhaps motivate them to engage in risky behavior. The next step is prevention – preventing teens from the desire to vape – and also helping teens who have been treated for EVALI so that they can stop vaping,” she adds. “Future plans for study are tracking the long-term effects of EVALI on lung function and studying current vaping habits of teens with a history of vaping and/or EVALI in the context of the novel coronavirus epidemic.”

Credit: 
UT Southwestern Medical Center

Researchers study genetic outcomes of great gray owl population in four states

image: University of Wyoming researchers led a study of great gray owls in a four-state region. The study, published in the May 31 online issue of Conservation Genetics, showed that range discontinuity could lead to genetic drift and subsequent loss of genetic diversity in these birds.

Image: 
Beth Mendelsohn

A University of Wyoming researcher led a study of great gray owls in a four-state region, showing that range discontinuity could lead to genetic drift and subsequent loss of genetic diversity in these birds.

Lower genetic diversity in these owls means they are more susceptible to changes in their environment and, thus, less able to adapt quickly.

"With lower genetic diversity, such owls have less ability to adapt to changes that include extreme fire effects on their habitats; human developments; stresses caused by diseases such as West Nile virus and trichomonas, a nasty parasite that damages their oral cavity and can lead to starvation; and other diseases," says Holly Ernest, a UW professor of wildlife genomics and disease ecology, and the Wyoming Excellence Chair in Disease Ecology in UW's Department of Veterinary Sciences and the Program in Ecology. "Another stress can be overzealous photographers who get too near nesting sites and scare great gray owl moms and dads off their nests and endanger the nestlings."

Ernest was the senior and corresponding author of a paper, titled "Population Genomic Diversity and Structure at the Discontinuous Southern Range of the Great Gray Owl in North America," that was published online May 31 in Conservation Genetics, a journal that promotes the conservation of biodiversity. The journal publishes original research papers, short communications, review papers and perspectives. Contributions include work from the disciplines of population genetics, molecular ecology, molecular biology, evolutionary biology, systematics, forensics and others.

Beth Mendelsohn, a 2018 UW Master of Science graduate in veterinary sciences and the Haub School of Environment and Natural Resources, from Missoula, Mont., is the paper's lead author. Mendelsohn is now a raptor biologist conducting research to improve owl and hawk conservation in Rocky Mountain ecosystems.

Other contributors to the paper are Bryan Bedrosian, research director of the Teton Raptor Center; Sierra Love Stowell, a research genomicist affiliated with Ernest's lab and a UW postdoctoral researcher from 2016-18; Roderick Gagne, a research scientist at Colorado State University and a UW postdoctoral researcher from 2015-17; Melanie LaCava, a UW Ph.D. candidate in the Program in Ecology and Department of Veterinary Sciences, from San Diego, Calif.; Braden Godwin, a 2019 UW Master of Science graduate in veterinary sciences and the Haub School of Environment and Natural Resources; and Joshua Hull, an adjunct associate professor of animal science at the University of California-Davis.

"This study constitutes the first genomic work on great gray owls and the first genetic analysis that includes Wyoming, which represents the southern extent of the species range in the Rocky Mountains and is impacted by habitat loss," Ernest says.

"When compared to some other bird species, we found that great gray owls seem to have lower genetic diversity," Mendelsohn says. "However, we have no evidence that the low genetic diversity is currently having a detrimental effect on the species. It may, however, affect the species' future adaptability."

Great gray owls studied live in northern and southern Oregon, California, Idaho and Wyoming. The study found that the populations that lacked connectivity to the rest of the breeding range -- those in California and Oregon -- had lower genetic diversity than the Rocky Mountain owl populations in Idaho and Wyoming. The owls in Idaho and Wyoming were connected to the core of the range. Owls in Wyoming showed greater genetic diversity than the other locations studied.

"We hypothesize that Wyoming owls live in their habitat range that is like a peninsula finger of habitat coming down, from north to south. From the main, larger habitat range to the north (Montana and up into Canada), they have habitat and range that are still connected, albeit a very tenuous connection, to the large main Montana/Canada habitat and range of great gray owls," Ernest says. "Wyoming owls exist on one of the most southern extents of great gray owl range that still has some connection with that larger Montana/Canada habitat."

By contrast, the great gray owl habitat in Oregon and California is cut off from the rest of the owls' range, which is mainly in boreal forests in Canada and around the globe in such regions as Russia, Ernest says. The owls' range becomes increasingly fragmented as it extends farther south. The northern boreal forest transitions to montane forest and borders on sagebrush and desert in Oregon and California.

As a result, these range-edge owl populations have a heightened susceptibility to disease and human-caused mortality, such as from trees being cut down to build homes; climate change; genetic diversity erosion; and potential extinction. Small, isolated populations of these owls are more likely to experience inbreeding, which results in lower genetic diversity.

"Conservationists could learn from this that great gray owls in this part of their range may need extra protections and may be vulnerable to environmental change," Mendelsohn says.

During the study, 158 DNA samples from unique owl individuals were sequenced, Mendelsohn says. After filtering for quality and marker coverage, the research team retained 123 individuals that had enough high-quality sequencing reads. For some of the analyses, a subset of 78 individuals was used because known related individuals were removed.

The genomics part of this work took a little under 2.5 years at UW, Mendelsohn says.

"Prior to that, I worked as a field technician at the Teton Raptor Center and helped collect DNA samples from owls from 2013-16," she says. "The lead biologist on the project, Bryan Bedrosian, connected me with Holly for my master's work."

Credit: 
University of Wyoming

Astronomers capture a pulsar 'powering up'

image: A Monash-University-led collaboration has, for the first time, observed the full, 12-day process of material spiralling into a distant neutron star, triggering an X-ray outburst thousands of times brighter than our Sun.

Image: 
NASA/JPL-Caltech

The research, led by PhD candidate Adelle Goodwin from the Monash School of Physics and Astronomy will be featured at an upcoming American Astronomical Society meeting this week before it is published in Monthly Notices of the Royal Astronomical Society. Adelle leads a team of international researchers, including her supervisor, Monash University Associate Professor Duncan Galloway, and Dr David Russell from New York University Abu Dhabi.

The scientists observed an 'accreting' neutron star as it entered an outburst phase in an international collaborative effort involving five groups of researchers, seven telescopes (five on the ground, two in space), and 15 collaborators.

It is the first time such an event has been observed in this detail - in multiple frequencies, including high-sensitivity measurements in both optical and X-ray.

The physics behind this 'switching on' process has eluded physicists for decades, partly because there are very few comprehensive observations of the phenomenon.

The researchers caught one of these accreting neutron star systems in the act of entering outburst, revealing that it took 12 days for material to swirl inwards and collide with the neutron star, substantially longer than the two- to three-days most theories suggest.

"These observations allow us to study the structure of the accretion disk, and determine how quickly and easily material can move inwards to the neutron star," Adelle said.

"Using multiple telescopes that are sensitive to light in different energies we were able to trace that the initial activity happened near the companion star, in the outer edges of the accretion disk, and it took 12 days for the disk to be brought into the hot state and for material to spiral inward to the neutron star, and X-rays to be produced," she said.

In an 'accreting' neutron star system, a pulsar (a dense remnant of an old star) strips material away from a nearby star, forming an accretion disk of material spiralling in towards the pulsar, where it releases extraordinary amounts of energy - about the total energy output of the sun in 10 years, over the period of a few short weeks.

The pulsar observed is SAX J1808.4?3658 which rotates at a rapid 400 times per second and is located 11,000 light-years away in the constellation Saggitarius.

"This work enables us to shed some light on the physics of accreting neutron star systems, and to understand how these explosive outbursts are triggered in the first place, which has puzzled astronomers for a long time," said New York University Abu Dhabi researcher, Dr David Russell, one of the study's co-authors.

Accretion disks are usually made of hydrogen, but this particular object has a disk that is made up of 50% helium, more helium than most disks. The scientists think that this excess helium may be slowing down the heating of the disk because helium 'burns' at a higher temperature, causing the 'powering up' to take 12 days.

The telescopes involved include two space observatories: the Neils Gehrels Swift X-ray Observatory, and the Neutron Star Interior Composition Explorer on the International Space Station; as well as the ground based Las Cumbres Observatory network of telescopes, and the South African Large Telescope.

Credit: 
Monash University