Earth

Novel method identifies areas most suitable for conservation of black lion tamarin

image: The study is a contribution to translocation initiatives that move groups of these animals to areas from which the species has disappeared

Image: 
Gabriela Cabral Rezende

By André Julião  |  Agência FAPESP – The black lion tamarin (Leontopithecus chrysopygus) once inhabited most forest areas in the state of São Paulo, Southeast Brazil, but currently occupies only some Atlantic Rainforest remnants there. In recent years, after various studies of the endangered species, environmental NGO Instituto de Pesquisas Ecológicas (IPÊ) moved groups of these animals to areas from which the species had disappeared. 

Similar initiatives have now been reinforced by a group of researchers at IPÊ, São Paulo State University (UNESP) and the Federal University of Mato Grosso (UFMT), who cross-tabulated climate data and data on landscape (forest cover) to determine the sites best suited for future translocation, a technique used by conservationists to bolster the viability and gene flow of endangered species. 

The study was supported by FAPESP and published in the American Journal of Primatology.

“We used climate and landscape data to try to predict which areas within the original distribution of the species are suitable for its conservation. The most widely used models use only climate data. Our study innovated by combining these two datasets in an approach that enabled us to identify the areas that are theoretically most suitable and compare them with the areas in which the species actually lives now,” said Laurence Culot, a professor at UNESP’s Institute of Biosciences (IB) in Rio Claro and principal investigator for the project “The effect of fragmentation on the ecological functions of primates”, which was funded by FAPESP and gave rise to the study. 

The first author of the published paper on the study is Gabriela Cabral Rezende, who conducted it as part of her PhD research at IB-UNESP with a scholarship from FAPESP

The black lion tamarin originally inhabited a long strip of the state of São Paulo running through its southwestern and central portions and amounting in aggregate to 92,239 square kilometers (km²). The model created by the researchers pointed to only 2,096 km² of suitable areas for the species, currently found in less than 40%. In sum, the species now occupies less than 1% of its original distribution area.

“The numbers are alarming,” Rezende said. “However, at the same time, they show there are suitable areas not currently inhabited by the species. We used this information to design better targeted strategies.”

Forest restoration

To determine which areas would be most suitable for the species, the researchers used a methodology that correlated data for the locations in which it lives now with climate and landscape data for the area it originally occupied. The model pinpointed more areas that would be suitable in terms of climate than landscape, such as Pontal do Paranapanema in the southwest of the state and Upper Paranapanema in southeastern São Paulo.

“These areas should be prioritized for forest restoration, connecting the forest remnants inhabited by the species and benefiting other animals,” Culot said.

Areas that would be suitable in terms of both climate and landscape rank highest on the list of translocation priorities. Groups moved to these areas would be far more likely to be ecologically and genetically healthy in the medium to long run than those translocated elsewhere. Some areas in the southwest and southeast of the state matched these criteria and are therefore considered the top priority for conservation of the species.

The researchers also stress the need to identify suitable areas in light of the climate change scenarios projected for the coming decades. According to a study by a different group, an area south of the Serra de Paranapiacaba (Paranapiacaba ridge, which is already inhabited by black lion tamarins) will undergo change that will make its climate more suitable for conservation of the species in the next 30-60 years. This is a large forest and considered highly suitable in landscape terms by Rezende et al., with major potential to serve as a habitat for populations of black lion tamarins in future, assuring their viability.

The species is known to be ecologically plastic and should be able to adapt to climate and landscape change, although to what extent this will affect its physiology is hard to know exactly. Its diet can shift toward fruit or small vertebrates, depending on availability. It can travel about 180 meters on the ground between forest fragments, although this incurs risks such as roadkill, predation, and attacks from domestic animals. It also plays an important role in forest regeneration by dispersing seeds. 

For the researchers, it is particularly urgent to be as cost-effective as possible at a time in which resources for species conservation are increasingly scarce – hence the importance of prioritizing areas and implementing properly targeted strategies.

“The study shows how theoretical models can be used in practice to help plan conservation policies and actions, increasing the likelihood of success,” Rezende said. “Although it focuses on a single species, the approach it describes is a powerful tool that can be used to establish conservation priorities for other species.”

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Drought of the century in the Middle Ages -- with parallels to climate change today?

image: In the journal Climate of the Past, researchers from the Leibniz Institutes for the History and Culture of Eastern Europe (GWZO) and Tropospheric Research (TROPOS) write that the 1302-07 weather patterns display similarities to the 2018 weather anomaly, in which continental Europe experienced exceptional heat and drought.

Image: 
ilo Arnhold, TROPOS

Leipzig. The transition from the Medieval Warm Period to the Little Ice Age was apparently accompanied by severe droughts between 1302 and 1307 in Europe; this preceded the wet and cold phase of the 1310s and the resulting great famine of 1315-21. In the journal Climate of the Past, researchers from the Leibniz Institutes for the History and Culture of Eastern Europe (GWZO) and Tropospheric Research (TROPOS) write that the 1302-07 weather patterns display similarities to the 2018 weather anomaly, in which continental Europe experienced exceptional heat and drought. Both the medieval and recent weather patterns resemble the stable weather patterns that have occurred more frequently since the 1980s due to the increased warming of the Arctic. According to the Leibniz researchers' hypothesis based on their comparison of the 1302-07 and 2018 droughts, transitional phases in the climate are always characterized by periods of low variability, in which weather patterns remain stable for a long time.
The published study presents preliminary findings of the Freigeist Junior Research Group on the Dantean Anomaly (1309-1321) at the Leibniz Institute for the History and Culture of Eastern Europe (GWZO). Funded by the Volkswagen Stiftung, the group is investigating the rapid climate change in the early 14th century and its effects on late medieval Europe.

The Great Famine (1315-1321) is considered the largest pan-European famine of the past millennium. It was followed a few years later by the Black Death (1346-1353), the most devastating pandemic known, which wiped out about a third of the population. At least partially responsible for both of these crises was a phase of rapid climate change after 1310, called the "Dantean Anomaly" after the contemporary Italian poet and philosopher Dante Alighieri. The 1310s represent a transitional phase from the High Medieval Climate Anomaly, a period of relatively high temperatures, to the Little Ice Age, a long climatic period characterized by lower temperatures and advancing glaciers.

The Leipzig-based researchers are studying the regions of northern Italy, southeastern France, and east central Europe. These areas have been little studied with regard to the Great Famine thus far, but offer a variety of historical sources for the reconstruction of extreme meteorological events and their socio-economic effects, including how vulnerable societies were at the time. "We want to show that historical climate change can be reconstructed much better if written historical sources are incorporated alongside climate archives like tree rings or sediment cores. The inclusion of humanities research clearly contributes to a better understanding of the social consequences of climate change in the past and to drawing conclusions for the future," explains Dr Martin Bauch from the GWZO, who heads the junior research group.

The study now published evaluates a large number of historical sources: chronicles from present-day France, Italy, Germany, Poland, and the Czech Republic. Regional and municipal chronicles provided information on historical city fires, which were an important indicator of droughts. Administrative records from Siena (Italy), the County of Savoy (France) and the associated region of Bresse shed light on economic developments there. Using the data, it was possible, for example, to estimate wheat and wine production in the French region of Bresse and compare it with wheat production in England. Since these yields depend strongly on climatic factors such as temperature and precipitation, it is thus possible to draw conclusions about the climate in the respective production years.

While the summer of 1302 was still very rainy in central Europe, several hot, very dry summers followed from 1304 onwards. From the perspective of climate history, this was the most severe drought of the 13th and 14th centuries. "Sources from the Middle East also report severe droughts. Water levels in the Nile, for example, were exceptionally low. We therefore think that the 1304-06 drought was not only a regional phenomenon, but probably had transcontinental dimensions," reports Dr. Thomas Labbé from the GWZO.

Based on the recorded effects, the team reconstructed the historical weather conditions between the summer of 1302 and 1307. Through evaluations of the 2018 drought and similar extreme events, it is now known that, in such cases, a so-called "precipitation seesaw" usually prevails. This is the meteorological term for a sharp contrast between extremely high precipitation in one part of Europe and extremely low precipitation in another. "This is usually caused by stable high and low pressure areas that remain in one region for an unusually long time. In 2018, for example, very stable lows lay over the North Atlantic and southern Europe for a long time, which led to heavy precipitation there and an extreme drought in between in central Europe," explains meteorologist Dr Patric Seifert from TROPOS, who was responsible for reconstructing the large-scale weather situations for the study. The analysis of the possible large-scale weather situations indicates that between 1303 and 1307, a strong, stable high pressure system predominated over central Europe, which explains the extreme drought in these years.

The analysis of these historical weather situations is particularly interesting given the ongoing discussion about how climate change in the Arctic affects weather patterns in Europe. In recent decades, the Arctic has warmed more than twice as much as other regions of the world. This phenomenon, called "Arctic Amplification," is being studied by a DFG Collaborative Research Centre led by the University of Leipzig. One theory assumes that the disproportionate warming of the Arctic causes the temperature differences--and thus also the atmospheric dynamics--between the mid-latitudes and the region around the North Pole to decrease. As a result, according to a common hypothesis, weather patterns may persist longer than in the past. "Even if it was a phase of cooling in the Middle Ages and we are now living in a phase of man-made warming, there could be parallels. The transitional period between two climate phases could be characterized by smaller temperature differences between the latitudes and cause longer-lasting large-scale weather patterns, which could explain an increase in extreme events," Seifert cautions.

In their study, the researchers recorded a noticeable coincidence between the periods of drought and urban fires. Fires were a great danger for the densely constructed cities in the Middle Ages, where there were no fire brigades like there are today. The best documented fire between 1302 and 1307 was probably in Florence, where over 1,700 houses burned on 10 June 1304. Sources for Italy and France showed a correlation between extreme drought and fires. "We think our analysis is the first to find a correlation between fires and droughts over a two-hundred-year period. Large urban fires usually followed droughts by a year. The wooden structures in medieval houses did not dry out immediately. But once they did, they ignited very easily," explains Bauch. Contemporaries were also aware of the connection between drought and fire: during dry periods, citizens were obliged to place buckets of water next to their front doors--a primitive sort of fire extinguisher, to be kept available at all times. It was only later that municipalities organized fire brigades, for example in Florence around 1348. Major infrastructural measures in response to the droughts have survived in the cities of northern Italy: Parma and Siena invested in larger, deeper wells, and Siena also bought a harbor on the Mediterranean coast, which it expanded after the drought years of 1302-04 in order to be able to import grain and become less dependent on domestic production.

"According to our analysis, the drought of 1302-1307 was a once-in-a-century event with regard to its duration. No other drought reached these dimensions in the 13th and 14th centuries. The next event that came close was not until the drought of 1360-62, which stretched across Europe and for which there indications in the historical record in Japan, Korea, and India," concludes Annabell Engel, M.A., from GWZO. In connection with global warming, researchers expect more frequent extreme events such as droughts. While numerous studies have already documented strong fluctuations in the 1340s, shortly before the plague epidemic, the first decade of the 14th century, unlike the 1310s, has been the focus of little research so far. The Leibniz researchers have now been able to show for the first time that exceptionally dry summers between 1302 and 1304 to the south of the Alps and 1304 and 1307 north of the Alps were the result of stable weather conditions and disparately distributed precipitation. The study thus sheds new light on the first years of the 14th century with its dramatic changes and draws a link to modern climate changes. "However, it is difficult to draw conclusions about future climatic developments in the 21st century from our study. While climate fluctuations in the 14th century were natural phenomena, in the modern age, humans are exerting artificial influence on the climate, as well," note Bauch and Seifert.

Credit: 
Leibniz Institute for Tropospheric Research (TROPOS)

Climate change caused mangrove collapse in Oman

image: 6,000 years ago, mangroves were widespread in Oman. Today, only one particularly robust mangrove species remains there, and this is found in just a few locations.

Image: 
© Valeska Decker/University of Bonn

Most of the mangrove forests on the coasts of Oman disappeared about 6,000 years ago. Until now, the reason for this was not entirely clear. A current study of the University of Bonn (Germany) now sheds light on this: It indicates that the collapse of coastal ecosystems was caused by climatic changes. In contrast, falling sea level or overuse by humans are not likely to be the reasons. The speed of the mangrove extinction was dramatic: Many of the stocks were irreversibly lost within a few decades. The results are published in the journal Quaternary Research.

Mangroves are trees that occupy a very special ecological niche: They grow in the so-called tidal range, meaning coastal areas that are under water at high tide and dry at low tide. Mangroves like a warm climate; most species do not tolerate sea surface temperatures below 24 °C (75°F). They are tolerant to salt, but only up to a tolerance limit that varies from species to species. "This is why we find them nowadays mostly in regions where enough rain falls to reduce salinization of the soil," explains Valeska Decker of the Institute for Geosciences at the University of Bonn, the lead author of the study.

Fossil finds prove that there used to be many mangrove lagoons on the coast of Oman. However, some 6,000 years ago these suddenly largely vanished - the reasons for this were previously disputed. Over the past few years, Decker traveled several times to the easternmost country of the Arabian Peninsula to pursue this question for her doctoral thesis. With the support of her doctoral supervisor Prof. Gösta Hoffmann, she compiled numerous geochemical, sedimentological and archaeological findings into an overall picture. "From our point of view, everything suggests that the collapse of these ecosystems has climatic reasons," she says.

Low pressure trough shifted to the south

Along the equator there is a low pressure trough, the Intertropical Convergence Zone, which is situated a little further north or south depending on the season. The Indian summer monsoon, for example, is linked to this zone. It is believed that about 10,000 years ago this zone was much further north than today, which meant the monsoon affected large parts of the Arabian Peninsula. Just over 6,000 years ago this low-pressure trough then shifted to the south, but the reason for this and how fast is still not completely clear.

"That this was the case has been well documented for several years," explains Decker. "Our results now indicate that this climate change had two effects: On the one hand, it caused salinization of the soil, which put the mangroves under extreme stress. On the other hand, the vegetation cover in the affected areas decreased in general due to the greater drought." This increased erosion: The wind carried large amounts of the barren soil into the lagoons. These silted up and successively dried up. The whole thing happened surprisingly fast: "The ecosystems probably disappeared within a few decades," stresses Decker. According to previous studies, the environmental changes were gradual. The mangrove ecosystems struggled till a certain threshold was reached and then collapsed within decades. Nowadays, the only mangroves in Oman are those of a particularly robust species and are found only in a few places.

She was able to exclude other possible causes for the disappearance of the mangroves in her study. For example, the researchers found no evidence of a drop in sea level 6,000 years ago that could have triggered the mangrove extinction. "Archaeological findings also speak against a man-made ecological catastrophe," she says. "It is true that there were humans living in the coastal regions who used the mangroves as firewood. However, they were nomads who did not build permanent settlements. This meant that their need for wood was relatively low - low enough to rule out overuse as a cause."

Decker and her colleagues now want to further investigate how much the annual precipitation changed and what impact this had on the region. To this end, the researchers plan to study the pollen that has persisted in the lagoon sediment for thousands of years. They want to find out how the vegetation changed as a result of the drought. The results could also be relevant for us: In many regions of the world, the climate is changing at a dramatic pace. Germany has also suffered increasingly from long droughts in recent years. Foresters are therefore already planning to plant more drought-resistant species in this country; this is a consequence of climate change that may leave long-term marks in the history of vegetation.

Credit: 
University of Bonn

Advanced materials in a snap

image: Sandia National Laboratories has developed a machine learning algorithm capable of performing simulations for materials scientists nearly 40,000 times faster than normal.

Image: 
Eric Lundin, Sandia National Laboratories

ALBUQUERQUE, N.M. -- If everything moved 40,000 times faster, you could eat a fresh tomato three minutes after planting a seed. You could fly from New York to L.A. in half a second. And you'd have waited in line at airport security for that flight for 30 milliseconds.

Thanks to machine learning, designing materials for new, advanced technologies could accelerate that much.

A research team at Sandia National Laboratories has successfully used machine learning -- computer algorithms that improve themselves by learning patterns in data -- to complete cumbersome materials science calculations more than 40,000 times faster than normal.

Their results, published Jan. 4 in npj Computational Materials, could herald a dramatic acceleration in the creation of new technologies for optics, aerospace, energy storage and potentially medicine while simultaneously saving laboratories money on computing costs.

"We're shortening the design cycle," said David Montes de Oca Zapiain, a computational materials scientist at Sandia who helped lead the research. "The design of components grossly outpaces the design of the materials you need to build them. We want to change that. Once you design a component, we'd like to be able to design a compatible material for that component without needing to wait for years, as it happens with the current process."

The research, funded by the U.S. Department of Energy's Basic Energy Sciences program, was conducted at the Center for Integrated Nanotechnologies, a DOE user research facility jointly operated by Sandia and Los Alamos national labs.

Machine learning speeds up computationally expensive simulations

Sandia researchers used machine learning to accelerate a computer simulation that predicts how changing a design or fabrication process, such as tweaking the amounts of metals in an alloy, will affect a material. A project might require thousands of simulations, which can take weeks, months or even years to run.

The team clocked a single, unaided simulation on a high-performance computing cluster with 128 processing cores (a typical home computer has two to six processing cores) at 12 minutes. With machine learning, the same simulation took 60 milliseconds using only 36 cores-equivalent to 42,000 times faster on equal computers. This means researchers can now learn in under 15 minutes what would normally take a year.

Sandia's new algorithm arrived at an answer that was 5% different from the standard simulation's result, a very accurate prediction for the team's purposes. Machine learning trades some accuracy for speed because it makes approximations to shortcut calculations.

"Our machine-learning framework achieves essentially the same accuracy as the high-fidelity model but at a fraction of the computational cost," said Sandia materials scientist Rémi Dingreville, who also worked on the project.

Benefits could extend beyond materials

Dingreville and Montes de Oca Zapiain are going to use their algorithm first to research ultrathin optical technologies for next-generation monitors and screens. Their research, though, could prove widely useful because the simulation they accelerated describes a common event -- the change, or evolution, of a material's microscopic building blocks over time.

Machine learning previously has been used to shortcut simulations that calculate how interactions between atoms and molecules change over time. The published results, however, demonstrate the first use of machine learning to accelerate simulations of materials at relatively large, microscopic scales, which the Sandia team expects will be of greater practical value to scientists and engineers.

For instance, scientists can now quickly simulate how miniscule droplets of melted metal will glob together when they cool and solidify, or conversely, how a mixture will separate into layers of its constituent parts when it melts. Many other natural phenomena, including the formation of proteins, follow similar patterns. And while the Sandia team has not tested the machine-learning algorithm on simulations of proteins, they are interested in exploring the possibility in the future.

Credit: 
DOE/Sandia National Laboratories

Some English bulldogs thought to have cancer may have newly identified syndrome

image: Tootsie was one of 84 English bulldogs that supplied blood that helped researchers discover a non-cancerous syndrome, called polyclonal B?cell lymphocytosis, that could be mistaken for a common cancer.

Image: 
Colorado State University

DENVER/January 5, 2020 – Some English bulldogs diagnosed with a common cancer may instead have a newly described, non-cancerous syndrome called polyclonal B‐cell lymphocytosis. The discovery was made by Morris Animal Foundation-funded researchers at Colorado State University during a study to better understand B-cell chronic lymphocytic leukemia (BCLL). The team published their findings in the Journal of Veterinary Internal Medicine.

“This could save some dogs from being misdiagnosed, treated for cancer or even euthanized when they shouldn’t be,” said Dr. Anne Avery, Professor, Department of Microbiology, Immunology and Pathology at Colorado State University. “The dogs may look to their veterinarians like they have leukemia, based on original diagnostics, but they don’t actually have cancer.”

In a previous BCLL paper published by Dr. Avery’s team, they identified different breeds at an increased risk for that tumor type. One breed was English bulldogs, but the dogs had a unique presentation as compared to the other breeds. English bulldogs were significantly younger when they presented and also had differences in what their B-cells (antibody-producing white blood cells) expressed on their cell surface when analyzed by flow cytometry. That led researchers to wonder whether English bulldogs truly had BCLL or a different, previously unidentified disease.

For this retrospective study, the team identified 84 cases with increased numbers of B-cells in the blood, drawn from their database of 195 English bulldogs. The team analyzed the serum of these dogs to determine the types of antibodies they produced. Since many of the dogs had enlarged spleens, the team also took a closer look to see what kinds of cells expanded there.

Then researchers performed an assay to determine if the B-cells were identical or not. If they were identical, it would suggest they arose from a single cell and likely would have been cancerous. The team also examined the sexes of the dogs, as well as the ages and what clinical signs were present (if any) when the blood was drawn.

The team found that 70% of the dogs did not have cancer. These dogs tended to be young, some just 1 or 2 years old when they developed the syndrome. Three-quarters of the dogs were male, and more than half had enlarged spleens. Most of the cases had hyperglobulinemia, an excess of antibodies in the blood stream. The team hypothesized that this syndrome has a genetic basis.

“This important finding demonstrates that we shouldn’t assume that a high B-cell count always indicates cancer in English bulldogs,” said Dr. Janet Patterson-Kane, Morris Animal Foundation Chief Scientific Officer. “This is very important information for veterinarians who may initially see these patients in their clinic.”

The team is looking for evidence of this syndrome in other breeds, but they believe it is rare in dogs other than English bulldogs. The group’s next step is to look for the genetic mutation that leads to this syndrome. They would also like to follow the dogs for a longer period to learn if there are health consequences to persistently high B-cell numbers.

The findings in this new study clarify the results from the original BCLL research, according to Avery. Rather than being at high risk for BCLL as originally thought, English bulldogs develop a benign syndrome that has many similarities to leukemia. The syndrome almost certainly has an underlying genetic cause and does not appear to have a malignant clinical course.

Credit: 
Morris Animal Foundation

Routine eye scans may give clues to cognitive decline in diabetes

image: Ophthalmology Exam of Patient with Diabetes

Image: 
Beetham Eye Institute/Joslin Diabetes Center

BOSTON - (January 4, 2021) - As they age, people with diabetes are more likely to develop Alzheimer's disease and other cognitive disorders than are people without diabetes. Scientists at Joslin Diabetes Center now have shown that routine eye imaging can identify changes in the retina that may be associated with cognitive disorders in older people with type 1 diabetes.

These results may open up a relatively easy method for early detection of cognitive decline in this population, providing better ways to understand, diagnose and ultimately treat the decline, said George L. King, MD, Joslin's Chief Scientific Officer and senior author on a paper about the study in the Journal of Clinical Endocrinology & Metabolism.

Previous research had demonstrated an association between proliferative diabetic retinopathy (PDR, a complication of diabetes that can severely damage eyesight) and cognitive impairment in people with type 1 diabetes. "Since we knew that there were cellular changes in the retina that might reflect changes in the brain, we were interested to see whether imaging techniques that visualize those changes in the retina might be reflective of changes in cognitive functions," said Ward Fickweiler, MD, a Joslin postdoctoral fellow and first author on the paper.

The scientists drew on eye scans routinely gathered from patients as part of normal vision care at Joslin's Beetham Eye Institute. One set of scans was based on optical coherence tomography (OCT, a technique employing light to provide cross-sections of the retina). A second set of scans employed OCT angiography (OCTA, an extension of OCT technology that examines blood vessels in the retina). Both types of scans are non-invasive and widely available in eye clinics in the United States, and can be performed within minutes.

The study enlisted 129 participants in the Joslin Medalist Study, which examines outcomes among people who have had type 1 diabetes for 50 years or longer. These volunteers took a series of cognitive tests that included tasks probing memory function as well as psychomotor speed (assessing the time it took to arrange objects by hand).

Strikingly, the researchers found very strong associations between performance on memory tasks and structural changes in deep blood vessel networks in the retina. "Memory is the main cognitive task that is affected in Alzheimer's disease and cognitive decline, so that was exciting," Fickweiler said.

The Joslin team also discovered strong associations between PDR and psychomotor speed. This finding reinforced earlier outcomes that had been identified among a smaller group of Joslin Medalists, and provided details about related changes in retinal structure. Additionally, the researchers saw that PDR was associated with memory performance among the larger group of Medalists.

While these results need to be confirmed in larger clinical investigations, the routine eye exams do seem to detect the cognitive changes happening in people with diabetes, said Fickweiler.

Currently, other ways to detect conditions such as Alzheimer's disease such as MRI scans are difficult and expensive. People typically are tested only when they're showing symptoms of cognitive decline and treatments at that stage generally don't offer much help.

"If you can detect the condition at an earlier stage, when they're still asymptomatic, that may benefit patients," Fickweiler said. Earlier detection also could aid the quest to develop better therapies for neurocognitive diseases.

The Joslin team plans to launch a larger prospective study to confirm the potential of eye imaging to pick up signs of cognitive decline over time. This research will include people with type 1 diabetes who are younger and haven't had the disease for as long as the Medalists. The scientists also will analyze MRI brain images and postmortem brain samples donated by Medalists.

Additionally, the investigators will look for common mechanisms that may inflict damage on brain and retina tissues, which share much of their early embryonic development pathways. Likely suspects in people with diabetes include impaired blood vessels and high or low levels of blood glucose. The autoimmunity that drives type 1 diabetes also might inflict other forms of harm, King said.

Notably, Joslin Medalists often display relatively low levels of the complications that can afflict those with long-term type 1 diabetes. For instance, almost half of Medalists don't develop advanced eye disease, and only one of the 129 Medalists in the eye-scan study may have Alzheimer's disease. "It is possible that in the Medalists, a shared mechanism alters the progression of the early stages of retinal and brain neurodegeneration, and provides protection against both PDR and Alzheimer's disease," Fickweiler speculated.

In addition to follow-up work in type 1 diabetes, King and his team plan to perform a similar study for people with type 2 diabetes. PDR also is associated with cognitive decline in this much larger group of patients, who also get OCT and OCTA eye scans as part of their regular vision care.

Credit: 
Joslin Diabetes Center

Identifying Canada's key conservation hot spots highlights problem

image: Despite the size of the country, new mapping suggests that less than 1% of Canada's land (0.6% of total area or approximately 56,000 km2) provides all ecosystem services (e.g. freshwater, climate regulation, recreation) in one area.

Image: 
Jake Dyson

To stop biodiversity loss, Canada recently committed to protecting 30% of its land and sea by 2030. But making conservation decisions about where to locate new protected areas is complicated. It depends on data both about biodiversity and about a range of benefits (e.g. freshwater, climate regulation, recreation) that people get from nature. Surprisingly, despite the size of the country, new mapping suggests that less than 1% of Canada's land (0.6% of total area or approximately 56,000 km2) is a hot spot, providing all these benefits in one place. Moreover, the study published today in Environmental Research Letters suggests that some of the most critical areas where people receive these key benefits from nature do not occur within currently protected areas and may be threatened by current or future natural resource extraction.

"This research is especially timely as it should help all levels of government design conservation plans that ensure that both people and nature thrive," says Elena Bennett, from McGill University's Bieler School of the Environment and one of the authors in a multi-institutional team that included researchers from Universities of British Columbia, McGill and Carleton and from the Yellowstone to Yukon Conservation Initiative (Y2Y).

Identifying key areas of Canada that provide ecosystem services

The paper highlights multiple places across Canada as important for one or more ecosystem services that include providing freshwater (such as for irrigation, drinking or hydroelectricity), climate regulation (as in the case of forests and wetlands that act as carbon sinks), or for nature-based human recreation. These include the forests of British Columbia and the Hudson Bay lowlands for above- and below-ground carbon; north-central Quebec, the eastern mountains of British Columbia, the eastern slopes of the Rockies in Alberta, and the north shore of Lake Superior for freshwater; and the Rocky Mountains, eastern Ontario, and southern Quebec for nature-based recreation.

"Canada is grappling with where and how to protect nature. Just one example of how this research could be used is in western Alberta. Our research shows that the Eastern Slopes of the Rockies is one of the most important places across the whole country for its combination of freshwater, carbon storage, and recreation -- not to mention important wildlife habitat -- and yet the same area is at risk from open-pit coal mining and other threats," says Dr. Aerin Jacob, co-author and conservation scientist, at the Yellowstone to Yukon Conservation Initiative.

A question of both supply and demand

Crucially and unusually, the mapping methods included both nature's capacity to supply these benefits as well as the human access and demand for them.

"Most research that studies the benefits people get from nature only evaluates where nature has the potential to supply these benefits. For example, where rain falls. Because our work also models and maps human access and demand, we could identify where people actually receive these benefits from nature. For example, the key locations producing water that people use for drinking, farming, or hydroelectricity," says Dr. Matthew Mitchell, lead author and Research Associate, Institute for Resources, Environment, and Sustainability, University of British Columbia. "Governments need to know both of these things in order to take action that protects human well-being. Research like this can help society do that."

Credit: 
McGill University

Natural products with potential efficacy against lethal viruses

image: Lifecycle of Coronavirus as represented by SARS-CoV-1/2 and MERS-CoV viruses

Image: 
Illustration: Jennifer Matthews/Scripps Oceanography

Researchers at Scripps Institution of Oceanography and Skaggs School of Pharmacy and Pharmaceutical Sciences at the University of California San Diego have broken down the genomic and life history traits of three classes of viruses that have caused endemic and global pandemics in the past and identify natural products - compounds produced in nature - with the potential to disrupt their spread.

In a review appearing in the Journal of Natural Products, marine chemists Mitchell Christy, Yoshinori Uekusa, and William Gerwick, and immunologist Lena Gerwick describe the basic biology of three families of RNA viruses and how they infect human cells. These viruses use RNA instead of DNA to store their genetic information, a trait that helps them to evolve quickly. The team then describes the natural products that have been shown to have capabilities to inhibit them, highlighting possible treatment strategies.

"We wanted to evaluate the viruses that are responsible for these deadly outbreaks and identify their weaknesses," said Christy, the first author. "We consider their similarities and reveal potential strategies to target their replication and spread. We find that natural products are a valuable source of inhibitors that can be used as a basis for new drug development campaigns targeting these viruses."

The research team is from Scripps Oceanography's Center for Marine Biotechnology and Biomedicine (CMBB), which collects and analyzes chemical compounds found in marine environments for potential efficacy as antibiotics, anticancer therapies, and other products with medical benefit. A drug known as Marizomib entered the final stages of clinical trials as a potential treatment for brain cancers earlier in 2020. The drug came from a genus of marine bacteria that CMBB researchers had originally collected in seafloor sediments in 1990.

The researchers, funded by the National Institutes of Health and the UC San Diego Chancellor's Office, present an overview of the structure of viruses in the families Coronaviridae, Flaviviridae, and Filoviridae. Within these families are viruses that have led to COVID-19, dengue fever, West Nile encephalitis, Zika, Ebola, and Marburg disease outbreaks. The team then identifies compounds produced by marine and terrestrial organisms that have some demonstrated level of activity against these viruses. Those compounds are thought to have molecular architectures that make them potential candidates to serve as viral inhibitors, preventing viruses from penetrating healthy human cells or from replicating. The goal of the review, the researchers said, was to improve the process of drug development as new pandemics emerge, so that containing disease spread can accelerate in the face of new threats.

"It is simply common sense that we should put into place the infrastructure necessary to more rapidly develop treatments when future pandemics occur," the review concludes. "One such recommendation is to create and maintain international compound libraries with substances that possess antiviral, antibacterial, or antiparasitic activity."

To achieve that goal, the researchers realize that international agreements would need to be reached to address intellectual property issues, the rights and responsibilities of researchers, and other complex issues.

And while there has been remarkable progress in the development of vaccines for SARS-CoV-2 infection, effective antiviral drugs are also critically needed for managing COVID-19 infection in unvaccinated individuals or in cases where the efficacy of a vaccine decreases over time, the researchers said. While several candidate antiviral molecules have been investigated for use in the clinic, such as remdesivir, lopinavir-ritonavir, hydroxychloroquine, and type I interferon therapy, all have shown limited or no efficacy in large scale trials. Effective antiviral drugs are still much in need of discovery and development.

Credit: 
University of California - San Diego

Gum disease-causing bacteria borrow growth molecules from neighbors to thrive

BUFFALO, N.Y. - The human body is filled with friendly bacteria. However, some of these microorganisms, such as Veillonella parvula, may be too nice. These peaceful bacteria engage in a one-sided relationship with pathogen Porphyromonas gingivalis, helping the germ multiply and cause gum disease, according to a new University at Buffalo-led study.

The research sought to understand how P. gingivalis colonizes the mouth. The pathogen is unable to produce its own growth molecules until it achieves a large population in the oral microbiome (the community of microorganisms that live on and inside the body).

The answer: It borrows growth molecules from V. parvula, a common yet harmless bacteria in the mouth whose growth is not population dependent.

In a healthy mouth, P. gingivalis makes up a miniscule amount of the bacteria in the oral microbiome and cannot replicate. But if dental plaque is allowed to grow unchecked due to poor oral hygiene, V. parvula will multiply and eventually produce enough growth molecules to also spur the reproduction of P. gingivalis.

More than 47% of adults 30 and older have some form of periodontitis (also known as gum disease), according to the Centers for Disease Control and Prevention. Understanding the relationship between P. gingivalis and V. parvula will help researchers create targeted therapies for periodontitis, says Patricia Diaz, DDS, PhD, lead investigator on the study and Professor of Empire Innovation in the UB School of Dental Medicine.

"Having worked with P. gingivalis for nearly two decades, we knew it needed a large population size to grow, but the specific processes that drive this phenomenon were not completely understood," says Diaz, also director of the UB Microbiome Center. "Successfully targeting the accessory pathogen V. parvula should prevent P. gingivalis from expanding within the oral microbial community to pathogenic levels."

The study, which was published on Dec. 28 in the ISME Journal, tested the effects of growth molecules exuded by microorganisms in the mouth on P. gingivalis, including molecules from five species of bacteria that are prevalent in gingivitis, a condition that precedes periodontitis.

Of the bacteria examined, only growth molecules secreted by V. parvula enabled the replication of P. gingivalis, regardless of the strain of either microbe. When V. parvula was removed from the microbiome, growth of P. gingivalis halted. However, the mere presence of any V. parvula was not enough to stimulate P. gingivalis, as the pathogen was only incited by a large population of V. parvula.

Data suggest that the relationship is one-directional as V. parvula received no obvious benefit from sharing its growth molecules, says Diaz.

"P. gingivalis and V. parvula interact at many levels, but the beneficiary is P. gingivalis," says Diaz, noting that V. parvula also produces heme, which is the preferred iron source for P. gingivalis.

"This relationship that allows growth of P. gingivalis was not only confirmed in a preclinical model of periodontitis, but also, in the presence of V. parvula, P. gingivalis could amplify periodontal bone loss, which is the hallmark of the disease," says George Hajishengallis, DDS, PhD, co-investigator on the study and Thomas W. Evans Centennial Professor in the University of Pennsylvania School of Dental Medicine.

"It is not clear whether the growth-promoting cues produced by P. gingivalis and V. parvula are chemically identical," says Diaz. "Far more work is needed to uncover the identity of these molecules."

Credit: 
University at Buffalo

A plant's way to its favorite food

video: The video captures the growth of the Arabidopsis root tip supplemented with ammonium vs. nitrate.

Image: 
Krisztina Ötvös / IST Austria.

Like any other plant, Arabidopsis thaliana or mouse-ear cress, needs nitrogen to survive and thrive. But, like maize, beans and sugar beet, it prefers nitrogen in the form of nitrate, growing better on nitrate rich soil. Whereas, pine and rice for example preferentially grow on ammonium nutrition, another form of the key macronutrient nitrogen. If the concentration or the availability of the different forms of nitrogen fluctuate, plants have to adapt quickly. "One of the most important questions is, what is the role of plant hormones in adaptation to the nitrogen availability? How do the machineries within a plant cope with their changing environment?" asks Eva Benková, developmental biologist and Professor at the Institute of Science and Technology (IST) Austria.

Finding the balance

In search of answers, Krisztina Ötvös, postdoctoral fellow in the research team of Eva Benková, together with colleagues from the Universidad Politécnica de Madrid, the Pontifical Catholic University of Chile, the Austrian Institute of Technology and the University of Montpellier, looked at two extremes: They compared how Arabidopsis seedlings that were grown exclusively on ammonium reacted, once the scientists transferred them to media containing either ammonium or nitrate.

If a plant lives in suboptimal soil, it tries to maintain its root growth as long as possible to reach a more suitable form of nitrogen. The major processes, which maintain the root growth, are the cell proliferation in the meristem, a plant tissue consisting of undifferentiated cells, and the cell expansion. The plant has to find a good balance between these two. Provided with ammonium, the form of nitrogen Arabidopsis is not so fond of, the meristematic zone of the cress produced less cells. Instead, they very quickly elongated. "Once we moved the plants to the nitrate, suddenly the meristem became bigger, more cells were produced and there was a different kinetics in cell expansion," says Benková. "Now Arabidopsis could afford to put more energy into cell division and optimized its root growth differently."

Controlling the hormone flow

Whether the plant invests in cell proliferation or cell elongation is instructed by the level of auxin. This plant hormone is essential for all developmental processes. It is transported in a very controlled way from one cell to the next by special auxin transporters. The proteins that control the transport of auxin out of the cells, so called efflux carriers, regulate the flow of auxin depending on which side of the cell they are sitting. Benková and her team were especially interested in the auxin transporter PIN2, which mediates the flow of auxin at the very root tip. The researchers were able to identify PIN2 as the main factor to set up the balance between cell division and cell elongation. "We observed that once we moved plants onto the nitrate, the localization of PIN2 changes. Thereby, it changes the distribution of auxin."

The activity of PIN2 on the other hand is affected by its phosphorylation status. "What really surprised us was that one modification, the phosphorylation of such a big protein like an efflux carrier, can have such an important impact on the root behavior," Benková adds. Furthermore, the amino acid of PIN2 that is the target of the phosphorylation, is present in many different plant species, suggesting that PIN2 might be universally involved in other plant species adaption strategies to changing nitrogen sources. In a next step, the researchers want to understand the machinery that controls the change of the phosphorylation status.

A very close look

"The present study is the result of the input of many different people from cell biologists and computer scientists to people working in advanced microscopy. It really is a multidisciplinary approach," Eva Benková emphasizes. In order to take a close look at the processes within Arabidopsis' roots, for example, the biologists used a vertical confocal microscope - a tool especially adapted at the IST Austria to suit the researchers' needs. Instead of a horizontal stage the microscope uses a vertical one, which allows you to observe the plant growth the way it naturally does - along the gravity factor. With its high resolution Benková and her team were able to observe how the cells within Arabidopsis' roots were dividing and expanding in real time. In a previous project, researchers at the IST Austria won Nikon's Small World in Motion video competition, showing live-tracking of a growing root tip of Arabidopsis thaliana under the microscope.

Credit: 
Institute of Science and Technology Austria

University of Miami leads groundbreaking trial for COVID-19 treatment

image: Camillo Ricordi, M.D., director of the Diabetes Research Institute (DRI) and Cell Transplant Center at the University of Miami Miller School of Medicine

Image: 
University of Miami Health System

University of Miami Miller School of Medicine researchers led a unique and groundbreaking randomized controlled trial showing umbilical cord derived mesenchymal stem cell infusions safely reduce risk of death and quicken time to recovery for the severest COVID-19 patients, according to results published in STEM CELLS Translational Medicine in January 2021.

The study's senior author, Camillo Ricordi, M.D., director of the Diabetes Research Institute (DRI) and Cell Transplant Center at the University of Miami Miller School of Medicine, said treating COVID-19 with mesenchymal stem cells makes sense.

Results: treatment group vs. control group

The paper describes findings from 24 patients hospitalized at University of Miami Tower or Jackson Memorial Hospital with COVID-19 who developed severe acute respiratory distress syndrome. Each received two infusions given days apart of either mesenchymal stem cells or placebo.

"It was a double-blind study. Doctors and patients didn't know what was infused," Dr. Ricordi said. "Two infusions of 100 million stem cells were delivered within three days, for a total of 200 million cells in each subject in the treatment group."

Researchers found the treatment was safe, with no infusion-related serious adverse events.

Patient survival at one month was 91% in the stem cell treated group versus 42% in the control group. Among patients younger than 85 years old, 100% of those treated with mesenchymal stem cells survived at one month.

Dr. Ricordi and colleagues also found time to recovery was faster among those in the treatment arm. More than half of patients treated with mesenchymal stem cell infusions recovered and went home from the hospital within two weeks after the last treatment. More than 80% of the treatment group recovered by day 30, versus less than 37% in the control group.

"The umbilical cord contains progenitor stem cells, or mesenchymal stem cells, that can be expanded and provide therapeutic doses for over 10,000 patients from a single umbilical cord. It's a unique resource of cells that are under investigation for their possible use in cell therapy applications, anytime you have to modulate immune response or inflammatory response," he said. "We've been studying them with our collaborators in China for more than 10 years in Type 1 Diabetes, and there are currently over 260 clinical studies listed in clinicaltrials.gov for treatment of other autoimmune diseases."

Mesenchymal stem cells potential to restore normal immune response

Mesenchymal cells not only help correct immune and inflammatory responses that go awry, they also have antimicrobial activity and have been shown to promote tissue regeneration.

"Our results confirm the powerful anti-inflammatory, immunomodulatory effect of UC-MSC. These cells have clearly inhibited the 'cytokine storm', a hallmark of severe COVID-19," said Giacomo Lanzoni, Ph.D, lead author of the paper and assistant research professor at the Diabetes Research Institute. "The results are critically important not only for COVID-19 but also for other diseases characterized by aberrant and hyperinflammatory immune responses, such as autoimmune Type 1 Diabetes."

When given intravenously, mesenchymal stem cells migrate naturally to the lungs. That's where therapy is needed in COVID-19 patients with acute respiratory distress syndrome, a dangerous complication associated with severe inflammation and fluid buildup in the lungs.

"It seemed to me that these stem cells could be an ideal treatment option for severe COVID-19," said Dr. Ricordi, Stacy Joy Goodman Professor of Surgery, Distinguished Professor of Medicine, and professor of biomedical engineering, microbiology and immunology. "It requires only an intravenous (IV) infusion, like a blood transfusion. It's like smart bomb technology in the lung to restore normal immune response and reverse life-threatening complications."

Early success with mesenchymal stem cells

When the pandemic emerged, Dr. Ricordi asked collaborators in China if they had studied mesenchymal stem cell treatment in COVID-19 patients. In fact, they and Israeli researchers reported great success treating COVID-19 patients with the stem cells, in many cases with 100% of treated patients surviving and recovering faster than those without stem cell treatment.

But there was widespread skepticism about these initial results, because none of the studies had been randomized, where patients randomly received treatment or a control solution (placebo), to compare results in similar groups of patients.

"We approached the FDA and they approved our proposed randomized controlled trial in one week, and we started as quickly as possible," Dr. Ricordi said.

Dr. Ricordi worked with several key collaborators at the Miller School, the University of Miami Health System, Jackson Health System, and collaborated with others in the U.S. and internationally, including Arnold I. Caplan, Ph.D., of Case Western Reserve University, who first described mesenchymal stem cells.

Next steps

The next step is to study use of the stem cells in COVID-19 patients who have not yet become severely ill but are at risk of having to be intubated, to determine if the infusions prevent disease progression.

The findings have implications for studies in other diseases, too, according to Dr. Ricordi.

Hyper-immune and hyper-inflammatory responses in autoimmune diseases might share a common thread with why some COVID-19 patients transition to severe forms of the disease and others don't.

"Autoimmunity is a big challenge for healthcare, as is COVID-19. Autoimmunity affects 20% of the American population and includes over 100 disease conditions, of which Type 1 Diabetes can be considered just the tip of the iceberg. What we are learning is that there may be a common thread and risk factors that can predispose to both an autoimmune disease or to a severe reaction following viral infections, such as SARS-CoV-2," he said.

The DRI Cell Transplant Center is planning to create a large repository of mesenchymal stem cells that are ready to use and can be distributed to hospitals and centers in North America, he said.

"These could be used not only for COVID-19 but also for clinical trials to treat autoimmune diseases, like Type 1 Diabetes," Dr. Ricordi said. "If we could infuse these cells at the onset of Type 1 Diabetes, we might be able to block progression of autoimmunity in newly diagnosed subjects, and progression of complications in patients affected by the disease long-term. We are planning such a trial specifically for diabetes nephropathy, a kidney disease that is one of the major causes of dialysis and kidney transplantation. We are also planning to do a study on umbilical cord mesenchymal stem cell transplantation in combination with pancreatic islets to see if you can modulate the immune response to an islet transplant locally."

Funding by The Cure Alliance made launching the initial trial possible, while a $3 million grant from North America's Building Trades Unions (NABTU) allowed Dr. Ricordi and colleagues to complete the clinical trial and expand research with mesenchymal stem cells.

"North America's Building Trades Unions (NABTU) has been a major supporter of the Diabetes Research Institute since 1984, when they started a campaign to fund, and build, our state-of-the-art research and treatment facility. NABTU has continued to support our work through the years, including our mesenchymal stem cell research that helped lead the way to this clinical trial," he said.

Credit: 
University of Miami Miller School of Medicine

Rare footage captured of jaguar killing ocelot at waterhole

video: Researchers captured rare footage of a male jaguar killing an ocelot, another predatory wild cat, at an isolated waterhole in the Maya Biosphere Reserve in Guatemala.

Image: 
Washington State University

PULLMAN, Wash. - In what may be a sign of climate-change-induced conflict, researchers have captured rare photographic evidence of a jaguar killing another predatory wild cat at an isolated waterhole in Guatemala.

In the footage, a male jaguar arrives near the waterhole and apparently lies in wait for an hour. It lets a potentially dangerous prey animal, a large tapir, pass by, but when the ocelot stops to drink, the jaguar pounces and carries off the smaller predator.

The event, detailed in a recent study published in the journal Biotropica, was captured in the Maya Biosphere Reserve in March 2019, a dry month in a drought year for the tropical forest, by wildlife ecologists from Washington State University and the Wildlife Conservation Society.

"Although these predator-on-predator interactions may be rare, there may be certain instances when they become more prevalent, and one of those could be over contested water resources," said Daniel Thornton, a WSU assistant professor and co-author on the paper. "People don't often think of tropical systems as being dry, but in many parts of the world, tropical rains are quite seasonal, and with climate change, some of these tropical ecosystems are expected to become even more seasonal. The more isolated and rare water resources become, the more they're going to become hotspots of activity."

Jaguars which can weigh more than 200 pounds typically prey on small animals like armadillos or peccaries. Ocelots, also carnivores, are smaller than their larger jaguar cousins at around 18 to 44 pounds, and their activity patterns overlap with the jaguars particularly in twilight periods of the day.

While some research has noted signs of ocelot in jaguar feces, until now, no known images have been captured of a jaguar directly killing an ocelot.

"These dramatic camera trap images clearly show the fierce competition wildlife face for precious resources like water," said Rony García-Anleu of WCS's Guatemala Program and a co-author of the study. "Unfortunately, climate change and associated droughts are predicted to worsen, which means tough times are ahead for wildlife that depend on watering holes for their survival."

The researchers had placed cameras at 42 waterholes in the area in 2018 and 2019. In the 2019 dry season only 21 had water, and none of those were within 10 km (6.2 miles) of this particular waterhole. At this same remote spot, scientists also recorded a fight between two jaguars and a jaguar attempting to attack a young tapir. They also noted that seven different jaguars frequented this waterhole, which is unusual for a species that normally avoids its peers and sticks to its own territory.

The jaguar-ocelot kill was captured as part of a larger monitoring project looking at the distribution of animals across the entire landscape in northern Guatemala, especially in relation to human pressures. Ironically, this waterhole was one that was far from any human community, but that did not mean it was necessarily unaffected by human activity.

"We have evidence that many things are happening related to climate change, but we might not be aware of every detail, of every consequence," said Lucy Perera-Romero, a WSU doctoral student and lead author on the study. "For example, in these beautiful, green forests, we may not be aware that water flow is a serious issue. It could be another source of mortality--apart from deforestation, from hunting, and from everything else that we do."

The Maya Forest is one of Mesoamerica's 5 Great Forests, spanning from Mexico to Colombia, collectively covering an area three times the size of Switzerland. The 5 Great Forests are all transboundary and represent Mesoamerica's most critical bastions for jaguars and other wildlife, and provide services such as carbon sequestration, clean water and food security to five million people.

Credit: 
Washington State University

Making therapeutic sense of antisense oligonucleotides

image: DNA/DNA double-stranded oligonucleotide is a new concept of oligonucleotide consisting of DNA-based antisense oligonucleotide and DNA-based complementary strand. When α-tocopherol is bound to this duplex and administered systemically, it reaches the mouse liver and subsequently the complementary DNA is degraded, and the released parent single-strand antisense oligonucleotide exerts a high target RNA suppression effect.

Image: 
Department of Neurology and Neurological Science,TMDU

Researchers from Tokyo Medical and Dental University (TMDU) and Ionis Pharmaceuticals, USA, report a modification wherein replacing the RNA strand of a heteroduplex oligonucleotide with DNA may enhance the efficacy of antisense oligonucleotide-based drugs

Tokyo, Japan - Antisense oligonucleotides (ASO) hold great promise for pharmacotherapy. Now, researchers at Tokyo Medical and Dental University (TMDU) and Ionis Pharmaceuticals, advancing their earlier work on a heteroduplex oligonucleotide (HDO) model, have demonstrated augmentation of ASO-based drugs by replacing the RNA strand with DNA.

Many drugs work by modifying specific disease-related proteins. Unfortunately, they may also affect non-targeted proteins causing side-effects that downgrade their safety and clinical applicability. Nucleic-acid therapeutics employs an emerging class of drugs including ASOs that target disease at the genetic level by suppressing the expression of pathogenic proteins. By modifying targets hitherto undruggable by conventional pharmacotherapy, they offer potential for treating intractable diseases such as spinal muscular atrophy and Huntington disease, with several candidates in clinical use and more on the horizon.

ASOs are synthetic single-stranded molecules comprising a few dozen base pairs capable of regulating gene expression through binding to the "sense" strand of mRNA targets. Arranging nucleotides, the building blocks of genetic code, in an "antisense" or opposing order can suppress a specific RNA sequence and prevent production of harmful proteins.

The research team had earlier developed an HDO wherein the single-stranded ASO was hybridized to complementary RNA and conjugated with tocopherol. Toc-HDO(coRNA) proved more stable in serum, efficiently deliverable to target cells and more potent than the parent ASO. First author Yutaro Asami explains the rationale of the current study: "Since cellular uptake was mostly in the intact form and the parent ASO was released intracellularly, we proposed replacing the phosphodiester (PO) RNA of the complementary strand with PO DNA that is more stable and easier to manufacture."

The researchers bioengineered a DNA/DNA double-stranded oligonucelotide: Toc-HDO(coDNA). The relatively low DNAse in serum would promote stability and the molecule would be activated intracellularly by DNase degradation. The efficacy of this molecular modification was evaluated using murine hepatocyte uptake assay, quantitative real-time PCR assay for RNA levels and fluorescence-based determination of hepatic ASO concentrations. "We could establish the efficacy of Toc-HDO(coDNA) on mRNA expression levels in comparison with parent ASOs of varied compositions," claims Asami. "Moreover, we also elucidated coDNA strand structure-activity relationships and degradation kinetics in mouse liver cells."

Senior author Professor Takanori Yokota looks into the future. "HDO technology promises personalized targeted therapy for several neurodegenerative and other intractable diseases. Our innovative molecular structural modifications, by enhancing clinical potency and safety, help enlarge the therapeutic toolkit on this versatile platform."

Credit: 
Tokyo Medical and Dental University

Neither liquid nor solid

image: Image of the position and orientation of ellipsoidal particles in clusters of a liquid glass.

Image: 
Research groups of Professor Andreas Zumbusch and Professor Matthias Fuchs

While glass is a truly ubiquitous material that we use on a daily basis, it also represents a major scientific conundrum. Contrary to what one might expect, the true nature of glass remains something of a mystery, with scientific inquiry into its chemical and physical properties still underway. In chemistry and physics, the term glass itself is a mutable concept: It includes the substance we know as window glass, but it may also refer to a range of other materials with properties that can be explained by reference to glass-like behaviour, including, for instance, metals, plastics, proteins, and even biological cells.

While it may give the impression, glass is anything but conventionally solid. Typically, when a material transitions from a liquid to a solid state the molecules line up to form a crystal pattern. In glass, this does not happen. Instead, the molecules are effectively frozen in place before crystallization happens. This strange and disordered state is characteristic of glasses across different systems and scientists are still trying to understand how exactly this metastable state forms.

A novel state of matter: liquid glass

Research led by professors Andreas Zumbusch (Department of Chemistry) and Matthias Fuchs (Department of Physics), both based at the University of Konstanz, has just added another layer of complexity to the glass conundrum. Using a model system involving suspensions of tailor-made ellipsoidal colloids, the researchers uncovered a new state of matter, liquid glass, where individual particles are able to move yet unable to rotate - complex behaviour that has not previously been observed in bulk glasses. The results are published in the Proceedings of the National Academy of Sciences of the United States of America (PNAS) (date of publication: 19 January 2021; published online: 4 January 2021).

Colloidal suspensions are mixtures or fluids that contain solid particles which, at sizes of a micrometre (one millionth of a metre) or more, are bigger than atoms or molecules and therefore well-suited to investigation with optical microscopy. They are popular among scientists studying glass transitions because they feature many of the phenomena that also occur in other glass-forming materials.

Tailor-made ellipsoidal colloids

To date, most experiments involving colloidal suspensions have relied on spherical colloids. The majority of natural and technical systems, however, are composed of non-spherical particles. Using polymer chemistry, the team led by Andreas Zumbusch manufactured small plastic particles, stretching and cooling them until they achieved their ellipsoid forms and then placed them in a suitable solvent. "Due to their distinct shapes our particles have orientation - as opposed to spherical particles - which gives rise to entirely new and previously unstudied kinds of complex behaviours", explains Zumbusch, who is a professor of physical chemistry and senior author on the study.

The researchers then went on to change particle concentrations in the suspensions, and tracked both the translational and rotational motion of the particles using confocal microscopy. Continues Zumbusch: "At certain particle densities orientational motion froze whereas translational motion persisted, resulting in glassy states where the particles clustered to form local structures with similar orientation". What the researchers have termed liquid glass is a result of these clusters mutually obstructing each other and mediating characteristic long-range spatial correlations. These prevent the formation of a liquid crystal which would be the globally ordered state of matter expected from thermodynamics.

Two competing glass transitions

What the researchers observed were in fact two competing glass transitions - a regular phase transformation and a nonequilibrium phase transformation - interacting with each other. "This is incredibly interesting from a theoretical vantage point", comments Matthias Fuchs, professor of soft condensed matter theory at the University of Konstanz and the other senior author on the paper. "Our experiments provide the kind of evidence for the interplay between critical fluctuations and glassy arrest that the scientific community has been after for quite some time". A prediction of liquid glass had remained a theoretical conjecture for twenty years.

The results further suggest that similar dynamics may be at work in other glass-forming systems and may thus help to shed light on the behaviour of complex systems and molecules ranging from the very small (biological) to the very big (cosmological). It also potentially impacts the development of liquid crystalline devices.

Credit: 
University of Konstanz

In-utero exposures associated with increased risk of thyroid cancer

image: Professor Tone Bjørge, University of Bergen

Image: 
University of Bergen

A recent study by prof. Tone Bjørge, University of Bergen, and her team shows that thyroid cancer is related to in-utero exposures.

Thyroid cancer is diagnosed at a younger age than most other malignancies and the incidence is higher in women than men.

"The only established modifiable risk factors for thyroid cancer are childhood exposure to ionizing radiation and obesity. Few in-utero and early life risk factors have so far been identified" says Bjørge, professor at Department of Global Public Health and Primary Care, University of Bergen.

Maternal hypothyroidism, hyperthyroidism, goiter, and benign thyroid neoplasms related to higher risk

The teamconducted a nested case-control study using nationwide registry data from four Nordic countries (Denmark, Finland, Norway, and Sweden). The study included 2,437 thyroid cancer cases and 24,362 matched controls aged 0-48 years during 1967-2015.

"Maternal benign thyroid conditions such as hypothyroidism, hyperthyroidism, goiter, and benign thyroid neoplasms were strongly associated with thyroid cancer risk in offspring. Also, high birth weight, congenital hypothyroidism, maternal history of diabetes, and maternal postpartum haemorrhage were associated with increased risk", says Bjørge.

Motivates further research

The study supports a link between in-utero exposures and an increased risk of thyroid cancer later in life.

"These findings should motivate additional research into early-life exposures that might cause thyroid cancer", says Bjørge.

Credit: 
The University of Bergen