Culture

Measuring the true cost of conservation

image: Estimated fair market value of all properties in the United States (3D visualization).

Image: 
Photo courtesy of Christoph Nolte, Boston University.

For decades, scientists have been warning about potential future effects of global climate change, including more frequent wildfires, longer periods of drought, and sharp increases in the number, duration, and intensity of tropical storms. And since the start of 2020, we've seen natural disasters in record-breaking numbers, from the wildfires that ravaged California and Colorado, to most consecutive days with temperatures skyrocketing over 100 degrees in places like Arizona. Environmental concerns are continually creeping to a broader, national stage: issues of climate change and conservation received more attention during the first presidential debate on September 29, 2020 than in any other presidential debate in history.

When it comes to the topic of safeguarding the environment, Boston University Earth & Environment Assistant Professor Christoph Nolte is hardly a newcomer. He's spent the majority of his academic career studying the effectiveness of conservation, asking key questions about where concerted efforts take place, and what difference they make for our world at large. To inform future decisions about conservation policy, Assistant Professor Nolte has now created the first high-resolution map of land value in the United states -- a tool he says will better estimate environmental conservation costs, inform policy recommendations, and help peer academics conduct their own research on rebuilding and protecting what's left of our natural resources and the biodiversity within our ecosystems. Boston University interviewed Nolte to learn more about his research and its impact.

You've created the first high-resolution map of U.S. land value. What spurred this research?

I was dissatisfied with the quality of cost data in conservation research. Land conservation decisions are about trade-offs. If we wish to keep forest carbon on the ground, species' habitats intact, wetlands functional, or landscapes beautiful, conservation usually means that we also give up something: the benefits from alternative land uses. Arguments in favor of more conservation are widespread. For instance, Harvard University's E.O. Wilson suggests that 50% of the Earth should be protected. However, those points are incomplete if they are not also explicit about what we should be giving up where, who wins and who loses, and who gets to decide.

Ignoring costs can make us blind to the negative effects of regulation, often borne by those without a voice. In the case of voluntary conservation programs, ignoring cost can mean that we end up with a proposal, but insufficient funding. If we want to make informed societal decisions about conservation efforts, we need reliable, publicly accessible estimates of its cost.

Unfortunately, it is difficult to get good data for conservation cost. Conservation organizations don't freely share their financial records. Land prices can be good substitutes for data on conservation cost, but such data is valuable, sensitive, and unavailable to the public in most countries. So when large-scale land price data in the United States became available to academics for free for the first time, this created an exciting opportunity to create the first high-resolution map of land value and see how well it would predict conservation cost.

What data did you use to generate this map?

There are many datasets behind this map. Perhaps the most important is a nationwide database of properties and their sales. This dataset came from Zillow, the real estate company, which obtains the data from public records and makes it available to academics and nonprofits. In my research group, we developed a system that links this data to digital maps of property boundaries. This allows us to obtain detailed information on land characteristics: buildings, terrain, land cover, road access, water access, flood risk, local demographics, nearby amenities, and so on. This data is fed into a machine learning algorithm, which learns to predict sales prices from its knowledge of the characteristics of each property. After the algorithm is trained, I let it predict the sales prices of every property in the country. The result is this map.

What does this map tell us about environmental conservation costs? Why is it essential to have accurate land value data?

I found that most of the cost estimates that were used in the literature have underestimated the cost of conservation in the United States. This underestimation is particularly large near cities, where land values tend to be much higher than previous proxies suggested. In other words, it means that we will need substantially higher levels of funding than previously assumed if we want to achieve certain environmental goals, such as protecting all floodplains from development or protecting species habitat in the face of climate change.

How can your research help to educate policymakers on future conservation plans and priorities? Why is this important?

I believe that it is good to be realistic about what a given level of funding can achieve. For instance, in August, congress passed the Great American Outdoors Act, a historic bipartisan bill that makes $4.5 billion dollars of federal funding available for land protection. If previous estimates of conservation cost were correct, this budget could help reach us proposed habitat protection needs for all species in the U.S. However, the new cost data suggests that even such an unprecedented budget covers merely 5% of what is actually needed. That's a big difference!

More accurate cost data can also change recommendations about where conservation investments should go. When I reproduced recent work, about one quarter of the sites recommended for species protection shifted from one location to another, for instance, from expensive Long Island to slightly less expensive Southeastern Massachusetts. While this result should be taken with a grain of salt, it shows that the quality of cost data matters. The good news is that the cost map is now published, so anyone can incorporate it in their analyses and revisit their earlier findings.

How can people best use this data when thinking about environmental conservation as it relates to their day to day lives?

Data on the cost of conservation helps us be real about the actual magnitude and severity of the conservation problem we are facing as a society. Many of us feel positively about the benefits that conserved lands provide. However, not all of us are willing to make sacrifices to protect these lands, whether it is by reducing our own ecological footprints, or by voting in favor of local land use regulations or measures that increase taxes to fund conservation. In the midst of this, we are exposed to advertising that suggests that we can cheaply "offset" our effects on the environment -- for instance, that we can become "carbon-neutral" for a few dollars by purchasing carbon offsets when we fly.

A closer look at many cheap offsetting schemes suggests that they don't actually reduce emission by very much. But their existence has the side-effect that we are getting our hopes up that there might be a cheap way to get around the conservation problem. In reality, win-win situations are rare, and trade-offs are real. This might be difficult to accept, but we should not ignore it, even if we so desperately want to feel good about our own levels of consumption.

Despite costs being much higher than originally estimated, why is conservation still such an important investment?

Answers to this question have two dimensions: science and values. Science helps us understand the consequences of our actions. If we want to mitigate climate change, we need to reduce greenhouse gas emissions. Land conservation can help, for instance, by conserving forests and wetlands, reducing urban sprawl, or increasing opportunities for local recreation. If we want to prevent species extinctions, we need to protect and restore threatened habitat, conserve lands in climate refugia, and build ecological corridors, so species can move as temperatures rise. And if we want to avoid damages from flooding, we should protect more floodplains from development.

Science tells us about the consequences of our actions, but it doesn't tell us what to do. The more difficult question is how much we, as individuals and as a society, care about these outcomes, and what we are willing to give up for them. There are 8 billion of us. What each of us cares about is shaped by our diverse beliefs and morals, our upbringing, the people in our lives, the media we consume, the things we enjoy doing, etc. Given my affiliation, it probably won't be a surprise that I feel positively about policies that reduce our collective human footprint, but it is unwise to elevate anyone's individual worldview to a standard. Instead, I think what it is desirable to have a broader societal conservation of well-informed citizens that make those decisions together. My job, alongside that of thousands of other colleagues, is to provide the tools that can help us gain clarity about what's at stake.

Are there any other surprising findings? Could this data have other applications in areas outside of conservation (for example, real estate)?

I was surprised by the predictive power of the algorithm. As a validation step, I tested whether estimated land values could predict the actual cost of more than 4,000 public land acquisitions for conservation that were distributed all over the country. I expected that the predictions would outperform the proxies used in earlier studies, which they did. But to my surprise, the predictions even outperformed the estimates of tax assessors. Tax assessors are tasked with estimating the value of all properties in a given jurisdiction for taxation purposes, and part of that process often involves estimating the "fair market value" of each property. Because assessors work locally and know their area much better than a national dataset does, I would have expected their estimates to vastly outperform mine. However, instead, I found that mine were 29% more accurate. This does raise the question of why these differences exist and opens up new interesting avenues for scrutinizing existing methods for property taxation.

In your opinion, what is the single most important conservation issue facing the world today?

It would be difficult to answer this question without pointing towards climate change. It affects everything else that we think about in conservation. If we don't stop climate change, conserving species where they are today won't conserve them in the future. Future flooding and sea level rise will be a lot less manageable. And we worry a lot more about the conservation of forests and wetlands today because we know that their loss fuels this fire. At a project level, if you want to persuade people that conservation is a good idea, talking about rare or interesting species, beautiful landscapes, and recreational opportunities might get you more traction. But many of those efforts might be a drop in the bucket if we don't address this massive elephant in the room.

What do you hope people take away from this project? What are you planning to research next?

For me, for my students, and for colleagues at universities all over the country, the synthesis of this rich database has created novel opportunities for empirical research that were unthinkable just a few years ago. We currently support research on the economic risks from flooding, oil spills and hazardous waste, the economic impacts of land regulations, the benefits of water quality and priorities for emissions reduction from forest protection. My own curiosity has mostly to do with projects that help people protect the places and species they love: identifying opportunities for protection, scrutinizing the effectiveness of existing programs, and reducing the informational barriers to get conservation done. It is an exciting time to do this research, and I'm glad that so many fantastic colleagues around the country are interested in advancing the frontiers of knowledge together.

Credit: 
Boston University

CCNY & partners in quantum algorithm breakthrough

Researchers led by City College of New York physicist Pouyan Ghaemi report the development of a quantum algorithm with the potential to study a class of many-electron quantums system using quantum computers. Their paper, entitled "Creating and Manipulating a Laughlin-Type ν=1/3 Fractional Quantum Hall State on a Quantum Computer with Linear Depth Circuits," appears in the December issue of PRX Quantum, a journal of the American Physical Society.

"Quantum physics is the fundamental theory of nature which leads to formation of molecules and the resulting matter around us," said Ghaemi, assistant professor in CCNY's Division of Science. "It is already known that when we have a macroscopic number of quantum particles, such as electrons in the metal, which interact with each other, novel phenomena such as superconductivity emerge."

However, until now, according to Ghaemi, tools to study systems with large numbers of interacting quantum particles and their novel properties have been extremely limited.

"Our research has developed a quantum algorithm which can be used to study a class of many-electron quantum systems using quantum computers. Our algorithm opens a new venue to use the new quantum devices to study problems which are quite challenging to study using classical computers. Our results are new and motivate many follow up studies," added Ghaemi.

On possible applications for this advancement, Ghaemi, who's also affiliated with the Graduate Center, CUNY noted: "Quantum computers have witnessed extensive developments during the last few years. Development of new quantum algorithms, regardless of their direct application, will contribute to realize applications of quantum computers.

"I believe the direct application of our results is to provide tools to improve quantum computing devices. Their direct real-life application would emerge when quantum computers can be used for daily life applications."

Credit: 
City College of New York

Dopamine surge reveals how even for mice, 'there's no place like home'

image: Randy Blakely, Ph.D., senior author, executive director of FAU's Brain Institute and a professor of biomedical science in FAU's Schmidt College of Medicine.

Image: 
Florida Atlantic University

By monitoring a well-replicated biomarker associated with reward, a study by neuroscientists from Florida Atlantic University provides evidence that the old adage, "There's no place like home," has its roots deep in the brain. The study demonstrates that a signal for pleasure - dopamine - rises rapidly when mice are moved from a simple recording chamber to their home cage, but less so when they are returned to a cage not quite like the one they knew. Prior studies have shown that rodents will actively choose their home cage over a look-alike environment. Using a sensor for dopamine placed in the mouse brain's key reward center, FAU scientists are the first to demonstrate that home evokes a surge of dopamine that mimics the response to a dose of cocaine.

The neurotransmitter dopamine is critical to motivational control and to directing behaviors that seek reward and have preferred outcomes. Dopamine release in the nucleus accumbens of humans and rodents, a primary site mediating natural rewards as well as addictive substances, is triggered initially by the reward itself but later occurs when reward-coupled cues are presented - a feature underlying reinforcement learning. The activity of dopamine neurons that drives dopamine release is sensitive to stimuli that signal valence and saliency and is key to motivational learning. In their report, the FAU scientists reveal that the simple act of "coming home" drives dopamine release.

For the study, researchers used a sensitive technique known as fiber photometry to capture the second-to-second changes in dopamine in the mouse nucleus accumbens. Results of the study, published in the journal Neurochemistry International, reinforce the importance of the brain system in substance misuse. Addiction is key to everyday pleasures, and anatomically-speaking, the researchers say that "home is where the brain is."

"Our data provide clear evidence of a biochemical foundation for the reinforcing properties of home cage return. This simple environmental manipulation can provide a minimally-invasive approach to peel away aspects of reward circuitry connected to natural reinforcers - one that is critical to an animal's survival," said Randy Blakely, Ph.D., senior author, executive director of FAU's Brain Institute and a professor of biomedical science in FAU's Schmidt College of Medicine. "We think that monitoring the home cage-elicited release of dopamine provides a simple, but powerful paradigm for the study of how genetic and life events can lead to an inability to feel pleasure. The inability to feel pleasure is a major characteristic of mood disorders, and a simple test for the efficacy of medications or other treatments. The field of drug discovery needs simple, biomarker-based analogs of behavioral changes seen in people with mood disorders since we can't ask a mouse how it feels."

The researchers questioned whether their findings reflected leaving an unappealing environment or if it truly was a response to the positive aspects of a known and safe environment. Therefore, they examined whether dopamine surges arise when the mice were relocated from the plexiglass recording chamber to a clean cage with natural bedding matching the one they had been living in prior to the study. Indeed, dopamine release occurred, however, this release was not as large as that observed when mice were transferred to the home cage and the dopamine surge was not as sustained.

"We weren't really exploring home cage effects," said Felix Mayer, Ph.D., a post-doctoral fellow in Blakely's lab and lead author of the study. "However, we were struck as to how reliable the manipulation was in evoking dopamine release particularly when placed in the context of little or no rise in dopamine when the mice were moved from the home cage to the test chamber. We are excited now to see if the genetic models of brain disorders we study will impact this effect."

Credit: 
Florida Atlantic University

Catalyzing a zero-carbon world by harvesting energy from living cells

image: Krebs cycle metabolites fall in energy-rich carbon feedstock

Image: 
Issey Takahashi

The imminent environmental crisis calls for an urgent transition to a green economy. A team of scientists at Nagoya University, Japan, led by Professor Susumu Saito, has recently found an interesting way to make this happen -- by leveraging an important metabolic pathway in living cells. Their aim was to turn the energy-poor pathway products into biorenewable ones that can potentially power our world in a sustainable manner.

In most plants, animals, fungi, and bacteria, a pathway called the "Krebs cycle" is responsible for providing fuel for cells to carry out their functions. Operating in the mitochondria, this cycle ultimately results in the formation of both energy-rich compounds like NADH and FADH2 (which are used to power the organism) and energy-deficient metabolites like C4-, C5-, and C6-polycarboxylic acids (PCAs). Recently, the idea of modifying highly functionalized PCAs into biorenewable molecules has been explored, by restoring the carbon-hydrogen (C-H) bonds that were lost in their creation. This would need these biomolecules to undergo reactions called "dehydration" and "reduction," that is, the reversal of the Krebs cycle -- a complicated process.

In their new study, which was published in Science Advances, Prof Saito and his team rose to the challenge by aiming to find an artificial "catalyst," a molecule that could facilitate this modification. They focused on a powerful, versatile precatalyst called "phosphine-bipyridine-phosphine (PNNP)iridium (Ir)-bipyridyl complex." Prof Saito says, "Single-active-metal catalyst such as the (PNNP)Ir catalyst can facilitate the selective hydrogenation and dehydration of highly functionalized (highly oxidized and oxygenated) biomass feedstock like Krebs cycle metabolites."

When the scientists tested the use of this precatalyst on C4-, C5-, and C6-polycarboxylic acids and other mitochondria-relevant metabolites, they found that the C-H bonds were incorporated effectively into the metabolites via hydrogenation and dehydration reactions -- a feat otherwise very difficult to achieve. The restoration of C-H bonds means energy-rich organic compounds can be generated from energy-poor materials that are abundant in nature. Moreover, the reactions resulted in compounds called "diols" and "triols," which are useful as moisturizing agents and in building plastics and other polymers. The sole "waste" product in this reaction is water, giving us a clean source of energy. Not just this, these complex processes could occur in a "one-pot fashion," making this process efficient.

Prof Saito and his team are optimistic that their research will have important consequences for a future centered on renewable energy. Prof Saito says, "Wasteful carbon feedstocks like sawdust and rotten food contain a vault of different carboxylic acids and their potential derivatives. The molecular (PNNP)Ir catalyst can be used to make zero-emission materials. Many commodity plastics and polymer materials could be produced from biomass-based wasteful feedstock using the diols and triols obtained from the hydrogenation process."

With these findings, a greener, more carbon-neutral society is surely in sight.

Credit: 
Nagoya University

State-level lung cancer screening rates not aligned with lung cancer burden in the US

ATLANTA - NOVEMBER 12, 2020 - A new study reports that state-level lung cancer screening rates were not aligned with lung cancer burden. The report, appearing in JNCI: The Journal of the National Cancer Institute, provides the first population-based state-level screening data for all 50 states and finds lung cancer screening rates varied geographically by state.

Lung cancer continues to be the leading cause of cancer-related death in the United States, with an estimated 135,720 deaths expected in 2020. Lung cancer screening with low-dose CT (LDCT) has the potential to reduce cancer death and has been recommended for people with a heavy smoking history since 2013. However, previous studies show it is underutilized. For this study, investigators led by the American Cancer Society's Dr. Stacey Fedewa, and co-authored by members of the National Lung Cancer Roundtable, examined lung cancer screening with LDCT rates and growth in all 50 states, including Washington D.C., from 2016 to 2018. They also looked at how states' lung cancer screening rates correlated with lung cancer burden, sociodemographic status, and access to lung cancer screening.

"The increasing but low utilization of lung cancer screening reflects both ongoing efforts to screening eligible adults, and the many challenges to do so," said Dr. Fedewa. "Kentucky, which has supported screening implementation efforts, is unique as its screening rates are over twice the national average and four times that of other high lung cancer burden states like West Virginia and Arkansas."

Results show that several Northeastern states with lower lung cancer burden (e.g. Massachusetts, Vermont, New Hampshire with 50 lung cancer deaths per 100,000) had lower screening rates (

The authors say their finding shows that while overall lung cancer screening rates increased nationally between 2016-2018, the rate was still low in 2018, with only 5-6% eligible adults in the U.S. receiving lung cancer low-dose CT (LDCT). Relative to the national average, screening rate ratios were lower in 8 states, mostly in the West or South and 50% higher in 13 states, mostly in the Northeast or Midwest with Kentucky as the outlier.

The study also found that compared to the national average, lung cancer screening rates were about 20% lower in states with a high proportion of uninsured adults who smoked and 40% lower in states with a relatively low number of lung cancer screening facilities; suggesting that there may be critical gaps in access to lung cancer screening. According to sociodemographic factors, screening rates were positively correlated with the proportion of smokers who were female and negatively correlated with smokers who were Hispanic. Results showed that states with adults who smoked and are Hispanic had a significantly lower screening rate ratio than the national average.

"Deliberate effort from various stakeholders such as policy makers, cancer control, health systems, and providers are needed to boost lung cancer screening rates among eligible adults with a heavy smoking history, a group facing multiple barriers to lung cancer screening and cancer care," said the authors. "If states know what their lung cancer screening rates are, they can set a goal and track progress toward it."

Credit: 
American Cancer Society

Nearly 1 in 5 cancer patients less likely to enroll in clinical trials during pandemic

Washington, D.C.--November 12, 2020--A significant portion of cancer patients may be less likely to enroll in a clinical trial due to the ongoing coronavirus pandemic. According to an article published this week in JAMA Oncology, nearly 1 in 5 cancer patients surveyed said the pandemic would make them less likely to enroll in a trial. The top reason given for not enrolling is fear of COVID-19 exposure.

"While most patients would still be willing to take part in a clinical trial during the pandemic, the fear of COVID-19 exposure that would come with participating in a clinical trial is poised to cause many otherwise interested patients from enrolling. This means that trials that already struggled to find enough patients are likely to see reduced enrollment as long as the pandemic continues," said Mark Fleury, co-author of the article and policy principle for emerging issues at the American Cancer Society Cancer Action Network (ACS CAN). "The barriers patients already faced pre pandemic made it challenging to take part in clinical trials. Now with the addition of COVID-19, it is even harder and we're likely to see long-term impacts on the pace of research."

The finding was based on a survey ACS CAN conducted of cancer patients and survivors between late May and mid-June. Later surveys showed COVID anxiety remains high among patients and fear of contracting the virus were cited--along with facility closures--as one of the main reasons patients delayed cancer care. Cancer patients are among those most at risk for severe effects of the coronavirus.

"The pandemic caused many institutions to stop enrolling new patients on clinical trials, and the assumption was that once facilities reopened, they could get enrollment back to normal. What we've found is that so long as the pandemic is still underway, fewer patients are going to volunteer for clinical trials," said Fleury. "The solution is that we need to get the pandemic under control or find innovative ways like telemedicine visits so that patients can take part in clinical trials without feeling exposed to additional COVID-19 risks."

Credit: 
American Cancer Society

Individualized brain stimulation therapy improves language performance in stroke survivors

image: Dr. Jed Meltzer, Baycrest's Canada Research Chair in Interventional Cognitive Neuroscience and a neurorehabilitation scientist at Baycrest's Rotman Research Institute (RRI).

Image: 
Baycrest

Baycrest scientists are pioneering the use of individualized brain stimulation therapy to treat aphasia in recovering stroke patients.

Aphasia is a debilitating language disorder that impacts all forms of verbal communication, including speech, language comprehension, and reading and writing abilities. It affects around one-third of stroke survivors, but can also be present in those with dementia, especially in the form of primary progressive aphasia.

"Aphasia can be very isolating," says Dr. Jed Meltzer, Baycrest's Canada Research Chair in Interventional Cognitive Neuroscience and a neurorehabilitation scientist at Baycrest's Rotman Research Institute (RRI). "It can negatively affect people's personal relationships, and it often determines whether or not someone can continue working."

In a recent study published in the journal Scientific Reports, Dr. Meltzer and his team tested language performance and used magnetoencephalography (MEG) to measure brain waves in 11 stroke survivors with aphasia before and after they underwent brain stimulation therapy.

The scientists found that the participants had abnormal electrical activity in brain regions close to but outside the area destroyed by the stroke. This abnormal activity was mainly a shift to slower brain waves, a pattern they have also observed in individuals with dementia.

"We mapped that abnormal activity and targeted it using non-invasive brain stimulation," says Dr. Meltzer. "We found that the stimulation made the activity more normal - that is, faster - and improved language performance in the short term."

Previous research has demonstrated that brain stimulation can improve language performance in aphasia patients. However, this study is one of the first to link this performance improvement to changes in the brain activity surrounding the tissue destroyed by stroke. In other words, this study suggests not only that brain stimulation works in aphasia patients, but also that the reason it works may be because it addresses abnormalities in the brain surrounding the destroyed tissue.

Another novel aspect of this work is that the scientists targeted each individual's abnormal brain activity with the stimulation treatment. In contrast, the standard approach in previous studies has been to use the exact same treatment, targeting the same brain areas, on every patient.

"Our results demonstrate a promising method to personalize brain stimulation by targeting the dysfunctional activity outside of the destroyed brain tissue," says Dr. Meltzer. "Aphasia patients are highly variable in terms of where their brain damage is and what part of the brain should be stimulated for therapy. By mapping individuals' brain waves, we are finding ways to target the right area to improve their language performance."

While the participants in this study were stroke survivors, individuals with dementia have similar dysfunctional tissue in their brains, and the scientists are also examining the use of brain stimulation in this group.

Dr. Meltzer and his team looked at the immediate effects of single stimulation sessions in this study. As a next step, they have received funding from the Heart and Stroke Foundation to conduct a full-scale clinical trial looking at the longer-term impacts of repeated stimulation for stroke survivors with aphasia. However, this study has been suspended because of the restrictions on in-person research participation due to the COVID-19 pandemic. In the meantime, the scientists have pivoted to optimize other aspects of aphasia treatment.

With additional funding, the researchers could test different types of stimulation with more patients over more sessions, allowing them to make faster progress in developing this treatment for individuals with aphasia.

Credit: 
Baycrest Centre for Geriatric Care

Smaller than ever--exploring the unusual properties of quantum-sized materials

image: Using dendrimers as molecular templates to produce diverse arrangements of metal ions, SNPs of about 1 nm in diameter with precise indium-to-tin ratios can be readily obtained.

Image: 
Tokyo Tech

The development of functional nanomaterials has been a major landmark in the history of materials science. Nanoparticles with diameters ranging from 5 to 500 nm have unprecedented properties, such as high catalytic activity, compared to their bulk material counterparts. Moreover, as particles become smaller, exotic quantum phenomena become more prominent. This has enabled scientists to produce materials and devices with characteristics that had been only dreamed of, especially in the fields of electronics, catalysis, and optics.

But what if we go smaller? Sub-nanoparticles (SNPs) with particle sizes of around 1 nm are now considered a new class of materials with distinct properties due to the predominance of quantum effects. The untapped potential of SNPs caught the attention of scientists from Tokyo Tech, who are currently undertaking the challenges arising in this mostly unexplored field. In a recent study published in the Journal of the American Chemical Society, a team of scientists from the Laboratory of Chemistry and Life Sciences, led by Dr Takamasa Tsukamoto, demonstrated a novel molecular screening approach to find promising SNPs.

As one would expect, the synthesis of SNPs is plagued by technical difficulties, even more so for those containing multiple elements. Dr Tsukamoto explains: "Even SNPs containing just two different elements have barely been investigated because producing a system of subnanometer scale requires fine control of the composition ratio and particle size with atomic precision." However, this team of scientists had already developed a novel method by which SNPs could be made from different metal salts with extreme control over the total number of atoms and the proportion of each element.

Their approach relies on dendrimers (see Figure 1), a type of symmetric molecule that branches radially outwards like trees sprouting form a common center. Dendrimers serve as a template on which metal salts can be accurately accumulated at the base of the desired branches. Subsequently, through chemical reduction and oxidation, SNPs are precisely synthesized on the dendrimer scaffold. The scientists used this method in their most recent study to produce SNPs with various proportions of indium and tin oxides and then explored their physicochemical properties.

One peculiar finding was that unusual electronic states and oxygen content occurred at an indium-to-tin ratio of 3:4 (see Figure 2). These results were unprecedented even in studies of nanoparticles with controlled size and composition, and the scientists ascribed them to physical phenomena exclusive to the sub-nanometer scale. Moreover, they found that the optical properties of SNPs with this elemental proportion were different not only from those of SNPs with other ratios, but also of nanoparticles with the same ratio. As shown in Figure 3, the SNPs with this ratio were yellow instead of white and exhibited green photoluminescence under ultraviolet irradiation.

Exploring material properties at the sub-nanometer scale will most likely lead to their practical application in next-generation electronics and catalysts. This study, however, is just the beginning in the field of sub-nanometer materials, as Dr Tsukamoto concludes: "Our study marks the first-ever discovery of unique functions in SNPs and their underlying principles through a sequential screening search. We believe our findings will serve as the initial step toward the development of as-yet-unknown quantum sized materials." The sub-nanometric world awaits!

Credit: 
Tokyo Institute of Technology

C4 rice's first wobbly steps towards reality

image: Changing a plant like rice, from following the C3 photosynthesis path to follow the C4 photosynthesis path, needs an interdisciplinary and international approach.

Image: 
Carl Davies

An international long-term research collaboration aimed at creating high yielding and water use efficient rice varieties, has successfully installed part of the photosynthetic machinery from maize into rice.

"We assembled five genes from maize that code for five enzymes in the C4 photosynthetic pathway into a single gene construct and installed it into rice plants," said lead author Dr Maria Ermakova, who works at The Australian National University (ANU), as part of the international C4 Rice Project, led by Oxford University.

Rice, one of the main world food staples, uses the less efficient C3 photosynthetic pathway. Scientists predict that the introduction of the more efficient C4 photosynthesis traits into rice can potentially increase photosynthetic efficiency by fifty percent, improve nitrogen use efficiency and double water use efficiency.

"Although introducing all the genes required to make C4 rice still a long way off, this is the first paper where we assembled a functional C4 biochemistry in rice, which is very exciting," said Dr Ermakova, from the ARC Centre of Excellence for Translational Photosynthesis (CoETP).

Using synthetic biology, scientists can introduce several genes at the same time, get a plant in just a year and make prototypes to redesign their "constructs" very rapidly, just in a matter of months. In sharp contrast, using to the old approach, which inserts one single gene each time, can take several years.

"For me, the most important aspect of this paper is that we have mastered the technology that will help us in our journey towards C4 rice and now we can move forward to the next phase at a higher velocity than ever before," said CoETP's Deputy Director Professor Susanne von Caemmerer, one of the co-authors of this study.

Using the same kind of technique that Hal Hatch used in 1966 during the discovery of the C4 pathway, the team of researchers from the Max Planck Institute were able to follow the labeled CO2 on its way through the pathway.

'This is another key result, as we were able to prove that carbon dioxide is fixed using the C4 pathway. In other words, we achieved gene expression, but we also got the enzymes involved to be active and functioning in the plant in the right cells," says Professor von Caemmerer.

"Even though the plants we produced are not yet working very efficiently as C4, we now know that part of their photosynthesis is moving through the C4 pathway," she says.

"The research team include scientists with diverse expertise from microscopy to physiology, plant breeding and modelling" says Dr Florence Danila, who was in charge of the enzyme localisation using molecular microscopy techniques at the ANU C4 Rice Project Node.

"We started the C4 rice project ten years ago, involving sixteen labs in eleven countries. This particular research has taken us five years to complete, and the coordinated effort of several researchers from multiple organisations around the world, including Washington State University, Oxford University, Cambridge University, ANU and the Max Planck Institute," says Professor Robert Furbank, Director of the ARC Centre of Excellence for Translational Photosynthesis and one of the authors of the study.

"Our next step is to assemble a construct using sixteen genes, so we have lots of work to do. These are the first wobbly steps towards achieving C4 rice. These results show that we can manipulate a whole metabolic pathway. These results show that creating a functional C4 rice is possible," Professor Furbank says.

Credit: 
ARC Centre of Excellence for Translational Photosynthesis

Researchers generate a brain cell type crucial to support neural activity

video: Researchers have succeeded in generating human OLs from pluripotent stem cells derived from patients with nervous system diseases, specifically multiple sclerosis or ALS.

Image: 
University of Malaga

The loss of oligodendrocytes (OLs) -highly specialized cells of the brain that produce myelin, an essential structure enabling an efficient transmission of electrical signals and the support of neural activity- is a frequent condition in patients suffering neurodegenerative diseases.

Researchers of the Department of Cellular Biology, Genetics and Physiology of the University of Malaga (UMA) have succeeded in generating human OLs from pluripotent stem cells derived from patients with nervous system diseases, specifically multiple sclerosis or ALS.

This is a new method that is faster and more efficient, because it enables the generation of OLs in just three weeks. This find is highlighted on the November cover of the scientific journal Nature Protocols.

"So far, no one has developed any treatment that reverts the loss of myelin and OLs in these patients, probably because there hasn't been an appropriate platform available to study these phenomena", says the researcher of the UMA Juan Antonio García-León, main author of this study.

According to this expert, the generated cells are equivalent of the OLs of a human brain, and they produced myelin around neurons when transplanted in the brain of an animal model.

Efficient drugs

García-León explains that these cells could be used to advance in searching efficient treatments that favor myelination. In fact, as he asserts, a biotechnology company already uses this new method to develop an efficient drug to revert myelin loss involved in multiple sclerosis, something crucial to counteract its symptoms and pathologies.

Neurological diseases

Although the studies on the alteration of OLs and myelin in patients with neurological diseases such as Alzheimer's disease or schizophrenia are still limited, some recent studies argue its fundamental role in these conditions.

The UMA is working in collaboration with other national and international R&D&I groups on the application of this new technology in those diseases in which its exact involvement is unknown.

This study has been conducted by the R&D&I group of the UMA "NeuroAD" -member of the Biomedical Research Institute of Malaga (IBIMA) and the Network Center for Biomedical Research in Neurodegenerative Diseases (CIBERNED)- led by Professor Antonia Gutiérrez. The researchers of the UMA José Carlos Dávila and Laura Cáceres have also participated in the study. Likewise, this research has been developed in collaboration with the KU Leuven University (Belgium) and the Sorbonne University (Paris).

Credit: 
University of Malaga

A therapeutic option for glioblastoma using pH-sensitive nanomicelles

image: We conjugated DAVBNH to an aliphatic ketone-functionalized PEG-PAA block copolymer that spontaneously self assembles in water to form a micelle.

Image: 
2020 Innovation Center of NanoMedicine

Summary:

Using practical nano-DDS technology, a polymeric nanomicelle that effectively delivers the potent mitotic inhibitor desacetyl vinblastine hydrazide (DAVBNH) to glioblastoma (GBM) was developed. GBM, which grows rapidly under anaerobic conditions, causes acidosis due to enhanced glycolysis, and the developed nanomicelle accurately senses the pH lowering to release the contained anticancer drug. Vinca alkaloids, represented by vinblastine, are long-known anticancer agents that strongly inhibit mitosis of cells, but they are highly toxic to normal cells and poorly tolerated for treatment. This major issue was greatly improved by our nano-DDS technology. In the experiment using mice with GBM transplanted intracranial, the DAVBNH loaded nanomicelle treated group containing improved 100% survival rate by 2 times and 50% survival rate by 2.6 times in comparison with the free DAVBNH treated group. This result, published in "Biomaterials" (IF = 10.317, 2019) on October 23, suggests that it may be clinically applicable to cancers with rapid disease progression other than glioblastoma. https://doi.org/10.1016/j.biomaterials.2020.120463

Main Body:

November 12, 2020, Kawasaki in Japan: The Innovation Center of NanoMedicine (Director General: Prof. Kazunori Kataoka, Location: Kawasaki-City in Japan, Abbreviation: iCONM) announced in "Biomaterials" (Impact Factor: 10.317 in 2019) that polymeric nanomicelles that selectively releases anti-cancer drugs by utilizing the acidity in the cells of glioblastoma (GBM) has been developed focusing on the fact that the hydrogen ion concentration (pH) of GBM cells is lower than healthy tissues. GBM is a brain tumor with extremely fast disease progression and poor prognosis (5-year survival rate: 10.1%). Although some drug candidates are under clinical trials, there are currently no drug therapies that can significantly improve the overall survival. Besides, it is difficult to completely remove it by surgery keeping the brain function as much as possible while the boundary with normal tissues is unclear.

Recently, it has been noted that as a treatment method for GBM, a method using an electromagnetic pulse called Tumor Treatment Fields (TTF) improved the overall survival (10-14 months to 16-24 months). The mechanism is known to be mitotic inhibition based on the destruction of the mitotic spindle that occurs during cell division. Therefore, we focused on the vinca alkaloids represented by vinblastine, an anticancer drug, which has been used as a mitosis inhibitor for decades. This type of drug suppresses mitosis by inhibiting the polymerization of intracellular microtubules and shows strong cytotoxicity, but it also affects not only cancer cells but also normal cells, and results in various types of serious adverse events including myelosuppression. Therefore, we considered the selective delivery of vinblastine to tumor tissue using nano-DDS (drug delivery system) with polymeric micelles. In this system, the drug must be released after reaching the tumor tissue. Cancer cells are usually placed in an anaerobic environment, unable to successfully run the TCA cycle to sustain their lives, and obtain the life energy exclusively from glycolysis. This nature makes cancer tissues accumulate acidic molecules and result in acidosis. This is more pronounced in faster-growing cells and is more acidic in rapidly developing cancers such as GBM. We wanted to take advantage of this property for drug release from micelles.

Hydrazone bonding was chosen as a linker that causes acidic cleavage and prepared by selecting a block polymer with carbonyl group such as ketones or aldehydes and using desacetyl vinblastine hydrazide (DAVBNH) as a vinca alkaloid. It is known that DAVBNH is 6 times more potent than vinblastine in suppressing the growth of glioblastoma. Self-association of a PEG-PAA block copolymer with DAVBNH attached to an aliphatic ketone residue in water was able to produce nanomicelles with an average diameter of 31 nm, containing DAVBNH in its inner core. Examining the stability of these micelles in solutions at various acidities from pH 6.0 to pH 7.4, the amount of free DAVBNH was significantly different between pH 6.9 and pH 7.4. In other words, it was found that the drug can be released on the slightly acidic side by accurately capturing the subtle difference in pH. It was also found that when the ketone residue is changed to an aldehyde residue, the drug is not released until the pH drops below 5.0.

Using mice in which GL261-Luc cells, which are a type of GBM cells, were transplanted intracranially, a micelle preparation of DAVBNH or a free drug was injected through the tail vein and the antitumor effect was examined. Compared with the free drug administration group, the 100% survival rate was improved by 2 times, and the 50% survival rate was improved by 2.6 times.

Credit: 
Innovation Center of NanoMedicine

Research finds that UK consumers dislike hormones in beef and chlorine washed chicken

New economic research from the University of Kent, University of Reading and IHS Markit, reveals the extent to which UK consumers dislike food produced using production methods such as hormones in beef and chlorine washed chicken.

The research also reveals that UK consumers highly value food production that adheres to food safety standards set by the EU as well as UK produced food. These findings are particularly relevant for post-Brexit trade deals and the ongoing debates about UK food standards.

The researchers conducted choice experiments for four food products examining UK consumer attitudes for food produced using several agricultural production methods currently prohibited in the UK, including chlorine washed chicken and beef from cattle grown using hormone implants.

These methods of food production are common in the USA but are prohibited under EU food safety regulations.

Results confirm that UK consumers dislike food produced using these production methods. In contrast, participants positively valued EU food safety standards as well as the UK as a country of origin for beef, chicken pork and corn production.

These findings are timely given the status of the UK's post-Brexit agricultural trade negotiations and the ongoing debate in Parliament about legislative basis of future food standards.

Professor Iain Fraser, Principle Investigator and Professor of Agri-Environmental Economics at the University of Kent said: 'Our findings are a strong indicator of the expectations placed on food production by UK consumers. Methods of food production that fall short in terms of animal welfare draw a negative response from UK consumers, whilst in contrast the presence of EU food safety standards on packaging results in a positive response from consumers. Data from the same project also suggests that consumers tend to strongly value EU food standards regardless of their attitudes towards Brexit.

'As the UK continues to consider post-Brexit agricultural trade arrangements, as well as how to capture industry and public views within the Agricultural Bill currently going through Parliament, these findings support the need to maintain high UK food standards.'

Credit: 
University of Kent

Brain metastases cause severe brain damage that can be inhibited by treatment

Researchers from the University of Seville and the University of Oxford have described how the presence of brain metastases causes acute cerebrovascular dysfunction from the early stages of the disease. The study, whose main author was Manuel Sarmiento Soto, Marie Curie researcher and member of the Group on Mechanisms of Cell Death in Neurodegenerative Diseases at the University of Seville, shows that this alteration is chiefly caused by the activation of cells called astrocytes.

By using a specific treatment to override this activation, the researchers were able to return cerebrovascular flow to healthy levels. This improvement in blood flow around the metastases can limit the neurological deterioration associated with the progression of this disease and improve the otherwise poor life expectancy of these patients. In addition, the same specific inhibitors used in this study were also used in several ongoing clinical trials, a fact that could significantly shorten the delay in transferring the results obtained by US researchers to use in the clinic.

Astrocytes are the most abundant sort of cell in the central nervous system. One of their critical functions is to protect the neurons, thus ensuring a proper supply of nutrients from the blood flow. As an organ, the brain consumes high levels of energy. Any failure in the blood supply to any region of the brain could cause irreversible damage leading to the onset of various pathologies such as Alzheimer's disease, Parkinson's or even death.

The key role played by the astrocytes in brain metastasis has been thoroughly proven. For example, in the specific case of brain metastases from breast cancer, when the tumour cells reach the brain they produce a significant activation of astrocytes causing a dysregulation of the afferent blood flow. This causes a decrease in brain perfusion in the areas adjacent to the brain metastases, with the consequent damage that this drop could cause to neurons and other brain cells.

Credit: 
University of Seville

Innovative machine-learning approach for future diagnostic advances in Parkinson's disease

image: Machine-learning based analysis of mitochondria interaction networks

Image: 
@LIH

Parkinson's disease (PD) is the second most common neurodegenerative disease, with patient numbers being expected to double worldwide in the next 20 years. The detailed molecular and cellular mechanisms underlying its pathogenesis remains unclear, although recent evidence has pointed towards the role of mitochondrial dysfunction in the onset of the disease. Mitochondria -- small cellular 'subunits' involved in cell metabolism and energy generation -- constantly and dynamically interact with each other, forming perpetually changing networks known as mitochondria interaction networks (MINs). The researchers therefore sought to understand the correlation between the mitochondrial impairments observed in PD and any specific network topological changes in MINs, with the aim of advancing the early diagnosis and classification of PD patients.

"Since conventional analysis focusing on individual mitochondria has not provided satisfying insights into PD pathogenesis, our pioneering work has gone a step forward by investigating the interaction networks between these organelles", explains Dr Feng He, Group Leader of the Immune Systems Biology Group of the LIH Department of Infection and Immunity and corresponding author of the publication.

Leveraging their strong expertise in network analysis and machine learning, the scientists analysed a large 700 Gigabyte dataset of three-dimensional mitochondrial images of colonic neurons, collected from PD patients and healthy controls, and dopaminergic neurons, derived from stem cells. They found that particular network structure features within MINs were altered in PD patients compared to controls. For instance, in PD patients, mitochondria formed connected subnetworks that were generally larger than in healthy individuals. In line with this result, the efficiency of the energy and information transmission and distribution among the different mitochondria in PD patient MINs was significantly lower than in controls, suggesting that the longer 'transmission delays' were associated with the larger diameter of the components of the MINs observed in PD subjects. "These different topological patterns in MINs may mean that energy and information are possibly produced, shared and distributed less competently in the neuronal mitochondria of PD patients relative to healthy controls, suggesting their connection to mitochondrial damage, deficiencies and fragmentation typical of neurodegenerative disorders", adds Dr He.

Moreover, the research team found these different MIN patterns to be highly correlated with the commonly-used PD clinical scores of individual patients, i.e. the Unified Parkinson's Disease Rating Scale (UPDRS). Indeed, when applying a machine learning approach to analyse these MIN characteristics, the researchers observed that the use of a combination of those network features alone allowed them to accurately distinguish between PD patients and healthy controls.

"Our findings bring forward the potential of using particular mitochondrial network features as novel biomarkers for the early diagnosis and classification of PD patients, which might help develop a new health index. As a next step, we will explore how our results may offer new perspectives for the understanding of various other neurodegenerative diseases characterised by mitochondrial dysregulation, such as Huntington disease and Alzheimer's, making our work a true instance of translational and transversal research", states Prof Rejko Krüger, Director of Transversal Translational Medicine at LIH and contributing author of the study.

"This publication also constitutes a major step forward in the application of advanced machine-learning techniques to unravel the complex network interactions of cellular organelles for disease stratification. Indeed, data analytics and innovative digital technologies are a core priority area for our department and for LIH as a whole", concludes Prof Markus Ollert, Director of the Department of Infection and Immunity and contributing author of the paper.

The inter-disciplinary study relied on the close cooperation between clinicians, neuroscientists, network biologists, big data and machine learning experts from the Luxembourg Institute of Health (LIH), the Luxembourg Centre for Systems Biology (LCSB) and the Central Hospital of Luxembourg (CHL), particularly via clinical neurologist Dr Nico Diederich, as well as on the collaboration with other international partners such as the Instituto de Fi?sica Interdisciplinar y Sistemas Complejos IFISC (Spain).

Credit: 
Luxembourg Institute of Health

3D printing -- a 'dusty' business?

To close the substantial gaps in our knowledge, scientists at the German Federal Institute for Risk Assessment (BfR) are investigating which particles are released into the environment and what their properties are. Different substances are released into the air depending on the material used for printing. For example, BfR experts were able to detect particles of the widely-used plastic polylactic acid and copper crystals, among other substances.

The size of the particles was 50 nanometres (polylactic acid) and 120 to 150 nanometres (copper). This means that they are so small that they can get into the alveoli, the smallest branches of the lungs. The higher the temperature during "printing", the more particles were released. The BfR is now exploring whether "3D printer dust" poses a health risk.

Consumer safety regarding 3D printers was also the focus of an expert meeting (partly held online) that took place at the BfR on 28 August 2020. In addition to the BfR, the participating institutions were the German Federal Institute for Materials Research and Testing (BAM), the German Environment Agency (UBA), the Institute for Occupational Safety and Health of the German Social Accident Insurance (IFA) and the US Consumer Products Safety Commission (CPSC). Research institutes from Technische Universität Berlin, interest groups and mem-bers of the 3D printing association "3DDruck e.V.", in which users and manufacturers are or-ganised, were also represented.

The event focused on fused deposition modelling (FDM, also known as fused filament fabri-cation, FFF). In this additive production process, a thermoplastic, the filament, is heated and then applied layer by layer to create the desired object.

The BfR, BAM and CPSC presented their initial results at the meeting. These showed that volatile components and particles are released during printing. Release is influenced by the materials used (plastic, dyes, additives) and the printing temperature. Investigations con-ducted by BAM and CPSC were carried out with 3D printers, while the BfR also tested 3D printing pens. There is little information available on possible health effects so far, so the BfR sees this as a core research area.

Different measures were discussed to reduce the release and ensure consumer protection. Other issues included possible risks in the subsequent treatment of 3D printed objects (e.g. through smoothing down) as well as the use of other 3D printing methods, such as stereo-lithography (SLA) or selective laser sintering (SLS). Better ways to distribute the information regarding possible health risks to the consumers were also discussed.

Credit: 
BfR Federal Institute for Risk Assessment