Earth

Ecofriendly nano-fabrication achieved with biodegradable chitosan nanoparticles

image: Nano-fabrication schematic for the self-cleaning antireflective glass by ecofriendly nano-fabrication using chitosan and general nano-fabrication using synthetic polymer nanoparticle

Image: 
Korea Institute of Machinery and Materials (KIMM)

The Korea Institute of Machinery and Materials (KIMM, President Chun Hong Park) succeeded in creating glass with self-cleaning and antireflective functions through the biodegradable chitosan nanoparticle coating. This is the first use of a biodegradable material in nanosphere lithography. The results of the study can be utilized to prevent the use of synthetic polymer nanoparticles in nano-fabrication, a kind of microplastic waste, which have been associated with toxicity issues.

Dr. Hyuneui Lim and Dr. Seung-Chul Park of the Department of Nature-Inspired Nano Convergence Systems under the Nano-Convergence Mechanical Systems Research Division used chitosan, obtained from crab shells, to develop an ecofriendly nanoparticles coating process, and published the results in ACS Applied Materials & Interfaces*. (Oct. 30)

* Title of paper: Synthesis of Surface-Reinforced Biodegradable Chitosan Nanoparticles and Their Application in Nanostructured Antireflective and Self-Cleaning Surfaces

The team used biodegradable chitosan nanoparticles instead of polystyrene nanoparticles in the nanostructuring of antireflective, self-cleaning glass surfaces. The main advantage of the proposed process is that the use of ecofriendly material eliminates the microplastic waste, which occurs in nanostructuring processes.

With the growing interest in environmental issues, extensive research has been conducted to adopt ecofriendly approaches in nano-fabrication processes. However, it has been difficult to use biodegradable materials due to their weak intrinsic properties.

The team strengthened the physical properties of biodegradable chitosan nano- particles through surface treatment and used it instead of polystyrene, which is a more commonly employed material in nanoparticle coating.

While chitosan particles have been used in food or drug delivery, this is the first approach of chitosan in nanosphere lithography. The results are expected to lead to diverse applications in processes involving polymer nanoparticles.

To date, the cheap, spherical polystyrene polymers have been used to make nanostructures on surfaces. This process produces large amounts of waste containing plastic nanoparticles, and has been a significant environmental concern.

Hyuneui Lim, head of the Department of Nature-Inspired Nano Convergence Systems, said, "The significance of this study lies in being the world's first ecofriendly nanosphere lithography that successfully reduces the use of nanoplastic particles. We expect it to have various applications in processes requiring the use of polymer nanoparticles."

The study received funding from the Convergence Research Program of the National Research Council of Science and Technology as a "preliminary study on Teflon-replacing high-temperature resistant superhydrophobic surface design and process technology", and from the Ministry of Trade, Industry and Energy for the "development of self-cleaning high-value-added color glass for solar modules in urban buildings."

Credit: 
National Research Council of Science & Technology

Compliance with Paris Agreement would limit loss of productivity in fishing, agriculture

What is the global impact of climate change on fishing and agriculture? An international team of scientists (1) led by the CNRS, involving the University of Montpellier (2) in France, has studied this question by applying climate models to worldwide data on employment, the economy, and food security. Their findings, published in Science Advances on 27 November, show that 90% of the global population may face decreases in productivity for both agriculture and fishing if greenhouse emissions are not reduced. On the other hand, most countries are in a position to limit these losses if emissions are drastically cut, as stipulated by the Paris Agreement.

By combining climate models with global employment, economic, and food security data, a group of scientists has analysed the potential effects of climate change on two key food sectors: agriculture and fishing.

Adopting the scenario of no reduction in greenhouse gas emissions, they have shown that roughly 90% of the worldwide human population--for the most part living in those countries most vulnerable to climate change and less able to adapt to it--would likely face productivity losses for agriculture and fishing, while less than 3% of the population would see simultaneous gains in productivity in their regions of the world by 2100. This scenario offers extremely little room for adaptation. It would be impossible to offset the impact on fishing by developing agriculture, or vice versa: both sectors would be hit hard.

Yet if the Paris Agreement is honoured, which would entail a drastic reduction in greenhouse gas emissions, the scientists conclude that the majority of countries--not just the most vulnerable, but also the majority of those responsible for the greatest emissions--would come out ahead. Though productivity would still be lost in many cases (affecting 60% of the population), the magnitude of this blow would be considerably lower. The most vulnerable nations would see only a fifth to a fourth of the losses they would otherwise suffer, giving them ample slack to implement adaptive strategies--e.g. diversification within an affected sector (by developing varieties that would be viable in the climate of tomorrow) or investment in sectors relatively unscathed by changing climate conditions, or even benefiting from them.

These findings thus suggest that making societies less vulnerable to the future effects of climate change requires quick action to attenuate it, along with strategic adaptation in regions where negative impacts appear inevitable.

Credit: 
CNRS

Underwater telecom cables make superb seismic network

image: Researchers employed 20 kilometers (pink) of a 51-kilometer undersea fiber-optic cable, normally used to communicate with an off-shore science node (MARS, Monterey Accelerated Research System), as a seismic array to study the fault zones under Monterey Bay. During the four-day test, the scientists detected a magnitude 3.5 earthquake 45 kilometers away in Gilroy, and mapped previously uncharted fault zones (yellow circle).

Image: 
Nate Lindsey, UC Berkeley

Fiber-optic cables that constitute a global undersea telecommunications network could one day help scientists study offshore earthquakes and the geologic structures hidden deep beneath the ocean surface.

In a paper appearing this week in the journal Science, researchers from the University of California, Berkeley, Lawrence Berkeley National Laboratory (Berkeley Lab), Monterey Bay Aquarium Research Institute (MBARI) and Rice University describe an experiment that turned 20 kilometers of undersea fiber-optic cable into the equivalent of 10,000 seismic stations along the ocean floor. During their four-day experiment in Monterey Bay, they recorded a 3.5 magnitude quake and seismic scattering from underwater fault zones.

Their technique, which they had previously tested with fiber-optic cables on land, could provide much-needed data on quakes that occur under the sea, where few seismic stations exist, leaving 70% of Earth's surface without earthquake detectors.

"There is a huge need for seafloor seismology. Any instrumentation you get out into the ocean, even if it is only for the first 50 kilometers from shore, will be very useful," said Nate Lindsey, a UC Berkeley graduate student and lead author of the paper.

Lindsey and Jonathan Ajo-Franklin, a geophysics professor at Rice University in Houston and a visiting faculty scientist at Berkeley Lab, led the experiment with the assistance of Craig Dawe of MBARI, which owns the fiber-optic cable. The cable stretches 52 kilometers offshore to the first seismic station ever placed on the floor of the Pacific Ocean, put there 17 years ago by MBARI and Barbara Romanowicz, a UC Berkeley Professor of the Graduate School in the Department of Earth and Planetary Science. A permanent cable to the Monterey Accelerated Research System (MARS) node was laid in 2009, 20 kilometers of which were used in this test while off-line for yearly maintenance in March 2018.

"This is really a study on the frontier of seismology, the first time anyone has used offshore fiber-optic cables for looking at these types of oceanographic signals or for imaging fault structures," said Ajo-Franklin. "One of the blank spots in the seismographic network worldwide is in the oceans."

The ultimate goal of the researchers' efforts, he said, is to use the dense fiber-optic networks around the world -- probably more than 10 million kilometers in all, on both land and under the sea -- as sensitive measures of Earth's movement, allowing earthquake monitoring in regions that don't have expensive ground stations like those that dot much of earthquake-prone California and the Pacific Coast.

"The existing seismic network tends to have high-precision instruments, but is relatively sparse, whereas this gives you access to a much denser array," said Ajo-Franklin.

Photonic seismology

The technique the researchers use is Distributed Acoustic Sensing, which employs a photonic device that sends short pulses of laser light down the cable and detects the backscattering created by strain in the cable that is caused by stretching. With interferometry, they can measure the backscatter every 2 meters (6 feet), effectively turning a 20-kilometer cable into 10,000 individual motion sensors.

"These systems are sensitive to changes of nanometers to hundreds of picometers for every meter of length," Ajo-Franklin said. "That is a one-part-in-a-billion change."

Earlier this year, they reported the results of a six-month trial on land using 22 kilometers of cable near Sacramento emplaced by the Department of Energy as part of its 13,000-mile ESnet Dark Fiber Testbed. Dark fiber refers to optical cables laid underground, but unused or leased out for short-term use, in contrast to the actively used "lit" internet. The researchers were able to monitor seismic activity and environmental noise and obtain subsurface images at a higher resolution and larger scale than would have been possible with a traditional sensor network.

"The beauty of fiber-optic seismology is that you can use existing telecommunications cables without having to put out 10,000 seismometers," Lindsey said. "You just walk out to the site and connect the instrument to the end of the fiber."

During the underwater test, they were able to measure a broad range of frequencies of seismic waves from a magnitude 3.4 earthquake that occurred 45 kilometers inland near Gilroy, California, and map multiple known and previously unmapped submarine fault zones, part of the San Gregorio Fault system. They also were able to detect steady-state ocean waves -- so-called ocean microseisms -- as well as storm waves, all of which matched buoy and land seismic measurements.

"We have huge knowledge gaps about processes on the ocean floor and the structure of the oceanic crust because it is challenging to put instruments like seismometers at the bottom of the sea," said Michael Manga, a UC Berkeley professor of earth and planetary science. "This research shows the promise of using existing fiber-optic cables as arrays of sensors to image in new ways. Here, they've identified previously hypothesized waves that had not been detected before."

According to Lindsey, there's rising interest among seismologists to record Earth's ambient noise field caused by interactions between the ocean and the continental land: essentially, waves sloshing around near coastlines.

"By using these coastal fiber optic cables, we can basically watch the waves we are used to seeing from shore mapped onto the seafloor, and the way these ocean waves couple into the Earth to create seismic waves," he said.

To make use of the world's lit fiber-optic cables, Lindsey and Ajo-Franklin need to show that they can ping laser pulses through one channel without interfering with other channels in the fiber that carry independent data packets. They're conducting experiments now with lit fibers, while also planning fiber-optic monitoring of seismic events in a geothermal area south of Southern California's Salton Sea, in the Brawley seismic zone.

Credit: 
University of California - Berkeley

Research enables artificial intelligence approach to create AAV capsids for gene therapies

image: Improved AAV vector capsid for gene therapy engineered with a new machine-guided approach shows, in red, improvements in efficiency of viral production based on the average effect of insertions at all possible amino acid positions, with white showing neutral and blue showing deleterious positions. (Left: capsid viewed from outside, Right: cut-out to reveal inner positions).

Image: 
Eric Kelsic, Dyno Therapeutics

Cambridge, MA, November 28, 2019 -- Dyno Therapeutics, a biotechnology company pioneering use of artificial intelligence in gene therapy, today announced a publication in the journal Science that demonstrates the power of a comprehensive machine-guided approach to engineer improved capsids for gene therapy delivery. The research was conducted by Dyno co-founders Eric D. Kelsic, Ph.D. and Sam Sinai, Ph.D., together with colleague Pierce Ogden, Ph.D., at Harvard's Wyss Institute for Biologically Inspired Engineering and the Harvard Medical School laboratory of George M. Church, Ph.D., a Dyno scientific co-founder. The publication, entitled "Comprehensive AAV capsid fitness landscape reveals a viral gene and enables machine-guided design," is available here: https://science.sciencemag.org/lookup/doi/10.1126/science.aaw2900 1

AAV capsids are presently the most commonly used vector for gene therapy because of their established ability to deliver genetic material to patient organs with a proven safety profile. However, there are only a few naturally occurring AAV capsids, and they are deficient in essential properties for optimal gene therapy, such as targeted delivery, evasion of the immune system, higher levels of viral production, and greater transduction efficiency. Starting at Harvard in 2015, the authors set out to overcome the limitations of current capsids by developing new machine-guided technologies to rapidly and systematically engineer a suite of new, improved capsids for widespread therapeutic use.

In the research published in Science, the authors demonstrate the advance of their unique machine-guided approach to AAV engineering. Previous approaches have been limited by the difficulty of altering a complex capsid protein without breaking its function and by the general lack of knowledge regarding how AAV capsids interact with the body. Historically, rather than addressing this challenge directly, the most popular approaches to capsid engineering have taken a roundabout solution: generating libraries of new capsids by making random changes to the protein. However, since most random changes to the capsid actually result in decreased function, such random libraries contain few viable capsids, much less improved ones. Recognizing the limitation of conventionally generated capsid libraries, the authors implemented a machine-guided approach that gathered a vast amount of data using new high-throughput measurement technologies to teach them how to build better libraries and, ultimately, lead to synthetic capsids with optimized delivery properties.

Focusing on the AAV2 capsid, the authors generated a complete landscape of all single codon substitutions, insertions and deletions, then measured the functional properties important for in vivo delivery. They then used a machine-guided approach, leveraging these data to efficiently generate diverse libraries of AAV capsids with multiple changes that targeted the mouse liver and that outperformed AAVs generated by conventional random mutagenesis approaches. In the process, the authors' systematic efforts unexpectedly revealed the existence of a previously-unrecognized protein encoded within the sequence of all the most popular AAV capsids, which they termed membrane-associated accessory protein (MAAP). The authors believe that the protein plays a role in the natural life cycle of AAV.

"This is just the beginning of machine-guided engineering of AAV capsids to transform gene therapy," underscores co-author Sam Sinai, Ph.D., Lead Machine Learning Scientist and co-founder of Dyno Therapeutics. "The success of the simple linear models used in this study has led us to pursue more data and higher capacity machine learning models, where the potential for improvement in capsid designs feels boundless."

"The results in the Science publication demonstrate, for the first time, the power of linking a comprehensive set of advanced techniques - large scale DNA synthesis, pooled in vitro and in vivo screens, next-generation sequencing readouts, and iterative machine-guided capsid design - to generate optimized synthetic AAV capsids," explains co-first and co-corresponding author Eric D. Kelsic, Ph.D., CEO and co-founder of Dyno Therapeutics. "At Dyno, our team is committed to advancing these technologies to identify capsids that meet the urgent needs of patients who can benefit from gene therapies."

Credit: 
The Yates Network

Illuminating seafloor seismology with existing 'dark' fiber-optic cables

A new fault system on the seafloor was discovered off California's coast by temporarily transforming a pre-existing underwater fiber optic cable into an array of nearly 10,000 seismic sensors, according to a new study. The results showcase the potential of leveraging the extensive web of subsea optical fiber telecommunication cables already spanning the ocean's floor to monitor and record oceanographic and seismic processes in unprecedented detail. Deep below the surface, tectonic forces conspire to fracture and fold the Earth's crust. These rocks break and move at faults. Like geological scars, the surface of Earth is striated with faults; the largest and most lively - where rocks are actively snapping and shifting - are responsible for triggering destructive earthquakes and tsunamis. Mineral deposits like oil and gas are often found along these structures, too. However, charting Earth's fault zones is challenging and many remain unknown, particularly those that lie on the bottom of the ocean. As a result, offshore seismic hazard potential is not fully understood, and information about offshore resources is incomplete. Distributed Acoustic Sensing (DAS), a type of fiber-optic sensing that uses pulses of laser light to continuously detect slight movements along optical fibers, has been used to measure seismic waves and map faults on land. Researchers have suggested it could be used to measure seafloor seismic activity as well. During a maintenance window, Nathanial Lindsey and colleagues temporarily repurposed an undersea fiber-optic cable - part of the Monterey Accelerated Research System - to collect DAS measurements across the continental shelf of California's coast. According to Lindsey et al., DAS turned the decade-old cable into the equivalent of thousands of sensitive seismic sensors. Over the short duration, the authors were able to map a previously unknown fault system and observe several dynamic tidal and storm-driven processes in the water column above. Philippe Jousset discusses the study in a related Perspective.

Credit: 
American Association for the Advancement of Science (AAAS)

An agenda for multidisciplinary cyber risk research

The science of cyber risk is inherently interdisciplinary, argue Gregory Falco and colleagues in this Policy Forum, and no single academic field on its own can adequately address related problems. The researchers also propose a new multi-field model for addressing this risk. "Only through such multidisciplinary collaboration can the science of cyber risk systematically move forward," write the authors. Perhaps generally considered to be a technical issue in the wheelhouse of computer science, cyber risk encompasses a broad and increasingly complex gamut of digital technologies and information systems spanning many fields and sectors. However, as a field of study, cyber risk means different things to a variety of disparate academic fields that very rarely coordinate across disciplinary boundaries. Here, Falco et al. make the case that cyber risk research can - and should - be systematically addressed through a common collaborative research agenda that equally involves relevant academic fields, such as behavioral science, data science, economics law, management science and political science. With input from experts and stakeholders across different fields within academia, industry, government agencies worldwide, Falco et al. developed a unified concept model for cyber risk, which establishes a core list of questions that need to be addressed to understand cyber risk and identifies how each field can best contribute to the efforts.

Credit: 
American Association for the Advancement of Science (AAAS)

Combine chemical probe resources to optimize biomedical research, scientists urge

A new report urges biomedical researchers to use online web resources very carefully, taking into account their complementary benefits and weaknesses, when selecting small-molecule chemical probes to help answer their research questions.

In a 'special report' published today in the journal Future Medicinal Chemistry today (Thursday), scientists at The Institute of Cancer Research, London, carried out the first comprehensive assessment of all the publicly available resources on chemical probes.

The report strongly recommends that researchers should avoid general search engines and vendor catalogue information and instead use two kinds of online resource - expert reviews and computational Big Data approaches - in order to make the best decisions about which chemical probes to use in biomedical research, and also to avoid poor selection of tools that can lead to incorrect conclusions.

Small-molecule chemical probes are important tools that are widely used by scientists to modify - usually to inhibit - the activity of individual proteins in isolated cells or organisms and hence to determine their function. Chemical probes are also used to test the role of a particular protein in diseases such as cancer, and to help validate that protein as a target for future drugs.

Small-molecule probes provide an alternative approach to genetic technologies such as RNA interference and CRISPR - in fact the chemical and genetic approaches are very complementary and highly effective when used in harness together.

Unfortunately, many small-molecule compounds that have been claimed as chemical probes, and often used very widely, are not sufficiently specific for the protein of interest - and using them can generate incorrect results. Some compounds hit a few extra protein targets while others hit very many others.

There is a pressing need to supply biomedical scientists with appropriate information so that they can select the best possible chemical probes for their experiments, helping to ensure that their research findings are robust.

A major challenge is that the information about potential chemical probes is scattered across many different scientific publications and other sources, making it difficult for scientists to make a fully informed choice - especially if the researcher is not an expert in chemical biology, pharmacology or drug discovery. Fortunately, help has been forthcoming over the last few years through the development of publicly available online resources on chemical probes.

The team of chemical probe researchers at The Institute of Cancer Research (ICR) discuss the various different web resources that supply information to guide the choice of chemical tools - finding that they all have their own strengths and limitations. They provide user-friendly advice on how to navigate these resources to select the best possible chemical probes for a researcher's needs.

The report stresses the value of resources that provide reviews written by chemical probe experts. It especially highlights the Chemical Probes Portal where chemical biology experts currently provide assessments on 192 chemical probes for 181 different proteins of interest and offer advice on how best to use them. The portal also has information on around 200 other small-molecule compounds that should no longer be employed, even though many of these may have been used extensively in the past.

But there are also some limitations to resources that rely on expert peer review. For instance, most of the probes assessed on the Chemical Probes Portal act on particular protein families (kinases, G-protein-coupled receptors, phosphodiesterases, epigenetic proteins and BCL2 family members) and many other protein families are not yet covered. In addition, it is not so easy to keep such resources up to date, since they rely on manual input from volunteer reviewers. Changes are under way to address these points and make other enhancements to the portal.

The report also emphasises the usefulness of resources that provide objective assessment of chemical probes using large-scale, quantitative computational analysis. It highlights Probe Miner, a public web-based resource that was launched by the ICR research team in 2017 with funding from Wellcome. Probe Miner has several key advantages. It is objective, data-driven, quantitative and very comprehensive - based on bioactivity data for more than 300,000 small molecules acting on more than 2,300 human proteins. The underlying databases are frequently updated, ensuring that analyses are carried out using the latest information.

But the researchers also warned that some data sources and chemical probes could be missed from these computational systems - and also that the rankings of probes could be difficult to interpret for biologists lacking detailed expertise in chemical biology.

The report concludes that the continued enhancement of online resources will improve the selection of high-quality chemical probes and increase the robustness of biomedical research. It recommends that the complementary expert-reviewed and computational data-driven resources should be used alongside each other to ensure the best decisions about which chemical tools to use - and that this could go a long way to addressing the major problems with the current misuse of chemical probes in biomedical science and drug discovery.

The authors also acknowledge that there is a significant challenge to make biologists aware of the resources available and to encourage their use. They encourage the wide dissemination of the recommendations in the report and uptake by researchers, research funders, journals and vendors to improve the quality and robustness of biomedical research.

Study co-leader Professor Bissan Al-Lazikani, Head of Data Science at The Institute of Cancer Research, London, said:

"Chemical probes are vital tools in biomedical research, playing a key role in understanding how proteins work and what impact they have in cancer cells. These chemical tools frequently also power the start of campaigns to discover new cancer drugs. So it's of the utmost importance that scientists are careful and thorough when choosing chemical probes for their experiments. Failure to do so can result in unreliable or misleading results.

"In the past five years, we have seen a rise in efforts to pull together all available data on the characteristics and quality of chemical probes in publicly accessible databases, including our own data-driven resource, Probe Miner.

"In our review of the various online chemical probe resources available we found that they all have their own merits, and that Big Data approaches are a major step forward in bringing together the most up-to-date evidence. We really need to be combining the different sources of information so that researchers can get the best possible information about chemical probes."

Study co-leader Dr Albert Antolin, Sir Henry Wellcome Postdoctoral Fellow in Systems Pharmacology at The Institute of Cancer Research, London, said:

"To be useful for research, chemical probes actually have to be more selective for a target than many drugs used to treat patients, where the effectiveness and safety of the treatment are the most important criteria. If researchers want to find out the exact role of a particular protein in the body, or in disease, then action on several targets can be misleading or completely unacceptable.

"There is a vast volume of information on potential chemical probes but this information is spread around in different formats, so biomedical researchers need help to find the best high-quality chemical tools for their research."

Study co-leader Professor Paul Workman, Chief Executive of The Institute of Cancer Research, London, said:

"The poor selection and use of chemical probes can lead to incorrect and misleading results. There have been cases where use of poor-quality compounds has led scientists down entirely the wrong track, wasting precious time and funding, and even at times slowing down the discovery of drugs for the treatment of patients.

"It's incredibly important that we spread the word about the importance of prudently choosing the best possible chemical probes in biomedical research, so we can put a stop to the ongoing use of outdated tools, and ensure that biomedical science leads to robust, reliable results.

"Using chemical probe resources curated by experts alongside those powered by Big Data approaches is the best way to select the right probe on the basis of the most recent evidence. We urge researchers, funders, scientific journals and vendors to utilise the recommendations in the report."

Credit: 
Institute of Cancer Research

What protects killer immune cells from harming themselves?

White blood cells, which release a toxic potion of proteins to kill cancerous and virus-infected cells, are protected from any harm by the physical properties of their cell envelopes, find scientists from UCL and the Peter MacCallum Cancer Centre in Melbourne.

Until now, it has been a mystery to scientists how these white blood cells - called cytotoxic lymphocytes - avoid being killed by their own actions and the discovery could help explain why some tumours are more resistant than others to recently developed cancer immunotherapies.

The research, published in Nature Communications, highlights the role of the physical properties of the white blood cell envelope, namely the molecular order and electric charge, in providing such protection.

According to Professor Bart Hoogenboom (London Centre for Nanotechnology, UCL Physics & Astronomy and UCL Structural & Molecular Biology), co-author of the study: "Cytotoxic lymphocytes, or white blood cells, rid the body of disease by punching holes in rogue cells and by injecting poisonous enzymes inside. Remarkably, they can do this many times in a row, without harming themselves. We now know what effectively prevents these white blood cells from committing suicide every time they kill one of their targets."

The scientists made the discovery by studying perforin, which is the protein responsible for the hole-punching. They found that perforin's attachment to the cell surface strongly depends on the order and packing of the molecules - so-called lipids - in the membrane that surrounds and protects the white blood cells.

More order and tighter packing of the lipid molecules led to less perforin binding, and when they artificially disrupted the order of this lipid in the white blood cells, the cells became more sensitive to perforin.

However, they also found that when the white blood cells were exposed to so much perforin that some of it stuck to their surface, the bound perforin still failed to kill the white blood cells, indicating that there must be another layer of protection. This turned out to be the negative charge of some lipid molecules sent to the cell surface, which bound the remaining perforin and blocked it from damaging the cell.

Joint first author, Dr Adrian Hodel, who screened and studied many membrane systems for this work, said: "We have long known that local lipid order can change how cells communicate which each other, but it was rather surprising that the precise physical membrane properties can also provide such an important layer of protection against molecular hole-punchers."

In Melbourne, joint first author Jesse Rudd-Schmidt, who focused on the characterisation of the white blood cells in Associate-Professor Ilia Voskoboinik's laboratory, said: "What we have found helps to explain how our immune system can be so effective in killing rogue cells. We are now also keen to investigate if cancer cells may use similar protection to avoid being killed by immune cells, which would then explain some of the large variability in patient response to cancer immunotherapies."

Credit: 
University College London

Researchers create 'smart' surfaces to help blood-vessel grafts knit better, more safely

image: McMaster University researcher Tohid Didar, a mechanical and biomedical engineer, led a team that has created a coating to make synthetic vascular grafts less prone to infection and clotting and more likely to integrate with the body.

Image: 
Georgia Kirkos, McMaster University

HAMILTON, ON, Nov. 27, 2019 - Researchers at McMaster University have created a new coating to prevent clotting and infection in synthetic vascular grafts, while also accelerating the body's own process for integrating the grafted vessels.

Variants of the coating material, described in two new articles published by the journals Small (published today) and ACS Biomaterials Science and Engineering (published Nov. 8), are "smart" coatings that line the vessels and prevent clot formation and bacterial adhesion while selectively attracting targeted cells that foster the growth of natural vessel walls, promoting faster, smoother healing.

Each article verifies the success of a different formulation of the coating, one designed for Dacron grafts (Small), the other for Teflon grafts (ACS Science) - the two major materials used to make artificial vessels. The smart materials are made to coat the inner walls of new sections of replacement vessels typically deployed after injury or disease.

Synthetic materials currently used in vascular grafts can be problematic because their surface properties and texture can collect cells and initiate blood clotting, a risk which requires patients to use anti-coagulant drugs such as warfarin for long periods.

These surfaces can also accelerate the buildup of microbes that can cause infection.

"These surfaces repel non-desirable elements in the blood: infections and clotting," says Tohid Didar, the McMaster mechanical and biomedical engineer who led the research team. "The hope is that down the road we can use less and less anti-coagulant medication on patients and at the same time that we can assure that the site remains uninfected."

The researchers collaborated with Jeffrey Weitz of the Thrombosis and Atherosclerosis Research Institute and McMaster chemical engineer Zeinab Hosseini-Doust to test the new material in lab experiments using human tissue.

The components used in the material have already been approved for use in humans, which is expected to shorten the process for getting the new material approved for use in clinical settings.

Didar's team had previously developed selectively repellent surfaces for other applications, but this is the first for use in blood vessels, where infection, clotting and rejection make the use of these grafts challenging.

Credit: 
McMaster University

Glass from a 3D printer

image: Various glass objects created with a 3D printer.

Image: 
Photo: Group for Complex Materials / ETH Zurich

Producing glass objects using 3D printing is not easy. Only a few groups of researchers around the world have attempted to produce glass using additive methods. Some have made objects by printing molten glass, but the disadvantage is that this requires extremely high temperatures and heat-?resistant equipment. Others have used powdered ceramic particles that can be printed at room temperature and then sintered later to create glass; however, objects produced in this way are not very complex.

Researchers from ETH Zurich have now used a new technique to produce complex glass objects with 3D printing. The method is based on stereolithography, one of the first 3D printing techniques developed during the 1980s. David Moore, Lorenzo Barbera, and Kunal Masania in the Complex Materials group led by ETH processor André Studart have developed a special resin that contains a plastic, and organic molecules to which glass precursors are bonded. The researchers reported their results in the latest issue of the journal Natural Materials.

Light used to "grow" objects

The resin can be processed using commercially available Digital Light Processing technology. This involves irradiating the resin with UV light patterns. Wherever the light strikes the resin, it hardens because the light sensitive components of the polymer resin cross link at the exposed points. The plastic monomers combine to form a labyrinth-?like structure, creating the polymer. The ceramic-?bearing molecules fill the interstices of this labyrinth.

An object can thus be built up layer by layer. The researchers can change various parameters in each layer, including pore size: weak light intensity results in large pores; intense illumination produces small pores. "We discovered that by accident, but we can use this to directly influence the pore size of the printed object," says Masania.

The researchers are also able to modify the microstructure, layer by layer, by mixing silica with borate or phosphate and adding it to the resin. Complex objects can be made from different types of glass, or even combined in the same object using the technique.

The researchers then fire the blank produced in this way at two different temperatures: at 600?C to burn off the polymer framework and then at around 1000?C to densify the ceramic structure into glass. During the firing process, the objects shrink significantly, but become transparent and hard like window glass.

Patent application submitted

These 3D-printed glass objects are still no bigger than a die. Large glass objects, such as bottles, drinking glasses or window panes, cannot be produced in this way - which was not actually the goal of the project, emphasises Masania.

The aim was rather to prove the feasibility of producing glass objects of complex geometry using a 3D printing process. However, the new technology is not just a gimmick. The researchers applied for a patent and are currently negotiating with a major Swiss glassware dealer who wants to use the technology in his company.

Credit: 
ETH Zurich

Nearly 40% of species are very rare and are vulnerable to climate change

image: Global hotspots of rare plant species.

Image: 
Patrick R. Roehrdanz, Moore Center for Science, Conservation International Data from Enqist et al.

Almost 40% of global land plant species are categorized as very rare, and these species are most at risk for extinction as the climate continues to change, according to new University of Arizona-led research.

The findings are published in a special issue of Science Advances that coincides with the 2019 United Nations Climate Change Conference, also known as COP25, in Madrid. The COP25 is convening nations to act on climate change. The international meeting runs from Dec. 2 through Dec. 13.

"When talking about global biodiversity, we had a good approximation of the total number of land plant species, but we didn't have a real handle on how many there really are," said lead author Brian Enquist, University of Arizona professor of ecology and evolutionary biology.

Thirty-five researchers from institutions around the world worked for 10 years to compile 20 million observational records of the world's land plants. The result is the largest dataset on botanical biodiversity ever created. The researchers hope this information can help reduce loss of global biodiversity by informing strategic conservation action that includes consideration of the effects of climate change.

They found that there are about 435,000 unique land plant species on Earth.

"So that's an important number to have, but it's also just bookkeeping. What we really wanted to understand is the nature of that diversity and what will happen to this diversity in the future," Enquist said. "Some species are found everywhere - they're like the Starbucks of plant species. But others are very rare - think a small standalone café."

Enquist and his team revealed that 36.5% of all land plant species are "exceedingly rare," meaning they have only been observed and recorded less than five times ever.

"According to ecological and evolutionary theory, we'd expect many species to be rare, but the actual observed number we found was actually pretty startling," he said. "There are many more rare species than we expected."

Moreover, the researchers found that rare species tend to cluster in a handful of hotspots, such as the Northern Andes in South America, Costa Rica, South Africa, Madagascar and Southeast Asia. These regions, they found, remained climatologically stable as the world emerged from the last ice age, allowing such rare species to persist.

But just because these species enjoyed a relatively stable climate in the past doesn't mean they'll enjoy a stable future. The research also revealed that these very rare-species hotspots are projected to experience a disproportionally high rate of future climatic changes and human disruption, Enquist said.

"We learned that in many of these regions, there's increasing human activity such as agriculture, cities and towns, land use and clearing. So that's not exactly the best of news," he said. "If nothing is done, this all indicates that there will be a significant reduction in diversity - mainly in rare species - because their low numbers make them more prone to extinction."

And it's these rare species that science knows very little about.

By focusing on identifying rare species, "this work is better able to highlight the dual threats of climate change and human impact on the regions that harbor much of the world's rare plant species and emphasizes the need for strategic conservation to protect these cradles of biodiversity," said Patrick Roehrdanz a co-author on the paper and managing scientist at Conservation International.

Credit: 
University of Arizona

Nine climate tipping points now 'active,' warn scientists

More than half of the climate tipping points identified a decade ago are now "active", a group of leading scientists have warned.

This threatens the loss of the Amazon rainforest and the great ice sheets of Antarctica and Greenland, which are currently undergoing measurable and unprecedented changes much earlier than expected.

This "cascade" of changes sparked by global warming could threaten the existence of human civilisations.

Evidence is mounting that these events are more likely and more interconnected than was previously thought, leading to a possible domino effect.

In an article in the journal Nature, the scientists call for urgent action to reduce greenhouse gas emissions to prevent key tipping points, warning of a worst-case scenario of a "hothouse", less habitable planet.

"A decade ago we identified a suite of potential tipping points in the Earth system, now we see evidence that over half of them have been activated," said lead author Professor Tim Lenton, director of the Global Systems Institute at the University of Exeter.

"The growing threat of rapid, irreversible changes means it is no longer responsible to wait and see. The situation is urgent and we need an emergency response."

Co-author Johan Rockström, director of the Potsdam Institute for Climate Impact Research, said: "It is not only human pressures on Earth that continue rising to unprecedented levels.

"It is also that as science advances, we must admit that we have underestimated the risks of unleashing irreversible changes, where the planet self-amplifies global warming.

"This is what we now start seeing, already at 1°C global warming.

"Scientifically, this provides strong evidence for declaring a state of planetary emergency, to unleash world action that accelerates the path towards a world that can continue evolving on a stable planet."

In the commentary, the authors propose a formal way to calculate a planetary emergency as risk multiplied by urgency.

Tipping point risks are now much higher than earlier estimates, while urgency relates to how fast it takes to act to reduce risk.

Exiting the fossil fuel economy is unlikely before 2050, but with temperature already at 1.1°C above pre-industrial temperature, it is likely Earth will cross the 1.5°C guardrail by 2040. The authors conclude this alone defines an emergency.

Nine active tipping points:

Arctic sea ice

Greenland ice sheet

Boreal forests

Permafrost

Atlantic Meridional Overturning Circulation

Amazon rainforest

Warm-water corals

West Antarctic Ice Sheet

Parts of East Antarctica

The collapse of major ice sheets on Greenland, West Antarctica and part of East Antarctica would commit the world to around 10 metres of irreversible sea-level rise.

Reducing emissions could slow this process, allowing more time for low-lying populations to move.

The rainforests, permafrost and boreal forests are examples of biosphere tipping points that if crossed result in the release of additional greenhouse gases amplifying warming.

Despite most countries having signed the Paris Agreement, pledging to keep global warming well below 2°C, current national emissions pledges - even if they are met - would lead to 3°C of warming.

Although future tipping points and the interplay between them is difficult to predict, the scientists argue: "If damaging tipping cascades can occur and a global tipping cannot be ruled out, then this is an existential threat to civilization.

"No amount of economic cost-benefit analysis is going to help us. We need to change our approach to the climate problem."

Professor Lenton added: "We might already have crossed the threshold for a cascade of inter-related tipping points.

"However, the rate at which they progress, and therefore the risk they pose, can be reduced by cutting our emissions."

Though global temperatures have fluctuated over millions of years, the authors say humans are now "forcing the system", with atmospheric carbon dioxide concentration and global temperature increasing at rates that are an order of magnitude higher than at the end of the last ice age.

Credit: 
University of Exeter

Early antiretroviral treatment shrinks the HIV reservoir in infected infants

Starting antiretroviral therapy within hours of birth drastically shrinks the reservoir of HIV virus - an important step in efforts to cure infections - and improves antiviral immune responses in newborns with HIV, shows a two-year study of a unique cohort of ten infants in Botswana. The findings demonstrate that starting treatment earlier in life than current guidelines recommend could substantially improve health outcomes in infants with HIV, who experience irreversible immune damage if left untreated. HIV infections in newborns represent a huge health burden in developing countries; one study estimated that 300 to 500 infants are infected every day in sub-Saharan Africa. Because HIV infection in newborns can lead to rapid and fatal immune deficiency, the World Health Organization recommends that infected newborns receive antiretroviral treatment within weeks of birth. However, adhering to these recommendations can be difficult in low-resource and remote settings. To understand how the timing of antiretroviral therapy affects newborns, Pilar Garcia-Broncano and colleagues studied samples from infants over two years in Botswana, where around 24% of pregnant women are living with HIV. The subjects consisted of ten infected infants who began treatment on average seven hours after birth, ten infected infants who began treatment on average four months after birth, and 54 infants without HIV. The researchers saw that the earliest-treated infants showed a much smaller viral reservoir - the latent pool of the virus that persists throughout life - compared to the second infant group at week 96. Early treatment also granted other benefits to the infants, who showed more functional HIV-specific T cell responses and antiviral responses in the innate immune system. Garcia-Broncano et al. note that follow-up studies of the infant cohort could reveal additional benefits of early antiretroviral treatment that might appear later in life.

Credit: 
American Association for the Advancement of Science (AAAS)

Chinese Academy of Sciences leads discovery of unpredicted stellar black hole

image: Figure LB-1: Accretion of gas onto a stellar black hole from its blue companion star, through a truncated accretion disk (Artist impression).

Image: 
YU Jingchuan, Beijing Planetarium, 2019.

Our Milky Way Galaxy is estimated to contain 100 million stellar black holes - cosmic bodies formed by the collapse of massive stars and so dense even light can't escape. Until now, scientists had estimated the mass of an individual stellar black hole in our Galaxy at no more than 20 times that of the Sun. But the discovery of a huge black hole by a Chinese-led team of international scientists has toppled that assumption.

The team, headed by Prof. LIU Jifeng of the National Astronomical Observatory of China of the Chinese Academy of Sciences (NAOC), spotted a stellar black hole with a mass 70 times greater than the Sun. The monster black hole is located 15 thousand light-years from Earth and has been named LB-1 by the researchers. The discovery is reported in the latest issue of Nature.

The discovery came as a big surprise. "Black holes of such mass should not even exist in our Galaxy, according to most of the current models of stellar evolution," said Prof. LIU. "We thought that very massive stars with the chemical composition typical of our Galaxy must shed most of their gas in powerful stellar winds, as they approach the end of their life. Therefore, they should not leave behind such a massive remnant. LB-1 is twice as massive as what we thought possible. Now theorists will have to take up the challenge of explaining its formation."

Until just a few years ago, stellar black holes could only be discovered when they gobbled up gas from a companion star. This process creates powerful X-ray emissions, detectable from Earth, that reveal the presence of the collapsed object.

The vast majority of stellar black holes in our Galaxy are not engaged in a cosmic banquet, though, and thus don't emit revealing X-rays. As a result, only about two dozen Galactic stellar black holes have been well identified and measured.

To counter this limitation, Prof. LIU and collaborators surveyed the sky with China's Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST), looking for stars that orbit an invisible object, pulled by its gravity.

This observational technique was first proposed by the visionary English scientist John Michell in 1783, but it has only become feasible with recent technological improvements in telescopes and detectors.

Still, such a search is like looking for the proverbial needle in a haystack: only one star in a thousand may be circling a black hole.

After the initial discovery, the world's largest optical telescopes - Spain's 10.4-m Gran Telescopio Canarias and the 10-m Keck I telescope in the United States - were used to determine the system's physical parameters. The results were nothing short of fantastic: a star eight times heavier than the Sun was seen orbiting a 70-solar-mass black hole, every 79 days.

The discovery of LB-1 fits nicely with another breakthrough in astrophysics. Recently, the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Virgo gravitational wave detectors have begun to catch ripples in spacetime caused by collisions of black holes in distant galaxies. Intriguingly, the black holes involved in such collisions are also much bigger than what was previously considered typical.

The direct sighting of LB-1 proves that this population of over-massive stellar black holes exists even in our own backyard. "This discovery forces us to re-examine our models of how stellar-mass black holes form," said LIGO Director Prof. David Reitze from the University of Florida in the U.S.

"This remarkable result along with the LIGO-Virgo detections of binary black hole collisions during the past four years really points towards a renaissance in our understanding of black hole astrophysics," said Reitze.

Credit: 
Chinese Academy of Sciences Headquarters

Discovery by Hebrew University scientists could revolutionize chemotherapy

image: Hebrew University Professor Alexander Binshtok in his lab.

Image: 
Hadas Parush/Flash 90

(Jerusalem, November 26, 2019)--It is a feeling that many who receive a cancer diagnosis can identify with: heartbreak and fear, followed by hopes that chemotherapy will save the day. Unfortunately, for many patients, chemo's painful side effects cause them to stop treatment prematurely.

Now, a research team headed by Professor Alexander Binshtok, head of the Pain Plasticity Research Group at the Hebrew University of Jerusalem's Faculty of Medicine and Edmond & Lily Safra Center for Brain Sciences, has developed a method that delivers chemotherapy drugs directly to malignant cells and bypasses healthy ones. This discovery could allow doctors to reduce chemo doses for patients, thereby reducing the unpleasant side-effects associated with the treatment, and improve treatment compliance and overall prognoses.

"Most anti-cancer treatments are not sufficiently specific, meaning they attack healthy cells together with the malignant ones they're trying to get rid of," explained Binshtok. "This leads to the many serious side-affects associated with chemo therapy. Eliminating cancerous cells while leaving healthy ones alone is an important step towards reduce patients' suffering."

The new findings were published in a recent issue of Frontiers in Pharmacology. The study focuses on the selective expression of the TRPV2 protein by cancer cells. When activated, TRPV2 protein opens a canal inside cell membranes. Binshtok and his team studied liver cancer cells and were able to successfully insert a low dose of doxorubicin, a chemotherapeutic agent, through the canal and directly into cancer cells. Not only did the new method target cancer cells without harming healthy ones. In the future, the precision of this delivery method may allow doctors to prescribe lower chemo doses and to relieve patients from some of the harsher effects of chemo.

"It's too early to make concrete predictions but we are hopeful this discovery will lead the way towards a new, more targeted delivery method for chemotherapy treatment, one that will drastically reduce patients' pain," Binshtok concluded.

Credit: 
The Hebrew University of Jerusalem