Tech

Hygiene rules are also effective against new coronavirus variants

The researchers found that the variants have a similar surface stability as the wild type virus under laboratory conditions, but can be effectively eliminated by disinfection and thorough hand washing, heat or alcohol treatment. They report their results in the Journal of Infectious Diseases from 16 May 2021.

For this study, the team from the Department for Molecular and Medical Virology and the Chair of Materials Discovery and Interfaces at Ruhr-Universität Bochum (RUB) cooperated with the European Virus Bioinformatics Center Jena, the University Hospital Duisburg-Essen and Paracelsus Medical University Nuremberg.

The fact that viruses change genetically over time is well known. Variants of concern are those that give the virus an advantage, for example by allowing it to replicate faster, become more infectious or enable it to evade the immune response. The British and South African variants have accumulated several mutations which result in an increased transmission and, in some cases, lead to more severe courses of disease. "Therefore, the question arose whether they also differ from the original variant in terms of their sensitivity to hygiene measures," explains Toni Meister from Ruhr-Universität Bochum.

Heat, soap, alcohol

For this reason, the team analysed how long the variants remain infectious on surfaces made of steel, silver, copper and on face masks and how they can be rendered harmless by means of soap, heat or alcohol.

It turned out that both variants, as well as the wild type virus, could be inactivated when treated with at least 30 percent alcohol for at least 30 seconds. "Common disinfectants are therefore effective against all these variants," says Stephanie Pfänder from RUB. Thorough hand washing with soap could also lower the risk of infection. Heat also works against the virus: after 30 minutes at 56 degrees Celsius, all variants were rendered harmless.

To find out whether the stability of the different mutant variants on surfaces differs from each other, they analyzed the amount of infectious virus particles on surfaces made of steel, copper, silver and on surgical and FFP2 masks over 48 hours. "The surface stability did not differ between the virus variants," points out Eike Steinmann from the Department for Molecular and Medical Virology at RUB. "As described several times before, copper in particular has a very strong antiviral effect". In conclusion, the team did not detect any differences between the different mutants in terms of their sensitivity to different hygiene measures.

Credit: 
Ruhr-University Bochum

White roofs and more green areas would mitigate the effects of heat waves in cities

The frequency and intensity of heat waves in cities is increasing due to climate change, with a great negative impact on the health and mortality rates of the population. Anthropogenic activities and urban materials affect heat accumulation in cities, and solar radiation stored throughout the day on asphalt and buildings is released slowly during the night, generating significant heat stress. To face this growing problem, cities must establish effective mitigation strategies that allow reducing the temperature during heat waves.

A study carried out by researchers from the Institut de Ciència i Tecnologia Ambientals of the Universitat Autònoma de Barcelona (ICTA-UAB) assesses the effectiveness of solutions such as the creation of cool (white) roofs on the buildings and the expansion of urban green areas in the Barcelona Metropolitan Area (AMB). The results, recently published in the scientific journal Urban Climate, show that the combined application of these two strategies would allow achieving the highest rates of temperature reduction during these summer episodes.

To conduct the study, the researchers used a meteorological model that included eleven different typologies of urban areas in the AMB and simulated the heat wave recorded in July 2015, when daytime temperatures reached between 35º C and 40º C, and night temperatures exceeded 25ºC.

The study simulated different mitigation scenarios based on solutions such as the creation of cool roofs on residential and industrial buildings, or the increase of green areas according to the targets set by the Urban Master Plan (PDU) of the AMB. Cool rooftops can be obtained by painting the roofs white to increase the albedo, that is, the percentage of radiation that reflects from the surface, and that is not absorbed by the building. For its part, the PDU calls for the addition of 6 urban parks and green areas with a total of 255.64 ha by 2030, which means increasing the vegetation from 32.54% to 35.92%.

The four scenarios analysed the effects of increasing the albedo of certain rooftops to 0.85 by painting them white, preferably flat and accessible roofs; increase urban green areas with a daily irrigation of 2 l/m2, and with a higher irrigation of 5 l/m2. The last scenario combined the option of cool roofs with the creation of additional green areas with irrigation of 5 l/m2.

The results show that the scenario that combines the two strategies is the one with the greatest impact, with an average temperature reduction of 1.26ºC. The reduction reaches 4.73ºC during the day (at 3pm), and 1.88ºC at night (9pm). This decrease in heat also has positive effects on energy consumption, with 26% less spending on air conditioning.

The first scenario with cool roofs allows the average temperature to be reduced by 0.67ºC, but is more effective during the day, reaching a maximum reduction of 3.83ºC at 3 pm, compared to the maximum decrease of 1.63ºC at night (7h). The strategy of increasing the green areas reduces the temperature to a lesser extent, although it offers better results in the scenario with greater irrigation (decrease of 0.15ºC on average for irrigation of 2 l/m2, compared to a decrease of 0.61ºC in the scenario with irrigation of 5 l/m2). "We have seen that when irrigation is increased, the reduction in daytime temperature improves notably thanks to the cooling effect caused by evapotranspiration", says Joan Gilabert, lead author of the study.

Despite the reduction in temperature in all the scenarios studied, the thermal regulation resulting from the combination of the two strategies (white roofs and green areas) is the one with the greatest impact. "It combines the benefits of reducing the temperature at nighttime due to more urban green areas, with the reduction of daytime heat due to the increased albedo and irrigation, thereby abating the heat wave effects throughout a 24-hours period", explains Sergi Ventura, co-author of the study. He adds that white rooftops lower the temperatures in the central and denser urban areas, while parks help to reduce the heat in the areas closest to them.

This study exemplifies how such urban modelling efforts can aid city-level decision-makers in best strategizing urban planning to counteract the impacts of heat waves, which are foreseen to increase due to global climate change and the intensification of urbanization rates.

Credit: 
Universitat Autonoma de Barcelona

New testing platform for COVID-19 is an efficient and accurate alternative to gold-standard RT-qPCR tests

image: A) AriaDNA analyzer. B) Microchip for coronavirus disease 2019 detection with lyophilized reagents in the microwells displayed along with its packaging.

Image: 
Lumex Instruments Canada

Philadelphia, May 18, 2021 - Throughout the COVID-19 pandemic, supply chain shortages of reagents and test kits have limited the rapid expansion of clinical testing needed to contain the virus. Investigators have developed and validated a new microchip real-time technology platform that uses 10-fold less reagents compared to Centers for Disease Control and Prevention (CDC)-approved tube-based RT-PCR tests, and reports results in as little as 30 minutes. Its accuracy was 100 percent predictive in clinical samples, investigators explain in the Journal of Molecular Diagnostics, published by Elsevier.

"Sensitivity is critical for early detection of COVID-19 infection where the viral load is minimal to prevent further spreading of the disease. During this pandemic, numerous testing assays have been developed, sacrificing sensitivity for speed and cost," explains lead investigator Peter J. Unrau, PhD, Department of Molecular Biology and Biochemistry, Simon Fraser University, Burnaby, BC, Canada. "This research offers a cheaper, faster alternative to the most reliable and sensitive test currently used worldwide, without sacrificing sensitivity and reproducibility."

Researchers validated a microchip PCR technology for detection of SARS-CoV-2 in clinical samples. Empty microchips with 30 microwells were manufactured from aluminum sheets and coated with surface modifiers. They were then filled with CDC-authorized primers and probes to detect SARS-CoV-2. They were individually packaged and sent to a laboratory for sample validation and testing. Real-time qPCR was performed using 1.2 microliter reaction volume per reaction on a microchip-based PCR analyzer using AriaDNA software to control the instrument and obtain PCR results.

Nasopharyngeal swabs from eight patients with positive COVID-19 test results and 13 patients with negative COVID-19 test results were collected at St. Paul's Hospital in Vancouver, Canada and tested with the microchip RT-qPCR kit. Of the 21 patient samples, eight tested positive, 12 tested negative, and one included sample was invalid, which tested negative in both the microchip RT-qPCR assay and hospital testing. The CDC standards deemed the sample invalid as the human internal control was not detected in this sample. The microchip kit miniaturized the reaction volumes needed by 10-fold, resulting in lower reagent consumption and faster assay times (in as little as 30 minutes compared to about 70 minutes), while maintaining the same gold standard in sensitivity as higher volume techniques. Because the kit comes preloaded with SARS-CoV-2 primers and probes, it may further reduce operator-associated errors, improving the reliability of analysis in remote settings.

Available internationally, the low-energy (100 watt), compact, lightweight microchip analyzer and COVID-19 detection kits developed by Lumex Instruments Canada and validated by Dr. Unrau and his colleagues may enable point-of-care testing in remote locations, clinics, and airports.

"Although further testing of additional clinical samples and sample types may be needed before this assay can be widely deployed," Dr. Unrau says, "these preliminary results demonstrate a promising, versatile technology that can be easily configured and mobilized to detect infections of current and future emerging viruses, overcoming current bottlenecks and ensuring a faster response in the future."

Credit: 
Elsevier

Tulane researchers develop test that can detect childhood tuberculosis a year ahead

video: Lead study author Tony Hu, PhD, explains how his new screening technology has the potential to make a big difference in the fight against childhood tuberculosis by detecting cases much earlier so doctors can quickly begin treatment.

Image: 
Video by Carolyn Scofield, Tulane University School of Medicine

Researchers at Tulane University School of Medicine have developed a highly sensitive blood test that can find traces of the bacteria that causes tuberculosis (TB) in infants a year before they develop the deadly disease, according to a study published in BMC Medicine.

Using only a small blood sample, the test detects a protein secreted by Mycobacterium tuberculosis, which causes TB infection. It can screen for all forms of TB and rapidly evaluate a patient's response to treatment, said lead study author Tony Hu, PhD, Weatherhead Presidential Chair in Biotechnology Innovation at Tulane University.

"This is a breakthrough for infants with tuberculosis because we don't have this kind of screening technology to catch early infections among those youngest groups who are most likely to be undiagnosed," Hu said. "I hope this method can be pushed forward quickly to reach these children as early as possible."

Each year, nearly a million children develop TB and 205,000 die of TB-related causes. More than 80% of childhood TB deaths occur in those under the age of 5. Most of these deaths occur because their disease is undiagnosed as children with TB, particularly infants, usually have symptoms that are not specific for the disease. These children also have difficulty producing the respiratory samples used for TB detection by the best TB tests now in use.

Even when it is possible to obtain these samples from children, they tend to be less effective for diagnosis, since they often contain much less of the bacteria than samples from adults, Hu said. His test's assay, however, uses a small blood sample that can be easily obtained from children of any age to detect a specific protein (CFP-10) that the bacteria secrete to maintain the infection that develops into TB. Since this protein is present at very low levels in the blood, Hu's assay uses an antibody specific for this protein to enrich it from other proteins in blood and a mass spectrometer to detect it with high sensitivity and accuracy.

Hu and his team used this test to screen stored blood samples collected from 284 HIV-infected and 235 children without the virus who participated in a large clinical trial conducted between 2004-2008. Hu's group found their test identified children diagnosed with TB by the current gold-standard TB tests with 100% accuracy. The assay also detected 83.7% of TB cases that were missed by these tests, but that were later diagnosed by a standard checklist employing an array of other information collected by each child's physician (unconfirmed TB cases). Hu's test also detected CFP-10 in 77% of the blood samples that were collected 24 weeks before children were diagnosed with TB by other methods, indicating its strong potential for early TB diagnosis. The biomarker from some positive cases can be detected as early as 60 weeks before their TB diseases were confirmed.

The researchers are working to develop an inexpensive, portable instrument to read the test to allow it to be more easily used in resource-limited settings often encountered in areas where TB is prevalent.

Credit: 
Tulane University

New peanut has a wild past and domesticated present

image: Wild peanut variations in the lab of David Bertioli and Soraya Leal-Bertioli at the Center for Applied Genetic Technologies.

Image: 
Andrew Davis Tucker/UGA

The wild relatives of modern peanut plants have the ability to withstand disease in ways that peanut plants can't. The genetic diversity of these wild relatives means that they can shrug off the diseases that kill farmers' peanut crops, but they also produce tiny nuts that are difficult to harvest because they burrow deep in the soil.

Consider it a genetic trade-off: During its evolution, the modern peanut lost its genetic diversity and much of the ability to fight off fungus and viruses, but gained qualities that make peanut so affordable, sustainable and tasty that people all over the world grow and eat them.

Modern peanut plants were created 5,000 to 10,000 years ago, when two diploid ancestors (plants with two sets of chromosomes) came together by chance, and became tetraploids (plants with four sets of chromosomes). While domesticated peanuts traveled around the world and show up in cuisine from Asia to Africa to the Americas, wild relatives stayed close to home in South America.

Over the past several years, researchers at the University of Georgia, particularly at the Wild Peanut Lab in Athens, have been homing in on the genetics of those wild relatives and detailing where those resiliency traits lie in their genomes. The goal has always been to understand the wilds well enough to make use of the advantageous ancient genes - the ones the relatives have, but peanut lost - while holding onto the modern traits that farmers need and consumers want.

"Most of the wild species still grow in South America," said Soraya Leal-Bertioli, who runs the Wild Peanut Lab with her husband, David Bertioli. "They are present in many places, but you don't just come across them on the streets. One has to have the 'collector's eye' to spot them in the undergrowth."

Those wild plants can't breed with peanut in nature any longer because they only have two sets of chromosomes.

"The wilds are ugly distant relatives that peanut does not want to mix with," Leal-Bertioli said, "but we do the match making."

Researchers in Athens and Tifton have successfully crossed some of those wild species together to create tetraploid lines that can be bred with peanut. Those new lines will give plant breeders genetic resources that will lead to a bumper crop of new varieties with disease resistance and increased sustainability. The newly released lines won't produce the peanuts that go into your PB&J tomorrow, but they are the parents of the plants that farmers will grow in coming years.

The Journal of Plant Registrations published the details about the first of these germplasm lines this month. The lines were created by a team led by the Bertiolis, who conduct peanut research through the College of Agricultural and Environmental Science's Institute for Plant Breeding, Genetics. They also manage separate global research projects for the Feed the Future Innovation Lab for Peanut, a U.S. Agency for International Development project to increase the global food supply by improving peanut.

The new lines developed by the Bertiolis are resistant to early and late leaf spot, diseases that cost Georgia peanut producers $20 million a year, and root-knot nematode, a problem that few approved chemicals can fight. They are "induced allotetraploids," meaning they are made through a complex hybridization that converts the wild diploid species into tetraploids.

The second set of new varieties come from work done in Tifton and led by Ye (Juliet) Chu, a senior research associate in Peggy Ozias-Akins' lab within the CAES Horticulture Department. These three varieties are made from five peanut relatives and show resistance to leaf spot. One also is resistant to tomato spotted wilt virus, a disease that can destroy the entire crop in peanut varieties without natural resistance.

Creating the first fertile allotetraploids is a challenge, but then scientists can cross them with peanut and, through generations, select for the right traits. Plant breeders will be able to take these lines made from peanut's wild relatives and cross them with modern domesticated peanut to get the best of both - a plant that looks like peanut and produces nuts with the size and taste of modern varieties, but that has the disease-fighting ability of the wild species.

While plant breeders have known the value of the diversity in wild peanut species for decades, they couldn't keep track of those valuable wild genes until recently. The peanut industry in Georgia and other states has invested in work to sequence peanut and the two ancestor species, knowing that the work to understand the peanut genome would pay off. With genetic markers developed using the genome, breeders not only can tell that a plant has a desirable trait, they know what genome regions are responsible for that trait and can combine DNA profiling with traditional field selection to speed the complex process of developing a new variety.

"It streamlines everything. You can make a cross, which produces 1,000 seeds, but before planting them, their DNA can be profiled. That way you can see that only 20 of those plants are ideal for further breeding. Forty years ago, you'd have to plant them all, making the process much more cumbersome," David Bertioli said.

With ongoing work, the Journal of Plant Registrations will document the release of other peanut germplasm with resistance to important diseases. Releasing the lines, along with the molecular markers for their advantageous traits, provides the peanut breeding community with genetic resources to produce more resilient crops.

"In the past, we knew where we were going, but it was like everyone drew their own map," David Bertioli said. "Now, it's like we have GPS. (Scientists) can tell each other, 'Here are my coordinates. What are yours?' And all the data is published."

Credit: 
University of Georgia

Salk scientists reveal role of genetic switch in pigmentation and melanoma

LA JOLLA--(MAY 18, 2021) Despite only accounting for about 1 percent of skin cancers, melanoma causes the majority of skin cancer-related deaths. While treatments for this serious disease do exist, these drugs can vary in effectiveness depending on the individual.

A Salk study published on May 18, 2021, in the journal Cell Reports reveals new insights about a protein called CRTC3, a genetic switch that could potentially be targeted to develop new treatments for melanoma by keeping the switch turned off.

"We've been able to correlate the activity of this genetic switch to melanin production and cancer," says Salk study corresponding author Marc Montminy, a professor in the Clayton Foundation Laboratories for Peptide Biology.

Melanoma develops when pigment-producing cells that give skin its color, called melanocytes, mutate and begin to multiply uncontrollably. These mutations can cause proteins, like CRTC3, to prompt the cell to make an abnormal amount of pigment or to migrate and be more invasive.

Researchers have known that the CRTC family of proteins (CRTC1, CRTC2, and CRTC3) is involved in pigmentation and melanoma, yet obtaining precise details about the individual proteins has been elusive. "This is a really interesting situation where different behaviors of these proteins, or genetic switches, can actually give us specificity when we start thinking about therapies down the road," says first author Jelena Ostojic, a former Salk staff scientist and now a principal scientist at DermTech.

The researchers observed that eliminating CRTC3 in mice caused a color change in the animal's coat color, demonstrating that the protein is needed for melanin production. Their experiments also revealed that when the protein was absent in melanoma cells, the cells migrated and invaded less, meaning they were less aggressive, suggesting that inhibiting the protein could be beneficial for treating the disease.

The team characterized, for the first time, the connection between two cellular communications (signaling) systems that converge on the CRTC3 protein in melanocytes. These two systems tell the cell to either proliferate or make the pigment melanin. Montminy likens this process to a relay race. Essentially, a baton (chemical message) is passed from one protein to another until it reaches the CRTC3 switch, either turning it on or off.

"The fact that CRTC3 was an integration site for two signaling pathways--the relay race--was most surprising," says Montminy, who holds the J.W. Kieckhefer Foundation Chair. "CRTC3 makes a point of contact between them that increases specificity of the signal."

Next, the team plans to further investigate the mechanism of how CTRC3 impacts the balance of melanocyte differentiation to develop a better understanding of its role in cancer.

Credit: 
Salk Institute

Bipolar order: A straightforward technique to have more control over organic thin films

image: The combination of bipolar electrochemistry with electrolytic micelle disruption leads to produce shaped organic thin films. The approach involves wirelessly inducing a desired potential distribution on a plate in an electrolytic cell to control the 'popping' of bubble-like micelles, which release their cargo to automatically form a film. Customized thin films produced with this inexpensive strategy could unlock applications in sophisticated biosensor systems and optoelectronics.

Image: 
Tokyo Tech

Modern and emerging applications in various fields have found creative uses for organic thin films (TFs); some prominent examples include sensors, photovoltaic systems, transistors, and optoelectronics. However, the methods currently available for producing TFs, such as chemical vapor deposition, are expensive and time-consuming, and often require highly controlled conditions. As one would expect, making TFs with specific shapes or thickness distributions is even more challenging. Because unlocking this customizability could spur advances in many sophisticated applications, researchers are actively exploring new approaches for TF fabrication.

In a recent study published in Angewandte Chemie internatnal edition, a team of scientists from Tokyo Tech found a clever and straightforward strategy to produce organic TF patterns with a controllable shape and thickness. The research was led by Associate Professor Shinsuke Inagi, whose group has been delving into the potential of bipolar electrochemistry for polymeric TF fabrication. In this peculiar branch of electrochemistry, a conducting object is submerged in an electrolytic cell, and the electric field generated by the cell's electrodes causes a potential difference to emerge across the surface of the object. This electric potential can be large enough to drive chemical reactions on the surface of the introduced (and now bipolar) object. Noting that the potential distribution on the bipolar object simultaneously depends on multiple factors, Prof. Inagi's team had previously leveraged this technique to achieve a good degree of controllability in fabricated polymeric TFs.

Now, Yaqian Zhou, a Ph.D. candidate in Prof. Inagi's team, combined bipolar electrochemistry with a unique strategy developed in the 1980s by Dr. Saji and colleagues, also from Tokyo Tech. This other method, called 'electrolytic micelle disruption (EMD),' basically consists in encapsulating an organic compound inside spherical structures called micelles, which are, like some soaps and detergents, composed of surfactant molecules. These surfactants molecules are special in that they tend to easily lose electrons when near a positively charged electrode; this destabilizes the micelles and releases the organic compounds trapped within, which then accumulate and form a film.

The team employed special bipolar electrochemical cells with different configurations to control the potential distribution induced wirelessly on a plate, creating, for example, a voltage gradient along a direction or a circular area with a positive potential zone. They then introduced micelles loaded with a desired organic compound. The catch is that these micelles "popped" more frequently on the more positively charged regions on the bipolar plate. Thus, as they released their cargo, the thin films that automatically formed closely resembled the induced voltage distribution, providing an interesting degree of customizability. "We managed to produce a variety of thickness-gradient and circular organic thin films in proof-of-concept experiments, which confirmed the validity of our proposed approach," highlights Prof. Inagi.

This novel strategy is remarkably inexpensive and makes customizable thin films much more accessible. Moreover, as Prof. Inagi explains, the technique is not limited to organic molecules and could be made compatible with polymers and carbon materials. "We've developed a promising tool for various applications that rely on thin films, not just in the field of luminescence, but also for more sophisticated areas like biosensor systems, due to the organic solvent-free and mild conditions required," he concludes. Hopefully, further improvements on this combined technique will help produce thin films that can satisfy all sorts of practical demands.

Credit: 
Tokyo Institute of Technology

A randomised trial comparing imaging-guided PCI with Orsiro vs Xience

Previous clinical trials suggested that ultra-thin strut biodegradable polymer sirolimus-eluting stent (BP-SES) may be associated with lower target lesion failure (TLF) when compared to durable polymer everolimus-eluting stents (DP-EES). However, the possible underlying mechanisms remain unclear. Therefore, the all-comers CASTLE study was designed to assess the role of imaging-guided percutaneous cardiac intervention (PCI) in the clinical outcomes difference between BP-SES vs DP-EES.

BP-SES has ultra-thin struts (60μm) and a biodegradable polymer that may provide potential advantages such as reduced vessel inflammation and thrombogenicity. Randomised clinical trials have provided mixed results. The BIOFLOW-V and BIOSTEMI trial showed a lower risk of TLF with BP-SES when compared to DP-EES. Meanwhile, the BIOSCIENCE trial showed neutral results. One of the possible explanations of these contradictory findings is the use of intracoronary imaging. The CASTLE investigator hypothesised that under imaging-guidance PCI, the actual difference in clinical outcomes between BP-SES and DP-EES might be clarified.

The CASTLE study is an investigator-initiated, multicentre, single-blinded, randomised, non-inferiority clinical trial executed in 69 centres in Japan. The population was composed of patients with acute and chronic coronary syndromes. Patients were randomised in a 1:1 ratio to image-guided PCI (intravascular ultrasound or optical coherence tomography) with BP-SES (intervention group) or DP-EES (control group). The primary outcome was TLF (cardiovascular death, target vessel myocardial infarction, and clinically driven target lesion revascularisation) at 12-month follow-up. An independent clinical event committee evaluated angiographies and clinical events. The prespecified margin for non-inferiority was 3.3%.

The investigator reported an interim analysis with ~70% of the follow-up. Between May 2019 and March 2020, 1440 patients were randomised; 722 were allocated to BP-SES and 718 to DP-EES. The 12-month follow-up was completed in 69.1% in the BP-SES group and 68.6% in the DP-EES. There were no significant differences between groups in terms of clinical and procedural characteristics. The trial included mainly chronic coronary syndromes (85%), stent diameter ? 3mm (66%), and imaging-guidance was performed in at least 97.5% of the patients. At 30-day follow-up, there was no difference in TLF between DP-SES vs DP-EES (5.0% vs 4.9%) or its components. In the primary endpoint, at 12-month follow-up, there was no difference in TLF between DP-SES vs DP-EES (HR 0.59 [95%CI 0.26 to 1.36]).

At least in this interim analysis, the data suggest that BP-SES and DP-EES may have similar clinical outcomes when PCI is performed under intracoronary imaging guidance. However, we should cautiously wait until the complete follow-up is performed to assess any potential difference between these two devices.

Credit: 
PCR

EBC MAIN trial results - what is new and what will change in left main stenting?

The European Bifurcation Club Left Main (EBC MAIN) trial addressed the issue of provisional single stent versus upfront double stenting in 467 patients with true bifurcation distal left main disease.

So far, only one other randomized trial, DKCRUSH-V (n=482), has addressed the same research question, showing better outcomes with an upfront two-stent strategy, more specifically the double-kissing crush (DK CRUSH) technique.

In terms of methodology, two aspects need to be considered for the correct interpretation of the EBC MAIN trial results. First, both LAD and CX ostia were affected by significant disease on angiography in all included patients. Second, it being a strategy trial, the EBC MAIN did not compare the implantation of one versus two stents, but it rather compared a provisional single stent strategy, which could entail extension to two stents under prespecified procedural conditions, with an upfront assignment to use a two-stent technique.

What is new?

The primary message is that no difference in terms of the studied clinical outcomes was noted between the planned single stenting and the upfront use of two-stent techniques. Of note, 22% of patients randomized to a planned single-stent strategy were ultimately treated with two stents.

The primary composite endpoint of one-year death, myocardial infarction and target lesion revascularization occurred at the rate of 14.7% in the provisional vs. 17.7% in the upfront two-stent group (hazard ratio 0.8, 95% confidence interval 0.5 - 1.3). Furthermore, no significant difference was detected for any of the individual components of the primary endpoint. The rates of stent thrombosis were similar, 1.7% in the provisional arm and 1.3% in patients treated with upfront double stenting.

What will change?

Given the overall neutral trial results, it is important to understand how EBC MAIN could be perceived as practice-changing. This comes on the backdrop of the 2018 European Society of Cardiology (ESC) Guidelines on myocardial revascularization recommendation to preferably use double-kissing crush (an upfront two-stent technique) over a planned single-stent strategy (provisional) in true left main bifurcations (Class of recommendation IIb, Level of evidence B).

The described recommendation was largely based on the results of a single randomized study, the DKCRUSH-V trial. In this respect, the EBC MAIN trial, presented on May 19 at EuroPCR 2021, adds important new data that deviate from the hitherto available randomized evidence on this topic.

Importantly and as highlighted by the authors, neutral findings of the EBC MAIN may provide support for the notion that even in true left main bifurcations the initial strategy of single stenting is not penalized by worse one-year outcomes as compared with upfront two-stent techniques.

The clinical value of these findings is compounded by prior evidence from a pooled analysis of the BBC ONE and NORDIC trials, which associated upfront two-stent techniques with a higher long-term mortality risk, as compared with an initial single-stent strategy.

Credit: 
PCR

Scientists debut most efficient 'optical rectennas,' devices that harvest power from heat

Scientists at the University of Colorado Boulder have tapped into a poltergeist-like property of electrons to design devices that can capture excess heat from their environment--and turn it into usable electricity.

The researchers have described their new "optical rectennas" in a paper published today in the journal Nature Communications. These devices, which are too small to see with the naked eye, are roughly 100 times more efficient than similar tools used for energy harvesting. And they achieve that feat through a mysterious process called "resonant tunneling"--in which electrons pass through solid matter without spending any energy.

"They go in like ghosts," said lead author Amina Belkadi, who recently earned her PhD from the Department of Electrical, Computer and Energy Engineering (ECEE).

Rectennas (short for "rectifying antennas"), she explained, work a bit like car radio antennas. But instead of picking up radio waves and turning them into tunes, optical rectennas absorb light and heat and convert it into power.

They're also potential game changers in the world of renewable energy. Working rectennas could, theoretically, harvest the heat coming from factory smokestacks or bakery ovens that would otherwise go to waste. Some scientists have even proposed mounting these devices on airships that would fly high above the planet's surface to capture the energy radiating from Earth to outer space.

But, so far, rectennas haven't been able to reach the efficiencies needed to meet those goals. Until now, perhaps. In the new study, Belkadi and her colleagues have designed the first-ever rectennas that are capable of generating power.

"We demonstrate for the first time electrons undergoing resonant tunneling in an energy-harvesting optical rectenna," she said. "Until now, it was only a theoretical possibility."

Study coauthor Garret Moddel, professor of ECEE, said that the study is a major advance for this technology.

"This innovation makes a significant step toward making rectennas more practical," he said. "Right now, the efficiency is really low, but it's going to increase."

An unbeatable problem

It's a development that Moddel, who has literally written the book on these devices, has been looking forward to for a long time. Rectennas have been around since 1964 when an engineer named William C. Brown used microwaves to power a small helicopter. They're relatively simple tools, made up of an antenna, which absorbs radiation, and a diode, which converts that energy into DC currents.

"It's like a radio receiver that picks up light in the form of electromagnetic waves," he said.

The problem, however, is that to capture thermal radiation and not just microwaves, rectennas need to be incredibly small--many times thinner than a human hair. And that can cause a range of problems. The smaller an electrical device is, for example, the higher its resistance becomes, which can shrink the power output of a rectenna.

"You need this device to have very low resistance, but it also needs to be really responsive to light," Belkadi said. "Anything you do to make the device better in one way would make the other worse."

For decades, in other words, optical rectennas seemed like a no-win scenario. That is until Belkadi and her colleagues, who include postdoctoral researcher Ayendra Weerakkody, landed on a solution: Why not sidestep that obstacle entirely?

A ghostly solution

The team's approach relies on a strange property of the quantum realm.

Belkadi explained that in a traditional rectenna, electrons must pass through an insulator in order to generate power. These insulators add a lot of resistance to the devices, reducing the amount of electricity that engineers can get out.

In the latest study, however, the researchers decided to add two insulators to their devices, not just one. That addition had the counterintuitive effect of creating an energetic phenomenon called a quantum "well." If electrons hit this well with just the right energy, they can use it to tunnel through the two insulators--experiencing no resistance in the process. It's not unlike a ghost drifting through a wall unperturbed. A graduate student in Moddel's research group had previously theorized that such spectral behavior could be possible in optical rectennas, but, until now, no one had been able to prove it.

"If you choose your materials right and get them at the right thickness, then it creates this sort of energy level where electrons see no resistance," Belkadi said. "They just go zooming through."

And that means more power. To test the spooky effect, Belkadi and her colleagues arrayed a network of about 250,000 rectennas, which are shaped like tiny bowties, onto a hot plate in the lab. Then they cranked up the heat.

The devices were able to capture less than 1% of the heat produced by the hot plate. But Belkadi thinks that those numbers are only going to go up.

"If we use different materials or change our insulators, then we may be able to make that well deeper," she said. "The deeper the well is, the more electrons can pass all the way through."

Moddel is looking forward to the day when rectennas sit on top of everything from solar panels on the ground to lighter-than-air vehicles in the air: "If you can capture heat radiating into deep space, then you can get power anytime, anywhere."

Credit: 
University of Colorado at Boulder

Intensive agriculture could drive loss of bees and other tropical pollinators

image: Tropical butterfly in Malaysia

Image: 
Dr Tim Newbold, UCL

Pollinators in the tropics are less likely to thrive in intensive croplands, finds a new study led by UCL researchers suggesting bees and butterflies are at risk of major losses.

Across the globe, lower levels of land use intensity are good for pollinators, finds the new Nature Communications paper which shows the importance of sustainable land management in cities and agricultural regions.

As insect pollinators were found to be more than 70% less abundant in areas with intensive cropland, compared to wild sites, the researchers say that more sustainable agricultural practices are needed to avert widespread losses of bees and other valuable insects.

Lead author, PhD student Joe Millard (UCL Centre for Biodiversity & Environment Research, UCL Biosciences and Institute of Zoology, ZSL), said: "Pollinating species are thought to be in decline globally due to combined pressures of habitat loss and climate change. Here, we found that pollinators in tropical regions are the most likely to decline as croplands continue to expand and intensify, and as animals in the tropics are also particularly vulnerable to the impacts of climate change."

The researchers modelled the effect of land-use type and intensity on global pollinator biodiversity, using a database covering 303 studies, 12,170 sites (primarily across North and South America, Europe, and Africa), and 4,502 pollinating species, including insects, birds and bats.

The researchers found that overall, low levels of land use intensity appear to have beneficial effects for pollinators, even compared to natural vegetation, while increasing intensity of different land uses was associated with reductions in species richness (the number of different species) and total abundance. In urban areas across the globe, total abundance of pollinators declined by 62% from minimal to intense use.

In the tropics, species richness and total abundance of all pollinators combined decreased between natural vegetation and high-intensity cropland by 44% and 49% respectively. Insect pollinators are particularly vulnerable to increases in cropland intensity, with abundance declines of at least 70% for all insect pollinator orders in high-intensity croplands compared to primary vegetation.

The researchers also found varying impacts of fertilisers, as flies did well in areas with a greater fertiliser application rate, while bees and butterflies suffered.

Senior author Dr Tim Newbold (UCL Centre for Biodiversity & Environment Research, UCL Biosciences) said: "More than three quarters of globally important food crops are at least partly reliant on animal pollination, including nuts, berries, and fruits grown in tropical areas. Croplands are expected to continue expanding rapidly in the tropics, which could pose a serious risk to local pollinators. As a result, we may see reduced yields of the many tropical crops that depend on animal pollination.

"Agricultural land management needs to take a long-term outlook to avoid harming pollinators. While maintaining wilderness spaces, so that not all land in a region is given over to human uses, is vital, agriculture can also be done more sustainably without reducing crop yields. This can mean planting different crops close together, using biocontrol agents instead of insecticides to control pests, planting hedgerows, or agroforestry. And consumers can also play their part by choosing more sustainably farmed products."

Joe Millard added: "Our finding that low intensity urban areas, like villages and green spaces, actually had greater pollinator biodiversity than wilderness areas, shows that urban areas can be good habitats for pollinators, with careful management. Planting flowers in gardens, without using insecticides, can help out our local pollinators."

Credit: 
University College London

Gut check

We are truly never alone, not even within our own bodies. Human beings play host to trillions of bacteria, fungi, viruses, and other microorganisms that make up the human microbiome. In recent years, the mix of these resident bacteria, and the presence of specific bacterial species, has been linked to conditions ranging from obesity to multiple sclerosis.

Now, going a step farther, researchers at Harvard Medical School and Joslin Diabetes Center have gone beyond microbial species. Analyzing the genetic makeup of bacteria in the human gut, the team has successfully linked groups of bacterial genes, or "genetic signatures," to multiple diseases.

The work brings scientists closer to developing tests that could predict disease risk or identify disease presence based on a sampling of the genetic makeup of a person's microbiome.

The findings, to be published May 18 in Nature Communications, link sets of bacterial genes to the presence of coronary artery disease, cirrhosis of the liver, inflammatory bowel disease, colon cancer, and type 2 diabetes. The analysis indicates that three of these conditions--coronary artery disease, inflammatory bowel disease, and liver cirrhosis--share many of the same bacterial genes. In other words, people whose guts harbor these bacterial genes seem more likely to have one or more of these three conditions.

The work represents a significant advance in the current understanding of the relationship between microbes residing in the human gut and specific diseases, the team said. If confirmed through further research, the results could inform the design of tools that could gauge a person's risk for a range of conditions based on analysis of a single fecal sample, they added.

"This opens a window for the development of tests using cross-disease, gene-based indicators of patient health," said first author Braden Tierney, a graduate student in the Biological and Biomedical Sciences program at HMS. "We've identified genetic markers that we think could eventually lead to tests, or just one test, to identify associations with a number of medical conditions."

The researchers caution that their study was not designed to elucidate exactly how and why these microbial genes may be linked to different diseases. Thus far, they said, it remains unclear whether these bacteria are involved in disease development or are mere bystanders in this process.

The goal of the study was to determine whether groups of genes could reliably indicate the presence of different diseases. These newly identified microbial genetic signatures, however, could be studied further to determine what role, if any, the organisms play in disease development.

"Our study underscores the value of data science to tease out complex interplay between microbes and humans," said study co-senior author Chirag Patel, associate professor of biomedical informatics in the Blavatnik Institute at HMS.

The researchers started out by collecting microbiome data from 13 groups of patients totaling more than 2,500 samples. Next, they analyzed the data to pinpoint linkages between seven diseases and millions of microbial species, microbial metabolic pathways, and microbial genes. By trying out a variety of modeling approaches--computing a total of 67 million different statistical models--they were able to observe what microbiome features consistently emerged as the strongest disease-associated candidates.

Of all the various microbial characteristics--species, pathways, and genes--microbial genes had the greatest predictive power. In other words, the researchers said, groups of bacterial genes, or genetic signatures, rather than merely the presence of certain bacterial families, were linked most closely to the presence of a given condition.

Some of the main observations included:

Clusters of bacterial genes, or genetic signatures, rather than individual bacterial genes, appear implicated in various types of human disease.

Coronary artery disease, inflammatory bowel disease, and liver cirrhosis have similar gut microbiome genetic signatures.

Type 2 diabetes, by contrast, has a microbiome signature unlike any other phenotype tested.

The analysis did not find a consistent link between the presence of the bacterial species Solobacterium moorei and colon cancer--an association previously reported in numerous studies. However, the researchers did identify particular genes from a S. moorei subspecies associated with colorectal cancer. This finding indicates that gene-level analysis can yield biomarkers of disease with greater precision and more specificity compared with current approaches. Patel said this result underscores the notion that it is not merely the presence of a given bacterial family that may portend risk, but rather the strains and gene signatures of the microbes that matter. The ability to identify interconnections with such precision will be critical for designing tests that can measure risk reliably, he added. Thus, in this specific example, a test intended to measure colon-cancer risk by merely detecting the presence of S. moorei in the gut may not be as reliable as a more refined test that measures bacterial genes to detect the presence of specific strains of S. moorei that are associated with colon cancer.

Two conditions--ear inflammation and benign soft-tissue tumors called adenomas--showed weak associations with the gut microbiome, suggesting that microorganisms residing in the human gut are not likely to play a role in the development of these conditions, nor are they likely to be reliable indicators that these conditions are present.

In a previous study, the HMS team used massive amounts of publicly available DNA-sequencing data from human oral and gut microbiomes to estimate the size of the universe of microbial genes in the human body. The analysis revealed that there may be more genes in the collective human microbiome than stars in the observable universe.

Given the sheer number of microbial genes that reside within the human body, the new findings represent a major step forward in understanding the complexity of the interplay between human diseases and the human microbiome, the researchers said.

"The ultimate goal of computational science is to generate hypotheses from a huge swath of data," said Tierney. "Our work shows that this can be done and opens up so many new avenues for research and inquiry that we are only limited by the time, people, and resources needed to run those tests."

Credit: 
Harvard Medical School

Greenhouse gas and aerosol emissions are lengthening and intensifying droughts

image: Droughts, such as the one impacting Devil's Punchbowl on the northern slope of the San Gabriel Mountains in Los Angeles County, have increased in duration and severity over the past century. In a new study in Nature Communications, researchers in UCI's Department of Civil & Environmental Engineering said that human-sourced greenhouse gases have been a significant factor in the growth and spread of the dry spells.

Image: 
Amir AghaKouchak / UCI

Irvine, Calif., May 17, 2021 -- Greenhouse gases and aerosol pollution emitted by human activities are responsible for increases in the frequency, intensity and duration of droughts around the world, according to researchers at the University of California, Irvine.

In a study published recently in Nature Communications, scientists in UCI's Department of Civil & Environmental Engineering showed that over the past century, the likelihood of stronger and more long-lasting dry spells grew in the Americas, the Mediterranean, western and southern Africa and eastern Asia.

"There has always been natural variability in drought events around the world, but our research shows the clear human influence on drying, specifically from anthropogenic aerosols, carbon dioxide and other greenhouse gases," said lead author Felicia Chiang, who conducted the project as a UCI graduate student in civil & environmental engineering.

Chiang, who earned her Ph.D. in 2020 and is now a postdoctoral scholar at NASA's Goddard Institute for Space Studies in New York, said that her team's research demonstrated significant shifts in drought characteristics - frequency, duration and intensity - due to human influence, or what they call "anthropogenic forcing."

The researchers used the recently released Coupled Model Intercomparison Project Phase 6 platform to run climate simulations showing how the length and strength of droughts changes under various scenarios including "natural-only" and with the addition of greenhouse gas and aerosol emissions.

The modeling experiments under natural-only conditions did not show regional changes in drought characteristics from the late 19th to late 20th centuries, according to the study. But when the team accounted for anthropogenic greenhouse gas and aerosol contributions, statistically significant increases occurred in drought hotspots in southern Europe, Central and South America, western and southern Africa and eastern Asia.

The team found that in examining the anthropogenic forcings separately, greenhouse gases had a bigger impact in the Mediterranean, Central America, the Amazon and southern Africa, while anthropogenic aerosols played a larger role in Northern Hemisphere monsoonal and sub-arctic regions. Chiang said human-emitted aerosols are essentially particulate matter that are small enough to be suspended in the air. They can come from power plants, car exhaust and biomass burning (fires to clear land or to burn farm waste).

"Knowing where, how and why droughts have been worsening around the world is important, because these events directly and indirectly impact everything from wildlife habitats to agricultural production to our economy," said co-author Amir AghaKouchak, UCI professor of civil & environmental engineering and Earth system science. "Lengthy dry spells can even hamper the energy sector through disruptions to solar thermal, geothermal and hydropower generation."

Co-author Omid Mazdiyasni, who earned a Ph.D. in civil & environmental engineering at UCI in 2020 and is now a project scientist with the Los Angeles County Department of Public Works, said, "To make matters worse, droughts can be accompanied by heat waves, and high heat and low moisture can increase wildfire risk, which is already significant in the western United States."

Mazdiyasni said that while the research paints a gloomy picture of the unwanted impact of humans on the global environment, it points to a potential solution.

"If droughts over the past century have been worsened by human-sourced pollution, then there is a strong possibility that the problem can be mitigated by limiting those emissions," he said.

Credit: 
University of California - Irvine

Researchers reveal new tool to help prevent suicide

image: A rainbow emerging from behind mountains.

Image: 
Photo by Evgeny Tchebotarev from Pexels

A team of Welsh academics has developed a new method of supporting health professionals to make clinical decisions about people who may be at risk of taking their own lives.

While the UK may have one of the lowest rates of suicide in the world, it is still the biggest cause of death in men under 45, so being able to make a Structured Professional Judgement about who might attempt suicide and knowing how to intervene is vitally important.

Researchers at Swansea and Cardiff universities have put together the Risk of Suicide Protocol (RoSP) which guides a professional to look at 20 aspects of a person's life known to be related to suicide. They can then formulate what the person's problems are and how they can be helped.

In two studies the team first examined if the RoSP could identify suicides from accidental deaths in people known to mental health services living in the community who had died unexpectedly, and secondly, if it could determine who would be likely to attempt suicide in a hospital caring for people at very high clinical risk.

The research, which has just been published by leading international journal Frontiers in Psychiatry, showed just how effective the RoSP is in both settings.

Professor Nicola Gray, from Swansea University, was working therapeutically with patients at very high clinical risk at the time of the study.

She said: "The RoSP came about as we were training healthcare professionals on how to identify and manage violence to others in the people they were caring for. The clinicians said their most significant clinical difficulty was in spotting and managing people's risk of harming themselves.

"We were asked to develop something to identify and improve safety planning in those at-risk people. Looking carefully at best practice guidelines we were able to put together a list of known risk indicators that were reasonably easy to identify for clinicians, and, importantly, could be the focus of intervention."

However, Professor Robert Snowden, of Cardiff University, said there is still need for caution: "We are never going to detect all cases or prevent all suicides. Many people die by suicide without ever seeing a professional. However, we hope the use of RoSP will help those that do see a mental health professional by providing a systematic review of the person's situation and clinical presentation.

"We need to look at settings such as A&E departments, prisons, GP surgeries, and other places where there may be people who are at risk of suicide."

Swansea University's Professor Ann John, who is Chair of the Welsh Government's national advisory group on suicide and self-harm prevention, was also involved in the research.

She added: "Prediction of suicidal behaviours is notoriously difficult and the use of clinical risk assessment tools to 'predict' future suicide risk is not recommended by NICE.

"The RoSP is a structured clinical assessment that supports clinicians in the identification of modifiable risk factors which can then be addressed. This is consistent with 'needs based approaches' and helps focus the assessment on a person's situation, how best to manage risk factors and develop a safety plan for any future crises. "

Credit: 
Swansea University

Slow research to understand fast change

image: Since 1972, researchers have been conducting experimental controlled burns to understand the interaction of fire, grazing, and nutrients at the Konza Prairie LTER.

Image: 
Barbara Van Slyke, Konza LTER. CC-BY 4.0.

In a world that's changing fast, the Long Term Ecological Research (LTER) Network can seem almost an anachronism. Yet the patience and persistence that have generated 40 years of careful, reliable science about the Earth's changing ecosystems may prove to be just what's needed in this rapidly shifting world. We can't wait for a crystal ball -- and we don't have to. By harnessing decades of rich data, scientists are beginning to forecast future conditions and plan ways to manage, mitigate, or adapt to likely changes in ecosystems that will impact human economies, health and wellbeing.

The National Science Foundation established the LTER Network more than 40 years ago to provide an alternative to funding models that favored constant innovation over continuity. The model has proven to be extraordinarily successful at both.

This month, in the Ecological Society of America's open-access journal Ecosphere, LTER researchers present examples of how changing populations -- of fish, herbs, trees, kelp, birds and more -- both reflect and influence the structure and resilience of ecosystems. The research collection contains 25 vignettes of the unexpected lessons drawn from long term research on populations of plants, animals and microbes -- just one small slice of the usable knowledge being generated by this program.

Ecologist Peter Groffman, who led the special collection, says the program is well-positioned to detect big changes. "Climate change is affecting ecological systems in really complex ways that are difficult to see and assess," Groffman said. "Observing from one point in time or through one method only reveals a slice of the situation. The scientists in the LTER Network combine long-term observations, experiments, models and theory to build up a more comprehensive picture."

He also highlights the program's team-oriented approach as being instrumental to its success. "The collaborative and inclusive nature of the network greatly facilitates our ability to address the very hardest questions in ecology and environmental science, and to share that information with the world," he explained. "We appreciate the National Science Foundation's vision of a lasting network that makes this environmental problem-solving possible."

Examples cluster around five main themes:

State change. Ecologists have known for decades that one small additional push could tip an ecosystem from prairie to shrubland or from mangrove forest to estuary. But just recognizing a true state change, let alone anticipating and avoiding such transitions, has been largely out of reach. At the Konza Prairie LTER, experimental manipulations of fire frequency, grazing, climate and nutrients allow researchers to identify signs of an impending shift from prairie to juniper woodland, factors (such as decreased fire frequency) that would exacerbate these shifts, and the conditions that would be needed to support restoration. 

Connectivity. Bringing together researchers from multiple disciplines in one location allows LTER researchers to discern the connections between air, water, plants, microbes, soil and humans that are difficult for individual researchers to capture. In the McMurdo Dry Valleys of Antarctica, summer temperatures normally hover around the freezing point. The summer of 2002 was slightly warmer and windier than average, but solar radiation held steady. Suddenly, streams flowed, soil organisms flourished, and lake productivity increased in this otherwise dry and frozen landscape. Liquid water closed the connections among landscape components that otherwise remained quite separate. LTER researchers, thoroughly familiar with the usual range of conditions, were able to capture and analyze every nuance of the change.

Time lags. Time lags are the ultimate "cold case" in ecology. Something -- a change in management practices, a fire or hurricane, the gain or loss of a species -- changes the way an ecosystem responds years or decades later. Without knowing the history of the system, it is easy to mistake the final straw for the underlying cause. Two decades of slowly warming sea surface temperatures, punctuated by the decadal excursions of El Niño and the Pacific Decadal Oscillation, appeared to produce no discernible change in krill populations of the California Current Ecosystem. Only when researchers also included the previous 40 years of observations by a related project did they detect an increase in krill populations -- driven by higher concentrations of nutrients in deeper ocean layers.

Cascading effects. Multi-layered, complex linkages in ecosystems can cause changes to propagate in ways that are difficult to anticipate. In the simplest example, reduced predation releases pressure on a population of grazers. More grazers survive, which in turn decimates the plant life at the bottom of the food chain. At Gull Lake in Michigan, increased growth of Microcystis, a toxic cyanobacterium, followed the arrival of zebra mussels in the early 1990s. Typically, Microcystis does better in warmer water, but as water temperatures increased at Gull Lake, Microcystis populations declined. Multiple experiments suggested that the presence of zebra mussels facilitated the growth of Microcystis, perhaps by grazing on the cyanobacterium's competitors. When a heat wave caused a massive die-off of zebra mussels in Gull Lake, the microcystin toxin declined by ~80%, thus providing a whole-lake confirmation of the experimental results.

Resilience. What qualities allow an ecosystem to retain its basic functions in the face of changing conditions? Or to return to the same basic form after a major disturbance? With temperature, nutrients, storms, water, and biodiversity all changing at once in the real world, controlled experiments allow LTER researchers to disentangle interacting influences on resilience. The BioCON experiment, established in 1997 at Cedar Creek LTER, separates the effects of biodiversity loss, increased nitrogen, and increased carbon dioxide, with nitrogen additions decreasing species richness by 16%. In the related TeRaCON experiments, researchers found that high biodiversity can mitigate microclimatic conditions by the equivalent of 2 degrees Celsius.

Credit: 
Ecological Society of America