Tech

Study shows effectiveness of suppressing female fruit flies

image: Florescent red proteins glow in D. suzukii fruit flies. Fruit flies at bottom left have four copies of the female lethal gene, while flies top middle have two copies of the gene. Wild flies without the gene are bottom right.

Image: 
Photos by Akihiko Yamamoto and Amarish Yadav, NC State University.

Populations of Drosophila suzukii fruit flies - so-called "spotted-wing Drosophila" that devastate soft-skinned fruit in North America, Europe and parts of South America - could be greatly suppressed with the introduction of genetically modified D. suzukii flies that produce only males after mating, according to new research from North Carolina State University.

D. suzukii are modified with a female-lethal gene that uses a common antibiotic as an off switch. Withholding the antibiotic tetracycline in the diet of larvae essentially eliminates birth of female D. suzukii flies as the modified male flies successfully mate with females, says Max Scott, an NC State entomologist who is the corresponding author of a paper describing the research.

"We use a genetic female-lethal system - a type of sterile insect technique - that works when a common antibiotic is not provided in larval diets," Scott said. "If we feed the antibiotic to the larvae, both males and females survive. If we don't, almost no females survive." Scott and collaborators previously showed success using a similar method in New World screwworm flies.

The modified flies overexpressed genes that cause cell death. Researchers used a fluorescent red protein to mark the presence of the female-lethal genes.

In the study, one line of flies grown without feeding tetracycline produced 100% males, while another line produced 98% males. Meanwhile, control fly lines grown with the antibiotic produced an approximate equal number of males and females.

"The technique worked more effectively than we expected," Scott said.

The study also tested how the introduction of males with the female-lethal gene would affect unmodified populations in lab cages. In one test, it took 10 generations to eliminate all female offspring. In a larger test, researchers placed 1,000 modified males twice weekly into cage populations containing approximately 150 to 200 pairs of wild type flies. After eight weeks, the test cages produced no new eggs. Control cages continued to produce over 100 eggs per day at the end of the study.

The study shows that the genetically modified males both competed fairly well for the attention of fertile wild females and mated successfully with fertile females under laboratory conditions. Scott added that the study also highlights that the lethal-female gene was passed on effectively.

Next steps could include contained trials in large cages in an NC State greenhouse, Scott said.

The study was published online in the journal Pest Management Science. Fang Li, Akihiko Yamamoto, Esther J. Belikoff, Amy Berger and Emily H. Griffith co-authored the paper. Funding for the work came from the National Institute of Food and Agriculture, U.S. Department of Agriculture Specialty Crops Research Initiative under agreement No. 2015-51181-24252 and a cooperative agreement with USDA-APHIS (award AP17PPQS&T00C165)

Credit: 
North Carolina State University

Microstructure found in beetle's exoskeleton contributes to color and damage resistance

image: Brilliant structural color in the exoskeleton of the flower beetle, Torynorrhina flammea. Photo courtesy of Ling Li.

Image: 
Virginia Tech

Beetles are creatures with built-in body armor. They are tiny tanks covered with hard shells, also known as exoskeletons, protecting their soft, skeleton-less bodies inside. In addition to providing armored protection, the beetle's exoskeleton offers functions like sensory feedback and hydration control. Notably, the exoskeletons of many beetles are also brilliantly colored and patterned, which enhances visual communication with other beetles and organisms.

Ling Li, lead investigator and assistant professor in mechanical engineering, has joined colleagues from six other universities to investigate the interplay between mechanical and optical performance in beetle exoskeletons. They discovered that the structures providing mechanical support are also key players in optical framework. Their findings were published in the Proceedings of the National Academy of Sciences.

The team has come together to answer questions of how the exoskeletal material achieves remarkable mechanical and optical functions at the same time, and which function dominates the structural design at nano- and micrometer scales.

Their focus was narrowed to a specific species: the flower beetle. This small scarab beetle lives in the rainforests of southeast Asia and is noted for displaying brilliant colors, ranging from deep blue to green, to orange and to red. These colorful shells are composed of two main layers that combine for protection, communication, and hydration.

How a beetle's colored armor works

Li and his team launched their research from knowledge of a beetle's shell composition: their outer exocuticle layer contains a unique microstructure with only 1/30 of a millimeter thickness. Its composition is a stack of horizontal nanoscale layers inserted with vertical microscale pillars, providing the exoskeleton with optical coloration and mechanical strength at the same time.

Unlike pigment-based colorations, the optical appearance of the flower beetle results from the exoskeleton's microstructure. The nanolayered region consists of two alternating material compositions, which selectively reflect light of certain colors. This phenomenon is called structural color or photonic color.

Structural color is a common strategy to produce coloration in nature, as seen in butterfly wings, bird feathers, and even some plants and mollusk shells. In 2015, Li and his colleagues discovered that a type of limpet found in Europe develops its iridescent blue color in its shells through a similar multilayered microstructure out of the mineral calcite, the same material found in chalk.

In addition to providing coloration, the exoskeletal shell of beetles needs to be strong and damage tolerant, Li explained. The flower beetle achieves this through reinforcement of its shell's vertical micropillars. When the microstructure is pierced, the shell's micropillars hold a seal around the site of the puncture. This prohibits the beetle's wing from tearing, cracking, or delaminating. The micropillars are also able to spring backward, thus reducing the size of the damage site intruded by the incoming object after unloading.

Micropillars with multiple jobs

Knowing that the mechanical and optical functions were linked, the team sought to discover which of the two were primary.

By collaborating with Mathias Kolle from MIT, the team developed an optical modeling program to simulate the optical response of the beetle's microstructure. They found that the presence of micropillars, while reducing some degree of optical reflection, is able to redistribute the reflected light to a greater angular range. This contributes to the beetle's ability to "send out" the optical signals to its potential receivers.

At the same time, mechanically, the presence of micropillars increases the stiffness, strength, and mechanical robustness of the structure by preventing the formation of shear bands, improving the damage resistance of the outer layer, and localizing damage to the exoskeleton.

After gaining an understanding of the basic mechanisms for optical coloration and mechanical reinforcement, Li and his team studied how the arrangement and size of an exoskeleton's micropillars impact both factors.

They found that a balance had to be struck: If there were many micropillars, the mechanical strength would be improved, Li explained. However, this would degrade the structural color, because the area percentage of the horizontal multilayer would be reduced.

The final objective was to determine which property, optical or mechanical, is more optimized when evolution "designs" the microstructure. To answer this question, the team examined the microstructure of flower beetles from the same species group, but with different colors.

Optical function won the day. They found that the size and distribution of the micropillars in beetles of differing colors were indeed optimized for achieving the most efficient light redistribution. The improvement of mechanical properties, particularly the stiffness, appeared not to be optimized, since the microstructure was not entirely covered with the stiffer and stronger micropillars. This result indicated that optical performance took priority over mechanical performance during the evolution of this peculiar multilayer, micropillar structure.

"This work presents a remarkable example of how nature achieves multifunctionality with unique microstructural designs," said Li. "We believe the material strategies revealed in this work can be used in designing photonic coating materials with robust mechanical performance. Our interdisciplinary approach based on materials, optics, mechanics, and biology also offers an important avenue to understanding the evolution at a materials level."

Credit: 
Virginia Tech

Researchers identify muscle proteins whose quantity is reduced in type 2 diabetes

Globally, more than 400 million people have diabetes, most of them suffering from type 2 diabetes.

Before the onset of actual type 2 diabetes, people are often diagnosed with abnormalities in glucose metabolism that are milder than those associated with diabetes. The term used to indicate such cases is prediabetes. Roughly 5-10% of people with prediabetes develop type 2 diabetes within a year-long follow-up.

Insulin resistance in muscle tissue is one of the earliest metabolic abnormalities detected in individuals who are developing type 2 diabetes, and the phenomenon is already seen in prediabetes.

In a collaborative study, researchers from the University of Helsinki, the Helsinki University Hospital and the Minerva Foundation Institute for Medical Research investigated the link between skeletal muscle proteome and type 2 diabetes.

In the study, the protein composition of the thigh muscle was surveyed in men whose glucose tolerance varied from normal to that associated with prediabetes and type 2 diabetes. A total of 148 muscle samples were analysed.

The results were published in the iScience journal.

"Our study is the broadest report on human muscle proteomes so far. The findings confirm earlier observations that have exposed abnormalities in muscle mitochondria in connection with type 2 diabetes," says Docent Heikki Koistinen from the University of Helsinki, Helsinki University Hospital and Minerva Foundation Institute for Medical Research, who headed the study.

Protein concentration already decreases in prediabetes

The researchers utilised mass spectrometry, enabling them to identify over 2,000 muscle proteins.

According to the findings, the quantity of dozens of proteins had already changed in prediabetic study subjects.

The greatest changes were observed in connection with type 2 diabetes, where the quantity of more than 400 proteins had primarily dropped. Most of these proteins were associated with mitochondrial energy metabolism.

In fact, the results highlight the significance of mitochondria when prediabetes is progressing toward type 2 diabetes.

"We found that the levels of mitochondrial muscle proteins are clearly reduced already in prediabetes," Koistinen notes.

The researchers also observed abnormalities, both in conjunction with prediabetes and type 2 diabetes, in the concentration of a range of phosphoproteins, which affect metabolism and muscle function.

Regular physical activity as targeted therapy

The researchers believe their new observations have multiple uses, including in the search for new drug targets.

"Still, there already exists an excellent and economical targeted therapy, since regular physical activity increases the number of muscle mitochondria and improves metabolism diversely," Koistinen points out.

Physical activity is also key when reducing the risk of developing diabetes.

"You can halve the risk of developing diabetes by losing weight, increasing physical activity and observing a healthy diet," Koistinen says.

Credit: 
University of Helsinki

Researchers pinpoint unique growing challenges for soybeans in Africa

URBANA, Ill. - Despite soybean's high protein and oil content and its potential to boost food security on the continent, Africa produces less than 1% of the world's soybean crop. Production lags, in part, because most soybean cultivars are bred for North and South American conditions that don't match African environments.

Researchers from the Soybean Innovation Lab (SIL), a U.S. Agency for International Development-funded project led by the University of Illinois, are working to change that. In a new study, published in Agronomy, they have developed methods to help breeders improve soybean cultivars specifically for African environments, with the intention of creating fast-maturing lines that will bolster harvests and profits for smallholder farmers.

"It is important for producers and breeders to know when a cultivar is going to mature: that moment when a plant is at full capacity and performing its best. We were motivated to fill in gaps of knowledge around maturity timing in Africa," says Guillermo Marcillo, postdoctoral researcher in the Department of Crop Sciences at U of I, and first author on the new study.

Marcillo and his collaborators capitalized on five years of SIL performance trials, encompassing 176 cultivars and experimental lines grown across 68 African sites. The trials are part of SIL's Pan-African Variety Trials (SIL-PAT) network, currently operating across 100 sites and 24 countries.

The researchers analyzed cultivar time-to-maturity against environmental variables, including temperature, day length, and elevation, using a statistical method known as a generalized additive model (GAM). The model was able to predict soybean time-to-maturity within 10 days for cultivars planted across Africa.

"The methodology we implemented is quite innovative, introducing data-driven algorithms and conventional breeding statistics to capture interactions between cultivars and environment in different areas," Marcillo says.

It was important to use a new statistical method to analyze the multi-environment dataset, according to Nicolas Martin, assistant professor in the Department of Crop Sciences and co-author on the study.

"Multi-environment trials allow breeders to analyze crop performance across diverse conditions, but also pose statistical challenges because of unbalanced data. Modern statistical methods, including GAM, can flexibly smooth a range of responses while retaining observations that could be lost under other approaches," he says.

The researchers found environmental factors, specifically daily minimum temperature and change in day length between planting and maturity, were far more important than genotypic differences in predicting time-to-maturity.

"Our study is the first systematic quantification of the effects of these two drivers in Africa," Marcillo says. "We know how temperature and day length affect maturity timing in the U.S., Brazil, and Argentina, but in Africa it's quite a big unknown. If we sent farmers cultivars from these regions, their environments might speed up or slow down maturity quite a bit. This knowledge is a big gain for Africa."

Michelle Da Fonseca Santos, who manages the SIL-PAT trial, notes minimum temperature depends on different factors in Africa compared to soybean-growing regions in North and South America.

"Low temperature is inversely related to elevation. In North and South America, we don't have much variation in elevation, so it's easy for us to divide maturity groups by latitude," she says. "But in Africa, for example in Kenya, if we try to plant the same lines in high elevation locations, they take forever. So we might need to use different lines for different regions depending on elevation."

Time-to-maturity is only one factor breeders need to take into account when developing new cultivars for African environments. Fortunately, the GAM method is flexible.

"We focused on time of maturity in this first project, but our method can apply to many other traits of economic importance such as yield, grain quality, protein content, and oil content," Marcillo says.

The researchers credit SIL-PAT farmers and breeders for making their discoveries possible.

"We couldn't have done anything close to this magnitude without the great work of our SIL-PAT colleagues in assembling the data set. Working with the breeders in each region is almost like a love story. Doing this manually or independently would have been impossible," Martin says.

Da Fonseca Santos adds, "The database is strong because of the public-private partnerships - seed companies, processors, government agencies - we have across Africa. None of this would be possible without them. And it's all demand driven. They all come to us saying we need soybean. So, this work is beneficial on many levels."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

CU Anschutz called a 'case study' for commercializing medical breakthroughs

image: CU Anschutz's Innovative Ecosystem

Image: 
CU Anschutz Medical Campus

A new study highlights the University of Colorado Anschutz Medical Campus as an example of how an academic medical center can turn groundbreaking research into commercial products that improve patient care and public health.

The paper, published recently in the Journal of Clinical and Translational Science, focuses on the unique ecosystem at CU Anschutz responsible for these innovations. And it specifically details the campus's collaborative culture and how biomedical research is commercialized.

The campus has successfully turned academic research into a variety of products. CU Anschutz, for example, developed two vaccines for shingles, Zostavax and Shingrix, and worked with industry partners to distribute them globally.

"At the CU Anschutz Medical Campus, we have created an ecosystem that fosters innovation and enables our faculty to turn big ideas into bold breakthroughs with transformative impacts on health and medicine," said Chancellor Donald M. Elliman, Jr. "Our investments in infrastructure and top talent in diverse fields, paired with a focus on building robust industry partnerships, are what allow us to translate research discoveries into new drugs and devices, novel approaches to care delivery and improved outcomes for patients everywhere."

The study notes two major engines driving this innovation: the Colorado Clinical and Translational Sciences Institute (CCTSI) and CU Innovations.

"By sharing best practices for translational research, our hope is to showcase how academic medical centers can develop and disseminate creative solutions to the many challenges facing healthcare systems," said author Ronald Sokol, MD, assistant vice chancellor for Clinical and Translational Science at the CU Anschutz Medical Campus and director of the CCTSI.

CCTSI sets out a road map for faculty that helps them translate research innovation and discoveries into commercially viable products or services. The CU Innovations program collaborates with academic and administrative offices on the campus to identify unique barriers to translation and find creative ways to partner with industry to commercialize discoveries.

"The close collaboration between CU Innovations and the CCTSI with a shared mission of supporting the translation of discoveries has led to major successes," said Kimberly Muller, executive director of CU Innovations. "At CU Innovation's core lies the traditional patent licensing and patent management. However, in recent years greater emphasis has been placed on industry collaborations, providing gap funding, making technology development experts accessible to faculty and offering training programs."

Recently, Summit Biolabs, a commercial-stage molecular diagnostics company, partnered with CU Innovations to collaborate on developing saliva liquid-biopsy tests for early detection of head and neck cancer and diagnosis of viral contagions such as COVID-19. The collaboration involves partnering with CU Anschutz faculty and experts on research, development and commercialization of the tests.

"This collaboration broadens and strengthens Summit Biolabs' ability to bring to market life-changing saliva liquid-biopsy tests that ultimately enable better treatment and improved outcomes for patients," said Bob Blomquist, Chief Executive Officer at Summit Biolabs.

Credit: 
University of Colorado Anschutz Medical Campus

New 2D alloy combines five metals, breaks down CO2

image: Scanning transmission electron microscope images of a high entropy transition metal dichalcogenide alloy flake in its entirety and an atom-resolved section. Monochromatic images depict the distribution of different elements.

Image: 
Mishra Lab

A two-dimensional alloy material -- made from five metals as opposed to the traditional two -- has been developed by a collaboration between researchers at the McKelvey School of Engineering at Washington University in St. Louis and researchers at the College of Engineering at the University of Illinois at Chicago.

And, in a first for such a material, it has been shown to act as an excellent catalyst for reducing CO2, into CO, with potential applications in environmental remediation.

The research, from the lab of Rohan Mishra, assistant professor in the Department of Mechanical Engineering & Materials Science at Washington University, was published Saturday, June 26, in the journal Advanced Materials.

"We're looking at transforming carbon dioxide, which is a greenhouse gas, into carbon monoxide," Mishra said. "Carbon monoxide can be combined with hydrogen to make methanol. It could be a way to take CO2 from the air and recycle it back into a hydrocarbon."

The basis of this innovation is a class of materials known as transition metal dichalcogenides (TMDCs) -- they include transition metals and a chalcogen, which includes sulfur, selenium and tellurium. When an alloy contains more than three metals at near equal ratios, it's said to be "high entropy." Hence the wordy name of the material developed in Mishra's lab: high-entropy transition metal dichalcogenides.

TMDCs are not new. There has been interest in similar two-dimensional forms of these materials due to their unique optical and electronic properties, Mishra said. But he had a suspicion they could be used for something else.

"We've been looking at these, too, but exploring their potential for electrocatalysis," acting as a catalyst to facilitate chemical reactions. As they are effectively two-dimensional (about three atoms thick), they make for efficient catalysts; reactions occur on the surface of a material, and a two-dimensional material has a lot of surface area, and not much else. In an earlier study, also published in the journal Advanced Materials in 2020, the group had shown that two-metal TMDC alloys showed improved catalytic activity over individual TMDCs. "This led us to the question, can adding more metals to these alloys make even better catalysts?" Mishra said.

With 10 applicable transition metals and three chalcogens, there are 135 two-metal and 756 five-metal possible TMDC alloys. However, just like oil and water, not all combinations will mix together to form a homogenous alloy.

"Without guidance from computations, experimentally determining which elemental combinations will give an alloy becomes a trial-and-error process that is also time consuming and expensive," Mishra explained.

The alchemist in this case was John Cavin, a graduate student in Washington University's Department of Physics in Arts & Sciences.

In the previous work, Cavin had shown which two transition metals can be combined, and at what temperatures, to form binary TMDCs alloys.

"The question was, 'Could we even synthesize a TMDC alloy that had that many components?'" Cavin said. "And will they improve the reduction of CO2 into CO?"

To find out, he used quantum mechanical calculations to predict which combinations were most likely to improve the material's ability to catalyze CO2. Then he had to go further to determine if the material would be stable, but had no tools to do so. So, he developed one himself.

"I had to develop a thermodynamic model for predicting stable high-entropy TMDC alloys from the quantum mechanical calculations," Cavin said. These calculations were carried using considerable supercomputing resources, made available by the Extreme Science and Engineering Discovery Environment network, which is supported by the National Science Foundation.

After years of development, the resulting analysis was sent to experimental collaborators at the University of Illinois at Chicago.

"At UIC, they could synthesize the materials that we predicted would form a high-entropy TMDC alloy," Mishra said. "Furthermore, one of them showed exceptional activity."

They may have other uses, too. UIC synthesized three of the four different TMDC alloys and will continue to analyze them.

"These are new materials, they have never before been synthesized," Mishra said. "They may have unanticipated properties."

The work stems from a DMREF grant from the National Science Foundation as part of the Materials Genome Initiative launched by President Barack Obama in 2011 as a multi-agency initiative to create policy, resources and infrastructure that support U.S. institutions to discover, manufacture and deploy advanced materials efficiently and cost-effectively.

Credit: 
Washington University in St. Louis

Data-driven approach for a more sustainable utility rate structure

Many drivers use tollways to get from point A to point B because they are a faster and more convenient option. The fees associated with these roadways are higher during peak traffic hours of the day, such as during the commute to and from work. With this structure, drivers who are not adding to the heavy flow of traffic do not have to pay higher toll prices. However, those who utilize the toll road during more congested hours pay a premium to use the faster, more convenient highways.

Similarly, not everyone uses the same amount of electricity throughout the day. There are peak load hours that put more strain on the grid, and there are users within those times who use more electricity while there are those who conserve.

Dr. Le Xie, professor in the Department of Electrical and Computer Engineering and assistant director-energy digitization of the Energy Institute at Texas A&M University, together with two co-authors, is proposing a user-impact tailored rate plan for utility companies to employ that is similar to toll roads. His rate plan will benefit individuals who use less power or utilize solar power to offset some of the strain on the distribution power grid, while those who use an excess of power from the electrical grid during peak times will cover more of the delivery cost.

In a typical monthly utility bill, users can see how much energy is used over that month and the kilowatt-hour rate, which is then multiplied together to determine the payment due. Through this project, Xie aims to provide new options for modernizing the power delivery rate structure for the utility industry through a data-driven approach.

This research was published in the June issue of the journal Utilities Policy.

Xie explained that the costs associated with delivering power are not necessarily the same for every customer, and further, not every customer is the same in terms of their impact and contribution to the grid. He proposes the use of electric smart meters, which can record with a granularity of 15 minutes how much energy a consumer has used during a given period. This would provide the utility company a more refined, closer to real-time understanding of a customer who contributes to the peak consumption of the grid and a customer who relieves the stress of the grid through the use of solar panels or lower consumption.

"If a particular customer is actually sending some of their solar panel energy back to the grid during the peak hours, they should be viewed as a positive asset to the grid, and that individual should be somehow compensated," Xie said. "And if someone is using their air conditioner an excessive amount when the grid is extremely stressed, then that individual is contributing to the strain on the grid and might need to pay a higher portion of the delivery cost."

Power distribution grids across the world are undergoing profound changes due to advances in grid edge technologies, such as solar panels and electric vehicles (EVs). The rate model Xie envisions would include a shift from charging end users based on their kilowatt-hour volumetric consumption and instead charging them a grid access fee that approximates the impact of end-users' time-varying demand on their local distribution network.

The dataset used in the case study is a system of 200 residential demand profiles, with 50 EV homes, 50 solar photovoltaic (PV) homes, and 100 non-DER homes -- those without distributed energy resources (DERs), such as an electric vehicle, solar panel or battery storage device.

Typically, customers charge their EVs during the night (off-peak hours, low grid impact). So, it follows that over 90% of EV customers see reduced bills under the proposed rate model. For solar PV customers, the results are more varied. For PV customers to minimize grid impact, combining solar PV with battery storage under a smart scheduling algorithm for charging and discharging would achieve the highest cost savings on bills. As for non-DER customers, over 80% experienced a small reduction in their bills under the proposed rate.

"The most exciting part is how we can translate technology innovations into a real-world impact," Xie said. "That impact is going to pave the way for a more sustainable operation of the grid -- leading to a more sustainable future."

Credit: 
Texas A&M University

Hunting dark energy with gravity resonance spectroscopy

Dark Energy is widely believed to be the driving force behind the universe's accelerating expansion, and several theories have now been proposed to explain its elusive nature. However, these theories predict that its influence on quantum scales must be vanishingly small, and experiments so far have not been accurate enough to either verify or discredit them. In new research published in EPJ ST, a team led by Hartmut Abele at TU Wien in Austria demonstrate a robust experimental technique for studying one such theory, using ultra-cold neutrons. Named 'Gravity Resonance Spectroscopy' (GRS), their approach could bring researchers a step closer to understanding one of the greatest mysteries in cosmology.

Previously, phenomena named 'scalar symmetron fields' have been proposed as a potential candidate for Dark Energy. If they exist, these fields will be far weaker than gravity - currently the weakest fundamental force known to physics. Therefore, by searching for extremely subtle anomalies in the behaviours of quantum particles trapped in gravitational fields, researchers could prove the existence of these fields experimentally. Within a gravitational field, ultra-cold neutrons can assume several discrete quantum states, which vary depending on the strength of the field. Through GRS, these neutrons are made to transition to higher-energy quantum states by the finely tuned mechanical oscillations of a near-perfect mirror. Any shifts from the expected values for the energy differences between these states could then indicate the influence of Dark Energy.

In their study, Abele's team designed and demonstrated a GRS experiment named 'qBOUNCE,' which they based around a technique named Ramsey spectroscopy. This involved causing neutrons in an ultra-cold beam to transition to higher-energy quantum states - before scattering away any unwanted states, and picking up the remaining neutrons in a detector. Through precise measurements of the energy differences between particular states, the researchers could place far more stringent bounds on the parameters of scalar symmetron fields. Their technique now paves the way for even more precise searches for Dark Energy in future research.

Credit: 
Springer

Dartmouth research turns up the heat on 3D printing inks

image: A 3D-printed flower demonstrates the qualities of a multi-functional printing gel that responds to moisture.

Image: 
Photo from the Ke Functional Research Group at Dartmouth.

A process that uses heat to change the arrangement of molecular rings on a chemical chain creates 3D-printable gels with a variety of functional properties, according to a Dartmouth study.

The researchers describe the new process as "kinetic trapping." Molecular stoppers--or speed bumps--regulate the number of rings going onto a polymer chain and also control ring distributions. When the rings are bunched up, they store kinetic energy that can be released, much like when a compressed spring is released.

Researchers in the Ke Functional Materials Group use heat to change the distribution of rings and then use moisture to activate different shapes of the printed object.

The process of printing objects with different mechanical strengths using a single ink could replace the costly and time-consuming need to use several inks to print items with multiple properties.

"This new method uses heat to produce and control 3D inks with varieties of properties," said Chenfeng Ke, an assistant professor of chemistry and the senior researcher on the study. "It's a process that could make the 3D printing of complex objects easier and less expensive."

Most common 3D printing inks feature uniform molecular compositions that result in printed objects with a single property, such as one desired stiffness or elasticity. Printing an object with multiple properties requires the energy-intensive and time-consuming process of preparing different inks that are engineered to work together.

By introducing a molecular "speed bump," the researchers created an ink that changes the distribution of molecular rings over time. The uneven rings also transform the material from a powder into a 3D printing gel.

"This method allowed us to use temperature to create complex shapes and make them actuate at different moisture levels," said Qianming Lin, a graduate student researcher at Dartmouth and first author of the study.

A video demonstrating the research shows a flower printed using a 3D ink produced with the process. The flower closes when exposed to moisture. Different parts of the printed flower have different levels of flexibility created by the arrangement of molecular rings. The mixture of properties allows the soft petals to close while the firmer parts of the flower provide structure.

Printing the same flower using current 3D printing methods would come with the extra challenge of combining different printed materials.

"The different parts of this object come from the same printing ink," said Ke. "They have similar chemical compositions but different numbers of molecular rings and distributions. These differences give the product drastically different mechanical strengths and cause them to respond to moisture differently."

The study, published in Chem, accesses the energy-holding "meta-stable" states of molecular structures made of cyclodextrin and polyethylene glycol--substances commonly used as food additives and stool softeners. By installing the speed bumps on the polyethylene glycol, the 3D-printed objects become actuators that respond to moisture to change shape.

According to the research team, future efforts to refine the molecule will allow precision control of multiple meta-stable states, allowing for the printing of "fast-responsive actuators" and soft robots using sustainable energy sources, such as variation in humidity.

The resulting printed objects could be used for medical devices or in industrial processes.

Credit: 
Dartmouth College

Duke study reveals mechanisms of increased infectivity, antibody resistance of SARS-CoV-2 variants

DURHAM, N.C. - Combining structural biology and computation, a Duke-led team of researchers has identified how multiple mutations on the SARS-CoV-2 spike protein independently create variants that are more transmissible and potentially resistant to antibodies.

By acquiring mutations on the spike protein, one such variant gained the ability to leap from humans to minks and back to humans. Other variants -- including Alpha, which first appeared in the United Kingdom, Beta, which appeared in South Africa, and Gamma, first identified in Brazil - independently developed spike mutations that enhanced their ability to spread rapidly in human populations and resist some antibodies.

The researchers published their findings in Science.

"The spike on the surface of the virus helps SARS-CoV-2 enter into host cells," said senior author Priyamvada Acharya, Ph.D., director of the Division of Structural Biology at the Duke Human Vaccine Institute.

"Changes on the spike protein determine transmissibility of the virus -- how far and quickly it spreads," Acharya said. "Some variations of the SARS-CoV-2 spike are occurring at different times and different places throughout the world, but have similar results, and it's important to understand the mechanics of these spike mutations as we work to fight this pandemic."

Acharya and colleagues -- including first author Sophie Gobeil, Ph.D., and co-corresponding author Rory Henderson, Ph.D., -- developed structural models to identify changes in the virus's spike protein. Cryo-electron microscopy allowed atomic level visualization, while binding assays enabled the team to create mimics of the live virus that directly correlated with its function in host cells. From there, the team used computational analysis to build models that showed the structural mechanisms at work.

"By building a skeleton of the spike, we could see how the spike is moving, and how this movement changes with mutations," Henderson said. "The different variant spikes are not moving the same way, but they accomplish the same task. "The different variant spikes are not moving the same way, but they accomplish the same task. The variants first appearing in South Africa and Brazil use one mechanism, while the UK and the mink variants use another mechanism."

All the variants showed increased ability to bind to the host, notably via the ACE2 receptor. The changes also created viruses that were less susceptible to antibodies, raising concerns that continued accumulation of spike mutations may reduce the efficiency of current vaccines.

Gobeil said the research illuminated the complexity of the virus: "It's amazing how many different ways the virus comes up with to be more infectious and invasive," she said. "Nature is clever."

Credit: 
Duke University Medical Center

Research rebuttal paper uncovers misuse of Holocaust datasets

image: Comparison of the original outlier identification model and three models derived from it. Due to the inapplicability of its assumptions to the considered dataset, the original model has no theoretical foundation. Three alternative models are less biased to size than the original model and produce opposing results.

Image: 
Melkior Ornik

Aerospace engineering faculty member Melkior Ornik is also a mathematician, a history buff, and a strong believer in integrity when it comes to using hard science in public discussions. So, when a story popped up in his news feed about a pair of researchers who developed a statistical method to analyze datasets and used it to purportedly refute the number of Holocaust victims from a concentration camp in Croatia, it naturally caught his attention.

Ornik is a professor in the Department of Aerospace Engineering at the University of Illinois Urbana-Champaign. He proceeded to study the research in depth and used the method to re-analyze the same data from the United States Holocaust Memorial Museum. Then he wrote a rebuttal paper debunking the researchers' findings.

Ornik's rebuttal is published in the same journal as the original article. He said the editor asked him to include a list of answers to some of the potential questions other scientists may have when they read his paper. A few weeks later, the journal placed a note on the original article stating that they do not endorse or share the authors' views, and recommended reading Ornik's paper.

"As scientists, as engineers, I think it's our duty to correct flawed and faulty science," Ornik said. "There is so much effort to get the public and policymakers to believe in the science, that when a math expert says they have proof, it brings credence to the argument. But when their claims are demonstrably not true, it's not good for science and it's not good for society. That's why it's especially important for scientists to challenge false findings when we discover them."

According to Ornik, some individuals promote the view that concentration camps either didn't exist or weren't used to kill people, or that the currently widely accepted numbers of victims have been substantially inflated. Most historians do not take the claims seriously in light of vast available data and evidence.

"For the authors of the original paper to claim that they have found mathematical proof that the list of victims of that camp was fabricated has obvious historical implications," Ornik said. "I think, to some extent the damage has already been done, but I felt the need to go on record with the assumptions, inaccuracies, and misuse of the raw museum data I found in the original research."

The paper Ornik responded to presents a novel method to identify anomalies across a set of histograms. Ornik said he does not dispute the merits of the method presented in the original paper, just its application to the Jasenovac concentration camp.

Ornik became suspicious of the paper's conclusions because the researchers implied in one case that a smaller list naturally has a smaller outlier score, but they compared scores across victim list sizes to claim that the one related to Jasenovac, one of the biggest ones, was problematic.

"I started looking to see if there was some sort of a bias for the size and whether they were actually more likely to assign the flag of being problematic to a larger list or not. And it turns out, despite the authors' claims, they were," Ornik said. "The bigger lists are more likely to be computed to be problematic than the smaller lists when their method is applied to the data."

Ornik, who commonly uses similar statistical analysis in aerospace applications, explained another reason that their statistical argument doesn't work.

"When you look at data, a collection of anything, and you want to figure out an outlier--something that's different­­--you need to assume that all of the pieces of data come from the same source, the same distribution. Take a list of victims by birth year. It would yield a graph of the ages of each person. Say 10 percent are older than 70 years old. Now, that distribution wouldn't be true for a list of deported children, for example, because that list, by definition, is structurally different. It is also different from a list of everyone who has an identity card. Identity cards are issued only to people who are not children. Yet, the lists that these researchers worked with came from a multitude of sources and include lists of children, lists of people getting married, lists of prisoners of war--things that by definition cannot have come from the same distribution."

Another major error in the original paper, Ornik said, is that some duplicate lists were treated as two separate lists. This meant that approximately 67 percent of their entire database was actually sub-lists of the larger list.

"The 7,000-plus lists published online by the Holocaust Museum are not curated," Ornik said. "For instance, there are two lists that contain exactly the same data; one is in Cyrillic and the other one uses the Latin alphabet. But they treated them as two separate lists. There are other lists that contain the same name, but there is no way of knowing if they are the same person or two different people born on the same day with identical names. They could have removed the very egregious errors in which a list is clearly duplicated but the rest, you would need access to the original historic data."

Both the original paper and Ornik's paper, "Comment on "TVOR: Finding Discrete Total Variation Outliers Among Histograms," are published in IEEE Access.

Credit: 
University of Illinois Grainger College of Engineering

A new piece of the quantum computing puzzle

image: Jung-Tsung Shen, associate professor in the Department of Electrical & Systems Engineering, has developed a deterministic, high-fidelity, two-bit quantum logic gate that takes advantage of a new form of light. This new logic gate is orders of magnitude more efficient than the current technology

Image: 
Jung-Tsung Shen

Research from the McKelvey School of Engineering at Washington University in St. Louis has found a missing piece in the puzzle of optical quantum computing.

Jung-Tsung Shen, associate professor in the Preston M. Green Department of Electrical & Systems Engineering, has developed a deterministic, high-fidelity two-bit quantum logic gate that takes advantage of a new form of light. This new logic gate is orders of magnitude more efficient than the current technology.

"In the ideal case, the fidelity can be as high as 97%," Shen said.

His research was published in May 2021 in the journal Physical Review A.

The potential of quantum computers is bound to the unusual properties of superposition -- the ability of a quantum system to contain many distinct properties, or states, at the same time -- and entanglement -- two particles acting as if they are correlated in a non-classical manner, despite being physically removed from each other.

Where voltage determines the value of a bit (a 1 or a 0) in a classical computer, researchers often use individual electrons as "qubits," the quantum equivalent. Electrons have several traits that suit them well to the task: they are easily manipulated by an electric or magnetic field and they interact with each other. Interaction is a benefit when you need two bits to be entangled -- letting the wilderness of quantum mechanics manifest.

But their propensity to interact is also a problem. Everything from stray magnetic fields to power lines can influence electrons, making them hard to truly control.

For the past two decades, however, some scientists have been trying to use photons as qubits instead of electrons. "If computers are going to have a true impact, we need to look into creating the platform using light," Shen said.

Photons have no charge, which can lead to the opposite problems: they do not interact with the environment like electrons, but they also do not interact with each other. It has also been challenging to engineer and to create ad hoc (effective) inter-photon interactions. Or so traditional thinking went.

Less than a decade ago, scientists working on this problem discovered that, even if they weren't entangled as they entered a logic gate, the act of measuring the two photons when they exited led them to behave as if they had been. The unique features of measurement are another wild manifestation of quantum mechanics.

"Quantum mechanics is not difficult, but it's full of surprises," Shen said.

The measurement discovery was groundbreaking, but not quite game-changing. That's because for every 1,000,000 photons, only one pair became entangled. Researchers have since been more successful, but, Shen said, "It's still not good enough for a computer," which has to carry out millions to billions of operations per second.

Shen was able to build a two-bit quantum logic gate with such efficiency because of the discovery of a new class of quantum photonic states -- photonic dimers, photons entangled in both space and frequency. His prediction of their existence was experimentally validated in 2013, and he has since been finding applications for this new form of light.

When a single photon enters a logic gate, nothing notable happens -- it goes in and comes out. But when there are two photons, "That's when we predicted the two can make a new state, photonic dimers. It turns out this new state is crucial."

High-fidelity, two-bit logic gate, designed by Jung-Tsung Shen.

Mathematically, there are many ways to design a logic gate for two-bit operations. These different designs are called equivalent. The specific logic gate that Shen and his research group designed is the controlled-phase gate (or controlled-Z gate). The principal function of the controlled-phase gate is that the two photons that come out are in the negative state of the two photons that went in.

"In classical circuits, there is no minus sign," Shen said. "But in quantum computing, it turns out the minus sign exists and is crucial."

When two independent photons (representing two optical qubits) enter the logic gate, "The design of the logic gate is such that the two photons can form a photonic dimer," Shen said. "It turns out the new quantum photonic state is crucial as it enables the output state to have the correct sign that is essential to the optical logic operations."

Shen has been working with the University of Michigan to test his design, which is a solid-state logic gate -- one that can operate under moderate conditions. So far, he says, results seem positive.

Shen says this result, while baffling to most, is clear as day to those in the know.

"It's like a puzzle," he said. "It may be complicated to do, but once it's done, just by glancing at it, you will know it's correct."

Credit: 
Washington University in St. Louis

Increased use of household fireworks creates a public health hazard, UCI study finds

Irvine, Calif., June 29, 2021 - Fireworks are synonymous in the United States with the celebration of Independence Day and other special events, but the colorful displays have caused a growing risk to public safety in recent years, according to a study by environmental health researchers at the University of California, Irvine.

Relying on real-time air quality measurements crowdsourced from a network of more than 750 automated sensors distributed throughout California, scientists from UCI's Program in Public Health found that short-term, extremely high-particulate-matter air pollution from the widespread use of fireworks spiked during the periods of late June through early July in 2019 and 2020.

The increase was most pronounced in Southern California counties where fireworks regulations are less strict than in northern parts of the state and where the illegal use of do-it-yourself pyrotechnics is also more prevalent. This and other findings are the subject of a study published recently in the International Journal of Environmental Research and Public Health.

"You may have seen discussions on social media lately about people worrying for their pets on nights when the skies are filled with exploding fireworks, but we've found that there is a real threat to human well-being too," said co-author Jun Wu, UCI professor of public health. "And like many other environmental justice issues, we find the worst impacts among residents of low-income communities."

Aerial explosions cause the release of fine particles less than 2.5 micrometers in diameter. Airborne particulate matter of this size is hazardous because when inhaled, it can be absorbed by the lungs and passed to other tissues inside the body. Fireworks get their distinct colors from compounds containing barium, copper, magnesium, potassium and strontium. As rockets burst in the sky, they release these chemicals, trace redox-active metals and water-soluble ions, which inevitably fall on those below.

"These fine particles are known to cause a wide range of adverse health effects, including premature mortality, respiratory and cardiovascular diseases, adverse pregnancy outcomes, and neurological diseases," Wu said.

The UCI team used data accumulated via a statewide network of PurpleAir sensors, low-cost devices deployed in households. Utilizing this method, the researchers built a high-resolution map tracking levels of airborne particulate matter less than 2.5 micrometers in diameter before, during and after Fourth of July fireworks during the study period.

"The PurpleAir network includes sensors that monitor air continuously, which offers advantages over the traditional monitoring installations that are often positioned away from residential areas and take intermittent measurements that may miss peak days such as the Fourth of July," said lead author Amirhosein Mousavi, a postdoctoral scholar in UCI's Program in Public Health. "By taking data from a large, distributed sensor network that's always collecting data in neighborhoods where people from various socioeconomic profiles live, we were able to get a much clearer characterization of the health risks posed by do-it-yourself fireworks."

The team found that among all 58 California counties, Los Angeles County experienced the highest daily PM2.5 levels around the Fourth of July holiday in both 2019 and 2020. They believe this was the result of larger numbers of individuals shooting off their own rockets in neighborhoods where they lived, as well as the nature of L.A.'s topography, which has long been known to facilitate the buildup of air pollution.

In addition, researchers believe they detected a COVID-19 effect in their data. PM2.5 concentrations on July 4 and 5 in 2020 were, on average, 50 percent higher than in 2019, likely due to the increased use of household-level fireworks during the pandemic lockdowns.

The team also learned that peak fireworks pollution was two times higher in communities with lower socioeconomic status, larger minority-group populations and higher asthma rates.

"This work highlights the important role that policy and enforcement can play in reducing fireworks-related air pollution and protecting public health," Wu said. "As there is a patchwork of different restrictions and regulations regarding fireworks in our state, it's clear that a more coordinated approach would help people breathe easier during times of celebration."

Credit: 
University of California - Irvine

Cooked crustaceans, cannabis and a budder way

This lobster tale begins a few years ago when the proprietor of a northeastern seafood restaurant publicly asserted that exposing lobsters to a little cannabis prior to cooking produced notable changes in their behavior and a less dramatic scene in the kitchen for all concerned, which was the Maine thing.

In a paper published online June 29, 2021 in the journal Pharmacology Biochemistry and Behavior, a team led by researchers at University of California San Diego School of Medicine, report on efforts to answer that burning, boiling and baked question. They obtained live lobsters (Homarus americanus) from a supermarket and exposed the crustaceans to up to 60 minutes of vaporized Δ9-tetrahydrocannabinol (THC) -- the principle psychoactive component of cannabis -- then measured THC levels in the animals' tissues and looked for behavioral changes, including thermal nociception, the perception of heat or cold.

The issue of whether lobsters feel pain when dropped into a cooking pot of boiling water is long-simmering. Though live cooking of lobsters has been banned in Switzerland (2018) and New Zealand (1999), based on that presumption, most scientists say empirical evidence that crustaceans are sensitive or can even detect temperature changes is far from clear.

A study at Queen's University in Belfast, published in 2013, exposed the claws of shore crabs (Carcinus maenas) to electrical shocks. The crabs learned to avoid them, a behavior the study authors said was "consistent with key criteria for pain experience."

A 2015 study at the University of Texas-Pan American found that Louisiana red swamp crayfish (Procambarus clarkii) displayed nociceptive behavior (responses to extreme temperatures) when briefly touched by a soldering iron at 129 degrees Fahrenheit, but not to low temperature or chemical stimuli, such as capsaicin, the active component in chili peppers.

The University of Texas authors concluded the crayfish possessed some nociceptive behaviors, but cautioned against over-interpreting the results. "We wish to be clear that we are not claiming crustaceans generally, or even crayfish specifically, feel pain," they wrote. "We are claiming that crayfish detect and respond to noxious high temperature stimuli in ways that they do not to other potentially noxious stimuli."

Michael A. Taffe, PhD, professor in the Department of Psychiatry at UC San Diego School of Medicine and senior author of the newest study, and colleagues had already documented in 2016 that inhaled THC produced anti-nociceptive effects in rats, blocking pain detection, and reduced the rodents' body temperature and physical activity. Given the restauranteur's claims, they were curious whether THC acted similarly upon lobsters, which are a well-established animal model for neurological research.

"Lobster models have contributed much to our understanding of neurotransmission and neuronal circuits," said Taffe, "most notably in the work of (Brandeis University neuroscientist) Eve Marder, who began her work when she was a graduate student at UC San Diego."

Though primarily an aquatic species, lobsters are able to survive for hours to days out of water if their gills (located inside their carapace or shell) remain moist enough to function. The UC San Diego researchers set up a sealed tank in which residing lobsters would be exposed to carefully calibrated levels of vaporized THC for 30 or 60 minutes.

The scientists subsequently measured levels of THC in tissue samples from some of the lobsters, including gills, brain, heart, liver, tail and claw. Claw samples where boiled for 10 minutes to determine if THC levels were reduced or eliminated by cooking.

The behaviors of other lobsters were monitored before and after THC exposure for changes, such as more or less time spent moving around, distance traveled and speed. Lobsters also had their claws and tails dipped in water ranging in temperature from ambient to 118 degrees Fahrenheit for up to 15 seconds. (Human showers typically range in water temperature from 105 to 112 degrees F.)

Taffe said the results were mixed. The primary finding was that Maine lobsters are capable of absorbing vaporized THC into their bodies by gill respiration, based on tissue samples. They also exhibited less locomotor activity, similarly to THC-exposed rodents. Responses to different water temperatures varied: the warmer the water, the more rapidly the lobsters moved their claws, tail or antennae away from the liquid, but THC exposure had very minimal effect on detection of a hot water stimulus in the study.

"This is the first direct evidence of thermal nociception in lobsters," said Taffe.

But does that constitute feeling pain? Taffe and colleagues cannot say with certainty: Detection of a stimulus is not synonymous with pain. Lobsters responded to THC exposure much less dramatically than rodents in both behavior and response to adverse stimuli.

"The assertions of the restaurateur that lobsters can be affected by vaping cannabinoids appears confirmed by their subsequent behavior," said Taffe, "but the impact of THC on thermal nociception was minimal. You would need to do more experimentation to fully investigate behavioral outcomes, including anxiety-like measures, which we did not address."

Credit: 
University of California - San Diego

Hackensack Meridian CDI scientists discover new tuberculosis treatment pathway

image: Dr. Thomas Dick, left, and colleagues at the CDI have identified a new treatment pathway for tuberculosis, a global killer.

Image: 
Hackensack Meridian Health

June 29, 2021 - Nutley, NJ - Scientists from the Hackensack Meridian Center for Discovery and Innovation, working with collaborators from across the globe, uncovered the mechanism of action of a novel anti-tuberculosis drug that they have helped develop.

The new findings show how the enzyme inhibitor triaza-coumarin, or TA-C, is metabolized by the TB germs, which makes it effective in inhibiting the disease from within, like in a "Trojan horse" attack, according to the new paper in the journal Proceedings of the National Academy of Sciences.

"This is a promising new direction of research," said Thomas Dick, member of the CDI faculty. "We are hoping this work can make a difference in the ongoing fight against TB."

"The scientists at the CDI who specialize in tuberculosis and other mycobacteria are at the vanguard of their specialty," said David Perlin, PhD., the chief scientific officer and senior vice president of the CDI. "Their promising new lines of research offer hope against a scourge that continues to kill in huge numbers, year after year."

Bacterial metabolism can cause intrinsic drug resistance - but it can also convert inactive parent drugs to bioactive derivatives, as is the case for several antimycobacterial "prodrugs." These "prodrugs" are biologically inactive compounds which are broken down by the bacteria to create the effective byproduct compounds within the bacterial cell.

The scientists show in the paper that intra-bacterial metabolism of TA-C, a new Mycobacterium tuberculosis dihydrofolate reductase (DHFR) inhibitor with moderate affinity for its target, boosts its on-target activity by two orders of magnitude. The TB germ takes up and metabolizes the TA-C - but the byproducts inhibit its function from within.

This is the first 'prodrug-like' antimycobacterial that possesses baseline activity in the absence of intracellular bio-activation. By describing how it works in this latest paper, the authors have provided the foundational basis of a novel class of DHFR inhibitors and uncovered a new antibacterial drug discovery concept.

This new methodology could be crucial in the ongoing fight against TB, which kills 1.3 million people across the globe annually, and which disproportionately afflicts the developing world. New drugs are desperately needed to fight drug-resistant strains of the disease.

Credit: 
Hackensack Meridian Health