Tech

Paleontologists discover new insect group after solving 150-year-old mystery

image: Wing of the new species Okanagrion hobani, from the McAbee fossil site in British Columbia, a damselfly-like insect of the new suborder Cephalozygoptera.

Image: 
Copyright Zootaxa, used by permission.

For more than 150 years, scientists have been incorrectly classifying a group of fossil insects as damselflies, the familiar cousins of dragonflies that flit around wetlands eating mosquitoes. While they are strikingly similar, these fossils have oddly shaped heads, which researchers have always attributed to distortion resulting from the fossilization process.

Now, however, a team of researchers led by Simon Fraser University (SFU) paleontologist Bruce Archibald has discovered they aren't damselflies at all, but represent a major new insect group closely related to them.

The findings, published today in Zootaxa, show that the distinctive shape of the insect's non-protruding, rounded eyes, set close to the head, are the defining features of a suborder related to damselflies and dragonflies that the researchers have named Cephalozygoptera.

"When we began finding these fossils in British Columbia and Washington State, we also thought at first they must be damselflies," says Archibald.

But on closer inspection, the team noticed they resembled a fossil that German paleontologist Hermann Hagen wrote about in 1858. Hagen set the precedent of linking the fossil to the damselfly suborder despite its different head shape, which didn't fit with damselflies at all.

Damselflies have short and wide heads with eyes distinctively protruding far to each side. Hagen's fossil, however, had an oddly rounded head and eyes. But he assumed this difference was false, caused by distortion during fossilization.

"Paleontologists since Hagen had written that these were damselflies with distorted heads," Archibald says. "A few hesitated, but still assigned them to the damselfly suborder."

The SFU-led team, including Robert Cannings of the Royal British Columbia Museum, Robert Erickson and Seth Bybee of Brigham Young University and SFU's Rolf Mathewes, sifted through 162 years of scientific papers and discovered that many similar specimens have been found since Hagen's time.

They experienced a eureka moment when they realized the odd heads of their new fossils were, in fact, their true shape.

The researchers used the fossil's defining head shape to name the new suborder Cephalozygoptera, meaning "head damselfly".

The oldest known species of Cephalozygoptera lived among dinosaurs in the Cretaceous age in China, and were last known to exist about 10 million years ago in France and Spain.

"They were important elements in food webs of wetlands in ancient British Columbia and Washington about 50 million years ago, after the extinction of the dinosaurs," says Archibald. "Why they declined and went extinct remains a mystery."

The team named 16 new species of Cephalozygoptera. Some of the fossils were found on the traditional land of the Colville Indian tribe of northern Washington, and so Archibald and his coauthors collaborated with tribal elders to name a new family of them. They called the family "Whetwhetaksidae", from the word "whetwhetaks", meaning dragonfly-like insects in the Colville people's language.

Archibald has spent 30 years combing the fossil-rich deposits of southern British Columbia and northern interior Washington. To date, in collaboration with others, he has discovered and named more than 80 new species from the area.

Credit: 
Simon Fraser University

Over 80% of Atlantic Rainforest remnants have been impacted by human activity

image: Biodiversity and biomass losses in the biome using data from 1,819 forest inventories. In terms of carbon storage, the losses correspond to the destruction of 70,000 km² of forest, representing some USD 2.6 billion in carbon credits

Image: 
Renato de Lima/USP

A Brazilian study published (http://www.nature.com/articles/s41467-020-20217-w) in Nature Communications shows that human activities have directly or indirectly caused biodiversity and biomass losses in over 80% of the remaining Atlantic Rainforest fragments.

According to the authors, in terms of carbon storage, the biomass erosion corresponds to the destruction of 70,000 square kilometers (km²) of forest - almost 10 million soccer pitches - or USD 2.3 billion-USD 2.6 billion in carbon credits. "These figures have direct implications for mechanisms of climate change mitigation," they state in the article.

Atlantic Rainforest remnants in Brazil are strung along its long coastline. The biome once covered 15% of Brazil, totaling 1,315,460 km². Only 20% of the original area is now left. The fragments are of varying sizes and have different characteristics.

To estimate the impact of human activity on these remnants, the researchers used data from 1,819 forest inventories conducted by several research groups.

"These inventories are a sort of tree census. The researchers go into the field and choose an area to survey, typically 100 meters by 100 meters. All the trees found within this perimeter are identified, analyzed, and measured," said Renato de Lima (https://bv.fapesp.br/en/pesquisador/668300/renato-augusto-ferreira-de-lima), a researcher at the University of São Paulo's Institute of Biosciences (IB-USP) and leader of the study. "We compiled all the data available in the scientific literature and calculated the average loss of biodiversity and biomass in the fragments studied, which represent 1% of the biome. We then used statistical methods to extrapolate the results to the fragments not studied, assuming that the impact would be constant throughout the Atlantic Rainforest biome."

After identifying the tree species in a fragment, the researchers estimated the size of their seeds and also what they call the "ecological or successional group". These two factors indicate how healthy the forest is, according to Lima. "There are hardy plants that require very little in the way of local resources and can grow on wasteland, pasture, forest borders, etc. These are known as pioneer species. A Brazilian example is the Ambay pumpwood [Cecropia pachystachya]," he said.

Pioneer tree species tend to produce seeds of smaller size, but in large numbers, because each seed has such a small chance of germinating. At the opposite extreme are climax species that flourish only in favorable environments, such as Brazilwood (Paubrasilia echinata) or various species of the genus Ocotea. These trees produce larger seeds with a substantial store of nutrients.

"This kind of seed requires a heavier investment by the parent tree in terms of energy," Lima said. "Areas in which climax species are present typically support more diversified fauna, so they serve as a marker of overall forest quality. Areas in which pioneer species predominate have probably been disturbed in the recent past."

The IB-USP group set out to show how the loss of late-successional species correlated with overall biodiversity loss and also with biomass loss, which represents the reduction in the forest's capacity to store carbon and keep this greenhouse gas out of the atmosphere. They found the forest fragments studied to have 25%-32% less biomass, 23%-31% fewer tree species, and 33%-42% fewer individuals belonging to late-successional, large-seeded, and endemic species.

The analysis also showed that biodiversity and biomass erosion were lower in strictly protected conservation units, especially large ones. "The smaller the forest fragment and the larger the edge area, the easier it is for people to gain access and disturb the remnant," Lima said.

On the positive side, degraded forest areas can recoup their carbon storage capacity if they are restored. "Combating deforestation and restoring totally degraded open areas such as pasturelands have been a major focus. These two strategies are very important, but we shouldn't forget the fragments in the middle," Lima said.

According to Paulo Inácio Prado (https://bv.fapesp.br/en/pesquisador/3487/paulo-inacio-de-knegt-lopez-de-prado), a professor at IB-USP and last author of the study, restored forest remnants can attract billions of dollars in investment relating to carbon credits. "Degraded forests should no longer be considered a liability. They're an opportunity to attract investment, create jobs and conserve what still remains of the Atlantic Rainforest," he said.

Lima believes this could be an attractive strategy for landowners in protected areas of the biome. "There's no need to reduce the amount of available arable land. Instead, we should increase the biomass in forest fragments, recouping part of the cost of restoration in the form of carbon credits," he said. "There will be no future for the Atlantic Rainforest without the owners of private properties. Only 9% of the remaining forest fragments are on state-owned land."

Database

According to Lima, the study began during his postdoctoral research, which was supported by São Paulo Research Foundation - FAPESP (https://bv.fapesp.br/en/bolsas/145695) and supervised by Prado. The aim was to identify the key factors that determine biodiversity and biomass loss in remnants of Atlantic Rainforest. "We found human action to be a major factor," he said. "We considered activities such as logging, hunting, and invasion by exotic species, as well as the indirect effects of forest fragmentation."

The data obtained from the 1,819 forest inventories used in the research is stored in a repository called TreeCo (http://labtrop.ib.usp.br/doku.php?id=projetos:treeco:start), short for Neotropical Tree Communities. Lima developed the database during his postdoctoral fellowship and still runs it. Its contents are described in an article published in Biodiversity and Conservation (https://link.springer.com/article/10.1007/s10531-015-0953-1). It is open to other research groups interested in sharing data on Neotropical forests.

"The repository became a byproduct of my postdoctoral project, and more than ten PhD and master's candidates are using it in their research," Lima said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

After Hurricane Irma, soundscape reveals resilient reef ecosystem

A new study from North Carolina State University reveals that the soundscapes of coral reef ecosystems can recover quickly from severe weather events such as hurricanes. The work also demonstrates that non-invasive monitoring is an important tool in shedding further light on these key ecosystems.

Soundscape ecology is a relatively new way for researchers to keep tabs on a variety of habitats without direct interference. In underwater habitats like coral reefs, soundscapes allow continual monitoring of an ecosystem that is difficult to access. By deploying underwater microphones, or hydrophones, researchers can get an acoustic picture of the types of animals in the ecosystem, as well as their behavior patterns.

Kayelyn Simmons, a Ph.D. student at NC State, used soundscapes and underwater mapping to monitor two different reef sites in the Florida Keys from February to December 2017. She deployed and collected eight hydrophones every three months between the two sites: a pristine reef located at Eastern Sambo, and a fishing site located at Western Dry Rocks.

Hurricane Irma struck the Florida Keys as a Category 4 storm in September 2017. Simmons was able to retrieve two of the hydrophones - one from each site - in December. Unfortunately, the hydrophone retrieved from Western Dry Rocks had been compromised by the storm, rendering its post-storm data unusable.

"Prior to the hurricane, we were able to determine what the 'normal' sound patterns were in each habitat, so we knew what the baselines were in terms of species and behavior," Simmons says. "You can tell which species are present based on where their sounds are on the frequency band. Similarly, the amount of noise from each species can give you an idea of their numbers. So the soundscape is a good way to measure abundance and diversity."

Each study site had the same species present. For example, snapping shrimp, with their high frequency "Rice Krispies in milk" popping noises, were active in the periods between dusk and dawn; while grunts, grouper and snapper, with sounds in the lower frequency bands, were mainly active during the day. The hydrophones also captured spawning activity during the full moon.

Simmons analyzed the sound captured by the surviving Eastern Sambo hydrophone and discovered that even though the reef suffered physical damage from the hurricane, the residents and their activity levels began returning to normal approximately 24 to 48 hours after the storm passed.

"The acoustic energy exposure for the reef was as loud as a small boat circling in one spot for two weeks," Simmons says. "So we didn't record any fish noises during the four-day period that Irma came through due to acoustic masking from the storm. However, the snapping shrimp were back to pre-storm sound levels within 24 hours. The fish noises on the lower frequency were back within 72. And on the next full moon we heard normal spawning behavior.

"Overall, the research shows that the coral reef soundscape was resilient and able to recover from the storm quickly."

Credit: 
North Carolina State University

Parker Solar Probe offers stunning view of Venus

image: When flying past Venus in July 2020, Parker Solar Probe's WISPR instrument, short for Wide-field Imager for Parker Solar Probe, detected a bright rim around the edge of the planet that may be nightglow -- light emitted by oxygen atoms high in the atmosphere that recombine into molecules in the nightside. The prominent dark feature in the center of the image is Aphrodite Terra, the largest highland region on the Venusian surface. Bright streaks in WISPR, such as the ones seen here, are typically caused by a combination of charged particles -- called cosmic rays -- sunlight reflected by grains of space dust, and particles of material expelled from the spacecraft's structures after impact with those dust grains. The number of streaks varies along the orbit or when the spacecraft is traveling at different speeds, and scientists are still in discussion about the specific origins of the streaks here. The dark spot appearing on the lower portion of Venus is an artifact from the WISPR instrument.

Image: 
NASA/Johns Hopkins APL/Naval Research Laboratory/Guillermo Stenborg and Brendan Gallagher

NASA's Parker Solar Probe captured stunning views of Venus during its close flyby of the planet in July 2020.

Though Parker Solar Probe's focus is the Sun, Venus plays a critical role in the mission: The spacecraft whips by Venus a total of seven times over the course of its seven-year mission, using the planet's gravity to bend the spacecraft's orbit. These Venus gravity assists allow Parker Solar Probe to fly closer and closer to the Sun on its mission to study the dynamics of the solar wind close to its source.

But -- along with the orbital dynamics -- these passes can also yield some unique and even unexpected views of the inner solar system. During the mission's third Venus gravity assist on July 11, 2020, the onboard Wide-field Imager for Parker Solar Probe, or WISPR, captured a striking image of the planet's nightside from 7,693 miles away.

WISPR is designed to take images of the solar corona and inner heliosphere in visible light, as well as images of the solar wind and its structures as they approach and fly by the spacecraft. At Venus, the camera detected a bright rim around the edge of the planet that may be nightglow -- light emitted by oxygen atoms high in the atmosphere that recombine into molecules in the nightside. The prominent dark feature in the center of the image is Aphrodite Terra, the largest highland region on the Venusian surface. The feature appears dark because of its lower temperature, about 85 degrees Fahrenheit (30 degrees Celsius) cooler than its surroundings.

That aspect of the image took the team by surprise, said Angelos Vourlidas, the WISPR project scientist from the Johns Hopkins Applied Physics Laboratory (APL) in Laurel, Maryland, who coordinated a WISPR imaging campaign with Japan's Venus-orbiting Akatsuki mission. "WISPR is tailored and tested for visible light observations. We expected to see clouds, but the camera peered right through to the surface."

"WISPR effectively captured the thermal emission of the Venusian surface," said Brian Wood, an astrophysicist and WISPR team member from the U.S. Naval Research Laboratory in Washington, D.C. "It's very similar to images acquired by the Akatsuki spacecraft at near-infrared wavelengths."

This surprising observation sent the WISPR team back to the lab to measure the instrument's sensitivity to infrared light. If WISPR can indeed pick up near-infrared wavelengths of light, the unforeseen capability would provide new opportunities to study dust around the Sun and in the inner solar system. If it can't pick up extra infrared wavelengths, then these images -- showing signatures of features on Venus' surface -- may have revealed a previously unknown "window" through the Venusian atmosphere.

"Either way," Vourlidas said, "some exciting science opportunities await us."

For more insight into the July 2020 images, the WISPR team planned a set of similar observations of the Venusian nightside during Parker Solar Probe's latest Venus flyby on Feb. 20, 2021. Mission team scientists expect to receive and process that data for analysis by the end of April.

"We are really looking forward to these new images," said Javier Peralta, a planetary scientist from the Akatsuki team, who first suggested a Parker Solar Probe campaign with Akatsuki, which has been in orbiting Venus since 2015. "If WISPR can sense the thermal emission from the surface of Venus and nightglow -- most likely from oxygen -- at the limb of the planet, it can make valuable contributions to studies of the Venusian surface."

Credit: 
NASA/Goddard Space Flight Center

Ancient skeletal hand could reveal evolutionary secrets

Evolutionary expert Charles Darwin and others recognized a close evolutionary relationship between humans, chimps and gorillas based on their shared anatomies, raising some big questions: how are humans related to other primates, and exactly how did early humans move around? Research by a Texas A&M University professor may provide some answers.

Thomas Cody Prang, assistant professor of anthropology, and colleagues examined the skeletal remains of Ardipithecus ramidus ("Ardi"), dated to 4.4 million years old and found in Ethiopia. One of Ardi's hands was exceptionally well-preserved.

The researchers compared the shape of Ardi's hand to hundreds of other hand specimens representing recent humans, apes and monkeys (measured from bones in museum collections around the world) to make comparisons about the kind of locomotor behavior used by the earliest hominins (fossil human relatives).

The results provide clues about how early humans began to walk upright and make similar movements that all humans perform today.

This discovery is described in a study published in the current issue of Science Advances.

"Bone shape reflects adaptation to particular habits or lifestyles - for example the movement of primates - and by drawing connections between bone shape and behavior among living forms, we can make inferences about the behavior of extinct species, such as Ardi, that we can't directly observe, Prang said.

"Additionally, we found evidence for a big evolutionary 'jump' between the kind of hand represented by Ardi and all later hominin hands, including that of Lucy's species (a famous 3.2 million-year-old well-preserved skeleton found in the same area in the 1970s). This 'evolutionary jump' happens at a critical time when hominins are evolving adaptations to a more human-like form of upright walking, and the earliest evidence for hominin stone-tool manufacture and stone-tool use, such as cut-marks on animal fossils, are discovered."

Prang said the fact that Ardi represents an earlier phase of human evolutionary history is important because it potentially shines light on the kind of ancestor from which humans and chimpanzees evolved.

"Our study supports a classic idea first proposed by Charles Darwin in 1871, when he had no fossils or understanding of genetics, that the use of the hands and upper limbs for manipulation appeared in early human relatives in connection with upright walking," he said. "The evolution of human hands and feet probably happened in a correlated fashion."

Since Ardi is such an ancient species, it might retain skeletal features that were present in the last common ancestor of humans and chimpanzees. If this is true, it could help researchers place the origin of the human lineage - in addition to upright walking - into a clearer light.

"It potentially brings us one step closer to an explanation for how and why humans evolved our form of upright walking," Prang said.

He added that the big change in hand anatomy between Ardi and all later hominins occurs at a time, roughly between 4.4 and 3.3 million years ago, coinciding with the earliest evidence of the loss of a grasping big toe in human evolution. This also coincides with the earliest known stone tools and stone cut-marked animal fossils.

He said it appears to mark a major change in the lifestyle and behavior of human relatives within this timeframe.

"We propose that it involves the evolution of more advanced upright walking, which enabled human hands to be modified by the evolutionary process for enhanced manual manipulation, possibly involving stone tools," Prang said

Credit: 
Texas A&M University

UM scientists achieve breakthrough in culturing corals and sea anemones cells

image: Pocillopora damicornis from Saboga Island, Panama growing in the Cnidarian Immunity Lab at the Rosenstiel School

Image: 
Mike Connelly

MIAMI--Researchers have perfected the recipe for keeping sea anemone and coral cells alive in a petri dish for up to 12 days. The new study, led by scientists at the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science, has important applications to study everything from evolutionary biology to human health.

Cnidarians are emerging model organisms for cell and molecular biology research. Yet, successfully keeping their cells in a laboratory setting has proved challenging due to contamination from the many microorganisms that live within these marine organisms or because the whole tissue survive in a culture environment.

UM cell biologist Nikki Traylor-Knowles and her team used two emerging model organisms in developmental and evolutionary biology--the starlet sea anemone (Nematostella vectensis) and cauliflower coral (Pocillopora damicornis)--to find more successful way to grow these cell cultures in a laboratory setting.

James Nowotny, a recent UM graduate mentored by Traylor-Knowles at the time, tested 175 cell cultures from the two organisms and found that their cells can survive for on average 12 days if they receive an antibiotic treatment before being cultured.

"This is a real breakthrough," said Traylor Knowles, an assistant professor of marine biology and ecology at the UM Rosenstiel School. "We showed that if you treat the animals beforehand and prime their tissues, you will get a longer and more robust culture to study the cell biology of these organisms."

"This is the first time that individual cells from all tissues of coral or sea anemones were shown to survive in cell culture for over 12 days," said Nowotny, who is currently a graduate student at the University of Maryland.

There are over 9,000 species in the phylum Cnidaria, which includes jellyfish, sea anemones, corals, Hydra, and sea fans. Due to several special unique attributes such as radial symmetry, a stinging cell known as a nematocyte and two-dermal cell layer, there is growing interest in using these animals to study key aspects of animal development.

"We can also now grow coral cells and use them in experiments that will help improve our understanding of their health in a very targeted way," said Traylor-Knowles.

Credit: 
University of Miami Rosenstiel School of Marine, Atmospheric, and Earth Science

New shape-changing 4D materials hold promise for morphodynamic tissue engineering

image: The 4D material changes shape in response to water. The grey side of the material in the image absorbs water faster than the blue side, causing it to bend into a "C" shape.

Image: 
Yu Bin Lee

New hydrogel-based materials that can change shape in response to physiological stimuli, such as water, could be the next generation of materials used to bioengineer tissues and organs, according to a team of researchers at the University of Illinois Chicago.

In a new paper published in the journal Advanced Functional Materials, the research team -- led by Eben Alsberg, the Richard and Loan Hill Professor of Biomedical Engineering -- that developed the substances show that the unique materials can curl into tubes in response to water, making the materials good candidates for bioengineering blood vessels or other tubular structures.

In nature, embryonic development and tissue healing often involve a high concentration of cells and complex architectural and organizational changes that ultimately give rise to final tissue morphology and structure.

For tissue engineering, traditional techniques have involved, for example, culturing biodegradable polymer scaffolds with cells in biochambers filled with liquid nutrients that keep the cells alive. Over time, when provided with appropriate signals, the cells multiply in number and produce new tissue that takes on the shape of the scaffold as the scaffold degrades. For example, a scaffold in the shape of an ear seeded with cells capable of producing cartilage and skin tissue may eventually become a transplantable ear.

However, a geometrically static scaffold cannot enable the formation of tissues that dynamically change shape over time or facilitate interactions with neighboring tissues that change shape. A high density of cells is also typically not used and/or supported by the scaffolds.

"Using a high density of cells can be advantageous in tissue engineering as this enables increased cell-cell interactions that can promote tissue development," said Alsberg, who also is professor of orthopaedics, pharmacology and mechanical and industrial engineering at UIC.

Enter 4D materials, which are like 3D materials, but they change shape when they are exposed to specific environmental cues, such as light or water. These materials have been eyed by biomedical engineers as potential new structural substrates for tissue engineering, but most currently available 4D materials are not biodegradable or compatible with cells.

To take advantage of the promise of 4D materials for bioengineering applications, Alsberg and colleagues developed novel 4D materials based on gelatin-like hydrogels that change shape over time in response to the addition of water and are cell-compatible and biodegradable, making them excellent candidates for advanced tissue engineering. The hydrogels also support very high cell densities, so they can be heavily seeded with cells.

In the paper, the researchers describe how exposure to water causes the hydrogel scaffolds to swell as the water is absorbed. The amount of swelling can be tuned by, for example, changing aspects of the hydrogel material such as its degradation rate or the concentration of cross-linked polymers -- strands of protein or polysaccharide in this case -- that comprise the hydrogels. The higher the polymer concentration and crosslinking, the less and more slowly a given hydrogel will absorb water to induce a change in shape.

The researchers found that by layering hydrogels with different properties like a stack of paper, the difference in water absorption between the layers will cause the hydrogel stack to bend into a 'C' shaped conformation. If the stack bends enough, a tubular shape is formed, which resembles structures like blood vessels and other tubular organs.

They also found that it is possible to calibrate the system to control the timing and the extent of shape change that occurred. The researchers were able to embed bone marrow stem cells into the hydrogel at very high density -- the highest density of cells ever recorded for 4D materials -- and keep them alive, a significant advance in bioengineering that has practical applications.

In the paper, the researchers described how the shape-changing cell-laden hydrogel could be induced to become bone- and cartilage-like tissues. 4D bioprinting of this hydrogel was also implemented to obtain unique configurations to achieve more complex 4D architectures.

"Using our bilayer hydrogels, we can not only control how much bending the material undergoes and its temporal progression, but because the hydrogels can support high cell densities, they more closely mimic how many tissues form or heal naturally," said Yu Bin Lee, a biomedical engineering postdoctoral researcher and first author on the paper. "This system holds promise for tissue engineering, but may also be used to study the biological processes involved in early development."

Credit: 
University of Illinois Chicago

Can polar bears and narwhals cling on as the ice shrinks?

As part of the Journal of Experimental Biology's Special Issue dedicated to climate change, Anthony Pagano (San Diego Zoo Global, USA) and Terrie Williams (University of California, Santa Cruz, USA), discuss the impact of environmental change on two iconic polar species; the polar bear and narwhal. Their review article is published in Journal of Experimental Biology at https://jeb.biologists.org/content/224/Suppl_1.

Mammals in the Polar Regions face an uncertain future as unprecedented warming drives catastrophic sea ice loss, driving polar bears onto land, after losing access to sea ice and the highly calorific seals upon which they feed, forcing the animals to depend on lower calorie diets. The scientists say, 'A polar bear would need to consume approximately 1.5 caribou, 37 Arctic char, 74 snow geese, 216 snow goose eggs (i.e.54 nests with 4 eggs per clutch) or 3 million crowberries to equal the digestible energy available in the blubber of one adult ringed seal'. They add, 'Few resources exist on land within the polar bears' range that could compensate for declines in seal feeding opportunities'.

Pagano and Williams have measured the energetic cost of movement for narwhals and polar bears and found that major ice loss translated into elevated locomotor that are 3- to 4-fold greater than expected when sea ice cover is normal. This increase in energy consumption, coupled with the loss of access to the polar bear's main food source leaves them particularly vulnerable to starvation.

The scientists also consider how the high costs of diving for narwhals, coupled with the loss of reliable breathing holes upon which they depend, due to unpredictable sea ice shifts, have led to the mammals becoming trapped beneath the ice. The narwhal's slow swimming paces also leaves them vulnerable to attacks by killer whales open water.

The scientists warn that the decline of both apex predators will 'lead to rapid changes in the Arctic marine ecosystem'.

Credit: 
The Company of Biologists

Treating rheumatoid arthritis with micromotors

image: Hydrogen-propelled micromotors (illustration, top, and microscope image, bottom) improved rheumatoid arthritis symptoms when injected into rats' joints. Scale bar, 20 μm.

Image: 
Adapted from <i>Nano Letters</i> <b>2021</b>, DOI: 10.1021/acs.nanolett.0c04438

Rheumatoid arthritis is a chronic inflammatory disorder marked by joint pain, swelling and damage. Although medications, such as steroids, anti-inflammatory drugs and immunosuppressants, can help slow joint destruction and relieve pain, they have side effects and aren't completely successful. Now, researchers reporting in ACS' Nano Letters have developed magnesium-based micromotors propelled by hydrogen bubbles, which improved rheumatoid arthritis symptoms when injected into the joints of rats.

Scientists have linked rheumatoid arthritis development to the excess production of reactive oxygen species (ROS). ROS can oxidize and degrade cartilage and bone, as well as activate the expression of inflammatory cytokines. A new type of therapy, hydrogen gas, can neutralize ROS and decrease inflammatory cytokine levels when given to patients in drinking water. However, the gas is poorly soluble in body fluids and quickly eliminated when given orally, limiting its therapeutic effects. Fei Peng, Yingfeng Tu, Yingjia Li and colleagues wanted to find a way to produce and deliver hydrogen gas directly inside an inflamed joint.

The researchers based their system on magnesium-based micromotors -- tiny spheres that react with water to produce hydrogen bubbles, which propel the motors. They coated the micromotors with hyaluronic acid, a joint lubricant, leaving a small opening for the magnesium to react with water. When placed in simulated joint fluid, the micromotors showed prolonged, sustained release of hydrogen bubbles and could move on their own. The team then injected the micromotors into the joints of rats that served as an animal model of rheumatoid arthritis and used ultrasound to visualize them. Compared with uninjected rats, the treated rats showed less-swollen paws, reduced bone erosion and lower levels of inflammatory cytokines. Although the micromotors still need to be tested in humans, they show great potential for the therapy of rheumatoid arthritis and other inflammatory diseases, the researchers say.

Credit: 
American Chemical Society

Machine learning tool can predict malignancy in patients with multiple pulmonary nodules

PHILADELPHIA - A machine learning-based tool was able to predict the risk of malignancy among patients presenting with multiple pulmonary nodules and outperformed human experts, previously validated mathematical models, and a previously established artificial intelligence tool, according to results published in Clinical Cancer Research, a journal of the American Association for Cancer Research.

Tools currently available can predict malignancy in patients with single nodules; predictive tools for patients presenting with multiple nodules are limited.
"With the adoption of widespread use of thoracic computed tomography (CT) for lung cancer screening, the detection of multiple pulmonary nodules has become increasingly common," said study author Kezhong Chen, MD, vice professor in the Department of Thoracic Surgery at Peking University People's Hospital in China. Among patients presenting with a pulmonary nodule on a CT scan in a previous lung cancer screening trial, roughly 50 percent presented with multiple nodules, Chen said.
"Current guidelines recommend the use of clinical models that incorporate nodule and sociodemographic features to estimate the probability of cancer prior to surgical treatment, and while there are several tools for patients that present with a single nodule, no such tool currently exists for patients with multiple nodules, representing an urgent medical need," Chen added.

To address this unmet need, the researchers set out to develop a machine learning-based model to predict the probability of lung malignancy among patients presenting with multiple pulmonary nodules. First, the study authors used data from a training cohort of 520 patients (comprising 1,739 nodules) who were treated at Peking University People's Hospital between January 2007 and December 2018. Using both radiographical nodule characteristics and sociodemographic variables, the authors developed a model, termed PKU-M, to predict the probability of cancer. The performance of the model was evaluated by calculating the area under the curve (AUC), where a score of 1 corresponds to a perfect prediction. In the training cohort, the model achieved an AUC of 0.91. Some of the top predictive features of the model included nodule size, nodule count, nodule distribution, and age of the patient.

The model was then validated using data from a cohort of 220 patients (comprising 583 nodules) who underwent surgical treatment in six independent hospitals in China and Korea between January 2016 and December 2018. The performance of the PKU-M model in this cohort was similar to its performance in the training cohort, with an AUC of 0.89. The researchers also compared the performance of their model with four prior logistic regression-based models that were developed for the prediction of lung cancer. The PKU-M model outperformed all four of the prior models, whose AUC values ranged from 0.68 to 0.81.

Finally, the researchers conducted a prospective comparison between the PKU-M model, three thoracic surgeons, a radiologist, and a previously established artificial intelligence tool for the diagnosis of lung cancer called RX. This comparison was conducted on a separate cohort of 78 patients (comprising 200 nodules) who underwent surgical treatment at four independent hospitals in China between January 2019 and March 2019. Similar to the training and validation cohorts, the performance of the PKU-M model achieved an AUC of 0.87, which was higher than that from the surgeons (with AUCs ranging from 0.73 to 0.79), the radiologist (AUC of 0.75), and the RX model (AUC of 0.76).

"The increasing detection rate of multiple pulmonary nodules has led to an emerging problem for lung cancer diagnosis," said study author Young Tae Kim, MD, PhD, professor in the Department of Thoracic and Cardiovascular Surgery at Seoul National University Hospital and the Seoul National University College of Medicine in the Republic of Korea. "Because many nodules are found to be benign either after long-term follow-up or surgery, it is important to carefully evaluate these nodules prior to invasive procedures. Our prediction model, which was exclusively established for patients with multiple nodules, can help not only mitigate unnecessary surgery but also facilitate the diagnosis and treatment of lung cancer."

"Models are developed to assist in clinical diagnosis, which means that they should be practical," said study author Jun Wang, MD, professor in the Department of Thoracic Surgery at Peking University People's Hospital. "We therefore designed a web-based version of the PKU-M model, where clinicians can input several clinical and radiological characteristics and the software will automatically calculate the risk of malignancy in a specific patient. This tool can quickly generate an objective diagnosis and can aid in clinical decision-making."

Because this study only used data from Asian patients, it may not be generalizable to a Western population or other populations, representing a limitation of this study.

Credit: 
American Association for Cancer Research

International team identifies 127 glaucoma genes in largest study of its kind

In the largest genome-wide association study of glaucoma comparing the genes of 34,179 people with the disease to 349,321 control subjects, an international consortium of researchers identified 44 new gene loci and confirmed 83 previously reported loci linked to glaucoma. Loci are considered "genetic street addresses," denoting a specific location on a gene.

The study's authors hope the identification of these genes will lead to new treatment targets for this incurable eye disease that is a leading cause of blindness worldwide.

"These new findings come out of the highest-powered genome-wide association study of glaucoma to date, and show the power of team science and using big data to answer questions when research groups around the world join forces," said co-senior study author Janey L. Wiggs, MD, PhD, Associate Chief of Ophthalmology Clinical Research at Mass Eye and Ear, and the Paul Austin Chandler Professor of Ophthalmology and Vice Chair of Clinical Research at Harvard Medical School. "The number of genes identified will lead to the discovery of new biological pathways that can lead to glaucoma, and in turn, new targets for therapeutics," added Dr. Wiggs, the Associate Director of the Ocular Genomics Institute at Harvard Medical School and a member of the National Academy of Medicine.

The findings were published February 24 in Nature Communications.

Glaucoma affects more than 75 million individuals worldwide, including about 3 million people in the United States, and these numbers are expected to increase with the aging population. Glaucoma causes irreversible damage to the eye's optic nerve. This damage is often painless and hard to detect as it begins, but over time can lead to vision loss. Primary open angle glaucoma is one of its most common forms of the disease and is highly hereditary, but the genes involved in the disease have been poorly understood.

First cross-ancestry comparison of glaucoma genes

For the first time in a glaucoma genome-wide association study, a cross-ancestry comparison was performed looking at genetic data from people of European, African and Asian descent. The researchers found the majority of loci that contribute to glaucoma were consistent across all three groups. Previous studies had mostly looked at gene data from people of European descent.

"Glaucoma rates are highest in African and Asian ancestry groups, but the largest genetic studies of glaucoma in the past focused on people of European ancestry," said lead author Puya Gharahkhani, Associate Professor in the Statistical Genetics group at QIMR Berghofer Medical Research Institute in Australia. "Those studies showed genetic tests could be used to help identify who would benefit from sight-saving early monitoring or treatment, but because of the narrow scope of the genetic data, we weren't sure until now that the genetic indicators were true for people of different ancestries.

The cross-ancestry data improved fine-mapping of causal variants linked to glaucoma. By integrating multiple lines of genetic evidence, the researchers implicated previously unknown biological processes that might contribute to the development of the disease.

Genetic findings may lead to better testing and clinical trial targets

Future initiatives for the research group will focus on using these genetic loci to improve screening and diagnosis of glaucoma, and one day, to develop new treatments.

Previous studies from the group used genes identified to develop polygenic risk scores, which are estimates of a person's disease risk. This new study can add a larger collection of genetic variants to improve this risk score's specificity. Such information can be important for screening patients who might be at risk for glaucoma, and for allowing patients with glaucoma to better understand their disease course based on their genetic profile. Studies are planned to identify how patients at higher genetic risk fare clinically, such as determining if polygenic risk score is linked to a need for eye drops or surgery.

Another future avenue for the research is identifying new causal genes and mechanisms for glaucoma, and using that information to develop therapeutic approaches to target those genes. The risk loci identified include genes that are highly expressed in relevant eye tissues, nerves, arteries and other tissues related to glaucoma.

Current glaucoma treatments are aimed at reducing eye pressure in order to slow the disease's progression and prevent permanent damage to the optic nerve, however no treatment can stop or cure glaucoma. While some current clinical trials are looking at treatments for certain genes, these new findings might increase the amount of targets and more precise treatments.

"Glaucoma is one of the most strongly genetic human diseases, which is why we are looking at the genetic architecture of the disease to find clues on how to prevent and treat it," said Professor Stuart MacGregor, the head of QIMR Berghofer's Statistical Genetics group and co-senior researcher on the study. "We're hopeful that understanding the biological processes and knowing which genes control them could help scientists develop new drugs in the future."

Credit: 
Mass Eye and Ear

Buckyballs on DNA for harvesting light

Organic molecules that capture photons and convert these into electricity have important applications for producing green energy. Light-harvesting complexes need two semiconductors, an electron donor and an acceptor. How well they work is measured by their quantum efficiency, the rate by which photons are converted into electron-hole pairs.

Quantum efficiency is lower than optimal if there is "self-quenching", where one molecule excited by an incoming photon donates some of its energy to an identical non-excited molecule, yielding two molecules at an intermediate energy state too low to produce an electron-hole pair. But if electron donors and acceptors are better spaced out, self-quenching is limited, so that quantum efficiency improves.

In a new paper in Frontiers in Chemistry, researchers from the Karlsruhe Institute of Technology (KIT) synthesize a novel type of organic light-harvesting supramolecule based on DNA. The double helix of DNA acts as a scaffold to arrange chromophores (i.e. fluorescent dyes) - which function as electron donors - and "buckyballs" - electron acceptors - in three dimensions to avoid self-quenching.

"DNA is an attractive scaffold for building light-harvesting supramolecules: its helical structure, fixed distances between nucleobases, and canonical base pairing precisely control the position of the chromophores. Here we show that carbon buckyballs, bound to modified nucleosides inserted into the DNA helix, greatly enhance the quantum efficiency. We also show that the supramolecule's 3D structure persists not only in the liquid phase but also in the solid phase, for example in future organic solar cells," says lead author Dr Hans-Achim Wagenknecht, Professor for Organic Chemistry at Karlsruhe Institute of Technology (KIT).

DNA provides regular structure, like beads on a helical string

As scaffold, Wagenknecht and colleagues used single-stranded DNA, deoxyadenosine (A) and thymine (T) strands 20 nucleotides long. This length was chosen because theory suggests that shorter DNA oligonucleotides wouldn't assemble orderly, while longer ones wouldn't be soluble in water. The chromophores were violet-fluorescent pyrene and red-fluorescent Nile red molecules, each bound noncovalently to a single synthetic uracil (U)-deoxyribose nucleoside. Each nucleoside was base-paired to the DNA scaffold, but the order of pyrenes and Nile reds was left to chance during self-assembly.

For the electron acceptors, Wagenknecht et al. tested two forms of "buckyballs" - also called fullerenes - which are known to have an excellent capacity for "quenching" (accepting electrons). Each buckyball was a hollow globe built from interlocking rings of five or six carbon atoms, for a total of 60 carbons per molecule. The first form of buckyball tested binds nonspecifically to the DNA through electrostatic charges. The second form - not previously tested as an electron acceptor - was covalently bound via a malonic ester to two flanking U-deoxyribose nucleosides, which allowed it to be base-paired to an A nucleotide on the DNA.

High quantum efficiency, including in solid phase

The researchers confirmed experimentally that the 3D structure of the DNA-based supramolecule persists in solid phase: a crucial requirement for applications in solar cells. To this end, they tested supramolecules with either form of buckyballs as the active layer in a miniature solar cell. The constructs showed excellent charge separation - the formation of a positive hole and negative electron charge in the chromophore and their acceptance by nearby buckyballs - with either form of buckyball, but especially for the second form. The authors explain this from the more specific binding, through canonical base-pairing, to the DNA scaffold by the second form, which should result in a smaller distance between buckyball and chromophore. This means that the second form is the better schoice for use in solar cells.

Importantly, the authors also show that the DNA-dye-buckyball supramolecule has strong circular dichroism, that is, it is much more reactive to left- than to right-handed polarized light, due to its complex 3D helical structure - even in the solid phase.

"I don't expect that everyone will have solar cells with DNA on their roof soon. But the chirality of DNA will be interesting: DNA-based solar cells might sense circularly polarized light in specialized applications," concludes Wagenknecht.

Credit: 
Frontiers

Measuring carbon nanotubes taken up by plants

image: Lettuce after exposure to carbon nanotubes.

Image: 
Kamol Das

Carbon nanotubes are tiny. They can be a hundred thousand times smaller than the width of a human hair. But they have huge potential.

Products manufactured using carbon nanotubes include rebar for concrete, sporting goods, wind turbines, and lithium batteries, among others.

Potential uses of carbon nanotubes could extend to diverse fields, such as agriculture, biomedicine and space science.

But as we use more carbon nanotubes to make things, we also increase the chances that these nanotubes enter different environments and ecosystems.

"That makes it important to understand how carbon nanotubes behave in these environments," says Yu Yang, a member of the Soil Science Society of America.

In a new study, Yang and his colleagues describe a way to measure levels of a specific kind of carbon nanotube in plant tissues. Their research was recently published in Journal of Environmental Quality.

Carbon nanotubes may make their way into agricultural fields and food products. There, they can pose a threat to human and environmental health.

"Knowing how to measure carbon nanotubes in the environment is crucial to understanding their environmental fate and effects," says Yang.

To mimic the nanotubes in the environment, Yang and colleagues grew hydroponic lettuce in the presence of carbon nanotubes. Then they analyzed the lettuce leaves for traces of carbon nanotubes.

Yang found this method could detect small amounts of carbon nanotubes in the leaves, stems and roots of the lettuce plants.

"We have developed a method to address the challenging issue of quantifying carbon nanomaterials in the environment," says Yang. "These findings can help guide the sustainable application of carbon nanotubes in natural environments."

The challenge in measuring carbon nanotubes in the environment is that they are made of carbon. All living things on Earth - including humans and plants - have carbon as a key building block.

The task Yang and colleagues faced was to distinguish between carbon in living material from carbon in carbon nanotubes.

A single layer of carbon atoms arranged in a honeycomb pattern is called graphene. A carbon nanotube is a sheet of graphene rolled into a tiny cylinder.

Carbon nanotubes made of a single sheet of graphene are called single-walled nanotubes. Layering multiple tubes within others yields multi-walled carbon nanotubes.

Scientists can add different molecules to carbon nanotubes. Adding these molecules can change their characteristics. They might dissolve more easily in solvents, for example.

"Carbon nanotubes with molecules added on could be used in the fabrication of nanocomposites, biomedicine, and chemical or biological probes," says Yang.

In previous research, Yang's group quantified multi-walled carbon nanotubes in plants. But no one had measured if this kind of carbon nanotube with a specific molecule added on gets into plants.

The researchers used a technique called programmed thermal analysis. In this approach, materials are heated in a controlled manner in different environments - say plus or minus oxygen, for example.

How different materials react to being heated in different environments can provide big clues about these materials.

Yang and colleagues found they could use programmed thermal analysis to detect the carbon in the nanotubes. Using these data, they could also tell apart the carbon in carbon nanotubes from the carbon in plants.

This is the first study to measure levels of this kind of carbon nanotube in plants using the thermal analysis. "That's crucial for understanding carbon nanotube fate in the environment and estimating potential human exposure," says Yang.

Yang is now working on detecting even smaller amounts of carbon nanotubes in the environment.

"We also want to try to measure carbon nanotubes with different molecules added on," says Yang. He also plans to expand test materials beyond lettuce plants. "We want to test this approach in different environments."

Ultimately, the goal is to advance the application of carbon nanotubes. "Being able to accurately measure carbon nanotubes in the environment can promote their sustainable use," says Yang.

Credit: 
American Society of Agronomy

Mushrooms add important nutrients when included in the typical diet

image: First dietary modeling analysis of all three USDA Food Patterns investigates the effects of adding a serving of mushrooms

Image: 
Mushroom Council

February 24, 2021 - The second study published in as many months has identified another reason to add more mushrooms to the recommended American diet. The new research , published in Food & Nutrition Research (February 2021), examined the addition of mushrooms to U.S. Department of Agriculture (USDA) Food Patterns resulting in the increase of several micronutrients including shortfall nutrients, while having a minimal to zero impact on overall calories, sodium or saturated fat.

Dr. Victor L. Fulgoni III and Dr. Sanjiv Agarwal looked at the nutritional effect of substituting a serving of various foods recommended to be moderated in the diet by the 2015-2020 U.S. Dietary Guidelines with an 84-gram serving of mushrooms on nutrient profiles in USDA's Healthy US style, Mediterranean-style and Vegetarian Eating Patterns. This is a similar approach that the USDA used for determining its Dietary Guidelines . For the mushroom serving, researchers looked at a composite of white, crimini and portabella mushrooms at a 1:1:1 ratio; one scenario including UV-light exposed mushrooms; and one scenario including oyster mushrooms.

"Simply adding an 84-gram serving, or what would be the equivalent of 5 medium white mushrooms, to USDA Food Patterns increased several shortfall nutrients including potassium as well as other B vitamins and minerals and had minimal to no impact on overall calories, sodium or saturated fat," said Dr. Fulgoni.

Depending on the pattern type and calorie level, key findings include:

The addition of a serving (84 g) of mushrooms to the diet resulted in an increase in potassium (8%-12%), copper (16%-26%), selenium (11%-23%), riboflavin (12%-18%) and niacin (11%-26%), but had no impact on calories, carbohydrate, fat or sodium.

The addition of a serving (84 g) of oyster mushrooms increased vitamin D (8%-11%) and choline (10%-16%) in USDA Food Patterns.

Mushrooms exposed to UV-light to increase vitamin D levels to 200 IU/serving also increased vitamin D by 67%-90% in USDA Food Patterns.

A composite of white, crimini and portabella mushrooms at a 1:1:1 ratio would be expected to add 2.24 mg ergothioneine and 3.53 mg glutathione, while oyster mushrooms would provide 24.0 mg ergothioneine and 12.3 mg glutathione. (Note: the USDA Food Patterns as well as USDA FoodData Central do not include analytical data either of these antioxidants at this time).

Results Mirror Similar Modeling Study

Drs. Fulgoni and Agarwal also modeled the addition of mushrooms to National Health and Nutrition Examination Survey (NHANES) 2011-2016 dietary data looking at a composite of white, crimini and portabella mushrooms at a 1:1:1 ratio; one scenario including UV-light exposed mushrooms; and one scenario including oyster mushrooms for both 9-18 years and 19+ years of age based on an 84g or ½ cup equivalent serving . Similar to the USDA Food Patterns, the NHANES data found the addition of a serving (84 g) of mushrooms to the diet resulted in an increase in dietary fiber (5%-6%), copper (24%-32%), phosphorus (6%), potassium (12%-14%), selenium (13%-14%), zinc (5%-6%), riboflavin (13%-15%), niacin (13%-14%), and choline (5%-6%) in both adolescents and adults; but had no impact on calories, carbohydrate, fat or sodium.

Looking specifically at vitamin D, the study shows that when commonly consumed mushrooms are exposed to UV-light to provide 5 mcg vitamin D per serving, vitamin D intake could meet and slightly exceed the recommended daily value (98% - 104%) for both the 9 -18 year and 19+ year groups as well as decrease inadequacy of this shortfall nutrient in the population. In addition, a serving of UV-light exposed commonly consumed mushrooms decreased population inadequacy for vitamin D from 95.3% to 52.8% for age group 9-18 years and from 94.9% to 63.6% for age group 19+ years.

Mushrooms Role in the Dietary Guidelines

Mushrooms are fungi - a member of the third food kingdom - biologically distinct from plant and animal-derived foods that comprise the USDA food patterns yet have a unique nutrient profile that provides nutrients common to both plant and animal foods. Although classified into food grouping systems by their use as a vegetable, mushrooms' increasing use in main entrees in plant-based diets is growing, supporting consumers' efforts to follow food-based dietary guidance recommendations to lower intake of calories, saturated fatty acids, and sodium while increasing intake of under-consumed nutrients including fiber, potassium and vitamin D.

When considering mushrooms' role in diet quality and helping consumers achieve healthy eating patterns, a previous analysis of NHANES 2001-2010 data discovered that mushroom intake was associated with higher intakes of several key nutrients and thus better diet quality . However, intake was low - about 21g per day among mushroom consumers. Because of mushrooms' culinary versatility and unique nutrient profile, greater recognition of mushrooms in dietary guidance is an opportunity to improve diet quality, particularly to increase consumption of vegetables.

"Results from this current research on modeling the nutritional impact of mushrooms on USDA healthy eating patterns are now available for consideration by the 2025-2030 Dietary Guidelines Advisory Committee," said Mary Jo Feeney, MS, RD, FADA and nutrition research coordinator to the Mushroom Council.

Mushrooms: A Nutrient Powerhouse

Often grouped with vegetables, mushrooms provide many of the nutrient attributes of produce, as well as attributes more commonly found in meat, beans or grains. According to the USDA's FoodData Central , one serving (5 medium/90g) of white, raw mushrooms contains 20 calories, 0g fat, 3g protein and is very low in sodium (0mg/

More Research from the Mushroom Council Still to Come

With mushrooms growing in awareness and consideration among consumers nationwide, in 2019, the Mushroom Council made a $1.5 million multi-year investment in research to help broaden understanding of the food's nutritional qualities and overall health benefits.

In addition to the analysis of mushrooms for bioactives/ergothioneine for inclusion in USDA FoodData Central database, additional research projects approved include:

Health promoting effects of including mushrooms as part of a healthy eating pattern.

Mushrooms' relationship with cognitive health in older adults.

Mushrooms' impact on brain health in an animal model.

Since 2002, the Council has conducted research that supports greater mushroom demand by discovering nutrient and health benefits of mushrooms. Published results from these projects form the basis for communicating these benefits to consumers and health influencers.

Credit: 
FLM Harvest

Quantum systems learn joint computing

image: This picture shows the two qubit modules (red atom between two blue mirrors) that have been interconnected to implement a basic quantum computation (depicted as light blue symbol) over a distance of 60 meters. The modules reside in different laboratories of the same building and are connected by an optical fiber. The computation operation is mediated by a single photon (flying red sphere) that interacts successively with the two modules.

Image: 
Stephan Welte, Severin Daiss (MPQ)

Today's quantum computers contain up to several dozen memory and processing units, the so-called qubits. Severin Daiss, Stefan Langenfeld, and colleagues from the Max Planck Institute of Quantum Optics in Garching have successfully interconnected two such qubits located in different labs to a distributed quantum computer by linking the qubits with a 60-meter-long optical fiber. Over such a distance they realized a quantum-logic gate - the basic building block of a quantum computer. It makes the system the worldwide first prototype of a distributed quantum computer.

The limitations of previous qubit architectures

Quantum computers are considerably different from traditional "binary" computers: Future realizations of them are expected to easily perform specific calculations for which traditional computers would take months or even years - for example in the field of data encryption and decryption. While the performance of binary computers results from large memories and fast computing cycles, the success of the quantum computer rests on the fact that one single memory unit - a quantum bit, also called "qubit" - can contain superpositions of different possible values at the same time. Therefore, a quantum computer does not only calculate one result at a time, but instead many possible results in parallel. The more qubits there are interconnected in a quantum computer; the more complex calculations it can perform.

The basic computing operations of a quantum computer are quantum-logic gates between two qubits. Such an operation changes - depending on the initial state of the qubits - their quantum mechanical states. For a quantum computer to be superior to a normal computer for various calculations, it would have to reliably interconnect many dozens, or even thousands of qubits for equally thousands of quantum operations. Despite great successes, all current laboratories are still struggling to build such a large and reliable quantum computer, since every additionally required qubit makes it much harder to build a quantum computer in just one single set-up. The qubits are implemented, for instance, with single atoms, superconductive elements, or light particles, all of which need to be isolated perfectly from each other and the environment. The more qubits are arranged next to one another, the harder it is to both isolate and control them from outside at the same time.

Data line and processing unit combined

One way to overcome the technical difficulties in the construction of quantum computers is presented in a new study in the journal Science by Severin Daiss, Stefan Langenfeld and colleagues from the research group of Gerhard Rempe at the Max Planck Institute of Quantum Optics in Garching. In this work supported by the Institute of Photonic Sciences (Castelldefels, Spain), the team succeeded in connecting two qubit modules across a 60-meter distance in such a way that they effectively form a basic quantum computer with two qubits. "Across this distance, we perform a quantum computing operation between two independent qubit setups in different laboratories," Daiss emphasizes. This enables the possibility to merge smaller quantum computers to a joint processing unit.

Simply coupling distant qubits to generate entanglement between them has been achieved in the past, but now, the connection can additionally be used for quantum computations. For this purpose, the researchers employed modules consisting of a single atom as a qubit that is positioned amidst two mirrors. Between these modules, they send one single light quanta, a photon, that is transported in the optical fiber. This photon is then entangled with the quantum states of the qubits in the different modules. Subsequently, the state of one of the qubits is changed according to the measured state of the "ancilla photon", realizing a quantum mechanical CNOT-operation with a fidelity of 80 percent. A next step would be to connect more than two modules and to host more qubits in the individual modules.

Higher performance quantum computers through distributed computing

Team leader and institute director Gerhard Rempe believes the result will allow to further advance the technology: "Our scheme opens up a new development path for distributed quantum computing". It could enable, for instance, to build a distributed quantum computer consisting of many modules with few qubits that are interconnected with the newly introduced method. This approach could circumvent the limitation of existing quantum computers to integrate more qubits into a single setup and could therefore allow more powerful systems.

Credit: 
Max-Planck-Gesellschaft