Culture

Researchers find safeguards for quantum communications

Army researchers developed a new way to protect and safeguard quantum information, moving quantum networks a step closer to reality.

Quantum information science is a rapidly growing interdisciplinary field exploring new ways of storing, manipulating and communicating information. Researchers want to create powerful computational capabilities using new hardware that operates on quantum physics principles.

For the Army, the new quantum paradigms could potentially lead to transformational capabilities in fast, efficient and secure collecting, exchanging and processing vast amounts of information on dynamic battlefields of the future.

Drs. Dan Jones, Brian Kirby and Michael Brodsky from the U.S. Army Combat Capabilities Development Command's Army Research Laboratory, joined by Gabriele Riccardi and Professor Cristian Antonelli from the University of L'Aquila, studied sources of noise in quantum communication channels.

Noise is a common plague of any communication - anyone who has ever used a radio, a walkie-talkie or a phone experienced noisy reception now and then, Brodsky said. Communication engineers devise intricate schemes to remove the noise and to clean the transmitted signal as much as possible.

According to Brodsky, quantum communications are no different in their susceptibility to noise in communication channels. In fact, even more so than the regular classic communications because the quantum signals are extremely low power.

"To engineer a useful quantum network, we need to understand how far, how fast and how reliably we could send quantum information," Brodsky said. "That requires understanding of the noise in communication channels."

As the team modeled, emulated, characterized and measured different types of noise in quantum channels, the researchers realized that while some quantum noise types are impossible to filter out, others could be removed quite easily.

Surprisingly, it turns out that the bad noise could be converted into good noise by simply adding a cheap extra component to the quantum channel. Having this extra control allows them to tweak the channel and to adjust the properties of the noise that masks the transmitted signal.

The overall focus of the lab's Quantum Networking Group is to experimentally explore the most efficient and secure ways to create, store and process quantum information based on state-of-the-art photonic technologies of the day. The main workhorse of the group is the lab's quantum networking testbed that they have built at its headquarters in Adelphi, Maryland. Researchers use the quantum testbed to test-drive various photonic technological approaches to the fast and robust delivery of quantum information over large distances.

"We approach our research quite uniquely by wearing system engineer hats," Jones said.

The research scope of the group spans developing the architecture and operational principles of quantum networks, as well as understanding and mapping technological limitations to its practical implementation, and, finally, inventing methods and techniques to engineer around these limitations. The current research results belong to the latter two categories.

The next projects in the pipeline focus on demonstrating an intriguing way of completely error-free transmission of quantum information. Further down the line is creating a multi-user quantum network testbed deployed in the field and demonstrating secure secret sharing protocols between two distance metropolitan campuses.

The field of quantum information science is booming worldwide as it potentially leads to unsurpassed capabilities in computation, communication and networking. It offers new paradigms in the ways information is being handled, which would lead to secure secret sharing, distributed network sensing and efficient decision making.

"Our research results are a step towards arming the warfighter of the future with quantum advantages and a good example of how operationalizing science results in transformational overmatch," Brodsky said.

The group summarized their research results are in a paper, Exploring classical correlation in noise to recover quantum information using local filtering, accepted by the peer-reviewed New Journal of Physics.

Credit: 
U.S. Army Research Laboratory

HKU study reveals the hidden fight within corals

image: Light and confocal images of Symbiodinium cells living in a host cell.

Image: 
Allisonmlewis / CC BY-SA

Researchers from the School of Biological Sciences and Swire Institute of Marine Science at the University of Hong Kong are working to understand how the coral symbiosis may respond to global warming through changes in their microbiome, specifically their symbiotic algae. Using a newly developed method they revealed , which may be a determining factor in the sucthe metabolic function of algae changes in response to competition with other speciescess or failure of certain host-symbiont combinations.

The research, published in The ISME Journal, used single-celled algae (dinoflagellates) which were isolated from reef-building corals to understand how hotter ocean temperatures might influence their ability to compete against each other within their coral host. The work builds on decades of research which has honed in on certain types of algal species which confer heat resistance to their host. Why these heat-tolerant species are not more widespread has remained a mystery, until now.

"We know that the ability of corals to withstand warming oceans is related to their microbiome. You could say we are asking the same types of questions as a physician: Can we manipulate the host microbiome to improve coral health? Our paper demonstrates that the efficacy of probiotic treatments or assisted evolution might depend on how these microbes interact with each other" explains postdoctoral fellow Dr Shelby McIlroy who co-led the study with PhD student Jane Wong.

The experiments were conducted at two temperatures; a heated treatment to simulate a coral bleaching event and an unheated control. The researchers found that the heat-tolerant algae were poor competitors at both temperatures and adopted a "shelter-in-place" strategy by storing more fats and carbohydrates to persist through times of stress. At normal temperatures, the thermally sensitive species grew similarly whether the other species was present or not. However, with warming competition triggered a marked increase in resource consumption, essentially restricting the availability of growth resources to its competitors. What the researchers suggest is that thermally tolerant algae have failed to become more widespread because they are outcompeted in most scenarios and simply the "last-man standing" under conditions unsuitable for other species.

The researchers combined three established methods - Fluorescent In-Situ Hybridization (FISH), Flow Cytometry (Flow), and Stable Isotope Analysis (SIA) - to differentiate two species of algae from one another that were grown together in a mixed culture. After introducing isotopically labeled nutrients, the team allowed the cells to assimilate carbon and nitrogen prior to separating them for isotope analysis. In this way they could see if one species was obtaining more resources for growth and reproduction than the other - evidence of competition. They called the method FFS.

"FFS is an exciting marriage of established methods. We applied it to an interesting question related to corals, but it can be adapted for any microbial community - such as the human gut. In doing so we can begin to assign metabolic functions to certain bacteria which are known to be present and may express certain genes but whose actual function remains unknown." said Dr David Baker, Associate Professor of School of Biological Sciences and Swire Institute of Marine Science who supervised the study.

Credit: 
The University of Hong Kong

Record efficiency for printed solar cells

image: Two-step roll to roll coating of perovskite thin films at Swansea University, where researchers from the SPECIFIC project have achieved record efficiency levels for printed solar cells.

Image: 
SPECIFIC/Swansea University

A new study reports the highest efficiency ever recorded for full roll-to-roll printed perovskite solar cells (PSCs), marking a key step on the way to cheaper and more efficient ways of generating solar energy.

A team at Swansea University's SPECIFIC Innovation and Knowledge Centre, led by Professor Trystan Watson, has reported using a roll-to-roll fabrication method for four layers of slot-die coated PSCs.

The PSCs gave the stable power output of 12.2% - the highest efficiency recorded for four layers of roll-to-roll printed PSCs to date.

A newcomer to the photovoltaic industry, PSCs have gathered remarkable attention from researchers around the globe. With efficiency reaching similar levels to those of silicon photovoltaics (PV), the current market leader, attention has been diverted towards upscaling PSCs.

In contrast to silicon PV, which requires high temperature and high vacuum depositions, PSCs can be solution-processed at a low temperature, which significantly reduces the manufacturing cost.

Low temperature processing makes it possible to use plastic substrates to create flexible solar cells.

The ability to solution-process provides the opportunity to apply various well-developed printing and coating techniques:

Screen printing

Inkjet printing

Gravure printing

Slot-die coating

Spray coating

These advantages made it possible for Swansea University researchers to use roll-to-roll manufacturing for four layers of PSCs.

Slot-die coating provides several advantages over the alternatives: it is a pre-metred technique, which means the wet film thickness can be controlled before coating. It is also highly efficient in material usage, with minimal loss of material compared with spray coating or screen printing.

Using the necessary toxic solvents at an industrial scale requires a lot of air handling to stay under the safety limits, which can incur significant and unnecessary expenses. For this reason, an acetonitrile-based system was used. This system has a rheological advantage due to low viscosity and low surface tension, which results in better coatings.

Along with this, a ternary blend of high workplace exposure limit solvents was introduced, replacing chlorobenzene for the deposition of hole transport material. In this research, the PSCs gave the stable power output of 12.2%, which is the highest efficiency reported for four layers of roll-to-roll printed PSCs.

A complete solar cell for a chosen architecture requires coating five layers. In this case, four layers were coated using slot-die coating and the top contact was put on using thermal evaporation. Slot-die coating the fifth (top) contact without destroying any layers underneath has not yet been achieved. Solving this would enable the manufacture of a fully roll-to-roll printed PSC.

Rahul Patidar of SPECIFIC, lead researcher on the project, said:

"Perovskite solar cells aim to increase the efficiency and lower the cost of traditional solar energy generation. They have the potential to be highly efficient and relatively cheap to manufacture, so the aim is to improve fabrication methods for upscaling.

This study signifies the next step towards commercialisation."

Credit: 
Swansea University

Mirror image tumor treatment

Our immune system ought to be able to recognize and kill tumor cells. However, many tumors deceive the immune system. For example, they induce the so-called immune checkpoints of T-cells to shut down immune responses. In the journal Angewandte Chemie, scientists have now introduced a new approach for immunological tumor treatment. Their method is based on the specific blockade of an immune checkpoint by a stable "mirror-image" peptide.

T lymphocytes have a variety of immune checkpoints on their surface, some that crank up the immune system and others that suppress immune reactions when they "discover" suitable ligands on the surfaces of "checked" cells. One such immune checkpoint is the programmed cell death protein 1 (PD-1). If the PD-L1 ligand is bound to PD-1, the immune response is inhibited to prevent attack on healthy cells produced by the body. Unfortunately, many tumors "camouflage" themselves with a particularly large number of PD-L1, which protects them. Blocking the interaction between PD-1 and PD-L1 can normalize the cancer immunity in the microenvironment around tumors. However, previous therapeutic approaches had only limited success, and the tumors often developed resistance.

A newly discovered immune checkpoint known as TIGIT could provide an alternative point of attack. TIGIT reacts to a ligand named PVR with an immunosuppressive signal. A team of researchers at Zhengzhou University in Zhengzhou, Tsinghua University in Beijing, and Sun Yat-sen University in Shenzhen, led by Yanfeng Gao and Lei Liu used RNA expression data from the Cancer Genome Atlas and Gene Expression Omnibus dataset to discover that TIGIT is much more common than PD-1 in many tumors, including those resistant to anti-PD-1 therapy.

The researchers wanted to use a peptide as their new drug because these molecules penetrate more deeply into tissue with affinities and specificities equal to those of antibodies. They cause significantly fewer undesired immunological side effects and are easier to produce. Their disadvantage is that they are rapidly broken down by proteases in the body. For this reason, the researchers decided to use "mirror-image" peptides, which are stable toward proteases. Amino acids can exist in the natural L configuration, or its mirror image, the synthetic D configuration. D peptides made from D amino acids are significantly more long-lived than L peptides.

To find a suitable peptide, the researchers used the mirror-image phage display technique. In this method, very large numbers of different biotechnologically produced peptides are presented on the surfaces of phages (viruses that attack bacteria). Those that bind to the desired target molecule are then selected and multiplied in bacteria. They then go through further selection cycles until only very strongly binding peptides remain. Initially, L peptides are presented in mirror-image phage display. However, those selected bind to the mirror image of the target molecule. For this, the researchers synthesized a portion of TIGIT in the D configuration. As the last step, they produced the mirror-image D version of the strongest binding L peptide, which fitted the key binding interface of the TIGIT/PVR protein perfectly.

The D-peptide known as (D)-TBP-3 developed by this technique effectively blocks the interaction of TIGIT with PVR, is stable toward proteases, and inhibits the growth and metastasis of anti-PD-1 resistant tumor models.

Credit: 
Wiley

Technique fishes valuable nutrients out of shrimp processing water

The seafood industry requires large amounts of water for food processing. Before used water is discharged, some organic matter, including protein, is typically removed. This sludge is usually landfilled or converted into biogas, which results in the valuable nutrients it contains being lost from the food chain. Now researchers report in ACS Sustainable Chemistry & Engineering a method to recover these nutrients from shrimp processing water so they can be incorporated in food or feed.

At present, food processing factories remove organic matter from water by first clumping it together with chemical treatments (coagulation) and then raising those clumps to the surface with a technique such as "dissolved air flotation" (DAF). Coagulation is traditionally carried out with iron or other non-food-grade flocculants that clean the water efficiently, but render the removed sludge unsuitable for food or feed purposes. One alternative is to filter the nutrients from the water using membranes, but the equipment is expensive and can clog. A more sustainable option is to switch to food-grade flocculants in combination with DAF. Although a few other studies have shown that such a combination could work, these were small-scale experiments. Ingrid Undeland and Bita Forghani of Chalmers University of Technology and colleagues wanted to scale up the combined food-grade flocculation-DAF process and assess the nutrient composition of the recovered biomass.

At a processing plant, the team treated shrimp processing water with alginate or carrageenan, edible flocculants derived from seaweed. The resulting particles were then collected via DAF and dried. The combination technique captured up to 98% of the protein present in the water, considerably more than flotation alone could collect. The recovered shrimp biomass contained up to 61% proteins and 23% total lipids. The researchers concluded the process could be used for recovering nutrients from shrimp processing water for later use in food or feed. 

Credit: 
American Chemical Society

Protein involved in corn's water stress response discovered

image: Crystal of the protein DRIK1 used in the study. It is involved in the mechanism of the plant's response to water and thermal stresses and to invasion by fungi

Image: 
GCCRC

Researchers affiliated with the Genomics for Climate Change Research Center (GCCRC), hosted by the University of Campinas (UNICAMP) in the state of São Paulo, Brazil, have discovered a protein involved in corn's resistance to dry weather, high temperatures, and fungal invasion.

This finding paves the way for the development of more drought-resistant plants and products that reduce losses in production at a time when global climate change threatens crop yields around the world. An article on the study is published in BMC Plant Biology.

The GCCRC is an Engineering Research Center (ERC) established by FAPESP and the Brazilian Agricultural Research Corporation (EMBRAPA).

The researchers named the new protein drought-responsive inactive kinase 1 (DRIK1). They also found a synthetic molecule that binds to DRIK1 and can be used in the future to breed plants in which the activity of the protein is naturally reduced or to develop products that inhibit the protein.
"Under normal conditions, the protein controls the plant's developmental mechanisms and inhibits stress-response genes. In dry weather or when the plant is attacked by pathogens, levels of the protein are reduced, and the necessary response is triggered to control the effects of water stress, thermal stress or pathogen attack," said Paulo Arruda (http://www.fapesp.br/cpe/home), a professor in UNICAMP's Institute of Biology (IB) and GCCRC's project leader.

To identify the molecule that binds to the protein, the researchers used a platform developed by UNICAMP's Center for Medicinal Chemistry (CQMED) to discover molecular targets for drugs. Led by Arruda, CQMED is also one of the National Science and Technology Institutes (INCTs) co-funded in the state of São Paulo by São Paulo Research Foundation - FAPESP and the National Council for Scientific and Technological Development (CNPq).

"CQMED's platform can search libraries for small molecules that inhibit specific proteins," Arruda explained. "In human health, this is important for the development of a new drug that inhibits a kinase protein involved in a disease, for example. We used the platform to identify a molecule that binds to the plant's protein kinase, and now we can study the function of the water stress response mechanism in which it is involved."

The researchers screened a library of 378 compounds that might bind to DRIK1 and identified a synthetic molecule with this capacity (ENMD-2076). They plan to modify it so that it can regulate DRIK1, increasing or decreasing its expression in plants.

The authors of the article also include Bruno Aquino, who worked as a postdoctoral intern at IB-UNICAMP with a scholarship from FAPESP; Viviane Cristina Heinzen da Silva, currently a postdoctoral intern in UNICAMP's Center for Molecular Biology and Genetic Engineering (CBMEG); and Katlin Brauer Massirer, CQMED's coprincipal investigator with Arruda.

Water stress response

To find the protein DRIK1, scientists searched a public database for genes related to the response to water stress in plants. They grew corn from seeds in a plant growth chamber for 15 days, watering some of the plants normally throughout the period. The others were divided into three groups and were not irrigated for nine, 12 or 14 days.

Samples of leaves and roots were RNA-sequenced. The researchers found that the water-stressed plants expressed less DRIK1 but that levels of the protein returned to normal when the plants were watered.

Information mined from the same database showed that DRIK1 probably behaves similarly in response to warmer temperatures and attacks by at least two different fungi.

The researchers also analyzed the protein's three-dimensional structure and mapped potentially important regions for the stress response function. In the future, these regions could serve as targets for compounds that modulate the protein's action mechanism.

Researchers are now working on the production of plants genetically engineered for altered expression of DRIK1 with the aim of obtaining varieties that are more drought-resistant.

"If we succeed in producing a variety that withstands water stress slightly more than others during a drought, it will be like having genetic insurance," Arruda said. "There will always be losses, but tons of food will be saved if these losses can be reduced."

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Tackling coral reefs' thorny problem

image: The crown-of-thorns starfish is a predator of coral.

Image: 
OIST

Researchers from the Okinawa Institute of Science and Technology Graduate University (OIST) have revealed the evolutionary history of the crown-of-thorns starfish - a predator of coral that can devastate coral reefs. Their findings shed light on how the populations of these starfish have changed over time and could potentially help reduce their ecological destruction.

A single crown-of-thorns starfish is formidable, with a large body covered in spiky, venomous thorns. But their true danger lies in their potent reproductive ability, with female crown-of thorns starfish releasing millions of eggs in a single spawning. This can quickly lead to plagues, with uncontrollably large numbers of starfish rapidly destroying vast areas of coral reef.

"Almost 40 years ago, Okinawa experienced a massive outbreak of crown-of-thorns starfish, where over 1.5 million starfish had to be removed by divers by hand," said Professor Noriyuki Satoh, senior author of the student and leader of the Marine Genomics Unit at OIST.

Although outbreaks have recently become less frequent around Okinawa and other subtropical islands in the Ryukyu Archipelago, they have become an increasingly large threat to the Great Barrier Reef in Australia, along with coral bleaching and tropical cyclones. These starfish outbreaks are becoming more common and more severe, as increasingly polluted and warmer waters aid the survival of the larvae.

In 2017, the OIST Marine Genomics Unit teamed up with Australian scientists to decode the genome of the crown-of-thorns starfish, with their results published in Nature. Now, in their latest study published in G3: Genes|Genomes|Genetics, the Marine Genomics Unit wanted to explore whether any information was recorded in the starfish genomes that could shed light on how and why these outbreaks occur.

The researchers collected crown-of-thorns starfish from coral reefs around three different islands in the Ryukyu Archipelago - Okinawa, Miyako and Iriomote. The scientists then sequenced the entire DNA found in the mitochondria, comprised of over 16,000 nucleotide bases, and used differences in the sequences between the individual starfish to construct an evolutionary tree.

The unit also performed the same analyses on two other starfish species - the blue starfish and the northern Pacific sea star. By comparing the crown-of-thorns starfish to these other two species, the scientists hoped to see whether their findings revealed anything unique to the crown-of-thorns starfish.

"The blue starfish is also a coral reef predator that lives in the same habitat as the crown-of-thorns starfish, but it doesn't produce these uncontrollable outbreaks," said Prof. Satoh. "Meanwhile, the northern Pacific sea star is the most common starfish in Japan and lives in colder waters around the Japanese mainland."

The scientists found that the evolutionary tree for the northern Pacific sea star showed that the species had split into two major lineages. Starfish collected from three different locations in the seas around the north-eastern regions of Japan were composed of individuals from one lineage, whilst a single population in the Seto Inland Sea in south-west Japan was formed of individuals from a second, more recent lineage.

"We believe that in a rare migration event, starfish larvae dispersed to the Seto Inland Sea. As these two areas are so separated, no migration occurred afterwards between the two populations, which resulted in the species splitting into two lineages," said Prof. Satoh. "Meanwhile, shorter range ocean currents kept individuals from the first lineage mixed between the nearby locations in the north-east of Japan."

For the blue starfish, the results were more surprising. The constructed evolutionary tree showed that the species had first split into two lineages, with the second lineage then diverging again into two smaller subgroups. But intriguingly, individuals from the two major lineages were found in both Okinawa and Ishigaki - the two areas in the Ryukyus where the blue starfish was collected. This means that two distinct starfish populations are living in the same geographic regions but are not breeding and mixing their genes. Prof. Satoh believes that this is strong evidence for there being two cryptic species of blue starfish - in other words, the starfish look the same despite being separate, non-breeding species.

The results also suggest that blue starfish migration occurs in both directions between Okinawa and Ishigaki. This was unexpected as the scientists had previously assumed that the powerful northeastern current flowing from Ishigaki towards Okinawa prevented starfish larvae from being carried in the opposite direction.

"For migration to readily occur in both directions, this suggests that the ocean currents in the Ryukyu Archipelago may be more complex that previously imagined," said Prof. Satoh.

The results from the evolutionary tree of the crown-of-thorns starfish also supported the idea of complex ocean currents in the region, with each crown-of-thorns starfish lineage also found in more than one geographic location. This has important implications for predicting where new outbreaks of crown-of-thorns starfish may occur in the Ryukyus, with the researchers now advocating for better understanding of the ocean currents in the area.

Overall, the evolutionary tree for the crown-of-thorns starfish looked significantly different from the other two starfish, underlying key differences in the species' historical population dynamics. Despite being a much younger species than the other two species, diverging less than one million years ago, the tree showed that the starfish quickly fragmented into five small lineages. These findings suggest that the species underwent frequent genetic bottlenecks, where the population was reduced to just a small number of individuals, which then jumpstarted a new lineage.

"This implies that the starfish outbreaks are just one part of a larger 'boom and bust' population cycle, where if they are left to their natural devices, the starfish eat so much coral that they run out of food and die," said Prof. Satoh.

For their next steps, the Marine Genomics Unit is collaborating with Australian scientists to analyze crown-of-thorns starfish from the Great Barrier Reef. Instead of just using DNA in the mitochondria, the scientists aim to sequence the entire genome of each starfish, including DNA in the nucleus.

"Ultimately, we hope our findings can help us understand the population trends of the starfish better and the role of ocean currents in seeding new outbreaks," concluded Prof. Satoh. "This could potentially help us predict and therefore mitigate future outbreaks."

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

Enforcing gender quotas increases boardroom diversity and quality

An organisation that is required by national law to have significant female representation on its board of directors sees higher diversity and skills than those in countries that simply advise on quotas, according to research from City, University of London's Business School.

Dr Sonia Falconieri, Reader in Finance and Chiara De Amicis, PhD in Finance student at City, along with Dr Moez Bennouri from Montpellier Business School, studied the boards of British, French and Italian listed companies across a 14-year period.

Each country has its own respective laws on gender quotas as follows:

United Kingdom - a soft, voluntary ratio of at least 25 per cent female representation on FTSE100 boards was recommended by the Davies Report of 2011, later amended to 33 per cent in 2015. FTSE250 were also advised to achieve this by 2020.

France - gender quotas implemented in 2011 required listed and non-listed companies of more than 500 employees and revenues above 50 million Euros to have a minimum of 20 per cent female representation on their boards of directors, which rose to 40 per cent in 2014. Failure to comply incurs voided appointments to the board and suspended payment of attendance.

Italy - quotas came into law in 2012 requiring publicly listed companies to have at least 20 per cent of either gender on first renewal of their board of directors, and 33 per cent after the second renewal. Failure to comply results in large fines and even potential voiding of directorships.

The research found that introduction of regulation was the single largest catalyst for an increase in the presence of women on boards, with this felt significantly more sharply in France and Italy under mandatory rather than voluntary regimes. The data collected also showed a marked increase in compliance with these guidelines.

Another study carried out to determine the 'quality' of these boards under gender quota guidelines showed no deterioration in cases of high diversity, and in several cases improved where quotas were mandatory.

Dr Falconieri said:

"Boardroom diversity is crucial to the success and sustainability of an organisation.

"There is a risk that the current pandemic crisis could see countries that do not enforce quotas on gender diversity take a large backwards step in terms of female representation in the boardroom.

"Our study demonstrates that businesses are more compliant with gender diversity regulation if it is enforced. In addition to this, we find no evidence to suggest that the quality of boardrooms, normally associated with effective monitoring, deteriorates under this mandatory regulation.

"Despite this, gender quota regulations have not yet had an overall positive impact on the appointment of female executives or board chairs, which remains a great challenge and an obstacle to gender equality."

Credit: 
City St George’s, University of London

Evolutionary biologists find several fish adapt in the same way to toxic water

image: Kansas State University and Washington State University biologists study fish capable of living in water with highly toxic levels of hydrogen sulfide.

Image: 
Kansas State University

MANHATTAN, KANSAS -- Several species of fish have adapted to harsh environments using the same mechanism, which brings to question evolutionary chance, according to a study by Kansas State University and Washington State University.

Michi Tobler, associate professor, Ryan Greenway, May 2019 doctoral graduate, and Nick Barts, doctoral student, all in the Division of Biology; Joanna Kelley, associate professor at Washington State University; and many additional collaborators recently published an article about repeated adaptations to extreme environments in Proceedings of the National Academy of Sciences.

"We are trying to understand how evolution and adaptation work," Tobler said. "We stumbled across these fish living in this highly toxic water. It is so toxic that it kills most other living things by binding to an enzyme in the mitochondria -- the powerhouse of cells -- and shuts off energy production at the cellular level."

The streams have high concentrations of hydrogen sulfide, a gas that is naturally dissolved in the water. Tobler and his collaborators found at least 10 different lineages of fish that have adapted to live in the extreme environment.

"Whether or not populations take the same path to adapting to novel environments is a long-standing question in evolutionary biology," Kelly said. "Our research shows that the same pathways have been modified in multiple different species of hydrogen sulfide adapted fishes."

All 10 adapted, regardless of location, using the same mechanism: tweaking the enzyme so the toxicant can't bind to it.

"The cool thing about these enzymes is all organisms have them," Tobler said. "We have them. Fungi have them. Plants have them. It's the universal way to make energy. Yet, it is this ancient pathway that has been conserved for so long that is modified in these fish."

According to Tobler, the fish also ramped up an existing detoxification mechanism inside the mitochondria so they can get rid of the hydrogen sulfide faster and survive when other non-adapted fish in the same species can't survive in the toxic water. The multiple lineages of fish with this capability brings to question a view proposed by evolutionary biologist Stephen Gould, that if evolution repeated itself, it would lead to different outcomes every time.

"Thirty years ago, Gould said 'if you could rewind the tape of life, you would get a different outcome every single time,' meaning that evolution would not find the same adaptive solutions every time," Tobler said. "What we actually found in all these lineages -- where the tape of life has been replayed as they were exposed to the same sources of selection -- is that evolution actually unfolds in very similar ways. I think it tells us something very fundamental about how organisms adapt and that adaptive solutions are possibly limited."

The researchers are able to compare the fish with the adaptation living the toxic water with ancestors that live in the normal environment because there is not a barrier between habitats. Tobler said as a consequence of the fish adapting to the toxic environment, they are actually evolving into a new species. His graduate students have further research pending.

Credit: 
Kansas State University

New study shows colliding neutron stars may unlock mysteries of universe expansion

ORLANDO, July 8, 2020 - The National Science Foundation's Arecibo Observatory in Puerto Rico has proven itself instrumental in another major astronomical discovery.

An international team of scientists, led by the University of East Anglia in the United Kingdom, found an asymmetrical double neutron star system using the facility's powerful radio telescope. This type of star system is believed to be a precursor to merging double neutron star systems like the one that LIGO/Virgo (the Laser Interferometer Gravitational-Wave Observatory in the United States) discovered in 2017. The LIGO/Virgo observation was important, because it confirmed the gravitational waves associated with merging neutron stars.

The work published by this team today in the journal Nature, indicates these specific kinds of double neutron star systems may be the key to understanding dead star collisions and the expansion of the universe.

"Back in 2017, scientists at LIGO/Virgo first detected the merger of two neutron stars," says physicist Robert Ferdman, who led the team. "The event caused gravitational-wave ripples through the fabric of space time, as predicted by Albert Einstein over a century ago. It confirmed that the phenomenon of short gamma-ray bursts was due to the merger of two neutron stars."

One of the unique aspects of the 2017 discovery and today's is that the double neutron systems observed are composed of stars that have very different masses. Current theories about the 2017 discovery are based on the masses of stars being equal or very close in size.

"The double neutron star system we observed shows the most asymmetric masses amongst the known merging systems within the age of the universe," says Benetge Perera, a UCF scientist at Arecibo who co-authored the paper. "Based on what we know from LIGO/Virgo and our study, understanding and characterizing of the asymmetric mass double neutron star population is vital to gravitational wave astronomy."

Perera, whose research is focused on pulsars and gravitational waves, joined the NSF-funded Arecibo Observatory in June 2019. The facility, which is managed by the University of Central Florida through a cooperative agreement with the NSF, offers scientists around the world a unique look into space because of its specialized instruments and its location near the equator.

The Discovery

The team discovered an unusual pulsar - one of deep space's magnetized spinning neutron-star 'lighthouses' that emits highly focused radio waves from its magnetic poles.

The newly discovered pulsar (known as PSR J1913+1102) is part of a binary system - which means that it is locked in a fiercely tight orbit with another neutron star.

"The Arecibo Observatory has a long legacy of important pulsar discoveries," says NSF Program Officer, Ashley Zauderer. "This exciting result shows how incredibly relevant the facility's unique sensitivity remains for scientific investigations in the new era of multi-messenger astrophysics."

Neutron stars are the dead stellar remnants of a supernova explosion. They are made up of the densest matter known - packing hundreds of thousands of times the Earth's mass into a sphere the size of a city like New York.

In about half a billion years the two neutron stars will collide, releasing astonishing amounts of energy in the form of gravitational waves and light.

That collision is what the LIGO/Virgo team observed in 2017. The event was not surprising, but the enormous amount of matter ejected from the merger and its brightness was unexpected, Ferdman said.

"Most theories about this event assumed that neutron stars locked in binary systems are very similar in mass," Ferdman says. "But this newly discovered binary is unusual because the masses of its two neutron stars are quite different - with one far larger than the other. Our
discovery changes these assumptions."

This asymmetric system gives scientists confidence that double neutron star mergers will provide vital clues about unsolved mysteries in astrophysics - including a more accurate determination of the expansion rate of the universe, known as the Hubble constant.

Credit: 
University of Central Florida

Preliminary study suggests tuberculosis vaccine may be limiting COVID-19 deaths

image: Luis Escobar, pictured, and two colleagues at the National Institutes of Health collected coronavirus mortality data from around the world. Photo courtesy of Luis Escobar for Virginia Tech.

Image: 
Virginia Tech

One of the emerging questions about the coronavirus that scientists are working to understand is why developing countries are showing markedly lower rates of mortality in COVID-19 cases than expected.

Research by Assistant Professor Luis Escobar of the College of Natural Resources and Environment and two colleagues at the National Institutes of Health suggests that Bacille Calmette-Guérin (BCG), a tuberculosis vaccine routinely given to children in countries with high rates of tuberculosis infection, might play a significant role in mitigating mortality rates from COVID-19. Their findings have been published in the Proceedings of the National Academy of Sciences.

"In our initial research, we found that countries with high rates of BCG vaccinations had lower rates of mortality," explained Escobar, a faculty member in the Department of Fish and Wildlife Conservation and an affiliate of the Global Change Center housed in the Fralin Life Sciences Institute. "But all countries are different: Guatemala has a younger population than, say, Italy, so we had to make adjustments to the data to accommodate those differences."

Escobar, working with NIH researchers Alvaro Molina-Cruz and Carolina Barillas-Mury, collected coronavirus mortality data from around the world. From that data, the team adjusted for variables, such as income, access to education and health services, population size and densities, and age distribution. Through all of the variables, a correlation held showing that countries with higher rates of BCG vaccinations had lower peak mortality rates from COVID-19.

One sample that stood out was Germany, which had different vaccine plans prior to the country's unification in 1990. While West Germany provided BCG vaccines to infants from 1961 to 1998, East Germany started their BCG vaccinations a decade earlier, but stopped in 1975. This means that older Germans -- the population most at risk from COVID-19 -- in the country's eastern states would have more protection from the current pandemic than their peers in western German states. Recent data shows this to be the case: western German states have experienced mortality rates that are 2.9 times higher than those in eastern Germany.

"The purpose of using the BCG vaccine to protect from severe COVID-19 would be to stimulate a broad, innate, rapid-response immunity," said Escobar, who noted that the BCG vaccines have already been shown to provide broad cross-protections for a number of viral respiratory illnesses in addition to tuberculosis.

Escobar stresses that the team's findings are preliminary, and that further research is needed to support their results and determine what the next steps should be for researchers. The World Health Organization noted that there is no current evidence that the BCG vaccine can protect people from COVID-19 infections, and stated that it does not currently recommend BCG vaccinations for the prevention of COVID-19. There are currently clinical trials underway to establish whether BCG vaccination in adults confers protection from severe COVID-19.

"We're not looking to advise policy with this paper," Escobar said. "This is, instead, a call for more research. We need to see if we can replicate this in experiments and, potentially, in clinical trials. We also need to come back to the data as we get more information, so we can reevaluate our understanding of the coronavirus pandemic."

Barillas-Mury, a chief researcher who specializes in mosquito-borne disease vectors, noted that establishing a link between BCG vaccines and COVID-19 case severity could result in attempts to stockpile doses of the BCG vaccine, placing countries with high tuberculosis rates at risk.

"If the BCG vaccine is protective, production would have to increase to meet the sudden spike in vaccine demand in order to prevent a delay in distribution to countries that very much need it to fight tuberculosis," she said.

While a direct correlation between BCG vaccinations and a reduction in coronavirus mortalities still needs to be understood more fully, researchers hold hope that the BCG vaccine might be able to provide at least short-term protections against severe COVID-19, particularly for front-line medical workers or high-risk patients. And, if BCG does provide short-term protection, there are longer term considerations about how countries could best utilize BCG vaccines to reduce mortality rates for future viral outbreaks that target the human respiratory system.

Credit: 
Virginia Tech

Programmable balloons pave the way for new shape-morphing devices

image: This programmable balloon could pave the way for new shape-morphing devices

Image: 
(Image courtesy of the Bertoldi Lab/Harvard SEAS)

Balloon shaping isn't just for kids anymore. A team of researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) has designed materials that can control and mold a balloon into pre-programmed shapes. The system uses kirigami sheets -- thin sheets of material with periodic cuts -- embedded into an inflatable device. As the balloon expands, the cuts in the kirigami sheet guide the growth, permitting expansion in some places and constricting it in others. The researchers were able to control the expansion not only globally to make large-scale shapes, but locally to generate small features.

The team also developed an inverse design strategy, an algorithm that finds the optimum design for the kirigami inflatable device that will mimic a target shape upon inflation.

"This work provides a new platform for shape-morphing devices that could support the design of innovative medical tools, actuators and reconfigurable structures," said Katia Bertoldi, the William and Ami Kuan Danoff Professor of Applied Mechanics at SEAS and senior author of the study.

The research is published in Advanced Materials.

An individual cut on a kirigami sheet contributes to the larger shape of the balloon like a pixel helps form an image on a 2D surface. The researchers found that by tuning the geometric parameters of these cuts, they could control and embed complex shapes.

"By only varying two parameters of the pixels, we can program all different kinds of crazy shapes into the kirigami balloons, including bends, twists and expansions," said Antonio Elia Forte, a postdoctoral fellow at SEAS and co-first author of the study. "Our strategy allows us to automatically design a morphable balloon starting from the shape that you need. It's a bottom-up approach that for the first time harnesses the elasticity of the material, not only kinematic."

Using these parameters, the researchers developed an inverse algorithm that could mix and match pixels of different width and height, or delete certain pixels altogether, to achieve the desired shape. By manipulating the parameters of individual pixels, the researchers were able to tune shapes at a significantly smaller scale. To demonstrate this, they programmed a balloon to mimic the shape of a squash (the experiments took place around Halloween) complete with the characteristic bumps and ridges along the side.

"By controlling the expansion at every level of the kirigami balloon, we can reproduce a variety of target shapes," Lishuai Jin, a graduate student at SEAS and co-first author of the paper.

The researchers also made kirigami balloons in the shapes of calabash gourds, hooks and vases, demonstrating that the approach is general enough to mimic any given shape.

Next, the researches aim to use these kirigami balloons as shape-changing actuators for soft robots. The work lays a foundation for the design of structures at multiple scales: from micro minimally-invasive surgical devices to macro structures for space exploration.

Credit: 
Harvard John A. Paulson School of Engineering and Applied Sciences

Soil studies can be helpful for border control

image: A soil tunnel shaft is used to access a tunnel under the US-Mexico border.

Image: 
College of ACES, University of Illinois.

URBANA, Ill. - Underground tunnels have been used by warriors and smugglers for thousands of years to infiltrate battlegrounds and cross borders. A new analysis published in the Open Journal of Soil Science presents a series of medieval and modern case studies to identify the most restrictive and ideal soil and geologic conditions for tunneling.

"Understanding the history of soil tunnels shows us that certain types of soils and geographies are uniquely suited for tunneling. Countries with warfare or smuggling issues, including the U.S.-Mexico border and Israeli borders, need detailed soil and hydrology maps of their borders to identify soil types, typographies, and thus areas where soil tunnels could be constructed," according to study co-author Kenneth Olson, professor emeritus and soil scientist in the Department of Natural Resource and Environmental Sciences at the University of Illinois.

Olson and co-author David Speidel looked at several tunnel systems throughout history, including examples in Syria, China, Cambodia, Vietnam, North Korea, South Korea, Iran, Iraq, Israel, Gaza, Egypt, Afghanistan, Mexico, and the United States.

The authors discuss the history of each area's tunnels, including construction and use. They detail the geological materials, bedrock, water tables, and climate for each tunnel network, and note its resilience or demise.

Using the case studies, the authors are able to identify site conditions that are most susceptible to soil tunneling and make specific recommendations for today's most vulnerable border crossings.

"Most cases of successful tunneling throughout history were in arid areas with a relatively low permanent water table," notes Olson. "These areas will need to be monitored for sound and vibrations to disrupt tunneling by smugglers."

Olson's previous work explaining how soils and tunneling were an equalizer during the Vietnam War caught the eyes of several military groups, which led him to expand his soil tunnel warfare and smuggling research into this more recent study.

Olson is a Vietnam-era veteran who served in the U.S. Army from 1969 to 1973. Speidel is a U.S. Army Iraq, Bosnia-Herzegovina, and Vietnam-era veteran as well as a USDA soil resource conservationist and retiree previously detailed by the Foreign Agricultural Service as a Civilian Response Crops Agricultural Advisor.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Learning more about particle collisions with machine learning

image: Schematic of ATLAS detector in the Large Hadron Collider.

Image: 
ATLAS Collaboration

The Large Hadron Collider (LHC) near Geneva, Switzerland became famous around the world in 2012 with the detection of the Higgs boson. The observation marked a crucial confirmation of the Standard Model of particle physics, which organizes the subatomic particles into groups similar to elements in the periodic table from chemistry.

The U.S. Department of Energy’s (DOE) Argonne National Laboratory has made many pivotal contributions to the construction and operation of the ATLAS experimental detector at the LHC and to the analysis of signals recorded by the detector that uncover the underlying physics of particle collisions. Argonne is now playing a lead role in the high-luminosity upgrade of the ATLAS detector for operations that are planned to begin in 2027. To that end, a team of Argonne physicists and computational scientists has devised a machine learning-based algorithm that approximates how the present detector would respond to the greatly increased data expected with the upgrade.

“Most of our research questions at ATLAS involve finding a needle in a giant haystack, where scientists are only interested in finding one event occurring among a billion others.” — Walter Hopkins, assistant physicist in Argonne’s High Energy Physics division

As the largest physics machine ever built, the LHC shoots two beams of protons in opposite directions around a 17-mile ring until they approach near the speed of light, smashes them together and analyzes the collision products with gigantic detectors such as ATLAS. The ATLAS instrument is about the height of a six-story building and weighs approximately 7,000 tons. Today, the LHC continues to study the Higgs boson, as well as address fundamental questions on how and why matter in the universe is the way it is.

“Most of the research questions at ATLAS involve finding a needle in a giant haystack, where scientists are only interested in finding one event occurring among a billion others,” said Walter Hopkins, assistant physicist in Argonne’s High Energy Physics (HEP) division.

As part of the LHC upgrade, efforts are now progressing to boost the LHC’s luminosity — the number of proton-to-proton interactions per collision of the two proton beams — by a factor of five. This will produce about 10 times more data per year than what is presently acquired by the LHC experiments. How well the detectors respond to this increased event rate still needs to be understood. This requires running high-performance computer simulations of the detectors to accurately assess known processes resulting from LHC collisions. These large-scale simulations are costly and demand large chunks of computing time on the world’s best and most powerful supercomputers.

The Argonne team has created a machine learning algorithm that will be run as a preliminary simulation before any full-scale simulations. This algorithm approximates, in very fast and less costly ways, how the present detector would respond to the greatly increased data expected with the upgrade. It involves simulation of detector responses to a particle-collision experiment and the reconstruction of objects from the physical processes. These reconstructed objects include jets or sprays of particles, as well as individual particles like electrons and muons.

“The discovery of new physics at the LHC and elsewhere demands ever more complex methods for big data analyses,” said Doug Benjamin, a computational scientist in HEP. “These days that usually means use of machine learning and other artificial intelligence techniques.”

The previously used analysis methods for initial simulations have not employed machine learning algorithms and are time consuming because they involve manually updating experimental parameters when conditions at the LHC change. Some may also miss important data correlations for a given set of input variables to an experiment. The Argonne-developed algorithm learns, in real time while a training procedure is applied, the various features that need to be introduced through detailed full simulations, thereby avoiding the need to handcraft experimental parameters. The method can also capture complex interdependencies of variables that have not been possible before.

“With our stripped-down simulation, you can learn the basics at comparatively little computational cost and time, then you can much more efficiently proceed with full simulations at a later date,” said Hopkins. “Our machine learning algorithm also provides users with better discriminating power on where to look for new or rare events in an experiment,” he added.

The team’s algorithm could prove invaluable not only for ATLAS, but for the multiple experimental detectors at the LHC, as well as other particle physics experiments now being conducted around the world.

Credit: 
DOE/Argonne National Laboratory

Monitoring for breast cancer after childhood chest radiation: When and how?

Chest radiation is used to treat children with Hodgkin and non-Hodgkin lymphoma as well as lung metastases in various solid tumors. But radiation itself is a potential cancer risk, including an increased risk for breast cancer later in life. Girls receiving chest radiation for childhood cancer face a breast cancer risk as high as 30 percent by age 50.

What is the best strategy for catching these breast cancers early? Annual screening with mammography and breast MRI is recommended, but practices vary from location to location, and the benefits, harms, and costs are unclear. A new study led by Jennifer M. Yeh, PhD, in the Division of General Pediatrics at Boston Children's Hospital, used modeling to compare results of different approaches. The findings appear in the Annals of Internal Medicine.

"Randomized clinical trials are often thought of as the 'gold standard,' but they aren't always feasible for screening studies, especially in rare high risk groups such as survivors of childhood cancer," Yeh says. "However, those developing guidelines need information on the potential benefits and harms of screening. Decision modeling can provide important insight on these health outcomes."

Yeh, with collaborators across the country, used two breast cancer simulation models that were developed to guide screening recommendations for women who are not cancer survivors. The two models, part of the Cancer Intervention and Surveillance Modeling Network (CISNET), were adapted using information from the Childhood Cancer Survivor Study (CCSS). The CCSS is a cohort study on outcomes, including breast cancer, in more than 24,000 survivors of childhood and adolescent cancers diagnosed between 1970 and 1999.

The models evaluated the following annual screening strategies:

no screening

digital mammography and breast MRI, starting at age 25 (as per current recommendations of the Children's Oncology Group), 30 or 35

MRI only, starting at age 25, 30, or 35.

The models assumed that women who were screened continued to be screened until age 74, and that those diagnosed with breast cancer received the best therapy available at the time.

All annual screening approaches save lives

In the simulation, Yeh showed that without screening, childhood cancer survivors previously treated with chest radiation had a 10 to 11 percent lifetime risk of dying from breast cancer, as compared with a 2.5 percent risk among women in the general population.

Compared with no screening, all annual screening strategies prevented more than half of breast cancer deaths according to the models. A combined approach of both breast MRI and mammography, beginning at age 25, prevented the most deaths (an estimated 56 to 71 percent); MRI alone prevented slightly fewer (56 to 62 percent).

However, starting at age 25 also meant more screening tests, more false-positive findings, and more breast biopsies that turned out to be benign. For example, the researchers estimate that the average survivor screened with both MRI and mammography would have four to five false-positive screens and one to two breast biopsies over the course of her lifetime.

When the emotional stress and costs of additional screening and testing were factored into the analysis, starting at age 30 was the preferred strategy; relatively little was gained by starting at age 25. Either strategy reduced the risk of dying from breast cancer by at least half, with or without the addition of mammography to MRI.

Interpreting the findings

Lisa Diller, MD, chief medical officer of the Dana-Farber/Boston Children's Cancer and Blood Disorders Center, is an expert in the care of childhood cancer survivors and a co-author of the study. "Breast cancer screening is one of the most important issues that oncologists should discuss with survivors of childhood chest radiation," Diller says. "Having these data informs discussions with young women who are facing screening at a very young age. It both reassures them that waiting until age 30 might be reasonable and impresses upon them that this screening could be life-saving."

One limitation of the study was that it used data from childhood cancer survivors diagnosed between 1970 and 1986. Since then, cancer treatment has changed, including decreased doses and improved delivery of radiation, and breast cancer monitoring now includes digital breast tomosynthesis, also known as 3D mammography.

"Our model-based findings suggest that even if the risk of breast cancer declines by half with more recent changes in radiation dose and delivery, early initiation of screening still remains favorable for these high-risk survivors," Yeh says. "Ensuring survivors are aware of and have access to screening can save lives."

Credit: 
Boston Children's Hospital