Tech

Theory describes quantum phenomenon in nanomaterials

image: A quantum dot (the yellow part) is connected to two lead electrodes (the blue parts). Electrons tunneling into the quantum dot from the electrodes interact with each other to form a highly correlated quantum state, called "Fermi liquid." Both nonlinear electric current passing through the quantum dot and its fluctuations that appear as a noise carry important signals, which can unveil underlying physics of the quantum liquid. It is clarified that three-body correlations of the electrons evolve significantly and play essential roles in the quantum state under the external fields that break the particle-hole or time-reversal symmetry.

Image: 
Rui Sakano

Theoretical physicists Yoshimichi Teratani and Akira Oguri of Osaka City University, and Rui Sakano of the University of Tokyo have developed mathematical formulas that describe a physical phenomenon happening within quantum dots and other nanosized materials. The formulas, published in the journal Physical Review Letters, could be applied to further theoretical research about the physics of quantum dots, ultra-cold atomic gasses, and quarks.

At issue is 'the Kondo effect'. This effect was first described in 1964 by Japanese theoretical physicist Jun Kondo in some magnetic materials, but now appears to happen in many other systems, including quantum dots and other nanoscale materials.

Normally, electrical resistance drops in metals as the temperature drops. But in metals containing magnetic impurities, this only happens down to a critical temperature, beyond which resistance rises with dropping temperatures.

Scientists were eventually able to show that, at very low temperatures near absolute zero, electron spins become entangled with the magnetic impurities, forming a cloud that screens their magnetism. The cloud's shape changes with further temperature drops, leading to a rise in resistance. This same effect happens when other external 'perturbations', such as a voltage or magnetic field, are applied to the metal.

Teratani, Sakano and Oguri wanted to develop mathematical formulas to describe the evolution of this cloud in quantum dots and other nanoscale materials, which is not an easy task.

To describe such a complex quantum system, they started with a system at absolute zero where a well-established theoretical model, namely Fermi liquid theory, for interacting electrons is applicable. They then added a 'correction' that describes another aspect of the system against external perturbations. Using this technique, they wrote formulas describing electrical current and its fluctuation through quantum dots.

Their formulas indicate electrons interact within these systems in two different ways that contribute to the Kondo effect. First, two electrons collide with each other,
forming well-defined quasiparticles that propagate within the Kondo cloud. More significantly, an interaction called a three-body contribution occurs. This is when two electrons combine in the presence of a third electron, causing an energy shift of quasiparticles.

"The formulas' predictions could soon be investigated experimentally", Oguri says. "Studies along the lines of this research have only just begun," he adds.

The formulas could also be extended to understand other quantum phenomena, such as quantum particle movement through quantum dots connected to superconductors. Quantum dots could be a key for realizing quantum information technologies, such as quantum computers and quantum communication.

Credit: 
Osaka City University

Researchers develop new way to break reciprocity law

image: One-way light transmission.

Image: 
Xuchen Wang / Aalto University

An international research team lead by Aalto University has found a new and simple route to break the reciprocity law in the electromagnetic world, by changing material properties periodically in time. The breakthrough could help to create efficient nonreciprocal devices, such as compact isolators and circulators, that are needed for the next generation of microwave and optical communications systems.

When we look through a window and see our neighbour on the street, the neighbour can also see us. This is called reciprocity, and it is the most common physical phenomenon in nature. Electromagnetic signals propagating between two sources is always governed by reciprocity law: if the signal from source A can be received by source B, then the signal from source B can also be received by source A with equal efficiency.

Researchers from Aalto University, Stanford University, and Swiss Federal Institute of Technology in Lausanne (EPFL) have successfully demonstrated that the reciprocity law can be broken if the property of the propagation medium periodically changes in time. Propagation medium refers to a material in which light and electromagnetic waves survive and propagate from one point to another.

The team theoretically demonstrated that, if the medium is shaped into an asymmetric structure and its physical property varies globally in time, the signal generated by source A can be received by source B but not the other way around. This creates a strong nonreciprocal effect, since the signal from Source B cannot be received by source A.

'This is an important milestone in both the physics and engineering communities. We need one-way light transmission for a variety of applications, like stabilising laser operation or designing future communication systems, such as full-duplex systems with increased channel capacity,' says postdoctoral researcher Xuchen Wang from Aalto University.

Previously, creating a nonreciprocal effect has required external magnets biasing, which makes devices bulky, temperature unstable, and sometimes incompatible with other components. The new findings provide the simplest and most compact way to break electromagnetic reciprocity, without the need of bulky and heavy magnets.

'Such "time-only" variations allow us to design simple and compact material platforms capable of one-way light transmission and even amplification,' Xuchen explains.

The results are reported in Physical Review Letters on 22 December 2020. The study has received funding from the Academy of Finland, European Union's Horizon 2020 Future Emerging Technologies call (FETOPEN - RIA) under project VISORSURF, the Finnish Foundation for Technology Promotion, and the U.S. Air Force Office of Scientific Research MURI project (Grant No. FA9550-18-1-0379).

Credit: 
Aalto University

Plastic is blowing in the wind

As the plastic in our oceans breaks up into smaller and smaller bits without breaking down chemically, the resulting microplastics are becoming a serious ecological problem. A new study at the Weizmann Institute of Science reveals a troubling aspect of microplastics - defined as particles smaller than 5 mm across. They are swept up into the atmosphere and carried on the wind to far-flung parts of the ocean, including those that appear to be clear. Analysis reveals that such minuscule fragments can stay airborne for hours or days, spreading the potential to harm the marine environment and, by climbing up the food chain, to affect human health.

"A handful of studies have found microplastics in the atmosphere right above the water near shorelines," says Dr. Miri Trainic, in the groups of Prof. Ilan Koren of the Institute's Earth and Planetary Sciences Department in collaboration with that of Prof. Yinon Rudich of the same department, and Prof. Assaf Vardi of the Institute's Plant and Environmental Sciences Department. "But we were surprised to find a non-trivial amount above seemingly pristine water."

Koren and Vardi have been collaborating for a number of years on studies designed to understand the interface between ocean and air. While the way the oceans absorb materials from the atmosphere has been well studied, the opposite-direction's process - aerosolization, in which volatiles, viruses, algal fragments and other particles are swept from seawater into the atmosphere - had been much less investigated.

As part of this ongoing effort, aerosol samples were collected for study in the Weizmann labs during the 2016 run of the Tara research vessel, a schooner on which several international research teams at a time come together to study the effects of climate change, primarily on marine biodiversity. The Weizmann team affixed the inlet of their measuring equipment to the top of one of the Tara's masts (so as to avoid any aerosols produced by the schooner, itself) and Dr. J. Michel Flores, of Koren's group, joined the mission to tend to the collecting as the schooner sailed across the North Atlantic Ocean.

Identifying and quantifying the microplastic bits trapped in their aerosol samples was far from easy, as the particles turned out to be hard to pick out under the microscope. To understand exactly what plastic was getting into the atmosphere, the team conducted Raman spectroscopy measurements with the help of Dr, Iddo Pinkas of the Institute's Chemical Research Support to determine their chemical makeup and size. The researchers detected high levels of common plastics - polystyrene, polyethylene, polypropylene and more - in their samples. Then, calculating the shape and mass of the microplastic particles, along with the average wind directions and speeds over the oceans, the team showed that the source of these microplastics was most likely the plastic bags and other plastic waste that had been discarded near the shore and made its way into the ocean hundreds of kilometers away.

Checking the seawater beneath the sample sites showed the same type of plastic as in the aerosol, providing support for the idea that microplastics enter the atmosphere through bubbles on the ocean surface or are picked up by winds, and are transported on air currents to remote parts of the ocean.

"Once microplastics are in the atmosphere, they dry out, and they are exposed to UV light and atmospheric components with which they interact chemically," says Trainic. "That means the particles that fall back into the ocean are likely to be even more harmful or toxic than before to any marine life that ingests them."

"On top of that," adds Vardi, "some of these plastics become scaffolds for bacterial growth for all kinds of marine bacteria, so airborne plastic could be offering a free ride to some species, including pathogenic bacteria that are harmful to marine life and humans."

"The real amount of microplastic in the ocean aerosols is almost certainly greater than what our measurements showed, because our setup was unable to detect those particles below a few micrometers in size," says Trainic. "For example, in addition to plastics that break down into even smaller pieces, there are the nanoparticles that are added to cosmetics and which are easily washed into the ocean, or are formed in the ocean through microplastic fragmentation."

Size, in the case of plastic particles, does matter, not only because lighter ones may stay airborne for longer periods. When they do land on the water's surface, they are more likely to be eaten by equally small marine life, which, of course, cannot digest them. Thus, every one of these particles has the potential to harm a marine organism or to work its way up the food chain and into our bodies.

"Last, but not least, like all aerosols, microplastics become part of the large planetary cycles - for example, carbon and oxygen - as they interact with other parts of the atmosphere," says Koren. "Because they are both lightweight and long-lived, we will be seeing more microplastics transported in the air as the plastics that are already polluting our oceans break up - even if we do not add any further plastics to our waterways." he adds.

Credit: 
Weizmann Institute of Science

Light smokers may not escape nicotine addiction, study reveals

HERSHEY, Pa. -- Even people who consider themselves to be casual cigarette smokers may be addicted, according to current diagnostic criteria. Researchers at Penn State College of Medicine and Duke University found that many light smokers -- those who smoke one to four cigarettes per day or fewer -- meet the criteria for nicotine addiction and should therefore be considered for treatment.

"In the past, some considered that only patients who smoke around 10 cigarettes per day or more were addicted, and I still hear that sometimes," said Jonathan Foulds, professor of public health sciences and psychiatry and behavioral health, Penn State. "But this study demonstrates that many lighter smokers, even those who do not smoke every day, can be addicted to cigarettes. It also suggests that we need to be more precise when we ask about cigarette smoking frequency."

According to Jason Oliver, assistant professor of psychiatry and behavioral sciences, Duke University, when assessing nicotine addiction -- clinically referred to as 'tobacco use disorder' -- clinicians are encouraged to fully assess the 11 criteria listed in the 5th edition of the Diagnostic and Statistical Manual (DSM-5). As a shortcut, he said, clinicians more typically ask smokers how many cigarettes they smoke per day.

"Lighter smoking is correctly perceived as less harmful than heavy smoking, but it still carries significant health risks," Oliver said. "Medical providers sometimes perceive lighter smokers as not addicted and, therefore, not in need of treatment, but this study suggests many of them may have significant difficulty quitting without assistance."

The researchers examined an existing data set from the National Institutes of Health, including more than 6,700 smokers who had been fully assessed to find out if they met the DSM-5 criteria for tobacco use disorder. They found that 85% of the daily cigarette smokers were addicted to some extent -- either mild, moderate or severe addiction.

"Surprisingly, almost two thirds of those smoking only one to four cigarettes per day were addicted, and around a quarter of those smoking less than weekly were addicted," Foulds said.

The researchers found that the severity of cigarette addiction, as indicated by the number of criteria met, increased with the frequency of smoking, with 35% of those smoking one-to-four cigarettes per day and 74% of those smoking 21 cigarettes or more per day being moderately or severely addicted.

The findings appeared Dec. 22 in the American Journal of Preventive Medicine.

"This was the first time that severity of cigarette addiction has been described across the full range of cigarette use frequency," said Foulds, a Penn State Cancer Institute researcher.

Oliver added that the study highlights the high prevalence of tobacco use disorder even among those considered to be light smokers and provides a basis from which treatment can begin to target this population.

"Previous research has found that non-daily smokers are more likely than daily smokers to make a quit attempt," Oliver said. "Clinicians should ask about all smoking behavior, including non-daily smoking, as such smokers may still require treatment to successfully quit smoking. Yet, it is unclear the extent to which existing interventions are effective for light smokers. Continued efforts to identify optimal cessation approaches for this population remain an important direction for future research."

Credit: 
Penn State

High-brightness source of coherent light spanning from the UV to THz

image: Artistic impression of the spectrum of a mid-infrared pulse broadening in the background with the electric field of the generated pulse.

Image: 
ICFO/L.Maidment, U. Elu & J. Biegert

Analytical optical methods are vital to our modern society as they permit the fast and secure identification of substances within solids, liquids or gases. These methods rely on light interacting with each of these substances differently at different parts of the optical spectrum. For instance, the ultraviolet range of the spectrum can directly access electronic transitions inside a substance while the terahertz is very sensitive to molecular vibrations.

Throughout the years many techniques have been developed to achieve hyperspectral spectroscopy and imaging, allowing scientists to observe the behavior of, for example, molecules when they fold, rotate or vibrate in order to understand the identification of cancer markers, greenhouse gases, pollutants or even substances that could be harmful to us. These ultrasensitive techniques have proven to be very useful in applications related to food inspection, biochemical sensing or even in cultural heritage, to investigate the structure of the materials used for ancient objects, paintings or sculptures.

A standing challenge has been the absence of compact sources that cover such large spectral range with sufficient brightness. Synchrotrons provide the spectral coverage, but they lack the temporal coherence of lasers, and such sources are available only in large-scale user facilities.

Now, in a recent study published in Nature Photonics, an international team of researchers from ICFO, the Max Planck Institute for the Science of Light, the Kuban State University, and the Max-Born-Institute for Nonlinear Optics and Ultrafast Spectroscopy, led by ICREA Prof. at ICFO Jens Biegert, report on a compact high-brightness mid-IR-driven source combining a gas-filled anti-resonant-ring photonic crystal fiber with a novel nonlinear-crystal. The table top source provides a seven-octave coherent spectrum from 340 nm to 40,000 nm with spectral brightness 2-5 orders of magnitude higher than one of the brightest Synchrotron facilities.

Future research will leverage the few-cycle pulse duration of the source for the time-domain analysis of substances and materials, thus opening new opportunities for multimodal measurement approaches in areas such as molecular spectroscopy, physical chemistry or solid-state physics, to name a few.

Credit: 
Max-Planck-Gesellschaft

Research reveals compromised transfer of SARS-CoV-2-specific antibodies through placenta

BOSTON - Recent analyses indicate that pregnant women and newborns may face elevated risks of developing more severe cases of COVID-19 following SARS-CoV-2 infection. New research led by investigators at Massachusetts General Hospital (MGH) and published in Cell reveals lower than expected transfer of protective SARS-CoV-2 antibodies via the placenta from mothers who are infected in the third trimester. The cause may be alterations to these antibodies after they're produced--a process called glycosylation.

The results expand on the team's recent findings published in JAMA Network Open that pregnant women with COVID-19 pass no SARS-CoV-2 virus, but also relatively low levels of antibodies against it, to newborns. For this latest study, the scientists compared maternal antibodies against the flu (influenza), whooping cough (pertussis), and SARS-CoV-2, and how these antibodies transferred across the placenta. Influenza- and pertussis-specific antibodies were actively transferred in a relatively normal fashion. In contrast, transfer of SARS-CoV-2-specific antibodies to the baby was not only significantly reduced, but the antibodies transferred were less functional than the antibodies against influenza. The reduced transfer was only observed in third trimester infection.

The scientists found that altered attachments of carbohydrates to the SARS-CoV-2-specific antibodies -- a process called glycosylation -- may be to blame for this reduced transfer from mother to fetus in the third trimester. The carbohydrate attachments on SARS-CoV-2-specific antibodies in maternal blood were different than those seen on influenza- and pertussis-specific antibodies. This carbohydrate pattern may cause the COVID-specific antibodies to be "stuck" in the maternal circulation, rather than transferred across the placenta via placental antibody receptors. Infection-induced increases in total maternal antibodies, as well as higher placental expression of an antibody receptor that attracts the carbohydrate pattern on the SARS-CoV-2-specific antibodies, helped to partially overcome the problem and facilitate the transfer of some functional antibodies from mother to fetus. Interestingly, some of the antibodies that transferred the best were also the most functional, activating natural killer cells that could help the newborn fight the virus if exposed.

The findings have implications for the design of vaccines against SARS-CoV-2 for pregnant women. "Vaccine regimens able to drive high levels of the COVID-specific antibodies with glycosylation patterns favored by the placenta for selective transfer to the fetus may lead to better neonatal and infant protection," says co-senior author Andrea Edlow, MD, MSc, a maternal-fetal medicine specialist at MGH and an assistant professor of Obstetrics, Gynecology, and Reproductive Biology at Harvard Medical School. Co-senior author and Core Member at the Ragon Institute of MGH, MIT and Harvard, Galit Alter, PhD, notes: "We are beginning to define the rules of placental antibody transfer of SARS-CoV-2 for the very first time -- catalyzing our ability to rationally design vaccines to protect pregnant women and their newborns."

In addition, understanding how antibody transfer varies by trimester may point to critical windows in pregnancy that may be most desirable for vaccination to optimize protection for both the mother and her infant.

Credit: 
Massachusetts General Hospital

Turning the heat down: Catalyzing ammonia formation at lower temperatures with ruthenium

image: The metal ruthenium, supported with lanthanide oxyhydrides, can efficiently catalyze the synthesis of ammonia at a much lower temperature than the traditional approach.

Image: 
Tokyo Tech

Nitrogen is an essential nutrient for plant growth. While about 80% of earth is nitrogen, it is mostly contained in the atmosphere as gas, and hence, inaccessible to plants. To boost plant growth, especially in agricultural settings, therefore, chemical nitrogen fertilizers are needed. A crucial step in the production of these fertilizers is the synthesis of ammonia, which involves a reaction between hydrogen and nitrogen in the presence of a catalyst.

Traditionally, ammonia production has been performed through the "Haber-Bosch" process, which, despite being effective, requires high temperature conditions (400-500°C), making the process expensive. Consequently, scientists have been trying to find a way to reduce the reaction temperatures of ammonia synthesis.

Recently, scientists have reported ruthenium--a transition metal--as an efficient "catalyst" for ammonia synthesis, as it operates under milder conditions than traditional iron-based catalysts. However, there is a caveat: nitrogen molecules need to stick to the catalyst surface to undergo dissociation into atoms before reacting with hydrogen to form ammonia. For ruthenium, however, the low temperature often causes hydrogen molecules to stick to its surface instead--a process called hydrogen poisoning--which impedes the production of ammonia. To work with ruthenium, therefore, it is necessary to suppress its hydrogen poisoning.

Fortunately, certain materials can boost the catalytic activity of ruthenium when used as a "catalyst support." A team of scientists from Tokyo Tech, Japan, recently revealed that lanthanide hydride materials of the form LnH2+x is one such group of support materials. "The enhanced catalytic performance is realized by two unique properties of the support material. First, they donate electrons, which guide the dissociation of nitrogen on the ruthenium surface. Second, these electrons combine with hydrogen on the surface to form hydride ions, which readily react with nitrogen to form ammonia and release the electrons, suppressing hydrogen poisoning of ruthenium", explains Associate Prof. Maasaki Kitano, who led the study.

Suspecting that hydride ion mobility might have a role to play in ammonia synthesis, the team, in a new study published in Advanced Energy Materials, investigated the performance of lanthanide oxyhydrides (LaH3-2xOx)--reportedly fast hydride ion conductors at 100-400°C--as a support material for ruthenium, with the aim of uncovering the relationship between ammonia synthesis and hydride ion mobility.

They found that while the "bulk" hydride ion conductivity had little bearing on the activation of ammonia synthesis, the surface or "local" mobility of hydride ions did play a crucial role in catalysis by helping to build up a strong resistance against hydrogen poisoning of ruthenium. They also found that, compared with other support materials, lanthanum oxyhydrides required a lower onset temperature for ammonia formation (160°C) and showed a higher catalytic activity.

Furthermore, the team observed that the presence of oxygen stabilized the oxyhydride framework and the hydride ions against nitridation--the transformation of lanthanum oxyhydride to lanthanum nitride and its subsequent deactivation--which tends to impede catalysis and is a major drawback in using hydride support materials. "The resistance to nitridation is a tremendous advantage as it helps to preserve the electron donating ability of the hydride ions for longer duration of the reaction," comments Prof. Kitano.

The superior catalytic performance and lower synthesis onset temperature achieved using lanthanide oxyhydrides could thus be the much sought-after solution to turn the heat down on ammonia production!

Credit: 
Tokyo Institute of Technology

Breaking bad: how shattered chromosomes make cancer cells drug-resistant

image: In this scanning electron micrograph of inside the nucleus of a cancer cell, chromosomes are indicated by blue arrows and circular extra-chromosomal DNA are indicated by orange arrows.

Image: 
Image courtesy of Paul Mischel, UC San Diego

Cancer is one of the world's greatest health afflictions because, unlike some diseases, it is a moving target, constantly evolving to evade and resist treatment.

In a paper published in the December 23, 2020 online issue of Nature, researchers at University of California San Diego School of Medicine and the UC San Diego branch of the Ludwig Institute for Cancer Research, with colleagues in New York and the United Kingdom, describe how a phenomenon known as "chromothripsis" breaks up chromosomes, which then reassemble in ways that ultimately promote cancer cell growth.

Chromothripsis is a catastrophic mutational event in a cell's history that involves massive rearrangement of its genome, as opposed to a gradual acquisition of rearrangements and mutations over time. Genomic rearrangement is a key characteristic of many cancers, allowing mutated cells to grow or grow faster, unaffected by anti-cancer therapies.

"These rearrangements can occur in a single step," said first author Ofer Shoshani, PhD, a postdoctoral fellow in the lab of the paper's co-senior author Don Cleveland, PhD, professor of medicine, neurosciences and cellular and molecular medicine at UC San Diego School of Medicine.

"During chromothripsis, a chromosome in a cell is shattered into many pieces, hundreds in some cases, followed by reassembly in a shuffled order. Some pieces get lost while others persist as extra-chromosomal DNA (ecDNA). Some of these ecDNA elements promote cancer cell growth and form minute-sized chromosomes called 'double minutes.'"

Research published last year by scientists at the UC San Diego branch of the Ludwig Institute for Cancer Research found that up to half of all cancer cells in many types of cancers contain ecDNA carrying cancer-promoting genes.

In the latest study, Cleveland, Shoshani and colleagues employed direct visualization of chromosome structure to identify the steps in gene amplification and the mechanism underlying resistance to methotrexate, one of the earliest chemotherapy drugs and still widely used.

In collaboration with co-senior author Peter J. Campbell, PhD, head of cancer, aging and somatic mutation at Wellcome Sanger Institute in the United Kingdom, the team sequenced the entire genomes of cells developing drug resistance, revealing that chromosome shattering jump-starts formation of ecDNA-carrying genes that confer anti-cancer therapy resistance.

The scientists also identified how chromothripsis drives ecDNA formation after gene amplification inside a chromosome.

"Chromothripsis converts intra-chromosomal amplifications (internal) into extra-chromosomal (external) amplifications and that amplified ecDNA can then reintegrate into chromosomal locations in response to DNA damage from chemotherapy or radiotherapy," said Shoshani. "The new work highlights the role of chromothripsis at all critical stages in the life cycle of amplified DNA in cancer cells, explaining how cancer cells can become more aggressive or drug-resistant."

Said Cleveland: "Our identifications of repetitive DNA shattering as a driver of anti-cancer drug resistance and of DNA repair pathways necessary for reassembling the shattered chromosomal pieces has enabled rational design of combination drug therapies to prevent development of drug resistance in cancer patients, thereby improving their outcome."

The findings address one of the so-called nine Grand Challenges for cancer therapy development, a joint partnership between the National Cancer Institute in the United States and Cancer Research UK, the world's largest independent cancer research and awareness charity.

Credit: 
University of California - San Diego

Survival of the thickest: Big brains make mammal populations less dense

image: A Barbary macaque (Macaca sylvanus) in Gibraltar

Image: 
Manuela Gonzalez-Suarez/University of Reading

Mammals with big brains tend to be less abundant in local areas than those with smaller brains, new research has shown.

The University of Reading led an international team of scientists in considering the effect of brain size for the first time in studying why populations densities of land mammals like mice, monkeys, kangaroos and foxes vary so widely in local areas, even among similar creatures.

Using statistical models to test different scenarios for hundreds of species, they found an overall trend of mammals with larger brains occurring at lower densities. Where different species had similar diets and body masses, brain size was found to be the deciding factor.

Dr Manuela González-Suárez, associate professor in ecological modelling at the University of Reading, who led the study, said: "Although they are associated with being smarter, we found that bigger brains may actually hold mammals back from becoming the most abundant organisms in an area. This may be because bigger brains require more food and other resources, and therefore more space, to sustain them.

"Understanding which animals are more abundant in different areas is important for conservation. Low densities make species more likely to become extinct, while higher local abundance can increase exposure to some threats like roads.

"Brain size is not the only thing that influences mammal abundance. Different environments have different levels of stability and competing species, so these will also have an impact. Further research is required to see how the effect of brain size varies in these different environments.

"There are also some exceptions to the rule. For example, humans appear to have used their advanced intelligence to overcome resource limitations, through agriculture and food production. We can import foods from halfway round the world to allow us to theoretically live almost anywhere in large numbers. Some other brainy species may also be able to partially overcome these limitations."

Although body size and diet are known to influence population densities, scientists had previously disagreed over whether bigger brains increased population densities in local areas by allowing creatures to exploit new resources, or decreased them due to requiring additional resources.

In the new study, published in the Journal of Animal Ecology, the team tested the relationship between brain size, body mass, diet and population density for 656 non-flying terrestrial mammal species.

Analysis revealed larger mammals with bigger brains and specialised diets were likely to be less locally abundant. The trend was particularly strong for primates and meat-eating mammals, but less clear in rodents and marsupials.

Examples from the study included the Barbary macaque - the species of monkey found in Gibraltar - which has an average body weight of 11kg and a brain weighing 95g, and whose average population density is 36 individuals per square kilometre. This density is nearly three times greater than the siamang - a species of gibbon - which has the same average body weight and diet but a larger brain weighing 123g, and an average population density of 14 individuals per square kilometre.

Credit: 
University of Reading

Genetic engineering without unwanted side effects helps fight parasites

Around a third of the world's population carries Toxoplasma gondii, a parasite that puts people with a weakened immune system at risk and can trigger malformations in the womb. The single-celled pathogen also leads to economic losses in agriculture, with toxoplasmosis increasing the risk of abortion among sheep, for example.

The parasite has a complex life cycle and infests virtually all warm-blooded creatures, including wild rodents and birds. It is introduced into livestock, and thus into humans, exclusively via cats. Only in this main host infectious stages form that are shed with the feces into the environment as encapsulated oocysts and from there enter the food chain.

"If we succeed in preventing the production of these oocysts, we can reduce the occurrence of toxoplasmosis among humans and animals," says Adrian Hehl, professor of parasitology and Vice Dean of Research and Academic Career Development at the University of Zurich's Vetsuisse Faculty. He and his research group have developed methods making an intervention of this sort possible.

Live vaccine protects cats from natural infection

In earlier research, the team already identified various genes that are responsible for the formation of oocysts. This has enabled them to develop a live vaccine for toxoplasmosis: the researchers can use the CRISPR-Cas9 gene editing scissors to switch off these essential genes and infect or inoculate cats with the modified parasites. These pathogens do not produce infectious oocysts, but still protect cats from natural infection with Toxoplasma in the wild.

Manipulation without side-effects

To make the sterile parasites, the researchers used the CRISPR-Cas9 gene editing scissors. While this enables precise modifications to the genetic material, depending on the protocol the method generally used can also have disadvantages. Errors and unintended genetic alterations can creep in. Now the research group around Hehl reports that in Toxoplasma, such unwanted side-effects can be avoided using a modified technique.

For CRISPR-Cas9 gene editing, scientists usually insert a ring-shaped piece of DNA, a so-called plasmid, into the cell. This contains all the information necessary to create the gene scissors and the elements that recognize the desired place in the genetic material. The cell thus produces all the components of the gene scissors itself. Afterwards, however, the plasmid remains in the cell and can trigger additional, unplanned genetic changes.

Gene scissors disappear without a trace

The method used by the Zurich team works differently. The researchers assemble the preprogrammed gene scissors outside the cell and then implant them directly into the parasites. After the genetic material has been manipulated, the components are very rapidly broken down completely, with only the desired edit remaining.

"Our approach isn't just quicker, cheaper and more efficient than conventional methods. It also enables the genomic sequence to be altered without leaving traces in the cell," explains Hehl. "This means we can now manufacture experimental live vaccines without plasmids or building in resistance genes."

Genetic engineering legislation lags behind

Given these results, Hehl questions the federal government's plans to make CRISPR-Cas9 genome editing subject to the existing law on genetic engineering (and the moratorium, which has been extended to 2025): "Our method is good example of how this new technology differs from conventional approaches to genetic engineering." He says that it is now possible to inactivate a gene without leaving unwanted traces in the genetic material, in a way which is indistinguishable from naturally occurring mutations. Unlike many other controversial applications of genetic engineering, this procedure does not affect the production of food either, and thus does not constitute a direct intervention in the food chain.

Credit: 
University of Zurich

A groggy climate giant: subsea permafrost is still waking up after 12,000 years

image: Artistic diagram of the subsea and coastal permafrost ecosystems, emphasizing greenhouse gas production and release.

Image: 
Original artwork created for this study by Victor Oleg Leshyk at Northern Arizona University.

In the far north, the swelling Arctic Ocean inundated vast swaths of coastal tundra and steppe ecosystems. Though the ocean water was only a few degrees above freezing, it started to thaw the permafrost beneath it, exposing billions of tons of organic matter to microbial breakdown. The decomposing organic matter began producing CO2 and CH4, two of the most important greenhouse gases.

Though researchers have been studying degrading subsea permafrost for decades, difficulty collecting measurements and sharing data across international and disciplinary divides have prevented an overall estimate of the amount of carbon and the rate of release. A new study, led by Ph.D. candidate Sara Sayedi and senior researcher Dr. Ben Abbott at Brigham Young University (BYU) published in IOP Publishing journal Environmental Research Letters, sheds light on the subsea permafrost climate feedback, generating the first estimates of circumarctic carbon stocks, greenhouse gas release, and possible future response of the subsea permafrost zone.

Sayedi and an international team of 25 permafrost researchers worked under the coordination of the Permafrost Carbon Network (PCN), which is supported by the U.S. National Science Foundation. The researchers combined findings from published and unpublished studies to estimate the size of the past and present subsea carbon stock and how much greenhouse gas it might produce over the next three centuries.

Using a methodology called expert assessment, which combines multiple, independent plausible values, the researchers estimated that the subsea permafrost region currently traps 60 billion tons of methane and contains 560 billion tons of organic carbon in sediment and soil. For reference, humans have released a total of about 500 billion tons of carbon into the atmosphere since the Industrial Revolution. This makes the subsea permafrost carbon stock a potential giant ecosystem feedback to climate change.

"Subsea permafrost is really unique because it is still responding to a dramatic climate transition from more than ten thousand years ago," Sayedi said. "In some ways, it can give us a peek into the possible response of permafrost that is thawing today because of human activity."

Estimates from Sayedi's team suggest that subsea permafrost is already releasing substantial amounts of greenhouse gas. However, this release is mainly due to ancient climate change rather than current human activity. They estimate that subsea permafrost releases approximately 140 million tons of CO2 and 5.3 million tons of CH4 to the atmosphere each year. This is similar in magnitude to the overall greenhouse gas footprint of Spain.

The researchers found that if human-caused climate change continues, the release of CH4 and CO2 from subsea permafrost could increase substantially. However, this response is expected to occur over the next three centuries rather than abruptly. Researchers estimated that the amount of future greenhouse gas release from subsea permafrost depends directly on future human emissions. They found that under a business-as-usual scenario, warming subsea permafrost releases four times more additional CO2 and CH4 compared to when human emissions are reduced to keep warming less than 2°C.

"These results are important because they indicate a substantial but slow climate feedback," Sayedi explained. "Some coverage of this region has suggested that human emissions could trigger catastrophic release of methane hydrates, but our study suggests a gradual increase over many decades."

Even if this climate feedback is relatively gradual, the researchers point out that subsea permafrost is not included in any current climate agreements or greenhouse gas targets. Sayedi emphasized that there is still a large amount of uncertainty about subsea permafrost and that additional research is needed.

"Compared to how important subsea permafrost could be for future climate, we know shockingly little about this ecosystem," Sayedi said. "We need more sediment and soil samples, as well as a better monitoring network to detect when greenhouse gas release responds to current warming and just how quickly this giant pool of carbon will wake from its frozen slumber."

https://iopscience.iop.org/article/10.1088/1748-9326/abcc29

This research was funded by the U.S. National Science Foundation and by BYU Graduate Studies.

Summary of the key scientific points:

Subsea permafrost has been thawing since the end of the last glacial period (~14,000 years ago) when it began to be inundated by the ocean

An international team of 25 permafrost researchers estimate that the subsea permafrost region currently traps 60 billion tons of methane and 560 billion tons of organic carbon in sediment and soil. However, the exact amount of these carbon stocks remains highly uncertain.

This carbon is already being released from the subsea permafrost region, though it remains unclear whether this is a natural response to deglaciation or if anthropogenic warming is accelerating greenhouse gas production and release.

The researchers estimate that currently, the subsea permafrost region releases approximately 140 million tons of CO2 and 5.3 million tons of CH4 to the atmosphere each year. This represents a small fraction of total anthropogenic greenhouse gas emissions--approximately equal to the greenhouse gas footprint of Spain.

Experts predict a gradual increase in emissions from subsea permafrost over the next three hundred years rather than an abrupt release.

The amount of greenhouse gas increase depends on how much human emissions are reduced. Experts estimate that approximately ¾ of the extra subsea emissions can be avoided if humans actively reduce their emissions compared to a no mitigation scenario.

This climate feedback is still virtually absent from climate policy discussions, and more field observations are needed to better predict the future of this system.

Quotes from other co-authors:

"I think there are three important messages from this study. First, subsea permafrost is probably not a climate time bomb on a hair trigger. Second, subsea permafrost is a potentially large climate feedback that needs to be considered in climate negotiations. Third, there is still a huge amount that we don't know about this system. We really need additional research, including international collaboration across northern countries and research disciplines."

Dr. Ben Abbott, senior researcher on the project, Brigham Young University

"This work demonstrates the power of science synthesis and networking by bringing together experts across a range of disciplines in order to assess our state of knowledge based on observations and models currently available. While scientific work will continue to be done to test these ideas, bringing knowledge together with this expert assessment provides an important baseline for shaping future research on subsea permafrost greenhouse gas emissions."

Dr. Ted Schuur, Lead investigator of the Permafrost Carbon Network, Northern Arizona University

"This expert assessment is a crucial contribution to the scientific literature in advancing our knowledge on subsea permafrost and potential greenhouse gas emissions from this so far understudied pool. Bringing together scientists from multiple disciplines, institutions, and countries has made it possible to move beyond individual datapoints or studies providing a much more comprehensive estimate of subsea permafrost. "

Dr. Christina Schädel, Co-Investigator of the Permafrost Carbon Network, Northern Arizona University

Credit: 
IOP Publishing

Artificial intelligence predicts gestational diabetes in Chinese women

WASHINGTON--Machine learning, a form of artificial intelligence, can predict which women are at high risk of developing gestational diabetes and lead to earlier intervention, according to a new study published in the Endocrine Society's Journal of Clinical Endocrinology & Metabolism.

Gestational diabetes is a common complication during pregnancy that affects up to 15 percent of pregnant women. High blood sugar in the mother can be dangerous for the baby and lead to complications like stillbirth and premature delivery. Most women are diagnosed with gestational diabetes during the second trimester, but some women are at high risk and could benefit from earlier intervention.

"Our study leveraged artificial intelligence to predict gestational diabetes in the first trimester using electronic health record data from a Chinese hospital," said study author He-Feng Huang Ph.D. of the Shanghai Jiao Tong University School of Medicine and the International Peace Maternity and Child Health Hospital in Shanghai, China. "These findings can help clinicians identify women at high risk of diabetes in early pregnancy and start interventions such as diet changes sooner. The artificial intelligence technology will continue to improve over time and help us better understand the risk factors for gestational diabetes."

The researchers analyzed nearly 17,000 electronic health records from a hospital in China in 2017 with machine learning models to predict women at high risk for gestational diabetes. They compared their predictions with 2018 electronic health record data and found they were successful at identifying who would develop gestational diabetes. The prediction models also found an association between low body mass and gestational diabetes.

Credit: 
The Endocrine Society

Regulatory RNAs promote breast cancer metastasis

image: Certain regulatory RNAs are involved in cancer cell movement and metastases. On the right, cancer cells are on the move with long, stiff actin filaments (in green) acting like fingers to help it move. Paxillin (red) collects in patches on the cell's edge, sticking it to a surface. On the left, cells lacking one such regulatory RNA, MaTAR25, are flat. Actin filaments and paxillin are disrupted (speckles instead of filaments or patches), so cells are unable to move efficiently, reducing their ability to escape a tumor and metastasize.

Image: 
Kung-Chi Chang/Spector lab, 2020

Cold Spring Harbor Laboratory (CSHL) scientists have discovered a gene-regulating snippet of RNA that may contribute to the spread of many breast cancers. In animal experiments, the researchers could reduce the growth of metastatic tumors with a molecule designed to target that RNA and trigger its destruction. The same strategy, they say, could be used to develop a new breast cancer treatment for patients.

The study, led by CSHL Professor and Director of Research David Spector, was reported in the journal Nature Communications. In 2016, Spector and colleagues identified dozens of RNA molecules that were more prevalent in breast cancer cells than in noncancerous cells of the same type. All were long, non-coding RNAs (lncRNAs)--RNA molecules that do not encode proteins and are thought to play various regulatory roles inside cells. The current study investigated how one of these, Mammary Tumor-Associated RNA 25 (MaTAR 25), impacted breast cancer cells' behavior in mice.

Experiments by Kung-Chi Chang, a graduate student in Spector's lab, indicate the molecule contributes to cancer's progression in several ways--revving up cells' growth as well as their ability to migrate and invade tissue. These effects may be due to changes in the activity of the tensin1 gene, which the team found is one of MaTAR 25's targets. Tensin1 helps connect a cell's internal cytoskeleton to the external matrix that surrounds it and is therefore positioned to influence a cell's movement as well as its growth-regulating pathways.

[Watch "Metastatic cancer cells on the move. Or not.": https://www.youtube.com/watch?v=oSxFfM5n-bE]

To eliminate MaTAR 25, the researchers designed a small piece of nucleic acid that recognizes and binds to its sequence. Once bound, that molecule, known as an antisense oligonucleotide, alerts an enzyme inside cells to destroy the lncRNA. When the researchers injected this molecule into the bloodstream of mice, it reached tumor cells and degraded most of the MaTAR 25, with dramatic effects. Spector said:

"When we did histology on the tumors, we found that they were very necrotic, meaning there was a lot of cell death after this RNA was degraded. And obviously, that's an important finding, but equally, if not more important, we found a very significant reduction in metastasis to the lungs. So this, you know, really gave us some very exciting data that this RNA molecule has some potential as a therapeutic target."

Spector's team found that high levels of an analogous RNA called LINC01271 are associated with more aggressive disease in patient breast tumors. They are now investigating whether an antisense oligonucleotide that targets LINC01271 can interfere with tumor growth and metastases in patient-derived breast cancer models.

Credit: 
Cold Spring Harbor Laboratory

NIH neuroscientists isolate promising mini antibodies against COVID-19 from a llama

image: NIH scientists showed that anti-COVID-19 nanobodies from a llama may be an effective tool in the battle against the COVID-19 virus.

Image: 
Courtesy of Brody lab NIH/NINDS.

National Institutes of Health researchers have isolated a set of promising, tiny antibodies, or "nanobodies," against SARS-CoV-2 that were produced by a llama named Cormac. Preliminary results published in Scientific Reports suggest that at least one of these nanobodies, called NIH-CoVnb-112, could prevent infections and detect virus particles by grabbing hold of SARS-CoV-2 spike proteins. In addition, the nanobody appeared to work equally well in either liquid or aerosol form, suggesting it could remain effective after inhalation. SARS-CoV-2 is the virus that causes COVID-19.

The study was led by a pair of neuroscientists, Thomas J. "T.J." Esparza, B.S., and David L. Brody, M.D., Ph.D., who work in a brain imaging lab at the NIH's National Institute of Neurological Disorders and Stroke (NINDS).

"For years TJ and I had been testing out how to use nanobodies to improve brain imaging. When the pandemic broke, we thought this was a once in a lifetime, all-hands-on-deck situation and joined the fight," said Dr. Brody, who is also a professor at Uniformed Services University for the Health Sciences and the senior author of the study. "We hope that these anti-COVID-19 nanobodies may be highly effective and versatile in combating the coronavirus pandemic."

A nanobody is a special type of antibody naturally produced by the immune systems of camelids, a group of animals that includes camels, llamas, and alpacas. On average, these proteins are about a tenth the weight of most human antibodies. This is because nanobodies isolated in the lab are essentially free-floating versions of the tips of the arms of heavy chain proteins, which form the backbone of a typical Y-shaped human IgG antibody. These tips play a critical role in the immune system's defenses by recognizing proteins on viruses, bacteria, and other invaders, also known as antigens.

Because nanobodies are more stable, less expensive to produce, and easier to engineer than typical antibodies, a growing body of researchers, including Mr. Esparza and Dr. Brody, have been using them for medical research. For instance, a few years ago scientists showed that humanized nanobodies may be more effective at treating an autoimmune form of thrombotic thrombocytopenic purpura, a rare blood disorder, than current therapies.

Since the pandemic broke, several researchers have produced llama nanobodies against the SARS-CoV-2 spike protein that may be effective at preventing infections. In the current study, the researchers used a slightly different strategy than others to find nanobodies that may work especially well.

"The SARS-CoV-2 spike protein acts like a key. It does this by opening the door to infections when it binds to a protein called the angiotensin converting enzyme 2 (ACE2) receptor, found on the surface of some cells," said Mr. Esparza, who is also an employee of the Henry M. Jackson Foundation for the Advancement of Military Medicine and the lead author of the study. "We developed a method that would isolate nanobodies that block infections by covering the teeth of the spike protein that bind to and unlock the ACE2 receptor."

To do this, the researchers immunized Cormac five times over 28 days with a purified version of the SARS-CoV-2 spike protein. After testing hundreds of nanobodies they found that Cormac produced 13 nanobodies that might be strong candidates.

Initial experiments suggested that one candidate, called NIH-CoVnb-112, could work very well. Test tube studies showed that this nanobody bound to the ACE2 receptor 2 to 10 times stronger than nanobodies produced by other labs. Other experiments suggested that the NIH nanobody stuck directly to the ACE2 receptor binding portion of the spike protein.

Then the team showed that the NIH-CoVnB-112 nanobody could be effective at preventing coronavirus infections. To mimic the SARS-CoV-2 virus, the researchers genetically mutated a harmless "pseudovirus" so that it could use the spike protein to infect cells that have human ACE2 receptors. The researchers saw that relatively low levels of the NIH-CoVnb-112 nanobodies prevented the pseudovirus from infecting these cells in petri dishes.

Importantly, the researchers showed that the nanobody was equally effective in preventing the infections in petri dishes when it was sprayed through the kind of nebulizer, or inhaler, often used to help treat patients with asthma.

"One of the exciting things about nanobodies is that, unlike most regular antibodies, they can be aerosolized and inhaled to coat the lungs and airways," said Dr. Brody.

The team has applied for a patent on the NIH-CoVnB-112 nanobody.

"Although we have a lot more work ahead of us, these results represent a promising first step," said Mr. Esparza. "With support from the NIH we are quickly moving forward to test whether these nanobodies could be safe and effective preventative treatments for COVID-19. Collaborators are also working to find out whether they could be used for inexpensive and accurate testing."

Credit: 
NIH/National Institute of Neurological Disorders and Stroke

Mapping out a transient atom

image: A view into the atomic-like quantum systems (AQS) experiment station. The experiment is carried out at the SQS scientific instrument of the European XFEL, using the AQS experimental station.

Image: 
European XFEL

An international team from Germany, Sweden, Russia and the USA, led by scientists from European XFEL, has published the results of an experiment that could provide a blueprint for the analysis of transitions states in atoms and molecules. This would open up new opportunities to gain insights into important processes such as photocatalysis, elementary steps in photosynthesis and radiation damage.

It was the very first user experiment carried out at European XFEL's Small Quantum System (SQS) instrument. The scientists used high-resolution electron spectroscopy to capture a snapshot of the short-lived transient state produced when X-rays punch a hole in the very core of the atomic electron cloud. The results of the study, which was carried out on neon atoms, are the starting point for the analysis of transient states and have been published in Physical Review X.

The extremely short-lived transient state of core-exited neon lasts for just 2.4 femtoseconds. To put a femtosecond in context: a femtosecond is to a second as a second is to about 31.71 million years. "The European XFEL allows us to use a high number of laser pulses per second and high pulse energy. This means we can bring a very high number of photons to the sample, which is crucial to probe such transient atomic states," explains Tommaso Mazza, the lead author of the paper.

"We used intense X-ray pulses to first remove the electrons from the inner shell, or core, of a neon atom and then used a second photon from the same X-ray pulse to map out the 'hollow' atom," says Mazza. "This is the first time scientists are able to obtain information of the electronic structure of this core-hole transient state by X-ray induced electron spectroscopy, and, more precisely, by measuring the energy of the electrons emitted after the excitation by the second photon while smoothly changing the wavelength of the X-ray pulses," he adds.

Leading Scientist at SQS Michael Meyer underlines that the results of this paper along with a paper recently published in Science show the outstanding possibility to efficiently control and probe excitations of specific electronic subshells at the SQS instrument. "We can enable atomic or element specific excitations in molecular targets and independently investigate for each atom the influence on the photon-induced molecular dynamics," he says. Targeting a specific atom in a molecule allows scientists to gain deeper understanding of the behavior of individual building blocks in the molecular assembly under intense irradiation.

The European XFEL in the Hamburg area is a large international X-ray laser facility. Its 27,000 X-ray flashes per second and their high brilliance open up completely new opportunities for science. Research groups from around the world are able to map the atomic details of viruses, decipher the molecular composition of cells, take three-dimensional "photos" of the nanoworld, "film" chemical reactions, and study processes such as those occurring deep inside planets.

Credit: 
European XFEL