Culture

Turmeric could have antiviral properties

Curcumin, a natural compound found in the spice turmeric, could help eliminate certain viruses, research has found.

A study published in the Journal of General Virology showed that curcumin can prevent Transmissible gastroenteritis virus (TGEV) - an alpha-group coronavirus that infects pigs - from infecting cells. At higher doses, the compound was also found to kill virus particles.

Infection with TGEV causes a disease called transmissible gastroenteritis in piglets, which is characterised by diarrhoea, severe dehydration and death. TGEV is highly infectious and is invariably fatal in piglets younger than two weeks, thus posing a major threat to the global swine industry. There are currently no approved treatments for alpha-coronaviruses and although there is a vaccine for TGEV, it is not effective in preventing the spread of the virus.

To determine the potential antiviral properties of curcumin, the research team treated experimental cells with various concentrations of the compound, before attempting to infect them with TGEV. They found that higher concentrations of curcumin reduced the number of virus particles in the cell culture.

The research suggests that curcumin affects TGEV in a number of ways: by directly killing the virus before it is able to infect the cell, by integrating with the viral envelope to 'inactivate' the virus, and by altering the metabolism of cells to prevent viral entry. "Curcumin has a significant inhibitory effect on TGEV adsorption step and a certain direct inactivation effect, suggesting that curcumin has great potential in the prevention of TGEV infection," said Dr Lilan Xie, lead author of the study and researcher at the Wuhan Institute of Bioengineering.

Curcumin has been shown to inhibit the replication of some types of virus, including dengue virus, hepatitis B and Zika virus. The compound has also been found to have a number of significant biological effects, including antitumor, anti-inflammatory and antibacterial activities. Curcumin was chosen for this research due to having low side effects according to Dr Xie. They said: "There are great difficulties in the prevention and control of viral diseases, especially when there are no effective vaccines. Traditional Chinese medicine and its active ingredients, are ideal screening libraries for antiviral drugs because of their advantages, such as convenient acquisition and low side effects."

The researchers now hope to continue their research in vivo, using an animal model to assess whether the inhibiting properties of curcumin would be seen in a more complex system. "Further studies will be required, to evaluate the inhibitory effect in vivo and explore the potential mechanisms of curcumin against TGEV, which will lay a foundation for the comprehensive understanding of the antiviral mechanisms and application of curcumin" said Dr Xie.

Credit: 
Microbiology Society

Glaucoma study findings emphasise need for regular eye checks

People with early-stage glaucoma see the contrast of visible objects in a very similar way to people without the condition, a new study has shown.

Research by the University of Bradford (UK) demonstrated that the brain compensates for the changes in the eye caused by glaucoma, when looking at objects with everyday levels of contrast. The findings add to our understanding of why glaucoma patients report few early symptoms of the disease and may not seek testing until their disease is more advanced.

Glaucoma is a common eye condition affecting half a million people in Britain, where the optic nerve which connects the eye to the brain becomes damaged. It develops slowly over many years and affects peripheral vision first. If untreated, glaucoma results in permanent vision loss.

Glaucoma makes it harder to see contrast - the differences between shades of light and dark - so the eyes are less able to detect low contrast objects. But until now it's not been clear if this contrast sensitivity loss means that patients with glaucoma see visible objects in a different way from healthy people.

Now, the University of Bradford team has shown that people with glaucoma see detectable contrast in the same way as healthy patients, despite their measurable vision loss.

In the study, 20 participants with early- to moderate- stage glaucoma had their disease confirmed, and their areas of peripheral vision loss mapped. They were then asked to respond to a screen display of patterned patches. They adjusted the controls until an image in their poor areas of vision looked equally as bright or dim as a central patterned patch. An eye tracker was used to ensure each patient was looking in the correct place before the central patch could be seen. A control group of healthy participants was tested in the same way.

The researchers found that participants with glaucoma didn't see the image as paler, or 'greyed out' in any way; instead they saw it in exactly the same way as people with healthy vision. The results suggest that glaucoma patients' brains are compensating for damage to the optic nerve.

Dr Jonathan Denniss, a qualified optometrist and lecturer at the University of Bradford, led the study. He said: "This underlines why it's so important to get eyes tested routinely so that glaucoma can be picked up before damage is established. It ties in with the fact that people who have glaucoma initially don't report any symptoms: their brains are successfully overcoming a loss of contrast sight."

He added: "It's always struck me as strange that we all accept the need for routine dental checks to maintain the health of our teeth and mouth, but that routine eye checks among the general population are not considered as important. This is a reminder to get your eyes checked regularly, even if they seem to be fine."

Credit: 
University of Bradford

Key technology for mass-production of lignin-bio-aviation fuels for reducing greenhouse gas

image: The team, led by Dr. Jeong-Myeong Ha of the Clean Energy Research Center at the Korea Institute of Science and Technology(KIST), has developed a technology that can be used to mass-produce aviation-grade fuels from wood wastes. The ability to produce aviation-grade fuel from wood waste is expected to help international aviation companies comply with the new strong emissions regulations, which are scheduled to go into effect in 2027.

Image: 
Korea Institue of Science and Technology(KIST)

A Korean research team has developed a key technology for the mass-production of bio-aviation fuels. The team, led by Dr. Jeong-Myeong Ha of the Clean Energy Research Center at the Korea Institute of Science and Technology (KIST), has announced its successful development of a technology that can be used to mass-produce aviation-grade fuels from wood wastes. The ability to produce aviation-grade fuel from oil derived from wood waste--which up until now has been difficult due to the high viscosity of the oil--is expected to help international aviation companies comply with the new strong

*emissions regulations, which are scheduled to go into effect in 2027.

*ICAO (the International Civil Aviation Organization) intends to regulate aviation industry greenhouse gas emissions starting in 2027.

Lignin constitutes 20 to 40 percent of lignocellulose including woods and grass, lignocellulose. Large volumes of lignin are generated as waste in the pulping processes that are used to produce paper. The pyrolysis of lignin produces an oil, which has little industrial utility due to its high viscosity. For this reason, lignin waste is typically used by paper mills as a low-grade boiler fuel, rather than as a high-grade fuel or as a raw material for chemical products.

The research team, led by Dr. Jeong-Myeong Ha of the KIST, used '**hydrocracking' to prepare hydrocracked lignin oil, which was mixed at a ratio of (raw lignin oil:hydrocrackd oil =) 7:3 with raw lignin oil to substantially reduce the viscosity of the oil to 1/7 (from 750 cp to 110 cp; e.g. the viscosity of water is 1 cp, and the viscosity of cooking oil is 80 cp), allowing it to be used for industrial purposes.

** Hydrocracking: Technique used in the petrochemical and refining industry to break down crude oil which is difficult to use as fuel.

The mixed oil prepared in this manner can be recycled to 'hydrocracking' process for the continuous process to mass-produce the bio-aviation fuels. Further, the final fuel product, similar to the contents of jet fuel, has a low freezing point compared to gasoline and diesel, and has a high energy density, being sutiable to bio-aviation fuels.

Dr. Jeong-Myeong Ha of the KIST, who led the research, commented on the team's research, saying, "Despite the digital revolution, a sharp increase in global parcel volumes supports the global paper production. Conventional chemical reaction methods were unable to convert the large volumes of lignin wastes from paper mills into high quality fuels, but our research has opened up the potential for the mass-production of jet fuels from the otherwise useless lignin wastes." He added, "This achievement will allow Korea to proactively meet jet fuel greenhouse emissions regulations, which will go into effect starting from 2027."

Credit: 
National Research Council of Science & Technology

Blood iron levels could be key to slowing ageing, gene study shows

Genes linked to ageing that could help explain why some people age at different rates to others have been identified by scientists.

The international study using genetic data from more than a million people suggests that maintaining healthy levels of iron in the blood could be a key to ageing better and living longer.

The findings could accelerate the development of drugs to reduce age-related diseases, extend healthy years of life and increase the chances of living to old age free of disease, the researchers say.

Scientists from the University of Edinburgh and the Max Planck Institute for Biology of Ageing in Germany focused on three measures linked to biological ageing - lifespan, years of life lived free of disease (healthspan), and being extremely long-lived (longevity).

Biological ageing - the rate at which our bodies decline over time - varies between people and drives the world's most fatal diseases, including heart disease, dementia and cancers.

The researchers pooled information from three public datasets to enable an analysis in unprecedented detail. The combined dataset was equivalent to studying 1.75 million lifespans or more than 60,000 extremely long-lived people.

The team pinpointed ten regions of the genome linked to long lifespan, healthspan and longevity. They also found that gene sets linked to iron were overrepresented in their analysis of all three measures of ageing.

The researchers confirmed this using a statistical method - known as Mendelian randomisation - that suggested that genes involved in metabolising iron in the blood are partly responsible for a healthy long life.

Blood iron is affected by diet and abnormally high or low levels are linked to age-related conditions such as Parkinson's disease, liver disease and a decline in the body's ability to fight infection in older age.

The researchers say that designing a drug that could mimic the influence of genetic variation on iron metabolism could be a future step to overcome some of the effects of ageing, but caution that more work is required.

The study was funded by the Medical Research Council and is published in the journal Nature Communications with DOI 10.1038/s41467-020-17312-3.

Anonymised datasets linking genetic variation to healthspan, lifespan, and longevity were downloaded from the publically available Zenodo, Edinburgh DataShare and Longevity Genomics servers.

Dr Paul Timmers from the Usher Institute at the University of Edinburgh, said: "We are very excited by these findings as they strongly suggest that high levels of iron in the blood reduces our healthy years of life, and keeping these levels in check could prevent age-related damage. We speculate that our findings on iron metabolism might also start to explain why very high levels of iron-rich red meat in the diet has been linked to age-related conditions such as heart disease."

Dr Joris Deelen from the Max Planck Institute for Biology of Ageing in Germany, said: "Our ultimate aim is to discover how ageing is regulated and find ways to increase health during ageing. The ten regions of the genome we have discovered that are linked to lifespan, healthspan and longevity are all exciting candidates for further studies."

Credit: 
University of Edinburgh

When many act as one, data-driven models can reveal key behaviors

image: Sample trajectories of the paths traveled by fluorescently tagged Myxococcus xanthus cells that were aggregating in mounds. The trajectories are superimposed on a fluorescent image in which the aggregates appear white.

Image: 
Image courtesy of C. Cotter/UGA

HOUSTON -- (July 16, 2020) -- Biology is rife with examples of collective behavior, from flocks of birds and colonies of bacteria to schools of fish and mobs of people. In a study with implications from oncology to ecology, researchers from Rice University and the University of Georgia have shown that data science can unlock subtle clues about the individual origins of collective behavior.

When a group of individuals move in synch, they can create patterns -- like the flocking of birds or "the wave" in a sports stadium -- that no single individual could make. While these emergent behaviors can be fascinating, it can be difficult for scientists to zero in on the individual actions that bring them about.

"You see emergent behaviors by looking at the group rather than the individual," said Rice bioengineer Oleg Igoshin, a theoretical biophysicist who has spent almost 20 years studying emergent behavior in cells -- be they cooperative bacteria, cancer cells or others.

In a study published online this week in the American Society for Microbiology journal mSystems, Igoshin and Rice alumnus Zhaoyang Zhang developed a method to assess which aspects of individual behavior give rise to emergent behavior.

To illustrate both the difficulty and the importance of understanding these individual contributions to the collective group, Igoshin uses the example of metastatic cancer, where a group of cells with a particular mutation are moving toward the surface of a tumor so they can break away and form a new tumor elsewhere.

"Most of these cells fail to escape the original tumor, and the question is, what determines which ones will succeed?" asked Igoshin, a professor of bioengineering and senior scientist at Rice's Center for Theoretical Biological Physics. "What property is a signal for emergence? Is it how fast they move? Is it how long they move before changing direction? Perhaps it's how frequently they stop. Or it could be a combination of several signals, each of which is too weak to bring about emergence on its own but which act to reinforce one another."

As a group, the potentially metastatic cancer cells share some key traits and abilities, but as individuals, their performance can vary. And in a large population of cells, these performance differences can be as stark as those between Olympic athletes and couch potatoes. Above all, it is this natural variation in individual performance, or heterogeneity, that makes it so difficult to zero in on individual behaviors that contribute to emergent behaviors, Igoshin said.

"Even for cells in a genetically homogeneous tumor, if you look at individuals there will be a distribution, some heterogeneity in performance that arises from some individuals performing 50% above average and others 50% below average," he said. "So the question is, 'With all of this background noise, how can we find the weak trends or signals associated with emergence?'"

Igoshin said the new method incorporates data science to overcome some weaknesses of traditional modeling. By populating their models with experimental data about the movements of individual cells, Igoshin said he and Zhang, who received his Ph.D. from Rice in May, simplified the search for individual behaviors that influence group behaviors.

To demonstrate the technique, they partnered with Lawrence Shimkets, whose lab at the University of Georgia (UGA) has spent years compiling data about the individual and group behaviors of the cooperative soil bacterium Myxococcus xanthus.

"They're predatory microbes, but they are smaller than many of the things they eat," Igoshin said of M. xanthus. "They work together, kind of like a wolf pack, to surround their prey and make the chemicals that will kill it and digest it outside of their bodies, turning it into molecules that are small enough for them to take in."

During times of stress, like when food is running short, M. xanthus exhibit a form of emergent behavior that has been studied for decades. Like lines of cars flowing into a city at rush hour, they stream together to form densely packed mounds that are large enough to see with the naked eye. Mound formation is an early step in the process of forming rugged long-lived spores that can reestablish the colony when conditions improve.

In a previous study, UGA's Chris Cotter, a co-author of the new study and a graduate student in Shimkets' group, tracked individual behaviors of wild-type cells and collaborated with Igoshin to develop a data-driven model that uncovers the cellular behaviors that are key to aggregation. In the new study, Cotter and Zhe Lyu, a former UGA postdoctoral researcher now at Baylor College of Medicine in Houston, collected mound-forming data from mixtures of three strains of M. xanthus: a naturally occurring wild type and two mutants. On their own, the mutants were incapable of forming mounds. But when a significant number of wild-type cells were mixed with the mutants, they were "rescued," meaning they integrated with the collective and took part in mound-building.

"One of the mutants is fully rescued while the other is only partially rescued, and the goal is to understand how rescue works," Igoshin said. "When we applied the methodology, we saw several things that were unexpected. For example, for the mutant that's fully rescued, you might expect that it behaves normally, meaning that all its properties -- its speeds, its behaviors -- will be exactly the same as the wild type. But that's not the case. What we found was that the mutant performed better than normal in some respects and worse in others. And those compensated for one another so that it appeared to be behaving normally."

Emergent behaviors in M. xanthus are well-studied, classic examples. In addition to showing that their method can unlock some of the mysteries of M. xanthus' behavior, Igoshin said the new study indicates the method can be used to investigate other emergent behaviors, including those implicated in diseases and birth defects.

"All we need is data on individuals and data on the emergent behavior, and we can apply this method to ask whether a specific type of individual behavior contributes to the collective, emergent behavior," he said. "It doesn't matter what kind of cell it is, and I think it could even be applied to study animals in ecological models. For instance, ecologists studying migration of a species into a new territory often collect GPS-tracking data. In principle, with enough data on individual behavior, you should be able to apply this approach to study collective, herd-level behaviors."

Credit: 
Rice University

Publicizing police killings of unarmed black people causes emotional trauma, says Rutgers study

A majority of college students of color show symptoms of post-traumatic stress disorder after watching social media videos of unarmed Black men being killed by police, a Rutgers study finds.

The study, published in the Journal of Black Studies, surveyed 134 college students in the United States, between the ages of 18 and 24, 77 percent of whom were Black or Latinx.

Researchers examined the students' engagements with police brutality videos, their reactions to police killings of unarmed Black men and boys, their own encounters with police and their perceptions about police violence.

They found that 90 percent of the students observed several acts of police violence on various social media platforms, with many reporting finding the videos hard to watch, while others shared anger, frustration and fear over these videos.

Seventy-five percent of the students reported getting stopped by the police: 79 percent believed race was a factor and most felt high levels of anxiety during these encounters. Even the students who did not have direct interactions with the police were still fearful because they identified with victims they watched being killed by police and shared on social media.

The findings add to growing data by the U.S. Census Bureau that race or ethnicity is a factor in being stopped by police and other related police violence.

"In communities of color, we are already mistrustful of law enforcement, watching police kill individuals who look like us or members of our families is traumatic," said Felicia Campbell, the primary lead author, and Lecturer at the Yale School of Medicine.

The study found 65 percent of students surveyed reported police violence is an issue in their hometowns, and 63 percent admitted getting coached by family members on how to handle police encounters.

Previous studies have shown that Black people are three times more likely to be killed by police than white people and five times more likely to be unarmed when killed. Studies also show that police are more likely to draw and shoot their guns after seeing a Black person than a white person. Stereotypes of Black people, especially men, as inherently criminal may contribute to police killing Black people at disproportionate rates, studies suggest.

"Higher education is no longer immune to Black Lives Matter's message and can't ignore police brutality that affects students of color," said Pamela Valera, an Assistant Professor at Rutgers School of Public Health. "It impacts the mental health of these students."

Researchers said police reform is needed to stop the killing of Black men, women and children by law enforcement. They added that technology, which has led to an increase of videos of deadly encounters between police and unarmed Black men, has helped to promote calls for reform.

Credit: 
Rutgers University

Pre-brain surgery test protects language in some tumors

image: nrTMS involves stimulating the brain with rapid magnetic pulses to determine where the language centers are.

Image: 
Kazuya Motomura

A technique that maps a patient's language centers before going into surgery works best when their brain tumor is not in those areas. The finding, published by Nagoya University researchers in the journal Scientific Reports, refines understanding of the test's effectiveness and could help improve surgical planning.

Gliomas are a common brain tumor that require extensive removal for patient survival. They often occur in regions involved in motor and language functions, so surgeons do their best to remove the tumors while protecting these vital regions. The language center is usually on the left side of the brain, but personal differences do exist.

Currently, surgeons try to protect the language regions by keeping the patient awake during surgery--the brain does not contain pain receptors--and applying direct cortical stimulation to various points on the brain while the patient simultaneously names what they see in an image. When an area involved in language is stimulated, the patient is unable to name the image. This method is very accurate for mapping the brain's language regions, but surgeons are looking for similarly accurate methods that can be done before an operation. This would help surgical planning and improve the prospects of the procedure.

Surgeons are experimenting with navigated repetitive transcranial magnetic stimulation (nrTMS) for language mapping. Before surgery, rapid magnetic stimulating pulses are applied to the head while the patient names what they see in images. However, the procedure's accuracy varies. Nagoya University neurosurgeon Kazuya Motomura and colleagues wanted to know why.

They analyzed how different clinical parameters, such as age, tumor type, tumor volume, and tumor involvement in the language centers, affect nrTMS accuracy in language mapping. They conducted nrTMS on 42 people with low grade and 19 with high grade gliomas. The tumor was in the left hemisphere in 50 patients and in the right in 11.

The factor that significantly impacted the procedure's accuracy in language center mapping was tumor involvement in these centers. Otherwise, nrTMS showed a good degree of reliability. The scientists also successfully used nrTMS to determine the side of the brain where the language center is mainly located in each patient.

"Our findings suggest that nrTMS language mapping could be a reliable method, particularly when the tumor is not involved in the classical language areas," says Motomura.

The approach needs further validation by testing it on larger numbers of patients in a randomized study.

Credit: 
Nagoya University

Fish reef domes a boon for environment, recreational fishing

image: Artificial reefs shown four months after installation.

Image: 
UNSW Science

In a boost for both recreational fishing and the environment, new UNSW research shows that artificial reefs can increase fish abundance in estuaries with little natural reef.

Researchers installed six manmade reefs per estuary studied and found overall fish abundance increased up to 20 times in each reef across a two-year period.

The study, published in the Journal of Applied Ecology recently, was funded by the NSW Recreational Fishing Trust.

The research was a collaboration between UNSW Sydney, NSW Department of Primary Industries (DPI) Fisheries and the Sydney Institute of Marine Science (SIMS).

Professor Iain Suthers, of UNSW and SIMS, led the research, while UNSW alumnus Dr Heath Folpp, of NSW DPI Fisheries, was lead author.

Co-author Dr Hayden Schilling, SIMS researcher and Conjoint Associate Lecturer at UNSW, said the study was part of a larger investigation into the use of artificial reefs for recreational fisheries improvement in estuaries along Australia's southeast coast.

"Lake Macquarie, Botany Bay and St Georges Basin were chosen to install the artificial reefs because they had commercial fishing removed in 2002 and are designated specifically as recreational fishing havens," Dr Schilling said.

"Also, these estuaries don't have much natural reef because they are created from sand. So, we wanted to find out what would happen to fish abundance if we installed new reef habitat on bare sand.

"Previous research has been inconclusive about whether artificial reefs increased the amount of fish in an area, or if they simply attracted fish from other areas nearby."

Fish reef domes boost abundance

In each estuary, the scientists installed 180 "Mini-Bay Reef Balls" - commercially made concrete domes with holes - divided into six artificial reefs with 30 units each.

Each unit measures 0.7m in diameter and is 0.5m tall, and rests on top of bare sand.

Professor Suthers said artificial reefs were becoming more common around the world and many were tailored to specific locations.

Since the study was completed, many more larger units - up to 1.5m in diameter - have been installed in NSW estuaries.

"Fish find the reef balls attractive compared to the bare sand: the holes provide protection for fish and help with water flowing around the reefs," Prof Suthers said.

"We monitored fish populations for about three months before installing the reefs and then we monitored each reef one year and then two years afterwards.

"We also monitored three representative natural reef control sites in each estuary."

Prof Suthers said the researchers observed a wide variety of fish using the artificial reefs.

"But the ones we were specifically monitoring for were the species popular with recreational fishermen: snapper, bream and tarwhine," he said.

"These species increased up to five times and, compared to the bare sand habitat before the reefs were installed, we found up to 20 times more fish overall in those locations.

"What was really exciting was to see that on the nearby natural reefs, fish abundance went up two to five times overall."

Dr Schilling said that importantly, their study found no evidence that fish had been attracted from neighbouring natural reefs to the artificial reefs.

"There was no evidence of declines in abundance at nearby natural reefs. To the contrary, we found abundance increased in the natural reefs and at the reef balls, suggesting that fish numbers were actually increasing in the estuary overall," he said.

"The artificial reefs create ideal rocky habitat for juveniles - so, the fish reproduce in the ocean and then the juveniles come into the estuaries, where there is now more habitat than there used to be, enabling more fish to survive."

The researchers acknowledged, however, that while the artificial reefs had an overall positive influence on fish abundance in estuaries with limited natural reef, there might also be species-specific effects.

For example, they cited research on yellowfin bream which showed the species favoured artificial reefs while also foraging in nearby seagrass beds in Lake Macquarie, one of the estuaries in the current study.

NSW DPI Fisheries conducted an impact assessment prior to installation to account for potential issues with using artificial reefs, including the possibility of attracting non-native species or removing soft substrate.

Artificial reef project validated

Dr Schilling said their findings provided strong evidence that purpose-built artificial reefs could be used in conjunction with the restoration or protection of existing natural habitat to increase fish abundance, for the benefit of recreational fishing and estuarine restoration of urbanised estuaries.

"Our results validate NSW Fisheries' artificial reef program to enhance recreational fishing, which includes artificial reefs in estuarine and offshore locations," he said.

"The artificial reefs in our study became permanent and NSW Fisheries rolled out many more in the years since we completed the study.

"About 90 per cent of the artificial reefs are still sitting there and we now have an Honours student researching the reefs' 10-year impact."
Dr Schilling said the artificial reefs were installed between 2005 and 2007, but the research was only peer-reviewed recently.

Credit: 
University of New South Wales

Exotic neutrinos will be difficult to ferret out

image: The Cracow supercomputer Prometheus has helped researchers from the Institute of Nuclear Physics of the Polish Academy of Sciences track right-handed neutrinos. (Sources: Cyfronet, AGH)

Image: 
Sources: Cyfronet, AGH

An international team tracking the 'new physics' neutrinos has checked the data of all the relevant experiments associated with neutrino detections against Standard Model extensions proposed by theorists. The latest analysis, the first with such comprehensive coverage, shows the scale of challenges facing right-handed neutrino seekers, but also brings a spark of hope.

In all the processes involving neutrinos that have been observed, these particles exhibit a feature referred to by physicists as left-handedness. Right-handed neutrinos, which are the most natural extension of the Standard Model, are nowhere to be seen. Why? The latest, extremely comprehensive analysis carried out by an international group of physicists, including the Institute of Nuclear Physics of the Polish Academy of Sciences (IFJ PAN) in Cracow helps to answer this question. For the first time, data from all the relevant experiments, directly and indirectly dedicated to neutrino detections, were included and checked against the ranges of parameters imposed by various theoretical extensions of the Standard Model.

The first subatomic particle, the electron, was observed more than 120 years ago. Since then, physicists have discovered a whole plethora of them. The richness of the building bricks of nature is explained on the assumption that the world consists of massive quarks, occurring in six flavours, and much less massive leptons, also in six flavours. Leptons include the electron, the muon (weighing 207 times the mass of the electron), the tau (3477 times the mass of an electron) and the corresponding three types of neutrinos.

Neutrinos interact extremely poorly with the rest of matter. They also show other features of particular importance for the shape of modern physics. It has recently been discovered that these particles oscillate, i.e. they are constantly transforming from one type to another. This phenomenon means the observed neutrinos must have a certain (though very low) mass. Meanwhile, the Standard Model, a modern theoretical tool describing subatomic particles with great accuracy, leaves no alternative: within its framework neutrinos cannot have any mass! This contradiction between theory and experience is one of the strongest indications in favour of the existence of unknown subatomic particles. The mass of neutrinos, however, is not their only puzzling property.

"We learn about the presence of neutrinos by observing the decay products of various particles and comparing what we have recorded with what theory predicts. It turns out that in all cases indicating the presence of neutrinos, these particles always had the same helicity: 1/2, i.e. they were left-handed. This is interesting because other particles of matter can have both positive and negative spin. But there are no right-handed neutrinos to be seen! If they don't exist, then why not? And if they do, where are they hiding?" asks Dr. Marcin Chrzaszcz (IFJ PAN).

An article just published in the European Physical Journal C by an international team of physicists brings us closer to answering the above questions. Scientists from IFJ PAN, the European Organization for Nuclear Research - CERN, Université catholique de Louvain (Louvain-la-Neuve, Belgium), Monash University (Melbourne, Australia), Technische Universität München (Germany) and University of Amsterdam (Netherlands) ) carried out the most accurate analysis to date of the data collected in over a dozen of the most sophisticated experiments in subatomic physics, both those of a general nature and those directly dedicated to observing neutrinos (including PIENU, PS-191, CHARM, E949, NuTeV, DELPHI , ATLAS, CMS).

The researchers did not limit themselves to merely increasing the number of experiments and the amount of data processed. In their analysis, they considered the possibility of hypothetical processes proposed by theoreticians that require the presence of right-handed neutrinos. One of them was the seesaw mechanism associated with Majorana neutrinos.

In 1937, Ettore Majorana postulated the existence of a particle of matter that is its own antiparticle. Such a particle could not have an electric charge. Since all particles of matter carry electric charge, except for neutrinos, the new particle can be a neutrino.

"The theory suggests that if Majorana neutrinos exist, there can also be a seesaw mechanism. This would mean that when neutrinos with one helicity state are not very massive, then neutrinos with the opposite helicity must have very large masses. So, if our neutrinos which are left-handed have very low masses, if they were to be Majorana neutrinos, in the right-handed version they would have to be massive. This would explain why we haven't seen them yet," says Dr. Chrzaszcz and adds that massive right-handed neutrinos are one of the candidates for dark matter.

The latest analysis, carried out using the specialized open source GAMBIT package, took into account all currently available experimental data and parameter ranges provided by various theoretical mechanisms. Numerically it was extremely onerous. The seesaw mechanism itself meant that the calculations had to use floating-point numbers not of double, but of quadruple precision. Ultimately, the volume of data reached 60 TB. The analysis had to be carried out in the fastest Polish computing cluster Prometheus, managed by the Academic Computer Centre Cyfronet of the AGH University of Science and Technology.

The results of the analysis, financed on the Polish side from grants from the Foundation for Polish Science and the National Agency for Academic Exchange, do not inspire optimism. It turned out that despite many experiments and a huge amount of collected data, the possible parameter space was penetrated to only a small extent.

"We may find right-handed neutrinos in experiments that are just about to begin. However, if we are unlucky and right-handed neutrinos are hiding in the farthest recesses of parameter space, we may have to wait up to one hundred years for their discovery," says Dr. Chrzaszcz.

Fortunately, there is also a shadow of hope. A trace of a potential signal was captured in the data that could be associated with right-handed neutrinos. At this stage it is very weak and ultimately it may turn out to be just a statistical fluctuation. But what would happen if it wasn't?

"In this case, everything indicates that it would already be possible to observe right-handed neutrinos in the successor to the LHC, the Future Circular Collider. However, the FCC has a certain disadvantage: it would start working about 20 years after its approval, which in the best-case scenario may take place only next year. If it doesn't, we will have to arm ourselves with a great amount of patience before we see right-handed neutrinos," concludes Dr. Chrzaszcz.

Credit: 
The Henryk Niewodniczanski Institute of Nuclear Physics Polish Academy of Sciences

New very short-lived isotope <sup>222</sup>Np is observed

image: Correlated α-decay chains of 222Np observed in the present work

Image: 
IMP

In a recent study, researchers at the Institute of Modern Physics (IMP) of the Chinese Academy of Science (CAS) and their collaborators reported the first discovery of 222Np, a new very short-lived neptunium (Np) isotope, and validated the N = 126 shell effect in Np isotopes.

The experiment, led by Prof. GAN Zaiguo of IMP, was carried out at the Heavy Ion Research Facility in Lanzhou. And the study was published in Physical Review Letters on July 13.

In the trans-lead neutron-deficient region, nuclides decay pervasively by emitting α particles. Therefore, α-decay spectroscopy, an old yet powerful tool in nuclear physics, has been employed generally to identify new heavy isotopes and to investigate the shell evolution and the nuclear structure of ground and excited states in this region.

In previous research, a weakening of the influence of the N = 126 shell closure towards Z = 92 uranium was observed experimentally. The question then arises how the N = 126 shell will evolve above Z = 92.

In the present work, researchers synthesized a very short-lived α-emitting isotope 222Np (N = 129), which is one of Np isotopes just above the N = 126 closed shell. The new isotope was produced in the complete-fusion reaction and observed at the gas-filled recoil separator SHANS. And it was identified by employing the recoil-α correlation measurement.

Researchers then established six α-decay chains and determined the decay properties of 222Np.

These experimental findings are important for improving the α-decay systematics of Np isotopes around N = 126. By combining the results with the existing data, researchers obtained the convincing evidence for the stability of the N = 126 magic shell in Np isotopes. The study clarifies the question how robust the N = 126 shell is in Np isotopes.

This work was supported by the Strategic Priority Research Program of CAS, the National Key R&D Program of China, the National Natural Science Foundation of China, etc.

Credit: 
Chinese Academy of Sciences Headquarters

Un-natural mRNAs modified with sulfur atoms boost efficient protein synthesis

image: Phosphorothioate modification of mRNA increases the rate of translation initiation and the molecular designs for a high translation efficiency were optimized. This result provides a useful molecular design guideline for improving the translation quantity by chemical modification of mRNA.

Image: 
Nagoya University

Since mRNAs play a key role in protein synthesis in vivo, the use of mRNAs as medicines and for in vitro protein synthesis has been desired. In particular, mRNA therapeutics hold the potential for application to vaccine therapy(1) against coronaviruses and are being developed. However, the efficiency of protein production with mRNAs in the natural form is not sufficient enough for certain purposes, including application to mRNA therapeutics. Therefore, mRNA molecules allowing for efficient protein production have been required to be developed.

A ribosome(2) repeats the following three steps to synthesize a protein in vivo using an mRNA as a template (translation reaction): 1) Initiation step: A ribosome binds to an mRNA to form a translation initiation complex; 2) Elongation step: The ribosome moves on the mRNA and links amino acids to synthesize a protein; and 3) Termination step: The protein synthesis process concludes, and the ribosome is liberated. In the translation reaction cycle, the initiation step takes the longest time.

Collaborative research by a group of Nagoya University consisting of Professor Hiroshi Abe, Research Assistant Professor Naoko Abe, and graduate student Daisuke Kawaguchi with Yoshihiro Shimizu, a team leader at RIKEN, has succeeded in the development of modified messenger RNAs (mRNAs). The modified mRNA contains sulfur atoms in the place of oxygen atoms of phosphate moieties of natural mRNAs. It is capable of supporting protein synthesis at increased efficiency. They discovered that modified mRNAs accelerated the initiation step of the translation reactions and improved efficiency of protein synthesis by at least 20 times compared with that using natural-form mRNAs."

This method is expected to be used for large-scale synthesis of proteins as raw materials for the production of biomaterials. Moreover, the application of the results obtained in this study to eukaryotic translation systems enables the efficient production of mRNA therapeutics for protein replacement therapy(3) to contribute to medical treatments. Furthermore, there are virtually no previous reports on the molecular design of highly functional mRNAs; therefore, the successful design achieved in this study can guide a future direction of the molecular design of modified mRNAs.

Credit: 
Japan Science and Technology Agency

Scientists uncover SARS-CoV-2-specific T cell immunity in recovered COVID-19 and SARS patients

The study by scientists from Duke-NUS Medical School, in close collaboration with the National University of Singapore (NUS) Yong Loo Lin School of Medicine, Singapore General Hospital (SGH) and National Centre for Infectious Diseases (NCID) was published in Nature. The findings suggest infection and exposure to coronaviruses induces long-lasting memory T cells, which could help in the management of the current pandemic and in vaccine development against COVID-19.

The team tested subjects who recovered from COVID-19 and found the presence of SARS-CoV-2-specific T cells in all of them, which suggests that T cells play an important role in this infection. Importantly, the team showed that patients who recovered from SARS 17 years ago after the 2003 outbreak, still possess virus-specific memory T cells and displayed cross-immunity to SARS-CoV-2.

"Our team also tested uninfected healthy individuals and found SARS-CoV-2-specific T cells in more than 50 percent of them. This could be due to cross-reactive immunity obtained from exposure to other coronaviruses, such as those causing the common cold, or presently unknown animal coronaviruses. It is important to understand if this could explain why some individuals are able to better control the infection," said Professor Antonio Bertoletti, from Duke-NUS' Emerging Infectious Diseases (EID) programme, who is the corresponding author of this study.

Associate Professor Tan Yee Joo from the Department of Microbiology and Immunology at NUS Yong Loo Lin School of Medicine and Joint Senior Principal Investigator, Institute of Molecular and Cell Biology, A*STAR added, "We have also initiated follow-up studies on the COVID-19 recovered patients, to determine if their immunity as shown in their T cells persists over an extended period of time. This is very important for vaccine development and to answer the question about reinfection."

"While there have been many studies about SARS-CoV-2, there is still a lot we don't understand about the virus yet. What we do know is that T cells play an important role in the immune response against viral infections and should be assessed for their role in combating SARS-CoV-2, which has affected many people worldwide. Hopefully, our discovery will bring us a step closer to creating an effective vaccine," said Associate Professor Jenny Low, Senior Consultant, Department of Infectious Diseases, SGH, and Duke-NUS' EID programme.

"NCID was heartened by the tremendous support we received from many previous SARS patients for this study. Their contributions, 17 years after they were originally infected, helped us understand mechanisms for lasting immunity to SARS-like viruses, and their implications for developing better vaccines against COVID-19 and related viruses," said Dr Mark Chen I-Cheng, Head of the NCID Research Office.

The team will be conducting a larger study of exposed, uninfected subjects to examine whether T cells can protect against COVID-19 infection or alter the course of infection. They will also be exploring the potential therapeutic use of SARS-CoV-2-specific T cells.

Credit: 
Duke-NUS Medical School

Quantum simulation: Particle behavior near the event horizon of block hole

image: (a) The schematic of mapping the behavior of fermion pair into a photonic lattice. (b) The designed bi-layer waveguide lattice and the corresponding dispersion relation. (c) The cross-section profile of the fabricated lattices. (d) The imagined output probability distribution of single-photon wave packet splitting to two parts and moving in opposite direction. (e) The output probability distributions with different excited positions in the same lattice. (f) The separation distance increases with the excited position linearly.

Image: 
©Science China Press

The vast universe can always arise people's infinite imagination and yearning. Black hole, as one of the most attractive heavenly bodies in the universe, are waiting to be explored and studied. However, due to the limitations of technology, human is still unable to go into the depths of universe, let alone reach the vicinity of a black hole.

Fortunately, based on the equivalence between the metric of curved space-time in general relativity and the electromagnetic parameters in electromagnetic materials, the physical scientist has developed the method of transformation optics to simulate the curved space-time of gravitational field. Now, the scientist is able to study and demonstrate the evolution of particles in curved space-time experimentally. However, up to now, these simulations are limited either in classical regime or in flat space whereas quantum simulation related with general relativity is rarely involved. In the quantum field related with the gravitational effect, there are many striking phenomena, such as Hawking radiation.

Recently, Yao Wang and Xianmin Jin from Shanghai Jiao Tong University and Chong Sheng and Hui Liu from the Nanjing University made an exciting progress in observing particle behavior near the event horizon of block hole.

Based on femtosecond laser direct write technology, the waveguide on-site energy and the coupling between waveguides can be well controlled. Inspired by the concept of transformation optics, the team successfully constructed a one-dimensional artificial black hole using a single-layer non-uniform-coupling photonic waveguide lattice. Comparing to linear time evolution in the flat space, the dynamic behavior of single-photon wave packets near the horizon of a black hole accelerates exponentially, and its exponential index depends on the curvature of the black hole.

The team further designed the two-layer photonic waveguide lattice and experimentally observed the acceleration, generation, and evolution of fermion pairs near the event horizon of the black hole: a single-photon packet with positive energy successfully escaped from the black hole, while a single-photon packet with negative energy was captured. The result, which deviates from the intuition that photons are always trapped by black holes, is found well analogue to Hawking radiation which completely origins from quantum effects associated with gravitational effects. Due to vacuum fluctuations, particle-antiparticle pairs are generated near the event horizon of the black hole. Particles with negative energy fall into the black hole, while particles with negative energy escape. This causes the black hole to lose mass, and it would appear that the black hole has just emitted a particle.

Finally, the team think that higher-dimensional curved space-time can be constructed based on the experimental platform. For example, a two-dimensional waveguide array can be used to simulate three-dimensional space-time, and a two-dimensional waveguide array with photon polarization or frequency can be used to simulate four-dimensional space-time. Furthermore, due to the propagation direction plays the role as the time in our experimental platform, dynamics metric can also be emulated based on this experimental platform, such as FRW metric, a model describing cosmic expansion as time evolution, and gravitational wave, which is the ripple of spacetime.

Credit: 
Science China Press

Ethical recommendations for triage of COVID-19 patients

"A lack of intensive care ventilation units owing to rapidly increasing infection rates numbers among the most significant nightmare scenarios of the corona pandemic," says Mathias Wirth, Head of the Ethics Department in the Faculty of Theology at the University of Bern, because: "Shortages of supply can result in triage of patients suffering from severe cases of COVID-19 and thus force a life or death decision." Here, triage means favoring some COVID-19 patients over others depending on urgency and prognosis. Together with experts from Yale University, King's College London, Charité Berlin and Essen University Hospital, medical ethicist Mathias Wirth has prepared a statement on these difficult decisions. The statement was published in the American Journal of Bioethics (AJOB), the most frequently cited scientific journal in the entire field of ethics.

Triage is only ethically justifiable under very specific circumstances

The experts warn against the possibility of prematurely implementing triage; even though triage allows for decisions based on fairness in extreme situations, it leads to significant strain on the affected parties, relatives and medical personnel. In order to avoid it, every effort must be made to transfer seriously ill patients to other hospitals without shortages of supply - across country borders in case of emergency, according to the authors.

In concrete terms, Mathias Wirth's team of researchers recommend increased regional, national and even international collaboration in intensive care for COVID-19 patients in preparation for future waves of infection. "Just because triage is correct under some circumstances does not mean that it is correct under all circumstances," says Wirth. "There is no real and legitimate triage situation as long as treatment spaces are available elsewhere."

Negative decision requires special care

Secondly, a negative triage decision for individual people should not under any circumstances mean that their medical and psychological care is neglected. Quite the opposite: If they are deprived of a ventilator, maximum effort is required for their care and treatment, both for them and for their relatives.

The statement from Wirth et al. presents all stakeholders who advocate for more collaboration in the future situation with strong arguments. Because the judgments associated with triage give too little consideration to moral problems, according to medical ethicists. "The suffering that triage decisions involved for patients, relatives and medical personnel in the epicenters of the first wave attests to this," says Wirth. Thanks to the recommendations, triage planning can be classified more clearly as a last resort, meaning that alternatives must be afforded greater attention.

Credit: 
University of Bern

Composing creativity: Children benefit from new painting materials

image: Example of full composition painting convert to the binarized image

Image: 
JAIST

New research out of the Japan Advanced Institute of Science and Technology (JAIST) utilizes digital image analysis technology to shed light on some of the challenges children face when representing their imaginations through the medium of paint. The research also offers concrete insight into the development of children's psyches, and importantly, offers suggestions for educators to improve children's cognitive, spatial, and artistic abilities.

Painting is a recreational and creative activity enjoyed by children across the world. However, children's paintings also serve as crucial artifacts of their perceptions, as they contain the sum of what children see as "essential." The spatial relationships between objects in paintings represent painters' experiences and technical prowess, and tend to indicate shortcomings of either. These are often perceived by viewers of paintings, but until now, detailed evaluation and analysis of children's art has been subjective, relying on the trained eye of an artist or critic.

JAIST researchers Lan Yu and Yukari Nagai have developed and tested a digital analysis process by which children's paintings can be digitized, categorized, and then thoroughly analyzed. Content, scale, patterns, details, and the relationships between objects in the paintings are objectively quantified and calculated.

Trends identified among the subjects sampled, 182 girls and 234 boys in the Republic of China, indicate that children who have not received instruction display poor abilities to recognize and emulate spatial relationships. The younger the painter, the less likely that he or she will have developed these cognitive abilities. Lack of detail and inability to imitate objects is identified as common to children's painting. Statistical analysis of the boys and girls in this study indicated that they have different attitudes toward painting; girls preferred participating in painting more than boys did. It was also apparent that girls were more able to accurately imitate real objects when representing them in their paintings.

This research has implications in art education, suggesting that art teachers focus on several key elements which are useful to the cognitive development of young painters. Guiding children to paint objects outside could enhance their awareness of their natural environments. Focusing instruction on object size, and the ability to recognize and control it, could remedy this crucial deficit in children's paintings. Specifically training students to imitate objects could show significant results in the overall quality of children's paintings. And finally, instruction in composition and creative methods is correlated to increasing the enjoyment and reducing children's perception of difficulty toward painting as a creative process.

Credit: 
Japan Advanced Institute of Science and Technology