Culture

Through the looking glass: Artificial 'molecules' open door to ultrafast polaritonic devices

Researchers from Skoltech and the University of Cambridge have shown that polaritons, the quirky particles that may end up running the quantum supercomputers of the future, can form structures behaving like molecules - and these "artificial molecules" can potentially be engineered on demand. The paper outlining these results was published in the journal Physical Review B Letters.

Polaritons are quantum particles that consist of a photon and an exciton, another quasiparticle, marrying light and matter in a curious union that opens up a multitude of possibilities in next-generation polaritonic devices. Alexander Johnston, Kirill Kalinin and Natalia Berloff, professor at the Skoltech Center for Photonics and Quantum Materials and University of Cambridge, have shown that geometrically coupled polariton condensates, which appear in semiconductor devices, are capable of simulating molecules with various properties.

Ordinary molecules are groups of atoms bound together with molecular bonds, and their physical properties differ from those of their constituent atoms quite drastically: consider the water molecule, H2O, and elemental hydrogen and oxygen. "In our work, we show that clusters of interacting polaritonic and photonic condensates can form a range of exotic and entirely distinct entities - "molecules" - that can be manipulated artificially. These "artificial molecules" possess new energy states, optical properties, and vibrational modes from those of the condensates comprising them," Johnston, of the University of Cambridge Department of Applied Mathematics and Theoretical Physics, explains.

When researchers were running numerical simulations of two, three, and four interacting polariton condensates, they noticed some curious asymmetric stationary states in which not all of the condensates have the same density in their ground state. "Upon further investigation, we found that such states came in a wide variety of different forms, which could be controlled by manipulating certain physical parameters of the system. This led us to propose such phenomena as "artificial polariton molecules" and to investigate their potential uses in quantum information systems," Johnston says.

In particular, the team focused on an "asymmetric dyad", which consists of two interacting condensates with unequal occupations. When two of those dyads are combined into a tetrad structure, the latter is, in some sense, analogous to a homonuclear molecule - for instance, to molecular hydrogen H2. Furthermore, artificial polariton molecules can also form more elaborate structures, which could be thought of as "artificial polariton compounds."

"There is nothing preventing more complex structures from being created. Indeed, in our work we have found that there is a wide range of exotic, asymmetric states possible in tetrad configurations. In some of these, all condensates have different densities (despite all of the couplings being of equal strength), inviting an analogy with chemical compounds," Alexander Johnston notes.

In specific tetrad structures, each asymmetric dyad can be viewed as an individual "spin," defined by the orientation of the density asymmetry. This has interesting consequences for the system's degrees of freedom (the independent physical parameters required to define states); the "spins" introduce a discrete degree of freedom, in addition to the continuous degrees of freedom given by the condensate phases.

The relative orientation of each of the dyads can be controlled by varying the coupling strength between them. Since quantum information systems can potentially have increased accuracy and efficiency if they utilize some kind of hybrid discrete-continuous system, the team therefore proposed this hybrid tetrad structure as a potential basis for such a system.

"In addition, we have discovered a plethora of exotic asymmetric states in triad and tetrad systems. It is possible to seamlessly transition between such states simply by varying the pumping strength used to form the condensates. This property suggests that such states could form the basis of a polaritonic multi-valued logic system, which could enable the development of polaritonic devices that dissipate significantly less power than traditional methods and, potentially, operate orders of magnitude faster," Professor Berloff says.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Supertest evaluates performance of engineering students in Russia, the United States, India, China

image: Igor Chirikov, Affiliated Researcher at the HSE Institute of Education

Image: 
Igor Chirikov

A group of researchers representing four countries summed up the results of the Supertest, a large-scale study of the academic performance of engineering students in Russia, China, India, and the United States. It is the first study to track the progress of students in computer science and electrical engineering over the course of their studies with regard to their abilities in physics, mathematics, and critical thinking and compare the results among four countries. The article about study in Nature Human Behavior.

The HSE Institute of Education played a key role not only in collecting and analyzing data from Russia, but also in developing uniform assessment tools in mathematics and physics for all countries. The Institute conducted a full range of psychometric studies in addition to substantiating the quality of the measurements and the comparability of results across different countries.

The Supertest was initiated by Stanford University in partnership with HSE University Moscow, the Educational Testing Service (ETS), and universities in China and India. The study authors include Prashant Loyalka, an Associate Professor at Stanford University and a leading researcher at the HSE International Laboratory for Evaluating Practices and Innovations in Education; Igor Chirikov a senior researcher at the Center for Studies in Higher Education at UC Berkeley and an affiliated researcher of the HSE Institute of Education; and Elena Kardanova and Denis Federyakin , leading researchers at the Centre for Psychometrics and Measurements in Education at the HSE.

More than 30,000 undergraduate students participated in the study. The researchers collected a sample of students from elite and large universities, roughly equal in number for each country. In Russia, the sample included students from six Project 5-100 universities and 28 other universities. Their skill development was measured three times: upon entering university, at the end of their second year, and at the end of their studies.

The task of the specialists of the HSE Centre for Psychometrics and Measurements in Education was to develop tests that had questions that would be neutral for students of different countries and would yield adequately comparable results across different countries. 'Over the course of analyzing the test results, we have proven that we were able to achieve both tasks,' said Centre Director Elena Kardanova. 'Testing in different countries was conducted in accordance with the same rules, with the assistance of specially trained examiners. All students were offered the same incentives to participate. We additionally tested the sensitivity of the results to possible differences in student motivation.'

The Supertest showed that at the start of their studies, Russian students perform lower than Chinese students in mathematics and physics, but higher than students from India in mathematics. After two years of study, the gap between Russian and Chinese students narrows, while Indian students catch up with Russian students in mathematics.

One finding of the study that is cause for concern relates to engineering students' critical thinking skills. Initially, upon entering university, Russian engineering students outperform Indian students while performing lower than Chinese students. In terms of developing these skills over the course of their studies, students of all three countries perform lower than students in the United States. 'We found that, as the students progress in their studies, their critical thinking skills remain approximately the same in Russia and India, but significantly decrease in China. On the contrary, American students show improvement,' said Igor Chirikov. This is a serious problem, the researchers note, because technologies change rapidly, and in order to be able to master new ones, you need not only a firm grasp of the subject area, but, above all, skills of the 21st century.

Another unexpected result was the gradual decline of academic skills among engineering students in China. 'Students at Chinese universities have an extremely high level of skills upon enrollment, but over the course of their university studies this level decreases: this applies to physics, mathematics, and critical thinking. We observe this at both elite and large universities, albeit to different extents,' said Igor Chirikov. 'A possible explanation lies in the way undergraduate education is organized in China, where institutions put emphasis on lectures, and instructors are not as demanding as in Russia and India. As a result, students have less motivation to learn and are not held accountable for developing skills.'

The results of the Supertest provide insight into how graduates perform in the globally competitive market of future computer science and electronic engineering professionals. Each represented country is renowned for its engineering expertise, and it is the Chinese, American, Russian and Indian specialists who migrate to different countries that largely determine the technological progress around the world. The balance of power in this sphere of education can play a decisive role in who will win the technology race tomorrow.

Credit: 
National Research University Higher School of Economics

Swapping alpha cells for beta cells to treat diabetes

image: At left is a healthy islet with many insulin-producing cells (green) and few glucagon-producing cells (red). At right, this situation is altered in a diabetic islet with a heavy preponderance of glucagon-producing cells (red) and very few insulin-producing cells (green

Image: 
UT Southwestern Medical Center

Blocking cell receptors for glucagon, the counter-hormone to insulin, cured mouse models of diabetes by converting glucagon-producing cells into insulin producers instead, a team led by UT Southwestern reports in a new study. The findings, published online in PNAS, could offer a new way to treat both Type 1 and Type 2 diabetes in people.

More than 34 million Americans have diabetes, a disease characterized by a loss of beta cells in the pancreas. Beta cells produce insulin, a hormone necessary for cells to absorb and use glucose, a type of sugar that circulates in the blood and serves as cellular fuel.

In Type 2 diabetes, the body's tissues develop insulin resistance, prompting beta cells to die from exhaustion from secreting excess insulin to allow cells to take in glucose. In Type 1 diabetes, which affects about 10 percent of the diabetic population, beta cells die from an autoimmune attack. Both kinds of diabetes lead to severely elevated blood sugar levels that eventually cause a host of possible complications, including loss of limbs and eyesight, kidney damage, diabetic coma, and death.

Most treatments for diabetes focus on insulin, but its counterpart - the hormone glucagon that is produced by alpha cells in the pancreas - has received comparatively little attention, says study leader May-Yun Wang, Ph.D., assistant professor of internal medicine at UTSW. Glucagon binds to receptors on cells in the liver, prompting this organ to secrete glucose. Some recent studies have suggested that depleting glucagon or blocking its receptor can help research animals or humans with diabetes better manage their glucose levels. But how this phenomenon occurs has been unknown.

To answer this question, Wang and her colleagues, including William L. Holland, Ph.D., a former assistant professor of internal medicine at UTSW who is now at the University of Utah, and Philipp E. Scherer, Ph.D., professor of internal medicine and cell biology at UTSW and director of UTSW's Touchstone Center for Diabetes Research, used monoclonal antibodies - manmade proteins that act like human antibodies and help the immune system identify and neutralize whatever they bind to - against the glucagon receptor in mouse models of diabetes.

In one model, called PANIC-ATTAC (pancreatic islet beta-cell apoptosis through targeted activation of caspase 8), a genetic mutation causes beta cells to selectively die off when these mice receive a chemical treatment. Once these animals' beta cells were depleted, the researchers administered monoclonal antibodies against the glucagon receptor. Weekly treatment with the antibodies substantially lowered the rodents' blood sugar, an effect that continued even weeks after the treatments stopped.

Further investigation showed that the number of cells in the pancreas of these animals significantly increased, including beta cells. Searching for the source of this effect, the researchers used a technique called lineage tracing to label their alpha cells. When they followed these alpha cells through rounds of cell divisions, they found that treatment with monoclonal antibodies pushed some of the glucagon-producing alpha cell population to convert into insulin-producing beta cells.

Although the PANIC-ATTAC model shares the same beta cell loss that occurs in both Type 1 and Type 2 diabetes, it's missing the autoimmune attack that spurs Type 1 diabetes. To see if beta cells could rebound through alpha cell conversion under these circumstances, the researchers worked with a different mouse model called nonobese diabetic (NOD) mice in which their beta cells become depleted through an autoimmune reaction. When these animals were dosed with monoclonal antibodies, beta cells returned, despite active immune cells.

In a third animal model that more closely mimics a human system, the researchers injected human alpha and beta cells into immunodeficient NOD mice - just enough cells to produce sufficient insulin to make the animals borderline diabetic. When these mice received monoclonal antibodies against the glucagon receptor, their human beta cells increased in number, protecting them against diabetes, suggesting this treatment could do the same for people.

Holland notes that being able to push alpha cells to shift to beta cells could be especially promising for Type 1 diabetics. "Even after decades of an autoimmune attack on their beta cells, Type 1 diabetics will still have plentiful amounts of alpha cells. They aren't the cells in the pancreas that die," he says. "If we can harness those alpha cells and convert them into beta cells, it could be a viable treatment for anyone with Type 1 diabetes."

Being able to produce native insulin, adds Wang, could hold significant advantages over the insulin injections and pumps used by both Type 1 and Type 2 diabetics. Eventually, she says, similar monoclonal antibodies could be tested in diabetics in clinical trials.

"Even though Type 1 and Type 2 diabetics try their very best to keep glucose under control, it fluctuates quite massively throughout the day even with the best state-of-the-art pump," Wang says. "Giving them back their own beta cells could help restore much better natural regulation, greatly improving glucose regulation and quality of life."

Scherer holds the Gifford O. Touchstone, Jr. and Randolph G. Touchstone Distinguished Chair in Diabetes Research and the Touchstone/West Distinguished Chair in Diabetes Research.

Credit: 
UT Southwestern Medical Center

Antibodies deplete cancer cells in mice and human cell lines; reach previously inaccessible targets

Three studies - one each in Science, Science Translational Medicine, and Science Immunology - reveal the promise of newly engineered bispecific antibodies, including by demonstrating their power against previously inaccessible tumor cell targets for the first time, in two cases. These bispecific antibodies, which simultaneously bind to tumor antigens and T cells, cleared cancer cells without damaging healthy cells in mouse tumor models and/or cell culture experiments, across the three studies. The results highlight the therapeutic potential of this antibody type, which - unlike engineered immune cell therapies like CAR T - does not have to be personalized; it could be used "off-the-shelf."
Some immunotherapy approaches against cancers rely on common cancer-related mutations to serve as antigens; they instigate an immune response to the cancer. Although it is one of the most common mutant tumor suppressor genes known in human cancers, the cancer-related p53 tumor suppressor gene has not been successfully targeted via this approach. This is largely because it is much more challenging to reactivate this and other mutant tumor suppressor genes than it is to deactivate cancer-driving genes. In Science, Emily Han-Chung Hsiue and colleagues successfully engineered a bispecific antibody to reactivate p53. Hsiue et al. first identified a distinct targetable fragment of the mutant tumor suppressor protein and characterized the structural basis for how the fragment is presented to T cells. They developed an antibody that could recognize this fragment of the mutant p53 protein and that did not cross-react with wild-type p53 in intact cells. The researchers then converted this antibody into a bispecific antibody format that attaches one portion to mutant p53 antigen on tumor cells and another to a T cell. In mice engrafted with human multiple myeloma cells, the antibody stimulated effective T cell killing of cancer cells that express the mutant p53 protein, resulting in tumor regression. As well, experiments in cell lines revealed the mutant p53 antigen-targeting bispecific antibody could activate T cells even when the neoantigens were expressed at very low levels on the tumor cell surface.
In Science Translational Medicine, Suman Paul and colleagues employed bispecific antibodies to target malignant T cells in T cell leukemias and lymphomas without harming healthy T cells - a challenging feat in cancer immunotherapy. Therapies that broadly target B cell antigen and cause near complete loss of both healthy and cancerous B cell populations have been remarkably successful and well-tolerated by patients, but broadly targeting T cells in the same way would result in harmful immunosuppression. Paul et al. hypothesized that targeting the beta chain region of T cell receptors could serve as a potential strategy to selectively eliminate malignant T cells while avoiding collateral damage to healthy T cells. They engineered bispecific antibodies to home into either TRBV5-5 or TRBV12, two examples of the 30 beta chains that could be present on malignant T cells. They hypothesized that targeting either beta chain could clear cancer cells without harming healthy T cells that express any of the other 29 beta chain variable regions. In cell lines extracted from patients with T-cell lymphomas and leukemias, application of these bispecific antibodies effectively killed malignant T cells and preserved healthy T cells. Similarly, in mouse models of human T cell cancers, whereby mice received human T cells via intravenous injection, the antibodies promoted killing of malignant T cells without depleting healthy T cells, leading to major tumor regression.
In Science Immunology, Jacqueline Douglass and colleagues modified bispecific antibodies to recognize and kill tumor cells isolated in culture that bear extremely low levels of cancer-driving mutant RAS proteins on their surfaces. Targeting cancer-driving mutant genes like RAS has emerged as a promising strategy for formulating cancer drugs. However, the success of this treatment strategy is limited by the fact that some of these cancer-driving mutant proteins are expressed in low levels on tumor cells, making them incredibly difficult to detect - despite their ubiquity in multiple cancer types. Now, Douglass et al. have used a method called phage display to search for mutant RAS-specific antibodies from a human antibody library. Based on their findings, the researchers developed mutation-associated neoantigen-directed antibodies (MANAbodies) that target mutant RAS neoantigens on tumor cell surfaces. They grafted these mutant RAS-specific MANAbodies onto an optimized T cell-engaging bispecific antibody. Testing the MANAbody-bearing bispecific antibodies in human lung and pancreatic cancer cell lines, the researchers showed the antibodies could specifically recognize and kill tumor cells that bear extremely low levels of mutated RAS proteins, while having no effect on tumor cells that express wild-type or related mutated proteins.
"Although the studies of Hsuie et al., Douglass et al., and Paul et al. are promising for advancing [bispecific antibodies] into the clinic, there are other factors to consider before therapeutic efficacy can be fully realized," Jon Weidanz says. For example, he notes that these antibodies are small molecules that can be rapidly cleared from the blood, which will likely make it necessary to continuously infuse these kinds of drugs with an implanted pump or use other methods to extend the drug's persistence in the blood.

Credit: 
American Association for the Advancement of Science (AAAS)

School-based dental program reduces cavities by more than 50%

image: A new study finds that school-based dental care reduces cavities by more than 50 percent.

Image: 
©Sorel: Courtesy of NYU Photo Bureau

A school-based cavity prevention program involving nearly 7,000 elementary school students reduced cavities by more than 50 percent, according to a study led by researchers at NYU College of Dentistry. The findings are published March 1 in the Journal of the American Dental Association.

"The widespread implementation of oral health programs in schools could increase the reach of traditional dental practices and improve children's oral health--all while reducing health disparities and the cost of care," said Richard Niederman, DMD, professor and chair of the Department Epidemiology & Health Promotion at NYU College of Dentistry and the study's senior author.

Dental cavities are the most common chronic disease in children, and one in five elementary school children have at least one untreated cavity. While cavities can be prevented with dental visits and good at-home oral hygiene, some families experience barriers to seeing a dentist, including cost and parents having to take time off of work.

"School-based cavity prevention programs eliminate these barriers by bringing basic dental care to children, rather than bringing children to care," said Niederman.

The study was conducted in 33 public, high-need elementary schools in Massachusetts, where dental hygienists provided care to 6,927 children. The services were provided at no cost to families.

Twice-yearly visits involved dental examinations followed by cavity prevention and treatment, including fluoride varnish, sealants, and minimally invasive fillings to stabilize cavities without drilling. Students also received oral hygiene instructions, toothbrushes, and fluoride toothpaste to take home. If more complex care was required, students were referred to local dentists. Notably, the procedures used do not create aerosols, which limits the risk of transmitting viruses through the air.

After six visits, the prevalence of untreated cavities decreased by more than 50 percent. In one group of schools, cavities were reduced from a baseline of 39 percent to 18 percent, and in a second group, cavities decreased from 28 percent to 10 percent. The prevention program reduced cavities in both baby and permanent teeth.

"In 2010, the federal government set a goal of reducing the prevalence of cavities in children by 10 percent by 2020. Our study shows that this is not only feasible, but also that a comprehensive school-based program can reduce cavities by five times their goal," said Niederman.

Recent economic analyses of school-based cavity prevention programs by researchers at NYU College of Dentistry, including one focusing on this program in Massachusetts, demonstrate that they are cost-effective and could save federal dollars. If this school-based program was implemented nationally, it could reduce Medicaid spending on children's oral health by as much as one-half.

The COVID-19 pandemic has interrupted most school-based dental care because of school closures and fear of creating aerosols, even as oral health care in dental practices has safely resumed with additional infection control measures in place. The researchers stress the importance of safely continuing school-based care, given its ability to prevent cavities using aerosol-free procedures.

Credit: 
New York University

Future of immunotherapy could be 'off-the-shelf' treatments

image: Jon Weidanz, director of the North Texas Genome Center and associate vice president of research at The University of Texas at Arlington.

Image: 
Randy Gentry/UTA

In a new commentary for the journal Science, an associate vice president for research at The University of Texas at Arlington argues that emerging protein-based immunotherapies could lead to highly effective "off-the-shelf" cancer treatments for more patients.

Jon Weidanz, who also is a professor in the College of Nursing and Health Innovation at UTA, is the author of a perspective regarding the development of cancer immunotherapies.

His article, "Targeting cancer with bispecific antibodies," will appear in the March 5 edition of Science. It evaluates the findings of three studies by researchers at Johns Hopkins University and proposes that an emerging method of protein-based immunotherapy that targets commonly occurring mutations in cancer cells or neoantigens--mutated antigens produced by tumor cells--could lead to treatments that are effective for oncology patients.

Immunotherapy, a method to treat illness by stimulating a person's immune system, is a developing alternative to traditional cancer treatments.

"Up until recently patients were limited to four treatment options: surgery, radiation, chemotherapy and targeted therapy," Weidanz said. "However, the holy grail has always been to develop strategies that would harness the power of the immune system to attack and destroy the cancer. With recent breakthroughs in immuno-oncology along with the new findings being published in Science, it does appear we are closing in on cancer with new immunotherapies."

As medicine has advanced, immunologists have discovered ways to engineer a person's T-cells, the white blood cells that fight and kill infected cells, to recognize and target cancer cells and eliminate them from the body. This approach has led to exciting advances in the field and remission in some patients. However, more work is required to make this form of T-cell therapy more broadly accessible.

Alternatively, researchers have developed approaches that stimulate the immune system without removing T-cells from the body. These "off-the-shelf" protein-based treatments, known as bispecific T-cell engaging antibodies, have proven effective in treating patients with acute lymphoblastic leukemia, a type of blood cancer.

"The ideal is to create protein molecules that have two arms. One arm can recognize the cancer cell and bind to it. The other arm binds to T-cells," Weidanz said. "The protein drug then brings the T-cells into proximity with the tumor cells, which activate the T-cells to destroy the tumor cells."

These two-armed, or bispecific, proteins would avoid healthy cells while destroying cancer cells. Weidanz argues that this method of protein-based immunotherapy could make a difference. The key comes down to the unique targets expressed by the cancer cells that the bispecific protein drug recognizes. Bispecific antibodies could bind to particular neoantigen targets found on tumor cells and recruit T-cells to destroy the cancer.

"The beauty of bispecific proteins is that you could manufacture those proteins and put them on the shelf as an immunotherapy agent," Weidanz said. "If a doctor sees that a patient's cancer expresses the neoantigen target, they could be treated immediately. It's still a personalized medicine, but would not require engineering T-cells."

An expert in immunology, Weidanz has more than 30 years of experience in biotechnology research with an emphasis on immunotherapy, especially related to oncology and product development to diagnose and treat cancer. His research lab at UTA investigates how the immune system identifies malignant cells with the goal of designing treatments that boost immune cells' ability to destroy cancerous cells.

"Dr. Weidanz's substantial expertise in the field of immunology will lead us into the next generation of cancer management," said James Grover, interim vice president of research. "The developments of his lab and those of his many talented colleagues across the nation make this a pivotal moment in the history of a devastating disease."

Weidanz said immunotherapy holds the promise of transforming cancer into a more manageable condition with better prognoses for patients.

"We are getting to a point where we will be able to make cancer more of a chronic disease," Weidanz said. "Now, we look at five-year survival. Maybe we can start looking at 15- or 20-year survival readouts because we're able to manage the disease with immunotherapies that are being developed. It's a very exciting time."

Credit: 
University of Texas at Arlington

Repurposed arthritis drug did not significantly improve severe COVID-19 pneumonia

image: Marketed as Actemra, tocilizumab is approved for treating arthritis and other inflammatory conditions.

Image: 
Photo courtesy of Roche

In a study published online February 25, 2021 in The New England Journal of Medicine, a repurposed drug used to treat arthritis did not significantly improve the outcomes of patients with severe COVID-19 pneumonia.

Results of the Phase III clinical trial, conducted by an international team led by senior author Atul Malhotra, MD, research chief of pulmonary, critical care and sleep medicine at UC San Diego Health, found that tocilizumab did not significantly improve clinical status or mortality rate at 28 days for participants who received it compared to a placebo.

"Although our trial was negative based on primary outcomes, we did see some benefits, including an improvement in length of stay of eight days with tocilizumab compared to placebo, as well as fewer days on the mechanical ventilator with our intervention," said Malhotra.

"Although it is important to be cautious in interpreting secondary outcomes, our trial helped in the design of subsequent studies which do show some improvement in outcomes with tocilizumab, particularly when given in combination with corticosteroids."

Marketed as Actemra, tocilizumab is an immunosuppressive drug used primarily to treat rheumatoid arthritis and systemic juvenile idiopathic arthritis, a severe form of the disease in children. The therapy works by using humanized monoclonal antibodies to specifically target and block cellular receptors for interleukin-6 (IL-6), a small protein or cytokine that plays an important role in triggering inflammation as an early immune response to disease.

In some patients with COVID-19, the immune response runs amok, overexpressing IL-6 and generating a "cytokine storm," which can lead to potentially life-threatening damage to lungs and other organs. Cytokine storms have been linked to a number of inflammatory diseases, from respiratory conditions caused by coronaviruses such as SARS and MERS to some forms of influenza to non-infectious diseases, such as multiple sclerosis and pancreatitis.

Researchers hoped that the heightened role of IL-6 in respiratory diseases and the fact that many severe cases of COVID-19 involve respiratory failure, hospitalization and death pointed to tocilizumab as a potentially effective therapy. Early case reports and retrospective observational studies buttressed that optimism.

The Phase III clinical trial, which began April 2020 and was conducted in 62 hospitals in nine countries, involved 452 patients with confirmed cases of severe COVID-19 pneumonia, randomized into a group of 294 persons who would receive an intravenous infusion of tocilizumab and 144 persons who received a placebo. Malhotra expressed his gratitude to his team at UC San Diego as well as the countless individuals around the world who helped in the execution of a carefully done study.

The researchers found no significant difference in how the two groups fared, and no reduced mortality rate associated with tocilizumab, though they noted the trial was not designed to fully assess that outcome.

No safety issues arose regarding the use of tocilizumab, and the authors said study data suggested the treatment may have some therapeutic benefit during hospital stays and in shortening stays in intensive care units. In both cases, though, they said more research was required.

"Since this trial launched, a lot has been learned about the virus and about how COVID-19 manifests in different people, in different ways and stages," said Malhotra. "These findings need to be understood in that context. We looked at very sick patients. There are very few proven therapies for severe COVID-19. Tocilizumab and some monoclonal antibody treatments may still have utility in specific circumstances, but more work needs to be done.

"In fact, more work must be done. The need for effective treatments for patients with severe COVID-19 pneumonia remains a major challenge of this pandemic. Each new study brings us one step closer to putting that challenge behind us."

Credit: 
University of California - San Diego

The time is ripe! An innovative contactless method for the timely harvest of soft fruits

image: A pulsed laser is focused by a lens onto a point close to the surface of the fruit. The laser-induced plasma creates a shockwave that excites Rayleigh waves on the surface of the mango, which are then measured using laser Doppler vibrometers.

Image: 
Shibaura Institute of Technology

Most people are probably familiar with the unpleasant feeling of eating overripe or underripe fruit. Those who work in agriculture are tasked with ensuring a timely harvest so that ripeness is at an optimal point when the fruit is sold, both to minimize the amount of fruit that goes to waste and maximize the quality of the final product. To this end, a number of techniques to assess fruit ripeness have been developed, each with their respective advantages and disadvantages depending on the type of produce.

Although biochemical and optical methods exist, mechanical techniques are the most widely used. They indirectly assess ripeness based on the fruit's firmness. In turn, firmness is quantified by observing the vibrations that occur on the fruit when mechanical energy is accurately delivered through devices such as hammers, pendulums, or speakers. Unfortunately, these approaches fall short for softer fruits, which are more readily damaged by the contact devices used.

In a recent study published in Foods, a team of scientists from Shibaura Institute of Technology (SIT), Japan, addressed this issue through an innovative method for measuring the firmness of soft fruits using laser-induced plasma (LIP). This work is a sort of follow-up of a previous study in which LIP was used to quantify the firmness of harder fruits.

But what is LIP and how is it used? Plasma is a state of matter similar to the gaseous state but in which most particles have an electric charge. This energetic state can be produced in normal air by focusing a high-intensity laser beam onto a small volume. Because the generated plasma "bubble" is unstable, it immediately expands, sending out shockwaves at ultrasonic speeds. Professor Naoki Hosoya and colleagues at SIT had successfully used LIP shockwaves generated close to the surface of apples to excite a type of vibration called 0S2 mode, colloquially referred to as "football mode vibration" because of how the resulting deformation looks on spherical bodies. They then verified that the frequency of the 0S2 mode vibrations was correlated with the firmness of the fruit.

However, soft fruits do not exhibit 0S2 mode vibrations, so the team had to analyze an alternative type of oscillation: Rayleigh waves. These are waves that occur exclusively on the surface of bodies without penetrating far into the interior. Using Kent mangoes, a setup for generating LIP, and commercially available laser-based vibrometers, the scientists verified that the velocity at which Rayleigh waves propagate is directly related to the firmness of the mangoes. Because the propagation velocity markedly decreases with storage time, it provides a reliable way to indirectly assess ripeness.

The team went further and looked for the best position on the mangoes' surface to determine the velocity of Rayleigh waves. Mangoes, as well as other soft fruits, have large seeds inside, which can alter the propagation of surface waves in ways that are detrimental to measurements. "The results of our experiments indicate that Rayleigh waves along the 'equator' of the mango are better for firmness assessment compared to those along the 'prime meridian'," explains Hosoya. The experiments also revealed that cavities within the fruit's flesh or decay can greatly affect the results of the measurements. Thus, as Hosoya adds, they will keep investigating which is the best area to measure firmness in mangoes using their novel approach.

In short, the team at SIT has engineered an innovative strategy to assess the ripeness of soft fruits from outside. "Our system," remarks Hosoya, "is suitable for non-contact and non-destructive firmness assessment in mangoes and potentially other soft fruits that do not exhibit the usual 0S2 mode vibrations." Further refinement of such firmness assessment methods will hopefully make them more reliable and accessible for the agricultural industry. With any luck, their widespread adoption will ensure that fruits reach your plate only when the time is ripe!

Credit: 
Shibaura Institute of Technology

Study: Treatable sleep disorder common in people with thinking and memory problems

MINNEAPOLIS - Obstructive sleep apnea is when breathing is repeatedly interrupted during sleep. Research has shown people with this sleep disorder have an increased risk of developing cognitive impairment and Alzheimer's disease. Yet, it is treatable. A preliminary study released today, February 28, 2021, has found that obstructive sleep apnea is common in people with cognitive impairment. The study will be presented at the American Academy of Neurology's 73rd Annual Meeting being held virtually April 17 to 22, 2021.

Cognitive impairment includes memory and thinking problems that affect concentration, decision making and learning new things. The risk of cognitive impairment increases as people age.

"Better sleep is beneficial to the brain and can improve cognitive skills. Yet in our study, we found that over half of the people with cognitive impairment had obstructive sleep apnea," said study author Mark I. Boulos, M.D., of the University of Toronto in Canada and member of the American Academy of Neurology. "We also found that those with the sleep disorder had lower scores on thinking and memory tests. Fully understanding how obstructive sleep apnea affects this population is important because with treatment, there is potential to improve thinking and memory skills as well as overall quality of life."

The study involved 67 people with an average age of 73 who had cognitive impairment. Participants completed questionnaires on sleep, cognition and mood. They also took a 30-point cognitive assessment to determine their level of cognitive impairment. Questions included identifying the date and the city they were in and repeating words they had been asked to remember earlier in the test. Scores on the test range from zero to 30. A score of 26 or higher is considered normal, 18-25 signifies mild cognitive impairment and 17 or lower signifies moderate to severe cognitive impairment.

Participants were given at-home sleep apnea tests to determine if they had obstructive sleep apnea. The at-home test uses a monitor to track breathing patterns and oxygen levels during sleep.

Researchers found that 52% of study participants had obstructive sleep apnea. People with the sleep disorder were 60% more likely to score lower on the cognitive test than people who did not have sleep apnea. People with the disorder had an average score of 20.5 compared to an average score of 23.6 for the people without the sleep disorder.

In addition, researchers found that the severity of obstructive sleep apnea corresponded with the degree of cognitive impairment as well as the quality of sleep for participants, including sleep time, how quickly they fell asleep, the efficiency of their sleep and how often they awoke at night.

"People with cognitive impairment should be assessed for obstructive sleep apnea because it can be treated by using a continuous positive airway pressure (CPAP) machine that helps keep the airway open at night," said Boulos. "However, not everyone who tries CPAP chooses to regularly use the therapy, and this may be a bigger challenge to people with thinking and memory problems. Future research should be directed toward determining ways to diagnose and manage the disease that are efficient and easy to use in people with cognitive impairment."

Credit: 
American Academy of Neurology

CUHK unveils balance between two protein counteracting forces in hereditary ataxias

image: Prpf19 degrades the disease protein of SCA3 and alleviates its toxicity. Exoc7 protein restrains Prpf19 from functioning the degradation, causing it to lose its beneficial effects on SCA3.

Image: 
CUHK

Collaborating with the University of Oxford, Professor Ho Yin Edwin Chan's research team from the School of Life Sciences of The Chinese University of Hong Kong (CUHK) recently unveiled the counteracting relationship between pre-mRNA-processing factor 19 (Prpf19) and exocyst complex component 7 (Exoc7) in controlling the degradation of disease protein and neurodegeneration of the rare hereditary ataxia. The research findings have been published in the prestigious scientific journal, Cell Death & Disease.

Protein misfolding contributes to the pathogenesis of SCA3

Proteins play a significant role in every single cell development in the human body, including neurons. Numerous studies have proved that misfolds and aggregation of proteins contribute to many occurrences of human diseases. Proteins need to adopt proper folding and architecture before being able to execute their biological functions. Even a minor improper assembly of a protein may result in cellular malfunctioning, leading to toxic insoluble protein aggregates that cause diseases. Progressive misfolding of proteins and aggregates will interfere with the functionalities of other normal proteins, and these are also detected in the deteriorating neurons of SCA3 and other protein misfolding induced disorders, including polyglutamine (polyQ) diseases.

Prpf19 is capable of degrading toxic expanded SCA3-polyQ protein

Professor Edwin Chan, Postdoctoral Fellow Dr. Zhefan Stephen Chen, and the team discovered that the nuclear-localised protein Prpf19 is responsible for scrutinising the quality of SCA3-polyQ, the disease protein of SCA3 or MJD. Potentiating the function of Prpf19 promotes degradation of faculty SCA3-polyQ protein via a process called ubiquitin-proteasome degradation. In this, the toxicity of SCA3 cells is proved to be alleviated, and improvement is also shown in the condition of neurodegeneration and the nervous system of the animal model with SCA3 disease.

The rivalry between Prpf19 and Exoc7 in controlling neurodegeneration of SCA3

Exoc7 is a protein that controls trafficking of proteins within cells, and it is also known as an associating partner of Prpf19. While the coiled-coil domain of Exoc7, a special region of the protein, is crucial for Exoc7 to restrain Prpf19 from functioning, the research team has further discovered that Exoc7 will shuttle to the cell nucleus where it binds directly to Prpf19 and interferes with the pre-mRNA splicing, causing Prpf19 to lose its beneficial effects on SCA3 or MJD disease models.

Professor Chan said, "The current study demonstrates an intricate relationship between Prpf19 and Exoc7, two crucial proteins in nerve cells. Elucidating the mechanism of action of protein networks that govern protein aggregation will allow us to develop potential small molecules or activators targeting Prpf19, with the hope of providing novel strategies for curing SCA3/MJD and other neurodegenerative disorders. Today (28 February) marks the Rare Disease Day 2021. SCA3/MJD belongs to the category of rare neurodegenerative diseases, I also hope our findings will provide an insight into rare disease translational biomedicine research."

Credit: 
The Chinese University of Hong Kong

The human brain grew as a result of the extinction of large animals

image: Elephant hunting illustrations

Image: 
Dana Ackerfeld

A new paper by Dr. Miki Ben-Dor and Prof. Ran Barkai from the Jacob M. Alkow Department of Archaeology at Tel Aviv University proposes an original unifying explanation for the physiological, behavioral and cultural evolution of the human species, from its first appearance about two million years ago, to the agricultural revolution (around 10,000 BCE). According to the paper, humans developed as hunters of large animals, causing their ultimate extinction. As they adapted to hunting small, swift prey animals, humans developed higher cognitive abilities, evidenced by the most obvious evolutionary change - the growth of brain volume from 650cc to 1,500cc. To date, no unifying explanation has been proposed for the major phenomena in human prehistory. The novel theory was published in Quaternary Journal.

In recent years more and more evidence has been accumulated to the effect that humans were a major factor in the extinction of large animals, and consequently had to adapt to hunting smaller game, first in Africa and later in all other parts of the world. In Africa, 2.6 million years ago, when humans first emerged, the average size of land mammals was close to 500kg. Just before the advent of agriculture this figure had decreased by over 90% - down to several tens of kg.

According to the researchers, the decrease in the size of game and the need to hunt small, swift animals forced humans to display cunning and boldness - an evolutionary process that demanded increased volume of the human brain and later led to the development of language enabling the exchange of information about where prey could be found. The theory claims that all means served one end: body energy conservation.

The researchers show that, throughout most of their evolution, early humans were apex (top) predators, specializing in hunting large game. Representing most of the biomass available for hunting, these animals provided humans with high fat levels, an essential source of energy, and enabled a higher energy return than small game. In the past, six different species of elephants lived in Africa, comprising more than half of the biomass of all herbivores hunted by humans. Initial evidence from East Africa indicates that homo sapiens only emerged in that area after a significant decline in the number of elephant species in certain regions. Comparing the size of animals found in archaeological cultures, representing different species of humans in east Africa, southern Europe and Israel, the researchers found that in all cases there was a significant decline in the prevalence of animals weighing over 200kg, coupled with an increase in the volume of the human brain.

"We correlate the increase in human brain volume with the need to become smarter hunters," explains Dr. Ben-Dor. For example, the need to hunt dozens of gazelles instead of one elephant generated prolonged evolutionary pressure on the brain functions of humans, who were now using up much more energy in both movement and thought processes. Hunting small animals, that are constantly threatened by predators and therefore very quick to take flight, requires a physiology adapted to the chase as well as more sophisticated hunting tools. Cognitive activity also rises as fast tracking requires fast decision-making, based on phenomenal acquaintance with the animals' behavior - information that needs to be stored in a larger memory."

The evolutionary adaptation of humans was very successful," says Dr. Ben-Dor. "As the size of animals continued to decrease, the invention of the bow and arrow and domestication of dogs enabled more efficient hunting of medium-sized and small animals - until these populations also dwindled. Toward the end of the Stone Age, as animals became even smaller, humans had to put more energy into hunting than they were able to get back. Indeed, this is when the Agricultural Revolution occurred, involving the domestication of both animals and plants. As humans moved into permanent settlements and became farmers, their brain size decreased to its current volume of 1300-1400cc. This happened because, with domesticated plants and animals that don't take flight, there was no more need for the allocation of outstanding cognitive abilities to the task of hunting."

Prof. Barkai: "While the chimpanzee's brain, for example, has remained stable for 7 million years, the human brain grew threefold, reaching its greatest size about 300,000 years ago. In addition to brain volume, evolutionary pressure caused humans to use language, fire and sophisticated tools such as bow and arrow, adapt their arms and shoulders to the tasks of throwing and hurling and their bodies to the prolonged chase, improve their stone tools, domesticate dogs and ultimately also domesticate the game itself and turn to agriculture."

Prof. Barkai adds: "It must be understood that our perspective is not deterministic. Humans brought this trouble upon themselves. By focusing on hunting the largest animals, they caused extinctions. Wherever humans appeared - whether homo erectus or homo sapiens, we see, sooner or later, mass extinction of large animals. Dependence on large animals had its price. Humans undercut their own livelihood. But while other species, like our cousins the Neanderthals, became extinct when their large prey disappeared, homo sapiens decided to start over again, this time relying on agriculture."

Credit: 
Tel-Aviv University

The gut microbiome can predict changes in glucose regulation

A study carried out by researchers from the Institute of Genomics, University of Tartu revealed that human gut microbiome can be used to predict changes in Type 2 diabetes related glucose regulation up to four years ahead.

Type 2 diabetes is a metabolic disease characterized by elevated blood glucose levels that contributes to millions of deaths worldwide each year and its prevalence is rapidly increasing. Type 2 diabetes is preceded by "prediabetes" - a condition when the glucose levels have started to rise, but the progression of the disease can still be stopped and inverted. Therefore, early detection of the disease progression is necessary and previous research suggests that gut microbiome could be used for that purpose, said Elin Org, last author of the paper and associate professor in genomics and microbiomics.

This study aimed to assess whether gut microbiome could be used to predict changes in metabolic parameters such as plasma insulin and glucose levels in the early stages of the disease. "This is one of the first studies that assess the role of the gut microbiome in type 2 diabetes over time," said Oliver Aasmets, the first author of the paper.

Results showed that gut microbiome can predict changes in glucose regulation primarily related to insulin levels and insulin secretion. "Our study design allowed us to compare predictions made a year and a half and four years ahead, which showed significant differences, giving input for further studies," said Aasmets. Furthermore, the study showed which microbes are the most useful for predicting changes in the metabolic parameters.

"Using gut microbiome as a risk factor for predicting various diseases is a promising research area, but further studies in different populations and with larger sample sets are needed in order to validate the results and to further develop the prediction models," said Org.

Credit: 
Estonian Research Council

To sustain a thriving café culture, we must ditch the disposable cup.

image: Nearly 300 billion takeaway coffee cups ending up in landfill each year.

Image: 
Pixabay

Takeaway coffees - they're a convenient start for millions of people each day, but while the caffeine perks us up, the disposable cups drag us down, with nearly 300 billion ending up in landfill each year.

While most coffee drinkers are happy to make a switch to sustainable practices, new research from the University of South Australia shows that an absence of infrastructure and a general 'throwaway' culture is severely delaying sustainable change.

It's a timely finding, particularly given the new bans on single-use plastics coming into effect in South Australia today, and the likelihood of takeaway coffee cups taking the hit by 2022.

Lead researcher, UniSA's Dr Sukhbir Sandhu, says the current level of coffee cup waste is unsustainable and requires a commitment from individuals, retailers, and government agencies alike to initiate change.

"There's no doubt we live in a disposable society - so much of our lives is about convenient, on-the-run transactions. But such a speedy pace encourages the 'takeaway and throwaway' culture that we so desperately need to change," Dr Sandhu says.

"Educating and informing people about the issues of single-use coffee cups is effective - people generally want to do the right thing - but knowing what's right and acting upon it are two different things, and at the moment, there are several barriers that are impeding potential progress.

"For example, if your favourite coffee shop doesn't offer recyclable or compostable cups, it's unlikely to stop you from getting a coffee; we need that coffee hit and we need it now. So, strike one.

"Then, with the popularity of arty, patterned paper cups on the rise, you may think you're buying a recyclable option. But no - most takeaway coffee cups are in fact lined with a waterproof plastic, which is not only non-recyclable, but also a contaminant. Strike two.

"Finally, if you happen upon a coffee shop that does offer recyclable coffee cups, once you're finished, where do you put it? A lack of appropriate waste disposal infrastructure means that even compostable cups are ending up in landfill. Strike three.

"As it happens, compostable cups need to go into a green organics bin, but these bins might not be easily accessible in public settings like the standard shopping precincts."

While the South Australian government is moving in the right direction with its Replace the Waste campaign, changing our 'grab and go' culture is challenging.

"It's important to drive home clear, strong messages about single-use plastics and their impact on the environment," Dr Sandhu says.

"The more we can drive people to choose reusable cups, the more uptake we'll see. People like to mimic what their colleagues, friends and peers do, especially when it is the right thing."

Credit: 
University of South Australia

Climate change threatens European forests

In recent years, European forests have suffered greatly from extreme climate conditions and their impacts. More than half of Europe's forests are potentially at risk from windthrow, forest fire, insect attacks or a combination of these. This is the main result of a study by an international team of scientists with the participation of Henrik Hartmann from the Max Planck Institute for Biogeochemistry in Jena, Germany. Using satellite data and artificial intelligence, the scientists studied vulnerability to disturbances in the period between 1979 and 2018. In the light of ongoing climate change, their findings are very important for improving mitigation and adaptation strategies as well as forest management to make European forests more resilient for the future.

Forests cover a good third of Europe's land mass, play an important role in regulating the climate, and provide a wide range of ecosystem services to humans. However, in recent decades climate change has made forests increasingly vulnerable to disturbances. Forest structure and prevailing climate largely determine how vulnerable forests are to perturbations and vulnerability to insect infestations has increased notably in recent decades; especially in northern Europe. The boreal coniferous forests in cold regions and the dry forests of Iberian Peninsula are among the most fragile ecosystems.

Insect infestations increasingly put forests at risk

Henrik Hartmann, research group leader at the Max Planck Institute for Biogeochemistry, observes forest responses from an ecophysiological perspective and sums up "The experience of recent years, especially since 2018, has clearly shown that the threat to forests posed by insect pests has particularly increased with ongoing climate change. There is a risk that further climate warming will increase this trend."

Extreme weather conditions such as heat waves and drought weaken trees and make them vulnerable to insect pests. "This finding is not new, and forests are normally well adapted to deal with occasional climate extremes. The fact that these extremes are now occurring so frequently and repeatedly makes the exception the norm, and forests cannot cope with that situation," the expert explains.

Old trees are particularly vulnerable

The study also shows that large and old trees are particularly vulnerable to climatic extremes. In recent drought years, this has also been observed in Central European beech forests, where an increasing number of old trees suddenly died. "This is because their water transportation system has to work under greater stress to transport water from the soil through the roots and up into the high-up crown. As a result, large trees suffer more from drought and are then more susceptible to disease."

Large and older trees are also preferred hosts for harmful insects. For example, the European spruce bark beetle, which mainly attacks adult spruce trees, prefers to fly to larger individuals. In addition, large trees also provide a greater area for wind attacks during storm events. "The results of the study are conclusive from both an ecological and an ecophysiological perspective," summarizes Henrik Hartmann.

Existing European forests will not necessarily disappear, but many of them could be severely damaged by anticipated climate change-induced disturbances and important ecosystem services could be impaired by the loss of especially large and old trees.

Credit: 
Max-Planck-Gesellschaft

Predicts the onset of Alzheimer's Disease (AD) using deep learning-based Splice-AI

image: High-throughput total RNA-seq profile for PLCγ1 and PLCβ subfamily gene expression in the cortex

Image: 
KBRI

Korea Brain Research Institute (KBRI, President Suh Pann-ghill) announced that the research team led by Dr. Jae-Yeol Joo discovered new cryptic splice variants and SNVs in PLCg1 gene of AD-specific models for the first time using Splice-AI.

This research outcome was published in PNAS, a world-renowned academic journal.

* (Title) Prediction of Alzheimer's Disease-Specific phospholipase c gamma-1 SNV by Deep Learning-Based Approach for High-Throughput Screening

Alternative splicing variant regulates gene expression and influences diverse phenotypes. Especially, genetic variants arising due to RNA splicing are frequently found in individuals having neurodevelopmental disorders.

The research team revealed splicing hidden within the transcriptome to AD models via deep learning-based Splice-AI.

The novel 14 alternative splicing sites in the PLCg1 gene body, the key element of the signal transduction mechanism, were identified through deep learning.

Especially, Splice-AI analysis predicted a total of 14 splicing sites in the human PLCg1 gene, accurate delta scores, and positions were analyzed and a novel splicing site in exon 26 of human PLCg1 was identified. (exon 26 of the PLCg1 gene is 100% identical with exon 27 of the same gene in mice in terms of amino acid sequence).

* (Splicing) A form of RNA processing that regulates gene expression through the medium of genetic information

* (PLCg1, phospholipase c gamma-1) An essential protein involved in cell signal transduction and human cell growth and death

* (Exon) A part of a gene that contains protein synthesis information

Abnormal RNA processing was identified with an SNV in exon 27 of the PLCg1 gene within the brain of AD mouse models.

The research team revealed for the first time that SNV lead to changes in amino acids of proteins at exon 27. This region is very important for homeostasis because, mutated sequence is evolutionary conserved sequence through various species such as human, ape, mouse, chicken, zebrafish and so on. Moreover, AD specific nucleotide alteration sites were distributed in the histone modification region in PLCg1 gene body during the brain development.

Dr. Joo mentioned, "Emerging variants of the coronavirus have been reported in England in December 2020, and that is more transmissible than previously circulating viruses.

This variants coronavirus has mutation and altered their spike protein amino acid.

Our research will give valuable information and technique for various human diseases and through the convergence and utilization of brain research with AI technology, which is the core of the Fourth Industrial Revolution, to understand various diseases including AD, we will be able to obtain critical information for diagnosis and treatment strategy."

Credit: 
Korea Brain Research Institute