Culture

Finnish study proposes a model to predict cryptocurrency defaults

image: Cryptocurrency coins

Image: 
Flickr CC by alpari.org

University of Vaasa (Finland) researchers propose a model that is capable of explaining 87 percent of cryptocurrency bankruptcies after only one month of trading. It could potentially serve as a screening tool for investors keen to boost overall performance of cryptocurrency investment portfolios by avoiding investing in unreliable cryptocurrencies.

Nowadays there are thousands of cryptocurrencies available. Interestingly, only a small percentage are actively traded, whereas the vast majority appears to be either inactive or have already failed. Default risk is one of the risks associated with investing in digital assets like cryptocurrencies. The question arises which factors help to predict whether a cryptocurrency will eventually go bust.

A recently published research article from Klaus Grobys and Niranjan Sapkota from the University of Vaasa in the well-known journal Applied Economics addresses this question.

In their study, the researchers examined all available 146 Proof-of-Work-based cryptocurrencies that started trading prior to the end of 2014 and tracked their performance until December 2018. They found that about 60 percent of those cryptocurrencies were eventually in default. The model of their study reveals four interesting factors that could be precursors to whether or not a coin will succeed.

The first factor is a coin's first-day performance. Specifically, how an Initial Coin Offering (ICO) performs on the first day coins are made public appears to be an indicator of long-term success. The study finds some evidence that those coins achieving the most long-term success also had the best first-day performances.

The second factor is a coin's pre-mining activity which takes place prior to the ICO launch. The study's findings indicate that a high level of activity raises a red flag. Even if excessive pre-mining doesn't necessarily guarantee failure, it raises suspicions of a potential bait-and-switch - often referred to as 'pump and dump'.

The third factor that might play an important role is developer anonymity. The study provides evidence for that 79 percent of all defaulted cryptocurrencies are developed by anonymous developers. Interestingly, a correlating statistic indicates that 58 percent of new coin developers choose to remain anonymous.

Finally, the study also shows that rewards and supply matter as there appears to be a correlation between longevity and two other factors: lower minimum rewards and coin supply. This result is rather puzzling when considering how critical coin mining is to the sustainability of any cryptocurrency. Specifically, without mining, a cryptocurrency network could not be maintained.

"Our study is a first attempt to reveal potential links between factors that could relate to coin success or failure. The links that we established in our study are not necessarily causal and much more research needs to be done on this issue," says Dr. Klaus Grobys from the University of Vaasa.

Credit: 
University of Vaasa

Study: Ultra-thin fibres designed to protect nerves after brain surgery

image: Light microscope image of nimodipine fibers.

Image: 
Johanna Zech

The drug nimodipine could prevent nerve cells from dying after brain surgery. Pharmacists at Martin Luther University Halle-Wittenberg (MLU), in cooperation with neurosurgeons at University Hospital Halle (Saale) (UKH), have developed a new method that enables the drug to be administered directly in the brain with fewer side effects. Their findings were published in the “European Journal of Pharmaceutics and Biopharmaceutics”.

Brain surgery poses a major threat to nerve cells. Even slight injuries can kill the sensitive cells. The drug nimodipine could help prevent this. It is currently being used to treat cerebral haemorrhages. The drug relaxes blood vessels which can prevent cramping. It also appears to stop nerve cells from dying. The research group led by Professor Karsten Mäder from the Institute of Pharmacy at MLU has now developed a system that enables the drug to be administered directly in the brain. “The neurosurgeons wanted the drug to be applied locally in order to reduce potential side effects,” explains Mäder.

His research group has integrated nimodipine into biodegradable polymer fibres. The fibres are only one to two micrometres thick. They can degrade in the body and the material which they are made of is already widely used in medicine. “If you want to apply something directly to the nerves, it must be well tolerated,” says Mäder. This is because nerve cells are particularly sensitive. So far, the nimodipine-polymer fibres have been tested in the laboratory for stability and their effect on different cell cultures. Mäder’s team of researchers has been able to show that they release the active ingredient at a very constant rate. This is important as it prevents side effects in the case of an overdose.

Professor Christian Scheller’s research group in the Department of Neurosurgery at UKH then tested how they affected various brain cells. The fibres exhibited no toxic effects. Under various stress conditions, such as heat or high salt concentrations, they reduced the number of cell deaths, in some cases drastically. Nerve cells particularly benefited from the treatment. “In the cell systems, we were able to show that the effect was as good as if we had added the active ingredient without the fibres, in other words intravenously,” says Scheller. However, the latter method has several disadvantages: The active ingredient degrades very quickly and has undesirable side effects, as it relaxes the blood vessels not only in the brain but throughout the entire body, including the heart muscles. This can lead to dangerously low blood pressure if the dose is too high. Directly applying it to the brain could minimize these side effects because significantly less of the active ingredient is required.

The fibres could also be used outside the brain, says Scheller, for example in different types of operations where nerves are at risk.

Credit: 
Martin-Luther-Universität Halle-Wittenberg

Multifunctional e-glasses monitor health, protect eyes, control video game

image: Smart e-glasses can wirelessly monitor EEG and EOG signals, UV intensity, and body movements, while also acting as sunglasses and a human-machine interface.

Image: 
Adapted from <i>ACS Applied Materials & Interfaces</i> <b>2020</b>, DOI: 10.1021/acsami.0c03110

Fitness tracker bracelets and watches provide useful information, such as step count and heart rate, but they usually can't provide more detailed data about the wearer's health. Now, researchers reporting in ACS Applied Materials & Interfaces have developed smart electronic glasses (e-glasses) that not only monitor a person's brain waves and body movements, but also can function as sunglasses and allow users to control a video game with eye motions.

Devices that measure electrical signals from the brain (electroencephalogram; EEG) or eyes (electrooculogram; EOG) can help diagnose conditions like epilepsy and sleep disorders, as well as control computers in human-machine interfaces. But obtaining these measurements requires a steady physical contact between skin and sensor, which is difficult with rigid devices. Suk-Won Hwang and colleagues wanted to integrate soft, conductive electrodes into e-glasses that could wirelessly monitor EEG and EOG signals, ultraviolet (UV) intensity, and body movements or postures, while also acting as a human-machine interface.

The researchers built the glasses' frame with a 3D printer and then added flexible electrodes near the ears (EEG sensor) and eyes (EOG sensor). They also added a wireless circuit for motion/UV sensing on the side of the glasses and a UV-responsive, color-adjustable gel inside the lenses. When the sensor detected UV rays of a certain intensity, the lenses changed color and became sunglasses. The motion detector allowed the researchers to track the posture and gait of the wearer, as well as detect when they fell. The EEG recorded alpha rhythms of the brain, which could be used to monitor health. Finally, the EOG monitor allowed the wearer to easily move bricks around in a popular video game by adjusting the direction and angle of their eyes. The e-glasses could be useful for digital healthcare or virtual reality applications, the researchers say.

Credit: 
American Chemical Society

Superworms digest plastic, with help from their bacterial sidekicks

image: Bacteria from the gut of superworms can degrade polystyrene (white material).

Image: 
Adapted from <i>Environmental Science & Technology</i> <b>2020</b>, DOI: 10.1021/acs.est.0c01495

Resembling giant mealworms, superworms (Zophobas atratus) are beetle larvae that are often sold in pet stores as feed for reptiles, fish and birds. In addition to their relatively large size (about 2 inches long), these worms have another superpower: They can degrade polystyrene plastic. Now, researchers reporting in ACS' Environmental Science & Technology have linked this ability to a strain of bacteria that lives in the larvae's gut.

Polystyrene is used in packaging containers, disposable cups and insulating materials. When thrown in landfills or littered in the environment, the plastic takes several hundred years to completely break down. Recently, several studies have found that mealworms and superworms can ingest and degrade polystyrene within a few weeks. In mealworms, this ability was linked to a certain strain of polystyrene-degrading bacteria in the worms' gut. Jiaojie Li, Dae-Hwan Kim and colleagues wanted to search for similar bacteria in superworms.

The team placed 50 superworms in a chamber with polystyrene as their only carbon source, and after 21 days, the worms had consumed about 70% of the plastic. The researchers then isolated a strain of Pseudomonas aeruginosa bacteria from the gut of the worms and showed that it that could grow directly on the surface of polystyrene and break it down. Finally, they identified an enzyme from the bacteria, called serine hydrolase, that appeared to be responsible for most of the biodegradation. This enzyme, or the bacteria that produce it, could someday be used to help break down waste polystyrene, the researchers say.

The authors acknowledge funding from the CJ Blossom Idea Lab of the CJ Corporation, the Undergraduate Research Program at Daegu Gyeongbuk Institute of Science and Technology and the INGE funds of Gwangju Institute of Science and Technology.

The abstract that accompanies this paper can be viewed here.

The American Chemical Society (ACS) is a nonprofit organization chartered by the U.S. Congress. ACS' mission is to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and its people. The Society is a global leader in providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, eBooks and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a specialist in scientific information solutions (including SciFinder® and STN®), its CAS division powers global research, discovery and innovation. ACS' main offices are in Washington, D.C., and Columbus, Ohio.

To automatically receive news releases from the American Chemical Society, contact newsroom@acs.org.

Follow us: Twitter | Facebook

Credit: 
American Chemical Society

New cancer immunotherapy targeting myeloid cells slows tumor growth

PHILADELPHIA - Checkpoint inhibitors, a type of immunotherapy, that target myeloid immune cells and slow tumor growth were discovered by a team from the Perelman School of Medicine at the University of Pennsylvania and other institutions. Reporting in Nature Cancer, the researchers showed for the first time in human cells and a mouse model that inhibiting the c-Rel molecule in myeloid cells -- as opposed to lymphoid cells that today's immunotherapies target -- blocked the production of immune suppressor cells and significantly shrank tumors.

Checkpoint inhibitors blocks proteins, called checkpoints, that are made by some types of immune system cells, such as T-cells. These checkpoints help keep immune responses from being too strong, but they often keep T-cells from killing cancer cells. These therapies have changed the cancer landscape by showing survival benefits where traditional therapies, like chemotherapy, may have failed. However, the number of patients who respond to these types of therapies remains limited, pushing researchers to explore a new class of inhibitors.

The team showed that in c-Rel-deficient mice tumor size and weight were reduced by up to 80 percent, and that administering the c-Rel inhibitor drug in another set of mice shrank tumors by up to 70 percent, compared to controls.

The findings not only show the potential of this new immunotherapy, but also point to a previously unknown pathway of cancer's assault on the body involving what are known as myeloid-derived suppressor cells (MDSCs). Cunning tumor cells, the authors found, hijack c-Rel to produce MDSCs that keep the immune system from attacking the cancer. The Penn-developed inhibitor releases that break.

"c-Rel is generally considered to be a promotor of immune responses, not a suppressor. That's why this discovery is surprising and unexpected," said senior author Youhai H. Chen, MD, PhD, a professor of Pathology and Laboratory Medicine in the Perelman School of Medicine. "There are two big takeaways: conceptually, this is a new pathway of cancer development that wasn't known before. And we have shown that a new drug inhibitor targeting this pathway works as well, if not better, than the first generation of checkpoint blockers."

The discovery of c-Rel's role in cancer was serendipitous. Chen's lab was studying c-Rel's role in inflammation and autoimmune diseases when they observed a relationship with MDSCs. Already equipped with a c-Rel mouse model, they decided to follow the thread and investigate whether c-Rel played a role in cancer growth, too, given the known function of MDSCs.

The results, said Chen, whose lab has been studying c-Rel for nearly two decades, were striking.

Beyond significant shrinkage of tumors in mice, in another experiment, the researchers deleted the REL gene, which blocked tumor growth and reduced MDSCs in mice, suggesting c-Rel is required to generate MDSCs. Follow-up genetic sequencing also showed how c-Rel turns on pro-tumor gene signatures that suppress functions of the immune system in MDSCs.

The researchers then tested their c-Rel inhibitor drug in mice and found that it not only reduced tumor growth, but also enhanced the effects of anti-PD-L1 therapy, another checkpoint inhibitor, when given in combination with the c-Rel drug. That one-two punch approach had the strongest suppression of tumor growth.

Combination therapy has become a popular approach for treating cancer patients, especially for those who do not respond well to other treatments. "In patients who respond to anti-PD-L1 treatments, many of them still die after two years," Chen said. "If you could extend lives by adding another effective approach, that would be a big advance."

Using human cells, the team also showed that its c-Rel inhibitor drug blocked the development of MDSCs in vitro, suggesting that inhibition might help eliminate cancer in patients, according to the authors.

Now that the efficacy of the inhibitor has been demonstrated in the preclinical setting, Chen said, the next step will be studies to assess the safety of the drug, before moving on to further studies and clinical trials.

"This represents a new class of checkpoints belonging to a different type of cell in the immune system that could move the field of immunotherapy even further along," Chen said.

Credit: 
University of Pennsylvania School of Medicine

How do we disconnect from the environment during sleep and under anesthesia?

During sleep and under anesthesia, we rarely respond to such external stimuli as sounds even though our brains remain highly active.

Now, a series of new studies by researchers at Tel Aviv University's Sackler Faculty of Medicine and Sagol School of Neuroscience find, among other important discoveries, that noradrenaline, a neurotransmitter secreted in response to stress, lies at the heart of our ability to "shut off" our sensory responses and sleep soundly.

"In these studies, we used different, novel approaches to study the filtering of sensory information during sleep and the brain mechanisms that determine when we awaken in response to external events," explains Prof. Yuval Nir, who led the research for the three studies.

The first study, published in the Journal of Neuroscience on April 1 and led by TAU doctoral student Yaniv Sela, calls into question the commonly accepted idea that the thalamus -- an important relay station for sensory signals in the brain -- is responsible for blocking the transmission of signals to the cerebral cortex.

"The shutdown of the thalamic gate is not compatible with our findings," says Sela whose study compares how neurons in different brain regions respond to simple and complex sounds while asleep or awake.

Using rat models, he found that the responses of neurons in the auditory cortex were similar when the rodents were awake or asleep. But when he examined the perirhinal cortex, related to complex conscious perception and memory associations, he found that neurons showed much weaker responses during sleep.

"Basic analysis of sound remains during sleep, but the sleeping brain has trouble creating a conscious perception of the stimulus," Sela adds. "Also, while we found that initial and fast responses are preserved in sleep, those that occur later and require communication between different regions in the cortex are greatly disrupted."

The second study, published on April 8 in Science Advances, finds that the locus coeruleus, a tiny region of the brainstem and the main source of noradrenaline secretions in the brain, plays a central role in our ability to disconnect from the environment during sleep. Led by TAU doctoral student Hanna Hayat at Prof. Nir's lab, the research was conducted in collaboration with Prof. Tony Pickering of Bristol University, Prof. Ofer Yizhar of the Weizmann Institute and Prof. Eric Kremer pf the University of Montpellier.

"The ability to disconnect from the environment, in a reversible way, is a central feature of sleep," explains Hayat. "Our findings clearly show that the locus coeruleus noradrenaline system plays a crucial role in this disconnection by keeping a very low level of activity during sleep."

For the purpose of the research, the scientists used rat models to determine the level of locus coeruleus activity during sleep and which sounds, if any, would be responsible for waking up the rodents.

They found that the rats' varying levels of locus coeruleus activity accurately predict if the animals would awaken in response to sounds. The team then silenced the locus coeruleus activity through optogenetics, which harnesses light to control neuronal activity, and found that the rats did not readily awaken in response to sound.

"When we increased the noradrenaline activity of the locus-coeruleus while a sound played in the background, the rats woke up more frequently in response, but when we decreased the activity of the locus coeruleus and played the same sound in the background, the rats only rarely woke up," says Hayat. "So we can say we identified a powerful 'dial' that controls the depth of sleep despite external stimuli.

"Importantly, our findings suggest that hyperarousal in some individuals who sleep lightly, or during periods of stress, may be a result of continued noradrenaline activity during sleep when there should only be minimal activity."

The third study, published on May 12 in the Proceedings of the National Academy of Sciences (PNAS), led jointly by TAU doctoral student Dr. Aaron Krom of Hadassah Hebrew University Medical Center and TAU doctoral student Amit Marmelshtein, focuses on our response to anesthesia and finds that the most significant effect of loss-of-consciousness is the disruption of communication between different cortical regions.

The study was the fruit of a collaboration between Prof. Nir, Prof. Itzhak Fried and Dr. Ido Strauss of TAU's Sackler Faculty of Medicine and Tel Aviv Sourasky Medical Center, and a team at Bonn University.

"Despite the routine use of anesthesia in medicine, we still do not understand how anesthesia leads to loss of consciousness; this is considered a major open question in biomedical research," explains Dr. Krom.

For the research, the scientists recorded the brain activity of epilepsy patients who had previously shown little to no response to drug interventions. The patients were hospitalized for a week and implanted with electrodes to pinpoint where in the brain their seizures originated. They were then anesthetized for the removal of their electrodes and their neuron activity recorded while they listened to sounds through headphones. They were asked to perform a task until they lost consciousness, which allowed the researchers to examine how their brain activity changed, down to individual neurons, in response to sounds at the very moment they lost consciousness.

"We found that loss-of-consciousness disrupted communication between cortical regions such that sounds triggered responses in the primary auditory cortex, but failed to reliably drive responses in other regions of the cortex," adds Marmelshtein. "This is the first study to examine how anesthesia and loss of consciousness affect sensory responses at a resolution of individual neurons in humans. We hope that our results will guide future research, as well as attempts to improve anesthesia and develop instruments that can monitor the level of consciousness in anesthesia and other states of altered consciousness such as vegetative states and severe dementia."

"These studies advance our understanding of sensory disconnection during sleep and anesthesia," concludes Prof Nir. "Sleep disturbances are a major health issue and are frequent in aging, as well as in neurological and psychiatric disorders. It is important to test if our findings on varying noradrenaline levels can explain hyperarousal that characterizes condition such as anxiety disorders and PTSD, and if so to build on these findings to develop novel methods to improve sleep quality."

Credit: 
American Friends of Tel Aviv University

Same father, same face

image: Is the similarity of facial features in mandrills a consequence of selection? Using up-to-date artificial intelligence scientists from the German Primate Center and the Institut des Sciences de l'Evolution de Montpellier (ISEM) tested this hypothesis.

Image: 
Paul Amblard

More like mom or dad? Human babies always get this curious look in their face combined with the question whom the child resembles most. The answers vary depending on the degree of kinship, gender and the time of assessment. Mandrills, monkeys living in Equatorial Africa, may recognize facial features coding relatedness better than humans. Scientists at the German Primate Center - Leibniz Institute for Primate Research in Göttingen, together with colleagues from the Institut des Sciences de l'Evolution de Montpellier (ISEM), showed by using up-to-date artificial intelligence (AI) that half-sisters, who have the same father look more alike than half-sisters who share the same mother. The paternal half-sisters also have closer social relationships with each other than unrelated mandrills. This result provided the first evidence suggesting that interindividual resemblance has been selected to signal paternal kinship (Sciences Advances).

Throughout the animal kingdom, related conspecifics show similar features. Some are the spitting image of each other. However, whether resemblance between kin merely reflects their genetic resemblance or results from selection to facilitate their recognition is still unknown. A team of scientists led by Marie Charpentier of the ISEM in Montpellier, including Clémence Poirotte and Peter Kappeler of the German Primate Center in Göttingen, used for the first time artificial intelligence (deep learning) to examine the hypothesis that the similarity of the facial features of free-living mandrills is the result of selection. The database consisted of 16,000 portraits of mandrills, taken since 2012 as part of the Project Mandrillus in Gabon. This natural population of mandrills is the only one habituated to the presence of humans. Using a trained algorithm of deep learning, the individuals were first identified and then quantified in terms of facial resemblance. The results were subsequently related to relatedness data of the study animals.

Mandrills live in groups consisting of more than 100 individuals and are characterized by the fact that the females are maternal relatives. They are familiar with each other and remain in the same family throughout their lives. Since reproduction in mandrill groups is mainly monopolized by the alpha male, young mandrills of similar age often have the same father. However, as members of different family groups within the large groups, they should hardly know each other. Nevertheless, half-sisters on the paternal side, as well as half-sisters on the maternal side, interact with each other more often than unrelated animals. "This observation suggests that paternal half-sisters recognize each other as relatives by their facial features. Although maternal and paternal half-sisters share the same degree of genetic relatedness, the facial resemblance is stronger among paternally related females. We suspect that the similarity of facial features between paternal relatives has evolved to facilitate social discrimination and nepotism between kin," says Clémence Poirotte.

Credit: 
Deutsches Primatenzentrum (DPZ)/German Primate Center

Two anti-inflammatory drugs found that inhibit the replication of the COVID-19 virus

Since the outbreak of the COVID-19 pandemic and its rapid spread, the scientific community has been working on developing an effective treatment for the virus responsible for the disease. Finding drugs that can inhibit the infection caused by SARS-CoV-2 is an essential step to finding the vaccine that can definitively bring the spread of the virus to an end. In this regard, the URV's Cheminformatics and Nutrition research group has carried out a computational screening to predict whether there is a medicine authorised for treating another pathology that can inhibit the main protease of the virus (M-pro). This is key to the whole process because this enzyme plays an essential role in the replication of the virus.

The study demonstrates that a human and a veterinary anti-inflammatory drug - Carprofen and Celecoxib - inhibit a key enzyme in the replication and transcription of the virus responsible for COVID-19. The aim of the study was to use computer techniques to analyze whether 6,466 drugs authorized by various drug agencies for both human and veterinary use could be used to inhibit the M-pro enzyme. This enzyme is a protease that is responsible for cutting two polypeptides (generated by the virus itself) and generating a number of proteins that are essential for the reproduction of the virus. Some of the trials coordinated by the WHO against the COVID-19 pandemic also aim to inhibit M-pro using two antiretrovirals such as Lopinavir and Ritonavir (drugs initially designed to treat HIV).

As a result of the study conducted at the URV, it has been predicted that 7 of these 6,466 drugs may inhibit M-pro. The results have been shared with the international initiative COVID Moonshot which has selected 2 of these 7 compounds (i.e., Carprofen and Celecoxib) in order to test their ability to inhibit M-pro in vitro. The results obtained show that at a concentration of 50 μM of Celecoxib or Carprofen, the inhibition of the in vitro activity of M-pro is 11.90 and 4.0%, respectively. Therefore, both molecules could be used as a starting point for further lead optimization to obtain even more potent derivatives.

The study by the Cheminformatics and Nutrition research group from the Biochemistry and Biotechnology Department of the URV has been led by Drs. Gerard Pujadas and Santi Garcia-Vallvé with the collaboration of Drs. Aleix Gimeno, María José Ojeda-Montes and Adrià Cereto-Massagué, the PhD students Guillem Macip and Bryan Saldivar-Espinoza and student Júlia Mestres-Truyol (double degree student in Biotechnology and in Biochemistry and Molecular Biology at the URV). It has been published by the International Journal of Molecular Sciences (IJMS) and is the first to be published worldwide on drug repositioning as inhibitors of SARS-CoV-2 M-pro where computational predictions are experimentally corroborated. The remaining 5 molecules are expected to be selected soon by COVID Moonshot so that their bioactivity can be tested as well.

Credit: 
Universitat Rovira i Virgili

In chimpanzees, females contribute to the protection of the territory

image: In Taï chimpanzees, both sexes participate to territorial maintenance via border patrols and engage in territorial conflicts with hostile neighbors.

Image: 
Liran Samuni

In many social species, including humans, even if large group sizes provide competitive advantages over smaller neighboring groups, the preponderant role of adult males in territoriality has often been put forward, most likely biased by an anthropocentric perspective.

In chimpanzees, one of our closest living relatives, and one of the most territorial primate species with humans, males are known to actively engage in inter-group conflicts and territorial behavior, while females stay out of this male business. This pattern seems to be true in some eastern chimpanzee populations. However, past findings in western chimpanzees from the Taï National Park, in Côte d'Ivoire, already suggested that females play a more important part in territorial behavior than previously thought. Thus, chimpanzee competitive ability in territorial contest, in light to variation of social systems, remains unclear.

"To unravel the mechanisms by which group competition and dominance acts in social species, we need long-term data on several groups within a population", explains Sylvain Lemoine, first author of a new study exploring these mechanisms in western chimpanzees. By compiling more than 20 years of data on four neighboring communities of western chimpanzees from the Taï Chimpanzee Project (TCP), including information on ranging patterns and inter-group encounters, the researchers found that group size better explains the variation in the costs and benefits of between-group competition than the number of adult males.

Advantage for larger groups

Although an increase of the number of adult males was associated with territory increase, larger groups - including females, adolescents and juveniles - gain advantage over smaller communities by accessing to larger territories and by suffering from less neighbor pressure. Larger and safer feeding grounds could then confer advantages to the individuals in term of reproductive success.

"The Taï chimpanzees are known to show high levels of gregariousness among males and females, associated with strong social bonds and cooperation between individuals", says Roman Wittig, one of the senior authors of the study and director of the TCP. In this population, qualified as 'bisexually-bonded', males and females of the same community occupy a similar territory, rather than being segregated in their own smaller home-ranges. Both sexes participate to territorial maintenance via border patrols and engage in territorial conflicts with hostile neighbors. Thus, grouping and socializing patterns in this population likely explain why group size, rather than just the number of males, confer their competitive ability in this population.

Fewer inter-community killings

"Adult males play a role in territory increase, but territorial maintenance, dominance over neighbors and their repulsion also require females and the rest of the group", adds Sylvain Lemoine. These findings may partly explain the relatively low level of inter-community killings observed in this population, as compared to eastern chimpanzees: when both opponent groups interact with all their members being engaged, the odds of imbalance of power are reduced, as well as the opportunities to kill a neighbor at low-risk.

Catherine Crockford, another senior author of this study, summarizes: "We are now accumulating compelling evidence that cooperation among non-related individuals, and among males and females, has likely been under selection due to the competition between neighboring groups. These findings have therefore strong implications to understand the evolution of cooperation in social species and in humans in particular. Continuous and long-lasting research and conservation efforts are necessary if we want to understand the interplay between competition and cooperation in this emblematic species for human evolution, and how it relates to our own evolution."

Credit: 
Max Planck Institute for Evolutionary Anthropology

Study reports nursing home hip fracture rates stay persistently high

A recent study of hip fracture rates in nursing homes in the U.S. reports a slight rise in the rate of hip fractures among long-stay residents in recent years. Researchers looked at data collected between 2007 and 2015 and found, despite a dip in 2013, rates have begun to rise again even though long- stay nursing home admissions have declined. Sarah T. Berry, M.D., M.P.H., Associate Director of the Musculoskeletal Research Center in the Hinda and Arthur Marcus Institute for Aging Research, is lead author on the paper recently published in the Journal of Bone and Mineral Research. The purpose of the study was to describe trends in hip fracture rates and post-fracture mortality among 2.6 million newly admitted U.S. nursing home residents from 2007 to 2015, and to examine whether these trends could be explained by differences in resident characteristics. Understanding the prevalence of fractures and what puts individuals at risk is important to reducing the incidence of fractures in this vulnerable population.

Thirty-six percent of nursing home residents with hip fractures will die within six months, and another 17.3 percent of ambulatory residents will become completely disabled. Among survivors, infections and pressure ulcers are common, leading to functional decline and a diminished quality of life.

Given the high morbidity, mortality, and financial expense associated with these fractures, hip fractures are a major public health concern. It is important then to characterize temporal trends of hip fractures to inform interventions and national policies aimed at ameliorating these fractures.

"Our findings underscore the magnitude of the hip fracture problem in the U.S. and should prompt widespread interventions to reduce the suffering associated with hip fractures in older adults," said Dr. Berry.

The study's researchers can only speculate at this point as to why nursing home fracture rates have not declined during the past decade. Among non-institutionalized older adults, a ten year downward trend in hip fracture incidence has been leveling out in the past few years. According to the paper, "available strategies exist to prevent falls and associated injuries, albeit, the success of falls prevention programs in the nursing home has been less than in the community setting. In general, nursing home residents are older and sicker, with more cognitive and functional impairment than community-dwellers. One possible explanation of these high rates is the underutilization of medications to treat osteoporosis."

Credit: 
Hebrew SeniorLife Hinda and Arthur Marcus Institute for Aging Research

Simple and readily available saline solution can reliably transport COVID-19 samples to testing labs

Philadelphia, May 27, 2020 - In a new peer-reviewed study appearing in The Journal of Molecular Diagnostics, published by Elsevier, investigators report that a simple salt solution commonly found in hospitals and clinical laboratories, phosphate buffered saline (PBS), can be used as a medium to reliably transport coronavirus-contaminated specimens to the testing laboratory for periods of up to 18 hours, which is often needed in busy clinical settings.

Clinical specimens suspected to contain viruses are usually shipped to testing laboratories in a complex mixture called virus transport medium (VTM). As the COVID-19 pandemic sweeps across the world, the availability of VTM has become severely limited, contributing to delays in diagnosis and rationing of diagnostic testing. Recognizing that SARS-CoV-2 viral RNA can remain stable on certain surfaces for up to 72 hours, scientists at Rutgers University hypothesized that PBS, which is inexpensive and commonly available in clinical settings, could be used instead.

"Necessity is the mother of invention," explains senior investigator Martin J. Blaser, MD, Department of Medicine, Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, USA; and Center for Advanced Biotechnology and Medicine, Rutgers University, Piscataway, NJ, USA. "We were facing a problem about specimens reaching the lab intact due to the nationwide shortage of VTM, so we endeavored to test a widely available solution to see if it could help us. But first, we had to determine whether or not it worked."

Three experimental procedures were performed using discarded respiratory secretions from 16 confirmed COVID-19 patients. Swabs were dropped into vials containing either PBS or VTM and transported to the laboratory for analysis using real-time PCR (RT-qPCR) testing to detect the presence of three specific SARS-CoV-2 virus genes in the samples. In the first procedure, eight samples from two subjects were harvested at the same time and transported in either VTM or PTS. The samples were processed immediately or after two hours at room temperature. SARS-CoV-2 was detected in all samples at similar levels for each patient, demonstrating that results were consistent for samples obtained and stored in identical manners.

In a parallel experiment, samples were tested after remaining at room temperature in either PBS or VTM at points ranging from 0-18 hours. These experiments mimicked field conditions, in which specimens may remain in transport for longer periods. Researchers found that storage at room temperature had little effect on the values detected for the three SARS-CoV genes in either PBS or VTM and could be useful for labs that test for several SARS-CoV genes that have different processing times.

The researchers also examined samples transported in both PBS and VTM from an additional 12 patients with unknown viral loads. Again, they found the storage medium did not affect the detectability of the virus.

"That results for all three viral genes tested were strongly correlated across samples from multiple patients support the robustness of the entire testing pathway, including transport, establishing PBS as a dependable transport medium for use with clinical samples," notes Dr. Blaser. "Our contribution will allow for increased effective testing and transport from relatively long distances to testing labs, at lower cost."

The researchers note that expanded testing capacity would facilitate more widespread surveillance and containment of COVID-19 in communities, allowing fewer restrictions in work, travel, and social distancing.

Credit: 
Elsevier

Under pressure, black holes feast

New Haven, Conn. -- A new, Yale-led study shows that some supermassive black holes actually thrive under pressure.

It has been known for some time that when distant galaxies --and the supermassive black holes within their cores -- aggregate into clusters, these clusters create a volatile, highly pressurized environment. Individual galaxies falling into clusters are often deformed during the process and begin to resemble cosmic jellyfish.

Curiously, the intense pressure squelches the creation of new stars in these galaxies and eventually shuts off normal black hole feeding on nearby interstellar gas. But not before allowing the black holes one final feast of gas clouds and the occasional star.

The researchers also suggested this rapid feeding might be responsible for the eventual lack of new stars in those environments. The research team said "outflows" of gas, driven by the black holes, might be shutting off star formation.

"We know that the feeding habits of central supermassive black holes and the formation of stars in the host galaxy are intricately related. Understanding precisely how they operate in different larger-scale environments has been a challenge. Our study has revealed this complex interplay," said astrophysicist Priyamvada Natarajan, whose team initiated the research. Natarajan is a professor of astronomy and physics in Yale's Faculty of Arts and Sciences.

The study is published in the Astrophysical Journal Letters. The first author is Angelo Ricarte, a former member of Natarajan's lab now at Harvard, who started this work as a Yale doctoral student. Co-authors are Yale Center for Astronomy and Astrophysics Prize postdoctoral associate Michael Tremmel and Thomas Quinn of the University of Washington.

The new study adds to a significant body of work from Natarajan's research group regarding how supermassive black holes form, grow, and interact with their host galaxies in various cosmic environments.

The researchers conducted sophisticated simulations of black holes within galaxy clusters using RomulusC, a cosmological simulation that Tremmel, Quinn and others developed.

Ricarte developed new tools for extracting information from RomulusC. While analyzing black hole activity in the cluster simulation, he said, he noticed "something weird happening once their host galaxies stopped forming stars. Surprisingly, I often spotted a peak in black hole activity at the same time that the galaxy died."

That "peak" would be the black hole's big, final feast, under pressure.

Tremmel said that "RomulusC is unique because of its exquisite resolution and the detailed way in which it treats supermassive black holes and their environments, allowing us to track their growth."

Credit: 
Yale University

Mouse model mimics SARS-CoV-2 infection in humans

A mouse model of infection with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) reproduces features observed in human patients, researchers report May 26 in the journal Cell Host & Microbe. Using CRISPR/Cas9 gene editing technology, the researchers generated mice that produce human angiotensin-converting enzyme II (hACE2)--the receptor that SARS-CoV-2 binds to and uses to enter human cells.

"A small animal model that reproduces the clinical course and pathology observed in COVID-19 patients is highly needed," says co-senior study author You-Chun Wang of the National Institutes for Food and Drug Control (NIFDC) in Beijing, China. "The animal model described here provides a useful tool for studying SARS-CoV-2 infection and transmission."

Wang and his collaborators used CRIPSR/Cas9 to generate a mouse model that could express hACE2. According to the authors, their mouse model has several advantages compared with other genetically engineered mice that express hACE2 for modeling SARS-CoV-2 infection. Instead of being randomly inserted, hACE2 is inserted precisely into a specific site on the X chromosome, and it completely replaces the mouse version of the protein. In addition, this is a genetically stable model, with few differences among individuals. Moreover, the viral RNA loads in the lung are much higher, and the resulting distribution of hACE2 in various tissues better matches that observed in humans.

After being infected with SARS-CoV-2 through the nose, the genetically engineered mice showed evidence of robust viral RNA replication in the lung, trachea, and brain. "The presence of viral RNAs in brain was somewhat unexpected, as only a few COVID-19 patients have developed neurological symptoms," says co-senior study author Cheng-Feng Qin of the Academy of Military Medical Sciences (AMMS) in Beijing, China.

SARS-CoV-2 S protein, which binds to hACE2 to enter host cells, was also present in the lung tissue and brain cells. Moreover, the researchers identified the major airway cells targeted by SARS-CoV-2 as Clara cells that produce the protein CC10. "Our result provides the first line of evidence showing the major target cells of SARS-CoV-2 in the lung," says co-senior study author Yu-Sen Zhou of AMMS.

In addition, the mice developed interstitial pneumonia, which affects the tissue and space around the air sacs of the lungs, causing the infiltration of inflammatory cells, the thickening of the structure that separates air sacs, and blood vessel damage. Compared with young mice, older mice showed more severe lung damage and increased production of signaling molecules called cytokines. Taken together, these features recapitulate those observed in COVID-19 patients.

When the researchers administered SARS-CoV-2 into the stomach, two of the three mice showed high levels of viral RNA in the trachea and lung. The S protein was also present in lung tissue, which showed signs of inflammation. According to the authors, these findings are consistent with the observation that patients with COVID-19 sometimes experience gastrointestinal symptoms such as diarrhea, abdominal pain, and vomiting. But 10 times the dose of SARS-CoV-2 was required to establish infection through the stomach than through the nose.

Future studies using this mouse model may shed light on how SARS-CoV-2 invades the brain and how the virus survives the gastrointestinal environment and invades the respiratory tract. "The hACE2 mice described in our manuscript provide a small animal model for understanding unexpected clinical manifestations of SARS-CoV-2 infection in humans," says co-senior study author Chang-Fa Fan of NIFDC. "This model will also be valuable for testing vaccines and therapeutics to combat SARS-CoV-2."

Credit: 
Cell Press

Patterns in crop data reveal new insight about plants and their environments

AMES, Iowa - A recently published study led by Iowa State University scientists applied a fresh perspective to vast amounts of data on rice plants to find better ways to predict plant performance and new insights about how plants adapt to different environments.

The study, published in the academic journal Genome Research, unearthed patterns in datasets collected on rice plants across Asia, said Jianming Yu, professor of agronomy and Pioneer Distinguished Chair in Maize Breeding. Those patterns allowed the researchers to develop a matrix to help them predict the traits of rice plants depending on their genetics and the environment in which they're grown. The research could improve the ability of farmers to predict how crop varieties will perform in various environments, giving growers a better sense of stability and minimizing risk, Yu said.

"An organism's traits are determined by a combination of its genome, its environment and circumstances unique to that organism," said Tingting Guo, the first author of the paper and a research scientist in agronomy. "These are all complex factors. We're trying to see how to gain a deeper understanding of the process so that we can move up the pyramid of data: information, knowledge and wisdom. Accurately predicting traits is a natural extension of applying that wisdom."

The study analyzed data from 174 rice plants grown in nine different environments across Asia. The researchers analyzed the dataset using methods they'd previously developed for sorghum and found temperatures early in the growth of the plants play a major role in determining the length of time the rice plants flower, called flowering time. Coupled with genomic data, the researchers used this observation to develop an index from the temperature profile between nine and 50 days after planting to predict flowering time.

Analysis can apply to other crops

The study's findings hinge on the ability of the researchers to apply innovative analysis techniques to large, previously available datasets, rather than generating new data.

"Our starting point was data that already existed," said Xianran Li, an adjunct associate professor. "But we spent our time extracting new information from that data and taking the next critical step to explore the connection with a much larger dataset."

The analysis also found patterns in the geographic distribution of certain rice genetic haplotypes, or sets of DNA combinations that tend to be inherited together, among roughly 3,000 diverse rice plants. These geographic patterns revealed preferential adaptation to different temperature zones, Guo said. Regions with lower temperature were dominated by haplotypes sensitive to temperature changes, while the equatorial region had a majority of haplotypes that are less responsive to temperature, according to the study.

Yu and his colleagues have applied their data analysis methods to sorghum and rice plants, and he said similar approaches should carry over to corn and soybeans as well.

The fundamental question motivating the research is how the complex genotype-environment interplay gives rise to phenotypic variation, Yu said. The team proposed a conceptual model to connect gene and environment together.

"It is time to bring the inseparable environmental context into how we define the effects of genes and gene networks. This model bridges the gaps among various research in individual gene discovery, field-level phenotypic plasticity, and genomic diversity and adaptation," Yu said. "We think this conceptual model also serves as a broad framework to move plant breeding forward."

Credit: 
Iowa State University

Avalanche photodiode from UVA and UT-Austin breaks performance record for LiDAR receivers

image: Epitaxial cross section of the avalanche photodiode design. Doping concentrations are given in cm-3

Image: 
Joe C. Campbell

Charlottesville, Va. - Electrical and computer engineers at the University of Virginia and University of Texas-Austin have developed an avalanche photodiode that achieved record performance and has the potential to transform next generation night-vision imaging and Light Detection and Ranging (LiDAR) receivers. For LiDAR, the team's low-noise, two-micrometer avalanche photodiode enables higher-power operation that is eye-safe.

The peer reviewed paper, "Low-noise high-temperature AlInAsSb/GaSb avalanche photodiodes for 2-μm applications," was published May 18, 2020, in Nature Photonics, a monthly journal of the best research from all areas of light generation, manipulation and detection.

This breakthrough comes from a long-standing collaboration between Joe C. Campbell, Lucien Carr III Professor of electrical and computer engineering at UVA, and Seth R. Bank, Cullen Trust Professor at UT-Austin. Andrew H. Jones, a 2020 Ph.D. graduate advised by Campbell, and Stephen D. March, a Ph.D. student in Bank's research group, contributed to the research. The team's work was funded by the Defense Advanced Research Projects Agency and the Army Research Office.

The team used the novel optical and electrical characteristics of a digital alloy created in Bank's Laboratory for Advanced Semiconductor Epitaxy. Bank employed molecular beam epitaxy to grow the alloy, composed of aluminum, indium, arsenic and antimony. The alloy combines long-wavelength sensitivity, ultra-low noise, and the design flexibility that is needed to achieve low dark currents, which is not available with existing low-noise avalanche photodiode materials technologies.

"Our ability to control the crystal growth process down to the single atom-scale enables us to synthesize crystals that are forbidden in nature, as well as design them to simultaneously possess the ideal combination of fundamental material properties necessary for efficient photodetection," Bank said.

The team's avalanche photodiode is an ideal solution for compact, high-sensitivity LiDAR receivers. Many LiDAR applications, such as robotics, autonomous vehicles, wide-area surveillance and terrain mapping, require high-resolution sensors that can detect greatly attenuated optical signals reflected from distant objects. Eye safety has limited the adoption of these next-generation LiDAR systems, however, because the requisite higher laser power poses an increased risk of eye damage.

"The 2-micrometer window is ideal for LiDAR systems because it is considered eye-safe and extends the detection reach." Campbell said. "I can envision our avalanche photodiode impacting numerous key technologies that benefit from high sensitivity detectors."

Credit: 
University of Virginia School of Engineering and Applied Science