Culture

A small protein in bacteria overlooked up to now

image: Photosynthesis is found at the beginning of nearly all food chains. Cyanobacteria use light as an energy source, too, and can carry out photosynthesis, similar to plants.

Image: 
Grafik: Wolfgang Hess

The biological process of photosynthesis is found at the beginning of nearly all food chains. It produces oxygen to breathe and provides the energetic foundation for using biotechnological processes to synthesize biofuels and chemical feedstock. Therefore, researchers are particularly interested in rapidly growing cyanobacteria. These organisms use light as an energy source and can carry out photosynthesis, similar to plants. However, the required photosynthetic protein complexes bind many nutrients. Vanessa Krauspe and Prof. Dr. Wolfgang Hess from the working group for Genetics & Experimental Bioinformatics of the Faculty of Biology of the University of Freiburg and their collaborators have discovered the small, previously unknown protein NblD, which is involved in the recycling of these nutrients. The researchers are presenting their new findings in the specialist journal PNAS.

In addition to the pigment chlorophyll, cyanobacteria use for photosynthesis frequently also phycobilisomes, complexes consisting of proteins and another class of tetrapyrrole pigments, which are considered as the most effective light-harvesting structures found in nature. However, using phycobilisomes is costly for the cell as they bind a huge amount of nutrients in their macromolecular structures - nitrogen in particular. In order to recycle these nutrients under scarcity conditions, for example under conditions of insufficient nitrogen supplies, the cyanobacteria have sophisticated genetic programs which, among scientists, are actually considered to be well-examined.

In a new approach, aiming at taking a closer look at especially small genes and proteins, the team at the University of Freiburg has been able to characterize NblD. It is a previously unknown small protein with high-affinity, meaning it rapidly forms bonds. NblD binds specifically to what is known as the phycocyanin beta-subunit of the phycobilosome. Through this process cyanobacterial cells receive special mechanisms to handle potentially dangerous intermediate products that occur during the recycling of the phycobilisomes. Hess says, "The results illustrate the fact that especially small genes and proteins have been neglected hitherto and deserve a closer look."

Credit: 
University of Freiburg

New research about emerging 'COVID-19 personality types'

image: New research just published identifies and explores the impacts of salient viral or COVID-19 behavioural identities that are emerging.

Image: 
Tony Picher

New research by Mimi E. Lam (University of Bergen) just published in Humanities and Social Sciences Communications identifies and explores the impacts of salient viral or COVID-19 behavioural identities that are emerging.

"These emergent COVID-19 behavioural identities are being hijacked by existing social and political identities to politicize the pandemic and heighten racism, discrimination, and conflict," says Lam. She continues: "the COVID-19 pandemic reminds us that we are not immune to each other. To unite in our fight against the pandemic, it is important to recognize the basic dignity of all and value the human diversity currently dividing us."

"Only then, can we foster societal resilience and an ethical COVID-19 agenda. This would pave the way for other global commons challenges whose impacts are less immediate, but no less dire for humanity.

Lam argues that liberal democracies need an ethical policy agenda with three priorities: 1. to recognize the diversity of individuals; 2. to deliberate and negotiate value trade-offs; and 3. to promote public buy-in, trust, and compliance.

Some emergent "COVID-19 personality types":

Deniers: who downplay the viral threat, promoting business as usual

Spreaders: who want it to spread, herd immunity to develop, and normality to return

Harmers: who try to harm others by, for example, spitting or coughing at them

Realists: who recognise the reality of the potential harm and adjust their behaviours

Worriers: who stay informed and safe to manage their uncertainty and fear

Contemplators: who isolate and re?ect on life and the world

Hoarders: who panic-buy and hoard products to quell their insecurity

Invincibles: often youth, who believe themselves to be immune

Rebels: who de?antly ?out social rules restricting their individual freedoms

Blamers: who vent their fears and frustrations onto others

Exploiters: who exploit the situation for power, pro?t or brutality

Innovators: who design or repurpose resources to fight the pandemic

Supporters: who show their solidarity in support of others

Altruists: who help the vulnerable, elderly, and isolated

Warriors: who, like the front-line health-care workers, combat its grim reality

Veterans: who experienced SARS or MERS and willingly comply with restrictions

Credit: 
The University of Bergen

How vitamins, steroids and potential antivirals might affect SARS-CoV-2

Evidence is emerging that vitamin D - and possibly vitamins K and A - might help combat COVID-19. A new study from the University of Bristol published in the journal of the German Chemical Society Angewandte Chemie has shown how they - and other antiviral drugs - might work. The research indicates that these dietary supplements and compounds could bind to the viral spike protein and so might reduce SARS-CoV-2 infectivity. In contrast, cholesterol may increase infectivity, which could explain why having high cholesterol is considered a risk factor for serious disease.

Recently, Bristol researchers showed that linoleic acid binds to a specific site in the viral spike protein, and that by doing so, it locks the spike into a closed, less infective form. Now, a research team has used computational methods to search for other compounds that might have the same effect, as potential treatments. They hope to prevent human cells becoming infected by preventing the viral spike protein from opening enough to interact with a human protein (ACE2). New anti-viral drugs can take years to design, develop and test, so the researchers looked through a library of approved drugs and vitamins to identify those which might bind to this recently discovered 'druggable pocket' inside the SARS-CoV-2 spike protein.

The team first studied the effects of linoleic acid on the spike, using computational simulations to show that it stabilizes the closed form. Further simulations showed that dexamethasone - which is an effective treatment for COVID-19 - might also bind to this site and help reduce viral infectivity in addition to its effects on the human immune system.

The team then conducted simulations to see which other compounds bind to the fatty acid site. This identified some drugs that have been found by experiments to be active against the virus, suggesting that this may be one mechanism by which they prevent viral replication such as, by locking the spike structure in the same way as linoleic acid.

The findings suggested several drug candidates among available pharmaceuticals and dietary components, including some that have been found to slow SARS-CoV-2 reproduction in the laboratory. These have the potential to bind to the SARS-CoV-2 spike protein and may help to prevent cell entry.

The simulations also predicted that the fat-soluble vitamins D, K and A bind to the spike in the same way making the spike less able to infect cells.

Dr Deborah Shoemark, Senior Research Associate (Biomolecular Modelling) in the School of Biochemistry, who modelled the spike, explained: "Our findings help explain how some vitamins may play a more direct role in combatting COVID than their conventional support of the human immune system.

"Obesity is a major risk factor for severe COVID. Vitamin D is fat soluble and tends to accumulate in fatty tissue. This can lower the amount of vitamin D available to obese individuals. Countries in which some of these vitamin deficiencies are more common have also suffered badly during the course of the pandemic. Our research suggests that some essential vitamins and fatty acids including linoleic acid may contribute to impeding the spike/ACE2 interaction. Deficiency in any one of them may make it easier for the virus to infect."

Pre-existing high cholesterol levels have been associated with increased risk for severe COVID-19. Reports that the SARS-CoV-2 spike protein binds cholesterol led the team to investigate whether it could bind at the fatty acid binding site. Their simulations indicate that it could bind, but that it may have a destabilising effect on the spike's locked conformation, and favour the open, more infective conformation.

Dr Shoemark continued: "We know that the use of cholesterol lowering statins reduces the risk of developing severe COVID and shortens recovery time in less severe cases. Whether cholesterol de-stabilises the "benign", closed conformation or not, our results suggest that by directly interacting with the spike, the virus could sequester cholesterol to achieve the local concentrations required to facilitate cell entry and this may also account for the observed loss of circulating cholesterol post infection."

Professor Adrian Mulholland, of Bristol's School of Chemistry, added: "Our simulations show how some molecules binding at the linoleic acid site affect the spike's dynamics and lock it closed. They also show that drugs and vitamins active against the virus may work in the same way. Targeting this site may be a route to new anti-viral drugs. A next step would be to look at effects of dietary supplements and test viral replication in cells."

Alison Derbenwick Miller, Vice President, Oracle for Research, said: "It's incredibly exciting that researchers are gaining new insights into how SARS-CoV-2 interacts with human cells, which ultimately will lead to new ways to fight COVID-19. We are delighted that Oracle's high-performance cloud infrastructure is helping to advance this kind of world-changing research. Growing a globally-connected community of cloud-powered researchers is exactly what Oracle for Research is designed to do."

The team included experts from Bristol UNCOVER Group, including Bristol's Schools of Chemistry, Biochemistry, Cellular and Molecular Medicine, and Max Planck Bristol Centre for Minimal Biology, and Bristol Synthetic Biology Centre, using Bristol's high performance computers and the UK supercomputer, ARCHER, as well as Oracle cloud computing. The study was supported by grants from the EPSRC and the BBSRC.

Credit: 
University of Bristol

Football and inclusion: It all comes down to the right motivational climate

image: Anne-Marie Elbe, Professor of Sports Psychology at Leipzig University.

Image: 
Photo: Swen Reichhold, Leipzig University

This is the conclusion of a recent study by an international team of researchers, including Anne-Marie Elbe, Professor of Sports Psychology at Leipzig University. The finding is of social importance because experiences in adolescence in particular have a formative influence on attitudes and behaviour in later life.

In sport, football is considered a model of inclusion. "Remarkably, to the best of our knowledge, theory and research on feelings of inclusion in (youth) team sports is lacking," the authors write in their study. They add that filling this gap is important, because team sports are not necessarily inclusive by nature.

For their study, the Danish-Dutch-German team of researchers interviewed 245 boys aged 10 to 16 about their experiences. "We focused [on them] because particularly in these age groups, positive intercultural contact experiences tend to lead to more positive intergroup attitudes in adulthood," said the authors. The subjects belong to two Dutch football clubs that train very diverse teams. The majority of the study participants - 61.6 per cent - were classified as having "minority" social status. This means that the player himself or at least one of his parents was born outside the Netherlands.

For the research team, inclusion consists of two components, explains Anne-Marie Elbe from Leipzig University: "How strongly do I feel I belong to a team? And how strongly do I feel that I can be myself - so act authentically with regard to things like my other cultural background?" This understanding of inclusion is based on existing research by other scholars.

"Our assumption in the study was that there would be a relationship between a person's feeling of inclusion and what kind of motivational climate exists in the team, so the climate created by the coach," said Elbe. A distinction is made between a performance-oriented motivational climate on the one hand, where the aim is to be better than other players in your own team, and a task-oriented motivational climate on the other. With task-based standards, the focus is on the individual player and improving his own skills. Motivating each player to learn is important: does he succeed in doing a task well, or at least not doing it worse than before?

Professor Anne-Marie Elbe and her team of researchers have now shown that the young players' sense of inclusion correlated positively with a task-oriented training climate, while it correlated negatively with a climate based on competition. When both types of training were used side by side, non-migrant players still managed to cope well - without this impacting too heavily on their sense of belonging. Among the "minority" players, however, it was observed that their sense of inclusion was only stable where there was a stronger focus on task-based standards, and the competitive approach within their own team was either not emphasised or emphasised only to a limited extent.

"So you can't say that being a member of a football team in itself has positive effects. In order to achieve positive effects through football training, the coach needs to behave in a certain way and create a specific climate during the training session. There is a lot of potential in this, and it is of enormous significance to society," said Anne-Marie Elbe. "Our study helps extend the quantitative research on inclusion and sport."

Credit: 
Universität Leipzig

Researchers illustrate the need for anti-racism in kidney care, research

(Boston)--There is a growing awareness of systematic inequality and structural racism in American society. Science and medicine are no exception, as evidenced by historical instances of discrimination and overt racism.

In a perspective piece in the Journal of the American Society of Nephrology, researchers from Boston University School of Medicine (BUSM), take an honest look at how the current practice of nephrology (kidney medicine) may have elements rooted in racist ideologies.

For twenty years, kidney function has been estimated based on lab tests and equations that consider black vs. non-black race. Many institutions are now reconsidering whether this practice is defensible, and several have stopped reporting kidney function based on racial identity. The researchers contemplate what other aspects of clinical practice and research may have subtle racist undertones.

Despite the fact that race is now understood as a social rather than biological construct, many examples in nephrology implicitly assume a biological basis for race. Examples include the use of race in estimating the risk for kidney stones in black vs. white individuals, for assessing the suitability of kidneys from black vs. white individuals for transplantation, and in studies of kidney function and physiology. "The practice and teaching of nephrology in graduate and medical school today continues to perpetuate an ideology that is non-scientific, misleading to students and trainees and ultimately, corrosive to society," explains corresponding author Sushrut S. Waikar, MD, the Norman Levinsky professor of medicine at BUSM.

According to Waikar, reporting kidney function separately for "black" and "white" patients is setting the stage for people to accept a biological basis for race. "Kidney function tests are among the most commonly reported tests by laboratories around the world. Tens of thousands of lab reports every day make a distinction between "black" and "white" kidney function. This may influence the way we think about race, leading to subtle and pervasive racism in everyday clinical medicine," he adds.

Waikar and Insa Schmidt, MD, MPH, nephrologists at Boston Medical Center, stress that physicians and scientists have a moral obligation to take a critical look at historical practices that may be rooted in racist ideology, and re-think the appropriate use of race in medicine. "We believe we have an obligation as doctors and researchers to be advocates for social justice and anti-racism. We also have to be honest and call out our own practices when they fall short of this ideal."

Credit: 
Boston University School of Medicine

Scientists identify locations of early prion protein deposition in retina

image: (left panel) Early in prion infection, a prion protein aggregate (magenta) blocks the entrance to a cilium (green) in a retinal photoreceptor. (lower right) In prion-infected retina, prion protein (magenta) accumulates under the horseshoe-shaped ribbon synapses (green) found in photoreceptor terminals.

Image: 
NIAID

WHAT:

The earliest eye damage from prion disease takes place in the cone photoreceptor cells, specifically in the cilia and the ribbon synapses, according to a new study of prion protein accumulation in the eye by National Institutes of Health scientists. Prion diseases originate when normally harmless prion protein molecules become abnormal and gather in clusters and filaments in the human body and brain.

Understanding how prion diseases develop, particularly in the eye because of its diagnostic accessibility to clinicians, can help scientists identify ways to slow the spread of prion diseases. The scientists say their findings, published in the journal Acta Neuropathologica Communications, may help inform research on human retinitis pigmentosa, an inherited disease with similar photoreceptor degeneration leading to blindness.

Prion diseases are slow, degenerative and usually fatal diseases of the central nervous system that occur in people and some other mammals. Prion diseases primarily involve the brain, but also can affect the eyes and other organs. Within the eye, the main cells infected by prions are the light-detecting photoreceptors known as cones and rods, both located in the retina.

In their study, the scientists, from NIH's National Institute of Allergy and Infectious Diseases at Rocky Mountain Laboratories in Hamilton, Montana, used laboratory mice infected with scrapie, a prion disease common to sheep and goats. Scrapie is closely related to human prion diseases, such as variant, familial and sporadic Creutzfeldt-Jakob disease (CJD). The most common form, sporadic CJD, affects an estimated one in one million people annually worldwide. Other prion diseases include chronic wasting disease in deer, elk and moose, and bovine spongiform encephalopathy in cattle.

Using confocal microscopy that can identify prion protein and various retinal proteins at the same time, the scientists found the earliest deposits of aggregated prion protein in cone photoreceptors next to the cilia, tube-like structures required for transporting molecules between cellular compartments. Their work suggests that by interfering with transport through cilia, these aggregates may provide an important early mechanism by which prion infection selectively destroys photoreceptors. At a later study timepoint, they observed similar findings in rods.

Prion protein also was deposited in cones and rods adjacent to ribbon synapses just before the destruction of these structures and death of photoreceptors. Ribbon synapses are specialized neuron connections found in ocular and auditory neural pathways, and their health is critical to the function of retinal photoreceptors in the eye, as well as hair cells in the ear.

The researchers say such detailed identification of disease-associated prion protein, and the correlation with retinal damage, has not been seen previously and is likely to occur in all prion-susceptible species, including people.

Next the researchers are hoping to study whether similar findings occur in retinas of people with other degenerative diseases characterized by misfolded host proteins, such as Alzheimer's and Parkinson's diseases.

Credit: 
NIH/National Institute of Allergy and Infectious Diseases

New technique identifies important mutations behind Lynch Syndrome

Colorectal cancer is the third most common form of cancer. While 90% of cases are in people older than 50, there is an as-yet unexplained rising incidence in younger people.

Family history ranks high among risk factors for developing colorectal cancer, and people with such a history are often advised to get more frequent screening tests or start screening sooner than the recommended age of 45 years old. Those with a family history of cancer often seek out genetic tests to look for mutations linked to cancer risk. However, those tests don't always provide helpful information.

In a new paper in the American Journal of Human Genetics, Jacob Kitzman, Ph.D., of the Department of Human Genetics at Michigan Medicine, and a team of collaborators describe a method for screening so-called genetic variants of uncertain significance in the hopes of identifying those mutations that could cause disease.

To do this, they used a genetic condition called Lynch syndrome, also known as hereditary non-polyposis colorectal cancer. Like BRCA1, a gene known to cause certain breast cancers, there are a handful of genes behind Lynch syndrome that have been well described. However, "there's a whole universe of possible genetic variants that can occur in genes associated with Lynch syndrome that we basically know nothing about," says Kitzman.

Because most mutations are rare in the human population, it can be difficult to tell if any particular one is problematic. And studying one variant in a lab at a time takes a lot of time--often too much to be useful for making clinical decisions.

Using a technique called deep mutational scanning, the research team set out to measure the impact of mutations in the gene MSH2, which when mutated, is one major cause of Lynch syndrome.

"The key advance is rather than doing one mutation at a time, we did it in a pooled format which allowed us to test about 18,000 mutations in a single batch," says Kitzman.

Using CRISPR-Cas technology, they deleted the normal copy of MSH2 from human cells, and replaced it with library of every possible mutation in the MSH2 gene. This created a mix of cells where each cell carried a unique MSH2 mutation. This population of cells was treated with a drug known as 6-thioguanine, a chemotherapy that killed only the cells that had a functional variant of MSH2.

The counterintuitive idea, notes Kitzman, is that the surviving cells are the ones without functioning MSH2--which are the ones with mutations that are most likely to be disease-causing.

"We were basically trying to sit down and make the mutations we could so they could serve as a reference for ones that are newly seen or are amongst the thousands of variants of unknown significance identified in people from clinical testing," says Kitzman. "Until now, geneticists could not be sure whether these are benign or pathogenic."

The hope is that, with other patient-specific information, some of these variants may be able to be reclassified, and those people notified that they should undergo more intense screening.

Says Kitzman, "One of the next areas that will need some focus in the field of human genetics is to create these sorts of maps for many different genes where there is a clinical connection, so we can be more predictive when variants are found in an individual."

Credit: 
Michigan Medicine - University of Michigan

How is human behavior impacting wildlife movement?

image: Human behaviours highly determine where wildlife may move and persist in a landscape, hence showing the importance of 'Anthropogenic Resistance' in conservation planning.

Image: 
Conservation Biogeography, Humboldt-University Berlin

For species to survive in the wild, maintaining connectivity between populations is critical. Without 'wildlife corridors', groups of animals are isolated, unable to breed and may die out. In assessing wildlife connectivity, many aspects of the landscape are measured, but the impact of human behaviour has largely been overlooked. Now, an international team led by the University of Göttingen and Humboldt University Berlin, introduce the concept of 'anthropogenic resistance', which should be studied to ensure sustainable landscapes for wildlife and people for the future. Their perspective article was published in the journal One Earth.

Landscapes around the world are increasingly affected by rapid urbanization, deforestation and similar developments driven by human activity. So far, data collection has largely focused on measuring properties of the land - such as agriculture, urbanization, forestland, crops, or elevation. Other impacts from people are usually lumped together in categories such as population density, or distance from settlements or roads. The researchers propose that it is not merely the presence, absence, or number of people, but what the people are actually doing which affects wildlife movement. In fact, a range of psychological and socioeconomic factors can play a part in 'anthropogenic resistance'. Some examples of these factors include hunting, poaching or supplementary feeding.

For their study, the researchers looked at three case studies in detail: wolves in Washington State; leopards in Iran; and large carnivores in central India. The same concept can be applied to other species: for example the Eurasian lynx, which are returning to their historical ranges; or roe deer who use croplands for both shelter and food but reduce their presence during the hunting season. In some parts of the world, cultural and religious beliefs can result in the tolerance of large carnivores, such as tigers and lions, despite substantial livestock losses and threats to human life. The researchers considered effects from beliefs, values and traditions to wildlife in different areas. The authors claim these nuanced differences in human behaviour strongly determine where wildlife may move and persist in a landscape.

Professor Niko Balkenhol, from Wildlife Sciences at the University of Göttingen, explains, "Anthropogenic resistance is also relevant to the BearConnect project, which aims to understand the factors that determine connectivity in European populations of the brown bear. Bears are capable of moving across huge distances, as shown by bear JJ1, better known as 'Bruno', who travelled from the Italian Trento region all the way to Bavaria, where he was shot. It is important to note that, although Bruno was able to cross the physical landscape, he was stopped by the severe 'anthropogenic resistance' provided by humans who could not tolerate his behaviour."

"Our paper shows that 'anthropogenic resistance' is an important piece of the puzzle for connectivity-planning to ensure the functionality of corridors for wildlife and people," says Dr Trishna Dutta, senior author of the study, also from Wildlife Sciences at the University of Göttingen. Dutta goes on to say: "It reveals that there are advantages for social and natural scientists to collaborate in understanding the effects of 'anthropogenic resistance' in future studies."

Credit: 
University of Göttingen

A computational approach to understanding how infants perceive language

Languages differ in the sounds they use. The Japanese language, for example, does not distinguish between "r" and "l" sounds as in "rock" versus "lock." Remarkably, infants become attuned to the sounds of their native language before they learn to speak. One-year-old babies, for example, less readily distinguish between "rock" and "lock" when living in an environment where Japanese, rather than English, is spoken.

Influential scientific accounts of this early phonetic learning phenomenon initially proposed that infants group sounds into native vowel- and consonant-like phonetic categories through a statistical clustering mechanism known as "distributional learning."

The idea that infants learn consonant- and vowel-like phonetic categories has been challenged, however, by a new study published this week in the Proceedings of the National Academy of Sciences.

In the study, a multi-institutional team of cognitive scientists and computational linguists have introduced a quantitative modeling framework that is based on a large-scale simulation of the language learning process in infants. Using computationally efficient machine learning techniques, this approach allows learning mechanisms to be systematically linked to testable predictions regarding infants' attunement to their native language.

"Hypotheses about what is being learned by infants have traditionally driven researchers' attempts to understand this surprising phenomenon," says Thomas Schatz, a postdoctoral associate in the University of Maryland of Maryland Institute for Advanced Computer Studies (UMIACS) who was lead author of the study. "We propose to start from hypotheses about how infants might learn."

In addition to Schatz, the study's authors include Naomi Feldman, an associate professor of linguistics at the University of Maryland with an appointment in UMIACS; Sharon Goldwater, a professor in the Institute for Language, Cognition and Computation at the University of Edinburgh's School of Informatics; Xuân-Nga Cao, a research engineer at Ecole Normale Supérieure (ENS) in Paris and co-founder of the Langinnov and Gazouyi startups; and Emmanuel Dupoux,a professor who directs the Cognitive Machine Learning team at ENS.

For their study, the researchers simulated the learning process in infants by training a computationally efficient clustering algorithm on realistic speech input. The algorithm was fed spectrogram-like auditory features sampled at regular time intervals that were obtained from naturalistic speech recordings in a target language. In this study, American English and Japanese were the two languages used.

This yielded a candidate model for the early phonetic knowledge of, say, a Japanese infant, the researchers say. Next, they asked two questions of the trained models. Could they explain the observed differences in how Japanese- and English-learning infants discriminate speech sounds? And, did the models learn vowel- and consonant-like phonetic categories?

The dominant scientific accounts of early phonetic learning would have expected the answers to these questions to match (either both should be "yes" or both should be "no"). The researchers found that the answer to the first question was positive: Their models did account for infants' observed behavior, in particular for the Japanese infants' difficulty with distinguishing words like "rock" and "lock." The answer to the second question, however, was negative: The models were found to have learned speech units too brief and acoustically variable to correspond to vowel- and consonant-like phonetic categories.

These results suggest a striking reinterpretation of the existing literature on early phonetic learning. Difficulties in scaling up distributional learning of phonetic categories to realistic learning conditions may be better interpreted as questioning the idea that what infants learn are phonetic categories, rather than the idea that how infants learn is through pure distributional learning (the traditional interpretation).

Cognitive science has not traditionally made use of such large-scale modeling, says Schatz, but recent advances in computing power, large datasets, and machine-learning algorithms make this approach more feasible than ever before.

Schatz and Feldman are part of the Computational Linguistics and Information Procession (CLIP) Laboratory in UMIACS, where Feldman is the current director. The robust computing resources in the CLIP lab and the Cognitive Machine Learning lab in Paris were instrumental to the research project, Feldman says.

In conclusion, the researchers believe their computationally-based modeling approach--together with ongoing efforts in the field to collect empirical data on a large scale, such as large-scale recordings of infants' learning environments at home and large-scale assessment of infants' learning outcomes--opens the path toward a much deeper understanding of early language acquisition.

Credit: 
University of Maryland

New study investigates photonics for artificial intelligence and neuromorphic computing

Scientists have given a fascinating new insight into the next steps to develop fast, energy-efficient, future computing systems that use light instead of electrons to process and store information - incorporating hardware inspired directly by the functioning of the human brain.

A team of scientists, including Professor C. David Wright from the University of Exeter, has explored the future potential for computer systems by using photonics in place of conventional electronics.

The article is published today (January 29th 2021) in the prestigious journal Nature Photonics.

The study focuses on potential solutions to one of the world's most pressing computing problems - how to develop computing technologies to process this data in a fast and energy efficient way.

Contemporary computers are based on the von Neumann architecture in which the fast Central Processing Unit (CPU) is physically separated from the much slower program and data memory.

This means computing speed is limited and power is wasted by the need to continuously transfer data to and from the memory and processor over bandwidth-limited and energy-inefficient electrical interconnects - known as the von Neumann bottleneck.

As a result, it has been estimated that more than 50 % of the power of modern computing systems is wasted simply in this moving around of data.

Professor C David Wright, from the University of Exeter's Department of Engineering, and one of the co-authors of the study explains "Clearly, a new approach is needed - one that can fuse together the core information processing tasks of computing and memory, one that can incorporate directly in hardware the ability to learn, adapt and evolve, and one that does away with energy-sapping and speed-limiting electrical interconnects."

Photonic neuromorphic computing is one such approach. Here, signals are communicated and processed using light rather than electrons, giving access to much higher bandwidths (processor speeds) and vastly reducing energy losses.

Moreover, the researchers try to make the computing hardware itself isomorphic with biological processing system (brains), by developing devices to directly mimic the basic functions of brain neurons and synapses, then connecting these together in networks that can offer fast, parallelised, adaptive processing for artificial intelligence and machine learning applications.

The state-of-the-art of such photonic 'brain-like' computing, and its likely future development, is the focus of an article entitled "Photonics for artificial intelligence and neuromorphic computing" published in the prestigious journal Nature Photonics by a leading international team of researchers from the USA, Germany and UK.

Credit: 
University of Exeter

Researchers use AI to help businesses understand Code of Federal Regs, other legal docs

Researchers at the University of Maryland, Baltimore County (UMBC) have made strides in automated legal document analytics (ALDA) by creating a way to machine-process the Code of Federal Regulations (CFR). The CFR is a complex document containing policies related to doing business with the federal government. All business affiliates of the federal government must comply with the CFR. For government contracts to be equitably open to a broad range of businesses, policies within the CFR must be accessible.

This document automation is just one part of a broader project to help contractors and other entities manage and monitor their legal documents. Directed by Karuna Joshi, associate professor of information systems, the team has successfully managed to do a complete analysis of the CFR. Digital Government: Research and Practice recently published their methodology.

Automating document review through AI

The team's method for analyzing the CFR involves artificial intelligence (AI), which learns how to categorize information within the document, store it and extract it when it is requested. Joshi and her team achieved this by creating a knowledge graph using Semantic Web technologies to illustrate all the key terms, rules, and regulations in the document. This basic framework enables users to ask an automated tool about a specific rule and be provided with the answer.

The semantic web language OWL, or Web Ontology Language, is used to represent concepts and to contextualize relationships. According to Joshi, the framework of the knowledge graph can be "adopted by federal agencies and businesses to automate their internal processes that reference the CFR rules and policies." To facilitate this, they will make it available in the public domain.

Question and answer

General users can interact with the knowledge graph through a kind of question-and-answer process, similar to how many people use Amazon's Alexa or Apple's Siri. For example, Joshi suggests that someone could ask a policy-related question like, "How many days at a minimum must a Request for Proposal (RFP) be posted open/available?" The system would query the CFR knowledge graph to find sections in the document that answer this question.

The researchers anticipate this will be a highly useful system for any business held to the CFR thanks to how it breaks down CFR's legal complexity through the automated process with ease.

Access and accountability

This project to automate and support users' understanding of legal documents has been an ongoing effort by the UMBC team. Beyond the CFR, they seek to assist people with understanding legally binding contracts that they encounter every day, such as terms of service for major companies. Lavanya Elluri, graduate student of information systems, adds, "Our research helps the organizations that use cloud services to understand the context from these textual documents quickly."

Many users have found their data being used without their knowledge, due to the information buried within terms of service and privacy policies. Joshi predicts that the tools her team is developing to help users better understand these documents will be essential to hold companies accountable for their data use.

Credit: 
University of Maryland Baltimore County

Americans like sports, but heterosexual men especially do

COLUMBUS, Ohio - Nearly nine out of 10 Americans say they enjoy sports at least a little, but heterosexual men more commonly identify as passionate sports fans, a new study suggests.

A survey of nearly 4,000 American adults found that only 11% said they did not identify as sports fans at all. Over 40% were passionate fans, identifying themselves as being "quite a bit" or "very much so" sports fans.

About 60% of heterosexual men in the survey identified as passionate sports fans, compared to about 40% of both heterosexual women and lesbians. About 30% of gay men reported being passionate sports fans.

"We found that U.S. adults respond overwhelmingly that they are sports fans," said Chris Knoester, co-author of the study and associate professor of sociology at The Ohio State University.

"Sports fandom is an ingrained part of our culture and central in the lives of many people."

The study, published this week in the Sociology of Sport Journal, was led by Rachel Allison, associate professor of sociology at Mississippi State University.

"One of the advantages of the survey data in this study is that it has a relatively large sample of individuals who identify as a sexual minority or as nonbinary in terms of their gender identity, which has not been the case in most previous studies," Allison said.

"It allowed us to show that while heterosexual men are particularly likely to identify as strong sports fans, there are substantial numbers of people across gender and sexual identities who are also passionate fans."

Survey data came from the National Sports and Society Survey (NSASS), sponsored by Ohio State's Sports and Society Initiative.

The survey was completed by 3,993 adults who volunteered to participate through the American Population Panel, run by Ohio State's Center for Human Resource Research. Participants, who came from all 50 states, answered the survey online between the fall of 2018 and spring of 2019.

Because NSASS participants are disproportionately female, white and Midwestern, the researchers also weighted the survey results to reflect the U.S. population more accurately. This resulted in modest increases of about 5% in the population estimates of the number of passionate sports fans.

While there has been growing attention in the United States to women's sports, and to gay and lesbian participation in sports, there hasn't been good data on how a variety of gender and sexual identities are reflected in the larger sports fan community, Knoester said.

This study gives a preliminary look. About 27% of those surveyed identified as lesbian, gay, bisexual or a sexual identity other than heterosexual. About 3% of respondents identified as nonbinary.

Overall, heterosexual men tended to identify as "quite a bit" of a sports fan, the findings suggest. In contrast, heterosexual women, lesbians and gay men were more likely to say they were "somewhat" of a sports fan on average.

But while heterosexual men are clearly more likely to be big sports fans than gay men, lesbians and heterosexual women have similar interest in sports, according to the results.

"Identifying as lesbian does not seem to discourage sports fandom like identifying as gay does for men," Allison said.

The researchers also explored whether early childhood experiences shaped sports fandom in adults. As expected, people who said they thought of themselves as athletes during childhood and who frequently thought about sports were more likely to be fans as adults.

People who said they were mistreated in sports-related interactions during their lifetime - such as being called names or being bullied - were less likely to be sports fans as adults.

But the researchers did not find that childhood sports experiences or mistreatment accounted for gender and sexual identity differences in how much adults identified as sports fans.

Allison said it is clear that the historic masculine, heterosexual culture of sports is changing. She documented some of those changes in her book Kicking Center: Gender and the Selling of Women's Professional Soccer.

But she said the results of this new study suggest it may not have changed enough to make some women and sexual minorities comfortable to identify as sports fans.

"We've clearly moved beyond the era of open hostility to women, lesbians and gay men in sports," Allison said.

"But the extent to which we've moved from tolerant to fully inclusive cultures isn't necessarily clear. We may be in this period of transition."

Knoester and Allison said sports organizations on all levels, from professional to youth, still need to do more to be inclusive to individuals with different gender and sexual identities.

"You aren't born being a sports fan. The differences in fandom we found here in this study are socially and culturally produced to a great extent, and they can be changed," Knoester said.

Credit: 
Ohio State University

Scientists look to soils to learn how forests affect air quality, climate change

image: A map of sampling locations across the eastern United States used in the studies. The green dot indicates Moores Creek where AM and ECM dominated plots are located while the red dots indicate sites associated with the 54-plot gradient. The Moores Creek site contains 16 total plots (8-AM and 8-ECM dominated) while each site in the 54-plot gradient contains nine plots of varying AM/ECM composition.

Image: 
Graphic by Mushinski, et al.

Trees are often heralded as the heroes of environmental mitigation. They remove carbon dioxide from the atmosphere, which slows the pace of climate change, and sequester nutrients such as nitrogen, which improves water and air quality.

Not all tree species, however, perform these services similarly, and some of the strongest impacts that trees have on ecosystems occur below the surface, away from the eyes of observers. This complicates efforts to predict what will happen as tree species shift owing to pests, pathogens, and climate change as well as to predict which species are most beneficial in reforestation efforts.

Additionally, researchers have sought for years to understand how and why forests comprised of different mixtures of tree species differ in their functioning. Because of the large number of species on Earth, it is impractical to study each tree species' unique effects on carbon and nutrient cycling. Recently, there has been a push to classify trees into groups to help predict the consequences of tree species shifts.

Now, researchers at Indiana University -- in collaboration with scientists from West Virginia University, Jet Propulsion Laboratory, the University of Virginia, and the University of Warwick -- have found that classifying temperate forest trees based on the type of symbiotic fungi with which the trees associate can serve as a broad indicator of how the trees and forests function.

Nearly all trees associate exclusively with one of two types of mycorrhizal fungi. These specialized fungi form mutualistic relationships with tree roots--enhancing the tree's ability to obtain nutrients from soil in exchange for carbon from the tree. Because the type of fungi with which a tree often associates reflects and determines how trees function, grouping trees based on their mycorrhizal fungi has been proposed to be a good way to classify trees.

In two studies, published in Global Change Biology and Ecology Letters, the researchers reported that forest stands dominated by trees that associate with arbuscular mycorrhizal (AM) fungi differ from stands dominated by trees that associate with ectomycorrhizal (ECM) fungi in terms of how they store and retain carbon and nitrogen.

In the first study, the authors found that AM-associating trees such as maples, tulip trees, cherry, and ash, which produce fast-cycling detritus, promote soil microbial communities that have more genes capable of processing nitrogen. This leads to the release of nitrogen gases that reduce air quality. In contrast, ECM-associating trees such as oaks, hickories, beech, and hemlock produce slow-cycling detritus that promotes microbial communities with few nitrogen-cycling genes, leading to lower gaseous nitrogen losses.

To understand the link between tree species and the functioning of soil microbes near these trees, the researchers collected soils from 54 plots spread evenly across six forests in the eastern United States. Each site had both AM- and ECM-associating trees. They extracted DNA from the soils in each plot and looked for the abundance of genes critical to nitrogen cycling. They then placed soils in closed chambers in the laboratory to measure how much nitrogen gas is released from the soil and to determine whether this relates to the abundance of nitrogen-cycling genes.

"Regardless of which tree species were present, we found nearly 5-fold more nitrogen cycling potential in the plots dominated by AM trees," said Ryan Mushinski, the lead author of the study. "It's very exciting that the trend is consistent across the eastern United States, indicating we may be able to predict nitrogen-cycle activity, and more importantly the gaseous loss of nitrogen, in other temperate forests around the world."

Mushinski, who conducted the study as a postdoctoral researcher in the Department of Biology and O'Neill School of Public and Environmental Affairs at Indiana University, is continuing this work in his role as an assistant professor at the University of Warwick, U.K.

"Simplifying the complexity of forest soil, and being able to predict the spatial variability of soil emissions of nitrogen gases, was once thought to be an impossible task," said Jonathan Raff, an associate professor and atmospheric chemist in the O'Neill School and co-author of the study. "Some of these gases are very hard to measure," added Raff, whose lab made the measurements, "but these gases are incredibly important for air quality and climate change mitigation."

In the second study, led by Adrienne Keller, who was a Ph.D. student in the IU Department of Biology at the time of the study and now a postdoc at University of Minnesota, researchers found that forests dominated by AM trees enhance soil carbon storage by releasing carbon from their roots. Keller packed mesh cores with root-free soil and inserted the cores in the same 54 forest plots as Mushinski.

Because the soil inside the cores had a unique chemical signature, she was able to separate the carbon released from roots from the carbon already present in the soil. Keller found that roots of AM trees release more carbon to soil than the roots of ECM trees and that much of the root carbon sticks to the surface of soil minerals where it is protected from microbial decay. This means that root carbon may persist for decades or longer, especially in AM-dominated stands.

"It's challenging to measure how much carbon plants shuttle from their roots to the soil," Keller offered. "Here we were able to not only quantify the amount of root carbon sequestered in the soil, but also show that its magnitude rivals that of aboveground plant inputs."

"There's been a shift in our thinking over the past decade about what controls soil carbon storage," said Richard Phillips, professor of biology in the IU Department of Biology and co-author on both studies. "We used to think that slow decaying leaf detritus was the main driver of soil carbon storage, but we now know that fast-decaying compounds released by roots may be what causes soil carbon to persist." Phillips added.

While more work is needed to explore the generality of these patterns beyond eastern forests of the United States, the two studies indicate that as species come and go in our forests, the ecosystem consequences are likely to be difficult to predict. While AM trees may increase nitrogen-cycling rates--with negative consequences for things like air quality--they may also increase soil carbon storage which can, in turn, slow climate change. Given the number of initiatives to plant trees globally as part of global reforestation efforts (mostly to slow climate change), land managers would be wise to consider what's happening in the soils, where roots and soil microbes are carrying out critical but underappreciated ecosystem functions.

Credit: 
Indiana University

COVID unemployment assistance puts food on the table: BU study

Another wave of COVID-19 is putting millions out of work, while tens of millions more remain unemployed, and Congress debates aid.

Now, a new Boston University School of Public Health (BUSPH) study shows that unemployment help directly translates to people being able to put food on the table.

The CARES Act--passed in March of 2020-- expanded unemployment insurance coverage, amount, and duration.

Published in JAMA Network Open, the study finds that receiving unemployment insurance cuts a person's risk of food insecurity by a third, and halves the likelihood of needing to eat less because of financial constraints. And receiving more coverage, such as the weekly $600 supplement included in CARES until last July, means an even bigger reduction in the risk of going hungry.

"There has long been a need to improve the proportion of people covered, the duration of coverage, and the amount of coverage in our unemployment insurance system. This paper speaks to the critical role that unemployment insurance can play in preventing people from facing food insecurity during a crisis," says study lead author Dr. Julia Raifman, assistant professor of health law, policy & management at BUSPH.

Raifman and colleagues used data from the Understanding Coronavirus in America study, looking at a sample of 2,319 people who had household incomes less than $75,000 and had been employed in February. By the end of July, 1,119 people (nearly half) had experienced unemployment.

Of those who lost their jobs, 415 reported food insecurity and 437 reported that they sometimes ate less because of financial constraints.

The researchers found that receiving unemployment insurance was associated with a 35.0% relative decline in a person's risk of food insecurity, and a 47.8% relative decline in the likelihood of having to eat less. Receiving larger amounts of unemployment insurance and/or the weekly $600 CARES supplement came with even more substantial declines in food insecurity and having to eating less.

The researchers also identified major disparities in who is facing food insecurity among those who have lost their jobs during COVID: 69.2% of Indigenous participants in the study reported food insecurity, as did 52.5% of Hispanic participants, 42.2% of Black participants, 40.3% of Asian participants, and 26.9% of non-Hispanic white participants.

They also found that 46.1% of households with kids faced food insecurity, compared to 32.8% of households without kids.

"It is heartbreaking that families with children are even more likely to face food insecurity," Raifman says. "The recent Booker/Pressley policy proposal to provide direct payments to children's families could make a big difference for their food security and short- and long-term health."

Credit: 
Boston University School of Medicine

Dewdrops on a spiderweb reveal the physics behind cell structures

video: : Researchers in the laboratories of Princeton University scientists Joshua Shaevitz, Howard Stone, and Sabine Petry have discovered that surface tension drives the liquid-like protein TPX2 to form globules that nucleate the formation of branching microtubules during cell division. The paper detailing these discoveries appeared in the Jan 28 issue of the journal Nature Physics. Here, a tabletop experiment shows how a uniform coating of glycerol on a wire transitions into beads. Withdrawing the wire quickly from the vial of glycerol (left) results in a thicker coating and bigger, more widely spaced beads, while withdrawing slowly (right) leads to a thinner coating and smaller, closer beads.

Image: 
Video by the authors: Sagar U. Setru, Bernardo Gouveia, Raymundo Alfaro-Aco, Joshua W. Shaevitz, Howard A. Stone and Sabine Petry

As any cook knows, some liquids mix well with each other, but others do not. For example, when a tablespoon of vinegar is poured into water, a brief stir suffices to thoroughly combine the two liquids. However, a tablespoon of oil poured into water will coalesce into droplets that no amount of stirring can dissolve. The physics that governs the mixing of liquids is not limited to mixing bowls; it also affects the behavior of things inside cells. It's been known for several years that some proteins behave like liquids, and that some liquid-like proteins don't mix together. However, very little is known about how these liquid-like proteins behave on cellular surfaces.

"The separation between two liquids that won't mix, like oil and water, is known as 'liquid-liquid phase separation', and it's central to the function of many proteins," said Sagar Setru, a 2021 Ph.D. graduate who worked with both Sabine Petry, a professor of molecular biology, and Joshua Shaevitz, a professor of physics and the Lewis-Sigler Institute for Integrative Genomics.

Such proteins do not dissolve inside the cell. Instead, they condense with themselves or with a limited number of other proteins, allowing cells to compartmentalize certain biochemical activities without having to wrap them inside membrane-bound spaces.

"In molecular biology, the study of proteins that form condensed phases with liquid-like properties is a rapidly growing field," said Bernardo Gouveia, a graduate student chemical and biological engineering, working with Howard Stone, the Donald R. Dixon '69 and Elizabeth W. Dixon Professor of Mechanical and Aerospace Engineering, and chair of the department. Setru and Gouveia collaborated as co-first authors on an effort to better understand one such protein.

"We were curious about the behavior of the liquid-like protein TPX2. What makes this protein special is that it does not form liquid droplets in the cytoplasm as had been observed before, but instead seems to undergo phase separation on biological polymers called microtubules," said Setru. "TPX2 is necessary for making branched networks of microtubules, which is crucial for cell division. TPX2 is also overexpressed in some cancers, so understanding its behavior may have medical relevance."

Individual microtubules are linear filaments that are rod-like in shape. During cell division, new microtubules form on the sides of existing ones to create a branched network. The sites where new microtubules will grow are marked by globules of condensed TPX2. These TPX2 globules recruit other proteins that are necessary to generate microtubule growth.

The researchers were curious about how TPX2 globules form on a microtubule. To find out, they decided to try observing the process in action. First, they modified the microtubules and TPX2 so that each would glow with a different fluorescent color. Next, they placed the microtubules on a microscope slide, added TPX2, and then watched to see what would happen. They also made observations at very high spatial resolution using a powerful imaging approach called atomic force microscopy.

"We found that TPX2 first coats the entire microtubule and then breaks up into droplets that are evenly spaced apart, similar to how morning dew coats a spider web and breaks up into droplets," said Gouveia.

Setru, Gouveia and colleagues found that this occurs because of something physicists call the Rayleigh-Plateau instability. Though non-physicists may not recognize the name, they will already be familiar with the phenomenon, which explains why a stream of water falling from a faucet breaks up into droplets, and why a uniform coating of water on a strand of spider web coalesces into separate beads.

"It is surprising to find such everyday physics in the nanoscale world of molecular biology," said Gouveia.

Extending their study, the researchers found that the spacing and size of TPX2 globules on a microtubule is determined by the thickness of the initial TPX2 coating -- that is, how much TPX2 is present. This may explain why microtubule branching is altered in cancer cells that overexpress TPX2.

"We used simulations to show that these droplets are a more efficient way to make branches than just having a uniform coating or binding of the protein all along the microtubule," said Setru.

"That the physics of droplet formation, so vividly visible to the naked eye, has a role to play down at the micrometer scales, helps establish the growing interface (no pun intended) between soft matter physics and biology," said Rohit Pappu, the Edwin H. Murty Professor of Engineering at Washington University in St. Louis, who was not involved in the study.

"The underlying theory is likely to be applicable to an assortment of interfaces between liquid-like condensates and cellular surfaces," adds Pappu. "I suspect we will be coming back to this work over and over again."

Credit: 
Princeton University