Culture

Not falling far from tree: Ecologists study seed-to-seedling transitions

image: A close-up of Apeiba membranacea, commonly known as monkey comb, found in tropical forests of Central America.

Image: 
Steven Paton, Smithsonian Tropical Research Institute (STRI)

LOGAN, UTAH, USA -- Why are there so many species of plants? Why do some plants thrive, while others don't?

Utah State University ecologist Noelle Beckman and colleagues Philippe Marchand of the University of Quebec, Liza Comita of Yale University, Joseph Wright of the Smithsonian Tropical Research Institute in Panama, Richard Condit of Chicago's Field Museum of Natural History and internationally renowned ecologist Stephen P. Hubbell of the University of California, Los Angeles, explore these questions and recently published findings about seed-to-seedling transitions in the journal Ecology.

The team's research is supported by the Smithsonian Institution and the National Science Foundation.

The researchers studied spatial characteristics of 24 tree species from data collected at the STRI's Forest Dynamics Plot on Barro Colorado Island, located in the man-made Gatun Lake in the Panama Canal.

"Patterns of seed dispersal and seed mortality influence the spatial structure of plant populations and the local coexistence of competing species," says Beckman, assistant professor in USU's Department of Biology and the USU Ecology Center. "Most seeds are dispersed close to the parent tree, where mortality is also expected to be the highest, due to competition with siblings or the attraction of natural enemies."

Distance-dependent mortality in the seed-to-seedling transition is often observed in tropical forests, she says, but few studies have closely studied survival-distance curves.

For this study, Beckman and colleagues examined spatial patterns of seeds and surviving seedlings.

"The resulting spatial patterns can tell us something about the mechanisms creating these patterns and the potential for those mechanisms to allow different plant species to exist," she says.

The Janzen-Connell hypothesis, for example, is a widely tested explanation that suggests host-specific herbivores, pathogens and other natural enemies make areas near a parent tree inhospitable for seedling survival, resulting in more regular spacing of plants.

"This mechanism can allow species to coexist," Beckman says. "However, seed densities can be a lot higher underneath parent trees than farther away. Hence, even if a large fraction of seeds is killed by natural enemies, a large number of seedlings may survive under the tree compared to far away."

This spatial pattern of seed dispersal and surviving seedlings, she says, is called the Hubbell pattern (an ecology pattern described by Beckman's UCLA co-author.)

"It suggests the strength of mortality experienced from the seed to seedling stage may not be sufficient to promote local diversity," Beckman says.

Credit: 
Utah State University

Eating a vegetarian diet rich in nuts, vegetables, soy linked to lower stroke risk

People who eat a vegetarian diet rich in nuts, vegetables and soy may have a lower risk of stroke than people who eat a diet that includes meat and fish, according to a study published in the February 26, 2020, online issue of Neurology®, the medical journal of the American Academy of Neurology.

"Stroke is the second most common cause of death worldwide and a leading cause of disability," said study author Chin-Lon Lin, M.D., of Tzu Chi University in Hualien, Taiwan. "Stroke can also contribute to dementia. If we could reduce the number of strokes by people making changes to their diets, that would have a major impact on overall public health."

The study involved two groups of people from Buddhist communities in Taiwan where a vegetarian diet is encouraged, and smoking and drinking alcohol are discouraged. Approximately 30% of participants in both groups were vegetarians. Of the vegetarians, 25% were men. Researchers defined vegetarians as people who did not eat any meat or fish.

At the start of the study, the average age of all participants was 50 and none had experienced stroke. The first group of 5,050 people was followed for an average of six years. The second group of 8,302 people was followed for an average of nine years. Participants were given medical exams at the start of the study and asked about their diet.

Vegetarians ate more nuts, vegetables and soy than non-vegetarians and consumed less dairy. Both groups consumed the same amount of eggs and fruit. Vegetarians ate more fiber and plant protein. They also ate less animal protein and fat.

Researchers then looked at a national database to determine the numbers of strokes participants had during the course of the study.

In the first group of 5,050 people, there were 54 strokes. For ischemic strokes, which are strokes when blood flow to part of the brain is blocked, there were three strokes among 1,424 vegetarians, or 0.21%, compared to 28 strokes among 3,626 non-vegetarians, or 0.77%. After adjusting for age, sex, smoking and health conditions like high blood pressure and diabetes, researchers found vegetarians in this group had a 74% lower risk of ischemic stroke than non-vegetarians.

In the second group of 8,302 people, there were 121 strokes. For both ischemic and hemorrhagic strokes, also called bleeding strokes, there were 24 strokes among 2,719 vegetarians, or 0.88%, compared to 97 strokes among 5,583 non-vegetarians, or 1.73%. After adjusting for other factors, researchers found vegetarians in this group had a 48% lower risk of overall stroke than non-vegetarians, a 60% lower risk of ischemic stroke and a 65% lower risk of hemorrhagic stroke.

"Overall, our study found that a vegetarian diet was beneficial and reduced the risk of ischemic stroke even after adjusting for known risk factors like blood pressure, blood glucose levels and fats in the blood," said Lin. "This could mean that perhaps there is some other protective mechanism that may protecting those who eat a vegetarian diet from stroke."

One limitation of the study was that the diet of participants was only assessed at the start of the study, so it is not known if participants' diets changed over time. Another limitation was that study participants did not drink or smoke, so results may not reflect the general population. Also, results from the study population in Taiwan may not be generalizable worldwide. Finally, there could be other factors, not accounted for, that might affect stroke risk.

Credit: 
American Academy of Neurology

New research uncovers potential pathway to slowing Alzheimer's

If we can overcome the loss of a process in the brain called "RNA editing", we may be able to slow the progress of Alzheimer's disease and other synaptic disorders, a new study has shown.

RNA editing is a genetic mechanism that modifies proteins essential in the connection between nerve cells in the brain, called synapses. RNA editing is deregulated in the brains of people with Alzheimer's disease, but whether this can cause disease is unknown.

In this study, the scientific team at the University of Technology Sydney Centre for Neuroscience & Regenerative Medicine (CNRM) replicated this deregulated process in the brains of mice, and discovered it led to the loss of synapses, as occurs in Alzheimer's.

The findings, published in the journal Molecular Brain, could have implications for a new way forward for ultimately treating Alzheimer's disease, says Professor Bryce Vissel, senior author of the study.

"Understanding mechanisms leading to synapses loss is essential to understand how patients suffering from Alzheimer's disease start losing their memory capacities and how to prevent this from happening," Professor Vissel says.

"Many scientists consider that Alzheimer's results from the build-up of a substance called amyloid in the brain. Consequently, they've focused their studies on removing amyloid. However, the most important event is actually the loss of connections between nerve cells called synapses which are known to be essential for memory formation.

"Our study is extremely important because we now have shown a mechanism that can lead to loss of synapses as occurs in Alzheimer's disease."

Dr Gary Morris, a scientist who contributed to the study, says that because "synapses are important for learning, the loss of these synapses leads to memory loss".

"Our study suggests that if we can overcome the loss of RNA editing in the brain, we may potentially be able to slow the disease."

Professor Vissel says the team's next step is to see if they can rescue synapses and memory deficits in Alzheimer's disease by overcoming the loss of RNA editing in the Alzheimer's brain.

"We have good reason to think that this could ultimately be a highly beneficial approach for solving Alzheimer's and potentially other neurodegenerative diseases such as Parkinson's."

Credit: 
University of Technology Sydney

Do girls read better than boys? If so, gender stereotypes may be to blame

A new longitudinal study of fifth and sixth graders in Germany examined the relation between classmates' gender stereotypes and individual students' reading outcomes to shed light on how these stereotypes contribute to the gender gap in reading. The study concluded that girls experienced positive effects and boys experienced negative effects on their reading-related outcomes, specifically, their competence beliefs, motivation, and achievement in reading. Furthermore, classmates' gender stereotypes also negatively related to boys' competence beliefs, motivation, and achievement in reading.

These findings come from researchers at the University of Hamburg. They appear in Child Development, a journal of the Society for Research in Child Development.

"It's a cycle of sorts," explains Francesca Muntoni, postdoctoral research associate at the University of Hamburg, who led the study. "Reading is first stereotyped as a female domain. This and other gender stereotypes that emphasize that girls are more competent in reading than boys significantly affect boys by causing them to devalue their actual reading ability while also having less motivation to read, which in turn impairs their reading performance."

In this longitudinal study, researchers collected two waves of information, once in fifth grade and once in sixth grade, on 1,508 students from 60 classes in Germany. Findings were based on data from 1) a student questionnaire assessing gender stereotypes and evaluating reading skills, degree of confidence in their ability to master aspects of good reading, and motivation to read, 2) a reading achievement test, and 3) information about students' socioeconomic status and ethnicity.

The study found that boys who held a strong stereotype favoring girls in reading were less motivated to read and held weaker reading-related beliefs about their own competence, and performed less optimally on the reading test. These effects were also found for boys in classes with students who held a strong stereotype favoring girls in reading. The effect of classmates' stereotypes was seen over and above the effect of individual stereotypes.

The study found fewer individual positive effects and no effects of classmates' stereotyping on reading-related outcomes for girls.

The study's authors caution that their findings are not causal. However, given the many experimental findings on the subject, they suggest that their study provides evidence of lasting negative effects of stereotypical beliefs in a classroom context. They also note that students' gender stereotypes were measured by self-reports, which may limit their accuracy. Finally, they point out that their study did not address how gender stereotypes are transmitted.

"To reduce socially determined gender disparities in reading, it may help to create classroom contexts that discourage students from acting on their stereotypical beliefs," says Jan Retelsdorf, professor of the psychology of learning and instruction at the University of Hamburg, who coauthored the study. "Teachers and parents might consider socializing boys and girls in ways that reduce stereotypical behaviors, and students could become aware of their gender stereotypes to counteract their effects on other students' outcomes and to create a gender-fair learning environment."

Credit: 
Society for Research in Child Development

Connectedness to nature makes children happier

A new study in Frontiers in Psychology, led by Dr Laura Berrera-Hernández and her team at the Sonora Institute of Technology (ITSON), has shown for the first time that connectedness to nature makes children happier due to their tendency to perform sustainable and pro-ecological behaviors.

As our planet faces growing threats from a warming climate, deforestation and mass species extinction, research focusing on the relationships between humans and nature is increasingly urgent to find solutions to today's environmental issues. As younger generations will be the future custodians of the planet, work is being done by researchers on how we can promote sustainable behaviors and develop environmental care in children. The researchers state that a disconnection to nature, termed 'nature deficit disorder', may contribute to the destruction of the planet, as the lack of a bond with the natural world is unlikely to result in desire to protect it.

Berrera-Hernández describes 'connectedness to nature' as not just appreciating nature's beauty, but also "being aware of the interrelation and dependence between ourselves and nature, appreciating all of the nuances of nature, and feeling a part of it."

The study recruited 296 children between the ages of 9 and 12 from a northwestern Mexican city. All the participants were given a self-administered scale completed in school to measure their connectedness to nature, sustainable behaviors (pro-ecological behavior, frugality, altruism, and equity) and happiness. This included measuring their agreement with statements about their connectedness to nature, such as 'Humans are part of the natural world' and statements about their sustainable behaviors, such as 'I separate empty bottles to recycle'.

The researchers found that in children, feeling connected to nature had positive associations for sustainability practices and behaviors, and also led to children reporting higher levels of perceived happiness. This suggests that children who perceive themselves to be more connected to nature tend to perform more sustainable behaviors and therefore also have greater levels of happiness. Previous research on adults had suggested a relationship between connectedness to nature and the development of pro-environmental behaviors, and the happiness derived from these

Despite the study's limitations of only testing children from the same city, the results provide insight into the power of positive psychology of sustainability in children. Deepening our understanding of the relationships between these variables may provide practical insights for the added psychological benefits of promoting sustainable behaviors in children. If we are to develop environmental care and concern in younger generations, then initiatives to encourage and enable young people to spend more time in nature is a must.

Berrera-Hernández states: "Parents and teachers should promote children to have more significant contact or exposure to nature, because our results indicate that exposure to nature is related to the connection with it, and in turn, with sustainable behaviors and happiness."

Credit: 
Frontiers

Bifunctional nanobodies protect against botulinum neurotoxins including Botox

image: The illustration shows six nanobodies (VHHs) bound to botulinum neurotoxin (BoNT). Lam et al. report the crystal structures and neutralizing mechanisms of six unique nanobodies against two major human pathogenic BoNT type A and B. They then develop a platform for structure-based rational design of bifunctional nanobodies with superior antitoxin potencies.

Image: 
UCI School of Medicine

Irvine, Calif. - February 27, 2020 - New study reveals potential for developing novel antibody-based antitoxins against botulinum neurotoxins (BoNTs), including the most commonly used, yet most toxic one, Botox.

Published in Cell Reports, the paper is titled, "Structural insights into rational design of single-domain antibody-based antitoxins against botulinum neurotoxins." Led by Rongsheng Jin, PhD, a professor in the Department of Physiology & Biophysics at the University of California, Irvine, School of Medicine, this paper describes how the team first identified the neutralizing epitopes of six anti-BoNT nanobodies (VHHs) based on their crystal structures, then harnessed the structural findings to rationally design bifunctional nanobodies. Different than ordinary nanobodies, bifunctional nanobodies are composed of two nanobodies that bind simultaneously to the toxins.

Based on a mouse model, their findings revealed the bifunctional nanobodies protected mice with much greater potency than the simple combination of two nanobodies.

"In a nutshell, we establish a platform for structure-based rational design of bifunctional antitoxins against BoNTs," said Kwok-ho Lam, the first author and a project scientist in the Jin lab. "BoNTs can be misused as a bioweapon and thus have been classified as Tier 1 select agents by the Centers for Disease Control and Prevention, which is why there is urgent need for antitoxins."

Ironically, Botox is a type A botulinum neurotoxin (BoNT/A), and just one of the many different types of botulinum neurotoxins. BoNT/B is another botulinum neurotoxin approved for therapeutic uses, and yet another type, BoNT/E, is in clinical trials.

"Currently, the only available antitoxin remedies are polyclonal antibodies from horse or human serum, which have known health risks and are in limited supply. Monoclonal antibodies are still under development," said Jin. "And, while it isn't necessarily a cause for worry, the increasingly popular therapeutic uses of BoNT products also create risks of possible botulism resulting from the medical treatments where they are used."

Credit: 
University of California - Irvine

Metals could be the link to new antibiotics

image: 23 previously unexplored compounds containing metals such as silver, manganese, zinc, ruthenium and iridium have been found to have antibacterial and antifungal activity.

Image: 
Angelo Frei, Institute for Molecular Bioscience, UQ

Compounds containing metals could hold the key to the next generation of antibiotics to combat the growing threat of global antibiotic resistance.

University of Queensland researchers, working with a network of international collaborators, have discovered 23 previously unexplored compounds containing metals such as silver, manganese, zinc, ruthenium and iridium that have antibacterial and antifungal activity.

The study was led by Dr Mark Blaskovich, Dr Angelo Frei and Dr Johannes Zuegg of UQ's Centre for Superbug Solutions at the Institute for Molecular Bioscience.

"This is promising research because the scientific community is struggling to keep up with the pace of bacterial resistance," Dr Blaskovich said.

They found many of the metal compounds selectively kill cells of bacteria, including the potentially deadly methicillin resistant Staphylococcus aureus (MRSA), but not human cells.

"There are around 40 new antibiotics in clinical trials, which sounds encouraging until you compare this to the more than 1000 medicines and vaccines in clinical trials for cancer treatments," he said.

Dr Frei said almost 75 per cent of the antimicrobial medicines under development were derivatives of known and used antibiotics, making them potentially susceptible to existing bacterial resistance.

"Finding completely new types of antibiotics in these metal-containing compounds offers promise to outwit bacterial resistance, because they likely use different mechanisms which the bacteria have not encountered previously," Dr Frei said.

"In addition to activity against MRSA, some compounds were active against dangerous Gram-negative pathogens such as Escherichia coli and Acinetobacter baumannii, which have even fewer novel antibiotic treatments."

The research was conducted through the Community for Open Antimicrobial Drug Discovery (CO-ADD)--which was established in the labs of Professor Matt Cooper to offer a simple and free screening service to scientists worldwide with funding from the Wellcome Trust and UQ.

"We embarked on a quest to tap into the millions of compounds sitting unused on laboratory shelves, discarded because they don't fit the mould for common drug design," Dr Blaskovich said.

"We test these compounds to see if they have an effect on bacterial and fungal pathogens.

"So far we have received and screened 300,000 compounds, including nearly 1000 metal-containing compounds, from over 300 academic groups across 47 countries."

The research team hopes the findings will bring prompt new investment in antimicrobial research.

"Many pharmaceutical companies are bowing out of antibiotic research as there is little return on investment," Dr Frei said.

"So it is vital to raise awareness that metal complexes are a prospective source of truly novel antibiotics with potential for combatting antimicrobial resistance."

This research has been published in Chemical Science and is free to read.

Credit: 
University of Queensland

A tadpole with a twist: Left-right asymmetric development of Oikopleura dioica

image: Model of left-right patterning in the larvacean embryo. It is well known that the dorsal-ventral axis shows 180? inversion between vertebrate and insect embryos in relation to the spatial pattern of BMP expression (purple). This larvacean can be regarded as an example of induction of 90? rotation for co-option of conventional BMP signaling to restrict neural gene expression on the left side.

Image: 
Osaka University

Osaka, Japan - How does a developing embryo, which is initially round, tell left from right? This basic process is still poorly understood. However, investigating unusual cases can help shed light on how this process occurs in animals. More than a century ago, German biologist Dr. H. C. Delsman described unusual left-right (L-R) patterning in the tadpole-like tunicate Oikopleura dioica. Now, researchers at Osaka University have uncovered the details of this process in O. dioica, reported in a new study published in the Proceedings of the National Academy of Sciences.

Bilateral symmetry is one the most fundamental characteristics of members of the phylum Chordata, the group that includes O. dioica as well as all animals with backbones, although L-R patterning tends to emerge later in development. However, in the larvacean tunicate O. dioica, distinct from other chordates, L-R asymmetry first appears in the four-cell embryo stage and persists throughout development; the nerve cord, typically located on the dorsal side of chordates, forms instead on the animal's left side. This notable difference provides an opportunity to investigate the mechanisms that drive L-R patterning in chordates.

"Our study reveals that this larvacean uses calcium ion oscillation and expression of the right-sided bone morphogenetic protein (Bmp) gene for embryonic left-right patterning," explains first author Takeshi A. Onuma. "Intriguingly, Nodal, an evolutionarily conserved left-determining gene found in other chordates, is absent from the genome of O. dioica. As the larvacean develops, it is likely that its tail twists 90° counterclockwise relative to its trunk, with the tail nerve cord localized on its left side."

In most chordates, Nodal and Bmp create the gradient responsible for L-R determination in the developing embryo. The absence of Nodal, combined with the novel and early L-R patterning of this larvacean, is therefore of great interest for advancing understanding of the roles of Nodal, Bmp, and calcium ion oscillation, and the evolution of L-R patterning in early chordates.

According to senior author Hiroki Nishida, "between insects and vertebrates, the dorsal-ventral axis is inverted 180°, which is correlated with Bmp expression. However, the reason for this inversion is not yet well understood. In addition to revealing a novel left-right patterning process of a chordate species, our findings provide an example where the dorsal-ventral axis and Bmp expression lead to 90° rotation with left-right patterning."

Such examples of novel L-R patterning are key for unraveling some of the most fundamental questions about the earliest evolution and development of chordates and other animals.

Credit: 
Osaka University

Troubled waters

New research reveals the unseen environmental damage being done to coral reefs in the hotly contested South China Sea, as China and other nations jostle for control of the disputed sea lanes.

Professor Eric Wolanski and Dr Severine Chokroun from James Cook University in Australia are physical oceanographers, researching the distribution, circulation, and physical properties of water.

In a new scientific paper, they argue that the disputed Spratly Islands in the South China Sea are in even more serious trouble than first believed.

"The Spratlys are the sites of a military build-up and gross overfishing, mainly by China. Reefs and islands have been destroyed to construct military outposts to further territorial claims," said Professor Wolanski.

He said it was already known that dredging to construct the new islands had damaged the environment and the region was massively overfished. There are typically 100-150 Chinese fishing boats working every reef that China controls, compared to between 0.1 and 0.5 fishing boats per reef in the Great Barrier Reef.

"We looked at the flows of fish and coral larvae from damaged reefs that produce, or used to produce, larvae and which reefs received them and are now deprived of them."

The scientists determined the currents around the islands by using satellite data and then modelled the movement of larvae from and to every reef in the Spratly Islands archipelago.

"Reefs degraded or killed by island-building and overfishing produce less fish and coral larvae for those downstream. The levels vary, but in the most extreme case - Namyit Island - there are no more new coral and fish larvae getting through, due to all its sources of larvae being destroyed," said Professor Wolanski.

He said China does not provide scientists with access to the reefs it occupies, neither does it provide data on the health of coral and fish populations at these reefs. But it now appears that the ecosystem of the whole Spratly Islands archipelago is at risk of collapse or severe degradation.

"We've pinpointed a priority list of reefs for vital conservation measures in the Spratly Islands archipelago. We recognise the political difficulties, but we have defined the problem and we have the solution based on the example of the developing collaboration between the Philippines and Vietnam that manages some reefs in the archipelago.

"We hope it's not just wishful thinking that action will follow," said Professor Wolanski.

Credit: 
James Cook University

Researchers discover second type of schizophrenia

image: In a large clinical study, 60 percent of patients with schizophrenia (subtype 1) had decreased gray matter volumes throughout the brain compared to healthy people, which is the typical pattern seen in those with this disorder. However, researchers found that over a third of schizophrenia patients (subtype 2) did not present with this pattern. These brains had increased volumes of gray matter in the basal ganglia, but were otherwise similar to healthy controls.

Image: 
Penn Medicine

Penn Medicine researchers are the first to discover two distinct neuroanatomical subtypes of schizophrenia after analyzing the brain scans of over 300 patients. The first type showed lower widespread volumes of gray matter when compare to healthy controls, while the second type had volumes largely similar to normal brains. The findings, published Thursday in the journal Brain, suggest that, in the future, accounting for these differences could inform more personalized treatment options.

"Numerous other studies have shown that people with schizophrenia have significantly smaller volumes of brain tissue than healthy controls. However, for at least a third of patients we looked at, this was not the case at all -- their brains were almost completely normal," said principal investigator Christos Davatzikos, PhD, the Wallace T. Miller Professor of Radiology in the Perelman School of Medicine at the University of Pennsylvania. "In the future, we're not going to be saying, 'This patient has schizophrenia,' We're going to be saying, 'This patient has this subtype' or 'this abnormal pattern,' rather than having a wide umbrella under which everyone is categorized."

Schizophrenia is a poorly understood mental disorder that typically presents with hallucinations, delusions, and other cognitive issues -- though symptoms and responses to treatment vary widely from patient to patient. Up until now, attempts to study the disease, by comparing healthy to diseased brains, has neglected to account for this heterogeneity, which Davatzikos says has muddled research findings and undermined clinical care.

To better characterize the distinct brain differences within the schizophrenia patient population, Davatzikos established a research consortium that spanned three continents -- the United States, China, and Germany. The international cohort of study participants included 307 schizophrenia patients and 364 healthy controls, all of whom were 45-years-old or younger.

Davatzikos and engineering colleagues then analyzed the brain scans using a machine learning method developed at Penn called HYDRA (Heterogeneity Through Discriminative Analysis). The approach helps to identify "true disease subtypes" by limiting the influence of confounding variables, such as age, sex, imaging protocols, and other factors, according to the study authors.

"This method enabled us to sub-categorize patients and find how they differed from the controls, while allowing us, at the same time, to dissect this heterogeneity and tease out multiple pathologies, rather than trying to find a dominant pattern," Davatzikos said.

After applying this machine learning method to the brain images, the researchers found that 115 patients with schizophrenia, or nearly 40 percent, did not have the typical pattern of reduced gray matter volume that has been historically linked to the disorder. In fact, their brains showed increases of brain volume in the middle of the brain, in an area called the striatum, which plays a role in voluntary movement. When controlling for differences in medication, age, and other demographics, the researchers could not find any clear explanation for the variation.

"The subtype 2 patients are very interesting, because they have similar demographic and clinical measures with subtype 1, and the only differences were their brain structures," said Ganesh Chand, PhD, a lead author and postdoctoral researcher in the radiology department at Penn.

There are a variety of antipsychotic medications available to manage the symptoms of schizophrenia, but how they will affect a particular patient -- both positively or negatively -- is often a shot in the dark, according to study co-senior author Daniel Wolf, MD, PhD, an associate professor of Psychiatry at Penn.

"The treatments for schizophrenia work really well in a minority of people, pretty well in most people, and hardly at all in a minority of people. We mostly can't predict that outcome, so it becomes a matter of trial and error," Wolf said. "Now that we are starting to understand the biology behind this disorder, then we will hopefully one day have more informed, personalized approaches to treatment."

As to why an entire subset of patients with schizophrenia have brains that resemble healthy people, Davatzikos is not willing to speculate.

"This is where we are puzzled right now," Davatzikos said. "We don't know. What we do know is that studies that are putting all schizophrenia patients in one group, when seeking associations with response to treatment or clinical measures, might not be using the best approach."

Future research, he said, will provide a more detailed picture of these subtypes in relation to other aspects of brain structure and function, clinical symptoms, disease progression, and etiology.

Credit: 
University of Pennsylvania School of Medicine

Intervention to help GPs identify and treat patients with hepatitis C found to be effective

The first UK clinical trial to increase the identification and treatment of hepatitis C (HCV) patients in primary care has been found to be effective, acceptable to staff and highly cost-effective for the NHS. The University of Bristol-led Hepatitis C Assessment to Treatment Trial (HepCATT), published in the British Medical Journal today [27 February], provides robust evidence of effective action GPs should take to increase HCV testing and treatment.

The National Institute for Health Research funded trial assessed whether a multi-part intervention in GP practices could increase the identification and treatment of HCV -infected patients compared to usual care. It took place in South West England, with 22 practices randomised to intervention and 23 to the control arm.

An electronic algorithm was devised to flag patients with HCV risk markers and invite them for an HCV test by letter, or opportunistically through pop-up messages during consultations. Practice staff received HCV educational training, and HCV posters and leaflets were placed in waiting rooms to increase patient awareness.

Around five percent of all patients were flagged with HCV risk markers. 16 per cent of the flagged patients were tested for HCV in HepCATT intervention practices compared to ten per cent in control practices - a 59 per cent increase after adjusting for the characteristics of different practices. Five times as many patients were assessed for treatment in the HepCATT intervention practices, compared to control.

The intervention was comparatively low cost at an average of £624 per general practice and £3,165 per additional patient assessed at hepatology. The overall benefit - taking into account future reduction in chronic illness - was estimated to be £6,212 per Quality Adjusted Life Year (QALY) gained which is well below the average cost of an intervention in the NHS and the National Institute for Health and Care Excellence (NICE) threshold for recommending interventions of £20,000 per QALY.

Matt Hickman, Professor in Public Health and Epidemiology and co-Director of NIHR Health Protection Research Unit in Evaluation of Interventions at the University of Bristol, who led the study, said:

"We know that scaling up hepatitis C case-finding and treatment alongside interventions that minimise transmission among people who inject drugs is critical for long-term prevention of chronic hepatitis C and hepatitis C-related disease and mortality. The HepCATT intervention had a modest impact but was highly cost-effective. We therefore recommend that it is considered for roll-out across the NHS, with further refinement and improvement before widescale implementation."

Professor Graham Foster from Queen Mary's University London and Clinical Lead for Hepatology at Barts Health, said: "Chronic hepatitis C infection is a major cause of liver disease and cancer. We are working to ensure that England is among the first countries in the world to eliminate the infection. Our primary care colleagues are key partners in the campaign and HepCATT provides the essential evidence base to allow us to expand testing into primary care in an affordable, cost-effective manner."

Dr Sema Mandal, Medical Consultant Epidemiologist lead for Hepatitis at Public Health England, said: "With nearly 100,000 people living with hepatitis C without a diagnosis it's vital that we optimise and implement new ways to enhance case finding in primary care. This new approach not only increases testing but ensures more people access life-saving treatments. Public Health England is working with NHS England and partners across academia to eliminate hepatitis C as a major public health threat and this new approach will help accelerate these efforts."

A qualitative evaluation of the study published in the British Journal of General Practice found that GPs valued the electronic algorithm, which provided them with a list of patients with HCV infection risk factors that GPs may not already know about to target for testing. GPs also appreciated the opportunity to discuss testing with patients, especially those who may not have been aware of their HCV risk. The training enhanced GPs' HCV awareness and knowledge of risk factors, which itself acted as a prompt for opportunistic testing.

GPs suggested refining the algorithm to weight risk factors, fully integrating the pop-up software with electronic patient record systems, and additional resources to screen lists and conduct tests.

Dr Jeremy Horwood, Associate Professor of Social Sciences and Health at the Centre for Academic Primary Care at the University of Bristol and ARC West, who led the qualitative evaluation, said: "With adequate resources and technology, primary care can play an important role in identifying patients with hepatitis C infection who have the potential to benefit from treatment. The cost-effective HepCATT intervention provides primary care with a range of tools to improve identification and care for HCV-infected patients and prevent HCV-related illness. This could help the UK reach the World Health Organization's target of 90 per cent of infected people knowing their status by 2030, and help stem the HCV epidemic."

Around 143,000 people in the UK have chronic HCV infection, 85 per cent of whom have a history of injecting drugs. As symptoms do not appear for several years, less than half of people infected are aware of they have HCV and many more are not receiving treatment, increasing the risk of liver damage and passing the virus to others.

The National Institute for Health and Care Excellence (NICE) in England recommends that GPs should increase testing and treatment, especially among people who inject drugs. However, robust evidence of effective interventions is lacking and testing and treatment rates in many sites are low.

Credit: 
University of Bristol

Helpful interactions can keep societies stable

For half a century, scientists who have developed models of how ecological communities function have arrived at an unsettling conclusion. Their models' predictions--seen as classic tenets of community ecology--suggested that mutualistic interactions between species, such as the relationship between plants and pollinators, would lead to unstable ecosystems.

"In one of these classic theories," says Erol Akçay, an assistant professor of biology at Penn, "it says that if you have a lot of these mutualistic interactions, where if you increase the abundance of one species it will lead to an increase in the other, things tend to go out of equilibrium."

In a paper published this week in Nature Ecology and Evolution, Akçay and Jimmy Qian, a 2019 Penn graduate who worked in Akçay's lab when he was a student, challenge those assumptions. Their work shows that mutualism is compatible with stable communities and that the balance of mutualism with other types of interactions, including competitive and exploitative, plays determinative roles in the makeup, size, and stability of those communities.

"We argue that mutualisms are not inherently destabilizing," says Qian, now a medical student at Stanford University. "It's all about the balance of how much mutualism there is and how unique those mutualistic benefits are."

As an undergraduate, Qian worked with Akçay for more than two years on projects related to health and medicine. The current project emerged from initial attempts to model the community dynamics of a human microbiome.

"As we started reading the literature and building models, we realized there were questions in microbiome ecology that were generalizable to community ecology as a whole, which is where this paper ended up," says Qian.

Specifically, the researchers started looking more closely at the seminal work of Robert May, a renowned ecologist and physicist who argued that larger, more complex communities tend to be less stable. Stability in these models is a measurement of how likely a system is to return to an equilibrium if nudged away from it. For example, a stable community could withstand a disease reducing numbers of one of its species and come out the other side of the infection with its same species composition intact.

In earlier studies, scientists showed that mutualistic interactions had destabilizing effects on communities and thus must only play a small role in ecosystems, alongside interactions that are either competitive or exploitative, like a predator-prey dynamic.

Yet one needn't look further than a coral reef or rainforest to see that the world is full of complex ecosystems. And from plant-pollinator interactions to human-microbiome relationships, mutualistic interactions also abound. So Akçay and Qian decided to dig deeper to see what these earlier models may have missed when it came to mutualism and ecosystem stability.

One thing they noticed was that earlier studies had assumed mutualistic interactions benefited the species involved in a linear fashion, without any saturation point. But in reality, the benefits of mutualism have a limit. For example, says Akçay, if you have more bees in an ecosystem, plants might get pollinated more and might produce more fruits. "But at some point," he says, "if the area is filled with bees and the plants are all pollinated, the plants will be limited by something else."

In addition to including this saturation point, Akçay and Qian attempted to make their new model more closely mimic the natural world by allowing the ecosystem to assemble gradually, adding species in a sequential manner. The classical models, in contrast, assumed that all the species came together in one fell swoop and then reached equilibrium.

"Of course, real communities don't assemble that way," says Akçay.

Their sequential assembly technique also allowed them to measure a different type of stability from the internal stability normally measured in these models, which they call external stability, or the ability of a community to resist invasion by a new species.

In their model, each time they added a species they would randomly assign it an interaction type--mutualistic, exploitative, or competitive--with all the other species in the community.

Their findings support the intuitive notion that mutualistic interactions have a place in a stable society.

"It's really the balance of the different interaction types between species that governs the community dynamics and stability," Qian says.

In their model, more mutualisms did not mean less internal stability, in contrast to what the classic models predicted. And mutualisms enhanced external stability in their analysis.

"So, they are actually more stable in an external sense because they are more resistant to invasions from outside," says Akçay. "And the reason is blindingly obvious in retrospect. If you have a community where most of the species are helping each other, each species will be abundant. If you are at this tiny population size and are trying to invade this community, it will be hard because your competitors are thriving."

While the new model is relatively simple and has room to be refined, Akçay and Qian say the results seem to be part of a shift in the community ecology field toward understanding that positive interactions in communities don't necessarily unsettle communities.

"These old, classical ecology questions still have legs," Akçay says.

Credit: 
University of Pennsylvania

'Low' socioeconomic status is the biggest barrier to STEM participation

A new study has found that socioeconomic status (SES) has the strongest impact on whether secondary school students study the STEM sciences.

A research team drew on data from over 4,300 pupils in Australia, and also looked at Indigenous students who are less likely to study all sciences.

Lead by Dr Grant Cooper of RMIT University and Professor Amanda Berry of Monash University, the study - published in the International Journal of Science Education - highlighted the demographic predictors of secondary student science enrolment. They found that female students are much less likely to study physics, more likely to study biology, and have roughly average participation in other areas such as chemistry.

The study notes, however, that these categories are not mutually exclusive. Indigeneity, gender, and low SES status can all manifest in the same student, complicating the results.

The authors point out a worrying lack of initiatives to improve low SES students' access to science. "Australia has one of the highest levels of school social segregation of all OECD countries, meaning schools mainly enrol students from low or high SES backgrounds."

This problem is worsened by a lack of diversity in school syllabi.

Dr Cooper and Professor Berry argue that "a significant challenge for educators and school leaders is the implementation of a science syllabus that meets the diverse needs of students, particularly for underrepresented cohorts, who are less likely to have access to valued cultural, social and science capital."

Lower SES schools are less likely to have enough resources, such as books, materials, and laboratories, to support student engagement in science. The researchers point to the Finnish education system, in which students from different socioeconomic backgrounds study together.

"A students' ease of access to, and a sustained immersion in cultural, social and science capitals facilitates a habitus and identity that embodies a sense that 'science is for me'."

Indigenous students face challenges in all forms of science except earth/space science, in which their participation was similar to other Australian children. The researchers suggest that this might be to do with cultural traits that emphasise a connection with the land.

"This result may be explained by Aboriginal Peoples' spiritual connectedness with Country, with land forming the basis of Aboriginal relationships, identities and cultural practices. Earth/space science syllabi commonly explore the interconnections between land, ocean and atmosphere."

This result may offer a clue as to how to better include Indigenous students in science, by incorporating Indigenous perspectives into the course content. They note that the Australian Curriculum, Assessment and Reporting Authority has attempted this.

"[They have introduced] new science elaborations addressing Aboriginal and Torres Strait Islander histories and cultures. An important purpose of these elaborations is the hope that ... 'Aboriginal and Torres Strait Islander students are able to see themselves, their identities and their cultures reflected in the curriculum of each of the learning areas, [and] can fully participate in the curriculum' (ACARA, 2018, para.1)."

While female students did show lower participation in physics, they were more involved than male students in biology, and about the same in other sciences. The researchers suggest that more needs to be done to encourage female involvement in STEM.

"Initiatives focusing on knowledge, ability, motivation and feelings of belonging could increase the interest and persistence in STEM education."

Credit: 
Taylor & Francis Group

UCLA engineers develop miniaturized 'warehouse robots' for biotechnology applications

image: Photo of the robots-on-a-chip system

Image: 
Wenzhou Yu & Haisong Lin/UCLA

UCLA engineers have developed minuscule warehouse logistics robots that could help expedite and automate medical diagnostic technologies and other applications that move and manipulate tiny drops of fluid. The study was published in Science Robotics.

The robots are disc-shaped magnets about 2 millimeters in diameter, designed to work together to move and manipulate droplets of blood or other fluids, with precision. For example, the robots can cleave one large droplet of fluid into smaller drops that are equal in volume for consistent testing. They can also move droplets into preloaded testing trays to check for signs of disease. The research team calls these robots "ferrobots" because they are powered by magnetism.

The ferrobots can be programmed to perform massively parallelized and sequential fluidic operations at small-length scales in a collaborative manner. To control the robots' motion, electromagnetic tiles in the chip pull the ferrobots along desired paths, much like using magnets to move metal chess pieces from underneath a chess board.

"We were inspired by the transformational impact of networked mobile robot systems on manufacturing, storage and distribution industries, such as those used to efficiently sort and transport packages at Amazon warehouses," said Sam Emaminejad, an assistant professor of electrical and computer engineering and the study's corresponding senior author. "So, we set out to implement the same level of automation and mobility in a microfluidic setting. But our 'factory floor' is much smaller, about the size of your palm, and our goods, the fluid droplets, are as small as a few tenths of a millimeter."

The "factory floor" is an index card-sized chip, designed by the researchers, with internal structures that help manipulate fluid droplets transported by the robots, as demonstrated in this video: https://www.youtube.com/watch?v=wuOHoJ1qaXs

"In the same way that mobile and cross-collaborative Amazon robots transformed the logistics-based industries, our technology could transform various biotech-related industries, including medical diagnostics, drug development, genomics, and the synthesis of chemicals and materials," said study co-corresponding and senior author Dino Di Carlo, UCLA's Armond and Elena Hairapetian Professor in Engineering and Medicine. "These fields have traditionally used refrigerator-sized 'liquid-handling' robots. Using our much smaller ferrobots, we have the potential to do a lot more experiments - and generate significantly more data - with the same starting materials and in the same amount of time."

The researchers showed in one of their experiments how an automated network of three robots could work in concert to move and manipulate droplets of human plasma samples on a chip in search of molecular markers that would indicate the presence of cancer.

"We programmed when and where the tiles were switched on and off to guide ferrobots through their designated routes," said Wenzhuo Yu, a UCLA electrical and computer engineering graduate student and a co-lead author on the paper. "This allows us to have several robots working in the same space, and at a relatively fast pace to accomplish tasks efficiently."

The robots moved at 10 centimeters per second and performed more than 10,000 cyclic motions during a 24-hour period in the experiments. In addition to transportation, other functions such as dispensing, merging and filtering of fluid samples were demonstrated as ferrobots interacted with structures on the on the chip.

Credit: 
University of California - Los Angeles

TRAX air quality study expands

image: A TRAX light rail train in Salt Lake City, Utah.

Image: 
University of Utah

For more than five years, University of Utah air quality sensors have hitched rides on TRAX light rail trains, scanning air pollution along the train's Red and Green Lines. Now the study, once a passion project of U researchers, has become a state-funded long-term observatory, with an additional sensor on the Blue Line into Sandy and Draper and additional insights into the events that impact the Salt Lake Valley's air, including summer fireworks and winter inversions.

In a new study published in Urban Science, researchers including Daniel Mendoza and Logan Mitchell report the latest from the TRAX Observation Project, including data validation studies that bolster the data's value for other researchers and three case studies from recent events showcasing the abilities of the mobile air quality sensors.

What's new: Blue Line and data validation

UTA's TRAX system consists of three light rail lines: red, green and blue. Up until November 2019, U sensors measuring ozone and particulate matter were installed only on the Red and Green Line trains, because both lines used the same train cars. These two lines travel through downtown Salt Lake City, the central I-15 corridor and the valley's west side. With an additional sensor on the Blue Line, however, air quality measurements now extend into the Salt Lake Valley's southeastern quadrant.

"That's a really important area of the valley," Mitchell says. "There's a lot of people down there." The Blue Line also goes up and down in elevation, just as the Red Line does as it ascends from downtown Salt Lake City to the U campus. "Since elevation is such a key part of the air quality and understanding the depth of the inversion on different days, under different conditions," he says, "it's going to be a really important piece of the dataset for us."

Extending into the south valley also allows researchers to learn more about how air masses move back and forth between Salt Lake and Utah counties, through the narrow Point of the Mountain passage.

"That's actually really critical because we sometimes have very different meteorological phenomenon going on between the two valleys," Mendoza says. "We can now examine in our basin an exchange of air masses."

The other major development in the TRAX Observation Project is the validation of the data coming from the mobile sensors. This is an important step in a pioneering project such as this, and serves along with quality assurance and quality control protocols as a certificate on the archived data now being made available to other researchers. It also assuages any concerns that the air turbulence caused by the moving train might skew the readings.

The experiment involved a stationary particulate matter sensor placed about 10 feet (3 m) from the rail line that would take readings whenever the TRAX trains were within 500 feet (150 m) of the sensors. Comparing the mobile and stationary readings, Mendoza says, showed 96% accuracy. "That really gives us a great deal of confidence that our TRAX sensors are actually performing really well compared to regulatory sensors and can be used for health studies, policy and so on," Mendoza says.

Watching the fireworks

With five years of continued observations, the TRAX Observation Project has captured many air quality events. Mendoza, Mitchell and their colleagues document three particular events in their paper: an elevated ozone event from August 2019, a cold air pool inversion event in November 2019 and the fireworks on July 4, 2019.

The fireworks event was unique--it wasn't a phenomenon caused by an atmospheric event or by the geography of the Salt Lake Valley. It was an incidence of multiple point sources of particulate matter air pollution, allowing observation of how those plumes of particulate matter moved through the valley.

Following generally good air quality, hotspots of elevated pollution started appearing in the TRAX data between 10-11 p.m. on Independence Day. By midnight, the majority of the valley was experiencing moderate to unhealthy air quality.

Mendoza says that the train data shows not only the dispersion of the smoke--something you don't see in wintertime inversions, which have low atmospheric energy--but also the evening winds coming down Emigration Canyon on the valley's east side, which washes out some of the air pollution.

"These are examples of the kinds of things that we're seeing that you couldn't see with stationary monitors," Mitchell adds. "It's helping us understand where the gradients are in the valley, how they evolve through pollution events such as during the Fourth of July or an inversion or an ozone event. You can see the air masses moving around. You can see where the pollution is and how it moves from different parts of the valley."

Next steps

Next, Mitchell says, the team hopes to add sensors that measure oxides of nitrogen and carbon monoxide, both important components of atmospheric chemistry. They'd also like to expand the study to light rail trains in cities such as Portland or Denver.

"It would be really interesting for us to be able to compare the spatial patterns we're seeing here with another city that has different topography around it and a different mix of emission sources," Mitchell says, "so that we can understand how cities in general are being affected by these things and how that's similar or different from what's going on in Salt Lake City."

Credit: 
University of Utah