Culture

Female surgeon scientists claim more than their share of research grants

image: The Changing Face of Academic Surgery: Over-Representation of Females Amongst Surgeon-Scientists with R01 Funding.

Image: 
American College of Surgeons

While their ranks in academic surgery may be not be robust, women surgeons are holding their own when it comes to surgical research, securing a greater percentage of National Institutes of Health (NIH) grants than their numbers suggest. However, their overall numbers remain low and academic medicine still needs to do more to encourage this emerging generation of surgeon scientists, according to authors of a new study of surgical research who report their findings in an "article in press" on the Journal of the American College of Surgeons website ahead of print.

"Females are underrepresented in academic surgery but hold a greater than anticipated proportion of NIH funding," said corresponding study author Shayna Showalter, MD, FACS, associate professor of surgery, University of Virginia Health System, Charlottesville. "To me, this means that female surgeon-scientists are a crucial component of future surgical research. They have been able to succeed even in a very competitive research environment."

The study queried the number of NIH R01 grants from surgery departments as of October 2018 and found that 212 held by 159 principal investigators (PIs) were in place, 49 of which were held by 42 women PIs. That means women represent 26.4 percent of these R01 grant holders while representing 19 percent of the academic surgical faculty, according to Association of American Medical Colleges data. The study chose R01 grants because they are the most common and historically oldest form of NIH grant with a track record of productive, high-quality research.

Women were more likely to be first-time grant recipients than men (73.5 vs. 54.8 percent, p=0.03) and less likely to have multiple grants or previous NIH funding (8.6 vs. 21.4 percent, p=0.03), the study found. "What I hope this shows is that we are potentially shifting away from the tradition of giving more funding to longstanding, proven researchers and that we continue to focus on awarding funding to a diverse group of accomplished researchers. We know that female surgeon-scientists are doing very good work," Dr. Showalter said. "Females in this study were twice as likely to be first-time grant recipients. As a community, we need to ensure that first-time grant holders continue to be taken seriously and are awarded NIH funding when appropriate."

However, funding for surgical research is shrinking. The study notes that the bias toward researchers with previous grants is "worrisome for all surgeons" because the number of funded R01 grants has declined 17 percent in recent years, with surgeon-led studies having a mean success rate considerably lower than the mean NIH funding rate (16.4 vs. 19.2 percent, p=0.011), according to a previously published study in the Journal of the American College of Surgeons.1

Another disparity the study uncovered was that female investigators had fewer published articles about their research than their male counterparts, which persisted even when the study authors applied the grant impact metric which controls for total amount of funding. "This difference may be related to the number of first-time grants and is consistent with prior knowledge that women in academic surgery have fewer publications in general than men," Dr. Showalter said.
The study found that women grant recipients were more likely to be from departments with a female chair (31 vs. 13.7 percent, p=0.01) or a department that was more than 30 percent female (35 vs. 18.2 percent, p=0.03). The study authors recommend a number of strategies for academic surgery departments to nurture and promote female researchers. "One of the strategies is having strong mentorship and sponsorship programs," said Dr. Showalter. "We know that successful academic surgeons value mentorship, often having more than one mentor throughout their career."

Another strategy is for academic settings to continue to hire and promote female surgeons. A 2018 study found that women held only 7 percent of the full professorships among surgeon faculty at U.S. medical schools.2 "Institutions must continue to support the academic advancement of female surgeon-scientists and to advocate for females in leadership positions. This will allow for a strong group of women to mentor the women behind them," Dr. Showalter said. "But we do have a paucity of female leaders in high-powered positions, including chairs and deans of surgical departments and leaders within societies."

The study reveals that women researchers are doing high-quality work in surgery, Dr. Showalter said. "There are some great female surgeon-scientists, and we have many more with potential that will be crucial to the future of surgical research," she said. "As a community within academia, we need to continue to work to figure out the best way to support and promote a diverse faculty."

Credit: 
American College of Surgeons

New technology promises to revolutionize nanomedicine

image: Principal investigator Maxim Nikitin in the laboratory.

Image: 
Image courtesy of Maxim Nikitin

Researchers from the Moscow Institute of Physics and Technology and their colleagues from Shemyakin-Ovchinnikov Institute of Bioorganic Chemistry and Prokhorov General Physics Institute of the Russian Academy of Sciences have developed a breakthrough technology to resolve a key problem that has prevented the introduction of novel drugs into clinical practice for decades. The new solution prolongs blood circulation for virtually any nanomedicine, boosting its therapeutic efficiency. The Russian researchers' study was published in Nature Biomedical Engineering and featured in the journal's News & Views section.

The development of medical chemistry since the late 19th century has led to the transition from traditional medicine to drugs with strictly defined chemical formulas. Despite being some 150 years old, this paradigm still underlies the absolute majority of modern medications. Their active molecules tend to perform one simple function: activate or deactivate a certain receptor.

However, since the 1970s many laboratories have been pursuing next-generation drugs that would implement multiple complex actions simultaneously. For example, identify cancer cells via a range of biochemical cues, signal the tumor location to the physician, and subsequently destroy all the malignant cells via toxins and heating.

Since one molecule cannot perform all of these functions, a larger supramolecular structure, or a nanoparticle, has to be used.

However, despite the enormous variety of nanomaterials developed to date, only the simplest ones with highly specific functions have made it into clinical practice. The main problem in using therapeutic nanoparticles has to do with the amazing efficiency of our immune system. Over millennia, evolution has perfected the human body's ability to eliminate nanosized foreign entities, from viruses to smoke particles.

When administered in reasonable doses, most artificial nanoparticles are cleared from the bloodstream by the immune system in mere minutes or even seconds. That means no matter how sophisticated the drugs are, the majority of the dose will not even have a chance to come in contact with the target, but will affect healthy tissues, usually in a toxic way.

In their recent paper, a team of Russian researchers led by Maxim Nikitin, who heads the Nanobiotechnology Lab at MIPT, proposed a groundbreaking universal technology that significantly prolongs the blood circulation and enhances the therapeutic efficiency of diverse nanoagents, without necessitating their modification.

The technology exploits the fact that the immune system continually eliminates the old, "expired" red blood cells -- about 1% per day in humans -- from the bloodstream. "We hypothesized that if we slightly intensified this natural process, we could trick the immune system. While it becomes busy clearing red blood cells, less attention is given to the clearance of the therapeutic nanoparticles. Importantly, we wanted to distract the immune system in the most gentle way, ideally via the body's innate mechanisms rather than by artificial substances," Maxim Nikitin said.

The team found an elegant solution, which involved injecting mice with red blood cell-specific antibodies. These are molecules forming the basis of the mammal immune system. They recognize the entities that need to be removed from the body, in this case RBCs. The hypothesis proved right, and a small dose of antibodies -- 1.25 milligrams per kilogram of body weight -- turned out to be very effective, extending the blood circulation of nanoparticles dozens of times. The tradeoff was very moderate, with the mice exhibiting a mere 5% drop in RBC levels, which is twice less than what qualifies as anemia.

The researchers found that their approach, called the "cytoblockade" of the mononuclear phagocyte system, was universally applicable to all nanoparticles. It prolonged the circulation times for tiny quantum dots measuring just 8 nanometers, medium-scale 100-nanometer particles, and large micron-sized ones, as well as the most advanced nanoagents approved for use on humans: the polymer-coated "stealth" liposomes, which disguise themselves under a highly inert polyethylene glycol coating to hide from the immune system. At the same time, the cytoblockade does not impair the body's ability to fend off bacteria (natural microparticles) in the bloodstream, both in small doses and in the case of sepsis.

There is a wide range of nanoparticle applications made possible by the new technology. In one series of experiments on mice, the researchers achieved a dramatic improvement in the so-called active delivery of nanoagents to cells.

It involves nanoparticles equipped with a special molecule to recognize target cells. An example would be using the antibody against the CD4 receptor that identifies T cells. Drug delivery to these cells would be useful for treating autoimmune and other diseases. The induction of a cytoblockade in mice yielded an increase in nanoparticle circulation time from the usual 3-5 minutes to over an hour. Without the cytoblockade, the clearance was too rapid, and no binding of the target cell could be achieved, but after cytoblockade the agents showed exceptionally high targeting efficiency, which was on a par with that achieved in vitro. The experiment highlights the enormous potential of the new technology not only for enhancing the performance of nanosized agents, but for enabling those previously completely inefficient in vivo.

The team went on to demonstrate the applicability of their technology to cancer therapy, with the cytoblockade enabling up to 23 times more efficient magnetically guided delivery of nanoparticles to the tumor (fig. 1). This delivery technique utilizes a magnetic field to guide, focus, and retain magnetic agents within a tumor to reduce systemic toxicity. Such delivery is available for nanoparticles but not molecules. The study reports an effective therapy of melanoma using liposomes loaded with magnetite and the chemotherapy drug doxorubicin, which were completely ineffective without the use of antibodies to red blood cells. Improved magnetic delivery was shown for five types of tumors of different nature, including melanoma and breast cancer.

"We observed an improved nanoagent delivery with each type of cancer that we ran experiments for. It is particularly important that the method worked on human tumor cells introduced into mice," commented study co-author Ivan Zelepukin, a junior researcher at the RAS Institute of Bioorganic Chemistry and MIPT.

Notably, the new technology enabled a therapeutic improvement for a commercially available liposomal agent approved for use on humans. This means that the cytoblockade does not only open up new therapeutic opportunities but also enhances the ones existing now.

The authors of the paper mention that the enhanced nanoparticle performance closely correlated with the prolongation of blood circulation time. That correlation could be established using a highly sensitive method of magnetic particle quantification developed by the team. It enables detecting the kinetics of particle elimination from the bloodstream in a noninvasive way -- that is, without drawing blood.

"That method did more than allow us to make real-time measurements of particle content in the bloodstream. It enabled the whole study, because it would not have been possible to measure such an enormous number of nanoparticle kinetic profiles using any other existing method within a reasonable time," said Petr Nikitin, a co-author of the study and the head of the Biophotonics Lab at the General Physics Institute of RAS.

The newly developed technology is especially promising in terms of translation for clinical use, because the anti-D antibodies, which bind to RhD-positive red blood cells, have long been approved for the treatment of immune thrombocytopenia and the prevention of rhesus disease. Therefore, assessment of the new technology in humans can begin in the nearest future using the already approved medications.

"There is no doubt that the combined action of the nanomedicines with the existing anti-D or improved anti-RBC antibodies of the next generation should be examined in stringent clinical tests. However, we feel very optimistic about this technology and its applications to severe diseases that require targeted drug delivery, including cancer," Maxim Nikitin added. "Now that this complex seven-year study has been published, we will make every effort to translate it into clinical practice. For this reason, we are seeking collaborators and active colleagues interested in joining the team."

Since the cytoblockade technology is universal in terms of the compatible nanoagents and does not require their modification, it has the potential to become substantially more productive than PEGylation, which was developed in the '70s and has since given rise to a multibillion industry of "prolonged circulation" drugs, with dozens of clinically approved medications.

The authors believe that the proposed technology may open doors for in vivo use of the most advanced nanoagents with the primary focus on functionality rather than stealth characteristics. Novel biomedical nanomaterials fabricated according to the most progressive ideas in materials science may be instantaneously introduced into life science research in vivo and then rapidly perfected for clinical use.

Credit: 
Moscow Institute of Physics and Technology

Kidney transplant, the cost of accounting for patients' preferences

VENICE - From the moment a kidney becomes available, there are just 24 hours to identify the recipient and carry out the transplant. Waiting lists are extremely long, available organs are few and not all of them are 'ideal'. Furthermore, a recipient has the ability to 'refuse' a kidney, for instance by continuing with the dialysis treatment while waiting for a better moment, thus complicating the race against time to carry out the transplant.

This issue involving the scarcity of resources and an array of delicate choices, could be resolved in a more efficient manner if the allocation system accounted for the patients' preferences, alongside parameters of compatibility, age and waiting time.

These are the results of a study recently published in the Journal of Health Economics by Giacomo Pasini, professor of Economics at Ca' Foscari University of Venice, co-authored by Mesfin G. Genie - who received its PhD in Economics at Ca' Foscari and is now working at the University of Aberdeen - and Antonio Nicolò of the University of Padua. The field work was made possible thanks to a strategic project fund granted by the University of Padua.

The economists interviewed 248 patients on the waiting list for a kidney transplant at the Kidney and Pancreas Transplantation Unit of the University of Padua's School of Medicine, one of the key centres in Italy, with over 2000 operations completed.

The experiment allowed the researchers to analyze the patients' preferences regarding different variables such as waiting time, post-graft life expectancy, infection risk and the neoplastic risk.

Patients who are not willing to take any risks, for example, might want to wait longer to get the 'perfect' kidney. Conversely, patients who find the dialysis treatment particularly unbearable, such as parents of young children whose life quality is seriously hindered by the number of hours they need to spend in the hospital, could be willing to accept an augmented risk kidney, just so they can get the transplant as soon as possible.

"We found a significant heterogeneity in the preferences of patients who are waiting for a transplant - explains Giacomo Pasini - Furthermore, we have proved how including patients' preferences in the kidney allocation algorithm would considerably improve both the patients' satisfaction and the expected graft survival"

The study has also tackled the internationally relevant topic of the so-called "marginal" kidney - that is to say potentially suitable kidneys, but inferior to standard criteria.

Examples are: kidneys from deceased older donors or those of a young person who died in a car crash, as a residual infection risk cannot be excluded.

"Marginal kidneys are currently offered only to those patients who have been on the waiting list for a long time - stated Professor Pasini - our article suggests there might be more patients who are willing to accept a marginal kidney right away. Broadly speaking, we are putting forward a way to improve the kidney allocation system that could improve both the efficiency of the procedure (an increase in the number of transplants completed) and the life quality of the patients".

Credit: 
Università Ca' Foscari Venezia

Separating gamma-ray bursts: Students make critical breakthrough

image: The figure indicates how similar different GRBs are to each other. Points which are closer together are more similar, and points which are further away are more different. What we find is that there are two distinct groups, one orange and the other blue. The orange dots appear to correspond to "short" GRB, which have been hypothesized to be produced by mergers of neutron stars, and the blue dots appear to correspond to "long" GRB, which might instead be produced by the collapse of dying, massive stars.

Image: 
Johann Bock Severin, Christian Kragh Jespersen and Jonas Vinther

By applying a machine-learning algorithm, scientists at the Niels Bohr Institute, University of Copenhagen, have developed a method to classify all gamma-ray bursts (GRBs), rapid highly energetic explosions in distant galaxies, without needing to find an afterglow - by which GRBs are presently categorized. This breakthrough, initiated by first-year B.Sc. students, may prove key in finally discovering the origins of these mysterious bursts. The result is now published in Astrophysical Journal Letters.

Ever since gamma-ray bursts (GRBs) were accidentally picked up by Cold War satellites in the 70s, the origin of these rapid bursts have been a significant puzzle. Although many astronomers agree that GRBs can be divided into shorter (typically less than 1 second) and longer (up to a few minutes) bursts, the two groups are overlapping. It has been thought that longer bursts might be associated with the collapse of massive stars, while shorter bursts might instead be caused by the merger of neutron stars. However, without the ability to separate the two groups and pinpoint their properties, it has been impossible to test these ideas.

So far, it has only been possible to determine the type of a GRB about 1% of the time, when a telescope was able to point at the burst location quickly enough to pick up residual light, called an afterglow. This has been such a crucial step that astronomers have developed worldwide networks capable of interrupting other work and repointing large telescopes within minutes of the discovery of a new burst. One GRB was even detected by the LIGO Observatory using gravitational waves, for which the team was awarded the 2017 Nobel Prize.

Breakthrough achieved using machine-learning algorithm

Now, scientists at the Niels Bohr Institute have developed a method to classify all GRBs without needing to find an afterglow. The group, led by first-year B.Sc. Physics students Johann Bock Severin, Christian Kragh Jespersen and Jonas Vinther, applied a machine-learning algorithm to classify GRBs. They identified a clean separation between long and short GRB's. Their work, carried out under the supervision of Charles Steinhardt, will bring astronomers a step closer to understanding GRB's.

This breakthrough may prove the key to finally discovering the origins of these mysterious bursts. As Charles Steinhardt, Associate Professor at the Cosmic Dawn Center of the Niels Bohr Institute explains, "Now that we have two complete sets available, we can start exploring the differences between them. So far, there had not been a tool to do that."

From algorithm to visual map

Instead of using a limited set of summary statistics, as was typically done until then, the students decided to encode all available information on GRB's using the machine learning algorithm t-SNE. The t-distributed Stochastic neighborhood embedding algorithm takes complex high-dimensional data and produces a simplified and visually accessible map. It does so without interfering with the structure of the dataset. "The unique thing about this approach," explains Christian Kragh Jespersen, "is that t-SNE doesn't force there to be two groups. You let the data speak for itself and tell you how it should be classified."

Shining light on the data

The preparation of the feature space - the input you give the algorithm - was the most challenging part of the project, says Johann Bock Severin. Essentially, the students had to prepare the dataset in such a way that its most important features would stand out. "I like to compare it to hanging your data points from the ceiling in a dark room," explains Christian Kragh Jespersen. "Our main problem was to figure out from what direction we should shine light on the data to make the separations visible."

"Step 0 in understanding GRB's"

The students explored the t-SNE machine-learning algorithm as part of their 1st Year project, a 1st year course in the Bachelor of Physics. "By the time we got to the end of the course, it was clear we had quite a significant result", their supervisor Charles Steinhardt says. The students' mapping of the t-SNE cleanly divides all GRB's from the Swift observatory into two groups. Importantly, it classifies GRB's that previously were difficult to classify. "This essentially is step 0 in understanding GRB's," explains Steinhardt. "For the first time, we can confirm that shorter and longer GRB's are indeed completely separate things."

Without any prior theoretical background in astronomy, the students have discovered a key piece of the puzzle surrounding GRB's. From here, astronomers can start to develop models to identify the characteristics of these two separate classes.

Credit: 
University of Copenhagen

Lifetime discrimination and greater risk of high blood pressure in African Americans

Experiences of discrimination over a lifetime is associated with high blood pressure in African American adults, according to findings published this month in the journal Hypertension from researchers at the Urban Health Collaborative at Drexel University's Dornsife School of Public Health.

High blood pressure is linked with many life-threatening conditions, including stroke, heart disease and dementia, and is also associated with higher risk of severe illness from COVID-19. This connection between poor heart health and higher risk of severe symptoms suggests the findings may also provide insights into current racial disparities in patient outcomes during the pandemic.

The authors used survey responses from 1,845 African American adults living in Mississippi who participated in the Jackson Heart study (and did not have high blood pressure at the start of the study). Those who had a systolic blood pressure of 140 mm Hg or higher or a diastolic blood pressure higher than 90 mm Hg and/or were taking medicine to manage their blood pressure at any follow up examination were considered to have developed hypertension.

Participants had a baseline visit between 2000 and 2004 and had two follow-up visits - one in 2005-2008 and the other in 2009-2013. At the time of study baseline patients reported previous experiences of discrimination via survey. The researchers counted instances of lifetime discrimination by counting any of nine domains -- such as in school/training, getting a job or housing, at work, etc. -- in which unfair treatment was reported.

After adjusting for gender, age, socioeconomic status and other high blood pressure risk factors, the team found that individuals reporting medium levels (in one to two domains) and high levels of lifetime discrimination (in three to nine domains) had a 49 percent and 34 percent increased risk for hypertension compared to those who reported low levels of lifetime discrimination (zero domains), respectively.

"Our findings in a large population show that the stress resulting from discrimination may have a major impact on the health of African Americans," said lead author Allana T. Forde, PhD, a postdoctoral research fellow at the Urban Health Collaborative.

The authors said the study could have implications for treating African American patients, but also renews attention towards the conditions in which people live and the significant influence of environment on health.

"Structural racism affects health in many ways. The experiences of discrimination investigated in this study represent only one of the many ways in which racism has measurable health consequences," said senior author and Urban Health Collaborative Director Ana Diez Roux, MD, PhD, dean and distinguished professor of Epidemiology at Dornsife School of Public Health. "Addressing racism is critical to promoting health and achieving health equity."

High blood pressure -- that is 130/80 or above according to current guidelines -- plagues 103 million adult Americans, nearly half of all adults. More than 40 percent of African American adults suffer from high blood pressure.

Credit: 
Drexel University

Paper: Mundane behavioral decisions, actions can be 'misremembered' as done

image: Mundane behaviors such as taking a daily medication can eventually create false memories of completing the task, said Dolores Albarracin, a professor of psychology and marketing at Illinois and the director of the Social Action Lab.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- There's a reason why people often forget to take a daily medication or respond to that email they've been meaning to send, and it can be chalked up to the gulf between intention and actually completing an action, according to new research co-written by a University of Illinois at Urbana-Champaign expert who studies social psychology.

Mundane behaviors that are repeated over time and occur in the context of many other similar behaviors can lead people to conflate intentions and behaviors and create false memories of completing the task, said Dolores Albarracin, a professor of psychology and marketing at Illinois and the director of the Social Action Lab.

"Intentions and making plans typically improve task execution. We need them to function in society, to realize our goals and to get along with others," she said. "But when we form an intention in the moment such as 'I'm going to sign that form now,' and it's an activity we routinely perform, we want to complete the task when we form the intention. Otherwise, we don't actually sign the form. And the reason why is because the thought of wanting to sign the form can be misremembered as actually having signed it, in which case we'd be better off not having formed the intention to sign the form in the first place."

Across five studies, Albarracin and her co-authors investigated the previously unrecognized phenomenon of remembering having enacted a mundane behavioral decision when one only intended to do so, as well as its psychological mechanisms.

"Our aim was to develop a lab-analog procedure entailing relatively simple, repetitive and similar behavioral decisions to create the conditions hypothesized to produce high levels of error," Albarracin said.

Participants chose job candidates and either acted on the decision to hire them, generated an intention to hire them later or made a judgment that was irrelevant to the behavior.

Following a delay, participants were asked to report whether they had acted on the decision or simply intended to do so for each person they had seen.

"The methodology was carefully crafted to produce the necessary high level of errors we were studying, to keep irrelevant characteristics constant across conditions, and to systematically manipulate enactment versus intention," said Albarracin, also a professor of business administration at Illinois. "If intentions play a causal role in producing misreports of behavior, misreports should be more common in the intention than the control condition."

The first two experiments showed misreports and subsequent performance errors even when controlling for guessing. Experiments three and four demonstrated greater confusion when the physical involvement and mental criteria for intention and behavior were similar. And the fifth experiment indicated that monitoring whether one has acted on a decision is highly effective at reducing errors and more effective than monitoring intention.

"Our results highlight that behaviors will look to be more consistent with intentions when the behavior is routine," she said. "The finding implies we should be more aware of the potential for error in these similarly trivial behaviors."

The paper has implications for health care settings and any other situation where self-reporting of following through on an action is critical, Albarracin said.

"The fulfillment of routine, repeated behaviors can have meaningful consequences, and are part of, if not central to, many practical contexts," Albarracin said. "More generally, understanding the complexity of the intention-behavior link and the possible unexpected effects of intention formation is essential to promote beneficial behaviors in many domains, ranging from financial decisions to a person's health."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Research underscores importance of global surveillance of plant pathogens

image: Current distribution of bacterial leaf streak of corn.

Image: 
Mary Ortiz-Castro, Terra Hartman, Teresa Coutinho, Jillian M. Lang, Kevin Korus, Jan E. Leach, Tamra Jackson-Ziems, and Kirk Broders

First spotted in the United States in 2014, bacterial leaf streak of corn is an emerging disease of corn that has now spread to ten states, including the top three corn-producing states of Illinois, Iowa, and Nebraska.

"I can remember vividly walking into multiple corn fields in western Nebraska with an extension agent in the summer of 2016. Bacterial leaf streak could be seen in almost every field, and in several of these fields greater than 80% of plants were infected," said Kirk Broders, a plant pathologist at Colorado State University. "The scale and potential severity of this disease were evident, and it was clear to all involved that day we needed to respond quickly."

While bacterial leaf streak of corn has been a known threat in South Africa for over 70 years, it is very new to both North and South America.

"The only scientific information pertaining to this disease on maize came from work done in South Africa, which primarily investigated host range on other African crops, such as sugarcane and banana," Broders explained. "As a result, when the disease was first reported in the U.S. in 2014 and rapidly spread throughout corn-growing regions, the research community was left with very few answers regarding the basic biology of this disease."

As a result, Broders and his colleagues set out to synthesize all the current research on bacterial leaf streak of corn conducted on behalf of an international group of researchers over the last 5 years in response to the emergence of the disease in North and South America. Published in Phytopathology, their article summarizes the spread of the bacteria from South Africa, how it infects its host, what plant tissues it can infect, and where initial inoculum comes from at the beginning of each crop season.

"The most surprising discovery was finding that the bacteria was also present and widespread in Argentina," Broders said. "And that the strains of the bacteria in Argentina and the US are carrying a piece of DNA that was likely incorporated from another bacterial pathogen of sorghum through horizontal gene transfer."

The story of bacterial leaf streak of corn also emphasizes another important point: in an increasingly globalized world, novel plant pathogens can more easily spread and evolve, a fact that underscores the importance of a rapid collaborative response to limit crop loss. Broders hopes that this work will encourage this kind of collaboration and to demonstrate the growing importance of developing global surveillance of plant pathogens.

Read more in "Current Understanding of the History, Global Spread, Ecology, Evolution, and Management of the Corn Bacterial Leaf Streak Pathogen, Xanthomonas vasicola pv. Vasculorum," published in the June issue of Phytopathology, which features a Pathogen Spotlight series on bacterial leaf streak of corn containing seven articles.

Credit: 
American Phytopathological Society

Study shows how traumatic experiences can leave their mark on a person's eyes

New research by Welsh academics shows that a patient's pupils can reveal if they have suffered a traumatic experience in the past.

Post-traumatic stress disorder can occur when a person has experienced a traumatic event such as a car crash, combat stress, or abuse. They can be left with a greater sensitivity, or hyperarousal, to everyday events and an inability to switch off and relax.

The research, led by Dr Aimee McKinnon at Cardiff University and published in the journal Biological Psychology, looked for traces of these traumatic events in the eyes of patients who were suffering from PTSD by measuring the pupil of the eye while participants were shown threatening images such as vicious animals or weapons, as well as other images that showed neutral events, or even pleasant images.

The response of people with PTSD was different to other people, including people who had been traumatised but did not have PTSD.

At first the pupil failed to show the normal sharp constriction that is caused by changes in light level - but then their pupils grew even larger to the emotional stimuli than for the other participants.

Another unexpected result was that pupils of the patients with PTSD not only showed the exaggerated response to threatening stimuli, but also to stimuli that depicted "positive" images, such as exciting sports scenes.

Swansea University's Professor Nicola Gray, who co-authored the paper along with Professor Robert Snowden of Cardiff University, believes this is an important finding.

She said: "This shows that the hyper-response of the pupil is in response to any arousing stimulus, and not just threatening ones. This may allow us to use these positive pictures in therapy, rather than relying upon negative images, that can be quite upsetting to the patient, and therefore make therapy more acceptable and bearable. This idea now needs testing empirically before it is put into clinical practice."

Dr McKinnon, who is now at Oxford University, added: "These findings allow us to understand that people with PTSD are automatically primed for threat and fear responses in any uncertain emotional context, and to consider what a burden this must be to them in everyday life.

"It also suggests that it is important for us to recognise that, in therapy, it is not just the fear-based stimuli that need deliberately re-appraising.

"If someone with PTSD is faced with any high-level of emotional stimulation, even if this is positive emotion, it can immediately trigger the threat system. Clinicians need to understand this impact of positive stimuli in order to support their service-users overcome the significant challenges they face."

Credit: 
Swansea University

Synapse-saving proteins discovered, opening possibilities in Alzheimer's, schizophrenia

image: Gek-Ming Sia, PhD, and colleagues in the Long School of Medicine at UT Health San Antonio discovered a new class of proteins that spare synapses from elimination. Increasing the numbers of these proteins could be a therapeutic target in treating Alzheimer's disease and schizophrenia.

Image: 
UT Health San Antonio

Researchers at The University of Texas Health Science Center at San Antonio (UT Health San Antonio) have discovered a new class of proteins that protect synapses from being destroyed. Synapses are the structures where electrical impulses pass from one neuron to another.

The discovery, published July 13 in the journal Nature Neuroscience, has implications for Alzheimer’s disease and schizophrenia. If proven, increasing the number of these protective proteins could be a novel therapy for the management of those diseases, researchers said.

In Alzheimer’s disease, loss of synapses leads to memory problems and other clinical symptoms. In schizophrenia, synapse losses during development predispose an individual to the disorder.

“We are studying an immune system pathway in the brain that is responsible for eliminating excess synapses; this is called the complement system,” said Gek-Ming Sia, PhD, assistant professor of pharmacology in UT Health San Antonio’s Long School of Medicine and senior author of the research.

“Complement system proteins are deposited onto synapses,” Dr. Sia explained. “They act as signals that invite immune cells called macrophages to come and eat excess synapses during development. We discovered proteins that inhibit this function and essentially act as ‘don’t eat me’ signals to protect synapses from elimination.”

The system sometimes goes awry

During development, synapses are overproduced. Humans have the most synapses at the ages of 12 to 16, and from then to about age 20, there is net synapse elimination that is a normal part of the brain’s maturation. This process requires the complement system.

In adults, synapse numbers are stable, as synapse elimination and formation balance out. But in certain neurological diseases, the brain somehow is injured and begins to overproduce complement proteins, which leads to excessive synapse loss.

“This occurs most notably in Alzheimer’s disease,” Dr. Sia said.

In mouse models of Alzheimer’s disease, researchers have found that the removal of complement proteins from the brain protects it from neurodegeneration, he said.

“We’ve known about the complement proteins, but there was no data to show that there were actually any complement inhibitors in the brain,” Dr. Sia said. “We discovered for the first time that there are, that they affect complement activation in the brain, and that they protect synapses against complement activation.”

Future directions

Dr. Sia and his colleagues will seek to answer interesting questions, including:

Whether complement system biology can explain why some people are more resistant and more resilient against certain psychiatric disorders;
How the number of complement inhibitors can be changed and whether that could have clinical ramifications;
Whether different neurons produce different complement inhibitors, each protecting a certain subset of synapses.

Regarding the last question, Dr. Sia said:

“This could explain why, in certain diseases, there is preferential loss of certain synapses. It could also explain why some people are more susceptible to synapse loss because they have lower levels of certain complement inhibitors.”

The researchers focused on a neuronal complement inhibitor called SRPX2. The studies are being conducted in mice that lack the SRPX2 gene, that demonstrate complement system overactivation and that exhibit excessive synapse loss.

Acknowledgments

This project is funded by a NARSAD Young Investigator Award from the Brain and Behavior Research Foundation, a grant from the William and Ella Owens Medical Research Foundation, a Rising STARs Award from The University of Texas System, and grants from two branches of the U.S. National Institutes of Health – the National Institute of Neurological Disorders and Stroke, and the National Institute on Deafness and Other Communication Disorders.

The endogenous neuronal complement inhibitor SRPX2 protects against complement-mediated synapse elimination during development

Qifei Cong, Breeanne M. Soteros, Mackenna Wollet, Jun Hee Kim and Gek-Ming Sia

First published: July 13, 2020, Nature Neuroscience

https://doi.org/10.1038/s41593-020-0672-0

The Long School of Medicine at The University of Texas Health Science Center at San Antonio is named for Texas philanthropists Joe R. and Teresa Lozano Long. The school is the largest educator of physicians in South Texas, many of whom remain in San Antonio and the region to practice medicine. The school teaches more than 900 students and trains 800 residents each year. As a beacon of multicultural sensitivity, the school annually exceeds the national medical school average of Hispanic students enrolled. The school’s clinical practice is the largest multidisciplinary medical group in South Texas with 850 physicians in more than 100 specialties. The school has a highly productive research enterprise where world leaders in Alzheimer’s disease, diabetes, cancer, aging, heart disease, kidney disease and many other fields are translating molecular discoveries into new therapies. The Long School of Medicine is home to a National Cancer Institute-designated cancer center known for prolific clinical trials and drug development programs, as well as a world-renowned center for aging and related diseases.

The University of Texas Health Science Center at San Antonio, also referred to as UT Health San Antonio, is one of the country’s leading health sciences universities and is designated as a Hispanic-Serving Institution by the U.S. Department of Education. With missions of teaching, research, patient care and community engagement, its schools of medicine, nursing, dentistry, health professions and graduate biomedical sciences have graduated more than 37,000 alumni who are leading change, advancing their fields, and renewing hope for patients and their families throughout South Texas and the world. To learn about the many ways “We make lives better®,” visit www.uthscsa.edu.

Stay connected with UT Health San Antonio on Facebook, Twitter, LinkedIn, Instagram and YouTube.

Journal

Nature Neuroscience

DOI

10.1038/s41593-020-0672-0

Credit: 
University of Texas Health Science Center at San Antonio

Uplifting of Columbia River basalts opens window on how region was sculpted

image: Joseph Canyon, 610 meters deep, in northeast Oregon, was formed by the Columbia River Flood Basalts some 15 million years ago. The photo was taken from the Joseph Canyon Viewpoint, off Oregon Highway 3, north of Enterprise.

Image: 
Image courtesy of Leif Karlstrom

EUGENE, Ore. - July 17, 2020 - Uplifting of Columbia River basalts has allowed University of Oregon researchers to better understand of how magma 14-16 million years ago shaped the region and why greenhouse gases released during a series of volcanic eruptions did not trigger a global extinction event.

The insights, published in Scientific Reports, were drawn from analyses of oxygen and hydrogen isotopes in crustal material, a mix of magma and original rocks, that is now exposed by geological uplifting and erosion.

The Columbia River Flood Basalts represent the youngest continental flood basalt province on Earth and one of the best preserved. It covers roughly 210,000 square kilometers, extending from eastern Oregon and Washington to western Idaho and part of northern Nevada.

Pivotal to the research were 27 samples from 22 different dikes - wall-like bodies of magma that cut through the sheeted lava flow landscape during the eruptions. A 10-meter-thick feeder dike into the Wallowa batholith, formed from a mix of basaltic magma and granite 16 million years ago, for example, likely acted as a magma conduit for up to seven years. It formed one of the largest surface lava flows and chemically altered about 100 meters of surrounding bedrock.

"We found that when hot basaltic magma intruded into the crust it boiled groundwater and volatilized everything in and near its path, causing chemical and isotopic changes in the rocks and the release of greenhouse gases," said Ilya Bindeman, a professor in the Department of Earth Sciences, who led the study.

Collectively, the effects of the heating throughout the flood-basalt region may have lasted about 150 years after magma stopped flowing, building the landscape that is visible today across the region, the seven-member research team from three countries concluded.

"The Columbia River basalts that are so dear to us in the Pacific Northwest," Bindeman said. "They are now uplifted and eroded to the level that allows us to sample the contacts of these basalts with the previous rocks. The same process today is happening every hour and everywhere under midocean ridges and also on continents. By studying these not-so-ancient rocks we have learned what is going on under our feet."

Computer modeling done with the chemical data suggests that the hydrothermal heating of the region's original metasedimentary rocks - a metamorphic rock formed through the deposition and solidification of sediment - and relatively low levels of organic matter affected by the eruptions would have generated the release of about 18 gigatons of carbon dioxide and methane. One gigaton equals a billion metric tons.

The individual Columbia River basalt eruptions were each 10 to 100 times larger than the largest historically experienced eruptions of Iceland's Eldgja and Laki volcanic eruptions in the years 934 and 1783, respectively, noted study co-author Leif Karlstrom, a professor of earth sciences, who along with Bindeman is a member of the Oregon Center for Volcanology.

The Laki eruption, which killed thousands of people, released volcanically derived greenhouse gases that generated a year without summer followed by a warm year across Europe and North America, Karlstrom said.

While the Columbia River eruptions released 210,000 cubic kilometers of basaltic magma over 1.5 million years, leading to global climate impacts, the researchers concluded, they did not cause mass extinctions such as the one triggered by eruptions over a similar timescale about 250 million years ago that formed the Siberian Traps.

The difference, the research team theorized, is in the geology of the regions. The Columbia River basalt eruptions occurred in igneous crust that contained low levels of organic matter that could be released by hydrothermal heating. Eruptions in the Siberian Traps occurred in organics-rich sedimentary rocks.

While the new findings suggest that similar regional-scale groundwater circulation around dikes is a signature of flood basalt volcanism globally, the researchers noted, the consequences may not always be catastrophic on a broad scale.

In the Columbia River basalts and the likely related Yellowstone hotspot, hydrothermal circulation is manifested as subtle isotopic signals, a depletion of oxygen isotopes, in the rocks, the research team found.

Credit: 
University of Oregon

Coordinated exit strategies crucial to avoid virus second-wave in Europe

Research by the University of Southampton shows European countries need to work together when lifting lockdown measures, to prevent COVID-19 cases rising again on the continent.

A study by WorldPop, experts in population mapping, has found any resurgence of the virus would be brought forward by up to five weeks if well-connected countries prematurely end their non-pharmaceutical interventions (NPIs), such as social distancing and self-isolation, without coordinating efforts. This would give less time to expand testing programmes and develop new treatments or vaccines. Detailed findings are published in the journal Science.

Lead author of the study, Dr Nick Ruktanonchai comments: "Our study shows the timing of any second epidemic across Europe depends on the actions of countries that are populous, well-connected and currently have strong interventions in place. The uncoordinated easing of NPIs can lead to much earlier secondary epidemics, while coordination can mean much higher likelihoods of eliminating all local cases."

Director of WorldPop, Professor Andy Tatem, says: "Intergovernmental organisations, such as the World Health Organization, have stressed the importance of international solidarity to share resources and expertise to combat COVID-19. Our results underline this and suggest that coordination between countries removing lockdown measures is vital. One country ending NPIs before others could lead to an accelerated resurgence of the disease."

The researchers used anonymised Vodafone mobile phone data and a Google mobility dataset to provide information on trends of population movement. They combined this with publicly available COVID-19 infection data. Using a sophisticated model, the team ran multiple exit strategy scenarios - each estimating the effect of relaxing different lockdown measures in different country combinations among 35 European countries, to examine how this affected virus spread in Europe over a six month period (April 2020 onwards).

The researchers concluded that if countries work together, it could greatly improve the likelihood of ending community transmission of COVID-19 throughout the continent. In particular, they showed that synchronizing intermittent lockdowns across countries would lead to half as many lockdown periods being necessary to achieve an end to transmission of the virus among people in Europe.

Across 1200 simulations, the researchers found that if countries synchronised implementation and relaxation of NPIs, an end to community transmission (over the six months period) was always the most likely outcome. If this was achieved, it would shift the emphasis to testing, tracing and quarantining cases coming to the region from elsewhere.

The study also showed that certain countries have a higher potential to cause a resurgence of COVID-19 than others. France, Germany, Italy, Poland and the UK were all identified as being a greater risk to triggering any new wave of infection.

Furthermore, the way in which each country would contribute to any resurgence varies. For example, Germany was shown as most likely to spark epidemics in neighbouring countries, whereas virus spread from France tends towards adversely affecting main population centres continent-wide. This suggests the most effective interventions may depend on the country considered. For instance, airport closures might be more useful for France, while limits on local travel may be more effective for Germany.

The researchers believe their approach could be used to study virus resurgence beyond Europe and plan to undertake future work to examine the effect of coordinated relaxation of NPIs at a global level.

Credit: 
University of Southampton

COVID-19: Viral shutdown of protein synthesis

Researchers from Munich and Ulm have determined how the pandemic coronavirus SARS-CoV-2 inhibits the synthesis of proteins in infected cells and shown that it effectively disarms the body's innate immune system. 

Although its name is relatively unspecific and indeed opaque, the Nonstructural Protein 1 (Nsp1) encoded by the coronavirus SARS-Cov-2, which is responsible for the current pandemic, has now been shown to have a devastating effect on host cells. Nsp1 is in fact one of the central weapons used by the virus to ensure its own replication and propagation in human hosts. Nsp1 was identified as a virulence factor following the outbreak of the related SARS coronavirus nearly 20 years ago, when it was shown to inhibit protein synthesis in infected cells. Now researchers based at Ludwig-Maximilians-Universitaet (LMU) in Munich and Ulm University Hospital have discovered what makes Nsp1 so potent. In a paper which appears in the journal Science, they describe the protein's mode of action in detail.

In all biological cells, the task of synthesizing proteins is performed by complex molecular machines known as ribosomes. Ribosomes interact with messenger RNAs (mRNAs), which serve as blueprints for protein synthesis, and translate the nucleotide sequence of each mRNA into the amino-acid sequence of the corresponding protein. The amino-acid sequence in turn determines the shape and biological function of each individual protein. Ribosomes consist of two distinct subunits, and Nsp1 binds to the smaller one - the 40S subunit. The mRNA initially binds to the small subunit, prior to the latter's interaction with the 60S subunit to form the cavity through which the mRNA is then threaded. The new study shows that one end of the Nsp1 protein interacts with the 40S subunit in such a way that it prevents binding of the mRNA. With the aid of high-resolution cryo-electron microscopy, Professor Roland Beckmann and his colleagues at the LMU Gene Center have shown in three-dimensional detail how Nsp1 binds tightly to a specific pocket in the small ribosomal subunit and inhibits the formation of functional ribosomes. Further experiments revealed that Nsp1 can also interact with specific configurational states of the fully assembled ribosome.

In addition, the team led by Konstantin Sparrer at Ulm University Hospital was able to show that the shutdown of protein synthesis leads to an almost complete collapse of one of the body's major lines of defense against the virus. Nsp1 inactivates the innate immune response by inhibiting a vital signaling cascade. The authors of the study hope that the insights gained will make it possible to find ways to neutralize the novel coronavirus, and thus mitigate the severity of the respiratory disease that it causes. One potential approach, they say, would be to develop a molecule that masks the viral protein's binding site. This should be feasible, since the Nsp1-binding pocket itself appears not to have an essential role in ribosomal function.

Credit: 
Ludwig-Maximilians-Universität München

Perspective: T cell responses to COVID-19 are a crucial target for research

While early research on the adaptive immune response to COVID-19 primarily looked at antibodies, more information is now emerging on how T cells react to the SARS-CoV-2 virus - addressing a crucial knowledge gap, say Daniel Altmann and Rosemary Boyton in a new Perspective. While antibody responses are generally much easier to study, T cells are known to play a more important role in protecting the body against viral infections. In the context of COVID-19, "antibody responses appear short-lived and T cell memory is potentially more durable," Altmann and Boyton say, leading them to argue that "it's time to admit that we really need the T cell data too." Seeking to assess the current state of knowledge on how T cells respond to the SARS-CoV-2 virus, the authors selected 9 studies - some published in peer-reviewed journals, while others are still under review and available as preprints - and summarized key takeaways and emerging points of consensus. "To fully understand population level immunity, screening for both antibody and T cell immunity using standardized testing methods would be beneficial," Altmann and Boyton conclude, noting that standardized tests to measure T cell immunity to SARS-CoV-2 could be designed using methods in common with established tests for T cell immunity to Mycobacterium tuberculosis.

Credit: 
American Association for the Advancement of Science (AAAS)

Coordination helps avoid continental COVID-19 resurgence, European modeling study shows

Coordinated lockdown strategies among countries is key to preventing resurgent COVID-19 outbreaks in continental Europe, a new modeling study shows. A continental epidemic could occur as many as five weeks earlier when well-connected countries with stringent existing interventions end their interventions prematurely, the study's authors say. As rates of new COVID-19 cases begin to decline in countries around the world, governments are considering how to ease restrictions without disease resurgence. This includes lifting non-pharmaceutical interventions such as social distancing policies and lockdown measures in a coordinated way that prevents international travel from causing resurgent epidemics. While such efforts are generally considered important, evidence informing how countries should implement them, and of their potential benefits, is lacking. To better quantify the value of coordinated exit strategies, Nick Ruktanonchai and colleagues took advantage of the way data from mobile phones can inform contact rates between people and the effect of interventions on human mobility. With data provided Vodafone and Google, they modelled the spread of COVID-19 under different coordinated and uncoordinated exit strategy scenarios using an openly available epidemiological model of COVID-19 transmission. Across 1200 simulations, they found that synchronized cycles of interventions were always more likely to end community transmission. In the most striking example, synchronizing four cycles of three-week long lockdowns led to local elimination of COVID-19 cases in 90% of simulations, while unsynchronized cycles only led to elimination 5% of the time. Importantly, say the authors, their simulations do not include any importation from other regions of the globe. "The implications of our study extend well beyond Europe and COVID-19," say Ruktanonchai and colleagues, "broadly demonstrating the importance of communities coordinating easing of various non-pharmaceutical interventions for any potential pandemic."

Credit: 
American Association for the Advancement of Science (AAAS)

A call to arms: Enlisting private land owners in conservation

image: People recreating in Yellowstone National Park, the world's first protected area. Protected areas like Yellowstone are invaluable, but they not enough to adequately conserve endangered species in the United States.

Image: 
Edward Hammill/Utah State University

In 1872 the United States created Yellowstone, the first National Park in the world. Since then many more parks, monuments, preserves, wildernesses and other protected areas have been created in the USA. Protected areas, like Yellowstone, are invaluable, but are they actually effective at preserving endangered species? And if not, how can future protected areas do better?

A team of ecologists at Utah State University published a study in Scientific Reports to answer these questions. They used computer models to determine if protected areas in the USA preserve enough land inhabited by endangered species to adequately ensure their future survival in the wild. As it is, the situation is problematic:

Of the 159 endangered mammal, bird, reptile and amphibian species in the continental USA only 21 are adequately preserved by existing protected areas.

Creating new protected areas on public land is fraught with obstacles. Many protected areas are designated based on scenic beauty or lack of agricultural value and these criteria don't necessarily benefit at-risk species.

Unfavorable political climates can also present problems. Trisha Atwood, Assistant Professor in the Department of Watershed Sciences and Ecology Center, and study co-author explained, "There has been a huge political push in the USA to reduce protected areas such as National Monuments. However, our results suggest that we not only need to increase the spatial coverage of protected areas in the USA, but we also need to ensure that we are protecting the places that contain critical habitat for endangered species."

Another obstacle is the limited availability of public land. For example, in the state of Texas 95% of the land is privately owned. And according to the study even if all federal and state public lands were given protected area status more than half of the at-risk species in the USA would still be in danger of extinction.

"We are not suggesting that protected areas are doing a bad job," said Edward Hammill, Assistant Professor in the Department of Watershed Sciences and Ecology Center and study co-author, "what we are suggesting is that there are many opportunities to increase protection."

One of those opportunities is the creation of conservation easements on private land. Conservation easements are voluntary, legal agreements that restrict future development on private land. In exchange for contributing to conservation efforts, land owners retain their property rights and can receive tax benefits. One of the most important findings from the study is that with the help of private land owners the USA has not lost the capacity to adequately protect 100% of its endangered species.

"It is unlikely that adequate conservation of endangered species will be achieved by increasing federal protected areas," said Hammill. "Our research highlights that private land owners represent an alternative route to achieving conservation goals."

Atwood concluded, "These findings give me hope that we can still make a change for the better. But, if we are going to win the fight against extinction we are going to need the help of private land owners."

Credit: 
S.J. & Jessie E. Quinney College of Natural Resources, Utah State University