Culture

Genetic factors in chronic versus episodic migraine

image: TRPV1 1911A>G SNP distribution in the control group and EM and CM patients. CM group revealed high frequency of AA and missing GG genotype

Image: 
Kazan Federal University

According to existing estimates, migraine is a highly prevalent ailment, with about 15 percent of global population suffering from it at one time or another. In Russia, the ratio is as high as 20 percent. The current diagnostics and treatment methods are strictly clinical, i. e. they are based on a patient's complaints.

This research was conducted by KFU's Neurobiology Lab and Gene and Cell Technologies Lab. The team looked for genetic markers of migraine for about two years. Colleagues from Saint-Petersburg and Kazan State Medical University also contributed.

"Chronic migraine is a much more serious disease than episodic migraine. Migraine is prone to becoming a chronic syndrome, so it's important to detect those who may become afflicted in the risk groups with episodic migraines. Such patients may be assigned prevention medications to avoid chronic migraine. Thanks to this particular research, we found genetic factors protecting from the chronification of migraines," says neurologist, Junior Research Associate Aliya Yakubova.

For the testing, 46 patients with migraine and 50 patients without the disease were selected. They donated blood for DNA sequencing. In the sequencing process, the scientists determined polymorphisms in the pain receptor TRPV1. In this case, the replacement of nucleotide A by nucleotide G leads to the change in amino acids.

At this stage, the pilot studies have been completed. The selection process will continue, and patients with chronic or episodic migraine can partake.

"We will test the results on a larger selection of individuals. If our takeaways stand ground, we'd like to introduce them into clinical practice. If a patient has genotypes AA or AG, which don't prevent migraine chronicity, they can receive long-term prevention therapies. If they have the GG genotype, chronicity is unlikely, and only symptomatic therapy can be prescribed," continues Yakubova.

The work helped find significant differences in polymorphisms in patients with chronic versus episodic migraine. This means that some degree of genetic predisposition exists, and there are risk factors and prevention factors of chronicity for all the three genotypes. The patenting process for the technique is currently underway.

Credit: 
Kazan Federal University

SMART researchers develop fast and efficient method to produce red blood cells

image: Laboratory set-up of microfluidic sorting and purification of cells during red blood cell culture and manufacturing

Image: 
Singapore-MIT Alliance for Research and Technology (SMART)

Singapore, 14 September 2020 - Researchers from Singapore-MIT Alliance for Research and Technology (SMART), MIT's research enterprise in Singapore, have discovered a new way to manufacture human red blood cells (RBCs) that cuts the culture time by half compared to existing methods and uses novel sorting and purification methods that are faster, more precise and less costly.

Blood transfusions save millions of lives every year, but over half the world's countries do not have sufficient blood supply to meet their needs. The ability to manufacture RBCs on demand, especially the universal donor blood (O+), would significantly benefit those in need of transfusion for conditions like leukemia by circumventing the need for large volume blood draws and difficult cell isolation processes.

Easier and faster manufacturing of RBCs would also have a significant impact on blood banks worldwide and reduce dependence on donor blood which has a higher risk of infection. It is also critical for disease research such as malaria which affects over 220 million people annually, and can even enable new and improved cell therapies.

However, manufacturing RBCs is time-consuming and creates undesirable by-products, with current purification methods being costly and not optimal for large scale therapeutic applications. SMART's researchers have thus designed an optimised intermediary cryogenic storage protocol that reduces the cell culture time to 11 days post-thaw, eliminating the need for continuous 23-day blood manufacturing. This is aided by complementary technologies the team developed for highly efficient, low-cost RBC purification and more targeted sorting.

In a paper titled "Microfluidic label-free bioprocessing of human reticulocytes from erythroid culture" recently published in the prestigious journal Lab on a Chip, the researchers explain the huge technical advancements they have made towards improving RBC manufacturing. The study was carried out by researchers from two of SMART's Interdisciplinary Research Groups (IRGs) - Antimicrobial Resistance (AMR) and Critical Analytics for Manufacturing Personalised-Medicine (CAMP) - co-led by Principal Investigators Jongyoon Han, a Professor at MIT, and Peter Preiser, a Professor at NTU. The team also included AMR and CAMP IRG faculty appointed at the National University of Singapore (NUS) and Nanyang Technological University (NTU).

"Traditional methods for producing human RBCs usually require 23 days for the cells to grow, expand exponentially and finally mature into RBCs," says Dr Kerwin Kwek, lead author of the paper and Senior Postdoctoral Associate at SMART CAMP. "Our optimised protocol stores the cultured cells in liquid nitrogen on what would normally be Day 12 in the typical process, and upon demand thaws the cells and produces the RBCs within 11 days."

The researchers also developed novel purification and sorting methods by modifying existing Dean Flow Fractionation (DFF) and Deterministic Lateral Displacement (DLD); developing a trapezoidal cross-section design and microfluidic chip for DFF sorting, and a unique sorting system achieved with an inverse L-shape pillar structure for DLD sorting.

SMART's new sorting and purification techniques using the modified DFF and DLD methods leverage the RBC's size and deformability for purification instead of spherical size. As most human cells are deformable, this technique can have wide biological and clinical applications such as cancer cell and immune cell sorting and diagnostics.

On testing the purified RBCs, they were found to retain their cellular functionality, as demonstrated by high malaria parasite infectivity which requires highly pure and healthy cells for infection. This confirms SMART's new RBC sorting and purifying technologies are ideal for investigating malaria pathology.

Compared with conventional cell purification by fluorescence-activated cell sorting (FACS), SMART's enhanced DFF and DLD methods offer comparable purity while processing at least twice as many cells per second at less than a third of the cost. In scale-up manufacturing processes, DFF is more optimal for its high volumetric throughput, whereas in cases where cell purity is pivotal, DLD's high precision feature is most advantageous.

"Our novel sorting and purification methods result in significantly faster cell processing time and can be easily integrated into current cell manufacturing processes. The process also does not require a trained technician to perform sample handling procedures and is scalable for industrial production," Dr Kwek continues.

The results of their research would give scientists faster access to final cell products that are fully functional with high purity at a reduced cost of production.

Credit: 
Singapore-MIT Alliance for Research and Technology (SMART)

Early steroids improve outcomes in patients with septic shock

September 14, 2020 - Some critically ill patients with septic shock need medications called vasopressors to correct dangerously low blood pressure. When high doses of vasopressors are needed or blood pressure isn't responding well, the steroid hydrocortisone is often used. In this situation, earlier treatment with hydrocortisone reduces the risk of death and other adverse outcomes, reports a study in SHOCK®: Injury, Inflammation, and Sepsis: Laboratory and Clinical Approaches, Official Journal of the Shock Society. The journal is published in the Lippincott portfolio by Wolters Kluwer.

For critical care specialists, the study provides new evidence on the optimal timing of steroid treatment for patients with vasopressor-dependent septic shock. "If hydrocortisone is to be initiated in patients with septic shock, it should be initiated within at least the first 24 hour after shock onset, and ideally within the first 12 hours," according to the new research by Gretchen L. Sacha, PharmD, and colleagues of the Cleveland Clinic.

New Evidence on Timing of Hydrocortisone for Septic Shock
The study included 1,470 patients with septic shock treated with hydrocortisone at Cleveland Clinic ICUs between 2011 and 2017. All patients required vasopressors to maintain near-normal blood pressure after fluid resuscitation.

Based on the timing of hydrocortisone therapy, patients were divided into five groups. About 39 percent started hydrocortisone within 0 to 6 hours after shock onset. Other groups started hydrocortisone at 6 to 12 hours (about 16 percent of patients), 12 to 24 hours (18 percent), 24 to 48 hours (13 percent), or after 48 hours (15 percent).

Toward determining the optimal timing of hydrocortisone initiation, time alive and off vasopressors - that is, with blood pressure within the target range - was compared among groups, along with other key outcomes. Patient characteristics varied between groups, including higher sepsis severity scores in patients who started hydrocortisone earlier.

"Despite being more critically ill at baseline, patients who received hydrocortisone earlier had better clinical outcomes when compared to patients who received hydrocortisone later after shock onset," Dr. Sacha and coauthors write. Patients starting steroids earlier not only had more days alive and off vasopressors but also had lower mortality rates. For example, risk of in-hospital death was 48.5 percent for patients initiating hydrocortisone at 0 to 6 hours versus 59.0 percent for those who started treatment after 48 hours.

After adjustment for severity of illness and other factors, earlier hydrocortisone therapy was still associated with increased vasopressor-free days. In this analysis, the odds of death in the ICU were 40 percent lower in patients receiving hydrocortisone within 0 to 6 hours, compared to those who started treatment beyond 48 hours.

Sepsis occurs when the immune system mounts an overwhelming inflammatory response to infection in the blood or elsewhere in the body. Septic shock is present in patients who develop a sharp drop in blood pressure and other metabolic abnormalities, with a risk of progression to organ failure.

Hydrocortisone is recommended for some patients with vasopressor-dependent septic shock, but there is ongoing debate over how and when it should be used. Some smaller studies have suggested hydrocortisone is more likely to be beneficial when started earlier after shock onset.

The new study, in a relatively large sample of patients, highlights the importance of early hydrocortisone therapy. "Timing of hydrocortisone initiation in patients with septic shock appears to be crucial and hydrocortisone should be started within the first 12 h after shock onset," Dr. Sacha and coauthors write.

In contrast, starting hydrocortisone after 24 hours does not appear to have any benefit. The researchers conclude, "Future randomized studies should focus on the timing of hydrocortisone initiation, ensuring initiation within the first 12 hours from shock onset."

Credit: 
Wolters Kluwer Health

COVID-19 patients with sleep apnoea could be at additional risk

Relationship between obstructive sleep apnoea and poorer outcomes from COVID-19 identified in systematic review of studies by University of Warwick

Researchers advise that people with obstructive sleep apnoea should take the necessary precautions to reduce their exposure and follow their treatment plan diligently

1.5 million people in the UK currently diagnosed with the condition, but up to 85% of people could be undiagnosed

Researchers call for better recording and more data on the condition

People who have been diagnosed with obstructive sleep apnoea could be at increased risk of adverse outcomes from COVID-19 according to a new study from the University of Warwick.

The conclusion is drawn from a systematic review of studies that reported outcomes for COVID-19 patients that were also diagnosed with obstructive sleep apnoea. Published in the journal Sleep Medicine Reviews, the review highlights the need to further investigate the impact of the virus on those with the sleep condition and to better identify those currently undiagnosed with it.

Obstructive sleep apnoea is a condition characterised by complete or partial blockage of the airways during sleep when the muscles there become weaker. It is commonly diagnosed in people who snore or appear to stop breathing or make choking sounds during sleep, and those who are obese in particular are more likely to experience it. If you are told that you make strange noises when you sleep or seem to stop breathing during sleep, you should speak to their GP about being referred to a sleep service to be checked for the condition. You can also find more information about the condition here: http://www.sleep-apnoea-trust.org and https://www.hope2sleep.co.uk

Many of the risk factors and comorbidities associated with sleep apnoea, such as diabetes, obesity and hypertension, are similar to those associated with poor COVID-19 outcomes. However, the researchers wanted to investigate whether being diagnosed with obstructive sleep apnoea conferred an additional risk on top of those factors.

The systematic review looked at eighteen studies up to June 2020 with regards to obstructive sleep apnoea and COVID-19, of these eight were mainly related to the risk of death from COVID-19 and ten were related to diagnosis, treatment and management of sleep apnoea. Although few studies of obstructive sleep apnoea in COVID-19 had been performed at the time, there is evidence to suggest that many patients who presented to intensive care had obstructive sleep apnoea and in diabetic patients it may confer an increased risk that is independent of other risk factors. In one large study in patients that had diabetes, who were hospitalised for COVID-19, those being treated for obstructive sleep apnoea were at 2.8 times greater risk of dying on the seventh day after hospital admission.

Researchers believe that in the UK up to 85% of obstructive sleep apnoea disorders are undetected, suggesting that the 1.5 million people in the UK currently diagnosed with the condition may be just the tip of the iceberg. With obesity rates and other related risk factors on the increase, the researchers also believe that rates of obstructive sleep apnoea are also increasing. The review highlights that the pandemic has also had worldwide effects on the ongoing diagnosis, management and treatment of patients with this and other sleep conditions. Moving forward it may be necessary to explore new diagnosis and treatment pathways for these individuals.

Lead author of the study Dr Michelle Miller of Warwick Medical School said: "Without a clear picture of how many people have obstructive sleep apnoea it is difficult to determine exactly how many people with the condition may have experienced worse outcomes due to COVID-19.

"This condition is greatly underdiagnosed, and we don't know whether undiagnosed sleep apnoea confers an even greater risk or not.

"It is likely that COVID-19 increases oxidative stress and inflammation and has effects on the bradykinin pathways, all of which are also affected in obstructive sleep apnoea patients. When you have individuals in which these mechanisms are already affected, it wouldn't be surprising that COVID-19 affects them more strongly."

Treatment for obstructive sleep apnoea with continuous positive airway pressure (CPAP) has been shown to have some beneficial effects on these mechanisms and it is important that treatment is optimised for these individuals. In the UK, the British Sleep Society with the OSA alliance has released guidelines with regards to the use of CPAP during the pandemic.

The researchers feel it is important that those diagnosed with obstructive sleep apnoea are aware of the potential additional risk and are taking appropriate precautions to reduce their exposure to the virus. Further research is required to determine whether these individuals need to be added to the list of vulnerable groups that may need to shield if transmission of virus increases.

Dr Miller adds: "This is a group of patients that should be more aware that obstructive sleep apnoea could be an additional risk if they get COVID-19. Make sure you are compliant with your treatment and take as many precautions as you can to reduce your risk, such as wearing a mask, social distancing and getting tested as soon as you notice any symptoms. Now more than ever is the time to follow your treatment plan as diligently as possible.

"Hospitals and doctors should also be recording whether their patients have obstructive sleep apnoea as a potential risk factor, and it should be included in studies and outcomes data for COVID-19. We need more data to determine whether this is something we should be more concerned about."

Credit: 
University of Warwick

Asthma patients given risky levels of steroid tablets

More than one quarter of asthma patients have been prescribed potentially dangerous amounts of steroid tablets, with researchers warning this puts them at greater risk of serious side-effects.

Researchers, led by University of Queensland Professor John Upham, analysed data from the Pharmaceutical Benefits Scheme (PBS) to find out how often Australians with asthma were taking repeated courses of steroid tablets.

Professor Upham said the study looked at more than 120,000 cases where asthma patients were given one or more prescriptions for steroid tablets by their doctor between 2014 and 2018.

"Researchers found more than 25 per cent of those patients were more likely to have a chronic condition," Professor Upham said.

"Short courses of steroid tablets can be effective at treating asthma attacks in the short term, but it's becoming clear that repeated use may cause significant long term side-effects like diabetes, osteoporosis and cataracts.

Around 2.5 million Australians have asthma, with the condition affecting more women than men.

Professor Upham said the best way to prevent asthma attacks was by regularly using preventer inhalers.

"Unfortunately, our study found half of asthma patients given repeated scripts for steroid tablets were not using inhalers as often as they should," he said.

"Better approaches are needed to educate and support asthma patients, and encourage them use preventer inhalers regularly.

"This is the best way to avoid or minimise the need for steroid tablets, and the side effects they can produce."

Credit: 
University of Queensland

When methane-eating microbes eat ammonia instead

Some microorganisms, the so-called methanotrophs, make a living by oxidizing methane (CH4) to carbon dioxide (CO2). Ammonia (NH3) is structurally very similar to methane, thus methanotrophs also co-metabolize ammonia and produce nitrite. While this process was observed in cell cultures, the underlying biochemical mechanism was not understood. Boran Kartal, head of the Microbial Physiology Group at the Max Planck Institute for Marine Microbiology in Bremen, Germany, and a group of scientists from Radboud University in Nijmegen, The Netherlands, now shed light on an exciting missing link in the process: the production of nitric oxide (NO).

Nitric oxide is a highly reactive and toxic molecule with fascinating and versatile roles in biology and atmospheric chemistry. It is a signaling molecule, the precursor of the potent greenhouse gas nitrous oxide (N2O), depletes the ozone layer in our atmosphere, and a key intermediate in the global nitrogen cycle. It now turns out that NO is also the key for the survival of methanotrophs that face ammonia in the environment - which they do more and more as fertilizer input into nature increases. When methanotrophs co-metabolize ammonia they initially produce hydroxylamine, which inhibits other important metabolic processes, resulting in cell death. Thus, methanotrophs need to get rid of hydroxylamine as fast as possible. "Carrying a hydroxylamine-converting enzyme is a matter of life or death for methane-eating microbes", Kartal says.

For their study, Kartal and his colleagues used a methanotrophic bacterium named Methylacidiphilum fumariolicum, which originates from a volcanic mud pot, characterized by high temperatures and low pH, in the vicinity of Mount Vesuvius in Italy. "From this microbe, we purified a hydroxylamine oxidoreductase (mHAO) enzyme," reports Kartal. "Previously it was believed that mHAO enzyme would oxidize hydroxylamine to nitrite in methanotrophs. We now showed that it actually rapidly produces NO." The mHAO enzyme is very similar to the one used by "actual" ammonia oxidizers, which is quite astonishing, as Kartal explains: "It is now clear that enzymatically there is not much difference between aerobic ammonia- and methane-oxidizing bacteria. Using essentially the same set of enzymes, methanotrophs can act as de facto ammonia oxidizers in the environment. Still, how these microbes oxidize NO further to nitrite remains unknown."

The adaptation of the mHAO enzyme to the hot volcanic mud pots is also intriguing, Kartal believes: "At the amino acid level, the mHAO and its counterpart from ammonia oxidizers are very similar, but the protein we isolated from M. fumariolicum thrives at temperatures up to 80 °C, almost 30 °C above the temperature optimum of their "actual" ammonia-oxidizing relatives. Understanding how so similar enzymes have such different temperature optima and range will be very interesting to investigate."

According to Kartal, production of NO from ammonia has further implications for methane-eating microbes: "Currently there are no known methanotrophs that can make a living out of ammonia oxidation to nitrite via NO, but there could be methanotrophs out there that found a way to connect ammonia conversion to cell growth."

Credit: 
Max Planck Institute for Marine Microbiology

Proximity to the southern border and DUI arrests in California

A new study from the Prevention Research Center of the Pacific Institute for Research and Evaluation of DUI arrests in California shows that arrests increase as distance to the southern border decrease, and that this may be due to greater availability of alcohol in the border area.

The article examines trends and population-level correlates of drinking driving arrests from 2005 to 2017 in California. Relying on arrest data from the California Department of Justice, and demographic and community data form the U.S. Census, measures of alcohol outlet density, and distance to the U.S./Mexico border, the authors found that:

Arrest rates among women and men showed an upward trend until 2008 and decreases after that year

DUI arrest rates were greater among Hispanics than Whites for younger age groups: 18-29 and 30-39

DUI arrest rates were positively related to proximity to the California/Mexico border and:

a higher percent of bar/pub outlets;

a higher percent of Hispanic population;

a higher percent of population 18-29, 30-39, and 40-49 years of age;

a higher percent of US-born population;

a higher percent of population with annual income of $100,000 or more;

a higher percent of population 150% below the federal poverty line; and

a higher level of law enforcement activities.

Lead author, Raul Caetano notes that: "These analyses were based on 2.3 million arrest records for 18 years, providing very stable results. Although DUI arrest rates increased as distance to the border decreased, in communities with larger percentages of Hispanics, DUI rates increased as distance to the border increased. This shows that effect of the increased alcohol and drug availability at the border are complex and influenced by community factors".

Credit: 
Pacific Institute for Research and Evaluation

Measuring brainwaves while sleeping can tell if you should switch antidepressants

Scientists have discovered that measuring brainwaves produced during REM sleep can predict whether a patient will respond to treatment from depression. This enables patients to switch to a new treatment rather than continue the ineffective treatment (and the depression) for weeks without knowing the outcome.

As study leader, Dr Thorsten Mikoteit said, "In real terms it means that patients, often in the depths of despair, might not need to wait weeks to see if their therapy is working before modifying their treatment". This work is presented at the ECNP Congress.

Around 7% of adults suffer depression (also known as MDD, Major Depression Disorder) in any one year. It's a huge health burden, costing economies hundreds of billions of Euros/dollars each year. Around 27m European and 17m Americans suffer from MDD every year.

The standard treatment is antidepressants, normally Selective Serotonin Reuptake Inhibitors (SSRI's), such as Prozac and Fluoxetine. However, these can take weeks or months to show an effect, meaning that patients often have to face the depth of their depression for several weeks before even knowing if the treatment they are taking will work. Around 50% of sufferers don't respond to initial antidepressant treatment, which means that after four weeks of ineffective treatment, doctors have to change treatment strategy, and again have to wait for response for another four weeks. Being able to predict the response as early as after one week of treatment would be of huge benefit to depressed patients, and would shorten the treatment response time.

A team led by Dr Thorsten Mikoteit, of the University of Basel, has conducted a randomised controlled trial on 37 patients with Major Depression. All were treated with antidepressants, but 15 were assigned to the control group, while the remaining 22 had their details given to the psychiatrist in charge of treatment. All then had their brainwaves monitored during REM* sleep (technically, this was a measurement of prefrontal theta cordance in REM sleep). The psychiatrists in charge of the treatment group patients were under instructions to interpret the brainwaves to see if the treatment was working, and if not to change the treatment. The overall aim was to see a 50% reduction in symptoms of depression, measured by the standard Hamilton Depression Rating Scale.

Doctors tested patients as early as one week after starting treatment, to see if the brainwaves indicated that the antidepressant treatment was likely to work. Those patients who were unlikely to have successful treatment were immediately switched to a different treatment. After 5 weeks it was found that 87.5% of these patients had an improved response, as opposed to just 20% in the control group.

Thorsten Mikoteit said:

"This is a pilot study, but nevertheless it shows fairly significant improvements. We have been able to show that by predicting the non-response to antidepressants we were able to adapt the treatment strategy more or less immediately: this enables us to significantly shorten the average duration between start of antidepressant treatment and response, which is vital especially for seriously depressed patients.

It needs to be repeated with a larger group of patients to make sure that the results are consistent. Patients need to be in a situation where their REM sleep can be monitored, so this requires more care than just giving the pill and waiting to see what happens. This means that the treatment monitoring will be more expensive, although we anticipate that will be offset by being able to give the right treatment much earlier. We are working on ways of streamlining this.

What it does mean is that we may be able to treat the most at-risk patients, for example those at risk of suicide, much quicker than we can currently do. If this is confirmed to be effective, it will save lives"

Commenting, Professor Catherine Harmer, University of Oxford and ECNP Executive Committee member, said:

"Most of the time, patients need to wait for around 4 weeks before they can tell if they are responding to a particular antidepressant or not. This is a hugely disabling and lengthy process and often a different treatment then needs to be started. The study results presented by Mikoteit are interesting and suggest that it may be possible to tell if a treatment is working much more quickly - even after a week of treatment - by using a physiological measure of response (REM sleeping pattern). If this is replicated in larger, blinded study then it would have enormous implications for the future treatment of individuals with depression".

Professor Harmer was not involved in this work, it is an independent comment.

Credit: 
European College of Neuropsychopharmacology

Pandemic spawns 'infodemic' in scientific literature

PITTSBURGH--The science community has responded to the COVID-19 pandemic with such a flurry of research studies that it is hard for anyone to digest them all, underscoring a long-standing need to make scientific publication more accessible, transparent and accountable, two artificial intelligence experts assert in a data science journal.

The rush to publish results has resulted in missteps, say Ganesh Mani, an investor, technology entrepreneur and adjunct faculty member in Carnegie Mellon University's Institute for Software Research, and Tom Hope, a post-doctoral researcher at the Allen Institute for AI. In an opinion article in today's issue of the journal Patterns, they argue that new policies and technologies are needed to ensure relevant, reliable information is properly recognized.

Those potential solutions include ways to combine human expertise with AI as one way to keep pace with a knowledge base that is expanding geometrically. AI might be used to summarize and collect research on a topic, while humans serve to curate the findings, for instance.

"Given the ever-increasing research volume, it will be hard for humans alone to keep pace," they write.

In the case of COVID-19 and other new diseases, "you have a tendency to rush things because the clinicians are asking for guidance in treating their patients," Mani said. Scientists certainly have responded - by mid-August, more than 8,000 preprints of scientific papers related to the novel coronavirus had been posted in online medical, biology and chemistry archives. Even more papers had been posted on such topics as quarantine-induced depression and the impact on climate change from decreased transportation emissions.

At the same time, the average time to perform peer review and publish new articles has shrunk; in the case of virology, the average dropped from 117 to 60 days.

This surge of information is what the World Health Organization calls an "infodemic" - an overabundance of information, ranging from accurate to demonstrably false. Not surprisingly, problems such as the hydroxycholoroquine controversy have erupted as research has been rushed to publication and subsequently withdrawn.

"We're going to have that same conversation with vaccines," Mani predicted. "We're going to have a lot of debates."

Problems in scientific publication are nothing new, he said. As a grad student 30 years ago, he proposed an electronic archive for scientific literature that would better organize research and make it easier to find relevant information. Many ideas continue to circulate about how to improve scientific review and publication, but COVID-19 has exacerbated the situation.

Some of the speed bumps and guard rails that Mani and Hope propose are new policies. For instance, scientists usually emphasize experiments and therapies that work; highlighting negative results, on the other hand, is important for clinicians and discourages other scientists from going down the same blind alleys. Identifying the best reviewers, sharing review comments and linking papers to related papers, retraction sites or legal rulings are among other ideas they explore.

Greater use of AI to digest and consolidate research is a major focus. Previous attempts to use AI to do so have failed in part because of the often figurative and sometimes ambiguous language used by humans, Mani noted. It may be necessary to write two versions of research papers - one written in a way that draws the attention of people and another written in a boring, uniform style that is more understandable to machines.

Mani said he and Hope have no illusions that their paper will settle the debate about improving scientific literature, but hope that it will spur changes in time for the next global crisis.

"Putting such infrastructure in place will help society with the next strategic surprise or grand challenge, which is likely to be equally, if not more, knowledge intensive," they concluded.

Credit: 
Carnegie Mellon University

Assessment of mental health of Chinese primary school students before, after school closing, opening during COVID-19 pandemic

What The Study Did: Psychological symptoms, nonsuicidal self-injury and suicidal ideation, plans, and attempts among children and adolescents were investigated in this observational study before the COVID-19 outbreak started (early November 2019) and two weeks after school reopening (mid-May 2020) in an area of China with low risk of COVID-19.

Authors: Ying Sun, M.D., of Anhui Medical University in the Anhui province of China, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.21482)

Editor's Note: The article includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Factors associated with suicide risk after leaving military service

What The Study Did: This observational study investigated demographic and military service characteristics associated with suicide risk among U.S. veterans after the transition from active military service to civilian life.

Authors: Mark A. Reger, Ph.D., of the Veterans Affairs Puget Sound Health Care System in Seattle, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.16261)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Changes in premature deaths from drug poisonings, suicide, alcohol-induced causes in US

What The Study Did: Researchers compared changes from 2000 to 2017 in premature deaths in the U.S. due to drug poisonings, suicide and alcohol-induced causes by geographic areas and demographic characteristics.

Authors: Meredith S. Shiels, Ph.D., of the National Cancer Institute in Rockville, Maryland, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.16217)

Editor's Note: The article includes conflicts of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

How does chronic stress induce bone loss?

image: The BNSTSOM-VMHSF-1-NTSVglut2 neural circuit regulates chronic stress-induced bone loss.

Image: 
SIAT

Clinical studies have found that bone mineral density in patients with anxiety or depression is lower than in ordinary people.

The brain, commander of the body, receives and processes external signals, and then sends instructions to peripheral bones. But how does anxiety induce a decline in bone mineral density?

Researchers from the Shenzhen Institutes of Advanced Technology (SIAT) of the Chinese Academy of Sciences and their collaborators now have an answer. They found that a central neural circuit from the forebrain to the hypothalamus mediates chronic stress-induced bone loss via the peripheral sympathetic nervous system.

Their study was published in the Journal of Clinical Investigation on September 10.

The researchers found that isolation can significantly increase anxiety levels, thus inducing bone loss in human subjects.

Biochemical analysis showed that prolonged isolation increases the concentration of norepinephrine and decreases osteogenic markers in serum. These changes were consistent with the observation of elevated anxiety and reduced bone formation in subjects.

In order to identify the neural mechanism underlying chronic stress-induced bone loss, the research team used a mouse model where mice were subjected to unpredictable chronic mild stress.

They found that after four to eight weeks of chronic stress, the mice displayed significant anxiety behaviors. The bone mineral density of the mice in the stress group was significantly lower than in the control group.

These results confirmed the correlation between stress-induced anxiety and bone loss in experimental animals, and provided a good animal model for follow-up neural mechanism analysis.

Through extensive experiments, researchers identified a population of inhibitory neurons expressing somatostatin in the brain nucleus that are known as the bed nucleus of the stria terminalis (BNST) in the forebrain. These neurons were activated when animals showed anxiety behaviors and transmitted "anxiety" information to the neurons in the ventromedial hypothalamus (VMH).

"Activating the BNST-VMH neural circuit can simultaneously induce anxiety-like behaviors and generate bone loss in the mice, whereas inhibition of this circuit can prevent stress-induced anxiety and bone loss at the same time," said Prof. YANG Fan from SIAT, the co-first and co-corresponding author of the study.

Furthermore, the researchers discovered that glutamatergic neurons in nucleus tractus solitaries (NTS) and the sympathetic system were employed to regulate stress-induced bone loss.

"This study provides a new perspective for the systematic study of the regulatory mechanism of brain homeostasis on metabolism and endocrine function of the body in special environments," said Prof. WANG Liping, Director of the Brain Cognition and Brain Disease Institute of SIAT.

Credit: 
Chinese Academy of Sciences Headquarters

Massive-scale genomic study reveals wheat diversity for crop improvement

image: Wheat grows at CIMMYT's experimental station in El Batán, near Mexico City. A new study analyzing the diversity of almost 80,000 wheat accessions reveals consequences and opportunities of selection footprints.

Image: 
© Eleusis Llanderal/CIMMYT

Researchers working on the Seeds of Discovery (SeeD) initiative, which aims to facilitate the effective use of genetic diversity of maize and wheat, have genetically characterized 79,191 samples of wheat from the germplasm banks of the International Maize and Wheat Improvement Center (CIMMYT) and the International Center for Agricultural Research in the Dry Areas (ICARDA).

The findings of the study published today in Nature Communications are described as "a massive-scale genotyping and diversity analysis" of the two types of wheat grown globally -- bread and pasta wheat -- and of 27 known wild species.

Wheat is the most widely grown crop globally, with an annual production exceeding 600 million tons. Approximately 95% of the grain produced corresponds to bread wheat and the remaining 5% to durum or pasta wheat.

The main objective of the study was to characterize the genetic diversity of CIMMYT and ICARDA's internationally available collections, which are considered the largest in the world. The researchers aimed to understand this diversity by mapping genetic variants to identify useful genes for wheat breeding.

From germplasm bank to breadbasket

The results show distinct biological groupings within bread wheats and suggest that a large proportion of the genetic diversity present in landraces has not been used to develop new high-yielding, resilient and nutritious varieties.

"The analysis of the bread wheat accessions reveals that relatively little of the diversity available in the landraces has been used in modern breeding, and this offers an opportunity to find untapped valuable variation for the development of new varieties from these landraces", said Carolina Sansaloni, high-throughput genotyping and sequencing specialist at CIMMYT, who led the research team.

The study also found that the genetic diversity of pasta wheat is better represented in the modern varieties, with the exception of a subgroup of samples from Ethiopia.

The researchers mapped the genomic data obtained from the genotyping of the wheat samples to pinpoint the physical and genetic positions of molecular markers associated with characteristics that are present in both types of wheat and in the crop's wild relatives.

According to Sansaloni, on average, 72% of the markers obtained are uniquely placed on three molecular reference maps and around half of these are in interesting regions with genes that control specific characteristics of value to breeders, farmers and consumers, such as heat and drought tolerance, yield potential and protein content.

Open access

The data, analysis and visualization tools of the study are freely available to the scientific community for advancing wheat research and breeding worldwide.

"These resources should be useful in gene discovery, cloning, marker development, genomic prediction or selection, marker-assisted selection, genome wide association studies and other applications," Sansaloni said.

Credit: 
International Maize and Wheat Improvement Center (CIMMYT)

Decreased MIR2911 absorption in human with SIDT1 polymorphism fails to inhibit SARS-CoV-2

In a new study in Cell Discovery, Liang Li and Chen-Yu Zhang group at Nanjing University and two other groups report that SIDT1 polymorphism remarkably decreases HD-MIR2911 absorption in human. Exosome isolated from volunteers that carry SIDT1 polymorphism has lower level of HD-MIR2911 and fails to inhibit of SARS-CoV-2 replication.

Previously, Chen-Yu Zhang's group have identified SID1 transmembrane family member 1 (SIDT1) as a critical membrane protein mediating dietary miRNAs absorption, which is abolished in SIDT1 deficient mice. In the present study, they demonstrate a significant frequency of human population (16%) that carry a SIDT1 polymorphism with amino acid replacement (Val78Met). Functional analysis reveals that such polymorphism (SIDT1poly) undermines its low pH-dependent uptake of exogenous miRNAs in vitro compared to wildtype SIDT1 protein. Additionally, people with SIDT1poly have lower serum levels of exogenous miRNAs (~10%) compared to SIDT1wt group. Dynamic absorption of MIR2911 after oral administration of honeysuckle decoction also significantly declines in SIDT1poly population. These findings suggest the critical role of SIDT1 in dietary miRNAs uptake in human. Furthermore, people with SIDT1poly have lower level of MIR2911 both in serum and isolated exosomes. The exosome from those polymorphic subjects show no effects on S-protein expression or virus replication. Notably, one out of six SARS-CoV-2 patients we observed who takes significantly longer time (17 days versus average 3.8 days) to become SARS-CoV-2 PCR-negative after MIR2911 antiviral therapy carries the exact polymorphism (SIDT1-Val78Met). Although it needs larger number of human subjects to strengthen the conclusions, it clearly shows that MIR2911 is indispensable to the antiviral effect of honeysuckle decoction.

1. This study provides evidence that SIDT1 medias dietary miRNAs uptake in human, which further confirm its critical role in exogenous miRNAs absorption.

2. Combined with their previous finding, it is clearly showed that MIR2911 is both necessary and sufficient to the antiviral effects of honeysuckle decoction against SARS-CoV-2.

"We have demonstrated that absorbed MIR2911 in honeysuckle decoction necessarily and sufficiently inhibits SARS-CoV-2 replication in vitro and in human subjects". Liang Li said. "Therefore, we wish that people could discard prejudice to traditional medicine and perform clinic trail to help controlling COVID-19 pandemic. Basically, you reject MIR2911 in honeysuckle decoction, you reject life". Liang Li added.

Credit: 
Nanjing University School of Life Sciences