Culture

Examining associations between ages of parents, grandparents and autism risk in children

What The Study Did: Older age for parents has been associated with autism spectrum disorders (ASDs) in children, however little is known about the association between the age of grandparents at the time of the birth of the parent and the risk of ASD in the grandchildren. This association was investigated in an observational study with the use of data from Danish national health registries that included three generations and 1.4 million children born from 1990 to 2013.

Authors: Zeyan Liew, Ph.D., M.P.H., of the Yale School of Public Health in New Haven, Connecticut, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.2868)

Editor's Note: The article includes funding/support disclosures. Please see the articles for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

How common is racial/ethnic discrimination in US surgical residency programs?

What The Study Did: Surveys from nearly 7,000 resident surgeons were used to evaluate how common racial/ethnic discrimination is in U.S. general surgery programs and how it's associated with burnout, thoughts of quitting and suicide.

Authors: Yue-Yung Hu, M.D., M.P.H., of the Feinberg School of Medicine at Northwestern University in Chicago, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamasurg.2020.0260)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the articles for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Prescribing an overdose: A chapter in the opioid epidemic

ROCHESTER, Minn. -- Research indicates that widespread opioid overprescribing contributed to the opioid epidemic. New research shows that this dangerous trend has apparently been coupled with another: inappropriate use of high-potency opioids.

A multi-institution research collaboration led by Mayo Clinic will publish its findings Wednesday, April 15, in JAMA Network Open. The study showed that more than half of Americans starting the most highly regulated opioids might be receiving inappropriate treatment.

"In pain management, there is a need to use a variety of treatment options, including -- when appropriate -- extended-release opioids and very strong immediate-acting opioids like fentanyl," W. Michael Hooten, M.D., a Mayo Clinic anesthesiologist and pain medicine specialist. "However, these particular medications can cause a number of serious adverse effects, so extra safeguards are needed when these medications are prescribed." Dr. Hooten is a study co-author.

"One of the key factors in determining whether these drugs can be used safely is the presence of opioid tolerance in the patient who was prescribed one of these medications," says Dr. Hooten. "In other words, tolerance to some of the most dangerous adverse effects of opioids, including suppressing breathing and excessive sedation, develops only after a patient takes daily doses of opioids over time. Patients who are not opioid-tolerant should not be receiving high-potency fentanyl or extended-release opioid products because they are susceptible to these life-threatening adverse effects."

The medications examined in the study included high-dose, extended-release oxycodone; all doses of extended-release hydromorphone; fentanyl patches; and all varieties of transmucosal ? oral or nasal delivery ? fentanyl.

Data behind findings

To determine whether these medications were inappropriately used across the U.S., the study team used pharmacy and medical claims data, and linked electronic health records from the OptumLabs Data Warehouse. OptumLabs is a collaborative center for research and innovation co-founded by Optum Inc. and Mayo Clinic, and focused on improving patient care and patient value.
Examining pharmacy and medical claims data from 2007 to 2016, the investigators identified nearly 300,000 instances of prescriptions during that time period that were for medications reserved for people with opioid tolerance. They removed records of people who had recently been hospitalized or who had an opioid poisoning diagnosis within the preceding six months. They also removed people who did not have at least six months of continuous insurance claims information at the time of the prescription and people with certain missing demographic information.

The remaining 153,385 instances of new outpatient prescriptions of these reserved medications occurred among 131,756 people from across the U.S.

Less than 48% of these showed evidence of prior opioid tolerance.

"Our findings are concerning because it appears that many people starting to use these drugs may be at risk for some quite serious outcomes," says Molly Jeffery, Ph.D., the study's lead author. "In general, physicians are allowed to prescribe drugs off-label -- that is, without adhering to the indications or warnings included in the drug label. But these particular drugs are considered risky enough that the FDA (Food and Drug Administration) requires manufacturers to provide additional oversight and education to physicians to make sure they understand the risks associated with the drugs." Dr. Jeffery is also the scientific director of Emergency Medicine Research at Mayo Clinic.

Furthermore, regulations for one of the drug classes that the team studied -- transmucosal immediate-release fentanyl, or TIRFS -- require physicians who prescribe, and pharmacists who dispense, the drugs to complete a certification process and enroll each patient.

"This process is meant to ensure patient safety while preserving access to the drugs for people who really need them," she says. "All of the drugs we studied have appropriate uses, but that does require that the physicians prescribing the drugs know the risks."

A key contribution of this study is that researchers were able to look for additional evidence of opioid tolerance in linked electronic health records for about 15% of the group, or 20,044 patients. They determined that between 0.5% and 4% of episodes of use of the reserved medications had additional information in the patient's record that showed evidence of tolerance not indicated in claims data. This additional evidence includes prescriptions showing up in the electronic health record but not in claims.

"Critiques of prior studies using only claims data raised the possibility that patients might have opioid prescriptions that don't show up in insurance claims," notes Dr. Jeffery. "For example, if a person paid for a prescription with cash instead of submitting it to insurance."

Prescriptions noted in the electronic health record may, or may not, have been filled.

"We used natural language processing -- a type of artificial intelligence -- to look through physician notes in the patients' health records, and also looked at information about prescriptions written," says Dr. Jeffery. "We did not find substantial additional evidence that patients were opioid-tolerant when they started these drugs."

"There was no evidence of opioid tolerance in more than half of the patients in our study," Dr. Jeffery says. "Given how common that was, we wanted to use the clinical notes to look for reasons why physicians would have prescribed these drugs in people who were not opioid-tolerant.

The team's analysis of clinical notes didn't give any insight into physicians' reasons for prescribing to people who were not opioid-tolerant. However, the physicians on the research team had some guesses about what might be behind these prescriptions.

"My colleagues and I discussed the possibility that, in particular, fentanyl patches might have been prescribed for patients who have serious medical or surgical problems that limit the ability to swallow oral medications," says Dr. Hooten. "Also, fentanyl patches contain a three-day supply of the drug, so patches could be a possible solution for patients who may not be able to take oral medications on a scheduled basis."

Dr. Hooten hopes his fellow clinicians pay attention to these findings.

"I often treat people mired in addiction," he says. "As physicians, our first charge is to do no harm, and with opioids -- especially this group of medications -- the risk of harm is very real."

"It concerns me that I might have patients coming to me whose substance abuse disorder was exacerbated by inappropriate prescribing practices."

Credit: 
Mayo Clinic

First Gulf-wide survey of oil pollution completed 10 years after Deepwater Horizon

video: The University of South Florida (USF) conducted the first comprehensive baseline study of oil pollution in the Gulf of Mexico. USF marine scientists surveyed 10,000 fish and found oil exposure in all of them - the highest levels were detected in yellowfin tuna, golden tilefish and red drum.

Image: 
University of South Florida

ST. PETERSBURG, Fla. (April 15, 2020)- Since the 2010 BP oil spill, marine scientists at the University of South Florida (USF) have sampled more than 2,500 individual fish representing 91 species from 359 locations across the Gulf of Mexico and found evidence of oil exposure in all of them, including some of the most popular types of seafood. The highest levels were detected in yellowfin tuna, golden tilefish and red drum.

The study, just published in "Nature Scientific Reports," represents the first comprehensive, Gulf-wide survey of oil pollution launched in response to the Deepwater Horizon spill. It was funded by a nearly $37 million grant from the independent Gulf of Mexico Research Initiative (GoMRI) to establish the Center for Integrated Modeling and Analysis of Gulf Ecosystems (C-IMAGE), an international consortium of professors, post-doctoral scholars and students from 19 collaborating institutions.

Over the last decade, USF scientists conducted a dozen research expeditions to locations off the United States, Mexico and Cuba examining levels of polycyclic aromatic hydrocarbons (PAHs), the most toxic chemical component of crude oil, in the bile of the fish. Bile is produced by the liver to aid in digestion, but it also acts as storage for waste products.

"We were quite surprised that among the most contaminated species was the fast-swimming yellowfin tuna as they are not found at the bottom of the ocean where most oil pollution in the Gulf occurs," said lead author Erin Pulster, a researcher in USF's College of Marine Science. "Although water concentrations of PAHs can vary considerably, they are generally found at trace levels or below detection limits in the water column. So where is the oil pollution we detected in tunas coming from?"

Pulster says it makes sense that tilefish have higher concentrations of PAH because they live their entire adult lives in and around burrows they excavate on the seafloor and PAHs are routinely found in Gulf sediment. However, their exposure has been increasing over time, as well as in other species, including groupers, some of Florida's most economically important fish. In a separate USF-led study, her team measured the concentration of PAHs in the liver tissue and bile of 10 popular grouper species. The yellowedge grouper had a concentration that increased more than 800 percent from 2011 to 2017.

Fish with the highest concentrations of PAH were found in the northern Gulf of Mexico, a region of increased oil and gas activity and in the vicinity of the Deepwater Horizon spill that gushed nearly four million barrels of oil over the course of three months in 2010. Oil-rich sediments at the bottom where much of the oil settled are resuspended by storms and currents, re-exposing bottom-dwelling fish.

Oil pollution hot spots were also found off major population centers, such as Tampa Bay, suggesting that runoff from urbanized coasts may play a role in the higher concentrations of PAHs. Other sources include chornic low-level releases from oil and gas platforms, fuel from boats and airplanes and even natural oil seeps--fractures on the seafloor that can ooze the equivalent of millions of barrels of oil per year.

"This was the first baseline study of its kind, and it's shocking that we haven't done this before given the economic value of fisheries and petroleum extraction in the Gulf of Mexico," said Steven Murawksi, professor of fisheries biology at USF, who led the international research effort.

Despite the detected trends of oil contamination in fish bile and liver, fish from the Gulf of Mexico are rigorously tested for contaminants to ensure public safety and are safe to eat because oil contaminants in fish flesh are well below public health advisory levels. Chronic PAH exposure, however, can prevent the liver from functioning properly, resulting in the decline of overall fish health.

These studies were made possible by BP's 10-year, $500 million commitment to fund independent research on the long-term effects of the Deepwater Horizon spill administered by the Gulf of Mexico Research Initiative. This year marks the end of that funding.

"Long-term monitoring studies such as these are important for early warning of oil pollution leaks and are vital for determining impacts to the environment in the case of future oil spills," Pulster said.

Credit: 
University of South Florida

Nanosensor can alert a smartphone when plants are stressed

CAMBRIDGE, MA -- MIT engineers have developed a way to closely track how plants respond to stresses such as injury, infection, and light damage, using sensors made of carbon nanotubes. These sensors can be embedded in plant leaves, where they report on hydrogen peroxide signaling waves.

Plants use hydrogen peroxide to communicate within their leaves, sending out a distress signal that stimulates leaf cells to produce compounds that will help them repair damage or fend off predators such as insects. The new sensors can use these hydrogen peroxide signals to distinguish between different types of stress, as well as between different species of plants.

"Plants have a very sophisticated form of internal communication, which we can now observe for the first time. That means that in real-time, we can see a living plant's response, communicating the specific type of stress that it's experiencing," says Michael Strano, the Carbon P. Dubbs Professor of Chemical Engineering at MIT.

This kind of sensor could be used to study how plants respond to different types of stress, potentially helping agricultural scientists develop new strategies to improve crop yields. The researchers demonstrated their approach in eight different plant species, including spinach, strawberry plants, and arugula, and they believe it could work in many more.

Strano is the senior author of the study, which appears today in Nature Plants. MIT graduate student Tedrick Thomas Salim Lew is the lead author of the paper.

Embedded sensors

Over the past several years, Strano's lab has been exploring the potential for engineering "nanobionic plants" -- plants that incorporate nanomaterials that give the plants new functions, such as emitting light or detecting water shortages. In the new study, he set out to incorporate sensors that would report back on the plants' health status.

Strano had previously developed carbon nanotube sensors that can detect various molecules, including hydrogen peroxide. About three years ago, Lew began working on trying to incorporate these sensors into plant leaves. Studies in Arabidopsis thaliana, often used for molecular studies of plants, had suggested that plants might use hydrogen peroxide as a signaling molecule, but its exact role was unclear.

Lew used a method called lipid exchange envelope penetration (LEEP) to incorporate the sensors into plant leaves. LEEP, which Strano's lab developed several years ago, allows for the design of nanoparticles that can penetrate plant cell membranes. As Lew was working on embedding the carbon nanotube sensors, he made a serendipitous discovery.

"I was training myself to get familiarized with the technique, and in the process of the training I accidentally inflicted a wound on the plant. Then I saw this evolution of the hydrogen peroxide signal," he says.

He saw that after a leaf was injured, hydrogen peroxide was released from the wound site and generated a wave that spread along the leaf, similar to the way that neurons transmit electrical impulses in our brains. As a plant cell releases hydrogen peroxide, it triggers calcium release within adjacent cells, which stimulates those cells to release more hydrogen peroxide.

"Like dominos successively falling, this makes a wave that can propagate much further than a hydrogen peroxide puff alone would," Strano says. "The wave itself is powered by the cells that receive and propagate it."

This flood of hydrogen peroxide stimulates plant cells to produce molecules called secondary metabolites, such as flavonoids or carotenoids, which help them to repair the damage. Some plants also produce other secondary metabolites that can be secreted to fend off predators. These metabolites are often the source of the food flavors that we desire in our edible plants, and they are only produced under stress.

A key advantage of the new sensing technique is that it can be used in many different plant species. Traditionally, plant biologists have done much of their molecular biology research in certain plants that are amenable to genetic manipulation, including Arabidopsis thaliana and tobacco plants. However, the new MIT approach is applicable to potentially any plant.

"In this study, we were able to quickly compare eight plant species, and you would not be able to do that with the old tools," Strano says.

The researchers tested strawberry plants, spinach, arugula, lettuce, watercress, and sorrel, and found that different species appear to produce different waveforms -- the distinctive shape produced by mapping the concentration of hydrogen peroxide over time. They hypothesize that each plant's response is related to its ability to counteract the damage. Each species also appears to respond differently to different types of stress, including mechanical injury, infection, and heat or light damage.

"This waveform holds a lot of information for each species, and even more exciting is that the type of stress on a given plant is encoded in this waveform," Strano says. "You can look at the real time response that a plant experiences in almost any new environment."

Stress response

The near-infrared fluorescence produced by the sensors can be imaged using a small infrared camera connected to a Raspberry Pi, a $35 credit-card-sized computer similar to the computer inside a smartphone. "Very inexpensive instrumentation can be used to capture the signal," Strano says.

Applications for this technology include screening different species of plants for their ability to resist mechanical damage, light, heat, and other forms of stress, Strano says. It could also be used to study how different species respond to pathogens, such as the bacteria that cause citrus greening and the fungus that causes coffee rust.

"One of the things I'm interested in doing is understanding why some types of plants exhibit certain immunity to these pathogens and others don't," he says.

Strano and his colleagues in the Disruptive and Sustainable Technology for Agricultural Precision interdisciplinary research group at the MIT-Singapore Alliance for Research and Technology (SMART), MIT's research enterprise in Singapore, are also interested in studying is how plants respond to different growing conditions in urban farms.

One problem they hope to address is shade avoidance, which is seen in many species of plants when they are grown at high density. Such plants turn on a stress response that diverts their resources into growing taller, instead of putting energy into producing crops. This lowers the overall crop yield, so agricultural researchers are interested in engineering plants so that don't turn on that response.

"Our sensor allows us to intercept that stress signal and to understand exactly the conditions and the mechanism that are happening upstream and downstream in the plant that gives rise to the shade avoidance," Strano says.

Credit: 
Massachusetts Institute of Technology

ECMO physicians offer guidance in the context of resource-scarce COVID-19 treatment

image: ECMO may be used to support critically ill patients in the COVID pandemic.

Image: 
ATS

April 15, 2020 - Rapidly escalating numbers of COVID-19 patients suffering from respiratory failure threaten to overwhelm hospital capacity and force healthcare providers into making challenging decisions about the care they provide. Of particular interest is the role of ECMO - extracorporeal membrane oxygenation, a form of life support for patients with advanced lung disease - to support critically ill patients in the current pandemic.

In "ECMO Resource Planning in the Setting of a Pandemic Respiratory Illness," an open-access paper published in the Annals of the American Thoracic Society, ECMO physicians outline their approach for care.

Currently, there is no vaccine or treatment for COVID-19 beyond supportive care such as mechanical ventilation or, in severe cases, ECMO to maintain patients and provide a window for potential recovery. However, when demand far outpaces a hospital's ability to provide highly specialized, resource-intensive therapies such as ECMO, physicians must be prepared to determine when and if to offer such support.

"The key challenge in pandemic settings is to optimize resource utilization so that patients are appropriately triaged and cared for within a hospital and throughout the larger health care system," said Steven Keller, MD, PhD, senior author and ECMO physician in the Division of Pulmonary and Critical Care Medicine, Brigham and Women's Hospital. "This is a daunting task as it requires a level of planning and coordination not routine in our current system and is difficult to implement within a limited window for planning and without dedicated resources."

"However, achieving this coordination is vital to ensuring that patients who are likely to benefit most from ECMO, younger patients with severe respiratory failure but without other significant co-morbidities or evidence of multi-organ failure, are able to receive the support that will offer them the best opportunity for survival."

Dr. Keller and his co-author suggest the following guidelines to help medical centers respond to patients' needs as resources contract in the COVID-19 pandemic:

Mild Surge - Focus on increasing capacity:

o Develop criteria specific to pandemic for initiation and cessation of ECMO.

o Obtain necessary equipment and expand capacity.

o Collocate/regionalize ECMO patients.

o Implement staffing protocols that allow for ECMO specialists/RNs to care for more patients based on acuity

o Collaborate with other local/regional ECMO centers.

Moderate Surge - Transition focus to determine allocation of scarce resources

Major Surge - Limit or defer use of scarce resources.

"Planning for how to deploy these resources in advance will both optimize care for patients initiated on ECMO support as well as provide guidance for clinicians caring for patients in whom ECMO support is not an option in a resource-limited environment," said Dr. Keller.

To read the authors' perspective piece in its entirety, and to see the full collection of COVID-19 manuscripts in the ATS library, go here.

Credit: 
American Thoracic Society

High-res imaging with elastography may accurately detect breast cancer in surgical margins

Bottom Line: A high-resolution, three-dimensional imaging technique, when combined with quantitative measurement of tissue elasticity, could accurately detect cancer within the resected margins of surgical specimens taken from patients undergoing breast-conserving surgery.

Journal in Which the Study was Published: Cancer Research, a journal of the American Association for Cancer Research

Authors: Brendan F. Kennedy, PhD, associate professor in the School of Engineering at The University of Western Australia (UWA) and laboratory head of BRITElab at the Harry Perkins Institute of Medical Research in Perth, Western Australia; and Christobel Saunders, MBBS, professor in the School of Medicine at UWA

Background: "Despite living in the 'digital age,' surgeons must routinely rely on their eyesight and sense of touch to determine if they have removed the entire tumor during breast-conserving surgery," said Kennedy. "Due to lack of adequate tools, 20 to 30 percent of patients must return for additional surgery, resulting in substantial physical and financial burdens and increased risk of complications."

Beyond surgeons' native senses of sight and touch, X-rays are often used for the intraoperative detection of tumor in the margin of surgical specimens, said Saunders. "While useful in some cases, this method can't detect small microscopic traces of tumor that surgeons often miss," she said. "As a result, it is widely accepted that higher-resolution intraoperative detection techniques are needed."

Optical coherence tomography (OCT) is a type of imaging technique that generates three-dimensional images of tissue. "OCT can be described as the optical equivalent of ultrasound, using reflections of light waves rather than sound waves to form images of tissue microstructure," Kennedy explained. The images generated by OCT can be used to visualize and detect cancer in mastectomy tissues, but the sensitivity and specificity of this technique in breast-conserving surgery specimens in several recent studies was relatively low.

Cancer and its associated stroma are stiffer than benign tissues, Kennedy explained. OCT technology can also be used to measure tissue deformation under an applied force, allowing for clinicians to ascertain the elasticity, or stiffness, of the surgical specimen. Quantitative micro-elastography (QME) is a variant of OCT that can generate three-dimensional maps of local elasticity. "We wanted to determine the diagnostic accuracy of QME for detecting tumor in the margins of breast-conserving surgical specimens, and compare this technique with images generated by OCT alone," Kennedy said.

How the Study was Conducted: In this study by Kennedy, Saunders, and colleagues, 90 patients were recruited who were undergoing surgical treatment for breast cancer. Of these patients, 83 received breast-conserving surgery, and seven received a mastectomy. Following surgery, simultaneous OCT and QME analyses were conducted on the resection margins of the freshly excised specimens. After imaging, the surgical specimens were submitted for standard histopathological processing. Histology sections were co-registered with the OCT and QME images to determine the types of tissue present in each scan.

To facilitate the comparison of OCT and QME images with histopathology, three-dimensional regions of interest (ROI) were selected from the scans. At least one ROI was selected for every surgical margin used in the study. Each ROI was determined to be positive for cancer if the pathologist identified tumor within 1 millimeter (mm) of the margin of the corresponding histology section.

To generate a set of pilot data, the researchers used all mastectomy surgical samples and surgical samples from 12 patients who received breast-conserving surgery. These data were used to train seven readers (two surgeons, two engineers, one medical sonographer, one pathology assistant, and one medical resident) to determine if images generated by OCT and QME indicated the presence of cancer in the surgical margins.

Surgical samples from the remaining 71 patients who received breast-conserving surgery were used to determine the ability of OCT and QME to detect cancer within 1mm of the surgical margins, as compared with gold-standard post-operative histology. Readers were blinded to the histopathology results. Sensitivity, specificity, and accuracy of each method was calculated for each reader, and aggregate results for all of the readers were determined. While inter-reader agreement was nearly perfect for QME, agreement was only moderate for OCT, Kennedy noted.

Results: Based on the aggregate results, OCT images resulted in a 69.0 percent sensitivity, 79.0 percent specificity, and 77.5 percent accuracy for detecting cancer within 1mm of the surgical margin. QME images resulted in a 92.9 percent sensitivity, 96.4 percent specificity, and 95.8 percent accuracy for detecting cancer within 1mm of the surgical margin.

Author's Comments: "Imaging the microscale stiffness of tissue using QME has the potential to reduce re-excision rates in breast-conserving surgery," Kennedy said. "Further, by quantifying tissue stiffness, we remove the subjectivity that is inherent to the surgeon's sense of touch.

"The ideal scenario would be to perform the imaging in the surgical cavity immediately after the specimen has been removed," Kennedy continued. "This would give surgeons a direct indication of whether any tumor had been missed. As such, our next goal is to develop a handheld QME probe to enable intraoperative imaging."

Study Limitations: Roughly 12 percent of the selected ROIs were excluded from the study for a variety of reasons, including extensive thermal damage, imaging artifacts, or the presence of a rare form of mucinous ductal carcinoma in situ (DCIS), representing a limitation of the study. "We are working on ways to accommodate for each of these excluded ROIs to ensure that the technique is as robust as possible," Kennedy said.

Credit: 
American Association for Cancer Research

Probiotic intervention in ameliorating the altered CNS functions in neurological disorders

As per the WHO report, one in every four people are affected by a mental or neurological disorder at some point in their lives making mental disorders among the leading cause of ill-health and disability worldwide. In the present era of socioeconomic competition and related stress, there has been a significant rise in the incidence of neurological and psychiatric ailments especially depression, bipolar disorder, acute anxiety and panic attacks. Many of these conditions are actually never talked about openly as people (affected directly or indirectly) shy away or feel embarrassed, causing the worsening of the conditions of a directly affected person in the absence of proper and early diagnosis and treatment.

With this scenario, the need to investigate newer and safer intervention therapies with prophylactic and/or therapeutic effects is well understood. Recently, the role of gut microbiota and its cross-talk with the human brain in modulating Central Nervous System (CNS) physiology and its optimal working has been highlighted. This review article, presented by Sharma and Kaur (Mehr Chand Mahajan DAV College for Women, India), focuses on the role and effect of regular intake of probiotic bacteria (through external administration of probiotic rich foods, drinks, capsules etc.) that help to strengthen our overall gut health. The review gives a comprehensive insight into the potential of regular intake of these good bacteria in a fixed dose to strengthen our gut microflora thus improving neurologic manifestations or decreasing the incidence and severity of neurological and psychiatric disorders. It also delineates the underlying mechanisms involved at the molecular and biochemical level through which the probiotic bacteria work in ameliorating the altered CNS functions under diseased neurological and psychiatric disorders (Anxiety, Major Depressive disorder, bipolar disorder, schizophrenia, autism spectrum disorder, cognitive impairments etc). The potential of probiotics as an important dietary modification as well as a useful intervention therapy with preventive and therapeutic value holds strong and should be an integral part of other treatment protocols recommended for the target population by the physician.

Credit: 
Bentham Science Publishers

A more plant-based diet without stomach troubles: getting rid of FODMAPs with enzymes

A plant-based diet is a good choice for both climate and health. However, many plant-based products, especially legumes, contain FODMAP compounds that are poorly digestible and cause unpleasant intestinal symptoms. A study by VTT and Finnish companies succeeded in breaking down FODMAPs with enzymes and producing new, stomach-friendly plant-based food products.

FODMAPs are short-chain carbohydrate molecules that are poorly absorbed in the human small intestine. These non-absorbed compounds move along to the large intestine, where intestinal microbes feed on them. This results in the production of gases that causes symptoms especially for those suffering from intestinal disorders, but also for many others. These problems are relatively common, as it has been estimated that the irritable bowel syndrome alone affects between 10% and 20% of the population.

Many foods containing FODMAPs are in themselves healthy and good sources of fibre, nutrients and vegetable proteins. However, those suffering from symptoms will often avoid these foods and miss out on their health benefits.

Enzymes to do away with FODMAPs

In a study funded by VTT, Gold&Green Foods, Raisio, Roal and Valio, VTT focused on two key FODMAP compounds: galactan and fructan. Galactan is abundant in, for example, legumes, while fructan is found in many cereals, among other things.

"We investigated whether these compounds can be removed from food by breaking them down with enzymes. We utilised both commercial enzymes and ones produced at VTT in the project. We used them to test the removal of FODMAPs from faba bean and pea protein concentrates as well as from rye, graham and wheat flour", says Senior Research Scientist Antti Nyyssölä from VTT.

The solution proved to work: there were only small amounts of FODMAPs remaining in the raw materials after enzymatic treatment.

"The method is similar to that used to make Hyla milk, in which lactose is broken down in advance. Similarly, enzymatic treatment can be used to remove FODMAPs from food."

New plant-based foods suitable for the FODMAP diet

The research project also tested whether enzymes work in connection with the preparation of food products. This would allow the food industry to eliminate harmful FODMAP compounds in their own processes. The project focused on testing plant-based spoonable products, meat analogues and bakery products to investigate different types of plant-based foods suitable for the FODMAP diet.

"The study showed that enzymes also work under a variety of conditions and in different food processes. This is interesting new information especially for legumes, as there are currently no similar legume-based foods suitable for the FODMAP diet on the market", says Nyyssölä.

"The results are most likely to be utilised next in the development of new food items, but also in academic research in order to verify the effects on intestinal symptoms with certainty", he continues.

Credit: 
VTT Technical Research Centre of Finland

How expectations influence learning

image: Burkhard Pleger (left) and Bin Wang collaborated for the studies.

Image: 
RUB, Marquard

During learning, the brain is a prediction engine that continually makes theories about our environment and accurately registers whether an assumption is true or not. A team of neuroscientists from Ruhr-Universität Bochum has shown that expectation during these predictions affects the activity of various brain networks. Dr. Bin Wang, Dr. Lara Schlaffke and Associate Professor Dr. Burkhard Pleger from the Neurological Clinic of Berufsgenossenschaftliches Universitätsklinikum Bergmannsheil report on the results in two articles that were published in March and April 2020 in the journals Cerebral Cortex and Journal of Neuroscience.

The neuroscientists identified two key regions in the brain: the thalamus plays a central role in decision-making. The insular cortex, on the other hand, is particularly active when it is clear whether the right or wrong decision has been made. "The expectation during learning then regulates specific connections in the brain and thus the prediction for learning-relevant sensory perception," says Burkhard Pleger.

Focus on the decision making process

For the investigation, the team used a learning task that focuses on the decision-making process during the perception of skin contact in the brain. "It's like learning a computer strategy game using a game pad, which gives sensory feedback to certain fingers on certain stimuli," compares Pleger. "The point is that a certain touch stimulus leads to success and that this has to be learned from stimulation to stimulation."

28 participants were given either tactile stimulus A or B on the index finger in each trial run. At the push of a button, they then had to predict whether the subsequent tactile stimulus would be the same or not. The probability of A and B was constantly changing, which the participant had to learn from prediction to prediction.

Strategy analysis

During the test, the participants' brain activity was examined using functional magnetic resonance imaging. The researchers were particularly interested in the trial runs in which the participants changed their decision-making strategy. They asked the question to what extent the change in expectations influenced brain activity.

To the researchers two brain regions stood out: the thalamus and the insular cortex. The thalamus processes information that comes from the sensory organs or other areas of the brain and passes it on to the cerebrum. It is also called the gateway to consciousness.

A new role for the thalamus

Using functional magnetic resonance images, the researchers were able to show that different brain connections between the prefrontal cortex and the thalamus were responsible for maintaining a learning strategy or changing the strategy. The higher the expectations before the decision, the sooner the strategy was maintained and the lower the strength of these connections. With low expectations, there was a change of strategy and the regions seemed to interact much more strongly with each other. "The brain appears to be particularly active when a learning strategy has to be changed while it takes significantly less energy to maintain a strategy," concludes Pleger.

"So far, the thalamus has been viewed as a switch," adds the neuroscientist. "Our results underline its role in higher cognitive functions that help decision-making while learning. So the thalamus is not only a gateway to sensory consciousness, but rather it seems to link it to cognitive processes that serve, for example, to make decisions

Affecting sensory perception

The insular cortex, on the other hand, is involved in perception, motor control, self-confidence, cognitive functions and interpersonal experiences. This part was particularly active when a participant had already made his decision and then found out whether he was right or wrong. "Different networks that are anchored in the insular cortex are regulated by expectations and thus seem to have a direct influence on future sensory perception," said Pleger.

Credit: 
Ruhr-University Bochum

Bees point to new evolutionary answers

image: A recently described Fijian bee species, Homalictus groomi (photo James Dorey).

Image: 
Courtesy James Dorey

Evolutionary biology aims to explain how new species arise and evolve to occupy myriad niches - but it is not a singular or simplistic story. Rare bees found in high mountain areas of Fiji provide evidence that they have evolved into many species, despite the fact they can't readily adapt to different habitats.

These bees - discovered by a team of researchers from Flinders University, South Australian Museum, UniSA and University of Adelaide - serve as a major warning about the impact of ongoing human-induced climate change and loss of biodiversity for different species.

The Fijian bees are locked into very specific habitats, and when these have contracted and split due to past climate change, the bee populations also became fragmented, with some isolated populations eventually turning into new species.

"The adaptation to new habitats and niches is often assumed to drive the diversification of species, but we found that Fijian bee diversity arose from an inability to adapt," says Flinders University's James Dorey, lead author on a new paper that explains this research.

The paper - "Radiation of tropical island bees and the role of phylogenetic niche conservatism as an important driver of biodiversity," by James Dorey, Scott Groom, Elisha Freedman, Cale Matthews, Olivia Davies, Ella Deans, Celina Rebola, Mark Stevens, Michael Lee and Michael Schwarz - has been published by Proceedings B journal.

"Our genetic data show how a single bee-colonisation in Fiji gave rise to over 20 endemic bee species largely constrained to cooler, high elevations," says Mr Dorey. "At least for Fijian bees, a relative inability to adapt has created a species-making machine."

New Fijian bee species evolved when a single ancestral species colonised and spread over lowland areas during cool periods, but were later restricted to different mountaintops as the climate warms and the lowlands become too hot for comfort. The isolated populations later become new species. Each subsequent climate cycle has the potential to generate new species.

"Perhaps, if Darwin had studied Fijian bees instead of Galapagos finches, he might have come to rather different conclusions about the origin of species", adds Flinders Univeristy's Associate Professor Mike Schwarz, who was part of the research group.

One of the major arguments at the core of evolutionary theory suggests that species arise from adaptive radiation into new niche spaces, with gene flow between the new and ancestral populations subsequently inhibited, eventually leading to speciation. As an alternative, phylogenetic niche conservatism points to the inability of a lineage to adapt to new or changing environments, in turn, promoting speciation when populations become isolated as their preferred habitats contract.

This is what the researchers found in the high mountain areas of Fiji. Of the 22 Fijian bee species they identified, most have very narrow elevational ranges (constrained by temperature) - and 14 species were only recovered from single mountain peaks.

"This demonstrates how slowly bees have adapted to new climates, since the colonisation of Fiji," says Mr Dorey. 'We further highlight that such phylogenetic signals could indicate climate-related extinction risks. Indeed, one Fijian bee species (Homalictus achrostus) appears to be at serious risk of extinction, with sightings becoming much rarer since its initial discovery in the 1970s. This raises concerns for the 13 other species that we have, so far, only found on single mountain tops. They have nowhere to go if climate continues to warm and represent 14 very good reasons to curb global greenhouse gas emissions."

Credit: 
Flinders University

Additions to resource industry underwater robots can boost ocean discoveries

image: An ROV fitted with an arm for collecting marine samples.

Image: 
AIMS

Underwater robots are regularly used by the oil and gas industry to inspect and maintain offshore structures. The same machines could be adapted to gather extra scientific information, thus boosting environmental and resource management capabilities, an Australian-led study has revealed.

Scientists from around the globe, led by Dianne McLean and Miles Parsons from the Australian Institute of Marine Science (AIMS), are urging closer ties between industry and researchers to maximise the use of the underwater robots, known as remotely operated vehicles (ROVs).

In a paper published in the journal Frontiers in Marine Science, they identify a range of instruments that can be easily added to the craft, including cameras, audio recorders and sample collectors.

The information gathered will significantly increase scientific and industry understanding of the impact of marine infrastructure, producing benefits for ecological management and regulatory compliance.

"This is a real win-win," said Dr McLean. "With some low-cost engineering and operational tweaks, industry and science can use ROVs to fuel new scientific discoveries. For instance, we could better understand the influence of structures such as platforms and pipelines in marine ecosystems - to the mutual benefit of the resource company and the environment."

The new research follows an earlier study that used adapted underwater vehicles to examine fish populations around a platform on the North West Shelf, 138 km offshore from Dampier.

In May this year, the AIMS team is set to extend the study, working with Santos to use ROVs to survey marine life around shallow water platform jackets.

The craft are routinely used to inspect thousands of industrial subsea structures around the world each year. They operate in shallow water, and at depths down to 3000 metres.

McLean, a fish ecologist and specialist in underwater video systems, and Parsons, an acoustics expert, teamed up with colleagues in Australia, the US, England and Scotland to identify feasible, cost-effective ways in which standard work-class ROVs could be adapted to expand their data-gathering capabilities.

These include the addition of extra sensors, cameras, acoustic transmitters and receivers, and sample collection devices.

"By partnering with experienced research scientists, industry can improve the quality of its ROV-derived data," says Dr Parsons.

Dr McLean said that the extra information, and the spirit of cooperation through which it was gathered, could be particularly useful when it came to complex engineering and environmental management challenges such as decommissioning large structures at the end of their working lives.

"From an industry point of view," she said, "these small additions to ROVs and their use for scientific surveys has the potential not only to improve environmental management, but also to facilitate more informed engagement with external stakeholders such as regulators and the public."

The research shows that small enhancements to the vehicles and how they are used now could provide substantial benefits to science and to resource companies in the long-term.

Credit: 
Australian Institute of Marine Science

Speeding-up quantum computing using giant atomic ions

image: Duality of trapped ion and Rydberg atom quantum technologies. Trapped Rydberg ions combine the key strengths of two very different quantum processors, trapped ion (above) and Rydberg atom (below), in one technology. This technology has the potential to speed up trapped ion quantum computers.

Image: 
Illustration by Elsa Wikander/Azote.

Trapped Rydberg ions can be the next step towards scaling up quantum computers to sizes where they can be practically usable, a new study in Nature shows.

Different physical systems can be used to make a quantum computer. Trapped ions that form a crystal have led the research field for years, but when the system is scaled up to large ion crystals this method gets very slow. Complex arithmetic operations cannot be performed fast enough before the stored quantum information decays.

A Stockholm University research group may have solved this problem by using giant Rydberg ions, 100 million times larger than normal atoms or ions. These huge ions are highly interactive and, therefore, can exchange quantum information in less than a microsecond.

"In a sense, Rydberg ions form small antennas for exchanging quantum information and thus make it possible to realize particularly fast quantum gates, which are the 'basic building blocks' of a quantum computer", explains Markus Hennrich, Department of Physics, Stockholm University, and group leader from the Stockholm University team. "The interaction between Rydberg ions is not based on crystal vibrations, as with ions trapped in crystal form, but on the exchange of photons. The fast interaction between the Rydberg ions can be used to create quantum entanglement."

"We used this interaction to carry out a quantum computing operation (an entangling gate) that is around 100 times faster than is typical in trapped ion systems", explains Chi Zhang, researcher at the Department of Physics, Stockholm University.

Theoretical calculations supporting the experiment have been conducted by Igor Lesanovsky and Weibin Li at University of Nottingham, UK and University of Tübingen, Germany.

"Our theoretical work confirmed that there is indeed no slowdown expected once the ion crystals become larger, highlighting the prospect of a scalable quantum computer", says Igor Lesanovsky from University of Tübingen.

Quantum computers are regarded as one of the key technologies of the 21st century. While conventional computers function according to the laws of classical physics, quantum computers work according to the rules of quantum mechanics. The ability of entangled quanta to exchange information without time delay makes them very fast and powerful. In the future, quantum computers could be used wherever complex calculations need to be solved, for example in the design of new medications or in artificial intelligence.

Credit: 
Stockholm University

Future dynamics prediction from short-term time series by anticipated learning machine

image: (a) The general principle of Anticipated Learning Machine (ALM). The observed attractor, a delay attractor and sampled nondelay attractors are all topologically conjugate with each other. Each sampled nondelay attractor preserves the dynamical information of the system in different ways. By integrating the information contained in these sampled nondelay attractors, we could find an accurate one-to-one map even under noise deterioration.
(b) Anticipated Learning Machine. For each future value, those maps are co-trained into a unified map Ψ. When the maps are trained, the weighted sum is used as the prediction. The predicted value is then used as the label when training other maps to predict the next time point. Clearly, ALM Ψ transforms spatial input X(tm) to temporal output Z(tm) at each point tm.

Image: 
©Science China Press

Making an accurate prediction based on observed data, in particular from short-term time series, is of much concern in various disciplines, arising from molecular biology, neuroscience, geoscience, economics to atmospheric sciences due to either data availability or time-variant non-stationarity. However, most of the existing methods require sufficiently long measurements of time series or a large number of samples, and there is no effective method available for the prediction only with short-term time-series because of lack of information.

To address this issue, Prof. CHEN Luonan (Institute of Biochemistry and Cell Biology, Chinese Academy of Sciences) with Dr. CHEN Chuan(Sun Yat-sen University), Prof. MA Huanfei (Soochow University) and Prof. AIHARA Kazuyuki (University of Tokyo) proposed a new dynamics-based data-driven method, Anticipated Learning Machine (ALM), for achieving precise future-state predictions based on short-term but high-dimensional data. Actually, the ALM is a multi-layered neural network, where high-dimensional variables are taken as input neurons (multiple variables but at a single time point) but a target variable is taken as output neurons (single variable but at multiple time points). In this way, ALM is able to transform the recent correlation/spatial information of high-dimensional variables to future dynamical/temporal information of any target variable, i.e. by spatial-temporal information transformation (STI) equations.

Specifically, ALM can be well trained to represent the randomly distributed embedding (RDE) map for STI equations by a large number of the generated training-samples with the Dropout scheme and the proposed consistent-training scheme, thus predicting the target variable in an accurate and robust manner even from short-term data. Extensive experiments on the short-term high-dimensional data from both synthetic and real-world systems demonstrated significantly superior performances of ALM over existing methods

Comparing with the traditional neural networks (or other machine learning approaches) which excavate the historical statistics of the original high-dimensional system and thus require a large number of samples, ALM efficiently and robustly reconstructs its dynamics even with a small number of samples by constraining to a low-dimension space which is actually an inherent property of such a dissipative system. Based on nonlinear dynamics to transform the spatial information of the all measured high-dimensional variables into the temporal evolution of the target variable by learning the STI equations, ALM open a new way for dynamics-based machine learning or "intelligent" anticipated learning.

"How to consider the strong nonlinearity or/and stochasticity of the dynamical systems also with the observed noisy data, and further how to make more in-depth theoretical analysis and further develop an appropriate framework taking these issues into consideration remain an open and interesting problem in future." Stated by the authors.

Credit: 
Science China Press

Breastfeeding may lead to fewer human viruses in infants

PHILADELPHIA - Even small amounts of breastmilk strongly influences the accumulation of viral populations in the infant gut and provides a protective effect against potentially pathogenic viruses, according to researchers who examined hundreds of babies in a study from the Perelman School of Medicine at the University of Pennsylvania.

The findings expand upon prior research that suggests that breastfeeding plays a key role in the interaction between babies and the microbial environment. This latest research could influence strategies for the prevention of early gastrointestinal disorders, and encourage mothers to feed babies breastmilk even when mixed with formula. The findings are published in Nature.

Penn researchers measured the numbers and types of viruses in the first stool -- meconium -- and subsequent stools of newborns in the United States and Botswana using advanced genome sequencing and other methods. Upon delivery, babies had little or no colonization, but by one month of life populations of viruses and bacteria were well developed, with numbers of viruses reaching a billion per gram of gut contents. Most of the first wave of viruses turned out to be predators that grow in the first bacteria that colonize the infant gut. Later, at four months, viruses that can replicate in human cells and potentially make humans sick were more prominent in the babies' stools. A strong protective effect was seen for breastfeeding, which suppressed the accumulation of these potentially pathogenic viruses. Similar results were seen for infants from the US and Botswana. Another conclusion from this work was that breastmilk could be protective even if sometimes mixed with formula, compared to a with formula-only diet.

"These findings can help us better understand why some babies get sick and develop life-threatening infections in their first months of life," said senior author Frederic Bushman, PhD, chair of the department of Microbiology.

The newborns' home country also played a part in the prevalence of viral infections. Babies from Botswana were more likely to have those potentially-harmful viruses in their stools at that 4-month mark compared to the stools of babies from the US.

"Location of the mom and baby seems to play a role, probably due to the kind and number of microorganisms babies are exposed to environmentally," said first author Guanxiang Liang, PhD, a postdoctoral researcher in the department at Microbiology. "Nevertheless, Botswana-born babies still seemed to benefit from breastfeeding, whether exclusively or in addition to formula consumption."

In the future, Bushman and Liang want to look at varying ages to see how development of the virome -- the virus population in the gut -- influences a child's growth, how virome colonization varies in infants around the world, and how virome colonization influences outcomes in preterm birth.

Credit: 
University of Pennsylvania School of Medicine