Culture

Oncotarget: Preoperative geriatric nutritional risk index is a useful prognostic indicator

image: Cancer-specific survival curves based on GNRI according to pTNM stage. (A) pTNM stage I (n = 171), (B) pTNM stage II (n = 53), (C) pTNM stage III (n = 73).

Image: 
Correspondence to - Noriyuki Hirahara - norinorihirahara@yahoo.co.jp

The cover for issue 24 of Oncotarget features Figure 4, "Cancer-specific survival curves based on GNRI according to pTNM stage," by Hirahara, et al.

Volume 11 Issue 24 of @Oncotarget reported that this study aimed to evaluate the relationship between preoperative Geriatric Nutritional Risk Index and long-term outcomes in elderly gastric cancer patients.

In the univariate analyses, OS was significantly associated with the American Society of Anesthesiologists Physical Status, tumor size, tumor differentiation, pathological stage, carcinoembryonic antigen, C-reactive protein, postoperative complications, and GNRI, whereas in the univariate analyses of CSS, ASA-PS, tumor size, tumor differentiation, pathological stage, CEA, postoperative adjuvant chemotherapy, and Geriatric Nutritional Risk Index were significantly associated with poor prognosis.

In the multivariate analysis, ASA-PS, tumor differentiation, pathological stage, and GNRI were significant independent prognostic factors of OS, whereas ASA-PS, pathological stage, and CEA were significant independent prognostic factors of CSS. Geriatric Nutritional Risk Index is significantly associated with OS and CSS in elderly gastric cancer patients and is an independent predictor of OS. It is a simple, cost-effective, and promising nutritional index for predicting OS in elderly patients.

Dr. Noriyuki Hirahara from The Department of Digestive and General Surgery at Shimane University Faculty of Medicine said "The tumor–node–metastasis (TNM) staging has been the global standard for estimating cancer cell dissemination."

The impact of the nutritional status on the outcome of cancer patients has been intensively studied in recent years, and several assessment tools have been proposed for nutritional screening.

However, the usefulness of these tools has not been fully evaluated in elderly patients.

The geriatric nutritional risk index was developed as a simple and objective nutritional assessment tool for hospitalized elderly patients based on their body weight and serum albumin level.

"The geriatric nutritional risk index was developed as a simple and objective nutritional assessment tool for hospitalized elderly patients based on their body weight and serum albumin level"

Therefore, the authors believe that the Geriatric Nutritional Risk Index accurately reflects the nutritional status of elderly cancer patients who are at risk of malnutrition because of their physiological frailty and vulnerability.

The principal aim of this study was to evaluate the prognostic significance of the preoperative Geriatric Nutritional Risk Index for estimating the postoperative outcomes of elderly gastric cancer patients.

The Hirahara Research Team concluded in their Oncotarget Research Paper that in Japan, which has an aging society, an individualized treatment strategy for gastric cancer is indispensable because there are many deaths caused by other diseases.

Recently, sarcopenia has been reported to affect the incidence of adverse events with chemotherapy and the continuation of treatment, leading to a worse prognosis.

Sarcopenia, the age-related loss of skeletal muscle mass and strength, was identified based on cross-sectional computed tomography images at the L3 level.

However, the Geriatric Nutritional Risk Index can be easily calculated from routine laboratory data and physical measurements.

The clinical significance of GNRI, as an indicator of OS, will be increasingly important in the future.

Credit: 
Impact Journals LLC

Children with developmental disabilities more likely to develop asthma

Children with developmental disabilities or delay are more at risk of developing asthma, according to a new study published in JAMA Network Open led by public health researchers at The University of Texas Health Science Center at Houston (UTHealth) as part of the Center for Pediatric Population Health.

Using data from the 2016-2017 National Survey of Children's Health, the team examined 71,811 families with children age 0 to 18. For the survey, parents were asked if their child had an asthma diagnosis, as well as one or more developmental disabilities - including behavioral disorders, motor disabilities, attention deficit hyperactivity disorder, vision impairment, hearing impairment, speech disabilities, cognitive disabilities, or an unspecified developmental delay.

The researchers discovered asthma prevalence was significantly higher in children with disabilities compared to their typically developing peers. Children with hearing loss had the greatest likelihood of having asthma, followed by those with cerebral palsy, and children with a learning disability. The study team also noted that ethnic minority children had higher odds of dual asthma/disability diagnoses compared to their non-Hispanic white peers.

"This research has shown that it's not just clinicians or pediatricians that should be aware that children with disabilities and delays may also have other health problems. It's also schools, after-school programs, and other communitywide programs," said Sarah Messiah, PhD, MPH, the study's senior author and professor of epidemiology, human genetics, and environmental sciences at UTHealth School of Public Health in Dallas. "It's equally important to understand these children may not always be able to communicate their discomfort, especially when it comes to asthma."

Messiah is director of the Center for Pediatric Population Health, a research collaboration between UTHealth School of Public Health and Children's Health. First author of the article is Luyu Xie, PharmD, of the School of Public Health and the Center for Pediatric Population Health.

In the U.S., more than 6 million children are diagnosed with asthma and nearly half have missed school due to the condition. Missing school has been linked to poor academic performance, which is often seen as a hallmark of a learning disability. The research illustrated the challenges of properly meeting the educational needs of students with a co-occurring diagnosis, and the role of adequate disease management and its link to academic success.

"Both asthma and disabilities in children are important determinants of school absenteeism, with the subsequent risk of educational delays. Asthma, when detected early and managed early, can lessen the impact it has on quality of life and missed school days," said George Delclos, MD, MPH, a co-author and professor of epidemiology, human genetics, and environmental sciences at the School of Public Health.

While current pediatric guidelines do not list a disability or delay as a risk factor for asthma, the study team suggests their findings could lead to more discussion of challenges children with asthma and a disability diagnosis face, and help to bridge the gap between their health care needs and increasing their quality of life.

"These results support advising pediatricians to screen for asthma in children with disabilities, so that interventions can be started sooner. This screening is particularly important to conduct in ethnic minorities with disabilities, given their even greater risk," said Delclos, Marcus M. Key, M.D. - Shell Occupational and Environmental Health Endowed Chair.

Credit: 
University of Texas Health Science Center at Houston

Including patients in hospital discharge communication would improve outcomes of care

Study by University of Warwick researchers shows UK patients are keen to receive discharge letters when they leave hospital

UK hospitals send discharge letters to GPs as part of care handover, but practice of copying in patients is inconsistent

Better information for patients following discharge from hospital could improve the outcomes of care, including avoiding prescribing errors and potentially helping prevent readmissions

Sending discharge letters to patients as well as their GPs when they leave UK hospitals could make a substantial difference to patient outcomes, according to a new study by University of Warwick researchers.

Most patients are in favour of receiving the letters, according to interviews conducted with patients, which also highlighted that improvements in their content are needed to make them more useful.

The study, which forms part of a wider project into communications around the discharge of patients from secondary care, is published today in BMC Health Services Research by researchers from Warwick Medical School and the Centre for Applied Linguistics at the University of Warwick.

Discharge letters are produced by hospital clinicians when patients leave hospital following inpatient or outpatient care, and are sent to their GP as part of the handover process when they return to the community.

Lead author Dr Katharine Weetman, an IAS fellow from Warwick Medical School at the University of Warwick, said: "If a patient has been treated at hospital and they're being discharged, it means that something important has happened, such as a new diagnosis, a change in medication or new advice that they should follow. There are patient safety implications in terms of how such information is communicated, shared and managed in order that both they and their GPs are fully informed."

Although there are national guidelines, each hospital trust can have its own discharge policy so there is considerable variation in how discharge letters are managed. Hospitals can vary the template, content, who writes it (i.e. a consultant, junior doctor, nurse, etc.), the process for sending it out, whether it's paper or electronic format, and particularly whether or not they share it with patients. The NHS Plan (2000) and 'Copying letters to patients: good practice guidelines' (2003) by the Department of Health, as well as other initiatives and guidelines since then, encourage sending patients their discharge letters as good practice, but it's not specifically mandated or standardised.

For the study, the researchers spoke to 50 patients who had a recent experience in an inpatient, outpatient, accident and emergency department or day surgery setting, though most related to inpatient stays. 88% indicated that they would like to receive a discharge letter (compared to 64% who reported having received one), and 62% were specifically in favour of receiving a copy of the discharge letter that their GPs receive, as opposed to a personalised letter. Several gave examples of how that would have improved their experience.

Dr Weetman said: "Patients described issues like not knowing what their treatment plan is, not knowing what they need to do when they get home, when follow-up appointments would be occurring, or what their medication or exercise regime is. If they don't know what's happening then it can cause a great deal of anxiety, a likelihood that important advice and guidance may not be followed, and so affect their future health, wellbeing and use of NHS resources.

"Where they did receive discharge letters, patients wanted to see less technical jargon or acronyms, which are confusing to patients and may be unfamiliar to GPs too. They felt information should be organised into relevant, clearly labelled sections.

"Past research has shown that if patients or GPs aren't informed properly then adverse outcomes may occur, such as patients being readmitted to hospital because vital advice in the letter such as changes in medication or the need for follow-up blood tests has been overlooked."

An earlier study by the team found that GPs were also generally supportive of patients receiving their letters, citing reasons such as the benefits of enabling patients to be more directly involved in their care, and helping to ensure that any follow-up is actioned if the GP hasn't received or actioned the advice in the letter for whatever reason.

The researchers also stress that it is important to maintain good practice in discharge communications even in a crisis situation such as the COVID-19 pandemic, where patient safety and avoidance of pressures on the NHS is even more vital.

Professor Jeremy Dale, Head of the Unit of Academic Primary Care at Warwick Medical School and a GP in Coventry commented: "This study demonstrates the importance of the NHS empowering patients at all stages of their care with relevant information and advice. Patients want clear written information about the investigations, treatment and advice that they have received in hospital, together with the actions that they and their GPs are now being advised to follow. As the NHS rebuilds its services following the impact of COVID-19, the findings from this research indicate the importance of addressing this issue in order to ensure that NHS resources are used efficiently with the risks of error following discharge being minimised."

Dr Weetman adds: "Good communication doesn't become obsolete in a pandemic, it becomes more important. Whether they have received face-to-face care or a video or phone consultation with a specialist, people need to know what's going on, what they are diagnosed with, what is planned and what the next steps are, what they need to be doing to be as safe and healthy as they can be. In this way, they can optimise the beneficial impact of hospital care."

Credit: 
University of Warwick

Endogenous insulin production is preserved in Type 1 diabetes with anti-TNF drug

image: Quattrin, UB Distinguished Professor of pediatrics and senior associate dean for research integration at the Jacobs School of Medicine and Biomedical Sciences at the University at Buffalo, led the study.

Image: 
Sandra Kicman, University at Buffalo

BUFFALO, N.Y. - In research led by a University at Buffalo pediatric endocrinologist, a drug called golimumab showed that it preserved beta-cell function in children and young adults with newly diagnosed Type 1 diabetes, according to findings from a Phase 2 study.

The study also demonstrated that golimumab, an anti-tumor-necrosis-factor (TNF) therapy, reduced the amount of injected insulin required by children and young adults with newly diagnosed Type 1 diabetes by preserving their ability to produce insulin on their own, called endogenous insulin. The World Without Disease Accelerator, through Janssen Research & Development, LLC, funded the study.

Golimumab, marketed as Simponi®, is currently used in the treatment of rheumatoid arthritis, ulcerative colitis and other autoimmune conditions, however it is not approved by the U.S. Food and Drug Administration for the treatment of Type 1 diabetes.

The findings were presented on June 13 at the annual meeting of the American Diabetes Association by the lead investigator, Teresa Quattrin, MD, UB Distinguished Professor in the Department of Pediatrics, senior associate dean for research integration in the Jacobs School of Medicine and Biomedical Sciences at UB and attending pediatric endocrinologist at the Diabetes Center at UBMD Pediatrics and John R. Oishei Children's Hospital.

"This study shows that golimumab is a potential disease-modifying agent for newly diagnosed patients with Type 1 diabetes," said Quattrin. "The main goal of the study was to see if golimumab could preserve beta-cell function in these newly diagnosed patients."

Measuring how well the pancreas is working

This was assessed by measuring the amount of C-peptide in patients' blood during a four-hour mixed meal tolerance test. Because C-peptide reflects only insulin made by the body and not injected insulin, C-peptide levels reveal how well the pancreas is producing insulin.

Patients treated with golimumab had a higher C-peptide level at week 52 compared to placebo. "This was statistically significant, thus the study met its primary goal," Quattrin said, "in fact, 41.1% of participants receiving golimubab had an increase or less than 5% decrease in C-peptide compared to only 10.7% in the placebo group."

Good control with less insulin

Nearly 43% of those who received golimumab were in partial diabetes remission (also known as the honeymoon phase) versus 7.1% of those receiving placebo. The definition of partial remission was based on insulin dose and blood sugar control levels as indicated by hemoglobin A1C, a measurement of average blood sugar levels over three months.

Quattrin explained that a child with Type 1 diabetes requires about 1 unit of insulin per kilogram of body weight per day. That means that a child weighing about 65 pounds typically requires about 30 units of injected insulin per day once they are out of the partial remission period, about 3-6 months after diagnosis.

"In this study, both golimumab and placebo groups achieved good blood sugar control, but patients treated with golimumab achieved it with less insulin," said Quattrin. "During the 52 weeks, insulin dose increased only slightly for those on golimumab, 0.07 units per kilogram per day, versus 0.24 units per kilogram per day for those on placebo study. Moreover, in a post-hoc analysis, an analysis conducted after the conclusion of the clinical trial, those who were younger than 18 years had 36% fewer episodes where blood sugar was less than 54 mg per deciliter, designated by the American Diabetes Association as level 2 hypoglycemia," Quattrin said.

This is important clinically because low blood sugar reactions are dangerous and can even be fatal if untreated. Low blood sugars require immediate attention, often causing the child to be removed from class or recreation activities compromising quality of life.

The drug is self-administered as a subcutaneous injection every 2 weeks. No serious side effects related to the study drug, such as serious infections, were reported.

The randomized, controlled clinical trial was conducted at 27 centers throughout the U.S., including at the Diabetes Center at UBMD Pediatrics and Oishei Children's Hospital in Buffalo. It involved 84 patients, aged 6 to 21 years, with two-thirds receiving golimumab and one-third receiving placebo starting within 100 days from diagnosis.

Throughout three decades as a leading researcher in pediatric endocrinology, Quattrin has been interested in finding ways to preserve beta-cell function in newly diagnosed patients with Type 1 diabetes.

The current study took place on the basis of positive findings in animal models, as well as Quattrin's work with patients treated at the Diabetes Center at UBMD Pediatrics and Oishei Children's Hospital. It confirms results published by her team in 2009 where in a randomized pilot study 10 patients received another TNF inhibitor and 8 received placebo starting within 28 days from diagnosis. The results of this small proof of concept study strongly suggested that this class of drugs might be able to preserve beta-cell function in newly diagnosed patients with Type 1 diabetes.

Credit: 
University at Buffalo

Study shows how caring responsibilities affect health and restrict ability to work

New research from the University of Southampton has highlighted inequalities faced by men and women over the age of fifty with caring responsibilities. As well as being more likely to be socio-economically disadvantaged, carers in this age group are more likely to experience problems with their mental and physical health than people who do not provide any care.

The study was led by the University's Medical Research Council (MRC) Lifecourse Epidemiology Unit. The team analysed the results of over 8000 men and women who took part in the MRC's Health and Employment after Fifty (HEAF) Study.

Life expectancy in Europe is increasing and, combined with reducing birth rates, has changed the shape of populations with growing proportions of older people relative to those of working age. An ageing population also increases the demand for informal care. Whilst such care can save governments enormous health and social care costs, the research team wanted to examine the potential that caring has to impact on national economic productivity if it comes at the expense of the quality or quantity of work that those who provide care are able to carry out.

The findings of this research, published in the European Journal of Public Health, showed that nearly a fifth of men and over a quarter of women reported some form of caring responsibility. Those providing the highest levels of care were more likely to be disadvantaged in terms of both social class and education level compared to those without any caring responsibilities. Caregivers were also more likely to be unemployed or retired, and amongst those who do work, were more likely to be working part-time or to work shifts.

When they considered health outcomes in care providers, the team found that those who provided care for more than 20 hours per week were more likely to suffer from chronic obstructive pulmonary disease (COPD) and to experience musculoskeletal pain, depression and sleep problems.

Professor Karen Walker-Bone Director of the MRC Versus Arthritis Centre for Musculoskeletal Health and Work at the University of Southampton, who led the study, said: "this study has shone new light on the disadvantages faced by those who have to care for their friends or family members, and the significant impact of caring on their own health and ability to work. Whilst governments have increased state retirement age to encourage people to work longer, many have to drop out of the work place in order to provide care to those who need it. It is vital that employers and Government make sure that sufficient support is available to carers to make sure they can remain healthy and productive and do not end up requiring care themselves."

Credit: 
University of Southampton

Shhhh, the whales are resting

video: Drone video from experiment shows humpback whale mother and calf changing behavior when exposed to loud boat noise.

Image: 
KR Sprogis, S. Videsen, PT Madsen

We need new guidelines to shield whales from human-made noise to ensure them some peace and quiet. It is no good keeping whale-watching boats out of whales' sight if the noise from the boats' engines disturb the whales most. And whales can hear the boats' engines from far away, according to a Danish-Australian research team.

Whale-watching has become a multi-billion dollar business, and companies want to give passengers the best possible experience by positioning their boats close to the whales.

Public authorities around the globe have set restrictions on whale-watching boats in order to protect whales. For example, some countries require boats to keep a distance of at least 100 metres from the whales, or require them to stay behind or next to the whales at slow speed. However, scientific studies have shown that even when boats keep to these restrictions, the whales are still disturbed and change behaviour:

They dive, change course, swim faster, breathe more often, disperse and may make different sounds compared to usual.

Now, a team of researchers from Aarhus University in Denmark believe they have found an explanation: The engines in some of the boats are too loud. And authorities can now place noise emission standards on this noise.

"Unlike humans, the dominant sense in whales is not sight - it is hearing. As such, a whale may not be able to see a whale-watch boat at 100 meter distance, but they are likely to hear it, so it makes sense to consider this when stipulating whale-watching guidelines," says Australian Kate R. Sprogis, biologist and Marie Sk?owdowska-Curie Fellow at Aarhus University.

Kate R. Sprogis is the lead of a team of researchers who, using underwater speakers and drone cameras, have experimented to find out how much noise from boats it takes for humpback whales to change their behaviour. The results have been published in the scientific journal eLife.

The experiments were carried out in Exmouth Gulf on the west coast of Australia, which has the largest population of humpback whales in the world. Exmouth Gulf is a resting area where the whales rely on their fat deposits during winter as they are not feeding, and where their calves suckle to become strong enough for the migration to the feeding ground in colder waters during summer.

Using a drone, the researchers would search for a mother and her calf, and drive their boat to a 100 metres distance from them in a whale-watching scenario whilst emitting boat-engine noises at different noise levels. They completed a total of 42 controlled exposure experiments of this type and recorded all behavioural movements of the whales from the aerial perspective with the drone's camera.

At the highest boat-noise level of 172 decibels (a loud boat), at a 100 metres distance to the whales, they found that the resting time of whale mothers dropped by 30%, their breathing rate doubled and their swimming speed rose by 37%.

However, they often returned to a state of resting when the boat noise moved further away.

Despite this, other research shows that repeated disturbance from humans can have long-term consequences for the whales: When whale mothers spend a great deal of energy negatively responding to underwater noise, they have less energy to feed their offspring, avert predators and unwanted males, and to migrate to their polar feeding ground to fatten up.

"For the calves, multiple disturbances can also mean that they don't get enough milk: In a short time, they have to grow big and strong enough to be able to cope with the migration to colder regions and to minimise the risk of being predated upon by sharks and killer whales," explains Kate R. Sprogis.

The researchers concluded that the noise level from a boat's engine should stay below 150 decibels (or, more specifically, 150 dB re 1 μPa RMS) to avoid impacting the humpback whales' behaviour.

Now, the researchers recommend introducing this noise emission standard, to ensure that loud boats are not disturbing whales unnecessarily.

They note that a number of whale-watching boats are already quiet and therefore comply with the recommendation.

Credit: 
Aarhus University

Newly discovered plant gene could boost phosphorus intake

Researchers from the University of Copenhagen have discovered an important gene in plants that could help agricultural crops collaborate better with underground fungi--providing them with wider root networks and helping them to absorb phosphorus. The discovery has the potential to increase agricultural efficiency and benefit the environment.

It is estimated that about 70 percent of phosphorus fertilizer used in Danish agriculture accumulates in soil, whereas only 30 percent of it reaches plants.

Quid pro quo--that's how one might describe the "food community" that the majority of plants have with mycorrhizal fungi. Plants allow fungi to live among their roots, while feeding them fat and sugar. And in return, fungi use their far-reaching hypha (filamentous branches) to capture vital soil nutrients for plants, including the important mineral phosphorus.

Now, researchers at the University of Copenhagen's Department of Plant and Environmental Sciences have discovered an extraordinary plant gene, the CLE53 gene, which regulates cooperation between fungi and plants. The gene is central to a mechanism that controls how receptive plants are to working with mycorrhizal fungi. Down the road, this newfound knowledge could serve to deliver better harvests and reduced fertiliser use.

"Similar genes are found in all plants--including agricultural crops. So, by mutating or turning off the CLE53 gene in a crop plant, it is more likely for a plant to become symbiotically involved with a fungus. In doing so, it becomes possible to reduce the need for phosphorus fertilizers, as plants improve at absorbing preexistent phosphorus from soil," explains Assistant Professor Thomas Christian de Bang of the Department of Plant and Environmental Sciences.

The research has been published in the Journal of Experimental Botany

Seventy percent of phosphorus fertilization does not reach plants

Phosphorus is vital for all plants. However, the problem with phosphorus use in agriculture is that more of it is applied for fertilisation than can be absorbed by crops. It is estimated that about 70 percent of phosphorus fertilizer used in Danish agriculture accumulates in soil, whereas only 30 percent of it reaches plants. With rain, there is an ever present risk that some of the accumulated phosphorus will be discharged into streams, lakes and the sea.

Paradoxically, researchers have observed that when phosphorus levels in soil are high, plants are less likely to collaborate with fungi, meaning that they become worse at absorbing nutrients.

"Through a range of experiments, we have demonstrated that a plant does not produce the CLE53 gene if it lacks phosphorus. However, when the phosphorus levels in a plant are high, or if the plant is already symbiotically involved with a fungus, then the level of CLE53 increases. Our study demonstrates that CLE53 has a negative effect on a plant's ability to enter into symbiosis with a fungus, and thereby absorb phosphorus most effectively," says Thomas Christian de Bang.

Requires CRISPR approval

The genomic editing of plants is legal in a number of non-EU countries--e.g., China, the US, Switzerland and the UK. However, within the EU, there is no general acceptance of gene-editing methods, such as CRISPR, to alter plants and foodstuffs.

Therefore, the researchers' discovery has, for the time being, a poorer chance of being used in Denmark and the rest of the EU.

"One can use the technology in other parts of the world, and getting started would be relatively straightforward. My guess is that within five years, plants will be tested and refined in such a way that they become more symbiotically involved with fungi and absorb more phosphorus. Here in Denmark and throughout the EU, an acceptance is required for gene editing and an amended approach to approval procedures for these types of plants," says Thomas Christian de Bang.

Facts:

90% of all plants engage in symbiotic relationships with mycorrhizal fungi, which popularly said, extend the root networks of plants, thus helping them to obtain enough phosphorus, water and other nutrients.

In order to benefit from the ability of mycorrhizal fungi to extract phosphorus from soil, a plant must feed it with fat and sugar. To avoid spending too much energy on the sponge, if for example, it is experiencing high phosphorus levels or has already been colonised by a fungus, the plant may switch off symbiosis.

It is estimated that Danish farms fertilise with roughly 30 kilos of phosphorus per hectare of land.

Of this, roughly 30 percent makes its way to crops, while the remaining 70 percent binds to soil.

With rain, some of this accumulated phosphorus is flushed away via surface runoff, into nearby streams, lakes and the sea. This increases algae growth and can kill both plants and wildlife.

Phosphorus is a finite natural resource, one that is expected to eventually be depleted.

The research is funded by the Novo Nordisk Foundation and the University of Copenhagen

Previous research has shown that a similar mechanism exists for symbiosis between legumes and rhizobium bacteria. This involved a CLE gene as well, albeit a different one than the researchers have now discovered.

Credit: 
University of Copenhagen

Strangely ordinary strata

The Torridon sandstone in northwestern Scotland preserves six kilometers of river sediment from Precambrian times. But what sort of geological events were able to leave their mark for researchers to find 1 billion years later?

Intriguingly, it was not great floods or dramatic course changes -- mostly just the regular crawl of sand dunes across the river bottom. In fact, only a few months' worth.

This ordinariness of river deposits, or fluvial strata, has perplexed geologists for the better part of a century. Given just how little of a river's history gets preserved, researchers find it odd that records of the commonplace predominate, rather than evidence of the most extreme events. New research published in the journal Geophysical Research Letters, reveals the processes that may finally explain this enigma.

The study led by Vamsi Ganti, an assistant professor of geomorphology at UC Santa Barbara, touches on one of the longest running debates in the field of geology: catastrophism versus uniformitarianism. That is, whether the geologic record tends to be influenced more by large, infrequent events or by small but common occurrences.

When it comes to river deposits, catastrophism has a pretty intuitive argument. "If the probability that any event is preserved is low, then what is preserved should be somehow special," Ganti explained. However, scientists find this simply isn't true, even though less than 0.0001% of elapsed time is preserved.

"That's the reason that we call this the strange ordinariness of fluvial strata," said Ganti, "because it is strange that preserved events are so ordinary even though the time preservation is so extraordinary."

River morphology tends to self-organize into a hierarchy of levels, which Ganti and his colleagues believed was the key to understanding this strange ordinariness. Ripples and dunes move across river bottoms on the order of minutes and hours. The movement of sand bars happens over months and years, while rivers meander and jump their banks over years and centuries. At the most extreme end, sea level changes can accelerate erosion or promote sedimentation over the course of millennia.

Fortunately, scientists understand how each of these phenomena appear in the stratigraphic record based on modern observations. It turns out that these features vary in size from inch-high ripples to sea-level induced erosion that can scour hundreds of meters of sediment.

Ganti and his colleagues built a probabilistic model to test their hypothesis. They found that if all river processes happen at the same scales, only the most extreme events get preserved. However, as soon as they introduced a hierarchy, sediment from ordinary processes began filling in the erosion caused by phenomena one level higher.

The mystery was solved. "So long as you have a hierarchical organization in river dynamics, your strata will be ordinary," Ganti said.

Scientists have known about these different hierarchical levels in river morphology for quite some time, but no one had directly linked them to the ordinariness of river strata until now, Ganti explained. Before these results, sedimentologists were a bit like early biologists who knew about taxonomy -- species, genera, families, etc. -- without understanding the theory of evolution that explains the dynamics connecting them.

Events in one level can build up sediment -- in which case they are preserved -- or they can erode away sediment, which will then be filled in by ordinary events one level lower. So, while some extreme events are preserved, common phenomena dominate the stratigraphic record.

Ganti also realized that the relative timeframes over which the levels evolve determine what is preserved. For instance, take the relative rates of river migration versus avulsion, or how often the river jumps its banks. "If your migration is fast and your avulsion infrequent, then you keep reworking your deposits," Ganti explained. These systems tend to preserve only the most extreme channel elevations. "However, when you have an avulsion, you cannot rework that deposit anymore because you've jumped to a new location."

With this understanding, scientists can now use strata to compare how fast each level was evolving when a river was actually active. In fact, the results bolster the conclusions of Ganti's previous study, where he had demonstrated that Precambrian rivers could have been similar to the single-channel, meandering rivers we know today.

Scientists had long doubted this since there was no evidence preserved in the stratigraphic record. Many argued that such rivers would have needed plants to secure their banks, and land plants had yet to evolve. But rather than having no migration, in truth it's likely that these rivers meandered so often that their strata kept getting erased. Indeed, other scientists have found that rivers in un-vegetated landscapes migrate 10 times faster than those with vegetation.

Ganti's findings also have ramifications for the modern world, where climate change and sea level rise are altering the behavior of major river systems. To understand our future, many scientists look at deposits from rivers during the Paleocene-Eocene Thermal Maximum, when average temperatures abruptly jumped 5 to 8 degrees Celsius, comparable to modern climate change. Evidence suggests that rivers were more mobile then, and now we have the tools to determine why.

"We know that sediment supply to rivers is changing because of human-induced changes. But what we don't know is what trajectory we are sending rivers on in the long term," Ganti said.

"Are we going to just increase migration rates? Are we going to make avulsions more frequent? This difference matters, because it determines the flooding history and where you develop in the decades and centuries to come."

Credit: 
University of California - Santa Barbara

Graphics cards farm to help in search of new physics at LHCb

For the first time, data from LHCb, a major physics experiment, will be processed on a farm of GPUs. This solution is not only much cheaper, but it will help decrease the cluster size and process data at speeds up to 40 Tbit/s. The research paper has been published in Computing and Software for Big Science. https://cds.cern.ch/record/2717938/files/LHCB-TDR-021.pdf

An interdisciplinary task force of researchers from one of the biggest international collaborations in high energy physics LHC beauty at CERN has suggested a novel way to process enormous dataflow from the particle detector. The team consists of researchers from leading European and US universities. The Russian part of the team was represented by HSE and Yandex School of data analysis. The main goal of the proposal is to provide the collaboration with a robust, efficient and flexible solution that could deal with increased data flow expected during the upcoming data taking period. This solution is not only much cheaper, but it will help decrease the cluster size and process data at speeds up to 40 Tbit/s.

The LHC and LHCb in particular were created for the purpose of searching for 'new physics', something beyond the Standard Model. While the research has achieved moderate success, hopes of finding completely new particles, such as WIMPs, have failed. Many physicists believe that in order to achieve new results, statistics on particle collision at the LHC should be increased considerably. But this not only requires new accelerating equipment - upgrades are currently underway and due to be completed by 2021-2022 - but also brand-new systems to process particle collision data. To detect the events on LHCb as correctly registered, the reconstructed track must match the one modelled by the algorithm. If there is no match, the data are excluded. About 70% of all collisions in the LHC are excluded this way, which means that serious calculation capacities are required for this preliminary analysis.

A group of researchers, including Andrey Ustyuzhanin https://www.hse.ru/en/staff/austyuzhanin, Nikita Kazeev https://www.hse.ru/en/staff/kazeev, Mikhail Belous and Sergei Popov https://www.hse.ru/en/org/persons/224906274 from HSE University, presented a new paper with an algorithm of a 'farm' of GPUs as a first high-level trigger (HLT1) for event registration and detection on the LHCb detector. The concept has been named Allen, after Frances Allen, a researcher in computational system theory and the first woman to have received the Turing Award.

Unlike previous triggers, the new system transfers data from CPUs to GPUs. These may include both professional solutions (such as Tesla GPUs, the most advanced on the market) and ordinary 'gamer' GPUs by NVIDIA or AMD. Thanks to this, the Allen trigger does not depend on one specific equipment vendor, which makes it easier to create and reduces costs. With the highest-performance systems, the trigger can process data at up to 40 Tbit/s.

In a standard scheme, information on all events goes from the detector to a zero-level (L0) trigger, which consists of programmable chips (FPGA). They perform selection at the basic level. In the new scheme, there will be no L0 trigger. The data immediately go to the 'farm', where each of the 300 GPUs simultaneously processes millions of events per second.

After initial event registration and detection, only the selected data with valuable physical information go to ordinary x86 processors of second-level triggers (HLT2). This means that the main computational load related to event classification happens at the 'farm' by means of GPUs exceptionally.

This framework will help solve the event analysis and selection tasks more effectively: GPUs are initially created as a multi-channel system with multiple cores. And while CPUs are geared towards consecutive information processing, GPUs are used for massive simultaneous calculations. In addition, they have a more specific and limited set of tasks, which adds to performance.

According to Denis Derkach https://www.hse.ru/en/org/persons/148813333, head of the LHCb team at HSE University, thanks to the decision not to use CPUs, the new 'farm' is well suited for future LHCb data taking. In addition, Allen will cost significantly less than a similar system on CPUs. It will also be simpler than the previous event registration systems at accelerators.

The long-term benefit of the new approach is particularly important. Equipment for many physics experiments is currently being upgraded all around the world. And virtually every such upgrade leads to a growing flow of processed information. Previously, experiments did not use systems based on GPUs exceptionally. But the advantages of Allen - a simpler architecture and lower cost - are so obvious that this approach will undoubtedly take the lead beyond the LHCb experiment.

Credit: 
National Research University Higher School of Economics

The rafts used by viruses

Rescue rafts are a lifesaver, although other types of rafts may put our lives in danger: that is the case with 'lipid rafts', which are exploited by coronaviruses to attack human cells. An interdisciplinary research group coordinated by the University of Trento and the University of Napoli - Federico II set out to understand what happens when a virus jumps on this type of raft to invade a cell.

To penetrate the human cell, the virus tricks the cell membrane that surrounds it. The membrane has a crucial role, because it ensures the regular functioning of the cell which is essential for tissue growth and development and organ functionality. When a virus sneaks into a cell pretending to be something friendly - a ligand, namely a molecule that binds to a chemically affine receptor and forms a complex capable to cause a cellular response - the membrane responds by creating localized thickened zones, called 'lipid rafts'. Indeed, that is where receptors find favourable sites for binding. Indeed, receptors must change their configuration as they bind to their ligand and this can be done more easily across suitable stress-relieved zones of the cell membrane, namely on the rafts. Also, these thickenings turn out to be energetically favourable for the system, thereby becoming entryways for viruses and ligands in general. The researchers adopted a mechanobiological approach to explain how the microstructural properties of the membrane interact with biochemical processes to form lipid rafts.

The study may suggest new strategies to limit virus attacks and prevent or combat diseases like Sars and Covid-19 based on biomedical and engineering principles.

The research was conducted by Luca Deseri and Nicola Pugno, professors of the group of Mechanics of solids and structures of the Department of Civil, Environmental and Mechanical Engineering of the University of Trento, and by the team of Massimiliano Fraldi, professor of the Department of Structures for Engineering and Architecture of the University of Napoli- Federico II, in collaboration with researchers at Carnegie Mellon University and at the University of Pittsburgh, in the USA, and with the universities of Palermo and Ferrara, where experiments were carried out.

Deseri explained how viruses attack cells: "The cell membrane regulates the transport of nutrients and the removal of waste products, and acts as a barrier to keep toxic substances and pathogens, including viruses, out. Viruses SARS-CoV-1 and SARS-CoV-2, which caused the current Covid-19 pandemic, trick the membrane by showing specific anti-receptors that look like ligands, to which the cell's receptors usually bind to, activating localized thickenings in the membrane, the 'lipid rafts', which then are used by viruses to enter the cell".

The findings contribute to the ongoing discussion on the diseases caused by coronaviruses of the SARS-Severe Acute Respiratory Syndrome type. "This study may suggest new strategies to identify innovative therapeutic approaches to prevent or fight the virus by integrating biomedical and mechanical knowledge", Deseri, Pugno and Fraldi remarked.

Credit: 
Università di Trento

'Relaxed' T cells critical to immune response

image: Rice University scientists' simple model of T cell activation of the immune response shows the T cell binding, via a receptor (TCR) to an antigen-presenting cell (APC). If an invader is identified as such, the response is activated, but only if the "relaxation" time of the binding is long enough.

Image: 
Hamid Teimouri/Rice University

HOUSTON - (June 16, 2020) - Like finding that needle in the haystack every time, your T cells manage what seems like an improbable task: quickly finding a few invaders among the many imposters in your body to trigger its immune response.

T cells have to react fast and do so nearly perfectly to protect people from diseases. But first, they need a little "me" time.

Rice University researchers suggest that has to do with how T cells "relax" in the process of binding to ligands -- short, functional molecules -- that are either attached to the invaders or just resemble them.

The look-alikes greatly outnumber the antigen ligands attached to attacking pathogens. The theory by Rice chemist Anatoly Kolomeisky and research scientist and Rice alumnus Hamid Teimouri proposes that the T cell's relaxation time -- how long it takes to stabilize binding with either the invader or the imposter -- is key. They suggested it helps explain the rest of the cascading sequence by which invaders prompt the immune system to act.

The inappropriate activation of a T cell toward its own molecules leads to serious allergic and autoimmune responses.

The researchers' study appears in the Biophysical Journal.

T cells operate best within the parameters that control a "golden triangle" of sensitivity, specificity and speed. The need for speed seems obvious: Don't let the invaders infect. And it is important because T cells spend so little time in the vicinity of the antigen-presenting cells, so they must act quickly to recognize them. Specificity is most challenging, since self-ligand imposters can outnumber invaders by a factor of 100,000.

"It is amazing how T cells are able to react so fast and so selectively. This is one of the most important secrets of living organisms," said Kolomeisky, a professor and chairman of Rice's Department of Chemistry and a professor of chemical and biomolecular engineering.

Their approach was to build a stochastic (random) model that analyzed how T cell receptors bind step-by-step to the peptide major histocompatibility complexes (pMHC) on the surface of antigen-presenting cells. At a high enough concentration, the bound complexes trigger the immune cascade.

The mathematical model aligned with experimental results that suggest T cell activation depends on kinetic proofreading, a form of biochemical error correction.
Proofreading slows down the relaxation for wrong molecules, and this allows the organism to start the correct immune response.

While the theory helps explain the T cells' "absolute discrimination," it does not explain downstream biochemical processes. However, the researchers said timing may have everything to do with those as well.

In a "very speculative" suggestion, the researchers noted that when the binding speed of imposters matches that of invaders, triggering both biomolecular cascades, there's no immune response. When the more relaxed binding of pathogenic ligands lags behind, it appears more likely to reach a threshold that triggers the immune system. Kolomeisky said the concept could be validated through experimentation.

He and Teimouri wrote that many other aspects of T cell triggering need to be explored, including the roles of the cellular membranes where receptors are located, cell-cell communications, and cell topography during interactions. But having a simple quantitative model is a good start.

"Our theory can be extended to explore some important features of the T cell activation process," Kolomeisky said.

Credit: 
Rice University

Team led by Children's Hospital LA researcher generates developmental map of human T-cells

Chintan Parekh, MD, of the The Saban Research Institute of Children's Hospital Los Angeles, has led a team of investigators that generated a comprehensive roadmap for how T-cells develop in the human thymus. The study will be published in the journal Immunity on June 16. T-cells are a type of white blood cell involved in immune response -- fighting off invaders like pathogens or cancer cells. Understanding human T-cell development is crucial for treating diseases arising from abnormal T-cell development, like leukemia and immunodeficiencies, and for developing highly effective immunotherapies, like CAR-T.

"While most previous studies have been done in mice, our study specifically reveals a high-resolution picture of human T-cell development," says Dr. Parekh. "Because of the biological differences between species, it's critical to specifically study human T-cells in order to generate the information we need to understand human disease and to design novel immunotherapies."

Using single-cell sequencing technology to study cells isolated from human thymic tissue, the investigators mapped the various stages of T-cell development in the human thymus, including the multitude of genes that switch on or off at each stage. They charted the different developmental routes that the most immature cells in the thymus may take as they progress to maturity and discovered stages of development and patterns of gene activity unique to humans.

This knowledge could lead to greater insights into diseases arising from T-cell deficiencies or abnormal T-cells such as immunodeficiency disorders (severe combined immunodeficiency disease or SCID), T-cell mediated autoimmune diseases (type 1 diabetes, rheumatoid arthritis) and leukemia (T-cell acute lymphoblastic leukemia).

The findings could also help in the advancement of immunotherapies like CAR-T therapy, regarded as one of the most significant advance in cancer treatment. A greater understanding of T-cell development is also needed to advance treatments to expedite recovery of the immune system in patients who have undergone bone marrow transplantation for treatment of cancer and other diseases.

The single cell data for developing T-cells is available on the National Center for Biotechnology Information (NCBI) Gene expression omnibus (GEO) database. This public genomics data repository ensures that other researchers have access to the data so they can learn more about which genes regulate T-cell development and use that knowledge to understand T-cell diseases and design new immunotherapies.

Credit: 
Children's Hospital Los Angeles

Digitize your dog into a computer game

video: Researchers at CAMERA at the University of Bath collected the movement data of a range of dogs to produce a model that could predict the poses of dogs from images taken by a single RGBD camera.

Image: 
University of Bath

Researchers from the University of Bath have developed motion capture technology that enables you to digitise your dog without a motion capture suit and using only one camera.

The software could be used for a wide range of purposes, from helping vets diagnose lameness and monitoring recovery of their canine patients, to entertainment applications such as making it easier to put digital representations of dogs into movies and video games.

Motion capture technology is widely used in the entertainment industry, where actors wear a suit dotted with white markers which are then precisely tracked in 3D space by multiple cameras taking images from different angles. Movement data can then be transferred onto a digital character for use in films or computer games.

Similar technology is also used by biomechanics experts to track the movement of elite athletes during training, or to monitor patients' rehabilitation from injuries. However, these technologies - particularly when applying them to animals - require expensive equipment and dozens of markers to be attached.

Computer scientists from CAMERA, the University of Bath's motion capture research centre digitised the movement of 14 different breeds of dog, from lanky lurchers to squat pugs, which were residents of the local Bath Cats' and Dogs' Home (BCDH).

Wearing special doggie motion capture suits with markers, the dogs were filmed under the supervision of their BCDH handlers doing a range of movements as part of their enrichment activities.

They used these data to create a computer model that can accurately predict and replicate the poses of dogs when they're filmed without wearing the motion capture suits. This model allows 3D digital information for new dogs - their shape and movement - to be captured without markers and expensive equipment, but instead using a single RGBD camera. Whereas normal digital cameras record the red, green and blue (RGB) colour in each pixel in the image, RGBD cameras also record the distance from the camera for each pixel.

PhD researcher Sinéad Kearney said: "This is the first time RGBD images have been used to track the motion of dogs using a single camera, which is much more affordable than traditional motion capture systems that require multiple cameras.

"This technology allows us to study the movement of animals, which is useful for applications such as detecting lameness in a dog and measuring its recovery over time.

"For the entertainment industry, our research can help produce more authentic movement of virtual animals in films and video games. Dog owners could also use it to make a 3D digital representation of their pet on their computer, which is a lot of fun!"

The team presented their research at one of the world's leading AI conferences, the CVPR (Computer Vision and Pattern Recognition) conference on 17 &18 June.

The team has also started testing their method on computer-generated images of other four-legged animals including horses, cats, lions and gorillas, with some promising results. They aim in the future to extend their animal dataset to make the results more accurate; they will also be making the dataset available for non-commercial use by others.

Professor Darren Cosker, Director of CAMERA, said: "While there is a great deal of research on automatic analysis of human motion without markers, the animal kingdom is often overlooked.

"Our research is a step towards building accurate 3D models of animal motion along with technologies that allow us to very easily measure their movement. This has many exciting applications across a range of areas - from veterinary science to video games."

Credit: 
University of Bath

Continuous glucose monitoring reduces hypoglycemia in older adults with type 1 diabetes

image: Laura Young, MD, PhD, associate professor of medicine, was UNC's principal investigator for this six-month clinical trial that shows the use of continuous glucose monitoring (CGM) reduces serious levels of hypoglycemia compared with standard monitoring by daily use of blood glucose finger-stick test strips.

Image: 
UNC School of Medicine

CHAPEL HILL, NC - Results from a six-month, multi-site clinical trial called the Wireless Innovation for Seniors with Diabetes Mellitus (WISDM) Study Group have been published by the Journal of the American Medical Association (JAMA).

Older adults with type 1 diabetes (T1D), a growing but under-studied population, are prone to hypoglycemia, particularly when diabetes is longstanding. Hypoglycemia can cause altered mental status and sometimes seizure or loss of consciousness, which can prove fatal. But according to this study, older adults who use continuous glucose monitoring (CGM) devices can significantly reduce the occurrence of hypoglycemia and severe hypoglycemic events while also reducing hemoglobin A1c.

The objective of the study was to determine if the use of CGM can reduce hypoglycemia among older adults with type 1 diabetes (T1D). A CGM device continuously measures blood sugar and provides real-time observation of glucose levels, trend direction, and alarms for when glucose drops to low levels or increases to high levels.

"Reducing hypoglycemia is an important aspect of management of T1D in older adults, many of whom have difficultly recognizing symptoms of hypoglycemia or cognitive impairment," said UNC's principal investigator Laura Young, MD, PhD, associate professor of medicine in the division of endocrinology and metabolism at the UNC School of Medicine.

The study was a randomized, controlled trial involving 203 men and women over the age of 60 at 22 clinical centers, including UNC-Chapel Hill. About half of participants were receiving insulin via an insulin pump and the other half used multiple daily injections of insulin. Half of participants were randomly assigned to a group using a Dexcom CGM device, and the other half to a control group using the standard finger-stick method with test strips for blood glucose monitoring.

The study found that the amount of time glucose levels were in a hypoglycemia range (

CGM users were far less likely to have a severe hypoglycemic event compared to the control group using blood glucose meter checks; 10 participants in the control group had a severe hypoglycemic event compared with only one participant in the CGM group. Five of the 10 severe hypoglycemic events reported in the control group involved seizure or loss of consciousness, which did not occur in the one CGM group event.

Reducing hypoglycemia did not come at the cost of worsening overall glucose control. The average hemoglobin A1c (HbA1c, an estimate of blood glucose control over a 3-month period) for the CGM group improved from 7.6% at the start of the study to 7.2% at 6-months. The control group HbA1c was 7.5% at the start of the study and 7.4% at 6-months. The American Diabetes Association recommends an HbA1c target of
Very low or very high blood glucose episodes can be dangerous. Regular CGM use increased the amount of time in target range (glucose levels of 70 mg/dL to 180 mg/dL) by more than two hours a day.

Another striking discovery was that 81% of study participants were still using the CGM devices seven days per week at six months. These results are important because they demonstrate CGM can be used effectively in an older adult population to reduce hypoglycemia, as well as improve overall glucose control. Because hypoglycemia can lead to serious complications, including hospitalization and death, the reduction in severe hypoglycemic events could have important public health implications.

"For too long, the older population with diabetes has suffered from not-so-benign neglect from the medical community," Young said. "Despite the high prevalence of diabetes and its complications in this age group, very few studies have addressed the potential utility of new technologies in this population. In part, this may be due to the mistaken belief that older adults can't manage or benefit from advanced technologies. Our study shows quite the opposite. Not only does CGM improve safety in older adults, it actually improves overall glycemic control. Moreover, the overwhelming majority of patients in our study used CGM most days for the entire duration of the study, demonstrating a high level of comfort with the technology in this population."

In addition, the treatment impact on reducing hypoglycemia at six months was present on those using insulin pump as well as those using multiple injections of insulin. The effective use of CGM in those using multiple injections of insulin has important implications as these patients likely will not make the switch to more advanced technologies that automate insulin.

Credit: 
University of North Carolina Health Care

TERAVOLT registry tracks outcomes among thoracic cancer patients sickened by COVID-19

image: Leora Horn, MD, MSc, Ingram Associate Professor of Cancer Research at Vanderbilt-Ingram Cancer Center, who is a senior author of the study and a TERAVOLT consortium steering committee member.

Image: 
Vanderbilt University Medical Center

Credit: 
Vanderbilt University Medical Center