Culture

Long lost human relative unveiled

(Jerusalem, September 19, 2019)--If you could travel back in time to 100,000 years ago, you'd find yourself living among several different groups of humans, including Modern Humans (those anatomically similar to us), Neanderthals, and Denisovans. We know quite a bit about Neanderthals, thanks to numerous remains found across Europe and Asia. But exactly what our Denisovan relatives might have looked like had been anyone's guess for a simple reason: the entire collection of Denisovan remains includes three teeth, a pinky bone and a lower jaw. Now, as reported in the scientific journal Cell, a team led by Hebrew University of Jerusalem (HUJI) researchers Professor Liran Carmel and Dr. David Gokhman (currently a postdoc at Stanford) has produced reconstructions of these long-lost relatives based on patterns of methylation (chemical changes) in their ancient DNA.

"We provide the first reconstruction of the skeletal anatomy of Denisovans," says lead author Carmel of HUJI's Institute of Life Sciences. "In many ways, Denisovans resembled Neanderthals but in some traits they resembled us and in others they were unique."

Denisovan remains were first discovered in 2008 and have fascinated human evolution researchers ever since. They lived in Siberia and Eastern Asia, and went extinct approximately 50,000 years ago. We don't yet know why. That said, up to 6% of present-day Melanesians and Aboriginal Australians contain Denisovan DNA. Further, Denisovan DNA likely contributed to modern Tibetans' ability to live in high altitudes and to Inuits' ability to withstand freezing temperatures.

Overall, Carmel and his team identified 56 anatomical features in which Denisovans differ from modern humans and/or Neanderthals, 34 of them in the skull. For example, the Denisovan's skull was probably wider than that of modern humans' or Neanderthals'. They likely also had a longer dental arch and no chin.

The researchers came to these conclusions after three years of intense work studying DNA methylation maps. DNA methylation refers to chemical modifications that affect a gene's activity but not its underlying DNA sequence. The researchers first compared DNA methylation patterns among the three human groups to find regions in the genome that were differentially methylated. Next, they looked for evidence about what those differences might mean for anatomical features--based on what's known about human disorders in which those same genes lose their function.

"In doing so, we got a prediction as to what skeletal parts are affected by differential regulation of each gene and in what direction that skeletal part would change--for example, a longer or shorter femur bone," Dr. Gokhman explained.

To test this ground-breaking method, the researchers applied it to two species whose anatomy is known: the Neanderthal and the chimpanzee. They found that roughly 85% of their trait reconstructions were accurate in predicting which traits diverged and in which direction they diverged. Then, they applied this method to the Denisovan and were able to produce the first reconstructed anatomical profile of the mysterious Denisovan.

As for the accuracy of their Denisovan profile, Carmel shared, "One of the most exciting moments happened a few weeks after we sent our paper to peer-review. Scientists had discovered a Denisovan jawbone! We quickly compared this bone to our predictions and found that it matched perfectly. Without even planning on it, we received independent confirmation of our ability to reconstruct whole anatomical profiles using DNA that we extracted from a single fingertip."

In their Cell paper, Carmel and his colleagues predict many Denisovan traits that resemble Neanderthals', such as a sloping forehead, long face and large pelvis, and others that are unique among humans, for example, a large dental arch and very wide skull. Do these traits shed light on the Denisovan lifestyle? Could they explain how Denisovans survived the extreme cold of Siberia?

"There is still a long way to go to answer these questions but our study sheds light on how Denisovans adapted to their environment and highlights traits that are unique to modern humans and which separate us from these other, now extinct, human groups," Carmel concluded.

Credit: 
The Hebrew University of Jerusalem

Persistent headache or back pain 'twice as likely' in the presence of the other

People with persistent back pain or persistent headaches are twice as likely to suffer from both disorders, a new study from the University of Warwick has revealed.

The results, published in the Journal of Headache and Pain, suggest an association between the two types of pain that could point to a shared treatment for both.

The researchers from Warwick Medical School who are funded by the National Institute for Health Research (NIHR) led a systematic review of fourteen studies with a total of 460,195 participants that attempt to quantify the association between persistent headaches and persistent low back pain. They found an association between having persistent low back pain and having persistent (chronic) headaches, with patients experiencing one typically being twice as likely to experience the other compared to people without either headaches or back pain. The association is also stronger for people affected by migraine.

The researchers focused on people with chronic headache disorders, those who will have had headaches on most days for at least three months, and people with persistent low back pain that experience that pain day after day. These are two very common disorders that are leading causes of disability worldwide.

Around one in five people have persistent low back pain and one in 30 have chronic headaches. The researchers estimate that just over one in 100 people (or well over half a million people) in the UK have both.

Professor Martin Underwood, from Warwick Medical School, said: "In most of the studies we found that the odds were about double - either way, you're about twice as likely to have headaches or chronic low back pain in the presence of the other. Which is very interesting because typically these have been looked as separate disorders and then managed by different people. But this makes you think that there might be, at least for some people, some commonality in what is causing the problem.

"There may be something in the relationship between how people react to the pain, making some people more sensitive to both the physical causes of the headache, particularly migraine, and the physical causes in the back, and how the body reacts to that and how you become disabled by it. There may also be more fundamental ways in how the brain interprets pain signals, so the same amount of input into the brain may be felt differently by different people.

"It suggests the possibility of an underpinning biological relationship, at least in some people with headache and back pain, that could also be a target for treatment."

Currently, there are specific drug treatments for patients with persistent migraine. For back pain, treatment focuses on exercise and manual therapy, but can also include cognitive behavioural approaches and psychological support approaches for people who are very disabled with back pain. The researchers suggest that those types of behavioural support systems may also help people living with chronic headaches.

Professor Underwood added: "A joint approach would be appropriate because there are specific treatments for headaches and people with migraine. Many of the ways we approach chronic musculoskeletal pain, particularly back pain, are with supportive management by helping people to live better with their pain.

"We could look at developing support and advice programmes that are appropriate for this population. And being aware of this relationship has the potential to change how we think about managing these people in the NHS on an everyday basis. There is a need for doctors and other healthcare professionals to think that when treating one issue to ask about the other and tailor the treatment accordingly. For future research, there's probably work that needs to be done to understand what the underlying mechanisms behind this relationship are."

Credit: 
University of Warwick

SMART announces a revolutionary tech to study cell nanomechanics

image: Dr. Vijay Raj Singh, SMART Research Scientist, and Dr. Zahid Yaqoob, MIT LBRC Principal Investigator, studying tumor cells using the new confocal reflectance interferometric microscope.

Image: 
SINGAPORE-MIT ALLIANCE FOR RESEARCH AND TECHNOLOGY

New confocal reflectance interferometric microscope enables scientists to probe nuclear membrane mechanics within intact cells in a label-free fashion

Nuclear mechanics is known to play a key role in many diseases including cancer metastasis and genetic illnesses such as progeria and muscular dystrophy

The label-free technology does not affect cells, allowing cell screening for therapeutic applications in which cells are injected or implanted into the human body

Existing label-free techs allow studying 'thin' cultured cells only; in contrast, the newly developed tech can enable scientists to study cells in biological tissues

Singapore, September 19, 2019 - Researchers at Singapore-MIT Alliance for Research and Technology (SMART) and MIT's Laser Biomedical Research Center (LBRC) have developed a new way to study cells, paving the way for a better understanding of how cancers spread and become killers.

The new technology was explained in a paper titled "Studying nucleic envelope and plasma membrane mechanics of eukaryotic cells using confocal reflectance interferometric microscopy", which was published in the prestigious academic journal, Nature Communications. The new confocal reflectance interferometric microscope provides 1.5 microns depth resolution and better than 200 picometers height measurement sensitivity for high-speed characterization of nanometer scale nucleic envelope and plasma membrane fluctuations in biological cells. It enables researchers to use these fluctuations to understand key biological questions such as the role of nuclear stiffness in cancer metastasis and genetic diseases.

"Current methods for nuclear mechanics are invasive as they either require mechanical manipulation such as stretching or require injecting fluorescent probes that 'light up' the nucleus to observe its shape. Both these approaches would undesirably change cell's intrinsic properties, limiting study of cellular mechanisms, disease diagnosis, and cell-based therapies," said Dr. Vijay Raj Singh, SMART Research Scientist and Dr. Zahid Yaqoob, MIT LBRC Principal Investigator. "With the confocal reflectance interferometric microscope, we can study nuclear mechanics of biological cells without affecting their native properties."

While the scientists can study about a hundred cells in a few minutes, they believe that the system can be upgraded in the future to improve the throughput to tens of thousands of cells.

"Today, many disease mechanisms are not fully understood because we lack a way to look at how cells' nucleus changes when it undergoes stress," said Dr. Peter So, SMART BioSyM Principal Investigator, , MIT Professor, and LBRC MIT Director. "For example, people often do not die from the primary cancer, but from the secondary cancers that form after the cancer cells metastasize from the primary site - and doctors do not know why cancer becomes aggressive and when it happens. Nuclear mechanics plays a vital role in cancer metastasis as the cancer cells must 'squeeze' through the blood vessel walls into the blood stream and again when they enter a new location. This is why the ability to study nuclear mechanics is so important to our understanding of cancer formation, diagnostics, and treatment."

With the new interferometric microscope, scientists at LBRC are studying cancer cells when they undergo mechanical stress, especially during extravasation process, paving the way for new cancer treatments. Further, the scientists are also able to use the same technology to study the effect of 'lamin mutations' on nuclear mechanics, which result in rare genetic diseases such as Progeria that leads to fast aging in young children.

The confocal reflectance interferometric microscope also has applications in other sectors as well. For example, this technology has the potential for studying cellular mechanics within intact living tissues. With the new technology, the scientists could shed new light on biological processes within the body's major organs such as liver, allowing safer and more accurate cell therapies. Cell therapy is a major focus area for Singapore, with the government recently announcing a S$80m boost to the manufacturing of living cells as medicine.

Credit: 
Singapore-MIT Alliance for Research and Technology (SMART)

Alarming number of heart infections tied to opioid epidemic

DALLAS, Sept. 18, 2019 -- An alarming number of people nationwide are developing infections of either the heart's inner lining or valves, known as infective endocarditis, in large part, due to the current opioid epidemic. This new trend predominantly affects young, white, poor men who also have higher rates of HIV, hepatitis C and alcohol abuse, according to new research published in the Journal of the American Heart Association, the open access journal of the American Heart Association.

Infective endocarditis occurs when bacteria or fungi in the blood stream enter the heart's inner lining or valves. Nearly 34,000 people receive treatment for this condition each year, of which approximately 20% die. One of the major risk factors for infective endocarditis is drug abuse.

"Infective endocarditis related to drug abuse is a nationwide epidemic," said the study's senior author Serge C. Harb, M.D., assistant professor of medicine at Cleveland Clinic Lerner College of Medicine, Cleveland, Ohio. "These patients are among the most vulnerable--young and poor, and also frequently have HIV, hepatitis C and alcohol abuse."

Researchers analyzed data in the National Inpatient Sample registry from 2002-2016 on nearly one million hospitalized patients diagnosed with infective endocarditis to compare patients with heart infections related to drug abuse to those with heart infections from other causes. The registry is the largest publicly available database of U.S. hospitalizations.

During the 14 years studied, researchers found that the prevalence ratio for drug-abuse-related heart infections nearly doubled in the United States, from 8% to 16%. All geographic regions saw increases, and the highest jump occurred in the Midwest at nearly 5% per year.

They also found those with infective endocarditis related to drug abuse:

Were predominantly young, white men (median age 38 years old);

Were poorer, with nearly 42% having a median household income in the lowest national quartile, and about 45% are covered by Medicaid;

Had higher rates of HIV, hepatitis C and alcohol abuse compared to patients with infective endocarditis who are not drug abusers;

Had longer hospital stays and higher health care costs; and

Were more likely to undergo heart surgery, yet less likely to die while hospitalized. Lower death rates are likely due to their significantly younger age.

"Nationwide public health measures need to be implemented to address this epidemic, with targeted regional programs to specifically support patients at increased risk," Dr. Harb said. "Specialized teams, including but not limited to cardiologists, infectious disease specialists, cardiac surgeons, nurses, addiction specialists, case managers and social workers, are needed to care for these patients. Appropriately treating the cardiovascular infection is only one part of the management plan. Helping these patients address their addictive behaviors with social supports and effective rehabilitation programs is central to improving their health and preventing drug abuse relapses."

Diseases that occurred more frequently among patients with heart infections from causes other than drug abuse included high blood pressure, diabetes, heart failure, kidney disease and lung disease. A study limitation is that the registry relies solely on diagnostic codes (ICD codes) and does not include hospital transfers. Another limitation is that the registry provided only general information by region, without details specific to states, cities and rural towns.

Credit: 
American Heart Association

Microbe chews through PFAS and other tough contaminants

image: In a series of lab tests, a relatively common soil bacterium has demonstrated its ability to break down the difficult-to-remove class of pollutants called PFAS.

Image: 
David Kelly Crow

In a series of lab tests, a relatively common soil bacterium has demonstrated its ability to break down the difficult-to-remove class of pollutants called PFAS, researchers at Princeton University said.

The bacterium, Acidimicrobium bacterium A6, removed 60% of PFAS _specifically perfluorooctanoic acid (PFOA) and perfluorooctane sulfonate (PFOS) _ in lab vials over 100 days of observation, the researchers reported in a Sept. 18 article in the journal Environmental Science and Technology. Because of their health concerns and ubiquity, EPA has recently opened a research effort into the chemicals impact in drinking water. Peter Jaffe, the lead researcher and a professor of civil and environmental engineering at Princeton, said the researchers were very encouraged to see these bacteria substantially degrade the famously recalcitrant class of chemicals but cautioned that more work was needed before reaching a workable treatment.

"This is a proof of concept," said Jaffe, the William L. Knapp '47 Professor of Civil Engineering. "We would like to get the removal higher, and then go and test it in the field."

PFAS (Per- and polyfluoroalkyl substances) have been widely used in products from non-stick pans to firefighting foam, and the Environmental Protection Agency has said there is evidence that exposure to PFAS is harmful to human health. Because of this, U.S. manufacturers have phased out several versions of PFAS in their products. But the substance is long-lived and extremely difficult to remove from soil and ground water. In recent years, local governments have been seeking ways to reduce the amount of PFAS in water supplies.

Because of the strength of the carbon-fluorine bond, these chemicals are extremely difficult to remove through conventional means. But Jaffe and co-researcher, Shan Huang, an associate research scholar at Princeton, suspected that the Acidimicrobium A6 might be an effective remedy.

The researchers first began working with the bacteria several years ago when they investigated a phenomenon in which ammonium broke down in acidic, iron-rich soils in New Jersey wetlands and similar locations. Because removing ammonium is a critical part of sewage treatment, the researchers wanted to understand what was behind the process, called Feammox. In their initial research in 2013, Jaffe and fellow researchers removed soil samples from the Assunpink wetland outside Trenton. They cultivated the samples in the lab with an eye to identify the microorganisms responsible for the Feammox process. The researchers learned that the Feammox reaction occurred in the presence of Acidimicrobium A6, but it required several years of painstaking work to isolate this organism and grow it as a pure culture.

One challenge in working with Acidimicrobium A6 is the bacterium's demand for iron both to grow and eliminate compounds like ammonium. Jaffe, along with graduate students Weitao Shuai and Melany Ruiz, now a post-doctoral researcher at Rutgers, determined that they could substitute an electrical anode for the iron in lab reactors. This allowed the researchers to more easily grow these bacteria and work with them; it also presented a possible way to develop reactors for remediation in the absence of iron.

When they sequenced the Acidimicrobium A6 genome, the researchers noticed certain characteristics that opened the possibility that the bacterium could be effective in removing PFAS.

"We knew this was a big environmental challenge, to find an organism that could degrade these perfluorinated organics," Jaffe said.

To test their hypothesis, the researchers sealed samples of Acidimicrobium A6 in lab containers and then tested the bacteria's ability to break down the compounds in lab reactors.

After 100 days, the researchers stopped the test and determined that the bacteria had removed 60 percent of the contaminants and released an equivalent amount of fluoride in the process. Jaffe said the 100 day period was an arbitrary length selected for the experiment, and that longer incubations might result in more PFAS removal. The researchers also plan to vary conditions in the reactor to find the optimum conditions for PFAS removal.

Acidimicrobium A6 thrives in low oxygen conditions, which makes it particularly effective for soil and groundwater remediation and allows it to function without expensive aeration. However, these bacteria also require iron and acidic soil conditions. Jaffe said this could limit their deployment, but adjusting soil conditions could also allow the bacteria to function in areas that do not naturally meet these requirements. Noting previous work on ammonium reduction by Acidimicrobium A6 in soil columns, constructed wetlands, and the electrochemical reactors, Jaffe said the researchers believe this could also be done for PFAS remediation.

Jaffe said the researchers are also working with Mohammad R. Seyedsayamdost, an associate professor of chemistry, and colleagues in the chemistry department to better understand the enzymes involved in the defluorination process. Characterizing those enzymes could provide insights that increase effectiveness in remediation.

Credit: 
Princeton University, Engineering School

Scientists construct energy production unit for a synthetic cell

image: This is an artist's impression of a synthetic cell, with the ATP production system in green.

Image: 
Bert Poolman / BaSyC consortium

Scientists at the University of Groningen have constructed synthetic vesicles in which ATP, the main energy carrier in living cells, is produced. The vesicles use the ATP to maintain their volume and their ionic strength homeostasis. This metabolic network will eventually be used in the creation of synthetic cells - but it can already be used to study ATP-dependent processes. The researchers described the synthetic system in an article that was published in Nature Communications on 18 September.

'Our aim is the bottom-up construction of a synthetic cell that can sustain itself and that can grow and divide,' explains University of Groningen Professor of Biochemistry Bert Poolman. He is part of a Dutch consortium that obtained a Gravitation grant in 2017 from the Netherlands Organisation for Scientific Research to realize this ambition. Different groups of scientists are producing different modules for the cell and Poolman's group was tasked with energy production.

Equilibrium

All living cells produce ATP as an energy carrier but achieving sustainable production of ATP in a test tube is not a small task. 'In known synthetic systems, all components for the reaction were included inside a vesicle. However, after about half an hour, the reaction reached equilibrium and ATP production declined,' Poolman explains. 'We wanted our system to stay away from equilibrium, just like in living systems.'

It took three Ph.D. students in his group nearly four years to construct such a system. A lipid vesicle was fitted out with a transport protein that could import the substrate arginine and export the product ornithine. Inside the vesicle, enzymes were present that broke down the arginine into ornithine. The free energy that this reaction provided was used to link phosphate to ADP, forming ATP. Ammonium and carbon dioxide were produced as waste products that diffused through the membrane. 'The export of ornithine produced inside the vesicle drives the import of arginine, which keeps the system running for as long as the vesicles are provided with arginine,' explains Poolman.

Transport protein

To create an out-of-equilibrium system, the ATP is used to maintain ionic strength inside the vesicle. A biological sensor measures ionic strength and if this becomes too high, it activates a transport protein that imports a substance called glycine betaine. This increases the cell volume and consequently reduces the ionic strength. 'The transport protein is powered by ATP, so we have both production and use of ATP inside the vesicle.'

The system was left to run for 16 hours in the longest experiment that the scientists have performed. 'This is quite long - some bacteria can divide after just 20 minutes,' says Poolman. 'The current system should suffice for a synthetic cell that divides once every few hours.' Eventually, different modules like this one will be combined to create a synthetic cell that will function autonomously by synthesizing its own proteins from a synthetic genome.

Artificial chromosome

The current system is based on biochemical components. However, Poolman's colleagues at Wageningen University & Research are busy collecting the genes needed for the production of enzymes used by the system and incorporating them into an artificial chromosome. Others are working on lipid and protein synthesis, for example, or cell division. The final synthetic cell should contain DNA for all these modules and operate them autonomously like a living cell, but in this case, engineered from the bottom-up and including new properties. However, this is many years away. 'In the meantime, we are already using our ATP-producing system to study ATP-dependent processes and advance the field of membrane transport,' says Poolman.

Credit: 
University of Groningen

New study is first to show long-term durability of early

A new study, VERIFY (Vildagliptin Efficacy in combination with metfoRmIn For earlY treatment of type 2 diabetes) presented at this year's Annual Meeting of the European Association for the Study of Diabetes (EASD) in Barcelona, Spain (16-20 Sept, 2019), and published simultaneously in The Lancet, is the first to show that early combination therapy using vildagliptin and metformin in patients newly diagnosed with type 2 diabetes (T2D) leads to better long-term blood sugar control and a reduced rate of treatment failure than metformin alone (the current standard-of-care treatment for patients newly diagnosed with T2D).

Vildagliptin (also known by its trade names of Galvus and Zomelis) is an oral drug used to treat type 2 diabetes, and belongs to the class of drugs known as dipeptidyl peptidase-4 (DPP-4) inhibitors. By inhibiting this key enzyme, DPP-4 inhibitors promote secretion of insulin by the pancreas, and inhibit production of glucagon, and thus help control blood sugar and avoid hyperglycaemia.

Metformin has been the first line treatment for T2D for several decades (the exact time varying by country), and belongs to the biguanide class of diabetes drugs. Currently, the first-line treatment recommended for type 2 diabetes is metformin monotherapy, with combination therapy only introduced later following treatment failure.

This study, led by EASD President Professor David Matthews (Oxford Centre for Diabetes, Endocrinology and Metabolism, and Harris Manchester College, University of Oxford, UK) and colleagues, included 2001 patients from 254 centres in 34 countries, with 998 randomised to receive early combination therapy using vildagliptin and metformin, and 1003 randomised to receive initial metformin alone, across a 5-year treatment period (enrolment occurred between 2012 and 2014 and follow-up of the final patients was completed in 2019).

The study was divided into 3 periods. In study period 1, patients received either the early combination treatment with metformin (individual, stable daily dose of 1000 to 2000 mg, depending on the patient's tolerability) and vildagliptin 50 mg twice daily, or standard-of-care initial metformin monotherapy (individual, stable daily dose of 1000 to 2000 mg) and placebo twice daily.

Treatment response was monitored by patients visiting their centre every 13 weeks, when the patients' level of glycated haemoglobin (HbA1c -- a measure of blood sugar control) was assessed. If the initial treatment did not maintain levels of HbA1c below 53 mmol/mol [7·0%]) during period 1, confirmed at two consecutive scheduled visits, 13 weeks apart, then this was defined as treatment failure and patients in the metformin monotherapy group received vildagliptin 50 mg twice daily in place of the placebo and patients in the early combination therapy group continued on combination.

This second period was thus a phase of two arms where allocated early combination therapy approach was being tested against a later, metformin with vildagliptin-if-necessary combination strategy. Subsequent failure requiring insulin treatment was assessed as an end-point for second failure by two further visits with loss of glycaemic control. Physicians would then move patients onto insulin therapy. However, patients who did not fail in period 1 but maintained good glycaemic control (HbA1c below 53 mmol/mol, 7%), continued administration of their randomised study medication (early combination or initial metformin monotherapy) for up to five years.

The primary efficacy endpoint was the time from randomisation to initial treatment failure, defined as HbA1c measurement of at least 53 mmol/mol (7·0%) at two consecutive scheduled visits, 13 weeks apart, during period 1.

A total of 1598 (79·9%) patients completed the 5-year study; 811 (81·3%) in the early combination therapy group and 787 (78·5%) in the monotherapy group. The incidence of initial treatment failure during period 1 was 429 (43·6%) patients in the combination treatment group and 614 (62·1%) patients in the monotherapy group. The median observed time to treatment failure in the monotherapy group was 36.1 months, while the median time to treatment failure time for those receiving early combination therapy could only be estimated to be beyond the 5-year study duration at 61.9 months. Both treatment approaches were safe and well tolerated.

The risk of losing blood sugar control (going above HbA1c 53 mmol/mol (7.0%) or more, twice) was approximately halved in the early combination treatment group compared with the monotherapy group over the 5-year study duration (a statistically significant 49% relative risk (RR) reduction). During period 2 when patients in both groups were (or could be) receiving combination treatment, the relative risk of losing blood sugar control was also reduced by 26% among those randomised to receive the early combination treatment, compared with those who transferred to combination therapy after their first treatment failure.

This showed that the early combination therapy strategy approach was superior to a sequential strategy approach involving later intensification of the failing monotherapy with a combination therapy, as demonstrated by a durable effect on blood glucose levels. The authors believe the better long term 'durability' of blood sugar control seen in the combination group could be due to the complementary mechanism of action between the two drugs.

The authors say: "The findings of VERIFY support and emphasise the importance of achieving and maintaining early glycaemic control" and refer to previous studies (such as the UK Prospective Diabetes study), in which early treatment intensification was associated with a legacy effect, where the reduction in vascular complications in the intensive group was maintained or strengthened over 10 years after study completion. In the Diabetes and Aging epidemiology study, an HbA1c value above 6·5% (the threshold for T2D diagnosis in the USA) for the first year following diagnosis was associated with worse outcomes (increasing microvascular events and mortality risk) over the subsequent 10 years of follow-up.

The authors point out that durable HbA1c values below 6.5% are unlikely to be achieved with monotherapy alone. They say: "Real-world evidence has shown how delayed treatment intensification after monotherapy failure results in increasing time spent with avoidable periods of hyperglycaemia, raising a crucial barrier to optimised care. The durable effect we observed with an early combination strategy in the VERIFY study provides initial support for such an approach as an effective way to intensify blood sugar control early after diagnosis and potentially avoid future complications." *

They conclude: "Early intervention with a combination therapy strategy provides greater and durable long-term benefits compared with the current standard-of-care monotherapy with metformin for patients with newly diagnosed type 2 diabetes."

Credit: 
Diabetologia

Study questions routine sleep studies to evaluate snoring in children

Pediatricians routinely advise parents of children who snore regularly and have sleepiness, fatigue or other symptoms consistent with sleep disordered breathing, to get a sleep study; this can help determine whether their child has obstructive sleep apnea, which is often treated with surgery to remove the tonsils and adenoids (adenotonsillectomy). Often pediatricians make surgery recommendations based on the results of this sleep study.

But a new finding from the University of Maryland School of Medicine (UMSOM) suggests that the pediatric sleep study -- used to both diagnose pediatric sleep apnea and to measure improvement after surgery - may be an unreliable predictor of who will benefit from having an adenotonsillectomy.

About 500,000 children under age 15 have adenotonsillectomies every year in the U.S. to treat obstructive sleep apnea. The American Academy of Pediatrics (AAP) recommends the surgery as a first-line therapy to treat the condition, which can cause behavioral issues, cardiovascular problems, poor growth, and developmental delays. The premise is that surgically removing or reducing the severity of the obstruction to the upper airway will improve sleep and reduce other problems caused by the disorder.

In 2012, the AAP recommended that pediatricians should screen children who snore regularly for sleep apnea, and refer children suspected of having the condition for an overnight in-laboratory sleep study. The group also recommended an adenotonsillectomy based on the results of the test. But results from the new UMSOM study, published in the September issue of the journal Pediatrics, call into question those recommendations because the data they analyzed found no relationship between improvements in sleep studies following surgery and resolution of most sleep apnea symptoms.

"Resolution of an airway obstruction measured by a sleep study performed after an adenotonsillectomy has long been thought to correlate with improvement in sleep apnea symptoms, but we found this may not be the case," said study lead author Amal Isaiah, MD, PhD, an Assistant Professor of Otorhinolaryngology--Head and Neck Surgery and Pediatrics at UMSOM. "Our finding suggests that using sleep studies alone to manage sleep apnea in children may be a less than satisfactory way of determining whether surgery is warranted."

To conduct the study, Dr. Isaiah and his colleagues, Kevin Pereira, MD, from UMSOM and Gautam Das, PhD, at the University of Texas at Arlington conducted a new analysis of findings from 398 children, ages 5 to 9 years, who participated in the Childhood Adenotonsillectomy Trial (CHAT), a randomized trial published in 2013 that compared adenotonsillectomy with watchful waiting to treat sleep apnea. They found that resolution of sleep apnea, as determined by sleep study results, did not correlate with improvements in the majority of outcome measures including behavior, cognitive performance, sleepiness and symptoms of attention deficit hyperactivity disorder.

"This is an important finding that should be carefully considered by the pediatric medical community to determine whether recommendations concerning the management of sleep apnea need to be updated," said E. Albert Reece, MD, PhD, MBA, Executive Vice President for Medical Affairs, UM Baltimore, and the John Z. and Akiko K. Bowers Distinguished Professor and Dean, University of Maryland School of Medicine. "Practice guidelines, in every field of medicine, should reflect the current state of science."

In the CHAT Trial, the researchers found that 79 percent of children who had the surgery had a normal sleep study 7 months later compared to 46 percent of those who had watchful waiting. Sleep apnea resolved spontaneously in about half of the children who underwent watchful waiting. It also demonstrated no significant improvement in how children performed on cognitive tests to assess how well they could focus, analyze and solve problems, and recall what they had just learned.

The CHAT researchers did find, however, that those who had early adenotonsillectomy had improved symptoms, quality of life, and behavior.

Credit: 
University of Maryland School of Medicine

Sesame yields stable in drought conditions

image: This is a sesame flower growing atop a sesame plant in western Texas research fields.

Image: 
Irish Lorraine B. Pabuayon

Texas has a long history of growing cotton. It's a resilient crop, able to withstand big swings in temperature fairly well. However, growing cotton in the same fields year after year can be a bad idea. Nutrients can get depleted. Disease can lurk in the ground during the winter season, only to attack the following year. Thus, rotating cotton with other crops could be a better system.

Agronomists have been researching various alternative crops that will grow well in western Texas. This area is part of the Ogallala water aquifer, which has been hit extremely hard the past few decades by drought. Another crop, sorghum, grows well with low water availability, but the yield can be greatly affected by drought conditions.

Irish Lorraine B. Pabuayon, a researcher at Texas Tech University (TTU), is on the team looking at an alternative crop for west Texas: sesame.

Like cotton and sorghum, sesame is also a "low-input" crop. This means it does not need a great deal of water, something that vegetable crops, corn and wheat need regularly and in large quantities.

"When introducing new crops to a water-limited system, it is important for growers to justify the water requirements of the new crops," says Pabuayon. "Properly determining the water requirements of the crops is important. Management decisions for wise use of limited water resources requires understanding a crop's moisture requirements."

Pabuayon and the TTU team found that even under conditions that lowered sorghum and cotton yields, sesame performed well. This could be good news for west Texas farmers.

"Our results showed that sesame yields were not significantly altered under water-deficit conditions," says Pabuayon. "Sesame continued to have consistent yields, even when water-deficit conditions decreased sorghum's yield by 25% and cotton's yield by 40%."

Having another crop that has good market value and can grow well during drought could benefit west Texas farmers. According to Pabuayon, sesame seeds are commonly used for food consumption and other culinary uses. The seeds are high in fat and are a good source of protein. Sesame is a major source of cooking oil. The remaining parts of sesame, after oil extraction, are good sources of livestock feed. Sesame has uses in the biodiesel industry, and even in cosmetics. This means there are multiple markets for the tiny seeds.

"Provided that the market price of sesame can support current yields, the results are favorable for low-input sesame production in west Texas," says Pabuayon. "However, the relatively low yields of sesame (per acre, compared to cotton and sorghum) suggest opportunities for additional genetic advancement. Currently, sesame varieties available for Texas are well-suited as an alternative crop for water-limited crop production systems.

Credit: 
American Society of Agronomy

Stabilizing neuronal branching for healthy brain circuitry

video: This is an illustrated video of the study covered in this press release, designed to be understood by a broad audience

Image: 
Thomas Jefferson University

PHILADELPHIA - Neurons form circuits in our brain by creating tree-like branches to connect with each other. Newly forming branches rely on the stability of microtubules, a railway-like system important for the transport of materials in cells. The mechanisms that regulate the stability of microtubules in branches are largely unknown. New research from the Vickie & Jack Farber Institute for Neuroscience - Jefferson Health has identified a key molecule that stabilizes microtubules and reinforces new neuronal branches.

"Like the railways to a new city, stable microtubules transport valuable material to newly formed branches so that they can grow and mature," explains Dr. Le Ma, associate professor in the department of Neuroscience and senior author of the study. Microtubule stability is regulated by proteins called microtubule-associated proteins (MAPs), which include many subtypes. Previous work from Dr. Ma and Stephen Tymanskyj, a postdoctoral fellow in the lab, had identified a subtype called MAP7, and found that it was localized at sites where new branches are formed. This made it a good candidate for regulating microtubule stability.

In the new study, published August 7 in Journal of Neuroscience, Dr. Tymanskyj and Dr. Ma used genetic tools to remove MAP7 from developing rodent sensory neurons and found that without MAP7, branches can still grow but they retract more frequently. This means that the branches cannot make complete and lasting connections without MAP7. The researchers also introduced more MAP7 protein to branches that had been cut by a laser and found that it could slow down or even prevent retraction that usually happens in response to injury. This suggests that manipulation of MAP7 could potentially rescue injured neuronal branches.

A key finding of the study demonstrated a unique property of MAP7 when it interacts with microtubules. The researchers found that in cells, MAP7 binds to specific regions of microtubules and makes them very stable but avoids the microtubule ends, where individual building blocks are rapidly added or removed. This valuable binding property prevents microtubules, or the cellular railway, from completely disassembling when branches retract. It also promotes steady re-assembly of microtubules to extend the cellular railway for subsequent branch growth. Moreover, the study is the first to demonstrate this new feature, which has not been observed for other MAPs.

Neuronal branches can be damaged by physical injury or toxicity. Understanding the role of MAP7 suggests new ways to reduce or avert that damage. "Our research has identified a new molecular mechanism of microtubule regulation in branch formation and has suggested a new target to potentially treat nerve injury," concludes Dr. Ma, who has already initiated new studies exploring this.

Credit: 
Thomas Jefferson University

Researchers develop thermo-responsive protein hydrogel

image: An illustration of how an engineered Q protein self-assembles to form fiber-based hydrogels at low temperature. These hydrogels have a porous microstructure that allows them to be used for drug delivery applications.

Image: 
NYU Tandon

BROOKLYN, New York, Tuesday, September 17, 2019 - Imagine a perfectly biocompatible, protein-based drug delivery system durable enough to survive in the body for more than two weeks and capable of providing sustained medication release. An interdisciplinary research team led by Jin Kim Montclare, a professor of biomolecular and chemical engineering at the NYU Tandon School of Engineering, has created the first protein-engineered hydrogel that meets those criteria, advancing an area of biochemistry critical to not only to the future of drug delivery, but tissue engineering and regenerative medicine.

Hydrogels are three-dimensional polymer networks that reversibly transition from solution to gel in response to physical or chemical stimuli, such as temperature or acidity. These polymer matrices can encapsulate cargo, such as small molecules, or provide structural scaffolding for tissue engineering applications. Montclare is lead author of a new paper in the journal Biomacromolecules, which details the creation of a hydrogel comprised of a single protein domain that exhibits many of the same properties as synthetic hydrogels. Protein hydrogels are more biocompatible than synthetic ones, and do not require potentially toxic chemical crosslinkers.

"This is the first thermo-responsive protein hydrogel based on a single coiled-coil protein that transitions from solution to gel at low temperatures through a process of self-assembly, without the need for external agents," said Montclare. "It's an exciting development because protein-based hydrogels are much more desirable for use in biomedicine."

The research team conducted experiments encapsulating a model small molecule within their protein hydrogel, discovering that small molecule binding increased thermostability and mechanical integrity and allowed for release over a timeframe comparable to other sustained-release drug delivery vehicles. Future work will focus on designing protein hydrogels tuned to respond to specific temperatures for various drug delivery applications.

Credit: 
NYU Tandon School of Engineering

Guppies teach us why evolution happens

image: Natural settings allow comparative studies of guppies to see how the life histories of those who lived with and without predators differed.

Image: 
David Reznick / UCR

Guppies, a perennial pet store favorite, have helped a UC Riverside scientist unlock a key question about evolution:

Do animals evolve in response to the risk of being eaten, or to the environment that they create in the absence of predators? Turns out, it's the latter.

David Reznick, a professor of biology at UC Riverside, explained that in the wild, guppies can migrate over waterfalls and rapids to places where most predators can't follow them. Once they arrive in safer terrain, Reznick's previous research shows they evolve rapidly, becoming genetically distinct from their ancestors.

"We already knew that they evolved quickly, but what we didn't yet understand was why," Reznick said. In a new paper published in American Naturalist, Reznick and his co-authors explain the reason the tiny fish evolve so quickly in safer waters.

To answer their questions, the scientists traveled to Trinidad, guppies' native habitat, and conducted an experiment. They moved guppies from areas in streams where predators were plentiful to areas where predators were mostly absent. Over the course of four years, they studied how the introduced guppies changed in comparison to ones from where they originated.

"If guppies evolve because they aren't at risk of becoming food for other fish, then evolution should be visible right away," Reznick said. "However, if in the absence of predators they become abundant and deplete the environment of food, then there will be a lag in detectable changes."

Guppies from all four streams were marked so they could be tracked over the course of four years. The scientists tracked the males, which tend to live about five months. They looked at the fishes' age and size at maturity, which are key traits affecting population growth.

They also tracked how the environment changed as the guppy populations expanded, focusing on the abundance of food such as algae and insects, as well as the presence of other nonpredator fish.

They found a two-to-three-year lag between when guppies were introduced and when males evolved, suggesting the second hypothesis was correct; guppies were first changing their new environments, and then as a result, they turned out to be changing themselves.

"The speed of evolution makes it possible to study how it happens," Reznick said. "The new news is that organisms can shape their own evolution by changing their environment."

One of Reznick's current projects includes applying these concepts to questions about human evolution.

"Unlike guppies and other organisms, human population density seems to increase without apparent limit, which increases our impact on our environment and on ourselves," he said.

Credit: 
University of California - Riverside

Poor diabetes control costs the NHS in England £3 billion a year in potentially avoidable hospital treatment

Poor diabetes control was responsible for £3 billion in potentially avoidable hospital treatment in England in the operational year 2017-2018, according to new research comparing the costs of hospital care for 58 million people with and without diabetes.

The findings, being presented at this year's European Association for the Study of Diabetes (EASD) Annual Meeting in Barcelona, Spain (16-20 September), reveal that on average, people with type 1 diabetes require 6 times more hospital treatment (£3,035 per person per year), and those with type 2 diabetes twice as much care (£1,291; after adjusting for their older age), than people without diabetes (£510).

Other than age, diabetes is the largest contributor to healthcare cost and reduced life expectancy in Europe. In England, two-thirds of people with type one diabetes and a third of those with type 2 diabetes have poor control over the blood sugar levels, increasing the risk of multiple long-term health problems ranging from kidney disease to blindness, and the need for additional hospital care.

In this study, researchers used data from the NHS Digital Hospital Episode Statistics in England and the National Diabetes Audit (2017-2018) to compare the cost of hospital treatment provided to people with type 1 and type 2 diabetes to people without diabetes, after adjusting for the effect of age.

Data on elective (planned) and emergency admissions, outpatient visits, and accident and emergency department (A & E) attendances for 58 million people including 2.9 million with type 2 diabetes, and 243,000 with type 1 diabetes between 2017 and 2018 were analysed. This included 90% of all hospital care provided across England.

Of total hospital costs of £36 billion in 2017-2018, the NHS in England spent around £5.5 billion on hospital care for people with diabetes. Of that sum, an estimated £3 billion (8%) was excess expenditure on diabetes (after accounting for age)--almost 10% of the NHS hospital budget.

Compared to people without diabetes, the average annual cost of elective care was more than two times higher for people with type 2 diabetes (£759 vs £331), and the average cost of emergency care was three times higher (£532 vs £179), having allowed for their age difference. Similarly, average costs for people with type 1 diabetes were five-fold greater for elective care (£1,657 vs £331) and eight-fold higher for emergency care (£1,378 vs £179).

"People with diabetes are admitted to hospital more often, especially as emergencies, and stay on average longer as inpatients. These increased hospital costs, 40% of which come from non-elective and emergency care, are three times higher than the current costs of diabetes medication. Improved management of diabetes by GPs and diabetes specialist care teams could improve the health of people with diabetes and substantially reduce the level of hospital care and costs", says author Dr Adrian Heald from Salford Royal Hospital in the UK.

The authors note that the study did not include the indirect costs associated with diabetes, such as those related to increased death and illness, work loss, and the need for informal care.

Credit: 
Diabetologia

Patients with high blood sugar variability much more likely to die than those with stable visit-to-visit readings

New research presented at this year's Annual Meeting of the European Association for the Study of Diabetes (EASD) in Barcelona, Spain (16-20 Sept) shows that patients with the highest variability in their blood sugar control are more than twice as likely to die as those with the most stable blood sugar measurements. The study is by Professor Ewan Pearson, University of Dundee, UK and Dr Sheyu Li, West China Hospital, Sichuan University, Chengdu, China, and University of Dundee, UK, and colleagues.

Measuring glycated haemoglobin (HbA1c) in a patient's blood has for many years been a standard method for measuring blood sugar control over previous weeks and months. Usually, there is focus on whether a patient's HbA1c level is at or below a treatment target for a patient. However, some patients have highly variable HbA1c, and others have stable HbA1c from visit to visit. It is unclear whether this variability in HbA1c is associated with altered prognosis of patients, independent of their average HbA1c from diagnosis. In this study, the authors aimed to investigate the association between visit-to-visit HbA1c variability and cardiovascular events and microvascular complications in patients with newly diagnosed type 2 diabetes.

The study retrospectively recruited patients from Tayside and Fife in the Scottish Care Information-Diabetes Collaboration (SCI-DC), who were observable from diagnosis and had at least five HbA1c measurements before the outcomes. They used a measurement called the HbA1c variability score (HVS) calculated as the percentage of the number of changes in HbA1c more than 0.5% (5.5mmol/mol) among all HbA1c measurements in an individual.

Ten outcomes were studied including the combined outcome of major adverse cardiovascular events (known as MACE), all-cause mortality, cardiovascular death, coronary artery disease (CAD), ischemic stroke, heart failure, diabetic retinopathy (DR), diabetic peripheral neuropathy (DPN), diabetic foot ulcer (DFU) and the new onset of chronic kidney disease (CKD). Statistical models adjusting for baseline characteristics were used to assess the association of HVS with outcomes.

For each outcome, the patients were divided into 5 groups with the patients with the lowest variability (0-20%) as the reference. Compared with this group, patients with HVS of more than 60% (the 60-80% group and the 80-100% group) were associated with increased risks of all the outcomes studied. This means that the outcomes of patients are worse when more than 60% of their HbA1c measurements differ by 0.5% from the previous measure.

When looking specifically at the highest variation group (80-100%) versus the lowest (0-20%), the highest group was associated with a 2.4 times increased risk of all three outcomes of MACE, all-cause and cardiovascular mortality. There was also a 2.6 times increased risk of coronary artery disease, a doubling of risk of stroke, a tripled risk of heart failure, DPN, and CKD; a five-times increased risk of diabetic foot ulcer and seven times increased risk of DR. Adjustment for baseline characteristics confirmed the results.

The authors say: "Higher HbA1c variability is associated with increased risks of all-cause mortality, cardiovascular events and microvascular complication of diabetes independently of accumulated exposure of high HbA1c."

The authors say HbA1c variability varies across individuals, explaining that: "A previous descriptive study* we completed suggests higher HbA1c variability was associated with age, sex, body mass, social deprivation and treatment patterns and this difference may explain some of the increased risk in those with high variability in HbA1c. Frequent fluctuation of HbA1c can be driven by multiple clinical factors, including variation in diet and lifestyle, changing to different anti-diabetic drugs and/or withdrawing of anti-diabetic treatment, and general healthcare quality**"

They explain further: "High variation of HbA1c is more common in patients with a higher average level of HbA1c. However, the association with adverse outcomes seen with high HbA1c variability remains even after adjusting for this baseline difference. Thus, a highly variable HbA1c should be considered as a major risk factor for adverse outcomes, even if the average HbA1c is not too high. At this stage, it is important to emphasize that we can't say that the adverse outcomes are definitively caused by the increased variability in HbA1c, and therefore we cannot yet be sure that reducing HbA1c variability will reduce that risk."

Credit: 
Diabetologia

Teen e-cigarette use doubles since 2017

Data from the 2019 Monitoring the Future Survey of eighth, 10th and 12th graders show alarmingly high rates of e-cigarette use compared to just a year ago, with rates doubling in the past two years. University of Michigan, Ann Arbor, scientists who coordinate and evaluate the survey released the data early to The New England Journal of Medicine (NEJM) to notify public health officials working to reduce vaping by teens. The survey is funded by the National Institute on Drug Abuse (NIDA), part of the National Institutes of Health.

The new data show a significant increase in past month vaping of nicotine in each of the three grade levels since 2018. In 2019, the prevalence of past month nicotine vaping was more than 1 in 4 students in 12th grade; 1 in 5 in 10th grade, and 1 in 11 in eighth grade.

"With 25% of 12th graders, 20% of 10th graders and 9% of eighth graders now vaping nicotine within the past month, the use of these devices has become a public health crisis," said NIDA Director Dr. Nora D. Volkow. "These products introduce the highly addictive chemical nicotine to these young people and their developing brains, and I fear we are only beginning to learn the possible health risks and outcomes for youth."

"Parents with school-aged children should begin paying close attention to these devices, which can look like simple flash drives, and frequently come in flavors that are appealing to youth," said University of Michigan lead researcher Dr. Richard Miech. "National leaders can assist parents by stepping up and implementing policies and programs to prevent use of these products by teens."

Additional findings from the 2019 Monitoring the Future Survey, documenting the use of and attitudes about marijuana, alcohol and other drugs, will be released in December.

Credit: 
NIH/National Institute on Drug Abuse