Tech

Body's ageing process accelerated by DNA changes, study suggests

DNA changes throughout a person's life can significantly increase their susceptibility to heart conditions and other age-related diseases, research suggests.

Such alterations - known as somatic mutations - can impact the way blood stem cells work and are associated with blood cancers and other conditions.

A study says that these somatic mutations and the associated diseases they cause may accelerate a person's biological age - how old their body appears - faster than their chronological age - the number of years they have been alive.

A study by scientists from the Universities of Edinburgh and Glasgow examined these changes and their potential effects in more than 1000 older people from the Lothian Birth Cohorts (LBCs), born in 1921 and 1936.

The LBCs are a group of people - now in their 80s and 90s - who sat intelligence tests as 11-year olds. They are some of the most-intensively studied research participants in the world.

Scientists studied people where the biological and chronological age was separated by a large gap.

They found the participants with somatic mutations - around six per cent - had a biological age almost four years older than those with no alterations.

Experts say they will now explore the link between these DNA changes and biological ageing acceleration.

The study, published in Current Biology, was funded by Alzheimer's Research UK.

Dr Tamir Chandra, Group Leader at the University of Edinburgh's MRC Human Genetics Unit, said: "Previously, somatic mutations have largely been studied in cancer. Our findings suggest they play a role in other diseases, which will change the way we study disease risk."

Credit: 
University of Edinburgh

The neurobiological mechanisms behind schizophrenia may depend on gender

The neurobiological pathophysiology of schizophrenia differs significantly between males and females, according to a new study. The findings suggest a possible need for more sex-specific treatments for schizophrenia. The study was the first to identify a number of sex-specific genes related to schizophrenia using neurons derived from induced pluripotent stem cells. The results were published in Nature Communications.

Co-ordinated by the University of Eastern Finland, the University of Helsinki and Karolinska Institutet, the study investigated the differences in gene and protein expression in neurons from identical twins discordant for schizophrenia and healthy controls, as well as between males and females. The researchers used induced pluripotent stem cell technology, where neurons were generated from pluripotent stem cells induced from study participants' skin cells.

Schizophrenia typically manifests after adolescence. Hundreds of genes are known to contribute to the risk of schizophrenia, but the neurobiological mechanisms leading to the onset of the illness are poorly known. In the present study, researchers were able to identify disease-specific changes in neurons by comparing cells from monozygotic, genetically identical twin pairs, one of which suffered from schizophrenia and the other healthy.

Schizophrenia was associated with alterations in several pathways, such as those related to glycosaminoglycan and neurotransmitter metabolism and GABAergic synapse. However, a large proportion of genes related to schizophrenia were expressed differentially in the cells of males and females.

According to the researchers, the results imply that the mechanisms involved in the development of schizophrenia differ at least partially between males and females, and these differences may matter in the choice of treatment. The fact that many genes related to schizophrenia are sex-specific may explain why symptoms appear after adolescence, when the expression of many sex-specific genes changes.

Neurons derived from induced pluripotent stem cells correspond to the developmental stage of the second trimester of pregnancy. Thus, the results of the present study indicate that schizophrenia-related brain changes may be present early in utero, and differences between monozygotic twins can also be observed already at this point.

Credit: 
University of Eastern Finland

Study finds increase in women giving TED talks but not ethnic minorities

Women gave more than half of TED talks in the first half of 2017, up from less than one-third in 2006, according to a new study published in Political Research Exchange. But the German research team also found that ethnic minorities remain under-represented as TED speakers, giving just one in five talks over the same time period.

"Our results raise some concerns, particularly about the representation of certain ethnic groups in these talks," says lead author Carsten Schwemmer from the University of Bamberg. "This highlights the importance of speaker diversity to reduce stereotypes about scientists and people driving societal change."

Since 2006, talks given at TED events and conferences have become an important means for communicating latest developments in science, culture and society, with the talks streamed online to a global audience of millions. Many well-known scientists, politicians and businesspeople have given TED talks, including Stephen Hawking, Al Gore and Elon Musk. But this has raised concerns that TED talks are being dominated by white men and their interests. This is what Schwemmer and his colleague, Sebastian Jungkunz, set out to explore.

To do this, they used facial recognition technology to determine the gender and ethnicity of speakers giving 2,333 TED talks between 2006 and 2017, representing all those available on YouTube's main TED channel. They also applied automated text analysis to transcripts of the talks, to determine the main topics being discussed, and to 1.2 million comments left by viewers of the talks on YouTube, in order to assess reaction and feedback.

While this kind of analysis of the representation of, and attitudes towards, different groups in traditional media is quite common, it is much rarer for digital platforms. This study was one of the first to apply facial recognition techniques to social science research. In their study, the researchers also advocate for responsible use, and support initiatives like Safe Face Pledge, which provides guidelines for ethic principle of facial analysis technology.

Schwemmer and Jungkunz found that the proportion of women giving TED talks had increased steadily since 2006, which they attributed to efforts by the TED organization to achieve a more balanced gender representation. But the proportion of speakers from non-white ethnic groups had remained fairly static over the same time, at just one in five. As a consequence, white men still made up just over half (56%) of all speakers between 2006 and the first half of 2017.

Talks discussing inequalities such as violence against women and racism, which the researchers thought would be particularly important for women and ethnic minorities, were also in a minority, accounting for just 3% of talks, although this proportion did rise over time. These talks also received more negative comments than talks on other topics, perhaps, say the researchers, because they often contained depressing rather than entertaining content.

However, the study also found that non-white TED speakers received more positive online comments than white speakers, but that female speakers received more negative and hateful comments than male speakers.

"Digital content providers like TED media should increase their efforts to prevent that talking about science and important matters of societal change on a global stage remains a privilege of white people," concludes Schwemmer. "Otherwise, under-representation of certain ethnic groups in the digital sphere can, similar to traditional media sources, further enhance stereotypes and negative attitudes."

Credit: 
Taylor & Francis Group

Novel math could bring machine learning to the next level

image:  The new approach allows artificial intelligence to learn to recognize transformed images much faster.

Image: 
Diogo Matias

A team of Italian mathematicians, including one who is also a neuroscientist from the Champalimaud Centre for the Unknown (CCU), in Lisbon, Portugal, has shown that artificial vision machines can learn to recognize complex images spectacularly faster by using a mathematical theory that was developed 25 years ago by one of this new study's co-authors. Their results have been published in the journal Nature Machine Intelligence.

During the last decades, machine vision performance has exploded. For example, these artificial systems can now learn to recognise virtually any human face - or to identify any individual fish moving in a tank, in the midst of a large number of other almost identical fish which are also moving.

The machines we're talking about are, in fact, electronic models of networks of biological neurons, and their aim is to simulate the functioning of our brain, which is as good as it gets at performing these visual tasks - and this, without any conscious effort on our part.

But how do these neural networks actually learn? In the case of face recognition, for instance, they do it by acquiring experience about what human faces look like in the form of a series of portraits. More specifically, after being digitized into a matrix of pixel values (think about your computer monitor's RGB system), each image is "crunched" inside the neural network, which then manages to extract general, meaningful features, from the set of sample faces (such as the eyes, mouth, nose, etc).

This learning (deep learning, in its more modern development) then enables the machine to spit out another set of values, which will in turn enable it, for instance, to identify a face it has never seen before in a databank of faces (much like a fingerprint database), and therefore to predict who that face belongs to with great accuracy.

The story of Clever Hans

But, before the neural network can begin to perform this well, though, it is typically necessary to present it with thousands of faces (i.e. matrices of numbers). Moreover, much as these machines have been increasingly successful at pattern recognition, the fact is that nobody really knows what goes on inside them as they learn their task. They are, basically, black boxes. You feed them something, they spit out something, and if you designed your electronic circuits properly... you'll get the correct answer.

What this means is that it is not possible to determine which or how many features the machine is actually extracting from the initial data - and not even how many of those features are really meaningful for face recognition. "To illustrate this, consider the paradigm of the wise horse", says first author of the study Mattia Bergomi, who works in the Systems Neuroscience Lab at the CCU.

The story dates from the early years of the 20th century. It's about a horse in Germany called Clever Hans that, so his master claimed, had learned to do arithmetics and announce the result of additions, subtractions, etc. by tapping one of its front hooves on the ground the right number of times. Everyone who witnessed the horse's performance was convinced he could count (the event was even reported by the New York Times). But then, in 1907, a German psychologist showed that the horse was in fact picking up unconscious cues in his master's body language that were telling it when to stop tapping...

"It's the same with machine learning; there is no control over how it works or what it has learned during training", Bergomi explains. The machine having no a priori knowledge of faces, it just somehow does its stuff - and it works.

This led the researchers to ask: could there be a way to inject some knowledge of the real world (about faces or other objects) into the neural network, before training, in order to cause it explore a more limited space of possible features instead of considering them all - including those that are impossible in the real world? "We wanted to control the space of learned features", Bergomi points out. "It's similar to the difference between a mediocre chess player and an expert: the first sees all possible moves, while the latter only sees the good ones", he adds.

Another way of putting it, he says, is by saying that "our study addresses the following simple question: When we train a deep neural network to distinguish road signs, how can we tell the network that its job will be much easier if it only has to care about simple geometrical shapes such as circles and triangles?".

The scientists reasoned that this approach would substantially reduce training time - and, not less importantly, give them a "whiff" of what the machine might be doing to obtain its results. "Allowing humans to drive the learning process of learning machines is fundamental to move towards a more intelligible artificial intelligence and reduce the skyrocketing cost in time and resources that current neural networks require in order to be trained", he remarks.

What's in a shape?

Here's where a very abstract and novel mathematical theory, called "topological data analysis" (TDA), enters the stage. The first steps in the development of TDA were taken in 1992 by the italian mathematician Patrizio Frosini, co-author of the new study and currently at the University of Bologna. "Topology is one of the purest forms of math", says Bergomi. "And until recently, people thought that Topology would not be applied to anything concrete for a long time. Until TDA became famous in the last few years."

Topology is a sort of extended geometry that, instead of measuring lines and angles in rigid shapes (such as triangles, squares, cones, etc.), seeks to classify highly complex objects according to their shape. For a topologist, for example, a donut and a mug are the same object: one can be deformed into the other by stretching or compression.

Now, the thing is, current neural networks are not good at topology. For instance, they do not recognize rotated objects. To them, the same object will look completely different every time it is rotated. That is precisely why the only solution is to make these networks "memorise" each configuration separately - by the thousands. And it is precisely what the authors were planning to avoid by using TDA.

Think of TDA as being a mathematical tool for finding meaningful internal structure (topological features), in any complex "object" that can be represented as a huge set of numbers, by looking at the data through certain well-chosen "lenses" or filters. The data itself can be about faces, financial transactions or cancer survival rates. For faces in particular, by applying TDA, it becomes possible to teach a neural network to recognize faces without having to present it with each of the different orientations faces might assume in space. The machine will now recognize all faces as being a face, even in different rotated positions.

It's a 5! No, it's a 7!

In their study, the scientists tested the benefits of combining machine learning and TDA by teaching a neural network to recognise hand-written digits. The results speak for themselves.

As these networks are bad topologists and handwriting can be very ambiguous, two different hand-written digits may prove indistinguishable for current machines - and conversely, two instances of the same hand-written digit may be seen by them as different.

That is why, to be performed by today's vision machines, this task requires presenting the network, which knows nothing about digits in the world, with thousands of images of each of the 10 digits, written with all sorts of slants, calligraphies, etc..

To inject knowledge about digits, the team built a set of a priori features that they considered meaningful (in other words, a set of "lenses" through which the network would "see" the digits), and forced the machine to choose among these lenses to look at the images. And what happened was that the number of images (that is, the time) needed for the TDA-enhanced neural network to learn to distinguish 5's from 7's - however badly written -, while maintaining its predictive power, dropped down to less than 50! "What we mathematically describe in our study is how to enforce certain symmetries, and this provides a strategy to build machine learning agents that are able to learn salient features from a few examples, by taking advantage of the knowledge injected as constraints", says Bergomi.

Does this mean that the inner workings of learning machines which mimic the brain will become more transparent in the future, enabling new insights on the inner workings of the brain itself? In any case, this is one of Bergomi's goals. "The intelligibility of artificial intelligence is necessary for its interaction and integration with biological intelligence", he says. He is currently working, in collaboration with his colleague Pietro Vertechi, also from the Systems Neuroscience Lab at CCU, on developing a new kind of neural network architecture that will allow humans to swiftly inject high-level knowledge into these networks to control and speed up their training.

Credit: 
Champalimaud Centre for the Unknown

Vintage film shows Thwaites Glacier ice shelf melting faster than previously observed

image: Professor Dustin Schroeder (foreground) and art historian Jessica Daniel splice 50-year-old film containing radar measurements of Antarctica into a reel in preparation for digital scanning at the Scott Polar Research Institute in the UK.

Image: 
Courtesy of Dustin Schroeder

Newly digitized vintage film has doubled how far back scientists can peer into the history of underground ice in Antarctica, and revealed that an ice shelf on Thwaites Glacier in West Antarctica is being thawed by a warming ocean more quickly than previously thought. This finding contributes to predictions for sea-level rise that would impact coastal communities around the world.

The researchers made their findings by comparing ice-penetrating radar records of Thwaites Glacier with modern data. The research appeared in Proceedings of the National Academy of Sciences Sept. 2.

"By having this record, we can now see these areas where the ice shelf is getting thinnest and could break through," said lead author Dustin Schroeder, an assistant professor of geophysics at Stanford University's School of Earth, Energy & Environmental Sciences (Stanford Earth) who led efforts to digitize the historical data from airborne surveys conducted in the 1970s. "This is a pretty hard-to-get-to area and we're really lucky that they happened to fly across this ice shelf."

Researchers digitized about 250,000 flight miles of Antarctic radar data originally captured on 35mm optical film between 1971 and 1979 as part of a collaboration between Stanford and the Scott Polar Research Institute (SPRI) at Cambridge University in the U.K. The data has been released to an online public archive through Stanford Libraries, enabling other scientists to compare it with modern radar data in order to understand long-term changes in ice thickness, features within glaciers and baseline conditions over 40 years.

Sea-level predictions

The information provided by historic records will help efforts like the Intergovernmental Panel on Climate Change (IPCC) in its goal of projecting climate and sea-level rise for the next 100 years. By being able to look back 40 to 50 years at subsurface conditions rather than just the 10 to 20 years provided by modern data, scientists can better understand what has happened in the past and make more accurate projections about the future, Schroeder said.

"You can really see the geometry over this long period of time, how these ocean currents have melted the ice shelf - not just in general, but exactly where and how," said Schroeder, who is also a faculty affiliate at the Stanford Woods Institute for the Environment. "When we model ice sheet behavior and sea-level projections into the future, we need to understand the processes at the base of the ice sheet that made the changes we're seeing."

The film was originally recorded in an exploratory survey using ice-penetrating radar, a technique still used today to capture information from the surface through the bottom of the ice sheet. The radar shows mountains, volcanoes and lakes beneath the surface of Antarctica, as well as layers inside the ice sheet that reveal the history of climate and flow.

Newly uncovered features

The researchers identified several features beneath the ice sheet that had previously only been observed in modern data, including ash layers from past volcanic eruptions captured inside the ice and channels where water from beneath the ice sheet is eroding the bottom of ice shelves. They also found that one of these channels had a stable geometry for over 40 years, information that contrasts their findings about the Thwaites Glacier ice shelf, which has thinned from 10 to 33 percent between 1978 and 2009.

"The fact that we were able to have one ice shelf where we can say, 'Look, it's pretty much stable. And here, there's significant change' - that gives us more confidence in the results about Thwaites," Schroeder said.

The scientists hope their findings demonstrate the value of comparing this historical information to modern data to analyze different aspects of Antarctica at a finer scale. In addition to the radar data, the Stanford Digital Repository includes photographs of the notebooks from the flight operators, an international consortium of American, British and Danish geoscientists. (Read more about the process of digitizing the archival film.)

"It was surprising how good the old data is," Schroeder said. "They were very careful and thoughtful engineers and it's much richer, more modern looking, than you would think."

Credit: 
Stanford's School of Earth, Energy & Environmental Sciences

Men who live alone have problems taking 'blood thinning' drug

Paris, France - 2 Sept 2019: Living alone is associated with difficulties using the "blood thinner" warfarin in men, but not women, according to research presented today at ESC Congress 2019 together with the World Congress of Cardiology.(1)

"'Ask my wife' is a common reply among older men to questions about their medication, disease, and treatment," said Dr Anders N. Bonde of Gentofte University Hospital, Copenhagen, Denmark.

"Our study suggests that when it comes to anticoagulation control, men are more dependent on their partner than women," he added. "Women living alone often have better relationships with children or a broader network of people who could help them manage a demanding medication like warfarin. Furthermore, divorces are often more difficult for men than for women, and the largest percentage of patients with diagnoses related to alcohol abuse (which is known to be important for anticoagulation control) was found in men living alone in our study."

Warfarin is a common anticoagulant treatment worldwide for preventing stroke in atrial fibrillation, the most common heart rhythm disorder. It is a type of vitamin K antagonist since it reduces blood clotting by blocking the action of vitamin K. Continuous blood-monitoring with international normalised ratio (INR) measurements is required for warfarin to be safe and effective, since too little of the drug may allow a blot clot to form and cause a stroke while too much causes bleeding.

Quality of INR control is usually measured as time in therapeutic range (TTR), meaning the percentage of time with optimal warfarin concentrations in the blood to prevent stroke and avoid bleeding. ESC guidelines advise being in the therapeutic range at least 70% of the time.(2) Warfarin can be challenging for patients, with dose adjustments needed to maintain high TTR coupled with a number of food and drug interactions.

Previous studies on anticoagulation control have mainly identified clinical variables that predict low TTR - for example depression and cancer. But few studies have evaluated socioeconomic factors. This study examined the influence of cohabitation status on TTR in men and women with atrial fibrillation.

A total of 4,772 atrial fibrillation patients with six months of continuous warfarin use and INR monitoring were identified from Danish registers. Patients were divided according to sex and whether they lived alone or with others. The researchers calculated TTR for men living alone, men not living alone, women living alone, and women not living alone separately. Factors known to influence TTR, such as depression, cancer, interacting medication, and age, were measured and adjusted for in a regression model.

The study found that the median TTR in men living along was 57%. After adjustment for other factors known to be important for TTR, this was still significantly lower - by 3.6% - than in cohabiting men.

Dr Bonde said: "Men living alone had low poor anticoagulation control. The impact of living on their own was larger than several factors previously known to affect TTR, such as cancer, use of interacting medication, or heart failure."

Women living alone also had lower TTR than cohabiting women, but after adjustment the difference was very small (0.2%) and non-significant.

"We also found a significant interaction between sex and living alone in our model, meaning that cohabitation status was a strong and important predictor for TTR among men, but not among women," said Dr Bonde.

He concluded: "Men who live on their own may need extra support to use warfarin, such as education, home visits, telephone contacts, or additional follow-up visits. They might also consider using a newer type of drug, a non-vitamin K antagonist oral anticoagulant (NOAC), which is easier to manage and has fewer interactions with food and drugs compared to warfarin."

Credit: 
European Society of Cardiology

The Lancet: Non-physician health workers lead new approach to lowering risk of world's number one cause of death

A substantial reduction in the risk of cardiovascular disease, the world's leading cause of death, can be achieved in a year with a new comprehensive approach, according to a randomised controlled trial of 1,371 adults in two countries published in The Lancet and simultaneously presented at the ESC Congress 2019.

In the 'Hope 4" trial, care was led by non-physician health workers using a computer tablet to help decision-making and working closely with physicians. Non-physician led care was supported by counselling to improve health behaviours in people with high blood pressure, two or more free blood pressure-lowering drugs, a statin, and help from family and friends as treatment supporters. This comprehensive approach reduced participants' blood pressure and risk of a future heart attack or stroke by more than double the amount compared to those who received treatment only from physicians.

Hypertension is the leading cause of cardiovascular disease globally but its control in the community is not adequately managed worldwide - in both poor and rich countries.

"Governments around the world have agreed on an ambitious goal to reduce cardiovascular mortality by 30% by 2030. But we will only achieve this goal if we can find novel ways to remove all barriers to cardiovascular risk reduction, such as access to care, compliance with taking medications, and adoption of healthy behaviours." says Dr Jon-David Schwalm, McMaster University, Canada, who led the research. [1]

The new trial included 1,371 adults aged 50 or older from 30 urban and rural communities across Colombia and Malaysia. Participants had uncontrolled high blood pressure and a raised risk of cardiovascular disease. A pre-trial appraisal identified barriers to following through with medical treatment, including the time and expense of travelling to attend physician visits, lost wages, and costs of drugs.

For the intervention group, which included 14 communities, non-physician health workers led the screening, detection, treatment and control of cardiovascular risk factors. Health workers were able to respond to participants' needs, for example by visiting them at home, providing counselling to improve lifestyles and delivering medications. Prescription recommendations were checked by local physicians, who agreed with health workers' assessments 93% of the time.

Those in the intervention group also received lifestyle counselling from the health workers using guidance from a tablet-based app, support from family or friends they nominated to help them remember to take medications and to improve their likelihood of sticking to lifestyle changes, a free supply of single-pill antihypertensive medications combining two drugs, and a separate statin also provided free of charge.

Participants in the control group (16 communities) were recommended to see their local health-care provider as usual.

After 12 months, participants treated using the combined strategy reduced their estimated risk of developing cardiovascular disease in the subsequent 10 years by half. They also achieved a reduction in blood pressure of over 11 mm Hg and a reduction in LDL cholesterol of nearly 0.5 mmol/L. The proportion of individuals with their blood pressure under control - considered to be less than 140 mm Hg - increased by 69%. In comparison, the proportion of individuals in the control group with their blood pressure under control increased by only 30%. Previous studies have found that that every 10 mm Hg reduction in blood pressure reduces the risk of a major cardiovascular event such as stroke, heart attack, or death by 25%.

Participants in the intervention group attended 94% of planned visits and treatment supporters were present at an average of 74% of them. Supporters were a spouse (48%), offspring (34%), other family (11%) or another individual (7%). At 12 months, 84% of participants were taking two or more antihypertensive medications at 12 months, compared to 65% in the control group. Statin use was 84% at 12 months compared to 38% in the control group. Participants assigned to the intervention also reported an increase in physical activity and improvements to their diet after 12 months. However, there were no significant differences in glucose concentrations, HDL cholesterol, rates of those who stopped smoking, or weight between the two groups. There were no safety concerns with the treatment given.

"The unique design of our strategy demonstrates the value of a comprehensive approach which actively involves family and friends as treatment supporters and care being coordinated by trained non-physician health workers guided by a computer program on tablets for diagnosis and counselling," says Professor Patricio Lopez-Jaramillo from the University of Santander in Colombia. [1]

Fadhlina Majid from the Universiti Teknologi MARA in Malaysia adds: "We hope that our results help persuade governments to provide antihypertensive drugs and statins at low or no cost to people with high blood pressure and develop models of care that include trained non-physicians. Even in high-income countries, the detection, treatment, and control of hypertension and other cardiovascular disease risk factors needs substantial improvements." [1]

The authors note several limitations to their study, including the fact that the method of screening could be considered an intervention in itself. Screening involved a combination of door-to-door household visits, use of community outreach centres and local events in public spaces. It therefore led to the detection of hypertension and treatment in individuals who otherwise were unlikely to have identified. More people in the control communities may have received treatment than would normally have been the case, reducing the real differences in outcomes between them and the intervention group. The results are therefore likely to be an underestimate of the full effects of the strategy. Blinding was not feasible in the study, but the authors note they took several steps to avoid biases.

Writing in a linked Comment, Professor Tazeen Jafar from Duke-NUS Medical School, Singapore, says: "A 30% reduction in premature cardiovascular mortality by the year 2030, relative to 2015, is targeted by Sustainable Development Goal 3.4. HOPE 4 and similar highly important studies should prompt the scientific and legislative communities to rethink scale-up of large, evidence-based approaches to dramatically reduce the burden of uncontrolled hypertension and lower cardiovascular risk. Such bold strategies cannot be ignored."

Credit: 
The Lancet

How much carbon the land can stomach with more carbon dioxide in the air

image: The Yaluzangbu canyon.

Image: 
YU Liang

About 600 petagrams, or 600 billion tons of carbon (the weight of about 100 billion really big elephants), was emitted as carbon dioxide from 1750-2015 due to fossil fuel burning, cement production and land-use change. About one-third of this was absorbed by land ecosystems.

Plants pull carbon dioxide out of the atmosphere by "eating" it, i.e., converting the carbon dioxide into sugars and starches, aka photosynthesis. Fortunately, plants' appetite for carbon dioxide is pretty good: The more carbon dioxide we have in the air, the faster the plants eat.

When a terrestrial ecosystem absorbs more carbon from human carbon dioxide emissions than it emits, it is called a carbon sink; otherwise, it is a carbon source. Scientists have found that rising carbon dioxide concentration in the air enhances land carbon sink, a process known as carbon dioxide fertilization. Quantifying carbon dioxide fertilization is critical for understanding and predicting how climate will affect and be affected by the carbon cycle.

Recently, researchers from 28 institutions in nine countries succeeded in quantifying carbon dioxide fertilization for the past five decades, using simulations from 12 terrestrial ecosystem models and observations from seven field carbon dioxide enrichment experiments.

They found that the sensitivity of northern temperate carbon sink to rising carbon dioxide concentration is linearly related to the site-scale sensitivity across the models. Based on this emergent relationship and field experiment observations as a constraint, the study estimated that for every 100-ppm increase in carbon dioxide in the air (equivalent to about one flea per one liter of water), terrestrial carbon dioxide sink increases by 0.64 billion tons (equivalent to 140 billion really big elephants) of carbon per year in the temperate Northern Hemisphere, and 3.5 billion tons of carbon per year globally. The team also revealed that carbon dioxide fertilization is primarily responsible for the observed increase in global terrestrial carbon sink.

"This study reduces uncertainty in the understanding of the carbon dioxide fertilization effect on terrestrial carbon sink," said co-author Dr. PIAO Shilong of the College of Urban and Environmental Sciences at Peking University. "The new approach and techniques in this study will be very useful to the scientific community in future research and studies."

"To explain further mechanisms underlying the carbon dioxide fertilization effect, more longer-term field experiments are required, particularly in boreal and tropical ecosystems. Joint effort between experimentalists and modelers is also necessary," said Dr. LIU Yongwen, lead author of the study and a research scientist at the Institute of Tibetan Plateau Research, Chinese Academy of Sciences.

Studies have shown the capacity of terrestrial ecosystems to absorb carbon dioxide is growing. This is good news since the process can slow the accumulation of carbon dioxide in the air and thus the pace of climate change. Credit should go to carbon dioxide fertilization, the extended growing season for vegetation, and reforestation, all of which help pull carbon from the atmosphere. At the same time, however, factors such as fire, heat waves and permafrost thawing - among other increasingly common global warming ills - are changing previous carbon sinks into carbon sources.

Credit: 
Chinese Academy of Sciences Headquarters

Lifestyle, not genetics, explains most premature heart disease

Paris, France - 2 Sept 2019: Physical inactivity, smoking, high blood pressure, diabetes, and high cholesterol play a greater role than genetics in many young patients with heart disease, according to research presented today at ESC Congress 2019 together with the World Congress of Cardiology.(1) The findings show that healthy behaviours should be a top priority for reducing heart disease even in those with a family history of early onset.

"Genetics are an important contributor to premature heart disease but should not be used as an excuse to say it is inevitable," said study author Dr Joao A. Sousa of Funchal Hospital, Portugal.

"In our clinical practice, we often hear young patients with premature heart disease 'seek shelter' and explanations in their genetics/family history," he added. "However, when we look at the data in our study, these young patients were frequently smokers, physically inactive, with high cholesterol levels and high blood pressure - all of which can be changed."

The study enrolled 1,075 patients under 50, of whom 555 had coronary artery disease (known as premature CAD). Specific conditions included stable angina, heart attack, and unstable angina. The average age was 45 and 87% were men. Risk factor levels and genetics in patients were compared to a control group of 520 healthy volunteers (average age 44, and 86% men). Patients and controls were recruited from the Genes in Madeira and Coronary Disease (GENEMACOR) database.

Five modifiable risk factors were assessed: physical inactivity, smoking, high blood pressure, diabetes, and high cholesterol. Nearly three-quarters (73%) of patients had at least three of these risk factors compared to 31% of controls. In both groups, the likelihood of developing CAD increased exponentially with each additional risk factor. The probability of CAD was 3, 7, and 24 times higher with 1, 2, and 3 or more risk factors, respectively.

All participants underwent genome sequencing. These data were used to develop a genetic risk score containing 33 variants thought to contribute to CAD or risk factors such as high blood pressure. The average score was higher in patients than controls. The score was also an independent predictor for premature CAD. However, the contribution of genetics to risk of CAD declined as the number of modifiable factors rose.

Dr Sousa said: "The findings demonstrate that genetics contribute to CAD. However, in patients with two or more modifiable cardiovascular risk factors, genetics play a less decisive role in the development of CAD."

He concluded: "Our study provides strong evidence that people with a family history of premature heart disease should adopt healthy lifestyles, since their poor behaviours may be a greater contributor to heart disease than their genetics. That means quit smoking, exercise regularly, eat a healthy diet, and get blood pressure and cholesterol levels checked."

Credit: 
European Society of Cardiology

Study shows metabolic surgery associated with lower risk of death and heart complications

video: Weight-Loss Surgery Associated with 40% Reduction in Risk of Death and Heart Complications in Patients with Diabetes and Obesity, Study Shows

Image: 
Cleveland Clinic

Monday, Sept. 2, 2019, PARIS: A large Cleveland Clinic study shows that weight-loss surgery performed in patients with type 2 diabetes and obesity is associated with a lower risk of death and major adverse cardiovascular events than usual medical care. These patients also lost more weight, had better diabetes control, and used fewer medications for treatment of their diabetes and cardiovascular disease than those undergoing usual medical care.

The observational study looked at nearly 2,300 patients who underwent metabolic surgery and 11,500 matched patients with similar characteristics who received usual medical care. Patients underwent one of four types of weight-loss surgery (also known as metabolic surgery): gastric bypass, sleeve gastrectomy, adjustable gastric banding, or duodenal switch.

The results were presented as a late-breaking study today at the European Society of Cardiology Congress and simultaneously published in the Journal of the American Medical Association (JAMA).

The primary endpoint of the study was the occurrence of death or one of five major complications associated with obesity and diabetes: coronary artery events, cerebrovascular events, heart failure, atrial fibrillation, and kidney disease. Over an eight-year period, patients undergoing metabolic surgery were 40 percent less likely to experience one of these events than those receiving usual medical care. Patients in the surgical group were 41 percent less likely to die from any cause.

"The striking results that we saw after metabolic surgery may be related to the patients' substantial and sustained weight loss," said Ali Aminian, M.D., a bariatric surgeon at Cleveland Clinic and lead author of the study. "However, there is a growing body of evidence to suggest that there are beneficial metabolic and hormonal changes after these surgical procedures that are independent of weight loss."

Patients who had metabolic surgery had an average of 15 percent greater weight loss and lower blood sugar levels. They used less diabetes medications, including insulin, and less heart medications such as blood pressure and cholesterol therapies compared with the non-surgery group.

"Cardiovascular complications from obesity and diabetes can be devastating. Now that we've seen these remarkable results, a well-designed randomized controlled trial is needed to definitively determine whether metabolic surgery can reduce the incidence of major heart problems in patients with type 2 diabetes and obesity," said Steven Nissen, M.D., Chief Academic Officer of the Heart & Vascular Institute at Cleveland Clinic and the study's senior author.

Nearly 40 percent of Americans have obesity which is linked to type 2 diabetes, heart disease, and stroke. Adults with diabetes are two to four times more likely to die from heart disease than those without diabetes.

Credit: 
Cleveland Clinic

Lack of government action on NHS staffing undermines ambition to diagnose cancer early

In just one year, around 115,000 cancer patients in England are diagnosed too late to give them the best chance of survival, according to new calculations* from Cancer Research UK released today (Monday).

This means that nearly half of all cancers diagnosed with a known stage in England are diagnosed at stage 3 or 4. And of these, around 67,000 people are diagnosed at stage 4 - the most advanced stage - leaving them with fewer treatment options and less chance of surviving their disease.

There are lots of things that can influence how early or late someone is diagnosed, but workforce shortages are a large contributor. There is a desperate shortage of NHS medical staff trained to carry out tests that diagnose cancer, meaning that efforts by the health system to diagnose and treat cancer more swiftly are being thwarted.

Last year, the Government made an important pledge to improve the number of people diagnosed with early stage cancer - a jump from two in four diagnosed early to three in four by 2028 which could save thousands of lives. Cancer Research UK has calculated that to reach this target, an extra 100,000 patients must be diagnosed early each year by 2028.**

NHS staff are working tirelessly to offer the best care possible, and the NHS is implementing important new initiatives to address late diagnosis and improve staff efficiency. But there just aren't enough of the right staff available on the ground now, and there are no plans to significantly increase the numbers needed to transform the health service.

An earlier diagnosis can be the difference between life and death. If bowel cancer is diagnosed at the earliest stage, more than nine in 10 people will survive, but if it is diagnosed at the latest stage, just one in 10 people will survive their disease for at least five years.

Efforts to diagnose more patients at an early stage means more people being referred urgently for tests, a vital shift for prompt diagnosis and treatment. But increasing referrals have left diagnostic staff under great pressure because of vacant posts, a lack of funding to train new doctors and growing lists of patients.

At least one in 10 of these posts is empty, so the Government needs to urgently invest in the cancer workforce if they plan to save more lives now, and in the future. Without the staff, the Government will not achieve its own ambition.

Emma Greenwood, Cancer Research UK's director of policy, said: "It's unacceptable that so many people are diagnosed late. Although survival has improved, it's not happening fast enough. More referrals to hospital means we urgently need more staff. The Government's inaction on staff shortages is crippling the NHS, failing cancer patients and the doctors and nurses who are working tirelessly to diagnose and treat them.

"By 2035, one person every minute will be diagnosed with cancer but there's no plan to increase the number of NHS staff to cope with demand now or the growing numbers in the future. Saving lives from cancer needs to be top of the agenda for the new Government and it must commit to investing in vital NHS staff now to ensure no one dies from cancer unnecessarily."***

An underpowered workforce is not the sole reason for late diagnoses. Other factors include symptoms being hard to spot, GPs having too little time to investigate people thoroughly, low uptake of screening programmes or the cancer being advanced when detected.

But right now, staff shortages are affecting every part of the pathway. According to work commissioned by Cancer Research UK, it is estimated that by 2027, the NHS needs:****

An additional 1,700 radiologists - people who report on imaging scans - increasing the total number to nearly 4,800

To nearly triple its number of oncologists - doctors specialising in treating patients with cancer - a jump from 1,155 to 3,000

Nearly 2,000 additional therapeutic radiographers - people who give radiotherapy to cancer patients - increasing the total to almost 4,800

By 2035, more than 500,000 people will be diagnosed with cancer in the UK, compared with nearly 360,000 today. With an ageing population, more tests will need to be carried out to diagnose more cancers and diagnose them earlier.

Dr Giles Maskell, Cancer Research UK's radiology expert, said: "We can feel the bottleneck tightening in the NHS - the pressure is mounting on diagnostic staff. We don't have nearly enough radiologists in the UK right now and far too many patients are waiting too long for scans and results.

"NHS staff are working as hard as they can, but we won't be able to care for the rising number of cancer patients unless the resources are found to train more specialist staff. Extra scanners are welcome, but they will achieve nothing without staff to run them and experts to interpret the scans. It's like buying a fleet of planes with no pilots to fly them."

Credit: 
Cancer Research UK

Childhood cholesterol, blood pressure, weight and smoking predict adult heart disease

Paris, France - 1 Sept 2019: The first reliable evidence of a link between major cardiovascular risk factors in children - serum cholesterol, blood pressure, body mass index (BMI), and smoking - with cardiovascular disease in adults is presented today at ESC Congress 2019 together with the World Congress of Cardiology.(1) The study highlights the need to lay the foundations for heart health early in life.

Study author Professor Terence Dwyer of the University of Oxford, UK, said: "While previous research has found connections between smoking and BMI in childhood and adult cardiovascular disease, there was no data for blood pressure or serum cholesterol. In addition, it has not been possible to assess the contributions of BMI and smoking while taking cholesterol and blood pressure into account."

The study used data from the International Childhood Cardiovascular Cohort (i3C) Consortium. Data was pooled on serum cholesterol, blood pressure, BMI, and smoking from seven child cohorts in the US, Australia, and Finland. Information was collected from 1971 onwards on approximately 40,000 children aged 3 to 19 who were followed up at around age 50 for cardiovascular events leading to hospitalisation (heart attack, stroke, peripheral arterial disease) or all-cause death.

Hazard ratios for cardiovascular events were calculated for each risk factor separately, and for all risk factors simultaneously. The latter analysis estimated how important each risk factor might be when all the others are taken into account.

For cardiovascular events leading to hospitalisation, the associations for risk factors were: (2)

BMI: 10% rise --> 20% higher risk of an event.

Systolic blood pressure: 10% rise --> 40% higher risk of an event.

Serum cholesterol: 10% rise --> 16% higher risk of an event.

Adolescent smoking --> 77% higher risk of an event.

In the simultaneous analysis of risk factors, the associations with future risk were reduced, suggesting a shared contribution to risk by each, although the smoking association remained relatively unchanged. The associations for death were very similar to those found for cardiovascular events.

Prof Dwyer said: "The study shows a very strong relationship between standard cardiovascular risk factors in children and the probability of a heart attack or stroke later in life. A meaningful proportion of that risk appears to be laid down in childhood and has a detrimental effect independent of adult risk factor levels. Programmes to prevent heart attacks and strokes should put more emphasis on promoting healthy lifestyles in children."

Credit: 
European Society of Cardiology

DNA methylation-based estimator of telomere length

image: Application of DNAmTL on hTERT-transduced primary human fibroblasts.

Image: 
Steve Horvath

Leukocyte DNAm TL is applicable across the entire age spectrum and is more strongly associated with age than measured leukocyte TL.

Leukocyte DNAm TL outperforms LTL in predicting: i)time-to-death, ii)time-to-coronary heart disease, iii)time-to-congestive heart failure and iv)association with smoking history.

Dr. Steve Horvath from the Department of Human Genetics, David Geffen School of Medicine at the University of California Los Angeles in Los Angeles, CA 90095, USA as well as the Centre for Genomic and Experimental Medicine, Institute of Genetics and Molecular Medicine at the University of Edinburgh in Edinburgh, EH4 2XU, UK and the Department of Twin Research and Genetic Epidemiology at Kings College London in London SE1 7EH, UK said "Telomeres are repetitive nucleotide sequences at the end of chromosomes that shorten with replication of somatic cells."

Since the number of cell replication in vivo increases with age, telomere length is negatively correlated with age of proliferating somatic cells.

Although both DNAm age and LTL are associated with chronological age, they exhibit only weak correlations with each other after adjusting for age, suggesting the distinct nature of their underlying mechanisms.

The authors show that DNAm TL correlates negatively with age in different tissues and cell types and outperforms TRF-based LTL in predicting mortality and time-to-heart disease, as well as being associated with smoking history and other age-related conditions.

They also validated the applicability of DNAm TL on a large-scale data set and uncovered asso-ciations between age-adjusted DNAm TL with diet and clinical biomarkers.

Monitoring cultured cells with or without telomerase revealed that DNAm TL records cell replication independently of telomere attrition.

The Horvath research team concluded, "Like epigenetic clocks, we expect that DNAm TL will become a useful biomarker in human interventional studies. A proof-of-concept study is provided by our preliminary analysis of omega-3 polyunsaturated fatty acid supplementation."

Credit: 
Impact Journals LLC

Guidelines on management of fast heartbeat published today

Paris, France - 31 Aug 2019: The European Society of Cardiology (ESC) Guidelines on supraventricular tachycardia are published online today in European Heart Journal,(1) and on the ESC website.(2) The document highlights how catheter ablation is revolutionising care for this group of common arrhythmias.

Supraventricular tachycardia (SVT) refers to a heart rate above 100 beats per minute (normal resting heart rate is 60 to 100). It occurs when there is a fault with the electric system that controls the heart's rhythm. SVTs are frequent arrhythmias, with a prevalence of approximately 0.2% in the general population. Women have a risk of developing SVT that is two times greater than men, while people 65 years or older have more than five times the risk of developing SVT than younger people.

SVTs usually start and stop suddenly. They arise in the atria of the heart and the conduction system above the ventricles, and are rarely life-threatening in the acute phase, unlike arrhythmias from the ventricles. However, most SVTs, if left untreated, are lifelong conditions that affect the heart's function, increase the risk of stroke, and affect quality of life. Symptoms include palpitations, fatigue, light-headedness, chest discomfort, shortness of breath, and altered consciousness.

The guidelines provide treatment recommendations for all types of SVTs. Drug therapies for SVT have not fundamentally changed since the previous guidelines were published in 2003.

But Professor Josep Brugada, Chairperson of the guidelines Task Force and professor of medicine, University of Barcelona, Spain, said: "We do have more data on the potential benefits and risks associated with several drugs, and we know how to use them in a safer way. In addition, some new antiarrhythmic drugs are available."

Antiarrhythmic drugs are useful for acute episodes. For long-term treatment these drugs are of limited value due to relatively low efficacy and related side-effects.

The main change in clinical practice over the last 16 years is related to the availability of more efficient and safe invasive methods for eradication of the arrhythmia through catheter ablation. This therapy uses heat or freezing to destroy the heart tissue causing the arrhythmia.

Professor Demosthenes Katritsis, Chairperson of the guidelines Task Force and director of the 3rd Cardiology Department, Hygeia Hospital, Athens, Greece, said: "Catheter ablation techniques and technology have evolved in a way that we can now offer this treatment modality to most of our patients with SVT."

SVT is linked with a higher risk of complications during pregnancy, and specific recommendations are provided for pregnant women. All antiarrhythmic drugs should be avoided, if possible, within the first trimester of pregnancy. However, if necessary, some drugs may be used with caution during that period.

"Pregnant women with persistent arrhythmias that do not respond to drugs, or for whom drug therapy is contraindicated or not desirable, can now be treated with catheter ablation using new techniques that avoid exposing themselves or their baby to harmful levels of radiation," said Prof Katritsis.

What should people do if they experience a fast heartbeat? "Always seek medical help and advice if you have a fast heartbeat," said Prof Brugada. "If SVT is suspected, you should undergo electrophysiology studies with a view to catheter ablation, since several of the underlying conditions may have serious long-term side effects and inadvertently affect your wellbeing. Prevention of recurrences depends on the particular type of SVT, so ask your doctor for advice. Catheter ablation is safe and cures most SVTs."

Credit: 
European Society of Cardiology

How to simulate softness

video: Co-senior author Charles Dhong describes a new UC San Diego study on how to recreate different levels of perceived softness.

Image: 
Liezel Labios/UC San Diego Jacobs School of Engineering

What factors affect how human touch perceives softness, like the feel of pressing your fingertip against a marshmallow, a piece of clay or a rubber ball? By exploring this question in detail, a team of engineers and psychologists at the University of California San Diego discovered clever tricks to design materials that replicate different levels of perceived softness.

The findings provide fundamental insights into designing tactile materials and haptic interfaces that can recreate realistic touch sensations, for applications such as electronic skin, prostheses and medical robotics. Researchers detail their findings in the Aug. 30 issue of Science Advances.

"We provide a formula to recreate a spectrum of softness. In doing so, we are helping close the gap in understanding what it takes to recreate some aspects of touch," said Charles Dhong, who co-led the study as a postdoctoral fellow at UC San Diego and is now an assistant professor in biomedical engineering at the University of Delaware. Dhong worked with Darren Lipomi, a professor of nanoengineering at UC San Diego and the study's co-corresponding author.

Based on the results from their experiments, the researchers created equations that can calculate how soft or hard a material will feel based on material thickness, Young's modulus (a measure of a material's stiffness), and micropatterned areas. The equations can also do the reverse and calculate, for example, how thick or micropatterned a material needs to be to feel a certain level of softness.

"What's interesting about this is that we've found two new ways to tune the perceived softness of an object--micropatterning and changing the thickness," Dhong said. "Young's modulus is what scientists typically turn to in terms of what's soft or hard. It is a factor, but now we show that it's only one part of the equation."

Recreating softness

The researchers began by examining two parameters engineers use to measure a material's perceived softness: indentation depth (how deep a fingertip presses into a material) and contact area between the fingertip and the material. Normally, these parameters both change simultaneously as a fingertip presses into an object. Touch a piece of soft rubber, for example, and the contact area will increase the deeper a fingertip presses in.

Dhong, Lipomi and colleagues were curious how indentation depth and contact area independently affect the perception of softness. To answer this question, they specially engineered materials that decoupled the two parameters and then tested them on human subjects.

The researchers created nine different elastomeric slabs, each with its own unique ratio of indentation depth to contact area. The slabs differed in amount of micropatterning on the surface, thickness and Young's modulus.

Micropatterning is key to the design. It consists of arrays of raised microscopic pillars dotted on the surface of the slabs. These tiny pillars allow a fingertip to press deeper without changing the contact area. This is similar to pressing against the metal pins of a Pinscreen toy, where arrays of pins slide in and out to make a 3D impression.

"By creating these micropatterned surface structures, we produce discontinuous regions of contact where the finger presses in that are much smaller than the shadow it would cast on the surface," Lipomi said.

The team tested the slabs on 15 subjects and instructed them to perform two tasks. In the first task, they presented subjects with multiple pairs of slabs and asked them to identify the softer one in each pair. In the second task, the researchers had subjects rank the nine slabs from softest to hardest.

Overall, the slabs that subjects perceived as softer were thicker, had little to no micropatterning on the surface, and had a low Young's modulus. Meanwhile slabs that felt harder were thinner, had more micropatterning and a high Young's modulus.

Softness: a basic ingredient of touch

Experiments also led the researchers to an interesting conclusion: the perception of softness is a basic sensation, not a combination of other sensations.

"This means softness is a primary ingredient of the human sense of touch. It's like how we have RGB for color displays," Lipomi said. "If we can find the other 'pixels of touch,' can we combine them to make any tactile image we want? These are the fundamental things we would like to know going forward."

Credit: 
University of California - San Diego