Culture

New EU rules could make total diet replacement products unviable from 2022, study warns

From October 2022, the European Union (EU) will impose new nutritional requirements for total diet replacement (TDR) products which could make them unviable to produce and sell, according to new research being presented at The European and International Congress on Obesity (ECOICO 2020), held online this year from 1-4 September.

The study examining the feasibility of manufacturing TDR products according to these new dietary compositional criteria, reveals substantial challenges including reduced shelf-life, a worsening in organoleptic properties such as mouth feel and taste, and higher production costs.

"The mandatory nutrient values are markedly different to those which currently apply", explains Dr Kelly Johnston from King's College London and LighterLife UK Ltd, who led the research. "The sensory properties of the new formulation are likely to cause major compliance issues for consumers, and the diminished shelf life of some products would simply make them unviable to produce and sell."

The Foods for Specific Groups (FSG) Regulation--the nutritional criteria of which was decided by the European Food Safety Authority (EFSA)--will apply to formula-based TDR products for weight control, which consists of both very low and low calorie diet (VLCD and LCD) programmes [1]. These programmes are an established weight loss solution and have been available for more than 30 years throughout the EU.

The first controls on these specifically formulated types of foods were introduced in 1977, by the concept of Foods for Particular Nutritional Uses (PARNUTS). The PARNUTs legislation (2009/39/EC) allowed the food industry to market specific products indicating their suitability for nutritional use. However, because of no specific rules, different interpretation of product types across the EU led to similar foods being marketed as PARNUTs in some countries and as foods for normal consumption in others. So the new FSG Regulation was introduced to ensure application of uniform rules across the EU.

The new FSG Regulation stipulates that for TDR, the daily minimum protein content should increase from 50g to 75g per day (50% increase), while the values of essential fatty acids (linoleic and alpha linolenic) should more than double from 3g to 11g and 0.5g to 1.4g, respectively per day. There is also a new daily requirement for an extra 400mg of choline (an essential nutrient vital for many body functions including nerve signalling, and liver and muscle functioning).

In this study, researchers formulated 10 different VLCD TDR products (ie, bars, shakes, textured meals) to both the current formulation guidance and the new EU regulation, as well as a range of other "in-between" macronutrient versions. A team of trained sensory panellists working for an independent manufacturer were asked to compare organoleptic properties (ie, mouth feel, texture, appearance, aroma, and taste) and to examine the extent of quality changes between the different formulations, weekly, during storage over 10 weeks. The researchers also assessed production costs.

For the textured meal, the panel found that the increase in essential fatty acids negatively affected the organoleptic properties of the meal product--creating a "fatty mouthfeel" and an "unpalatable/unpleasant taste" [2]. This was not noted in either the current formulation or the 'in-between' test products.

In accelerated shelf-life tests, the panel reported unsatisfactory organoleptic profiles in 100% of new formulation samples compared with 20% in both the current formulation and 'in-between' samples. Additionally, rancidity was noted in new formulation samples from 4 weeks, whereas no rancidity was detected across the 10-week period for any other samples (see poster table 2).

Whilst the new recommended minimum protein content of 75g per day for TDR products was not reported to have any major adverse effects on sensory properties, the researchers estimate that the associated increase in raw material costs could increase the price to consumers by as much as 38%.

"We would like to see alternate compositional criteria, based on the currently available evidence", says Dr Johnston. "Specifically, because of the lack of evidence for the new additional requirements of so much fat."

According to co-author Dr Anthony Leeds from Frederiksburg Hospital, Denmark, "At a time when total diet replacement has been shown to have a potential role in delivering type 2 diabetes remission with improved quality of life and reduced drug costs across Europe, these legislative changes will challenge the commercial viability of existing total diet replacement, and deprive consumers of the freedom to choose this safe, cost-effective option."

Credit: 
European Association for the Study of Obesity

CU Anschutz researchers shed light on split-second decision making

AURORA, Colo. (Aug. 31, 2020) - A little understood region of the cerebellum plays a critical role in making split-second `go-no go' decisions, according to a new study from researchers at the University of Colorado Anschutz Medical Campus.

"We wanted to know how this kind of decision making takes place," said the study's senior author Diego Restrepo, PhD, professor of cell and developmental biology at the University of Colorado School of Medicine. "How, for example, do you decide to swing or not swing at a fast ball in baseball?"

The study was published online today in Nature Communications.

Employing mice rather than ball players, Restrepo and his team used a multiphoton microscope that peered into the brains of the free-moving rodents as they decided whether or not to lick a water solution.

The researchers focused specifically on the molecular layer interneurons (MLIs) in the cerebellum. The mice were given a sugar water reward if they licked a water spout in the presence of a specific, pleasant odor and they avoided a timeout when they refrained from licking in the presence of unscented mineral oil.

At first, the MLI responses did not differ between odors. But with learning, the reward odor prompted a large increase in MLI calcium responses. When the stimuli were reversed, the MLI switched responses to the odors.

When the scientists intervened with chemogenetic agents to inhibit MLI activity, the mice floundered and became less effective in making `go-no go' decisions.

"Our data indicate that the MLIs have a role in learning valence," Restrepo said. "That is, it helps determine whether something is good for me or not."

The findings further illuminate the function of the cerebellum, long associated primarily with movement. But it also plays a key role in cognition and emotion and is associated with non-motor conditions such as autism spectrum disorders.

"A lot of learning goes on inside the cerebellum," Restrepo said. "The cerebellum may also be the place where quick choice arises."

This study shows that it also coordinates both motion and decision making, when to go or not to go.

"We found an entire subset of brain cells that change after learning," Restrepo said. "It sheds further light on how the cerebellum functions and the complex web of connections that go into quick decision making."

Credit: 
University of Colorado Anschutz Medical Campus

Study: Portable, point-of-care COVID-19 test could bypass the lab

image: Illinois researchers developed a microfluidic cartridge for a 30-minute COVID-19 test. The cartridges are 3D-printed and could be manufactured quickly.

Image: 
Photo courtesy of Bill King, University of Illinois

CHAMPAIGN, Ill. — As COVID-19 continues to spread, bottlenecks in supplies and laboratory personnel have led to long waiting times for results in some areas. In a new study, University of Illinois, Urbana-Champaign researchers have demonstrated a prototype of a rapid COVID-19 molecular test and a simple-to-use, portable instrument for reading the results with a smartphone in 30 minutes, which could enable point-of-care diagnosis without needing to send samples to a lab.

“If such a device and test were available, we could test for COVID-19 at public events, auditoriums, large gatherings and potentially even at home for self-testing. The results could be sent back to the appropriate public health system for coordination,” said Rashid Bashir, a professor of bioengineering and the dean of the Grainger College of Engineering at Illinois. Bashir co-led the study with electrical and computer engineering professor Brian Cunningham and mechanical science and engineering professor Bill King.

Typical tests for SARS-CoV-2, the virus that causes COVID-19, take a sample from a patient with a long nasopharyngeal swab, put that swab into a substance called viral transport media, and send it to a lab for a multistep process of extracting, isolating and multiplying the telltale RNA inside the virus. This RNA multiplication process, called RT-PCR, requires several temperature fluctuation cycles, specialized equipment and trained personnel, Cunningham said.

As reported in the Proceedings of the National Academy of Sciences, the Illinois team used a simpler process to analyze the viral transport media, called LAMP, which bypasses the RNA extraction and purification steps.

“LAMP only needs one temperature – 65 C – so it is much easier to control,” said graduate student Anurup Ganguli, the first author of the study. “Also, LAMP works more robustly than PCR, especially when there are contaminants in the test sample. We can just briefly heat the sample, break open the virus and detect the genetic sequence that specifically identifies SARS-CoV-2.”

The researchers compared the LAMP assay with PCR, first using synthetic nasal fluid spiked with the virus and then with clinical samples. They found the results were in agreement with PCR results, and they documented the sensitivity and specificity of the LAMP test.

Then, the researchers incorporated the LAMP assay onto a small 3D-printed microfluidic cartridge that has two input slots for syringes: one for the sample-containing viral transport media, one for the LAMP chemicals. Once the two are injected, they react within the cartridge.

“We use modern, high speed additive manufacturing to make these cartridges.  The entire thing can be quickly scaled up to hundreds of thousands of tests,” King said. “Production scale-up is typically the biggest obstacle for commercial applications of microfluidic cartridges, and we can overcome that obstacle using this new approach. Modern additive manufacturing is elastic and scalable, and it can be ramped up very quickly compared with legacy manufacturing technologies.”

The team is working with Fast Radius Inc., a Chicago-based technology company King co-founded, to manufacture the microfluidic cartridges.

The cartridge can be inserted into a hand-held portable instrument with a heating chamber, which heats the cartridge to 65 degrees Celsius for the duration of the reaction, and a smartphone cradle for reading the results. In approximately 30 minutes, a positive result will emit fluorescent light.

“The reader illuminates the liquid compartments with light from blue LEDs, while the phone’s rear-facing camera records a movie of the green fluorescent light being generated,” Cunningham said. 

See a video of the cartridge, hand-held device and process on YouTube.

The researchers demonstrated the portable device with additional clinical samples, and found the results matched those of the standard PCR lab procedure.

The researchers are exploring whether the assay would work with saliva samples to eliminate the need for nasopharyngeal swabs, and collecting more patient data as they consider next steps for regulatory approvals, Bashir said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Scientists show how brain flexibility emerges in infants

image: This image is of MRI data, showing neural flexibility over time.

Image: 
Biomedical Research Imaging Center at UNC-Chapel Hill.

Cognitive flexibility refers to the ability to readily switch between mental processes in response to external stimuli and different task demands. For example, when our brains are processing one task, an external stimulus is present, requiring us to switch our mental processes to attend to this external stimulus. This ability of switching from one to another mental task is the cognitive flexibility. Such flexibility can predict reading ability, academic success, resilience to stress, creativity, and lower risk of various neurological and psychiatric disorders. To shed light on the development of this critical cognitive process during early infancy, researchers at the UNC Biomedical Research Imaging Center (BRIC) at the UNC School of Medicine conducted a brain imaging study in infants to examine the emergence of neural flexibility, which refers to the frequency with which a brain region changes its role (or allegiance to one functional network to another). Neural flexibility is thought to underlie cognitive flexibility.

Publishing their work in the Proceedings of the National Academy of Sciences (PNAS), the researchers show that brain regions with high neural flexibility appear consistent with the core brain regions that support cognitive flexibility processing in adults, whereas brain regions governing basic brain functions, such as motor skills, exhibit lower neural flexibility in adults, demonstrating the emergence of functionally flexible brains during early infancy.

For this study, the authors used magnetic resonance imaging to examine brain activity up to seven times in 52 typically developing infants under the age of two during natural sleep. The researchers, led by Weili Lin, PhD, director of BRIC, the Dixie Lee Boney Soo Distinguished Professor of Neurological Medicine, and Vice Chair of Basic Research in the UNC Department of Radiology, found that neural flexibility increased with age across the whole brain, and specifically in brain regions that control movement, potentially enabling infants to learn new motor skills. Neural flexibility also increased with age in brain regions involved in higher-level cognitive processes, such as attention, memory, and response inhibition, indicating continuing development of these functional networks as babies become toddlers.

The age-related increase in neural flexibility was highest in brain regions already implicated in cognitive flexibility in adults, suggesting that cognitive flexibility may start to develop during the first two years of life.

"Neural flexibility in these brain regions may reflect early developmental processes that support the later emergence of cognitive flexibility," Lin said. "What we've imaged, in essence, is the brain's flexibility setting the stage for later maturity of higher cognitive brain functions."

Additional analysis of brain regions with especially high neural flexibility revealed the presence of relatively weak and unstable connections from these regions to other parts of the brain, potentially showing how these regions can rapidly switch their allegiances between different functional networks. By contrast, neural flexibility in brain regions involved in visual functions remained relatively low throughout the first two years of life, suggesting that these regions had already matured.

Lower levels of neural flexibility (i.e., greater established brain maturity) of visual brain regions at three and 18 months of age were associated with better performance on cognitive and behavioral assessments at the age of five or six years.

These findings provide insights into the development of higher-level brain functions, and could be used to predict cognitive outcomes later in life. The developed approach of assessing neural flexibility non-invasively could also provide a new means to assess subjects with neurodevelopmental disorders.

Credit: 
University of North Carolina Health Care

Brain protein linked to seizures, abnormal social behaviors

image: Confocal image of a mouse brain tissue shows the astrocytes (red) and neurons (green).

Image: 
Ethell lab, UC Riverside.

A team led by a biomedical scientist at the University of California, Riverside has found a new mechanism responsible for the abnormal development of neuronal connections in the mouse brain that leads to seizures and abnormal social behaviors.

The researchers focused on the area of the brain called hippocampus, which plays an important role in learning and social interactions; and synapses, which are specialized contacts between neurons.

Each neuron in the brain receives numerous excitatory and inhibitory synaptic inputs. The balance between excitation and inhibition in neuronal circuits, known as E/I balance and thought to be essential for circuit function and stability and important for information processing in the central nervous system, can play a role in causing many neurological disorders, including epilepsy, autism spectrum disorder, and schizophrenia.

The researchers also focused on a protein called ephrin-B1, which spans the membrane surrounding the cell and plays a role in maintaining the nervous system. The goal of their study was to determine if the deletion or over-production of ephrin-B1 in astrocytes -- glial cells in the brain that regulate synaptic connections between neurons -- affects synapse formation and maturation in the developing hippocampus and alters the E/I balance, leading to behavioral deficits.

"We found the changes in the E/I balance are regulated by astrocytes in the developing brain through the ephrin protein," said Iryna Ethell, a professor of biomedical sciences in the UCR School of Medicine who led the mouse study. "Further, astrocytic ephrin-B1 is linked to the development of inhibitory networks in the hippocampus during a critical developmental period, which is a new and unexpected discovery. Specifically, we show the loss of astrocytic ephrin-B1 tilts the E/I balance in favor of excitation by reducing inhibition, which then hyperactivates the neuronal circuits. This hyperactivity manifests as reduced sociability in the mice and suggests they can serve as a new model to study autism spectrum disorder."

The findings, published in the Journal of Neuroscience, can further scientists' understanding of the mechanisms that lead to neurodevelopmental disorders, allowing researchers to discover novel interventions for treating these disorders by targeting astrocytes during a specific developmental period.

Ethell explained that astrocyte dysfunctions are also linked to synapse pathologies associated with neurodevelopmental disorders and neurodegenerative diseases such as Alzheimer's disease where early dysfunction in synaptic connections can also lead to neuron loss.

"How exactly astrocytes use the ephrin protein to control the development of neuronal networks remains to be explored in future studies," she said. "Our findings open a new inquiry into future clinical applications as impaired inhibition has been linked to several developmental disorders, including autism and epilepsy."

The report is first to establish a link between astrocytes and the development of E/I balance in the mouse hippocampus during early postnatal development.

"We provide new evidence that different ephrin-B1 levels in astrocytes influence both excitatory and inhibitory synapses during development and contribute to the formation of neuronal networks in the brain and associated behaviors," Ethell said.

She explained that synapses are building blocks of neural networks and function as fundamental information-processing units in the brain. Excitatory synapses are cell-cell connections that facilitate neuronal activity, she said, whereas inhibitory connections negatively regulate brain activity to coordinate brain responses, their timing, and specificity.

"Hyperactivity of neuronal networks resulting from the loss or impaired function of inhibitory synapses can lead to neural dysfunctions and seizures," she added. "Like a car without brakes, the brain without inhibitory neurons cannot function properly and becomes overactive, resulting in loss of body control."

Ethell acknowledged further investigation is needed to determine how exactly ephrin signaling in astrocytes alters inhibitory synapses, and specifically how astrocytes may contribute to these mechanisms.

"Given the widespread and growing research interest in the astrocyte-mediated mechanisms that regulate E/I balance in neurodevelopmental disorders, our findings establish a foundation for future studies of astrocytes in clinically relevant conditions," she said.

Credit: 
University of California - Riverside

Aspirated consonants may promote the spread of COVID-19, RUDN University linguist says

image: According to a linguist from RUDN University, the number of COVID-19 cases in a country might be related to the existence of aspirated consonants in its main language of communication. This data can help create more accurate models to describe the spread of COVID-19.

Image: 
RUDN

According to a linguist from RUDN University, the number of COVID-19 cases in a country might be related to the existence of aspirated consonants in its main language of communication. This data can help create more accurate models to describe the spread of COVID-19. The results of the study were published in the Medical Hypotheses journal.

COVID-19 is mainly spread through droplets of liquid coming from the respiratory passages of an infected person. The disease spreads faster through coughing or sneezing, as in these cases the speed of the droplets increases. However, a regular conversation can also lead to infection, and the amount of produced droplets depends on the sounds pronounced by an infected speaker. The exact sounds that add the most to the spread of COVID-19 and other viruses haven't been identified yet. A RUDN linguist suggested that they might include aspirated consonants, i.e. the sounds that are accompanied by exhalation.

The issue of a correlation between the spread of infections and the language of the infected people was first brought up in 2003, after the outbreak of SARS-CoV-1in South China when over 8,000 cases were registered in 26 countries. The USA accounted for 70 of them, but Japan did not have a single patient, in spite of the fact that the number of Japanese tourists in China at that time was much higher than that of US travelers (3.2 mln vs 2.3 mln, respectively). Some scientists suggested a linguistic explanation: the staff of Chinese stores spoke to US tourists in English, and to Japanese guests in Japanese. Georgios Georgiou from RUDN University found the same correlation for COVID-19.

Unlike in Japanese, in the English language the consonants [p], [t], and [k] are aspirated. When they are pronounced, numerous small droplets are released from the respiratory passages of a speaker into the air. Such droplets may contain viral particles. Japanese has fewer aspirational consonants, and therefore Japanese speakers produce less airborne droplets during a conversation. To confirm whether the speakers of the languages that are rich in aspirated consonants are more prone to the COVID-19 infection, the RUDN linguist used the official data of 26 countries with 1,000+ registered COVID-19 cases as of March 23, 2020.

"Our study did not include Switzerland because it has several official languages. We also excluded countries with too many or too few cases per 1 mln residents (e.g. Italy and Japan, respectively) to avoid extreme values," said Georgios Georgiou, PhD, a postdoctoral researcher at the Department of General and Russian Linguistics, Philological Faculty of RUDN University.

The languages in the study were divided into two groups by the presence or absence of aspirated consonants. According to the scientists, although the groups did not show statistically significant differences, the countries that predominantly spoke the languages of the first group had more cases of COVID-19: 255 per 1 million residents (as opposed to 206 cases in the second group).

"Although no clear relationship was observed, we do not rule out that the spread of COVID-19 can be partially due to the presence of aspirated consonants in a country's main language of communication. This can be a valuable insight for epidemiologists,' added Georgios Georgiou.

However, the researchers of this paper recognize that it is difficult to determine such a relationship due to experimental limitations. For example, the different social distance measures taken in each country and the fact that we do not know exactly the linguistic background of the speakers in each country, render the paper's initial assumption just a hypothesis. However, this hypothesis has strong fundaments and can be studied in a future large-scale study.

Credit: 
RUDN University

Mental health and the COVID-19 pandemic: A call for action

The Covid-19 outbreak has caused immense problems at a macro socioeconomic level throughout the world. Not only has it affected the Covid-19 patients but has also taken a toll on the caregivers, physicians, paramedics and all others who have been engaged in the combat against the pandemic.

The current pandemic is feared to affect the physical and mental health and the well-being of people world over. The physicians, nurses and others directly dealing with the Covid-19 sufferers can be likely going through trauma consequences and burnout syndrome. Also, the impact of the quarantine lockdown on mental health has to be accounted for. It is, therefore, essential to include mental health as part of national public health response to the COVID-19 pandemic in assisting all those in need.

To view the letter please visit: https://benthamopen.com/ABSTRACT/TOPHJ-13-411

Credit: 
Bentham Science Publishers

Scoring system for the diagnosis of COVID-19

image: This scoring system for Covid-19 is simple, could be calculated in a few minutes, and incorporates the main possible data/findings of any patient.

Image: 
Dr. Mohamed Farouk Allam, Bentham Open

The spread of Covid-19 pandemic has been unprecedented considering the speed and severity with which it has affected most of the countries in the world. Viewing the magnitude and urgency required in handling the remedial measures, Dr. Allam proposes that we need a scoring system to help us classify the suspected patients in order to determine who would require a follow-up, home isolation, quarantine or further investigations. A scoring system is essential because the teams involved in finding and treating patients have found much difficulty in collecting nasopharyngeal swab specimen from all suspected patients. Also, the costs are very high for RT-PCR and CT Scans, if the doctors go on testing entire populations.

The proposed scoring system, that can prove a viable diagnostic tool for suspected patients, takes into account the Epidemiological Evidence of Exposure, Clinical Symptoms and Signs, and Investigations (if available). This simple scoring system for Covid-19 can be calculated in mere minutes, based on the data for the patients.

To find out the details for the proposed Scoring System for Covid-19, please view the letter here: https://benthamopen.com/ABSTRACT/TOPHJ-13-413

Credit: 
Bentham Science Publishers

Scientists develop low-temperature resisting aqueous zinc-based batteries

image: Schematic illustration of the ZBBs with hybrid electrolytes at low temperatures and possible mechanism of how Zn2+-EG solvation interaction impacts the chemistry of the hybrid electrolyte

Image: 
CHANG Nana

Aqueous zinc-based batteries (ZBBs) are widely used for portable and grid-scale applications due to their high safety, low cost and high energy density.

However, the inhomogeneous zinc deposition on anode during charging and the zinc dendrite formation decrease the cycling stability of ZBBs. Moreover, the traditional aqueous electrolytes are not capable of working at low temperature due to the suddenly dropped ionic conductivities, limiting the applicable temperature range of aqueous ZBBs.

Recently a research group led by Prof. LI Xianfeng from the Dalian Institute of Chemical Physics (DICP) of the Chinese Academy of Sciences (CAS) developed a low-temperature resisting, cost-effective, safe and eco-friendly hybrid electrolyte for aqueous ZBBs.

This work was published in Energy & Environmental Science.

The developed electrolyte, consisting of water (H2O), ethylene glycol (EG) and zinc sulfate (ZnSO4), exhibited high zinc-ion conductivity at low temperature.

"We demonstrated the unique solvation interaction of Zn2+ with EG through experiments together with theoretical calculation," said Prof. LI.

This interaction could not only enhance the hydrogen bonding between EG and H2O, providing the hybrid electrolyte with lower freezing point, but also weaken the solvation interaction of Zn2+ with H2O, achieving highly reversible Zn/Zn2+ chemistry and uniform zinc deposition.

Both the Zn-ion hybrid supercapacitors (ZHSCs) and Zn-ion batteries (ZIBs) with the hybrid electrolytes showed high energy densities, high power densities and long-cycle life at -20 °C. This series of hybrid electrolytes with tunable EG-to-H2O ratios provided good balance between performance and cost, which enabled promising application in various regions.

This work offers enlightenment for designing electrolytes for low-temperature energy storage devices. It was supported by the Natural Science Foundation of China and CAS Engineering Laboratory for Electrochemical Energy Storage.

Credit: 
Chinese Academy of Sciences Headquarters

A soft-hearted approach to healing

Tsukuba, Japan - Cardiovascular disease is the leading cause of death in humans. Worldwide, as the population ages, the burden of treating heart failure is increasing; opportunities for heart transplantation cannot keep pace. As adult cardiomyocytes (heart muscle cells) are terminally differentiated and do not proliferate, regeneration may be the answer. Support cells like fibroblasts can be directly reprogrammed in vitro, but these induced cardiomyocytes (iCMs) are less mature than those in vivo. Now, researchers at the University of Tsukuba have identified the roles of matrix stiffness and mechanotransduction in cardiac reprogramming, showing that matrices with softness comparable to native myocardium enhance the efficiency of this transformation by about 15%.

Mammalian myocardium is essentially incapable of regeneration following injury. Instead, a non-contractile fibrous scar forms to maintain structural integrity; compensatory hypertrophy is often inadequate to prevent eventual cardiac failure. Though cell transplantation holds potential, regeneration of damaged myocardium by direct reprogramming of cardiac fibroblasts, present in abundance, is a viable alternate strategy.

The researchers sought to investigate the signaling pathways for cardiac reprogramming as well as the underlying mechanotransductive processes (whereby cells convert physical stresses into electrochemical signals). They first prepared Matrigel-coated polystyrene dishes and Matrigel-based hydrogels to replicate extracellular matrices (ECM) with elasticities ranging from that of brain tissue to bone. These substrates were plated with transgenic mouse fibroblasts which were later transduced with four cardiac transcription factors, Gata4, Mef2c, Tbx5 and hand2 (GHMT), to generate iCMs.

"After four weeks, we found that soft substrates showed significantly more spontaneously beating iCMs, maximally on those matching native myocardium," says senior author, Professor Masaki Ieda. "Additionally, using innovative high-speed video microscopy and motion vector analysis, we demonstrated increased iCM contraction/relaxation velocities on those substrates, thus illustrating their functional maturation."

Further immunocytochemistry, western blot analysis, and fluorescence-activated cell sorting analysis helped elucidate the underlying mechanisms and signaling pathways. The researchers showed that soft ECM promotes cardiac reprogramming by inhibiting two related transcriptional co-activators, YAP and TAZ, thus suppressing fibroblast signatures. The upstream mechanotransduction pathway was also elucidated; suppression of YAP/TAZ was mediated by inhibition of integrins (transmembrane receptors that facilitate and signal ECM adhesion), Rho/ROCK (a kinase that modulates cell shape and movement), and actomyosin (a contractile protein-complex).

Professor Ieda explains the implications of their results: "Following a heart attack, the healing myocardium stiffens due to fibrosis. Understanding how matrix softness and mechanobiology affect cardiac reprogramming could inform clinical research. Direct reprogramming of cardiac fibroblasts may allow replacement of non-contractile scar by functional muscle in patients recovering from myocardial infarction."

Credit: 
University of Tsukuba

Can people with heart disease exercise safely?

Sophia Antipolis, France - 29 Aug 2020: The first recommendations on sports and physical activity in all types of heart disease are launched today by the European Society of Cardiology (ESC). The document is published online in European Heart Journal,1 and on the ESC website.2

"With rising levels of obesity and sedentary lifestyles, promoting physical activity is more crucial now than ever before," said Professor Antonio Pelliccia, Chairperson of the guidelines Task Force and chief of cardiology, Institute of Sports Medicine and Science, Rome, Italy. "Regular exercise not only prevents heart disease, but also reduces premature death in people with established heart disease."

"The chance of exercise triggering a cardiac arrest or heart attack is extremely low," said Professor Sanjay Sharma, Chairperson of the guidelines Task Force and professor of sports cardiology and inherited cardiac diseases, St. George's, University of London, UK. "People who are completely inactive and those with advanced heart disease should consult their doctor before taking up sports."

The document covers leisure exercise and competitive sports for people with heart disease and conditions which raise the risk of heart disease such as obesity and diabetes. Advice is also given on exercise during pregnancy, or in special settings such as at high altitude, in deep sea, in polluted areas, and at extreme temperatures. The document states that traffic fumes are unlikely to lessen the benefits of physical activity to heart health.

In common with healthy adults of all ages, people with heart disease should exercise on most days, totalling at least 150 minutes per week of moderate intensity exercise. Moderate intensity means increasing your heart rate and breathing rate but still being able to hold a conversation.

For people who are obese or have high blood pressure or diabetes, the guidelines recommend strength-building exercise (for example, lifting light weights) at least three times a week plus moderate or vigorous aerobic exercise, such as cycling, running, or swimming.

Coronary artery disease is the most common type of heart disease and is caused by build-up of fatty deposits on the inner walls of the arteries. If the arteries become completely blocked this can cause a heart attack. Most people with coronary artery disease can play competitive or amateur sports.

"People with long-standing coronary artery disease who wish to take up exercise for the first time should see their doctor first," said Professor Pelliccia. "The aim is to tailor the intensity of activity according to the individual risk of causing an acute event such as a heart attack."

Regular, moderate physical activity is recommended to prevent the most common heart rhythm disorder - called atrial fibrillation. People with atrial fibrillation who are taking anticoagulants to prevent stroke should avoid contact sports due to the risks of bleeding.

People with pacemakers should not be discouraged from playing sports (except collision sports) because of the device. However, they need to tailor their choice according to the underlying disease.

Professor Pelliccia noted that anyone experiencing chest pain for more than 15 minutes should call an ambulance. He added: "If you find that exercise brings on palpitations or unusual shortness of breath or chest discomfort, scale back your activity and make an appointment to see your health professional."

Professor Sharma said: "Physical activity is good for everyone with heart disease and even small amounts are beneficial. We hope these guidelines will help patients and their health professionals choose the best and most enjoyable activities for them."

Credit: 
European Society of Cardiology

How to treat the most common heart attacks

Sophia Antipolis, France - 29 Aug 2020: One in five patients die within a year after the most common type of heart attack. European Society of Cardiology (ESC) treatment guidelines for non-ST-segment elevation acute coronary syndrome are published online today in European Heart Journal,1 and on the ESC website.2

Chest pain is the most common symptom, along with pain radiating to one or both arms, the neck, or jaw. Anyone experiencing these symptoms should call an ambulance immediately. Complications include potentially deadly heart rhythm disorders (arrhythmias), which are another reason to seek urgent medical help.

Treatment is aimed at the underlying cause. The main reason is fatty deposits (atherosclerosis) that become surrounded by a blood clot, narrowing the arteries supplying blood to the heart. In these cases, patients should receive blood thinners and stents to restore blood flow. For the first time, the guidelines recommend imaging to identify other causes such as a tear in a blood vessel leading to the heart.

Regarding diagnosis, there is no distinguishing change on the electrocardiogram (ECG), which may be normal. The key step is measuring a chemical in the blood called troponin. When blood flow to the heart is decreased or blocked, heart cells die, and troponin levels rise. If levels are normal, the measurement should be repeated one hour later to rule out the diagnosis. If elevated, hospital admission is recommended to further evaluate the severity of the disease and decide the treatment strategy.

Given that the main cause is related to atherosclerosis, there is a high risk of recurrence, which can also be deadly. Patients should be prescribed blood thinners and lipid lowering therapies. "Equally important is a healthy lifestyle including smoking cessation, exercise, and a diet emphasising vegetables, fruits and whole grains while limiting saturated fat and alcohol," said Professor Jean-Philippe Collet, Chairperson of the guidelines Task Force and professor of cardiology, Sorbonne University, Paris, France.

Behavioural change and adherence to medication are best achieved when patients are supported by a multidisciplinary team including cardiologists, general practitioners, nurses, dietitians, physiotherapists, psychologists, and pharmacists.

The likelihood of triggering another heart attack during sexual activity is low for most patients, and regular exercise decreases this risk. Healthcare providers should ask patients about sexual activity and offer advice and counselling.

Annual influenza vaccination is recommended - especially for patients aged 65 and over - to prevent further heart attacks and increase longevity.

"Women should receive equal access to care, a prompt diagnosis, and treatments at the same rate and intensity as men," said Professor Holger Thiele, Chairperson of the guidelines Task Force and medical director, Department of Internal Medicine/Cardiology, Heart Centre Leipzig, Germany.

Credit: 
European Society of Cardiology

Cardiac biomarker shows stronger associations with kidney disease progression than BP

image: Visual Abstract: Strong associations for CXCL12, urine NGAL, and cardiac markers may even exceed that of systolic BP ?140 mmHg, a well-established risk factor for CKD progression.

Image: 
© NKF

The primary goal of this study by Amanda H. Anderson et al was to identify independent risk factors of CKD progression among participants with and without diabetes in a prospective CKD cohort study (N=3,379). Among those with diabetes, CKD progression rates approximately doubled with higher levels of the inflammatory chemokine CXCL12, the cardiac marker NTproBNP and the kidney injury marker urine NGAL. Among those without diabetes, rates increased over 1.5-fold with higher levels of high-sensitivity troponin T, NTproBNP, and urine NGAL. The strength of these associations exceeded that of systolic blood pressure ≥140 mmHg, a well-established risk factor for kidney disease progression. These findings provide insights into potential mechanisms of CKD progression and will guide future research in defining subgroups at highest risk for CKD progression.

Credit: 
National Kidney Foundation

Cholesterol drug combinations could cut health risk for European patients

New findings from a large European study of patients in 18 countries, including the UK, show that while many patients are able to reduce their risk through taking statins, those at the highest risk of cardiovascular events may benefit from combinations of lipid-lowering therapies.

According to the authors, the study highlights a gap between current clinical guidelines and clinical practice for cholesterol management across Europe. They explain that even among patients who are already receiving optimal doses of statins, greater use of other, non-statin cholesterol-lowering drugs could help to further reduce cholesterol levels and potentially improve health outcomes for those most at risk.

The findings, which will be presented at the virtual meeting of the European Society of Cardiology 2020, are published in the journal European Journal of Preventive Cardiology.

Professor Kausik Ray, from Imperial's School of Public Health, who led the DA VINCI study, said: "In order to tackle the burden of cardiovascular disease, a global approach is needed. After diet and lifestyle, cholesterol lowering with medications is a key approach to lowering risk of heart disease and strokes. Based on trial data we have compelling evidence that lower cholesterol levels benefit those at highest risk particularly.

"Though statins are first line treatment, it is clear from our contemporary study that statins alone even when optimally used will not help the majority of patients achieve European Society of Cardiology cholesterol goals. Only one in five very-high risk patients achieve 2019 recommended goals and to improve this will require use of combination therapy of more than one drug. Currently less than 10% of very-high risk patients in Europe receive some form of combination therapy, 9% with ezetimibe and 1% with PCSK9 inhibitors."

High levels of low-density lipoprotein (LDL) cholesterol, or so-called 'bad' cholesterol, in the blood are a known risk factor for cardiovascular disease. While diet and lifestyle are important factors in reducing LDL cholesterol, many patients are at increased risk - such as those with diabetes, inherited conditions or who have previously had heart attack or stroke - and are prescribed cholesterol-lowering drugs, like statins, to reduce their cholesterol.

But a number of other classes of cholesterol-lowering drugs are available, which act on different elements of the body's cholesterol-metabolism. These treatments, such as ezetimibe, bempedoic acid, or PCSK9 inhibitors, can be used in combination with statins to further reduce LDL-cholesterol levels.

In the DA VINCI study, a consortium of researchers led by the Imperial Clinical Trials Unit at Imperial College London looked at patients across Europe who were prescribed lipid-lowering therapies.

In total 5888 patients, enrolled across 18 countries, provided information at doctor's appointment or in hospital to manage cardiovascular conditions. Information included lifestyle factors, previous cardiovascular events (such as heart attack or stroke) as well as measures of their current LDL cholesterol levels and any current lipid-lowering medications.

Current guidelines from the European Society of Cardiology (ESC)/European Atherosclerosis Society (EAS) recommend statins as first-line treatment for lowering LDL cholesterol. The guidance also recommends goals based on risk groupings, such as a target of 50% reduction in LDL-C levels in very-high risk patients and achieving LDL-C levels below 1.4mmol/L, in order to reduce the risk of additional cardiovascular events.

In the DA VINCI study the team reviewed how lipid-lowering therapies were used in primary and secondary care and the attainment of cholesterol reduction goals set out by the guidelines.

Analysis revealed that 84% of patients1 received statins as their primary lipid-lowering therapy only, with high intensity statins used in approximately one-quarter (28%)2 of patients. Just 9% of patients were prescribed ezetimibe with moderate intensity statins and just 1% of patients used PCSK9 inhibitors in combination with statins and/or ezetimibe.

They found that overall, less than half of patients were achieving the most recent cholesterol-lowering goals set out by guidelines. Among patients receiving high-intensity statins, 2019 LDL-C goals were achieved in 22% of patients with established cardiovascular disease. However, among the patients receiving statins with a PCSK9 inhibitor about two thirds attained the new lower ESC recommended cholesterol goals.

According to the authors, the findings highlight the potential for combinations of lipid lowering drugs to help close the gap and reduce the risk for millions of patients across Europe. They explain that reducing LDL cholesterol levels in very-high risk patients (from the observed levels of above 2mmol/L to below 1.4 mmol/L) could offer an 11% relative reduction in cardiovascular events and 5% relative reduction in mortality.

The authors add that untreated lipid levels were not available and so they were unable to quantify to what extent the ?50% LDL-C reduction from baseline was achieved, but used high-intensity statin use as proxy. They add that physician choice of LLT, pre-treatment LDL cholesterol levels and local prescribing restrictions could have influenced our observations.

Professor Ray added: "Over the last 15 years we have seen improvements in guideline implementation and control of cardiovascular risk factors. These were based on better first line treatment such as statins. Now, with lowering of cholesterol goals, our data suggest this will not be enough and we need to think about cholesterol in the same way as we look at blood pressure where often combinations of treatments are needed to optimise targets."

Credit: 
Imperial College London

Fidelity of El Niño simulation matters for predicting future climate

image: Future increase of El Nino and La Nina intensity leads to enhances warming in the eastern tropical Pacific (left). Future decrease of El Nino and La Nina intensity leads to less warming in the eastern tropical Pacific (right).

Image: 
Data from NOAA.

A new study led by University of Hawai'i at Mānoa researchers, published in the journal Nature Communications this week, revealed that correctly simulating ocean current variations hundreds of feet below the ocean surface - the so-called Pacific Equatorial Undercurrent - during El Niño events is key in reducing the uncertainty of predictions of future warming in the eastern tropical Pacific.

Trade winds and the temperatures in the tropical Pacific Ocean experience large changes from year to year due to the El Niño-Southern Oscillation (ENSO), affecting weather patterns across the globe. For instance, if the tropical Pacific is warmer and trade winds are weaker than usual - an El Niño event -flooding in California typically occurs and monsoon failures in India and East Asia are detrimental to local rice production. In contrast, during a La Niña the global weather patterns reverse with cooler temperatures and stronger trade winds in the tropical Pacific. These natural climate swings affect ecosystems, fisheries, agriculture, and many other aspects of human society.

Computer models that are used for projecting future climate correctly predict global warming due to increasing greenhouse gas emissions as well as short-term year-to-year natural climate variations associated with El Niño and La Niña.

"There is, however, some model discrepancy on how much the tropical Pacific will warm," said Malte Stuecker, co-author and assistant professor in the Department of Oceanography and International Pacific Research Center at UH Mānoa. "The largest differences are seen in the eastern part of the tropical Pacific, a region that is home to sensitive ecosystems such as the Galapagos Islands. How much the eastern tropical Pacific warms in the future will not only affect fish and wildlife locally but also future weather patterns in other parts of the world."

Researchers have been working for decades to reduce the persistent model uncertainties in tropical Pacific warming projections.

Many climate models simulate El Niño and La Niña events of similar intensity. In nature, however, the warming associated with El Niño events tends to be stronger than the cooling associated with La Niña. In other words, while in most models El Niño and La Niña are symmetric, they are asymmetric in nature.

In this new study, the scientists analyzed observational data and numerous climate model simulations and found that when the models simulate the subsurface ocean current variations more accurately, the simulated asymmetry between El Niño and La Niña increases--becoming more like what is seen in nature.

"Identifying the models that simulate these processes associated with El Niño and La Niña correctly in the current climate can help us reduce the uncertainty of future climate projections," said corresponding lead author Michiya Hayashi, a research associate at the National Institute for Environmental Studies, Japan, and a former postdoctoral researcher at UH Mānoa supported by the Japan Society for the Promotion of Science (JSPS) Overseas Research Fellowships. "Only one-third of all climate models can reproduce the strength of the subsurface current and associated ocean temperature variations realistically."

"Remarkably, in these models we see a very close relationship between the change of future El Niño and La Niña intensity and the projected tropical warming pattern due to greenhouse warming," noted Stuecker.

That is, the models within the group that simulate a future increase of El Niño and La Niña intensity show also an enhanced warming trend in the eastern tropical Pacific due to greenhouse warming. In contrast, the models that simulate a future decrease of El Niño and La Niña intensity show less greenhouse gas-induced warming in the eastern part of the basin. The presence of that relationship indicates that those models are capturing a mechanism known to impact climate--signifying that those models are more reliable. This relationship totally disappears in the two-thirds of climate models that cannot simulate the subsurface ocean current variations correctly.

"Correctly simulating El Niño and La Niña is crucial for projecting climate change in the tropics and beyond. More research needs to be conducted to reduce the biases in the interactions between wind and ocean so that climate models can generate El Niño - La Niña asymmetry realistically," added Fei-Fei Jin, co-author and professor in the Department of Atmospheric Sciences at UH Mānoa.

"The high uncertainty in the intensity change of El Niño and La Niña in response to greenhouse warming is another remaining issue," said Stuecker. "A better understanding of Earth's natural climate swings such as El Niño and La Niña will result in reducing uncertainty in future climate change in the tropics and beyond."

Credit: 
University of Hawaii at Manoa