Culture

Aim to exceed weekly recommended physical activity level to offset health harms of prolonged sitting

New additional research shows that increasing physical activity can counter early death risk linked to long periods of sedentary time

The health harms associated with prolonged sitting can be offset by exceeding weekly recommended physical activity levels, says the World Health Organization (WHO) in new global guidelines on physical activity and sedentary behaviour [1], published in a special dedicated issue of the British Journal of Sports Medicine.

But all physical activity counts and is good for long term health, say the new guidelines.

It's the first time that a recommendation of this kind has been made. It reflects a large and growing body of evidence linking extensive sedentary time to serious ill health and a heightened risk of early death.

New data also released today, and published in the same special issue, [2] show that adults who clock up long hours of sedentary time every day can counter these risks by increasing the amount of physical activity they do.

The new research, involving more than 44,000 people wearing activity trackers from four countries, reveals that a high daily tally of sedentary time (defined in this study as 10 or more hours) is linked to a significantly heightened risk of death, particularly among people who are physically inactive.

But 30 to 40 daily minutes of moderate to vigorous intensity physical activity substantially weakens this risk, bringing it down to levels associated with very low amounts of sedentary time, indicate the findings, which broadly confirm the recommendations set out in the 2020 World Health Organization Global Guidelines on Physical Activity and Sedentary Behaviour.

There's not enough evidence to recommend specific maximum thresholds for sedentary behaviour, say the guidelines. But everyone, irrespective of their age or abilities, should try to limit their daily sedentary time and replace it with physical activity of any intensity.

All physical activity counts. This could be anything from climbing the stairs instead of taking the lift, a walk around the block, a spot of gardening, or some household chores, to going for a run or bike ride, a high intensity interval training work-out, or team sport.

It all adds up to the weekly tally of 150-300 minutes of moderate intensity, or at least 75-100 minutes of vigorous intensity, physical activity, the WHO guidance recommends. But any amount of physical activity is better for health than none, it emphasises.

And those unable to meet these recommendations should start small and gradually build up the frequency, intensity, and duration of their physical activity over time, it says.

Boosting physical activity levels doesn't just benefit mental and physical health, and help to stave off the risk of an early death, it's also likely to benefit the global economy, through lower rates of presenteeism, higher productivity, and lower rates of working age sickness and death, indicates another study in this special issue.[3]

Doing at least 150 minutes of moderate intensity physical activity every week, which is the lower end of the range recommended in the new WHO guidelines, would increase global gross domestic product (GDP) by between 0.15%-0.24% a year between now and 2050, estimate the researchers.

That's worth up to US$314-446 billion a year and US$6.0-8.6 trillion cumulatively over the 30 years in 2019 prices.

The new guidelines, which aim to drive national policy and practice around the globe, involved more than 40 scientists from six continents. They provide a consensus on the latest science on the health impacts of physical activity and sedentary behaviour from early childhood through to older age, and update WHO global recommendations for physical health, published in 2010.

They highlight the importance of regularly undertaking both aerobic and muscle strengthening activities, and for the first time, make specific recommendations for important, but often neglected groups, including those who live with long term conditions or disabilities, pregnant women and new mothers.

Key recommendations for adults, including those living with long term conditions or disabilities at any age:

Aim to do 150-300 mins of moderate intensity, or 75-150 mins of vigorous-intensity physical activity, or some equivalent combination of moderate intensity and vigorous intensity aerobic physical activity every week.

Undertake muscle-strengthening activity (such as weights, core conditioning) at moderate or greater intensity on 2 or more days of the week.

Reduce sedentary behaviour and aim to exceed these weekly recommendations to offset health harms of prolonged sitting.

Older adults (65+) should do physical activity that emphasises functional balance and strength training at moderate or greater intensity on 3 or more days of the week, to enhance functional capacity and prevent falls.

Women should do regular physical activity throughout pregnancy and after the birth, to include various aerobic and muscle-strengthening activities. Gentle stretching may also be beneficial.

Light intensity physical activity doesn't cause a substantial increase in heart rate or breathing and includes activities such as strolling.

Moderate intensity physical activity increases heart rate and induces a degree of breathlessness where it's still possible to talk. Examples include brisk walking, dancing, or raking up leaves.

Vigorous intensity physical activity substantially increases heart rate and breathing rate. Examples include cycling, running/jogging, swimming, carrying heavy objects, walking up the stairs, digging the garden, playing tennis.

"Although the new guidelines reflect the best available science, there are still some gaps in our knowledge. We are still not clear, for example, where exactly the bar for 'too much sitting' is. But this is a fast paced field of research, and we will hopefully have answers in a few years' time," comments special issue co-editor Professor Emmanuel Stamatakis, of the University of Sydney.

He adds: "These guidelines are very timely, given that we are in the middle of a global pandemic, which has confined people indoors for long periods and encouraged an increase in sedentary behaviour.

"But people can still protect their health and offset the harmful effects of physical inactivity. As these guidelines emphasise, all physical activity counts and any amount of it is better than none.

"There are plenty of indoor options that don't need a lot of space or equipment, such as climbing the stairs, active play with children or pets, dancing, or online yoga or Pilates classes."

Co editor of the special issue, and development lead on the guidelines, Professor Fiona Bull of the WHO, adds: "The most recent global estimates show that one in four (27.5%) adults and more than three-quarters (81%) of teenagers don't meet the recommendations for aerobic exercise, as outlined in the 2010 Global Recommendations. So there's an urgent need for governments to prioritise and invest in national initiatives and health and community services that promote physical activity.

"The publication of these new WHO global guidelines can support implementation of policy recommendations set out in the WHO's Global Action Plan on Physical Activity 2018-2030 to achieve its ambitious target of 15% improvement by 2030."

Credit: 
BMJ Group

Central trafficking compartment in neurons malfunctions in majority of Alzheimer's patients

NEW YORK, NY (Nov. 25, 2020)--Decades before the first symptoms of Alzheimer's appear, the brain's neurons start secreting tau proteins, one of the first changes known to occur in the course of the disease.

High levels of secreted forms of tau--which can be detected in spinal fluid and, as recently reported, in blood--are known to be the most reliable predictor of who will eventually develop Alzheimer's disease.

But a critical question about tau has remained unanswered: Why are neurons secreting tau in Alzheimer's disease?

"Since tau secretion is one of the earliest events in Alzheimer's, figuring out why that happens can tell us about the underlying mechanisms of the disease, which is critical for developing therapies. If tau is the smoke, in other words, what is the fire?" says Scott Small, MD, PhD, the Boris and Rose Katz Professor of Neurology at Columbia University Vagelos College of Physicians and Surgeons and director of the Alzheimer's Disease Research Center at Columbia University.

The Neuron's 'Grand Central Station' Is Commonly Defective in Alzheimer's

A new study from Small's laboratory found that, in many patients, tau secretion arises from tiny malfunctioning compartments inside the brain's neurons, suggesting that these malfunctional compartments are commonly involved in the appearance of Alzheimer's disease.

These tiny compartments, called endosomes, function as a 'grand central station' and traffic proteins throughout a cell. The new study provides evidence that endosomal trafficking is disrupted in about 70% of the patients it examined, including those only displaying the first signs of Alzheimer's.

The results provide the first direct evidence that endosomal traffic disruption--which Small, Richard Mayeux, MD, chair of neurology at Columbia, and others had previously identified as one of Alzheimer's root causes--is commonly defective among patients with the disease.

"There was no question that endosomal dysfunction is a component of Alzheimer's, but just how often it's involved had been unknown," says Sabrina Simoes, PhD, assistant professor of neurological sciences at Columbia University Vagelos College of Physicians and Surgeons, who led the study.

"Our study suggests that if drugs that restore endosomal dysfunction can slow Alzheimer's, those drugs could help a large proportion of patients."

Moreover, the study's results and new biomarker tools can be used to investigate which predisposing genes or comorbid diseases, such as obesity and diabetes, disrupt endosomal trafficking and raise the risk of developing Alzheimer's.

What the Study Did

The researchers were looking for biomarkers of endosomal trafficking dysfunction to determine its presence in Alzheimer's patients. Using mice with the same endosomal trafficking defect, the researchers searched the animals' spinal fluid for proteins that differed from that in normal animals. Three proteins stood out: Two were cleaved proteins that are known to be secreted by endosomes, called n-APLP1 and n-CHL1. The third was tau.

The study then examined the spinal fluid of people. First, investigators from Janssen Pharmaceuticals, collaborators in the study, developed new biomarkers of n-APLP1 and n-CHL1. Armed with accurate new biomarkers of these two proteins and with established biomarkers of tau (those that are in current use to diagnose patients), the investigators examined their relationship in healthy people. They found a remarkably tight relationship among the three proteins, suggesting that tau is normally secreted from the endosomal pathway.

The researcher then looked for the proteins in Alzheimer's patients and found that all three proteins are abnormally elevated in spinal fluid in approximately 70% of patients, even in those in the early "prodromal" stage of the disease.

Can Alzheimer's Be Slowed by Restoring Endosome Trafficking?

The study's identification of three new biomarkers of endosomal trafficking disruption can be used to accelerate Alzheimer's drug discovery and clinical trials by identifying patients with malfunctioning endosomes and measuring a drug candidate's ability to restore endosomal operations.

Researchers in Small's lab are working to find ways to improve endosomal trafficking and slow Alzheimer's disease. Several years ago, they identified compounds that do just that in neurons in a dish, and are now trying to develop these compounds as therapeutics. They are also developing gene therapies that can correct endosomal trafficking disruption in the brain.

The researchers are testing these therapeutics to see if they can improve the endosome's trafficking function in animal models of Alzheimer's disease.

More Information

The study appears in a paper titled, "Tau and other proteins found in Alzheimer's Disease spinal fluid are linked to retromer-mediated endosomal traffic," published online Nov. 25 in Science Translational Medicine.

Credit: 
Columbia University Irving Medical Center

Water-to-land transition in early tetrapods

image: Description: The aerial scene depicts two Late Devonian early tetrapods – Ichthyostega and Acanthostega – coming out of the water to move on land. Footprints trail behind the animals to show a sense of movement.
Original artwork created by scientific illustrator Davide Bonadonna.

Image: 
Original artwork created by scientific illustrator Davide Bonadonna.

The water-to-land transition is one of the most important and inspiring major transitions in vertebrate evolution. And the question of how and when tetrapods transitioned from water to land has long been a source of wonder and scientific debate.

Early ideas posited that drying-up-pools of water stranded fish on land and that being out of water provided the selective pressure to evolve more limb-like appendages to walk back to water. In the 1990s newly discovered specimens suggested that the first tetrapods retained many aquatic features, like gills and a tail fin, and that limbs may have evolved in the water before tetrapods adapted to life on land. There is, however, still uncertainty about when the water-to-land transition took place and how terrestrial early tetrapods really were.

A paper published November 25 in Nature addresses these questions using high-resolution fossil data and shows that although these early tetrapods were still tied to water and had aquatic features, they also had adaptations that indicate some ability to move on land. Although, they may not have been very good at doing it, at least by today's standards.

Lead author Blake Dickson, PhD '20 in the Department of Organismic and Evolutionary Biology at Harvard University, and senior author Stephanie Pierce, Thomas D. Cabot Associate Professor in the Department of Organismic and Evolutionary Biology and curator of vertebrate paleontology in the Museum of Comparative Zoology at Harvard University, examined 40 three-dimensional models of fossil humeri (upper arm bone) from extinct animals that bridge the water-to-land transition.

"Because the fossil record of the transition to land in tetrapods is so poor we went to a source of fossils that could better represent the entirety of the transition all the way from being a completely aquatic fish to a fully terrestrial tetrapod," said Dickson.

Two thirds of the fossils came from the historical collections housed at Harvard's Museum of Comparative Zoology, which are sourced from all over the world. To fill in the missing gaps, Pierce reached out to colleagues with key specimens from Canada, Scotland, and Australia. Of importance to the study were new fossils recently discovered by co-authors Dr. Tim Smithson and Professor Jennifer Clack, University of Cambridge, UK, as part of the TW:eed project, an initiative designed to understand the early evolution of land-going tetrapods.

The researchers chose the humerus bone because it is not only abundant and well preserved in the fossil record, but it is also present in all sarcopterygians - a group of animals which includes coelacanth fish, lungfish, and all tetrapods, including all of their fossil representatives. "We expected the humerus would carry a strong functional signal as the animals transitioned from being a fully functional fish to being fully terrestrial tetrapods, and that we could use that to predict when tetrapods started to move on land," said Pierce. "We found that terrestrial ability appears to coincide with the origin of limbs, which is really exciting."

The humerus anchors the front leg onto the body, hosts many muscles, and must resist a lot of stress during limb-based motion. Because of this, it holds a great deal of critical functional information related to an animal's movement and ecology. Researchers have suggested that evolutionary changes in the shape of the humerus bone, from short and squat in fish to more elongate and featured in tetrapods, had important functional implications related to the transition to land locomotion. This idea has rarely been investigated from a quantitative perspective - that is, until now.

When Dickson was a second-year graduate student, he became fascinated with applying the theory of quantitative trait modeling to understanding functional evolution, a technique pioneered in a 2016 study led by a team of paleontologists and co-authored by Pierce. Central to quantitative trait modeling is paleontologist George Gaylord Simpson's 1944 concept of the adaptive landscape, a rugged three-dimensional surface with peaks and valleys, like a mountain range. On this landscape, increasing height represents better functional performance and adaptive fitness, and over time it is expected that natural selection will drive populations uphill towards an adaptive peak.

Dickson and Pierce thought they could use this approach to model the tetrapod transition from water to land. They hypothesized that as the humerus changed shape, the adaptive landscape would change too. For instance, fish would have an adaptive peak where functional performance was maximized for swimming and terrestrial tetrapods would have an adaptive peak where functional performance was maximized for walking on land. "We could then use these landscapes to see if the humerus shape of earlier tetrapods was better adapted for performing in water or on land" said Pierce.

"We started to think about what functional traits would be important to glean from the humerus," said Dickson. "Which wasn't an easy task as fish fins are very different from tetrapod limbs." In the end, they narrowed their focus on six traits that could be reliably measured on all of the fossils including simple measurements like the relative length of the bone as a proxy for stride length and more sophisticated analyses that simulated mechanical stress under different weight bearing scenarios to estimate humerus strength.

"If you have an equal representation of all the functional traits you can map out how the performance changes as you go from one adaptive peak to another," Dickson explained. Using computational optimization the team was able to reveal the exact combination of functional traits that maximized performance for aquatic fish, terrestrial tetrapods, and the earliest tetrapods. Their results showed that the earliest tetrapods had a unique combination of functional traits, but did not conform to their own adaptive peak.

"What we found was that the humeri of the earliest tetrapods clustered at the base of the terrestrial landscape," said Pierce. "indicating increasing performance for moving on land. But these animals had only evolved a limited set of functional traits for effective terrestrial walking."

The researchers suggest that the ability to move on land may have been limited due to selection on other traits, like feeding in water, that tied early tetrapods to their ancestral aquatic habitat. Once tetrapods broke free of this constraint, the humerus was free to evolve morphologies and functions that enhanced limb-based locomotion and the eventual invasion of terrestrial ecosystems

"Our study provides the first quantitative, high-resolution insight into the evolution of terrestrial locomotion across the water-land transition," said Dickson. "It also provides a prediction of when and how [the transition] happened and what functions were important in the transition, at least in the humerus."

"Moving forward, we are interested in extending our research to other parts of the tetrapod skeleton," Pierce said. "For instance, it has been suggested that the forelimbs became terrestrially capable before the hindlimbs and our novel methodology can be used to help test that hypothesis."

Dickson recently started as a Postdoctoral Researcher in the Animal Locomotion lab at Duke University, but continues to collaborate with Pierce and her lab members on further studies involving the use of these methods on other parts of the skeleton and fossil record.

Credit: 
Harvard University, Department of Organismic and Evolutionary Biology

Early trial hints CAR T cells may combat solid tumors in children with neuroblastoma

A phase 1 trial involving 12 children with relapsed neuroblastoma - a hard-to-treat pediatric cancer - shows that anticancer CAR T cells displayed signs of efficacy against these tumors while avoiding damage to nerve tissue. The results suggest that CAR T cells should be further evaluated as treatments for neuroblastoma and other solid tumors, which have typically been difficult to target with T cell therapies. CAR T cells are immune cells from patients that have been engineered to hunt down cancer cells, and have shown great promise against blood cancers such as leukemia. However, it has been much more difficult to use CAR T cells against solid tumors, which are harder for the cells to penetrate. Here, Karin Straathof and colleagues performed a phase 1 trial to study the safety and performance of CAR T cells against neuroblastoma. They created CAR T cells that target the GD2 protein, which is common on neuroblastoma tumors, and administered the cells to 12 pediatric patients who had failed to respond to standard chemotherapies. None of the children showed measurable drops in the size of their tumors after one month, according to standard X-ray criteria, and some experienced immune-related side effects. However, three patients who also received two conditioning drugs showed some signs of antitumor immunity, and did not display any signs of damage to healthy nerve tissue bearing GD2. Straathof et al. note that further modifications to the CAR T cells will be necessary to boost their persistence within tumors, given the transient nature of the observed antitumor responses.

Credit: 
American Association for the Advancement of Science (AAAS)

Electromagnetic imaging reveals freshwater cache off Hawai'ian coast

Pointing toward a much-needed future reservoir of freshwater for the island of Hawai'i in the face of climate-driven drought, electromagnetic images of the island have revealed multilayered basalt, ash and soil formations that serve as a previously unknown conduit to move freshwater offshore to the submarine flank of the island. According to the study by Eric Attias and colleagues, these freshwater-saturated layers may extend as much as 4 kilometers offshore. Attias et al. decided to look more deeply into the transport of freshwater through the island's complex geology after noting a 40% discrepancy in the predicted volume of fresh groundwater. The researchers looked for this missing freshwater using a technique called marine controlled-source electromagnetic (CSEM) imaging, which can distinguish seawater from freshwater through measurements of the electrical resistivity of rock formations. Their images mapped out a pattern of alternating ash/soil and basalt layers of differing porosity that trap layers of fresh groundwater while forcing out seawater. As these layers dip under the ocean, they form a 3.5 cubic kilometer cache of freshwater west of Hawai'i. Electromagnetic studies of other volcanic islands across the globe suggest the presence of similarly layered offshore hydrogeological formations, the researchers note, indicating potential new sources of freshwater for these islands as well.

Credit: 
American Association for the Advancement of Science (AAAS)

For teens with migraine, sleeping in (a bit) may help

Research indicates that starting school later in the morning yields health and academic benefits for high schoolers, whose natural body clock tends toward late-to-bed, late-to-rise habits. While parents raise concerns about drowsy driving, irritation and impaired school performance, a new study led by researchers at UC San Francisco suggests another reason to push back the start time.

The researchers found that teens with migraines whose high schools started before 8:30 a.m. experienced an average 7.7 headache days per month. This was close to three more headache days than those with later school start times, the researchers reported in their study, which publishes in Headache: The Journal of Head and Face Pain on Nov. 25, 2020.

"Evidence suggests that there is a relationship between sleep and migraine," said first author Amy Gelfand, MD, a neurologist at the Pediatric Headache Program at UCSF Benioff Children's Hospitals, noting that 8-12 percent of adolescents suffer from the disease. "Getting adequate sleep and maintaining a regular sleep schedule may reduce the frequency of migraines."

The American Academy of Sleep Medicine advises that teens get from eight to 10 hours of sleep a night. In recognition of adolescents' delayed circadian clock, the American Academy of Pediatrics recommends that middle and high schools start no earlier than 8:30 a.m. However, just 18 percent of public middle and high schools adhere to this recommendation, according to the Centers for Disease Control and Prevention.

To quantify what impact, if any, school start times have on migraine frequency, the researchers reached out to high schoolers via social media, offering a $10 gift card to complete a brief survey. Approximately 1,000 9th-12th graders whose headaches fit the criteria for migraine responded to the survey. They comprised 509 students who started school before 8:30 a.m. and 503 who started school after 8:30 a.m.

Both groups had an average 24-minute commute to school, with the earlier-start group waking up at 6:25 a.m. and beginning school at 7:56 a.m., and the later-start group waking up at 7:11 a.m. and beginning school at 8:43 a.m. Of note, the later-start group went to bed earlier on school nights - on average at 10:19 p.m., versus 10:58 p.m. in the earlier-start group.

While the average number of headache days per month averaged 7.7 days for the earlier-start group and 4.8 days for the later-start group, the difference narrowed to 7.1 and 5.8 days when the researchers adjusted for risk factors such as inadequate sleep and skipping breakfast, as well as for gender, grade, homework volume and migraine medication use. Nevertheless, the 1.3 days difference between the two groups remained significant, said Gelfand, who is affiliated with the UCSF Weill Institute for Neurosciences.

"The magnitude of the effect size in this study is similar to that seen in studies of migraine prevention drugs," she said. "For example, in a trial of topiramate (Topamax) versus placebo in 12- to 17-year-olds with episodic migraine, those receiving the drug had an average of two migraine days in the last month, compared with 3.5 migraine days for those on placebo - a difference of 1.5 days."

Similarly, in two studies of adults receiving onabotulinum toxin A (Botox) injections to prevent chronic migraines, the difference between the number of headache days between the two treatment groups and each placebo group was 1.4 days and 2.3 days.

"If our findings are confirmed in future research, shifting to a later high school start time is a modifiable, society-level intervention that could translate to thousands of fewer migraine days and fewer missed days of school for teenagers," said Gelfand, who will be the new editor-in-chief of Headache in January.

In 2019, California became the first state to legislate that high schools begin no earlier than 8:30 a.m., a change that will be implemented in the 2022-23 academic year. "The COVID-19 pandemic has led to widescale changes in how students attend schools," said Gelfand. "As we rethink what a typical school day looks like, the time may be ripe for changing school start time as well."

Credit: 
University of California - San Francisco

Study characterizes suspected COVID-19 infections in emergency departments in the UK

Among patients reporting to hospital emergency departments (EDs) with suspected COVID-19 infection, important differences in symptoms and outcome exist based on age, sex and ethnicity, according to a new study published this week in the open-access journal PLOS ONE by Steve Goodacre of the University of Sheffield, UK, and colleagues.

Hospital EDs have played a crucial role during the COVID-19 pandemic in receiving acutely ill patients, determining their need for hospital admission, and providing treatment. Appropriate management of the heterogeneous population of patients who are suspected of having COVID-19 is an important challenge that needs to be informed by relevant data.

In the new paper, part of the Pandemic Respiratory Infection Emergency System Triage (PRIEST) study and funded by the National Institute for Health Research (NIHR), researchers collected a mixture of prospective and retrospective data from 22,445 people presenting to 70 EDs across the UK with suspected COVID-19 infection between March 26, 2020 and May 28, 2020. Data on sex, age, ethnicity, presenting symptoms, admission to hospital, COVID-19 result, organ support and death was available for each patient 30 days after initial presentation. This study is one of a number of COVID-19 studies that have been given urgent public health research status by the Chief Medical Officer/ Deputy Chief Medical Officer for England.

On average, those included in the study were 58.4 years old, 50.4% female, and 84.75% white. Adults admitted to the hospital with confirmed COVID-19 were more than twice as likely to die or receive organ support than adults who did not have COVID-19, suggesting a worse outcome from COVID-19 than similar presentations. Compared to children aged 16 years and under, adults were sicker, had higher rates of hospital admission (67.1% vs 24.7%), COVID positivity (31.2% vs 1.2%) and death (15.9% vs 0.3%). Men were also more likely to be admitted to the hospital than women (72.9% vs 61.4%), required more organ support (12.2% vs 7.7%) and were more likely to die (18.7% vs 13.3%).

In addition, ethnicity conveyed some differences--Black and Asian adults tended to be younger than White adults and, while they were less likely to be admitted to the hospital (Black 60.8%, Asian 57.3%, White 69.6%), they were more likely to require organ support (15.9%, 14.3%, 8.9%) and, importantly, more likely to have a positive COVID-19 test (40.8%, 42.1%, 30.0%).

The authors add: "Our findings show that people attending emergency departments with suspected COVID-19 were seriously ill, suggesting that policies aimed at diverting less serious cases away from hospitals were successful. We also showed that admission with COVID-19 carries a much higher risk of death or need for life-saving treatment than admission with similar conditions."

Credit: 
PLOS

New wheat and barley genomes will help feed the world

image: Associate Professor Ken Chalmers, University of Adelaide, inspecting grain grown in glasshouse trials

Image: 
University of Adelaide

An international research collaboration, including scientists from the University of Adelaide's Waite Research Institute, has unlocked new genetic variation in wheat and barley - a major boost for the global effort in breeding higher-yielding wheat and barley varieties.

Researchers from the 10+ Wheat Genomes Project, led by Professor Curtis Pozniak (University of Saskatchewan, Canada), and the International Barley Pan Genome Sequencing Consortium, led by Professor Nils Stein (Leibniz Institute of Plant Genetics and Crop Plant Research (IPK), Germany), have sequenced a suite of genomes of both cereals, published today in the journal Nature. They say it will open the doors to the next generation of wheat and barley varieties.

"Wheat and barley are staple food crops around the world but their production needs to increase dramatically to meet future food demands," says the University of Adelaide's Associate Professor Ken Chalmers who, together with his School of Agriculture, Food & Wine colleague Professor Emeritus Peter Langridge, led the Adelaide research. "It is estimated that wheat production alone must increase by more than 50% over current levels by 2050 to feed the growing global population." Professor Chengdao Li at Murdoch University also played a key role in the Australian component of the barley sequencing.

Today's published research brings scientists closer to unlocking the entire gene set - or pan genomes - of wheat and barley. Through understanding the full extent of genetic variation in these cereals, researchers and plant breeders will have the necessary tools to realise the required increased global production.

"Advances in genomics have accelerated breeding and the improvement of yield and quality in crops including rice and maize, but similar efforts in wheat and barley have been more challenging," says Professor Langridge. "This is largely due to the size and complexity of their genomes, our limited knowledge of the key genes controlling yield, and the lack of genome assembly data for multiple lines of interest to breeders.

"Modern wheat and barley cultivars carry a wide range of gene variants and diverse genomic structures that are associated with important traits, such as increased yield, drought tolerance and disease resistance.

"This variation cannot be captured with a single genome sequence. Only by sequencing multiple and diverse genomes can we begin to understand the full extent of genetic variation, the pan genome."
The two international projects have sequenced multiple wheat and barley varieties from around the world. The Adelaide component was supported by the Grains Research and Development Corporation (GRDC).

"The information generated through these collaborative projects has revealed the dynamics of the genome structure and previously hidden genetic variation of these important crops, and shown how breeders have achieved major improvements in productivity. This work will support the delivery of the next generations of modern varieties," Associate Professor Chalmers says.

The inclusion of two Australian varieties of wheat, AGT-Mace (PBR) and Longreach-Lancer (PBR) reflecting both the southern and northern growing areas, means that potential genetic variation for adaptation to our different production environments can be identified.

The University of Adelaide also sequenced three barley varieties with desirable traits such as high-yield and potential for tolerance to heat, frost, salinity and drought, and novel disease resistance.

"These genome assemblies will drive functional gene discovery and equip researchers and breeders with the tools required to bring the next generation of modern wheat and barley cultivars that will help meet future food demands," says Associate Professor Ken Chalmers.

Credit: 
University of Adelaide

Wheat diversity due to cross-hybridization with wild grasses

image: Great genetic diversity: There are more than 560,000 different varieties of bread wheat.

Image: 
Rebecca Leber, UZH

A variety of bread wheat that flourishes across Switzerland would remain just a poorly growing grass in India. This ability to adapt to regional climate conditions and environmental factors makes bread wheat the most commonly grown crop around the world. Its cultivation dates back around 8,000 years. Over time, more than 560,000 different varieties have developed. Seeds of each variety are stored in international seed vaults. Until now, however, the genetic factors responsible for the diversity and adaptability of wheat were largely unknown.

Genomes of 10 wheat varieties completely deciphered

Previously, the only bread wheat genome to have been decoded was from an old Chinese landrace. It has served as a model plant in research for many years, but differs greatly from the properties of more modern wheat varieties used in agriculture. Now an international consortium led by University of Saskatchewan wheat breeder Curtis Pozniak and involving more than 100 researchers from nine countries - including plant and evolutionary biologists from the University of Zurich (UZH) - has completely sequenced the genomes of 10 wheat varieties from North America, Asia, Australia and Europe. "The 10 varieties represent a significant portion of the worldwide variety of wheats. The genome data, which are freely available to all interested parties, constitute an important resource for humanity," says Beat Keller, professor at the UZH Department of Plant and Microbial Biology.

Chromosome fragments from wild grasses cross-hybridized

With around 100,000 genes on 21 chromosomes, the wheat genome is approximately five times larger than the human genome. Like other kinds of grains, the modern common wheat has a multiple set of chromosomes that came about through hybridization and combining of three different parent plants. "We were able to find numerous differences in the genome structure of the investigated wheat varieties. They differ in particular through large chromosome fragments which at some time in the past were cross-hybridized by wild grasses," adds UZH researcher Thomas Wicker, one of the corresponding authors of the study. While some of these fragments were transferred through targeted cultivation, the source of most fragments is still unknown.

Crossing of species boundaries leads to diversity

If chromosome fragments from wild grasses are crossed with wheat, the species boundary has been crossed. According to the researchers, this process is a significant biological factor behind the diversity and adaptability of wheat. This is exemplified by the large differences in the type and number of immune receptors they discovered in the genome sequences. "This variability shows that the different varieties have adapted to the regionally varying plant diseases, such as viruses and fungi, or pests, such as insects," says Wicker.

Meet rising demand thanks to more targeted cultivation

According to Kentaro Shimizu, UZH professor at the Department of Evolutionary Biology and Environmental Studies, triple chromosome sets of bread wheat gives it another evolutionary advantage: "Single genes can change while other copies of the same gene retain their original function. The plant then has a greater repertoire of possibilities for adaptation." Alongside the discovery of the genes for particular quality features and resistances, which are of agronomic significance - exemplified by the Japanese cultivar Norin 61 in an additional publication of the UZH researchers - , the "10+ Wheat Genome Project" also enables a more targeted cultivation of specific wheat varieties, which will help that the rising worldwide demand can be met in the future.

Credit: 
University of Zurich

Fruit flies reveal new insights into space travel's effect on the heart

image: Karen Ocorr, Ph.D., assistant professor in Sanford Burnham Prebys' Development, Aging and Regeneration Program and Neuroscience and Aging Research Center

Image: 
Sanford Burnham Prebys Medical Discovery Institute

Scientists at Sanford Burnham Prebys Medical Discovery Institute have shown that fruit flies that spent several weeks on the International Space Station (ISS)--about half of their lives--experienced profound structural and biochemical changes to their hearts. The study, published today in Cell Reports, suggests that astronauts who spend a lengthy amount of time in space--which would be required for formation of a moon colony or travel to distant Mars--could suffer similar effects and may benefit from protective measures to keep their hearts healthy. The research also revealed new insights that could one day help people on Earth who are on long-term bed rest or living with heart disease.

"For the first time, we can see the cellular and molecular changes that may underlie the heart conditions seen in astronaut studies," says Karen Ocorr, Ph.D., assistant professor in the Development, Aging and Regeneration Program at Sanford Burnham Prebys and co-senior author of the study. "We initiated this study to understand the effects of microgravity on the heart, and now we have a roadmap we can use to start to develop strategies to keep astronaut hearts strong and healthy."

Past studies have shown that under microgravity conditions, the human heart shifts from an oval to a more spherical shape. Space flight also causes the heart muscle to weaken (atrophy), reducing its ability to pump blood throughout the body. However, until now, human heart studies--completed using ultrasounds performed on the ISS--have been limited to a relatively small number of astronauts. While important, these studies didn't reveal the cellular and molecular changes that drive these transformations--information needed to develop countermeasures that will keep astronauts safe on prolonged flights.

"As we continue our work to establish a colony on the moon and send the first astronauts to Mars, understanding the effects of extended time in microgravity on the human body is imperative," says Sharmila Bhattacharya, Ph.D., senior scientist at NASA and a study author. "Today's results show that microgravity can have dramatic effects on the heart, suggesting that medical intervention may be needed for long-duration space travel, and point to several directions for therapeutic development."

Fruit flies are surprisingly good models for studying the human heart. The insects share nearly 75% of disease-causing genes found in humans, and their tube-shaped hearts mirror an early version of ours--which begins as a tube when we're in the womb and later folds into the four chambers with which we're familiar. Fortunately, fruit flies are also largely self-sustaining. All the food the flies needed for the duration of the trip were contained in special boxes designed for this study--allowing busy astronauts to focus on other tasks.

Journey to space

In the study, the scientists sent the special "vented fly boxes" containing vials filled with a few female and male fruit flies to the ISS for a one-month-long orbit. While in space, these flies produced hundreds of babies that experienced three weeks of microgravity--the human equivalent of three decades. The fruit flies that were born in space returned to Earth via a splashdown off the coast of Baja California. A member of the scientific team retrieved the flies from the Port of Long Beach and--very carefully--drove the specimens to Sanford Burnham Prebys' campus in La Jolla, California.

Once the flies arrived at the lab, the scientists sprang into action. Tests of heart function had to be taken within 24 hours of the return to Earth so gravity wouldn't interfere with the results. The researchers worked around the clock to measure the flies'

ability to climb up a test tube; to capture videos of the beating hearts to measure contractility and heart rate; and to preserve tissue for future genetic and biochemical assays, including mapping gene expression changes that occurred in the heart.

Extensive tissue remodeling

This work revealed that the space flies had smaller hearts that were less contractile--reducing their ability to pump blood and mirroring symptoms seen in astronauts. The heart tissue also underwent extensive remodeling. For example, the normally parallel muscle fibers became misaligned and lost contact with the surrounding fibrous structures that permit the heart to generate force--resulting in impaired pumping.

"In the normal fly heart, the muscle fibers work like your fingers when they squeeze a tube of toothpaste. In the space flies, the contraction was like trying to get toothpaste out by pressing down instead of squeezing," explains Ocorr. "For humans, this could become a big problem."

To the scientists' surprise, the fibrous extracellular matrix (ECM) surrounding the heart of the space flies was significantly reduced. After a heart injury such as a heart attack, this supportive tissue is often overproduced and interferes with heart function. For this reason, the interplay between the ECM and the heart is an active area of research for heart scientists.

"We were very excited to find several ECM-interacting proteins that were dysregulated in the space flies," says Rolf Bodmer, Ph.D., director and professor in the Development, Aging and Regeneration Program at Sanford Burnham Prebys and co-senior author of the study. "These proteins weren't previously on the radar of heart researchers, so this could accelerate the development of therapies that improve heart function by reducing fibrosis."

The tip of the iceberg

Ocorr and Bodmer are still busy analyzing genetic and molecular data from this study and believe these insights are the "tip of the iceberg" for this type of research. Vision problems are common in astronauts, so the scientists are also analyzing eye tissue from the space flies. Another area of interest relates to the babies of the flies that were born in space, which would help reveal any inherited effects of space flight. While astronaut health is the primary goal, people on Earth may ultimately be the greatest beneficiaries of this pioneering work.

"I am confident that heart disease research is going to benefit from the insights we're gaining from these flights," says Ocorr. "Understanding how the heart functions in space is also going to teach us more about how the heart works and can break on Earth."

Credit: 
Sanford Burnham Prebys

From fins to limbs and water to land

image: The aerial scene depicts two Late Devonian early tetrapods - Ichthyostega and Acanthostega - coming out of the water to move on land. Footprints trail behind the animals to show a sense of movement.

Image: 
Davide Bonadonna

It's hard to overstate how much of a game-changer it was when vertebrates first rose up from the waters and moved onshore about 390 million years ago. That transition led to the rise of the dinosaurs and all the land animals that exist today.

"Being able to walk around on land essentially set the stage for all biodiversity and established modern terrestrial ecosystems," said Stephanie Pierce, Thomas D. Cabot Associate Professor of Organismic and Evolutionary Biology and curator of vertebrate paleontology in the Museum of Comparative Zoology. "It represents an incredibly important period of time in evolutionary history."

Scientists have been trying for more than a century to unravel exactly how this remarkable shift took place, and their understanding of the process is largely based on a few rare, intact fossils with anatomical gaps between them. A new study from Pierce and Blake Dickson, Ph.D. '20, looks to provide a more thorough view by zeroing in on a single bone: the humerus.

The study, published today, shows how and when the first groups of land explorers became better walkers than swimmers. The analysis spans the fin-to-limb transition and reconstructs the evolution of terrestrial movement in early tetrapods. These are the four-limbed land vertebrates whose descendants include extinct and living amphibians, reptiles, and mammals.

The researchers focused on the humerus, the long bone in the upper arm that runs down from the shoulder and connects with the lower arm at the elbow, to get around the dilemma of gaps between well-preserved fossils. Functionally, the humerus is invaluable for movement because it hosts key muscles that absorb much of the stress from quadrupedal locomotion. Most importantly, the bone is found in all tetrapods and the fishes they evolved from and is pretty common throughout the fossil record. The bone represents a time capsule of sorts, with which to reconstruct the evolution of locomotion since it can be examined across the fin-to-limb transition, the researchers said.

"We went in with the idea that the humerus should be able to tell us about the functional evolution of locomotion as you go from being a fish that's just swimming around and as you come onto land and start walking," Dickson said.

The researchers analyzed 40 3D fossil humeri for the study, including new fossils collected by collaborators at the University of Cambridge as part of the TW:eed Project. The team looked at how the bone changed over time and its effect on how these creatures likely moved.

The analysis covered the transition from aquatic fishes to terrestrial tetrapods. It included an intermediate group of tetrapods with previously unknown locomotor capabilities. The researchers found that the emergence of limbs in this intermediate group coincided with a transition onto land, but that these early tetrapods weren't very good at moving on it.

To understand this, the team measured the functional trade-offs associated with adapting to different environments. They found that as these creatures moved from water to land, the humerus changed shape, resulting in new combinations of functional traits that proved more advantageous for life on land than in the water.

That made sense to the researchers. "You can't be good at everything," Dickson said. "You have to give up something to go from being a fish to being a tetrapod on land."

The researchers captured the changes on a topographical map showing where these early tetrapods stood in relation to water-based or land-based living. The scientists said these changes were likely driven by environmental pressures as these creatures adapted to terrestrial life.

The paper describes the transitional tetrapods as having an "L-shaped" humerus that provided some functional benefit for moving on land, but not much. These animals had a long way to go to develop the traits necessary to use their limbs on land to move with ease and skill.

As the humerus continued to change shape, tetrapods improved their movement. The "L" shaped humerus transformed into a more robust, elongated, twisted form, leading to new combinations of functional traits. This change allowed for more effective gaits on land and helped trigger biological diversity and expansion into terrestrial ecosystems. It also helped establish complex food chains based on predators, prey, herbivores, and carnivores still seen today.

Analysis took about four years to complete. Quantifying how the humerus changed shape and function took thousands of hours on a supercomputer. The researchers then analyzed how those changes impacted functional performance of the limb during locomotion and the trade-offs associated.

The innovative approach represents a new way of viewing and analyzing the fossil record -- an effort Pierce said was well worth it.

"This study demonstrates how much information you can get from such a small part of an animal's skeleton that's been recorded in the fossil record and how it can help unravel one of the biggest evolutionary transformations that has ever occurred," Pierce said. "This is really cutting-edge stuff."

Credit: 
Harvard University

Obesity is not only the individual's responsibility

image: Statistical differences were found in all of the categories on the graph.

Image: 
(Adapted from Table 3 in Asahara et al. PLOS ONE, 2020.)

Research based on 5425 citizens' responses to a questionnaire survey has illuminated that obesity causes are linked to various factors in addition to the individual's current socioeconomic circumstances, including childhood experiences, particularly those of abuse. The study was carried out by Project Professor TAMORI Yoshikazu (Division of Creative Health Promotion) et al.'s research group at the Kobe University Graduate School of Medicine.

Conventionally, there is a tendency to perceive individuals who are overweight as lacking the willpower to improve their lifestyle habits. However, this research study has revealed that in women, obesity in adulthood is linked not only to factors such as social environment (for example, economic circumstances and education), but also to childhood experiences, in particular abuse.

This suggests that improving child welfare, such as by increasing abuse prevention measures, will also help to prevent obesity in adults.

These research results are due to be published in the scientific journal PLOS ONE on November 25 at 2pm (EST).

Main Points

Analysis of 5425 responses to a questionnaire survey given to 20,000 Kobe citizens revealed that in women, obesity was related to the individual's social and economic background (for example, factors such as marital status, economic circumstances, educational background, and childhood experiences of abuse by a parent).

The same connection was not found in results from male participants.

This study is the first in Japan to illuminate the connection between abuse during childhood and obesity in adulthood.

Socioeconomic factors also have an impact on obesity in women. Therefore, it is important to approach obesity prevention efforts not only from a medical standpoint but also from a societal perspective, including the authorities.

Research Background

Obesity is increasing worldwide against a background of lifestyle habits such as overeating and insufficient exercise. In Japan, approximately 1 in 3 men and 1 in 5 women are overweight. Obesity causes various disorders such as Type-2 diabetes, dyslipidemia, high blood pressure, heart disease, fatty liver, stroke and sleep apnea, thus shortening healthy life expectancy.

There is a strong relationship between obesity and lifestyle habits, however it has been reported overseas that various aspects of individuals' social backgrounds also have an effect. In Japan, this kind of survey study had yet to be carried out. In addition, there are racial and cultural differences between Japan and overseas. Therefore, this study aimed to understand the relationship between obesity and social background based on the results of a survey carried out in Japan.

Revealing the relationship between obesity and the individual's social background will make a large contribution towards measures to tackle and prevent obesity.

Research Results

In 2018, the Kobe City authorities distributed a survey to 20,000 citizens aged between 20 and 64 about aspects such as their living conditions and health issues in order to gain an understanding of residents' health. Based on the results of this questionnaire, Professor Tamori investigated what sort of personal living conditions were connected to obesity, which has been called 'the cause of all kinds of diseases'.

There were more obese male respondents (27.2%) than female respondents (10.6%), which is the same as the national trend in Japan. When the researchers investigated the social and personal factors that were linked to obesity, they found that there were differences in the following categories between obese women and those with an average weight: employment status, household economic circumstances, educational background, extracurricular activities during middle school/high school, economic circumstances at age 15, and experiences of adversity during childhood. Furthermore, marital status, household economic circumstances, educational background, and experiences of adversity during childhood were factors that could predict the onset of obesity (Figure 1). On the other hand, no statistical differences were found between the surveyed categories in men (Figure 2).

Experiences of adversity during childhood included physical violence from a parent, insufficient food or clothing, and emotional trauma originating from a parent's comments or insults.

The results of this research have revealed that although obesity is more prevalent in men, the social background of the individual is strongly connected to the onset of obesity in women. In particular, this study is the first in Japan to show a connection between experiences of childhood abuse and obesity in adult women (see supplementary explanation).

Research Significance and Further Research

It has been reported that social and economic factors such as income and educational background are linked to obesity in women living in developed countries. The current study showed that in Kobe, a representative large city of Japan, obesity in women is also related to socioeconomic background.

The main causes of obesity are conventionally considered to be overeating and insufficient exercise. Consequently, there is a tendency to perceive those who are overweight as lacking in self-discipline and being weak-willed. However, this study has revealed that in women, the social background of the individual is also connected to the onset of obesity. This highlights the importance of taking social factors into account when implementing policies to tackle obesity.

Childhood abuse counselling cases are increasing in Japan. This research indicates that improving child welfare by, for example, strengthening measures against abuse, could be connected to obesity prevention.

Supplementary Explanation

Studies conducted overseas have reported that experience of childhood abuse (physical, psychological and sexual abuse, and neglect) not only leads to obesity in adulthood but is also linked to general ill health, such as increased habitual smoking. Those who have experienced abuse are believed to develop a reliance on sugary or high-fat 'tasty' foods more easily and to be more prone to overeating when stressed.

Researcher Comment (Project Professor TAMORI Yoshikazu)

There are differences in the way men and women perceive weight. 'Thinness' is more easily associated with 'health and beauty' in women than in men. Sufficient socioeconomic means are required to pursue this ideal of 'health and beauty', therefore it is probable that socioeconomic difficulties have a stronger impact on obesity in women. In addition, it is possible that the upbringing environment during childhood may cause individuals to be predisposed towards becoming overweight through 'body changes' (i.e. biotransformation) such as those resulting from the secretion of hormones. The existence of these kinds of changes has not been illuminated, however it is likely that their impact is different in men and women.

Credit: 
Kobe University

RUDN University research team of mathematicians suggested a new decision making algorithm

image: A research team from RUDN University developed an algorithm to help large groups of people make optimal decisions in a short time. They confirmed the efficiency of their model using the example of the market at which the outbreak of COVID-19 began. The model helped the administration and sellers agree on closing the market and reach a consensus about the sums of compensations in just three steps.

Image: 
RUDN University

A research team from RUDN University developed an algorithm to help large groups of people make optimal decisions in a short time. They confirmed the efficiency of their model using the example of the market at which the outbreak of COVID-19 began. The model helped the administration and sellers agree on closing the market and reach a consensus about the sums of compensations in just three steps. An article about the algorithm was published in the Information Sciences journal.

Decision theory is a field of mathematics that studies the patterns of decision making and strategy selection. In the terms of mathematics, decision making is an optimization task with multiple criteria. Expert opinions, judgments, and possible risks are considered variables, and the relations between participants and the search for an optimal solution are expressed as mathematical operations. LSGDM is a model in decision theory that describes decision making situations with over 20 expert-level participants. Their opinions are affected by personal relations: for example, friends support each other's views. This increases the level of uncertainty because convincing the participants and reaching a consensus becomes more difficult. A research team of mathematicians from RUDN University suggested a method to eliminate this uncertainty.

"Thanks to today's technological developments, more and more people start to participate in decision-making processes. That is why LSGDM has become a burning issue for researchers. In LSGDM, participants represent different areas of interest, and therefore it takes longer for them to reach a consensus. The process requires a moderator capable of convincing all parties to adjust their opinions," said Prof. Enrique Herrera-Viedma, research team's leader in RUDN University.

The solution suggested by his team of mathematicians is based on the so-called robust optimization technique. It is applied to optimization tasks that are sensitive to changes in the initial data (in this case, in the personal relations between the participants). The mathematicians suggested a new way of categorizing experts into clusters based on relationship strength and the level of trust between them. The algorithm consisted of several steps. First, the experts were clusterized; then, the team identified a cluster with the opinion that differed the most from the collective judgment; and after that, such opinion was corrected. The iterations were repeated until all participants agreed on one solution. The methods of opinion correction were irrelevant from the mathematical point of view. The only factor that mattered was the unit negotiation cost: the amount of resources (time, money, etc.) that had to be spent to reach the desired result.

The research team applied the model to a real-life example. After the outbreak of COVID-19, a seafood market in Wuhan had to be closed down. The administration was looking for an optimal solution: it had to compensate the losses of the sellers while staying within the market's budget. The mathematicians chose 20 sellers that requested different sums of compensation for closing their stalls: from 200 to 900 yuans. The participants were divided into four clusters based on such factors as similar opinions, the proximity of stalls to each other, and so on. The algorithm suggested by the team let the sellers and the administrators reach a consensus in just three steps. The final sum of compensation was 880 yuans, and the negotiation cost for the market administration turned out to be the lowest compared to other existing models.

Credit: 
RUDN University

Bank-affiliated funds contribute to funding their parent banks in times of crisis

A study published recently in the journal Review of Financial Studies by the researchers Javier Gil-Bazo, Sergio Mayordomo and Peter Hoffmann, shows clear evidence that in Spain, bank-affiliated funds provided funding support to their parent company via purchases of bonds in the primary market during the last crisis (2008-2012).

Cost-benefit calculation by conglomerates

The research is the first international study of the use of mutual funds managed by companies controlled by banks as an alternative source of funding for banking institutions. The researchers study whether conglomerates strategically consider the benefits of this practice, and if the availability of this alternative source of funding helps banks overcome periods of financial difficulty.

To minimize the cost of mutual funding, it is more common to use funds aimed at retail investors and funds without performance fees. This suggests that this form of funding is the result of a strategic decision taken at conglomerate level.

In their research, the authors found that the financial support provided by subsidiary mutual fund managers to their parent banks is greater in times of crisis and for riskier banks. According to the authors, the conflict of interest they analyse arises only in extraordinary circumstances such as those experienced by European and Spanish banks after Lehman Brothers went bankrupt in 2008.

While funding funds of affiliates is of little value to banks in normal times, it is valuable in times of financial crisis. The possibility of the parent bank using this alternative source of funding remains latent until a banking crisis occurs. This practice is more common among banks that rely more on central bank liquidity, that have a higher ratio of nonperforming loans and have experienced a reduction in their credit rating.

Analysis of Spanish banks during the period 2000-2012

The study uses data from mutual funds affiliated to Spanish banks in the period 2000-2012. According to the authors, Spain is an especially suitable laboratory for the research goal for three reasons: its mutual fund industry is dominated by the banks; the Spanish banking sector entered a period of serious crisis after the Lehman crash in 2008, which increased their dependence on central banks and competition for stable sources of funding; and finally, because, unlike the US, related transactions were not prohibited in Spain.

The study found that during the period 2000-2012, totalling all the funds of the same asset management group, excess debt purchases by the parent bank of the funds of its affiliates accounted, on average, for 2.85% of the total amount issued for the entire period analysed, which corresponds to 514 million euros per bank, or 14,400 million euros overall. While support from funding is absent in normal times, it accounts for 7% of the amount issued in times of crisis (11,900 million euros overall). This funding allowed banks to ease their financial constraints and facilitate access to credit by Spanish businesses in the context of the crisis.

Changes in regulations in Spain, which remain lax and do not favour fund investors

Unlike the US, in Spain transactions with companies of the same group are not prohibited for mutual funds, due to falling under the European regulatory framework relating to conflicts of interests in asset management. The regulation is based on the establishment of rules of conduct to prevent conflicts of interest or to minimize the impact on investors. This approach may not be effective in protecting investors of mutual funds, according to the authors.

In February 2018, the Comisión Nacional del Mercado de Valores (CNMV) national securities market commission established new rules whereby prior authorization is required for these transactions. Authorization must take into account any conflicts of interest that may arise, which means that asset management companies must explicitly declare that a transaction is made in the investors' best interests. The authors also propose a new regulatory approach that includes improving the transparency of related party transactions and their potential costs for investors and provide investors with tools that facilitate the comparison of yield, fees and risks between funds.

Credit: 
Universitat Pompeu Fabra - Barcelona

UB study identifies new functions in the Machado-Joseph genetic disease

image: Rods (type of photoreceptor) isolated from the retina of a control mouse (Atxn3 +/+) and a mouse with the silenced gene Atxn3 (Atxn3-/-). We can see the elongation of the external segment or neurosensorial cilium (OS plus CC) when there is no ATXN3 protein.

Image: 
V. Toulis et al./ Cell Reports

Ataxia is a minority disease with genetic origins, known for its neuromuscular alterations due to the selective loss of neurons in the cerebellum, the organ of our nervous systems which controls movement and balance. UB researchers have identified new functions in the ataxin 3 gene (ATXN3) -which causes Machado-Joseph disease, the most common type of ataxia- in the development of retina photoreceptors. According to the researchers, these results are not only relevant to analyse the molecular causes of ataxia and for the design of potential therapies against the disease but also to understand other diseases, such as macular degenerations.

The study, published in the journal Cell Reports, is led by Gemma Marfany, professor at the Department of Genetics, Microbiology and Statistics and researcher at the Institute of Biomedicine of the UB (IBUB) and the Rare Diseases Networking Biomedical Research Centre (CIBERER). Among other participants are the experts from the same Department, Vasileios Toulis, Sílvia García Monclús, Carlos de la Peña Ramírez, Rodrigo Arenas Galnares, Josep F. Abril and Alejandro Garanto, as well as other researchers from the University of Michigan and the Wayne State University (United States).

The Machado-Josep disease, also known as spinocerebellar ataxia-3, is caused by gain of function mutations in the ATXN3 gene. These mutations induce the formation of neurotoxic aggregates that involve the progressive death pf neurons in the cerebellum. However, not much is known about the basic function of the ATXN3 gene. In this context, the UB team has conducted its research for more than twenty years on the genetic causes of hereditary diseases in the retina, a neurosensorial tissue part of the central nervous system and which is in charge of our sight. "In previous studies, our group found that ATXN3 is expressed in a relevant way in the retina of an adult mouse, but it does so during the development of this tissue, so we suggested that we should find out the function of this gene in the retina", notes Gemma Marfany.

Alterations in the structure of the retina

With this objective, researchers carried out several experiments where they silenced ATXN3 in zebrafish embryos, and later, they analysed the effects of make this gene unusable in a genetically modified mouse. According to the researchers, the results show that the removal of ATXN3 causes severe alterations in the structure of the retina. Therefore, they could identify a new function of the protein of this gene.

"The model organisms we analysed, both fish and mice, presented an elongation of the photoreceptor cilium, the organelles that receive the light photons and that turn light energy into electric one", notes Gemma Marfany.

"Any alteration in these cilia -continues the researcher- alters the function of photoreceptors and can cause its death, and therefore, can cause blindness".

Moreover, the lack of ATXN3 protein affects the retina epithelium, a layer of pigmented cells that appear in the external part of the retina and which phagocytizes the ends of the cilia to allow these to renew themselves every day. According to the new study, without the ATXN3 gene, this phagocytosis slows down, and it alters the renewal -which is necessary for the retina to function. "All these alterations can be explained if the ATXN3 regulates the necessary proteins for the formation and growth of microtubules that cross the cell on the inside and that act as railways to transport the required proteins for the formation of the cilium and a proper function of the phagocytosis", describes the researcher.

Impact on other eye diseases

The importance of these results is an advance in the understanding of molecular causes of rare diseases such as ataxia, and other diseases as well. "The study provides new keys on ho basic functions are regulated in the neurons, among which we find the photoreceptors. The alteration of these functions contributes to other eye diseases, such as macular degeneration, an eye disorder that destroys the sight slowly, and which affects a great amount of population", concludes Marfany.

Credit: 
University of Barcelona