Earth

Community choirs reduce loneliness and increase interest in life for older adults

An innovative San Francisco program of community choirs for older adults found that singing in a choir reduced loneliness and increased interest in life, but did not improve cognition or physical function, according to a study by researchers at UC San Francisco.

The program -- Community of Voices -- was a collaboration between UCSF and the nonprofit San Francisco Community Music Center (CMC), as well as the San Francisco Department of Aging and Adult Services (DAAS), that aimed to assess whether art-based social interventions could substantively improve quality of life for older adults.

"Our current health and social systems are not prepared to help support our rapidly increasing population of older adults," said lead author Julene Johnson, PhD, associate dean for research and professor in the UCSF School of Nursing. "There's a high percentage who experience loneliness and social isolation, and depression also is relatively high. There's a need to develop novel approaches to help older adults stay engaged in the community and also stay connected."

The nearly 50 million Americans aged 65 and older represented 15.2 percent of the total U.S. population in 2016, according to the U.S. Census Bureau, and are increasingly diverse, with nearly 22 percent currently from racial/ethnic minority backgrounds -- increasing to almost a third by 2030 -- and at increased risk for poor health outcomes. Previous studies have shown that social isolation and depression can exacerbate poor health.

A potential novel approach is to engage them in the arts, as they can be offered in the community, are relatively low cost to deliver, are engaging, and can be culturally tailored. One option is community choirs, as about 32.5 million U.S. adults regularly sing in choirs.

"Thanks to the vision and leadership of UCSF and Julene Johnson, we now have evidence-based research to support the value of choirs for older adults," said Sylvia Sherman, CMC program director.

In the Nov. 9, 2018, Journal of Gerontology: Psychological Sciences study, 12 federally supported senior centers in San Francisco were randomized into a weekly group choir program designed to engage adults age 60 and older cognitively, physically and socially. Over a three-year period (February 2012 to August 2015), 390 English- and Spanish-speaking participants were enrolled into either a group that started choirs immediately (208 members), or another group that initiated choirs six months later (182 members). Two-thirds of the participants were from diverse backgrounds, 20 percent reported financial hardship, and 60 percent had two or more chronic medical conditions.

The Community of Voices choirs were led by professional choir directors and accompanists. They identified music repertoire that was culturally tailored for each site, appropriate for older adults with various singing abilities, and challenging enough to facilitate growth and mastery over time. The 90-minute choir sessions included informal public performances.

During the study, singers completed memory, coordination and balance tests, and completed questionnaires about their emotional well-being. Researchers assessed outcomes at six months, along with the health care costs.

Overall, the researchers found that older adults who sang in a choir for six months experienced significant improvements in loneliness and interest in life. However, no substantial group differences occurred in the cognitive or physical outcomes or for health care costs. The overall six-month retention rate was 92 percent.

Each of the 12 choirs created for the UCSF trial program continues to sing as part of CMC's Older Adult Choir Program.

"We were a little surprised not to see improvements in cognitive and physical function, especially because the literature, although small, suggested there should be improvements," Johnson said. "However, our study is one of the first randomized controlled trials of a choir intervention, whereas the others were cross-sectional or did not randomly assign the participants."

More research is needed on how choirs improve well-being and the potential long-term health impacts, said Johnson, who served on a 25-person panel of the National Institutes of Health and John F. Kennedy Center for the Performing Arts on music and the brain, with results published in March 2018 in Neuron.

"Besides being one of the first arts-based randomized trials for older adults, our trial represents a new direction in translational research designed to address health disparities, in which interventions are designed and evaluated in community settings from the outset," Johnson said. "These study methods can be a model for future trials to engage and retain diverse older adults in research."

Credit: 
University of California - San Francisco

Personalized medicine helps eyes drain themselves

image: The Purdue University glaucoma drainage device is built with microactuators that vibrate when a magnetic field is introduced.

Image: 
Hyowon Lee/Purdue University

WEST LAFAYETTE, Ind. - Purdue University researchers have invented a new smart drainage device to help patients with glaucoma, a leading cause of blindness in the world, as they try to save their eyesight.

Glaucoma can be treated only with medications or surgical implants, both of which offer varying degrees of success in helping to improve sight and to relieve pressure buildup inside the eye. The U.S. Centers for Disease Control and Prevention says about 3 million Americans have glaucoma.

Implantable glaucoma drainage devices have grown in popularity over the past years, but only half of the devices are still operational after five years because microorganisms accumulate on the device during and after implantation. This problem is known as biofouling.

"We created a new drainage device that combats this problem of buildup by using advances in microtechnology," said Hyowon "Hugh" Lee, an assistant professor in Purdue's Weldon School of Biomedical Engineering and a researcher at the Birck Nanotechnology Center, who led the research team. "It is able to clear itself of harmful bio-buildup. This is a giant leap toward personalized medicine."

The Purdue glaucoma drainage device is built with microactuators that vibrate when a magnetic field is introduced. The vibrations shake loose the biomaterials that have built up in the tube.

"We can introduce the magnetic field from outside the body at any time to essentially give the device a refresh," Lee said. "Our on-demand technology allows for a more reliable, safe and effective implant for treating glaucoma."

The Purdue technology is published in the latest issue of Microsystems and Nanoengineering. Another unique aspect of the Purdue device is its ability to vary flow resistance, which allows the drainage technology to customize treatment for each patient at different stages of glaucoma with varying degrees of pressure buildup inside the eye.

Other members of the Purdue research team include Arezoo Ardekani, an associate professor of mechanical engineering, and Simon John from the Jackson Laboratory.

The work aligns with Purdue's Giant Leaps celebration, acknowledging the university's global advancements in health as part of Purdue's 150th anniversary. This is one of the four themes of the yearlong celebration's Ideas Festival, designed to showcase Purdue as an intellectual center solving real-world issues.

Researchers are working with the Purdue Office of Technology Commercialization to patent the technology. They are looking for partners to license it.

Credit: 
Purdue University

Cultural brain hypothesis may explain why brains have become bigger

A theory called the cultural brain hypothesis could explain extraordinary increases in brain size in humans and other animals over the last few million years, according to a study published in PLOS Computational Biology by Michael Muthukrishna of the London School of Economics and Political Science and Harvard University, and colleagues at the University of British Columbia and Harvard University.

Humans have extraordinarily large brains, which have tripled in size in the last few million years. Other animals also experienced a significant, though smaller, increase in brain size. These increases are puzzling, because brain tissue is energetically expensive: that is, a smaller brain is easier to maintain in terms of calories. Building on existing research on learning, Muthukrishna and colleagues analytically and computationally modeled the predictions of the cultural brain hypothesis and found that this theory not only explains these increases in brain size, but a variety of other relationships with group size, learning strategies, knowledge and life history.

The theory relies on the idea that brains expand to store and manage more information. Brains expand in response to the availability of information and calories. Information availability is affected by learning strategies, group size, mating structure, and the length of the juvenile period, which co-evolve with brain size. The model captures this co-evolution under different conditions and also describes the specific and narrow conditions that can lead to a take-off in brain size--a possible pathway that led to the extraordinary expansion in our own species. The authors called this set of predictions the cumulative cultural brain hypothesis. These theories were supported by tests using existing empirical data. Taken together, the findings may help explain the rapid expansion of human brains and other aspects of our species' life history and psychology.

"This is a brand-new theory to explain the evolution of the human brain as well as brains more generally. It shows how various characteristics of a species are actually intrinsically connected through a common evolutionary process," says Muthukrishna. "The limits to larger brains is our ability to birth them, but as this theory suggests, this process is ongoing - we're now expanding our juvenile period, hitting a new biological limit in our ability to reproduce at an older age".

Next, the researchers plan to test the predictions made by the theory that relate to individual, rather than social, learning, as well as developing extensions to the theory.

Credit: 
PLOS

Common allergen, ragweed, will shift northward under climate change

image: By the 2050s, under a high-emissions scenario the bright blue areas will be newly populated with ragweed according to most climate models, and orange areas will have significantly less ragweed according to most climate models. Lighter blue shows that some of the 13 climate models predict expansion, while brown shows that some of the models predict a contraction. Gray is where ragweed is not present, and black indicates no change.

Image: 
Michael Case/University of Washington

New research from the University of Washington and the University of Massachusetts - Amherst looks at how the most common cause of sneezing and sniffling in North America is likely to shift under climate change.

A recent study published in the open-access journal PLOS ONE finds that common ragweed will expand its range northward as the climate warms, reaching places including New York, Vermont, New Hampshire, and Maine, while retreating from some current hot spots.

"It was surprising that nobody had looked at ragweed distributions in the U.S.: As climate conditions are changing, where will it spread to in the future?" said corresponding author Michael Case, who did the work as a postdoctoral researcher in the UW School of Environmental and Forest Sciences.

Ragweed is a native North American plant that thrives in open areas, moving quickly into disturbed areas. It produces copious fine-powder pollen from August to November, causing sneezing, runny noses, irritated eyes, itchy throats and headaches for people with hay fever.

Several studies of ragweed's future geographic distribution have been done in Europe, where people are concerned because this invasive species is expanding its range. This is the first study to consider future ragweed distribution in the United States.

Case's previous research looks at how climate change may influence the distribution of various species, mainly native trees in the Pacific Northwest. Co-lead author Kristina Stinson, an assistant professor of plant ecology at UMass Amherst, is an expert on ragweed, including mapping allergy hot spots in New England.

"One reason we chose to study ragweed is because of its human health implications. Ragweed pollen is the primary allergen culprit for hay fever symptoms in summer and fall in North

America, so it affects a lot of people," Stinson said.

For the new study, the two authors built a machine learning model using Maxent software that takes some 726 observations of common ragweed in the eastern U.S., drawn from an international biodiversity database, then combines those with climate information to identify conditions that allow the plant to thrive. Researchers next ran the model into the future using temperature and precipitation output from 13 global climate models under two different pathways for future greenhouse gas emissions.

The results show that roughly 35 years from now, ragweed is projected to expand northward into places where it has not been documented, including upstate New York, including the Albany area, New Hampshire, Maine and Vermont.

While that news may be ominous, knowing the plant is coming may help those communities prepare.

"Weed control boards, for example, might include ragweed on their list to keep an eye out and monitor for," Case said. "Historically they might not have been looking for ragweed, but our study suggests maybe they should start looking for it."

The study only covers the region east of the Mississippi River because that's where there were enough ragweed observations to run the model. The plant is commonly found in Illinois, Florida and the eastern seaboard from Washington, D.C. to Rhode Island. It is possible that ragweed would also expand its range westward or north into Canada, Case said, but those areas were outside the scope of the study.

The study also finds regions where ragweed is prevalent today but will decline substantially in the future, including the southern Appalachian Mountains, central Florida and northeastern Virginia. And knowing that, too, might be useful.

"As the climate becomes less suitable, there may be opportunities to try and displace ragweed. Maybe that is the silver lining -- that there are some opportunities for those communities to actually get some headway on mitigating or even eradicating this species," Case said.

Models show an overall surge in ragweed in the eastern U.S. by the 2050s followed by a slight overall contraction from the 2050s to the 2070s, as temperature and precipitation become more variable.

"It is kind of an interesting case study of climate change effects: It's not all bad, it's not all good," Case said.

"We don't have a lot of models like this that tell us where individual species may go under different scenarios," Stinson said. "Ecologists are working on doing this type of study for more species, but there are not always enough data points from around the world; individual species data are rare. But ragweed happens to be quite abundant, which made this study feasible."

Credit: 
University of Washington

Embryos remember the chemicals that they encounter

video: Researchers used a special imaging technique to visualize the location of every nucleus in a lab-generated embryo (left). When exposed to Activin alone, cells along the embryo's border reacted briefly, but did not fully differentiate (right).

Image: 
Laboratory of Stem Cell Biology and Molecular Embryology at The Rockefeller University

We all start out as a clump of identical cells. As these cells divide and multiply, they gradually take on distinct identities, acquiring the traits necessary to form, for instance, muscle tissue, bone, or nerves. A recent study from Rockefeller scientists offers new insight into how these cellular identities are cultivated over the course of development.

According to the study, published in eLife, cells retain a memory of the chemical signals to which they are exposed. And, the researchers show, embryos that fail to form these memories remain a clump of clones, never realizing their unique biological potential.

Activating embryos

Over 25 years ago, Ali H. Brivanlou demonstrated that the protein Activin causes embryonic frog cells to take on traits specific to certain tissue types, a process called differentiation. For decades now, Activin has been thought to instigate the transition from homogenous clump to specialized cells.

"Activin was the textbook definition of a molecule that is necessary and sufficient for differentiation," says Brivanlou, the Robert and Harriet Heilbrunn Professor. "Researchers had shown that the dose of the protein determines cellular fate. At a very high dose, for example, you get gut and muscle; and at a very low dose, you get nerve tissue."

Despite ample evidence from animal studies, questions remained about how Activin guides development in human cells. Working with Brivanlou and Eric D. Siggia, the Viola Ward Brinning and Elbert Calhoun Brinning Professor, graduate fellow Anna Yoney set out investigate whether the protein triggers differentiation in laboratory-generated human embryos. Developed from stem cells, these embryos mimic the behavior of human cells during the early stages of development.

The researchers expected these synthetic embryos to respond just like Brivanlou's frogs. Yet, after applying Activin to these cells, they observed, well, nothing.

"Anna put Activin on the embryos and we waited--and waited and waited. And absolutely nothing happened! That was shocking," says Brivanlou.

Memorable molecules

Undeterred, Yoney considered possible explanations for her results. "I thought, Ok, we don't get a response from Activin alone," she recalls. "What additional signals might we need to see differentiation?"

She ultimately homed in on WNT, a molecule known to regulate the movement of cells during development. In her next experiment, she exposed the cells to WNT before adding Activin; and, this time, they differentiated in the normal manner.

"The cells that saw WNT reacted to Activin with the full range of response--just like we see in the frog and other animals," says Brivanlou. "But cells that hadn't seen WNT were totally unresponsive, as if Activin wasn't even there."

The researchers concluded that differentiation requires both WNT and Activin signaling. Crucially, however, they showed that cells needn't be exposed to the two chemicals simultaneously.

"We blocked WNT signaling during the Activin treatment phase and found that the cells still differentiated," says Yoney. "So we concluded that the cells actually remembered that they had previously been exposed to WNT."

The researchers deemed this phenomenon "signaling memory" because WNT appears to permanently change cells that cross its path. Earlier research in this area failed to uncover evidence for embryonic memories because, says Brivanlou, most developmental biologists work with animal cells.

"In animal model systems, cells encounter a series of signals before people like me manipulate them. But Anna's artificial embryos came from stem cells that hadn't had this kind of exposure--and this makes them perfect tools for discovering the roles of other signals," he says. "As beautiful as the model systems are, sometimes they can lead you to miss things."

The researchers hope to further explore how and where cellular memories are stored. Yoney suspects that they are recorded in cells' nuclei as modifications to the epigenome, which controls the way that cells read out their DNA. Additional research in this area could have major implications for understanding development in humans and other species.

Credit: 
Rockefeller University

Recessive genes explain only small fraction of undiagnosed developmental disorders

The Deciphering Developmental Disorders study has discovered that only a small fraction of rare, undiagnosed developmental disorders in the British Isles are caused by recessive genes. The study by researchers from the Wellcome Sanger Institute and their collaborators estimated that only five per cent of the patients had inherited a disease-causing gene mutation from both parents, far fewer than previously thought. This will guide research and could lead to a better understanding of the risk for future pregnancies.

Published in Science today (8 November 2018), the study also revealed two new recessive genetic conditions, enabling clinicians to give diagnoses to the families involved and helping inform families with these genetic problems in the future.

Every year, thousands of babies are born who have differences in their genetic make up that stop them from developing normally. Most of these children have intellectual disabilities and many of them also have other problems such as seizures, autism and heart defects. The Deciphering Developmental Disorders (DDD) study aims to find diagnoses for children with as-yet unknown developmental disorders, and to understand the causes of these conditions.

Previous work had estimated that up to half the patients in the study had disorders due to new mutations, which are only present in the egg or sperm of one parent, rather than in the rest of that parent's cells. However, it was not known how many of the other disorders were due to recessive genes - where a patient needs to inherit a damaging genetic variant in the same gene from both parents in order to cause a disorder.

Working with clinicians as part of the DDD study, the researchers sequenced the genes from about 5,500 patients with these rare developmental disorders. They created a new method of analysing the data to identify recessive causes in known or as-yet undiscovered genes. Surprisingly, they found that recessive mutations only explain a small fraction of the disorders.

Dr Hilary Martin, first author on the paper from the Wellcome Sanger Institute, said: "This study is the first unbiased estimate of the fraction of patients with recessive causes of rare developmental disorders. We found that only about five per cent of the patients in our study have disorders due to recessive genes, which is much lower than expected, and means there are probably other mechanisms we don't yet understand. Our result could also eventually lead to better personalised risk prediction for undiagnosed families thinking of having another child."

The different types of genetic causes give families very different risks for future children, so the information is key. Families with new mutations only have a very low risk of having further children with the developmental disorder. In contrast, parents with recessive mutations have a 25 per cent chance that any of their children will inherit two damaged copies of a gene and suffer from the condition.

The study identified two new recessive genes causing developmental disorders. The new diagnoses resulting from this discovery will provide explanations for the families involved, and help clinicians understand the disorders and identify them in any new patients.

Dr Jeff Barrett, senior author on the paper who carried out the work at the Wellcome Sanger Institute, said: "It is very important to families to find out what caused their child's developmental disorder. The families affected by these two new conditions will be able to contact and support each other, and the diagnoses could help guide possible treatments. The diagnoses also give the possibility for prenatal testing for the parents of these children to check future pregnancies."

The research revealed that new mutations and recessive genes together only accounted for about 55 per cent of the patients in the study, leaving the mechanisms for nearly half the cases still to be discovered.

Dr Matthew Hurles, leader of the Deciphering Developmental Disorders project from the Wellcome Sanger Institute, said: "This study was only possible only due to the large number of patients and the expert clinicians we have been working with, and showed that few patients had recessive causes of rare developmental diseases. This indicates that for many patients, more complicated genetic mechanisms may be involved. The results will allow us to target our research to benefit the maximum number of patients in our study, to help them and their families receive the diagnoses they need."

Credit: 
Wellcome Trust Sanger Institute

Study finds 'dual mobility' hip replacement implant reduces risk of dislocation

image: Dr. Westrich is director of research of the Adult Reconstruction and Joint Replacement Service at Hospital for Special Surgery.

Image: 
Hospital for Special Surgery

Hip replacement surgery is highly successful in relieving pain, restoring mobility and improving quality of life. More than 330,000 procedures are performed each year in the United States, and that number is expected to almost double by the year 2030.

As with all surgical procedures, the possibility of a complication exists, and dislocation is the most common problem. The risk of dislocation is higher in patients who have had a second hip replacement, known as revision surgery. Some people need a revision surgery many years after their first hip replacement when the original implant wears out. Hip instability after joint replacement is another reason a patient might need a revision surgery.

Research conducted by Dr. Geoffrey Westrich and colleagues at Hospital for Special Surgery and other joint replacement centers indicates that a newer type of artificial hip known as a "modular dual mobility" implant could be a good option for patients who need a revision surgery. Their study was presented at the annual meeting of the American Association of Hip and Knee Surgeons in Dallas this month.

"Although the concept of dual mobility was originally developed in France in the 1970s, the technology is relatively new in the United States," says Dr. Westrich, director of research of the Adult Reconstruction and Joint Replacement Service at HSS. "Our study found that the newer technology with modular dual mobility components offered increased stability, lowering the risk of dislocation, without compromising hip range of motion in patients having a revision surgery."

"Dual mobility" refers to the bearing surface of the implant - where the joint surfaces come together to support one's body weight. A hip replacement implant is a ball-in-socket mechanism, designed to simulate a human hip joint. Typical components include a stem that inserts into the femur (thigh bone), a ball that replaces the round head of the thigh bone, and a shell that lines the hip socket.

Modular dual mobility implants provide an additional bearing surface compared to a traditional implant. With the dual mobility hip, a large polyethylene plastic head fits inside a polished metal hip socket component, and an additional smaller metal or ceramic head is snap-fit within the polyethylene head.

"Currently, there are few large-scale outcome studies on the modular dual mobility device in revision hip replacement," Dr. Westrich noted. "We set out to determine the rate of dislocation and the need for another surgery following revision hip replacement using this implant and report on the functional outcomes."

The study included 370 patients who underwent revision hip replacement with the dual mobility implant between April 2011 and April 2017. The average patient age at the time of surgery was 65.8 years. Clinical, radiographic and patient reported-outcome information was collected.

To be included in the final report, patients needed to be seen for follow-up for at least two years after their surgery, and the average follow-up was 3.3 years. "At the latest follow-up, we found that surgery with the dual mobility implant resulted in a very low rate of instability for the revision patients, namely 2.9 percent, with good functional improvement and a low rate of reoperation," Dr. Westrich noted. "While longer-term follow-up is needed to fully assess the newer device, in our study there was clearly a benefit provided by the dual mobility implant in the first few years following revision surgery."

Credit: 
Hospital for Special Surgery

Mutant protein tackles DNA guardian to promote cancer development

image: Biomedical animation of a breast tumour

Image: 
Drew Berry, Walter and Eliza Hall Institute

Melbourne scientists have discovered how tumour development is driven by mutations in the most important gene in preventing cancer, p53.

The research revealed that in the early stages of cancer, mutant p53 'tackles' the normal p53 protein and blocks it from carrying out its protective role. As a result, p53 can no longer activate natural defences against cancer - such as the body's DNA repair process - increasing the risk of cancer developing.

The research was led by Dr Brandon Aubrey, Professor Andreas Strasser and Dr Gemma Kelly together with bioinformaticians Professor Gordon Smyth and Dr Yunshun Chen. The findings are published in this month's edition of Genes and Development.

At a glance

Researchers have discovered how mutations in p53 - found in half of all human cancers - drive cancer development.

The mutant p53 protein 'tackles' the normal protein and prevents it from carrying out its protective role, while permitting it to activate genes driving tumour growth.

The team is now examining whether the mutant p53 protein acts in the same way in established tumours, with important implications for cancer therapy.

Tackling DNA's guardian

p53 is known as the 'guardian of the genome' due to its role in protecting cells from cancer.

"p53 plays a critical role in many pathways that prevent cancer, such as repairing DNA or killing cells if they have irreparable DNA damage," Dr Kelly said.

"Genetic defects in p53 are found in half of all human cancers, but exactly how these changes disrupt p53 function has long been a mystery."

Dr Kelly said that cells normally have two copies of the p53 gene in every cell.

"Early during cancer development, one copy of the gene may undergo a sudden and permanent change through mutation, while the other copy of the gene remains normal. This results in the cell making a mixture of normal and mutant versions of the p53 protein.

"We found that the mutant p53 protein can bind to and 'tackle' the normal p53 protein, blocking it from performing protective roles such as DNA repair. This makes the cell more likely to undergo further genetic changes that accelerate tumour development."

The team expected the mutant proteins would block all normal p53 activity, so was surprised to find that only certain p53-dependent pathways were affected.

"The mutant proteins are cunning: while they stop p53 from activating pathways that protect against cancer, they still allow p53 to activate pathways that promote tumour growth. p53's role in cancer is clearly more complicated than we had expected," Dr Kelly said.

A mystery resolved

Professor Strasser said the findings inform a longstanding debate about mutant p53.

"Scientists have been debating how mutant p53 contributes to the development of cancer for decades.

"One camp argues that mutant p53 acts by 'tackling' the normal protein and blocking its natural protective roles. The other camp argues that mutant p53 goes 'rogue' and performs new roles that promote tumour development."

"Our work clearly shows that during cancer development, the 'tackling' of normal p53 is most significant. This selectively disables certain but not all normal functions of p53," Professor Strasser said.

Clinical implications

The team is now investigating whether the same is true for established tumours, with important implications for drug treatments.

"Established tumours have often lost the normal copy of their p53 gene and only produce mutant p53 protein," Dr Kelly said.

"If mutant p53 acts by tackling normal p53, then it may no longer play a role in established tumours where no normal p53 is produced. This would mean that drugs that block mutant p53 would have no clinical benefit," she said.

"Conversely, if mutant p53 has new, cancer-promoting activities of its own in established tumours, then a drug that specifically blocks mutant p53 could be beneficial for treating thousands of patients."

Credit: 
Walter and Eliza Hall Institute

Satellite finds Tropical Cyclone 03S developing in Southern Indian Ocean

image: At 5:12 a.m. EDT (0912 UTC) on Nov.6, the VIIRS instrument aboard NOAA's NOAA-20 satellite captured a visible image of Tropical Cyclone 03S in the Southern Indian Ocean

Image: 
NASA/NRL

Tropical Cyclone 03S formed in the Southern Indian Ocean and the NOAA-20 satellite passed overhead and captured a visible image of the storm.

At 5:12 a.m. EDT (0912 UTC) on Nov.6, the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard the NOAA-20 polar orbiting satellite saw fragmented bands of thunderstorms around Tropical Storm 03S's center of circulation. The VIIRS image revealed that the bulk of clouds and storms were south of the center. The Joint Typhoon Warning Center noted "The bulk of the deep convection remains concentrated southwestward of the low level circulation."

On Nov. 6 at 11 a.m. EDT (1500 UTC) 03S's maximum sustained winds were near 35 knots (40 mph/62 mph) making it a tropical storm. It was located approximately 641 nautical miles north of Port Louis, Mauritius near 9.7 south latitude and 58.9 east longitude. 03S was moving west-southwestward and is expected to strengthen over the next several days.

NOAA-20 is the first in the JPSS series of satellites. JPSS is a collaborative program between the National Oceanic and Atmospheric Administration (NOAA) and its acquisition agent, the National Aeronautics and Space Administration (NASA). NOAA is responsible for managing and operating the JPSS program, and developing portions of the ground segment, while NASA is responsible for developing and building the JPSS instruments, spacecraft, and portions of the ground segment and providing launch services.

For more information about the JPSS series of satellites, visit: https://www.jpss.noaa.gov/

Credit: 
NASA/Goddard Space Flight Center

How invasive earthworm feces is altering US soils

Asian jumping earthworms are carving out territory all over the U.S. Midwest and East Coast, leaving in their wake changed soils that are just beginning to be studied.

The invasive earthworms are native to eastern Asia and are known by several names: Jumping worms, crazy worms, Alabama jumpers and snake worms. They have been making inroads into soils of North America since the 19th century, according to an information page by the Wisconsin Department of Natural Resources. They were first detected in Wisconsin in 2013.

"The way that these worms change the soil is something new. You can see very clearly that they have been there," said geoscientist Jenelle Wempner of the University of Wisconsin in Madison. "They leave little balls of soil. Imagine a soil surface covered with coffee grounds."

The little balls, or aggregates, left behind by the Asian earthworms are essentially worm feces, and they are a target of Wempner's research.

"A lot of soil I look at is worm poop," she said. How the worm droppings transform the soils they invade is important not only for understanding the effects of the worms on the land - like increasing erosion - but could potentially help in controlling the invasive worms.

"These soil aggregates lock up nutrients and chemically alter the soil composition," Wempner said. Her team used a scanning electron microscope to examine the minerals in the aggregates. Their preliminary data show a sharp increase in heavy metals (iron and aluminum) and nutrients like potassium and calcium in the aggregates, which makes them less accessible to plants.

They also examined the cocoons that the adult worms produce each year as they mature. They found that the cocoons have an outer layer with selective permeability - something that can be taken into account when developing chemical treatments to control worm populations.

"We have some answers," said Wempner, but it's just a start on studying the physical effects of the worms. "There is an adequate amount of support on ecological side of the research. But not so much on the physical science." Her work is supported by the University of Wisconsin.

Wempner will be presenting a poster about the ongoing research at the annual meeting of the Geological Society of America in Indianapolis, on Wednesday, November 7, 2018.

Credit: 
Geological Society of America

Enhanced views of Earth tectonics

Scientists from Germany's Kiel University and British Antarctic Survey (BAS) have used data from the European Space Agency (ESA), Gravity field and steady-state Ocean Circulation Explorer (GOCE) mission to unveil key geological features of the Earth's lithosphere - the rigid outer layer that includes the crust and the upper mantle.

Published this week in the journal Scientific Reports the study is a step forward in the quest to image the structure and setting of different continents using satellite gravity data, including Antarctica, the least understood piece of the whole plate tectonic puzzle. Satellite gravity provides a new tool to link the remote and ice-covered continent with the rest of the Earth. This improves our understanding of Antarctica's deep structure, which is particularly important, as the properties of its lithosphere can also influence the overlying ice sheets.

GOCE measures differences in horizontal and vertical components of the gravity field - known as gradients. These gradients can be complex to interpret and so the authors combined these to produce simpler 'curvature images' that reveal large-scale tectonic features of the Earth more clearly.

Lead author, Prof. Jörg Ebbing from the Kiel University said:

"Our new satellite gravity gradient images improve our knowledge of Earth's deep structure. The satellite gravity data can be combined with seismological data to produce more consistent images of the crust and upper mantle in 3D. This is crucial to understanding how plate tectonics and deep mantle dynamics interact".

Fausto Ferraccioli, Science Leader of Geology and Geophysics at the British Antarctic Survey and co-author of the study, said,

"Satellite gravity is revolutionizing our ability to study the lithosphere of the entire Earth, including its least understood continent, Antarctica. In East Antarctica, for example, we now begin to see a more complex mosaic of ancient lithosphere provinces. GOCE shows us fundamental similarities but also unexpected differences between its lithosphere and other continents, to which it was joined until 160 million years ago".

The new study presents a view of the Earth's continental crust and upper mantle not previously achievable using global seismic models alone. The authors noted that, despite their similar seismic characteristics, there are contrasts in the gravity signatures for ancient parts of the lithosphere (known as cratons), indicating differences in their deep structure and composition. These features are important. Because they form the oldest cores of the lithosphere, they hold key records of Earth's early history.

Credit: 
British Antarctic Survey

Adolescent brain development impacts mental health, substance use

SAN DIEGO -- Advances in understanding adolescent brain development may aid future treatments of mental illness and alcohol and substance use disorders. The findings were presented at Neuroscience 2018, the annual meeting of the Society for Neuroscience and the world's largest source of emerging news about brain science and health.

Adolescence is a developmental period characterized by outsized risk-taking and reward-seeking behavior, including first alcohol and drug exposures, as well as the first emergence of symptoms such as depression and anxiety. And yet, much of the research on brain functions related to these conditions is performed on adults. As we gain a better understanding of adolescence-specific neurological causes of these conditions and behaviors, we increase the potential for early treatments and for interventions even before serious symptoms emerge.

Today's new findings show that:

A variant in an opioid receptor gene in the brain reduces the natural reward response in young adolescents before they have started using alcohol or other substances, indicating carriers of this genetic variant may be more susceptible to addiction (John W. VanMeter, abstract 281.06).

Childhood trauma impacts the development of critical brain networks during adolescence, elevating the risk for alcohol abuse (Sarita Silveira, PhD, abstract 645.04).

The strength of connections between the brain's reward and anti-reward systems corresponds to the severity of several important psychiatric symptoms in adolescents, including anxiety and depression (Benjamin Ely, abstract 320.11).

"The neuroscience advances presented today help expand our understanding of the connections between adolescent brain development and mental health issues, including alcohol and substance use," said press conference moderator Jay Giedd, MD, of the University of California, San Diego, who conducts research on the biological basis of cognition, emotion, and behavior with an emphasis on the teen years. "These advances provide potential new methods to identify young people who have biological susceptibility to addiction and mental illnesses, so we can implement intervention strategies even before problems emerge."

Credit: 
Society for Neuroscience

Researchers identify promising proteins for diagnostic, prognostic use in ALS

Researchers from North Carolina State University have identified proteins that may be useful in both earlier diagnosis of Amyotrophic Lateral Sclerosis (ALS) and in more accurate disease prognosis.

ALS, often referred to as Lou Gehrig's disease, is a progressive, neurodegenerative disease that affects the brain and spinal cord. Currently, there is no effective treatment or cure.

"The current average time from symptom onset to diagnosis for ALS patients is 1 to 1 ½ years," says Michael Bereman, assistant professor in biological sciences at NC State, leader of the proteomics core at NC State's Center for Human Health and the Environment (CHHE) and lead author of a paper describing the work. "A lot of time and money gets spent by patients during that time period. And once a patient has a diagnosis, the tools we currently have to monitor disease progression are very subjective."

Bereman, who was diagnosed with ALS in 2015, knows firsthand the shortcomings of the current methods of diagnosis and prognosis. So he and his NC State colleagues decided to look for biomarkers in ALS patients that could potentially speed diagnosis and give doctors a more accurate picture of disease progression.

The team obtained samples of cerebrospinal fluid (CSF) and blood plasma from 33 ALS patients and 30 healthy individuals. "We were looking for differences in protein abundance," Bereman says. "Essentially, if there were higher or lower levels of a particular protein in the ALS fluids, it was considered a protein of interest."

Using mass spectrometry they identified over 1,000 different proteins in the fluids, then used advanced machine learning techniques to develop models that consisted of multiple proteins. "It was the combination of proteins that really improved upon the diagnostic and prognostic value of any single protein," Bereman says. Models developed from proteins found in CSF proved more useful than those in blood plasma. However, Bereman notes the potential clearly exists for an assay in blood plasma, which would be less invasive.

Bereman selected two proteins that looked promising for both diagnostic and prognostic applications, then conducted further analysis to validate their usefulness as biomarkers. The proteins, chitinase-3 like1 and alpha-1-antichymotrypsin, are associated with immune-system activation in the brain and thus could also be used as an objective way to measure effectiveness of current therapies directed at tempering this pathway. Interestingly, immune-system activation is also known to play a role in other neurodegenerative diseases, such as Parkinson's and Alzheimer's, indicating the assays could potentially be used in these diseases as well.

"Our goal is to create a panel of protein targets that could give doctors a quicker path to diagnosis for ALS patients, as well as an objective way to measure disease progression, or to test the efficacy of new drugs," Bereman says. "Our next steps will be to look at changes in these proteins and their signaling pathways over time in fluids that have been longitudinally collected from ALS patients."

Credit: 
North Carolina State University

Study could help explain how childhood stress contributes to anxiety, depression

SAN DIEGO - New research could help explain why stress early in life can create vulnerabilities to mood and anxiety disorders later on.

The study, led by researchers at The Ohio State University, was presented Nov. 5 in San Diego at the annual Society for Neuroscience meeting, and highlights the important role of mast cells.

"These are immune cells involved in allergic reactions that historically were largely ignored by neuroscientists, but now we're finding in rodent models they could be responsible for some of the changes we see in neurodevelopment after a childhood trauma," said Kathryn Lenz, the study's senior author and an assistant professor of psychology at Ohio State.

Study lead author Angela Saulsbery said that she's especially interested in how this research might begin to draw molecular-level connections between adverse childhood experiences and adolescent and adult depression and anxiety.

"Mast cells may also be a viable target for prophylactic drugs that help prevent these psychological disorders in children who experience these traumatic events," Saulsbery said.

The researchers compared stressed rats to unstressed rats, and also compared animals based on sex. And they looked at the effects of prenatal stress, a single stressor after birth and chronic stress.

"We found that stress at different times had different effects - chronic exposure to stress is where we saw the significant differences in mast cell activity in the brain," Lenz said, adding that those animals had 30 percent more of the immune cells than their unstressed counterparts.

Chronic stress in the animals included being left alone without their mothers for periods of time.

Male animals had more mast cells overall, which is interesting because there is evidence that, in humans, males may be more vulnerable to serious problems stemming from early childhood trauma, Lenz said.

This new work in the Lenz lab focused on determining if stress contributes to a more permeable blood-brain barrier, which might explain the surge in mast cells. These cells release histamine, a chemical usually associated with allergic responses, that could potentially alter brain development.

"Our lab is interested in early life stressors and how they impact mast cell function. We're trying to better understand how exposure to adverse childhood experiences might lead to problems later in life," said Lenz, who is part of the Institute for Behavioral Medicine Research at Ohio State's Wexner Medical Center.

"These childhood traumas, such as living in an abusive home or being neglected, can contribute to a wide array of problems down the road, including drug and alcohol addiction, depression and anxiety and even cardiovascular disease," she said.

Credit: 
Ohio State University

Molecular virologist fights influenza at the molecular level

image: 1918 "Spanish" flu

Image: 
UAB

BIRMINGHAM, Ala. - Molecular virologist Chad Petit, Ph.D., uses basic science to fight influenza -- through experiments at the atomic level.

This includes a deadly poultry influenza virus in China called the H7N9 avian flu virus. Since 2013, H7N9 has infected 1,625 people, killing 623. While not highly contagious for humans, just three mutations could change that, turning H7N9 into the feared Disease X, the term health experts use for the next unknown cause of a worldwide epidemic.

In research to improve influenza therapies against H7N9 and other influenza strains, Petit and his University of Alabama at Birmingham colleagues have detailed the binding site and mechanism of inhibition for two small-molecule experimental inhibitors of influenza viruses. Their report is published in the Journal of Biological Chemistry,

The two experimental inhibitors studied by Petit, a UAB assistant professor of biochemistry and molecular genetics, are small molecules whose precise mechanism of action was unknown. The inhibitors target the function of a key influenza protein called NS1, which has multiple roles to block the body's immune response during influenza infection. Thus, NS1 is essential to the survival and adaptability of the influenza virus.

Petit and colleagues used nuclear magnetic resonance, or NMR, spectroscopy to probe interactions of the inhibitors with NS1. They first showed that the inhibitors -- called A9 and A22 -- interacted with just one of the two independently folded domains of NS1, the NS1 effector domain.

The researchers noted that the structures of both small-molecule inhibitors were very similar to a fragment of a host protein called CPSF30 that the NS1 effector domain binds in order to short-circuit the body's immune response. Therefore, the researchers hypothesized that A9 and A22 block influenza viral replication and block NS1 function by interfering with the interaction between the NS1 effector domain and CPSF30.

NMR data revealed the particular amino acids of the NS1 effector domain that are involved in inhibitor binding. The researchers -- using two significantly different NS1 proteins from distinct influenza strains, including the H7N9 strain -- showed that similar sequences of amino acids in the two NS1 proteins were involved in inhibitor binding.

The 1918 "Spanish" flu NS1 protein

Besides the Chinese H7N9 NS1, the other NS1 protein tested was the NS1 effector domain from the 1918 "Spanish" flu, which infected one-third of the world's population a century ago and killed 50 million to 100 million people.

The UAB researchers then used X-ray crystallography, led by UAB Microbiology assistant professor Todd Green, Ph.D., to determine the three-dimensional structure of the NS1 effector domain from the 1918 "Spanish" flu. This allowed them to map the A9/A22-binding site onto that structure, which confirmed their hypothesis -- A9 and A22 interact with the NS1 effector domain hydrophobic pocket that is known to bind the host protein CPSF30.

The crystallography data also showed that the NS1 effector domain is able to dimerize, using an interface different from two other known dimers of the NS1 effector domain. Biological significance of this new dimer form is unknown.

"Altogether, our findings provide strong evidence for the mechanism of action of two anti-influenza compounds that target NS1, and the findings contribute significant structural insights into NS1 that we hope will promote and inform the development and optimization of influenza therapies based on A9 and A22," Petit said.

The need for novel antiviral compounds is great. Each year, influenza strains kill 250,000 to 500,000 people worldwide, and the virus is noted for quick changes to produce pandemic strains that few people have immunity against. Viral resistance has limited the effectiveness of several earlier antiviral compounds that were developed to treat influenza.

Credit: 
University of Alabama at Birmingham