Culture

Study explains why some creams and cosmetics may cause a skin rash

Allergic reactions in the skin can be caused by many different chemical compounds found in creams, cosmetics, and other topical consumer products, but how they trigger the reaction has remained somewhat mysterious.

A new study suggests the way some chemicals displace natural fat-like molecules (called lipids) in skin cells may explain how many common ingredients trigger allergic contact dermatitis, and encouragingly, suggests a new way to treat the condition.

The study was led by researchers at Columbia University Irving Medical Center, the Brigham and Women’s Hospital, and Monash University and published online today in Science Immunology.

Why some chemicals trigger dermatitis is a mystery

Poison ivy is a commonly known trigger for allergic contact dermatitis, an itchy skin rash. But many ingredients found in nonprescription topical products can trigger a similar type of rash.

An allergic reaction begins when the immune system’s T cells recognize a chemical as foreign. T cells do not directly recognize small chemicals, and research suggests that these compounds need to undergo a chemical reaction with larger proteins in order to make themselves visible to T cells.

“However, many small compounds in skincare products that trigger allergic contact dermatitis lack the chemical groups needed for this reaction to occur,” says study co-leader Annemieke de Jong, PhD, assistant professor of dermatology at Columbia University Vagelos College of Physicians and Surgeons.

"These small chemicals should be invisible to T cells, but they're not."

Skin cells unmask allergy-inducing chemicals

De Jong and her colleagues suspected that CD1a, a molecule that’s abundant on Langerhans cells (immune cells in the skin’s outer layer), might be responsible for making these chemicals visible to T cells.

In the current study, conducted with human cells in tissue culture, the researchers found that several common chemicals known to trigger allergic contact dermatitis were able to bind to CD1a molecules on the surface of Langerhans cells and activate T cells.

These chemicals included Balsam of Peru and farnesol, which are found in many personal care products, such as skin creams, toothpaste, and fragrances. Within Balsam of Peru, the researchers identified benzyl benzoate and benzyl cinnamate as the chemicals responsible for the reaction, and overall they identified more than a dozen small chemicals that activated T cells through CD1a.

“Our work shows how these chemicals can activate T cells in tissue culture, but we have to be cautious about claiming that this is definitively how it works in allergic patients,” de Jong says. “The study does pave the way for follow up studies to confirm the mechanism in allergic patients and design inhibitors of the response.”

New Ideas for Treatment

CD1a molecules normally bind the skin’s own naturally occurring lipids in its tunnel-like interior. These lipids protrude from the tunnel, creating a physical barrier that prevents CD1a from interacting with T cells.

Structural work done at Monash University showed that farnesol, one of the allergens identified in this study, can hide inside the tunnel of CD1a, displacing the natural lipids that normally protrude from the CD1a molecule. “This displacement makes the CD1a surface visible to the T cells, causing an immune reaction,” de Jong says.

This discovery raises the possibility that allergic contact dermatitis could be stopped by applying competing lipids to the skin to displace those triggering the immune reaction. “From previous studies, we know the identity of several lipids that can bind to CD1a but won’t activate T cells,” she says.

Currently, the only way to stop allergic contact dermatitis is to identify and avoid contact with the offending chemical. Topical ointments can help sooth the rashes, which usually clear up in less than a month. In severe cases, physicians may prescribe oral corticosteroids, anti-inflammatory agents that suppress the immune system, increasing the risk of infections and other side effects.

Credit: 
Columbia University Irving Medical Center

A possible path to improved bone-repair procedures

image: Heparin microparticles used to deliver bone morphogenetic protein are shown, in red, binding to a defective femur in a rat. The heparin-based mix kept the the biomaterial used in the treatment localized to only the targeted area.

Image: 
Image courtesy of Marian Hettiaratchi

EUGENE, Ore. - Researchers are moving closer to a new approach for improving spinal fusion procedures and repairing broken or defective bones that avoids an over-production of bone that commonly occurs in current treatments.

In a preclinical study, researchers significantly reduced undesired bone growth outside of targeted repair areas in rat femurs by delivering a potent bone-forming protein called bone morphogenetic protein, or BMP, using a new biomaterial made from heparin.

A six-member research team - led by Marian H. Hettiaratchi, a bioengineer in the Phil and Penny Knight Campus for Accelerating Scientific Impact at the University of Oregon - described the approach in a paper published Jan. 3 issue of the online journal Science Advances.

Hettiaratchi began exploring the use of heparin microparticles to deliver BMP as a possible way to stop abnormal bone growth more than five years ago while a doctoral student at the Georgia Institute of Technology under the mentorship of co-authors Robert Guldberg and Todd McDevitt.

The traditional approach of using high doses of BMP alone has led to numerous complications in humans, including soft tissue inflammation and abnormal ossification.

For the new study, Hettiaratchi and colleagues fed their earlier results from experiments done in both rats and test tubes into computer simulations to explore ways to adjust their heparin-based approach in animal testing with levels of BMP comparable to dosages required in human bone-repair procedures.

"We focused on using doses that were more clinically relevant. In humans, the typical treatment uses 0.1 to 0.2 milligrams of BMP per kilogram of body weight, so we used the same amount in the rats," Hettiaratchi said. "Most research done in rats uses 10 times less BMP to repair bone, which isn't comparable to what's done in humans and doesn't exhibit the side effects of a clinical BMP dose."

Two different strengths of the combination were used, resulting in 40 to 50 percent reductions in abnormal ossification. The heparin microparticles contain heparin's long-chained linear polysaccharides, with sulfated groups which drive stronger binding affinity to BMP.

The heparin and BMP, mixed in an alginate hydrogel, were injected into a nanofiber mesh tube - created in Guldberg's lab to isolate a repair area and unveiled in Biosciences in 2011 - already inserted into femoral defects in the rats. Human medical practices have relied on high doses of BMP injected into a collagen sponge, which leads to abnormal ossification in surrounding soft tissue as BMP rapidly escapes the sponge.

The findings represent a proof-of-concept for fine-tuning the approach rather than a route into clinical testing in humans, Hettiaratchi said. The eventual goal, she said, is to create synthetic heparin-like microparticles that achieve the same results while avoiding potential side effects of heparin.

"The problem with healing large bone defects clinically is that the BMP delivered using collagen sponges results in abnormal bone formation because the drug doesn't stay on the material," Hettiaratchi said. "Our new material retains much more of the BMP, keeping it localized. You don't get bone formation outside the targeted area."

Hettiaratchi joined the UO after completing a postdoctoral fellowship at the University of Toronto. Guldberg joined the UO's Knight Campus as executive director in August 2018. McDevitt is now in San Francisco, affiliated with the Gladstone Institute of Cardiovascular Disease and the University of California.

At Toronto, Hettiaratchi began pursuing the development of a synthetic material to localize protein delivery that would avoid potential side effects from heparin, a widely used anticoagulant that prevents blood clots. None of heparin's long list of known side effects has been seen in the rats, she noted. Another potential problem is that heparin's numerous sulfate groups might bind to other proteins not related to bone repair.

Ideally, she said, a synthetic heparin-like drug could be engineered to only bind to BMP. Such work will be the initial focus in her UO lab, which will open in early 2020.

Credit: 
University of Oregon

Supercharging tomorrow: the world's most efficient lithium-sulfur battery

image: Associate Professor Matthew Hill, Dr. Mahdokht Shaibani and Professor Mainak Majumder.

Image: 
Monash University

Monash University researchers have developed the world's most efficient lithium-sulphur battery, capable of powering a smartphone for five continuous days.

Prototype cells have been developed in Germany. Further testing in cars and solar grids to take place in Australia in 2020.

Researchers have a filed patent on the manufacturing process, and will capture a large share of Australia's lithium chain.

Imagine having access to a battery, which has the potential to power your phone for five continuous days, or enable an electric vehicle to drive more than 1000km without needing to "refuel".

Monash University researchers are on the brink of commercialising the world's most efficient lithium-sulphur (Li-S) battery, which could outperform current market leaders by more than four times, and power Australia and other global markets well into the future.

Dr Mahdokht Shaibani from Monash University's Department of Mechanical and Aerospace Engineering led an international research team that developed an ultra-high capacity Li-S battery that has better performance and less environmental impact than current lithium-ion products.

The researchers have an approved filed patent (PCT/AU 2019/051239) for their manufacturing process, and prototype cells have been successfully fabricated by German R&D partners Fraunhofer Institute for Material and Beam Technology.

Some of the world's largest manufacturers of lithium batteries in China and Europe have expressed interest in upscaling production, with further testing to take place in Australia in early 2020.

The study was published in Science Advances on Saturday, 4 January 2020 - the first research on Li-S batteries to feature in this prestigious international publication.

Professor Mainak Majumder said this development was a breakthrough for Australian industry and could transform the way phones, cars, computers and solar grids are manufactured in the future.

"Successful fabrication and implementation of Li-S batteries in cars and grids will capture a more significant part of the estimated $213 billion value chain of Australian lithium, and will revolutionise the Australian vehicle market and provide all Australians with a cleaner and more reliable energy market," Professor Majumder said.

"Our research team has received more than $2.5 million in funding from government and international industry partners to trial this battery technology in cars and grids from this year, which we're most excited about."

Using the same materials in standard lithium-ion batteries, researchers reconfigured the design of sulphur cathodes so they could accommodate higher stress loads without a drop in overall capacity or performance.

Inspired by unique bridging architecture first recorded in processing detergent powders in the 1970s, the team engineered a method that created bonds between particles to accommodate stress and deliver a level of stability not seen in any battery to date.

Attractive performance, along with lower manufacturing costs, abundant supply of material, ease of processing and reduced environmental footprint make this new battery design attractive for future real-world applications, according to Associate Professor Matthew Hill.

"This approach not only favours high performance metrics and long cycle life, but is also simple and extremely low-cost to manufacture, using water-based processes, and can lead to significant reductions in environmentally hazardous waste," Associate Professor Hill said.

Credit: 
Monash University

Study: US presidents play surprising role in driving corporate social responsibility

A new study by San Francisco State University Assistant Professor of Management Nara Jeong suggests that CEOs look to the White House for leadership on social responsibility -- but not the way you might expect. It turns out that corporate leaders are less likely to act on their values when they're in agreement with the president. And their social responsibility efforts increase when they don't agree with the leadership of the commander in chief.

Jeong studies CEO behavior and corporate social responsibility, which is defined in her latest research -- examining a decade of behavior starting in the mid 1990s -- as actions that "further some social good, beyond the interests of the firm and that which is required by law." She and the study's co-author found that liberal CEOs invest more in socially conscious activities, such as diversity initiatives and environmental conservation, when they feel those values are threatened.

"Republican presidents aren't as interested in those values, so business leaders think, 'We need to do more to promote and protect these values,'"Jeong said.

Conversely, when business leaders shared the same political beliefs as the president, support for socially conscious initiatives dropped. For left-leaning CEOs, who are more likely to engage in socially responsible activities, those efforts fell by an average of 18 percent, Jeong says.

Business leaders with the same political orientation as the president may have an expectation that the government "will deliver on the social values they hold dear," the study reported. As a result, these executives may feel empowered to focus more on their companies' financial performance, Jeong adds.

Jeong and her collaborator went a step further and tested whether politics encouraged companies to act irresponsibly. Examples could include increasing pollution, lowering emission standards or doing away with policies that protect minority employees. Yet Jeong found no evidence that firms engaged in such activities based on whether their politics were aligned or misaligned with the president.

To conduct their study, Jeong looked to Kinder, Lydenberg and Domini (KLD) -- an index that rates the social investments companies make. Categories KLD measures include environment, community involvement, product safety, excessive compensation of executives and diversity. They examined the activities of 752 CEOs between 1994 and 2005.

Next, they turned to the Federal Elections Commission to track the CEOs' political donations over 10 years, a period that covers two presidential elections and several congressional election cycles. This helped them determine the political tendencies of the CEO. They also tracked whether the president was a Democrat or a Republican.

Jeong was surprised by her findings. "You think that the people who are committed to social responsibility will stay committed regardless of the context," she said. "[CEOs] may change their stance if the context changes."

Jeong wrote "The effects of political orientation on corporate social (ir)responsibility" with Lehman College Assistant Professor of Business and Economics Nari Kim. The study appeared in the journal Management Decision in November.

Credit: 
San Francisco State University

Yale study urges lifesaving drug treatment to combat Ukraine's HIV epidemic

New Haven, Conn. -- A new study led by Yale University researchers finds that scaling up use of methadone and buprenorphine -- medications for treating opioid use disorder known as opioid agonist therapies (OATs) -- could greatly reduce HIV transmission rates and prevent deaths in Ukraine, where the disease is epidemic among people who inject drugs.

The study was published in The Lancet.

Annual new HIV infections in Ukraine -- home to Eastern Europe and Central Asia's second largest HIV epidemic -- rose from 9,500 in 2010 to 12,000 in 2018, according to the study. New infections are likely to increase by approximately 60,000 over 10 years without additional interventions.

The researchers found that treating at least 20% of people with opioid use disorder who inject drugs -- the minimum recommended by the World Health Organization -- could, over 10 years, prevent more than 10,000 new HIV infections and nearly 18,000 new deaths.

Currently, only 2.7% of people who inject drugs in Ukraine receive OATs, despite their proven effectiveness.

"Opioid agonist treatments are one of the most effective treatments for opioid use disorder and preventing HIV infections," said co-author Lynn Madden, a Yale postdoctoral associate in internal medicine and head of a foundation focused on substance use disorders and mental illness. "In addition to treating opioid dependence, it substantially reduces drug use and injection frequency, lowers HIV transmission rates, and prevents death, including death due to overdose," she said.

Senior author Alexei Zelenev, Yale associate research scientist in medicine, said the healthcare system in Ukraine needs modernization, and HIV testing needs to be expanded, as only 56% of the population with HIV are aware of their infected status.

"High prevalence in people who inject drugs, criminalization of drug users, large injection networks, and suboptimal access to evidence-based treatment for opioid use disorder contribute to ongoing HIV transmission," he said.

Researchers obtained HIV epidemic profiles and regional data -- including OAT treatment -- for 23 regions in the Ukraine. Their mathematical model evaluated the efficiency of current OAT treatment programs and assessed the effect of expanding those programs to treat 20% of the drug-injecting population.

Taking into account regional differences, the study showed that scaling up OAT in regions with large populations of people who inject drugs -- like Dnipropetrovsk, Odessa, and Kyiv -- would lead to the greatest reductions in infections and death, but that smaller regions not covered by the U.S. President's Emergency Plan for AIDS Relief (PEPFAR) remain highly vulnerable to HIV outbreaks and need to be considered when allocating resources.

PEPFAR is the U.S. government's response to the global HIV/AIDS epidemic.

Scaling up OAT programs requires initiative on several fronts, said Zelenev.

In addition to expanding capacity at existing treatment sites, he said that expansion of addiction treatment into primary care clinics, as well as through take-home pharmacy prescriptions, can offer pathways for increased access to effective treatment.

"The expansion of OAT has not been adequate," he said.

Amid the ongoing military conflict with Russia, Ukraine faces a difficult financial situation that exacerbates the public health crisis.

Frederick Altice, professor of medicine, epidemiology and public health at Yale, and a co-author, said the study reveals the importance of scaling up evidence-based treatments to prevent new HIV infections and death.

"Ukraine is a major country in the Eastern European and Central Asian region, the only region globally where new HIV infections and HIV-related deaths remain increasing," he said. "Findings from this study have important implications for other countries throughout the region where the HIV epidemic is similar. In nearby Russia, new HIV infections and deaths are increasing faster than in any other country in the region due to their complete bans on OATs -- one of the greatest HIV prevention tools we have available to us."

Credit: 
Yale University

Brassica crops best for crop rotation and soil health in potato production systems

image: Prevalent soilborne potato diseases

Image: 
Robert Larkin

Crop rotation is vital to any crop production system. Rotating crops maintains crop productivity and soil health by replenishing organic matter, nutrients, soil structure, and other properties while also improving water management and reducing erosion. Rotating crops also reduces the buildup of soilborne pathogens and diseases.

When implementing a crop rotation system, growers should consider crop type, rotation length, and crop sequence. In the webcast "Crop Rotation and Soil Health in Potato Production Systems," Robert Larkin summarizes the results of years-long crop management strategy studies conducted on potato fields in Maine to determine the most effective management practices.

Crop rotation moderates soilborne diseases in multiple ways. It breaks the host-pathogen cycle by replacing the host plant with another nonhost plant, and it stimulates microbial activity, diversity, and beneficial soil organisms. Crop rotation also directly inhibits pathogens by stimulating microorganisms that are antagonistic to pathogens or introduce inhibitory compounds produced by the plant itself.

For potato, specifically, Larkin recommends a 3-year rotation (or longer) with conservation tillage. Growers should grow a disease-suppressive crop, such as a Brassica crop or Sundangrass, prior to potato and a cover crop, such as winter rye or ryegrass, following the rotation crop. He also recommends using a compost amendment to improve organic matter, soil properties, water availability, and yield.

Credit: 
American Phytopathological Society

Sustainable supply of minerals and metals key to a low-carbon energy future

image: Cobalt miner operating in the DRC.

Image: 
University of Sussex.

The global low-carbon revolution could be at risk unless new international agreements and governance mechanisms are put in place to ensure a sustainable supply of rare minerals and metals, a new academic study has warned.

The amount of cobalt, copper, lithium, cadmium, and rare earth elements needed for solar photovoltaics, batteries, electric vehicle (EV) motors, wind turbines, fuel cells, and nuclear reactors will likely grow at a rapid pace in the upcoming years. Even if alternatives are found for one metal, there will be reliance on another as the scope of possibilities is inherently limited by physical and chemical properties of elements.

However, with global supplies often heavily monopolized by a single country, confronted by social and environmental conflict, or concentrated in poorly functioning markets, there is a real possibility that a shortage of minerals could hold back the urgent need for a rapid upscaling of low-carbon technologies. In some cases, markets are providing misleading signals to investors that can lead to poor decisions. In other cases, the countries or regions supplying minerals are politically unstable.

An international team of researchers have made a number of recommendations to help manage the demand for such low-carbon technology minerals as well as limiting the environmental and public health damage of their extraction and processing, supporting social benefits, and also ensuring the benefits are shared more universally and equitably, in a new paper published in Science today [January 3].

Benjamin K. Sovacool, Professor of Energy Policy at the University of Sussex, said: "Mining, metals, and materials extraction is the hidden foundation of the low-carbon transition. But it is far too dirty, dangerous, and damaging to continue on its current trajectory.

"The impacts to mining rightfully alarm many environmental campaigners as a large price to pay to safeguard a low-carbon future. But as the extraction through terrestrial mining becomes more challenging, the on-land reserves of some terrestrial minerals dwindle or the social resistance in some countries escalates, even oceanic or even space based mineral reserves will become a plausible source."

Although the new study calls for renewed attention to tackle existing conditions of terrestrial extraction and processing of metals, it also states that there are important prospects of cobalt and nickel on the continental shelf within states' Exclusive Economic Zones as well as on the outer continental shelf regions.

Within international waters, metallic nodules found in the vast Clarion-Clipperton Zone of the Pacific as well as in cobalt and tellurium crusts, found in seamounts worldwide, provide some of the richest deposits of metals for green technologies.

But minerals in more pristine and distinctive ecosystems near hydrothermal vents should remain off-limits for mineral extraction for the foreseeable future, the researchers add.

Morgan Bazilian, Professor and Director of the Payne Institute for Public Policy, Colorado School of Mines, said: "As the global energy landscape changes, it is becoming more mineral and metal intensive. Thus, the sustainability and security of material supply chains is essential to supporting the energy transition. How we shape that pathway will have important consequences for everything from the environment, to development, and geopolitics."

The study's authors also recommend:

Enhance and coordinate international agreements on responsible mining and traceability in order to establish mineral supply justice.

Greatly expand the recycling and reuse of rare minerals to extend the lifetimes of products and stretch out reserves.

Diversify mineral supply scale to incorporate both small and large-scale operations while allowing miners to have control over mineral revenue through stronger benefit sharing mechanisms and access to markets.

Focus development donor policies to recognize the livelihood potential of mining in areas of extreme poverty rather than just regulating the sector for tax revenues.

Stipulate stronger Extended Producer Responsibility for products that use valuable rare minerals. This can ensure that responsibility for the entire lifespan of a product including at the end of its usefulness shifts from users or waste managers to major producers such as Apple, Samsung, and Toshiba.

Materials security of essential minerals and metals to be actively incorporated into formal climate planning including establishing a list of "critical minerals" for energy security (which is already done to some degree by the European Union and United States).

Saleem Ali, Blue and Gold Distinguished Professor of Energy and the Environment at the University of Delaware, said: "Our analysis is aimed at galvanizing international policy-makers to include mineral supply concerns for green technologies in climate change negotiations. We need to build on the resolution on mineral governance passed at the United Nations Environment Assembly in 2019 and operationalize a clear action plan on supply chain security for a low carbon transition."

Benoit Nemery, Emeritus Professor at the Centre for Environment and Health at KU Leuven, said: "Let's not achieve a low-carbon future at the expense of mineworkers and public health."

Factfile - The expected rising demands for a decarbonized future

Between 2015 and 2050, the global EV stock needs to jump from 1.2 million light-duty passenger cars to 965 million passenger cars

For the same period, battery storage capacity needs to climb from 0.5 gigawatt-hour (GWh) to 12,380 GWh while the amount of installed solar photovoltaic capacity must rise from 223 GW to more than 7100 GW.

Another research study has predicted increases in demand for materials for EV batteries of 87,000%, 1000% for wind power, and 3000% for solar cells and photovoltaics between 2015 and 2060.

Credit: 
University of Sussex

Research offers new way to assess an organization's public relations

Communication and marketing experts place great weight on an organization's relationship with its public stakeholders, and a new tool allows organizations to better measure and describe the nature of these relationships.

"Traditionally, these relationships are measured using questionnaires, which provide only a static snapshot of how one party viewed an organization," says Yang Cheng, co-author of a paper on the work and an assistant professor of communication at North Carolina State University. "But questionnaires don't account for the organization's role in shaping the relationship, nor do questionnaires account for the dynamic nature of relationships.

"Our tool, called Contingent Organization-Public Relationships (COPR), accounts for both of those factors, and can help our field better understand both how and why relationships change over time. The COPR, as a toolkit, can be applied to evaluate relationships in not only positive and cooperative environments but also during conflicts or crises."

The COPR framework assesses relationships based on the stance of the organization on a given subject and the stance of the relevant publics on the same subject, with the understanding that each side will adopt a stance that best serves its interest. The stances are measured on a continuum that runs from "aggressive" to "accommodating."

The COPR can use these stances to describe a relationship as belonging in one of six well-defined categories. For example, if both parties have taken an aggressive stance, they have a "competing" relationship. But if a one party is aggressive and the other party is accommodating, they have a "capitulating" relationship.

"We can determine each party's stance by mining datasets such as public discourse on social media, organizational actions, such as news releases or blog posts, and so on," Cheng says. "And COPR allows us to see how these relationships evolve in response to changing circumstances, such as during a concerted marketing push or after a crisis."

To demonstrate COPR's utility, the researchers conducted an analysis of the Red Cross in China from 2011 to 2014, as the organization grappled with a crisis concerning its credibility with Chinese audiences.

The paper, "Examining six modes of relationships in a social-mediated crisis in China: an exploratory study of contingent organization-public relationships (COPR)," is published in the Journal of Applied Communication Research. The paper was co-authored by Glen Cameron of the University of Missouri. The work was done with support from the Center for the Digital Globe and the School of Journalism at the University of Missouri.

Credit: 
North Carolina State University

Less offspring due to territorial conflicts

image: Territorial conflicts between neighboring groups have negative effects on gestating females and thus on unborn offspring.

Image: 
Liran Samuni

Both species, humans and chimpanzees, can be extremely territorial, and territorial disputes between groups can turn violent, with individuals killing each other. In humans, such between-group competition can escalate to war and devastating loss of human life. Researchers from the Max Planck Institute for Evolutionary Anthropology studied wild Western chimpanzees to find out whether territorial behavior may have shaped counter-strategies. One important strategy that is evident in both humans and chimpanzees, but rare in the rest of the animal kingdom, is the capacity to work together in order to achieve a goal, for example to defend a territory - even if it is together with individuals who are not one's kin.

The researchers tested whether the effects of territoriality - the pressure that neighboring groups exert on each other on one side, and the competitive capacity of a group on the other side - impact female reproductive success. Reproductive success is a measure of how many of one's genes pass into the next generation and therefore how much of an influence one's traits have on subsequent generations. Using long-term data on four neighboring chimpanzee communities that span several decades of these animals' lives, the researchers show that between-group competition has negative effects on wild female chimpanzees' reproductive success. Competition between groups seems to have a selective impact and could have helped shape associated traits in this species.

"We developed a new index of neighbor pressure that reflects the danger of intrusion by neighboring groups into one's territory", explains Sylvain Lemoine, first author of this study. "We show that high neighbor pressure during the time when females are supposed to resume reproduction is associated with a delay in reproduction, leading to longer intervals between births. We also show that having many males in a group is advantageous and speeds up reproduction".

The researchers also provide an extensive survival analysis and demonstrate that high neighbor pressure during pregnancy, but not during lactation, is associated with a reduced likelihood of offspring survival, suggesting that between-group competition has negative effects on gestating females and thus on unborn offspring. Groups of chimpanzees compete for space that encompasses important food resources, so a likely explanation of these findings is that females experience more stress when between-group competition is high, for example due to a loss of territory leading to nutritional deficiency, or due to direct exposure to neighbor group encounters, which is known to trigger stress responses in this species.

"These physiological mechanisms remain to be examined, as well as the potential efficiency of in-group cooperation to reduce the received pressure from neighbors, such as cooperative border patrols regularly observed in wild chimpanzees", adds Catherine Crockford, one of the senior authors of the study. "For highly territorial species, including humans, these findings shed light on how between-group competition could have acted as a selective pressure favoring the evolution of particular traits, such as group-level cooperation with non-kin, and how this could have shaped our ancestors", concludes senior author Roman Wittig. This study provides evidence for the underlying forces that could have shaped group cooperation in our ancestors by providing fitness advantages for those who are able to cooperate.

Credit: 
Max Planck Institute for Evolutionary Anthropology

Research identifies changes in neural circuits underlying self-control during adolescence

image: A study examining the relationship between structural and functional brain connectivity in 727 participants ages 8-23 years old revealed marked remodeling of structure-function coupling during youth.

Image: 
Graham Baum

PHILADELPHIA -- The human brain is organized into circuits that develop from childhood through adulthood to support executive function--critical behaviors like self-control, decision making, and complex thought. These circuits are anchored by white matter pathways which coordinate the brain activity necessary for cognition. However, little research exists to explain how white matter matures to support activity that allows for improved executive function during adolescence--a period of rapid brain development.

Researchers from the Lifespan Brain Institute of the Perelman School of Medicine at the University of Pennsylvania and Children's Hospital of Philadelphia applied tools from network science to identify how anatomical connections in the brain develop to support neural activity underlying these key areas. The findings were published in the Proceedings of the National Academy of Sciences.

"By charting brain development across childhood and adolescence, we can better understand how the brain supports executive function and self-control in both healthy kids and those with different mental health experiences," said the study's senior author Theodore Satterthwaite, MD, an assistant professor of Psychiatry at Penn. "Since abnormalities in developing brain connectivity and deficits in executive function are often linked to the emergence of mental illness during youth, our findings may help identify biomarkers of brain development that predict cognitive and clinical outcomes later in life."

In this study, the researchers mapped structure-function coupling--the degree to which a brain region's pattern of anatomical connections supports synchronized neural activity. This could be thought of like a highway, where the anatomical connections are the road and the functional connections are the traffic flowing along those roads. Researchers mapped and analyzed multi-modal neuroimaging data from 727 participants ages 8 to 23 years, and three major findings emerged.

First, the team found that regional variability in structure-function coupling was inversely related to the complexity of the function a given brain area is responsible for. Higher structure-function coupling was found in parts of the brain that are specialized for processing simple sensory information, like the visual system. In contrast, there was lower structure-function coupling in complex parts of the brain that are responsible for executive function and self-control, which require more abstract and flexible processing.

Results showed that structure-function coupling also aligned with known patterns of brain expansion over the course of primate evolution. Previous work comparing human, ape, and monkey brains has showed that sensory areas like the visual system are highly conserved across primate species and have not expanded much during recent evolution. In contrast, association areas of the brain, such as the prefrontal cortex, have expanded dramatically over the course of primate evolution. This expansion may have allowed for the emergence of uniquely complex human cognitive abilities. The team found that the brain areas which expanded rapidly during evolution had lower structure-function coupling, while simple sensory areas that have been conserved in recent evolution had higher structure-function coupling.

Researchers also found that structure-function coupling increased throughout childhood and adolescence in complex frontal brain regions. These are the same regions that tend to have lower baseline structure-function coupling, are expanded compared to monkeys, and are responsible for self-control. The prolonged development of structure-function coupling in these regions may allow for improved executive function and self-control that develops into adulthood. Indeed, the team found that higher structure-function coupling in the lateral prefrontal cortex--a complex brain area which plays important roles in self-control--was associated with better executive function.

"These results suggest that executive functions like impulse control--which can be particularly challenging for children and adolescents--rely in part on the prolonged development of structure-function coupling in complex brain areas like the prefrontal cortex," explained lead author Graham Baum, PhD, a postdoctoral fellow at Harvard University, who was a Penn neuroscience PhD student during the time of the research. "This has important implications for understanding how brain circuits become specialized during development to support flexible and appropriate goal-oriented behavior."

Credit: 
University of Pennsylvania School of Medicine

Using gene therapy to treat chronic traumatic encephalopathy

image: The first peer-reviewed journal in the field of human gene therapy, providing all-inclusive coverage of the research, methods, and clinical developments that are driving today's explosion of gene therapy advances.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, January 3, 2020--A new study shows the feasibility of using gene therapy to treat the progressive neurodegenerative disorder chronic traumatic encephalopathy (CTE). The study, which demonstrated the effectiveness of direct delivery of gene therapy into the brain of a mouse model of CTE, is published in Human Gene Therapy, a peer-reviewed journal from Mary Ann Liebert, Inc., publishers. Click here to read the full-text article free on the Human Gene Therapy website through February 3, 2020.

Ronald Crystal and colleagues from Weill Cornell Medical College, New York, NY, coauthored the article entitled "Anti-Phospho-Tau Gene Therapy for Chronic Traumatic Encephalopathy."

There is currently no treatment for CTE, which is caused by repeated trauma to the central nervous system (CNS), such as that suffered by soldiers, athletes in contact sports, and in accident-related trauma. Inflammation results in the accumulation of hyperphosphorylated forms of Tau protein (pTau). Crystal et al. developed an adeno-associated virus (AAV) vector to deliver an anti-pTau antibody to the (CNS). They showed that direct delivery of the AAVrh.10anti-pTau directly into the hippocampus of brain-injured mice was associated with a significant reduction in pTau levels across the CNS. They propose that doses could be scaled up and this strategy could be effective in humans as well.

"CTE is much more prevalent than was initially realized, and there is currently no therapy available," says Editor-in-Chief Terence R. Flotte, MD, Celia and Isaac Haidak Professor of Medical Education and Dean, Provost, and Executive Deputy Chancellor, University of Massachusetts Medical School, Worcester, MA. "This new work from the Crystal laboratory is potentially ground-breaking as a means to remove the offending Tau phoshoprotein."

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Kids twice as likely to eat healthy after watching cooking shows with healthy food

audio: A new study found kids who watched a child-oriented cooking show featuring healthy food were 2.7 times more likely to make a healthy food choice than those who watched a different episode of the same show featuring unhealthy food.

Image: 
<em>Journal of Nutrition Education and Behavior</em>

Philadelphia, January 8, 2020 - Television programs featuring healthy foods can be a key ingredient in leading children to make healthier food choices now and into adulthood.

A new study in the Journal of Nutrition Education and Behavior, published by Elsevier, found kids who watched a child-oriented cooking show featuring healthy food were 2.7 times more likely to make a healthy food choice than those who watched a different episode of the same show featuring unhealthy food.

Researchers asked 125 10- to 12-year-olds, with parental consent, at five schools in the Netherlands to watch 10 minutes of a Dutch public television cooking program designed for children, and then offered them a snack as a reward for participating. Children who watched the healthy program were far more likely to choose one of the healthy snack options - an apple or a few pieces of cucumber - than one of the unhealthy options - a handful of chips or a handful of salted mini-pretzels.

"The findings from this study indicate cooking programs can be a promising tool for promoting positive changes in children's food-related preferences, attitudes, and behaviors," said lead author Frans Folkvord, PhD, of Tilburg University,Tilburg, Netherlands.

This study was conducted at the children's schools, which could represent a promising alternative for children learning healthy eating behaviors. Prior research has found youth are more likely to eat nutrient-rich foods including fruits and vegetables if they were involved in preparing the dish, but modern reliance on ready-prepared foods and a lack of modeling by parents in preparing fresh foods have led to a drop in cooking skills among kids.

"Providing nutritional education in school environments instead may have an important positive influence on the knowledge, attitudes, skills, and behaviors of children," Dr. Folkvord said.

This study indicates the visual prominence of healthier options in both food choice and portion size on TV cooking programs leads young viewers to crave those healthier choices then act on those cravings.

The effect that exposure to healthier options has on children is strongly influenced by personality traits. For example, children who don't like new foods are less likely to show a stronger desire for healthier choices after watching a TV program featuring healthier foods than a child who does enjoy trying new foods. As they grow older, though, they start to feel more responsible for their eating habits and can fall back on information they learned as children. Researchers believe this may indicate watching programs with healthier options can still have a positive impact on children's behavior, even if it is delayed by age.

"Schools represent the most effective and efficient way to reach a large section of an important target population, which includes children as well as school staff and the wider community," Dr. Folkvord commented. "Positive peer and teacher modeling can encourage students to try new foods for which they exhibited distaste previously."

Poor dietary habits during childhood and adolescence have multiple negative effects on several health and wellness indicators, including achievement and maintenance of healthy weights, growth and development patterns, and dental health.

"The likelihood of consuming fruits and vegetables among youth and adults is strongly related to knowing how to prepare most fruits and vegetables. Increased cooking skills among children can positively influence their consumption of fruit and vegetables in a manner that will persist into adulthood," Dr. Folkvord added.

Credit: 
Elsevier

Versatile bile acids

Could bile acids--the fat-dissolving juices churned out by the liver and gallbladder--also play a role in immunity and inflammation?

The answer appears to be yes, according to two separate Harvard Medical School studies published in Nature.

The findings of the two studies, both conducted in mice, show that bile acids promote the differentiation and activity of several types of T cells involved in regulating inflammation and linked to intestinal inflammatory conditions. They also reveal that gut microbes are critical for converting bile acids into immune-signaling molecules.

The work suggests possible therapeutic pathways for modulating intestinal inflammation, a process that underlies the development of autoimmune conditions such as inflammatory bowel disease, commonly referred to as IBD.

The first study, led by immunologist Jun Huh and published Nov. 27 in Nature, reveals bile acids exert their immune-modulating effect by interacting with immune cells in the gut. Once bile acids leave the gallbladder and complete their fat-dissolving duties, they make their way down the digestive tract where they are modified into immune-regulatory molecules by gut bacteria. The modified bile acids then activate two class of immune cells: regulatory T cells (Tregs) and effector helper T cells, specifically Th17, each responsible for modulating immune response by either curbing or promoting inflammation.

Under normal conditions, the levels of proinflammatory Th17 cells and anti-inflammatory Treg cells balance each other, maintaining a degree of protection against pathogens without causing too much tissue-damaging inflammation. These cells play a key role in the context of intestinal infection. Th17 cells ignite inflammation to quell the infection, while Tregs curb inflammation once the threat has subsided. Unrestrained, the activity of Th17 can also lead to aberrant inflammation that promotes autoimmune disease and damages the intestine.

In their experiments, the researchers used undifferentiated, or naïve, mouse T cells, and exposed them to various bile acid metabolites one at a time. The experiments showed that two separate bile acid molecules exerted different effects on T cells--one molecule promoted Treg differentiation, while another molecule inhibited Th17 cell differentiation. When the researchers administered each molecule to mice, they observed the animals' Th17 and Treg cells fell and rose, accordingly. Additionally, the researchers found that the two bile acid byproducts are also present in human stool, including stools of people with IBD--a finding that suggest the same mechanism is at play in humans.

"Our findings identify an important regulatory mechanism in gut immunity, showing that microbes in our intestines can modify bile acids and turn them into regulators of inflammation," said Huh, assistant professor of immunology in the Blavatnik Institute at HMS.

If affirmed in further studies, the results can inform the development of small-molecule therapies that target Treg and Th17 cells as a way to control inflammation and treat autoimmune diseases affecting the gut.

The second study, published Dec. 25 in Nature and led by Dennis Kasper, focused on a subset of inflammation-taming regulatory T cells, or Tregs, that arise in the colon as a result of exposure to gut microbes. In contrast, most other immune cells originate in the thymus.

Low levels of colonic regulatory T cells (colonic Tregs) have been linked to the development of autoimmune conditions such as IBD and Crohn's disease.

Kasper's experiments demonstrate that gut microbes and diet work in concert to modify bile acids, which in turn affect the levels of colonic Tregs in mice. They also show that low levels of Treg cells induced by lack of bile acids or deficiency in bile acid sensors makes animals prone to developing inflammatory colitis--a condition that mimics human IBD.

To test the hypothesis that gut bacteria convert food-derived bile acids produced in response to food into immune signaling molecules, the researchers silenced bile acid-converting genes in various gut microbes and then put both the modified and nonmodified microbes in mice specially bred to have germ-free guts. Animals whose guts were populated by microbes without bile acid-converting genes had notably lower levels of Treg cells. The researchers then fed animals either nutrient-rich meals or minimal food.

Animals with normal microbe populations in their guts that were receiving minimal food had lower levels of colonic Tregs and lower bile acid levels than mice eating rich food. Yet animals with germ-free guts receiving rich food also had low levels of Treg cells--a finding which shows that both gut microbes and food-derived bile acids are required to modulate immune cell levels.

To test whether bile acids are directly involved in immune cell regulation, the researchers then mixed various bile acid molecules with the drinking water of animals that had low Treg cell levels and minimal diets. Several weeks later, these animals had an increase in the levels of inflammation-curbing Treg cells.

In a final step, the researchers gave three groups of mice a compound that induces colitis. One group was fed a minimal diet, another group received nutrient-rich meals and a third group received minimal food and drank water supplemented with bile acid molecules. As expected, only mice fed minimal diets not supplemented by bile acid molecules developed colitis. The experiment confirmed that bile acids play a critical role in Treg regulation, intestinal inflammation and colitis risk.

"Our results demonstrate an elegant three-way interaction between gut microbes, bile acids and the immune system," said Kasper, who is professor of immunology in the Blavatnik Institute at HMS and the William Ellery Channing Professor of Medicine at HMS and Brigham and Women's Hospital. "Importantly, our work suggests it is plausible to think of harnessing certain gut bacteria as a way to modulate disease risk."

Credit: 
Harvard Medical School

Structured, salary-only compensation plan for physicians is a model for pay equity

ROCHESTER, Minn. -- Gender pay equity in the field of medicine remains elusive. Gender-based pay differences have been shown to persist, even when controlling for experience, clinical productivity, academic rank and other factors. These inequities result in significantly lower lifetime earnings, job burnout and negative attitudes toward work, and adverse effects on the profession and society.

One model for eliminating pay disparities among physicians is a structured, salary-only plan that incorporates national benchmarks, and standardized pay steps and increments, such as the plan that is used at Mayo Clinic.

A Mayo Clinic study set out to assess how well the institution adheres to its own compensation model and achieves pay equity. The study reviewed data for all permanent staff physicians employed at Mayo Clinic in Arizona, Florida and Minnesota who were in clinical roles as of January 2017. Each physician's pay, demographics, specialty, full-time equivalent status, benchmark pay, leadership roles and other factors were collected and analyzed.

Among 2,845 physicians, pay equity was affirmed in 96% of the cases, according to the analysis, which is published in Mayo Clinic Proceedings. All physicians whose salaries were not in the predicted range were evaluated further and found to have the appropriate compensation, most often due to unique or blended departmental appointments. Of the 80 physicians -- 2.8% of the total -- with higher compensation than predicted by the model, there was no correlation with gender, race or ethnicity. The same was true of the 35 physicians -- 1.2% -- who had lower-than-predicted compensation.

"Our analysis is unique and to our knowledge the first to demonstrate that a structured compensation model achieved equitable physician compensation by gender, race and ethnicity, while also meeting the practice, education and research goals of a large academic medical center such as Mayo Clinic," says Sharonne Hayes, M.D., a Mayo Clinic cardiologist and the study's first author. "The analysis of this long-standing salary-only model was reassuring, not only that it was equitable, but that we as an organization adhere to our own standards."

A structured compensation program has been used for physician salaries at Mayo Clinic for more than 40 years to remove financial incentives to do more than is necessary or less than desired for the patient. The step-based model is designed to ensure that salaries are market-competitive; advance efforts to recruit and retain staff; and support the mission, vision and values of the organization. There are no incentives or bonus pay, and nonsalary compensation and benefits are consistent across Mayo Clinic locations and specialties.

Of the 2,845 physicians whose compensation was analyzed, 861 were women and 722 were nonwhite. More men than women held one of the compensated leadership positions or had past leadership roles -- 31.4% of men were in that category, compared with 15.9% of women -- and more men than women were in the highest compensated specialties.

The study calls for health care organizations to systematically define the drivers and incentives of physician compensation, and assess whether these organizations unfairly exclude or disadvantage certain groups -- whether women, racial or ethnic minorities, or medical specialties -- and then develop processes that can achieve equity and values alignment.

"While solutions to persistent pay inequities are different for each organization, leadership must be committed to addressing those inequities by identifying and consistently tackling biases," says Gianrico Farrugia, M.D., president and CEO, Mayo Clinic, and a study co-author. "Furthermore, absolute gender pay equity will only be realized when women achieve parity in the most highly compensated specialties and leadership roles."

Credit: 
Mayo Clinic

Health ranks as top concern for veterans immediately after military service

image: In this US Air Force image, an airman reunites with his daughter following a deployment. A new study by Veterans Affairs researchers and colleagues has found that health ranks as the top concern for veterans newly separated from military service.

Image: 
USAF Airman 1st Class Ericka Woolever

In the months after separating from military service, most veterans are less satisfied with their health than with their work or social relationships, found a study by Veterans Affairs researchers. While the veterans surveyed were mostly satisfied with their work and social well-being, a majority were dealing with chronic physical health conditions and a third reported chronic mental health conditions.

According to Dr. Dawne Vogt of the VA Boston Healthcare System and Boston University, lead author on the study, the results highlight the importance of addressing veterans' health concerns early.

"What remains to be seen is whether those veterans with health conditions--which were more commonly experienced by deployed veterans--continue to maintain high levels of well-being in other life domains over time," she says. "Given that it is well-established that health problems can erode functioning in other life domains, it may be that these individuals experience declines in their broader well-being over time."

The results appear Jan. 2, 2019, in the American Journal of Preventive Medicine.

More than 200,000 U.S. service members transition out of military service each year. Researchers have pointed to the early transition period as a critical time to address challenges veterans may face in readjusting to civilian life.

To investigate which of these challenges are most pressing to newly separated veterans, researchers from the VA National Center for PTSD and colleagues surveyed almost 10,000 veterans from a population-based roster of all separating service members.

All participants left the military in the fall of 2016. Veterans were surveyed about three months after their separation, and then six months after that.

The researchers found that the biggest concern was health. At both three and nine months after leaving the military, 53% of participants said they had chronic physical health conditions. About 33% reported chronic mental health conditions at both time points.

The most commonly reported health conditions were chronic pain, sleep problems, anxiety, and depression. Slightly more than half of participants said they had reduced satisfaction with their health between when they first left the military and a few months later. Health satisfaction did not change much between three and nine months after separation.

While physical and mental health was a concern for many veterans, most reported high vocational and social well-being. The majority of participants said they were satisfied with their work and social relationships and that they were functioning well in these areas. According to Vogt, the fact that most participants had high work and social satisfaction "highlights the resilience of the veteran population, and should provide some reassurance to those concerned about the well-being of newly separated veterans."

More than three-quarters of participants said they were in an intimate relationship in the months after they left the military. Almost two-thirds reported that they had regular contact with their friends and extended family and that they were involved in their broader communities.

Over half of participants had found work three months after military separation. While most participants reported high work satisfaction, the study group showed an overall decline in work functioning over the first year after military separation. Functioning declined even though overall employment rates increased. The researchers hypothesized that this decline in work functioning could be due to health concerns, which are known to erode broader well-being over time.

The study also found differences in well-being based on other factors. Enlisted veterans showed consistently poorer health, vocational, and social well-being than officers. Veterans who had deployed to a war zone had more health concerns than veterans who did not deploy.

There were also several differences between men and women. Male veterans were more likely to be employed than female veterans both three and nine months after leaving the military. Men were also more likely to report hearing conditions, high blood pressure, and high cholesterol. Women were more likely to endorse mental health conditions at nine months post-separation. They also reported more depression and anxiety at both timepoints.

The researchers have shared their findings with the VA Transition Assistance Program (TAP), which helps Veterans transition back to civilian life. The program is jointly managed by VA and the departments of Defense and Labor, in coordination with the departments of Education and Homeland Security, as well as the U.S. Office of Personnel Management and the U.S. Small Business Administration. According to Vogt, the results could help TAP and other programs that help veterans with readjustment decide how to allocate their resources. Vogt writes that the findings "suggest that maybe we don't need as much focus on promoting employment right now, and need more emphasis on treatment of mental/physical health conditions."

The researchers say their findings have implications not only for VA but for the wide spectrum of organizations nationwide--more than 40,000 in all--that provide programs, services, and support for veterans making their transition back to civilian life. Historically, much of the support for veterans leaving the military has primarily focused on providing employment and educational assistance and informing veterans of their benefits. But the findings suggest that veterans' health concerns should be prioritized, says Vogt. Interventions should also target at-risk subgroups of veterans. The researchers concluded that addressing newly separated veterans' health concerns could promote their broader well-being and longer-term readjustment.

Vogt points out the importance of addressing veterans' readjustment challenges before they worsen and have a chance to erode broader well-being. She says this may require re-evaluating support methods. "Given that most transition support is targeted to veterans with the most acute or chronic concerns," she says, "this recommendation may require rethinking how veteran programs prioritize their efforts. While it makes sense to target resources to those with greatest need, it is better to support individuals before their concerns become chronic when we can."

Work is underway to expand on this study using the same study group. The research team is analyzing how veterans' health and well-being changes in the second and third year after leaving service, as well as how veterans' initial health status impacts their subsequent well-being in other areas.

Credit: 
Veterans Affairs Research Communications