Culture

Study finds no gender discrimination when leaders use confident language

PULLMAN, Wash. - People tend to listen to big talkers, whether they are women or men. Still, more women prefer not to use assertive language, according to a new study led by Washington State University economist Shanthi Manian.

The study, published in Management Science on Feb. 17, found that participants in an experiment more often followed advice when the people giving the advice used assertive "cheap talk," statements that cannot be verified as true. Examples of such statements are often found in job seeking cover letters, such as "I have extremely strong problem-solving skills."

The experiment participants followed the advice people gave at similar rates regardless of their gender--even though they thought other people would be less likely to follow the advice of the female leaders.

"It was surprising. We didn't see actual discrimination: the subjects themselves seemed to respond about the same to men and women," said Manian. "Yet, after the experiment was over, and we asked the participants what they thought we'd find, many of them expected discrimination."

For the experiment, Manian and Ketki Sheth, an economist at University of California, Merced, recruited about 1,000 people to play a difficult online game. The players were paired with either a male or female leader, who gave advice online on how to play the game to earn the biggest reward.

Everyone had the same interactions with their leaders, except for two features--the gender of the leader and the assertiveness of the leaders' language. The same advice was couched in language that ranged from less assertive, using statements such as "You probably have better problem-solving skills than I do, but here is what I am thinking," to more assertive, such as "If you listen to my advice, I can assure you that my skills and experiences will help you perform well in this game."

The game was played by about 1,000 people, roughly half were U.C. Merced students and half recruited through Amazon's Mechanical Turk online crowdsourcing portal.

While all the leaders gave the same good advice, the more assertive the leader's statements were, the more people were likely to follow the advice.

However, the researchers found that the participants did not discriminate by gender regardless of the language being used by the leader. The people paired with a female leader who used assertive cheap talk were just as likely to follow the advice as those paired with a male leader using the same language - even though most participants characterized such language as being more masculine.

Also, while previous research has found that when people violate gender norms, they may get punished or face backlash, in this experiment, the participants were just as likely to listen to the female leaders and found them no less likeable.

After the game was played, the researchers asked the participants whether they thought the leaders' gender would matter. People were much more likely to believe that others would follow the advice from a male leader than a female leader, even if the advice was identical.

They were also asked what type of language they would choose if they were leaders in the game. While most men and women avoided the least assertive language, men were ten percentage points more likely to prefer the most assertive language option.

Sheth said the results raise questions about why women prefer less assertive language and what price they may pay for that aversion.

"The fact that the subjects expected discrimination suggests that it's hard for people to know when discrimination is going to happen," said Sheth.

Credit: 
Washington State University

Understanding cellular clock synchronization

image: Mice without a brain clock lose the synchronisation between the different organs, as shown in the bioluminescence profile (right). In the liver, however, synchronisation is maintained.

Image: 
© UNIGE

Circadian clocks, which regulate the metabolic functions of all living beings over a period of about 24 hours, are one of the most fundamental biological mechanisms. In humans, their disruption is the cause of many metabolic diseases such as diabetes or serious liver diseases. Although scientists have been studying this mechanism for many years, little is known about how it works. Thanks to an observation tool based on bioluminescence, a research team from the University of Geneva (UNIGE) were able to demonstrate that cells that compose a particular organ can be in-phase, even in the absence of the central brain clock or of any other clocks in the body. Indeed, the scientists managed to restore circadian function in the liver in completely arrhythmic mice, demonstrating that neurons are not unique in their ability to coordinate. Results can be discovered in the journal Gene and Development.

For a long time, the scientific community considered that circadian rhythms were entirely controlled by a central clock located in the brain, before discovering, a few years ago, the existence in all cells of the body of a small molecular clock. "Nevertheless, the brain clock was deemed indispensable for the synchronization of all peripheral clocks," says Ueli Schibler, honorary professor at the UNIGE Faculty of Science, who initiated the work. "However, usual research tools did not allow us to explore the validity of this hypothesis. Indeed, to do so, we must be able to follow in real time, over a relatively long period of time, the expression of the circadian genes of an animal with or without a functional brain clock," explains Flore Sinturel, researcher at the Department of Medicine of UNIGE Faculty of Medicine and first author of this work.

Bioluminescence to study circadian rhythms

As early as 2013, Professor Schibler's team developed a completely new technology, now commercially available , which makes it possible to monitor the activity of a specific organ and the circadian rhythms that control it. "We were inspired by the principle of bioluminescence that can be observed in fireflies, for example," he explains. "Our mice carry a circadian reporter gene that produces an enzyme, luciferase. We then add luciferin to their drinking water, a substance which, when oxidized by the luciferase, causes photon emission." Light is then captured by a photomultiplier that records the number of photons emitted per minute and thus detects the expression of the circadian reporter gene over time.

Liver clock cells are phase-coupled without receiving timing cues

After the central clock was removed, scientists observed that all the clocks in the body are in different phases. However, at the level of a single organ - the liver in this case - the mice retain a robust and coordinated circadian rhythmicity. So, while the central clock can synchronize all the organs in the same phase, the cells communicate enough to maintain a coordinated rhythmicity within a single organ. "While it was thought that only neurons had strong enough connections to ensure this circadian coordination, we are now demonstrating that this is not the case," says Flore Sinturel. "This puts the singularity of the central clock into perspective."

The scientists then confirmed their discovery: in arrhythmic mice, i.e. mice with no circadian clocks whatsoever, the researchers succeeded in restoring the expression of rhythmicity in the liver alone, without touching the other organs. "This allowed us to show that a clock restored in one organ works and has rhythms, even in the absence of all the other clocks in the body," she explains. They now want to understand how these cells stay in the same phase when they are not receiving any information, either from the brain or from other external signals. Their hypothesis? The existence of a form of coupling, in the form of an exchange of molecules between these different cells.

Credit: 
Université de Genève

3D microscopy clarifies understanding of body's immune response to obesity

image: Xiaohui Zhang, left, Andrew Smith, Kelly Swanson, Erik Nelson, Mark Anastasio and Junlong Geng are part of a team working to clarify the relationship between obesity and inflammation while on the hunt for obesity-fighting drug therapies.

Image: 
Photo by L. Brian Stauffer.

CHAMPAIGN, Ill. -- Researchers who focus on fat know that some adipose tissue is more prone to inflammation-related comorbidities than others, but the reasons why are not well understood. Thanks to a new analytical technique, scientists are getting a clearer view of the microenvironments found within adipose tissue associated with obesity. This advance may illuminate why some adipose tissues are more prone to inflammation - leading to diseases like type 2 diabetes, cancer and cardiovascular disorders - and help direct future drug therapies to treat obesity.

In a new study, University of Illinois Urbana Champaign bioengineering professors Andrew Smith and Mark A. Anastasio, molecular and integrative physiology professor Erik Nelson and nutritional sciences professor Kelly Swanson detail the use of the new technique in mice. The results are published in the journal Science Advances.

Inflammation in adipose tissue presents itself as round complexes of inflammatory tissue called crownlike structures. Previous studies have shown that body fat that contains these structures is associated with worse outcomes of obesity and related metabolic disorders, the study reports.

Previously, researchers were confined to the use of 2D slices of tissue and traditional microscopy, limiting what researchers could learn about them.

To get a better view, the team combined a special type of microscopy that uses a 3D sheet of light rather than a beam, a fat-clearing technique that renders tissue optically transparent, and deep-learning algorithms that help process the large amount of imaging data produced.

The researchers found that the crownlike appearances that gives these structures their name are, in reality, more like 3D shells or concentric spheres surrounding an empty core, Smith said.

"Using our new technique, we can determine the crownlike structures' volume, the specific number of cells associated with them, as well as their size, geometry and distribution," Smith said.

This ability led the team to discover that obesity tends to be associated with a prevalence of rare, massive crownlike structures that are not present in the lean state.

"These very large crownlike structures are clustered together and located in the center of the tissue," Smith said. "And there is no way we could have analyzed this before using our new technique."

Smith said the research may lead to new drug therapies and new ways to evaluate patients' metabolic health.

"Right now, we know that some patients are overweight but metabolically healthy, while others are underweight and metabolically unhealthy," Smith said. "We believe that having the ability to look deep into the microenvironments with fat tissue may unlock some of the reasons why this is."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Dr. Frederick Boop presents at the ISPN 2020 Virtual Meeting

Understanding the molecular biology of brain tumors is key to prognosis and treatment said Le Bonheur Neuroscience Institute Co-Director Frederick Boop, MD, in his presentation "How Molecular Biology Impacts Clinical Practice" at the International Society for Pediatric Neurosurgery (ISPN) 2020 Virtual Meeting.

"Historically we have depended on what we see under a microscope to differentiate tumor types and determine prognosis and therapy," said Boop. "We know now that what we see doesn't necessarily predict how these tumors are going to behave."

Physicians are able to send a piece of a child's tumor to FoundationOne, an FDA-approved tissue-based broad companion diagnostic (CDx) for solid tumors, which provides the genomic alterations of that particular tumor. This explanation of the genetic aberrations includes its significance, best available treatment with mechanism of action and studies open for enrollment.

Manipulation of tumors based on molecular genetics began more than 35 years ago with shrinking prolactinomas before turning to neurosurgery. Boop and his team now use a molecular biological approach with medulloblastomas, low-grade gliomas, congenital glioblastomas and many more types of brain tumors. Closer study of molecular genetics has revealed different variants within each type of tumor, each with a different treatment approach and prognosis based on the genetic variant. Further study is needed into treatment side effects and long-term consequences for some of these therapies.

"As neurosurgeons, it is important for us to get tissue to the lab in every instance for us to understand what's going on so that these children can have a chance," said Boop.

For tumors that can't be removed surgically but tissue is needed for diagnostics, biopsies provide better understanding and treatment of the tumor. Previously, neurosurgeons avoided these biopsies because it was believed that the cells required were closest to necrotic areas that could cause catastrophic complications. Better understanding of tumors means that the tumor can be biopsied in a safer area in order to obtain the molecular profile of the tumor.

"Molecular genetics has completed changed our field and will continue to do so," said Boop. "There may come a time when the role for surgeons is much less than it is today."

Credit: 
Le Bonheur Children's Hospital

Friends fur life help build skills for life

image: A new UBC Okanagan study finds children reap the benefits of working with therapy dogs

Image: 
UBC Okanagan

A new UBC Okanagan study finds children not only reap the benefits of working with therapy dogs-they enjoy it too.

"Dog lovers often have an assumption that canine-assisted interventions are going to be effective because other people are going to love dogs," says Nicole Harris, who conducted this research while a master's student in the School of Education. "While we do frequently see children improve in therapy dog programs, we didn't have data to support that they enjoyed the time as well."

Harris was the lead researcher in the study that explored how children reacted while participating in a social skill-training program with therapy dogs.

The research saw 22 children from the Okanagan Boys and Girls Club take part in a series of sessions to help them build their social skills. Over six weeks, the children were accompanied by therapy dogs from UBC Okanagan's Building Academic Retention through K9s (BARK) program as they completed lessons.

Each week the children were taught a new skill, such as introducing themselves or giving directions to others. The children would first practice with their assigned therapy dog before running through the exercise with the rest of the group. In the final phase, the children --accompanied by their new furry friend and volunteer handler --would practice their new skills with university students located in the building.

"Therapy dogs are often able to reach children and facilitate their growth in surprising ways. We saw evidence of this in the social skills of children when they were paired with a therapy dog," says Dr. John-Tyler Binfet, associate professor in the School of Education and director of BARK. "The dogs helped create a non-threatening climate while the children were learning these new skills. We saw the children practice and hone their social skills with and alongside the dogs."

While the children were learning and practising their new skills, the research team collected data.

"Findings from our observations suggested that canine-assisted social and emotional learning initiatives can provide unique advantages," says Harris. "Our team saw that by interacting with the therapy dogs, the children's moods improved and their engagement in their lessons increased."

In fact, 87 per cent of the team rated the children's engagement level as very or extremely engaged during the sessions.

At the end of the six weeks, Harris interviewed eight children, aged 5 to 11 years old, who regularly attended the sessions. Each child indicated the social skill-training program was an enjoyable and positive experience and the dogs were a meaningful and essential part of the program.

One participant noticed that the children behaved better at the sessions than at their regular after-school care program, and they thought it was because the children liked being around the dogs.

Half of the children mentioned ways that they felt the dogs helped with their emotional well-being, with one participant crediting a dog with helping him "become more responsible and control his silliness."

As a full-time elementary school teacher, Harris notes that schools have become increasingly important in helping students develop social and emotional skills, and this research could contribute to the development of future school-based or after-school programs.

"Dogs have the ability to provide many stress-reducing and confidence-boosting benefits to children," says Harris. "It was really heartwarming to see the impact the program had on the kids."

Credit: 
University of British Columbia Okanagan campus

FSU College of Medicine researcher develops new possibilities to prevent sudden cardiac death

image: Assistant Professor of Biomedical Sciences Stephen Chelko, right, works in his laboratory with graduate student Maicon Landim-Vieira. Chelko's lab has published research providing important new insights about arrhythmogenic cardiomyopathy, a leading cause of sudden cardiac death among young athletes.

Image: 
Mark Bauer/FSU College of Medicine

Nearly a half-million people a year die from sudden cardiac death (SCD) in the U.S. -- the result of malfunctions in the heart's electrical system.

A leading cause of SCD in young athletes is arrhythmogenic cardiomyopathy (ACM), a genetic disease in which healthy heart muscle is replaced over time by scar tissue (fibrosis) and fat.

Stephen Chelko, an assistant professor of biomedical sciences at the Florida State University College of Medicine, has developed a better understanding of the pathological characteristics behind the disease, as well as promising avenues for prevention. His findings are published in the current issue of Science Translational Medicine.

Individuals with ACM possess a mutation causing arrhythmias, which ordinarily are non-fatal if managed and treated properly. However, Chelko shows that exercise not only amplifies those arrhythmias, but causes extensive cell death. Their only option is to avoid taking part in what should be a healthy and worthwhile endeavor: exercise.

"There is some awful irony in that exercise, a known health benefit for the heart, leads to cell death in ACM subjects," Chelko said. "Now, we know that endurance exercise, in particular, leads to large-scale myocyte cell death due to mitochondrial dysfunction in those who suffer from this inherited heart disease."

Several thousand mitochondria are in nearly every cell in the body, processing oxygen and converting food into energy. Considered the powerhouse of all cells (they produce 90 percent of the energy our bodies need to function properly), they also play another important role as a protective antioxidant.

As mitochondria fail to function properly, and myocyte cells in the heart die, healthy muscles are replaced by scar tissue and fatty cells. Eventually, the heart's normal electrical signals are reduced to an erratic and disorganized firing of impulses from the lower chambers, leading to an inability to properly pump blood during heavy exercise. Without immediate medical treatment, death occurs within minutes.

Chelko's research gets to the heart of the process involved in mitochondrial dysfunction.

"Ultimately, mitochondria become overwhelmed and expel 'death signals' that are sent to the nucleus, initiating large-scale DNA fragmentation and cell death," Chelko said. "This novel study unravels a pathogenic role for exercise-induced, mitochondrial-mediated cell death in ACM hearts."

In addition to providing a better understanding of the process involved, Chelko discovered that cell death can be prevented by inhibiting two different mitochondrial proteins. One such approach utilizes a novel targeting peptide developed for Chelko's research by the National Research Council in Padova, Italy.

That discovery opens avenues for the development of new therapeutic options to prevent myocyte cell death, cardiac dysfunction and the pathological progression leading to deadly consequences for people living with ACM.

Credit: 
Florida State University

Researchers ID blood protein that sheds light on common, post-operative complication

BOSTON - Delirium, a common syndrome among older adults, particularly in those who have recently undergone surgery, critically ill patients in the ICU, and in older patients with multiple health issues, is a form of acute confusion that is characterized by poor attention, disorientation, impaired memory, delusions, and abrupt changes in mood and behavior. Moreover, patients who experience delirium are at increased risk of long term cognitive decline. Recently, clinicians and scientists have recognized that delirium is one of the first signs of COVID-19 infection in older patients and that it occurs frequently in patients with severe COVID-19 disease.

In a new study led by an interdisciplinary team of gerontologists, geriatricians, precision medicine experts, and bioinformaticians at Beth Israel Deaconess Medical Center (BIDMC), researchers identified a single protein present in the blood that is associated with increased risk of post-operative delirium. The finding, published in the Journal of Gerontology: Medical Sciences, sheds light on a potential pathophysiological mechanism underlying delirium and paves the way for a non-invasive, cost-effective test to guide prediction, diagnosis and monitoring of delirium. While further study is needed, pre-operative blood tests for these proteins could help physicians determine which patients are at higher risk for developing delirium.

"Delirium is associated with more complications, longer hospitalizations, increased risk of long-term cognitive decline, dementia and mortality, and costs the U.S. healthcare system an estimated $182 billion each year," said first author Sarinnapha Vasunilashorn, PhD, Assistant Professor of Medicine at BIDMC and Harvard Medical School (HMS).

"Despite its pervasiveness, delirium remains a clinical diagnosis with no established tests to diagnose the condition," said co-senior author Towia Libermann, PhD, Director of the BIDMC Genomics, Proteomics, Bioinformatics and Systems Biology Center. "The discovery of a reliable biomarker could change that."

Vasunilashorn, also a member of the Department of Epidemiology at the Harvard T.H. Chan School of Public Health, and colleagues used a cutting edge proteomics platform, SOMAscan -- a large-scale quantitative analysis of the expression levels of proteins -- to evaluate proteins present in the blood from a patient cohort called SAGES (Successful Aging after Elective Surgery). Sponsored by the National Institute on Aging, SAGES follows 560 noncardiac surgical patients ages 70 and older with the goal of identifying novel biomarkers of delirium and its associated long-term cognitive outcomes.

"SAGES participants have been very generous with their time, participating in interviews to test their memory and thinking, and also donate small amounts of blood, before and immediately after their major elective surgery," said co-senior author Edward Marcantonio, MD, Section Chief for Research in the Division of General Medicine at BIDMC and Professor of Medicine at HMS. "We are now analyzing this stored blood with novel techniques, such as SOMAscan, to understand the biological basis of delirium, an incredibly challenging clinical problem."

The researchers' analysis of more than 1,300 proteins revealed a single protein (known as chitinase-3-like-protein-1, or CHI3L1/YKL-40) that was present at higher concentrations in the blood both before and after surgery in patients who experienced delirium as compared with patients who did not develop postoperative delirium. This protein -- itself linked to aging and age-related conditions including Alzheimer's disease -- plays a critical role in the body's type 2 immune response.

The team also found that patients who had high pre-operative levels of the protein CHI3L1/YKL-40 combined with high post-operative levels of an immune-related protein called interleukin-6 (or IL-6) were at increased risk of delirium.

"Our study specifically highlights the involvement of this highly specific immune activating protein in postoperative delirium, which may also play a role in COVID-19 associated delirium," said Libermann, who is also an Associate Professor of Medicine at Harvard Medical School. "In addition to providing a promising candidate for a delirium biomarker, our findings suggest a possible link between delirium, aging and Alzheimer's disease."

Credit: 
Beth Israel Deaconess Medical Center

Wolves, dogs and dingoes, oh my

image: A) Person holding the front paws of a dingo spread wide. B) Shows a dingo climbing rocks.

Image: 
Lyn Watson

Dogs are generally considered the first domesticated animal, while its ancestor is generally considered to be the wolf, but where the Australian dingo fits into this framework is still debated, according to a retired Penn State anthropologist.

"Indigenous Australians understood that there was something different about the dingoes and the colonial dogs," said Pat Shipman, retired adjunct professor of anthropology, Penn State. "They really are, I think, different animals. They react differently to humans. A lot of genetic and behavioral work has been done with wolves, dogs and dingoes. Dingoes come out somewhere in between."

Wolves, dogs and dingoes are all species of the canidae family and are called canids. In most animals, hybridization between closely related species does not happen, or like female horses and male donkeys, produce mules -- usually non-fertile offspring. However, many canid species, including wolves, dingoes and dogs, can interbreed and produce fertile offspring. Defining species boundaries in canids becomes more difficult.

Domestic dogs came to the Australian continent in 1788 with the first 11 ships of convicts, but dingoes were already there, as were aboriginal Australians who arrived on the continent about 65,000 years ago. A large portion of dingoes in Australia today have domestic dog in their ancestry, but dingoes came to Australia at least 4,000 years ago according to fossil evidence. Shipman believes that date may be even earlier, but no fossils have yet been found.

"Part of the reason I'm so fascinated with dingoes is that if you see a dingo through American eyes you say, 'that's a dog,'" said Shipman. "In evolutionary terms, dingoes give us a glimpse of what started the domestication process."

Shipman reports her analysis of wolves, dogs and dingoes in a January 2021 special issue of the Anatomical Record.

Dingoes, and the closely related New Guinea singing dogs, look like the default definition of dog, but they are not dogs.

"There is a basic doggy look to dingoes," said Shipman.

Genetically and behaviorally they differ from dogs and are more like wolves in their inability to digest starches and their relationships with humans.

Most domestic dogs evolved along with humans as humans became agriculturalists and moved to a diet containing large amounts of starch, whether from maize, rice, potatoes or wheat. Their genome changed to allow the digestion of these starches. Dingoes, like wolves, have very few of the genes for starch digestion.

While indigenous Australians stole dingo puppies from their dens and raised them, these puppies generally left human homes at maturity and went off to breed and raise offspring. The ability to closely bond with humans is limited in dingoes, although present in dogs. Native Australians also did not manipulate dingo breeding, which is a hallmark of domestication.

Dingoes are also well-adapted to the Australian outback and fare well in that environment. Domestic dogs that become feral do not survive well in the outback.

"Aboriginal Australians were not well-regarded as holders of knowledge or special skill when Europeans came to the continent," said Shipman. "So, no one thought to ask them about dingoes. Even recently, asking aboriginals for their scientific or behavioral knowledge really was not common."

However, aboriginal Australians have a long history of living with dingoes in their lives. Many people argue that dingoes are just dogs -- strange dogs, but just dogs, said Shipman. But, according to aboriginals, dingoes are not dogs.

With dingoes showing behaviors somewhere between wolves and dogs and exhibiting only slight genetic ability to consume starchy foods or tolerate captivity, Shipman concluded that "A dingo is a wolf on its way to becoming a dog, that never got there."

Credit: 
Penn State

Protein linked to Alzheimer's, strokes cleared from brain blood vessels

image: Amyloid deposits (blue) in mouse brain tissue and blood vessels are reduced after treatment with an antibody that targets the protein APOE (right), a minor component of amyloid deposits, compared to a placebo antibody (left). Amyloid deposits in the brain increase the risk of dementia and strokes. Researchers at Washington University School of Medicine in St. Louis have identified an antibody that clears amyloid deposits from the brain without raising the risk of brain bleeds.

Image: 
Monica Xiong

As people age, a normal brain protein known as amyloid beta often starts to collect into harmful amyloid plaques in the brain. Such plaques can be the first step on the path to Alzheimer's dementia. When they form around blood vessels in the brain, a condition known as cerebral amyloid angiopathy, the plaques also raise the risk of strokes.

Several antibodies that target amyloid plaques have been studied as experimental treatments for Alzheimer's disease. Such antibodies also may have the potential to treat cerebral amyloid angiopathy, although they haven't yet been evaluated in clinical trials. But all of the anti-amyloid antibodies that have successfully reduced amyloid plaques in Alzheimer's clinical trials also can cause a worrisome side effect: an increased risk of brain swelling and bleeds.

Now, researchers at Washington University School of Medicine in St. Louis have identified an antibody that, in mice, removes amyloid plaques from brain tissue and blood vessels without increasing risk of brain bleeds. The antibody targets a minor component of amyloid plaques known as apolipoprotein E (APOE).

The findings, published Feb. 17 in Science Translational Medicine, suggest a potentially safer approach to removing harmful amyloid plaques as a way of treating Alzheimer's disease and cerebral amyloid angiopathy.

"Alzheimer's researchers have been searching for decades for therapies that reduce amyloid in the brain, and now that we have some promising candidates, we find that there's this complication," said senior author David Holtzman, MD, the Andrew B. and Gretchen P. Jones Professor and head of the Department of Neurology. "Each of the antibodies that removes amyloid plaques in clinical trials is a little different, but they all have this problem, to a greater or lesser degree. We've taken a different approach by targeting APOE, and it seems to be effective at removing amyloid from both the brain tissue and the blood vessels, while avoiding this potentially dangerous side effect."

The side effect, called ARIA, for amyloid-related imaging abnormalities, is visible on brain scans. Such abnormalities indicate swelling or bleeding in the brain caused by inflammation, and can lead to headaches, confusion and even seizures. In clinical trials for anti-amyloid antibodies, roughly 20% of participants develop ARIA, although not all have symptoms.

Anti-amyloid antibodies work by alerting the immune system to the presence of unwanted material -- amyloid plaques -- and directing the cleanup crew -- inflammatory cells known as microglia -- to clear out such debris. ARIA seems to be the result of an overenthusiastic inflammatory response. Holtzman and first author Monica Xiong, a graduate student, suspected that an antibody that targets only a minor part of the amyloid plaque might elicit a more restrained response that clears the plaques from both brain tissue and blood vessels without causing ARIA.

Fortunately, they had one such antibody on hand: an antibody called HAE-4 that targets a specific form of human APOE that is found sparsely in amyloid plaques and triggers the removal of plaques from brain tissue. To determine whether HAE-4 also removes amyloid from brain blood vessels, the researchers used mice genetically modified with human genes for amyloid and APOE4, a form of APOE associated with a high risk of developing Alzheimer's and cerebral amyloid angiopathy. Such mice develop abundant amyloid plaques in brain tissue and brain blood vessels by the time they are about six months old. Along with Holtzman and Xiong, the research team included co-authors Hong Jiang, PhD, a senior scientist in Holtzman's lab, and Gregory J. Zipfel, MD, the Ralph G. Dacey Distinguished Professor of Neurological Surgery and head of the Department of Neurosurgery, among others.

Experiments showed that eight weeks of treating mice with HAE-4 reduced amyloid plaques in brain tissue and brain blood vessels. Treatment also significantly improved the ability of brain blood vessels to dilate and constrict on demand, an important sign of vascular health.

Amyloid plaques in brain blood vessels are dangerous because they can lead to blockages or ruptures that cause strokes. The researchers compared the number of brain bleeds in mice treated for eight weeks with either HAE-4 or aducanumab, an anti-amyloid antibody that is in phase 3 clinical trials for Alzheimer's. The mice had a baseline level of tiny brain bleeds because of their genetic predisposition for amyloid buildup in blood vessels. But aducanumab significantly increased the number of bleeds while HAE-4 did not.

Further investigation revealed that HAE-4 and aducanumab initially elicited immune responses against amyloid plaques that were similar in strength. But mice treated with the anti-APOE antibody resolved the inflammation within two months, while inflammation persisted in mice treated with the anti-amyloid antibody.

"Some people get cerebral amyloid angiopathy and never get Alzheimer's dementia, but they may have strokes instead," Holtzman said. "A buildup of amyloid in brain blood vessels can be managed by controlling blood pressure and other things, but there isn't a specific treatment for it. This study is exciting because it not only shows that we can treat the condition in an animal model, but we may be able to do it without the side effects that undermine the effectiveness of other anti-amyloid therapeutics."

Credit: 
Washington University School of Medicine

Radiological images confirm 'COVID-19 can cause the body to attack itself'

image: MRI image of a patient's shoulder. The red arrow points to inflammation in the joint. The COVID virus triggered rheumatoid arthritis in this patient with prolonged shoulder pain after other covid symptoms resolved.

Image: 
Northwestern University

CHICAGO --- Muscle soreness and achy joints are common symptoms among COVID-19 patients. But for some people, symptoms are more severe, long lasting and even bizarre, including rheumatoid arthritis flares, autoimmune myositis or "COVID toes."

A new Northwestern Medicine study has, for the first time, confirmed and illustrated the causes of these symptoms through radiological imaging.

"We've realized that the COVID virus can trigger the body to attack itself in different ways, which may lead to rheumatological issues that require lifelong management," said corresponding author Dr. Swati Deshmukh.

The paper will be published Feb. 17 in the journal Skeletal Radiology. The study is a retrospective review of data from patients who presented to Northwestern Memorial Hospital between May 2020 and December 2020.

"Many patients with COVID-related musculoskeletal disorders recover, but for some individuals, their symptoms become serious, are deeply concerning to the patient or impact their quality of life, which leads them to seek medical attention and imaging," said Deshmukh, an assistant professor of musculoskeletal radiology at Northwestern University Feinberg School of Medicine and a Northwestern Medicine musculoskeletal radiologist. "That imaging allows us to see if COVID-related muscle and joint pain, for example, are not just body aches similar to what we see from the flu -- but something more insidious."

Imaging (CT, MRI, ultrasound) can help explain why someone might have prolonged musculoskeletal symptoms after COVID, directing them to seek the right physician for treatment, such as a rheumatologist or dermatologist.

In some cases, radiologists may even suggest a COVID diagnosis based on musculoskeletal imaging in patients who previously didn't know they contracted the virus, Deshmukh said.

What does the imaging look like?

"We might see edema and inflammatory changes of the tissues (fluid, swelling), hematomas (collections of blood) or devitalized tissue (gangrene)," Deshmukh said. "In some patients, the nerves are injured (bright, enlarged) and in others, the problem is impaired blood flow (clots)."

How can imaging lead to better treatment?

"I think it's important to differentiate between what the virus causes directly and what it triggers the body to do," Deshmukh said. "It's important for doctors to know what's happening in order to treat correctly."

For example, Deshmukh said, if a patient has persistent shoulder pain that started after contracting COVID, their primary care provider might order an MRI/ultrasound. If a radiologist knows COVID can trigger inflammatory arthritis and imaging shows joint inflammation, then they can send a patient to a rheumatologist for evaluation.

"Some doctors request imaging for patients with 'COVID toes,' for example, but there wasn't any literature on imaging of foot and soft tissue complications of COVID," Deshmukh said. "How do you find something if you're unsure of what to look for? So in our paper, we discuss the various types of musculoskeletal abnormalities that radiologists should look for and provide imaging examples."

Credit: 
Northwestern University

World's oldest DNA reveals how mammoths evolved

image: The illustration represents a reconstruction of the steppe mammoths that preceded the woolly mammoth, based on the genetic knowledge we now have from the Adycha mammoth.

Image: 
Beth Zaiken/CPG

An international team led by researchers at the Centre for Palaeogenetics in Stockholm has sequenced DNA recovered from mammoth remains that are up to 1.2 million years old. The analyses show that the Columbian mammoth that inhabited North America during the last ice age was a hybrid between the woolly mammoth and a previously unknown genetic lineage of mammoth. In addition, the study provides new insights into when and how fast mammoths became adapted to cold climate. These findings are published today in Nature.

Around one million years ago there were no woolly or Columbian mammoths, as they had not yet evolved. This was the time of their predecessor, the ancient steppe mammoth. Researchers have now managed to analyse the genomes from three ancient mammoths, using DNA recovered from mammoth teeth that had been buried for 0.7-1.2 million years in the Siberian permafrost.

This is the first time that DNA has been sequenced and authenticated from million-year-old specimens, and extracting the DNA from the samples was challenging. The scientists found that only minute amounts of DNA remained in the samples and that the DNA was degraded into very small fragments.

"This DNA is incredibly old. The samples are a thousand times older than Viking remains, and even pre-date the existence of humans and Neanderthals", says senior author Love Dalén, a Professor of evolutionary genetics at the Centre for Palaeogenetics in Stockholm.

The age of the specimens was determined using both geological data and the molecular clock. Both these types of analyses showed that two of the specimens are more than one million years old, whereas the third is roughly 700 thousand years old and represents one of the earliest known woolly mammoths.

An unexpected origin of the Columbian mammoth

Analyses of the genomes showed that the oldest specimen, which was approximately 1.2 million years old, belonged to a previously unknown genetic lineage of mammoth. The researchers refer to this as the Krestovka mammoth, based on the locality where it was found. The results show that the Krestovka mammoth diverged from other Siberian mammoths more than two million years ago.

"This came as a complete surprise to us. All previous studies have indicated that there was only one species of mammoth in Siberia at that point in time, called the steppe mammoth. But our DNA analyses now show that there were two different genetic lineages, which we here refer to as the Adycha mammoth and the Krestovka mammoth. We can't say for sure yet, but we think these may represent two different species", says the study's lead author Tom van der Valk.

The researchers also suggest that it was mammoths that belonged to the Krestovka lineage that colonised North America some 1.5 million years ago. In addition, the analyses show that the Columbian mammoth that inhabited North America during the last ice age, was a hybrid. Roughly half of its genome came from the Krestovka lineage and the other half from the woolly mammoth.

"This is an important discovery. It appears that the Columbian mammoth, one of the most iconic Ice Age species of North America, evolved through a hybridisation that took place approximately 420 thousand years ago", says co-lead author Patrícia Pečnerová.

Evolution and adaptation in the woolly mammoth

The second million-year-old genome, from the Adycha mammoth, appears to have been ancestral to the woolly mammoth. The researchers could therefore compare its genome with the genome from one of the earliest known woolly mammoths that lived 0.7 million years ago, as well as with mammoth genomes that are only a few thousand years old. This made it possible to investigate how mammoths became adapted to a life in cold environments and to what extent these adaptations evolved during the speciation process.

The analyses showed that gene variants associated with life in the Arctic, such as hair growth, thermoregulation, fat deposits, cold tolerance and circadian rhythms, were already present in the million-year-old mammoth, long before the origin of the woolly mammoth. These results indicate that most adaptations in the mammoth lineage happened slowly and gradually over time.

"To be able to trace genetic changes across a speciation event is unique. Our analyses show that most cold adaptations were present already in the ancestor of the woolly mammoth, and we find no evidence that natural selection was faster during the speciation process", says co-lead author David Díez-del-Molino.

Future research

The new results open the door for a broad array of future studies on other species. About one million years ago was a period when many species expanded across the globe. This was also a time period of major changes in climate and sea levels, as well as the last time that Earth's magnetic poles changed places. Because of this, the researchers think that genetic analyses on this time scale have great potential to explore a wide range of scientific questions.

"One of the big questions now is how far back in time we can go. We haven't reached the limit yet. An educated guess would be that we could recover DNA that is two million years old, and possibly go even as far back as 2.6 million. Before that, there was no permafrost where ancient DNA could have been preserved", says Anders Götherström, a professor in molecular archaeology and joint research leader at the Centre for Palaeogenetics.

Credit: 
Stockholm University

Body shape, beyond weight, drives fat stigma for women

A woman's body shape--not only the amount of fat--is what drives stigma associated with overweight and obesity.

Fat stigma is a socially acceptable form of prejudice that contributes to poor medical outcomes and negatively affects educational and economic opportunities. But a new study has found that not all overweight and obese body shapes are equally stigmatized. Scientists from Arizona State University and Oklahoma State University have shown that women with abdominal fat around their midsection are more stigmatized than those with gluteofemoral fat on the hips, buttocks and thighs. The work will be published on February 17 in Social Psychology and Personality Science.

"Fat stigma is pervasive, painful and results in huge mental and physical health costs for individuals," said Jaimie Arona Krems, assistant professor of psychology at OSU and first author on the paper. "We found that even when women are the same height and weight, they were stigmatized differently--and this was driven by whether they carried abdominal or gluteofemoral fat. Indeed, in one case, people stigmatized obese women with gluteofemoral fat more than objectively smaller women with abdominal fat. This finding suggests that body shape is sometimes even more important than overall size in driving fat stigma."

The location of fat on the body determines body shape and is associated with different biological functions and health outcomes. Gluteofemoral fat in young women can indicate fertility, while abdominal fat can accompany negative health outcomes like diabetes and cardiovascular disease.

"When people try to understand what others are like, and what characteristics they possess, they often rely on easily visible cues to make their best guesses. We've known for a long time that people use weight as such a cue. Given that different fats, on different parts of the body, are associated with different outcomes, we wanted to explore whether people also systematically use body shape as a cue," said Steven Neuberg, co-author of the study and Foundation Professor and Chair of the ASU Department of Psychology.

To test how the location of fat on the body affected stigma, the research team created illustrations of underweight, average-weight, overweight and obese bodies that varied in both size and shape. The illustrations of higher-weight bodies had either gluteofemoral or abdominal fat.

The study participants stigmatized obese women more than overweight women and also overweight women more than average-weight women. But women with overweight who weighed the same were less stigmatized when they carried gluteofemoral fat than when they carried abdominal fat. This same pattern held for women with obesity, suggesting that body shape, in addition to overall body size, drives stigmatization.

The research team also tested the impact of body shape in driving stigma in different ethnicities and cultures. White and Black Americans, and also participants in India, all showed the same pattern of stigmatizing women carrying abdominal fat.

"The findings from this study are probably not surprising to most women, who have long talked about the importance of shape, or to anyone who has read a magazine article on 'dressing for your shape' that categorizes body shapes as apples, pears, hourglasses and the like," Krems said. "Because theories have not focused on body shape, we haven't tested for its importance and have missed one of the major drivers of fat stigma for some time. It is important to put data to this idea so we can improve interventions for people with overweight and obesity."

Credit: 
Arizona State University

One in five has a mutation that provides superior resilience to cold

image: Håkan Westerblad, professor of cellular muscle physiology at the Department of Physiology and Pharmacology, Karolinska Institutet, Sweden. Photo: Mats Rundgren

Image: 
Mats Rundgren

Almost one in five people lacks the protein α-actinin-3 in their muscle fibre. Researchers at Karolinska Institutet in Sweden now show that more of the skeletal muscle of these individuals comprises slow-twitch muscle fibres, which are more durable and energy-efficient and provide better tolerance to low temperatures than fast-twitch muscle fibres. The results are published in the scientific journal The American Journal of Human Genetics.

Skeletal muscle comprises fast-twitch (white) fibres that fatigue quickly and slow-twitch (red) fibres that are more resistant to fatigue. The protein α-actinin-3, which is found only in fast-twitch fibres, is absent in almost 20 per cent of people - almost 1.5 billion individuals - due to a mutation in the gene that codes for it. In evolutionary terms, the presence of the mutated gene increased when humans migrated from Africa to the colder climates of central and northern Europe.

"This suggests that people lacking α-actinin-3 are better at keeping warm and, energy-wise, at enduring a tougher climate, but there hasn't been any direct experimental evidence for this before," says Håkan Westerblad, professor of cellular muscle physiology at the Department of Physiology and Pharmacology, Karolinska Institutet. "We can now show that the loss of this protein gives a greater resilience to cold and we've also found a possible mechanism for this."

For the study, 42 healthy men between the ages of 18 and 40 were asked to sit in cold water (14 °C) until their body temperature had dropped to 35.5 °C. During cold water immersion, researchers measured muscle electrical activity with electromyography (EMG) and took muscle biopsies to study the protein content and fibre-type composition.

The results showed that the skeletal muscle of people lacking α-actinin-3 contains a larger proportion of slow-twitch fibres. On cooling, these individuals were able to maintain their body temperature in a more energy-efficient way. Rather than activating fast-twitch fibres, which results in overt shivering, they increased the activation of slow-twitch fibers that produce heat by increasing baseline contraction (tonus).

"The mutation probably gave an evolutionary advantage during the migration to a colder climate, but in today's modern society this energy-saving ability might instead increase the risk of diseases of affluence, which is something we now want to turn our attention to," says Professor Westerblad.

Another interesting question is how the lack of α-actinin-3 affects the body's response to physical exercise.

"People who lack α-actinin-3 rarely succeed in sports requiring strength and explosiveness, while a tendency towards greater capacity has been observed in these people in endurance sports," he explains.

One limitation of the study is that it is harder to study mechanisms in human studies at the same level of detail as in animal and cell experiments. The physiological mechanism presented has not been verified with experiments at, for example, the molecular level.

Credit: 
Karolinska Institutet

Study: Including videos in college teaching may improve student learning

Washington, February 17, 2021--As higher education institutions worldwide transition to new methods of instruction, including the use of more pre-recorded videos, in response to the COVID-19 pandemic, many observers are concerned that student learning is suffering as a result. However, a new comprehensive review of research offers some positive news for college students. The authors found that, in many cases, replacing teaching methods with pre-recorded videos leads to small improvements in learning and that supplementing existing content with videos results in strong learning benefits. The study was published today in Review of Educational Research, a peer-reviewed journal of the American Educational Research Association.

Study authors Michael Noetel, Shantell Griffith, Taren Sanders, Philip D. Parker, Borja del Pozo Cruz, and Chris Lonsdale at Australian Catholic University, and Oscar Delaney at the University of Queensland, analyzed 105 prior studies with a pooled sample of 7,776 students. The prior studies had used randomized controlled trials to compare the impact of videos (such as recorded lectures or highly edited clips that included audio and visual elements) on learning with the impact of other forms of instruction, including face-to-face lectures, tutorials, or assigned readings. Studies in which the use of video could not be isolated from other variables--for example, in "flipped" classrooms where lectures were more interactive due to increased student engagement--were excluded by the authors.

"Overall, when students got videos instead of the usual forms of teaching, the average grade increased from a B to a B+," said Noetel, a research fellow at Australian Catholic University. "When they got videos in addition to their existing classes, the effect was even stronger, moving students from a B to an A."

Videos were found to be more effective for teaching skills than for transmitting knowledge. On a skills assessment, videos improved student scores by about 5 points out of 100. For learning knowledge, videos were about as good as existing teaching methods, increasing student scores by about 2 points.

The results were robust across different teaching methods (e.g., lectures, tutorials, homework), course subjects, types of video (e.g., case demonstrations, recorded lectures), lengths of the video experiments, and amount of time between the experiments and follow-up assessments of student learning.

"In a slightly concerning finding for my job as an academic, videos were even better than face-to-face classes with a teacher, although only by only a little," said Noetel. "Still, this surprised us because we thought classes would more effective, not less."

"Obviously some valuable learning activities are best done face-to-face, like role-plays and class discussion," said Noetel. "But our results show many forms of learning can be done better and more cost-effectively via video. Shifting the 'explaining' bits to videos allows the rich, interactive work to take up more of the precious face-to-face time with students."

The authors noted that videos might be more effective than face-to-face classes with comparable interactivity because students are able to engage at their own pace and in their own time, without being overloaded.

"Because each student is in charge of the controls, videos may allow learners to stop themselves from becoming overloaded, pause to take notes, rewind, or go faster if they're bored." said Noetel. "Video may also increase student motivation by allowing increased autonomy and self-direction. It's nice to be able to learn when and where you want; it can fit in better with life."

The authors noted that videos often show things more authentically than lectures can, by providing real-life demonstrations instead of artificial demonstrations in class. This may explain why the videos in the study were more effective for teaching skills than for transmitting knowledge.

According to the authors, college policies should focus on incentivizing staff to create and share high-quality video resources, funding the infrastructure for creating quality videos, and supporting students with less access to technology.

"Even after the pandemic ends, college instructors will find value in incorporating video into their teaching," said Noetel. "Ensuring that those videos are of high quality and that all students have equal access to them will provide significant long-term benefits."

Credit: 
American Educational Research Association

High-risk gene for neurodevelopmental disorders linked to sleep problems in flies

image: The mutation of a gene (ISWI) that has been associated with neurodevelopmental disorders like autism spectrum disorder led to marked sleep disturbances in fruit flies.

Image: 
Matthew Kayser, Penn Medicine

PHILADELPHIA - The mutation of a gene that has been associated with neurodevelopmental disorders like autism spectrum disorder led to marked sleep disturbances in fruit flies, according to a new study from scientists in the Perelman School of Medicine at the University of Pennsylvania. The findings, published Wednesday in Science Advances, provide further evidence that sleep is linked to early neurodevelopmental processes and could guide future treatments for patients.

While sleep disruption is a commonly reported symptom across neurodevelopmental disorders, including autism, it is often treated clinically as a "secondary effect" of other cognitive or behavioral problems, according to senior author Matthew Kayser, MD, PhD, an assistant professor of Psychiatry and Neuroscience at Penn, who led the study with Natalie Gong and Leela Chakravarti Dilley, both MD/PhD students.

"Our paper shows that sleep problems are not arising because of these other issues, but rather, this gene acts in different brain circuits, at different periods of time during development, to independently give rise to each of these symptoms," Kayser said. "Which is to say, we're guessing that the genetic constellation or signaling pathway that leads to disorders like autism or depression can also lead to sleep problems in humans."

To identify a correlation between sleep and neurodevelopment, Kayser and his research team genetically manipulated Drosophila, or fruit flies, by individually "knocking down" each of 218 genes that have been strongly associated with risk for neurodevelopmental disorders in humans. They then observed how the flies -- a remarkably powerful model for biomedical research -- reacted.

After observing the flies' behavioral patterns, they saw that knocking down the gene Imitation SWItch/SNF (ISWI) made the fruit flies almost entirely unable to sleep. ISWI in fruit flies is homologous to SMARCA1 and SMARCA5 genes in humans that have been linked to various neurodevelopmental disorders. In addition to sleep deficits, the researchers found that knocking down ISWI also led to memory problems and social dysfunction. Surprisingly, the ISWI gene was found to act in different cells of the fly brain during distinct developmental times to independently affect each of these behaviors.

Importantly, even though sleep deficits appear to arise directly from dysfunction of a given gene, Kayser said that previous research suggests treatments like cognitive behavioral therapy for insomnia are still likely to be effective.

"Even if problems like sleep disruption or insomnia arise from really early problems in the brain's wiring, we have every reason to believe that we can use existing treatments," Kayser said.

The findings support the idea that treating sleep problems in children with neurodevelopmental disorders could potentially improve other symptoms. Future work will examine the potential for leveraging sleep as a modifiable risk factor in mitigating the severity of neurodevelopmental disorders.

"Now that we know that sleep deficits are a primary characteristic of early developmental origin in neurodevelopmental disorders, we can start to ask," Kayser said, "whether improving sleep will also improve memory and social function."

Credit: 
University of Pennsylvania School of Medicine