Culture

Essential virulence proteins of corn smut discovered

image: Disease symptoms on the leaf of an Ustilago maydis infected maize plant.

Image: 
MPI f. Terrestrial Microbiology/ Geisel

To infect its host plant maize, the fungal parasite Ustilago maydis uses a complex of seven proteins. Numerous findings reveal an essential role of the complex in causing disease and suggest a widespread occurence in fungal plant pathogens.

Each year, fungal plant pathogens such as rusts, rice blast and mildews destroy huge amounts of cereal crops that could feed millions of people. Many of these fungi are biotrophic pathogens: Instead of killing their host plants, they manipulate host cells to assure that these sustain fungal growth. Among these pathogens, the corn smut fungus Ustilago maydis has emerged as a model for basic research on biotrophic fungi.

During the infection, U. maydis releases an entire cocktail of so-called effectors which function either in the interaction zone between fungus and host or are delivered to plant cells. Effector proteins suppress plant immunity, alter plant biosynthesis pathways and re-initiate cell division in leaf tissue, leading to prominent tumor-like structures from which the fungus spreads its spores. At present, the mechanism how effectors of plant-pathogenic fungi end up in plant cells remains a mystery.

Over many years, researchers around Regine Kahmann at the Max Planck Institute for Terrestrial Microbiology have worked on elucidating the molecular function of effectors. In the present study they have identified five fungal effectors plus two transmembrane proteins, which form a stable protein complex. If only one of these seven proteins is missing, the infection process stops entirely. Such a strong contribution to virulence is highly unusual for effectors which individually usually have only a modest contribution to virulence. Mutants lacking complex members fail to downregulate host immunity, suggesting an involvement of the complex in effector delivery. Localization experiments in part conducted with collaboration partners in the US and at the Philipps-Universität in Marburg revealed that complex proteins reside in structures extending from the fungus into host cells.

Essential for infection

These and other findings together with the observation that the expression of the complex is co-regulated with the infection process point towards a central, if not universal role of the protein complex. "We consider it likely that the effector complex in fact acts as a transmembrane structure that helps pathogens to deliver effector proteins into host cells", says Regine Kahmann. "Such devices are well known from bacteria, but not from fungi so far."

However, direct proof is tough to come by. "This would require to show that certain effectors fail to enter the plant cell when the complex is missing, something we cannot really prove at the moment since mutants lacking the complex are immediately attacked by the plant immune system and die after entering the plant" says Nicole Ludwig, lead author of the study that was published in the journal Nature Microbiology. "We hope that in the near future the already achieved reconstitution of the complex will pave the way to study its atomic structure and presumed function in effector delivery."

But there is also an applied aspect associated with the essential role of the complex, as Regine Kahmann points out. "Because the complex is indispensable for infection, it is an attractive target for stopping the disease by developing new fungicides". First steps towards this goal have already been taken by setting up a high throughput screen in collaboration of the Max Planck researchers in Marburg with the Compound Management and Screening Center (COMAS) at the MPI for Molecular Physiology in Dortmund. The screen resulted in several candidate compounds, of which the most promising ones could successfully inhibit disease symptoms not only of U. maydis but also disease caused by a rust fungus. Supported by Max Planck Innovation a patent was recently filed at the European Patent Office, illustrating the transfer of basic research to potential applications in agriculture.

Credit: 
Max-Planck-Gesellschaft

Study finds racial disparities in concussion symptom knowledge among college athletes

May 7, 2021 - Among collegiate football players and other athletes, Black athletes recognize fewer concussion-related symptoms than their White counterparts, reports a study in the May/June issue of the Journal of Head Trauma Rehabilitation (JHTR). The official journal of the Brain Injury Association of America, JHTR is published in the Lippincott portfolio by Wolters Kluwer.

"Despite NCAA concussion education requirements for athletes, Black collegiate-athletes were found to have lower concussion symptom knowledge than White collegiate-athletes," according to the new research by Jessica Wallace, PhD, MPH, LAT, ATC, of University of Alabama, Tuscaloosa, and colleagues. The study also finds racial differences in the sources of concussion symptom knowledge.

Need for 'equitable strategies' to communicate concussion information

In the study, 768 collegiate athletes - about 83 percent White and 17 percent Black - completed a standard assessment of concussion symptom knowledge. Participants included 196 football players, of whom 59 percent were White and 41 percent were Black.

Scores on the concussion symptom knowledge questionnaire were compared for Black and White athletes, with adjustment for other characteristics. The athletes were also asked about their sources of information about concussion symptoms.

"Black athletes were more likely to have lower concussion symptom knowledge scores than White athletes," according to the authors. The difference was small: average scores on the 34-point questionnaire were 27.7 for White athletes and 25.9 for Black athletes. However, the between-group difference remained statistically significant after adjustment for other factors. The findings were similar for football players compared to all athletes.

Concussion symptom knowledge varied for a range of sleep, mood, physical, and cognitive symptoms of concussion. In particular, Black athletes were less likely to correctly recognize the symptoms of feeling like "in a fog," nausea or vomiting, and feeling more irritable/angry.

Most athletes identified athletic trainers as a main source of concussion information. White athletes were somewhat more likely to mention other school-based professionals (such as teachers or nurses), online medical sources, or the National Collegiate Athletic Association (NCAA). Black athletes were more likely to cite referees as a source of concussion information.

Basic knowledge of concussion symptoms is fundamental to prompt recognition and reporting of concussions. Studies have suggested that 50 to 80 percent of concussions may go unreported.

In previous studies, Dr. Wallace and colleagues reported racial disparities in concussion knowledge among high school and youth athletes. Concussion knowledge was lowest for athletes attending schools of lower socioeconomic status and without access to athletic trainers.

These differences in concussion knowledge may extend to the college level, the new study suggests. The disparity may be related to historic and systemic bias/discrimination and inequitable access to healthcare and other resources, or to racial distrust in medical resources and healthcare overall. "All of these contributing factors of inequity and disparity at the community level, youth sports level, and high school sports level could impact exposure and/or receptiveness to concussion education materials that may ultimately explain the lower concussion symptom knowledge of Black collegiate-athletes," Dr. Wallace and coauthors write.

The researchers emphasize the need for strategies to ensure equitable access to concussion education and prevention that do not perpetuate disparity: "Moving forward, a conscious attempt is needed to redevelop concussion education initiatives as racially, culturally, and linguistically inclusive, addressing the needs of all collegiate athletes equally and equitably."

The NCAA provides educational resources on concussion: https://www.ncaa.org/sport-science-institute/concussion-educational-resources.

Click here to read "Assessing Differences in Concussion Symptom Knowledge and Sources of Information Among Black and White Collegiate-Athletes."
DOI: 10.1097/HTR.0000000000000672

Credit: 
Wolters Kluwer Health

Breaching the blood-brain barrier to deliver precious payloads

image: Georgia Tech Mechanical Engineering Ph.D. student Yutong Guo (left) and her mentor, assistant professor Costas Arvanitis, have developed a way to use ultrasonics to treat brain disease.

Image: 
Ashley Ritchey, Georgia Tech

RNA-based drugs have the potential to change the standard of care for many diseases, making personalized medicine a reality. This rapidly expanding class of therapeutics are cost-effective, fairly easy to manufacture, and able to go where no drug has gone before, reaching previously undruggable pathways.

Mostly.

So far, these promising drugs haven't been very useful in getting through to the well-protected brain to treat tumors or other maladies.

Now a multi-institutional team of researchers, led by Costas Arvanitis at the Georgia Institute of Technology and Emory University, has figured out a way: using ultrasound and RNA-loaded nanoparticles to get through the protective blood-brain barrier and deliver potent medicine to brain tumors.

"We're able to make this drug more available to the brain and we're seeing a substantial increase in tumor cell death, which is huge," said Arvanitis, assistant professor in the Wallace H, Coulter Department of Biomedical Engineering (BME) and Georgia Tech's George W. Woodruff School of Mechanical Engineering (ME).

Arvanitis, whose collaborators include researchers and clinicians from Emory's School of Medicine and the University of Cincinnati College of Medicine, is the corresponding author of a new paper published in the journal Science Advances that describes the team's development of a next-generation, tunable delivery system for RNA-based therapy in brain tumors.

"Our results were very positive, but if you think I'm excited, you haven't talked to oncologists - they're 10 times as excited," Arvanitis said.

The roots of this project go back to when he and the paper's lead author, ME grad student Yutong Guo, arrived at Georgia Tech in August 2016.

"From the start, I was very interested in the application of ultrasonics in treating brain disease," said Arvanitis, who linked up with Emory physician Tobey MacDonald, director of the Pediatric Neuro-Oncology Program at the Aflac Cancer and Blood Disorders Center, and one of the paper's co-authors. "Our main question was, can we use ultrasound to deliver drugs to tumors? Because that is a major challenge."

RNA drugs have two major weaknesses: limited circulation time and limited uptake by cells. To overcome these challenges, the drugs are packaged in robust nanocarriers, typically 100 nm in size, to improve their bioavailability. Still, these nanocarriers have typically been too large to penetrate the blood-brain barrier, the tightly-connected and selective endothelial cells surrounding blood vessels in the brain, until now a locked door to RNA drugs.

But now, Arvanitis and his colleagues have discovered a safe way to get the drug safely across.

Using mouse models, the team deployed a modified version of ultrasound, the diagnostic imaging technique that uses sound waves to create images of internal body structures, such as tendons, blood vessels, organs and, in the case of pregnant women, babies in utero. The researchers combined this technology with microbubbles -- tiny gas pockets in the bloodstream, designed as vascular contrast agents for imaging -- which vibrate in response to ultrasound waves, changing the permeability of blood vessels.

"Focusing multiple beams of ultrasound energy onto a cancerous spot caused the microbubbles' vibrations to actually stretch, pull, or shear the tight junctions of endothelial tissue that make up the blood-brain barrier, creating an opening for drugs to get through," Guo said.

It's a technique that biomedical ultrasound researchers have been refining for more than a decade, and recent clinical trials have demonstrated its safety. But there hasn't been much evidence for selective and effective delivery of nanoparticles and their payloads directly into brain tumor cells. But even when blood borne drugs succeed in penetrating the blood-brain barrier, if they are not taken up by the cancer cell, the job isn't complete.

Arvanitis and his team packaged siRNA, a drug that can block the expression of genes that drive tumor growth, in lipid-polymer hybrid nanoparticles, and combined that with the focused ultrasound technique in pediatric and adult preclinical brain cancer models. Using single-cell image analysis, they demonstrated a more than 10-fold improvement in delivery of the drug, reducing harmful protein production and increasing tumor cell death in preclinical models of medulloblastoma, the most common malignant brain tumor in children.

"This is completely tunable," Arvanitis said. "We can fine tune the ultrasound pressure to attain a desired level of vibration and by extension drug delivery. It's non-invasive, because we are applying sound from outside the brain, and it's very localized, because we can focus the ultrasound to a very small region of the brain."

Current standard treatments for brain tumors come with potentially awful side effects, Arvanitis said, "however, this technology can provide treatment with minimal side effects, which is very exciting. Now we are moving forward to try and identify what components are missing to translate this technology to the clinic."

Credit: 
Georgia Institute of Technology

Some meat eaters disgusted by meat

Some meat eaters feel disgusted by meat, according to a new study.

University of Exeter scientists showed food pictures to more than 700 people, including omnivores (who eat meat and other foods), flexitarians (who try to eat less meat) and vegetarians.

About 7% of meat eaters (15% of flexitarians and 3% of omnivores) had a "fairly strong disgust response" to images of meat dishes commonly eaten in the UK, like roast chicken or bacon.

As a group, omnivores rated meat images about twice as disgusting on average as pictures of carbohydrate-rich foods like bread, chips and rice.

Based on the findings, the researchers say harnessing the "yuk factor" may be more effective than relying on willpower for anyone who wants to eat less meat.

"We were surprised to find that so many people are grossed out by meat - even people who eat meat all the time," said Elisa Becker, of the University of Exeter.

"Our results don't explain why these people eat meat, but it's possible that habits, family and cultural traditions all play a part.

"Meat consumption is increasingly seen as unsustainable, unhealthy and unethical, and many people want to eat less meat.

"If you're trying to cut down your meat intake, sheer willpower may not be enough - but harnessing the 'yuk factor' could be the way to go."

The study's 711 participants - 402 omnivores, 203 flexitarians and 106 vegetarians - each completed a survey and took a rapid-response task (measuring instinctive reactions) to test their levels of "meat disgust".

"Meat liking" was also measured. About 75% of omnivores - and more than 20% of vegetarians - showed a fairly strong liking for meat.

To be classified as having "fairly strong" meat disgust, participants had to rate six meat images closer to "very much" than "not at all" on a sliding scale of disgust, and also had to show evidence of meat disgust on the rapid-response task.

Among flexitarians - the only group attempting to reduce their meat intake - meat disgust was a better predictor than self-control (measured in a separate questionnaire) of reduced meat-eating.

Meat disgust was also associated with reduced intake over the following six months.

"We hope that this information can help us develop new interventions to help people reduce their meat intake," said Professor Natalia Lawrence, of the University of Exeter.

"Not everyone wants to reduce their meat consumption - but for those who do, we are working on computer tasks that might help them harness the power of disgust in a fun way.

"It's important to note that our study does not establish causation - so further research is needed to find out whether meat disgust causes people to eat less meat, or whether avoiding meat allows these negative emotional responses to develop or be expressed."

Becker added: "It's interesting to note that almost all of us experience meat disgust from time to time - for example when we see unfamiliar meats or dishes made from parts of animals we don't usually eat, like squirrel meat or beef heart.

"Humans may have evolved a degree of meat disgust because eating spoilt meat can be much more dangerous than eating a carrot that's a bit off."

Credit: 
University of Exeter

How bullying and obesity can affect girls' and boys' mental health

image: Sofia Kanders, Ph.D. student at the Department of Neuroscience, Uppsala University.

Image: 
Uppsala University

Depressive symptoms are more common in teenage girls than in their male peers. However, boys' mental health appears to be affected more if they suffer from obesity. Irrespective of gender, bullying is a considerably greater risk factor than overweight for developing depressive symptoms. These conclusions are drawn by researchers at Uppsala University who monitored adolescents for six years in a questionnaire study, now published in the Journal of Public Health.

Peer-review/Observational study/People

"The purpose of our study was to investigate the connection between body mass index (BMI) and depressive symptoms, and to take a close look at whether being subjected to bullying affects this relationship over time. We also wanted to investigate whether any gender differences existed," says Sofia Kanders, a PhD student at Uppsala University's Department of Neuroscience.

In the study young people, born in Västmanland County, replied to questions about their height, weight and depressive symptoms on three separate occasions (2012, 2015 and 2018). The respondents' mean age was 14.4 years on the first occasion and 19.9 years on the last.

Based on BMI, the adolescents were divided into three groups: those with normal weight, overweight and obesity respectively. They were also grouped according to the extent of their depressive symptoms.

Overall, regardless of their weight, the girls stated more frequently that they had depressive symptoms. In 2012, 17 per cent of the girls and 6 per cent of the boys did so. By 2015, the proportions of adolescents with these symptoms had risen to 32 per cent for the girls and 13 per cent for the boys. The corresponding figures for 2018 were 34 and 19 per cent respectively.

A higher BMI did not, as far as the researchers could see, affect the girls' mental well-being to any great extent. Among the boys, however, the pattern observed was entirely different.

"When we analysed girls and boys separately, we saw that for boys with obesity in 2012, the risk for having depressive symptoms in 2015 was, statistically, five times higher than for normal-weight boys. In the girls we found no such connection," Kanders says.

The study has been unable to answer the question of what causes this gender difference, and the researchers think more research is needed in this area.

The young respondents were also asked about bullying -- for example, to state whether, in the past year, they had been physically exposed to blows and kicks, teased or excluded, subjected to cyberbullying (abusive texting or other electronic or web bullying), or bullied by an adult at school.

In every analysis, exposure to bullying was associated with a higher risk of depressive symptoms. This connection was also evident six years later, especially in overweight boys. The researchers believe that these results seem to indicate a gender difference in how BMI and bullying together drive development of future depressive symptoms.

"One key conclusion and take-home message from our study is that bullying can affect mental illness for a long time to come, which therefore makes preventive measures against bullying in schools extremely important," Kanders says.

Credit: 
Uppsala University

Consumption of pornography is widespread among young Internet users

Nearly four-fifths of 16- and 17-year-olds have encountered pornographic content on the Internet

Pornography is a multibillion-dollar business. Pornographic content is virtually ubiquitous on the Internet, and surveys suggest that 25% of all searches lead to explicit content. Given the size of the market, it's not surprising that young people are drawn to such sites, which are only a couple of clicks away.

Professor Neil Thurman of the Department of Media and Communication (IfKW) at Ludwig-Maximilians-Universitaet (LMU) in Munich, in collaboration with statistician Fabian Obster (Universität der Bundeswehr München), has carried out a study on the use of pornographic sites by young people. Based on a survey involving a sample of 1000 British adolescents, the survey also provides pointers for regulators and legislators in Germany.

Overall, 78% of users between the ages of 16 and 17 reported that they had encountered pornography on the Internet. Moreover, many of them stated that they visited dedicated pornographic websites frequently. Those who participated in the survey admitted that, on average, they had last visited such sites 6 days prior to filling in the questionnaire. And many respondents said they watched porno videos and viewed picture galleries on that very day. Analysis of the responses indicated that adolescents spent an average of 2 hours per month on commercial pornographic websites, almost always accessed on their smartphones or tablets. The survey also revealed that young consumers are also turning to social media portals for access to explicit material. Adolescent users of online pornography are more likely to be male.

Well acquainted with VPNs and the Tor browser

In Germany, the UK, France and Canada, efforts are now underway to regulate access to legal online pornography, and in some cases measures have already been implemented. These include provisions for mandatory age verification prior to the admission of users to such websites. But, according to Thurman's survey, around half of the respondents had used VPNs or the Tor browser. Both tools anonymize connection data, thus allowing country-specific restrictions to be circumvented.

"At present, the online pornography market is highly concentrated. It is dominated by a few global firms. Indeed, only a handful of websites account for the majority of consumption," Thurman says. In the context of measures to protect minors, he therefore suggests that, in additional to country-specific measures, there should also be pressure placed on the major global publishers of pornography, to encourage them to introduce effective age restrictions in all the markets in which they operate. In addition, similar regulations should be applied, as is already happening in the UK, to social-media platforms.

Credit: 
Ludwig-Maximilians-Universität München

Migratory songbirds climb to extreme altitudes during daytime

Great reed warblers normally migrate by night during its month-long migration from northern Europe to Sub-Saharan Africa. However, researchers have now discovered that during the few occasions when it continues to fly during daytime, it flies at extremely high altitudes (up to 6300 meters). One possible explanation for this unexpected and consistent behaviour could be that the birds want to avoid overheating. The study is published in Science.

Most of the many millions of songbirds that migrate every year between Europe and Africa fly by night and spend the daytime hours resting and eating. Some species, which normally only fly by night, occasionally fly for over 24 consecutive hours to avoid having to stop in inhospitable locations with insufficient access to food, such as deserts and seas. The great reed warbler, Acrocephalus arundinaceus, is such a species. During its month-long migration, it can fly for up to 34 consecutive hours without landing.

A research team from Lund University in Sweden, University of Copenhagen and the Nature Research Centre in Vilnius has used unique data loggers developed at Lund University to study these long migratory flights almost minute by minute. The researchers used the data loggers, weighing no more than 1.2 grams, to track 14 great reed warblers which were captured during the summer at Lake Kvismaren, close to Örebro in Central Sweden.

The data loggers collected data over a full year, thereby providing the researchers with valuable information. As the great reed warblers migrated to and from Africa, the data loggers continuously stored information about altitude, activity and the sun's ascend and descend. The altitude measurements were recorded once per hour and every five minutes the data loggers saved information about whether the birds were flying, looking for food on the ground or resting.

Based on this information, the researchers calculated that the great reed warblers flew at an average altitude of 2400 metres at night. On the few occasions on which the birds flew for more than one consecutive night, they climbed 3000 metres at dawn to fly at an average altitude of 5400 metres during the day. They remained cruising at that altitude until dusk, when they dived 3-4000 metres and continued their flight at an altitude of around 2000 metres the following night.

Sissel Sjöberg and Dennis Hasselquist at Lund University led the study, and the results came as a surprise to them and their colleagues. They describe the great reed warblers' behaviour as very consistent, indicating that the birds may have to fly at more than twice their night-time altitude to manage migrating during daylight hours at all.

There is no definitive explanation as yet. However, the research team rules out the previously predominant explanations for choice of altitudes in migratory birds: winds and air temperature.

"At altitudes over 2000 metres, winds and air temperatures do not change merely going from day to night or vice versa", says Sissel Sjöberg.

Nor are there any high mountains along the migration route that could push the birds up to extreme altitudes. Two possible explanations could be that during daytime, the great reed warblers see further from a great height and that they can avoid birds of prey. But the researchers also suggest another explanation: the birds climb to more than double the altitude during the day to avoid becoming overheated when they are exposed to the additional effect of the sun's heat radiation.

"It is colder higher up in the air and when the birds rise to 5400 metres, they reach a layer of air with a temperature of around minus 9 degrees centigrade. That is 22 degrees colder than the altitude at which the birds fly during the night", says Sissel Sjöberg.

"The wings of migratory songbirds beat several times per second, so they are working extremely hard, which makes their bodies very warm regardless of whether they are flying by day or by night. But if they fly by day, they are also exposed to the heat from the sun's radiation; that is what we suggest they are compensating for when they climb to a much colder layer of air at daytime. It is likely that they would not manage to fly by day without becoming overheated if they did not climb to these extreme altitudes", says Dennis Hasselquist.

The researchers are continuing to study the great reed warblers. In time, they hope to find answers to the question of why the birds behave in this way, which might also help to explain why most birds that migrate to the tropics during the winter fly almost entirely by night, whereas they are active during the day and sleep at night for the rest of the year.

Credit: 
Lund University

Overcoming tab overload

If you are reading this, chances are you have several other tabs open in your browser that you mean to get to eventually.

Internet browser tabs are a major source of friction on the internet. People love them. People hate them. For some users, tabs bring order and efficiency to their web browsing. For others, they spiral out of control, shrinking at the top of the screen as their numbers expand.

A research team at Carnegie Mellon University recently completed the first in-depth study of browser tabs in more than a decade. They found that many people struggle with tab overload, an underlying reason being that while tabs serve a variety of functions, they often do so poorly.

"Browser tabs are sort of the most basic tools that you use on the internet," said Joseph Chee Chang, a postdoctoral fellow in the School of Computer Science's Human Computer Interaction Institute (HCII) and a member of the research team. "Despite being so ubiquitous, we noticed that people were having all sorts of issues with them."

The team will present their paper, "When the Tab Comes Due: Challenges in the Cost Structure of Browser Tab Usage," at the Association for Computing Machinery's Conference on Human Factors in Computing Systems (CHI 2021), May 8-13.

For the study, the team conducted surveys and interviews with people about their tab use. The study details why people kept tabs open, including using them as reminders or fearing they would have to search for the information again. It also looked at why people closed tabs, knowing that tab overload can strain a person's attention and computer resources. About 25% of the participants in one aspect of the study reported that their browser or computer crashed because they had too many tabs open.

The researchers found that people felt invested in the tabs they had open, making it difficult for them to close the tabs even as they started to feel overwhelmed or ashamed by how many they had open.

Tabs first showed up in web browsers in 2001 and haven't changed much since. The internet, however, has. There is about a billion times more information on the web now than there was 20 years ago. Today, one tab could house an email inbox. Another could be used for a music or video player. Articles stashed away to read later could be in other tabs, as could restaurant reviews or information for an upcoming trip. Add in social media sites, news or other pages used for work or play, and it is easy to have a dozen or more tabs or windows open at any given time.

Tabs, it turns out, aren't the best tool for assisting with complex work and life tasks that people perform on the internet. Their simple list structure makes it difficult for users to jump between sets of tasks throughout the day. And despite people using tabs as an external form of memory, they do not capture the rich structure of their thoughts. Researchers found that while users complained about being overwhelmed by the number of tabs they queued up to work on later, they also didn't want to move them out of sight, as they worried about never going back to them.

"People feared that as soon as something went out of sight, it was gone," said Aniket Kittur, a professor in the HCII and head of the research team. "Fear of this black hole effect was so strong that it compelled people to keep tabs open even as the number became unmanageable."

Tab overload also arises from sense-making and decision tasks that require a person to absorb information from many sources, stitch it together and come to a conclusion. For instance, if someone is researching what camera to buy, that person may search several different reviews, how-to guides and shopping sites to compare models.

"Managing this sort of task is really one of the most important aspects of productivity in our lives," Kittur said. "And the number one tool that everyone uses for it is tabs, even though they don't do a good job."

The team believes that today's browsers do not offer a good tool for managing all the information and tasks people head to the internet for. To fix this, they created Skeema, an extension for the Google Chrome browser that reimagines tabs as tasks.

The extension helps users group their tabs into tasks and then organize, prioritize and switch between them. Skeema uses machine learning to make suggestions for grouping open tabs into tasks and supports nested tasks and complex decision-making.

Users of an early version of the tool significantly reduced the number of tabs and windows they kept open, reported much less stress connected to tabs, and remained more focused on the task at hand. Many of the early beta testers started using the tool daily to manage the tabs and tasks in their lives.

"Our task-centric approach allowed users to manage their browser tabs more efficiently, enabling them to better switch between tasks, reduce tab clutter and create task structures that better reflected their mental models," Chang said. "As our online tasks become increasingly complex, new interfaces and interactions that can merge tab management and task management in a browser will become increasingly important. After 20 years of little innovation, Skeema is a first step toward making tabs work better for users."

Credit: 
Carnegie Mellon University

Losing an only child is more devastating than losing a spouse, according to study of Chinese parents

Which wound cuts deeper: the loss of an only child or loss of a spouse? A new study led by researchers at NYU Rory Meyers College of Nursing and Fudan University suggests that Chinese parents find the loss of an only child to be approximately 1.3 times as psychologically distressing than the loss of a spouse. The findings are published in the journal Aging & Mental Health.

Older adults in China rely heavily on family support, particularly from their adult children. Filial piety--the Confucian idea describing a respect for one's parents and responsibility for adult children to care for their parents as they age--is a central value in traditional Chinese culture.

In the 1970s, China introduced a one-child policy to slow the population growth, resulting in hundreds of millions of families with only children. While the policy ended in 2016, its consequences will be felt for decades, particularly for families who experience the loss of a child.

"The death of a child has been recognized as one of the most challenging and traumatic events for a parent," said Bei Wu, PhD, Dean's Professor in Global Health at NYU Rory Meyers College of Nursing and co-director of the NYU Aging Incubator, as well as the study's senior author. "Within the cultural context of China, the death of an only child is devastating not only due to the emotional loss, but also the loss of financial and instrumental support that is critical to older adults."

The death of a spouse is also recognized as a distressing life event, forcing older adults to navigate both the emotional loss and the shattering of a married couple's social and economic circumstances. In this study, Wu and her colleagues wanted to examine whether the loss of a spouse had a similar impact on psychological well-being as the loss of an only child, and whether the presence of one mitigated the absence of the other.

The researchers analyzed data from a 2013 survey conducted in Shanghai involving more than 1,100 adults, including 128 parents who lost their only child. The survey evaluated the impact of the loss of a spouse or child on participants' psychological well-being, including depression, loneliness, and life satisfaction.

They found that adults who lost their only child but have a living spouse had more psychological distress than those who lost their spouse but have a living child. This effect appeared to be stronger in women than in men.

Losing an only child resulted in 1.37 times the level of loneliness and 1.51 times the level depression as losing a spouse, and life satisfaction was 1.14 times worse for those who lost an only child vs. their spouse. Adults whose children and spouse were both alive had better psychological well-being than those who experienced loss.

"Our findings demonstrate that the loss of an only child carries more psychological weight than the loss of a spouse in Chinese culture," said Wu.

Wu and her colleagues recommend increasing access to professional mental health services for adults who experience loss, as well as developing culturally relevant interventions to address social isolation and loneliness among older Chinese adults.

Credit: 
New York University

Damage to white matter is linked to worse cognitive outcomes after brain injury

video: This video is a map of edge density, a measure of the density of white matter connections between different brain regions. The color scale goes from dark purple where there are only sparse connections, to a yellowish white where the connections are most densely packed. A University of Iowa study suggest that damage to highly connected regions of white matter is more predictive of cognitive impairment than damage to highly connected gray matter hubs.

Image: 
Justin Reber, University of Iowa

A new University of Iowa study challenges the idea that gray matter (the neurons that form the cerebral cortex) is more important than white matter (the myelin covered axons that physically connect neuronal regions) when it comes to cognitive health and function. The findings may help neurologists better predict the long-term effects of strokes and other forms of traumatic brain injury.

"The most unexpected aspect of our findings was that damage to gray matter hubs of the brain that are really interconnected with other regions didn't really tell us much about how poorly people would do on cognitive tests after brain damage. On the other hand, people with damage to the densest white matter connections did much worse on those tests," explains Justin Reber, PhD, a UI postdoctoral research fellow in psychology and first author on the study. "This is important because both scientists and clinicians often focus almost exclusively on the role of gray matter. This study is a reminder that connections between brain regions might matter just as much as those regions themselves, if not more so."

The new study, published in PNAS, analyzes brain scans and cognitive function tests from over 500 people with localized areas of brain damage caused by strokes or other forms of brain injury. Looking at the location of the brain damage, also known as lesions, the UI team led by Reber and Aaron Boes, MD, PhD, correlated the level of connectedness of the damaged areas with the level of cognitive disability the patient experienced. The findings suggest that damage to highly connected regions of white matter is more predictive of cognitive impairment than damage to highly connected gray matter hubs.

Network hubs and brain function

Research on cognition often focuses on networks within the brain, and how different network configurations contribute to different aspects of cognition. Various mathematical models have been developed to measure the connectedness of networks and to identify hubs, or highly connected brain regions, that appear to be important in coordinating processing in brain networks.

The UI team used these well accepted mathematical models to identify the location of hubs within both gray and white matter from brain imaging of normal healthy individuals. The researchers then used brain scans from patients with brain lesions to find cases where areas of damage coincided with hubs. Using data from multiple cognitive tests for those patients, they were also able to measure the effect hub damage had on cognitive outcomes. Surprisingly, damage to highly connected gray matter hubs did not have a strong association with poor cognitive outcomes. In contrast, damage to dense white matter hubs was strongly linked to impaired cognition.

"The brain isn't a blank canvas where all regions are equivalent; a small lesion in one region of the brain may have very minimal impact on cognition, whereas another one may have a huge impact. These findings might help us better predict, based on the location of the damage, which patients are at risk for cognitive impairment after stroke or other brain injury," says Boes, UI professor of pediatrics, neurology, and psychiatry, and a member of the Iowa Neuroscience Institute. "It's better to know those things in advance as it gives patients and family members a more realistic prognosis and helps target rehabilitation more effectively."

UI registry is a unique resource for neuroscientists

Importantly, the new findings were based on data from over 500 individual patients, which is a large number compared to previous studies and suggests the findings are robust. The data came from two registries; one from Washington University in St. Louis, which provided data from 102 patients, and the Iowa Neurological Registry based at the UI, which provided data from 402 patients. The Iowa registry is over 40 years old and is one of the best characterized patient registries in the world, with close to 1000 subjects with well characterized cognition derived from hours of paper and pencil neuropsychological tests, and detailed brain imaging to map brain lesions. The registry is directed by Daniel Tranel, PhD, UI professor of neurology, and one of the study authors.

Reber notes that the study also illustrates the value of working with clinical patients as well as healthy individuals in terms of understanding relationships between brain structure and function.

"There is a lot of really excellent research using functional brain imaging with healthy participants or computer simulations that tell us that these gray matter hubs are critical to how the brain works, and that you can use them to predict how well healthy people will perform on cognitive tests. But when we look at how strokes and other brain damage actually affect people, it turns out that you can predict much more from damage to white matter," he says. "Research with people who have survived strokes or other brain damage is messy, complicated, and absolutely essential, because it builds a bridge between basic scientific theory and clinical practice, and it can improve both.

I cannot stress enough how grateful we are that these patients have volunteered their time to help us; without them, a lot of important research would be impossible," he adds.

Credit: 
University of Iowa Health Care

Can federated learning save the world?

Training the artificial intelligence models that underpin web search engines, power smart assistants and enable driverless cars, consumes megawatts of energy and generates worrying carbon dioxide emissions. But new ways of training these models are proven to be greener.

Artificial intelligence models are used increasingly widely in today's world. Many carry out natural language processing tasks - such as language translation, predictive text and email spam filters. They are also used to empower smart assistants such as Siri and Alexa to 'talk' to us, and to operate driverless cars.

But to function well these models have to be trained on large sets of data, a process that includes carrying out many mathematical operations for every piece of data they are fed. And the data sets they are being trained on are getting ever larger: one recent natural language processing model was trained on a data set of 40 billion words.

As a result, the energy consumed by the training process is soaring. Most AI models are trained on specialised hardware in large data centres. According to a recent paper in the journal Science, the total amount of energy consumed by data centres made up about 1% of global energy use over the past decade - equalling roughly 18 million US homes. And in 2019, a group of researchers at the University of Massachusetts estimated that training one large AI model used in natural language processing could generate around the same amount of CO2 emissions as five cars would generate over their total lifetime.

Concerned by this, researchers at the University of Cambridge set out to investigate more energy-efficient approaches to training AI models. Working with collaborators at the University of Oxford, University College London, and Avignon Université, they explored the environmental impact of a different form of training - called federated learning - and discovered that it had a significantly greener impact.
Instead of training the models in data centres, federated learning involves training models across a large number of individual machines. The researchers found that this can lead to lower carbon emissions than traditional learning.

Senior Lecturer Dr Nic Lane explains how it works when the training is performed not inside large data centres but over thousands of mobile devices - such as smartphones - where the data is usually collected by the phone users themselves.

"An example of an application currently using federated learning is the next-word prediction in mobile phones," he says. "Each smartphone trains a local model to predict which word the user will type next, based on their previous text messages. Once trained, these local models are then sent to a server. There, they are aggregated into a final model that will then be sent back to all users."

And this method has important privacy benefits as well as environmental benefits, points out Dr Pedro Porto Buarque De Gusmao, a postdoctoral researcher working with Dr Lane.

"Users might not want to share the content of their texts with a third party," he explains. "In federated learning, we can keep data local and use the collective power of millions of mobile devices together to train AI models without users' raw data ever leaving the phone."

"And besides these privacy-related gains," says Dr Lane, "in our recent research, we have shown that federated learning can also have a positive impact in reducing carbon emissions.

"Although smartphones have much less processing power than the hardware accelerators used in data centres, they don't require as much cooling power as the accelerators do. That's the benefit of distributing the training of models across a wide pool of devices."

The researchers recently co-authored a paper on this called 'Can Federated Learning save the planet?' and will be discussing their findings at an international research conference, the Flower Summit 2021, on 11 May.

In their paper, they offer the first-ever systematic study of the carbon footprint of federated learning. They measured the carbon footprint of a federated learning setup by training two models -- one in image classification, the other in speech recognition - using a server and two chipsets popular in the simple devices targeted by federated methods. They recorded the energy consumption during training, and how it might vary depending on where in the world the chipsets and server were located.

They found that while there was a difference between CO2 emission factors among countries, federated learning under many common application settings was reliably 'cleaner' than centralised training.

Training a model to classify images in a large image dataset, they found any federated learning setup in France emitted less CO2 than any centralised setup in both China and the US. And in training the speech recognition model, federated learning was more efficient than centralised training in any country.

Such results are further supported by an expanded set of experiments in a follow-up study ('A first look into the carbon footprint of federated learning') by the same lab that explores an even wider variety of data sets and AI models. And this research also provides the beginnings of necessary formalism and algorithmic foundation of even lower carbon emissions for federated learning in the future.

Based on their research, the researchers have made available a first-of-its-kind 'Federated Learning Carbon Calculator' so that the public and other researchers can estimate how much CO2 is produced by any given pool of devices. It allows users to detail the number and type of devices they are using, which country they are in, which datasets and upload/download speeds they are using and the number of times each device will train on its own data before sending its model for aggregation.

They also offer a similar calculator for estimating the carbon emissions of centralised machine learning.

"The development and usage of AI is playing an increasing role in the tragedy that is climate change," says Dr Lane, "and this problem will only worsen as this technology continues to proliferate through society. We urgently need to address this which is why we are keen to share our findings showing that federated learning methods can produce less CO2 than data centres under important application scenarios.

"But even more importantly, our research also shines a light as to how federated learning should evolve towards being even more broadly environmentally friendly. Decentralized methods like this will be key in the invention of future sustainable forms of AI in the years ahead."

Credit: 
University of Cambridge

Researchers produce laser pulses with record-breaking intensity

image: Researchers created high-intensity pulses using the petawatt laser (pictured) at the Center for Relativistic Laser Science (CoReLS) in the Republic of Korea. This high intensity laser will allow scientists to examine astrophysical phenomena such as electron-photon and photon-photon scattering in the lab.

Image: 
Chang Hee Nam, CoReLS

WASHINGTON -- Researchers have demonstrated a record-high laser pulse intensity of over 1023 W/cm2 using the petawatt laser at the Center for Relativistic Laser Science (CoReLS), Institute for Basic Science in the Republic of Korea. It took more than a decade to reach this laser intensity, which is ten times that reported by a team at the University of Michigan in 2004. These ultrahigh intensity light pulses will enable exploration of complex interactions between light and matter in ways not possible before.

The powerful laser can be used to examine phenomena believed to be responsible for high-power cosmic rays, which have energies of more than a quadrillion (1015) electronvolts (eV). Although scientists know that these rays originate from somewhere outside our solar system, how they are made and what is forming them has been a longstanding mystery.

"This high intensity laser will allow us to examine astrophysical phenomena such as electron-photon and photon-photon scattering in the lab," said Chang Hee Nam, director of CoReLS and professor at Gwangju Institute of Science & Technology. "We can use it to experimentally test and access theoretical ideas, some of which were first proposed almost a century ago."

In Optica, The Optical Society's (OSA) journal for high impact research, the researchers report the results of years of work to increase the intensity of laser pulses from the CoReLS laser. Studying laser matter-interactions requires a tightly focused laser beam and the researchers were able to focus the laser pulses to a spot size of just over one micron, less than one fiftieth the diameter of a human hair. The new record-breaking laser intensity is comparable to focusing all the light reaching earth from the sun to a spot of 10 microns.

"This high intensity laser will let us tackle new and challenging science, especially strong field quantum electrodynamics, which has been mainly dealt with by theoreticians," said Nam. "In addition to helping us better understand astrophysical phenomena, it could also provide the information necessary to develop new sources for a type of radiation treatment that uses high-energy protons to treat cancer."

Making pulses more intense

The new accomplishment extends previous work in which the researchers demonstrated a femtosecond laser system, based on Ti:Sapphire, that produces 4 petawatt (PW) pulses with durations of less than 20 femtoseconds while focused to a 1 micrometer spot. This laser, which was reported in 2017, produced a power roughly 1,000 times larger than all the electrical power on Earth in a laser pulse that only lasts twenty quadrillionths of a second.

To produce high-intensity laser pulses on target, the generated optical pulses must be focused extremely tightly. In this new work, the researchers apply an adaptive optics system to precisely compensate optical distortions. This system involves deformable mirrors -- which have a controllable reflective surface shape -- to precisely correct distortions in the laser and generate a beam with a very well-controlled wavefront. They then used a large off-axis parabolic mirror to achieve an extremely tight focus. This process requires delicate handling of the focusing optical system.

"Our years of experience gained while developing ultrahigh power lasers allowed us to accomplish the formidable task of focusing the PW laser with the beam size of 28 cm to a micrometer spot to accomplish a laser intensity exceeding 1023 W/cm2," said Nam.

Studying high-energy processes

The researchers are using these high-intensity pulses to produce electrons with an energy over 1 GeV (109 eV) and to work in the nonlinear regime in which one electron collides with several hundred laser photons at once. This process is a type of strong field quantum electrodynamics called nonlinear Compton scattering, which is thought to contribute to the generation of extremely energetic cosmic rays.

They will also use the radiation pressure created by the ultrahigh intensity laser to accelerate protons. Understanding how this process occurs could help develop a new laser-based proton source for cancer treatments. Sources used in today's radiation treatments are generated using an accelerator that requires a huge radiation shield. A laser-driven proton source is expected to reduce the system cost, making the proton oncology machine less costly and thus more widely accessible to patients.

The researchers continue to develop new ideas for enhancing the laser intensity even more without significantly increasing the size of the laser system. One way to accomplish this would be to figure out a new way to reduce the laser pulse duration. As lasers with peaks power ranging from 1 to 10 PW are now in operation and several facilities reaching 100 PW are being planned, there is no doubt that high-intensity physics will progress tremendously in the near future.

Credit: 
Optica

Slender-snouted Besanosaurus was an 8 m long marine snapper

image: The skull of the type specimen of Besanosaurus leptorhynchus is characterized by extreme longirostry (i.e., thin elongate snout), and equipped with tiny pointed teeth, perfect for catching small fish and extinct cousins of squids with rapid snapping moves of the head and jaws.

Image: 
Gabriele Bindellini and Marco Auditore, © Museo di Storia Naturale di Milano.

Middle Triassic ichthyosaurs are rare, and mostly small in size. The new Besanosaurus specimens described in the peer-reviewed journal PeerJ – the Journal of Life and Environmental Sciences – by Italian, Swiss, Dutch and Polish paleontologists provide new information on the anatomy of this fish-like ancient reptile, revealing its diet and exceptionally large adult size: up to 8 meters, a real record among all marine predators of this geological epoch. In fact, Besanosaurus is the earliest large-sized marine diapsid - the group to which lizards, snakes, crocodiles, and their extinct cousins belong to – with a long and narrow snout.

Besanosaurus leptorhynchus was originally discovered near Besano (Italy) three decades ago, during systematic excavations led by the Natural History Museum of Milan. The PeerJ article re-examines its skull bones in detail and assigns five additional fossils to this species: two previously undescribed fossil specimens, and two fossils previously referred to a different species (Mikadocephalus gracilirostris), which turns out to be not valid due to lack of significant anatomical differences with Besanosaurus.

The six specimens vary mainly in size and likely represent different growth stages. According to this re-analysis, Besanosaurus is the oldest and basal-most representative of a group of ichthyosaurs known as shastasaurids.

All specimens, housed in museums in Milan, Zurich, and Tübingen were collected in the last century from the bituminous black shales of the Monte San Giorgio area (Italy/Switzerland, UNESCO World Heritage), which were deposited some 240 million years ago at the oxygen-depleted bottom of a peculiar marine basin. The locality is famous worldwide for its rich fossil fauna that, besides ichthyosaurs, includes many other marine and semiaquatic reptiles, a variety of fish, and hard-shelled invertebrates.

“The extremely long and slender rostrum suggests that Besanosaurus primarily fed on small and elusive prey, feeding lower in the food web than an apex predator: a novel ecological specialisation never reported before this epoch of the Triassic in a large diapsid reptile. This might have triggered an increase of body size and lowered competition among the diverse ichthyosaurs that co-existed in this part of the Tethys Ocean”, says Gabriele Bindellini of the Earth Science Dept. of Milan University, first author of this study.

“Studying these fossils was a real challenge. All Besanosaurus specimens have been extremely compressed by deep time and rock pressure, so we used advanced medical CT scanning, photogrammetry techniques and comparisons with other ichthyosaurs to reveal their hidden anatomy and reconstruct their skulls in 3D, bone by bone”, remarks Cristiano Dal Sasso of the Natural History Museum of Milan, senior author of the PeerJ article, who in 1996 originally described and named Besanosaurus.

Interestingly, the Italian researchers started re-studying the Milan Besanosaurus roughly at the same time an international team including Andrzej Wolniewicz (IP PAS, Warsaw), Feiko Miedema (SMNS, Stuttgart), and Torsten Scheyer (UZH, Zurich) started working on the Swiss specimens. “Rather than doing parallel studies, we pooled our data and efforts and pulled on the same string, to enhance our understanding of these fascinating extinct animals”, adds Torsten Scheyer.

Credit: 
PeerJ

Men with chest pain receive faster, more medical attention than women

Among younger adults visiting the emergency department for chest pain, women may be getting the short end of the stick. Compared with men of similar age, women were triaged less urgently, waited longer to be seen, and were less likely to undergo basic tests or be hospitalized or admitted for observation to diagnose a heart attack, according to new research being presented at the American College of Cardiology's 70th Annual Scientific Session.

The study is the first to examine emergency room management of chest pain specifically among younger adults (age 18-55 years). Heart disease is the leading cause of death in women and is becoming more common in younger adults. About one-third of women who were hospitalized for a heart attack in the past two decades were under the age of 55, a proportion that has grown in recent years.

"Women should trust their instincts," said Darcy Banco, MD, an internal medicine resident at NYU Langone Health and the study's lead author. "Women should seek care right away if they experience new chest discomfort, difficulty breathing, nausea, vomiting, fatigue, sweating or back pain, as these could all be signs of a heart attack. The most important thing a woman can do is to seek medical care if she is worried and to ask specific questions of her doctor."

Chest discomfort is the most common symptom of a heart attack in both men and women, but research shows that women can have a broader range of accompanying symptoms that may not initially be recognized as a sign of a heart attack. Chest discomfort caused by a heart attack can be perceived as pain, pressure, tightness or another uncomfortable sensation.

The study is based on data collected by the National Hospital Ambulatory Medical Care Survey between 2014-2018. Researchers extrapolated the data to represent an estimated 29 million emergency department visits for chest pain in the U.S. among adults aged 18-55; women comprised nearly 57% of those visits.

Researchers found that women reporting chest pain were equally likely to arrive at the hospital by ambulance but significantly less likely than men to be triaged as emergent. On average, women waited about 11 minutes longer to be evaluated by a clinician. Women were also significantly less likely to undergo an electrocardiogram (EKG), the standard initial test used to diagnose a heart attack, or to receive cardiac monitoring or be seen by a consultant, such as a cardiologist.

Medical guidelines recommend that all patients with possible heart attack symptoms receive an EKG within 10 minutes of arrival in the emergency department to minimize the time to treatment.

"Time is very important when you're treating heart attacks," Banco said. "The longer people wait, the worse their outcomes can be."

The study did not examine the reasons why women with chest pain were treated differently than men. Banco suggested that pre-conceived notions of risk--rather than overt discrimination--likely play a role. Historically, heart attacks have been most common in older men, and clinicians may be less likely to suspect a heart attack among patients outside of that demographic. Banco suggested clinicians should appreciate that younger women represent a growing portion of heart attack patients.

"We, as health care providers, should continue to learn about how best to triage and diagnose patients with heart attacks, particularly among those who have historically been under-diagnosed or under-treated," Banco said. "We are learning that heart attacks take many forms. We need to continue to raise awareness and make sure all patients are diagnosed and treated properly, even if they're not the 'classic' demographic for a heart attack. [This knowledge] will help us improve care for all."

Credit: 
American College of Cardiology

High-risk, disadvantaged groups face barriers to preventing spread of COVID-19

Social factors such as education, financial stability, food security and the neighborhood where someone resides were strongly correlated with whether or not individuals with heart disease adopted measures to prevent the spread of COVID-19, including wearing masks and working from home, according to a study presented at the American College of Cardiology's 70th Annual Scientific Session. The researchers say the findings draw attention to longstanding challenges related to social determinants of health.

"Unless we look at COVID-19 through the lens of social determinants of health, we may not optimize our yield from interventions, and we might not be reaching the group of individuals who need these interventions the most," said Kobina Hagan, MD, a postdoctoral fellow at Houston Methodist Research Institute and the study's lead author.

The research is based on data from the COVID-19 Household Impact Survey, which assessed COVID-19 preventive strategies along with health and sociodemographic factors among more than 25,000 U.S. adults. The researchers analyzed responses from just over 2,000 survey respondents who reported a history of heart disease, heart attack or stroke. According to the U.S. Centers for Disease Control and Prevention, heart disease increases the risk of severe illness from COVID-19. The researchers grouped survey respondents into quartiles reflecting social risk factors based on income and financial security, employment, education, health insurance status, food insecurity and neighborhood quality.

The largest gap was seen in the degree to which respondents reported having flexibility in their work arrangements--defined as being able to work from home or cancel or postpone work activities. Compared to those with the most favorable social risk profile, those with the greatest social adversity were 46% less likely to report flexibility at work. Individuals with the greatest social adversity were also 31% less likely to engage in all social distancing measures (canceling or postponing social activities and avoiding crowded public places, restaurants and contact with high-risk people) and 17% less likely to engage in all personal protective measures (wearing a face mask, washing hands and keeping a 6-foot distance from those outside their household) compared to those with the most favorable social risk profile. These differences remained significant even after accounting for demographics and comorbidities.

The study was conducted before COVID-19 vaccines were available, at a time when public health experts primarily recommended social distancing and other measures to slow the disease's spread. However, similar trends are likely at play in the context of vaccine access, researchers said.

"I think we are repeating the same mistakes and expecting better results," Hagan said. "I hope we can come to a point where we as a society take social disparities into account in our decision making. We have to prioritize these socially disadvantaged individuals in our public policy programs, including vaccine delivery, to reduce the disparities in COVID-19 risk and outcomes."

Although the survey did not assess the reasons why respondents did or did not adopt COVID-19 mitigation measures, Hagan said many of the factors reflect circumstances that are often outside of a person's control, such as their ability to work from home. For measures that involve more personal choice, he suggested that low health literacy may play a role in the lower uptake of mitigation measures among disadvantaged groups.

"We as a society have ignored all the disparities and inequities that were happening during calmer times, even in cardiovascular disease," Hagan said. "2020 was a time when we could no longer ignore the disparities. We need to focus on holistic strategies to effectively fight this pandemic and ensure those not afforded the privilege of personal protection, social distancing and work flexibilities are prioritized with vaccine outreach to avoid further compounding existing health inequalities."

Credit: 
American College of Cardiology