Culture

Artificial intelligence in art: a simple tool or creative genius?

image: In October 2018, a work of art by Edmond de Belamie, which was created with the help of an intelligent algorithm, was auctioned for 432,500 USD at Christie's Auction House.

Image: 
Obvious (collective), Public Domain

Intelligent algorithms are used to create paintings, write poems, and compose music. According to a study by an international team of researchers from the Massachusetts Institute of Technology (MIT), and the Center of Humans and Machines at the Max Planck Institute for Human Development, whether people perceive artificial intelligence (AI) as the ingenious creator of art or simply another tool used by artists depends on how information about AI art is presented. The results were published in the journal iScience.

In October 2018, a work of art by Edmond de Belamie, which was created with the help of an intelligent algorithm, was auctioned for 432,500 USD at Christie's Auction House. According to Christie's auction advertisement, the portrait was created by artificial intelligence (AI). The media often described this as the first work of art not created by a human but rather autonomously by a machine. The proceeds were not given to the machine but instead to the French artists' collective Obvious. This collective had fed an algorithm with pictures of real paintings by human painters and trained it to create images autonomously. They then selected a certain picture, printed it, gave it a name, and marketed it. However, the programmers who developed the artificial neural networks and algorithms used were not mentioned, nor did they receive any of the proceeds from the sale of the painting.

"Many people are involved in AI art: artists, curators and programmers alike. At the same time, there is a tendency - especially in the media - to endow AI with humanlike characteristics. According to the reports you read, creative AI autonomously creates ingenious works of art. We wanted to know whether there is a connection between this humanization of AI and the question of who gets credit for AI art", Ziv Epstein, PhD student at the MIT Media Lab and first author of the study, explained.

To this end, the researchers informed almost 600 participants about how AI art is created and asked who should receive recognition for the work of art. At the same time, they determined the extent to which each participant humanizes AIs. The individual answers varied greatly. But on average, people who humanized AI and did not perceive it merely as a tool also felt that AI should receive recognition for the AI art and not the people involved in the creation process.

When asked which people deserve the most recognition in the process of creating AI art, recognition was initially given to the artists who provided the learning algorithms with data and trained them. Only then were curators named, followed by technicians who programmed the algorithms. And finally, the "crowd" (i.e. the mass of Internet users who produce the data material with which AIs are often trained) was mentioned. Respondents who humanized the AI gave more recognition to the technicians and the crowd, but proportionally less to the artists. A similar picture emerges when respondents are asked about who is responsible, for example when an AI artwork violates copyright. Here, too, the ones who humanized the AIs placed more responsibility on the AIs.

A key finding of the study is that it is possible to actively manipulate whether people humanize AIs by changing the language used to report on AI systems in art. The creative process can be described by explaining the fact that AI, supported only by an artistic collaborator, conceives and creates new works of art. Alternatively, the process can be described by explaining the fact that an artist conceives the artwork and that the AI executes simple commands given by the artist. The different descriptions changed the degree of humanization and thus also to whom the participants attributed recognition and responsibility for AI art from among the human actors.

"Because AI is increasingly penetrating our society, we will have to pay more attention to who is responsible for what is created with AI. In the end, there are humans behind every AI. This is particularly relevant when the AI malfunctions and causes damage - for example, in an accident involving an autonomous vehicle. It is therefore important to understand that language influences our view of AI and that a humanization of AI leads to problems in assigning responsibility", says Iyad Rahwan, director of the Center for Humans and Machine at the Max Planck Institute for Human Development and co-author of the study.

Credit: 
Max Planck Institute for Human Development

A first in-depth look at the latent virus reservoir of individuals living with HIV

image: Nadia Roan (left), Xiaoyu Luo (center), and Jason Neidleman (right) mapped out an atlas of latent reservoir cells that could open new avenues for studying or targeting the main barrier to an HIV cure.

Image: 
Gladstone Institutes

SAN FRANCISCO, CA--September 29, 2020--The latent reservoir is the last bastion of HIV's resistance to a cure. But it is difficult to destroy because it is invisible: the cells in the reservoir harbor virus that is dormant, so they don't have any viral proteins on their surface that would give them away.

As a result, scientists have struggled to learn what the reservoir looks like in individuals with HIV. And without this knowledge, they harbor little hope of being able to target the reservoir with therapies that could eliminate or reduce it, thus ridding people of HIV infection for good.

To fish out reservoir cells, scientists have to reawaken the virus by activating cells they collect from infected individuals. Once awake, the virus produces proteins that mark the surface of its host cells, which gives researchers a handle to find and study these cells. However, the very process of reactivating the virus leads to changes to the cells' biology that obscure their original identity. And so, the true identity of the cells making up the latent reservoir--also called latent cells--has remained elusive.

To overcome this problem, Gladstone Scientist Nadia Roan, PhD, has taken advantage of an approach she developed previously to backtrack reactivated reservoir cells to their original latent state. With this approach, Roan and her team have mapped out an atlas of the reservoir cells of eight individuals living with HIV, which they recently reported in the journal eLIFE.

"Our findings challenge some previously held assumptions about the makeup of the reservoir," says Roan, who is also an associate professor of urology at UC San Francisco. "In addition, our detailed map of reservoir cells will make it easier to find these cells in infected individuals, which will fundamentally change how the latent reservoir can be studied."

Method to the Madness

Previous investigations suggest that the reservoir consists in large part of memory T cells, a subset of cells in the immune system that retain the memory of past infections. These cells can remain for a long time in the body in a quiescent state, waiting for a new infection by a previously encountered virus to wake them up. That makes them the perfect hiding place for HIV.

But memory T cells come in many types, and a commonly held view among scientists is that the latent reservoir consists of a random assortment of memory T cells rather than a specific subset.

"If so, targeting latently infected cells as a curative strategy would be all the more difficult," says Roan.

Most previous studies of the reservoir have relied on the examination of only a few proteins on the surface of cells. With Roan's approach, however, her team can follow nearly 40 proteins at once, which greatly increases their ability to distinguish even closely related cells. The researchers can also compare populations of cells before and after reactivation, thus matching each reactivated cell to the pre-activation cell that resembles it most, as if going back in time.

"It's somewhat like implementing facial recognition technology on cells," says Roan. "You can think of it as having a photo of someone in their fifties, and trying to identify them in their high school yearbook. Although individuals' looks change as they age, you can typically still recognize them by looking for a combination of their traits. Similarly, latent cells change as they are reactivated, but they still retain some of their original identity in a way that we can capture by tracking 40 proteins at once."

Roan's team carried out this analysis on millions of cells collected from eight individuals under antiretroviral therapy. The cells came from the donors' blood and gut, the latter which is thought to be a primary site of viral persistence in individuals on this treatment. The scientists also obtained cells from the lymph nodes of one donor.

"Reservoir cells reside in the blood but also in various tissues in the body," says Jason Neidleman, a senior research associate in Roan's lab and co-first author of the study. "We wanted to know how reservoir cells in the blood compare to those from other sites."

The team first built an atlas of the CD4+ T cells (the type of T cells HIV can infect) in each infected donor, based on the assortment of the 40 proteins these cells contained. Then, they mapped each reactivated cell from each individual against the corresponding atlas to find the most similar atlas cell. This most similar cell is thought to represent the original state of the latently infected cell, before it reawakened from latency.

"Somewhat to our surprise, we found that the blood reservoir is not randomly distributed among memory T cells," says Xiaoyu Luo, PhD, a scientist at Gladstone and co-first author of the study. "Instead, reservoir cells in the blood samples map to a few distinct areas on the atlas. What's more, reservoir cells from different donors mapped near one another, indicate that they share common features."

When comparing the blood, lymph nodes, and gut samples, the team also revealed important differences between blood and tissue cells, but also some shared markers, in particular between gut and lymph node cells.

"The existence of shared features across people and tissue types give us hope that we can one day design therapies that target large fractions of the reservoir at once and will work for many infected individuals," says Roan.

Toward a Deeper Understanding of the Reservoir

For now, the team is keen to use their findings to learn more about the reservoir.

"One of the problems that has galled the field is that most reservoir cells harbor defective versions of the HIV genome," says Roan. "These cells do not constitute the most clinically relevant reservoir of HIV, because even after they are reactivated, they do not produce infectious virus."

This situation makes it difficult to home in on the reservoir cells that truly matter--those with a viral genome competent for replication and infection--which can represent as little as 1 percent of the reservoir population.

However, Roan and her team found that when they used the shared markers they identified to extract cells from donor samples, they could obtain populations of cells where more than 50 percent of the reservoir cells contained intact viral genomes.

"These results suggest that intact and defective virus are kept in different subsets of cells," says Roan. "And now that we can more readily identify the reservoir cells capable of producing infectious virus, we can begin to elucidate how these cells persist in an infected individual over time."

Another problem of infectious reservoir cells is that they are very rare to begin with--perhaps as few as one in a million CD4 T cells--which makes it all the more difficult to obtain enough replication-competent reservoir cells to carry out experiments. The markers Roan's team identified alleviate this problem by allowing researchers to increase the proportion of infectious reservoir cells in a donor's sample by 100-fold or more.

"By increasing our access to infectious reservoir cells this much, we open up the possibility of conducting a variety of previously impossible experiments that could greatly refine our understanding of reservoir cells," says Roan. "In particular, it might allow for the discovery of unanticipated--and perhaps even unique--markers of latent cells, which could speed up the design of new therapies for HIV eradication."

Credit: 
Gladstone Institutes

How the Humboldt squid's genetic past and present can secure its future

image: A study of the Humboldt squid's genetic stocks led by Hiroshima University marine biologists and in collaboration with researchers from Peru found that there is no north-south divide for this cephalopod's population. As warming waters affect their migration routes and push these large cephalopods to stretch toward the poles, they risk exposing themselves to more fishery fleets that are trying to satisfy the growing global appetite for squids. The researchers are calling for more international cooperation among governments along this squid's migration route to ensure sustainable fishing.

Image: 
Dr. Mitsuo Sakai

A group of marine biologists is pushing for more international collaboration to manage the Humboldt squid population after their study to identify its genetic stocks revealed its vulnerability to overfishing by fleets trying to feed the world’s hunger for squids.

Hiroshima University marine biologist Gustavo Sanchez led a team of researchers to find out the genetic structure of the Humboldt squid population in the Eastern Pacific Ocean using two types of DNA markers — the mitochondrial ND2 gene and nuclear microsatellite loci.

The team found that Humboldt squids could trace back their population to three historical matrilineage that spread out during the late Pleistocene and that the species has at least two contemporary genetic stocks homogeneously co-distributed in the northern and southern hemispheres.

Different genetic stocks within a species are usually defined by where they feed and breed. But in Humboldt squids, DNA markers showed no north-south divide. The equator doesn’t serve as a natural barrier to separate the different genetic stocks of these fast swimmers risking capture by different fishery fleets along their migration route.

“In our study, we identify at least two genetic stocks co-distributed in the north and southern hemisphere of the Eastern Pacific Ocean. Our results suggest that rather than independent marine policies from each country, the sustainability of this squid requires an international marine policy,” Sanchez said.

To ensure sustainable fishing, countries in South America where the squid is traditionally found have established yearly catch quotas. But the study found this approach to be ineffective, especially as catch restrictions are absent in international waters on the squid’s migration path.

“Countries fishing this squid have established catch quotas with no consideration that the total amount varies from year to year, and that the amount of squid caught influences the number of squids next year. By doing so, the genetic contribution of the offspring every year will also clearly fluctuate. In such a situation, there is a risk of having a genetic erosion with a smaller number of squids which are also less likely to adapt rapidly to the changing environment,” he remarked.

“From our study, it is also clear that the squids caught by different countries also belong at least two different populations, with likely different genetic contribution for the next generation. Catching these squids without knowing that their genetic contribution is different, is also very risky.”

A grim warning

Both warm tropical waters and the cooler Humboldt current, which runs from Tierra del Fuego at the southernmost tip of the South American mainland upwards to the northern coast of Peru, play a role in the Humboldt squid’s life cycle.

The squid seeks warm waters near the equator to spawn its clusters of neutrally buoyant eggs. But it needs nutrient-rich cool waters where they go on a feeding frenzy to grow from one-millimeter paralarvae specks to enormous predators of over 1.2 meters long.

These squids typically spawn only once during their one-year lifespan then die, making their future volatile if fishing goes unchecked. And such fears are not farfetched. 

It’s eastern relatives, the Japanese flying squid, has suffered the same fate. Years of overfishing, poor regulatory oversight, and the changing climate have depleted their population at an alarming rate that yearly catch of Japanese fishermen dropped over 70% from more than 200,000 tons in 2011 to 53,000 tons in 2017. The shortage worries the fishing town of Hakodate whose identity and economy are intertwined with the squid.

“The population of the Japanese flying squids has decreased, and this is because along the distribution of this squid you have a lot of fleets from Japan, China, Korea, and Taiwan, some with high capacity for catching this squid. Countries like China with massive distant-water fishing fleets can move anywhere outside their national jurisdiction to catch this squid. If you have the technology you can go to international waters and catch anything,” Sanchez said.

 

He said Hakodate’s experience could be a grim warning of things to come for his country Peru.

“The Humboldt squid is the second most important economical species in Peru. That means that when we have less squid, that will affect also the economy of the country, particularly the economy of the fisherman that depends on this squid,” he said.

Historical clues

Over 90 percent of warming on Earth in the past 50 years has happened in the ocean and the speed it is heating up is accelerating. Warming oceans due to climate change have driven sea creatures toward the poles. 

The Humboldt squid population itself has expanded its migratory path. It recently stretched its route farther north to Alaska and south to the tip of Chile which exposes these cephalopods that hunt in packs of up to 1,200 to fishing boats in each territory on its path as well as technologically advanced vessels waiting in international waters.

Sanchez’s team found a similar pattern of historical population expansion under extreme climate conditions when they looked at the mitochondrial DNA of the squid. They found that warming global temperatures 30,000 years ago which thawed Ice Age glaciers contributed to a sea-level rise favorable for the Humboldt squid population to spread out. The event which coincided with the decrease in the population of sperm whales, their natural predators, led to a population expansion for the squids.

Although quick to adapt, warmer temperatures mean less food, smaller maturity size, and fewer eggs to replenish its population. 

Securing Humboldt squids’ future

Much, including its conservation status, is still unknown of this large squid species. But with its economic significance to fishing communities and its important role in the marine ecosystem as food for diverse species, the new knowledge of its genetic stock can help inform future marine policies to manage its population.

“The Humboldt squid is the largest squid fishery in the world and is heavily caught in the Eastern Pacific Ocean by several countries, including countries from Asia like Japan, Korea, China, and Taiwan. This squid is one of the most commercial squids in the world, and it sustains the economy of many countries.”

“Identifying genetic stocks, also known as genetically different groups, throughout population genetics is very important for implementing marine policies that control the total catch of this squid. The high migratory capacity of this squid is the main challenge to identify the exact number of genetic stocks, and more genetic resources and sampling are required to clearly reveal this number.”

Credit: 
Hiroshima University

Rapeseed instead of soy burgers: researchers identify a new source of protein for humans

Rapeseed has the potential to replace soy as the best plant-based source of protein for humans. In a current study, nutrition scientists at the Martin Luther University Halle-Wittenberg (MLU), found that rapeseed protein consumption has comparable beneficial effects on human metabolism as soy protein. The glucose metabolism and satiety were even better. Another advantage: The proteins can be obtained from the by-products of rapeseed oil production. The study was published in the journal Nutrients.

For a balanced and healthy diet, humans need protein. "It contains essential amino acids which can not be synthesized in the body," says Professor Gabriele Stangl from the Institute of Agricultural and Nutritional Sciences at MLU. Meat and fish are important sources of high-quality proteins. However, certain plants can also provide valuable proteins. "Soy is generally considered the best source of plant protein as it contains a particularly beneficial composition of amino acids," says Stangl.

Her team investigated whether rapeseed, which has a comparably beneficial composition of amino acids, could be an alternative to soy. Rapeseed also contains phytochemicals - chemical compounds produced by plants - which could have beneficial effects on health, says Stangl. "So far, only a few data on the effect of rapeseed protein intake in humans had been available," adds the scientist. In comparison to soy rapeseed has several other advantages: It is already being cultivated in Europe and the protein-rich by-products of the rapeseed oil production could be used as ingredients for new food products. These by-products are currently used exclusively for animal feed.

In a study with 20 participants, the team investigated the effect of ingested rapeseed and soy proteins on human metabolism. Before the interventions the participants were asked to document their diets for a few days. Then they were invited to eat a specifically prepared meal on three separate days: noodles with tomato sauce, that either contained no additional protein, or was enriched with soy or rapeseed protein. After the meal, blood was regularly drawn from the participants over a six-hour period. "By using this study design, we were able to assess the acute metabolic response of each study participants to the dietary treatments." says Stangl.

The study showed: "The rapeseed protein induced comparable effects on metabolic parameters and cardiovascular risk factors as soy protein. Rapeseed even produced a slightly more beneficial insulin response in the body," says nutritionist Christin Volk from MLU. Another benefit was that the participants had a longer feeling of satiety after eating the rapeseed protein. "To conclude, rapeseed appears to be a valuable alternative to soy in the human diet," says Volk.

The only drawback: "Rapeseed protein, in contrast to soy protein, has a mustard flavour," says Volk. Therefore, rapeseed is more suitable for the production of savoury foods rather than sweet foods, explains the researcher.

Credit: 
Martin-Luther-Universität Halle-Wittenberg

Scientists help reboot 50 years of plant advice to solve one of nature's biggest challenges

image: Linum narbonense - a distant relative.

Image: 
University of Portsmouth

Scientists from the University of Portsmouth and Royal Botanic Gardens, Kew, have come up with a formula to help plant breeders and farmers around the world grow crops in a more sustainable way.

The new checklist, which has just been published in the Botanical Journal of the Linnean Society, will guide plant breeders to better understand the species they are trying to improve. It will also help them find ways to increase growth and yield of crops using wild plant species from which they were once domesticated.

There is an urgent and critical need for changes in farming techniques due to the growing challenges of global warming faced by crop producers. Plants that were selected and bred to suit certain climates, now need more help from the humans who are damaging the environment in which they grow.

Study author Dr Rocio Perez-Barrales, Senior Lecturer in the School of Biological Science at the University of Portsmouth said: "When the human race first domesticated crops, the climate and environment were completely different - what we are seeing in the last 50 years is a rapid change in climate. The world is now frequently facing catastrophic climate events like droughts and in the UK we are now seeing some crops being harvested up to a month earlier than they used to be.

"When plants were domesticated, they were artificially selected for a specific desirable trait. Artificial selection and farming have led to quality improvements in foods such as meat, milk, and fruit. However, over hundreds of years, there has been a negative impact to this process - a reduction in plant genetic diversity.

"Scientists believe genetic diversity is important for plants to cope with a change in environment. This leads to a choice of using an artificial process such as the use of pesticides, to protect crops against pests. An alternative for plant breeders is to use wild crop relatives and use the natural genetic variation in those species that protects them against the natural enemies.

"Climate change is altering the way crops behave. Crops have lost so much genetic diversity they are less able to adapt and respond to climate change. Scientists are now looking at wild crop relatives to see what traits can be improved to make crops better adapted to the current environmental challenges."

The researchers re-visited guidelines set out 50 years ago that have since become outdated. They used this classification as a basis to deliver a new method of improving crops, without destroying the very few natural environments left in the world to grow food.

Dr Perez-Barrales explained: "Some crops have just a few closely related species, whilst others might have a hundred or so. For example, linseed has more than 150 related species, and the challenge is how do we select the relevant traits and from what wild relatives? In answering this question, we realised that we needed to learn more from the biology of the species, which can only be done by using modern classification developed using the latest science. The classification developed in the early 1970's needed to be updated, and in effect rebooted, to integrate this modern information."

This new toolset for crop breeders relies on identifying which wild crop relative needs to be explored in order to improve the crops. Dr Perez-Barrales said: "There may be a demand to grow linseed, for example, in countries at different latitudes. Linseed (Linum usitatissimum) was domesticated in the Middle East 10,000 years ago, and we can grow it in England because it naturally captured genes from pale blue flax, Linum bienne, allowing the crop to grow in northern and colder environments. My research looks at the natural variation in flowering of wild Linum species to see if we can use it to improve linseed. That way the right genes can be selected and introduced into the crop, something that plant breeders do regularly. These new guidelines will help plant breeders become more sustainable and efficient. We believe it is the future of farming."

This paper recommends guidelines for plant breeders to select the one right species to improve the crops. The guidelines include:

1. To understand the genetic diversity between species (genetic distance) and ascertain how closely related the target species are. This is equivalent to human genealogy, so you are more closely related to siblings than cousins. The closer you are related the better because there will be fewer genetic barriers.

2. To understand if there is genetic compatibility between species. This includes understanding variation in chromosome numbers. For two plant species to successfully cross, they need to have the same number of chromosomes. As with humans, if the number of chromosomes don't match there will be problems in reproduction. However, plants can have very different chromosome numbers, ranging from 14 to more than 100. It is important to understand chromosome variation so we can understand their compatibility.

3. To gather all the information of the pollination biology, reproduction and the mechanisms to avoid inbreeding. Plants can prevent self-pollination and inbreeding, so the pollen of a flower does not fertilise the ovules from the same flower. Just like in animals, inbreeding can cause genetic diseases. But the mechanisms that avoid selfing can create barriers between the crop and the wild relative, making it difficult to create new hybrids that could be tested to validate the newly improved crop

Dr Juan Viruel from Kew Garden said: "In this study we advise plant breeders to use phylogenetic distance metrics, cytogenetic compatibility data (for example, chromosome number and ploidy) and information about the breeding system to shortlist wild species for plant breeding programmes". With this information we can better select the wild species to improve our crops. It is an invaluable checklist for plant breeders and will help production of crops in a more sustainable way."

Credit: 
University of Portsmouth

Videos most effective in communicating with parents about secondhand smoke risks

The best way to communicate with parent smokers about the risks of secondhand smoke to their children is to use videos depicting the risks, as well as solutions to reduce those risks.

Those were the findings of a study, published in the Journal of Health Communication, which demonstrates that video messages, compared to text-only messages, are significantly more effective in influencing parent smokers’ intentions to protect their offspring from exposure to secondhand smoke.

The effects and diseases caused by secondhand smoking kills 41,000 people in America annually (Centers for Disease Control and Prevention, 2017). Of the 58 million Americans exposed to secondhand smoke, infants and children are the most vulnerable to adverse health effects.

In order to assess which format of communication may be most effective to getting people to not only stop smoking, but stop smoking near their children, experts from both the Center of Tobacco Studies at Rutgers University and The SIDS Center of New Jersey, assessed 623 adult daily smokers living with children in the US, aged 0–15

The participants were assigned to view messages that differed by recommendation (stopping altogether – known as cessation vs cessation plus exposure reduction) and format (video vs text-only), or to a no-message control group.

All messages delivered to participants discussed the health risks of secondhand smoke. The cessation messages encouraged smokers to seek information about getting help to quit and for couples to work together to help each other quit, while the “cessation plus exposure reduction” messages additionally discussed strategies for parent smokers to protect their children from secondhand smoke, like smoking outdoors, washing toys, surfaces, and clothes exposed to secondhand smoke, and not smoking in a car with children.

Results showed:

Parent smokers who saw video messages had significantly greater intentions to reduce secondhand smoke exposure to their children, as well as discuss the issue and risks of secondhand smoke with other parents, compared to parent smokers that saw text-only messages.
Parents who saw videos also had significantly greater intentions to quit smoking altogether and get help to quit smoking compared to parents who did not view a message.
Parent smokers who viewed either message recommendation reported greater harm perceptions, perceptions of self-efficacy to protect children from secondhand smoke, and intentions to get help to quit smoking than the no-message control group.
Cessation plus exposure reduction recommendations elicited greater quit intentions than the no-message control.

Reporting on their findings, lead author Dr. Jennah Sontag stated, “All forms of communication tested in this study were effective in influencing perceptions and intentions about the risks of secondhand smoke for children, indicating that even communication that lasts only one to two minutes or is presented in text form may influence parent smokers to reduce or eliminate secondhand smoke exposure to children.

“But our research suggests that sharing this information visually through a video can be especially effective, and these videos should include recommendations for both cessation and ways to protect children from secondhand smoke.

“We already know that visual messages are better than text-only formats of tobacco-related risk communication. However, this study clearly demonstrates that the visual portrayals of the potential health risks of secondhand smoke exposure among infants and children, from asthma, ear infections, respiratory problems or SIDS, may have influenced greater harm perceptions due to the severity of these outcomes.”

It is estimated that two in five children in the US are exposed to secondhand smoking. However, notable disparities exist including those from low-income communities, 3- to 11-year olds, and black children, who are at highest risk of exposure. Several environmental factors play a role in this exposure, including a lack of smokefree laws in public spaces or smokefree rules at home, and living in multiunit housing. US healthcare expenditures of secondhand smoke exposure among children living in public housing were more than $180 million in 2011 alone.

Only 55% of smoking mothers quit smoking during pregnancy, and 70% of these relapse into smoking after childbirth.

Sontag believes these types of videos are more likely to reach wider audiences – due to their shareability– and therefore should be considered for future anti-smoking and secondhand smoking communications.

“Video shows the greatest potential to influence perceptions and intentions related to reducing secondhand smoke exposure. To reach parent smokers, such messaging can be used by practitioners in clinical settings, such as waiting rooms and examination rooms, which allow for follow-up discussions, or in family-based education and community organizations. Because videos are more appropriate for the digital/social media landscape, messages can be further disseminated among parents and other caregivers,” she adds.

The paper calls for future research to assess the extent to which secondhand smoking-related communication that uses videos and includes recommendations for cessation and exposure reduction influences parent smokers to inquire with their doctor about cessation options. “It is also important to assess whether strategies that protect children from secondhand smoke are carried out long term by parent smokers who do not quit,” it states.

Limitations of the study include that over 70% of participants were female. In addition, parent smokers viewed only one message one time on a computer screen in a controlled experimental setting. Messages viewed in a natural environment (e.g., clinics, hospitals) and multiple exposures may produce different results. Additionally, 38% of participants reported that they never smoke inside the home, implying that these participants are already exhibiting one of the exposure-reduction behaviors that reduces secondhand smoke exposure to children; recruitment of only parent smokers that always smoke inside the home may have produced different results.

 

Further information

For more information, please contact:
Simon Wesson, Press & Media Executive
Email: newsroom@taylorandfrancis.com
Tel.: +44 (0)7817299937
Follow us on Twitter: @tandfnewsroom

 

This study was funded by the New Jersey Commission on Cancer Research.

About Taylor & Francis Group

Taylor & Francis Group partners with researchers, scholarly societies, universities and libraries worldwide to bring knowledge to life. As one of the world’s leading publishers of scholarly journals, books, ebooks and reference works our content spans all areas of Humanities, Social Sciences, Behavioural Sciences, Science, Technology and Medicine.

 

From our network of offices in Oxford, New York, Philadelphia, Boca Raton, Boston, Melbourne, Singapore, Beijing, Tokyo, Stockholm, New Delhi and Cape Town, Taylor & Francis staff provide local expertise and support to our editors, societies and authors and tailored, efficient customer service to our library colleagues.

Journal

Journal of Health Communication

DOI

10.1080/10810730.2020.1797947

Credit: 
Taylor & Francis Group

Rodent ancestors combined portions of blood and venom genes to make pheromones

Experts who study animal pheromones have traced the evolutionary origins of genes that allow mice, rats and other rodents to communicate through smell. The discovery is a clear example of how new genes can evolve through the random chance of molecular tinkering and may make identifying new pheromones easier in future studies. The results, representing a genealogy for the exocrine-gland secreting peptide (ESP) gene family, were published by researchers at the University of Tokyo in the journal Molecular Biology and Evolution.

Researchers led by Professor Kazushige Touhara in the University of Tokyo Laboratory of Biological Chemistry previously studied ESP proteins that affect mice's social or sexual behavior when secreted in one mouse's tears or saliva and spread to other animals through social touch.

Recently, Project Associate Professor Yoshihito Niimura led a search for the evolutionary origin of ESP genes using the wide variety of fully sequenced animal genomes available in modern DNA databases. Niimura looked for ESP genes in 100 different mammals and found them only in two evolutionarily closely related families of rodents: the Muridae family of mice, rats and gerbils, and the Cricetidae family of hamsters and voles.

Notably, the Cricetidae had few ESP genes usually all grouped together in the same stretch of DNA, but the Muridae had both that same small group of ESP genes as well as a second, larger group of additional ESP genes.

"We can imagine about 35 million years ago, the common ancestor of Muridae and Cricetidae formed the first ESP genes. Eventually, approximately 30 million years ago, the ancestor of Muridae duplicated and expanded these ESP genes. So now mice have many more ESP genes than the Cricetidae rodents," said Niimura.

To identify the source of what formed the first ESP gene, researchers compared additional genome sequences. They uncovered how random chance copied uniquely functional portions of two other genes, then coincidentally pasted them next to each other.

The DNA sequence of a gene includes portions called exons, which later become the functional protein, and other portions called introns, which do not become protein. Introns and exons are spaced throughout the gene with no apparent organization, introns interrupting essential functional portions of exons. Therefore, if a single exon were randomly copied and pasted elsewhere in the genome, any resulting protein fragment would have no meaningful function.

However, if an exon-only version of a gene were copied and reinserted into the genome, the chances of that new sequence remaining functional become much greater. Cells do create exon-only versions of genes called mRNA as part of the normal process of making protein from genes and cells do possess machinery, likely left over from viral infections, that can copy mRNA back into the DNA strand.

"This is not the normal way of things in cells, but it is a common source of evolution. We guess this is what happened to make ESP genes because the whole functional portion of the ESP gene is one exon, no intron interruption," said Niimura.

Specifically, the research team discovered for the first time that ESP proteins contain an uncommon spiral shape characteristic of alpha-globin, a component of the iron-carrying hemoglobin protein in blood. DNA sequence comparisons revealed that multiple alpha-globin gene exons spliced together show a subtle but distinctive similarity to the ESP gene sequence.

"It doesn't matter that hemoglobin is the source of the ESP pheromone. Any protein can become a pheromone if it is used for species-specific communication," said Niimura.

Regardless of its shape, no protein can function without being in the proper location. In ESP proteins, the alpha-globin-derived portion is attached to a signaling portion, which directs the protein to be secreted from salivary and tear glands. Researchers identified the ESP genes' location signaling sequence as resembling that of CRISP2, a gene expressed in mammalian reproductive tracts and salivary glands as well as the venom gland of some snakes.

The hemoglobin and CRISP genes are both ancient genes that existed in the shared evolutionary ancestor of vertebrates - all animals with a backbone - over 500 million years ago. The genetic shuffling that created ESP genes occurs relatively frequently in the cells of all organisms, but for these changes to become inherited evolutionary traits, the changes must occur in the sex cells so they can be passed on to future generations.

"The creation of new genes is not done from scratch, but nature utilizes pre-existing material. Evolution is like a tinkerer, using old things and broken parts to create some new device with a useful function," said Niimura.

Niimura and his colleagues plan to use their new understanding of the evolution of this one family of pheromones to direct their search for new pheromones. The short length of many known pheromone genes makes it likely that similar pheromones are overlooked in standard genome searches. They also predict that salivary and tear glands, often overlooked because their small size makes them inconvenient tissues to study, may contain interesting future discoveries.

Credit: 
University of Tokyo

AI taught to rapidly assess disaster damage so humans know where help is needed most

image: This photo shows the distribution of damage estimated by the convolutional neural network model for Mashiki town in the 2016 Kumamoto earthquake (L) and Nishinomiya City in the 1995 Kobe earthquake (R). Hiroshima University researchers created a post-disaster damage assessment CNN model that does not need pre-disaster images to make an evaluation.

Image: 
Hiroyuki Miura

Researchers at Hiroshima University have taught an AI to look at post-disaster aerial images and accurately determine how battered the buildings are — a technology that crisis responders can use to map damage and identify extremely devastated areas where help is needed the most.

Quick action in the first 72 hours after a calamity is critical in saving lives. And the first thing disaster officials need to plan an effective response is accurate damage assessment. But anyone who has seen aftermath scenes of a natural catastrophe knows the many logistical challenges that can make on-site evaluation a danger to the lives of crisis responders.

Using convolutional neural network (CNN) — a deep learning algorithm inspired by the human brain’s image recognition process — a team led by Associate Professor Hiroyuki Miura of Hiroshima University’s Graduate School of Advanced Science and Engineering trained an AI to finish in an instant a task that usually requires us to devote crucial hours and personnel at a time when resources are scarce.

Previous CNN models that assess damage require both before and after photos to give an evaluation. But Miura’s model doesn’t need pre-disaster images. It only relies on post-disaster photos to determine building damage.

It works by classifying buildings as collapsed, non-collapsed, or blue tarp-covered based on the seven damage scales (D0-D6) used in the 2016 Kumamoto earthquakes by the Architectural Institute of Japan.

A collapsed building is defined as D5–D6 or major damage. Non-collapse is interpreted as D0–D1 or negligible damage. Intermediate damage, which was rarely considered in previous CNN models, is designated as D2–D3 or moderate damage.

Researchers trained their CNN model using post-disaster aerial images and building damage inventories by experts during the 1995 Kobe and 2016 Kumamoto earthquakes.

The researchers overcame the challenge of identifying buildings that suffered intermediate damage after confirming that blue tarp-covered structures in photos used to train the AI predominantly represented D2-D3 levels of devastation.

Since ground truth data from field investigations of structural engineers were used to teach the AI, the team believes its evaluations are more reliable than other CNN models that depended on visual interpretations of non-experts.

When they tested it on post-disaster aerial images of the September 2019 typhoon that hit Chiba, results showed that damage levels of approximately 94% of buildings were correctly classified.

Now, the researchers want their AI to outdo itself by making its damage assessment more powerful.

“We would like to develop a more robust damage identification method by learning more training data obtained from various disasters such as landslides, tsunami, and etcetera,” Miura said.

“The final goal of this study is the implementation of the technique to the real disaster situation. If the technique is successfully implemented, it can immediately provide accurate damage maps not only damage distribution but also the number of damaged buildings to local governments and governmental agencies.”

Credit: 
Hiroshima University

Delirium a key sign of COVID-19 in frail, older people

A new analysis of data from researchers at King's College London using information from the COVID Symptom Study app and patients admitted to St Thomas' Hospital in London, has shown that delirium - a state of acute confusion associated with a higher risk of serious illness and death - is a key symptom of COVID-19 in frail, older people.

The findings, published in the journal Age and Ageing, highlight that doctors and carers should be aware of delirium as a possible early warning sign of COVID-19 in the elderly, even in the absence of more typical symptoms such as cough or fever.

Led by clinical fellow and geriatrician Dr Rose Penfold at King's College London, the researchers analysed data from two groups of older people aged 65 or over from March through May. The first group included 322 patients admitted to hospital with COVID-19 who had tested positive for COVID-19, while the second comprised 535 users of the COVID Symptom Study app who reported having had a positive test result.

They found that older adults admitted to hospital who were classified as frail according to a standard scale were more likely to have had delirium as one of their symptoms than people of the same age who were not classed as frail. Delirium, along with tiredness and breathlessness, were also more common in frailer users of the COVID Symptom Study app with COVID-19, compared with fitter people of the same age.

A third of app users experiencing delirium did not report suffering the 'classic' COVID-19 symptoms of cough and fever, while delirium was the only symptom for around one in five (18.9%) of hospitalised patients.

Frailty in the group of hospitalised patients was measured using the Clinical Frailty Scale (CFS) test, which is administered by a doctor. COVID Symptom Study App users were asked to complete a short questionnaire asking about their health, which is comparable to the CFS.

This is the first study showing that delirium is a likely symptom of COVID-19 in frail older adults, although the precise biological connection between the two conditions still needs to be understood. The findings also highlight the need for systematic assessment of frailty for older people, along with awareness and screening for delirium for this vulnerable population in hospitals, care homes and the community.

Dr Rose Penfold from King's College London said: "Older, frailer people are at greater risk from COVID-19 than those who are fitter, and our results show that delirium is a key symptom in this group. Doctors and carers should watch out for any changes in mental state in elderly people, such as confusion or strange behaviour, and be alert to the fact that this could be an early sign of coronavirus infection."

Dr Claire Steves from King's College London said: "The past six months have shown us that COVID-19 can spread catastrophically through care homes. Knowing that delirium is a symptom in frail, elderly people will help families and carers spot the signs earlier of COVID-19 and act appropriately and put in place infection control measures such as isolation, increased hygiene and personal protective equipment to protect this highly vulnerable group."

Professor Tim Spector, Professor of Genetic Epidemiology at King's College London and COVID Symptom Study lead, said: "In April we upgraded the COVID Symptom Study app to allow users to log health reports on behalf of friends and family who aren't able to access the app. This significantly increased the number of older people in the study, providing vital insights. We're hugely grateful to all our users and urge everyone to download the app and log their health and that of their loved ones on a daily basis as we move towards the winter months."

Credit: 
King's College London

Safe flight: New method detects onset of destructive oscillations in aircraft turbines

image: The efforts of the research team from TUS and JAXA for early flutter detection in aircraft turbines would help the development of safer and more eco-friendly turbine designs.

Image: 
Tokyo University of Science

Despite humanity's remarkable engineering prowess, sometimes completely unexpected or poorly understood physical phenomena can rapidly lead to catastrophic failures. Such was the case in Braniff International Airways Flight 542 in 1959 and Northwest Orient Airlines Flight 710 in 1960, where both aircrafts spontaneously disintegrated in mid-air due to a mechanical phenomenon known as "flutter."

In aerospace research, flutter generally refers to undesired and self-sustained vibrations in turbine blades that can readily grow out of control, destroying them along with the engine, and even the aircraft's wings. It is not very surprising that flutter remains an area of active research and one of the main concerns when designing turbines. In fact, flutter has been placed once again under the spotlight in a project (advanced-fan-jet-research: aFJR) launched by the Japan Aerospace Exploration Agency (JAXA) aimed at designing highly efficient and environment-friendly turbines.

In a new study published in Physical Review Applied, scientists from the Tokyo University of Science (TUS), in collaboration with researchers from JAXA, tackle the problem of developing a novel methodology for early detection of flutter in the design state of blades. Dr. Hiroshi Gotoda (corresponding author of the paper) from the Department of Mechanical Engineering at TUS explains the problem at hand and how they tried to solve it, "The onset of cascade flutter has impeded the technological development of advanced jet engines and its early detection is a long-standing problem in current aerospace propulsion engineering. Our main aim was to explore the applicability of a methodology combining complex networks and synchronization to detect a precursor of cascade flutter."

The main idea behind their approach is that the turbine fan can be mathematically modeled as a complex network of interrelated oscillators and that flutter is ultimately the result of the progressive synchronization of more and more blades as a result of increased airflow going through the turbine. In another study, published in the Journal of Applied Physics, the same group had explored an artificial intelligence-based method for detecting the onset of flutter from time-series data using the permutation entropy of the system, which is a measure of the randomness of the turbine's complex dynamics. In their current work, they demonstrate that a network representation of the system based on synchronization is closely related to the actual oscillatory behavior of the blades.

Through experiments on an actual turbine test rig conducted at JAXA's Altitude Test Facility, the research team found that, before the onset of flutter, one particular blade begins to act as a "central hub" in the network and adjacent blades start to oscillate in sync with it. This "local" synchronization quickly expands and leads to the collective synchronization of all blades, resulting in potentially catastrophic "flutter."

In this context, the network representation of the system proposed in this study serves two important purposes, as explained by Dr. Gotoda, "We demonstrate the applicability of two local and global measures as potential detectors of cascade flutter: the connecting strength between individual network nodes and the network's synchronization parameter. The former is valid for specifying the dominant blades for the onset of cascade flutter. In contrast, the latter, which ranges from 0 to 1, is more suitable for determining a threshold for this onset."

The combined findings of these new studies shed light on the complex phenomenon of flutter, and contribute to the academic systemization of nonlinear problems in the field of aeronautical engineering and related nonlinear science. They could represent promising techniques for the early detection of flutter onset in the design state of blades. The efforts of this research team from TUS and JAXA would help the development of safer and more eco-friendly turbine designs.

Credit: 
Tokyo University of Science

Large contact tracing study in Science finds children as active transmitters of COVID-19

A team of investigators from CDDEP, the Government of Tamil Nadu, and Andhra Pradesh studied disease transmission patterns in 575,071 individuals exposed to 84,965 confirmed cases of COVID-19. The study, based on data collected by tens of thousands of contact tracers in the states of Andhra Pradesh and Tamil Nadu is the largest and most comprehensive analysis of COVID-19 epidemiology to date.

Andhra Pradesh (population 50 million) and Tamil Nadu (population 68 million) are among the Indian states with the largest healthcare workforce and public health expenditures per capita, and are known for their effective primary healthcare delivery models. Both states initiated rigorous disease surveillance and contact tracing early in response to the pandemic. Procedures include syndromic surveillance and SARS- CoV-2 testing for all individuals seeking care for severe acute respiratory illness or influenza-like illness at healthcare facilities; delineation of 5km "containment zones" surrounding cases for daily house-to-house surveillance to identify individuals with symptoms; and daily follow-up of all contacts of laboratory-confirmed or suspect COVID-19 cases, with the aim of testing these individuals 5-14 days after their contact with a primary case, irrespective of symptoms, to identify onward transmission.

The AP-TN study found that:

COVID transmission

1. Risk of transmission from an index case to a close contact ranges from 2.6% in the community to 9.0% in the household and does not differ significantly with respect to the age of the index case.
2. Infection probabilities ranged from 4.7-10.7% for low-risk and high-risk contact types, respectively. Same-age contacts were associated with the greatest infection risk.
3. The study found a high prevalence of infection among children who were contacts of cases around their own age; this finding of enhanced infection risk among individuals exposed to similar-age cases was also apparent among adults.
4. Not all infected individuals transmit COVID-19. Prospective follow-up testing of exposed contacts revealed that 70% of infected individuals did not infect any of their contacts, while 8% of infected individuals accounted for 60% of observed new infections. This study presents the largest empirical demonstration of superspreading that we are aware of.

Mortality

1. Case-fatality ratios spanned 0.05% at ages 5-17 years to 16.6% at ages ?85 years.
2. Men were 62% more likely to die than women.
3. 63% of those who died had at least one co-morbidity. 36% had two or more co-morbidities. 45% of those who died were diabetic.
4. Unlike observations in high-income settings, deaths in India are concentrated at ages 50-64 years. The figure below shows case fatality ratios (CFR) at various age groups in India compared to other countries. The CFR is higher in the 40-70 age group in India than in any of the four comparison countries (see figure below). For the age category above 80, the CFR is India is in line with other countries indicating a potential survival effect.
5. Contrary to long hospital stays reported in high-income settings, the median time to death is 6 days following admission (compared to 13 in the United States).

Effect of the Lockdown

1. There are substantial reductions in the reproductive number Rt associated with the implementation of India's country-wide shutdown, which has not previously been shown empirically.
2. Case-fatality ratios (proportion of cases that died) have decreased over the course of the epidemic. Individuals who tested positive between in July were 26% less likely to die than those tested in March and April. Those who tested positive in May and June were 13% less likely to die than those tested in March and April.

According to the director of CDDEP, Dr. Ramanan Laxminarayan, "This study was made possible by the significant contact tracing effort in Andhra Pradesh and Tamil Nadu, which involved tens of thousands of healthcare workers. The results on disease transmission and mortality have the potential to inform policy to fight COVID-19. The study also speaks to the capacity of research emerging from India to help inform the global response to COVID-19".

Credit: 
Center for Disease Dynamics, Economics & Policy

Researchers exploit weaknesses of master game bots

UNIVERSITY PARK, Pa. -- If you've ever played an online video game, you've likely competed with a bot -- an AI-driven program that plays on behalf of a human.

Many of these bots are created using deep reinforcement learning, which is the training of algorithms to learn how to achieve a complex goal through a reward system. But, according to researchers in the College of Information Sciences and Technology at Penn State, using game bots trained by deep reinforcement learning could allow attackers to use deception to easily defeat them.

To highlight this risk, the researchers designed an algorithm to train an adversarial bot, which was able to automatically discover and exploit weaknesses of master game bots driven by reinforcement learning algorithms. Their bot was then trained to defeat a world-class AI bot in the award-winning computer game StarCraft II.

"This is the first attack that demonstrates its effectiveness in real-world video games," said Wenbo Guo, a doctoral student studying information sciences and technology. "With the success of deep reinforcement learning in some popular games, like AlphaGo in the game Go and AlphaStar in StarCraft, more and more games are starting to use deep reinforcement learning to train their game bots."

He added, "Our work discloses the security threat of using deep reinforcement learning trained agents as game bots. It will make game developers be more careful about adopting deep reinforcement learning agents."

Guo and his research team presented their algorithm in August at Black Hat USA - a conference that is part of the most technical and relevant information security event series in the world. They also publicly released their code and a variety of adversarial AI bots.

"By using our code, researchers and white-hat hackers could train their own adversarial agents to master many -- if not all -- multi-party video games," said Xinyu Xing, assistant professor of information sciences and technology at Penn State.

Guo concluded, "More importantly, game developers could use it to discover the vulnerabilities of their game bots and take rapid action to patch those vulnerabilities."

Credit: 
Penn State

Children hold leaders primarily responsible, not entitled

image: Researchers explored how young children conceptualize leadership, specifically whether they view leaders primarily as more entitled individuals or more responsible individuals, relative to non-leaders. The findings showed that they expected a leader to contribute more to a joint goal than its non-leader partner, expected a leader to withdraw an equal share from the common prize, and judged a leader more harshly than a non-leader for not adhering to these two behaviors.

Photo: Prof. Gil Diesendruck, Department of Psychology, Bar-Ilan University

Image: 
Bar-Ilan University

Important strides have been made in recent years in uncovering children's knowledge about hierarchical social relationships. Much of these have focused on dominance, a hierarchy in which individuals defer to others out of fear or intimidation.

In a new study published in Child Development researchers from Central European University and Bar-Ilan University begin to explore how young children conceptualize leadership, a hierarchy that is based on voluntary deference and respect. Specifically, they examined whether children view leaders primarily as more entitled individuals (deserving preferential treatment) or more responsible individuals, relative to non-leaders. Such concepts could underlie children's predictions and evaluations of leaders' and non-leaders' behavior in a variety of situations.

The researchers presented five-year-old Israeli children stories about a group who elect a leader and go to an amusement park. Each story involved a collaborative situation. In one, rides required two protagonists - thus establishing a joint goal -- and to activate a ride, the protagonists must put coins into its piggy-bank box. Children were shown a hierarchical dyad (a leader and a non-leader) and an egalitarian dyad (two non-leaders), and their respective coins, in turn. Children anticipated how many coins each protagonist would contribute to activate the ride and judged the acceptability of two 'actual' contributions (small or large) by the leader (hierarchical dyad) and non-leader (egalitarian dyad). The other story was similar, except that protagonists only had to ride in dyads, and dyads fulfilling this requirement received a prize. Children's anticipation of relative withdrawals from the joint prize, and acceptability judgments of 'actual' small and large withdrawals by a leader (hierarchical dyad) and a non-leader (egalitarian dyad) were measured.

Results showed that children expected a leader to contribute more to a joint goal than its non-leader partner, expected a leader to withdraw an equal share (not more) from the common prize, and judged a leader more harshly than a non-leader for not adhering to these two behaviors. "These findings show that children view leaders as more responsible (not more entitled), relative to non-leaders, in two collaborative contexts," said the study's first author, Dr. Maayan Stavans, of Central European University, who collaborated on the research with Prof. Gil Diesendruck, of the Department of Psychology at Bar-Ilan University. "Our focus on collaborative situations, extends prior research presenting primarily competitive or neutral contexts. It is also a first demonstration that children understand conferred leadership by election."

"Because children were not asked about familiar kinds of authority figures, such as teachers and parents, nor about situations in their everyday lives, their responses reflect sophisticated ideas about leadership applied to novel protagonists and situations. While impressive, we have yet to understand how children come to think leaders have increased responsibility, at what point do they begin to represent leaders' increased entitlement (as adults do!), and how dependent are these representations on context," added Dr. Stavans.

Meanwhile, the researchers say that parents and educators can capitalize on the finding that children attribute leaders increased responsibility in two ways. First, they, as 'leaders', should behave toward their children 'followers' in line with these ideas, to reinforce them. Second, once children begin to attribute increased entitlement to leaders, parents and educators can explain how position perks come with increased responsibility and can be lost when responsibility is unfulfilled, upholding leaders when needed.

Credit: 
Bar-Ilan University

New research on how fungal cells respond to stress

Researchers at the University of Maryland, Baltimore County (UMBC) have published new findings in Molecular and Cellular Proteomics on critical cellular processes triggered when cells respond to environmental stress. Mark Marten, professor of chemical, biochemical, and environmental engineering, led the research team, which identified three coordinated pathways involved in the response to cell wall stress in filamentous fungi. Cynthia Chelius, who recently earned her Ph.D. in chemical engineering at UMBC, is the first author on the paper.

A previous NSF grant supported the work that Marten conducted with Ranjan Srivastava, University of Connecticut, and Steven Harris, University of Manitoba.

Numerous species of filamentous fungi are pathogens that can make people sick, especially people who are immunocompromised. Different species of fungi play an important role in the development of pharmaceuticals and enzymes, and agriculture, where fungi can help improve the quality of soil and make nutrients more readily available for crops. By understanding how cells work and respond to stress, researchers can reverse-engineer processes that could have a broad range of applications.

To understand how the fungal cell walls respond to environmental stressors, Marten and his team studied what he describes as the cell's "software"--rules that control how the cell behaves. When fungi experience stress, Marten's team found an increase in the number of septa (or cross-hyphal bulkheads) created. "When you stress cells, they sense it and try to protect themselves," Marten explains. He adds that fungi try to repair damage to their cell walls so that they can resume normal growth and function.

The study used a multi-omic methodology, which researchers say can be applied to studying how signaling networks in cells work in general. The methodology allowed researchers to get a more detailed understanding of how cells respond to stressors. They found that when cell walls experience stress, there is a coordinated response through various pathways. By combining short time-scale phosphoproteomic sampling and longer scale transcriptomic sampling, the researchers were able to see a broader view of how cells respond to stress.

Marten, Srivatava, and Harris's teams will continue to collaborate on related research, which will be supported by a new three-year grant totaling $1.23 million grant from the National Science Foundation. This work will further explore how filamentous fungi repair their cell walls when exposed to stressors.

The team will examine how the parts of the fungal cell are assembled and how fungal gene regulatory networks function. They hope to understand how proteins in cells interact with each other, and how cells can turn on and off certain parts of their DNA to respond to stress.

"We were excited to see the results from this paper, as they both revealed a novel connection between different aspects of gene regulation in fungi and served as the basis for a new hypothesis regarding gene regulation in our most recent NSF Collaborative Research Award," says Marten.

Credit: 
University of Maryland Baltimore County

Why have fewer long-term care residents died from COVID-19 in BC than Ontario?

British Columbia found that BC was better prepared for the pandemic and responded in a more coordinated and decisive manner, leading to far fewer deaths than in Ontario.

The article is published in CMAJ (Canadian Medical Association Journal).

As of September 10, 2020, Ontario had reported 5965 resident cases in LTC homes and 1817 resident deaths from COVID-19, compared with just 466 cases and 156 deaths in BC homes.

"The BC long-term care system before the pandemic was better prepared to minimize SARS-CoV-2 transmission and respond to outbreaks," says lead author Michael Liu, medical and graduate student at Harvard University, Boston, Massachusetts, and the University of Oxford, Oxford, United Kingdom.

In a comparison of the two provinces' preparedness and response to COVID-19, the authors found that BC's health system had several strengths over Ontario's.

For example, before the pandemic, the average combined per diem funding per LTC resident in BC was $222 compared with $203 in Ontario. Long-term care residents were more likely to live in shared rooms in Ontario (63%) than in BC (24%). Links between hospitals, LTC and public health were stronger in BC, and the organizational structure of the health system was relatively stable compared with Ontario, which was undergoing significant change with the merging of regional entities and several provincial agencies into Ontario Health.

"BC overall was better prepared for the pandemic, and elected leaders and public health officials responded faster and more decisively with measures to limit transmission of SARS-CoV-2 into long-term care homes," says Dr. Irfan Dhalla, a physician at St. Michael's Hospital, Unity Health Toronto, and the University of Toronto.

The authors recommend governments should ensure clear, consistent communications; respond rapidly and proactively; ensure disparities between for-profit and non-profit homes do not affect quality of care; move to single rooms; ensure infection prevention and control teams can support LTC homes during outbreaks; and consider organizational structures to support integration between LTC, public health and hospitals.

"Residents of long-term care homes will always be vulnerable to infections," says Dr. Dhalla. "Our analysis highlights policies and practices that, if implemented, could help protect these vulnerable seniors from a second wave of COVID-19 as well as other infectious diseases."

Credit: 
Canadian Medical Association Journal