Culture

'Fun size' Cas9 nucleases hold promise for easier genome editing

Researchers from Skoltech and their colleagues from Russia and the US have described two new, compact Cas9 nucleases, the cutting components of CRISPR-Cas systems, that will potentially expand the Cas9 toolbox for genome editing. One of the two nucleases has been shown to work in human cells and thus can have biomedical applications. The paper was published in the journal Nucleic Acids Research.

CRISPR-Cas, the genome editing technology borrowed from bacteria, relies on Cas nucleases; these enzymes, when guided by CRISPR RNAs, can degrade target genetic sequences -- they are the blades in the Nobel Prize-worthy 'genetic scissors.' In research applications, the most popular Cas9 nuclease is the Streptococcus pyogenes one, Type II-A SpCas9. It is efficient and relatively simple, as one large protein both binds to crRNA and cleaves DNA; it also requires a short PAM -- a string of nucleotides bookending the target site that the enzyme uses to locate and 'read' it.

But SpCas9 is a large protein, which creates a problem when one wants to, say, use an adeno-associated viral (AAV) particle as a vehicle for delivering the 'genetic scissors' into a cell. Ideally one would want to fit both the gene encoding the Cas protein and the sequences for guide RNAs into a single particle, and that size limit requires shorter Cas9 varieties. Those shorter nucleases, however, tend to require longer and more complex PAMs, so researchers face a tradeoff between protein size and choice of targets.

In their new paper, Iana Fedorova, who recently defended her Skoltech PhD, and research scientist Aleksandra Vasileva at the Skoltech Severinov lab and their colleagues describe two new small Cas9 nucleases derived from Defluviimonas sp.20V17, a bacterium living in hydrothermal vents, and Pasteurella pneumotropica, a common bacterium found in rodents and other mammals. These nucleases happen to be small enough for an AAV vector and to have relatively short PAMs, a 'best of both worlds' option when it comes to Cas9 enzymes.

The new nucleases are related to Type II-C CRISPR-Cas systems, which are usually represented by smaller Cas9 effectors compared to SpCas9. These proteins adopt a conservable bilobed architecture similar to other Cas9 proteins, but also have unique features: they lack several insertion subdomains and have a smaller Wedge domain (the domain responsible for interaction with single-guide RNA scaffold) -- and thus are more compact.

"Indeed, type II-C effectors tend to require longer PAM sequences, but it is just an observation based on a limited number of Type II-C effectors characterized to date. For example, the recently found SauriCas9 from Staphylococcus auricularis, similarly to PpCas9, requires a short PAM (5'-NNGG-3'). And most likely more Type II-C Cas9 enzymes requiring short PAMs will be found soon. These small Cas9 proteins with different PAM requirements expand the number of potentially editable DNA targets in eukaryotic and prokaryotic genomes," Fedorova says.

In vitro studies and experiments in bacteria showed that these two nucleases are efficient at cleaving DNA, and the P. pneumotropica Cas9 nuclease is active in human cells. It also turned out to be fairly similar to other Cas9 nucleases which had been shown to work in eukaryotic cells, Nme1Cas9 and Nme2Cas9. Although more research is needed to establish the efficiency of these nucleases, the authors believe they may present a viable alternative to the more conventional nucleases used in microbial engineering and biomedical genome editing.

Iana Fedorova notes that preliminary studies of PpCas9 off-targeting (unintended modifications) have shown that this enzyme has reasonable specificity. But to show that PpCas9 is specific enough to be considered as a genome editing tool, additional studies using more sophisticated methods are needed.

"Moreover, it looks like PpCas9 demonstrates selectivity in targeting of different genes in cells. It may reduce the range of possible PpCas9 genomic targets, and the nature of this bias is a subject of further studies," she adds.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Genomic analysis of early SARS-CoV-2 outbreak in Boston shows role of superspreading events

SARS-CoV-2 was introduced to the Boston area of Massachusetts many times in early 2020, according to a new analysis of virus genomes, but only a small number of importations - including one related to an international business conference - led to most cases there. Because viruses circulating at the conference happened to be marked by distinct genomic signatures, the study's authors were able to track its downstream effects far beyond the superspreading event itself. Their work "provides powerful evidence of the importance of superspreading events in shaping the course of this pandemic," they write. As the COVID-19 pandemic continues, countries around the world seek to improve their control of the spread of the virus. A better understanding of transmission dynamics - including superspreading events - could help. Jacob Lemieux and colleagues used genomic epidemiology to investigate the introduction and spread of SARS-CoV-2 in the Boston, Massachusetts area. Data came from nearly all confirmed early cases in the Boston area, as well as putative superspreading events involving an international conference held locally from February 26 - 27 and from close-quarters living environments. Using SARS-CoV-2-positive samples collected in the state from March to May 2020, they generated 778 complete virus genomes and reconstructed phylogenetic relationships among them. Using ancestral trait inference, they identified 122 putative importation events into Massachusetts, from four continents. Through further analysis, they confirmed two presumed superspreading events in the Boston area, including at the business conference and at a skilled nursing facility. One variant associated with the conference spread extensively through Boston, the analysis shows. It began to appear in multiple other US states by March, and by November, viruses containing this variant could be found in 29 states. The cluster of cases that proved to be a superspreading event at the skilled nursing facility led to significant mortality in the vulnerable population, but little broader spread, the authors report. The study "illustrates the role of chance in the trajectory of an epidemic," say the authors. "[A] single introduction had an outsized effect on subsequent transmission because it was amplified by superspreading in a highly mobile population very early in the outbreak," they say. By contrast, other early introductions led to very little onward transmission. The authors conclude: "This study provides clear evidence that superspreading events may profoundly alter the course of an epidemic and implies that prevention, detection and mitigation of such events should be priority for public health efforts."

Credit: 
American Association for the Advancement of Science (AAAS)

Dartmouth researchers work to reduce child-directed food marketing on educational websites

Dartmouth Researchers Work to Reduce Child-Directed Food Marketing on Educational Websites

A new article, published in the American Journal of Preventive Medicine by a team of researchers and advocates including Dartmouth faculty, asserts that current gaps in the regulation of commercial educational websites are exposing children to unhealthy food marketing.

"Our main issue is that if the USDA (United States Department of Agriculture) not deem products to be nutritious enough to sell in schools, companies shouldn't be allowed to advertise them on school-issued devices that children now need to use because of the COVID-19 pandemic," explains Jennifer Emond, PhD, MS, an assistant professor of biomedical data science and of pediatrics at Dartmouth's Geisel School of Medicine and lead author on the paper.

After doing extensive research on the topic, the study team visited several of the most popular commercial educational websites between March-July 2020. They scanned for the presence of unhealthy food marketing and found a number of ads for sugary cereals, fast food, and packaged kids' meals on websites such as ABCya!, which offers more than 400 educational games and apps for children in pre-kindergarten to grade 5.

As part of their research project, the core group, which now includes a coalition of more than 20 academic, public health, education, and advocacy organizations, wrote letters to the major companies involved and their governing agency, the Children's Food and Beverage Advertising Initiative (CFBAI), requesting that they stop food marketing on educational platforms for children.

They also reached out to the USDA, asking that the agency issue guidance on the topic and to help strengthen schools' local wellness policies, which traditionally have been focused on prohibiting unhealthy foods and beverages in classrooms and on school grounds.

As a result, a number of major food companies--including McDonald's, General Mills, Kraft Heinz, and Kellogg, which advertise products like Happy Meals, Lunchables, and Frosted Flakes--pledged to stop food marketing on educational websites for the remainder of the year.

While these were notable steps for the companies and CFBAI, which is responsible for the self-regulation of child-directed food marketing in the U.S., the pledges aren't permanent and they allow for many exceptions, says Emond, who adds that the group plans to make a new request to the USDA to limit this type of marketing under the incoming Biden administration.

"We want to see the USDA make sure that the food marketing policies within schools also cover school-issued devices," she says. "And we want the companies to follow suit--to not market their products on any computer, app, platform, or website that promotes itself as educational curriculum."

Fran Fleming-Milici, PhD, a co-author on the study and director of marketing initiatives at the University of Connecticut Rudd Center for Food Policy and Obesity, adds: "When parents and teachers direct children to educational platforms to support online learning, exposure to the marketing of junk food should not be part of that experience. Children's education and health should be put ahead of companies' profits."

Credit: 
The Geisel School of Medicine at Dartmouth

Computational method provides faster high-resolution mass spectrometry imaging

image: Top, hyperspectral visualization with data from a standard 9-hour experiment compared with hyperspectral visualization with data from a proposed 1-hour experiment.

Image: 
Courtesy the Beckman Institute for Advanced Science and Technology.

A new computational mass spectrometry imaging method enables researchers to achieve high mass resolution and high spatial resolution for biological samples while providing data sets exponentially faster.

Researchers at the Beckman Institute for Advanced Science and Technology developed a subspace mass spectrometry imaging approach that accelerates the speed of data acquisition -- without sacrificing the quality -- by designing a model-based reconstruction strategy.

The technique, which was developed using animal models, could have important implications for many applications, including analytical chemistry and clinical studies, with results available at a fraction of the time. It also can detect a wide range of biomolecules -- from small molecules such as neurotransmitters and amino acids to larger molecules such as lipids or peptides.

The paper "Accelerating Fourier Transform-Ion Cyclotron Resonance Mass Spectrometry Imaging Using a Subspace Approach" was published in the Journal of the American Society of Mass Spectrometry.

"Fourier transform-ion cyclotron resonance is a really powerful instrument, providing the highest mass resolution," said Yuxuan Richard Xie, a bioengineering graduate student at the University of Illinois Urbana-Champaign, who is first author on the paper. "But one disadvantage of FT-ICR is it's very slow. So essentially, if people want to achieve a certain mass resolution, they have to wait days to acquire data sets. Our computational approach speeds up this acquisition process, potentially from one day to maybe one to two hours -- basically a tenfold increase in data acquisition speed."

"Our method is changing the way that we acquire the data," Xie said. "Instead of acquiring mass spectra per pixel, the technique recognizes the redundancy in the high-dimensional imaging data and uses a low-dimensional subspace model to exploit this redundancy to reconstruct multispectral images from only a subset of the data."

Xie collaborated with Fan Lam, an assistant professor of bioengineering, and Jonathan V. Sweedler, the James R. Eiszner Family Endowed Chair in Chemistry and the director of the School of Chemical Sciences, who are co-principal investigators on the paper. Daniel Castro, a graduate student in molecular and integrative physiology, also contributed.

"We have been using subspace models in our MRI and MR spectroscopic imaging work for a long time," Lam said. "It is really nice to see that it also has great potentials for a different biochemical imaging modality."

"The ability to acquire enhanced chemical information and the locations of the chemicals in a complex sample such as a section of a brain becomes enabling for our neurochemical research," Sweedler said.

The subspace imaging concept was pioneered by Zhi-Pei Liang, a professor of electrical and computer engineering and full-time Beckman faculty member, who is a world-leading expert in MRI and MRSI.

The research continues as researchers seek to apply the technique to 3D imaging. "(The approach) could have a much larger impact for the scientific community for 3D imaging of larger areas, such as the brain," Xie said. "Because if we do 50 slices on FT-ICR, it would take weeks right now, but (with this technique) we can achieve decent coverage maybe within days.

"I believe that computational imaging, especially the data driven approach, is like a new shining star. It's getting more and more powerful, and we should definitely utilize some of those techniques for chemical analysis of tissue through mass spectrometry imaging."

Credit: 
Beckman Institute for Advanced Science and Technology

COVID-19 found in the cornea: Are transplants a transmission risk?

COVID-19 has been found in conjunctival swabs and tears of infected patients, according to a new study published in The Ocular Surface.

The discovery prompted a research team including Shahzad Mian, M.D., an ophthalmologist at Kellogg Eye Center, to analyze the prevalence of COVID-19 in human post-mortem ocular tissues. The results: the virus can infiltrate corneal tissue, the clear, outer layer of the eye, that could be used for transplantation in the U.S., raising concerns that the disease could be transmitted to a healthy recipient.

Of the 132 ocular tissues from 33 donors intended for surgery in Michigan, Illinois, Ohio and New Jersey, 13% were positive for COVID-19, which was determined by isolating the ribonucleic acid (RNA), a molecule similar to DNA, of the patients that were known to have the virus or showed symptoms without a positive nasopharyngeal swab.

Studies have shown that COVID-19 patients hold much of the virus in the upper respiratory tract, so there's a strong possibility the virus could contaminate the outer layers of the eye via respiratory droplets after coughing, sneezing or hand-to-eye contact, according to Mian.

"There's no evidence to suggest COVID-19 can be transmitted from a corneal transplant, but our data assures us that a screening process to determine who's positive for the virus and who isn't is important to make sure we do everything in case there is a potential risk of transmission," Mian adds.

The findings also demonstrate the critical importance of post-mortem nasopharyngeal swab testing for detecting COVID-19 before transplantation. The study's donors were divided into three groups:

Group 1

This group was positive for COVID-19 after receiving a nasopharyngeal swab at the time of corneal recovery.

Group 2

This group was primarily made up of donors from early in the pandemic when testing wasn't widely available. The majority of these donors had a negative COVID-19 test.

Group 3

This group didn't have signs or symptoms of COVID-19 and tested negative, but they also spent extended amounts of time with someone who tested positive.

This is significant because 15% of the corneal samples from Group 2 presented with COVID-19 RNA, despite having a negative nasopharyngeal swab test. In fact, this was even higher than the presence of coronavirus-infected corneal tissue from Group 1, which only had a positivity rate of 11% despite the donors having positive nasopharyngeal swab tests.

None of the tissues from Group 3's two donors had a presence of COVID-19 RNA.

Lowering the presence of COVID-19 in the cornea

Aside from establishing a screening process, Mian set out to discover if there was a way to lower the presence of COVID-19 in the donor tissue -- another strategy that could diminish transmission risk.

"An initial study goal was to test the effectiveness of povidone-iodine, a disinfectant, in inactivating COVID-19," Mian says.

This was done by recovering donors' right eyes without cleaning with povidone-iodine and following the Eye Bank Association of America recommended double povidone-iodine soak procedure on the left eyes.

The procedure involves soaking the cornea in 5% povidone-iodine for five minutes, then flushing with sterile saline fluid. This is repeated after a sample of corneal tissue is harvested. All of the eyes that underwent this disinfecting tested negative for COVID-19 RNA, compared to one of the right eye swabs that tested positive.

However, the team is unable to conclude that povidone-iodine is at all effective in reducing COVID-19 in corneal tissue because this procedure was only performed on 10 patients.

"A larger study is needed to confirm our findings, but we're excited about this research's potential implications. These questions are important in keeping our patients healthy and safe," Mian says.

It's unclear whether the presence of COVID-19 RNA is due to ocular surface infection or due to transport of the virus from the upper respiratory tract via the tear ducts. It's also unclear whether COVID-19 can replicate in corneal cells and what changes occur in these cells when infected.

"The takeaway I hope other physicians have when reading this study is that following elaborate donor screening procedures to mitigate the risk of pathogen transmission during transplant, as well as testing post-mortem donors for COVID-19 specifically, when there is no COVID-19 nasal swab testing, is critically important as professionals in this field."

Credit: 
Michigan Medicine - University of Michigan

Hubble identifies strange exoplanet that behaves like the long-sought "Planet Nine"

image: This 11-Jupiter-mass exoplanet called HD106906 b occupies an unlikely orbit around a double star 336 light-years away and it may be offering clues to something that might be much closer to home: a hypothesized distant member of our Solar System dubbed "Planet Nine." This is the first time that astronomers have been able to measure the motion of a massive Jupiter-like planet that is orbiting very far away from its host stars and visible debris disc.

Image: 
ESA/Hubble, M. Kornmesser

The 11-Jupiter-mass exoplanet called HD106906 b occupies an unlikely orbit around a double star 336 light-years away and it may be offering clues to something that might be much closer to home: a hypothesized distant member of our Solar System dubbed "Planet Nine." This is the first time that astronomers have been able to measure the motion of a massive Jupiter-like planet that is orbiting very far away from its host stars and visible debris disc.

The exoplanet HD106906 b was discovered in 2013 with the Magellan Telescopes at the Las Campanas Observatory in Chile's Atacama Desert. However, astronomers did not then know anything about the planet's orbit. This required something only the Hubble Space Telescope could do: collect very accurate measurements of the vagabond's motion over 14 years with extraordinary precision.

The exoplanet resides extremely far from its host pair of bright, young stars -- more than 730 times the distance of Earth from the Sun. This wide separation made it enormously challenging to determine the 15 000-year-long orbit in such a short time span of Hubble observations. The planet is creeping very slowly along its orbit, given the weak gravitational pull of its very distant parent stars.

The Hubble team behind this new result [1] was surprised to find that the remote world has an extreme orbit that is very inclined, elongated and external to a dusty debris disc that surrounds the exoplanet's twin host stars. The debris disc itself is very extraordinary, perhaps due to the gravitational tug of the rogue planet. This study was led by Meiji Nguyen of the University of California, Berkeley.

"To highlight why this is weird, we can just look at our own Solar System and see that all of the planets lie roughly in the same plane," explained Nguyen. "It would be bizarre if, say, Jupiter just happened to be inclined 30 degrees relative to the plane that every other planet orbits in. This raises all sorts of questions about how HD 106906 b ended up so far out on such an inclined orbit."

The prevailing theory to explain how the exoplanet arrived at such a distant and strangely inclined orbit is that it formed much closer to its stars, about three times the distance that Earth is from the Sun. However, drag within the system's gas disc caused the planet's orbit to decay, forcing it to migrate inward toward its stellar hosts. The gravitational forces from the whirling twin stars then kicked it out onto an eccentric orbit that almost threw it out of the system and into the void of interstellar space. Then a star passed very close by to this system, stabilising the exoplanet's orbit and preventing it from leaving its home system. Candidate passing stars had been previously identified using precise distance and motion measurements from the European Space Agency's Gaia survey satellite.

This scenario to explain HD106906 b's bizarre orbit is similar in some ways to what may have caused the hypothetical Planet Nine to end up in the outer reaches of our own Solar System, beyond the Kuiper Belt. Planet Nine could have formed in the inner Solar System and was then kicked out by interactions with Jupiter. However, Jupiter would very likely have flung Planet Nine far beyond Pluto. Passing stars may have stabilised the orbit of the kicked-out planet by pushing the orbit path away from Jupiter and the other planets in the inner Solar System.

"It's as if we have a time machine for our own Solar System going back 4.6 billion years to see what may have happened when our young Solar System was dynamically active and everything was being jostled around and rearranged," explained team member Paul Kalas of the University of California, Berkeley.

To date, astronomers have only circumstantial evidence for the existence of Planet Nine. They've found a cluster of small celestial bodies beyond Neptune that move in unusual orbits compared to the rest of the Solar System. This configuration, some astronomers think, suggests that these objects were shepherded together by the gravitational pull of a huge, unseen planet. An alternative hypothesis is that there is not one giant perturber, but instead the imbalance is due to the combined gravitational influence of much smaller objects.

"Despite the lack of detection of Planet Nine to date, the orbit of the planet can be inferred based on its effect on the various objects in the outer Solar System," explained team member Robert De Rosa of the European Southern Observatory in Santiago, Chile who led the study's analysis. "This suggests that if a planet was indeed responsible for what we observe in the orbits of trans-Neptunian objects it should have an eccentric orbit inclined relative to the plane of the Solar System. This prediction of the orbit of Planet Nine is similar to what we are seeing with HD 106906b."

Scientists using the upcoming NASA/ESA/CSA James Webb Space Telescope plan to get additional data on HD106906 b to better understand the planet's system. Astronomers want to know where and how the planet formed and whether the planet has its own debris system around it, among other questions.

"There are still a lot of open questions about this system," added De Rosa. "For example, we do not conclusively know where or how the planet formed. Although we have made the first measurement of orbital motion, there are still large uncertainties on the various orbital parameters. It is likely that both observers and theorists alike will be studying HD 106906 for years to come, unraveling the many mysteries of this remarkable planetary system."

Credit: 
ESA/Hubble Information Centre

When it comes to feeling pain, touch or an itch, location matters

image: From left: Martyn Goulding and Graziana Gatto.

Image: 
(L) Yolanda Leenders-Goulding; (R) Salk Institute.

LA JOLLA--(December 10, 2020) When you touch a hot stove, your hand reflexively pulls away; if you miss a rung on a ladder, you instinctively catch yourself. Both motions take a fraction of a second and require no forethought. Now, researchers at the Salk Institute have mapped the physical organization of cells in the spinal cord that help mediate these and similar critical "sensorimotor reflexes."

The new blueprint of this aspect of the sensorimotor system, described online in Neuron on November 11, 2020, could lead to a better understanding of how it develops and can go awry in conditions such as chronic itch or pain.

"There's been a lot of research done at the periphery of this system, looking at how cells in the skin and muscles generate signals, but we didn't know how that sensory information is trafficked and interpreted once it reaches the spinal cord," says Martyn Goulding, a professor in Salk's Molecular Neurobiology Laboratory and holder of the Frederick W. and Joanna J. Mitchell Chair. "This new work gives us a fundamental understanding of the architecture of our sensorimotor system."

Reflexive behaviors--seen even in newborn babies--are considered some of the simplest building blocks for movement. But reflexes must quickly translate information from sensory neurons that detect touch, heat and painful stimuli to motor neurons, which cause the muscles to take action. For most reflexes, the connections between the sensory neurons and motor neurons are mediated by interneurons in the spinal cord, which serve as sort of "middlemen," thereby saving time by bypassing the brain. How these middlemen are organized to encode reflexive actions is poorly understood.

Goulding and his colleagues turned to a set of molecular engineering tools they've developed over the past decade to examine the organization of these spinal reflexes in mice. First, they mapped which interneurons were active when mice responded reflexively to sensations, like itch, pain or touch. They then probed the function of interneurons by turning them on and off individually and observing how the resulting reflex behaviors were affected.

"What we found is that each sensorimotor reflex was defined by neurons in the same physical space," says postdoctoral researcher Graziana Gatto, the first author of the new paper. "Different neurons in the same place, even if they had very different molecular signatures, had the same function, while more similar neurons located in different areas of the spinal cord were responsible for different reflexes."

Interneurons in the outermost layer of the spinal cord were responsible for shuttling reflexive messages related to itch between sensory and motor cells. Deeper interneurons relayed messages of pain--causing a mouse to move a foot touched by a pin, for instance. And the deepest set of interneurons helped mice reflexively keep their balance, stabilizing their body to prevent falling. But within each spatial area, neurons had varying molecular properties and identities.

"These reflexive behaviors have to be very robust for survival," says Goulding. "So, having different classes of interneurons in each area that contribute to a particular reflex builds redundancy into the system."

By demonstrating that the location of each interneuron type within the spinal cord matters more than the cell's developmental origin or genetic identity, the team tested and confirmed an existing theory about how these reflex systems are organized.

Now that they know the physical architecture of the interneuron circuits that make up these different reflex pathways, the researchers are planning future studies to reveal how messages are conveyed and how the neurons within each space interact with each other. This knowledge is now being used to probe how pathological changes in the somatosensory system result in chronic itch or pain. In an accompanying paper, Gatto and Goulding collaborated with Rebecca Seal of the University of Pittsburgh to map the organization of neurons that generate different forms of chronic pain.

Credit: 
Salk Institute

Black churches are trusted messengers of COVID-19 information to their communities

ROCHESTER, Minn. -- U.S. public health officials have reported that Black communities are disproportionately affected by the COVID-19 pandemic, with higher infection and mortality rates than the general population. These disparities relate to the prevalence of underlying chronic diseases, and social and economic inequality, according to Mayo experts. Now as the number of COVID-19 cases across the U.S. surge, Mayo Clinic researchers are working closely with Black churches on disparities in emergency preparedness and providing access to culturally relevant, evidence-based health information. The early results of this research were published Thursday, Dec. 10, in Preventing Chronic Disease, the Centers for Disease Control and Prevention's (CDC) public health journal.

More than 100 of these Black churches in Rochester and the Twin Cities are active in the FAITH! program (Fostering African-American Improvement in Total Health), which LaPrincess Brewer, M.D., has led at Mayo Clinic since 2012. Dr. Brewer is a cardiologist and health disparities researcher and is first author on the paper.

FAITH! is an academic-community partnership focused on tackling health disparities, particularly related to cardiovascular disease within Black communities. When the pandemic began to threaten these communities in early 2020, Dr. Brewer and her FAITH! partners quickly pivoted to focus their work on the pandemic and position Black churches as strongholds of COVID-19 information and preparedness.

"Black churches have long been more than places of worship to their communities," says Dr. Brewer. "They serve as strongholds for disseminating trusted information, including health information, in their communities."

The paper in Preventing Chronic Disease details these efforts. The researchers used the CDC Crisis and Emergency Risk Communication framework, a strategy previously tested in another recent Mayo COVID-19 community-based research study.

The research began with an emergency resource needs assessment. Needs identified were financial support, food and utilities, and COVID-19 health information. Nearly all respondents ? 97% ? expressed the desire to receive COVID-19 health information through their FAITH! partnership via email and social media.

The researchers then distributed emergency preparedness manuals and American Red Cross emergency preparedness starter kits to each participating church and helped them establish COVID-19 emergency preparedness teams to serve their congregations. They also trained a pair of communication leaders who were already trusted messengers within the community to deliver culturally relevant, evidence-based messages according to each church's emergency resource needs.

Feedback on program feasibility and acceptability was overwhelmingly positive from church and communication leaders.

"Because the congregants know that the FAITH project is tailored toward the African American community, one of the things is it's reliable. It's believed to be reliable and trustworthy, so that's the primary thing," says one of the participating church leaders whose name was redacted in the study.

The researchers hope the community-based approach outlined in their paper can help others plan and implement effective campaigns and initiatives related to COVID-19 or other pressing health concerns among high-risk populations

Credit: 
Mayo Clinic

Research examines impact of hurricanes on hospitalizations, medical providers

AUDIO

More older adults are hospitalized in the month following hurricanes while fewer primary care doctors, surgeons and specialists are available in some of their communities in the long term, according to a pair of University of Michigan studies.

The findings are noteworthy as the population of older adults is rapidly growing alongside increasing impacts from climate change, such as extreme weather events, the U-M researchers say.

The study results can help inform policy around disaster preparedness and response planning for health care systems, says lead author Sue Anne Bell, assistant professor at the U-M School of Nursing.

The first study looked at hospitalizations among adults 65 and older in the month after eight of the largest hurricanes in the U.S in recent years. It found that hospitalizations for any reason among adults increased from 10% (Hurricane Irene, 2011) to 23% (Hurricane Sandy, 2012). People 85 and older were significantly more likely to be hospitalized, as were those older adults living in poverty.

When researchers removed the first three days after hurricanes Irene and Sandy that might account for admissions for injuries and trauma, hospitalizations remained significantly elevated.

"We can surmise that the stronger the hurricane, the greater the impact will be on individuals and communities," Bell said. "But even a small storm can cause great damage to a community that is not prepared. Areas we plan to include in future studies are state and local policies around emergency preparedness."

Another study will analyze the most common reasons for admission, says Bell, who encourages older adults to plan for disasters by taking small, regular steps before an emergency.

"Include in your grocery budget a few items each month to start building a supply of food and water," Bell said. "A gallon of water is less than a dollar, for example. Buying a can opener and a few canned goods can be a good start, and the next time you go to the grocery, think of another item or two to add to your stash.

"Another way is to communicate with family, friends and neighbors. Let them know your evacuation plans in the event of an emergency. Now is a great time to plan a family Zoom meeting and talk about those plans."

Bell was surprised that hospitalizations increased across all eight disasters.

"Considering there are generally more than 100 disasters occurring in the U.S. annually, including steps to support the health of older adults in disaster preparedness is important," she said.

The second study compared the numbers of medical providers one year before and five years after hurricanes Katrina and Sandy. Counties affected by Katrina saw 3.6 fewer primary care physicians per 10,000 residents; 5.9 fewer medical specialists per 10,000 residents; and 2.1 fewer surgeons per 10,000 residents.

The availability of nurse practitioners didn't change and helped to offset the decline in physicians. Counties affected by Hurricane Sandy saw no significant decrease in providers.

"While both storms were catastrophic events for their communities, areas affected by Hurricane Katrina already had a higher economic burden than those in Hurricane Sandy, and therefore their recovery was more protracted," Bell said. "Communities that are more affluent have more means to recover, and these communities are going to be more attractive to health care providers to live and work."

Bell says the findings confirm that communities, especially in economically disadvantaged areas, should include guidelines to attract and retain providers in their disaster plans.

"As they recover, these communities may need health care services the most," she said.

The hospitalization study examined Medicare claims data, and the provider study examined data from the National Plan and Provider Enumeration System.

Credit: 
University of Michigan

The psychology of causality

Like a parent being pestered with endless questions from a young child, most people will now and again find themselves following an infinite chain of cause and effect when considering what led to some particular event. And while many factors can contribute to an event, we often single out only a few as its causes. So how do we decide?

That's the topic of a recent paper by Tadeg Quillien, a doctoral student in the Department of Psychological and Brain Sciences. The study, published in the journal Cognition, outlines how a factor's role in an event influences whether or not we consider it to be a cause of that event.

In his paper, Quillien constructs a mathematical model of causal judgement that reproduces people's intuitions better than any previous model. And in addition to providing theoretical insights, understanding how we reason about causality has major implications for how we approach problems overall.

Intuitively speaking, the event that has the strongest role in determining an outcome is generally considered its cause. In fact, philosophers and psychologists have observed humans ranking the causes of an event in different studies. For instance, if a match is found at the scene of a forest fire, people usually say the match caused the blaze, even though the oxygen in the air was also necessary for the fire to start.

"But what do we mean by 'the strongest role'?" Quillien asked. "This is still a very hazy notion, and making it more precise has, for decades, been a source of headaches for philosophers and psychologists trying to understand causal judgment."

Quillien approached this question by considering what evolutionary purpose our causal reasoning serves. "At least one of the functions of causal judgement is to highlight the factors that are most useful in predicting an outcome," Quillien proposed, "as well as the factors that you can manipulate to affect the outcome."

The process reminded him of a scientist seeking to understand how different phenomena are related. Scientists can run controlled experiments with many different cases to quantify correlations and determine an effect size, which is the association between one variable and another.

But if we accept that this is what the mind is trying to do, a problem arises. Scientists rely on many observations before arriving at a judgment. They can't compute an effect size from a single occurrence. And yet, people generally have no trouble making one-off causal judgements.

Quillien believes that this paradox can be resolved with the following hypothesis. When people make a causal judgment, they are unconsciously imagining the different ways that an event could have unfolded. "These counterfactuals give you the data that you need in order to compute this measure of effect size," he said.

Guided by these ideas, Quillien designed a simple mathematical model of how people make causal judgments. To test his model, he analyzed data from an experiment conducted by Harvard psychologist Adam Morris and his colleagues. The experiment used a lottery game to explore the effect of probability and logical structure on people's causal intuitions.

"The probability of events affects our sense of causation in a strange way," Quillien explained. Say a professor, Carl, wants funding for a project. His request is reviewed by his department chairs, Alice and Bill, both of whom have to approve it. Alice approves nearly every application, but Bill is notorious for rejecting most of them. The question is, if Carl receives his funding, who's most responsible?

Most people would say Bill caused Carl's request to be approved, since getting his endorsement has more bearing, in general, on receiving funding.

However, change just one detail, and people's intuitions flip. If Carl only needs the approval of one or the other of his colleagues, and still gets both, then people attribute Carl's funding to Alice. In this case, her more reliable support was the strongest factor in whether Carl's project was funded.

In their experiment, Morris and his colleagues were able to precisely quantify this effect that an event's probability had on people's causal judgment. Their conclusion was surprising, and no psychological theory at the time could explain their results, Quillien said.

When he re-analyzed their data, Quillien found that his mathematical model closely matched how Morris's participants had assigned causality to the various events. In fact, it matched the data better than any other model to date.

The results highlight how probability and logical structure together inform our causal intuition. When both votes are necessary for Carl to get funding, it will happen only if the most stringent committee member is on board. As a result, people attribute a positive outcome to the less likely vote. By contrast, in situations where a single vote is enough, the approval of the more permissive faculty member is what most often determines the outcome. "We are attuned to causes that tend to co-occur with the effects," Quillien said.

The way in which we reason about causality has practical implications. Consider the example of the forest fire again. Fires need three things to burn: oxygen, fuel and an ignition source. But our minds don't give these factors equal weight.

"While we may not have an exact model of how forest fires work, we still have this sense that oxygen is there all the time, and the forests are not always on fire," Quillien said. "So the correlation between oxygen and fire is relatively low." The same reasoning applies to the fuel, namely the wood in the trees. But introduce a match to the equation, and the forest is much more likely to catch ablaze.

The method of causal judgement that Quillien outlines in his work is good at guiding us toward the match: a factor with high predictive power that we might even be able to control. However, our intuition can sometimes lead us astray when we try to gain a more complete understanding of the world.

"If you want a deep understanding of how fire works, you need to factor in the role of oxygen," Quillien said. "But if your intuitive sense of causation is screaming at you that oxygen does not matter, then that might lead you to ignore some of the important factors in the world."

Causal reasoning is a ubiquitous feature of cognition, and Quillien plans to further investigate how our sense of causation influences other aspects of our psychology and worldview. "We explain almost everything in terms of cause and effect," he said. "As a consequence, many of the concepts that we use to make sense of the world have causation as a building block."

"If we can understand the concept of causation, then we can potentially understand the way a lot of other concepts work as well."

Credit: 
University of California - Santa Barbara

Nanocylinder vibrations help quantify polymer curing for 3D printing

image: Colorized plot of the light-assisted curing of a polymer over five seconds, as measured with NIST's custom atomic force microscope with a nanocylinder probe. Darker colors indicate a higher level of conversion from a liquid resin to a polymer. The magenta block at left represents the light fixture that initiates the reaction.

Image: 
NIST

In a step toward making more accurate and uniform 3D-printed parts such as personalized prosthetics and dental materials, researchers at the National Institute of Standards and Technology (NIST) have demonstrated a method of measuring the rate at which microscopic regions of a liquid raw material harden into a solid plastic when exposed to light.

NIST's custom atomic force microscope (AFM) with a nanometer-scale, cylinder-shaped tip revealed that the complex process of curing resins, as they react under light to form polymers, requires controlling how much of the light's energy goes into forming the polymer and how much the polymer spreads out, or diffuses, during 3D printing.

Described in a new paper, the NIST experiments showed that overall light-exposure conditions, not just the total optical energy as often assumed, control how far the polymer diffuses. For example, increasing light intensity for a constant or shorter duration reduced resin-to-polymer conversion and could distort the shape of a printed part. The measurements required only a few microliters of resin, offering a way to reduce the costs of making and testing novel resins.

"This research really digs into the unique process and materials science insight afforded by our new metrology techniques," project leader Jason Killgore said.

The work builds on the NIST team's prior development of a related AFM method -- sample-coupled-resonance photorheology (SCRPR) -- that measures how and where a material's properties change in real time at the smallest scales during the curing process. Those measurements were made with conventional, tapered AFM probes, which have angled sides and therefore can't reliably measure localized liquid flow or thickness, technically referred to as viscosity.

Now, NIST researchers have quantified viscosity, conversion and diffusion by use of a cylindrical AFM probe, which has straight sides surrounded by consistent liquid flow. The probe's vibrations, as they perturb the resin, are reduced by an amount that depends on cylinder length and liquid viscosity. The increase in liquid resin viscosity is related to conversion, enabling measurements of the polymer's evolution in space and time.

Researchers used computational fluid dynamics to model the force slowing down, or damping, the oscillating nanocylinder and the resulting changes in its speed to determine the amount of resin affected by the motion. By relating SCRPR damping to resin viscosity and conversion, researchers made spatial maps of conversion versus time for different exposure conditions.

The AFM was equipped with a light modulator that directed patterned light from an LED to the resin sample. Measurements of the conversion of a fast-curing resin showed polymer accumulating tens of micrometers away from the light source within a few seconds of exposure, indicating the extent and speed of diffusion. The size of the light pattern was important; wider features led to higher conversion at a given light intensity and duration (see image).

SCRPR has attracted interest from industry. So far one company has visited NIST to use the instrumentation, Killgore said.

Credit: 
National Institute of Standards and Technology (NIST)

Germans want open communication of uncertainty in the coronavirus pandemic

The COVID-19 pandemic has once again highlighted the uncertainty inherent in science. The results of a Germany-wide study conducted by researchers at the Max Planck Institute for Human Development and Charite - Universitaetsmedizin Berlin show that most Germans want to be openly informed about this uncertainty. The results have now been published in the journal JAMA Network Open.

Since the SARS-Cov-2 virus was first identified in December 2019, new scientific findings on the spread of the virus, symptoms of COVID-19, and new treatments have been reported almost daily. What is valid one day may be outdated the next. Likewise, predictions on how the number of infections will develop by Christmas and what effects the current 'circuit breaker' lockdown will have, as well as current estimates of the reproduction number (R), are anything but certain.

"Politicians and health experts sometimes shy away from communicating scientific uncertainty, fearing that it will generate mistrust. But presenting uncertain aspects of the pandemic as certain may also have negative effects on citizens' trust if those reports later prove to be invalid," says Odette Wegwarth, lead author of the study and Senior Research Scientist in the Center for Adaptive Rationality at the Max Planck Institute for Human Development as well as associated researcher in the Institute of Medical Sociology and Rehabilitation Science at Charite.

To investigate people's preferences for health communications with varying degrees of scientific uncertainty in the context of COVID-19, a team of researchers from the Max Planck Institute for Human Development and Charite ran an online survey with a representative sample of 2,011 Germans. Participants were shown four scenarios that communicated information on the future course of the pandemic with varying magnitudes of scientific uncertainty. In the version with the highest degree of uncertainty, information on, for example, current infections, deaths, and the R number was communicated in terms of ranges rather than precise values. The text also emphasized that "it is uncertain whether the differences observed are due to random fluctuation or are the first signs of the onset of a second wave of coronavirus infections."

In contrast, the version with the lowest degree of uncertainty reported precise values and stressed that "this development in case numbers leaves no doubt that a second wave of infections has already begun." Each version concluded with the same appeal, namely, to continue taking preventive measures to protect risk groups, such as wearing a face mask in public places.

Participants were then asked which of the versions would be most suitable for informing people about the future course of the COVID-19 pandemic. The largest group of respondents (32%) chose the version expressing the highest degree of uncertainty. This version was also the most likely to persuade people to comply with containment measures. Overall, more than half of the participants (54%) preferred one of the versions that conveyed numerical and/or verbal uncertainty over the versions that did not. The version that left uncertainty unmentioned proved the least popular, being chosen by just 21% of respondents. Interestingly, communication expressing uncertainty appeared to be particularly effective for motivating those who are currently skeptical of governmental containment measures to comply with those measures.

"To better engage with those people who are currently skeptical about the government's coronavirus measures, the government and the media should have the courage to communicate uncertainties more openly," recommends Gert G. Wagner, co-author of the study and Max Planck Fellow at the MPI for Human Development.

Credit: 
Max Planck Institute for Human Development

Artificial Chemist 2.0: quantum dot R&D in less than an hour

image: This color wheel of quantum dots highlights some of the colors that can be made with Artificial Chemist 2.0.

Image: 
Milad Abolhasani, NC State University

A new technology, called Artificial Chemist 2.0, allows users to go from requesting a custom quantum dot to completing the relevant R&D and beginning manufacturing in less than an hour. The tech is completely autonomous, and uses artificial intelligence (AI) and automated robotic systems to perform multi-step chemical synthesis and analysis.

Quantum dots are colloidal semiconductor nanocrystals, which are used in applications such as LED displays and solar cells.

"When we rolled out the first version of Artificial Chemist, it was a proof of concept," says Milad Abolhasani, corresponding author of a paper on the work and an assistant professor of chemical and biomolecular engineering at North Carolina State University. "Artificial Chemist 2.0 is industrially relevant for both R&D and manufacturing."

From a user standpoint, the whole process essentially consists of three steps. First, a user tells Artificial Chemist 2.0 the parameters for the desired quantum dots. For example, what color light do you want to produce? The second step is effectively the R&D stage, where Artificial Chemist 2.0 autonomously conducts a series of rapid experiments, allowing it to identify the optimum material and the most efficient means of producing that material. Third, the system switches over to manufacturing the desired amount of the material.

"Quantum dots can be divided up into different classes," Abolhasani says. "For example, well-studied II-VI, IV-VI, and III-V materials, or the recently emerging metal halide perovskites, and so on. Basically, each class consists of a range of materials that have similar chemistries.

"And the first time you set up Artificial Chemist 2.0 to produce quantum dots in any given class, the robot autonomously runs a set of active learning experiments. This is how the brain of the robotic system learns the materials chemistry," Abolhasani says. "Depending on the class of material, this learning stage can take between one and 10 hours. After that one-time active learning period, Artificial Chemist 2.0 can identify the best possible formulation for producing the desired quantum dots from 20 million possible combinations with multiple manufacturing steps in 40 minutes or less."

The researchers note that the R&D process will almost certainly become faster every time people use it, since the AI algorithm that runs the system will learn more - and become more efficient - with every material that it is asked to identify.

Artificial Chemist 2.0 incorporates two chemical reactors, which operate in a series. The system is designed to be entirely autonomous, and allows users to switch from one material to another without having to shut down the system. Video of how the system works can be found at https://youtu.be/e_DyV-hohLw.

"In order to do this successfully, we had to engineer a system that leaves no chemical residues in the reactors and allows the AI-guided robotic system to add the right ingredients, at the right time, at any point in the multi-step material production process," Abolhasani says. "So that's what we did.

"We're excited about what this means for the specialty chemicals industry. It really accelerates R&D to warp speed, but it is also capable of making kilograms per day of high-value, precisely engineered quantum dots. Those are industrially relevant volumes of material."

Credit: 
North Carolina State University

Restorative justice preferred among the Enga in Papua New Guinea

image: A group of Enga men offer pigs to contribute to the compensation for the victim for harm done.

Image: 
Polly Wiessner

All human groups create systems for regulating cultural norms to maintain cooperation in society. Most large-scale populations employ a punitive judicial system where a third party doles out punishment. Yet, there's little evidence that this system achieves cooperation in a community. Advocates have long called for a more restorative justice system that repairs harm done to victims and reintegrates wrongdoers back into society. From an evolutionary perspective, humans have developed capacities to resolve disputes through restoration such as language, empathy and causal reasoning, and theory of mind to understand others and their perspectives.

"There's a lot of dismay with our current punitive criminal justice system," said lead author Polly Wiessner, anthropologist at the University of Utah and Arizona State University. "Many people have argued that punishment is necessary to maintain the norms and values of society. But when you work in small societies, you just don't see that."

In a study that published online Dec. 7, 2020, in the journal PNAS, Wiessner analyzed 10 years of third-party regulation of the Enga of Papua New Guinea. The Enga, a small-scale horticultural society, navigate both formal, western courts which administer punitive justice and customary village courts which negotiate restorative settlements. The vast majority of disputes and violations go through the village courts. Wiessner's team of Enga research colleagues documented 333 village court cases concerning assault, marriage and land and property violations. Results show that village courts overwhelmingly emphasize restorative justice, allowing both sides and community members to tell their side of the story and vent their feelings. There was no instance of summoning the police or recommending a jail sentence. Instead, the community assisted the wrongdoer in paying compensation to the victim to make up for harm done, and supported reintegrating the offender back into society. In small-scale societies people recognize that every member is valuable and has something to give, however, repeated offenders and free riders receive ever less community support.

"In large-scale societies, it's anonymous--judges have no repercussions for handing out harsh sentences in the way that community leaders do. Judges don't take the time to know the person. In small-scale societies, the community knows the person, needs them and wants to keep them in the fold" said Wiessner.

Enga society is facing rapid change with the impact of globalization and technology, high-powered weapons, mobile phones, internet, cash and modern transport. With these developments come changes in patterns of interaction and communication between people of different ages, genders and social standings. The community forums in customary courts allow people to discuss change and update norms to adapt to the future without losing sight of their heritage.

As one senior Enga magistrate, Anton Yongapen, explained, "We have no law books. We must just listen to the different sides and use our heads and hearts to apply custom in appropriate ways to bring about justice today. Justice must not only be done but be seen to be done by community, else our goal of bringing about peace and harmony will not be achieved."

Restorative justice has some drawbacks--it fails to remove truly dangerous people from society and it's hard to scale up beyond community to large populations. While difficult, the U.S. and other countries have already seen considerable success with restorative justice in youth courts where there's broad support to avoid losing young offenders possible future contributions to society. Wiessner suggests that today many have a yearning for integrating restorative measures, which has led to significant attempts to reintroduce amends and reconciliation into our judicial system.

Credit: 
University of Utah

Which product categories and industries benefit most from social advertising

CATONSVILLE, MD, December 10, 2020 - New research from a team of scientists at four leading universities has shed new light on the effectiveness of social advertising in specific product categories to learn which product categories tend to benefit more from social advertising, and which may not.

"Social advertising is the placement of social cues or endorsements in ads shown to friends of those who have engaged with a brand or product," said Huang. "Some examples include Facebook's social advertising that places the images and names of Facebook friends who have liked a brand in their ads. Or, Google's Shared Endorsement ads that do the same thing, placing the names, images and product ratings of others in product search results.

Advertising without these social cues was compared to advertising that included the social cues to arrive at findings that can help marketers develop and implement more effective advertising strategies.

The research study to be published in the December issue of the INFORMS journal Marketing Science, "Social Advertising Effectiveness Across Products: A Large-Scale Field Experiment," is authored by Shan Huang of the University of Washington, Sinan Aral of Massachusetts Institute of Technology (MIT), Yu Jeffrey Hu of Georgia Institute of Technology, and Erik Brynjolfsson of Stanford University.

"We found some categories like clothing, car and food products exhibited significantly stronger social advertising effectiveness," said Aral. "These are products that are experiential and sometimes carry certain levels of status, which matters among friends when making purchase decisions. We then found that other goods we call 'Search Goods' did not perform any better with or without social advertising. Search goods are products that are easier to evaluate using nonsocial information about the product's characteristics, such as what can be found online. Examples of these types of products are commodities like batteries, home appliances or in some cases consumer technology."

The majority of similar research that has been conducted in this area has centered on advertising and social proof involving one single product at a time. This study is broader and further reaching.

The researchers examined how social influence and ad engagement varies by comparing 71 products in 21 categories using a random sample of more than 37 million users of a large social network (WeChat). They focused on WeChat Moments ads, a type of social ad that is displayed in WeChat users' news feeds. WeChat is considered a world-leading mobile social networking platform with more than 1 billion monthly active users. Click-through rates were used as a critical measure of social advertising performance.

"In general, we found that certain goods that rely on status-driven consumption displayed strong social advertising effectiveness," said Hu. "And, we identified multiple situations where social advertising effectiveness varied across relative characteristics of those who view the ads, along with their friends shown in the ads. Ultimately, this tells us there is no one-size-fits-all approach."

"One of the more interesting findings was how status goods compared to nonstatus goods," added Brynjolfsson. "Social influence is more relevant for status goods, not always because we learn about the product and its quality from our friends, but because as we make purchase decisions, we make relative comparisons to our friends to create our own social identity. In other words, products that can't influence our social status are less likely to perform better simply through social advertising."

Credit: 
Institute for Operations Research and the Management Sciences