Culture

Research advances emerging DNA sequencing technology

image: Dr. Moon Kim, the Louis Beecherl Jr. Distinguished Professor in the Erik Jonsson School of Engineering and Computer Science, and fellow researchers developed a nanopore sequencing platform that, for the first time, can detect the presence of nucleobases, the building blocks of DNA and RNA.

Image: 
The University of Texas at Dallas

Nanopore technology shows promise for making it possible to develop small, portable, inexpensive devices that can sequence DNA in real time. One of the challenges, however, has been to make the technology more accurate.

Researchers at The University of Texas at Dallas have moved closer toward this goal by developing a nanopore sequencing platform that, for the first time, can detect the presence of nucleobases, the building blocks of DNA and RNA. The study was published online Feb. 11 and is featured on the back cover of the April print edition of the journal Electrophoresis.

"By enabling us to detect the presence of nucleobases, our platform can help improve the sensitivity of nanopore sequencing," said Dr. Moon Kim, professor of materials science and engineering and the Louis Beecherl Jr. Distinguished Professor in the Erik Jonsson School of Engineering and Computer Science.

Currently, most DNA sequencing is done through a process that involves preparing samples in the lab with fluorescent dye and using lasers to determine the sequence of the four nucleobases, the fundamental units of the genetic code: adenine (A), cytosine (C), guanine (G) and thymine (T). Each nucleobase emits a different wavelength when illuminated, allowing scientists to determine the sequence.

In nanopore sequencing, a DNA sample is uncoiled, and the hairlike strand is fed through a tiny hole, or nanopore, typically in a fabricated membrane. As it moves through the nanopore, the DNA strand disturbs the electrical current flowing through the membrane. The current responds differently based on the characteristics of a DNA molecule, such as its size and shape.

"The electrical signal changes as the DNA moves through the nanopore," Kim said. "We can read the characteristics of the DNA by monitoring the signal."

One of the challenges in advancing nanopore sequencing has been the difficulty of controlling the speed of the DNA strand as it moves through the nanopore. The UT Dallas team's research focused on addressing that by fabricating an atomically thin solid-state -- or nonbiological -- membrane coated with titanium dioxide, water and an ionic liquid to slow the speed of the molecules through the membrane. The water was added to the liquid solution to amplify the electrical signals, making them easier to read.

"By enabling us to detect the presence of nucleobases, our platform can help improve the sensitivity of nanopore sequencing."

The next step for researchers will be to advance the platform to identity each nucleobase more quickly. Kim said the platform also opens possibilities for sequencing other biomolecules.

"The ultimate goal is to have a hand-held DNA sequencing device that is fast, accurate and can be used anywhere," Kim said. "This would reduce the cost of DNA sequencing and make it more accessible."

Credit: 
University of Texas at Dallas

Light, in addition to ocean temperature, plays role in coral bleaching

image: Staghorn corals in the reef flat off Hagåtña appear bleached as a response to stress from environmental changes. As one of Guam's dominant reef-builders whose habitat experiences temperatures up to 97 degrees Fahrenheit, this species was used in a University of Guam study published in February that found that shade can mitigate the effects of heat stress on corals.

Image: 
University of Guam

A study by University of Guam researchers has found that shade can mitigate the effects of heat stress on corals. The study, which was funded by the university's National Science Foundation EPSCoR grant, was published in February in the peer-reviewed Marine Biology Research journal.

"We wanted to see what role light has in coral bleaching," said UOG Assistant Professor Bastian Bentlage, the supervisor and co-author of the study. "Usually, people talk about temperature as a cause for bleaching, but we show that both light and temperature work together."

Previous UOG research led by Laurie J. Raymundo found that more than one-third of all coral reefs in Guam were killed from 2013 to 2017 over the course of multiple bleaching events. Coral bleaching is the process in which corals stressed by environmental changes expel the essential symbiotic algae that live in their tissues, causing them to turn white and often die.

This latest study examined the resilience of staghorn corals (Acropora cf. pulchra) in heightened seawater temperatures. This species of coral is one of Guam's dominant reef-builders, and its habitats experience temperatures up to 97 degrees Fahrenheit during the hottest months of the year, leaving it vulnerable to bleaching episodes and population decline.

A team of researchers -- including lead author Justin T. Berg, a UOG graduate student studying biology; Charlotte M. David, an undergraduate student from the University of Plymouth (England), and Melissa Gabriel, a UOG graduate student studying environmental science -- took coral samples from the Hagåtña reef flat and examined their health in the UOG Marine Laboratory under normal and elevated temperatures.

"One group was subjected to consistent baseline temperatures observed on Guam's reef flats," Bentlage said, "and another was set to temperatures that are projected to become the new normal over the next couple of decades."

The researchers found that the corals took three weeks to recover from a week-long heat stress event. The experiment was then replicated to see how the corals would react if they were given shade while subjected to warmer temperatures.

"We found that when we put the shading over coral with increased seawater temperatures, it greatly increased photosynthetic yield of the symbiotic algae. Shade made a huge difference for coral health when you have high temperatures," Berg said.

Implications for reef management

Shading is a practice already used in coral nurseries, Bentlage said, but it may not be practical to shade whole reefs in the ocean. Future studies can look into practical ways to reduce the impact of light on corals, particularly as they recover from periods of elevated temperatures.

"We saw the corals recover rather slowly," Berg said. "The length of recovery indicates that corals are vulnerable during this time and management efforts may be particularly necessary during this period to reduce coral mortality."

Berg said the new knowledge may also help inform the best locations to successfully outplant corals.

"For example, slightly turbid waters could provide some shading to corals, making them less likely to bleach during periods of elevated sea surface temperatures," Berg said.

Credit: 
University of Guam

Many Hispanics died of COVID-19 because of work exposure

COLUMBUS, Ohio - Hispanic Americans have died of COVID-19 at a disproportionately high rate compared to whites because of workplace exposure to the virus, a new study suggests.

It's widely documented that Hispanics are overrepresented among workers in essential industries and occupations ranging from warehousing and grocery stores to health care and construction, much of which kept operating when most of the country shut down last spring.

The analysis of federal data showed that, considering their representation in the U.S. population, far higher percentages of Hispanics of working age - 30 to 69 years old - have died of COVID-19 than whites in the same age groups. A separate look at case estimates showed a similar pattern of unequally high COVID-19 infection rates for Hispanics - meaning that the elevated deaths in the working-age Hispanic population is consistent with elevated exposure to the virus.

"There was no evidence before this paper that really demonstrated that the excess cases were precisely in these working age groups," said Reanne Frank, professor of sociology at The Ohio State University and co-author of the study.

"Particularly for front-line and essential workers, among whom Hispanics are overrepresented, COVID-19 is an occupational disease that spreads at work. Hispanics were on the front lines and they bore a disproportionate cost."

Identifying a link between essential work and a higher rate of COVID-19 deaths should lead to better workplace protections, said study co-author D. Phuong (Phoenix) Do, associate professor of public health policy and administration at the University of Wisconsin-Milwaukee.

"If we know the source of the spread, then we can tackle it head on," she said. "This finding is applicable to any disease that is highly infectious. We can't stop the economy - we've learned that. There has to be a way to protect the workers and enforce protection."

All analyses were based on the most recent data as of Sept. 30, 2020. The research is published in the journal Demographic Research.

Because COVID-19 death rates are highest among older ages, the fact that a much higher percentage of Hispanics are in the younger age groups compared to whites meant that excess Hispanic deaths were initially masked. Centers for Disease Control and Prevention (CDC) age-adjusted data from 2020 showed that Hispanics constituted 19% of the population, but almost 41% of COVID-19 deaths.

When it became apparent that COVID-19 deaths were disproportionately high among minorities, commentators frequently suggested that unequal access to quality health care, higher levels of pre-existing conditions and multigenerational households were key causes, along with exposure as front-line workers.

At the time, however, "there wasn't any case data to support this workplace vulnerability hypothesis, which to us seemed most compelling in trying to understand the excess deaths among Hispanics," said Frank, also a faculty affiliate in Ohio State's Institute for Population Research.

Using CDC death counts stratified by age within racial/ethnic groups, the researchers compared the proportion of COVID-19 deaths attributed to whites and Hispanics with each group's relative population size. Nationally and in most states, in every range below age 75, Hispanic deaths were disproportionately high and deaths among whites were disproportionately low. One example from national data: Hispanics ages 35 to 44 and 55 to 64 experienced a higher-than-expected proportion of deaths of 15.4 and 8 percentage points, respectively. In contrast, whites in those same age groups faced mortality advantages of 23 and 17 percentage points, respectively.

Turning to CDC case surveillance data, the researchers found the same patterns at the county level. Overall and within each age group, whites were disproportionately underrepresented among COVID-19 cases, while Hispanics were overrepresented, with the greatest excess in cases among those of working age: 30 to 59.

Among the reported cases, Hispanics had fewer pre-existing health conditions than whites and there were no significant differences between working-age Hispanics and whites in the percentage of infections that resulted in death. Hence, the researchers said, the case data is not supportive of pre-existing comorbidities and/or lower-quality health care being driving factors in the excess Hispanic mortality.

"If case fatality rates are comparable across racial and ethnic groups, and they are, but we see big differences in the amount of death, which we do, then we have to focus on differential exposure," Do said. "So what we see is that these two patterns are consistent with higher case burden being the driving factor of the higher mortality burden among Hispanics.

"The evidence does not support the other hypotheses. The data in this case supported the workplace exposure hypothesis but not unequal access to health care or unequal quality of care, not pre-existing conditions, and not multigenerational household exposure."

The researchers said the patterns revealed in the data ideally will discourage what amounts to victim-blaming - attributing an unequally high rate of COVID-19 deaths among Hispanics to risks associated with individual health behaviors or living arrangements rather than their overrepresentation in the essential workforce, often in low-wage jobs.

"There's this impulse when we're trying to understand racial health disparities - even new ones like COVID that appeared very quickly - to obscure the role of structural factors, which includes work environments," Frank said. "This evidence can hopefully set the record straight about why the Hispanic community, along with other groups overrepresented among front-line workers, took such a heavy hit from this pandemic - that it was because they were doing their jobs, and putting themselves on the line."

#

Contacts:
Phoenix Do, dphuong@uwm.edu
Reanne Frank, frank.219@osu.edu

Written by Emily Caldwell, Caldwell.151@osu.edu

Credit: 
Ohio State University

Quality improvement project boosts depression screening among cancer patients

image: Jason Fish, M.D.

Image: 
UT Southwestern Medical Center

DALLAS - April 28, 2021 - Depression screening among cancer patients improved by 40 percent to cover more than 90 percent of patients under a quality improvement program launched by a multidisciplinary team at UT Southwestern Medical Center and Southwestern Health Resources.

Cancer patients with depression are at an increased risk of mortality and suicide compared with those without depression. Although rates vary based on cancer type and stage, depression is estimated to affect 10 to 30 percent of patients with cancer compared with 7 to 8 percent of adults without a diagnosis or history of cancer, and impact both men and women equally.

Due to the higher risk, medical and scientific authorities including the National Institutes of Health, the Institute of Medicine, and the National Comprehensive Cancer Network recommend routine screening to identify untreated symptoms of depression in cancer patients.

"Identifying those with depressive symptoms through earlier detection, diagnosis, and treatment can greatly improve the quality of life for these patients and their families, and prevent minor symptoms from progressing to severe psychopathology and potential self-harm," says Jason Fish, M.D., chief medical officer at Southwestern Health Resources and associate professor of internal medicine at UT Southwestern. "The findings from our study have the potential ability to not only positively impact treatment outcomes and slow disease progression, but to save health care resources."

A multidisciplinary team collaboratively applied Lean Six Sigma methods and tools among more than 14,000 oncology patients within oncology and psychiatry clinics in the Southwestern Health Resources network and at UT Southwestern's Harold C. Simmons Comprehensive Cancer Center.

The ongoing quality improvement initiative enhanced screening and follow-up rates in individual clinics by more than 40 percent and achieved the project goal of reaching 90 percent of patients in fewer than six months, according to Fish, who oversees quality and performance improvement activities for Southwestern Health Resources, a clinically integrated health care network formed by UT Southwestern and Texas Health Resources. If the ending performance rate of 89.8 percent had been in effect at the beginning of the project, an additional 1,290 patients could have received screening in a single month, the authors wrote.

Credit: 
UT Southwestern Medical Center

Considerable gap in evidence around whether portable air filters reduce the incidence of COVID-19

Considerable gap in evidence around whether portable air filters reduce the incidence of COVID-19 and other respiratory infections

There is an important absence of evidence regarding the effectiveness of a potentially cost-efficient intervention to prevent indoor transmission of respiratory infections, including COVID-19, warns a study by researchers at the University of Bristol.

Respiratory infections such as coughs, colds, and influenza, are common in all age groups, and can be either viral or bacterial. Bacteria and viruses can become airborne via talking, coughing or sneezing. The current global coronavirus (COVID-19) pandemic is also spread primarily by airborne droplets, and to date has led to over three million deaths worldwide.

Controlling how we acquire and transmit respiratory infections is of huge importance, particularly within indoor environments such as care homes, households, schools/day care, office buildings and hospitals where people are in close contact. Several manufacturers of portable air filters have claimed their products remove potentially harmful bacteria and viruses from indoor air, including COVID-19 viral particles. However, there is often no detailed evidence provided on their websites to corroborate their claims for potential consumers to review before purchasing.

A team of UK researchers from the University of Bristol reviewed previous studies to investigate whether portable air filters used in any indoor setting can reduce incidence of respiratory infections and thus, whether there is any evidence to recommend their use in these settings to reduce the spread of COVID-19 and other respiratory infections. The team also explored whether portable air filters in indoor settings capture airborne bacteria and viruses within them, and if so, what specifically is captured.

The researchers found no studies investigating the effects of portable, commercially available air filters on the incidence of respiratory infections in any indoor community setting. Two studies reported removal or capture of airborne bacteria in indoor settings (an office and emergency room), demonstrating that the filters did capture airborne bacteria and reduced the amount of airborne bacteria in the air. Neither tested for the presence of viruses in the filters, nor a reduction in viral particles in the air.

The study, funded by Professor Alastair Hay's National Institute for Health Research Senior Investigator Award and published in PLoS One, was a systematic review of studies published after 2000 reporting (i) effects of portable air filters on incidence of respiratory infection, or (ii) whether filters capture and/or remove aerosolised bacteria and viruses from the air, including information of what is captured. Studies reporting non-portable air filters were excluded from this study.

Lead author, Dr Ashley Hammond, an Infectious Disease Epidemiologist at the Centre for Academic Primary Care, University of Bristol, said: "Our study highlights the considerable gap in evidence related to the effectiveness of portable air filters in preventing respiratory infections, including COVID-19. Whilst we found some evidence suggesting use of air filters could theoretically contribute to reducing the spread of COVID-19 and other respiratory infections by capturing airborne particles, there is a complete absence of evidence as to whether they actually reduce the incidence of these infections."

Professor Alastair Hay, a GP and Professor of Primary Care at the Centre for Academic Primary Care, University of Bristol, and the research group lead, said: "Randomised controlled trials are urgently needed to demonstrate the effects of portable air filters on incidence of respiratory infections, including COVID-19. The main research questions should focus primarily on whether use of portable air filters in any indoor environment can reduce respiratory infections compared to those environments without portable air filters."

Credit: 
University of Bristol

Was North America populated by 'stepping stone' migration across Bering Sea?

video: A paleotopographic reconstruction accounting for Glacial Isostatic Adjustment digitally explores an archipelago about 1400 km long that likely existed from 30,000 BP to 8000 BP.

Image: 
Dobson, et al

LAWRENCE -- For thousands of years during the last ice age, generations of maritime migrants paddled skin boats eastward across shallow ocean waters from Asia to present-day Alaska. They voyaged from island to island and ultimately to shore, surviving on bountiful seaweeds, fish, shellfish, birds and game harvested from coastal and nearshore biomes. Their island-rich route was possible due to a shifting archipelago that stretched almost 900 miles from one continent to the other.

A new study from the University of Kansas in partnership with universities in Bologna and Urbino, Italy, documents the newly named Bering Transitory Archipelago and then points to how, when and where the first Americans may have crossed. The authors' stepping-stones hypothesis depends on scores of islands that emerged during the last ice age as sea level fell when ocean waters were locked in glaciers and later rose when ice sheets melted. The two-part study, just published in the open-access journal Comptes Rendus Geoscience, may answer what writer Fen Montaigne calls "one of the greatest mysteries of our time . . . when humans made the first bold journey to the Americas."

The "stepping-stones" idea hinges on retrospective mapping of sea levels while accounting for isostacy -- deformation of the Earth's crust due to the changing depth and weight of ice and water, reaching its greatest extreme during the Last Glacial Maximum about 20,500 years ago.

"We digitally discovered a geographic feature of considerable size that had never been properly documented in scientific literature," said principal author Jerome Dobson, professor emeritus of geography at KU. "We named it the Bering Transitory Archipelago; it existed from about 30,000 years ago through 8,000 years ago. When we saw it, we immediately thought, 'Wow, maybe that's how the first Americans came across.' And, in fact, everything we've tested seems to bear that out -- it does seem to be true."

For more than a decade, researchers have pondered a mystery within a mystery. Mitochondrial DNA indicates that migrants were isolated somewhere for up to 15,000 years on their way over from Asia to North America. The Beringian Standstill Hypothesis arises from the fact that today Native American DNA is quite different from Asian DNA, a clear indication of genetic drift of such magnitude that it can only have happened over long periods of time in nearly complete isolation from the Asian source population. The Bering Transitory Archipelago provides a suitable refugium with internal connectivity and outward isolation.

Dobson said people crossing the Bering Sea probably didn't have sails but could have been experienced in paddling skin boats like the kayaks and umiaks that Inuits use today.

"They probably traveled in small groups," he said, "either from Asia or islands off the coast of Asia. Some maritime people are known to have existed 27,000 years ago on northern Japanese islands. They probably were maritime people -- not just living on islands, but actually practicing maritime culture, economy and travel."

Dobson recently received the American Geographical Society's Cullum Geographical Medal (the same gold medal that Neil Armstrong won for flying to the moon and Rachel Carson won for writing "Silent Spring"). He named and continuously champions "aquaterra" -- all lands that were exposed and inundated repeatedly during the Late Pleistocene ice ages -- thus creating a zone of archeological promise scattered offshore from all coastal regions around the globe.

Recently, Dobson and co-authors Giorgio Spada of the University of Bologna and Gaia Galassi of Urbino University "Carlo Bo" applied an improved Glacial Isostatic Adjustment model to nine global choke points, meaning isthmuses and straits that have funneled transport and trade throughout history. Significant human migrations are known to have occurred across some of them, including "Beringia"-- all portions of the Bering Sea that were exposed before, during and after the Last Glacial Maximum.

"These Italian ocean scientists read my 'Aquaterra' paper and took it upon themselves to refine the boundaries of aquaterra for the whole world at coarse resolution and for Beringia itself at fine resolution," Dobson said. "Later we agreed to join forces and tackle those nine global choke points. At the end of that study, we suddenly spotted these islands in the Bering Sea, and that became our focus. This had an immediate potential because it could be a real game-changer in terms of all sciences understanding how migration worked in the past. We found startling results in certain other choke points and have begun analyzing them as well."

In Beringia, the three investigators contend, this action produced a "conveyor belt" of islands that rose from the sea and fell back again, pushing bands of people eastward. "The first islands to appear were just off the coast of Siberia," the KU researcher said. "Then islands appeared ever eastward. Most likely migrants kept expanding eastward, too, generally to islands within view and an easy paddle away."

By 10,500 years ago, when the Bering Strait itself first appeared, almost all islands in the west had submerged. Only three islands remained, and paddling distances had increased accordingly. Thus, occupants were forced to evacuate, and they faced a clear choice: return to Asia, which they knew to be populated and may even have left due to population pressures and resource constraints, or paddle east to less known territory, perhaps less populated islands with ample resources.

To fully confirm the idea set forth in the new paper, Dobson said researchers from many fields will need to collaborate as one geographer and two ocean scientists have done here.

"We ourselves are at a stage where we definitely need underwater confirmation," he said. "No doubt underwater archaeologists by title will prevail in that quest, but other disciplines, specialties and fields are essential. Working together plus scouring diverse literature, we presented a fundamentally new physical geography for scientists to contemplate. That should entice every relevant discipline to question conventional theory and explore new ideas regarding how, when and where people came to North America. More broadly, aquaterra can serve as a unifying theme for understanding human migrations, demic expansions, evolutionary biology, culture, settlement and endless other topics."

Credit: 
University of Kansas

New cell atlas of COVID lungs reveals why SARS-CoV-2 is deadly and different

NEW YORK, NY (April 29, 2021)--A new study is drawing the most detailed picture yet of SARS-CoV-2 infection in the lung, revealing mechanisms that result in lethal COVID-19, and may explain long-term complications and show how COVID-19 differs from other infectious diseases.

Led by researchers at Columbia University Vagelos College of Physicians and Surgeons and Herbert Irving Comprehensive Cancer Center, the study found that in patients who died of the infection, COVID-19 unleashed a detrimental trifecta of runaway inflammation, direct destruction and impaired regeneration of lung cells involved in gas exchange, and accelerated lung scarring.

Though the study looked at lungs from patients who had died of the disease, it provides solid leads as to why survivors of severe COVID may experience long-term respiratory complications due to lung scarring.

"It's a devastating disease, but the picture we're getting of the COVID-19 lung is the first step towards identifying potential targets and therapies that disrupt some of the disease's vicious circuits. In particular, targeting cells responsible for pulmonary fibrosis early on could possibly prevent or ameliorate long-term complications in survivors of severe COVID-19," says Benjamin Izar, MD, PhD, assistant professor of medicine, who led a group of more than 40 investigators to complete in several months a series of analyses that usually takes years.

This study and a companion paper led by researchers at Harvard/MIT, to which the Columbia investigators also contributed, were published the journal Nature on April 29.

Study Creates Atlas of Cells in COVID Lung

The new study is unique from other investigations in that it directly examines lung tissue (rather than sputum or bronchial washes) using single-cell molecular profiling that can identify each cell in a tissue sample and record each cell's activity, resulting in an atlas of cells in COVID lung.

"A normal lung will have many of the same cells we find in COVID, but in different proportions and different activation states," Izar says. "In order to understand how COVID-19 is different compared to both control lungs and other forms of infectious pneumonias, we needed to look at thousands of cells, one by one."

Izar's team examined the lungs of 19 individuals who died of COVID-19 and underwent rapid autopsy (within hours of death)--during which lung and other tissues were collected and immediately frozen--and the lungs of non-COVID-19 patients. In collaboration with investigators at Cornell University, the researchers also compared their findings to lungs of patients with other respiratory illnesses.

Drugs Targeting IL-1beta May Reduce Inflammation

Compared to normal lungs, lungs from the COVID patients were filled with immune cells called macrophages, the study found.

Typically during an infection, these cells chew up pathogens but also regulate the intensity of inflammation, which also helps in the fight.

"In COVID-19, we see expansion and uncontrolled activation of macrophages, including alveolar macrophages and monocyte-derived macrophages," Izar says. "They are completely out of balance and allow inflammation to ramp up unchecked. This results in a vicious cycle where more immune cells come in causing even more inflammation, which ultimately damages the lung tissue."

One inflammatory cytokine in particular, IL-1beta, is produced at a high rate by these macrophages.

"Unlike other cytokines such as IL-6, which appears to be universally prevalent in various pneumonias, IL-1beta production in macrophages is more pronounced in COVID-19 compared to other viral or bacterial lung infections," Izar says. "That's important because drugs exist that tamp down the effects of IL-1beta."

Some of these drugs are already being tested in clinical trials of COVID patients.

Severe COVID also Prevents Lung Repair

In a typical infection, a virus damages lung cells, the immune system clears the pathogen and the debris, and the lung regenerates.

But in COVID, the new study found that not only does SARS-CoV-2 virus destroy alveolar epithelial cells important for gas exchange, the ensuing inflammation also impairs the ability of the remaining cells to regenerate the damaged lung. Though the lung still contains cells that can do the repairs, inflammation permanently traps these cells in an intermediate cell state and leaves them unable to complete the last steps of differentiation needed for replacement of mature lung epithelium.

"Among others, IL-1b appears to be a culprit in inducing and maintaining this intermediate cell state," says Izar, "thereby linking inflammation and impaired lung regeneration in COVID-19. This suggests that in addition to reducing inflammation, targeting IL-1beta may help take the brakes off cells required for lung repair."

Preventing Accelerated Fibrosis

The researchers also found a large number of specific fibroblast cells, called pathological fibroblasts, that create rapid scarring in COVID-19 lungs. When the fibroblast cells fill the lung with scar tissue, a process called fibrosis, the lung has less space for cells involved in gas exchange and is permanently damaged.

Given the importance of pathological fibroblasts in the disease, Izar's team closely analyzed the cells to uncover potential drug targets. An algorithm called VIPER, developed previously by Andrea Califano, Dr, chair of systems biology at Columbia University Vagelos College of Physicians and Surgeons, identified several molecules in the cells that play an important role and could be targeted by existing drugs.

"This analysis predicted that inhibition of STAT signaling could alleviate some of the deleterious effects caused by pathological fibroblasts," Izar says.

"Our hope is that by sharing this analysis and massive data resource, other researchers and drug companies can begin to test and expand on these ideas and find treatments to not only treat critically ill patients, but also reduce complications in people who survive severe COVID-19."

Team Effort by Several Columbia Labs

"Pulling this study together in such a short period of time was only possible with the help of several teams of researchers at Columbia," Izar says.

Critically, in the first few months of the pandemic, Columbia's Department of Pathology & Cell Biology decided to flash-freeze many tissues from deceased COVID patients to preserve the cells' molecular state. Hanina Hibshoosh, MD, director of the department's tissue bank, initiated the collaboration with Izar's lab, which has expertise in conducting single-cell analyses with frozen tissue. Pathologist Anjali Saqi, MD, professor of pathology & cell biology, was also instrumental in procuring and evaluating the samples.

Jianwen Que, MD, PhD, professor of medicine, and his laboratory provided expertise in identifying and characterizing cells in the lung and their regenerative potential. Fibrosis expert Robert Schwabe, MD, associate professor of medicine, was essential in dissecting mechanisms by which COVID-19 propelled lung scarring. "We are incredibly grateful to all the labs contributing to this effort and very fortunate to be at Columbia with all the necessary expertise at hand in one collaborative environment."

Credit: 
Columbia University Irving Medical Center

Implementing Industry 4.0 in SMEs by focusing on the customer

image: The University of the Basque Country's BDI research group has developed a methodology to help SMEs create new software services aligned with Industry 4.0, to enable them, through low-cost options, to change the way in which customers interact with companies

Image: 
123RF

Small and medium-sized manufacturing enterprises (SMEs) face many obstacles and difficulties (economic, technical, cultural, etc.) when it comes to implementing Industry 4.0. "These are transition processes that are economically costly, and in which SMEs often come up against technical and cultural problems, as they are not cognizant of how to make this transition, or of the benefits their companies stand to gain by implementing Industry 4.0," explained the UPV/EHU pre-doctoral researcher Víctor Ramírez-Durán.

While several pieces of work address the incorporating of Industry 4.0 technologies into the sphere of product lifecycle and supply chain, which SMEs could use as a reference, this is not the case as far as the customer life cycle in concerned. In this respect, in a piece of work conducted by the UPV/EHU's BDI research group, a methodology has been developed to assist professionals in SME software departments in the task of creating new software services aligned with Industry 4.0; using low-cost options, this would enable them to change the way customers interact with companies and the experiences they have while interacting with them. "Industry 4.0 technologies can help to better understand customers, improve their experience when interacting with products, and facilitate enhanced after-sales support," said Ramirez, author of the work.

Semantic technologies and 3D visualization

The proposed methodology describes a series of well-specified phases and activities that are easy to understand and implement, and which can make this transition possible by using minimal financial resources and taking advantage of new technologies. "With this methodology, it is possible to identify the shortcomings that a company may have with regard to customers, and to analyse how services can be generated in a way that would improve how the customer is treated. The methodology is general; i.e. it does not say which services need to be implemented but helps to spot the shortcomings that can be remedied by generating new services," says Ramírez.

The methodology is mainly based on the use of semantic technologies and 3D visualization. Semantic technologies, in this case ontologies -a network or system that specifies the existing relationships between the concepts of a domain or area of knowledge- provide a high degree of flexibility for the description of knowledge, and also facilitate inference and reasoning processes that are difficult to achieve using traditional databases. On the other hand, 3D visualization technologies offer improved visual representation including better graphics and navigation controls, allowing the user to enjoy an enhanced, interactive experience.

As the researcher explained, this methodology could be used, for example, to create an improved catalogue of a company's products through the 3D visualization of models and advanced navigation options, "with precise descriptions and characteristics of different parts of the model. It could also be used to create a search module which suggests to the customer which product he or she needs". Another of the examples highlighted by Ramírez could be a virtual technician to improve customer service once the product has been purchased. However, Ramírez pointed out that "the methodology is designed in a practical way so that each company can carry out an analysis allowing it to solve a range of problems in relation to its customers. All these benefits will improve the relationship between the customer and the company and achieve a high degree of customer loyalty".

Additional information

Víctor Julio Ramírez-Durán is a pre-doctoral researcher in the Department of Computer Languages and Systems (LSI) of the Faculty of Informatics of the UPV/EHU. His thesis supervisors are the lecturer Idoia Berges and the professor Arantza Illarramendi.

Credit: 
University of the Basque Country

Single-cell CRISPR technology deciphers role of chromatin accessibility in cancer

image: CRISPR-sciATAC is a novel integrative genetic screening platform that jointly captures CRISPR gene perturbations and single-cell chromatin accessibility genome-wide. The new method harnesses the programmability of the gene editing system CRISPR to knock-out nearly all chromatin-related genes in parallel, offering researchers deeper insights into the role of DNA accessibility in cancer and in rare diseases involving chromatin.

Image: 
Sanjana Lab of New York Genome Center and NYU

NEW YORK, NY (April 29, 2021) - In a new resource for the scientific community, published today in Nature Biotechnology, researchers in the lab of Neville Sanjana, PhD, at the New York Genome Center (NYGC) and New York University (NYU) developed CRISPR-sciATAC, a novel integrative genetic screening platform that jointly captures CRISPR gene perturbations and single-cell chromatin accessibility genome-wide. With this technology, they profile changes in genome organization and create a large-scale atlas of how loss of individual chromatin-altering enzymes impacts the human genome. The new method harnesses the programmability of the gene editing system CRISPR to knock-out nearly all chromatin-related genes in parallel, offering researchers deeper insights into the role of DNA accessibility in cancer and in rare diseases involving chromatin.

Recent advances in single-cell technologies have given scientists the ability to profile chromatin, the complex of DNA and proteins that resides within the nucleus of individual cells. Chromatin is often called the "gatekeeper" of the genome because its proteins act as packaging elements for the DNA, either promoting or refusing access to it. This controls gene expression processes in the cell, such as turning on or off specific genes. Changes in the chromatin landscape have been linked to diverse human traits and diseases, most notably cancer.

In an initial demonstration of CRISPR-sciATAC, the Sanjana Lab team designed a CRISPR library to target 20 chromatin-modifying genes that are commonly mutated in different cancers, including breast, colon, lung and brain cancers. Many of these enzymes act as tumor suppressors and their loss results in global changes in chromatin accessibility. For example, the group showed that loss of the gene EZH2, which encodes a histone methytransferase, resulted in an increase in gene expression across several previously silenced developmental genes.

"The scale of CRISPR-sciATAC makes this dataset very unique. Here, in a uniform genetic background, we have accessibility data capturing the impact of every chromatin-related gene. This provides a detailed map between each gene and how its loss impacts genome organization with single-cell resolution," said Dr. Noa Liscovitch-Brauer, a postdoctoral fellow in Sanjana's lab at the New York Genome Center and NYU and the study's co-first author.

In total, the team targeted more than 100 chromatin-related genes and developed a "chromatin atlas" that charts how the genome changes in response to loss of these proteins. The atlas shows that different subunits within each of the 17 chromatin remodeling complexes targeted can have different effects on genome accessibility. Surprisingly, nearly all of these complexes have subunits where loss triggers increased accessibility and other subunits with the opposite effect. Overall, the greatest disruption in transcription factor binding sites, which are important functional elements in the genome, was observed after loss of AT-rich interactive domain-containing protein 1A (ARID1A), a member of the BAF complex. Mutations in BAF complex proteins are estimated to be involved in 1 out of every 5 cancers.

In addition to the CRISPR-sciATAC method, the team also developed a suite of computational methods to map the dynamic movements of the nucleosomes, which are the protein clusters that DNA is wrapped around. When there are more nucleosomes, the DNA is tightly wound and less available to bind transcription factors. This is exactly what the team found at specifical transcription factor binding sites involved in cell proliferation after CRISPR knock-out of ARID1A. When targeting a different chromatin-modifying enzyme, these same sites underwent an expansion in nucleosome spacing, demonstrating the dynamics of nucleosome positioning at specific sites in the genome. The CRISPR-sciATAC method allowed the team to systematically explore this genome plasticity for multiple chromatin-modifying enzymes and transcription factor binding sites.

"We really focused on making CRISPR-sciATAC an accessible technique -- we wanted it to be something that any lab could do. We produced most of the key enzymes in-house and used simple methods for single-cell isolation that do not require microfluidics or single-cell kits," said Dr. Antonino Montalbano, a former postdoctoral fellow in Sanjana's lab at the New York Genome Center and NYU and the study's co-first author.

To develop the CRISPR-sciATAC technology, the researchers used a mix of human and mouse cells to create a tagging/identification process that allowed them to split and barcode the nuclei of cells as well as capture the single-guide RNAs required for CRISPR targeting. The work builds off prior single-cell combinatorial indexing ATAC-seq (sciATAC-seq) work from Dr. Jay Shendure at the University of Washington and other groups developing new single-cell genomics methods. CRISPR-sciATAC also uses an unique, easy-to purify transposase that was developed in the NYGC's Innovation Technology Lab. A key technical hurdle was optimizing experimental conditions to simultaneously capture the CRISPR guide RNAs and genome fragments for accessibility profiling while also keeping the nuclear envelope of each cell intact.

"Integrating chromatin accessibility profiling into the genome-wide CRISPR screens provides a new lens for us to understand gene regulation," said Dr. Sanjana, Core Faculty Member, NYGC, Assistant Professor of Biology, NYU, and Assistant Professor of Neuroscience and Physiology, NYU Grossman School of Medicine, the study's senior author. "With CRISPR-sciATAC, we have a comprehensive view into how specific chromatin-modifying enzymes and complexes change accessibility and orchestrate the interactions that control gene expression. Chromatin sets the stage for gene expression, and here we can measure the impact of different mutations on chromatin rapidly. We hope this atlas will be a broadly useful resource for the community and that CRISPR-sciATAC will be used to produce similar atlases in other biological systems and disease contexts."

Credit: 
New York Genome Center

How SARS-CoV-2 hijacks human cells to evade immune system

image: Human enzyme METTL3 adds methyl groups to introduce m6A in SARS-CoV-2's RNA. That modification prevents the virus' RNA from triggering inflammatory molecules known as cytokines. METTL3 also leads to increased expression of pro-viral genes -- those that encode proteins needed for SARS-CoV-2 replication and survival.

Image: 
UC San Diego Health Sciences

Researchers at University of California San Diego School of Medicine have discovered one way in which SARS-CoV-2, the coronavirus that causes COVID-19, hijacks human cell machinery to blunt the immune response, allowing it to establish infection, replicate and cause disease.

In short, the virus' genome gets tagged with a special marker by a human enzyme that tells the immune system to stand down, while at the same time ramping up production of the surface proteins that SARS-CoV-2 uses as a "doorknob" to enter cells.

The study, published April 22, 2021 in Cell Reports, helps lay the groundwork for new anti-viral immunotherapies -- treatments that work by boosting a patient's immune system, rather than directly killing the virus.

"It's very smart of this virus to use host machinery to simultaneously go into stealth mode and get inside more cells," said Tariq Rana, PhD, professor and chief of the Division of Genetics in the Department of Pediatrics at UC San Diego School of Medicine and Moores Cancer Center. "The more we know about how the virus establishes itself in the body, the better equipped we are to disrupt it."

In human cells, genes (DNA) are transcribed into RNA, which is then translated into proteins, the molecules that make up the majority of cells. But it's not always so straightforward. Cells can chemically modify RNA to influence protein production. One of these modifications is the addition of methyl groups to adenosine, one of the building blocks that make up RNA. Known as N6-methyladenosine (m6A), this modification is common in humans and other organisms, including viruses.

In contrast to humans, the entire genomes of some viruses, including SARS-CoV-2, are made up of RNA instead of DNA. And rather than carry around the machinery to translate that into proteins, the coronavirus gets human cells to do the work.

Rana and his team previously discovered that m6A plays an important role in HIV and Zika virus infections. In their latest study, the researchers discovered that the human enzyme METTL3 adds methyl groups to introduce m6A in SARS-CoV-2's RNA. That modification prevents the virus' RNA from triggering inflammatory molecules known as cytokines. To the team's surprise, METTL3's activity also led to increased expression of pro-viral genes -- those that encode proteins needed for SARS-CoV-2 replication and survival, such as ACE2, the cell surface receptor that the virus uses to enter human cells.

"It remains to be seen why our cells help the virus out like this," Rana said.

When the team removed METTL3 from cells in the laboratory, using gene silencing or other methods, they saw the reverse -- a pro-inflammatory molecule known as RIG1 binds the viral RNA, more inflammatory cytokines were produced, and pro-viral genes were inhibited. Ultimately, inhibiting METTL3 suppressed viral replication.

To see how this mechanism plays out in the real world, the team compared post-mortem lung samples from COVID-19 patients and healthy lung biopsies. In patients who had died from severe COVID-19, the team found, METTL3 expression was lower and inflammatory genes were elevated. That makes sense in later stages of COVID-19, Rana said, because cytokine storm -- the excessive activation of the patient's own immune system -- is known to worsen the disease.

"It's like there are two phases of the infection -- in the first, the virus needs METTL3 to help it evade the immune response," he said, "but in the second phase, once the virus is replicating like crazy, it's better to downregulate METTL3."

Rana's team is now validating their findings in animal models, and developing METTL3 inhibitors to test as potential experimental therapies for COVID-19.

"We hope that by manipulating m6A levels in the virus, we might be able to time the innate immune response in a way that benefits patients with COVID19, especially for the mild or moderate patients who haven't developed cytokine storm," Rana said. "The challenge is that cells have many other enzymes like METTL3, known as methyltransferases, so inhibiting it would need to be done very specifically, at a specific time."

Credit: 
University of California - San Diego

Finding the optimal way to repay student debt

image: The shaded areas denote situations in which income-based payments would minimize the cost for direct undergraduate, graduate, and PLUS loans, according to the time until loan forgiveness in an income-based scheme (on the vertical axis, in years) and loan balance (on the horizontal axis, in U.S. dollars). The light gray area represents immediate enrollment and the dark gray area represents later enrollment after a period of maximum payments. In areas with no shading, the optimal choice is to make maximum payments until the loan is fully paid off. The vertical lines denote the maximum loan amounts that are currently allowed for those schemes, indicating that it is always best to make maximum payments on direct undergraduate loans and often best to do so for direct graduate loans as well.

Image: 
Figure courtesy of Paolo Guasoni, Yu-Jui Huang, and Saeed Khalili.

The burden of student loans in the U.S. continues to grow unabatedly, currently accounting for a total of $1.7 trillion in household debt among nearly 45 million borrowers. "The introduction of income-based repayment over the past decade has made student loans rather complicated products," Paolo Guasoni of Dublin City University said. As borrowers navigate this complex process, they face long-term consequences; people with student debt are less likely to own homes or become entrepreneurs, and generally postpone their enrollment in graduate or professional studies. Though legislative reform is necessary to combat this problem on a grand scale, individual borrowers can take steps to repay their loans with minimal long-term costs.

In a paper that published in April in the SIAM Journal on Financial Mathematics, Guasoni--along with Yu-Jui Huang and Saeed Khalili (both of the University of Colorado, Boulder)--developed a strategy for minimizing the overall cost of repaying student loans. "In the literature, we found mostly empirical studies discussing what borrowers are doing," Huang said. "But what we wanted to know was rather, how should a borrower repay to minimize debt burden?"

Students become responsible for repaying their loans a few months after they graduate or unenroll, and must contend with the loan growing at a national fixed interest rate. One option for borrowers is to repay their balances in full by a fixed maturity -- the date at which a loan's final payment is due. Another is to enroll in an income-based scheme, in which monthly payments are only due if the borrower has an income above a certain subsistence threshold. If payments are required, they are proportional to the amount the borrower makes above that threshold. After roughly 20 to 25 years, any remaining balance is forgiven but taxed as ordinary income. "The tension is between postponing payments until forgiveness and letting interest swell the loan balance over time," Guasoni said. The tax cost of delaying payments increases exponentially with longer timeframes until forgiveness, potentially offsetting the supposed savings.

The intuitive approach for many borrowers may be to pay off small loans as quickly as possible, since even minimum payments would extinguish the balance by the end of its term, making forgiveness irrelevant. Similarly, one may wish to minimize the payments for a large loan through an income-based scheme, especially if the loan will be forgiven in a few years anyway. However, the situation is not always as simple as it seems. "The counterintuitive part is that, if your loan is large and forgiveness is far away, it may be better to maximize payments over the first few years to keep the loan balance from exploding," Huang said. "Then you can switch to income-based repayment and take advantage of forgiveness."

To investigate what is truly the optimal way to pay back a student loan, the authors created a mathematical model of a borrower who took out a federal student loan--the most common type of student loan--with a constant interest rate. The model assumes that the borrower is able to repay the loan under its original term and even possibly make additional payments; otherwise, they would have no choice but to enroll in an income-based scheme. Quickly paying off the loan leads to lower costs from compounding interest. However, the borrower's motivation to do so is contradicted by the possibility of the remaining balance being forgiven and taxed in the future, which encourages them to delay payment until the forgiveness date.

The mathematical model revealed several possible approaches for a borrower who wishes to minimize the overall cost of their loan. "The optimal strategy is to either (i) repay the loan as quickly as possible [if the initial balance is sufficiently low], or (ii) maximize payments up to a critical horizon (possibly now) and then minimize them through income-based repayment," Guasoni said. The critical horizon occurs when the benefits of forgiveness begin to outweigh the compounding costs of interest on the loan balance. For large loans with a high interest rate--which are common for professional degrees--the savings from the strategy of high initial payments followed by enrollment in an income-based scheme can be substantial, for those that are able to afford such a plan.

The authors provided an example of a dental school graduate with a balance of $300,000 in Direct PLUS loans that carry an interest rate of 7.08 percent (according to the American Dental Education Association, 83 percent of dental school graduates have student loan debt, with an average balance of $292,169). This graduate has a starting salary of $100,000 that will grow four percent annually, and is able to repay at most 30 percent of the income that they make above the subsistence level. If they kept up such maximal payments, they would repay the loan in less than 20 years with a total cost of $512,000.

The example graduate could also immediately enroll in income-based repayment, paying only 10 percent of the income that they make above subsistence. After 25 years, their balance would equal $1,053,000 due to compounding interest. This balance would be forgiven and taxed as income at a 40 percent rate, yielding a total cost of $524,000. Alternatively, the graduate could use the authors' suggested strategy and repay 30 percent of their income above subsistence for around nine years, then switch to the income-based repayment scheme. The remaining balance to be forgiven after 25 years would then be $462,000, leading to a total cost from payments and tax of $490,000 -- the lowest of all the strategies. The reduction in the balance through multiple years of high payments curbs the balance's ensuing growth during the period of minimum payments.

Future research could further explore the more complicated factors of student debt repayment. The authors' model is deterministic -- it does not account for the fact that the interest rates could potentially change in the future. However, interest rates can increase or decrease, which may compel borrowers to refinance or delay payments. Further work is necessary to determine the influence of such changes on optimal debt repayment.

This research illuminated the way in which borrowers' choices in their loan repayments can have a sizable impact on overall costs, especially given compounding interest. "If you have student loans, you should consider your specific options carefully and see what the total cost would be under different strategies," Guasoni said. Huang agreed, noting that their proposed strategy may be especially beneficial for the large loans that are often held by law and dental school graduates. "Each loan is slightly different," he said. "Our model does not capture every possible detail, but it helps to focus the attention on two possibilities: quickest full repayment or enrollment in an income-based scheme, possibly after a period of high payments." A careful, mathematical consideration of the approach to loan repayment can help borrowers make decisions that will benefit them in the years to come.

Source article: Guasoni, P., Huang, Y.-J., & Khalili, S. (2021). American student loans: Repayment and valuation. SIAM J. Finan. Math., 12(2), SC-16-SC-30.

Credit: 
Society for Industrial and Applied Mathematics

New optical hydrogen sensors eliminate risk of sparking

Hydrogen as a clean, renewable alternative to fossil fuels is part of a sustainable-energy future, and very much already here. However, lingering concerns about flammability have limited widespread use of hydrogen as a power source for electric vehicles. Previous advances have minimized the risk, but new research from the University of Georgia now puts that risk in the rearview mirror.

Hydrogen vehicles can refuel much more quickly and go farther without refueling than today's electric vehicles, which use battery power. But one of the final hurdles to hydrogen power is securing a safe method for detecting hydrogen leaks.

A new study published in Nature Communications documents an inexpensive, spark-free, optical-based hydrogen sensor that is more sensitive -- and faster -- than previous models.

"Right now, most commercial hydrogen sensors detect the change of an electronic signal in active materials upon interaction with hydrogen gas, which can potentially induce hydrogen gas ignition by electrical sparking," said Tho Nguyen, associate professor of physics in the Franklin College of Arts and Sciences, a co-principal investigator on the project. "Our spark-free optical-based hydrogen sensors detect the presence of hydrogen without electronics, making the process much safer."

Not just for cars

Hydrogen power has many more applications than powering electric vehicles, and flammability mitigating technologies are critical. Robust sensors for hydrogen leak detection and concentration control are important in all stages of the hydrogen-based economy, including production, distribution, storage and utilization in petroleum processing and production, fertilizer, metallurgical applications, electronics, environmental sciences, and in health and safety-related fields.

The three key problems associated with hydrogen sensors are response time, sensitivity, and cost. Current mainstream technology for H2 optical sensors requires an expensive monochromator to record a spectrum, followed by analyzing a spectral shift comparison.

"With our intensity-based optical nano sensors, we go from detection of hydrogen at around 100 parts-per-million to 2 parts-per-million, at a cost of a few dollars for a sensing chip," Tho said. "Our response time of .8 seconds is 20% faster than the best available optical device reported in the literature right now."

How it works

The new optical device relies on the nanofabrication of a nanosphere template covered with a Palladium Cobalt alloy layer. Any hydrogen present is quickly absorbed, then detected by an LED. A silicon detector records the intensity of the light transmitted.

"All metals tend to absorb hydrogen, but by finding the suitable elements with a right balance in the alloy and engineering the nanostructure to amplify subtle changes in light transmission after hydrogen absorption, we were able to set a new benchmark for how fast and sensitive these sensors can be," said George Larsen, a senior scientist at Savannah River National Laboratory and co-principal investigator on the project. "All while keeping the sensor platform as simple as possible."

Credit: 
University of Georgia

NSU researcher part of a flagship study on vertebrate genomes

video: Genome 10K Project

Image: 
Genome 10K Project

Study Take-Aways

Unprecedented novel discoveries have implications for characterizing biodiversity for all life, conservation, and human health and disease.

o This finding provides novel avenues of research to increase immune defenses, particularly relevant for emerging infectious diseases, such as the current COVID-19 pandemic.

The flagship paper presented whole genome sequence analyses of 16 vertebrate species to illustrate high quality, near error free, near complete, low cost reference genome assemblies.

o Though near 400 species have been sequenced at some level, the quality today reflects a quantum leap in precision sequence details and discovery.

FORT LAUDERDALE/DAVIE, Fla. - Two decades ago, the full genome sequence of humankind was released. It was funded by international government and philanthropic sources at a cost of billions of dollars.

Fast forward to 2008 and, driven by the need for better genome understanding and the precipitous drop in sequencing costs, the Genome 10K Community of Scientists (G10K) was established to promote and ensure the genome analysis of 10,000 species of vertebrates. The G10K-sponsored Vertebrate Genomes Project embraced dramatic improvements in sequencing bio-technologies in the last few years to expand production of high-quality reference genome assemblies for all ~70,000 living vertebrates in the coming years.

Today, the G10K sponsored Vertebrate Genomes Project (VGP) announces their flagship study and associated publications focused on genome assembly quality and standardization for the field of genomics. This study includes 16 diploid high-quality, near error-free, and near complete vertebrate reference genome assemblies for species across all taxa with backbones (i.e., mammals, amphibians, birds, reptiles, and fishes) from five years of piloting the first phase of the VGP project.

In a special issue of Nature, along with simultaneous companion papers published in other scientific journals, the VGP details numerous technological improvements based on these 16 genome assemblies. In this new study, the VGP demonstrates the feasibility of setting and achieving high-quality reference genome quality metrics using state-of-the-art automated approach of combining long-read and long-range chromosome scaffolding approaches with novel algorithms that put the pieces of the genome assembly puzzle together. To date, the current VGP pipelines have led to the submission of 129 diploid assemblies representing the most complete and accurate versions of those species to date, and is on the path to generating thousands of genome assemblies, demonstrating feasibility in not only quality standardization but also scale.

Some of the animals that were part of this study included, but were not limited to:

Mammals: Pale spear-nosed bat, Egyptian fruit bat; Canada lynx; vaquita; platypus;

Birds: Zebra finch; kakapo; Anna's hummingbird;

Reptile: Goode's thornscrub tortoise;

Fish: Zig-zag eel; climbing perch; blunt-snouted clingfish.

"When we first started the G10K idea, we gathered a small handful of diverse field zoologists together with genome-centric computer scientists, pledging to work together to develop genome sequence data for thousands of the world's vertebrates," said Stephen O'Brien, Ph.D., a professor and research scientist at Nova Southeastern University's (NSU) Halmos College of Arts and Sciences. "We wanted to offer a gift for the next generation of genome scientists. Today the dream of genome empowerment of so many living species took a giant leap forward."

O'Brien is the co-founder of the Genome 10K Consortium, the Chief Scientific Officer at the Theodosius Dobzhansky Center for Genome Bioinformatics, St. Petersburg State University, Russia and is a member of the National Academy of Sciences.

The G10K-VGP's approach combines assembly pipelines with manual curation to fix misassemblies, major gaps, and other errors, which informs the iterative development of better algorithms. For example, the VGP helped reveal high levels of false gene duplications, losses or gains, due mostly to algorithms not properly separating maternal and paternal chromosomes. One solution includes a trio binning approach of using DNA from the parents to separate out the paternal and maternal sequences in the offspring. For cases where parental data is unavailable, another solution developed by the VGP and collaborators is an algorithm called FALCON-Phase that reduces the computational complexity of phasing maternal and paternal DNA sequences at chromosome scale.

"When I was asked to take on leadership of the G10K-VGP in 2015, I emphasized the need to work with technology partners and genome assembly experts on approaches that produce the highest quality data possible, as it was taking months per gene for my students and postdocs to correct gene structure and sequences for their experiments, which was causing errors in our biological studies", said Erich Jarvis, lead of the VGP sequencing hub at The Rockefeller University, Chair of the G10K and a Howard Hughes Medical Institute Investigator. "For me this was not only a practical mission, but a moral imperative."

Kerstin Howe, lead of the curation team at the Wellcome Sanger Institute in the UK, said: "Our new approach to produce structurally validated, chromosome-level genome assemblies at scale will be the foundation of ground-breaking insights in comparative and evolutionary genomics."

"It truly was a challenge to design a pipeline applicable to highly diverged genomes - our largest genome, 5GB in size, broke almost every tool commonly used in assembly processes," said Arang Rhie, from the National Human Genome Research Institute, National Institutes of Health, Bethesda, Maryland, who is the first author of the flagship paper. "The extreme level of heterozygosity or repeat contents posed a big challenge. This is just the beginning; we are continuously improving our pipeline in response to new technology improvements."

Adam Phillippy, chair of the VGP genome assembly and informatics working group of more than 100 members and head of the Genome Informatics Section of the National Human Genome Research Institute, National Institutes of Health, Bethesda, Maryland, USA, added: "Completing the first vertebrate reference genome, human, took over 10 years and $3 billion dollars. Thanks to continued research and investment in DNA sequencing technology over the past 20 years, we can now repeat this amazing feat multiple times per day for just a few thousand dollars per genome."

Specific to conservation and in collaboration with the Māori in New Zealand and officials in Mexico, genomic analyses of the kākāp?, a flightless parrot, and the vaquita, a small porpoise and the most endangered marine mammal, respectively, suggest evolutionary and demographic histories of purging harmful mutations in the wild. The implication of these long-term small population sizes at genetic equilibrium gives hope for these species' survival.

Richard Durbin, a Professor at the University of Cambridge and lead of the VGP sequencing hub at the Wellcome Sanger Institute in the UK, said: "These studies mark the start of a new era of genome sequencing that will accelerate over the next decade to enable genomic applications across the whole tree of life, changing our scientific interactions with the living world."

The G10K-VGP consortia involves hundreds of international scientists working together from more than 50 institutions in 12 different countries since the VGP was initiated in 2016 and is exemplary in its scientific cooperation, extensive infrastructure, and collaborative leadership. Additionally, as the first large-scale eukaryotic genomes project to produce reference genome assemblies meeting a specific minimum quality standard, the VGP has thus become a working model for other large consortia, including the Bat 1K, Global Invertebrate Genome Aliance-GIGA, Pan Human Genome Project, Earth BioGenome Project, Darwin Tree of Life, and European Reference Genome Atlas, among others.

"The VGP project is at the vanguard of the creation of a genomic catalog in analogy with Linnaeus' classification of life, said Gene Myers, lead of the VGP sequencing hub at the Max Planck Institute in Dresden, Germany. "I and my colleagues in Dresden are excited to be contributing superb genome reconstructions with the funding of the Max-Planck Society of Germany."

As a next step, the VGP will continue to work collaboratively across the globe and with other consortia to complete Phase 1 of the project, approximately one representative species per 260 vertebrate orders separated by a minimum of 50 million years from a common ancestor with other species in Phase 1. The VGP intends to create comparative genomic resources with these 260 species, including reference-free whole genome alignments, that will provide a means to understand the detailed evolutionary history of these species and create consistent gene annotations. Genome data are primarily generated at three sequencing hubs that have invested in the mission of the VGP including The Rockefeller University's Vertebrate Genome Lab, New York, USA; Wellcome Sanger Institute, UK; and Max Planck Institute, Germany.

Phase 2 will focus on representative species from each vertebrate family and is currently in the progress of sample identification and fundraising. The VGP has an open-door policy and welcomes others to join its efforts, ranging from fundraising and sample collection to generating genome assemblies or including their own genome assemblies that meet the VGP metrics as part of our overall mission.

The VGP collaborated with and tested many protocols from genome sequencing companies, some of whose scientists are also co-authors of the flagship study, including from Pacific Biosciences, Oxford Nanopore Technologies, Illumina, Arima Genomics, Phase Genomics, and Dovetail Genomics. The VGP also collaborated with DNAnexus and Amazon to generate a publicly available VGP assembly pipeline and host the genomic data in the Genome Ark database. The genomes, annotations and alignments are also available in international public genome browsing and analyses databases, including the National Center for Biotechnology Information Genome Data Viewer, Ensembl genome browser, and UC Santa Cruz Genomics Institute Genome Browser. All data are open source and publicly available under the G10K data use policies.

Other novel biological discoveries from the 16 genomes in the flagship paper, and 25 genomes total from over 20 papers in this first wave of publications include:

Corrections of false gene or chromosome losses, where previous assemblies missed between 30% to 50% of GC-rich protein-coding gene regulatory regions, which were considered to belong to the 'dark matter' of the genome;

Newly identified chromosomes in the zebra finch and platypus;

Complete and error free mitochondrial genomes for most species, some generated in single molecule sequences without the need for assembly;

Wild sex chromosome evolution in monotreme mammals and birds;

Genetic variations between humans and marmosets that have implications for marmosets as an emerging non-human primate model system for biomedical research;

Lineage-specific changes shaping the evolution of bird and mammal genomes: duck, emu and platypus and echidna; and

Proposal for a universal evolution-based revised nomenclature for the oxytocin and vasotocin ligand and receptor families.

Credit: 
Nova Southeastern University

More stringent public health measures associated with lower COVID-19 cases, deaths

As state and local policymakers and politicians made the decision to enact stay-at-home orders last March in response to the coronavirus pandemic, a recent study found that more stringent public health measures put in place directly correlated with lower virus case numbers during the first two months of the pandemic.

The study, "More Stringent Measures Against Covid-19 Are Associated With Lower Cases and Deaths in Florida and Miami-Dade," was recently published in the American Journal of Emergency Medicine.

Utilizing The New York Times' GitHub repository of cases and deaths and the COVID-19 Government Response Stringency Index developed by Oxford University's Blavatnik School of Government for the period between March 11, 2020 and March 26, 2021, the researchers found that mitigation efforts to reduce the spread of infection, hospitalizations, and death--including the closure of schools and businesses, the adoption of social distancing measures, the use of facial coverings, and stay-at-home orders--were effective in keeping the overall COVID-19 case count and deaths lower than they would not have been had these efforts not been implemented.

"As we analyzed the data, it was striking to us how the adoption of more stringent policy measures aimed at reducing the spread of COVID-19 infection largely coincided with fewer cases and deaths," said Alex R. Piquero, chair and professor in the Department of Sociology and Arts & Sciences Distinguished Scholar at the University of Miami and corresponding author on the study.

Once the stringencies were loosened, however, researchers noted an initial surge in transmission in the period between the Memorial Day and Independence Day holidays. As statewide stringencies reached their lowest enforcement during October 2020, a second surge in coronavirus cases was seen immediately after.

Any changes in the stringency index were predictive of changes in the incidence of coronavirus cases 10 to 17 days later, the researchers noted.

Credit: 
University of Miami

Six out of every 10 teachers believe that changing the design of the classroom is key to improving learning

image: Science tells us that we learn better and more by collaborating, and therefore the space must favour this

Image: 
(Photo: Smartclassroomproject.com)

The image of rows of chairs and desks facing a teacher at a blackboard has been a reality for decades. However, research reveals that this way of organizing the classroom furniture in schools is not the best way for favouring the learning process. Especially if the needs of 21st-century students are taken into account, who, according to the OECD, require a social environment that fosters autonomy, flexibility, decision-making capacity and the connection of knowledge by individual students or through teamwork.

It is also the opinion of 6 out of every 10 teachers that changing the design of the classroom is key to improving learning. This was the result of a recent study conducted by researchers of the Universitat Oberta de Catalunya (UOC), Universitat Autònoma de Barcelona (UAB), Universitat de Barcelona (UB), Universitat de Vic (UVic) and Universidad Simón Bolívar (USB), in which 847 preschool, primary and secondary school teachers from 40 schools participated. "We assume that's what the spaces should be like without giving it much thought or without connecting what we're innovating in terms of methodology with the place in which we're going to put that into practice," said Guillermo Bautista, member of the Smart Classroom Project research group of the UOC and principal investigator in this study. That's why we need to make the Smart Learning Space a reality: "a space that meets any learning need or proposal, that is flexible and not zoned, in which physical and psychological well-being are prioritized as the foundations upon which the learning activity can take place, in which the pupils play a proactive and autonomous role," said Bautista.

Several studies have already acknowledged the benefits of a suitably-designed classroom. This was one of the reasons why the Consorci d'Educació de Barcelona started replacing the furniture in 487 classrooms a few weeks ago, whilst also reorganizing the spaces to obtain motivating environments that encourage discovery. And, as the authors of the UOC-led study point out, it's that the skills and learning needs of today's pupils not only oblige us to rethink our teaching practices or the inclusion of digital resources, they also require changes in the learning spaces in general.

Guillermo Bautista demonstrates this with an example: as he explains, science tells us that we learn better and more by collaborating, and therefore the space must favour this collaboration and interaction, also taking into account what research tells us about collaborative learning. If we organize the activity with groups of four students based on a challenge or a project, it would be logical that the space should be suitable to enable the group to collaborate and also enable a certain amount of autonomy for using the resources it needs, for moving, looking around, experimenting, and self-organizing, etc. "This means that not all of the groups will be doing the same thing at the same time, and the same resources will not be necessary for everyone. The activity in the classroom is diverse and the space must constantly respond to this organizational diversity of use, resources, movements," he explained.

However, the strong assumption upheld for decades that the classroom is as it is, has resulted in us proposing few changes. And when these are finally being proposed, the direction of these changes is not easy to decide upon, "and that is why our research is necessary, to help establish criteria so that the space is changed with guarantees," he said.

Changes in the design of secondary school classrooms also

Currently, most teachers negatively rate the organization of the environment in their classroom. This is one of the findings of the study, whereby low or moderate scores were obtained regarding the suitability of current classrooms to serve as comprehensive learning spaces. But differences exist between the different levels of education, as the design of preschool and primary education learning spaces is generally more flexible, collaborative and personal, affirm the authors of the study, who point to a possible reason for this scenario. "It is precisely in the infant and primary stages where teaching trends such as those applied since the early 20th century (in which the spaces, their layout, furniture, etc. were already linked to clear educational meanings) have been most present and usually more visible," said Angelina Sánchez-Martí, researcher of the Smart Classroom Project and Serra Húnter professor at the Universitat Autònoma de Barcelona.

On the other hand, the traditional layout of the spaces is much more established among secondary school centres and teachers. That's why the authors of the study positively value the fact that during the study it was confirmed that there are teachers and centres from this stage of education who are aware that their spaces do not correspond with the methodologies that they want to implement. "The Smart Spaces that we have implemented as part of the research are co-designed, applying a thoroughness, rigorousness, and seeking to meet the highest objectives and results for learning proposed by each centre. And these spaces are needed in all of the stages," said Bautista.

Another result to highlight is that the teachers are especially critical when it comes to appraising the integration of technology in the classrooms. But in the opinion of the authors of the study, this data is not surprising as "it is precisely the new technologies that are threatening the traditional times and spaces, and therefore demand great flexibility and a constant adaptation to change, as well as a reformulation of the learning spaces", said Sánchez-Martí. She added that the possibilities that technological integration offers in terms of creating new ways to relate and learn "completely clashes with the very standardized design derived from the idea that schools must be based on classrooms per se, when this does not necessarily have to be the case."

Credit: 
Universitat Oberta de Catalunya (UOC)