Earth

Changes in climate and land cover affecting European migratory bird populations

image: Swallows are a common sight across the UK. Every year, they make the long journey to South Africa from Europe.

Image: 
Philip Stephens

A new study led by the Department of Biosciences at Durham University, UK, is the first large-scale assessment of how recent changes in both climate and land cover have impacted populations of migrating birds.

Global declines in the numbers of individuals of many migratory species are thought to be caused by a combination of climate change and habitat loss on both their breeding and non-breeding grounds, as well as changes to areas they use to refuel whilst on migration. Understanding of which factors are key in causing recent declines, and in which areas changes are having most impact, remains poor.

Using data on the long-term population trends of 61 short- and 39 long-distance European breeding migratory birds, the researchers related changes in climate and land cover across their breeding and non-breeding grounds over a 36-year period to their population trends.

The study showed that populations of migratory birds were most affected by changes in climate on the European grounds where they stopped to breed but, in the areas that they migrate to after the breeding season, changes in land cover had the greater impact.

The combined effects of changes in climate and land cover account for approximately 40 percent of the variation in the population trends of migratory birds, which means that other factors, such as changes in habitat quality, probably also have a substantial impact on population changes.

Professor Stephen Willis, who led the study, said: "For years, people have suspected that climate and land cover changes are major drivers of population trends of migratory birds.

Here we show, for the first time that for long distance migrants moving between Europe and Africa, it is a combination of European climate change and African land cover change that are key to the population declines of many such species over recent decades.

"In the UK, we have seen major declines in many migratory bird species that come here to breed from their African wintering grounds. For example, the Turtle Dove has declined by 95% between 1992-2017, and the Nightingale has declined by 56% between 1995 - 2018."

Lead author, Dr Christine Howard added: "The relatively minor role of recent climate changes on African non-breeding grounds for long distance migrants was surprising but probably reflects the less extreme climatic changes there compared to Europe.

"The fact that a lot of variation in population trends remain unexplained in our study suggests that other factors, such as agricultural intensification, are probably also impacting populations, along with changes at migratory stopping points, including hunting."

The researchers say that to stop the declines of European migrant birds, an integrated approach must consider all processes affecting them across the different grounds they inhabit throughout the year.

Credit: 
Durham University

Dynamic membranes set to solve problems of liquid waste treatment

image: Graph of the particle size distribution of the dispersed phase of the PS suspension in an aqueous solution of acetone

Image: 
Kazan Federal University

The co-authors, Associate Professor Dinar Fazullin and Associate Professor Gennady Mavrin, have been engaged in the topic of membrane elements for water purification for ten years. This research area is very pertinent because of the large volumes of liquid waste and a lack of specialized types of membranes.

In the article, the co-authors explain that in order to increase the resistance of the membrane to aggressive media and increase the mechanical strength, dynamic membranes with a surface layer of polystyrene on a substrate of porous hydrophilic polytetrafluoroethylene were obtained. 'Dynamic' is the name of the composite membrane created by the formation of a semi-permeable layer on the surface of a porous base of suspended microparticles or dissolved substances present in the solution being processed, which is in dynamic equilibrium with the solution. The main advantages of dynamic membranes are repairability and equipment durability. Particles of polystyrene deposited on a substrate are less than 70 nm in size. Electron micrographs of the membranes were obtained by scanning electron microscopy, which shows that the surface of the dynamic membrane is covered with spherical polystyrene particles with a particle size of less than 100 nm. After applying a layer of polystyrene to the surface of the substrate, a decrease in the specific productivity of the membranes is observed and an increase in the hydrophobicity of the surface layer of the membrane is revealed.

A dynamic membrane was used to separate the spent cutting fluid. The separation efficiency of the emulsion was evaluated by the removal of oil products and fatty acids. After ultrafiltration, the concentration of oil products in the membrane filtrate decreases. The calculated retention capacity of the initial PTFEg microfiltration membrane for oil products is not more than 49%; after applying a dynamic layer of polystyrene, an increase in the retention capacity of the membrane to 95% is provided, which is 5.5% more than the retention capacity of a commercial ultrafiltration membrane UPM-100. The retention capacity of other commercial ultrafiltration membranes for emulsions is 82 ... 99%. The high retention capacity of the dynamic membrane for oil products is achieved at a specific productivity of 167 dm3 / m2 ? h, which is better than the performance of commercial ultrafiltration membranes. The HPLC method was used to study the content of fatty acids in the emulsion and its filtrates, the retention capacity of which in the dynamic membrane was more than 68%.

The obtained hydrophobic dynamic membranes can be effectively used for the separation of spent emulsions and disposal of liquid waste with a high specific performance. The developed method for producing a composite membrane will allow obtaining membranes with the required pore sizes, and will make it possible to repeatedly replace the contaminated dynamic layer with a new one.

In the future, the researchers plan to work on stabilizing the dynamic layer of the membrane by physical means, since the stability of the surface layer is an important factor in the process of efficient separation of emulsions. It is also planned to study the development of new composite membranes for water treatment and water purification processes.

Credit: 
Kazan Federal University

Decline in US bird biodiversity related to neonicotinoids, study shows

image: Yijia Li (left) and Madhu Khanna, University of Illinois.

Image: 
College of ACES, University of Illinois

URBANA, Ill. ­- Bird biodiversity is rapidly declining in the U.S. The overall bird population decreased by 29% since 1970, while grassland birds declined by an alarming 53%.

Valuable for so much more than flight and song, birds hold a key place in ecosystems worldwide. When bird numbers and varieties dwindle, pest populations increase and much-needed pollination decreases. Those examples alone negatively impact food production and human health.

Likely reasons for the far-reaching and devastating declines include intensified agricultural production, use of pesticides, conversion of grassland to agricultural land, and climate change. A new study from University of Illinois points to increased use of neonicotinoid insecticides as a major factor in the decline, says Madhu Khanna, distinguished professor in agricultural and consumer economics at U of I and co-author on the paper, published in Nature Sustainability.

Khanna says numerous studies have shown neonicotinoids - nicotine-based pesticides - negatively affect wild bees, honey bees, and butterflies, but large-scale studies on the pesticide's impact on birds have been limited. She speaks more about the topic in a podcast from the Center for the Economics of Sustainability at Illinois.

"This represents the first study at a national scale, over a seven-year time period, using data from hundreds of bird species in four different categories - grassland birds, non-grassland birds, insectivores, and non-insectivores," she says.

"We found robust evidence of the negative impact of neonicotinoids, in particular on grassland birds, and to some extent on insectivore birds after controlling for the effects of changes in land use."

Khanna and co-authors Yijia Li, a graduate student at U of I, and Ruiqing Miao, assistant professor at Auburn University, analyzed bird populations from 2008 to 2014 in relation to changes in pesticide use and agricultural crop acreage.

The authors found that an increase of 100 kilograms in neonicotinoid usage per county-a 12% increase on average-contributed to a 2.2% decline in populations of grassland birds and 1.6% in insectivorous birds. By comparison, the use of 100 kilograms of non-neonicotinoid pesticides was associated with a 0.05% decrease in grassland birds and a 0.03% decline in non-grassland birds, insectivorous birds, and non-insectivorous birds.

Since impacts accumulate, the authors estimate that, for example, 100 kilograms neonicotinoid use per county in 2008 reduced cumulative grassland-bird populations by 9.7% by 2014. These findings suggest that neonicotinoid use has a relatively large effect on population declines of important birds and that these impacts grow over time.

According to the study, the adverse impacts on bird populations were concentrated in the Midwest, Southern California, and Northern Great Plains.

The researchers say the effect of neonicotinoids could result directly from birds consuming treated crop seeds, and indirectly by affecting the insect populations they feed on. Consumption of just a few seeds is enough to cause long-term damage to the birds' reproduction and development.

The study included data on bird population and species diversity from the North American breeding bird survey, a comprehensive database with data from about 3,000 bird routes across the United States. The researchers correlated the bird data with pesticide use, as well as satellite data on agricultural crop acreage and urban land use.

They examined whether intensified agricultural production and conversion of grassland to agricultural land also contributed to the bird decline. Results showed a small negative effect on grassland birds related to cropland expansion, but no significant effect on other types of birds.

While the use of other pesticides has been flat or declining, neonicotinoid usage has grown exponentially over the past two decades. Neonicotinoids are considerably more toxic to insects and persist longer in the environment, the researchers note.

"This research provides compelling support for the re-evaluation of policies permitting the use of neonicotinoids by the U.S. Environmental Protection Agency by incorporating considerations of the implications of these pesticides for bird habitats," the authors conclude.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Historical redlining linked to premature births, lower birth weight babies

Past discriminatory housing practices may play a role in perpetuating the significant disparities in infant and maternal health faced by people of color in the U.S., suggests a new study by researchers at the University of California, Berkeley.

For decades, banks and other lenders used redlining maps to deny loans to people living in neighborhoods deemed too risky for investment. These maps, first drawn in 1935 by the government-sponsored Home Owners' Loan Corp. (HOLC), shaded neighborhoods in one of four colors -- from green representing the lowest risk to red representing the highest risk. These designations were based, in part, on the race and socioeconomic status of each neighborhood's residents.

To investigate the link between historical redlining and infant and maternal health today, the team obtained birth outcome data for the cities of Los Angeles, Oakland and San Francisco between 2006 and 2015 and compared them to HOLC redlining maps.

They found that adverse birth outcomes -- including premature births, low birth weight babies and babies who were small for their gestational age -- occurred significantly more often in neighborhoods with worse HOLC ratings.

"Our results highlight how laws and policies that have been abolished can still assert health effects today," said Rachel Morello-Frosch, a professor of public health and of environmental science, policy and management at UC Berkeley and senior author of the study, which appeared online this month in the journal PLOS ONE. "This suggests that if we want to target neighborhood-level interventions to improve the social and physical environments where kids are born and grow, neighborhoods that have faced historical forms of discrimination, like redlining, are important places to start."

Non-Hispanic Black women living in the U.S. are one-and-a-half times more likely to give birth to premature babies than their white counterparts and are more than twice as likely to have babies with a low birth weight. Hispanic women face similar, though less dramatic, disparities, compared to non-Hispanic white women.

While the legacy of public and private disinvestment in redlined neighborhoods has led to well-documented disparities in income level, tree canopy coverage, air pollution and home values in these communities, the long-term health impacts of redlining are just now starting to be explored.

"Children born during the time of our study would be the great-great-grandchildren of those who were alive at the time of redlining, whose options of where to live would have been determined by redlining maps," said study lead author Anthony Nardone, a medical student in the UC Berkeley-UCSF Joint Medical Program. "We chose to look at birth outcomes because of the stark inequities that exist across race in the U.S. today, inequities that we believe are a function of long-standing institutional racism, like historical redlining."

Earlier work led by Nardone showed that residents of neighborhoods with the worst HOLC rating were more than twice as likely to visit the emergency room with asthma than residents of neighborhoods with the highest HOLC rating. And a recent study from the Harvard School of Public Health found a link between redlining and preterm births in New York City.

In the new study, the team found that neighborhoods with the two worst HOLC ratings -- "definitely declining" and "hazardous" -- had significantly worse birth outcomes than those with the best HOLC rating.

However, Los Angeles neighborhoods rated "hazardous" showed slightly better birth outcomes than those with the second worst, or "definitely declining," rating. In San Francisco and Oakland, neighborhoods with these two ratings showed similar birth outcomes.

This pattern might be attributed to the effects of gentrification on previously redlined neighborhoods, the authors surmised. They added that people in the hardest hit neighborhoods may also rely more on community support networks, which can help combat the effects of disinvestment.

"We also saw different results by metropolitan area and slightly different results by maternal race," Morello-Frosch said. "This suggests that maybe the underlying mechanisms of the effect of redlining differ by region and should be investigated further."

Credit: 
University of California - Berkeley

TGen review suggests postmenopausal women at risk for nonalcoholic fatty liver disease

PHOENIX, Ariz. -- Aug. 14, 2020 -- A review article authored by a researcher at the Translational Genomics Research Institute (TGen), an affiliate of City of Hope, suggests that following menopause, women are at higher risk for developing nonalcoholic fatty liver disease (NAFLD), a chronic condition caused by the build-up of excess fat in the liver not caused by alcohol.

NAFLD is the most common cause of liver damage, and can lead to liver cirrhosis and death. It also is one of the leading indicators for liver transplants. And, it is common, affecting nearly 1 in 4 people across the globe. It often is associated with obesity, abnormally high amounts of lipids in the blood, and type 2 diabetes.

In the U.S., the number of NAFLD cases is expected to grow to more than 100 million within the next decade. Already, the total annual costs among Americans is estimated at $292 billion.

"Even without taking into consideration the indirect costs of the disease, such as lost work-related productivity, it is clear that NAFLD places a substantial burden on the United States healthcare system," said Dr. Johanna DiStefano, a Professor and head of TGen's Diabetes and Fibrotic Disease Unit, and the study's senior author.

Dr. DiStefano's review of more than 60 epidemiological, clinical and experimental studies, published this week in the journal Endocrinology, suggests that the risk of NAFLD is greater among postmenopausal women than premenopausal women. Menopause is the time when women are no longer able to have children, most often after age 45. During menopause, a woman's ovaries stop producing the hormones estrogen and progesterone. Women reach menopause when they have not had a period for one year.

Significantly, the level of an endocrine hormone called estradiol, or E2, which is produced by the ovaries, declines significantly following menopause. E2 is the major female sex hormone involved in the regulation of the estrous and menstrual female reproductive cycles.

"It is likely that the loss of protection conferred by estrogens, combined with other factors, underlie the increased NAFLD risk in post-menopausal women," Dr. DiStefano said.

NAFLD can progress to a more dangerous condition called nonalcoholic steatohepatitis (NASH), which indicates there is both inflammation and liver cell damage, along with fat in the liver. The number of NASH cases in the U.S. also is expected to climb to a projected 27 million by 2030. Among women, NASH is now the leading indication for liver transplantation, which is the most effective treatment strategy against NASH, though NAFLD recurrence in transplant patients is high.

"The mortality rate is rising among women with NAFLD, and more are dying from cirrhosis, suggesting that many women have NASH, rather than just NAFLD," Dr. DiStefano said.

Hope for women: avoiding liver disease

The review also suggests that normal-weight women with lipid, glucose and insulin levels within normal ranges are at low risk for developing NAFLD.

"Efforts to emphasize healthy diet and regular physical activity should be urged in middle-aged women as they approach menopause to prevent the development of NAFLD," Dr. DiStefano said.

In addition, postmenopausal women may potentially benefit from treatment options, such as hormone replacement therapy (HRT). However, the effects of different hormone combinations, including the start of therapy, the duration of therapy, dosages and even how the treatments are administered "represent a critical gap in clinical research," according to the review.

"Clinical studies with focused outcomes are necessary to determine if postmenopausal hormonal manipulation or other treatments can prevent or treat NAFLD in at-risk women," Dr. DiStefano said.

Credit: 
The Translational Genomics Research Institute

Traces of ancient life tell story of early diversity in marine ecosystems

image: USask paleobiologists Gabriela Mángano and Luis Buatois (left to right) in the field in Morocco examining 478 million year-old marine rocks

Image: 
Xiaoya Ma

If you could dive down to the ocean floor nearly 540 million years ago just past the point where waves begin to break, you would find an explosion of life--scores of worm-like animals and other sea creatures tunneling complex holes and structures in the mud and sand--where before the environment had been mostly barren.

Thanks to research published today in Science Advances by a University of Saskatchewan (USask)-led international research team, this rapid increase in biodiversity--one of two such major events across a 100-million-year timespan 560 to 443 million years ago--is part of a clearer picture emerging of Earth's ancient oceans and life in them.

"We can see from the trace fossils--tracks, trails, borings, and burrows animals left behind--that this particular environment of the ocean floor, the offshore, served as a 'crucible' for life," said USask paleobiologist Luis Buatois, lead author of the article. "Over the next millions of years, life expanded from this area outwards into deeper waters and inwards into shallower waters."

The research is the culmination of over 20 years of work from Buatois and the team which examined hundreds of rock formations in locations across every continent.

"Until now, these two events--the Cambrian Explosion and the Great Ordovician Biodiversification Event--have been understood mostly through the study of body fossils--the shells, carapaces and the bones of ancient sea creatures," said Buatois. "Now we can confidently say that these events are also reflected in the trace fossil record which reveals the work of those soft-bodied creatures whose fleshy tissues rot very quickly and so are only very rarely preserved."

For the first time, the team has shown evidence of animals actively "engineering" their ecosystem--through the construction of abundant and diverse burrows on the sea floor of the world's oceans in this ancient time.

"Never underestimate what animals are capable of doing," said USask paleobiologist Gabriela Mángano, co-author of the paper. "They can modify their physical and chemical environment, excluding other animals or allowing them to flourish by creating new resources. And they were definitely doing all these things in these ancient seas."

The trace fossil-producing animals' engineering efforts may have laid the foundation for greater diversity in marine life. The researchers identified a 20-million-year time lag during the Cambrian Explosion (the time when most of the major groups of animals first appear in the fossil record) between diversification in trace fossils and in animal body fossils, suggesting the later animals exploited changes which enabled them to diversify even more.

The research also helps resolve a big question from the geochemical record, which indicated much of the ancient ocean was depleted of oxygen and unsuitable for life. Like oceans today, the Cambrian ocean had certain areas that were full of life, while others lacked the necessary conditions to support it.

"The fact that trace fossil distribution shows that there were spots where life flourished adjacent to others devoid of animal activity all through the early Cambrian period is a strong argument in favor of the idea that zones with enough oxygen to sustain a diversity of animals co-existed with oxygen-depleted waters in deeper areas," said Mángano. "It's a situation similar to what happens in modern oceans with oxygen minimum zones in the outer part of the continental shelf and the upper part of the continental slope, but oxygenated ones in shallower water."

The research could provide new insights from an evolutionary perspective into the importance of extensive rock formations of a similar vintage found in Canada and elsewhere, and help society to prepare for coming challenges.

"Understanding changes that took place early in the history of our planet may help us to face present challenges in modern oceans, particularly with respect to oxygen changes," said Buatois.

Credit: 
University of Saskatchewan

Exploring connections between ovarian cancer and blood cells

image: Illustration of organ-on-chip at work, with different sections indicating the drug and tumor endothelium platelets flow. The OvCa-Chip gives researchers an easier window to view the biological processes between tumors and platelets and could be used to test new anticancer drugs on a surface the size of a USB drive.

Image: 
Texas A&M University College of Engineering

Dr. Abhishek Jain, assistant professor in the Department of Biomedical Engineering and the Department of Medical Physiology in the College of Medicine, collaborated with researchers from the Departments of Gynecologic Oncology and Cancer Biology at MD Anderson Cancer Center to gain a better understanding of the interaction among ovarian cancer tumors, blood vessels and platelets. They found that tumors break the blood vessel barriers so that they can communicate with the blood cells, such as platelets. When these tumors come into contact with platelets, they can then metastasize, or begin to spread to other sites in the body.

The collaborative research was recently published in the journal Blood Advances.

Currently, researchers understand that platelets are one of the initiators of ovarian cancer metastasis but did not know what led to the introduction of the platelets to the tumor cells. Instead of struggling to view this relationship in animal models, Jain's team brought a new solution to the table: organ-on-a-chip research.

Organs-on-a-chip are microfluidic medical devices the size of a USB drive. The team designed on the OvCa-Chip to give researchers an easier window to view the biological processes between tumors and platelets.

In an interview with the International Society on Thrombosis and Hemostasis, Jain explained that "it basically is a microenvironment where ovarian tumor cells can be co-cultured along with their blood vessels, and then they can interact with blood cells. Once we learn about these interactions, we can then move forward to look into how drugs will impact these kinds of interactions."

Viewing the interaction between tumors and blood vessels on the OvCa-Chip led the researchers to an extraordinary result -- the tumor cells systematically broke down the endothelial cells, which are the barrier that lines the interior surface of blood vessels and prevents exterior interaction with blood cells. Once this barrier was gone, blood cells and platelets entered the tumor microenvironment and could be recruited for metastasis.

Harnessing this knowledge could change how clinicians approach ovarian cancer treatment, Jain said, suggesting that anti-vascular drugs could be considered along with anticancer treatments. A benefit of the organ-on-a-chip is that it can also test these novel drug treatments and drug combinations.

Another application of the chips could be diagnostics.

"You have to understand that these are chips that are living. They contain living cells. The advantage is that these are all actually human samples," Jain stated in the interview. "So what we think the future for this technology is, is perhaps we can advance it in the direction of personalized medicine where we could actually take stem cells from patients and other patient-derived cells and make this entire chip from a single patient."

Credit: 
Texas A&M University

A watershed moment for US water quality

COLUMBUS, Ohio - A new federal rule that determines how the Clean Water Act is implemented leaves millions of miles of streams and acres of wetlands unprotected based on selective interpretation of case law and a distortion of scientific evidence, researchers say in a new publication.

In a Policy Forum article published in the Aug. 14 issue of Science, the researchers assert that the Navigable Waters Protection Rule undermines the spirit - if not the letter - of the Clean Water Act by protecting only waters that have a permanent hydrologic surface connection to rivers, lakes and other large "navigable" bodies of water. Also omitted from consideration is maintaining the integrity of the biological and chemical quality of the nation's waters, protections that are explicitly called for in the Clean Water Act.

"It's so important to say, right out of the gates, that the new rule does not protect water in the way that the Clean Water Act was intended to protect water," said lead author Mažeika Sullivan, director of the Schiermeier Olentangy River Wetland Research Park at The Ohio State University.

The rule went into effect on June 22.

Left unprotected under the new rule are stand-alone wetlands across the country whose collective area is approximately the size of the state of West Virginia. Among the millions of miles of ephemeral streams - those that flow after precipitation events - losing federal protection are, for example, more than 95 percent of Arizona's streams, including many tributaries that flow into the Grand Canyon.

The change means that now-unprotected waters may be subjected to a variety of harmful human activities such as dredging or filling in waters for development, or even unpermitted dumping of industrial waste into streams or wetlands. Some potential results: higher risk for floods, loss of biodiversity, and threats to drinking water and recreational fishing.

"We're talking about major roll-backs in protections that limit activities that impair, pollute and destroy these systems," said Sullivan, also associate professor in Ohio State's School of Environment and Natural Resources, who co-authored the article with colleagues specializing in aquatic science, conservation science and environmental law.

"And it comes at a time when we're really starting to understand multiple stressors on water - not just urbanization or climate change or pollution, but how all these factors interact. And now we're removing protections and potentially undermining decades of taxpayer investment in improving water quality.

"It's a travesty, not just for us now, but for future generations. It could really be a watershed moment in that sense."

Legal battles have been waged for years over which non-navigable U.S. waters should be protected under the Clean Water Act, and the U.S. Supreme Court weighed in with opinions in a 2006 case. Justice Antonin Scalia argued that non-navigable waters should be covered by federal law only if they have a "relatively permanent" flow and a continuous surface connection to traditionally protected waters. Justice Anthony Kennedy suggested a non-navigable water body should be protected if it has a "significant nexus" to a traditional navigable waterway - meaning it can affect the physical, biological and chemical integrity of downstream waters.

In 2015, the Obama administration implemented the Clean Water Rule, which classified all tributaries and most wetlands as "waters of the United States" that fall under federal jurisdiction. At the heart of that rule was a Connectivity Report produced by the Environmental Protection Agency, backed by a review of more than 1,200 scientific publications and input from 49 technical experts. The science supported protection for isolated or intermittent systems that, if polluted or destroyed, would decrease water quality downstream. Sullivan was a member of the EPA Scientific Advisory Board that confirmed the scientific underpinnings of the report and the rule.

The language of the new Navigable Waters Protection Rule instead harkens back to Scalia's 2006 opinion, protecting waters with "relatively permanent" surface flows and excluding from federal jurisdiction all groundwater and all ephemeral bodies of water, as well as others.

"So what's extremely concerning from a policy standpoint is that the federal government is, at least in part, leaving science aside," Sullivan said. "This idea of connectivity is one of the most crucial components of the science that has largely been ignored in this rule. There are magnitudes of connectivity - it could be frequency or how long it lasts. There are also different types of connectivity: biological, chemical and hydrologic.

"Further, just because a waterbody may be less connected to another doesn't necessarily mean it's less important for water quality."

For human recreation and well-being, Sullivan said, small streams and wetlands are critical, both in their own right, as well as because they support larger, downstream ecosystems such as rivers, lakes and reservoirs.

"There are tendrils that extend into every aspect of our lives, from how we recreate and how we live, to our economy, with cultural implications for a lot of folks in the U.S. Water is fundamental to people's sense of place and where they belong," he said.

Sullivan and colleagues cited an April 2020 Supreme Court decision that may influence outcomes of the more than 100 pending lawsuits filed in opposition to the new rule. In County of Maui v. Hawaii Wildlife Fund, the court affirmed for the first time that pollutants that travel through groundwater and then emerge into surface waters are covered by the Clean Water Act.

Until the litigation is sorted out, the authors urged mobilization of grassroots efforts among watershed councils, other agencies and academics to conserve and protect water - a tall order, Sullivan acknowledged, when it comes to staying coordinated and coming up with resources.

"We're going to have to start thinking about this in a very different way," he said. "Everybody needs clean water, right? This isn't a political issue."

Credit: 
Ohio State University

Heavy class A drug use linked to heightened risk of sight loss in US military

Heavy use of class A drugs, such as heroin, methamphetamine, or cocaine is linked to a heightened risk of partial or total blindness among US military personnel, finds research published online in the journal BMJ Military Health.

The condition, which affects the ability to read, drive, and recognise faces, is more common among military personnel than it is among civilians, possibly because of certain risk factors that are unique to active service in the US Armed Forces, say the researchers.

Several behavioural and health factors are associated with sight loss. These include HIV infection; high blood pressure; smoking; diabetes; poor diet; sedentary lifestyle; cancer; depression; gum disease; and social and economic deprivation.

It's not clear whether illicit drug use might compound these factors or represent an independent risk factor in its own right, or what role experience of military service might have.

To explore this further, the researchers drew on responses to the nationally representative National Health and Nutrition Examination Surveys for the years 2013 to 2018.

They compared dietary, lifestyle, and drug use factors in 106 serving or former soldiers and 1572 civilians.

The responses showed that those with experience of military service were significantly more likely to use illicit drugs than the civilians surveyed: 21% vs 13.5%. And proportionally more of them reported partial or total blindness than the civilians.

They were also more likely to have several other risk factors associated with sight loss, including HIV positivity, higher blood pressure, more years of smoking and lower levels of physical activity.

The researchers then ran the data through three separate analyses.

Their first showed that experience of military service was associated with a heightened risk of sight loss, while the second showed that older age, lower educational attainment, and lower household income were risk factors.

In the third analysis, health conditions, lifestyle factors, socioeconomic factors, military service and illicit drug use were all put into the mix, with experience of military service and Illicit drug use emerging as the strongest risk factors, followed by HIV positivity.

The researchers highlight several limitations to their study findings, including no information on the type, extent, or cause of sight loss or the timing of drug use. And there was little information on whether respondents were serving personnel or veterans.

"However, given the zero tolerance policy on drug use in the military, it is likely our military sample was composed entirely of veterans," they point out.

"This study indicates that (1) self-reported vision loss among service members or veterans is more prevalent than among civilians with no military experience, and (2) self-reported vision loss in military service members or veterans may be associated with severe or prolonged illicit drug use, such as heroin, methamphetamine, or cocaine," write the researchers.

And they conclude: "Given the relatively high prevalence of severe illicit drug use among service members or veterans compared with civilians, and the long term impacts on the eye, there is a need for medical and behavioural health programmes that provide vision screenings to drug-using veterans."

Credit: 
BMJ Group

How a protein stops cells from attacking their own DNA

Viruses multiply by injecting their DNA into a host cell. Once it enters the intracellular fluid, this foreign material triggers a defense mechanism known as the cGAS-STING pathway. The protein cyclic GMP-AMP Synthase (cGAS), which is also found inside the fluid, binds to the invading DNA to create a new molecule. This, in turn, binds to another protein called Stimulator of Interferon Genes (STING), which induces an inflammatory immune response.

Sometimes, the material contained inside the fluid - and in contact with the cGAS protein - comes not from a virus but from the cell itself, for instance after the nucleus has accidentally ruptured. When this happens, the cGAS-STING pathway isn't activated. Scientists at EPFL have demonstrated how cells are able to respond differently to their own DNA and to genetic material from a pathogen - and avoid attacking the wrong target. Their discovery, published in a paper in the journal Science, sheds new light on the complex processes at work in the body's inflammatory response.

The team, led by Prof. Andrea Ablasser and working with colleagues from the laboratories of Prof. Beat Fierz and Prof. Selman Sakar, uncovered new insights into the key role of a small protein known as Barrier-to-Autointegration Factor (BAF). They showed that, by binding to the inoffensive DNA, BAF prevents the cGAS protein from doing the same, thereby stopping the cGAS-STING pathway in its tracks.

BAF strengthens the cell nucleus, connecting the nuclear envelope (or membrane) to the DNA inside. Experiments have shown that when this protein is removed from lab-grown cells, the nucleus ruptures. This breach releases the genetic material into the intracellular fluid, where it comes into contact with the cGAS protein and triggers the cGAS-STING pathway - just as if it were foreign DNA.

There are various ways to cause a nucleus to rupture, such as by applying mechanical pressure. But according to Baptiste Guey, one of the paper's lead authors, only one of these methods - removing the BAF protein - induces an immune response. "We can therefore conclude that BAF plays a key role in preventing the cell from attacking its own DNA," says Guey.

The protein's inhibitor role is vitally important: although the cGAS-STING pathway helps the body fight off infections, it also needs to be kept in check. "Nuclei do occasionally rupture, but cells are able to repair the damage," says Marilena Wischnewski, another lead author of the paper. "If cGAS bound to the DNA every time that happened, the consequences would be more serious."

The dangers of an overactive cGAS-STING pathway can be seen in Aicardi-Goutières syndrome: a rare and usually fatal genetic disorder that induces an excessive inflammatory response as if the body's cells were under constant attack from invading pathogens.

BAF is also believed to play a role in some types of tumor. According to Wischnewski, a high concentration of the protein in cancer cells may be associated with a poorer prognosis. "It could be that BAF makes tumors more resistant," she explains. "By preventing activation of the cGAS-STING pathway, it might allow cancer cells to evade the body's immune system."

The protein is found in varying quantities in different types of cells. The team is planning to dig deeper into these variations as they seek to understand how different tissue types respond to infection and inflammation.

Credit: 
Ecole Polytechnique Fédérale de Lausanne

Warming Greenland ice sheet passes point of no return

image: Icebergs near Greenland form from ice that has broken off--or calved--from glaciers on the island. A new study shows that the glaciers are losing ice rapidly enough that, even if global warming were to stop, Greenland's glaciers would continue to shrink.

Image: 
Photo courtesy Michalea King

COLUMBUS, Ohio - Nearly 40 years of satellite data from Greenland shows that glaciers on the island have shrunk so much that even if global warming were to stop today, the ice sheet would continue shrinking.

The finding, published today, Aug. 13, in the journal Nature Communications Earth and Environment, means that Greenland's glaciers have passed a tipping point of sorts, where the snowfall that replenishes the ice sheet each year cannot keep up with the ice that is flowing into the ocean from glaciers.

"We've been looking at these remote sensing observations to study how ice discharge and accumulation have varied," said Michalea King, lead author of the study and a researcher at The Ohio State University's Byrd Polar and Climate Research Center. "And what we've found is that the ice that's discharging into the ocean is far surpassing the snow that's accumulating on the surface of the ice sheet."

King and other researchers analyzed monthly satellite data from more than 200 large glaciers draining into the ocean around Greenland. Their observations show how much ice breaks off into icebergs or melts from the glaciers into the ocean. They also show the amount of snowfall each year--the way these glaciers get replenished.

The researchers found that, throughout the 1980s and 90s, snow gained through accumulation and ice melted or calved from glaciers were mostly in balance, keeping the ice sheet intact. Through those decades, the researchers found, the ice sheets generally lost about 450 gigatons (about 450 billion tons) of ice each year from flowing outlet glaciers, which was replaced with snowfall.

"We are measuring the pulse of the ice sheet--how much ice glaciers drain at the edges of the ice sheet--which increases in the summer. And what we see is that it was relatively steady until a big increase in ice discharging to the ocean during a short five- to six-year period," King said.

The researchers' analysis found that the baseline of that pulse--the amount of ice being lost each year--started increasing steadily around 2000, so that the glaciers were losing about 500 gigatons each year. Snowfall did not increase at the same time, and over the last decade, the rate of ice loss from glaciers has stayed about the same--meaning the ice sheet has been losing ice more rapidly than it's being replenished.

"Glaciers have been sensitive to seasonal melt for as long as we've been able to observe it, with spikes in ice discharge in the summer," she said. "But starting in 2000, you start superimposing that seasonal melt on a higher baseline--so you're going to get even more losses."

Before 2000, the ice sheet would have about the same chance to gain or lose mass each year. In the current climate, the ice sheet will gain mass in only one out of every 100 years.

King said that large glaciers across Greenland have retreated about 3 kilometers on average since 1985--"that's a lot of distance," she said. The glaciers have shrunk back enough that many of them are sitting in deeper water, meaning more ice is in contact with water. Warm ocean water melts glacier ice, and also makes it difficult for the glaciers to grow back to their previous positions.

That means that even if humans were somehow miraculously able to stop climate change in its tracks, ice lost from glaciers draining ice to the ocean would likely still exceed ice gained from snow accumulation, and the ice sheet would continue to shrink for some time.

"Glacier retreat has knocked the dynamics of the whole ice sheet into a constant state of loss," said Ian Howat, a co-author on the paper, professor of earth sciences and distinguished university scholar at Ohio State. "Even if the climate were to stay the same or even get a little colder, the ice sheet would still be losing mass."

Shrinking glaciers in Greenland are a problem for the entire planet. The ice that melts or breaks off from Greenland's ice sheets ends up in the Atlantic Ocean--and, eventually, all of the world's oceans. Ice from Greenland is a leading contributor to sea level rise--last year, enough ice melted or broke off from the Greenland ice sheet to cause the oceans to rise by 2.2 millimeters in just two months.

The new findings are bleak, but King said there are silver linings.

"It's always a positive thing to learn more about glacier environments, because we can only improve our predictions for how rapidly things will change in the future," she said. "And that can only help us with adaptation and mitigation strategies. The more we know, the better we can prepare."

Credit: 
Ohio State University

FEFU scientists propose to restore neural tissue with hydrogels based on modified pectins

image: FEFU School of Biomedicine, Vadim Kumeiko's lab

Image: 
FEFU press office

Far Eastern Federal University (FEFU) scientists have developed implantable hydrogels based on plant polysaccharides (pectins). They can play the role of an artificial extracellular matrix, a special network of molecules that fills the space between body cells. The development to be used as a medium for growing tissues and organs, as well as for drug delivery and brain recovery after removal of malignant tumors glioblastomas. A related article appears in the International Review of Neurobiology.

The hydrogels developed at the FEFU School of Biomedicine (FEFU SBM) are plant carbohydrate-based materials made of pectins modified by bioengineering methods. They are suitable for neural tissue restoration after its malignant transformation caused by brain tumors, and after damages caused by traumas or neurodegenerative diseases, involving such as cell death or loss of functional activity of cells and their environment.

"Some variations of our extracellular hydrogel matrices are capable of suppressing the cell proliferation in glioma, a malignant brain tumor. Their chemical modifications can be used to secure the normal neural stem cells potential, "preserving" them in an undifferentiated state, saving their viability and potential for the future. This is interesting for the development of cellular biotechnologies in regenerative medicine," says the head of the research group Vadim Kumeiko, Deputy Director for Development of the FEFU SBM. "Certainly, bioengineering solutions associated with the use of extracellular matrices from pectins need to be carefully checked. However, we expect that in the future our hydrogels can be implanted in the brain tumor resection area in order to kill the tumor cells remaining after the operation."

The scientist explained that in the human body, the extracellular space is a complex molecular network, i.e. a matrix, which consists of two main components: protein and carbohydrate. The neural system matrix differs from the matrix of many other tissues because it has more carbohydrate and resembles marmalade or marshmallow in its physicochemical properties. This vary from a more elastic and rigid matrix with a protein component predominance typical for connective tissues. It is almost impossible for cells to move along the carbohydrate matrix.

"This is the way nature intended so that in adult organisms brain cells do not easily migrate to new areas and do not form new electrical connections too quickly. Because if they do, it can lead, for example, to the loss of memory, acquired skills, and knowledge. Imagine you know a foreign language in the evening, but during the night cells migrated and you forgot everything," explains Vadim Kumeiko.

The problem is that tumor cells control the rigidness of its extracellular space by adding protein components to it. By doing that, they pave a pathway for themselves to run away in order to metastasize and form new tumors in other areas of the body.

"The matrix with a predominance of the carbohydrate component, implanted after the tumor was removed, will not only inhibit the growth and proliferation of cells, but will also be perfect as a delivery vehicle for highly toxic drugs. Such drugs will be released from it gradually, causing less harm to the body as a whole, and killing the remaining tumor cells. At the next stage, it is possible to stimulate the regeneration and growth of neural cells by implanting injection of a tougher matrix containing a large proportion of proteins into the operated area," says Vadim Kumeiko.

The scientist clarified the approach was proposed by his research group earlier in Frontiers in Bioengineering and Biotechnology, a new article is devoted to a partial experimental justification of the concept.

In the future, scientists plan to investigate the effect of pectin matrix composition on the rate of drug release, and what combination of carbohydrate and protein components will allow restoring neural tissue without scars or excessive density typical for tumor tissue.

In general, so far there are very few materials in the world that are approved for neural tissue bioengineering and clinical practice. Mainly, they are intended for the regeneration of the peripheral rather than the central nervous system.

Credit: 
Far Eastern Federal University

Reconstructing global climate through Earth's history

image: Sea surface temperatures in shallow, semi-enclosed seas today are warmer than they should be for their locations. Many paleotemperature data come from settings like these, raising the possibility that ancient temperatures suggest a warmer ancient Earth.

Image: 
Syracuse University

A key component when forecasting what the Earth's climate might look like in the future is the ability to draw on accurate temperature records of the past. By reconstructing past latitudinal temperature gradients (the difference in average temperature between the equator and the poles) researchers can predict where, for example, the jet stream, which controls storms and temperatures in the mid-latitudes (temperate zones between the tropics and the polar circles), will be positioned. The trouble is, many of the existing data are biased toward particular regions or types of environments, not painting a full picture of Earth's ancient temperatures.

Researchers from the Department of Earth and Environmental Sciences, including Emily Judd '20 Ph.D., Thonis Family Assistant Professor Tripti Bhattacharya and Professor Linda Ivany, have published a study titled, "A dynamical framework for interpreting ancient sea surface temperatures," in the journal "Geophysical Research Letters," to help account for the offset between location-biased paleoclimate data and the 'true' average temperature at a given latitude through Earth's history. Their work was funded by the National Science Foundation.

According to Judd, accurate temperature estimates of ancient oceans are vital because they are the best tool for reconstructing global climate conditions in the past, including metrics like mean global temperature and the latitudinal temperature gradient. While climate models provide scenarios of what the world could look like in the future, paleoclimate studies (study of past climates) provide insight into what the world did look like in the past. Seeing how well the models we use to predict the future can simulate the past tells us how confident we can be in their results. It is therefore of utmost importance to have thorough, well-sampled data from the ancient past.

"By understanding how latitudinal temperature gradients have changed over the course of Earth's history and under a variety of different climate regimes, we can start to better anticipate what will happen in the future," says Judd.

To determine ancient temperatures, geologists study proxies, which are chemical or biological traces that record temperatures from sedimentary deposits preserved on the sea floor or continents. Due to the recycling of ancient seafloor into the Earth's mantle, there is an 'expiration date' on the availability of seafloor data. Most ancient temperature proxies therefore come from sediments that accumulated on continental margins or in shallow inland seas where records can persist for much longer.

Judd, Bhattacharya and Ivany use temperature data from modern oceans to reveal consistent, predictable patterns where the ocean surface is warmer or cooler, or more or less seasonal, than otherwise expected at that latitude.

"The biggest offsets happen to be in the two settings that are most represented in the geologic past," says Ivany. "Knowing how those regions are biased in comparison to the global mean allows researchers to better interpret the proxy data coming from the ancient Earth."

Data from shallow, semi-restricted seas (e.g., the Mediterranean and Baltic Seas) show that sea surface temperatures are warmer than in the open ocean. As a result, a key finding of their paper theorizes that estimates of global mean temperature from the Paleozoic Era (~540-250 million years ago), a time when the majority of data come from shallow seas, are unrealistically hot.

Even in the more recent geologic past, the overwhelming majority of sea surface temperature estimates come from coastal settings, which they demonstrate are also systematically biased in comparison to open ocean temperatures.

In order to have a more accurate record of average ocean temperature at a given latitude, Bhattacharya says researchers must account for the incomplete nature of paleotemperature data. "Our work highlights the need for the scientific community to focus sampling efforts on under-sampled environments," says Bhattacharya. "New sampling efforts are essential to make sure we are equally sampling unique environmental settings for different intervals of Earth's history."

According to Judd, the paleoclimate community has made major advances toward understanding ancient climates in the past few decades. New, faster, and cheaper analytical techniques, as well as a surge in expeditions that recover ocean sediment cores, have led to massive compilations of ancient sea surface temperature estimates. Despite these advancements, there are still significant disagreements between temperature estimates from different locations within the same time interval and/or between temperature estimates and climate model results.

"Our study provides a framework within which to reconcile these discrepancies," says Judd. "We highlight where, when and why temperature estimates from the same latitudes may differ from one another and compare different climate models' abilities to reconstruct these patterns. Our work therefore lays the groundwork to more holistically and robustly reconstruct global climate through Earth's history."

Credit: 
Syracuse University

Busting Up the Infection Cycle of Hepatitis B

image: Prof. Jodi A. Hadden-Perilla

Image: 
Photo courtesy of Jodi Hadden-Perilla

Researchers at the University of Delaware, using supercomputing resources and collaborating with scientists at Indiana University, have gained new understanding of the virus that causes hepatitis B and the "spiky ball" that encloses the virus's genetic blueprint.

The research, which has been published online, ahead of print, by the American Chemical Association journal ACS Chemical Biology, provides insights into how the capsid -- a protein shell that protects the blueprint and also drives the delivery of it to infect a host cell -- assembles itself.

Computer simulations performed by the UD scientists investigated the effects of a mutation that impairs the assembly process. Together with collaborators, the researchers revealed that the region of the protein that contains the mutation, the spike, can communicate with the region of the protein that links with other subunits to assemble the capsid. They found evidence that a change in the shape of the capsid protein switches it into an "on" state for assembly.

Scientists believe that the capsid is an important target in developing drugs to treat hepatitis B, a life-threatening and incurable infection that afflicts more than 250 million people worldwide.

"The capsid looks like a spiky ball, with 120 protein dimers that assemble to form it; each dimer contains a spike," said Jodi A. Hadden-Perilla, assistant professor in UD's Department of Chemistry and Biochemistry and a co-author of the new paper. "The capsid is key to the virus infection cycle. If we could disrupt the assembly process, the virus wouldn't be able to produce infectious copies of itself."

The Indiana University researchers had been studying the dimers, which are two-part, T-shaped molecular structures, and investigating whether a mutation could activate or deactivate a switch to turn on the capsid's assembly mechanism. They worked with Hadden-Perilla's group, which ran computer simulations to explain how changes in the protein structure induced by the mutation affected the capsid's ability to assemble.

"What we learned is that this mutation disrupts the structure of the spike at the top of the dimer," Hadden-Perilla said. "This mutation slows down assembly, which actually involves a region of the protein that is far away from the spike. It's clear that these two regions are connected. A change in the shape of the protein, particularly at the spike, may actually activate or deactivate assembly."

Her team did its work using the National Science Foundation-supported Blue Waters supercomputer at the University of Illinois at Urbana-Champaign, the largest supercomputer on any university campus in the world, to perform what are known as all-atom molecular dynamics simulations.

Molecular dynamics simulations allow researchers to study the way molecules move in order to learn how they carry out their functions in nature. Computer simulations are the only method that can reveal the motion of molecular systems down to the atomic level and are sometimes referred to as the "computational microscope."

The paper, titled "The integrity of the intradimer interface of the Hepatitis B Virus capsid protein dimer regulates capsid self-assembly," can be viewed on the journal's website.

From Colombia to UD
For doctoral student Carolina Pérez Segura, a co-author of the paper, working with data from the supercomputer simulations was the kind of research experience that first brought her to the University of Delaware and then inspired her to stay.

She examined numerous simulations and vast amounts of data to investigate the effect of the mutation and "made some important discoveries," Hadden-Perilla said. "We threw her into the deep end in my brand-new research group [last summer], and she did a great job."

Pérez Segura came to UD as a participant in the University's Latin American Summer Research Program. A graduate of the Universidad Nacional de Colombia (National University of Colombia), the program marked her first time leaving Colombia and, indeed, her first time traveling by plane. She planned to conduct research under Hadden-Perilla's mentorship for a couple of months and then return home.

But, she said, the experience was so meaningful to her that she canceled her plane ticket home and stayed on to work as a visiting scholar with Hadden-Perilla while applying to UD's doctoral program in chemistry. She was accepted and began her studies during spring semester.

It was her fascination with computational chemistry that brought her to Delaware, she said, and the work with supercomputers that made her decide to continue that research.

"While I was an undergraduate, I chose that branch of chemistry as the kind of career I wanted," said Pérez Segura, who worked with a research group in the field, on a smaller scale, in Colombia. "When I was introduced to the idea that math and physics can help you understand biological processes, I knew that was what I wanted to do.

"I thought it was really amazing to be able to explain biological processes with numbers and computers. I wanted to learn more, and here, there's so much more opportunity to learn it."

Although the social and travel restrictions imposed by the coronavirus (COVID-19) pandemic have limited her ability to fully experience American life and culture, she said her experience at UD remains very positive. She's eager to be able to go out more, practice her English and feel a part of American culture, but meanwhile, she's busy with exciting research, she said.

She's currently also working on research that Hadden-Perilla is conducting into the virus that causes COVID-19.

"It's unusual for a student to be accepted into our graduate program 'off-cycle,' beginning in spring semester," Hadden-Perilla said. "But Carolina is exceptional."

Credit: 
University of Delaware

Breast cancer 'ecosystem' reveals possible new targets for treatment

image: Microscopy image of a new subtype of cells, discovered within triple negative breast cancer by Garvan researchers.

Image: 
Sunny Wu

Researchers from the Garvan Institute of Medical Research have uncovered four new subtypes of cells within triple negative breast cancer, which contain promising new therapeutic targets for the aggressive disease.

Using cellular genomics, the team revealed one of the new cell types produces molecules that suppress immune cells, which may help cancer cells evade the body's immune system. The discovery, published in EMBO Journal, could lead to a new class of therapies for triple negative breast cancer.

"Patients with triple negative breast cancers have a poor prognosis, in large part because treatment approaches have advanced very slowly," says senior author Associate Professor Alex Swarbrick, Head of the Tumour Progression Laboratory at Garvan. "We've analysed individual cells in patient tumour samples to gain unprecedented insights into what makes up a tumour, allowing us to identify subtypes of cells and investigate their role in disease."

Triple negative breast cancer: new targets for therapy

By definition, triple negative breast cancers lack three receptors (for oestrogen, progesterone and the HER2 protein) that are targeted with specialised therapies in other breast cancers. This leaves patients with limited treatment options and poor outcomes - a substantial proportion of patients diagnosed with the cancer die within five years of diagnosis.

"In our study, we searched for new potential targets for therapy by analysing the individual cells inside triple negative breast tumours. This includes not only the cancer cells themselves, but also the surrounding host cells, such as immune and connective tissue cells, which can be thought of as the cancer 'ecosystem' that supports a tumour to grow and spread," explains Sunny Wu, first author of the study.

The researchers used next-generation sequencing of 24,271 individual cells extracted from biopsy samples of five triple negative breast cancer patients, and detected over 6000 unique RNA molecules in every cell, creating a snapshot of each cell's gene activity.

By analysing the profiles of active genes, the researchers revealed four cell subtypes of stromal cells, which form the connective tissues in the body. Previous studies in triple negative breast cancers had generally considered there to be only one type of stromal cell.

Cancer's 'ecosystem' suppresses the immune system

The researchers investigated further to understand what role these four cell subtypes might play in triple negative breast cancers. They revealed surprising interactions between the signalling molecules produced by the stromal cells and immune cells.

"Our findings suggest that there is significant crosstalk between the immune system and stromal cells, which were generally thought to have only a structural role in cancers," says Associate Professor Swarbrick.

For instance, the team found that one of the cell subtypes they discovered in the breast cancers, so-called inflammatory cancer-associated fibroblasts or 'iCAFs', released the chemokine CXCL12, a signalling molecule known to suppress the anti-tumour activity of T cells.

"This is significant because immunotherapy - which is designed to activate the patient's immune system against a tumour - has limited response in many patients with triple negative breast cancer," Associate Professor Swarbrick adds. "If iCAFs are suppressing T cells in triple negative breast cancer, and we can remove this interaction, T cells will be more susceptible to activation and more likely to attack the cancer."

New directions for cancer treatment

The researchers say that combining immunotherapy with a treatment that stops the interaction between stromal cells and immune cells holds promise for improving the treatment of triple negative breast cancer.

The team will now analyse further breast cancer samples, to obtain an even better understanding of the cells that comprise triple negative breast cancers, their interactions and how these can be intervened with to stop disease progression.

"Pathologists have been describing cancers under the microscope for more than 150 years, but we still only have a shallow understanding of the cells that are there," says Associate Professor Swarbrick. "Cellular genomics is showing us that what we once thought of as one cell type is in reality a diversity of cell types, which will have a significant impact on how we tailor treatments in future."

The data generated in this study forms part of the Breast Cancer Cell Atlas, an ambitious initiative to sequence more than a million individual cells from patient breast tumours to create the most comprehensive view of breast cancer cells yet, undertaken in collaboration with the Garvan-Weizmann Centre for Cellular Genomics.

Credit: 
Garvan Institute of Medical Research