Earth

Bacteria killed by new light-activated coating

To stop the spread of disease, it could be used to coat phone screens and keyboards, as well as the inside of catheters and breathing tubes, which are a major source of healthcare-associated infections (HCAIs).

The most well known HCAIs are caused by Clostridioides difficile (C. difficile), methicillin-resistant Staphylococcus aureus (MRSA) and Escherichia coli (E. coli). They commonly occur during in-patient medical or surgical treatment, or from visiting a healthcare setting and pose a serious health threat, making them a key priority for the NHS to address*.

The research, published today in Nature Communications, is the first to show a light activated antimicrobial coating successfully killing bacteria in low intensity, ambient light (300 Lux), such as that found in wards and waiting rooms. Previously, similar coatings needed intense light (3,000 Lux), like that found in operating theatres, to activate their killing properties.

The new bactericidal coating is made of tiny clusters of chemically modified gold embedded in a polymer with crystal violet - a dye with antibacterial and antifungal properties.

First author, Dr Gi Byoung Hwang (UCL Chemistry), said: "Dyes such as crystal violet are promising candidates for killing bacteria and keeping surfaces sterile as they are widely used to disinfect wounds. When exposed to bright light, they create reactive oxygen species, which in turn kill bacteria by damaging their protective membranes and DNA. This is amplified when they are paired with metals such as silver, gold and zinc oxide."

"Other coatings have effectively killed bacteria but only after exposure to UV light, which is dangerous to humans, or very intense light sources, which aren't very practical. We are surprised to see just how effective our coating is in killing both S. aureus and E. coli in ambient light, making it promising for use in a variety of healthcare environments," added Professor Ivan Parkin (UCL Chemistry), senior author and Dean of UCL Mathematical & Physical Sciences.

The team of chemists, chemical engineers and microbiologists created the bactericidal coating using a scalable method and tested how well it killed S. aureus and E. coli against control coatings and under different lighting conditions.

Sample surfaces were treated with either the bactericidal coating or a control coating before being inoculated with 100,000 colony forming units (CFU) per ml of either S. aureus and E. coli. The growth of the bacteria was investigated under dark and white light conditions between 200 - 429 Lux.

They found that in ambient light, a control coating of crystal violet in a polymer alone did not kill either bacteria. However, in the same lighting conditions, the bactericidal coating led to a 3.3 log reduction in the growth of S. aureus after six hours and a 2.8 log reduction in the growth of E. coli after 24 hours.

"E. coli was more resistant to the bactericidal coating than S. aureus as it took longer to achieve a significant reduction in the number of viable bacteria on the surface. This is presumably because E. coli has a cell wall with a double membrane structure whereas S. aureus only has a single membrane barrier," explained study co-author Dr Elaine Allan (UCL Eastman Dental Institute).

The team unexpectedly discovered that the coating kills bacteria by producing hydrogen peroxide - a relatively mild reagent used in contact lens cleaner solutions. It works by chemically attacking the cell membrane, and therefore takes longer to work on bacteria with more layers of protection.

"The gold clusters in our coating are key to generating the hydrogen peroxide, through the action of light and humidity. Given the clusters contain only 25 atoms of gold, very little of this precious metal is required compared to similar coatings, making our coating attractive for wider use," commented senior author Professor Asterios Gavriilidis (UCL Chemical Engineering).

Credit: 
University College London

Colorectal cancer burden shifting to younger individuals

March 5, 2020--ATLANTA-- The burden of colorectal cancer is swiftly shifting to younger individuals as incidence increases in young adults and declines in older age groups, according to the latest edition of Colorectal Cancer Statistics 2020, a publication of the American Cancer Society. A sign of the shift: the median age of diagnosis has dropped from age 72 in 2001-2002 to 66 during 2015-2016; in other words, half of all new diagnoses are now in people 66 or younger.

Colorectal cancer (CRC) is the third most commonly diagnosed cancer and the third leading cause of cancer death in both men and women in the United States. Rapid declines in CRC incidence occurred in people 50 and older during the 2000s, largely because of increased screening with colonoscopy, which can prevent cancer by removing premalignant polyps.

In recent years, however, declines in incidence have been confined to people 65 and older, among whom rates dropped by 3.3% per year from 2011 through 2016. Among those 50 to 64, declines in incidence of 2% to 3% per year during the 2000s have reversed in recent years, with rates increasing by 1.0% per year during 2011 through 2016. This is similar to the uptick occurring in people under 50, in whom incidence rates have been increasing since the mid-1990s and rose by 2.2% per year from 2011 to 2016. These increases likely reflect elevated disease risk in generations born since 1950 that is being carried forward over time as people age, a phenomenon referred to as a birth cohort effect.

Although rising incidence in those under 50 was previously driven by rectal tumors, in the most recent five years of data (2012-2016), incidence rates rose by 1.8% per year for tumors in the proximal and distal colon as well as in the rectum. Rising incidence in people younger than 65 is driven by trends in non-Hispanic whites, although rates in American Indians/Alaska Natives are also increasing steeply.

In 2020, there will be about 18,000 cases of CRC (12%) diagnosed in people under 50 in 2020, the equivalent of 49 new cases per day. In addition, 3,640 CRC deaths (10 per day) are expected in 2020 in this age group, partly owing to delays in diagnosis; 1 in 4 patients (26%) younger than 50 is diagnosed with metastatic disease compared to 19% of those 65 and older.

CRC death rates overall have been decreasing since the late 1940s in women, but only since 1980 in men, likely reflecting differences in incidence trends, which are unavailable prior to 1975. However, incidence and mortality trends have been very similar between the sexes over the past three decades.

Like incidence, CRC mortality patterns vary by age, with rapid decreases in the oldest age groups and increasing trends in young adults. Over the past 10 data years (2008-2017), death rates declined by 3% per year in people 65 years and older but by only 0.6% per year in people 50 to 64, while increasing by 1.3% per year in those under 50. The uptick in CRC death rates in adults under 50, which is most rapid among non-Hispanic whites (2% per year), began around 2004 and was preceded by declines of 1% to 2% per year since at least 1975. Rapid declines in overall CRC death rates of approximately 3% per year during the 2000s decelerated to 1.8% per year from 2012 to 2017, perhaps reflecting slower gains in screening uptake and lower rates of first-time testing, as well as rising trends in younger adults.

Wide differences in the prevalence of CRC risk factors, such as smoking and excess body weight, as well as access to high-quality health care, including screening, result in large disparities. The incidence rate during 2012-2016 ranged from a low of 30 per 100,000 among Asian/Pacific Islanders to 46 per 100,000 among blacks and 89 per 100,000 Alaska Natives.

The pattern for mortality is the same, although the magnitude of the disparity is twice as large as that for incidence. For example, CRC incidence rates are about 20% higher in blacks (46 per 100,000) than in non-Hispanic whites (39 per 100,000) whereas death rates are almost 40% higher in blacks (19 versus 14 per 100,000). Even more striking are death rates among Alaska Natives (40 per 100,000), which are double those among blacks. American Indians/Alaska Natives are the only racial group in which overall CRC death rates are not declining.

Of all racial/ethnic groups, black patients are the most likely to be diagnosed with distant-stage CRC (25% vs 20% of non-Hispanic whites and Asians/Pacific Islanders) and also have the lowest overall 5-year survival rate (60% vs 66% among non-Hispanic whites and 68% among Asians/Pacific Islanders). These disparities are largely driven by socioeconomic inequalities that result in differences in access to early detection and the receipt of timely, high quality treatment.

Two-thirds (66%) of individuals 50 and older were current for colorectal cancer screening in 2018, ranging from 58% in Puerto Rico and 60% in Wyoming to 76% in Massachusetts. However, less than half of people ages 50-54 years were up-to-date, which is particularly concerning given increasing CRC incidence and mortality in this age group. Screening is also low among immigrants in the U.S. less than 10 years (26%), people who are uninsured (30%) or Medicaid-insured (53%), and Asian Americans (55%).

"Although overall colorectal cancer incidence and mortality continue to decline, this progress is increasingly confined to older age groups and is marred by vast disparities," said Rebecca Siegel, MPH, lead author of the report. "Unfortunately, tools that are very effective at reducing the burden of this disease are not being fully utilized. One in three people 50 and older is not up-to-date on screening; many of them have never been screened at all. We could save countless lives by increasing access to screening in rural and other low-income areas, especially in Alaska, and incentivizing primary care clinicians to ensure that all patients 45 and older are screened, as well as facilitating healthier lifestyles in our communities.

"More timely diagnosis among younger patients remains critical while we await answers to why CRC incidence is rising in young and middle-aged adults," said Ms. Siegel.

Credit: 
American Cancer Society

New insights into evolution: Why genes appear to move around

Scientists at Uppsala University have proposed an addition to the theory of evolution that can explain how and why genes move on chromosomes. The hypothesis, called the SNAP Hypothesis, is presented in the scientific journal PLOS Genetics.

Life originated on earth almost four billion years ago and diversified into a vast array of species. How has this diversification occurred? The Theory of Evolution together with the discovery of DNA and how it replicates provide an answer and a mechanism. Mutations in DNA occur from generation to generation and can be selected if they help individuals to adapt better to their environment. Over time, this has led to the separation of organisms into the different species that now inhabit all the different ecosystems of the planet.

Current theory (that evolution involves mistakes made when replicating a gene) explains how genes can mutate over time and acquire new meanings. However, a mystery in biology is that the relative locations of genes on chromosomes also changes over time. This is very obvious in bacteria, where different species often have the same genes in very different relative locations. Since the origin of life, genes have apparently been changing location. The questions are, how and why do genes move their relative locations?

Now, scientists at Uppsala University have proposed an addition to the theory of evolution that can explain how and why genes move on chromosomes. The hypothesis, called the SNAP Hypothesis, is based on the observation that tandem duplications of sections of chromosome occur very frequently in bacteria (more than a million times more frequently than most mutations). These duplications are lost spontaneously unless they are selected. Selection to maintain a duplication can occur whenever bacteria find themselves in a sub-optimal environment, where having two copies of a particular gene could increase fitness (for example, if the duplicated region includes a gene that increases growth rate on a poor nutrient).

Duplications typically contain hundreds of genes, even if only one is selected. The scientists Gerrit Brandis and Diarmaid Hughes argue that mutations can quickly accumulate in the hundreds of non-selected genes, including genes that are normally essential when there is only a single copy in the chromosome. Once two different essential genes are inactivated, one in each copy of the duplication, the duplication can no longer be lost. From this point on, the bacteria will have many genes unnecessarily duplicated, and mutations to inactivate or delete them will be positively selected because they increase fitness.

Over time, all of the unnecessary duplicated genes may be lost by mutation, but this will happen randomly in each copy of the duplication. By this process of random loss of unnecessary duplicated genes in each copy of the duplication, the relative order of the remaining genes can be completely changed. The SNAP process can rearrange gene order very rapidly and it may contribute to separating different species.

Credit: 
Uppsala University

New model improves management of wetland, floodplain and river habitats

image: USU researchers developed a model that can help improve wetland and river habitats.

Image: 
USU

Wetlands, floodplains and aquatic habitats are some of Utah's most important ecosystems. They are home to many bird, plant and fish species, and they provide unique outdoor recreation opportunities.

But in recent years these habitats have faced mounting pressure from encroaching land use and increased demand for water. Now researchers at Utah State University are developing new tools that help preserve and increase the area and quality of wetland, floodplain and aquatic habitats.

USU Associate Professor David Rosenberg and Ayman Alafifi, a water resources engineer at Brown and Caldwell, have developed an innovative computer model that helps water and wetland managers make better, data-based decisions. Their work was recently published in the journal of Environmental Modelling and Software. The study is part of a multi-year research effort focused on creating more effective habitat management tools.

The Watershed Area of Suitable Habitat systems model, known as WASH, takes hydrologic, topography, ecological and management data and helps managers identify when, where and how to allocate water, financial resources and vegetation management to improve the area and quality of the three habitat types. Recommendations are subject to constraints, including water availability, vegetation growth, infrastructure and existing demands.

"It's important to manage water and vegetation together in these areas," said Rosenberg. "Managing water and vegetation together helps us identify synergies and trade-offs across the three habitat types."

Rosenberg and Alafifi recently tested the model along the lower Bear River in Northern Utah, the largest source of water to the Great Salt Lake. The results show potential to increase aquatic habitat area during all months of the year, an increase in floodplain habitat during spring and summer, and an increase in wetland habitat during critical summer months when water is most scarce.

"These habitats and their well-being affect many residents in Utah because Utahns like to fish, hunt, bird and recreate," said Rosenberg. "People value these outdoor areas and want to see them improved."

Rosenberg and Alafifi worked in collaboration with a group of stakeholders, including the Nature Conservancy, Trout Unlimited, PacifiCorp, Bear River Land Conservancy and Cache County. These stakeholders gave researchers information about the habitats, shared water and vegetation management on the river and provided feedback on model results.

Credit: 
Utah State University

Oncotarget; Inducible knock-out of BCL6 in lymphoma cells results in tumor stasis

image: BCL6 knock-out in a DLBCL xenograft induces tumor stasis. Tumor xenografts were established in C.B-17 SCID mice by subcutaneous injection of inducible SU-DHL-4 Cas9 BCL6 and control sgRNA cells. Mice were randomized to receive drinking water with DOX (2 mg/kg) plus 5% sucrose (DOX on) or 5% sucrose only (DOX off). (A) After 5 days DOX treatment tumors from four mice were harvested and analyzed for Cas9 GFP induction using flow cytometry. Cas9-GFP-induced cells are indicated in green, non-induced cells in red. (B-E) Tumor-bearing mice were treated with DOX for 8 days after which tumors from control and BCL6 knock-out tumors were harvested 17/20 days after start of DOX treatment, respectively. Tumor volumes from (B) BCL6 sgRNA tumors (n = 10 DOX off, n = 7 DOX on) and (C) control (n = 10 DOX off, n = 8 DOX on) were measured. *p

Image: 
Manfred Koegl -- manfred.koegl@boehringer-ingelheim.com

The cover for issue 9 of Oncotarget features Figure 6, "BCL6 knock-out in a DLBCL xenograft induces tumor stasis," by Schlager, et al.

Read more: An oncogenic role of BCL6 in the initiation of DLBCL has been shown as the constitutive expression of BCL6 in mice recapitulates the pathogenesis of human DLBCL. Conditional BCL6 deletion in established DLBCL tumors in vivo induced a significant tumor growth inhibition with initial tumor stasis followed by slow tumor growth kinetics.

Dr. Manfred Koegl from Boehringer Ingelheim RCV GmbH & Co KG, Vienna, Austria said, "DLBCL is an aggressive and genetically diverse B-cell neoplasm in adults resulting in a biologically and clinically heterogeneous disease."

Such genetic alterations include translocations that fuse its coding sequence to heterologous promoters, point mutations in BCL6 promoter negative regulatory elements or mutations that affect BCL6 transcription, acetylation-mediated BCL6 inactivation or BCL6 degradation.

Constitutive BCL6 expression within GC B-cells leads to the development of DLBCL in mice that mimics that observed in patients suggesting that BCL6 is sufficient to initiate cancer.

A variety of BCL6 inhibitors have been previously reported, several of which have demonstrated that the BTB domain of BCL6 is amenable to targeting with peptide and small molecule inhibitors as well as PROTACs.

Importantly, we found that the anti-proliferative activity of BCL6 degraders such as BI-3802 on tissue culture cells is generally higher than that of BCL6 inhibitors despite their equipotent BCL6 binding affinities.

Addressing this question, we report on the establishment of an inducible BCL6 knock-out DLBCL model, which allows studying the phenotype of BCL6 loss in DLBCL xenografts in vivo.

The Koegl research team concluded in their Oncotarget Research Paper, "our findings have important implications for understanding the impact of BCL6-targeted therapies in DLBCL. According to our studies, it is reasonable to predict that the treatment of DLBCL with BCL6 degraders results in significant tumor growth inhibition and at least tumor stasis. The observed magnitude of effects of BCL6 blockade in monotherapy might provide a rationale for therapeutic combinations with other targeted and/or chemotherapeutic agents. Our CRISPR/Cas9 BCL6 knock-out model represents a valuable pre-clinical tool to evaluate such combination approaches."

Credit: 
Impact Journals LLC

Travel history should become routine in medical assessments to slow pandemics' spread

DALLAS - March 4, 2020 - Integrating travel history information into routine medical assessments could help stem the rapidly widening COVID-19 epidemic, as well as future pandemics, infectious disease specialists recommend in the Annals of Internal Medicine.

Trish Perl, M.D., M.Sc., Chief of Infectious Diseases and Geographic Medicine at UT Southwestern Medical Center, and Connie Savor Price, M.D., of the University of Colorado School of Medicine, say it's time to add travel history to routine information such as temperature and blood pressure collected in electronic medical records.

"We have the infrastructure to do this easily with the electronic medical record, we just need to implement it in a way to make it useful to the care teams," says Perl, who studies outbreaks and pandemics. "Once the infrastructure is built, we'll also need to communicate what is called 'situational awareness' to ensure that providers know what geographic areas have infections so that they can act accordingly."

A simple, targeted travel history can help put infectious symptoms in context for physicians and caregiver teams, and, if deemed appropriate, trigger more detailed history, further testing, and rapid implementation of protective measures for others in affected households, co-workers or other daily contacts, and health care personnel. Shared electronic health records also can integrate travel history with computerized decision-making support to suggest specific diagnoses in recent travelers, the authors note, in much the same way as trained medical teams routinely ask about tobacco exposure to ascertain levels of cancer and heart disease risk.

The emergence of novel respiratory diseases in the past two decades - including Severe Acute Respiratory Syndrome (SARS) in 2002-2003, Middle East Respiratory Syndrome (MERS) in 2012-2013, Western Africa-based Ebola in 2014, and now COVID-19 from China - demonstrate the need for change. With each wave, "the urgent threat of communicable diseases comes with significant morbidity and mortality, tremendous health care disruptions and resource utilization, and collateral economic and societal costs," Perl and Price write.

"MERS and SARS were associated with very specific travel. MERS was associated with travel to the Arabian Peninsula, and SARS was associated with travel primarily to Hong Kong, Singapore, and Beijing," Perl says. "Currently COVID is similar in that there are geographic clusters, but those lines may be blurring as the outbreak expands. The challenges and potential stress on the public health infrastructure, including the hospitals which are part of this, will be notable in that we could see large numbers of patients. Our role will not only be to care for these patients but to communicate to them the strategies that they can use to protect themselves."

The Annals commentary suggests that a simple script could be strategically and carefully developed to elicit clues for emerging infectious diseases and information about current emerging pathogen threats. The information could be collected along with the four gold standard vital signs - temperature, heart rate, respiratory rate, and blood pressure - currently used to help U.S.-based medical teams assess patients' health status, triage to appropriate care, determine potential diagnoses, and predict recovery.

"The current outbreak is an opportune time to consider adding travel history to the routine. The COVID outbreak is clearly moving at a tremendous pace, with new clusters appearing daily," says Perl, who holds the Jay P. Sanford Professorship in Infectious Diseases at UTSW. "This pace is a signal to us that it is a matter of time before we will see more of these infections in the U.S. What is different with this outbreak is that this virus is more fit and transmissible and hence there has been much more transmission."

Credit: 
UT Southwestern Medical Center

NASA tracks ex-Tropical Cyclone Esther over Northern Territory

image: On Mar. 4, 2020, the MODIS instrument that flies aboard NASA's Aqua satellite provided a visible image of Esther's remnant clouds centered over the Barkly area of the Northern Territory.

Image: 
NASA Worldview

NASA's Aqua satellite continues to provide forecasters with a visible image ex-tropical cyclone Esther's remnant clouds and storms, now over the Barkly Region of Australia's Northern Territory.

On March 4, the Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite provided a visible image of Esther's remnant clouds that showed the center over the Barkly region of the Northern Territory.

The Barkly Region is located in the west central part of the territory and the region's main town is Tennant Creek. The region covers an area of 124,600 square miles (322,713 square km). Esther's clouds stretched into Queensland.

On March 4, the Australian Bureau of Meteorology (ABM) issued a Flood Watch for the Tanami Desert, Central Desert, MacDonnell Ranges, Barkly, Georgina River and Simpson Desert.

At 10:38 a.m. ACST on Mar. 4 (8:08 p.m. EST on Mar. 3), ABM cautioned that widespread disruption to roads expected to continue in central parts for next few days.

Ex-tropical cyclone Esther is expected to move across the southern Barkly District and into southwestern Queensland later today, Mar. 4.

ABM noted, "During the past 48 hours widespread rainfall totals of 50 to 150 mm (2 to 6 inches) have been recorded in many parts of the southern Northern Territory with some totals even higher, mainly in the Tanami Desert and Barkly. Rainfall is expected to contract east during today with 20 - 50 mm (0.8 to 2 inch) daily totals expected into tomorrow in the upper Georgina River and Simpson Desert."

Many areas have experienced localized flooding and areas of inundation during the past 48 hours.  ABM noted, "Many roads including major transport routes in the flood watch area are expected to continue to be affected and become or remain impassable with some communities and homesteads becoming isolated."

Credit: 
NASA/Goddard Space Flight Center

Exciting apparatus helps atoms see the light

image: The scientists used a device called a magneto-optical trap (MOT) to capture and cool Rubidium atoms, which were then excited to a Rydberg state.

Image: 
OIST

Researchers in the Light-Matter Interactions for Quantum Technologies Unit at the Okinawa Institute of Science and Technology Graduate University (OIST) have generated Rydberg atoms - unusually large excited atoms - near nanometer-thin optical fibers. Their findings, published recently in Physical Review Research, mark progress toward a new platform for quantum information processing, which has the potential to revolutionize material and drug discoveries and provide more secure quantum communication.

Due to their extraordinary susceptibility to electric and magnetic fields, Rydberg atoms have long piqued physicists' interests. Used in conjunction with optical nanofibers, these hyper-sensitive atoms could play an instrumental role in new types of scalable quantum devices. However, Rydberg atoms are notably difficult to control.

"The main aim of the study was to bring Rydberg atoms into proximity with the nanofibers," said Krishnapriya Subramonian Rajasree, a PhD student at OIST and the first author of the study. "This set-up creates a new system for studying interactions between Rydberg atoms and nanofiber surfaces."

Unusual atoms

To carry out their research, the scientists used a device called a magneto-optical trap to capture a cluster of Rubidium (Rb) atoms. They reduced the temperature of the atoms to approximately 120 microKelvin - fractions of a degree above absolute zero and ran a nanofiber through the atom cloud.

Then, the scientists excited the Rb atoms to a more energetic Rydberg state, using a 482 nm beam of light traveling through the nanofiber. These Rydberg atoms, which formed around the nanofiber surface, are greater in size than their ordinary counterparts. When the atoms' electrons gained energy, they moved further from the atomic nucleus, creating larger atoms. This unusual size heightens the atoms' sensitivity to their environment and to the presence of other Rydberg atoms.

Through their experiment, the scientists brought the Rydberg atoms within mere nanometers of the optical nanofiber, enabling increased interaction between the atoms and light travelling in the nanofiber. Due to their abnormal properties, the Rydberg atoms escaped the magneto-optical trap. The scientists were able to understand aspects of Rydberg atom behavior by examining how the loss of atoms depended on the power and wavelength of the light.

The ability to use light travelling in an optical nanofiber to excite and then control Rydberg atoms may help pave the way toward methods of quantum communication, while also heralding incremental progress toward quantum computing, the scientists said.

"Understanding interactions between light and Rydberg atoms is crucial," said Dr. Jesse Everett, a post-doctoral scholar at OIST and a co-author of the study. "Harnessing these atoms could enable the secure routing of communication signals using very small amounts of light."

Moving forward, the researchers hope to further study properties of the Rydberg atoms in conjunction with optical nanofibers. In future studies, they intend to look at Rydberg atoms that are even bigger in size, to explore the possibilities and limits of this system.

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

Birds of a feather better not together

image: Geographic variation in bird biodiversity and stability over two decades. Figure by Catano et al from paper available at https://royalsocietypublishing.org/doi/10.1098/rspb.2019.2520

Image: 
Courtesy Proceedings of the Royal Society B

Diversity plays a key role in maintaining the stability of plant and animal life in an area. But it's difficult to scale up smaller experiments to understand how changes will impact larger ecosystems.

A new study of North American birds from Washington University in St. Louis finds that the regional stability of ecosystems over time depends on both the total number of species present in a locality and on the variation in species identities among localities.

The results have implications for maintaining a diverse portfolio of local species in the face of major environmental threats -- like climate change, biological invasions, intensifying land use and other human and natural disturbances. "Homogenization" may threaten ecosystems at larger geographic scales, the research suggests. The study is published March 4 in the Proceedings of the Royal Society B.

"Species diversity is changing in more complicated ways than just going up or down, or the total number of species," said lead author Christopher P. Catano, a recent PhD graduate of Washington University and current postdoctoral research associate at Michigan State University. "One of the most critical and conspicuous ways that diversity is changing is by changing the distribution of species across space."

Recent high-profile studies have tried to show how and why biodiversity is changing at regional to global scales. Scientists have sounded alarms about net losses of insect, fish, bird and plant species. Many fear that such losses may alter the functioning of ecosystems and upend their ability to provide critical goods and services that humans rely on.

"The study provides some of the first evidence to suggest that local biodiversity loss and biotic homogenization will impact the functioning and stability of ecosystems at macroscales," said Jonathan Myers, associate professor of biology in Arts & Sciences.

A diverse portfolio -- of birds

The greater the number of species in an ecosystem, the more that ecosystem tends to be stable. The general reason why this happens is something called an insurance effect.

In concept, it's similar to why investors might want to have a diverse set of stocks in their portfolio. If one stock performs poorly, there's a chance that it will be buffered by others that perform better than expected.

"Over time, these different stocks -- or these different species, in the context of an ecosystem -- can compensate for one another," Catano said. "The fact that species may respond differently to the same environmental change is what gives that insurance effect."

This idea is widely accepted by biologists. The basic mechanism has been confirmed by lots of studies over time, and it underlies many arguments to promote or conserve biodiversity.

But it's not without its limitations. For example, much of the supporting research was conducted at a relatively small scale. One notable experiment was completed on plots of land that measured only about half as long as a bowling alley, each neatly planted with varying numbers and species of seeds.

Real life is not so clean. The ground is not perfectly flat, rivers cut through breeding areas, and animals and seeds travel across large areas. Ecosystem management often occurs within and across large tracts of land.

"At larger scales, it's not just the number of species that's potentially varying, it's also the identity or the composition of those species across space," Catano said.

"So there are two components to regional stability," Catano said. "One is how stable your average local community is. And the second is how differently those local communities respond through time -- relative to each other."

Observations at a larger geographic scale

Catano and Myers decided to try to test the relative importance of these two factors -- the number of species, and the amount of site-to-site variation in species composition -- in determining the stability of ecosystems at a larger scale.

The researchers used 20 years of observational data from the North American Breeding Bird Survey, a joint effort of the U.S. Geological Survey and Environment Canada. They focused on 342 species of songbirds in 1,675 breeding bird "communities" -- the census-block-like geographic units routinely sampled in the Breeding Bird Survey -- distributed across 35 large bird-conservation regions.

The researchers used the production of total bird biomass over time as their measure of ecosystem stability.

They found that species count does matter for stability -- but site-to-site variation in species composition matters three times as much at the larger geographic scale of bird conservation regions.

And what happens at a larger scale with birds is likely to affect humans, too. "Birds are major consumers of insect pests that limit production of plants and therefore ecosystems. Also, a lot of plants are dependent on seed dispersal by birds," Catano said. "There's a lot of critical services that are mediated by birds."

A landscape that supports variation

The results have implications for conservation, the researchers said.

Land use managers have a key role to play in promoting variation across an area, the researchers said, by taking advantage of management techniques that introduce or maintain environmental heterogeneity.

"An example might be through grazing," Catano said. "Animals graze somewhat patchily, and those patches create meaningful variation in the composition of other species. Fire management, or controlled burns, is another measure that, when done appropriately, can increase the resource heterogeneity in an ecosystem.

"These are things that land managers and conservation practitioners are already doing," he added. "This just cues them into this other dimension of biodiversity that's often overlooked when they assess the success of their restoration or management effort.

"Changes that lead to something like biotic homogenization could be destabilizing for ecosystems, even if it doesn't lead to the loss of species."

Credit: 
Washington University in St. Louis

Expanding the plasmonic painter's palette

image: An image of two colorful parrots was created by mixing red, blue and green structural colors.

Image: 
Adapted from <i>ACS Nano</i> <b>2020</b>, DOI: 10.1021/acsnano.9b07523

By blending paints in their palette, artists can create a broad spectrum of colors with subtly different hues. However, scientists who wish to create a similar range of structural colors, like those found on butterfly wings, are much more limited. Now, researchers reporting in ACS Nano have developed a new method for mixing plasmonic red, blue and green to yield a virtually unlimited number of colors that could be used for new types of displays. 

Unlike pigments, structural colors get their hues by reflecting light from microscopic textures. Scientists can create some of these colors by putting metal nanoparticles onto surfaces in various patterns. These "plasmonically induced" colors are less susceptible to fading than pigments, and they might be useful for new types of paint, electronic displays and anti-counterfeiting measures. But producing a gamut of structural colors with smooth transitions between hues and tones has been challenging. Therefore, Dimos Poulikakos, Hadi Eghlidi and colleagues wanted to develop a new plasmonic color-mixing approach that would allow countless color variations.

The researchers began with a palette of three primary colors (red, green and blue). They made pixels of each color by arranging silver nanorods in lattice patterns on glass surfaces. The lengths and widths of the nanorods, and the distances between nanorods in the horizontal direction, determined whether the pixel was red, green or blue. The researchers adjusted the brightness of each color by varying the vertical distance between nanorods in the lattice. When the team interwove three of the primary color lattices in a single pixel and varied the vertical distances to adjust brightness, they could generate 2,456 unique colors with a pixel size of 4.26 × 4.26 μm. The researchers demonstrate the method to reproduce an image of two colorful parrots and a black-and-white photograph of Marie Curie.

Credit: 
American Chemical Society

Fighting hand tremors: First comes AI, then robots

BROOKLYN, New York, Wednesday, March 4, 2020 - Robots hold promise for a large number of people with neurological movement disorders severely affecting the quality of their lives. Now researchers have tapped artificial intelligence techniques to build an algorithmic model that will make the robots more accurate, faster, and safer when battling hand tremors.

Their model, which is ready for others to deploy, appears this month in Scientific Reports, an online journal of Nature. The international team reports the most robust techniques to date to characterize pathological hand tremors symptomatic of the common and debilitating motor problems affecting a large number of aging adults. One million people throughout the world have been diagnosed with Parkinson's disease, just one of the neurodegenerative diseases that can cause hand tremors.

While technology such as sophisticated wearable exoskeleton suits and neurorehabilitative robots could help people offset some involuntary movements, these robotic assistants need to precisely predict involuntary movements in real-time - a lag of merely 10 or 20 milliseconds can thwart effective compensation by the machine and in some cases may even jeopardize safety.

Enter the big dataset collected at the London (Ontario) Movement Disorders Centre and the team's pioneering machine learning model, which they named PHTNet, for "Pathological Hand Tremors using Recurrent Neural Networks". Using small sensors, they analyzed the hand motions of 81 patients in their 60s and 70s, then applied a novel data-driven deep neural network modeling technique to extract predictive information applicable to all patients.

Their paper details the artificial intelligence model and training, and reports a 95% confidence rate over 24,300 samples.

"Our model is already at the ready-to-use stage, available to neurologists, researchers, and assistive technology developers," said co-author S. Farokh Atashzar, who is now an NYU Tandon assistant professor and who began exploring the use of robots coupled with artificial intelligence while conducting doctoral and post-doctoral research in Canada. "It requires substantial computational power, so we plan to develop a low-power, cloud-computing approach that will allow wearable robots and exoskeletons to operate in patients' homes. We also hope to develop models that require less computational power and add other biological factors to the inputs."

Credit: 
NYU Tandon School of Engineering

Robot uses artificial intelligence and imaging to draw blood

image: This tabletop robotic device can accurately steer needles and catheters into tiny blood vessels with minimal supervision.

Image: 
Martin Yarmush and Alvin Chen

Rutgers engineers have created a tabletop device that combines a robot, artificial intelligence and near-infrared and ultrasound imaging to draw blood or insert catheters to deliver fluids and drugs.

Their most recent research results, published in the journal Nature Machine Intelligence, suggest that autonomous systems like the image-guided robotic device could outperform people on some complex medical tasks.

Medical robots could reduce injuries and improve the efficiency and outcomes of procedures, as well as carry out tasks with minimal supervision when resources are limited. This would allow health care professionals to focus more on other critical aspects of medical care and enable emergency medical providers to bring advanced interventions and resuscitation efforts to remote and resource-limited areas.

"Using volunteers, models and animals, our team showed that the device can accurately pinpoint blood vessels, improving success rates and procedure times compared with expert health care professionals, especially with difficult to access blood vessels," said senior author Martin L. Yarmush, Paul & Mary Monroe Chair & Distinguished Professor in the Department of Biomedical Engineering in the School of Engineering at Rutgers University-New Brunswick.

Getting access to veins, arteries and other blood vessels is a critical first step in many diagnostic and therapeutic procedures. They include drawing blood, administering fluids and medications, introducing devices such as stents and monitoring health. The timeliness of procedures can be critical, but gaining access to blood vessels in many people can be quite challenging.

Failures occur in an estimated 20 percent of procedures, and difficulties increase in people with small, twisted, rolling or collapsed blood vessels, which are common in pediatric, elderly, chronically ill and trauma patients, the study says. In these groups, the first-stick accuracy rate is below 50 percent and at least five attempts are often needed, leading to delays in treatment. Bleeding complications can arise when major adjacent arteries, nerves or internal organs are punctured, and the risk of complication rises significantly with multiple attempts. When nearby blood vessels are inaccessible, more invasive approaches such as central venous or arterial access are often required.

The robotic device can accurately steer needles and catheters into tiny blood vessels with minimal supervision. It combines artificial intelligence with near-infrared and ultrasound imaging to perform complex visual tasks, including identifying the blood vessels from the surrounding tissue, classifying them and estimating their depth, followed by motion tracking. In other published work, the authors have shown that the device can serve as a platform to merge automated blood-drawing and downstream analysis of blood.

Next steps include more research on the device in a broader range of people, including those with normal and difficult blood vessels to access.

"Not only can the device be used for patients, but it can also be modified to draw blood in rodents, a procedure which is extremely important for drug testing in animals in the pharmaceutical and biotech industries," Yarmush said.

Credit: 
Rutgers University

Household chemical use linked to child language delays

COLUMBUS, Ohio - Young children from low-income homes whose mothers reported frequent use of toxic chemicals such as household cleaners were more likely to show delays in language development by age 2, a new study found.

In addition, the children scored lower on a test of cognitive development. These developmental delays were evident even when the researchers took into account factors such as the education and income of mothers, which are also linked to their children's language and cognitive skills.

The findings provide additional evidence of the need for pediatricians and other health care providers to counsel parents of young children to restrict their use of toxic household chemicals, said Hui Jiang, lead author of the study and senior research associate at The Ohio State University.

"We found that a significant percentage of mothers with young children may commonly expose their children to toxic household chemicals, possibly because they are unaware that such materials may be harmful," said Jiang, who is with Ohio State's Crane Center for Early Childhood Research and Policy.

The study was published online recently in the journal Clinical Pediatrics.

The researchers used data on 190 families from the Kids in Columbus Study, a Crane Center research project that followed children born into low-income families in Columbus for five years after birth.

When they first started the study, mothers were asked about their use of household chemicals such as floor and toilet cleaners and solvents during pregnancy. They were asked again when their child was 14 to 23 months old. Mothers also reported whether they had mold in the home, their use of pesticides, and neighborhood pollution sources.

Children's language development was measured when they were between 14 and 23 months old and again when they were 20 to 25 months old. The researchers used a standardized test that examines children's understanding and expression of language - for example, recognition of objects and people, following directions, and naming objects and pictures.

Findings showed that neighborhood pollution, mold in the house and pesticide use were not significantly linked to child outcomes.

But the more household chemicals mothers reported using regularly after childbirth, the lower the child language and cognitive outcomes at 2 years of age.

There was no link between chemical use during pregnancy and child outcomes, possibly because mothers reported using significantly fewer chemicals during pregnancy.

Exposure to toxic chemicals was reported by about 20 percent of mothers during pregnancy, but that increased to 30 percent when their children were between 1 and 2 years old. Mothers also reported using more household chemicals after childbirth.

"A lot of mothers seem to know to limit exposure to toxic chemicals during pregnancy, but once their child is born, they may think it is no longer a problem," Jiang said.

But research has shown these early years of a child's life are key in many ways, said Laura Justice, co-author of the study and professor of educational psychology at Ohio State.

"When kids reach about 2 years old, that is a peak time for brain development," said Justice, who is executive director of The Crane Center.

"If the use of toxic chemicals is interfering with that development, that could lead to problems with language and cognitive growth."

While many mothers may use household cleaners and other toxic chemicals when their children are young, low-income mothers may face particular challenges, Jiang said.

For example, they often live in smaller apartments where it may be more difficult to keep children away from chemicals, particularly while they are cleaning.

Jiang noted that this study simply analyzed the relationship between mothers' use of toxic chemicals and later child development and as such can't prove that chemical use caused the developmental delays.

"Future studies are need to more carefully examine the mechanisms through which household toxicants may disrupt early language development," she said.

The findings do show that pediatricians need to emphasize that pregnancy is not the only time for mothers to be concerned about chemical use, Justice said.

"Parents need to understand the delicacy of brain development in the first several years of life and their children's susceptibility to chemical exposure," she said.

Credit: 
Ohio State University

Study reveals improving survival rates after liver transplantation in the UK

In the past two decades, death rates after liver transplantation have dropped by more than half in the UK, according to a recent analysis of almost 10,000 liver transplant recipients published in BJS (British Journal of Surgery). During this time period, survival over the first 3 years has improved to 83.1% in 2012-2016 (from 71.7% in 1997-2001) for patients who had transplants for cancer and to 90.7% (from 79.6%) for those transplanted for benign diseases.

"The increase in survival after liver transplantation in the last 20 years can be explained by a combination of factors. There are improvements in short-time survival that are probably related to surgical technique and perioperative care, and improvements in long-term outcomes that are linked to developments in immunosuppression and follow-up care," said study coauthor Professor Nigel Heaton, MBBS, FRCS, of King's College Hospital NHS Foundation Trust.

Credit: 
Wiley

City of Hope creates innovative platform for landmark study, opening data to more people

DUARTE, Calif. -- A $12 million federal grant enabled City of Hope and collaborators to deploy a novel cloud-computing platform, making an immense amount of data from a historic 25-year study more accessible and user-friendly.

The ongoing California Teachers Study that first began in 1995 has already given researchers a bevy of data on how to better prevent and treat cancers, heart conditions and Alzheimer's disease. In the past, this data was available only to a select few researchers. Opening the data to researchers worldwide and making it user-friendly will fast-track scientific discoveries that can improve the quality of life for people around the world, said James Lacey Jr., Ph.D., M.P.H., director of the Division of Health Analytics at City of Hope and one of the principal investigators of the study.

"We might be one of the first in the world to use secure cloud computing to build a data commons for an observational study," Lacey said, adding that observational studies are expensive, so synthesizing data from disparate sources and making the information widely available is one way to ensure that federal grant dollars "get more mileage."

"City of Hope continues to lead the health provider pack when it comes to collaborating with cutting-edge technology companies to deploy solutions that accelerate the translation of precision medicine into disease prevention and, potentially, therapies for patients," Lacey said. Precision medicine is a personalized approach to disease prevention and treatment that takes into account each person's specific genes, environment and lifestyle choices.

The study, published on Feb. 12 in the journal Cancer Epidemiology, Biomarkers and Prevention, provides a roadmap for other population health experts who want to broaden the reach and potential impact of their own research. The novel open cloud-computing platform City of Hope, San Diego Supercomputer Center (SDSC) and UC San Diego created for the California Teachers Study has simplified the process for understanding the incidence and distribution of disease. As a result, scientists can more quickly detect patterns and trends that could be translated into better health for individuals and the public.

The California Teachers Study was created in 1995 and enrolled 133,479 current and former public-school teachers or administrators. They agreed to have their health and lifestyle tracked to help understand why teachers historically have higher rates of breast cancer. The study has since expanded to address other cancers including colon, pancreatic and bladder, as well as heart disease and even Parkinson's and Alzheimer's disease. More than 190 published studies have resulted from the data.

This is an example of how sharing is really caring, Lacey said. "Cancer, heart disease and Alzheimer's disease are big problems that need the combined brainpower of the brightest minds around the world. Cloud computing directly helps cancer researchers store, share, analyze and use their data in new and more efficient ways. In short, our open website allows interested individuals to securely access, explore and generate discoveries with our California Teachers Study data."

The new platform shortens the time needed to launch a research project from weeks to days, Lacey said. Previously, every research project had to be custom built, but now with the data commons framework, users can get started quickly, apply workflow templates for their projects and start analyzing the data right away.

"It is gratifying to see the grant for the California Teachers Study infrastructure successfully deliver on its promise of building a secure, cloud-based data commons platform for the cancer epidemiology research community," said Sandeep Chandra, director of SDSC's Sherlock Division and senior author of the study. "What is more exciting is the potential of how this data commons can serve as a model for other current, and future, observational studies through adoption of this framework; thereby, reducing time and investment to deploy data management and analysis capabilities."

Elena Martinez, Ph.D., professor in the Department of Family Medicine and Public Health at UC San Diego and one of the principal investigators of the study added, "The newly implemented California Teachers Study infrastructure exemplifies what is possible when leveraging the knowledge and experience of population scientists who work alongside data scientists to move research into the 21st century. I am proud to be a part of an innovation leader that will serve as a model for future observational studies."

Credit: 
City of Hope