Earth

Remote neuropsychology tests for children shown effective

image: Administering neuropsychology evaluations to children online in the comfort of their own homes is feasible and delivers results comparable to tests traditionally performed in a clinic, a new study led by UT Southwestern researchers and Children's Health indicates. The finding, published online this month in the Archives of Clinical Neuropsychology, could help expand access to specialists and reduce barriers to care, particularly as the popularity of telemedicine grows during the COVID-19 pandemic.

Image: 
UT Southwestern Medical Center

Administering neuropsychology evaluations to children online in the comfort of their own homes is feasible and delivers results comparable to tests traditionally performed in a clinic, a new study led by UT Southwestern researchers and Children's Health indicates. The finding, published online this month in the Archives of Clinical Neuropsychology, could help expand access to specialists and reduce barriers to care, particularly as the popularity of telemedicine grows during the COVID-19 pandemic.

Patients with a variety of neurological disorders require periodic neuropsychological evaluations to track their cognition, academic skills, memory, attention, and other variables. Typically, these tests are done in clinics, often by specialists in these disorders.

However, explains Lana Harder, Ph.D., ABPP, associate professor of psychiatry and neurology at UTSW, many patients travel hundreds of miles to access specialists for their care - a major expense and inconvenience that can also cause fatigue and potentially influence the results. Harder also leads the neuropsychology service and is the neuropsychology training director at Children's Health.

Research on adults has shown that these evaluations can be done effectively, with the examiner and patient in different rooms. However, those tests were conducted in controlled clinic or laboratory settings rather than patients' homes, where distractions and technological glitches could confound results. Plus, none of the earlier studies involved children, a population that has its own unique challenges.

To evaluate whether teleneuropsychology evaluations could be effectively performed with children at home, Harder, along with Benjamin Greenberg, M.D., professor of neurology and pediatrics at UTSW and co-director with Harder of the Pediatric CONQUER Program at Children's, and their colleagues recruited 58 patients primarily from the Pediatric Demyelinating Disease Program at Children's Medical Center Dallas. This clinic treats patients with neurological autoimmune disorders that target myelin, an insulating layer on nerve cells that is critical to their function. The disorders include transverse myelitis, multiple sclerosis, acute disseminated encephalomyelitis, optic neuritis, and neuromyelitis optica. The patients ranged in age from 6 to 20 and traveled up to 2,033 miles for visits to the clinic.

Each child received the same 90-minute neuropsychology battery twice - once at home and once at the clinic - spaced apart by about 16 days. Half the group received the home test first; the other half got the clinic test first.

For the home test, children received a packet of testing materials prior to their test date and, if they did not have a computer or tablet at home, borrowed a tablet from the researchers' office in advance. For both tests, parents or other caregivers left the room, allowing the patient and researcher to interact one on one.

The home-based environment had unique challenges compared with the clinic, Greenberg explains: Any distraction, from a barking dog to a doorbell, or technological glitches, such as a poor internet connection, could invalidate the results. While distractions and technology problems occurred intermittently during remote sessions, these were typically fleeting and generally did not interfere with testing sessions.

When the researchers compared the results obtained from the home- and clinic-based tests, no significant differences were found.

But it's not enough to show that the home-based testing is comparable to the clinic, Harder notes - patients and their caregivers must also be willing and interested in remote testing to make it feasible. To that end, the researchers gave each patient and their caregivers a survey to assess their level of satisfaction with the videoconference-based test. The vast majority (94 percent of caregivers and 90 percent of participants) responded that they were satisfied with home-based testing. If given a choice between remote or in-person, most indicated no preference.

Teleneuropsychology testing still needs to be evaluated over a broader age range and array of conditions and measures before it becomes a staple in the field, Harder says. But having this as an option could eventually help children avoid having to travel far distances to access specialists or avoid exposure from in-person visits - a boon during the era of COVID-19, she adds.

"This model could allow these young and often medically fragile children to stay put but still receive the care that they need," Harder says.

Credit: 
UT Southwestern Medical Center

Choanozoan and picozoan marine protists are probably virus eaters - study

image: Sampling seawater in the Gulf of Maine.

Image: 
Ramunas Stepanauskas and coauthors

Viruses occur in astronomic numbers everywhere on Earth, from the atmosphere to the deepest ocean. Surprisingly, considering the abundance and nutrient-richness of viruses, no organisms are known to use them as food. In Frontiers in Microbiology, researchers publish the first compelling evidence that two groups of ecologically important marine protists, choanozoans and picozoans, are virus eaters, catching their "prey" through phagocytosis (i.e. engulfing).

"Our data show that many protist cells contain DNA of a wide variety of non-infectious viruses but not bacteria, strong evidence that they are feeding on viruses rather than on bacteria. That came as a big surprise, as these findings go against the currently predominant views of the role of viruses and protists in the marine food webs," says corresponding author Dr Ramunas Stepanauskas, Director of the Single Cell Genomics Center at Bigelow Laboratory for Ocean Sciences in East Boothbay, Maine, USA.

Stepanauskas and colleagues sampled surface seawater from two sites: the Northwestern Atlantic in the Gulf of Maine, USA in July 2009, and the Mediterranean off Catalonia, Spain in January and July 2016. They used modern single-cell genomics tools to sequence the total DNA from 1,698 individual protists in the water. Each of the resulting Single Amplified Genomes (SAGs) consists of the genome of a single protist, with or without associated DNA: for example, from symbionts, ingested prey, or viruses or bacteria sticking to its exterior. The technique is very sensitive, but doesn't directly show the type of relationship between a protist and its associates.

The researchers found a range of protists including alveolates, stramenopiles, chlorophytes, cercozoans, picozoans, and choanozoans. Nineteen percent of SAGs from the Gulf of Maine and 48% of those from the Mediterranean were associated with bacterial DNA, suggesting that these protists had eaten bacteria. More common were viral sequences, found in 51% of SAGs from the Gulf of Maine and 35% of those from the Mediterranean, with a frequency of 1-52 virus types per protist. Most were from viruses known to infect bacteria - presumably representing parasites of the protists' bacterial prey.

But choanozoans and picozoans, which only occurred in the Gulf of Maine sample, were different. These groups, neither of which have chloroplasts, are poorly known. Choanozoans (3-10 μm; also known as choanoflagellates), are of great evolutionary interest as the closest living relatives of animals and fungi. The tiny (up to 3 μ) picozoans were first discovered twenty years ago and originally known as picobiliphytes. Until now, their food sources were a puzzle, as their feeding apparatus is too small for bacteria - but ample for viruses, most of which are smaller than 150 nm.

Every single one of the choanozoan and picozoan SAGs were associated with viral sequences from bacteriophages and CRESS-DNA viruses, but mostly without any bacterial DNA, while the same sequences were found across a great diversity of species.

"It is very unlikely that these viruses are capable of infecting all the protists in which they were found," says Dr Julia Brown, a researcher at the Bigelow Laboratory for Ocean Sciences and coauthor on the study.

The authors conclude that choanozoans and picozoans probably routinely eat viruses.

"Viruses are rich in phosphorus and nitrogen, and could potentially be a good supplement to a carbon-rich diet that might include cellular prey or carbon-rich marine colloids," says Brown. "The removal of viruses from the water may reduce the number of viruses available to infect other organisms, while also shuttling the organic carbon within virus particles higher up the food chain. Future research might consider whether protists that consume viruses accumulate DNA sequences from their viral prey within their own genomes, or consider how they might protect themselves from infection."

Credit: 
Frontiers

Sky islands and tropical alpine sunflowers at risk of disappearing

image: Páramo with Espeletia plants

Image: 
Andrés Cortés, Santiago Madriñán and coauthors

As temperatures rise around the world, many species may escape the heat by migrating to higher elevations. But what will happen to those species that are already as high as there is to go?

A new study in Frontiers in Ecology and Evolution is among the first to predict the vulnerability of ecosystems in the Andes to both climate change and human activities. The researchers focused on biodiversity hotspots, called Páramos, and the most diverse plant species of these ecosystems---relatives of the sunflower in the genus Espeletia. The researchers' models predict that these habitats will shrink substantially in the next 30 years without conservation efforts. Beyond this potential loss of biodiversity, this is likely to negatively impact the human populations that rely on these ecosystems as well.

"Páramos are one of the fastest evolving biodiversity hotspots on earth and they are one of the most threatened," says co-leading author Dr Andrés Cortés, of the Colombian Corporation for Agricultural Research, together with Dr Santiago Madriñán, who is an expert on Páramos at the Universidad de los Andes in Colombia. "Páramos are also the main water supplier of wetland ecosystems and densely populated areas, hence, disregarding the future of the Páramos may jeopardize overall food and water safety in the northern Andes."

Páramos, or "sky islands," are tropical high elevation ecosystems that are above the tree line, ca. 2,800 - 5,000 m above sea level, but that are still below the permanently frozen mountaintops. Over a few millions of years, the species that inhabit these areas have adapted to extreme variations in temperature, water availability and sunlight exposure.

As a result of these conditions, there are now over 3,000 plant species throughout the South American Páramos and these areas continue to be among the fastest evolving ecosystems in the world. But scientists are only beginning to understand whether these species can evolve fast enough to keep up with climate change.

The team selected Espeletia as a representative genus because it is one of the most diverse and successful plant genera endemic to the Páramos, as well as iconic with its unbranched trunk topped with a rosette of leaves. The researchers used the most up-to-date computer modelling to predict what the distribution of 28 species of Espeletia would look like in 2050.

By adding in other factors such as nature reserves, surrounding forests, population density, agriculture and mining, the researchers found that some Páramos were particularly vulnerable and they also confirmed the limited opportunities for Espeletia species to migrate or adapt.
Future work is needed to understand the role of microhabitats and other species, but these findings highlight the importance of protecting these areas.

"We hope that our findings might assist future conservation efforts, such as promoting more sustainable land uses by empowering local communities and developing ecotourism, which are also essential to relieve human impact on tropical-alpine plant diversity in the northern Andes," says the team. "In the most pessimistic scenario, if Espeletia and Páramos were lost forever, science would lose an underexplored laboratory to study evolution happening at incredible rates -- it would be just as if The Galápagos Islands disappeared."

Credit: 
Frontiers

Climate pledges 'like tackling COVID-19 without social distancing'

Current global pledges to tackle climate change are the equivalent of declaring a pandemic without a plan for social distancing, researchers say.

In the Paris Agreement, nations agreed to limit global warming to "well below 2°C".

But University of Exeter scientists say governments are engaged in "climate hypocrisy" by publicly supporting the agreement while subsidising the fossil fuel industry, destroying forests and pursuing other harmful policies.

Writing in the journal Global Sustainability, they highlight two other crises - ozone depletion and the COVID-19 pandemic - and call for similar action on the climate crisis.

The call comes as world leaders including UK Prime Minister Boris Johnson discuss climate action and a "sustainable recovery" from the pandemic at the UN General Assembly.

"Restoring the ozone layer and minimising the COVID-19 pandemic both required governments to enact specific legislation to address the precise causes of these problems," said Professor Mark Baldwin, of Exeter's Global Systems Institute (GSI).

"By contrast, Paris Agreement commitments are the equivalent of intending to restore the ozone layer without a plan for eliminating ozone-depleting substances, or intending to end the COVID-19 pandemic without a plan for social distancing to reduce the spread of the virus.

"We know the climate crisis is caused mainly by fossil fuels.

"Current climate and energy policies are therefore nonsensical because they condemn greenhouse gas emissions by individuals while promoting fossil fuel production.

"Today we have governments publicly supporting the Paris Agreement, but simultaneously opening new coal mines, destroying forests, supporting fracking, subsidising the fossil fuel industry and supporting fossil fuel projects in the developing world."

Professor Tim Lenton, director of the GSI, said: "The fundamental reason we are not solving the climate crisis is not a lack of green energy solutions - it is that many governments continue energy strategies that prioritise fossil fuels.

"These entrenched energy policies subsidise the discovery, extraction, transport and sale of fossil fuels, with the aim of ensuring a cheap, plentiful, steady supply of fossil energy into the future.

"Some governments are introducing policies to reduce demand for fossil fuels and shift to green energy sources, but these policies are not enough.

"Green energy is not yet replacing fossil fuels - it is merely augmenting it. Energy from both fossil fuels and green sources is increasing.

"Individual behaviour choices - such as diets and modes of travel - are important, but more fundamental is to replace the supply of fossil fuels with green energy."

The researchers call for a "comprehensive global plan" to solve the climate crisis.

They make seven recommendations:

1. End all government subsidies to the fossil fuel industry.

2. Ban all exploration for new oil/gas/coal reserves anywhere in the world.

3. Enforce a policy that no public money can be spent on fossil fuel infrastructure anywhere in the world.

4. Stop justifying fossil fuel use by employing carbon offset schemes.

5. Redirect most fossil fuel subsidies to targeted programmes for enabling the transition to a green energy economy.

6. Minimise reliance on future negative-emissions technologies. They should be the subject of research, development, and potentially deployment, but the plan to solve the climate crisis should proceed on the premise that they will not work at scale.

7. Trade deals: Do not buy products from nations that destroy rainforests in order to produce cheaper, greater quantities of meat and agricultural products for export.

Professor Baldwin added: "To bring about real change, we must address complex issues involving politics, fake news, human behaviour, government subsidies, taxes, international trade agreements, human rights, lobbying by the fossil fuel industry, and disinformation campaigns."

Credit: 
University of Exeter

World first study links obesity with reduced brain plasticity

A world-first study has found that severely overweight people are less likely to be able to re-wire their brains and find new neural pathways, a discovery that has significant implications for people recovering from a stroke or brain injury.

In a new paper published in Brain Sciences, researchers from UniSA and Deakin University show that brain plasticity is impaired in obese people, making it less likely that they can learn new tasks or remember things.

Using a series of experiments involving transcranial magnetic stimulation, the researchers tested 15 obese people aged between 18 and 60, comparing them with 15 people in a healthy-weight control group.

Repeated pulses of electrical stimulation were applied to the brain to see how strongly it responded. The healthy-weight control group recorded significant neural activity in response to the stimulation, suggesting a normal brain plasticity response. In contrast, the response in the obese group was minimal, suggesting its capacity to change was impaired.

UniSA researcher Dr Brenton Hordacre says the findings provide the first physiological evidence of a link between obesity and reduced brain plasticity.

Obesity is based on body mass index (BMI) which calculates the ratio between height and weight to determine body fat. An adult who has a BMI between 25 and 29.9 is considered overweight. Anything above that is obese.

"Obesity is already associated with a raft of adverse health effects, including a higher risk of cardiovascular disease, metabolic disorders and dementia," Dr Hordacre says.

"For the first time, we found that obesity was associated with impaired brain function, adding further support for the need to address the obesity epidemic.

"A growing number of people are obese - 650 million according to the World Health Organization - which not only has health consequences but is a serious financial burden for global health systems," he says.

"These new findings suggest that losing weight is particularly important for healthy brain ageing or for recovery in people who suffer strokes or brain injuries, where learning is fundamental for recovery."

Stroke is the third most common cause of death in Australia and the leading cause of disability, affecting speech, cognition and memory.

The ability of the brain to find new pathways is crucial to recovery, Dr Hordacre says. Worldwide, 15 million people suffer strokes each year, a third of whom die.

Credit: 
University of South Australia

Study shows that cycling is associated with reduced risk of both all-cause and cardiovascular mortality among people with diabetes

New research presented at this year's Annual Meeting of the European Association for the Study of Diabetes (EASD) shows that cycling reduces the risk of all-cause and cardiovascular mortality among people with diabetes, and could be a useful addition to existing physical activity referral schemes for patients with the disease.

The research was conducted by Dr Mathias Ried-Larsen and colleagues at the Centre for Physical Activity Research, Rigshospitalet, Copenhagen, Denmark. It analysed the association between time spent cycling and the risk of both all-cause, and cardiovascular mortality. They also looked at whether changes to the time spent cycling also had an effect on the risk of all-cause and cardiovascular mortality.

The risk of premature death from cardiovascular issues in people with diabetes is well established, however individuals with the disease also have higher all-cause mortality and there are few effective preventive measures to reduce this risk.

This prospective cohort study was conducted through a pair of questionnaire-based surveys conducted in eight Western European countries as part of the European Prospective Investigation into Cancer and Nutrition. The initial baseline survey took place between 1992 and 2000, with participants being sent a follow-up questionnaire 5 years after they completed the first one.

A total of 7,513 adults had self-reported or confirmed diabetes at the baseline of the study, of whom 5,506 went on to complete the second questionnaire and were selected by Dr Ried-Larsen and his team for inclusion in their research. The primary and secondary outcomes they investigated were all-cause and cardiovascular mortality, respectively. Other factors included in their analysis were the average time an individual spent cycling at baseline measured in minutes per week, and the change in cycling status between baseline and the second survey.

During a total 111,840 person-years of follow-up there were 1,684 deaths from all causes registered among the study group. The authors found that compared to the reference group of people who reported no cycling at baseline, the all-cause mortality risks were; 25%, 24%, 31%, and 24% lower for those participants who cycled for 1-59 min/week, 60-149 min/week, 150-299 min/week and 300+ min/week, respectively.

The analysis of the effect of a change in cycling status was based on data from 58,493 person-years of follow-up, during which 990 deaths of study participants from all causes were registered. The team discovered that compared to people who reported no cycling at both examinations, there was no difference in all-cause mortality compared with those who cycled and then stopped; but mortality was 35% lower in initial non-cyclists who started cycling, and 44% lower for people who reported cycling in both questionnaires. Similar results were also observed for cardiovascular mortality.

The authors found that "Cycling was associated with lower all-cause and cardiovascular mortality risk among people with diabetes independent of practicing other types of physical activity."

They add: "Participants who took up cycling between the baseline and second survey had a significantly lower risk of both all-cause and cardiovascular mortality compared to consistent non-cyclists."

They conclude: "As starting cycling decreases risk of both all-cause and cardiovascular mortality among persons with diabetes, these findings suggest that cycling could be considered as an addition to existing physical activity referral schemes to increase physical activity in the clinical care of diabetes."

Credit: 
Diabetologia

Oncotarget: A comprehensive analysis of clinical trials in pancreatic cancer

image: Summary of the time and cost for drug development (modified from DiMasi et al. [2016]). Costs factor in the 11.83% success rate through all three trials.

Image: 
Correspondence to - Jordan M. Winter - Jordan.Winter@UHhospitals.org

The cover for issue 38 of Oncotarget features Figure 3, "Summary of the time and cost for drug development (modified from DiMasi et al [2016]," by Katayama, et al. which reported that Pancreatic cancer is the most aggressive common cancer and is desperately in need of novel therapies.

In this study, the Oncotarget authors perform the first comprehensive analysis of the current clinical trial landscape in pancreatic cancer to better understand the pipeline of new therapies.

In this study, the Oncotarget authors perform the first comprehensive analysis of the current clinical trial landscape in pancreatic cancer to better understand the pipeline of new therapies.

Studies were curated and categorized according to the phase of the study, the clinical stage of the study population, type of intervention under investigation, and biologic mechanism targeted by the therapy under study.

As of May 18, 2019, there were 430 total active therapeutic interventional trials testing 590 interventions.

The vast minority of trials are in phase III testing. 189 interventions are immunotherapies, 69 target cell signaling pathways, 154 target cell cycle or DNA biology, and 35 target metabolic pathways.

Dr. Jordan M. Winter from The Case Western Reserve University School of Medicine as well as The University Hospitals Seidman Cancer Center and Case Comprehensive Cancer Center said, "Pancreatic ductal adenocarcinoma (PDAC) is the most aggressive of the common cancers."

The institute allocates roughly $6 billion annually to cancer research, and just over $100 million is dedicated to studying pancreatic cancer.

Other agencies and organizations like the Pancreatic Cancer Action Network, American Cancer Society, and the Department of Defense contribute significantly to this mission, likely adding more than $20 million per year in totality.

Along these lines, the authors have likely neared a survival ceiling for our patients in the absence of new discoveries that target other aspects of cancer biology, due to the additive toxicities of chemotherapeutic combinations.

Patients, family members, primary care providers, and oncologists battling together against pancreatic cancer often consider the same important questions: what new treatments are coming down the pike, and how soon will they arrive?

This work is necessary to better anticipate the timeframe for novel therapies against pancreatic cancer to reach the market. More importantly, this 20,000-foot view provides a foundation to discuss optimal resource allocation, with a principal goal to accelerate the pace of innovation, with an eye towards improving patient outcomes.

The Winter Research Team concluded in their Oncotarget Research Paper that the majority of PDAC trials are focused on immunotherapy, chemotherapy, and radiation.

Following the herd has not yet worked well for PDAC research; immunotherapy and precision therapy have yet to strongly impact this disease.

Finally, this compendium focuses on therapeutic studies and not early detection.

It is possible that the greatest advance in the future could be the discovery of an effective PDAC biomarker.

If the authors can detect PDAC at the PanIN 3 stage, therapeutic trials of invasive cancer become inconsequential.

Credit: 
Impact Journals LLC

Scientists develop forecasting technique that could help advance quest for fusion energy

image: An artist's rendition of a disrupting tokamak plasma in front of computer code

Image: 
Elle Starkman / PPPL Office of Communications

Bringing the power of the sun to Earth requires sound theory, good engineering, and a little finesse. The process entails trapping charged, ultra-hot gas known as plasma so its particles can fuse and release enormous amounts of energy. The most widely used facilities for this process are doughnut-shaped tokamaks that hold plasma in place with strong magnets that are precisely shaped and positioned. But errors in the shaping or placement of these magnets can lead to poor confinement and loss of plasma, shutting down fusion reactions.

Now, an international group of researchers led by physicists at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) has developed a technique that forecasts how tokamaks might respond to these unwanted magnetic errors. These forecasts could help engineers design fusion facilities that efficiently create a virtually inexhaustible supply of safe and clean fusion energy to generate electricity.

Fusion combines light elements in the form of plasma -- the hot, charged state of matter composed of free electrons and atomic nuclei -- and generates massive amounts of energy in the stars. Scientists aim to reproduce and control this process on Earth.

The team formulated a rule known as a scaling law that helps infer the properties of future tokamaks from present devices. The law is derived largely from three years of experiments on the DIII-D National Fusion Facility that General Atomics operates for the DOE in San Diego. Researchers also drew upon a database of error field effects maintained by ITER's International Tokamak Physics Activity group, which coordinates fusion research around the world.

Now data from additional devices with a range of sizes are needed to increase confidence in extrapolating the scaling law to predict how large error fields can be before disrupting in ITER, the multinational tokamak being built in France to demonstrate the viability of fusion energy.

Formation of error fields

Irregularities in the shaping or placement of a tokamak's magnets can produce error fields that trigger a disruption in the plasma, causing it to suddenly escape from the magnetic fields and release lots of energy. "The question is how large an error field ITER can tolerate without disrupting," said Nikolas Logan, PPPL physicist and lead author of a paper reporting the results in Nuclear Fusion. "We want to prevent disruptions in ITER because they could both interfere with fusion reactions and damage the walls."

Since ITER is under construction, the researchers used a mash-up of two computer codes to model the effects of error fields on plasmas for tokamaks in South Korea, China, the United Kingdom, and other countries, strengthening the errors until the plasmas disrupted. The researchers hoped to find patterns allowing them to formulate a simple rule that would help make conjectures about future error field disruptions in tokamaks being built.

The combined codes modeled the plasma more accurately than each individual code could do on its own. The TM1 code developed by Germany's Max Planck Institute for Plasma Physics solves equations that model chaotic plasma behavior in cylinder shapes, while the Ideal Perturbed Equilibrium Code (IPEC) code developed at PPPL models plasma in a tokamak shape. "By combining these codes, we were able to simulate a wide range of conditions that could occur in a variety of devices, including ITER," said PPPL physicist Qiming Hu, one of the paper's authors. "It's important to get accurate forecasts for ITER because no current machine is that size."

"This work extends our knowledge of the effects of error fields in fusion devices," said Raffi Nazikian, head of the ITER and Tokamak department at PPPL. "The combination of numerical and experimental analysis provides a compelling basis for assessing the importance of error fields in ITER and future reactors."

Next steps

Logan and Hu hope to gather more information from tokamak experiments to make the scaling law more precise, enabling it to forecast plasma performance in both the core and edge regions of the plasma. "This is not an alarm-bell paper," said Logan. "It just helps physicists and engineers know how carefully they need to consider prospective error fields before putting lots of power into ITER."

Collaborators included researchers from General Atomics, the Institute of Plasma Physics of the Czech Academy of Sciences, the Institute of Plasma Physics of the Chinese Academy of Sciences, Korea's Ulsan National Institute of Science and Technology, the United Kingdom's Culham Centre of Fusion Energy, Italy's Consorzio RFX, Germany's Max Planck Institute for Plasma Physics, and the Plasma Science and Fusion Center at the Massachusetts Institute of Technology.

Credit: 
DOE/Princeton Plasma Physics Laboratory

Tandon Researchers develop method to create colloidal diamonds

image: Complex structures comprising colloidal diamonds. The researchers applied strands of DNA to colloid surfaces. When colloids collide with each other in a liquid bath, the DNA snags and the colloids are linked.

Image: 
David Pine Lab

BROOKLYN, New York, Wednesday, September 23, 2020 - The colloidal diamond has been a dream of researchers since the 1990s. These structures -- stable, self-assembled formations of miniscule materials -- have the potential to make light waves as useful as electrons in computing, and hold promise for a host of other applications. But while the idea of colloidal diamonds was developed decades ago, no one was able to reliably produce the structures. Until now.

Researchers led by David Pine, professor of chemical and biomolecular engineering at the NYU Tandon School of Engineering and professor of physics at NYU, have devised a new process for the reliable self-assembly of colloids in a diamond formation that could lead to cheap, scalable fabrication of such structures. The discovery, detailed in "Colloidal Diamond," appearing in the September 24 issue of Nature, could open the door to highly efficient optical circuits leading to advances in optical computers and lasers, light filters that are more reliable and cheaper to produce than ever before, and much more.

Pine and his colleagues, including lead author Mingxin He, a postdoctoral researcher in the Department of Physics at NYU, and corresponding author Stefano Sacanna, associate professor of chemistry at NYU, have been studying colloids and the possible ways they can be structured for decades. These materials, made up of spheres hundreds of times smaller than the diameter of a human hair, can be arranged in different crystalline shapes depending on how the spheres are linked to one another. Each colloid attaches to another using strands of DNA glued to surfaces of the colloids that function as a kind of molecular Velcro. When colloids collide with each other in a liquid bath, the DNA snags and the colloids are linked. Depending on where the DNA is attached to the colloid, they can spontaneously create complex structures.

This process has been used to create strings of colloids and even colloids in a cubic formation. But these structures did not produce the Holy Grail of photonics -- a band gap for visible light. Much as a semiconductor filters out electrons in a circuit, a band gap filters out certain wavelengths of light. Filtering light in this way can be reliably achieved by colloids if they are arranged in a diamond formation, a process deemed too difficult and expensive to perform at commercial scale.

"There's been a great desire among engineers to make a diamond structure," said Pine. "Most researchers had given up on it, to tell you the truth - we may be the only group in the world who is still working on this. So I think the publication of the paper will come as something of a surprise to the community."

The investigators, including Etienne Ducrot, a former postdoc at NYU Tandon, now at the Centre de Recherche Paul Pascal - CNRS, Pessac, France; and Gi-Ra Yi of Sungkyunkwan University, Suwon, South Korea, discovered that they could use a steric interlock mechanism that would spontaneously produce the necessary staggered bonds to make this structure possible. When these pyramidal colloids approached each other, they linked in the necessary orientation to generate a diamond formation. Rather than going through the painstaking and expensive process of building these structures through the use of nanomachines, this mechanism allows the colloids to structure themselves without the need for outside interference. Furthermore, the diamond structures are stable, even when the liquid they form in is removed.

The discovery was made because He, a graduate student at NYU Tandon at the time, noticed an unusual feature of the colloids he was synthesizing in a pyramidal formation. He and his colleagues drew out all of the ways these structures could be linked. When they happened upon a particular interlinked structure, they realized they had hit upon the proper method. "After creating all these models, we saw immediately that we had created diamonds," said He.

"Dr. Pine's long-sought demonstration of the first self-assembled colloidal diamond lattices will unlock new research and development opportunities for important Department of Defense technologies which could benefit from 3D photonic crystals," said Dr. Evan Runnerstrom, program manager, Army Research Office (ARO), an element of the U.S. Army Combat Capabilities Development Command's Army Research Laboratory.

He explained that potential future advances include applications for high-efficiency lasers with reduced weight and energy demands for precision sensors and directed energy systems; and precise control of light for 3D integrated photonic circuits or optical signature management.

"I am thrilled with this result because it wonderfully illustrates a central goal of ARO's Materials Design Program -- to support high-risk, high-reward research that unlocks bottom-up routes to creating extraordinary materials that were previously impossible to make."

The team, which also includes John Gales, a graduate student in physics at NYU, and Zhe Gong, a postdoc at the University of Pennsylvania, formerly a graduate student in chemistry at NYU, are now focused on seeing how these colloidal diamonds can be used in a practical setting. They are already creating materials using their new structures that can filter out optical wavelengths in order to prove their usefulness in future technologies.

Credit: 
NYU Tandon School of Engineering

UK lockdown and air pollution: Nitrogen dioxide halved but sulphur dioxide doubled

A University of Liverpool study of air pollution in the UK during the first 100 days of lockdown has revealed that whilst nitrogen oxide levels were cut by half, levels of sulphur dioxide increased by over 100%.

Researchers from the University's School of Environmental Sciences analysed data from the Department for Environment, Food & Rural Affairs (DEFRA) air-quality sensors and UK Met Office stations to see how lockdown measures had affected levels of nitrogen dioxide, sulphur dioxide, particle matter (PM2.5) and ozone, and compare it to data from the past seven years.

The study revealed that during this period (from 23rd March to 13 June 2020) nitrogen dioxide (NO2) levels were cut by half which would relate to the reduction in vehicle emissions. More surprisingly, though, the analysis found that levels of sulphur dioxide (SO2), typically created by UK industry but in sharp decline, were more than double that of previous years.

Researchers also explored the localised effects of lockdown on air-quality, on seven large UK cities: London, Glasgow, Belfast, Birmingham, Manchester, Newcastle and Liverpool.

This revealed that NO2 levels in all of the cities reduced on average between 37 - 41% although it was these reductions were slightly greater in Glasgow declining by 44%. However, northern cities were found to experience greater increases in sulphur dioxide.

Lockdown in the UK came into effect on 23 March 2020 when the Prime minister, Boris Johnson, told the country that people 'must' stay at home and certain businesses must close.

This resulted in the significant reduction in motor vehicle usage with the first day of lockdown seeing a reduction to 69% of normal. This reached a low of 23% on 13 April before steadily increasing back to 77% 100 days after the lockdown. The first 100 days of lockdown also coincided with higher temperature and less humidity.

Lecturer in contemporary and dynamic processes, Dr Jonny Higham, who led the study said: "The results of our analysis are surprising. It is evident that the reduction in motor vehicles and human activity had a substantial impact on air quality as demonstrated by the reduction in nitrogen oxide. However, although it reduced one pollutant there has been a big increase in another pollutant.

"We think these changes could be driven by an in-balance in the complex air chemistry near to the surface exacerbated by the meteorological conditions in particularly low humidity levels and changes in pollutions concentrations.

"It is important to note that the complex and relatively stable air composition in the near surface layer can be disrupted in a short period of time by the significant reduction of primary emissions from human activities. For the case of UK, getting cleaner air from a large NO2 reduction may not be as straightforward as it seems."

Credit: 
University of Liverpool

Study shows impact of climate change on Neotropical freshwater ecosystems

image: Researchers from six countries in the Americas explored bromeliad microcosms, showing how drought and flood affect the functioning of aquatic ecosystems, especially at the bottom of the food chain

Image: 
Gustavo Quevedo Romero / UNICAMP

To understand how climate change may affect different ecosystems, 27 researchers from Brazil, Argentina, Colombia, Costa Rica, French Guiana and Puerto Rico, among other countries, conducted experiments in seven different locations involving the aquatic environment in the tank (centrally-located water-holding cup) of bromeliads, a habitat for insect larvae and other small organisms.

They discovered that, in contrast with what might be thought, the smallest organisms at the bottom of the food chain suffer the most from rainfall instability, one of the expected effects of climate change, rather than the large organisms at the top. The study was supported by São Paulo Research Foundation - FAPESP, and the findings are published
in Nature Communications.

According to ecologist Gustavo Quevedo Romero, first author of the article and a professor in the Animal Biology Department at the University of Campinas's Biology Institute (IB-UNICAMP) in the state of São Paulo, Brazil, multiple-site studies designed to understand how climate change may affect ecosystems are rare. This type of approach is necessary to achieve a better understanding of how each geographic region and ecosystem will be affected. The findings from the experiments with bromeliads contradict those of much previous ecological research focusing on climate change and species resilience.

"We manipulated the quantity and frequency of rainfall in the microcosms [in the bromeliad tank] according to climate change models for the decades ahead. In the same experiment, we created both drought and flood conditions, as well as an index of hydrological stability in each bromeliad. The sample, therefore, included bromeliads with more stable hydrological conditions, in which water volume varied little over time, and others with more unstable conditions," Romero said. "We found that hydrological instability affected smaller organisms negatively. The largest amounts of these were found in microcosms with more stable conditions. The largest predators were found in bromeliads exposed to the worst periods of drought."

The scientific literature suggests larger animals suffer most from climate changes, especially because they have less room to forage. "But we showed that in this case the bottom of the food chain may be more sensitive, and changes there can modify the trophic levels above," he said.

The researchers were surprised to find a single consistent pattern of susceptibility to climate change across all seven study sites despite local variations: predators always benefited from drier conditions, while small organisms were adversely affected in small environments and favored in larger environments with more rainfall. "Our study sites included Santa Fé, Argentina, which is arid, and locations in Costa Rica and French Guiana where drought is unusual, and the pattern was the same," he said.

Other results were also geographically consistent. "The predator biomass to prey biomass ratio, or the biomass pyramid representing the amount of potential food available for each trophic level in an ecosystem, was the same in all plants in all study sites and consistent throughout the geographical space regardless of species pool and the degree to which the organisms were drought-adapted," he said.

Instability

The study shows that climate change causes instability in food webs, especially when it is associated with drought. "More predators in a bromeliad with less water intensify the top-down predation effect on prey communities. This destabilizes the food web and may lead to local extinction of both predator and prey species," Romero said. In sum, although predators benefit in the smaller habitats caused by drought, these environments are more unstable and more vulnerable to extinction and ecosystem collapse.

The study methodology classified the fauna inhabiting bromeliad aquatic ecosystems into apex (top) predators, mesopredators (mid-ranking predators), and detritivores (small animals that feed on dead organic material, especially plant detritus). "We saw no impact of climate change on mesopredators, which were not affected by the treatments we administered," Romero said. In the bromeliad microcosm analyzed, mid-ranking predators were represented by small insect larvae that feed on other small organisms and are generally considered opportunistic species.

Like detritivores, filter feeders are small organisms. These are aquatic animals that feed on particles or other small organisms strained out of water and circulated through their system. Both of these groups benefited from steady hydrological conditions but were adversely affected by unstable conditions and exceptionally heavy rainfall.

"For example, heavy downpours in lakes and in bromeliad tanks lead to overflowing and loss of microorganisms and other nutrients via flushing," Romero said. "We showed in other studies that flushing and flooding can lead to significant losses of bacteria, microinvertebrates, and nutrients such as nitrate and phosphate."

Microcosm

Bromeliads, Romero explained, are natural environments in which scientists are able to explore various aspects of ecosystems, thanks to their small size and the ease with which they can be handled, as well as the fact that the results obtained in them can often be extrapolated for larger systems such as lakes or lagoons. Studies conducted in these larger environments tend to produce similar results to those performed in the bromeliad microcosm.

"Bromeliads are found throughout the Neotropics, and although their tanks are small environments a forest bromeliad community can accumulate as much as 50,000 liters of water per hectare, serving as a water source for birds, mammals and other terrestrial animals, and as an entire ecosystem for organisms that live in aquatic environments," he said.

This aquatic environment, mainly inhabited by small insects and crustaceans, is what the researchers explored. "The apex predators, especially dragonfly larvae, are larger animals, while the detritivores are tiny insects like mosquitoes and midges," he explained. "The apex predators feed on these smaller organisms, which feed on particles in suspension and detritus that falls from trees."

Common protocol

The research group, which also included scientists from France, Canada, the US, and the UK, held two meetings to decide on the experiment design. The same protocol was implemented at all seven study sites. At each site, they selected 30 bromeliads with more than 100?ml of tank capacity so that they could be colonized by the large predators. The species were the most common locally, as each region's bromeliad community comprises different species.

The 210 plants were washed and disinfected to remove all detritus and macroorganisms, as well as microorganisms such as bacteria and fungi. "We cleaned the bromeliads out very thoroughly so as to start from scratch," Romero said. "To initiate colonies and communities in the experimental ecosystems, we evenly divided the previously removed fine and coarse detritus among the 30 plants and stocked each bromeliad with the same community in terms of invertebrate families and functional groups. We then took them back into the field and installed them under individual transparent plastic shelters to ensure that the results were not affected by rain."

Each site's rainfall regime was simulated in terms of a five-year regional average for volume and frequency. "We manipulated the amount and frequency of rainfall during a two-month period at each site," Romero said. "We had an average for each geographical region. Below that we considered drought, and above it flooding. We manipulated the extremes, drought, and flood. No one had done that previously."

After two months the plants were taken to the laboratory, where the researchers recorded water turbidity (organic and inorganic suspended matter resulting from detritivore activity), a surrogate for total nutrient availability in freshwater ecosystems. They then dissected each bromeliad by removing each leaf, washing it separately in running water, and filtering this water through sieves before analyzing its content.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Magnetic 'T-Budbots' made from tea plants kill and clean biofilms (video)

image: Magnetically propelled microbots can clean biofilms and kill bacteria.

Image: 
American Chemical Society

Biofilms -- microbial communities that form slimy layers on surfaces -- are difficult to treat and remove, often because the microbes release molecules that block the entry of antibiotics and other therapies. Now, researchers reporting in ACS Applied Materials & Interfaces have made magnetically propelled microbots derived from tea buds, which they call "T-Budbots," that can dislodge biofilms, release an antibiotic to kill bacteria, and clean away the debris. Watch a video of the T-Budbots here.

Many hospital-acquired infections involve bacterial biofilms that form on catheters, joint prostheses, pacemakers and other implanted devices. These microbial communities, which are often resistant to antibiotics, can slow healing and cause serious medical complications. Current treatment includes repeated high doses of antibiotics, which can have side effects, or in some cases, surgical replacement of the infected device, which is painful and costly. Dipankar Bandyopadhyay and colleagues wanted to develop biocompatible microbots that could be controlled with magnets to destroy biofilms and then scrub away the mess. The team chose Camellia sinensis tea buds as the raw material for their microbots because the buds are porous, non-toxic, inexpensive and biodegradable. Tea buds also contain polyphenols, which have antimicrobial properties.

The researchers ground some tea buds and isolated porous microparticles. Then, they coated the microparticles' surfaces with magnetite nanoparticles so that they could be controlled by a magnet. Finally, the antibiotic ciprofloxacin was embedded within the porous structures. The researchers showed that the T-Budbots released the antibiotic primarily under acidic conditions, which occur in bacterial infections. The team then added the T-Budbots to bacterial biofilms in dishes and magnetically steered them. The microbots penetrated the biofilm, killed the bacteria and cleaned the debris away, leaving a clear path in their wake. Degraded remnants of the biofilm adhered to the microbots' surfaces. The researchers note that this was a proof-of-concept study, and further optimization is needed before the T-Budbots could be deployed to destroy biofilms in the human body.

Credit: 
American Chemical Society

Halt post-disturbance logging in forests

image: Burned eucalypt forest in Australia. Avoiding overall post-disturbance logging after such major disturbances can help to maintain biodiversity.

Image: 
(Photo: Simon Thorn / University of Wuerzburg)

Storms, fires, bark beetles: Many forests around the world are increasingly affected by these and other natural disturbances. It is common practice to eliminate the consequences of these disturbances - in other words, to harvest damaged trees as quickly as possible. Spruce trees attacked by bark beetles are removed from the forest, as are dryed beeches or trees thrown to the ground by storms.

"However, this practice is an additional disturbance that has a negative impact on biodiversity," says Dr. Simon Thorn, forest ecologist from Julius-Maximilians-Universität (JMU) Würzburg in Bavaria, Germany. During such logging operations, soil is damaged, most dead wood is removed and structures such as folded up root plates are lost. "That is why a certain proportion of such disturbed forests should be excluded from overall logging operations," Thorn says.

Evidence-based benchmarks calculated for the first time

Forests in which natural disturbances are preserved without human intervention are among the most threatened habitats in the world. "Up to now, there have been no evidence-based benchmarks on what proportion of land in a naturally disturbed forest should be left in order to promote the biodiversity of plants, birds, insects and fungal species " says the JMU scientist.

To close this gap, an international research team led by Simon Thorn has analyzed data global dataset on natural forest disturbances. In the journal Nature Communications, the scientists conclude that if around 75 percent of a naturally disturbed forest area is not cleared, 90 percent of its original species richness will be preserved. If only half of a disturbed forest is left untouched, around a quarter of the species will be lost. "These numbers can serve as a simple rule of thumb for leaving natural disturbances in forests unlogged," says Thorn.

Credit: 
University of Würzburg

NASA finds Tropical Storm Lowell's center north of strongest side

image: On Sept. 23 at 5:55 a.m. EDT (0955 UTC), the MODIS instrument aboard NASA's Aqua satellite gathered temperature information about Tropical Storm Lowell's cloud tops. MODIS found the most powerful thunderstorms (red) were east and south of the center, where temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius).

Image: 
NASA/NRL

NASA's Aqua satellite used infrared light to identify strongest storms and coldest cloud top temperatures in Tropical Storm Lowell and found them south of the center of circulation.

Lowell is moving through the Eastern Pacific Ocean and far from land areas. There are no coastal watches or warnings in effect.

Infrared Data Reveals Powerful Storms

On Sept. 23 at 5:55 a.m. EDT (0955 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Aqua satellite gathered temperature information about Lowell's cloud tops. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

MODIS found the most powerful thunderstorms were south of the center, where temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

Lowell's center is located near the northern edge of a band of strong thunderstorms and deep convection which extends over the eastern and southern portion of the circulation due to moderate northwesterly wind shear in the mid-levels below the cirrus layer (high cloud).

Lowell's Status on Sept. 23

At 5 a.m. EDT (0900 UTC) on Sept. 23, the center of Tropical Storm Lowell was located near latitude 19.8 degrees north and longitude 119.9 degrees west. Lowell is centered about 680 miles (1,090 km) west-southwest of the southern tip of Baja California, Mexico.

Lowell is moving toward the west-northwest near 12 mph (19 kph), and this general motion is expected to continue today.  Maximum sustained winds have increased to near 50 mph (85 kph) with higher gusts. The estimated minimum central pressure is 1001 millibars.

Lowell's Forecast

NOAA's National Hurricane Center expects Lowell to move in a westward motion beginning early Thursday, Sept. 24 and continue into the weekend. Little change in strength is forecast during the next few days.

NASA Researches Tropical Cyclones

Hurricanes/tropical cyclones are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

For forecast updates on hurricanes, visit: http://www.hurricanes.gov

By Rob Gutro
NASA's Goddard Space Flight Center

Credit: 
NASA/Goddard Space Flight Center

Some polar bears in far north are getting short-term benefit from thinning ice

A small subpopulation of polar bears lives on what used to be thick, multiyear sea ice far above the Arctic Circle. The roughly 300 to 350 bears in Kane Basin, a frigid channel between Canada's Ellesmere Island and Greenland, make up about 1-2% of the world's polar bears.

New research shows that Kane Basin polar bears are doing better, on average, in recent years than they were in the 1990s. The study, published Sept. 23 in Global Change Biology, finds the bears are healthier as conditions are warming because thinning and shrinking multiyear sea ice is allowing more sunlight to reach the ocean surface, which makes the system more ecologically productive.

"We find that a small number of the world's polar bears that live in multiyear ice regions are temporarily benefiting from climate change," said lead author Kristin Laidre, a polar scientist at the University of Washington Applied Physics Laboratory's Polar Science Center.

If greenhouse gases continue to build up in the atmosphere and the climate keeps warming, within decades these polar bears will likely face the same fate as their southern neighbors already suffering from declining sea ice.

"The duration of these benefits is unknown. Under unmitigated climate change, we expect the Kane Basin bears to run into the same situation as polar bears in the south -- it's just going to happen later," Laidre said. "They'll be one of the last subpopulations that will be negatively affected by climate change."

All of the world's 19 polar bear subpopulations, including Kane Basin, are experiencing a shorter on-ice hunting season, according to a 2016 study led by Laidre. This makes it hard for the animals, that can weigh more than 1,200 pounds as adults, to meet their nutritional needs. Polar bears venture out on sea ice to catch seals. In summer when the sea ice melts, the polar bears fast on land.

Laidre led a recent study showing that in the Baffin Bay polar bear subpopulation, which includes about 2,800 bears living just south of Kane Basin, adult females are thinner and are having fewer cubs as the summer open-water season -- when they must fast on land -- grows longer.

"Kane Basin is losing its multiyear ice, too, but that doesn't have the same effect on the polar bears' ability to hunt," Laidre said. "Multiyear ice becomes annual ice, whereas annual ice becomes open water, which is not good for polar bears."

The new paper looked at Kane Basin bears using satellite tracking data and direct physical measurements to compare from 1993 to 1997 with a more recent period, from 2012 to 2016. The body condition, or fatness, improved for all ages of males and females. The average number of cubs per litter, another measure of the animals' overall health, was unchanged.

Satellite tags showed the Kane Basin polar bears traveled across larger areas in recent years, covering twice as much distance and ranging farther from their home territory.

"They now have to move over larger areas," Laidre said. "The region is transitioning into this annual sea ice that is more productive but also more dynamic and broken up."

Observations show a profound shift in the sea ice in Kane Basin between the two study periods. In the 1990s, about half the area was covered in multiyear ice in the peak of summer, while in the 2010s the region was almost completely annual ice, which melts to open water in summer.

Even though there's now more open water, the marine ecosystem has become more productive. Annual sea ice allows more sunlight through, so more algae grow, which supports more fish and in turn attracts seals.

"Two decades ago, scientists hypothesized that climate change could temporarily benefit polar bears in multiyear ice regions over the short term, and our observations support that," Laidre said.

The subpopulation on the other side of Ellesmere Island, in Canada's Norwegian Bay, could be in a similar situation, she said, though no data exist for those animals.

If conditions continue to warm these northernmost polar bears will likely face the same fate as their southern neighbors. Kane Basin polar bears have only much deeper water to turn to farther north.

"It's important not to jump to conclusions and suggest that the High Arctic, which historically was covered by multiyear sea ice, is going to turn into a haven for polar bears," said Laidre, who is also an associate professor in the UW School of Aquatic and Fishery Sciences. "The Arctic Ocean around the North Pole is basically an abyss, with very deep waters that will never be as productive as the shallower waters to the south where most polar bears live.

"So we are talking about temporary benefits in a limited area and to a very small number of bears."

Credit: 
University of Washington