Culture

Evaluating the effect of plain afforestation project and future spatial suitability in Beijing

image: Boxplots of the maximum NDVI at the patch scale in 2012, with categories of consistently decreasing (in total 30) (a) and increasing (in total 33) (b) patches after the afforestation. The number in x-axis indicates the number id of forest segment.

Image: 
©Science China Press

Taking the "One Million-Mu (666 km2)" Plain Afforestation (Phase I) Project as an example, the authors characterized the growth status and temporal trends of forest patches using time series NDVI. They selected forest patches with greater than 50 ha in size as the main objects for analysis. Using the maximum NDVI in the base year (2012), they calculated the change ratio of forest patches in different years. These ratios were further divided into three categories using the natural break approach. That is decreasing, relatively stable without notable changes and increasing. According to derived categories from the trend of greenness in forest patches, they analyzed the initial status of vegetation growing in the base year 2012 (Fig. 1). They found that the greenness of forest patches that were categorized as decreasing (in total 30) was relatively high in 2012 (i.e., ranging from 0.6 to 0.8). Meanwhile, these patches had a relatively low heterogeneity (Fig. 1a). This result suggests that the growing status of vegetation before afforestation is good. However, for those forest patches categorized as increasing (in total 33), their greenness is relatively low in 2012 and has a high heterogeneity (e.g., from 0.2 to 0.8) for pixels within the patch (Fig. 1b).

The authors found the growth status of some forest patches becomes worse in the Phase I afforestation project. For those farmland patches with relatively high greenness and low heterogeneity, their growth status is likely to become worse due to the interruption of afforestation. Also, the afforestation project occupies massive fertile farmland through decreasing their connectivity and compactness. These findings are helpful to urban greenspace planning in the future.

According to the results from Phase I afforestation and the impact of urbanization on green space, they constructed a series of spatial variables and generated a suitability map for the next "New Round of One Million-Mu (666km2) Afforestation project" (Phase II). The key of this evaluation is to maximize the suitability of vegetation growth based on the spatial distribution of patches, the initial growing status of vegetation (e.g., forest coverage, NDVI value, heterogeneity), and the temporal trend of greenspace over a given historical period. They modeled the distribution of forest patches in the Phase II afforestation project and assessed their suitability from an urban planning perspective (Fig. 2). Also, the basic farmland is not allowed for afforestation, while the abandoned built-up areas are preferentially considered for afforestation. Different from traditional approaches that operate at the pixel level, they used segments as our basic modeling unit in this study to represent the shape of forest patches appropriately. Their modeling results suggest most afforested areas are located in peripheral districts in the plain area of Beijing. Meanwhile, human intervention through occupying farmland around the existing forest patches is not recommended.

The authors derived the spatial map of reforestation areas at the segment scale in the Phase II afforestation project (Fig. 3). Comparing the results from the Phase I and Phase II, their overall patterns are similar, i.e., most of the forest patches are in Tongzhou, Daxing, Shunyi, and other districts with flat terrain. These regions are not only the focus of building greening and suitable urban society from the planning aspect but also the main area of the afforestation zones suggested in the Beijing Master Plan (2016-2035).

This study evaluated the implementation of urban green space planning using a multi-source dataset. Different from traditional approaches that focus on the current status of vegetation, they proposed a dynamic approach to evaluating the growth status of green space, taking into account the trend of greenness over the years. They developed a modeling approach to explore the spatial distribution of urban greenspace, in consideration of the patch distribution, the initial growing status of vegetation, temporal trends of growth, and particular planning constraints. The derived results can support decision making for urban greenspace planning. The proposed evaluation framework for the afforestation project can be extended to the national scale, by offering scientific insights to the monitoring and planning of regional afforestation project.

Credit: 
Science China Press

True size of prehistoric mega-shark finally revealed

image: Palaeoartist reconstruction of a 16m adult Megalodon. Reconstruction by Oliver E. Demuth.

Image: 
Oliver E. Demuth

A new study led by Swansea University and the University of Bristol has revealed the size of the legendary giant shark Megalodon, including fins that are as large as an adult human.

There is a grim fascination in determining the size of the largest sharks, but this can be difficult for fossil forms where teeth are often all that remain.

Today, the most fearsome living shark is the Great White, at over six metres (20 feet) long, which bites with a force of two tonnes.

Its fossil relative, the big tooth shark Megalodon, star of Hollywood movies, lived from 23 to around three million years ago, was over twice the length of a Great White and had a bite force of more than ten tonnes.

The fossils of the Megalodon are mostly huge triangular cutting teeth bigger than a human hand.

Jack Cooper and colleagues from Swansea University and the University of Bristol used a number of mathematical methods to pin down the size and proportions of this monster, by making close comparisons to a diversity of living relatives with ecological and physiological similarities to Megalodon.

The project was supervised by shark expert Dr Catalina Pimiento from Swansea University and Professor Mike Benton, a palaeontologist at the University of Bristol. Dr Humberto Ferrón from Bristol also collaborated.

Jack Cooper, who will now start his PhD at Swansea University said: "I have always been mad about sharks. As an undergraduate, I have worked and dived with Great Whites in South Africa - protected by a steel cage of course. It's that sense of danger, but also that sharks are such beautiful and well-adapted animals, that makes them so attractive to study.

"Megalodon was actually the very animal that inspired me to pursue palaeontology in the first place at just six years old, so I was over the moon to get a chance to study it.

"This was my dream project. But to study the whole animal is difficult considering that all we really have are lots of isolated teeth."

Previously the fossil shark, known formally as Otodus megalodon, was only compared with the Great White. Jack and his colleagues, for the first time, expanded this analysis to include five modern sharks.

Dr Catalina Pimiento said: "Megalodon is not a direct ancestor of the Great White but is equally related to other macropredatory sharks such as the Makos, Salmon shark and Porbeagle shark, as well as the Great white. We pooled detailed measurements of all five to make predictions about Megalodon."

Professor Benton added: "Before we could do anything, we had to test whether these five modern sharks changed proportions as they grew up. If, for example, they had been like humans, where babies have big heads and short legs, we would have had some difficulties in projecting the adult proportions for such a huge extinct shark.

"But we were surprised, and relieved, to discover that in fact that the babies of all these modern predatory sharks start out as little adults, and they don't change in proportion as they get larger."

Jack Cooper added: "This means we could simply take the growth curves of the five modern forms and project the overall shape as they get larger and larger - right up to a body length of 16 metres."

The results suggest that a 16-metre-long Otodus megalodon likely had a head round 4.65 metres long, a dorsal fin approximately 1.62 metres tall and a tail around 3.85 metres high.

This means an adult human could stand on the back of this shark and would be about the same height as the dorsal fin.

The reconstruction of the size of Megalodon body parts represents a fundamental step towards a better understanding of the physiology of this giant, and the intrinsic factors that may have made it prone to extinction.

Credit: 
Swansea University

New dating of Nebra sky disk

image: The condition of the Nebra sky disk before being transferred to the Landesmuseum Halle an der Saale.

Image: 
Hildegard Burri-Bayer

FRANKFURT. The Nebra sky disk is one of Germany's most significant archaeological finds and was included in the UNESCO Memory of the World Register in 2013. It was discovered in an illegal excavation in 1999 together with Bronze Age swords, axes and bracelets according to the finders. This discovery context was important for the scientific dating, as the disk itself could neither be scientifically nor archeologically dated by comparison with other objects. Many years of investigations by several research groups therefore attempted to verify both the attribution to the supposed discovery site as well as the common origins of the objects independent of the vague information given by the looters.

Rupert Gebhard, Director of the Munich Archäologischen Staatssammlung, and Rüdiger Krause Professor for Prehistory and Early European History at Goethe University Frankfurt have now extensively analysed the discovery circumstances and research results on the Nebra sky disk. Their conclusion: The site that was considered the discovery site until today and which was investigated in subsequent excavations is with high probability not the discovery site of the looters. Furthermore, there is no convincing evidence that the Bronze Age swords, axes and bracelets form an ensemble of common origins. For this reason, it must be assumed that this is not a typical Bronze Age deposit and that the disk was not found together with the other objects in an original state at the excavation site.

According to the archaeologists, this means that the disk must be investigated and evaluated as an individual find. Culturally and stylistically, the sky disk cannot be fitted into the Early Bronze Age motif world of the beginning of the second millennium B.C. On the contrary, clearer references can be made to the motif world of the Iron Age of the first millennium B.C. According to Gebhard and Krause, on the basis of a divergent data situation and on the basis of this new assessment, all previous, sometimes far-reaching cultural-historical conclusions must be discussed anew and with an open mind, and the disk must be interpreted and evaluated in different contexts than before. The basis for this must be the submission of all previously unpublished data and facts.

Credit: 
Goethe University Frankfurt

Researchers probe Soldier sleep deprivation effects

image: The study shows how the complex set of molecular and fluid dynamics that comprise the glymphatic system - the brain's unique process of waste removal - are synchronized with the master internal clock that regulates the sleep-wake cycle

Image: 
University of Rochester Medical Center

RESEARCH TRIANGLE PARK, N.C. - New Army-funded study looks at effects of sleep deprivation, which can greatly affect Soldiers on the battlefield.

Research conducted at the University of Rochester Medical Center and funded by the Army Research Office, an element of the U.S. Army Combat Capabilities Development Command's Army Research Laboratory, suggests that people who rely on sleeping during daytime hours are at greater risk for developing neurological disorders.

The study, published in Nature Communications, details how the complex set of molecular and fluid dynamics that comprise the glymphatic system - the brain's unique process of waste removal - are synchronized with the master internal clock that regulates the sleep-wake cycle.

"Establishing a role for communication between astrocytes and the significant impacts of circadian timing on glymphatic clearance dynamics represent a major step in understanding the fundamental process of waste clearance regulation in the brain," said Dr. Frederick Gregory, a program manager for ARO's neurophysiology of cognition initiative. "This knowledge is crucial to developing future countermeasures that offset the deleterious effects of sleep deprivation and addresses future multi-domain military operation requirements for Soldiers to sustain performance over longer periods without the ability to rest."

The glymphatic system, first discovered by the URMC Nedergaard lab in 2012, consists of a network that piggybacks on the brain's blood circulation system and is comprised of layers of plumbing, with the inner blood vessel encased by a 'tube' that transports cerebrospinal fluid. The system pumps the fluid through brain tissue primarily during sleep, washing away toxic proteins and other waste.

"These findings show that glymphatic system function is not solely based on sleep or wakefulness, but by the daily rhythms dictated by our biological clock," said neuroscientist Maiken Nedergaard, M.D., D.M.Sc., co-director of the Center for Translational Neuromedicine at URMC and senior author of the study.

The research team and others have shown the role that blood pressure, heart rate, circadian timing, and depth of sleep play in the glymphatic system's function and the chemical signaling that occurs in the brain to turn the system on and off. They have also shown how disrupted sleep or trauma can cause the system to break down and allow toxic proteins to accumulate in the brain, potentially giving rise to a number of neurodegenerative diseases, such as Alzheimer's.

Circadian rhythms, 24-hour body clocks, are maintained in a small area of the brain called the suprachiasmatic nucleus. This clock regulates several important biological functions, including the sleep-wake cycle.

The new study, conducted in mice, showed that when the animals were anesthetized all day long, their glymphatic system still only functioned during their typical rest period - mice are nocturnal, so their sleep-wake cycle is the opposite of humans.

"Circadian rhythms in humans are tuned to a day-wake, night-sleep cycle," said Dr. Lauren Hablitz, first author of the new study and a research assistant professor in the Center for Translational Neuromedicine at URMC. "Because this timing also influences the glymphatic system, these findings suggest that people who rely on cat naps during the day to catch up on sleep or work the night shift may be at risk for developing neurological disorders. In fact, clinical research shows that individuals who rely on sleeping during daytime hours are at much greater risk for Alzheimer's and dementia along with other health problems."

The study singles out cells called astrocytes that play multiple functions in the brain. Scientists believe that astrocytes in the suprachiasmatic nucleus help regulate circadian rhythms. Astrocytes also serve as gatekeepers that control the flow of cerebrospinal fluid throughout the central nervous system. The results of the study suggest that communication between astrocytes in different parts of the brain may share the common goal of optimizing the glymphatic system's function during sleep.

The researchers also found that during wakefulness, the glymphatic system diverts cerebrospinal fluid to lymph nodes in the neck. Because the lymph nodes are key waystations in the regulation of the immune system, the research suggests that cerebrospinal fluid may represent a fluid clock that helps wake up the body's infection fighting capabilities during the day.

Credit: 
U.S. Army Research Laboratory

New mathematical method shows how climate change led to fall of ancient civilization

image: This figure shows the settlements of the Indus Valley Civilization during different phases of its evolution. RIT Assistant Professor Nishant Malik developed a mathematical method that shows climate change likely caused the rise and fall of the ancient civilization.

Image: 
RIT

A Rochester Institute of Technology researcher developed a mathematical method that shows climate change likely caused the rise and fall of an ancient civilization. In an article recently featured in the journal Chaos: An Interdisciplinary Journal of Nonlinear Science, Nishant Malik, assistant professor in RIT's School of Mathematical Sciences, outlined the new technique he developed and showed how shifting monsoon patterns led to the demise of the Indus Valley Civilization, a Bronze Age civilization contemporary to Mesopotamia and ancient Egypt.

Malik developed a method to study paleoclimate time series, sets of data that tell us about past climates using indirect observations. For example, by measuring the presence of a particular isotope in stalagmites from a cave in South Asia, scientists were able to develop a record of monsoon rainfall in the region for the past 5,700 years. But as Malik notes, studying paleoclimate time series poses several problems that make it challenging to analyze them with mathematical tools typically used to understand climate.

"Usually the data we get when analyzing paleoclimate is a short time series with noise and uncertainty in it," said Malik. "As far as mathematics and climate is concerned, the tool we use very often in understanding climate and weather is dynamical systems. But dynamical systems theory is harder to apply to paleoclimate data. This new method can find transitions in the most challenging time series, including paleoclimate, which are short, have some amount of uncertainty and have noise in them."

There are several theories about why the Indus Valley Civilization declined--including invasion by nomadic Indo-Aryans and earthquakes--but climate change appears to be the most likely scenario. But until Malik applied his hybrid approach-- rooted in dynamical systems but also draws on methods from the fields of machine learning and information theory--there was no mathematical proof. His analysis showed there was a major shift in monsoon patterns just before the dawn of this civilization and that the pattern reversed course right before it declined, indicating it was in fact climate change that caused the fall.

Malik said he hopes the method will allow scientists to develop more automated methods of finding transitions in paleoclimate data and leads to additional important historical discoveries. The full text of the study is published in Chaos: An Interdisciplinary Journal of Nonlinear Science.

Credit: 
Rochester Institute of Technology

Innovative biocontainment unit shows promise to protect healthcare workers

image: Dr. Jason Chang, assistant professor in the department of emergency medicine at the University of Pittsburgh Medical Center performs a simulated intubation with the individual biocontainment unit developed by the U.S. Army Combat Capabilities Development Command's Army Research Laboratory and UPMC.

Image: 
Courtesy photo

The U.S. Army partnered with the University of Pittsburg Medical Center to create a biocontainment unit that could help healthcare workers caring for COVID-19 patients.

Researchers from the U.S. Army Combat Capabilities Development Command's Army Research Laboratory and UPMC created an individual biocontainment unit that uses negative pressure to suction the air from around a patient and filter out viral particles. This prevents environmental contamination and limits exposure to SARS-CoV-2.

"Outside of the current pandemic, the IBU could be rapidly deployed to isolate patients with any respiratory illness." said study co-author Dr. David Turer, a plastic surgeon who recently completed his residency at UPMC. "It's easy to see this technology used to contain influenza, MERS, or tuberculosis, particularly in places lacking advanced hospital infrastructure."

The device and the results of safety testing are described in a study published today in the Annals of Emergency Medicine. This research was first reported by the Army in April during an effort to identify solutions to help combat the spread of COVID-19.

At that time, initial approaches to minimize viral spread involved the use of plexiglass barriers, such as intubation boxes, to limit healthcare worker exposure when inserting a breathing tube down a patient's throat. While these barriers may mitigate exposure to larger droplets, the research team hypothesized that they do little to stop the spread of smaller aerosolized viral particles.

Army researcher and study co-author, Dr. Cameron Good and Turer, along with a team of colleagues, developed prototype IBUs and tested them by performing simulated medical procedures. Using validated techniques adopted from the medical research laboratory community, they tested the IBU and a plexiglass intubation box for their ability to contain virus-sized particles from a simulated COVID-19 patient.

"Greater than 99.99% of the virus-sized aerosols were trapped by the IBU and prevented from escaping into the room," Good said. "When we tested the passive intubation box, we observed more than three times the aerosol concentration outside the box--where the healthcare provider is located--than inside the box. It is not safe to use these intubation boxes without actively filtering the air."

The Food and Drug Administration recently revoked an emergency use authorization for passive plexiglass intubation barriers and mandated the use of negative pressure systems, such as the IBU, to prevent viral spread.

The team is actively developing a portable vacuum and filter system that can run on a battery pack for use in austere environments where energy resources are limited, which is of particular interest for military and humanitarian applications.

"The ability to isolate COVID-19 patients at the bedside is key to stopping viral spread in medical facilities and onboard military ships and aircraft, particularly to limit transmission through close quarters or shared ventilation systems," Good said.

The FDA is considering a recently submitted emergency use authorization. Once granted, hospitals and military units will be able to use IBUs immediately to protect healthcare workers caring for COVID-19 patients and to prepare for future surges.

"None of this would have been possible without the extremely dedicated clinicians and engineers who rapidly designed, built, tested and validated the equipment," Good said. "I want to thank Dr. Robert Turer [David Turer's brother] from Vanderbilt University Medical Center, Nick Karlowsky from Filtech, Inc., Drs. Lucas Dvoracek, J. Peter Rubin and Jason Chang from UPMC, and Ben Schilling and Dr. Heng Ban from the University of Pittsburg. It truly takes a team."

Credit: 
U.S. Army Research Laboratory

Comprehensive look at US fuel economy standards show big savings on fuel and emissions

image: The graph shows vehicle miles traveled versus fuel consumption from 1965 through 2018 in the United States. While travel increased significantly during that time, fuel use dropped due, in large part, to the fuel economy standards and the fuel efficiency technologies that were developed and implemented to meet the standards.

Image: 
Graph courtesy of the researchers (Rebecca Ciez); Redesign by Bumper DeJesus

In one of the first comprehensive assessments of the fuel economy standards in the United States, Princeton University researchers found that, over their 40-year history, the standards helped reduce reliance on foreign oil producers, cut greenhouse gas emissions, and saved consumers money.

Using data including household spending data, oil use, and greenhouse gas emissions, the researchers found that the standards (known as the CAFE standards), which were first enacted in 1975 as a way to reduce dependence on foreign oil after the oil crisis, set well-defined societal objectives and were cost-effective, fair, durable and adaptive. The standards required automakers to produce more efficient vehicles over time, increasing the number of miles per gallon required of their vehicle fleets. The researchers cite that the standards saved $5 trillion in fuel costs and prevented 14 billion metric tons of carbon from being released into the atmosphere, the equivalent of the United States eliminating all emissions from all sectors for nearly three years.

"It has been one of the most effective policies to date," said Judi Greenwald, a co-author of the study, former top U.S. Department of Energy official and non-resident fellow at the Princeton University's Andlinger Center for Energy and the Environment.

The paper, coauthored by Greenwald, Rebecca Ciez and David Greene, was published on August 23 in the journal Energy Policy. Ciez was a Distinguished Postdoctoral Fellow at the Andlinger Center and Greene is a research professor in the Department of Civil and Environmental Engineering at the University of Tennessee, Knoxville. Ciez has accepted a position as assistant professor in mechanical engineering and environmental and ecological engineering at Purdue University.

"There really hasn't been any comprehensive lookback to day one of the standards to consider what their impacts have been, how they changed over time, whether the potential threats to their effectiveness materialized or not, and their overall impact," said Greene.

The researchers noted that the policies helped, in part, to keep the rate of yearly growth in U.S. gasoline consumption to 0.2% since 1975. The policy, in addition to fluctuations in gas prices, reduced oil imports and saved 2 trillion gallons of gasoline, enough to fuel all the light-duty vehicles in the United States for fifteen years.

"These standards have been remarkably effective from both an environmental perspective and an energy security perspective, and most people don't realize it," said Greenwald.

The authors said these types of regulations are more effective at improving fuel economy than other policy tools, like a gasoline tax, because they don't rely on the consumer to make the long-term fuel-efficient choice and, therefore, gain cost benefits at the pump. The fuel economy standards move the calculation to regulators and require that manufacturers improve fuel economy across their product lines using technologies that may cost a little more but save consumers much more on fuel in the long run.

A prior study by Greene found that over the lifetime of the policy, the technology for efficiency upgrades increased the cost of cars by an average of $4,800, but yielded $16,000 in savings for consumers at the pump.

Dan Sperling, founding director of the Institute of Transportation Studies at the University of California, Davis, who is unaffiliated with the study, called it an "important and authoritative history and analysis." "There is nothing like this in the literature," said Sperling, who is also the Distinguished Blue Planet Prize Professor of Civil Engineering and Environmental Science and Policy at UC, Davis and a member of the California Air Resources Board.

Greenwald said the standards have evolved in ways that continue to benefit and serve the public and have endured various administrations and political tides. It is a testament to their initial design, as well as regulators' adaptive responses to changing circumstances over time. In 2010, two sets of vehicle standards affecting automakers, one for greenhouse gas emissions and one for fuel efficiency, were harmonized so that manufacturers could meet one set of standards when designing new vehicles.

The analysis concludes with a recommendation to continue to increase the stringency of the standards based on the best available data and analysis, as regulators have done historically. The most recent rules promulgated by the Trump administration aim to loosen the fuel efficiency requirements by dropping the annual efficiency increase from five percent to one and a half percent through 2026. Given that transportation is the largest source of U.S. greenhouse gas emissions (GHG) and that people keep their cars for approximately 10 years, this would severely stymie environmental progress, the researchers said. The Rhodium Group, an independent research organization unaffiliated with the study, estimates that the policy change would achieve only one fifth of greenhouse gas reductions that the Obama-era policy would achieve.

Ciez pointed to the 1990s as an example of what can happen when fuel targets are effectively frozen. She said it led automakers to produce bigger, faster, and more polluting cars. Gas prices were cheap and gas-guzzling vehicles hit the road in mass numbers. Car companies made SUVs and vehicles with quicker acceleration times, which became very popular among American drivers. Ciez said without the standards, there is little incentive for automakers to focus on fuel economy as opposed to horsepower or vehicle comfort. The standards have spurred technological innovation, allowing cars to provide all three attributes - power, comfort, and efficiency - at a reasonable cost.

Regardless of the what happens over the next four years, Sperling said, the authors have provided "a model for assessing other policies."

In the closing statement the authors contextualized this moment in history.

"It is likely that the United States is in the middle, not the end, of the story of the adaptive response of the vehicle CAFE and GHG standards."

Credit: 
Princeton University, Engineering School

Children can have COVID-19 antibodies and virus in their system simultaneously

With many questions remaining around how children spread COVID-19, Children's National Hospital researchers set out to improve the understanding of how long it takes pediatric patients with the virus to clear it from their systems, and at what point they start to make antibodies that work against the coronavirus. The study, published Sept. 3 in the Journal of Pediatrics, finds that the virus and antibodies can coexist in young patients.

"With most viruses, when you start to detect antibodies, you won't detect the virus anymore. But with COVID-19, we're seeing both," says Burak Bahar, M.D., lead author of the study and director of Laboratory Informatics at Children's National. "This means children still have the potential to transmit the virus even if antibodies are detected."

She adds that the next phase of research will be to test if the virus that is present alongside the antibodies can be transmitted to other people. It also remains unknown if antibodies correlate with immunity, and how long antibodies and potential protection from reinfection last.

The study also assessed the timing of viral clearance and immunologic response. It found the median time from viral positivity to negativity, when the virus can no longer be detected, was 25 days. The median time to seropositivity, or the presence of antibodies in the blood, was 18 days, while the median time to reach adequate levels of neutralizing antibodies was 36 days. Neutralizing antibodies are important in potentially protecting a person from re-infection of the same virus.

This study used a retrospective analysis of 6,369 children tested for SARS-CoV-2, the virus that causes COVID-19, and 215 patients who underwent antibody testing at Children's National between March 13, 2020, and June 21, 2020. Out of the 215 patients, 33 had co-testing for both the virus and antibodies during their disease course. Nine of the 33 showed presence of antibodies in their blood while also later testing positive for the virus.

Also of note, researchers found patients 6 through 15 years old took a longer time to clear the virus (median of 32 days) compared to patients 16 through 22 years old (median of 18 days). Females in the 6-15 age group also took longer to clear the virus than males (median of 44 days for females compared to median of 25.5 days for males).

Although there is emerging data regarding this timing in adults with COVID-19, there is far less data when it comes to the pediatric population. The findings being gathered by Children's National researchers and scientists around the world are critical to helping understand the unique impact on children and their role in viral transmission.

"The takeaway here is that we can't let our guard down just because a child has antibodies or is no longer showing symptoms," says Dr. Bahar. "The continued role of good hygiene and social distancing remains critical."

Credit: 
Children's National Hospital

Unmanned aerial vehicles help wheat breeders

image: Margaret Krause operates an unmanned aerial vehicle at the International Maize and Wheat Improvement Center (CIMMYT) in Ciudad Obregón, Mexico.

Image: 
José Manuel Reyes Mendoza

Breeding programs for crops with limited per-plant seed yield require one or more generations of seed increase to generate sufficient quantities for sowing replicated yield trials. The ability to accurately discard low potential lines at these early stages may reduce spending on costly yield testing.

Breeders typically rely on visual selection at these stages because extensive measurement of plant traits is difficult due to the large number of lines under evaluation. However, recent advances in remote sensing have made high-throughput data collection increasingly feasible.

Authors of a recent Crop Science article leveraged unmanned aerial vehicles (UAVs) to record the normalized difference vegetation index (NDVI), a measure of plant health, at the seed increase stage of the International Maize and Wheat Improvement Center's (CIMMYT) wheat breeding program. NDVI measurements were heritable and moderately correlated with grain yield, and results showed that selection based on NDVI would have outperformed visual selection.

Harnessing UAV-collected traits to inform selection at the early stages may improve resource-use efficiency in breeding programs and/or increase rates of genetic gain. As remote sensing technologies become increasingly automated and scalable, breeders will have access to comprehensive suites of traits with which to develop integrative selection strategies.

Credit: 
American Society of Agronomy

Wearable, portable invention offers options for treating antibiotic-resistant infections

image: Purdue University innovators created a wearable invention that offers options for treating antibiotic-resistant infections and wounds.

Image: 
Purdue University/Rahim Rahimi

WEST LAFAYETTE, Ind. - The rapid increase of life-threatening antibiotic-resistant infections has resulted in challenging wound complications with limited choices of effective treatments. About 6 million people in the United States are affected by chronic wounds.

Now, a team of innovators from Purdue University has developed a wearable solution that allows a patient to receive treatment without leaving home. The Purdue team's work is published in the journal Frontiers in Bioengineering and Biotechnology.

A video showing the technology is available at https://youtu.be/UMZpDwYQZJM.

"We created a revolutionary type of treatment to kill the bacteria on the surface of the wound or diabetic ulcer and accelerate the healing process," said Rahim Rahimi, an assistant professor of materials engineering at Purdue. "We created a low-cost wearable patch and accompanying components to deliver ozone therapy."

Ozone therapy is a gas phase antimicrobial treatment option that is being used by a growing number of patients in the U.S. In most cases, the ozone treatments require patients to travel to a clinical setting for treatment by trained technicians.

"Our breathable patch is applied to the wound and then connected to a small, battery powered ozone-generating device," Rahimi said. "The ozone gas is transported to the skin surface at the wound site and provides a targeted approach for wound healing. Our innovation is small and simple to use for patients at home."

Credit: 
Purdue University

Zooming in on dark matter

image: Projected dark matter density map, created using a simulation measuring 2.4 billion light years on each side. . The intermediate square (top right) is just under a million light years across. The smallest square (bottom left) is the deepest zoom: it is only 783 light years across, equivalent to 500 times the size of the solar system.
In the intermediate square(top right) the largest dark matter haloes have a mass similar to that of a rich galaxy cluster (a million trillion times the mass of the Sun). In the smallest square (bottom right) the smallest clearly visible haloes have a mass comparable to that of the Earth (0.000003 the mass of the Sun).

Image: 
Dr Sownak Bose, Center for Astrophysics, Harvard University

Cosmologists have zoomed in on the smallest clumps of dark matter in a virtual universe - which could help us to find the real thing in space.

An international team of researchers, including Durham University, UK, used supercomputers in Europe and China to focus on a typical region of a computer-generated universe.

The zoom they were able to achieve is the equivalent of being able to see a flea on the surface of the Moon.

This allowed them to make detailed pictures and analyses of hundreds of virtual dark matter clumps (or haloes) from the very largest to the tiniest.

Dark matter particles can collide with dark matter anti-particles near the centre of haloes where, according to some theories, they are converted into a burst of energetic gamma-ray radiation.

Their findings, published in the prestigious journal Nature, could mean that these very small haloes could be identified in future observations by the radiation they are thought to give out.

Co-author Professor Carlos Frenk, Ogden Professor of Fundamental Physics at the Institute for Computational Cosmology, at Durham University, UK, said: "By zooming in on these relatively tiny dark matter haloes we can calculate the amount of radiation expected to come from different sized haloes.

"Most of this radiation would be emitted by dark matter haloes too small to contain stars and future gamma-ray observatories might be able to detect these emissions, making these small objects individually or collectively 'visible'.

"This would confirm the hypothesised nature of the dark matter, which may not be entirely dark after all."

Most of the matter in the universe is dark (apart from the gamma radiation they emit in exceptional circumstances) and completely different in nature from the matter that makes up stars, planets and people.

The universe is made of approximately 27 per cent dark matter with the rest largely consisting of the equally mysterious dark energy. Normal matter, such as planets and stars, makes up a relatively small five per cent of the universe.

Galaxies formed and grew when gas cooled and condensed at the centre of enormous clumps of this dark matter - so-called dark matter haloes.

Astronomers can infer the structure of large dark matter haloes from the properties of the galaxies and gas within them.

The biggest haloes contain huge collections of hundreds of bright galaxies, called galaxy clusters, weighing a 1,000 trillion times more than our Sun.

However, scientists have no direct information about smaller dark matter haloes that are too tiny to contain a galaxy. These can only be studied by simulating the evolution of the Universe in a large supercomputer.

The smallest are thought to have the same mass as the Earth according to current popular scientific theories about dark matter that underlie the new research.

The simulations were carried out using the Cosmology Machine supercomputer, part of the DiRAC High-Performance Computing facility in Durham, funded by the Science and Technology Facilities Council (STFC), and computers at the Chinese Academy of Sciences.

By zooming-in on the virtual universe in such microscopic detail, the researchers were able to study the structure of dark matter haloes ranging in mass from that of the Earth to a big galaxy cluster.

Surprisingly, they found that haloes of all sizes have a very similar internal structure and are extremely dense at the centre, becoming increasingly spread out, with smaller clumps orbiting in their outer regions.

The researchers said that without a measure scale it was almost impossible to tell an image of a dark matter halo of a massive galaxy from one of a halo with a mass a fraction of the Sun's.

Co-author Professor Simon White, of the Max Planck Institute of Astrophysics, Germany, said: "We expect that small dark matter haloes would be extremely numerous, containing a substantial fraction of all the dark matter in the universe, but they would remain mostly dark throughout cosmic history because stars and galaxies grow only in haloes more than a million times as massive as the Sun.

"Our research sheds light on these small haloes as we seek to learn more about what dark matter is and the role it plays in the evolution of the universe."

Credit: 
Durham University

Novel technology for the selection of single photosynthetic cells

image: PhenoChip- a microfluidic device for the single cell phenotyping of unicellular phototrophs such as microalgae and cyanobacteria.

Image: 
Lars Behrendt

You might need a microscope to witness the next agricultural revolution. New research, published in the journal Science Advances, demonstrates how microfluidic technologies can be used to identify, isolate and propagate specific single photosynthetically active cells for fundamental industry applications and improved ecosystem understanding.

Natural environments are inherently dynamic and require photosynthetic organisms to adapt their physiology to make optimal use of available resources and grow to the best of their abilities. However, not all photosynthetic organisms are equally efficient in this physiological fine-tuning, and where some, for example, succumb to the effects of temperature stress, others persist and grow.

In agriculture, humans have taken advantage of this phenotypic heterogeneity in natural plant populations for thousands of years: the selective breeding of more resistant or productive plant phenotypes has given rise to many of our modern crops and has sustained much of human progress.

While microalgae and cyanobacteria have a similar potential for bioenergy production and biosynthesis of food and chemicals, until now, the tools for their selection have been blunt and unwieldy, relying on bulk culture - akin to selecting for traits in wheat at the level of the landscape.

In this new study, a team of researchers from Sweden, Denmark and Switzerland reports on a novel microfluidic technology called 'PhenoChip' which allows for the identification and selection of unicellular phototrophs under relevant environments.

"Similar to our ancestors selecting a more drought-resistant plant, we can now pick and propagate single phenotypes and start asking fundamental questions. What mechanism causes this phenotype to emerge? Does it persist over many generations? Can we use it to obtain increased biomass yields for biotechnological applications or select resilient phenotypes from natural environments?" says first author Lars Behrendt, Assistant Professor at the Department of Environmental Toxicology at Uppsala University.

In a first-proof-of-concept application, the team used PhenoChip on single cells essential to coral reef health, ecosystems currently under pressure due to changes in climate. In their study, they exposed cells of the coral symbiont Symbiodinium to thermal and chemical treatments, both relevant to the onset of coral bleaching. This enabled the identification of single cells with elevated resilience to rising temperatures and the selection of cells that maintained specific phenotypes for several generations.

PhenoChip's assisted evolution of Symbiodinium could thus help ongoing initiatives aiming to mitigate threats to coral reefs resulting from projected changes in sea surface temperatures and other stressors.

"Conceivably we could use PhenoChip to create a 'library' of desired Symbiodinium phenotypes and try to supply these symbionts--which have not been genetically manipulated but were selected for being more naturally robust--to bleached corals under laboratory conditions. While we don't yet know whether this would improve the ability of corals to recover and persist in the face of future stress, it's an exciting thought," says Behrendt.

Credit: 
Uppsala University

Effect of dexamethasone on days alive, ventilator-free in patients with COVID-19, acute respiratory distress syndrome

What The Study Did: This randomized clinical trial in Brazil of 299 patients with COVID-19 and moderate or severe acute respiratory distress syndrome (ARDS) examined if intravenous dexamethasone plus standard care compared with standard care alone would increase the number of days patients were alive and free from mechanical ventilation.

Authors: Luciano C. P. Azevedo, M.D., Ph.D., of Hospital Sirio-Libanes in São Paulo, Brazil, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jama.2020.17021)

Editor's Note: The article includes conflicts of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Effect of hydrocortisone on death, organ support in patients with severe COVID-19

What The Study Did: This randomized clinical trial of patients with severe COVID-19 was stopped early after results from another trial were released but this study investigated whether intravenous hydrocortisone (administered either as a seven-day fixed-dose course or restricted to when shock is clinically evident) improved 21-day organ support-free days.

Authors: Derek C. Angus, M.D., M.P.H., of the University of Pittsburgh, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jama.2020.17022)

Editor's Note: The article includes conflicts of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Association between treatment with corticosteroids, risk of death among critically ill patients with COVID-19

What The Study Did: The results of seven randomized clinical trials with 1,703 critically ill patients with COVID-19 were combined to estimate the association between administration of corticosteroids compared with usual care or placebo and the risk of death after 28 days.

Authors: Jonathan A.C. Sterne, M.A., M.Sc., Ph.D., of the University of Bristol in England, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jama.2020.17023)

Editor's Note: The article includes conflicts of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network