Tech

Research shows potential for zero-deforestation pledges to protect wildlife in oil palm

New research has found that environmental efforts aimed at eliminating deforestation from oil palm production have the potential to benefit vulnerable tropical mammals.

These findings, published by Conservation Letters, were drawn from an international collaboration led by Dr Nicolas Deere from the University of Kent's Durrell Institute of Conservation and Ecology (DICE), and including the University of Melbourne, University of York, Universiti Malaysia Sabah, and South East Asia Rainforest Research Partnership.

In a promising development, the Roundtable on Sustainable Palm Oil (RSPO) has recently committed to zero-deforestation on plantations certified as sustainable, as a way to prohibit forest clearance during the development of new agricultural areas. This is achieved using the High Carbon Stock (HCS) Approach, a land-use planning tool that aims to protect patches of well-connected, high quality forest.

In their study of forest patches in Borneo Dr Deere and his colleagues found that those forests greater than 100 hectares, and afforded the highest conservation priority by High Carbon Stock protocols, supported larger populations of threatened species particularly sensitive to deforestation, such as sun bears and orang-utans.

Using camera-trap information, the size of the mammal population was greatest in larger and more connected forest patches. However, hunting and forest quality compromised the suitability of the patch for many species, indicating the importance of accounting for the impacts of human activities on wildlife in agricultural areas.

While the study highlights the potential for zero-deforestation approaches to contribute to wildlife conservation, the feasibility of protecting patches large enough to sustain sufficient numbers of species was called into question.

The study found that in the 100 hectares patches (the minimum criterion for high priority conservation status in HCS), only 35% of mammal species that would otherwise be present in continuous forest would be protected. In fact, patches would need to be 30 times larger to support the mammal community, and even larger if the effects of hunting were considered. Preserving forest patches of this size is simply unrealistic in most plantations.

Dr Deere said: 'The HCS protocols seem to work well in prioritising patches of wildlife-friendly forest within oil palm plantations, but it's not enough for many of the species we studied. A switch in emphasis towards joining up forest patches and managing them together across farmland landscapes would really help wildlife conservation in the long term.'

Credit: 
University of Kent

Smart materials are becoming smarter

image: Smart materials are becoming smarter.

Image: 
Immanuel Kant Baltic Federal University

A researcher from Baltic Federal University together with his colleagues developed a composite material that can change its temperature and parameters under the influence of magnetic and electrical fields. Smart materials are safe for human health, and with these properties can be used to manufacture implants (or surface coating for them) that would work as sensors. The article was published in the Scientific Reports journal.

Composites are a new type of materials that consist of heterogeneous components (metals, ceramics, glass, plastic, carbon, etc) and combine their properties. To create such a material, a filler with certain stability and rigidity is placed into a flexible matrix. Various compositions and matrix-filler ratios create a wide range of materials with given sets of characteristics. Composites may be used in different fields, from construction to energy, medicine, and space research. Polymer composites are currently considered one of the most promising smart materials for biomedical applications.

A researcher from Immanuel Kant Baltic Federal University together with his team used this approach to develop smart materials for biological implants. The authors of the study wanted the implant to act as a sensor, e.g. to measure a patient's body temperature and other health indicators in real time, and also to release drugs into a patient's body in given amounts and at given intervals. To create such an implant, the scientists had to find a combination of materials with the required properties. In its recent study the team described a composite material constructed from Gd5(Si,Ge)4 magnetic nanoparticles incorporated into a polyvinylidenfluoride (PVDF) matrix.

PVDF is a flexible and biocompatible (i.e. harmless for the body) polymer that is used as a surgical suture material. It also possesses piezoelectric properties: when PVDF is stretched or compressed, electric voltage occurs in it (this is called direct piezoeffect), and when voltage is applied to it, the material changes in size (reverse piezoeffect). Due to these properties, PVDF is effectively used in sensors. Moreover, it has also been used to create new magnetoelectric materials, such as composite multiferroics. The magnetic and ferroelectric characteristics of such materials are mutually manageable, i.e. their electrical properties can be controlled with a magnetic field, and magnetic characteristics - with an electric one. Thanks to its properties, PVDF may be used as a basis for implant coating or even the implants themselves.

"The novelty of our approach lies in the use of specific magnetic particles as a filler of a piezopolymer matrix. Along with magnetic properties they also possess the magnetocaloric effect, i.e. change their temperature under the influence of a magnetic field. Magnetocaloric materials are a promising basis for the development of alternative cooling systems, the so-called 'magnetic freezers'. It's also recently been suggested that they could be used in biomedical applications," said Karim Amirov, a Candidate of Physics and Mathematics, a senior researcher at the Laboratory for New Magnetic Materials, Kant Baltic Federal University. According to him, to create magnetoelectric smart composites, magnetocaloric substances are added to PVDF (dissolved in the dimethylformamide solvent) and evenly spread. After that the polymer is dried down in line with a specific temperature and time protocol. The result is a flexible piezopolymer plate of a given shape with incorporated magnetic particles. Such a plate can be easily cut with scissors.

Thus, the use of the new magnetocaloric particles led to the development of a smart composite material combining magnetoelectric and magnetocaloric properties. The first ones make the material a sensor detecting both magnetic and electric fields, and the second turn it into a heating or cooling element depending on magnetic field changes.

Credit: 
Immanuel Kant Baltic Federal University

Record-breaking terahertz laser beam

image: Claudia Gollner and the laser system at TU Wien.

Image: 
TU Wien

Terahertz radiation is used for security checks at airports, for medical examinations and also for quality checks in industry. However, radiation in the terahertz range is extremely difficult to generate. Scientists at TU Wien have now succeeded in developing a terahertz radiation source that breaks several records: it is extremely efficient, and its spectrum is very broad - it generates different wavelengths from the entire terahertz range. This opens up the possibility of creating short radiation pulses with extremely high radiation intensity. The new terahertz technology has now been presented in the journal Nature Communications.

The "Terahertz Gap" Between Lasers and Antennas

"Terahertz radiation has very useful properties," says Claudia Gollner from the Institute of Photonics at TU Wien. "It can easily penetrate many materials, but unlike X-rays, it is harmless because it is not ionising radiation".

From a technical point of view, however, terahertz radiation is located in a frequency region which is very hard to access - in sort of a no man's land between two well-known areas: Radiation with higher frequencies can be generated by ordinary solid-state lasers. Low-frequency radiation, on the other hand, as it is used in mobile communications, is emitted by antennas. The greatest challenges lie exactly in between, in the terahertz range.

In the laser laboratories of TU Wien, a great deal of effort must therefore be put into generating the desired high-intensity terahertz radiation pulses. "Our starting point is the radiation of an infrared laser system. It was developed at our Institute and it is unique in the world," says Claudia Gollner. First, the laser light is sent through a so-called non-linear medium. In this material, the infrared radiation is modified, part of it is converted into radiation with twice the frequency.

"So now we have two different types of infrared radiation. These two kinds of radiation are then superimposed. This creates a wave with an electric field with a very specific asymmetric shape," says Gollner.

Turning Air into Plasma

This electromagnetic wave is intense enough to rip electrons out of the molecules in the air. The air turns into a glowing plasma. Then, the special shape of the wave's electric field accelerates the electrons in such a way that they produce the desired terahertz radiaton.

"Our method is extremely efficient: 2.3% of the supplied energy is converted into terahertz radiation - that is orders of magnitude more than can be achieved with other methods. This results in exceptionally high THz energies of almost 200 μJ," says Claudia Gollner. Another important advantage of the new method is that a very broad spectrum of terahertz radiation is generated. Very different wavelengths throughout the terahertz range are emitted simultaneously. This produces extremely intense short radiation pulses. The larger the spectrum of different terahertz wavelengths, the shorter and more intense pulses can be generated.

Numerous Possible Applications

"This means that for the first time a terahertz source for extremely high intensity radiation is now available", says Andrius Baltuska, the head of the research group at the Vienna University of Technology. "Initial experiments with zinc-telluride crystals already show that terahertz radiation is excellently suited to answer important questions from material science in a completely new way. We are convinced that this method has a great future."

Credit: 
Vienna University of Technology

Brazilian wildfire pollution worsens air quality in distant cities -- study

Wildfires in south eastern Brazil produce airborne pollution that worsens air quality in major cities such as Sao Paulo - cancelling out efforts to improve the urban environment and posing health risks to citizens, according to a new study.

The planet is frequently affected by smoke from fires caused by humans and natural processes. Australia, California and other regions are prone to seasonal wildfires and smoke from wildfires and agricultural burns worsening air quality in places up to 2,000?km away.

Most wildfires in Brazil occur in the dry season between July and September in the areas of Amazon and Cerrado - mostly agriculture-related fires - and the Pampas. Depending on the weather, long-range transport of smoke affects the air quality of small and large cities downwind of the fire spots, including the 'megacity' of Sao Paulo.

Burning biomass produces increased quantities of low-lying ozone due, in part, to the South Atlantic subtropical high pressure system. Transported considerable distances from the fire, this pollution further contribute to poor air quality and smog in cities such as Sao Paulo.

Researchers from the University of Birmingham, the Federal University of Technology, Londrina, Brazil, and the University of Stockholm published their findings in the Journal of Environmental Management.

Professor Roy Harrison, from the University of Birmingham, commented: "The state of Sao Paulo has led with progressive measures to curb air pollution, such as controlling sulphur dioxide from industrial sources and enforcing standards for cleaner vehicles and fuels.

"However, present results indicate that policies targeting the reduction of biomass burning are of utmost importance to improve urban air quality, particularly in densely populated areas where high pollutant concentrations are frequently observed."

Besides affecting air quality and increasing the risk of death from respiratory causes, ozone is a short-lived climate forcer - an atmospheric compound with a warming effect but with a shorter lifetime than carbon dioxide. Reducing ozone levels has two main benefits: reducing impact on air quality and climate.

Atmospheric emission data suggests that emissions from biomass burning make up a substantial part of the precursors for O3 formation.

Dr. Admir Créso Targino, from the Federal University of Technology, commented: "We need enhanced governance at regional, national and international levels to combat biomass burning practices in Brazil and its neighbouring countries.

"Not only would the population health benefit from such a measure, but also the regional climate, as ozone and particulate matter generated by the fires are short-lived climate forcers. Such an approach would be well-aligned with the Paris Agreement that aims to limit global warming to below 2OC compared to the pre-industrial period - a critical measure in the fight against climate change."

Researchers combined in situ ozone data, measured in the states of Sao Paulo and Parana from 2014 to 2017, with information about a range of co-pollutants such as NOx, PM2.5 and PM10 to identify sources, transport and geographical patterns in the air pollution data.

Ozone concentrations peaked in September and October - linked to biomass burning and enhanced photochemistry. Long-range transport of smoke contributed to between 23 and 41 per cent of the total ozone during the pollution events.

Credit: 
University of Birmingham

Platelets instead of spheres make screens more economical

image: UV light shines on a pane of glass, coated with several layers of two-dimensional semiconductor nanoplatelets, which emits blue light.

Image: 
ETH Zurich / Jakub Jagielski

QLED screens have been on the market for a few years now. They are known for their bright, intense colours, which are produced using what is known as quantum dot technology: QLED stands for quantum dot light emitting diode. Researchers at ETH Zurich have now developed a technology that increases the energy efficiency of QLEDs. By minimising the scattering losses of light inside the diodes, a larger proportion of the light generated is emitted to the outside.

Conventional QLEDs consist of a multitude of spherical semiconductor nanocrystals, known as quantum dots. In a screen, when these nanocrystals are excited from behind with UV light, they convert it into coloured light in the visible range. The colour of light each nanocrystal produces depends on its material composition.

However, the light these spherical nanocrystals emit scatters in all directions inside the screen; only about one-fifth of it makes its way to the outside world and is visible to the observer. To increase the energy efficiency of the technology, scientists have been trying for years to develop nanocrystals that emit light in only one direction (forward, towards the observer) - and a few such light sources already exist. But instead of spherical crystals, these sources are composed of ultra-thin nanoplatelets that emit light only in one direction: perpendicular to the plane of the platelet.

If these nanoplatelets are arranged next to each other in a layer, they produce a relatively weak light that is not sufficient for screens. To increase the light intensity, scientists are attempting to superimpose several layers of these platelets. The trouble with this approach is that the platelets begin to interact with each other, with the result that the light is again emitted not only in one direction but in all directions.

Stacked and insulated from each other

Chih-Jen Shih, Professor of Technical Chemistry at ETH Zurich, and his team of researchers have now stacked extremely thin (2.4 nanometres) semiconductor platelets in such a way that they are separated from each other by an even thinner (0.65 nanometre) insulating layer of organic molecules. This layer prevents quantum-physical interactions, which means that the platelets emit light predominantly in only one direction, even when stacked.

"The more platelets we pile on top of each other, the more intense the light becomes. This lets us influence the light intensity without losing the preferred direction of emission," says Jakub Jagielski, a doctoral student in Shih's group and first author of the study published in Nature Communications. That is how the scientists managed to produce a material that for the first time emits high-intensity light in only one direction.

Very energy-efficient blue light

Using this process, the researchers have produced light sources for blue, green, yellow and orange light. They say that the red colour component, which is also required for screens, cannot yet be realised with the new technology.

In the case of the newly created blue light, around two-fifths of the light generated reaches the eye of the observer, compared to only one-fifth with conventional QLED technology. "This means that our technology requires only half as much energy to generate light of a given intensity," Professor Shih says. For other colours, however, the efficiency gain achieved so far is smaller, so the scientists are conducting further research with a view to increasing this.

Compared to conventional LEDs, the new technology has another advantage, as the scientists emphasise: the novel stacked QLEDs are very easy to produce in a single step. It is also possible to increase the intensity of conventional LEDs by arranging several light-emitting layers on top of each other; however, this needs to be done layer by layer, which makes production more complex.

Credit: 
ETH Zurich

New drug prevents liver damage, obesity and glucose intolerance in mice on high-fat diet

WASHINGTON (January 20, 2020) -- Mice given a new drug targeting a key gene involved in lipid and glucose metabolism could tolerate a high-fat diet regimen (composed of 60% fat from lard) without developing significant liver damage, becoming obese, or disrupting their body's glucose balance. The study by Georgetown Lombardi Comprehensive Cancer Center researchers appeared January 20, 2020, in Cell Death and Differentiation.

The U.S. Centers for Disease Control and Prevention (CDC) estimates that there are 4.5 million adults in the U.S. diagnosed with liver disease every year. Nonalcoholic fatty liver disease, or NAFLD, can evolve to a more serious condition known as inflammatory steatohepatitis, or NASH, which can lead to chronic inflammation, scarring of the liver, and cirrhosis and eventually to hepatocellular carcinoma. While NAFLD can be reversed in the early stages with weight loss and dietetic adjustments, it becomes intractable in later stages.

There is no standard therapy for NASH, albeit many drugs are being evaluated in clinical trials. Because of the now epidemic state of the liver disease, researchers at Georgetown developed a small molecule able to inhibit the activity of a key gene, Slc25a1, that they hypothesized plays an important role in fatty liver disease.

"Our research takes on a definite urgency when you consider that about 25 percent of adults in the U.S. have NAFLD," said Maria Laura Avantaggiati, MD, associate professor of oncology at Georgetown Lombardi. In addition, while NAFLD can be reversed with dietetic adjustments, it is difficult for these individuals to undergo dramatic life-style changes, posing a challenge to halt NAFLD evolution to NASH."

One of the investigators' key steps was to administer the new drug, CTPI-2, as a preventive treatment in mice fed the high-fat diet before NASH developed, or as a reversion treatment in mice with significant liver damage. This latter setting reflects what is seen people who seek medical advice when the disease is already present. The administration of CTPI-2 was able to nearly completely prevent the evolution to NASH and obesity in mice on the high-fat diet, compared to mice that did not receive the drug. At later stages of the disease, CTPI-2 also reversed liver damage, induced weight loss and restored the glucose metabolic profile.

"The results were quite dramatic as the livers of most of the mice that received CTPI-2 nearly resembled the normal livers of animals fed with a regular diet," said Avantaggiati. "In addition, CTPI-2 normalized glucose metabolism, leading us to hypothesize that the drug could also have applications in the treatment of diabetes, but this aspect will need further study."

To confirm their findings, the investigator developed a genetically modified mouse with Slc25a1 inactivated in the liver. Mice with the inactive Slc25a1 gene were partially protected from fatty liver disease as if they were treated with CTPI-2, confirming the importance of this gene in liver damage induced by the high-fat diet.

"We have established that CTPI-2 has anti-inflammatory activity and has anti-tumor activity towards several types of cancer, said Avantaggiati. "We now need to establish if CTPI-2 can also halt the progression to hepatocellular carcinoma."

Credit: 
Georgetown University Medical Center

Wisdom of the crowd? Building better forecasts from suboptimal predictors

Tokyo, Japan - Researchers at the University of Tokyo and Kozo Keikaku Engineering Inc. have introduced a method for enhancing the power of existing algorithms to forecast the future of unknown time series. By combining the predictions of many suboptimal forecasts, they were able to construct a consensus prediction that tended to outperform existing methods. This research may help provide early warnings for floods, economic shocks, or changes in the weather.

Time series data are a familiar part of our daily lives. A gyrating graph might represent the water level of a river, the price of a stock, or the daily high temperature in a city, just to name a few. Advance knowledge of the future movements of a time series could be used to avert or prepare for future undesirable events. However, forecasting is extremely difficult because the underlying dynamics that generates the values are nonlinear (even if assumed to be deterministic) and therefore subject to wild fluctuations.

Delay embedding is a widely used method to help make sense of time series data and attempt to predict future values. This approach takes a sequence of observations and "embeds" them in a higher-dimensional space by combining the current value with evenly spaced lagged values from the past. For example, to create a three-dimensional delay embedding of the S&P 500 closing price, you can take the closing prices today, yesterday, and the day before as the x-, y-, and z-coordinates, respectively. However, the possible choices for embedding dimension and delay lag make finding the most useful representation for making forecasts a matter of trial and error.

Now, researchers at the University of Tokyo and Kozo Keikaku Engineering Inc. have showed a way to select and optimize a collection of delay embeddings so that their combined forecast does better than any individual predictor. "We found that the 'wisdom of the crowd,' in which the consensus prediction is better than each on its own, can be true even with mathematical models," first author Shunya Okuno explains.

The researchers tested their method on real-world flood data, as well as theoretical equations with chaotic behavior. "We expect that this approach will find many practical applications in forecasting time series data, and reinvigorate the use of delay embeddings," senior author Yoshito Hirata says. Forecasting a future system state is an important task in many different fields including neuroscience, ecology, finance, fluid dynamics, weather, and disaster prevention, hence this work has potential for use in a wide range of applications.

Credit: 
Institute of Industrial Science, The University of Tokyo

On the edge between science and art: Historical biodiversity data from Japanese 'gyotaku'

image: Gyotaku rubbing from Chiba Prefecture.

Image: 
Yusuke Miyazaki

Historical biodiversity data is being obtained from museum specimens, literature, classic monographs and old photographs, yet those sources can be damaged, lost or not completely adequate. That brings us to the need of finding additional, even if non-traditional, sources.

Biodiversity observations are made not only by researchers, but also by citizens, though rather often these data are poorly documented or not publicly accessible. Nowadays, this type of data can be found mostly with online citizen science projects resources.

In Japan many recreational fishers have recorded their memorable catches as 'gyotaku':魚拓, which means fish impression or fish rubbing in English. 'Gyotaku' is made directly from the fish specimen and usually includes information such as sampling date and locality, the name of the fisherman, its witnesses, the fish species (frequently its local name), and fishing tackle used. This art has existed since the last Edo period. Currently, the oldest 'gyotaku' material is the collection of the Tsuruoka City Library made in 1839.

Traditionally, 'gyotaku' is printed by using black writing ink, but over the last decades colour versions of 'gyotaku' have become better developed and they are now used for art and educational purposes. Though, the colour prints are made just for the means of art and rarely include specimen data, sampling locality and date.

In the sense of modern technological progress, it's getting rarer and rarer that people are using 'gyotaku' to save their "fishing impressions". The number of personally managed fishing-related shops is decreasing and the number of original 'gyotaku' prints and recreational fishermen might start to decrease not before long.

Smartphones and photo cameras are significantly reducing the amount of produced 'gyotaku', while the data from the old art pieces are in danger of either getting lost or diminished in private collections. That's why the research on existing 'gyotaku' as a data source is required.

A Japanese research team, led by Mr. Yusuke Miyazaki, has conducted multiple surveys among recreational fishing shops in different regions of Japan in order to understand if 'gyotaku' information is available within all the territory of the country, including latitudinal limits (from subarctic to subtropical regions) and gather historical biodiversity data from it.

In total, 261 'gyotaku' rubbings with 325 printed individual specimens were found among the targeted shops and these data were integrated to the 'gyotaku' database. Distributional data about a total of 235 individuals were obtained within the study.

The observed species compositions reflected the biogeography of the regions and can be representative enough to identify rare Red-listed species in particular areas. Some of the studied species are listed as endangered in national and prefectural Red Lists which prohibits the capture, holding, receiving and giving off, and other interactions with the species without the prefectural governor's permission. Given the rarity of these threatened species in some regions, 'gyotaku' are probably important vouchers for estimating historical population status and factors of decline or extinction.

"Overall, the species composition displayed in the 'gyotaku' approximately reflected the fish faunas of each biogeographic region. We suggest
that Japanese recreational fishers may be continuing to use the 'gyotaku' method in addition to digital photography to record their memorable catches", concludes author of the research, Mr. Yusuke Miyazaki.

Credit: 
Pensoft Publishers

Local water availability is permanently reduced after planting forests

image: This is a shallow river bed in Buderim Forest Park, Queensland, Australia.

Image: 
Laura Bentley

River flow is reduced in areas where forests have been planted and does not recover over time, a new study has shown. Rivers in some regions can completely disappear within a decade. This highlights the need to consider the impact on regional water availability, as well as the wider climate benefit, of tree-planting plans.

"Reforestation is an important part of tackling climate change, but we need to carefully consider the best places for it. In some places, changes to water availability will completely change the local cost-benefits of tree-planting programmes," said Laura Bentley, a plant scientist in the University of Cambridge Conservation Research Institute, and first author of the report.

Planting large areas of trees has been suggested as one of the best ways of reducing atmospheric carbon dioxide levels, since trees absorb and store this greenhouse gas as they grow. While it has long been known that planting trees reduces the amount of water flowing into nearby rivers, there has previously been no understanding of how this effect changes as forests age.

The study looked at 43 sites across the world where forests have been established, and used river flow as a measure of water availability in the region. It found that within five years of planting trees, river flow had reduced by an average of 25%. By 25 years, rivers had gone down by an average of 40% and in a few cases had dried up entirely. The biggest percentage reductions in water availability were in regions in Australia and South Africa.

"River flow does not recover after planting trees, even after many years, once disturbances in the catchment and the effects of climate are accounted for," said Professor David Coomes, Director of the University of Cambridge Conservation Research Institute, who led the study.

Published in the journal Global Change Biology, the research showed that the type of land where trees are planted determines the degree of impact they have on local water availability. Trees planted on natural grassland where the soil is healthy decrease river flow significantly. On land previously degraded by agriculture, establishing forest helps to repair the soil so it can hold more water and decreases nearby river flow by a lesser amount.

Counterintuitively, the effect of trees on river flow is smaller in drier years than wetter ones. When trees are drought-stressed they close the pores on their leaves to conserve water, and as a result draw up less water from the soil. In wet weather the trees use more water from the soil, and also catch the rainwater in their leaves.

"Climate change will affect water availability around the world," said Bentley. "By studying how forestation affects water availability, we can work to minimise any local consequences for people and the environment."

Credit: 
University of Cambridge

Spider-Man-style robotic graspers defy gravity

image: A wall-climbing robot uses the zero-pressure difference method to form suction.

Image: 
Xin Li and Kaige Shi

WASHINGTON, January 17, 2020 -- Specially designed vacuum suction units allow humans to climb walls. Scientists have developed a suction unit that can be used on rough surfaces, no matter how textured, and that has applications in the development of climbing robots and robotic arms with grasping capabilities.

Traditional methods of vacuum suction and previous vacuum suction devices cannot maintain suction on rough surfaces due to vacuum leakage, which leads to suction failure.

Researchers Xin Li and Kaige Shi developed a zero-pressure difference (ZPD) method to enhance the development of vacuum suction units. Their method overcame leakage limitations by using a high-speed rotating water ring between the surface and suction cup to maintain the vacuum. They discuss their work in this week's Physics of Fluids, from AIP Publishing.

"There are many applications of our design, but we think the wall-climbing robot will be the most useful," said Li. "Compared to other wall-climbing robots, the robot with our ZPD-based suction unit achieves surprising improvement in performance."

The centrifugal force of the rotating water eliminates the pressure difference at the boundary of the vacuum zone to prevent vacuum leakage. It can maintain a high vacuum pressure inside the suction cup.

Their ZPD suction unit is energy-efficient and smaller and lighter than traditional suction units. The researchers tested their unit with three different suction sizes and applications: on a robotic arm to grip and handle objects, on a hexapod wall-climbing robot and as a Spider-Man-like wall-climbing device.

"The next step in this research is to cut down the water consumption. If the water consumption can be reduced, the suction unit will work for a very long time with little water so that the wall-climbing robot could carry its own water instead of being connected to a supply," said Li.

Credit: 
American Institute of Physics

NJIT scientists measure the evolving energy of a solar flare's explosive first minutes

Toward the end of 2017, a massive new region of magnetic field erupted on the Sun's surface next to an existing sunspot. The powerful collision of magnetic energy produced a series of potent solar flares, causing turbulent space weather conditions at Earth. These were the first flares to be captured, in their moment-by-moment progression, by NJIT's then recently opened Expanded Owens Valley Solar Array (EOVSA) radio telescope.

In research published in the journal Science, the solar scientists who recorded those images have pinpointed for the first time ever exactly when and where the explosion released the energy that heated spewing plasma to energies equivalent to 1 billion degrees in temperature.

With data collected in the microwave spectrum, they have been able to provide quantitative measurements of the evolving magnetic field strength directly following the flare's ignition and have tracked its conversion into other energy forms - kinetic, thermal and superthermal - that power the flare's explosive 5-minute trip through the corona.

To date, these changes in the corona's magnetic field during a flare or other large-scale eruption have been quantified only indirectly, from extrapolations, for example, of the magnetic field measured at the photosphere--the surface layer of the Sun seen in white light. These extrapolations do not permit precise measurements of the dynamic local changes of the magnetic field in the locations and at time scales short enough to characterize the flare's energy release.

"We have been able to pinpoint the most critical location of the magnetic energy release in the corona," said Gregory Fleishman, a distinguished research professor of physics in NJIT's Center for Solar-Terrestrial Research and author of the paper. "These are the first images that capture the microphysics of a flare--the detailed chain of processes that occur on small spatial and time scales that enable the energy conversion."

By measuring the decline in magnetic energy, and the simultaneous strength of the electric field in the region, they are able to show that the two concord with the law of energy conservation are thus able to quantify the particle acceleration that powers the solar flare, including the associated eruption and plasma heating.

These fundamental processes are the same as those occurring in the most powerful astrophysical sources, including gamma ray bursts, as well as in laboratory experiments of interest to both basic research and the generation of practical fusion energy.

With 13 antennas working together, EOVSA takes pictures at hundreds of frequencies in the 1-18 GHz range, including optical, ultraviolet, X-rays and radio wavelengths, within a second. This enhanced ability to peer into the mechanics of flares opens new pathways to investigate the most powerful eruptions in our solar system, which are ignited by the reconnection of magnetic field lines on the Sun's surface and powered by stored energy in its corona.

"Microwave emission is the only mechanism that is sensitive to the coronal magnetic field environment, so the unique, high-cadence EOVSA microwave spectral observations are the key to enabling this discovery of rapid changes in the magnetic field," noted Dale Gary, a distinguished professor of physics at NJIT, EOVSA's director and a co-author of the paper. "The measurement is possible because the high-energy electrons traveling in the coronal magnetic field dominantly emit their magnetic-sensitive radiation in the microwave range."

Before EOVSA's observations, there was no way to see the vast region of space over which high-energy particles are accelerated and then become available for further acceleration by the powerful shock waves driven by the flare eruption, which, if directed at Earth, can destroy spacecraft and endanger astronauts.

"The connection of the flare-accelerated particles to those accelerated by shocks is an important piece in our understanding of which events are benign and which pose a serious threat," Gary said.

Just over two years after the expanded array began operating, it is automatically generating microwave images of the Sun and making them available to the scientific community on a day-to-day basis. As solar activity increases over the course of the 11-year solar cycle, they will be used to provide the first daily coronal magnetograms, maps of magnetic field strength 1,500 miles above the Sun's surface.

Credit: 
New Jersey Institute of Technology

NASA water vapor imagery shows Tino's heavy rain potential over Fiji

image: NASA's Aqua satellite passed over Tropical Cyclone Tino in the Southern Pacific Ocean on Jan. 17 at 7:50 a.m. EST (12:50 UTC) and highest concentrations of water vapor (brown) and coldest cloud top temperatures were around the center of circulation and over Fiji and surrounding islands.

Image: 
Credits: NASA/NRL

When NASA's Aqua satellite passed over the Southern Pacific Ocean it gathered water vapor data that provided information about the intensity of Tropical Cyclone Tino.

Tropical Cyclone Tino formed near Fiji in the Southern Pacific Ocean and NASA's Aqua satellite provided meteorologists with a look at the water vapor content of the storm showing potential for heavy rain.

On January 17, 2020, many warnings and watches were in effect from the Fiji Meteorological Service. A tropical cyclone warning is in force for Cikobia, Vanua Levu, Taveuni; and nearby smaller islands, Yasawa, Lau and Lomaiviti Group. A tropical cyclone alert remains in force for the eastern half of Viti levu. A storm warning is in force for Lakeba, Cicia, Tuvuca, Nayau, Oneata, Moce, Komo, Kabara, Namuka-1-Lau, Fulaga and Ogea. A gale warning remain in force for Cikobia, Vanua Levu, Taveuni and nearby smaller islands, eastern half of Viti Levu, Yasawa, the rest of Lau and Lomaiviti group. A strong wind warning remains in force for the rest of the Fiji Group.

On Jan. 17 at 7:50 a.m. EST (12:50 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Aqua satellite gathered water vapor content and temperature information on Tropical Storm Tino. The MODIS image showed highest concentrations of water vapor and coldest cloud top temperatures were around the center of circulation and over Fiji and surrounding islands. Coldest cloud top temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 degrees Celsius) in those storms. Storms with cloud top temperatures that cold have the capability to produce heavy rainfall.

Water vapor analysis of tropical cyclones tells forecasters how much potential a storm has to develop. Water vapor releases latent heat as it condenses into liquid. That liquid becomes clouds and thunderstorms that make up a tropical cyclone. Temperature is important when trying to understand how strong storms can be. The higher the cloud tops, the colder the cloud tops and the stronger the storms.

On Jan. 17 EST (0900 UTC), Tropical Storm Tino was located near latitude 16.3 degrees south and longitude 179.4 degrees east, about 178 nautical miles north-northeast of Suva, Fiji. Tino is moving to the southeast with maximum sustained winds near 55 knots (62 mph/102 kph).

Tino is forecast to move southeast while strengthening to 60 knots (69 mph/111 kph). After a day or two, the storm will become extra-tropical while weakening.

NASA's Aqua satellite is one in a fleet of NASA satellites that provide data for hurricane research.

Tropical cyclones/hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

By Rob Gutro
NASA's Goddard Space Flight Center

Credit: 
NASA/Goddard Space Flight Center

'Melting rock' models predict mechanical origins of earthquakes

video: Researchers twist rock discs against one another under large amounts of pressure at high speeds to simulate what happens during earthquakes at fault lines. New models from Duke engineers are the first that can accurately reproduce how the amount of friction decreases as the speed of the rock slippage increases and the rock undergoes a phase change.

Image: 
Giulio DiToro (University of Padova), Elena Spagnuolo and Stefano Aretusini (National Institute of Geophysics and Volcanology, Rome)

DURHAM, N.C. -- Engineers at Duke University have devised a model that can predict the early mechanical behaviors and origins of an earthquake in multiple types of rock. The model provides new insights into unobservable phenomena that take place miles beneath the Earth's surface under incredible pressures and temperatures, and could help researchers better predict earthquakes -- or even, at least theoretically, attempt to stop them.

The results appear online on January 17 in the journal Nature Communications.

"Earthquakes originate along fault lines deep underground where extreme conditions can cause chemical reactions and phase transitions that affect the friction between rocks as they move against one another," said Hadrien Rattez, a research scientist in civil and environmental engineering at Duke. "Our model is the first that can accurately reproduce how the amount of friction decreases as the speed of the rock slippage increases and all of these mechanical phenomena are unleashed."

For three decades, researchers have built machines to simulate the conditions of a fault by pushing and twisting two discs of rock against one another. These experiments can reach pressures of up to 1450 pounds per square inch and speeds of one meter per second, which is the fastest underground rocks can travel. For a geological reference point, the Pacific tectonic plate moves at about 0.00000000073 meters per second.

"In terms of ground movement, these speeds of one meter per second are incredibly fast," said Manolis Veveakis, assistant professor of civil and environmental engineering at Duke. "And remember that friction is synonymous with resistance. So if the resistance drops to zero, the object will move abruptly. This is an earthquake."

In these experiments, the surface of the rocks either begins to turn into a sort of gel or to melt, lowering the coefficient of friction between them and making their movement easier. It's been well established that as the speed of these rocks relative to one another increases to one meter per second, the friction between them drops like a rock, you might say, no matter the type. But until now, nobody had created a model that could accurately reproduce these behaviors.

In the paper, Rattez and Veveakis describe a computational model that takes into account the energy balance of all the complicated mechanical processes taking place during fault movement. They incorporate weakening mechanisms caused by heat that are common to all types of rock, such as mineral decomposition, nanoparticle lubrication and melting as the rock undergoes a phase change.

After running all of their simulations, the researchers found that their new model accurately predicts the drop in friction associated with the entire range of fault speeds from experiments on all available rock types including halite, silicate and quartz.

Because the model works well for so many different types of rock, it appears to be a general model that can be applied to most situations, which can reveal new information about the origins of earthquakes. While researchers can't fully recreate the conditions of a fault, models such as this can help them extrapolate to higher pressures and temperatures to get a better understanding of what is happening as a fault builds toward an earthquake.

"The model can give physical meaning to observations that we usually cannot understand," Rattez said. "It provides a lot of information about the physical mechanisms involved, like the energy required for different phase transitions."

"We still cannot predict earthquakes, but such studies are necessary steps we need to take in order to get there," said Veveakis. "And in theory, if we could interfere with a fault, we could track its composition and intervene before it becomes unstable. That's what we do with landslides. But, of course, fault lines are 20 miles underground, and we currently don't have the drilling capacity to go there."

Credit: 
Duke University

Sanitary care by social ants shapes disease outcome

image: Argentine ant workers.

Image: 
Gert Brovad

Who wins in a competition is largely dependent on the opponent faced, yet the role of the environment in which the battle takes place should not be underestimated either. In sports, some skiers profit from icy over snowy grounds and some tennis players are weaker on sand than on grass. Similarly, within our bodies, the immune system sets the environment for the competition between multiple pathogens that infect us at the same time.

It is long known that an immune response can bias the competitive outcome of competing pathogens as it may affect one pathogen more than the other. Professor Sylvia Cremer and her team at the Institute of Science and Technology (IST Austria) could now provide first evidence that it is not only the immune system of the host individual which shapes the competitive outcome of coinfecting pathogens within the insect body, but that the social context can have a similar effect.

Survival of the fastest

Solitary species have to fight disease alone. In contrast, in groups of social species--including bees, ants or termites--, nestmates often assist the infected individual by providing sanitary care, thereby creating an environment of "social immunity". The Cremer group discovered that--besides the immune system of the individual host itself--the sanitary care provided by ants to their fungus-exposed colony members modulates the pathogen competition inside the host's body, changing the success of pathogen outgrowth after infection.

Testing a number of different pathogen combinations, the Cremer's team found that one fungal pathogen species that was very successful in winning the competition in individually-reared ants was much less successful when the ants were reared together with healthy colony members. The researchers discovered that this bias introduced by care-providing nestmates was not caused by selective grooming of one pathogen species over the other. Rather, the fungal spores showed different susceptibility to the ants' grooming: spores that quickly enter their host's body turned out less susceptible to grooming than spores that need more time penetrating the body surface. Due to this slower so called germination speed, the respective pathogen was exposed to the ants' grooming for longer than the otherwise weaker competitor.

Professor Sylvia Cremer summarizes: "If one pathogen species takes longer to germinate, this leaves the ants more time and increases their chance to groom it off. Hence a fast germination reduces the time window for the ants to perform successful sanitary care and can shift the balance towards winning the competition against a slower-germinating pathogen species."?

Social care-giving beats self-cleaning

Pathogens of the genus Metarhizium infect insects by attaching to the body surface of their hosts as spores and start germinating. Germinated spores grow a plug-like structure that produces both pressure and lytic enzymes to break the host body surface. They then grow into the host, replicate, kill the host by toxins and grow out millions of novel spores that cause the next round of infections. Grooming helps ants to effectively prevent these infections.

First author and IST Austria postdoc Barbara Milutinovi? explains: "The ants use their mouthparts to pluck off the infectious spores from the body surface of their nestmates. Such social allogrooming is much more efficient than selfgrooming, as some body parts are impossible to be reached by oneself--as we all know from our own experience when we try to scratch an itchy spot on our own back." As the Cremer group found, in the presence of grooming nestmates, this social allogrooming can induce a shift in the pathogen community inside the host--and thus alter the disease outcome.

Credit: 
Institute of Science and Technology Austria

Green in tooth and claw

Go ahead, take a big bite.

Hard plant foods may have made up a larger part of early human ancestors' diet than currently presumed, according to a new experimental study of modern tooth enamel from Washington University in St. Louis.

Scientists often look at microscopic damage to teeth to infer what an animal was eating. This new research -- using experiments looking at microscopic interactions between food particles and enamel -- demonstrates that even the hardest plant tissues scarcely wear down primate teeth. The results have implications for reconstructing diet, and potentially for our interpretation of the fossil record of human evolution, researchers said.

"We found that hard plant tissues such as the shells of nuts and seeds barely influence microwear textures on teeth," said Adam van Casteren, lecturer in biological anthropology in Arts & Sciences, the first author of the new study in Scientific Reports. David S. Strait, professor of physical anthropology, is a co-author.

Traditionally, eating hard foods is thought to damage teeth by producing microscopic pits. "But if teeth don't demonstrate elaborate pits and scars, this doesn't necessarily rule out the consumption of hard food items," van Casteren said.

Humans diverged from non-human apes about seven million years ago in Africa. The new study addresses an ongoing debate surrounding what some early human ancestors, the australopiths, were eating. These hominin species had very large teeth and jaws, and likely huge chewing muscles.

"All these morphological attributes seem to indicate they had the ability to produce large bite forces, and therefore likely chomped down on a diet of hard or bulky food items such as nuts, seeds or underground resources like tubers," van Casteren said.

But most fossil australopith teeth don't show the kind of microscopic wear that would be expected in this scenario.

The researchers decided to test it out.

Previous mechanical experiments had shown how grit -- literally, pieces of quartz rock -- produces deep scratches on flat tooth surfaces, using a device that mimicked the microscopic interactions of particles on teeth. But there was little to no experimental data on what happens to tooth enamel when it comes in contact with actual woody plant material.

For this study, the researchers attached tiny pieces of seed shells to a probe that they dragged across enamel from a Bornean orangutan molar tooth.

They made 16 "slides" representing contacts between the enamel and three different seed shells from woody plants that are part of modern primate diets. The researchers dragged the seeds against enamel at forces comparable to any chewing action.

The seed fragments made no large pits, scratches or fractures in the enamel, the researchers found. There were a few shallow grooves, but the scientists saw nothing that indicated that hard plant tissues could contribute meaningfully to dental microwear. The seed fragments themselves, however, showed signs of degradation from being rubbed against the enamel.

This information is useful for anthropologists who are left with only fossils to try to reconstruct ancient diets.

"Our approach is not to look for correlations between the types of microscopic marks on teeth and foods being eaten -- but instead to understand the underlying mechanics of how these scars on tooth surface are formed," van Casteren said. "If we can fathom these fundamental concepts, we can generate more accurate pictures of what fossil hominins were eating."

So those big australopith jaws could have been put to use chewing on large amounts of seeds -- without scarring teeth.

"And that makes perfect sense in terms of the shape of their teeth" said Peter Lucas, a co-author at the Smithsonian Tropical Research Institute, "because the blunt low-cusped form of their molars are ideal for that purpose."

"When consuming many very small hard seeds, large bite forces are likely to be required to mill all the grains," van Casteren said. "In the light of our new findings, it is plausible that small, hard objects like grass seeds or sedge nutlets were a dietary resource for early hominins."

Credit: 
Washington University in St. Louis