Tech

Tunable rainbow light trapping in ultrathin resonator arrays

image: Spectral (a-c) and spatial (d-f) field profiles of groove arrays with gradients in (a,d) groove-length, (b,e) groove-width and (c,f) both groove-length and groove width. Each gradient design produces broadband field enhancement with the spatial and spectral profiles determined by the geometries of the comprising grooves. (g-h) SEM images of the nanoscale metal-dielectric-metal grooves with a minimum groove width of 5 nm, allowing for extreme light concentration.

Image: 
by Nazir Kherani

Through the manipulation of quasiparticles known as surface plasmon polaritons (SPPs) metallic nanostructures can control light at the nanoscale, confining it to ultrathin gaps and magnifying the light intensity up to 1000 times. This light localization amplifies light-matter interactions and has wide ranging applications in nanoscale optics including low concentration molecular sensing, enhanced photocatalysis and super-resolution optics. However, current systems for designing and fabricating these nanostructures lack control over the device's optical response and often suffer from poor light confinement, limiting the applications which can benefit from this light trapping effect.

A new paper published in Light Science & Applications from a team of scientists led by Professor Nazir Kherani at the University of Toronto outlines novel design and fabrication techniques optimized for light localization across the visible spectrum, also known as rainbow light trapping. The nanostructures consist of a series of grooves grouped into arrays with gradients in the groove geometry in various configurations providing broadband electromagnetic field enhancement. Using an analytical model combinations of these grooves are designed to produce devices which can control both the spectral and spatial distribution of localized light. The team demonstrated the capabilities of their device design by creating various arrays of ultrathin metallic grooves with groove-widths down to 5 nm, allowing for extreme light confinement. The combination of precise analytical design along with versatile device fabrication and extreme field localization will allow for the ubiquitous application of these structures for the enhancement of many phenomena.
The researchers highlight the versatility of their techniques as key to the significance of their work. "The analytical design approach significantly reduces the time required to develop new devices as compared to the numerous iterative simulations which are currently used. Additionally, the bottom up fabrication technique can be easily modified to create nanostructures for any application while producing very strong electric fields" the scientists state. The researchers hope that this versatility will allow their work to have a wide-ranging impact in the field of nanoscale optics.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

Curtin collision models impact the future of energy

A new Curtin University-created database of electron-molecule reactions is a major step forward in making nuclear fusion power a reality, by allowing researchers to accurately model plasmas containing molecular hydrogen.

The Curtin study, published in the Atomic Data and Nuclear Data Tables journal, is supplying data to the International Thermonuclear Experimental Reactor (ITER) - one of the largest scientific projects in the world aimed at developing fusion technology for electricity production on Earth.

Lead researcher, PhD candidate and Forrest Scholar Liam Scarlett from the Theoretical Physics Group in Curtin's School of Electrical Engineering, Computing and Mathematical Sciences said his calculations and the resulting collision database will play a crucial role in the development of fusion technology.

"Our electron-molecule collision modelling is an exciting step in the global push to develop fusion power - a new, clean electricity source. Fusion is the nuclear reaction which occurs when atoms collide and fuse together, releasing huge amounts of energy. This process is what powers the Sun, and recreating it on Earth requires detailed knowledge of the different types of collisions which take place in the fusion plasma - that's where my research comes in," Mr Scarlett said.

"We developed mathematical models and computer codes, and utilised the Perth-based Pawsey Supercomputing Centre to calculate the probabilities of different reactions taking place during collisions with molecules. The molecules we looked at here are those which are formed from atoms of hydrogen and its isotopes, as they play an important role in fusion reactors.

"Until now the available data was incomplete, however our molecular collision modelling has produced an accurate and comprehensive database of more than 60,000 electron-molecule reaction probabilities which, for the first time, has allowed a team in Germany to create an accurate model for molecular hydrogen in the ITER plasma.

"This is significant because their model will be used to predict how the plasma will radiate, leading to a better understanding of the plasma physics, and the development of diagnostic tools which are vital for controlling the fusion reaction."

The research project was funded by the United States Air Force Office of Scientific Research as part of an international research endeavour to harness fusion power as a future energy source.

Research supervisor and co-author Professor Dmitry Fursa, from Curtin's School of Electrical Engineering, Computing and Mathematical Sciences, said fusion power is attractive due to its virtually unlimited fuel supply (hydrogen) and the lack of long-lived radioactive waste or carbon emissions.

"Fusion is one of the biggest projects in the world right now. You can harness an enormous amount of energy from the reaction that occurs when you take hydrogen atoms and fuse them together," Professor Fursa said.

"This new and comprehensive electron-molecule collision modelling has provided a solid basis for other researchers to continue their work into developing an efficient reactor to re-create the Sun's fusion process here on Earth."

Credit: 
Curtin University

A semiconductor chip detects antigen concentrations at 1 parts per quadrillion molar mass

image: IoT sensor captures and detects trace amounts of antigens on a nano-film surface

Image: 
COPYRIGHT (C) TOYOHASHI UNIVERSITY OF TECHNOLOGY. ALL RIGHTS RESERVED.

Overview:

Associate Professor Kazuhiro Takahashi and Assistant Professor Yong-Joon Choi of the Department of Electrical and Electronic Information Engineering at Toyohashi University of Technology have developed a chip that can sense antigens at one part per quadrillion molar mass. The chip was created using semiconductor micromachining technology. Antigens derived from diseases and present in blood and saliva were adhered onto the surface of a flexibly deformable nanosheet. The amount of force generated during the interaction between adhered antigens was then converted into nanosheet deformation information in order to successfully detect specific antigens. Created with semiconductor technology machined at the millimeter scale, this sensor chip is expected to contribute to telemedicine by functioning as an IoT biosensor that allows antigen and antibody tests to be carried out from home.

Details:

The measuring device simply and quickly detects diseases using a minuscule amount of blood, urine, saliva, or other bodily fluid, and will be a vital tool for accurately diagnosing diseases, verifying treatment results, and checking for recurrences and metastasis. Research is being conducted into a biosensor that can measure treatment results and pathological reactions by detecting DNA, RNA and proteins contained in such fluid. This technology has recently attracted interest across the globe, with antigen and antibody detection widely used to detect and determine the presence of novel coronavirus infections. Furthermore, among COVID-19 patients, reports suggest that patients with severe symptoms show differences in multiple protein concentrations contained in the blood when compared to those with mild symptoms. By examining such markers, this technology is expected to be used to predict illness severity. Current detection devices are not digitalized, and require visual confirmation of color changes using a labeling agent. Reading the wide range of markers is time consuming, and has made implementation for IoT devices difficult.

The research team is developing a micro sensor chip that checks for diseases using a flexibly deformable nanosheet made using semiconductor micromachining technology. First, an antibody that catches the targeted antigens is fixed onto the nanosheet, and deformations to a thin film caused by electric repulsions among the adhered antigens is measured. To improve sensitivity to the point where the membrane that the antigens adhere to becomes thin and soft, organic nanosheets two-times softer than semiconductor silicon are used. This is expected to improve sensor sensitivity to a magnitude twice that of conventional silicon-based sensors. In addition, development is continuing on signal detection technology that uses a smartphone camera to detect nanosheet deformation.

With this sensor, which is designed for sensitive changes in adhesion of biomolecules, the antibody must be fixed to the nanosheet in advance in order to capture the antigen, and issues related to film degradation can make this process difficult. The research team optimized density for antibodies to adhere to a nano membrane with adjustable thickness, creating a biosensor that detects only antigens with specifically high sensitivity. Moreover, since it is possible to detect deformation to the nano sheet caused by adhered molecules in real time, the technology is expected to allow for quick detection of disease-derived molecules. The biosensor developed in this project was used in an experiment to detect albumin, a protein contained in blood. The experiment successfully detected one femtogram (15 attomoles in molar concentration) of antigen contained in one milliliter. With the minimum detection limit almost equivalent to that of large-scale detection devices that use labeling agents, this device is expected to allow for ultra-sensitive detection on a portable scale.

Future Outlook:

Going forward, the research team plans to conduct trials using semiconductor sensors to detect markers for severe symptoms of COVID-19 infection. In addition to blood detection, chemical sensors are being developed to detect odor and chemical substances. We believe we can contribute to an IoT-based society by making new, small-scale sensor devices a reality. Replacing the probe molecule on the surface of our nanosheet, the technology can be used to detect viruses while also detecting a variety of biomarkers. By making these biosensors common in society, we aim to contribute to telemedicine, allowing doctor diagnoses to be performed from home.

Credit: 
Toyohashi University of Technology (TUT)

Danish researchers develop budget optical ammonia sensor

In collaboration with the Technical University of Denmark (DTU), the Department of Engineering at Aarhus University has developed photonic sensor technology that can pave the way for a portable, reliable and, above all, inexpensive device for detecting ammonia and other gases in agriculture.

Working with chemists and chemical engineers, photonics researchers from Aarhus University and DTU have developed a new sensor system that could contribute to a significant reduction in air pollution in Denmark.

The researchers have developed an integrated optical sensor, based on modern telecommunication technology, that measures ammonia in the air using a laser, a gas sensor and hollow-core optical fibres.

The new sensor has been described in the scientific journal MDPI Photonics.

"Our device demonstrates that it is possible to carry out continuous ammonia monitoring for the agricultural sector, and because it is based on mature telecommunications technology, the system can be built at very low cost. At the same time, the system is very compact, and it meets the need for a portable, reliable and above all inexpensive system for detecting ammonia," says Andreas Hänsel, a postdoc from Aarhus University and part of the Photonic Integrated Circuits research group.

Development of the sensor technology is still ongoing, and focus is currently on increasing the sensitivity of the equipment. Andreas Hänsel also says that, although the new sensor has been developed to detect ammonia, it can easily be configured to detect a wide range of other gases, including greenhouse gases.

The new technology has been developed as part of the Ecometa project, which has received DKK 12.5 million funding from Innovation Fund Denmark.

In collaboration with leading Danish technology companies, the project researchers have been looking for new methods and developed technology to measure and reduce air pollution from the agricultural sector for several years.

Livestock production is today responsible for a significant part of Danish air pollution, primarily from ammonia, which is one of the largest environmental problems for agriculture. However, emissions of ammonia are not measured at all today at farm level, as such measurements are rather expensive.

"Ecometa is a very relevant research project with great prospects for agriculture. The ability to continuously and cost-effectively track the development in ammonia emissions from agriculture provides completely new opportunities for the industry to experiment on reducing the emissions," says Kent Myllerup, Head of Pig Innovation at SEGES and chairman of the steering committee for Ecometa.

Today, air pollution from agriculture is based on standard figures, but a future prospect of continuous farm level measurements could be an incentive in the green transformation of the industry, explains Associate Professor Anders Feilberg, who heads Aarhus University's part of the project:

"The new technology takes us one step closer to enabling farmers to monitor their emissions continuously. With accurate monitoring of ammonia emissions from sheds and stables, farmers can streamline operations far better. This takes us closer to emissions-based regulation using measured emissions, and it can significantly reduce the environmental impact of agriculture," says Associate Professor Anders Feilberg, who is heading Aarhus University's part of the Ecometa project.

The project will run until 2021, and it will help comply with Denmark's international commitments for the environment, especially with regard to ammonia.

Credit: 
Aarhus University

The new generation solar, developed by TalTech, cells contribute to the green revolution

image: Researchers at TalTech Laboratory of Photovoltaic Materials

Image: 
TalTech

The European Union is determined to undertake a major reform known as the European Green Deal with an aim of making Europe the first climate neutral continent in 2050. The biggest changes will take place in the energy production sector, which stands on the brink of a complete transition to renewable energy sources, including solar energy. To boost the power output of solar cells to a terawatt-scale, technologies that leave a smaller ecological footprint, are more efficient and offer a wider range of applications need to be developed alongside with the first-generation silicon-based solar cells currently dominating in the solar cell market.

TalTech's photovoltaic materials and optoelectronic materials physics research groups published an article in the journal Solar Energy titled "The effect of S/Se ratio on the properties of Cu2CdGe(SxSe1?x)4 microcrystalline powders for photovoltaic applications", which focused on the development of the new generation monograin layer solar cells.

One of the authors of the article, Head of the TalTech Laboratory of Photovoltaic Materials, Senior Researcher Marit Kauk-Kuusik says, "Unlike the widespread silicon-based solar panels, the next-generation solar cells are made of very thin layers of material. To build such solar cells, semiconductors with very good light-absorbing properties must be used. As is known, light absorption in silicon is rather poor, thus requiring relatively thick absorber layers, which make solar cells heavy and rigid. Our research focused on the analysis of the potential applications of the Cu2CdGe(SxSe1?x)4 semiconductor in the production of solar energy. In this study, we focused on the effect of sulfur/selenium (S/Se) ratio on the optoelectronic properties of the absorber material in order to maximise the spectral sensitivity range."

Solar cell works on the principle of photovoltaic effect, i.e. energy can be produced directly by light. A solar cell absorber should be able to absorb light as efficiently as possible, in particular to harness the full spectrum of wavelengths in solar radiation. In addition, the absorption coefficient of the absorber material must be as high as possible, which means that already a very thin layer of the absorber should absorb all the incident light. This in turn means that less material is required to produce an absorber than in case of a lower absorption coefficient. Therefore, while absorbers made of silicon, which is a material with a low absorption coefficient, are 150-200 μm thick, the layers of modern absorber materials based on monograin powders can be 5-10 times thinner (i.e. 10-30 μm thick). It also automatically reduces the weight of the solar cell.

Lower weight of solar sells also means a decrease in material consumption, which is of course not of minor importance in our current era of increase in environmental awareness and green revolution. "It is important to consistently search new alternatives to the existing silicon-based solar cells used for decades," Marit Kauk-Kuusik says. The trend is towards environmental friendliness and overall sustainability. In addition to reduced material consumption and weight, the new solutions are also much more innovative. The keywords are still high-performance, lightness, flexibility and durability.

While conventionally costly vacuum evaporation or sputtering technologies have been widely used to produce solar cells, the unique monograin powder technology applied by TalTech material researchers does not require any high vacuum equipment. Microcrystalline powder is synthesized by molten salt method in quartz ampoules in a special chamber furnace. The mass obtained is washed and sieved into narrow size fractions by a special sieving system and the synthesized high-quality microcrystalline powder, monograin powder, is used for the production of solar cells.

Marit Kauk-Kuusik says, "The monograin powder produced by our powder technology consists of microcrystals that form miniature solar cells in a large module. This provides major advantages over silicon-based solar panels: the material is lightweight, flexible, can be semi-transparent, while being environmentally friendly and significantly less expensive."

Environmentally friendly energy production has become vital in the light of the green revolution and sustainable consumption. Renewable energy, where solar energy plays an increasingly significant role, is an important keyword here.

"The power conversion efficiency of the solar cell developed as a result of our research is 6.4%, which is world highest published performance for Cu2CdGe(SxSe1?x)4 based solar cells and slightly higher than that of the world's first, silicon-based cell developed decades ago. Thus, it is a promising result," Kauk-Kuusik says. She is also convinced that, unlike with this invention, it will no longer take 30 to 40 years to achieve higher power conversion efficiency, as was the case with the silicon, but results in science will be achieved in a much shorter period of time.

Credit: 
Estonian Research Council

Multi-center, multi-tracer PET studies harmonized to detect neuroinflammation in ALS

image: Surface-based analysis between participants with ALS and HVs for 18F-DPA714 SUVR40-60, 11C-PBR28 SUVR60-90, and both tracers (18F-DPA714 SUVR40-60 combined with 11C-PBR28 SUVR60-90) using WB without ventricles as pseudo reference region.

Image: 
Images created by Van Weehaeghe et al. JNM. 2020.

Reston, VA--A novel ALS (amyotrophic lateral sclerosis) study has pooled data from multiple sites to effectively visualize neuroinflammation, which is key to developing drugs to treat the disease. Pooling data acquired from different scanners, different neuroinflammation positron emission tomography (PET) markers and different sites enhanced researchers' ability to detect neuroinflammation in ALS patients. This research was published in the November issue of The Journal of Nuclear Medicine.

ALS is a rare and fatal neurodegenerative disease that causes progressive weakness, respiratory failure and eventual death. Developing drugs to treat the disease is uniquely challenging because it is so rare. "In rare diseases such as ALS, only a limited pool of participants is available to participate in imaging studies," noted Donatienne Van Weehaeghe, MD, PhD, researcher in the department of imaging and pathology at University Hospital Leuven in Leuven, Belgium. "Therefore, conducting collaborative research across various sites and bringing in data to a common analysis pool is valuable to accelerate imaging biomarker development."

The study investigated two second-generation translator protein (TSPO) tracers, 18F-DPA714 and 11C-PBR28, that are currently being developed in the United States and Europe as promising ALS biomarkers. Researchers first sought to validate the established 11C-PBR28 PET pseudo reference analysis technique (which is used as a substitute for full dynamic modeling) for 18F-DPA714; they then evaluated whether multicenter data pooling of 18F-DPA714 and 11C-PBR28 data was feasible.

ALS patients and healthy volunteers from the United States and Belgium were recruited for the study and underwent dynamic 18F-DPA714 or 11C-PBR28 PET/MRI (magnetic resonance imaging). Data from the 18F-DPA714 or 11C-PBR28 images were analyzed, and results were compared.

The pseudo reference analysis technique was found to produce results comparable to those of gold standard PET analyses obtained by full dynamic modeling. The most sensitive pseudo reference region was whole brain without ventricles. Analysis of the 18F-DPA714 and 11C-PBR28 data from multiple sites showed a much greater power to detect inflammation compared to individual site data alone.

"In this exciting study, we have shown the ability to pool together and analyze brain neuroinflammation PET imaging data acquired at multiple institutions with varying scanner capabilities, using state-of-the-art analytical tools. This is the essential first step for bringing cutting-edge research closer to ALS patients globally and for accelerating the pace of biomarker readouts for future ALS clinical trials," said Suma Babu, MBBS, MPH, assistant professor of neurology at Harvard Medical School and physician investigator at Massachusetts General Hospital in Boston, Massachusetts. "This approach could reduce time and travel burden for patients, allowing them to participate in novel biomarker research while remaining close to home. From a scientific study conduct standpoint, this approach retains scientific rigor, increases statistical power, reduces trial durations and reduces risks of attrition."

"Developing mechanistic central nervous system biomarkers that can be acquired across multiple study sites would greatly accelerate the pace of finding effective treatments for neurodegenerative diseases, including ALS," said Nazem Atassi, MD, MMSc, associate professor of neurology at Harvard Medical School and head of early neuro-development at Sanofi-Genzyme.

PET imaging of neuroinflammation is relevant to multiple neurological conditions, not just ALS. "The ability to combine data across different radiotracers allows researchers to build on the foundation laid by prior research without the need to start from scratch every time with a new radioligand. If ongoing and future collaborative research in this field is successful, it could directly impact the use of PET imaging markers in future clinical trials testing anti-neuroinflammatory medications in ALS and other neurological conditions," remarked Van Weehaeghe.

Credit: 
Society of Nuclear Medicine and Molecular Imaging

Skoltech scientists run a 'speed test' to boost production of carbon nanotubes

image: Skoltech researchers have investigated the procedure for catalyst delivery used in the most common method of carbon nanotube production, chemical vapor deposition (CVD), offering what they call a "simple and elegant" way to boost productivity and pave the way for cheaper and more accessible nanotube-based technology.

Image: 
Pavel Odinev/Skoltech

Skoltech researchers have investigated the procedure for catalyst delivery used in the most common method of carbon nanotube production, chemical vapor deposition (CVD), offering what they call a "simple and elegant" way to boost productivity and pave the way for cheaper and more accessible nanotube-based technology. The paper was published in the Chemical Engineering Journal.

Single-walled carbon nanotubes (SWCNT), tiny rolled sheets of graphene with a thickness of just one atom, hold huge promise when it comes to applications in materials science and electronics. That is the reason why so much effort is focused on perfecting the synthesis of SWCNTs; from physical methods, such as using laser beams to ablate a graphite target, all the way to the most common CVD approach, when metal catalyst particles are used to "strip" a carbon-containing gas of its carbon and grow the nanotubes on these particles.

"The road from raw materials to carbon nanotubes requires a fine balance between dozens of reactor parameters. The formation of carbon nanotubes is a tricky and complex process that has been studied for a long time, but still keeps many secrets," explains Albert Nasibulin, a professor at Skoltech and an adjunct professor at the Department of Chemistry and Materials Science, Aalto University School of Chemical Engineering.

Various ways of enhancing catalyst activation, in order to produce more SWCNTs with the required properties, have already been suggested. Nasibulin and his colleagues focused on the injection procedure, namely on how to distribute ferrocene vapor (a commonly used catalyst precursor) within the reactor.

They grew their carbon nanotubes using the aerosol CVD approach, using carbon monoxide as a source of carbon, and monitored the synthesis productivity and SWCNT characteristics (such as their diameter) depending on the rate of catalyst injection and the concentration of CO2 (used as an agent for fine-tuning). Ultimately the researchers concluded that "injector flow rate adjustment could lead to a 9-fold increase in the synthesis productivity while preserving most of the SWCNT characteristics", such as their diameter, the share of defective nanotubes, and film conductivity.

"Every technology is always about efficiency. When it comes to CVD production of nanotubes, the efficiency of the catalyst is usually out of sight. However, we see a great opportunity there and this work is only a first step towards an efficient technology," Dmitry Krasnikov, senior research scientist at Skoltech and co-author of the paper, says.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Early human landscape modifications discovered in Amazonia

image: Aerial view of a research site called Severino Calazans.

Image: 
Martti Pärssinen

In 2002 Professor Alceu Ranzi (Federal University of Acre) and Prof. Martti Parssinen (University of Helsinki) decided to form an international research team to study large geometric earthworks, called geoglyphs, at the Brazilian state of Acre in South-western Amazonia. Soon it appeared that a pre-colonial civilization unknown to international scholars built there geometric ceremonial centers and sophisticated road systems. This civilization flourished in the rainforest 2,000 years ago. The discovery supported Prof. William Balee´s (Tulane University) theory of early human impacts on the current Amazonian tropical forest composition that radically altered the notion of the pristine Amazon rainforest.

Now, the team published an article in Antiquity demonstrating that the earthwork-building civilization had a much longer human history behind it than was expected. The team members demonstrate that humans have regularly used fire to clear small open patches in the rainforest. These activities started quite soon after the last Ice Age ended thousands of years before the first geoglyphs were constructed. Thanks to the charcoal the humans left in the Amazonian soil during the last 10 000 years, it was possible to measure systematically carbon-13 isotope values of many samples. By using these values taken from archaeologically dated charcoal it was possible to estimate past vegetation and precipitation. The results published in Antiquity indicate that the forest main vegetation and precipitation have remained quite unchanged during the last ten thousand years until the 20th century. No evidence of drier periods or natural/artificial savannah formations were observed before the current colonization started to penetrate into the southwestern Amazonia from the turn of the 19th and 20th centuries onward. Hence, the authors argue that the theories of extensive savannah formations in the South-western Amazonia during the current Holocene period are based on a false interpretation of the connection between charcoal accumulation and natural fires due to drier climatic periods. These interpretations have not taken into account the millennial human presence in Amazonia.

Alceu Ranzi says that "it is possible that opening patches were aimed to attract large mammals such as giant sloths and mastodons until the megafauna disappeared forever. In addition, ash and charcoal fertilized the soil and open areas were prepared for the growing of palms fruits, vegetables and root plants useful for human subsistence." Martti Parssinen adds that "it is probably not a coincidence that today southwestern Amazonia is considered one of the most important centers of domestication: cassava/manioc, squash, chili-pepper and peach palm seem to have been domesticated there almost 10 000 years ago. In every case, domestication processes left important fingerprints on Amazonian forest composition. Therefore there is no such thing as virgin rainforest."

In general, the study shows that indigenous peoples of the Amazon have been able to use their environment in a sustainable manner. Parssinen says that "there is no indication that large areas of Holocene forest would have been deforested before the second half of the 20th century. Deforestation is a current phenomenon."

Credit: 
University of Helsinki

AI reduces computational time required to study fate of molecules exposed to light

image: This is Shirin Faraji, Associate Professor in Theoretical Chemistry at the Zernike Institute for Advanced Materials, University of Groningen.

Image: 
Sylvia Germes

Light-induced processes are critical in transformative technologies such as solar energy harvesting, as well as in photomedicine and photoresponsive materials. Theoretical studies of the dynamics of photoinduced processes require numerous electronic structure calculations, which are computationally expensive. Scientists from the University of Groningen developed machine learning-based algorithms, which reduce these computations significantly. The Open Source software package that they developed, PySurf, was presented in a paper in the Journal of Chemical Theory and Computation on 24 November.

How do molecules behave when they are exposed to light? Knowledge of this process is not only central to crucial processes in nature, such as photosynthesis and vitamin D production, but it is also critical for the rational design of new molecules with specific photoresponsive properties.

Machine learning

Yet, despite great advances in hardware and computational methods, calculations of the interaction between light and molecules is still a challenge, explains Shirin Faraji, Associate Professor in Theoretical Chemistry, the lead author of the paper. 'The high-level electronic structure calculations are already very costly for medium-sized molecules, typical chromophores have around thirty heavy atoms.' Including the influence of the environment at the quantum mechanical level on such a system is practically impossible.

'Current software searches the entire conformational space, but we use machine learning to exclude parts of this conformational space search, making it a very smart search,' Faraji explains. 'Our software, therefore, requires several orders of magnitude less computational time than existing direct dynamics software.' In the paper, the developers report the photodynamics of two benchmark molecules, SO2 and pyrazine, and show that their results are comparable to those obtained using simulations that are based entirely on quantum dynamics.

Quantum chemistry

Furthermore, the software package was developed from scratch and is easy to adapt for specific purposes, for example by using plug-in and workflow engines. Faraji: 'A PhD student could easily dig into the code and develop a specific algorithm, for example, a new neural-network-based algorithm.'

Faraji contributed code to several software packages, most notably Q-Chem, one of the world's leading quantum chemistry software programs, and is currently a member of the Q-Chem Board of Directors. The new PySurf package will interface with Q-Chem, but also with other electronic structure software. PySurf is Open Source, which means that it is available as a free download together with the manual, and Faraji's team will provide support for users.

First release

The PySurf software is the result of a project funded by a personal grant to Faraji from the Dutch Research Council (NWO) Vidi programme. Faraji: 'We are only a year and a half into this five-year project. So, the current version is just the first release. We continue to work on the program to optimize it and to create a user-friendly interface.'

Credit: 
University of Groningen

Scientists warn of the social and environmental risks tied to the energy transition

image: Demonstration against hydropower utility in Winnipeg, Manitoba, Canada (nov 2019).

Image: 
Daniela Del Bene

To meet the most ambitious 1.5º C climate goal requires a rapid phaseout of fossil fuels and mass use of renewables. However, new international research by the Institute of Environmental Science and Technology of the Universitat Autònoma de Barcelona (ICTA-UAB) warns that green energy projects can be as socially and environmentally conflictive as fossil fuel projects. While renewable energies are often portrayed as being environmentally sustainable, this new study cautions about the risks associated with the green energy transition, arguing for an integrated approach that redesigns energy systems in favor of social equity and environmental sustainability. The research, which analyzes protests over 649 energy projects, has been recently published in the journal Environmental Research Letters.

The study, authored by an international group of researchers with a large presence of the ICTA-UAB and led by Dr. Leah Temper, from McGill University, draws on data from the Global Atlas of Environmental Justice (EJAtlas), an online database by ICTA-UAB that systematizes over 3000 ecological conflicts. The research examines what energy projects are triggering citizen mobilizations, the concerns being expressed as well as how different groups are impacted, and the success of these movements in stopping and modifying projects.

The study finds that conflicts over energy projects disproportionately impact rural and indigenous communities and that violence and repression against protesters was rife, with assassination of activists occurring in 65 cases, or 1 out of 10 cases studied. However, the study also points to the effectiveness of social protest in stopping and modifying energy projects, finding that over a quarter of projects facing social resistance turn out to be either cancelled, suspended, or delayed. Furthermore, it highlights how communities engage in collective action as a means of shaping energy futures and make claims for localization, democratic participation, shorter energy chains, anti-racism, climate-justice-focused governance, and Indigenous leadership.

According to Dr. Temper, "the study shows that the switch from fossil fuels to green energy is not inherently socially and environmentally benign and demonstrates how communities are standing up to demand a say in energy systems that works for them. These results call for action to ensure that the costs of decarbonization of our energy system do not fall on the most vulnerable members of our society." The study urges climate and energy policymakers to pay closer attention to the demands of collective movements to meaningfully address climate change and to move towards a truly just transition.

The study finds that amongst low-carbon energy projects, hydropower is the most socially and environmentally damag-ing, leading to mass displacement and high rates of violence. Out of the 160 cases of hydropower plants from 43 coun-tries studied, almost 85% of the cases are either high or medium intensity. Indigenous peoples are particularly at risk and are involved in 6 out of 10 cases. Co-author Dr. Daniela Del Bene, from ICTA-UAB, urges caution around large-scale renewables. "The case of hydropower dams shows that even less carbon-emitting technologies can cause severe im-pacts and lead to intense conflicts, including violence and assassinations of opponents. The energy transition is not only a matter of what technology or energy source to use but also of who controls and decides upon our energy systems", she says.

On the other hand, wind, solar, and geothermal renewable energy projects, were the least conflictive and involved lower levels of repression than other projects.

According to co-author Sofia Avila, "conflicts around mega wind and solar power infrastructures are not about "blocking" climate solutions but rather about "opening" political spaces to build equitable approaches towards a low-carbon future. For example, in Mexico, long-lasting claims of injustice around an ambitious Wind Power Corridor in Oaxaca has spearheaded citizen debates around a just transition, while different proposals for cooperative and decentralized energy production schemes are emerging in the country."

According to Prof. Nicolas Kosoy, from McGill University, "participation and inclusiveness are key to resolving our socio-environmental crises. Both green and brown energy projects can lead to ecological devastation and social exclusion if local communities and ecosystems rights continue to be trampled upon."

The study argues that place-based mobilizations can point the way towards responding to the climate crisis while tackling underlying societal problems such as racism, gender inequality, and colonialism. According to Dr. Temper, addressing the climate crisis calls for more than a blind switch to renewables. Demand-side reduction is necessary but this needs to work in tandem with supply side approaches such as moratoria, and leaving fossil fuels in the ground are necessary. "Equity concerns need to be foremost in deciding on unminable and unburnable sites. Instead of creating new fossil fuel and green sacrifices zones, there is a need to engage these communities in redesigning just energy futures", she says.

Credit: 
Universitat Autonoma de Barcelona

How are older adults coping with the mental health effects of COVID-19?

image: McLean researchers indicate that older adults may be withstanding the mental health strains of the COVID-19 pandemic better than other age groups

Image: 
McLean Hospital

Older adults are especially vulnerable to the effects of the COVID-19 pandemic--with higher risks of severe complications and death, and potentially greater difficulties accessing care and adapting to technologies such as telemedicine. A viewpoint article published in the Journal of the American Medical Association notes that there's also a concern that isolation during the pandemic could be more difficult for older individuals, which could exacerbate existing mental health conditions. Information gathered over the past several months suggests a much more nuanced picture, however.

"Over the spring and summer of 2020, we were struck by a number of individual studies from all over the world that reported a consistent theme: Older adults, as a group, appeared to be withstanding the strains on mental health from the pandemic better than all other age groups," said lead author Ipsit Vahia, MD, medical director of the Geriatric Psychiatry Outpatient Services and the Institute for Technology in Psychiatry
at McLean Hospital. "In this article, we highlight some of these studies and discuss resilience in older adults and what factors may be driving it."

Resilience may reflect an interaction among internal factors--such as an individual's stress response, cognitive capacity, personality traits, and physical health--and external resources like social connections and financial stability. For older adults experiencing isolation during the pandemic, having more meaningful relationships seems to be more important than having more interactions with others, and maintaining these relationships may require the use of technology to connect with loved ones.

Resilience can be supported through increased physical activity, enhanced compassion and emotional regulation, and greater social connectivity. Technology can play an important role in achieving these. "It can help maintain social connectivity, provide access to care via telemedicine, and also facilitate a range of other activities that may help cope with isolation," said Vahia. "It is increasingly becoming important for clinicians to assess patients' access and proficiency with technology as a part of care."

The authors stressed that although findings from the early months of the pandemic are encouraging and provide cause for cautious optimism, they may not reflect individual realities. "Older adults are a highly diverse group, and each person's response to the stresses of the pandemic depends on a unique set of circumstances," Vahia explained. "In addition, the current studies may not reflect specific high-risk populations with unique stressors, such as those living in underserved areas or those suffering with dementia or caregivers for people with dementia."

Importantly, the pandemic continues without a defined timeline or clear end in sight. The longer-term effects of COVID-19 on older adults' mental health, especially in countries with very high rates of disease, are unclear.

Credit: 
McLean Hospital

In a holiday season unlike any other, avoid unfounded claims about suicide

image: Figure 1. Percentage of holiday-season stories supporting the holiday-suicide myth vs. those debunking it, in 21 holiday seasons running from November 1999 through January 2020. Excludes stories citing both in a coincidental manner.

Image: 
Annenberg Public Policy Center of the University of Pennsylvania

The holiday season usually has the lowest monthly suicide rates. And while the COVID-19 pandemic has increased risk factors associated with suicide, the media and the public should be careful this holiday season not to make unfounded claims about suicide trends.

Last year, during the holidays and before the pandemic, about half of the newspaper stories that connected the holidays and suicide contained misinformation falsely perpetuating the myth that suicide increases over the holidays, according to a new analysis by the Annenberg Public Policy Center (APPC) of the University of Pennsylvania.

"News stories must always be careful not to sensationalize suicide," said APPC Research Director Dan Romer, who has been tracking holiday-season press coverage about suicide for over two decades. "Creating the false impression that suicide is more likely than it is can have adverse consequences for already-vulnerable individuals. This potential danger is even more critical this year, when COVID-19 has caused hundreds of thousands of deaths in the United States alone."

Though the pandemic has increased factors associated with suicide such as unemployment, financial stress, and social isolation, there is no clear evidence to date that it has increased the suicide rate, so caution is needed when reporting on this, as noted recently in The Lancet Psychiatry. In addition, a study found that suicide rates did not increase during the Massachusetts lockdown in March, April, and May, while a BMJ editorial reviewing national suicide trends during the early months of the pandemic found either no rise or a fall in suicide rates in high-income countries, including the United States.

Suicide, the myth, and the media

For decades, the holiday season has had some of the lowest suicide rates, according to the U.S. Centers for Disease Control and Prevention (CDC). CDC data from 2018, the latest available, show that the month with the lowest average daily suicide rate was December (12th in suicide rate). The next-lowest rates were in November (11th) and January (10th). In 2018, the highest rates were in June, July, and August - respectively, 1st, 2nd, and 3rd. (See Figure 2 and Table 1.)

The 2019-2020 APPC analysis found that the 33 articles linking the holidays and suicide were almost evenly split - 17 upheld the myth (52%) while 16 debunked it (48%). (See Figure 1.) By contrast, the prior year a greater percentage debunked the myth (68%) than supported it (32%).

APPC has analyzed news coverage of the holiday-suicide myth for 21 years, since the 1999-2000 holiday season. In most years, more newspapers upheld the myth than debunked it. In only three of those years did more than 60 percent of the news stories debunk the myth.

Perpetuating the holiday-suicide myth

Romer noted that the inaccurate stories this year predominantly appeared in smaller news outlets. These stories frequently featured unsupported claims made by community members, columnists, and letters-to-the-editor writers.

"News outlets often publish well-intentioned stories and features at this time of year designed to help people who find the holiday season emotionally difficult," Romer said. "But they need to be careful to avoid a false connection between the holiday or winter blues that some people experience and suicide. Years of U.S. data show that the suicide rate drops at this time of year."

The false connection between the holidays and suicide can be seen in examples such as these:

According to The Aegis, in Bel Air, Md., a councilman from the city of Havre de Grace, Md., "noted that suicides in the wider community tend to increase around the Christmas and New Year's holidays, so he urged local residents to contact Harford County's Crisis Hotline..." (December 4, 2019)

A column in the Eunice (La.) News on "suicide prevention and facts" coming from the parish sheriff's office says that "there is no single cause for suicide but the risk of suicide often rises during holidays." (December 14, 2019)

A "thumbs up, thumbs down" column in the Standard-Examiner of Ogden, Utah, reads in part: "Thumbs down: This time of year can be difficult for many, whether it's due to a recent family loss, greater sense of loneliness, or other personal struggles. This means it's typical to see increasing numbers of suicide attempts and death by suicide in Utah." (December 28, 2019)

The Hutchinson News in Kansas ran a column on the holiday mood in which the writer said: "Years ago, when I served as a pastoral counselor, in a private therapy practice with two clinical psychologists, our most demanding time was the holiday season. Indeed, suicides spiked during that time of year." (December 17, 2019)

Debunking the myth

Not all stories make the false connection. For instance:

In a story about mental health concerns at holiday time, a reporter for the Sun Chronicle (Attleboro, Mass.) included a quote that debunked the myth: "While many believe that more people kill themselves around the holidays, Annemarie Matulis, director of the Bristol County Regional Coalition for Suicide Prevention, said research shows that more people commit suicide in the warmer months. 'It is a myth that more people die by suicide this time of year,' Matulis said." (December 29, 2019)

A story in the Star-Courier (Kewanee, Ill.) about suicide prevention noted: "According to the Centers for Disease Control and Prevention, the holiday months usually have some of the lowest suicide rates, but an Annenberg Public Policy Center analysis of newspaper coverage showed 64% of stories perpetuated the myth that suicides increase during the holidays." (December 13, 2019)

COVID-19 and Seasonal Affective Disorder

Some news stories discuss the "winter blues" and seasonal affective disorder (SAD). But it is possible to reference the holiday blues without drawing connections to an increase in suicide. For example, a column in November by New York Times health writer Jane Brody successfully described the effects of the seasonal blues without suggesting that it leads to suicide.

The COVID-19 pandemic led to increased stress levels and has left people feeling both more anxious and depressed, Romer said. But in times of stress, people also find support from friends and family that can help to overcome those risks for suicide.

Romer said it's important that reporters and news organizations debunk the myth, because letting people think that suicide is more likely during the holidays can have contagious effects on individuals who are contemplating suicide. National recommendations for reporting on suicide advise journalists not to promote information that can increase contagion, such as reports of epidemics or seasonal increases, especially when the claim has no basis in fact. These recommendations, developed by journalism and suicide-prevention groups along with the Annenberg Public Policy Center, say that reporters should consult reliable sources such as the CDC on suicide rates and provide information about resources that can help people in need.

Journalists helping to dispel the holiday-suicide myth can provide resources for readers who are in or know of someone who is in a potential crisis. Among those offering valuable information are the CDC (http://www.cdc.gov/ViolencePrevention/suicide/holiday.html), the Suicide Prevention Resource Center (http://www.sprc.org) and the Substance Abuse and Mental Health Services Administration (https://www.samhsa.gov/find-help/suicide-prevention). The U.S. National Suicide Prevention Lifeline is 800-273-TALK (8255). The Federal Communications Commission has approved a plan to create a three-digit phone number, 988, for suicide prevention but it is not yet in operation.

Methodology

News and feature stories linking suicide with the holidays were identified through searches of both the LexisNexis and NewsBank databases. The searches used the terms "holiday" and "suicide" and (Christmas or Thanksgiving or New Year*) from November 15, 2019, through January 31, 2020.

The search originally used only the LexisNexis database, but was expanded in 2019 to include NewsBank to provide greater coverage of the U.S. press. Using this expanded database search, a reanalysis of prior years, including 2015-16, 2016-17, and 2017-18, did not substantially alter the proportion of stories that debunked or supported the myth.

Researchers determined whether the stories supported the link, debunked it, or showed a coincidental link. Coincidental stories were eliminated. Only domestic suicides were counted; overseas suicide bombings, for example, were excluded.

Credit: 
Annenberg Public Policy Center of the University of Pennsylvania

Shrinking massive neural networks used to model language

You don't need a sledgehammer to crack a nut.

Jonathan Frankle is researching artificial intelligence -- not noshing pistachios -- but the same philosophy applies to his "lottery ticket hypothesis." It posits that, hidden within massive neural networks, leaner subnetworks can complete the same task more efficiently. The trick is finding those "lucky" subnetworks, dubbed winning lottery tickets.

In a new paper, Frankle and colleagues discovered such subnetworks lurking within BERT, a state-of-the-art neural network approach to natural language processing (NLP). As a branch of artificial intelligence, NLP aims to decipher and analyze human language, with applications like predictive text generation or online chatbots. In computational terms, BERT is bulky, typically demanding supercomputing power unavailable to most users. Access to BERT's winning lottery ticket could level the playing field, potentially allowing more users to develop effective NLP tools on a smartphone -- no sledgehammer needed.

"We're hitting the point where we're going to have to make these models leaner and more efficient," says Frankle, adding that this advance could one day "reduce barriers to entry" for NLP.

Frankle, a PhD student in Michael Carbin's group at the MIT Computer Science and Artificial Intelligence Laboratory, co-authored the study, which will be presented next month at the Conference on Neural Information Processing Systems. Tianlong Chen of the University of Texas at Austin is the lead author of the paper, which included collaborators Zhangyang Wang, also of Texas A&M, as well as Shiyu Chang, Sijia Liu, and Yang Zhang, all of the MIT-IBM Watson AI Lab.

You've probably interacted with a BERT network today. It's one of the technologies that underlies Google's search engine, and it has sparked excitement among researchers since Google released BERT in 2018. BERT is a method of creating neural networks -- algorithms that use layered nodes, or "neurons," to learn to perform a task through training on numerous examples. BERT is trained by repeatedly attempting to fill in words left out of a passage of writing, and its power lies in the gargantuan size of this initial training dataset. Users can then fine-tune BERT's neural network to a particular task, like building a customer-service chatbot. But wrangling BERT takes a ton of processing power.

"A standard BERT model these days -- the garden variety -- has 340 million parameters," says Frankle, adding that the number can reach 1 billion. Fine-tuning such a massive network can require a supercomputer. "This is just obscenely expensive. This is way beyond the computing capability of you or me."

Chen agrees. Despite BERT's burst in popularity, such models "suffer from enormous network size," he says. Luckily, "the lottery ticket hypothesis seems to be a solution."

To cut computing costs, Chen and colleagues sought to pinpoint a smaller model concealed within BERT. They experimented by iteratively pruning parameters from the full BERT network, then comparing the new subnetwork's performance to that of the original BERT model. They ran this comparison for a range of NLP tasks, from answering questions to filling the blank word in a sentence.

The researchers found successful subnetworks that were 40 to 90 percent slimmer than the initial BERT model, depending on the task. Plus, they were able to identify those winning lottery tickets before running any task-specific fine-tuning -- a finding that could further minimize computing costs for NLP. In some cases, a subnetwork picked for one task could be repurposed for another, though Frankle notes this transferability wasn't universal. Still, Frankle is more than happy with the group's results.

"I was kind of shocked this even worked," he says. "It's not something that I took for granted. I was expecting a much messier result than we got."

This discovery of a winning ticket in a BERT model is "convincing," according to Ari Morcos, a scientist at Facebook AI Research. "These models are becoming increasingly widespread," says Morcos. "So it's important to understand whether the lottery ticket hypothesis holds." He adds that the finding could allow BERT-like models to run using far less computing power, "which could be very impactful given that these extremely large models are currently very costly to run."

Frankle agrees. He hopes this work can make BERT more accessible, because it bucks the trend of ever-growing NLP models. "I don't know how much bigger we can go using these supercomputer-style computations," he says. "We're going to have to reduce the barrier to entry." Identifying a lean, lottery-winning subnetwork does just that -- allowing developers who lack the computing muscle of Google or Facebook to still perform cutting-edge NLP. "The hope is that this will lower the cost, that this will make it more accessible to everyone ... to the little guys who just have a laptop," says Frankle. "To me that's really exciting."

Credit: 
Massachusetts Institute of Technology

Alcohol-free hand sanitizer just as effective against COVID as alcohol-based versions

image: An illustration of hand sanitizer in use

Image: 
BYU Photo

A new study from researchers at Brigham Young University finds that alcohol-free hand sanitizer is just as effective at disinfecting surfaces from the COVID-19 virus as alcohol-based products.

The BYU scientists who conducted the study suspected that the CDC's preference for alcohol sanitizer stemmed from as-yet limited research on what really works to disinfect SARS-CoV-2. To explore other options, they treated samples of the novel coronavirus with benzalkonium chloride, which is commonly used in alcohol-free hand sanitizers, and several other quaternary ammonium compounds regularly found in disinfectants. In most of the test cases, the compounds wiped out at least 99.9% of the virus within 15 seconds.

"Our results indicate that alcohol-free hand sanitizer works just as well, so we could, maybe even should, be using it to control COVID," said lead study author Benjamin Ogilvie.

Alcohol-free hand sanitizers, which are also effective against common cold and flu viruses, have a number of advantages over their alcohol-based counterparts, Ogilvie explained.

"Benzalkonium chloride can be used in much lower concentrations and does not cause the familiar 'burn' feeling you might know from using alcohol hand sanitizer. It can make life easier for people who have to sanitize hands a lot, like healthcare workers, and maybe even increase compliance with sanitizing guidelines," he said.

In the face of shortages, "having more options to disinfect hospitals and public places is critical," added Ph.D. student Antonio Solis Leal, who conducted the study's experiments.

Switching to alcohol-free hand sanitizer is logistically simple as well.

"People were already using it before 2020," said BYU professor and coauthor Brad Berges. "It just seems like during this pandemic, the non-alcohol-based hand sanitizers have been thrown by the wayside because the government was saying, 'we don't know that these work,' due to the novelty of the virus and the unique lab conditions required to run tests on it."

Since benzalkonium chloride typically works well against viruses surrounded by lipids--like COVID--the researchers believed that it would be a good fit for disinfecting the coronavirus.

To test their hypothesis, they put COVID samples in test tubes and then mixed in different compounds, including .2% benzalkonium chloride solution and three commercially available disinfectants containing quaternary ammonium compounds, as well as soil loads and hard water.

Working fast to simulate real-world conditions--because hand sanitizer has to disinfect quickly to be effective--they neutralized the disinfecting compounds, extracted the virus from the tubes, and placed the virus particles on living cells. The virus failed to invade and kill the cells, indicating that it had been deactivated by the compounds.

"A couple of others have looked at using these compounds against COVID," said Berges, "but we're the first to actually look at it in a practical timeframe, using four different options, with the realistic circumstance of having dirt on your hands before you use it."

The team believes their findings "may actually provide a change in government directions about hand sanitizer," Berges said.

Ogilvie hopes that reintroducing alcohol-free sanitizers into the market can relieve the shortages--and reduce the chances of people encountering some potentially "sketchy" alcohol sanitizers that have cropped up in response to the demand.

"Hand sanitizer can play an especially important role in controlling COVID," he concluded. "This is information that could affect millions of people."

Credit: 
Brigham Young University

Geoscientists use zircon to trace origin of Earth's continents

image: Jesse Reimink, assistant professor of geosciences at Penn State, is among a team of researchers to use the mineral zircon to help understand how the Earth's continents formed billions of years ago.

Image: 
Penn State

Geoscientists have long known that some parts of the continents formed in the Earth's deep past, but the speed in which land rose above global seas -- and the exact shapes that land masses formed -- have so far eluded experts.

But now, through analyzing roughly 600,000 mineral analyses from a database of about 7,700 different rock samples, a team led by Jesse Reimink, assistant professor of geosciences at Penn State, thinks they're getting closer to the answers.

The researchers say that Earth's land masses began to slowly rise above sea level about 3 billion years ago. When their interpretation is combined with previous work, including work from other Penn State researchers, it suggests that continents took roughly 500 million years to rise to their modern heights, according to findings recently published in Earth and Planetary Science Letters.

To reach this conclusion, scientists applied a unique statistical analysis to crystallization ages from the mineral zircon, which is reliably dateable and is frequently found in sedimentary rocks. While these researchers did not date these samples, the samples were all dated using the the uranium-lead decay system. This method measures the amount of lead in a sample and calculates from the well established rate of uranium decay, the age of the crystal. When zirconium forms, no lead is incorporated into its structure, so any lead is from uranium decay.

The minerals found in the sedimentary rock samples originally formed in older magmas but, through erosion and transport, traveled in rivers and were eventually deposited in the ocean where they were turned into sedimentary rock beneath the surface of the sea floor. The ages of zircons retrieved from individual rock samples can be used to tell the type of continent they were eroded from.

The ages of zircons from Eastern North American rocks are, for instance, different from those of land masses such as Japan, which was formed by much more recent volcanic activity.

"If you look at the Mississippi River, it's eroding rocks and zircons from all over North America. It's gathering mineral grains that have a massive age range from as young as a million years to as old as a few billions of years," Reimink said. "Our analysis suggests that as soon as sediment started to be formed on Earth they were formed from sedimentary basins with a similarly large age range."

Sediments are formed from weathering of older rocks, and carry the signature of past landmass in time capsules such as zircons. The research doesn't uncover the overall size of primordial continents, but it does speculate that modern-scale watersheds were formed as early as 2.7 billion years ago.

"Our research matches nicely with the preserved rock record," Reimink said.

This finding is critical for a few reasons. First, knowing when and how the continents formed advances research on the carbon cycle in the land, water and atmosphere. Secondly, it gives us clues as to the early origins of Earth. That could prove useful as we discover more about life and the formation of other planets. Earth is a life-sustaining planet, in part, because of how continental crust influences our atmospheric and oceanic composition. Knowing how and when these processes occurred could hold clues to the creation of life.

"Whenever we're able to determine processes that led to our existence, it relates to the really profound questions such as: Are we unique? Is Earth unique in the universe? And are there other Earths out there," Reimink said. "These findings help lead us down the path to the answers we need about Earth that allow us to compare our planet to others."

Credit: 
Penn State