Tech

SNew solar panel design could lead to wider use of renewable energy

Designing solar panels in checkerboard lines increases their ability to absorb light by 125 per cent, a new study says.

Researchers say the breakthrough could lead to the production of thinner, lighter and more flexible solar panels that could be used to power more homes and be used in a wider range of products.

The study - led by researchers from the University of York and conducted in partnership with NOVA University of Lisbon (CENIMAT-i3N) -  investigated how different surface designs impacted on the absorption of sunlight in solar cells, which put together form solar panels.

Scientists found that the checkerboard design improved diffraction, which enhanced the probability of light being absorbed which is then used to create electricity.

The renewable energy sector is constantly looking for new ways to boost the light absorption of solar cells in lightweight materials that can be used in products from roof tiles to boat sails and camping equipment.

Solar grade silicon - used to create solar cells - is very energy intensive to produce, so creating slimmer cells and changing the surface design would make them cheaper and more environmentally friendly. 

Dr Christian Schuster from the Department of Physics said: "We found a simple trick for boosting the absorption of slim solar cells. Our investigations show that our idea actually rivals the absorption enhancement of more sophisticated designs - while also absorbing more light deep in the plane and less light near the surface structure itself.

"Our design rule meets all relevant aspects of light-trapping for solar cells, clearing the way for simple, practical, and yet outstanding diffractive structures, with a potential impact beyond photonic applications.

"This design offers potential to further integrate solar cells into thinner, flexible materials and therefore create more opportunity to use solar power in more products."

The study suggests the design principle could impact not only in the solar cell or LED sector but also in applications such as acoustic noise shields, wind break panels, anti-skid surfaces, biosensing applications and atomic cooling.
 

Dr Schuster added: "In principle, we would deploy ten times more solar power with the same amount of absorber material: ten times thinner solar cells could enable a rapid expansion of photovoltaics, increase solar electricity production, and greatly reduce our carbon footprint.

"In fact, as refining the silicon raw material is such an energy-intensive process, ten times
thinner silicon cells would not only reduce the need for refineries but also cost less, hence
empowering our transition to a greener economy." 

Data from the Department for Business, Energy & Industrial Strategy shows renewable energy - including solar power - made up 47% of the UK's electricity generation in the first three months of 2020. 

Credit: 
University of York

'Danger molecule' associated with being obese, female and black in younger adults

image: A 'danger molecule' is higher in the blood of younger black adults than whites, females than males and increases with weight and age, researchers report in the first large, longitudinal study associating circulating HMGB1 levels with obesity, inflammation promoters and early indicators of cardiovascular risk in humans.

Image: 
Kim Ratliff, Production Coordinator, Augusta University

A "danger molecule" is higher in the blood of younger black adults than whites, females than males and increases with weight and age, researchers report in the first large, longitudinal study associating circulating HMGB1 levels with obesity, inflammation promoters and early indicators of cardiovascular risk in humans.

Higher HMGB1 levels were consistently associated with higher blood levels of the established inflammation molecule C-reactive protein and stiffer arteries, an often-early indicator of vascular damage, across all groups, Medical College of Georgia researchers report in the journal Arteriosclerosis, Thrombosis and Vascular Biology.

The findings point to the potential of circulating HMGB1 levels as a sound biomarker of cardiovascular risk as well as the use of HMGB1 antibodies or inhibitors to prevent or treat chronic inflammation, obesity and cardiovascular disease, corresponding author Dr. Yanbin Dong, geneticist and cardiologist and his colleagues write.

"We think it's an initiator for the inflammation cascade, which should make its blood levels really good information for patients and physicians," says Dong, a faculty member in the MCG Department of Medicine and its Georgia Prevention Institute.

HMGB1, or High Mobility Group Box-1, is one of the body's so called "danger molecules."

These damage-associated molecular patterns, or DAMPs, should reside inside the nucleus or cytoplasm of your cells, where they may have a positive role. Inside cells HMGB1, for example, can help manage the architecture of our chromosomes.

But like other DAMPs, when HMGB1 gets released by injured or otherwise stressed cells -- including as a result of significant mental stress -- it's viewed by the body like a virus might be, activating an immune response that can go viral and producing an unhealthy, chronic state of inflammation. In fact, this excessive immune response is what happens with the lung damage and resulting cytokine storm in COVID-19, says Dr. Li Chen, postdoctoral fellow with Dong and the study's first author.

"HMGB1 is like a biomarker of the danger and stress you face," Chen says.

In the largest human study of HMGB1, they looked at 489 individuals an average age of about 25 when they had their first of at least four blood samples drawn over the course of 8.5 years at the Georgia Prevention Institute. They are part of the longitudinal Georgia Stress and Heart Study seeking to learn more about the development of cardiovascular risks, and had enrolled as healthy 5 to 16 year olds.

The researchers found obesity most closely associated with high levels of HMGB1 over time, Dong says. Animal studies, including their own, have shown that just 12 weeks of a high-fat diet increases blood levels of HMGB1. Levels of the danger molecule have been shown to be high in individuals with a high waist-hip ratio, which means their waist and hip are closer to the same size so the overall shape is characterized as apple-shaped. Pear-shaped, with less fat around the middle, is generally considered a healthier distribution of fat. It's also already associated with a higher blood pressure and higher levels of the small cytokine interleukin 6, secreted by immune cells in response to injury or infection, and Turkey researchers have found higher HMGB1 levels in heart patients compared to healthy individuals

It's known that fat cells themselves are a sort of double DAMP hazard because they both secrete the danger molecules when they are alive and as they die; and individuals with obesity tend to have more fat cells. HMGB1 released from fat cells under either circumstance activates nearby immune cells, which also secrete HMGB1, which recruits even more immune cells, creating a vicious cycle of chronic inflammation that plays an early and key role in hypertension and cardiovascular disease, Dong says.

Blood levels of HMGB1 started out higher in blacks and females in the study and stayed higher throughout the course of the 8.5 years, the researchers say. Blacks tend to have more severe cardiovascular and cerebrovascular problems starting at a younger age than whites, and an early, heightened inflammatory response is likely a factor, Dong says, so their findings were consistent with those trends. But the female findings were surprising.

"Females tend to be healthier than males by a lot of markers," Chen says. "But this one is in the opposite direction." At least before menopause, females tend to have fewer cardiovascular problems than males. The researchers suspect that generally higher percentages of body fat in females might help explain the higher HMGB1 levels, but that younger females may have unique cardiovascular protections, like estrogen.

Increases in HMGB1 across the board were accompanied by increases in proinflammatory factors like C-reactive protein, interleukin 6 and tumor necrosis factor as well as stiffer arteries and generally higher blood pressure. There can be bad synergy between those factors, for example, interleukin-6 prompts the liver to make C-reactive protein, which is an established inflammatory factor that some physicians already look at in the blood, along with other known risks like high cholesterol and lipid levels. The researchers found higher C-reactive protein consistently associated with increasing age, being female and obese. There also were strong associations with high levels of C-reactive protein and arterial stiffness.

The increase that came with aging also was not a surprise, they say. "As you age, you have more cell death, generally speaking," Dong says.

"This is the first study to demonstrate the age, sex and race differences in circulating HMGB1," they write. "The increasing circulating concentrations of HMGB1 with age suggest a potential role of HMGB1 in the pathogenesis of chronic, low-grade inflammation, obesity and subclinical (early) (cardiovascular disease) risk."

HMGB1 is one of the most studied DAMPs, and it's known to have a lot of roles in functions like inflammation, cell differentiation and tumor cell migration. But its role in chronic, low-grade inflammation, which plays a major role in many disease states from heart disease to stroke to cancer, has not been well studied in humans.

Dong and Chen's future work also likely includes pursuing HMGB1 as a potential biomarker for blood vessel disease like heart attack and stroke, as well as a potential prognostic indicator for how these diseases will progress and how patients are responding to treatment.

Credit: 
Medical College of Georgia at Augusta University

Genomic study reveals evolutionary secrets of banyan tree

image: The banyan tree Ficus macrocarpa produces aerial roots that give it its distinctive look. A new study reveals the genomic changes that allow the tree to produce roots that spring from its branches.

Image: 
Photo by Gang Wang

CHAMPAIGN, Ill. -- The banyan fig tree Ficus microcarpa is famous for its aerial roots, which sprout from branches and eventually reach the soil. The tree also has a unique relationship with a wasp that has coevolved with it and is the only insect that can pollinate it.

In a new study, researchers identify regions in the banyan fig's genome that promote the development of its unusual aerial roots and enhance its ability to signal its wasp pollinator.

The study, published in the journal Cell, also identifies a sex-determining region in a related fig tree, Ficus hispida. Unlike F. microcarpa, which produces aerial roots and bears male and female flowers on the same tree, F. hispida produces distinct male and female trees and no aerial roots.

Understanding the evolutionary history of Ficus species and their wasp pollinators is important because their ability to produce large fruits in a variety of habitats makes them a keystone species in most tropical forests, said Ray Ming, a plant biology professor at the University of Illinois, Urbana-Champaign who led the study with Jin Chen, of the Chinese Academy of Sciences. Figs are known to sustain at least 1,200 bird and mammal species. Fig trees were among the earliest domesticated crops and appear as sacred symbols in Hinduism, Buddhism and other spiritual traditions.

The relationship between figs and wasps also presents an intriguing scientific challenge. The body shapes and sizes of the wasps correspond exactly to those of the fig fruits, and each species of fig produces a unique perfume to attract its specific wasp pollinator.

To better understand these evolutionary developments, Ming and his colleagues analyzed the genomes of the two fig species, along with that of a wasp that pollinates the banyan tree.

"When we sequenced the trees' genomes, we found more segmental duplications in the genome of the banyan tree than in F. hispida, the fig without the aerial roots," Ming said. "Those duplicated regions account for about 27% of the genome."

The duplications increased the number of genes involved in the synthesis and transport of auxins, a class of hormones that promote plant growth. The duplicated regions also contained genes involved in plant immunity, nutrition and the production of volatile organic compounds that signal pollinators.

"The levels of auxin in the aerial roots are five times higher than in the leaves of trees with or without aerial roots," Ming said. The elevated auxin levels appear to have triggered aerial root production. The duplicated regions also include genes that code for a light receptor that accelerates auxin production.

When they studied the genome of the fig wasp and compared it with those of other related wasps, the researchers observed that the wasps were retaining and preserving genes for odorant receptors that detect the same smelly compounds the fig trees produce. These genomic signatures are a signal of coevolution between the fig trees and the wasps, the researchers report.

Ming and his colleagues also discovered a Y chromosome-specific gene that is expressed only in male plants of F. hispida and three other fig species that produce separate male and female plants, a condition known as dioecy.

"This gene had been duplicated twice in the dioecious genomes, giving the plants three copies of the gene. But Ficus species that have male and female flowers together on one plant have only one copy of this gene," Ming said. "This strongly suggests that this gene is a dominant factor affecting sex determination."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Novel digital dashboard improves cancer case review efficiency

image: An MU Health Care tumor board meets to discuss cases. Photo taken prior to the COVID-19 pandemic.

Image: 
Justin Kelley

Multidisciplinary tumor boards are vital to cancer treatment plans, bringing together clinicians from different specialties to guide patient treatment and improve outcomes. However, compiling the relevant data for each case is time-consuming and requires contributions from multiple team members. To optimize the process, researchers at the MU School of Medicine partnered with Roche Diagnostics to evaluate a cloud-based product called NAVIFY® Tumor Board that integrates all relevant clinical data for a tumor board into a single digital dashboard accessible to everyone. During a 16-month clinical study of the dashboard, researchers found NAVIFY Tumor Board significantly reduced the amount of time doctors and nurses across multiple specialties spent preparing for 227 tumor board meetings involving 1866 patient cases.

"In addition to saving time, the NAVIFY digital tumor board solution resulted in less variability in preparation time," said Richard Hammer, MD, professor of pathology at the MU School of Medicine and vice chair of clinical affairs in the Dept. of Pathology and Anatomical Sciences. "The improvements were sustained and became more significant over time, decreasing administrative burdens of meeting preparation."

Hammer evaluated case preparation time during four phases: before NAVIFY Tumor Board implementation, after manual implementation, after partial electronic medical record (EMR) integration and after a stable EMR integration phase. The study found a 30% preparation time reduction across three cancer categories with full integration compared to pre-implementation. The biggest time savings involved the breast tumor board, where nurse navigators reduced their preparation time by 69%.

"Institutions with dedicated nurses preparing for cases will likely benefit the most," Hammer said. "This dashboard enables easy access to clinical data, which may support optimal decision-making. In addition, it reduces costs for both patients and hospitals, which is currently under analysis."

Hammer's team is also in the process of submitting data on the impact of the NAVIFY clinical decision support software on case discussion time during tumor board meetings. Future studies will investigate its impact on the quality of case discussions.

"As the first reference site for NAVIFY Tumor Board in the U.S., we are already hosting other institutional leaders to help them implement this software," Hammer said. "This is the wave of the future, where we are using digital clinical decision support software to enhance how we care for patients, while improving efficiency, standardizing the preparation of cases and making them available to clinicians at any time."

In addition to Hammer, the study's co-authors included MU School of Medicine colleague Lincoln Sheets, MD, assistant research professor. Hammer received research funding, an honoraria and serves as an advisor for Roche.

Credit: 
University of Missouri-Columbia

Graphene detector reveals THz light's polarization

image: Artist's rendering of a phase-sensitive terahertz interferometer.

Image: 
Daria Sokol/MIPT Press Office

Physicists have created a broadband detector of terahertz radiation based on graphene. The device has potential for applications in communication and next-generation information transmission systems, security and medical equipment. The study came out in ACS Nano Letters.

The new detector relies on the interference of plasma waves. Interference as such underlies many technological applications and everyday phenomena. It determines the sound of musical instruments and causes the rainbow colors in soap bubbles, along with many other effects. The interference of electromagnetic waves is harnessed by various spectral devices used to determine the chemical composition, physical and other properties of objects -- including very remote ones, such as stars and galaxies.

Plasma waves in metals and semiconductors have recently attracted much attention from researchers and engineers. Like the more familiar acoustic waves, the ones that occur in plasmas are essentially density waves, too, but they involve charge carriers: electrons and holes. Their local density variation gives rise to an electric field, which nudges other charge carriers as it propagates through the material. This is similar to how the pressure gradient of a sound wave impels the gas or liquid particles in an ever expanding region. However, plasma waves die down rapidly in conventional conductors.

That said, two-dimensional conductors enable plasma waves to propagate across relatively large distances without attenuation. It therefore becomes possible to observe their interference, yielding much information about the electronic properties of the material in question. The plasmonics of 2D materials has emerged as a highly dynamic field of condensed matter physics.

Over the past 10 years, scientists have come a long way detecting THz radiation with graphene-based-devices. Researchers have explored the mechanisms of T-wave interaction with graphene and created prototype detectors, whose characteristics are on par with those of similar devices based on other materials.

However, studies have so far not looked at the details of detector interaction with distinctly polarized T-rays. That said, devices sensitive to the waves' polarization would be of use in many applications. The study reported in this story experimentally demonstrated how detector response depends on the polarization of incident radiation. Its authors also explained why this is the case.

Study co-author Yakov Matyushkin from the MIPT Laboratory of Nanocarbon Materials ?ommented: "The detector consists of a silicon wafer 4 by 4 millimeters across, and a tiny piece of graphene 2 by 5 thousandths of a millimeter in size. The graphene is connected to two flat contact pads made of gold, whose bow tie shape makes the detector sensitive to the polarization and phase of incident radiation. Besides that, the graphene layer also meets another gold contact at the top, with a nonconductive layer of aluminum oxide interlaid between them."

In microelectronics, this structure is known as a field transistor (fig. 1), with the two side contacts usually referred to as a source and a drain. The top contact is called a gate.

Terahertz radiation is a narrow band of the electromagnetic spectrum between microwaves and the far infrared light. From the applications standpoint, an important feature of T-waves is that they pass through living tissue and undergo partial absorption but cause no ionization and therefore do not harm the body. This sets THz radiation apart from X-rays, for example.

Accordingly, the applications traditionally considered for T-rays are medical diagnostics and security screening. THz detectors are also used in astronomy. Another emerging application is data transmission at THz frequencies. This means the new detector could be useful in establishing the 5G and 6G next-generation communication standards.

"Terahertz radiation is directed at an experimental sample, orthogonally to its surface. This generates photovoltage in the sample, which can be picked up by external measurement devices via the detector's gold contacts," commented study co-author Georgy Fedorov, deputy head of the MIPT Laboratory of Nanocarbon Materials. "What's crucial here is what the nature of the detected signal is. It can actually be different, and it varies depending on a host of external and internal parameters: sample geometry, frequency, radiation polarization and power, temperature, etc."

Notably, the new detector relies on the kind of graphene already produced industrially. Graphene comes in two types: The material can either be mechanically exfoliated or synthesized by chemical vapor deposition. The former type has a higher quality, fewer defects and impurities, and holds the record for charge carrier mobility, which is a crucial property for semiconductors. However, it is CVD graphene that the industry can scalably manufacture already today, making it the material of choice for devices with an ambition for mass production.

Another co-author of the study, Maxim Rybin from MIPT and Prokhorov General Physics Institute of the Russian Academy of Sciences is the CEO of graphene manufacturer Rusgraphene, and he had this to say about the technology: "The fact that it was CVD graphene that we observed plasma wave interference in, means such graphene-based THz detectors are fit for industrial production. As far as we know, this is the first observation of plasma wave interference in CVD graphene so far, so our research has expanded the material's potential industrial applications."

The team showed that the nature of the new detector's photoresponse has to do with plasma wave interference in the transistor channel. Wave propagation begins at the two opposite ends of the channel (fig. 2), and the special geometry of the antenna makes the device sensitive to the polarization and phase of the detected radiation. These features mean the detector could prove useful in building communication and information transmission systems that operate at THz and sub-THz frequencies.

The study reported in this story was co-authored by researchers from the MIPT Laboratory of Nanocarbon Materials and their colleagues from Moscow State Pedagogical University, Ioffe Institute of the Russian Academy of Sciences, and the University of Regensburg, Germany. This research was supported by the Russian Foundation for Basic Research and the Russian Ministry of Science and Higher Education.

Credit: 
Moscow Institute of Physics and Technology

Scientists reconstruct beetles from the Cretaceous

image: Micro-CT reconstruction of Mysteriomorphus pelevini

Image: 
D. Peris & R. Kundrata et al. / Scientific Reports

About a year ago, researchers found fossil specimens of beetles in an amber deposit in Myanmar, thereby describing a new beetle family that lived about 99 million years ago. However, the scientists had not been able to fully describe the morphology of the insects in the amber sample, which is why the beetles were subsequently given the mysterious name Mysteriomorphidae. An international research team led by the University of Bonn (Germany) and Palacky University (Czech Republic) has now examined four newly found specimens of the Mysteriomorphidae using computer tomography and has been able to reconstruct them. The results allow to draw conclusions about the evolution of the species during the Cretaceous period. The study has been published in the journal Scientific Reports.

Small creatures enclosed in amber can provide scientists with important information about past times, some of which date back many millions of years. In January 2019, the Spanish paleontologist Dr. David Peris, one of the two main authors of the study, collected several amber samples from the northern state of Kachin in Myanmar during a scientific trip to China and found beetle specimens from the same group as the Mysteriomorphidae.

Some of the newly found specimens showed a very good state of preservation - a good prerequisite for David Peris and his colleagues to carry out a virtual reconstruction of one of the beetles using computer tomography (CT scan). The technique used in paleontology allows researchers to study many small features of the fossils - even internal structures such as genitalia, if preserved.

While David Peris and his colleagues started to study and describe the morphology, i.e. the outer shape of the beetles, another research group also described the new family of Mysteriomorphidae by means of further specimens, that also came from the amber deposit in Myanmar. "However, the first study left some open questions about the classification of these fossils which had to be answered. We used the opportunity to pursue these questions with new technologies," explains David Peris, researcher now at the Institute for Geosciences and Meteorology at the University of Bonn.

"We used the morphology to better define the placement of the beetles and discovered that they were very closely related to Elateridae, a current family," explains Dr. Robin Kundrata from Palacky University, the second main author of the study and also an expert on this group of beetles. The scientists discovered important diagnostic characters that these beetle lineages share on mouthparts, thorax and abdomen.

Analysis of the evolution of beetles

Apart from the morphology, the researchers also analyzed the evolutionary history of the beetles. Earlier models had suggested that the beetles had a low extinction rate throughout their long evolutionary history, even during the Cretaceous period. However, the researchers provided a list of fossil groups of beetles described from the Cretaceous amber findings that, as Mysteriomorphidae, are only known as fossils from that time and had not survived the end of the Cretaceous period.

Background: During the Cretaceous period, flowering plants spread all over the world, replacing the old plants in the changing environment. This distribution of plants was connected with new possibilities for many associated animals and also with the development of new living beings, for example pollinators of flowers. However, most previous theories had not described that the animal species that were previously well adapted to the old plants were under pressure to adapt to the new resources and possibly became extinct. "Our results support the hypothesis that beetles, but perhaps some other groups of insects, suffered a decrease in their diversity during the time of plant revolution," states David Peris.

Credit: 
University of Bonn

Researchers find increases in nitrous oxide emissions, outpacing global predictions

The term "greenhouse gas" is often used interchangeably with carbon dioxide, due to its prevalence in our atmosphere - more than 80 percent of all greenhouse gas emissions, estimates the Environmental Protection Agency. But another greenhouse gas, nitrous oxide (N2O), can have effects with far greater impact.

And, according to a recent study, N2O emissions are increasing at a "devastating" rate, faster than predictions introduced by the Intergovernmental Panel on Climate Change.

In a paper published in Nature, a large, multinational team of researchers associated with the Global Carbon Project -- including Peter Raymond, professor of ecosystem ecology at Yale School of the Environment (YSE), and postdoctoral fellow Taylor Maavara -- explains that existing inventories of N2O emissions don't provide a full picture of its prevalence. Using "bottom-up" and "top-down" approaches, the researchers have provided a global look at N2O emissions, accounting for naturally occurring sources of nitrous oxide and attributing anthropogenic sources that had both been omitted in previous inventories.

"Nitrous oxide is often seen as the third most important greenhouse gas" behind carbon dioxide and methane, says Maavara. "Not as much attention is paid to nitrous oxide, but it's extremely important." In addition to being an ozone depleting chemical, nitrous oxide, she explains, can take more than a century to completely break down in the atmosphere and has a climate warming potential nearly 300 times higher than carbon dioxide.

Developing a more comprehensive inventory, the researchers found growing N2O emissions in emerging economies -- particularly Brazil, China and India -- due in large part to agricultural activity, the cause of nearly 70 percent of global human-derived N2O emissions over the past decade. As populations grow and more food is needed, the researchers predict that N2O emissions will continue to grow if not mitigated.

"It's going to be difficult because we need food," says Maavara, who suggests more sustainable practices, such as best management practices for farming that focus on more precise timing and applications of fertilizer.

But, even then, Maavara says positive results could take decades.

"Even if the world is willing, there's not a quick fix. It's going to take a long time to change what's been done to the soil. We want to emphasize that this is a problem now so we can begin developing the incremental solutions."

Credit: 
Yale School of the Environment

Coordinated efforts on Twitter to interfere in US elections are foreign-based

A coordinated effort on Twitter to influence the upcoming U.S. presidential election -- using trolls (fake personas that spread hyper-partisan themes) and super-connectors (highly-networked accounts) -- aims to sow distrust, exacerbate political divisions and undermine confidence in American democracy, according to a new RAND Corporation report.

While researchers say they cannot definitively attribute this year's election interference to a specific actor, the tactics they observed on Twitter mirror Russia's longstanding strategy of playing off existing partisan tensions to create a sense of disunity among U.S. voters, and they also further Russia's interests.

"Social media has made it cheaper and easier for foreign actors to mount increasingly sophisticated attacks on our democracy and our political discourse," said William Marcellino, the study's lead author and a social and behavioral scientist at RAND, a nonprofit, nonpartisan research group. "Many Americans are immersed in online conversations that have been shaped artificially, and that are giving them a false and distorted picture of the world."

The RAND report is the second of a four-part series intended to help policymakers and the public understand - and mitigate - the threat of online foreign interference in national, state and local elections. The first report concluded that the main goal of foreign interference is to paralyze the American political process by driving people to extreme positions that make it ever more difficult to reach consensus.

The latest study used software tools developed by RAND to analyze a very large dataset of 2.2 million tweets from 630,391 unique Twitter accounts collected between Jan. 1 and May 6, 2020. The analysis found that troll and super-connector accounts overwhelmingly cluster in certain Twitter communities engaged in political conversations around the election.

The pro-Donald Trump community had the highest percentage of both types of accounts; trolls in this community were strongly supportive of the president, as well as QAnon content and other content that favored the Trump candidacy.

In among the pro-Vice President Biden community, which also had among the highest concentrations of troll and super-connector accounts, trolls were anti-Biden, and either criticized Biden or praised Bernie Sanders.

This orchestrated activity may have worked in favor of President Trump, and against the candidacy of Vice President Biden, according to the report. Targeting both sides of the political spectrum also is a strategy that is consistent with prior Russian efforts to meddle in U.S. elections.

The researchers encourage social media platforms to adapt and embrace emerging methods of detecting election interference efforts, including the combination of network analyses and machine learning used in this study.

"New technologies may have made it easier for foreign actors to carry out malign influence efforts, but technological innovation can also help us combat them," Marcellino said. "We've detected interference in prior elections, but we've been closing the barn door too late -- after an election. Our study shows that it is possible to detect, and respond to, these efforts before an election."

Researchers also recommend publicizing the threat of online election interference broadly, in print and on the radio and TV, to make Americans aware of ongoing, most likely foreign efforts to manipulate them and undermine their confidence in democracy.

Publicizing details about the target audiences (e.g., supporters of President Trump or supporters of former Vice President Biden), as well as specific tactics (e.g., sharing attack memes), could further help protect Americans from online manipulation, according to the report.

This research was sponsored by the California Governor's Office of Emergency Services.

The report, "Foreign Interference in the 2020 Election: Tools for Detecting Online Election Interference," is available at http://www.rand.org. Other authors of the study are Christian Johnson, Marek N. Posard and Todd Helmus.

The RAND National Security Research Division conducts research and analysis on defense and national security topics for the U.S. and allied defense, foreign policy, homeland security, and intelligence communities and foundations and other non-governmental organizations that support defense and national security analysis.

Credit: 
RAND Corporation

SwRI scientists study the rugged surface of near-Earth asteroid Bennu

image: As NASA's OSIRIS-REx spacecraft's Touch-And-Go asteroid sample collection attempt approaches, Southwest Research Institute scientists have helped determine what the spacecraft can expect to return from the near-Earth asteroid Bennu's surface. SwRI scientists also played a role in sample site selection, including the primary site Nightingale shown here.

Image: 
NASA/Goddard/University of Arizona

SAN ANTONIO -- As the days count down to NASA's OSIRIS-REx spacecraft's Touch-And-Go asteroid sample collection attempt, Southwest Research Institute scientists have helped determine what the spacecraft can expect to return from the near-Earth asteroid Bennu's surface. Three papers published online by Science on Oct. 8 discuss the color, reflectivity, age, composition, origin and distribution of materials that make up the asteroid's rough surface.

On October 20, the spacecraft will descend to the asteroid's boulder-strewn surface, touch the ground with its robotic arm for a few seconds and collect a sample of rocks and dust - marking the first time NASA has grabbed pieces of an asteroid for return to Earth. SwRI scientists played a role in the selection of the sample sites. The first attempt will be made at Nightingale, a rocky area 66 feet in diameter in Bennu's northern hemisphere. If this historic attempt is unsuccessful, the spacecraft will try again at a secondary site.

Since the spacecraft arrived at Bennu in 2018, scientists have been characterizing the asteroid's composition and comparing it to other asteroids and meteorites. The mission discovered carbon-bearing compounds on Bennu's surface, a first for a near-Earth asteroid, as well as minerals containing or formed by water. Scientists also studied the distribution of these materials, globally and at the sample sites

"Our recent studies show that organics and minerals associated with the presence of water are scattered broadly around Bennu's surface, so any sample returned to Earth should contain these compounds and minerals," said SwRI's Dr. Vicky Hamilton, a coauthor on all three papers. "We will compare the sample's relative abundances of organics, carbonates, silicates and other minerals to those in meteorites to help determine the scenarios that best explain Bennu's surface composition."

Asteroid Bennu is a dark, rubble pile held together by gravity and thought to be the collisional remnant of a much larger main-belt object. Its rubble-pile nature and heavily cratered surface indicates that it has had a rough-and-tumble life since being liberated from its much larger parent asteroid millions or even billions of years ago.

"Boulders strewn about near the Nightingale site have bright carbonate veins," Hamilton said. "Bennu shares this compositional trait with aqueously altered meteorites. This correlation suggests that at least some carbonaceous asteroids were altered by percolating water in the early Solar System."

The boulders on Bennu have diverse textures and colors, which may provide information about their variable exposure to micrometeorite bombardment and the solar wind over time. Studying color and reflectance data provide information about the geologic history of planetary surfaces.

"Bennu's diverse surface includes abundant primitive material potentially from different depths in its parent body plus a small proportion of foreign materials from another asteroid family littered about its surface," said SwRI's Dr. Kevin Walsh, a coauthor of one of the papers. "In addition, both the primary and back-up sample sites, Nightingale and Osprey, are situated within small spectrally reddish craters that are thought to be more pristine, having experienced less space weathering than most of Bennu's bluish surface."

The OSIRIS-REx team is also comparing Bennu to Ryugu, another near-Earth asteroid. Both asteroids are thought to have originated from primitive asteroid families in the inner main belt. The Japan Aerospace Exploration Agency launched Hayabusa2 in 2014 and rendezvoused with near-Earth asteroid Ryugu in 2018. After surveying the asteroid for a year and a half, the spacecraft collected samples and is expected to return to Earth December 6, 2020.

The sample returned by OSIRIS-REx, combined with the surface context maps OSIRIS-REx has collected, will improve interpretations of available ground and space telescope data for other primitive dark asteroids. Comparing returned Bennu samples with those of Ryugu will be instrumental for understanding the diversity within, and history of, asteroid families and the entire asteroid belt.

Credit: 
Southwest Research Institute

UCI, others see agriculture as major source of increase in atmospheric nitrous oxide

image: Global N2O budget 2007-2016

Image: 
Global Carbon Project

Irvine, Calif. ¬ - An international team of researchers - including Earth system scientists at the University of California, Irvine - recently completed the most thorough review yet of nitrous oxide from emission to destruction in the planet's atmosphere.

In addition to confirming that the 20 percent increase in the amount of the greenhouse gas since the start of the Industrial Revolution can be totally attributed to humans, the team expressed doubt about the ability to reduce emissions or mitigate their future impacts.

In a study published this week in Nature, the researchers document the details of human-sourced N2O emissions and how they have intensified by 30 percent over the past four decades, the dominant share coming from synthetic nitrogen fertilizers and animal manure used in agriculture. The paper notes that emerging economies - particularly Brazil, China and India - are becoming major emitters of the gas as they increase their food production. Once released into the air, nitrous oxide remains for about 116 years.

The buildup of N2O, which is generated naturally as well as through human activities and is also known to deplete stratospheric ozone, creates a dilemma for nations working to put the brakes on runaway climate change, according to co-author Michael Prather, Distinguished Professor of Earth system science at UCI.

"Nitrous oxide emissions are increasing faster than any scenarios considered by the Intergovernmental Panel on Climate Change - so fast that if left unchecked, they along with carbon dioxide will push the rise in global mean temperature to well above 2 degrees Celsius from pre-industrial levels, the nominal goal of the Paris climate agreement," he said.

Scientists estimate that to achieve this end-of-century objective, cumulative CO2 emissions should not exceed about 700 gigatons throughout the next 80 years; to limit global warming to an even more ambitious 1.5 degrees, this sum drops to 500 gigatons.

"There is some cautious optimism that these goals are achievable if we can curtail fossil fuel emissions and take steps to actively remove CO2 from the atmosphere," Prather noted.

Nitrous oxide is a considerably more stubborn greenhouse gas, however. The study found that these emissions have always exceeded expectations, and with nitrogen use being tied to food production, there appear to be limited ways of significantly mitigating the gas's release.

Further, unlike CO2, N2O is not photosynthesized by plants, nor is it chemically reactive enough to be sequestered through geoengineering methods. Prather said that the path to controlling N2O emissions is through more efficient use of fertilizer and the land on which livestock are raised.

According to him, if current nitrous oxide emissions continue, they will be equivalent to about 230 gigatons of cumulative CO2 emissions by the end of the century - almost half of the total CO2 emissions allowed to remain within a 1.5-degree global temperature increase.

"The most optimistic scenarios call for N2O emissions to be cut to the equivalent of 178 gigatons of CO2," Prather said. "But our paper shows that this will be difficult to achieve because N2O emissions are not tied to the carbon sector but to agriculture, which must continue to expand with population and gross domestic product growth."

He said the study is an important step in confronting the challenge presented by this uncooperative greenhouse gas.

"Compared to the well-established understanding of the contributions of carbon dioxide and methane to climate change, nitrous oxide had long been assumed to be the most 'uncertain' of greenhouse gases," Prather said. "With this paper, we can confirm that N2O sources and sinks are now well-defined and attributed, so tackling the task of reducing emissions can be accompanied by appropriate checks and validation procedures."

He said that the often-forgotten, third-most-important greenhouse gas may become a long-term threat to controlling climate change.

Credit: 
University of California - Irvine

The Marangoni Effect can be used to obtain freshwater from the sea

The Achilles' heel of water desalination technologies is the crystallization of salt particles within the various components of the device. This clogging phenomenon causes a reduction in performance over time, thus limiting the durability of these devices. Tackling this problem is important to ensure a constant production of freshwater over time. Recently, innovative nanostructured materials with anti-clogging properties have been proposed, with the potential of limiting salt accumulation. However, the high cost of these materials makes large-scale production of commercial prototypes difficult.

Starting from this problem, a team of engineers from the Energy Department of the Politecnico di Torino (SMaLL), in collaboration with the Massachusetts Institute of Technology (MIT), has thoroughly studied the mechanisms underlying the transport of salt particles in desalination devices. The study started after noting an inconsistency between experimental observations and classical theoretical models of salt transport. In particular, the engineers of the Politecnico di Torino, after more than two years of numerical and laboratory research funded by the Compagnia di San Paolo (MITOR project) and the CleanWaterCenter (CWC), have shown that this large difference in the salt transport is due to the so-called Marangoni effect. Based on this discovery, the researchers of the Politecnico di Torino (Matteo Morciano, Matteo Fasano, Eliodoro Chiavazzo and Pietro Asinari, who also holds the position of Scientific Director of the National Institute of Metrological Research - INRiM) and of MIT (Svetlana V. Boriskina) have created a prototype capable of desalting seawater in a sustainable way and spontaneously removing the salt accumulated during operation.

The Marangoni effect is a phenomenon also present in nature, which can be observed in everyday life: "In an aqueous solution, liquid molecules interact with each other through intermolecular bonds that generate forces called 'cohesion forces'. Two solutions with different concentrations will have different cohesion forces. The presence of this concentration variation, and therefore of cohesion forces, causes the liquid to flow away from regions of low concentration, generating a re-mixing process. This effect is responsible for the 'tears' of wine that are observed on the walls of the glass when shaken. The Marangoni effect, due to a change in concentration in the liquid, can therefore be engineered and exploited to increase the re-mixing of solutions with different concentrations. In our desalination device (where the treated solutions are based on sea water at different concentrations), this phenomenon allows to avoid the accumulation of salt in the evaporators, ensuring constant and lasting productivity of distilled water, and safeguarding the components subject to deterioration. Our strategy was therefore to design a device capable of taking full advantage of this effect, achieving a further step towards future commercial applications of the device", explains Matteo Morciano, researcher at the Energy Department of the Politecnico di Torino and first author of the research.

In the current version and considering an area for the absorption of solar energy of about one square meter, the desalination device can supply more than 15 litres of water per day. Furthermore, thanks to the Marangoni effect, the salt removal process is up to 100 times faster than predictions based on spontaneous diffusion, thus favouring a rapid restoration of the properties of the components.

The results of this research, published in the prestigious journal Energy and Environmental Science [*], may have important implications in the design of a new generation of desalination materials and devices, allowing them to spontaneously 'self-clean' the accumulated salt and guaranteeing stable and long-lasting performance. Further research is currently underway at the CleanWaterCenter of the Politecnico di Torino, with the aim of making the prototype industrializable and more versatile.

Credit: 
Politecnico di Torino

Arctic weather observations can improve hurricane track forecast accuracy

image: Releasing balloon with radio sonde automatically, over the Arctic Ocean

Image: 
Jun Inoue (NIPR)

In 2017, Category 5 Hurricane Irma devastated islands of the Lesser and Greater Antilles before turning northward and ultimately making landfall in southwestern Florida. Forecasting the timing and position of that northward turn was critical for Floridians to prepare for the storm's impact, but the uncertainty surrounding prediction of the upper-level trough that would steer the turn made this difficult. Collecting additional meteorological data, including measurements from locations as distant as the Arctic, could help meteorologists forecast the tracks of future tropical cyclones like Irma.

In a new study published in the journal Atmosphere, a research team led by the Kitami Institute of Technology compared the accuracy of operational medium-range ensemble forecasts for 29 Atlantic hurricanes from 2007 to 2019, with a focus on hurricanes that moved northward in response to upper-level atmospheric circulation over the mid-latitudes and approached the United States.

Although hurricane track forecasting has significantly improved in recent decades, there are still significant errors in some cases, and the consequences can be severe. In particular, uncertainty regarding the paths of upper-level troughs with strong winds in the mid-latitudes can lead to greater uncertainty when they influence the tracks of tropical cyclones. The research team found that in cases of hurricanes steered by upper-level troughs, forecasting errors of the hurricanes' central positions were larger than those in cases not influenced by upper-level troughs.

Lead author Kazutoshi Sato explains, "During the forecast period Hurricane Irma in 2017, there was large meandering of the jet stream over the North Pacific and North Atlantic, which introduced large errors in the forecasts. When we included additional radiosonde observation data from the Research Vessel Mirai collected in the Arctic in the late summer of 2017, the error and ensemble spread of the upper-level trough at the initial time of forecast were improved, which increased the accuracy of the track forecast for Irma."

The researchers also investigated the effect of including additional dropsonde data collected by the United States Air Force Reserve Command and the Aircraft Operations Center of the National Oceanic and Atmospheric Administration over the Atlantic Ocean near Hurricane Irma in 2017. Hurricane forecast accuracy was improved both by dropsonde measurements near the hurricanes and by radiosonde observations over the Arctic Ocean.

According to co-author Jun Inoue, an associate professor of National Institute of Polar Research, "Our findings show that developing a more efficient observation system over the higher latitudes will be greatly beneficial to tropical cyclone track forecasting over the mid-latitudes, which will help mitigate the human casualties and socioeconomic losses caused by these storms."

Credit: 
Research Organization of Information and Systems

Setting a TRAP for pandemic-causing viruses

image: The TRAP display method "fishes" for synthetic proteins from among a library of trillions for those that can target SARS-CoV-2. The approach was able to identify proteins that can be used for testing for the virus and potentially treating people infected with COVID-19.

Image: 
Hiroshi Murakami

A research team led by Nagoya University scientists in Japan has developed an approach that can quickly find synthetic proteins that specifically bind to important targets, such as components of the SARS-CoV-2 virus. The method was published in the journal Science Advances and could be used to develop test kits or for finding treatments.

"We developed a laboratory technique for rapid selection of synthetic proteins that strongly bind to SARS-CoV-2," says Nagoya University biomolecular engineer Hiroshi Murakami. "High-affinity synthetic proteins can be used to develop sensitive antigen tests for SARS-CoV-2 and for future use as neutralization antibodies in infected patients."

Murakami and his colleagues had previously developed a protein selection lab test called TRAP display, which stands for 'transcription-translation coupled with association of puromycin linker.' Their approach skips two time-consuming steps in another commonly used technique for searching through synthetic protein libraries. But their investigations indicated there was a problem with the puromycin linker.

In the current study, the team improved their technique by modifying the puromycin linker. Ultimately, they were able to use their TRAP display to identify nine synthetic proteins that bind to the spike protein on SARS-CoV-2's outer membrane. The approach took only four days compared to the weeks it would take using the commonly used messenger RNA display technology.

TRAP display involves using a large number of DNA templates that code for and synthesize trillions of proteins carrying random peptide sequences. The synthetic proteins are linked to DNA with the help of the modified puromycin linker and then exposed to a target protein. When the whole sample is washed, only the synthetic proteins that bind to the target remain. These are then placed back into the TRAP display for further rounds until only a small number of very specific target-binding synthetic proteins are left.

The researchers investigated the nine synthetic proteins that were found to bind to SARS-CoV-2. Some were specifically able to detect SARS-CoV-2 in nasal swabs from COVID-19 patients, indicating they could be used in test kits. One also attaches to the virus to prevent it from binding to the receptors it uses to gain access to human cells. This suggests this protein could be used as a treatment strategy.

"Our high-speed, improved TRAP display could be useful for implementing rapid responses to subspecies of SARS-CoV-2 and to other potential new viruses causing future pandemics," says Murakami.

Credit: 
Nagoya University

Men less likely to see food as national security issue amid pandemic

PULLMAN, Wash. - On average, men not only showed less empathy toward temporary agricultural laborers, known as H-2A guest workers, but also were less likely to see food supply and production as issues of national security, according to a study led by a Washington State University researcher.

This particular finding relating to gender stood out from the rest of the study's results. The survey was conducted before and during the COVID-19 pandemic.

The study was published in the journal Applied Economic Perspectives and Policy by Jeff Luckstead, WSU Assistant Professor in the School of Economic Sciences, and Rodolfo M. Nayga and Heather A. Snell, both at the University of Arkansas.

The gender anomaly notwithstanding, the study found that on average, people did shift their views toward food being a national security issue during the pandemic. They were also more empathetic toward H-2A workers because of the crisis.

Researchers found that gender played a strong role in other ways, too. On average, men believed that stay-at-home orders and related economic impacts were not justified. Men were also found to have viewed the shelter-in-place restrictions as an over-reaction on the part of local and state officials. Respondents' political views on immigration did not change, the study found.

"The surprising part was how gender played a strong role in influencing responses," Luckstead said. "It was the only statistically significant factor for all the questions we asked."

Specializing in agricultural trade and policy analysis, Luckstead also studies immigration and its role in agriculture and food production.

Luckstead and his co-authors posed nine questions to the pool of respondents. The questions were broken into two sets: questions asked before and during the COVID-19 outbreak, and questions asked only during the pandemic.

Among other questions, the researchers asked respondents to rank their bias on immigration policy from very liberal to very conservative. They also asked the importance respondents placed on agricultural food production during the coronavirus crisis.

Other questions explored whether or not shelter-in-place orders were a matter of over-reacting or under-reacting, and whether or not any economic damage caused by stay at home orders was justified.

The researchers screened out respondents who made more than $50,000 annually, those with advanced degrees, and retirees.

"We wanted to sample a domestic audience who would most likely be candidates for agricultural field work," Luckstead said. He added that domestic workers in his survey categories are vastly under-represented in the agricultural field work economy.

In terms of the big picture, Luckstead added that it is important to understand how low-skilled domestic workers in this labor pool view food, food-production, and supply, especially in the context of a pandemic.

Because these domestic workers are largely underrepresented in ag field work, it is important to understand why they aren't working in these labor sectors, particularly given the high employment rates stemming from the COVID-19 crisis.

"It is interesting to see that while attitudes generally shifted because of the pandemic, gender really stood out as a significant difference in attitudes," Luckstead said.

Credit: 
Washington State University

Musical training can improve attention and working memory in children - study

image: Dr Kausel and colleagues analyzing the fMRI results

Image: 
L. Kausel and coauthors

Neuroscientists have found new evidence that learning to play an instrument may be good for the brain. Musically trained children perform better at attention and memory recall and have greater activation in brain regions related to attention control and auditory encoding, executive functions known to be associated with improved reading, higher resilience, greater creativity, and a better quality of life. These results are published in the open-access journal Frontiers in Neuroscience.

A team led by Dr Leonie Kausel, a violinist and neuroscientist at the Pontifical Catholic University of Chile and the Universidad del Desarrollo Chile, tested the attention and working memory of 40 Chilean children between 10-13 years of age. Twenty played an instrument, had had at least two years of lessons, practiced at least 2 h a week and regularly played in an orchestra or ensemble. Twenty control children, recruited from public schools in Santiago, had had no musical training other than in the school curriculum. Their attention and working memory was assessed through the previously developed and validated "bimodal (auditory/visual) attention and working memory (WM) task". During this task, Kausel et al. monitored brain activity of the children with functional magnetic resonance imaging (fMRI), detecting small changes in blood flow within the brain.

There was no difference between the two groups in reaction time. However, musically trained children did significantly better on the memory task.

"Our most important finding is that two different mechanisms seem to underlie the better performance of musically trained children in the attention and WM memory task," says Kausel. "One that supports more domain-general attention mechanisms and another that supports more domain-specific auditory encoding mechanisms."

Here, "domain" refers to how sensorial modalities -- types of senses such as heat, sound, or light -- are encoded by the brain, while domain-specific vs. -general means that only one vs. more than one sensorial modality is processed, and "mechanism" refers to the neurochemical processes that occur. Both mechanisms seem to have improved function in musically trained children. For the domain-specific mechanism, brain regions that are more active include the inferior frontal gyrus and the supramarginal gyrus - in the front and center-front of the brain, both part of the so-called "phonological loop", a working memory system involved in auditory processing, establishing auditory-motor connections, and tonal and verbal auditory working memory. For the domain-general mechanism, a more active brain region is probably the fronto-parietal control network, a large-scale network composed of various brain regions that deals with executive function, goal-oriented, and cognitively-demanding tasks.

Kausel et al. suspect music training increases the functional activity of these brain networks.

"The next step of the project is to establish the causality of the mechanisms we found for improving attention and working memory," says Kausel. "We also aim to make a longitudinal study on musical training with children, evaluating attention and working memory, and the possibility to evaluate a musical training intervention on ADHD children."

Does this mean you should sign your kids up for music classes?

"Of course, I would recommend that," Kausel agrees. "However, I think parents should not only enrol their children because they expect that this will help them boost their cognitive functions, but because it is also an activity that, even when very demanding, will provide them with joy and the possibility to learn a universal language."

How the study was done

Kausel et al. adapted the bimodal attention and WM memory task from Johnson & Zatorre (2006). Neuroimage 31:1673-81. They asked participants to focus on either one, both, or neither stimuli of a pair: a visual abstract figure and a short melody, presented simultaneously for a duration of 4 s ("encoding phase"). Two seconds later, they asked them to recall both by means of a yes/no recognition task ("memory retrieval phase"). They also measured accuracy of responses and reaction time.

fMRI is a non-invasive technique that measures brain activity in real time: increased blood flow to a region implies increased activity. To determine activity associated with paying attention, Kausel et al. subtracted fMRI data acquired from "passive" trials (i.e. when children passively observe the bimodal stimuli, without a memory recall task) from those acquired during "active" trials (i.e. when children paid attention to auditory and/or visual stimuli). From this, they could identify brain regions associated with paying attention and memory encoding, activated during the encoding phases.

Credit: 
Frontiers