Tech

Satellite sees tropical cyclone Vayu centered off India coastline

image: NOAA's JPSS satellite passed over the Northern Indian Ocean and the VIIRS instrument captured a visible image that showed the eye off the Gujurat coast of India.

Image: 
NOAA/NRL

Tropical Cyclone Vayu's eye was just off the western coast of India when the NOAA-20 satellite passed overhead and captured a visible image of the storm.

The Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard the NOAA-20 polar orbiting satellite provided a visible image of the storm. The VIIRS image revealed an eye surrounded by powerful thunderstorms. However, the majority of the strongest storms appeared to be displaced southwest of the eye.

The Joint Typhoon Warning Center or JTWC noted that dry air is wrapping into the system, which will limit its ability to intensify further.

At 11 a.m. EDT (1500 UTC) on June 13, 2019, Vayu had maximum sustained winds near 95 knots (109 mph/176 kph). Vayu's eye was centered near 20.9 degrees north latitude and 68.8 degrees east longitude. It is approximately 264 nautical miles south-southeast of Karachi, Pakistan. Vayu has tracked north-northwestward.

JTWC forecasters noted that "the track is expected to shift westward in the near term as another subtropical ridge (elongated area of high pressure) located over Saudi Arabia builds in to the northwest. In one and a half days, increasing outside winds and more dry air moving into the system is expected to begin weakening it.

Credit: 
NASA/Goddard Space Flight Center

Mysterious Majorana quasiparticle is now closer to being controlled for quantum computing

image: A scanning tunneling microscope (STM) was used to visualize Majorana quasiparticles (green peaks) occurring at the ends of topological edge channels (yellow regions) at the atomic steps of a bismuth thin film grown on a superconducting surface. Small magnetic clusters are seen as small bumps decorating the corner of these edges. At the interfaces between the magnetic clusters and the edge channel, experiments detected robust Majorana quasiparticles, but only when the cluster magnetization points along the channel.

Image: 
Yazdani Lab at Princeton University

As mysterious as the Italian scientist for which it is named, the Majorana particle is one of the most compelling quests in physics.

Its fame stems from its strange properties - it is the only particle that is its own antiparticle - and from its potential to be harnessed for future quantum computing.

In recent years, a handful of groups including a team at Princeton have reported finding the Majorana in various materials, but the challenge is how to manipulate it for quantum computation.

In a new study published this week, the Princeton team reports a way to control Majorana quasiparticles in a setting that also makes them more robust. The setting - which combines a superconductor and an exotic material called a topological insulator - makes Majoranas especially resilient against destruction by heat or vibrations from the outside environment. What is more, the team demonstrated a way to turn on or off the Majorana using small magnets integrated into the device. The report appeared in the journal Science.

"With this new study we now have a new way to engineer Majorana quasiparticles in materials," said Ali Yazdani, Class of 1909 Professor of Physics and senior author on the study. "We can verify their existence by imaging them and we can characterize their predicted properties."

The Majorana is named for physicist Ettore Majorana, who predicted the existence of the particle in 1937 just a year before mysteriously disappearing during a ferry trip off the Italian coast. Building on the same logic with which physicist Paul Dirac predicted in 1928 that the electron must have an antiparticle, later identified as the positron, Majorana theorized the existence of a particle that is its own antiparticle.

Typically when matter and antimatter come together, they annihilate each other in a violent release of energy, but the Majoranas, when they appear as pairs each at either end of specially designed wires, can be relatively stable and interact weakly with their environment. The pairs enable the storing of quantum information at two distinct locations, making them relatively robust against disturbance because to change the quantum state requires operations at both ends of the wire at the same time.

This capability has captivated technologists who envision a way to make quantum bits - the units of quantum computing - that are more robust than current approaches. Quantum systems are prized for their potential to tackle problems impossible to solve with today's computers, but they require maintaining a fragile state called superposition, which if disrupted, can result in system failures.

A Majorana-based quantum computer would store information in pairs of particles and perform computation by braiding them around each other. The results of computation would be determined by annihilation of Majoranas with each other, which can result in either the appearance of an electron (detected by its charge) or nothing, depending on how the pair of Majoranas have been braided. The probabilistic outcome of the Majorana pair annihilation underlies its use for quantum computation.

The challenge is how to create and easily control Majoranas. One of the places they can exist is at the ends of a single-atom-thick chain of magnetic atoms on a superconducting bed. In 2014, reporting in Science, Yazdani and collaborators used a scanning tunneling microscope (STM), in which a tip is dragged over atoms to reveal the presence of quasiparticles, to find Majoranas at both ends of a chain of iron atoms resting on the surface of a superconductor.

The team went on to detect the Majorana's quantum "spin," a property shared by electrons and other subatomic particles. In a report published in Science in 2017, the team stated that the Majorana's spin property is a unique signal with which to determine that a detected quasiparticle is indeed a Majorana.

In this latest study, the team explored another predicted place for finding Majoranas: in the channel that forms at the edge of a topological insulator when it is placed in contact with a superconductor. Superconductors are materials in which electrons can travel without resistance, and topological insulators are materials in which electrons flow only along the edges.

The theory predicts that Majorana quasiparticles can form at the edge of a thin sheet of topological insulator that comes in contact with a block of superconducting material. The proximity of the superconductor coaxes electrons to flow without resistance along the topological insulator edge, which is so thin that it can be thought of as a wire. Since Majoranas form at the end of wires, it should be possible to make them appear by cutting the wire.

"It was a prediction, and it was just sitting there all these years," said Yazdani. "We decided to explore how one could actually make this structure because of its potential to make Majoranas that would be more robust to material imperfections and temperature."

The team built the structure by evaporating a thin sheet of bismuth topological insulator atop a block of niobium superconductor. They placed nanometer-sized magnetic memory bits on the structure to provide a magnetic field, which derails the flow of electrons, producing the same effect as cutting the wire. They used STM to visualize the structure.

When using their microscope to hunt for the Majorana, however, the researchers were at first perplexed by what they saw. Some of the time they saw the Majorana appear, and other times they could not find it. After further exploration they realized that the Majorana only appears when the small magnets are magnetized in the direction parallel to the direction of electron flow along the channel.

"When we began to characterize the small magnets, we realized they are the control parameter," said Yazdani. "The way the magnetization of the bit is oriented determines whether the Majorana appears or not. It is an on-off switch."

The team reported that the Majorana quasiparticle that forms in this system is quite robust because it occurs at energies that are distinct from the other quasiparticles that can exist in the system. The robustness also stems from its formation in a topological-edge mode, which is inherently resistant to disruption. Topological materials derive their name from the branch of mathematics that describes how objects can be deformed by stretching or bending. Electrons flowing in a topological material thus will continue moving around any dents or imperfections.

Credit: 
Princeton University

The surprising reason why some lemurs may be more sensitive to forest loss

image: Researchers report that the microbes living in the guts of leaf-eating lemurs like this one are largely shaped by the forests where they live, a finding that could make some species less resilient to deforestation.

Image: 
David Haring, Duke Lemur Center

DURHAM, N.C. -- Duke University scientists have given us another way to tell which endangered lemur species are most at risk from deforestation -- based on the trillions of bacteria that inhabit their guts.

In a new study, researchers compared the gut microbes of 12 lemur species across the island of Madagascar, where thousands of acres of forest are cleared each year to make way for crops and pastures.

The team found that some lemurs harbor microbes that are more specialized than others for the forests where they live, to help lemurs digest their leafy diets.

One thing that could make it more difficult for such lemurs to adapt to fragmented forests or new locales in the wake of habitat change, the findings suggest, may be their ability to digest the specific mix of plants that grow there.

The study was published June 12 in the journal Biology Letters.

Researchers are trying to tease apart the influence of various factors that shape the balance of microbes in the gut, in part because of studies showing that an animal's gut microbiome affects its health.

"Gut microbes perform crucial functions," said first author Lydia Greene, who conducted the research as part of her Ph.D. dissertation at Duke.

Led by Greene and professor of evolutionary anthropology Christine Drea, the study compared the gut microbiomes of 12 species representing two branches of the lemur family tree, brown lemurs and sifaka lemurs.

Both groups of lemurs eat plant-based diets culled from hundreds of species of trees. But while brown lemurs eat mostly fruit, sifakas are known for the ability to eat leaves full of fiber and tannins. Sifakas' intestines are teeming with mostly friendly bacteria that help them break down the tough leaves they eat, turning plant fiber into nutrients the lemurs use to stay healthy.

Using feces collected by a network of lemur-tracking colleagues working at seven sites across Madagascar, the team sequenced the DNA of gut bacteria from 128 lemurs to figure out which microbes were present.

The stool samples revealed striking differences. The fruit-eating brown lemurs harbored similar collections of gut microbes regardless of where they lived on the island. But the microbial makeup inside the guts of the leaf-eating sifakas varied from place to place, and in ways that couldn't be attributed to genetic relatedness between lemur species. Instead, what mattered most was where they lived: Microbes that were common in lemurs living in dry forest were rare or absent in rainforest dwellers, and vice versa.

The patterns they found may also explain "why so many brown lemurs have adapted to captivity, but only one species of sifaka" has been successfully reared in zoos and sanctuaries, Greene said.

"They have specialized diets and are completely reliant on having the right microbes" to extract nutrients and energy from the food they eat, Greene said.

"If you look at any one of these fruit-eating species and take away its forest, theoretically it could move next door," Drea said. "The leaf specialists may not be able to."

Credit: 
Duke University

Carbon-neutral fuels move a step closer

The carbon dioxide (CO2) produced when fossil fuels are burned is normally released into the atmosphere. Researchers working on synthetic fuels - also known as carbon-neutral fuels - are exploring ways to capture and recycle that CO2. At EPFL, this research is spearheaded by a team led by Professor Xile Hu at the Laboratory of Inorganic Synthesis and Catalysis (LSCI). The chemists have recently made a landmark discovery, successfully developing a high-efficiency catalyst that converts dissolved CO2 into carbon monoxide (CO) - an essential ingredient of all synthetic fuels, as well as plastics and other materials. The researchers published their findings in Science on 14 June.

Replacing gold with iron

The new process is just as efficient as previous technologies, but with one major benefit. "To date, most catalysts have used atoms of precious metals such as gold," explains Professor Hu. "But we've used iron atoms instead. At extremely low currents, our process achieves conversion rates of around 90%, meaning it performs on a par with precious-metal catalysts."

"Our catalyst converts such a high percentage of CO2 into CO because we successfully stabilized iron atoms to achieve efficient CO2 activation," adds Jun Gu, a PhD student and lead author of the paper. To help them understand why their catalyst was so highly active, the researchers called on a team led by Professor Hao Ming Chen at National Taiwan University, who conducted a key measurement of the catalyst under operating conditions using synchrotron X-rays.

Closing the carbon cycle

Although the team's work is still very much experimental, the research paves the way for new applications. At present, most of the carbon monoxide needed to make synthetic materials is obtained from petroleum. Recycling the carbon dioxide produced by burning fossil fuels would help preserve precious resources, as well as limit the amount of CO2 - a major greenhouse gas - released into the atmosphere.

The process could also be combined with storage batteries and hydrogen-production technologies to convert surplus renewable power into products that could fill the gap when demand outstrips supply.

Credit: 
Ecole Polytechnique Fédérale de Lausanne

Novel communications architecture for future ultra-high speed wireless networks

image: Figure 1. (first above) Photo of the open space measurement setup of Fig. 2, showing the client, AP2 (access point 2) and AP5 (access point 5).

Image: 
IMDEA Networks

The radio frequency spectrum, the basis for wireless telecommunications, is a finite resource that needs to be managed effectively to satisfy the demands posed by the exponential growth in wireless internet access.

IMDEA Networks researchers have developed a novel communications architecture for future ultrafast wireless networks that promises to achieve data rates previously only possible with optical fiber.

The radio frequency spectrum, the basis for wireless telecommunications, is a finite resource that needs to be managed effectively to satisfy the demands posed by the exponential growth in wireless internet access. IMDEA Networks researchers have developed a novel communications architecture for future ultrafast wireless networks that promises to achieve data rates previously only possible with optical fiber.

Facebook initiated the Terragraph project that uses a mesh of reconfigurable millimeter-wave links to provide reliable, high-speed Internet access in urban and suburban environments. It previously experimented with networks of solar-powered drones with millimeter-wave backhaul and interconnection links to provide connectivity in areas with little infrastructure. The Loon project by Alphabet (Google) uses high-altitude balloons with millimeter-wave links for the same purpose. Millimeter-wave technology also has extremely interesting properties for large scale networks of small satellites to provide world-wide connectivity, such as the planned Starlink network of SpaceX and PointView Tech (Facebook), and is very likely to be used in such networks. As density and capacity of such types of networks increases, the scalability results of this ERC project will be of high practical relevance.

"The ground breaking protocols and algorithms we have developed provide key elements for the scalability of future wireless networks," says Joerg Widmer. "In analogy to the evolution of wired Ethernet from a shared medium to a fully switched network, we envision that future wireless networks will consist of many highly directional LOS (line-of-sight) channels for communication between access points (APs) and end devices". Thus, the architecture of future millimeter-wave networks will be characterized by being ultra-dense and highly scalable. "In order to deal with the extremely dynamic radio environments where channels may appear and disappear over very short time intervals, SEARCHLIGHT uses angle information to rapidly align the directional millimeter-wave antennas," explains Dr. Widmer. "The architecture integrates a location system and learns a map of the radio environment, which allows to rapidly select the most suitable access point and antenna beam pattern and allocate radio resource using predicted location as context information. Access points are deployed ubiquitously to provide continuous connectivity even in face of mobility and blockage and the project developed very low overhead network management mechanisms to cope with the high device density."

Dr. Widmer, has recently awarded a H2020 Marie Sklodowska-Curie Innovative Training Network grant on "Millimeter-wave Networking and Sensing for Beyond 5G", and within this project his group will follow-up on the promising work that was started during the ERC grant. The work of the prestigious ERC grant also led to a collaboration project funded by Huawei on millimeter-wave and low frequency channel correlation and a sub-contracted project on a "Millimeter-wave SDR-based Open Experimentation Platform" within the framework of the H2020 project "Orchestration and Reconfiguration Control Architecture" (ORCA), to extend a FPGA based platform to less powerful hardware and enable remote access and experimentation for teaching and research.

Credit: 
IMDEA Networks Institute

Understanding social structure is important to rewilding

Increasing the success of wildlife translocations is critical, given the escalating global threats to wildlife. A study published in May 2019 in the journal Global Ecology and Conservation highlights the influence of a species' social structure on translocation success, and it provides a template for incorporating social information in the rehabilitation and release planning process. Using elephants as a model, the study--"Increasing Conservation Translocation Success by Building Social Functionality in Released Populations"--highlights the need to include animal social structure as an integral part of conservation plans, in order to assure better animal welfare and program success.

"Understanding of the complexity of social behavior in both wild and captive populations has greatly expanded over recent years," says Shifra Goldenberg, Ph.D., Smithsonian Conservation Biology Institute ecologist and San Diego Zoo Institute for Conservation Research fellow. "This information offers valuable insight into the social processes underpinning species' demography and behavior, and should be applied to enhance the success of their management."

The study offers wildlife managers a framework in which to analyze and evaluate social relationships within a translocation program, which entails evaluation of an individual animal's social interactions before, during and after human intervention. This framework echoes the recommendations of the International Union for Conservation of Nature (IUCN) as part of its One Plan Approach to Conservation. In addition to detailing how this concept can be applied, the study also uses information gathered from previous translocation efforts for African and Asian elephants to illustrate the need for this approach.

"Elephants are exceptional candidate species for release efforts; despite the support and broader benefits inherent to elephant release projects, elephants are challenging animals to translocate," says Megan Owen, Ph.D., director of Population Sustainability at the San Diego Zoo Institute for Conservation Research. "Elephants are highly social animals that exhibit frequent fission and fusion in their aggregation patterns. They have clear social preferences among their multigenerational associates, which are usually close relatives, but elephants can establish strong bonds with nonrelatives in the absence of family."

The paper suggests that improved data collection on individual animal social structure before translocation and rewilding can significantly inform decision-making and increase post-translocation success--including the consideration and mitigation of problems that can arise post translocation, such as human-animal conflict.

"Careful consideration of the ways in which social relationships shape how wildlife use landscapes can be an important tool for conservation translocations, whether the species is territorial and solitary or highly interactive," adds Goldenberg.

Credit: 
San Diego Zoo Wildlife Alliance

A new paradigm of material identification based on graph theory

image: The simplified graph and the actual crystal structure (upper right) of spinel Co3O4.

Image: 
Science China Press

Materials Genome Initiative (MGI) and National Materials Genome Project have been launched by American and Chinese government in the past decade. One of the major goals of these missions is to facilitate the identification of materials data to speed material discovery and development. Current methods are promising candidates to identify structures effectively, but have limited ability to deal with all structures accurately and automatically in the big materials database, because different material resources and various measurement error lead to variation of bond length and bond angle.

Feng Pan and his colleagues, from Peking Univerisy Shenzhen Graduate School, propose a new paradigm based on graph theory (GT scheme) to improve the efficiency and accuracy of material identification, which focuses on processing the "topological relationship" rather than the value of bond length and bond angle among different structures.

In GT scheme, the researchers first simplify crystal structures into a graph, which only consists of vertices and edges, in which atoms are simplified as vertices and adjacent atoms with the actual chemical bonds are "connected" with edges. If the topological connections in the simplified graphs between two structures are the isomorphic, the GT scheme will consider them as one structure. By using this method, automatic deduplication for big materials database is achieved for the first time, which identifies 626,772 unique structures from 865,458 original structures.

Moreover, the GT scheme has been modified to solve some advanced problems such as identifying highly distorted structures, distinguishing structures with strong similarity and classifying complex crystal structures in materials big data. Compared with the traditional structure chemistry methods, the GT scheme can address these iusses much more easily, which enhances the efficiency and reliability of material identification.

By using this artificial intelligent technique, the researchers are trying to achieve high-throughput calculation, preparation and detection for the materials database. The GT scheme subverts the traditional material research methods and accelerates the development in material research field.

Credit: 
Science China Press

Butting out: Researchers gauge public opinion on tobacco product waste

image: This is professor Janet Hoek.

Image: 
University of Otago

Requiring cigarettes to contain biodegradable filters, fining smokers who litter cigarette butts and expanding smoke free outdoor areas are measures the public considers are most likely to reduce tobacco product waste, new University of Otago research reveals.

The recent survey of 800 New Zealanders including both smokers (386) and non-smokers (414) shows most respondents consider cigarette butts are toxic to the environment, holding smokers primarily responsible.

However, Lead Researcher Professor Janet Hoek from the Departments of Public Health and Marketing says when knowledge of butt non-biodegradability increased, so too did the proportion of respondents holding tobacco companies responsible for tobacco product waste.

"Tobacco companies have framed smokers as responsible for tobacco product waste, yet it is tobacco companies that knowingly manufacture a toxic and environmentally harmful product," Professor Hoek says.

"Not only do cigarette filters mislead smokers about the risks they face, they are also a pervasive litter item; recent news reports suggest annually more than six million cigarette butts are discarded in New Zealand," she says.

"Because cigarette filters comprise non-biodegradable cellulose acetate, they accumulate where they are littered, leaching toxic chemicals into the soil before being swept into waste water systems and out to sea where they contribute to accumulating plastic mountains."

Cigarette filters were first introduced in the 1950s and 60s in response to growing evidence that smoking caused lung cancer and other serious and often fatal illnesses. Advertising at the time declared filtered cigarettes were better for people's health, a perception that has persisted despite evidence filters bring no health benefits to smokers and cause major environmental damage.

The survey is believed to be the first in New Zealand to gauge public perceptions of the problem tobacco product waste causes and what appropriate responses might be.

Professor Hoek says smokers and non-smokers varied in their perceptions of the measures tested. Overall, most respondents saw mandating biodegradable filters (81 per cent), and fines (74) as effective in reducing tobacco waste. However, non-smokers were more likely than smokers to see increased smoke free spaces (70) as effective while smokers were more likely than non-smokers to see educative measures, such as social marketing campaigns (72) or on-pack labels (55) as likely to be effective.

The researchers call for strategies that increase awareness of tobacco companies' role in creating tobacco product waste, which has received little attention.

"Greater knowledge of how tobacco companies contribute to this environmental problem could foster political support for producer responsibility measures that require these companies to manage the waste their products cause," the research states.

"At the same time, we must also continue to foster smoking cessation and decrease uptake, as reducing smoking prevalence presents the best long-term solution to addressing the problem."

Credit: 
University of Otago

Concert of magnetic moments

image: The spins (red and blue arrows) in distant magnetic layers can couple due to the discovered long-range interaction, which is illustrated as a white string connecting two spins. This coupling leads to a clockwise canting between the spins in the two layers. Its depicted mirror image with the opposite sense of canting is not found in the considered systems.

Image: 
Forschungszentrum Jülich / Jan-Philipp Hanke

An international collaboration between researchers from Germany, the Netherlands, and South Korea has uncovered a new way how the electron spins in layered materials can interact. In their publication in the journal Nature Materials, the scientists report a hitherto unknown chiral coupling that is active over relatively long distances. As a consequence, spins in two different magnetic layers that are separated by non-magnetic materials can influence each other even though they are not adjacent.

Magnetic solids form the foundation of modern information technology. For example, these materials are ubiquitous in memories such as hard disk drives. Functionality and efficiency of these devices is determined by the physical properties of the magnetic solids. The latter originate from the "concert of spins" - the interactions between microscopically small magnetic moments within the material. Understanding and directing this concert is a fundamental question for both research and applications.

Two magnetic materials can influence each other over long distances even if they are not in direct contact. In the past, such a long-range interaction has been observed, for example, in heterostructures of magnetic iron layers that are separated by a thin chromium spacer. A unique fingerprint of the so-called interlayer coupling is the parallel or antiparallel alignment of the magnetic moments in the iron layers. This phenomenon is technologically important since the electrical resistance of the two possible configurations is drastically different - which is known as giant magnetoresistance effect. It is used in operating magnetic memories and sensors, and in 2007 the Nobel Prize in Physics was awarded jointly to Peter Grünberg and Albert Fert for their discovery.

A group of scientists has now extended this "concert of spins" by a new type of long-range interlayer coupling. They report in the journal Nature Materials that the discovered interaction leads to a special configuration of the magnetic moments, which is neither parallel nor antiparallel but has a specific chirality. This means that the resulting arrangement of spins is not identical to its mirror image - just like our left hand is different from our right hand. Such chiral interactions in crystals are very rare in nature. Using theoretical simulations on the supercomputer JURECA in Jülich, the researchers identified the interplay between the crystal structure and relativistic effects as origin of the observed chiral interlayer coupling. This flavor of the "concert of spins" offers novel opportunities for engineering complex magnetic configurations that could be useful to store and process data more efficiently in the future.

Credit: 
Forschungszentrum Juelich

NASA's Fermi mission reveals its highest-energy gamma-ray bursts

video: This animation shows the most common type of gamma-ray burst, which occurs when the core of a massive star collapses, forms a black hole, and blasts particle jets outward at nearly the speed of light. Viewing into a jet greatly boosts its apparent brightness. A Fermi image of GRB 130427A ends the sequence.

Watch on YouTube:https://www.youtube.com/watch?v=7uN1AjMui5k

Download in HD: https://svs.gsfc.nasa.gov/13220

Image: 
NASA's Goddard Space Flight Center

For 10 years, NASA's Fermi Gamma-ray Space Telescope has scanned the sky for gamma-ray bursts (GRBs), the universe's most luminous explosions. A new catalog of the highest-energy blasts provides scientists with fresh insights into how they work.

"Each burst is in some way unique," said Magnus Axelsson, an astrophysicist at Stockholm University in Sweden. "It's only when we can study large samples, as in this catalog, that we begin to understand the common features of GRBs. These in turn give us clues to the physical mechanisms at work."

The catalog was published in the June 13 edition of The Astrophysical Journal and is now available online. More than 120 authors contributed to the paper, led by Axelsson, Elisabetta Bissaldi at the National Institute of Nuclear Physics and Polytechnic University in Bari, Italy, and Nicola Omodei and Giacomo Vianello at Stanford University in California.

GRBs emit gamma rays, the highest-energy form of light. Most GRBs occurs when some types of massive stars run out of fuel and collapse to create new black holes. Others happen when two neutron stars, superdense remnants of stellar explosions, merge. Both kinds of cataclysmic events create jets of particles that move near the speed of light. The gamma rays are produced in collisions of fast-moving material inside the jets and when the jets interact with the environment around the star.

Astronomers can distinguish the two GRB classes by the duration of their lower-energy gamma rays. Short bursts from neutron star mergers last less than 2 seconds, while long bursts typically continue for a minute or more. The new catalog, which includes 17 short and 169 long bursts, describes 186 events seen by Fermi's Large Area Telescope (LAT) over the last 10 years.

Fermi observes these powerful bursts using two instruments. The LAT sees about one-fifth of the sky at any time and records gamma rays with energies above 30 million electron volts (MeV) -- millions of times the energy of visible light. The Gamma-ray Burst Monitor (GBM) sees the entire sky that isn't blocked by Earth and detects lower-energy emission. All told, the GBM has detected more than 2,300 GRBs so far.

Below is a sample of five record-setting and intriguing events from the LAT catalog that have helped scientists learn more about GRBs.

1. GRB 081102B

The short burst 081102B, which occurred in the constellation Boötes on Nov. 2, 2008, is the briefest LAT-detected GRB, lasting just one-tenth of a second. Although this burst appeared in Fermi's first year of observations, it wasn't included in an earlier version of the collection published in 2013.

"The first LAT catalog only identified 35 GRBs," Bissaldi said. "Thanks to improved data analysis techniques, we were able to confirm some of the marginal observations in that sample, as well as identify five times as many bursts for the new catalog."

2. GRB 160623A

Long-lived burst 160623A, spotted on June 23, 2016, in the constellation Cygnus, kept shining for almost 10 hours at LAT energies -- the longest burst in the catalog. But at the lower energies recorded by Fermi's GBM instrument, it was detected for only 107 seconds. This stark difference between the instruments confirms a trend hinted at in the first LAT catalog. For both long and short bursts, the high-energy gamma-ray emission lasts longer than the low-energy emission and happens later.

3. GRB 130427A

The highest-energy individual gamma ray detected by Fermi's LAT reached 94 billion electron volts (GeV) and traveled 3.8 billion light-years from the constellation Leo. It was emitted by 130427A, which also holds the record for the most gamma rays -- 17 -- with energies above 10 GeV.

A popular model proposed that charged particles in the jet, moving at nearly the speed of light, encounter a shock wave and suddenly change direction, emitting gamma rays as a result. But this model can't account for the record-setting light from this burst, forcing scientists to rethink their theories.

The original findings on 130427A show that the LAT instrument tracked its emission for twice as long as indicated in the catalog. Due to the large sample size, the team adopted the same standardized analysis for all GRBs, resulting in slightly different numbers than reported in the earlier study.

4. GRB 080916C

The farthest known GRB occurred 12.2 billion light-years away in the constellation Carina. Called 080916C, researchers calculate the explosion contained the power of 9,000 supernovae.

Telescopes can observe GRBs out to these great distances because they are so bright, but pinpointing their exact distance is difficult. Distances are only known for 34 of the 186 events in the new catalog.

5. GRB 090510

The known distance to 090510 helped test Einstein's theory that the fabric of space-time is smooth and continuous. Fermi detected both a high-energy and a low-energy gamma ray at nearly the same instant. Having traveled the same distance in the same amount of time, they showed that all light, no matter its energy, moves at the same speed through the vacuum of space.

"The total gamma-ray emission from 090510 lasted less than 3 minutes, yet it allowed us to probe this very fundamental question about the physics of our cosmos," Omodei said. "GRBs are really one of the most spectacular astronomical events that we witness."

What's missing?

GRB 170817A marked the first time light and ripples in space-time, called gravitational waves, were detected from the merger of two neutron stars. The event was captured by the Laser Interferometer Gravitational Wave Observatory (LIGO), the Virgo interferometer and Fermi's GBM instrument, but it wasn't observed by the LAT because the instrument was switched off as the spacecraft passed through a region of Fermi's orbit where particle activity is high.

"Now that LIGO and Virgo have begun another observation period, the astrophysics community will be on the lookout for more joint GRB and gravitational wave events" said Judy Racusin, a co-author and Fermi deputy project scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "This catalog was a monumental team effort, and the result helps us learn about the population of these events and prepares us for delving into future groundbreaking finds."

The Fermi Gamma-ray Space Telescope is an astrophysics and particle physics partnership managed by NASA's Goddard Space Flight Center in Greenbelt, Maryland. Fermi was developed in collaboration with the U.S. Department of Energy, with important contributions from academic institutions and partners in France, Germany, Italy, Japan, Sweden and the United States.

Credit: 
NASA/Goddard Space Flight Center

Earth's heavy metals result of supernova explosion, University of Guelph Research Reveals

That gold on your ring finger is stellar - and not just in a complimentary way.

In a finding that may overthrow our understanding of where Earth's heavy elements such as gold and platinum come from, new research by a University of Guelph physicist suggests that most of them were spewed from a largely overlooked kind of star explosion far away in space and time from our planet.

Some 80 per cent of the heavy elements in the universe likely formed in collapsars, a rare but heavy element-rich form of supernova explosion from the gravitational collapse of old, massive stars typically 30 times as weighty as our sun, said physics professor Daniel Siegel.

That finding overturns the widely held belief that these elements mostly come from collisions between neutron stars or between a neutron star and a black hole, said Siegel.

His paper co-authored with Columbia University colleagues appears today in the journal Nature.

Using supercomputers, the trio simulated the dynamics of collapsars, or old stars whose gravity causes them to implode and form black holes.

Under their model, massive, rapidly spinning collapsars eject heavy elements whose amounts and distribution are "astonishingly similar to what we observe in our solar system," said Siegel. He joined U of G this month and is also appointed to the Perimeter Institute for Theoretical Physics, in Waterloo, Ont.

Most of the elements found in nature were created in nuclear reactions in stars and ultimately expelled in huge stellar explosions.

Heavy elements found on Earth and elsewhere in the universe from long-ago explosions range from gold and platinum, to uranium and plutonium used in nuclear reactors, to more exotic chemical elements such as neodymium found in consumer items such as electronics.

Until now, scientists thought that these elements were cooked up mostly in stellar smashups involving neutron stars or black holes, as in a collision of two neutron stars observed by Earth-bound detectors that made headlines in 2017.

Ironically, said Siegel, his team began working to understand the physics of that merger before their simulations pointed toward collapsars as a heavy element birth chamber. "Our research on neutron star mergers has led us to believe that the birth of black holes in a very different type of stellar explosion might produce even more gold than neutron star mergers."

What collapsars lack in frequency, they make up for in generation of heavy elements, said Siegel. Collapsars also produce intense flashes of gamma rays.

"Eighty per cent of these heavy elements we see should come from collapsars. Collapsars are fairly rare in occurrences of supernovae, even more rare than neutron star mergers - but the amount of material that they eject into space is much higher than that from neutron star mergers."

The team now hopes to see its theoretical model validated by observations. Siegel said infrared instruments such as those on the James Webb Space Telescope, set for launch in 2021, should be able to detect telltale radiation pointing to heavy elements from a collapsar in a far-distant galaxy.

"That would be a clear signature," he said, adding that astronomers might also detect evidence of collapsars by looking at amounts and distribution of heavy element s in other stars across our Milky Way galaxy.

Siegel said this research may yield clues about how our galaxy began.

"Trying to nail down where heavy elements come from may help us understand how the galaxy was chemically assembled and how the galaxy formed. This may actually help solve some big questions in cosmology as heavy elements are a nice tracer."

This year marks the 150th anniversary of Dmitri Mendeleev's creation of the periodic table of the chemical elements. Since then, scientists have added many more elements to the periodic table, a staple of science textbooks and classrooms worldwide.

Referring to the Russian chemist, Siegel said, "We know many more elements that he didn't. What's fascinating and surprising is that, after 150 years of studying the fundamental building blocks of nature, we still don't quite understand how the universe creates a big fraction of the elements in the periodic table."

Credit: 
University of Guelph

Making the 'human-body internet' more effective

image: Experiment setup to understand how characteristics of human body communication can be improved.

Image: 
Dairoku Muramatsu & Yoshifumi Nishida Source: Equivalent Circuit Model Viewed from Receiver Side in Human Body Communication

Wireless technologies such as Wi-Fi and Bluetooth have made remote connectivity easier, and as electronics become smaller and faster, the adoption of "wearables" has increased. From smart watches to implantables, these devices interact with the human body in ways that are very different from those of a computer. However, they both use the same protocols to transfer information, making them vulnerable to the same security risks. What if we could use the human body itself to transfer and collect information? This area of research is known as human body communication (HBC). Now, scientists from Japan report HBC characteristics specific to impedance and electrodes, which according them "have the potential to improve the design and working of devices based on HBC."

First, let's understand exactly how HBC works and why it represents a more "secure" network. HBC is safer because it uses a lower-frequency signal that is sharply attenuated depending on the distance. The closed nature of the transmission results in lower interference and higher reliability, and, therefore, more secure connections. Having the device interact directly with the body also means that it has reliable biomedical applications.

HBC technologies use electrodes instead of antennas to couple signals to the human body. This can be used to conduct an electric field from a transmitter to a receiver, and thus to communicate data. HBC receivers work very similarly to radio frequency receivers; however, it's much more difficult to determine their input impedance. This is important because this allows scientists to maximize the received signal power.

The most important factors are the arrangement of electrodes and the distance between the transmitter and the receiver. These affect the output impedance and the equivalent source voltage of the system, ultimately having an impact on the received signal power. The signal emanates from the transmitter electrode and goes through the body. The body's conductivity couples the field to the environment and this serves as the return path for the transmitted signal.

In their study, the team of Japanese scientists-- Dr Dairoku Muramatsu (Tokyo University of Science), Mr. Yoshifumi Nishida, Prof Ken Sasaki, Mr. Kentaro Yamamoto (all from the University of Tokyo), and Prof Fukuro Koshiji (Tokyo Polytechnic University)--sought out to analyze these characteristics by constructing an equivalent circuit model of the signal transmission that goes from the body to an off-body receiver through touch.

The signal electrodes of both the transmitter and the receiver, as well as the ground electrode of the transmitter, were attached to the body. The ground electrode of the receiver was left "floating" in air. This was unlike other contemporary HBC configurations, in which both ground electrodes are left floating in air. The researchers found that impedance increases with increasing distance between the transmitter electrodes. Interestingly, they also found that the size of the receiver ground was another factor that affected transmission. They report that capacitive coupling between receiver ground and human body increases as the former gets larger.

The findings of this study are important as they enable scientists to design more efficient HBC devices, which are better tuned to the human electric field and, hopefully, better suited for user interaction.

Consider the way we interact with most technologies. Keyboards, screens, switches, and wires dominate the way we communicate, and despite smartphones and touchscreens, the basics of user interface, or "soft ergonomics," have hardly changed in the last few decades. We still sit behind desks for hours and stare at monitors. Our connectivity is very dependent on wireless signals, and thus, the open nature of these networks makes our data vulnerable to hacker attacks.

By using the human body itself as a network, HBC could potentially change this.

As Dr Muramatsu and Mr. Nishida put it, "Because the electric field used in HBC has the property of being sharply attenuated with respect to distance, it hardly leaks to the surrounding space during signal transmission. Thus, using this human body communication model makes it possible to communicate with excellent confidentiality and without generating electromagnetic noise. However, one major drawback of HBC is that it cannot be used for high-speed data communication. Thus, the focus should be on applications of HBC that transmit relatively low-capacity data, such as authentication information and biomedical signals, for long periods with low power consumption."

Credit: 
Tokyo University of Science

The start of a new era in stem cell therapy

image: The research of Assoc. Prof. Tamer Önder of Koç University and his team has shortened the time required for producing stem cells from skin cells and increased the success rate of transformation by a ten-fold.

Image: 
Tamer Onder & Team

ISTANBUL, TURKEY - A recent study published in the April 8 issue of Nature Chemical Biology improves on the "Cellular Reprogramming" method developed by Nobel Laureate in Medicine and Physiology Prof. Shinya Yamanaka, making it possible to produce cells in a considerably shorter time and with greater success. Yamanaka's method, which is referred to as "Cellular Reprogramming", obtains pluripotent cells, similar to the ones we know exist in the very early stages of the embryo. Since such cells are obtained by transforming existing cells of the body (such as skin cells), they are referred to as induced pluripotent stem cells, or iPS in short.

While transformative and hugely important, Yamanaka's reprogramming method needed improvement in two regards. First of all, the transformation of cells takes a long time, around 3-4 weeks. And the rate of successful reprogramming was rather low: Around one in a hundred thousand.

Now, thanks to the joint work of Assoc. Prof. Tamer Önder of Koç University School of Medicine and doctoral students Ayyub Ebrahimi and Kenan Sevinç, along with Prof. Udo Oppermann of Oxford University and his team, this waiting period has shortened, and the success rate has increased.

The challenge of the viruses used to transfer the Yamanaka factors to the skin cells sometimes acting rebelliously and inserting themselves to arbitrary parts of the chromosomes led Assoc. Prof. Önder to investigate the use of certain chemicals instead of viruses. After targeted trials, the team observed that two chemicals produced the desired results in turning skin cells to stem cells. This meant that two of the four Yamanaka factors were no longer necessary. And applying the method with two factors instead of four has reduced the waiting period to approximately a week. And even more importantly, the success rate increased up to as high as ten-fold.

The next phase of the research will involve eliminating the other two Yamanaka factors as well. In this way, it will be much easier to apply the method in clinical settings; as viruses will no longer be needed, there will be no danger of manipulating with the wrong gene or involuntarily suppressing the effects of a particular gene.

Credit: 
Koc University

BTI researchers discover interactions between plant and insect-infecting viruses

image: This is a green peach aphid (Myzus persicae) feeding on a husk tomato (Physalis floridana) plant.

Image: 
Mariko Alexander

Aphids and the plant viruses they transmit cause billions of dollars in crop damage around the world every year. Researchers in Michelle Heck's lab at the USDA Agricultural Research Service and Boyce Thompson Institute are examining the relationship at the molecular level, which could lead to new methods for controlling the pests.

Heck's group used recently developed small RNA sequencing techniques to better understand how plant viruses interact with aphids. In an unanticipated discovery, Heck and her team uncovered what may be the first example of a plant virus and an insect virus cooperating to increase the likelihood that both viruses will spread to other plant and aphid hosts.

The work was published in the May 22 issue of Phytobiomes journal.
The researchers focused on the green peach aphid (Myzus persicae), which transmits more than 100 different plant viruses and feeds on a wide variety of crops, including peaches, tomatoes, potatoes, cabbage, corn and numerous others.

Potato leafroll virus (PLRV) is of particular concern because it can reduce potato yield by more than 50%, causing 20 million tons of annual global yield losses.

"The most interesting finding of this research is that PLRV suppressed the aphid's immune system, and this suppression was mediated by a single virus protein, the P0 protein," says Jennifer Wilson, a co-first author on the paper. Wilson is a PhD candidate in the School of Integrative Plant Science (SIPS) at Cornell University and is conducting her thesis research with Dr. Heck.

P0 is a PLRV protein that is expressed inside plant tissue but not inside the aphids. While P0 had been previously shown to suppress plants' immune systems, the protein's impact on the insect's immune system was a surprise to the researchers.

"We don't know if the aphids ingest P0 from the plant or not, but we do know that when P0 is present in the plant, the aphids' immune systems are suppressed," explains Wilson.

One critical result of the insect's immune system being hampered is an increase in the proliferation of an insect virus, Myzus persicae densovirus (MpDNV). The researchers also found that aphids with more copies of MpDNV were more likely to have wings.

Because green peach aphids rarely have wings until the weather turns colder in the fall, this increase in winged insects could mean increased spread of PLRV and MpDNV to new hosts all summer long, a synergistic effect that wouldn't happen as much if the aphids were infected with only one of the viruses.

"We think we have found the first example of cooperation between a plant virus and an insect virus," Wilson says. "This cooperation may lead to increased transmission of both viruses."

Wilson and Heck are currently working to test this hypothesis by repeating the experiments in aphids not infected with MpDNV, which Wilson collected last summer from farms in upstate New York.

Future work could include figuring out how MpDNV and the P0 protein could be used to control virus transmission by aphids.

"Developing strategies to block virus transmission in the field is a major goal of our research program," said Heck, also an adjunct assistant professor in SIPS. "Wilson's thesis work is paradigm shifting. Stay tuned for more exciting stuff from her in the near future."

Credit: 
Boyce Thompson Institute

Can break junction techniques still offer quantitative information at single-molecule level

image: This is a schematic representation of a typical conductance-distance trace in the opening process of the two electrodes. The total conductance (G, red solid line) of a single-molecule junction is composed of the through-space tunneling (Gs, transparent black line) and through-molecule tunneling (Gm, transparent red line) contributions. After breaking down of the junction, a gold-molecule-solution-gold channel (Gc, transparent green line) appears. The three gray areas show regions in which the conductance cannot be measured.

Image: 
©Science China Press

Single-molecule break junction techniques offer unique insights into the charge transport at the molecular level. The conductance through single-molecule junction consists of the through-space tunneling and the through-molecule tunneling conductance. However, the existence of through-space tunneling, which is ubiquitous at the single-molecule level, makes the quantitative extraction of the intrinsic molecular signals challenging.

Although its powerful capability to contact individual molecules, single-molecule break junction techniques are intrinsically not capable of intervening the actual signatures of the junction, such as binding geometries, number of molecules in the junction, interaction between the junction and neighbor molecules, etc.

The widely accepted method to extract the conductance information of a single molecule is to obtain statistics in thousands of break junction processes. Typically, a conductance histogram is plotted to find the most probable conductance. However, to which extent should we trust the obtained conductance? Since the result may affected by many stochastic events, such as junction formation probability, early rupture of the molecular junction, etc., in the break junction process.

What's more, as the focus of the state-of-art single-molecule break junction measurements has gradually shifted from strong interaction to weak interaction systems (such as supramolecular junctions) and from static processes to more dynamic processes (such as diffusion and chemical reaction processes), can break junction techniques still offer quantitative information at single-molecule level in these systems?

Very recently, Professor Wenjing Hong's group in Xiamen University, working together with Prof. Jielou Liao's group in University of Science and Technology of China, explored in detail the quantitative characterization capability of break junction techniques in single-molecule systems through an analytical model (Figure 1). This model describes the conductance changes during the opening process of two gold electrodes and validates the capability of conductance and displacement analyses during the break junction experiments of OAE-type molecule junctions. Based on this model, they demonstrated that the break junction technique can be used to detect the conductance of the molecular system with weak interactions under low junction formation probabilities and early rupture of the formed junction before it reaches a fully-stretched configuration. Using the established simulation approach, they further proved that the break junction technique can offer a quantitative understanding of molecular assembly, diffusion and even reaction processes with complementary conductance and displacement analyses.

Credit: 
Science China Press