Tech

Even bacteria need their space: Squished cells may shut down photosynthesis

image: The cyanobacteria cells near the inside of this colony fluoresce more than those at its edges.

Image: 
Kristin A. Moore and Jeffrey C. Cameron

Introverts take heart: When cells, like some people, get too squished, they can go into defense mode, even shutting down photosynthesis.

In a study published today, a team at the University of Colorado Boulder took advantage of a new microscopic technique to follow the lives of individual bacteria as they grew and divided in complex colonies.

The researchers discovered something unexpected in the process: Whenever these single-celled organisms, a type of cyanobacteria or blue-green algae, got too smushed, they began to switch off the machinery that was essential for them to turn sunlight into sugar.

The tiny organisms, in other words, slowed down their growth in a big way, said Jeffrey Cameron, an assistant professor in the Department of Biochemistry and coauthor of the new research.

"If a cell is between a rock and a hard place, internally everything says, 'Yes, I have nutrients. I want to grow,'" said Cameron, also of the Renewable and Sustainable Energy Institute (RASEI) at CU Boulder. "But there also has to be a feedback that says, 'I need to turn off photosynthesis so I don't expand and rupture.'"

The findings, which appear in Nature Microbiology, provide a new window on this process that sustains most life on Earth--and, in particular, how organisms regulate photosynthesis when the going gets tough. Cameron hopes that his team's results will also help scientists to develop custom-made microbes that could one day turn light into electricity or even construct living buildings.

Bacterial pancakes

As Cameron explains, it all began almost by accident.

He launched this research several years ago with a deceptively simple goal in mind: He and his colleagues wanted to track the behavior of individual cells within a bacterial colony throughout their entire lifecycle.

That was a difficult feat--for simple-looking organisms, cyanobacteria can form pretty complex structures.

"The cells on the outside of a colony are exposed to a lot of light, while those on the inside have low light exposure," Cameron said. "Over time as they grow and accumulate more cells, they shade themselves."

To be able to see every single cell in a growing colony, Cameron developed a method of culturing cyanobacteria so that they spread out like flat pancakes. When he started peering at these two-dimensional growths under a microscope, he noticed something odd: The more the colonies grew, and the more the bacteria inside began to squeeze together, the more they began to glow, or "fluoresce," under a certain type of light.

The microbes, he explained, were emanating heat out to their environment, almost like a person perspiring on a packed city bus.

"I dropped everything and spent the next four years figuring out what was happening," Cameron said.

The key, he and his colleagues discovered, was that the cells in a colony weren't all glowing the same. The cyanobacteria on the inside of a colony, for example, fluoresced a lot more than those on the fringes. They also grew a lot slower, dividing in two at about half the rate as their exterior cousins.

Put differently, when cells get squished, they shine.

"When the cells become confined, and they can't expand, they become highly fluorescent," Cameron said.

Tiny antennae

So why were those interior cells perspiring so much?

To answer that question, you have to get to know the phycobilisome. This teeny, protein-based structure is the antenna of the cell. Many of these phycobilisomes sit inside cyanobacteria where they collect sunlight, then transport it to the reaction sites where that energy can be converted into glucose, or sugar.

Or that's what usually happens. Cameron and his colleagues found that when their cyanobacteria became too confined, they started to shed their phycobilisomes.

"If the cyanobacteria got more light than they could use for photosynthesis, these antennae would literally pop off," Cameron said.

The microbes couldn't use all that energy, so to keep from getting glutted, they turned off photosynthesis.

The group's results, he said, show just how dynamic single-celled organisms can be: They have a lot of tools for staying healthy in a dynamic social environment.

Understanding that social environment could also one day help scientists put photosynthesis to work--tapping sunlight to design more sustainable buildings or make other biology-based tools.

"We might be able to develop small-scale machines that are using light to perform computations or work," Cameron said.

It's a whole new way to think about catching some rays.

Credit: 
University of Colorado at Boulder

A key development in the drive for energy-efficient electronics

image: A sample of the advanced material being prepared for muon spin spectroscopy.

Image: 
University of Leeds

Scientists have made a breakthrough in the development of a new generation of electronics that will require less power and generate less heat.

It involves exploiting the complex quantum properties of electrons - in this case, the spin state of electrons.

In a world first, the researchers - led by a team of physicists from the University of Leeds - have announced in the journal Science Advances that they have created a 'spin capacitor' that is able to generate and hold the spin state of electrons for a number of hours.

Previous attempts have only ever held the spin state for a fraction of a second.

In electronics, a capacitor holds energy in the form of electric charge. A spin capacitor is a variation on that idea: instead of holding just charge, it also stores the spin state of a group of electrons - in effect it 'freezes' the spin position of each of the electrons.

That ability to capture the spin state opens up the possibility that new devices could be developed that store information so efficiently that storage devices could get very small. A spin capacitor measuring just one square inch could store 100 Terabytes of data.

Dr Oscar Cespedes, Associate Professor in the School of Physics and Astronomy who supervised the research, said: "This is a small but significant breakthrough in what could become a revolution in electronics driven by exploitation of the principles of quantum technology.

"At the moment, up to 70 per cent of the energy used in an electronic device such as a computer or mobile phone is lost as heat, and that is the energy that comes from electrons moving through the device's circuitry. It results in huge inefficiencies and limits the capabilities and sustainability of current technologies. The carbon footprint of the internet is already similar to that of air travel and increases year on year.

"With quantum effects that use light and eco-friendly elements, there could be no heat loss. It means the performance of current technologies can continue to develop in a more efficient and sustainable way that requires much less power."

Dr Matthew Rogers, one of the lead authors, also from Leeds, commented: "Our research shows that the devices of the future may not have to rely on magnetic hard disks. Instead. They will have spin capacitors that are operated by light, which would make them very fast, or by an electrical field, which would make they extremely energy efficient.

"This is an exciting breakthrough. The application of quantum physics to electronics will result in new and novel devices."

The paper, Reversible spin storage in metal oxide--fullerene heterojunctions, can be accessed here.

How a spin capacitor works

In conventional computing, information is coded and stored as a series of bits: e.g. zeroes and ones on a hard disk. Those zeroes and ones can be represented or stored on the hard disc by changes in the polarity of tiny magnetized regions on the disc.

With quantum technology, spin capacitors could write and read information coded into the spin state of electrons by using light or electric fields.

The research team were able to develop the spin capacitor by using an advanced materials interface made of a form of carbon called buckminsterfullerene (buckyballs), manganese oxide and a cobalt magnetic electrode. The interface between the nanocarbon and the oxide is able to trap the spin state of electrons.

The time it takes for the spin state to decay has been extended by using the interaction between the carbon atoms in the buckyballs and the metal oxide in the presence of a magnetic electrode.

Some of the world's most advanced experimental facilities were used as part of the investigation.

The researchers used the ALBA Synchrotron in Barcelona which uses electron accelerators to produce synchrotron light that allows scientists to visualise the atomic structure of matter and to investigate its properties. Low energy muon spin spectroscopy at the Paul Scherrer Insitute in Switzerland was used to monitor local spin changes under light and electrical irradiation within billionths of a meter inside the sample. A muon is a sub-atomic particle.

The results of the experimental analysis were interpreted with the assistance of computer scientists at the UK's Science and Technical Facilities Council, home to one of the UK's most powerful supercomputers.

The scientists believe the advances they have made can be built on, most notably towards devices that are able to hold spin state for longer periods of time.

Credit: 
University of Leeds

Is Niagara Falls a barrier against fish movement?

image: The brown bullhead (Ameiurus nebulosus) is one of seven native fish species collected by the Canada Department of Fisheries and Oceans from above and below Niagara Falls and used in this study.

Image: 
© Canada Department of Fisheries and Oceans

New research shows that fishes on either side of Niagara Falls--one of the most powerful waterfalls in the world--are unlikely to breed with one another. Knowing how well the falls serves as a barrier to fish movement is essential to conservation efforts to stop the spread of invasive aquatic species causing ecological destruction in the Great Lakes. The study is published today in the journal Molecular Ecology.

"In the past 50 years or so, aquatic invasive species have expanded in the Great Lakes as a tremendous conservation concern, causing billions of dollars' worth of damage," said Nathan Lujan, lead author of the study and a Gerstner Scholar at the American Museum of Natural History. "Both Canadian and American authorities are concerned about the potential impact of these species on the Great Lakes and are very interested in installing barrier technologies in the Niagara River that would slow or stop their spread."

For more than 11,000 years since glaciers retreated from North America, most water flowing through the Great Lakes has crossed the Niagara Falls, which has a flow rate of more than 750,000 gallons per second. There's one other way water can get through this constriction point: through the Welland navigation canal, which was built about 200 years ago and features a series of locks that bring vessels from one side of the falls to the other. The canal is relatively small compared to the Niagara River, but questions remain about how significant it and the falls are in allowing fishes to move upstream to downstream, and vice versa. The leading idea is to install a combination of barrier technologies in the Welland Canal, including electricity, sound, light, and possible physical barriers to inhibit fish movement.

"If you're going to spend potentially hundreds of millions of dollars on installing barrier technologies and fishes can go right over the falls, then that's obviously not a good use of resources," Lujan said. "If people can survive it in a barrel, you'd think a fish could."

To investigate these questions, Lujan and colleagues examined the DNA of seven native fish species to determine whether populations above and below Niagara Falls interbreed or are reproductively isolated. By gathering data from throughout the fishes' genomes, they found that populations of all species are genetically distinct on opposite sides of the falls.

Then they modeled how DNA from different populations mix, and determined that in four species there has been no significant migration past Niagara Falls since the falls were first formed 11,000 years ago. Two other species showed some indication of migration past the falls, yet the models indicated that no species had migrated past the falls via the Welland Canal.

"These results should reassure policymakers that infrastructure being considered to prevent the movement of invasive aquatic species will not impact native species, and that the falls themselves are an effective barrier to both upstream and downstream movement of aquatic species," Lujan said. "Additional measures to prevent fish movement can safely be restricted to the Welland Canal."

Credit: 
American Museum of Natural History

Peak district grasslands hold key to global plant diversity

Scientists from the University of Sheffield use Peak District grassland to answer a globally important question

The team discovered how a large variety of plants are able to grow in one ecosystem when it is low in key nutrients

The findings show that plants are able to share nutrients, which could explain high levels of biodiversity in many ecosystems around the world

Scientists at the University of Sheffield have found that plants are able to co-exist because they share key nutrients, using grasslands from the Peak District.

In a study published in Nature Plants, the team investigated how some ecosystems can have high biodiversity when all of these plants are competing for the same nutrients. They looked especially at ecosystems which are high in biodiversity but low in phosphorus, an essential nutrient for plant growth.

To do this they used soil taken from Peak District limestone grassland which is low in phosphorus. They then injected different types of phosphorus into the soil which allowed them to track which plants took up which type of phosphorus.

Their findings show that plants are able to share out the phosphorus by each preferring to take it up in a different form. This sharing is known as resource partitioning.

Professor Gareth Phoenix, from the University of Sheffield's Department of Animal and Plant Sciences, who led the study, said: "The plants had different preferences for the various phosphorus compounds. Some showed greater uptake from the inorganic phosphorus form of phosphate, some preferred to use a mineral bound phosphorus compounds such as calcium phosphate, and others were better at using the organic compound DNA. Critically, this means the plants can co-exist because they are using different chemical forms of phosphorus in the soil. In other words, they are sharing the phosphorus.

"Our research answers the global question of how we get very high levels of plant species biodiversity, especially in ecosystems with very low amounts of soil phosphorus. By helping to understand how we get high levels of biodiversity, we can also better protect ecosystems and conserve their biodiversity."

The research used different radioactive compounds to directly trace the phosphorus from the soil into the plants to accurately trace which soil phosphorus compound the phosphorus in the plant came from. Only very tiny amounts of radioactive phosphorus are needed to detect the uptake, meaning the team could look at the natural behaviour of the plants in natural soil, and with small amounts of phosphorus, as you would find in nature.

Credit: 
University of Sheffield

UM scientists play a direct role in identification of forests for protection in Borneo

image: A Borneo rainforest identified in the study as critical for protection.

Image: 
Jedediah Brodie

MISSOULA - An international team of researchers, including two from the University of Montana, are working to help identify priority forest areas for protection on Borneo.

The government of the state of Sabah in Malaysian Borneo has set an ambitious target of securing 30% of Sabah's land area under protection by 2025. The study identified important forest areas across the state that are critical for protection. These areas are rich in threatened rainforest animals and plants. The forests also store large amounts of carbon, helping to slow global warming.

In total, the priority forests identified are about the size of Glacier National Park, almost twice the size of the Yorkshire Dales National Park in the United Kingdom.

Lead author, Dr. Sara Williams from UM said, "What makes our study different is that the research is being used to inform conservation planning decisions about to be taken by government agencies.

"The priority forest areas account for 5% of the land area of the state of Sabah. The approach that we came up with, selecting areas for protection strategically rather than randomly, protects around 12% of most of the statewide conservation features that we were interested in, which includes species ranges, forest carbon and landscape connectivity."

Creating land corridors that link protected areas together will mean animals can move from the large parks across the Indonesian border into the flagship conservation areas in the center of Sabah.

"Allowing species to move from the hottest lowland areas to the cooler mountains will be especially important for conserving rainforest species in the long term, as they will need to keep up with a warming climate," said Dr. Sarah Scriven from the University of York.

UM's Dr. Jedediah Brodie said, "Protected areas are generally situated based on scenic beauty or political expediency. But putting new protected areas in the best places for biodiversity, as we've done here, more than doubles their conservation benefits."

Credit: 
The University of Montana

April's SLAS Technology is now available

image: Cover image of SLAS Technology

Image: 
David James Group

Oak Brook, IL - Just released is the April edition of SLAS Technology featuring cover article, "CURATE.AI: Optimizing Personalized Medicine with Artificial Intelligence," by Agata Blasiak, Ph.D., Jeffrey Khong, Ph.D., and Theodore Kee, Ph.D., (University of Singapore and The N.1 Institute for Health). In this review, the authors explain an alternate approach to the current limited and suboptimal big data decision-making tools that are used to help medical teams determine a patient's drug and dose recommendation.

In their review, Blasiak, Khong and Kee introduce CURATE.AI, an AI-derived mechanism-independent technology platform built to address the challenges in personalized dosing. CURATE.AI profiles are dynamically generated for an individual patient based on only that patient's data, drug doses and phenotypic outputs. The profile is then used to recommend drug doses towards a desired response. Drug doses may be reduced, yet drug efficacy can increase with an accompanying drop in the toxicity levels.

Although different approaches aim to individualize drug selection, less focus has been given to personalizing the dose for the identified drug or treatment. When drugs are given at suboptimal doses, effectiveness can be impaired or absent. This also happens to be a major cause of clinical trial failure and poor response rates from patients. Additionally, the same patient's medical state will be different from one day to the next, which means the original selected dose will require readjustments over time. "No two people are the same - an unavoidable reality that has complicated medical care throughout time. The treatment that works for one patient may fail for another," says Blasiak. "With advances in engineering, the medical team is more and more equipped to tailor the treatment to an individual."

Credit: 
SLAS (Society for Laboratory Automation and Screening)

Stroke: When the system fails for the second time

image: The second lesion, in which large parts of the left hemisphere are not working anymore (A, dark grey), increased the contribution of the right brain. The individual impairment predicted the activation on the right side (A, yellow). The stronger the fibre connection between the sister areas (B, red) on the right side, the less the patient was affected by the interruption on the left.

Image: 
MPI CBS

It is now widely known that the brain is much more malleable than once thought. Even after stroke or brain injury the brain often succeeds finding a new balance between the failed regions and the functions they serve. Commonly, neighbouring regions are activated as well as homologues on the other side of the brain side. During language processing, the homologues of the left-dominant language areas are usually less active and are kept in check by the dominant half - until the emergency case occurs.

Until now, it was unclear whether these mechanisms also apply in the event of a second attack. Does the brain retain its capacity to adapt? This is important as up to 15 percent of those affected will have a second stroke. In addition, there was disagreement about whether an activated right brain is generally good for healing. Some studies suggest that involvement of the right hemisphere helps recovery, at least in the short term. Others had shown, however, that a loss of language areas in the left half can literally inhibit the right half. In that case the contribution of the right hemisphere has nothing to do with language and can cause confusion. The brain gets out of step. Further, studies had also found that the patients are better off if the overactive half is restrained by inhibitory magnetic stimulation. The activity is more and more shifting back to the left hemisphere. It wins the upper hand again.

Scientists at the Max Planck Institute for Human Cognitive and Brain Sciences (MPI CBS) in Leipzig, Germany, have now found that the brain areas on the right side also become more active when there is a second injury in the left language areas. "In the recovered brain, the right side's contribution was still little after the first impairment. After the second event, in which large parts of the left hemisphere are not working anymore, its role becomes much more important", explains Gesa Hartwigsen, research group leader at the MPI CBS and first author of the article, which has been published in the high-ranking, open access journal elife. "The second lesion increased the contribution of the right brain", Hartwigsen said.

The scientists examined these relationships using 12 patients in whom the regions for processing properties of sound in the left hemisphere were injured. The incident had happened to them at least six months prior. Their brain had the opportunity to regenerate and adapt to the new situation. The researchers simulated the second disruption using so-called transcranial magnetic stimulation, which can be used to briefly halt certain areas of the brain through electrical stimuli. It can be used to simulate how the brain would react if certain areas actually fail due to a stroke or other events - and how this affects the ability to recognize sounds, for example. To do this, Hartwigsen and her team used a simple decision task. The participants heard the word "cat" and had to decide whether it consisted of one or two syllables. The individual impairment predicted the activation on the right side.The researchers also found that the stronger the fibre connection between the sister areas on the right side, the less the patient was affected by the interruption on the left.

"These results show that after large-scale disturbances, in which large parts of the left hemisphere no longer function as they should, the right hemisphere probably plays a beneficial role. Often, there is a lot of tissue in the left half of the brain that only works to a limited extent and needs support from the right side. "Other studies show that recovery is helped when the activated right side later down regulates itself and thus contributes to normalization on the left side", said Hartwigsen. On the other hand, if the right half remains permanently up-regulated, healing is delayed.

Findings on how the damaged brain adapts to repeated injury could help to improve the therapy of stroke patients in the long term. "This may make it possible, at some point, to assess whether it would be more helpful to regulate specific areas up or down," says Hartwigsen, confidently.

Credit: 
Max Planck Institute for Human Cognitive and Brain Sciences

Skoltech scientists developed a new cathode material for metal-ion batteries

image: Researchers from the Skoltech Center for Energy Science and Technology (CEST) created a new cathode material based on titanium fluoride phosphate, which enabled achieving superior energy performance and stable operation at high discharge currents.

Image: 
Skoltech

Researchers from the Skoltech Center for Energy Science and Technology (CEST) created a new cathode material based on titanium fluoride phosphate, which enabled achieving superior energy performance and stable operation at high discharge currents.

Nowadays, the rapid development of electric transport and renewable energy sources calls for commercially accessible, safe and inexpensive energy storage solutions based on metal-ion batteries. The high price of the existing lithium-ion technology is its major hurdle, which is further exacerbated by the speculations that the world may soon run out of lithium and cobalt essential to the production of the cathode - the battery's key component that determines its functional characteristics and energy performance.

The search for an alternative technology involves tremendous effort toward creating batteries using more accessible and less expensive elements, such as potassium, instead of lithium. As for cobalt, it can be replaced by the more common and environmentally friendly iron, manganese and even titanium.

The 10th most common element in the Earth's crust, titanium is mined all over the world and the main titanium-containing reagents are easily available, stable and non-toxic. But despite these obvious advantages, the low electrochemical potential that limits the battery's attainable specific energy has long been a major stumbling block for using titanium compounds in cathode materials.

Skoltech scientists succeeded in creating a commercially attractive advanced cathode material based on titanium fluoride phosphate, KTiPO4F, exhibiting a high electrochemical potential and unprecedented stability at high charge/discharge rates.

Professor Stanislav Fedotov: "This is an exceptional result that literally destroys the dominant paradigm long-present in the "battery community" stating that titanium-based materials can perform as anodes only, due to titanium's low potential. We believe that the discovery of the "high-voltage" KTiPO4F can give fresh impetus to the search and development of new titanium-containing cathode materials with unique electrochemical properties."

Professor Artem Abakumov, Director of CEST: "From the perspective of inorganic chemistry and solid state chemistry, this is an excellent example showing once again that rather than blindly following the generally accepted dogmas, we should look at things with eyes wide open. If you choose the right chemical composition, crystal structure and synthesis method, the impossible becomes possible and you can find new materials with unexpected properties and new opportunities for practical applications. This has been brilliantly demonstrated by Professor Fedotov and his team."

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

New 3D view of methane tracks sources

image: NASA's new 3-dimensional portrait of methane shows the world's second largest contributor to greenhouse warming as it travels through the atmosphere. Combining multiple data sets from emissions inventories and simulations of wetlands into a high-resolution computer model, researchers now have an additional tool for understanding this complex gas and its role in Earth's carbon cycle, atmospheric composition, and climate system. The new data visualization builds a fuller picture of the diversity of methane sources on the ground as well as the behavior of the gas as it moves through the atmosphere.

Image: 
NASA/Scientific Visualization Studio

NASA's new 3-dimensional portrait of methane concentrations shows the world's second largest contributor to greenhouse warming, the diversity of sources on the ground, and the behavior of the gas as it moves through the atmosphere. Combining multiple data sets from emissions inventories, including fossil fuel, agricultural, biomass burning and biofuels, and simulations of wetland sources into a high-resolution computer model, researchers now have an additional tool for understanding this complex gas and its role in Earth's carbon cycle, atmospheric composition, and climate system.

Since the Industrial Revolution, methane concentrations in the atmosphere have more than doubled. After carbon dioxide, methane is the second most influential greenhouse gas, responsible for 20 to 30% of Earth's rising temperatures to date.

"There's an urgency in understanding where the sources are coming from so that we can be better prepared to mitigate methane emissions where there are opportunities to do so," said research scientist Ben Poulter at NASA's Goddard Space Flight Center in Greenbelt, Maryland.

A single molecule of methane is more efficient at trapping heat than a molecule of carbon dioxide, but because the lifetime of methane in the atmosphere is shorter and carbon dioxide concentrations are much higher, carbon dioxide still remains the main contributor to climate change. Methane also has many more sources than carbon dioxide, these include the energy and agricultural sectors, as well as natural sources from various types of wetlands and water bodies.

"Methane is a gas that's produced under anaerobic conditions, so that means when there's no oxygen available, you'll likely find methane being produced," said Poulter. In addition to fossil fuel activities, primarily from the coal, oil and gas sectors, sources of methane also include the ocean, flooded soils in vegetated wetlands along rivers and lakes, agriculture, such as rice cultivation, and the stomachs of ruminant livestock, including cattle.

"It is estimated that up to 60% of the current methane flux from land to the atmosphere is the result of human activities," said Abhishek Chatterjee, a carbon cycle scientist with Universities Space Research Association based at Goddard. "Similar to carbon dioxide, human activity over long time periods is increasing atmospheric methane concentrations faster than the removal from natural 'sinks' can offset it. As human populations continue to grow, changes in energy use, agriculture and rice cultivation, livestock raising will influence methane emissions. However, it's difficult to predict future trends due to both lack of measurements and incomplete understanding of the carbon-climate feedbacks."

Researchers are using computer models to try to build a more complete picture of methane, said research meteorologist Lesley Ott with the Global Modeling and Assimilation Office at Goddard. "We have pieces that tell us about the emissions, we have pieces that tell us something about the atmospheric concentrations, and the models are basically the missing piece tying all that together and helping us understand where the methane is coming from and where it's going."

To create a global picture of methane, Ott, Chatterjee, Poulter and their colleagues used methane data from emissions inventories reported by countries, NASA field campaigns, like the Arctic Boreal Vulnerability Experiment (ABoVE) and observations from the Japanese Space Agency's Greenhouse Gases Observing Satellite (GOSAT Ibuki) and the Tropospheric Monitoring Instrument aboard the European Space Agency's Sentinel-5P satellite. They combined the data sets with a computer model that estimates methane emissions based on known processes for certain land-cover types, such as wetlands. The model also simulates the atmospheric chemistry that breaks down methane and removes it from the air. Then they used a weather model to see how methane traveled and behaved over time while in the atmosphere.

The data visualization of their results shows methane's ethereal movements and illuminates its complexities both in space over various landscapes and with the seasons. Once methane emissions are lofted up into the atmosphere, high-altitude winds can transport it far beyond their sources.

When they first saw the data visualized, several locations stood out.

In South America, the Amazon River basin and its adjacent wetlands flood seasonally, creating an oxygen-deprived environment that is a significant source of methane. Globally, about 60% of methane emissions come from the tropics, so it's important to understand the various human and natural sources, said Poulter.

Over Europe, the methane signal is not as strong as over the Amazon. European methane sources are influenced by the human population and the exploration and transport of oil, gas and coal from the energy sector.

In India, rice cultivation and livestock are the two driving sources of methane. "Agriculture is responsible for about 20% of global methane emissions and includes enteric fermentation, which is the processing of food in the guts of cattle, mainly, but also includes how we manage the waste products that come from livestock and other agricultural activities," said Poulter.

China's economic expansion and large population drive the high demand for oil, gas and coal exploration for industry as well as agriculture production, which are its underlying sources of methane.

The Arctic and high-latitude regions are responsible for about 20% of global methane emissions. "What happens in the Arctic, doesn't always stay in the Arctic," Ott said. "There's a massive amount of carbon that's stored in the northern high latitudes. One of the things scientists are really concerned about is whether or not, as the soils warm, more of that carbon could be released to the atmosphere. Right now, what you're seeing in this visualization is not very strong pulses of methane, but we're watching that very closely because we know that's a place that is changing rapidly and that could change dramatically over time."

"One of the challenges with understanding the global methane budget has been to reconcile the atmospheric perspective on where we think methane is being produced versus the bottom-up perspective, or how we use country-level reporting or land surface models to estimate methane emissions," said Poulter. "The visualization that we have here can help us understand this top-down and bottom-up discrepancy and help us also reduce the uncertainties in our understanding of the global methane budget by giving us visual cues and a qualitative understanding of how methane moves around the atmosphere and where it's produced."

The model data of methane sources and transport will also help in the planning of both future field and satellite missions. Currently, NASA has a planned satellite called GeoCarb that will launch around 2023 to provide geostationary space-based observations of methane in the atmosphere over much of the western hemisphere.

Credit: 
NASA/Goddard Space Flight Center

Seeing is believing: Visualizing differences in RNA biology between single cells

Tsukuba, Japan - A long-standing quest in medicine has been to understand how the same disease presents differently among various patients. Although it has been thought that a central aspect of these differences is a heterogeneity between single cells within a large population of cells, it has remained a challenge to visualize this heterogeneity for a better understanding of single-cell biology. In a new study published in BMC Genomics, researchers from the University of Tsukuba reveal that they have developed a novel tool that enables the visualization of how single cells handle and process RNA.

The genetic information of an organism is stored within DNA. It contains the code for making other molecules that make all cells and organs of the body functional. Interestingly, only 1% of DNA makes up genes, of which proteins are produced via RNA intermediaries. There is much debate on the role of the remaining DNA, but different types of RNA are thought to be produced from it and direct the fate of the cell. Even though each cell of the body contains the same DNA, how they read and process DNA to make RNA can differ quite dramatically between single cells. This has especially been known for the transcriptome, which includes all RNA that are produced from genes, but not so much for other RNA.

"Genes have been the main focus of biological research for a long time," says lead author of the study Haruka Ozaki. "We wanted to focus on what we call read coverage of single-cell RNA sequencing (scRNA-seq) data, which also includes RNA that are not products of genes. Although we can measure the amount of different RNA a single cell produces by scRNA-seq technologies, we wanted to come up with a new method that also visualizes specifically read coverage, because only then we can get a full picture of RNA biology and how it contributes to cell biology at the single-cell level."

To achieve their goal, the researchers developed a new computational tool that they called Millefy. Millefy uses existing scRNA-seq data to visualize read coverage of single cells as a heat map, illustrating differences between individual cells on a relative scale. The researchers first demonstrated the utility of Millefy in a well-established mouse embryonic stem cell model by showing heterogeneity of read coverage between developing cells. They then applied Millefy to cancer cells from patients with triple-negative breast cancer, a particularly aggressive type of breast cancer. Not only did Millefy show heterogeneity between cancer cells in general, but it revealed heterogeneity in a specific aspect of RNA biology that had previously been unknown.

"Our approach simplifies the investigation of cellular heterogeneity in RNA biology using scRNA-seq data," says Ozaki. "Our findings could help identify what makes single cells individual, which would help us understand why patients with the same disease are often treated with varying success. Additionally, to enable rapid progress in field, we made Millefy publicly available to the scientific community."

Credit: 
University of Tsukuba

How do you power billions of sensors? By converting waste heat into electricity

video: These thermoelectric semiconductor chips have been constructed in a precisely and accurately refined process via high-precision chip mounter.

Image: 
Osaka University

Osaka, Japan - Interconnected healthcare and many other future applications will require internet connectivity between billions of sensors. The devices that will enable these applications must be small, flexible, reliable, and environmentally sustainable. Researchers must develop new tools beyond batteries to power these devices, because continually replacing batteries is difficult and expensive.

In a study published in Advanced Materials Technologies, researchers from Osaka University have revealed how the thermoelectric effect, or converting temperature differences into electricity, can be optimally used to power small, flexible devices. Their study has shown why thermoelectric device performance to date has not yet reached its full potential.

Thermoelectric power generators have many advantages. For example, they are self-sustaining and self-powered, have no moving parts, and are stable and reliable. Solar power and vibrational power do not have all of these advantages. Aviation and many other industries use the thermoelectric effect. However, applications to thin, flexible displays are in their infancy.

Many researchers have optimized device performance solely from the standpoint of the thermoelectric materials themselves. "Our approach is to also study the electrical contact, or the switch that turns the device on and off," explains Tohru Sugahara, corresponding author of the study. "The efficiency of any device critically depends on the contact resistance."

In their study, the researchers used advanced engineering to make a bismuth telluride semiconductor on a 0.4-gram, 100-square-millimeter flexible, thin polymer film. This device weighs less than a paperclip, and is smaller than the size of an adult fingernail. The researchers obtained a maximum output power density of 185 milliwatts per square centimeter. "The output power meets standard specifications for portable and wearable sensors," says Sugahara.

However, approximately 40% of the possible output power from the device was lost because of contact resistance. In the words of Sugahara: "Clearly, researchers should focus on improving the thermal and electrical contact resistance to improve power output even further."

Japan's Society 5.0 initiative, aimed at helping everyone live and work together, proposes that the entirety of society will become digitalized. Such a future requires efficient ways to interconnect our devices. Technological insights, such as those by Ekubaru, co-lead author, and Sugahara, are necessary to make this dream a reality.

Credit: 
Osaka University

Ultrathin but fully packaged high-resolution camera

image: The ultrathin camera.

Image: 
Professor Ki-Hun Jeong, KAIST

The unique structures of biological vision systems in nature inspired scientists to design ultracompact imaging systems. A research group led by Professor Ki-Hun Jeong have made an ultracompact camera that captures high-contrast and high-resolution images. Fully packaged with micro-optical elements such as inverted micro-lenses, multilayered pinhole arrays, and gap spacers on the image sensor, the camera boasts a total track length of 740 μm and a field of view of 73°.

Inspired by the eye structures of the paper wasp species Xenos peckii, the research team completely suppressed optical noise between micro-lenses while reducing camera thickness. The camera has successfully demonstrated high-contrast clear array images acquired from tiny micro lenses. To further enhance the image quality of the captured image, the team combined the arrayed images into one image through super-resolution imaging.

An insect's compound eye has superior visual characteristics, such as a wide viewing angle, high motion sensitivity, and a large depth of field while maintaining a small volume of visual structure with a small focal length. Among them, the eyes of Xenos peckii and an endoparasite found on paper wasps have hundreds of photoreceptors in a single lens unlike conventional compound eyes. In particular, the eye structures of an adult Xenos peckii exhibit hundreds of photoreceptors on an individual eyelet and offer engineering inspiration for ultrathin cameras or imaging applications because they have higher visual acuity than other compound eyes.

For instance, Xenos peckii's eye-inspired cameras provide a 50 times higher spatial resolution than those based on arthropod eyes. In addition, the effective image resolution of the Xenos peckii's eye can be further improved using the image overlaps between neighboring eyelets. This unique structure offers higher visual resolution than other insect eyes.

The team achieved high-contrast and super-resolution imaging through a novel arrayed design of micro-optical elements comprising multilayered aperture arrays and inverted micro-lens arrays directly stacked over an image sensor. This optical component was integrated with a complementary metal oxide semiconductor image sensor.

This is first demonstration of super-resolution imaging which acquires a single integrated image with high contrast and high resolving power reconstructed from high-contrast array images. It is expected that this ultrathin arrayed camera can be applied for further developing mobile devices, advanced surveillance vehicles, and endoscopes.

Professor Jeong said, "This research has led to technological advances in imaging technology. We will continue to strive to make significant impacts on multidisciplinary research projects in the fields of microtechnology and nanotechnology, seeking inspiration from natural photonic structures."

Credit: 
The Korea Advanced Institute of Science and Technology (KAIST)

Researchers observe ultrafast processes of single molecules for the first time

image: The research group Femtosecond Dynamics of the Institute of Experimental Physics has once again achieved a success in quantum physics. From left to right: The study authors Pascal Heim, Bernhard Thaler and Markus Koch with the former head of the institute Wolfgang Ernst and the colleagues Stefan Cesnik, Leonhard Treiber and Michael Stadlhofer.

Image: 
© Lunghammer - TU Graz

Markus Koch, head of the research group Femtosecond Dynamics at the Institute of Experimental Physics at TU Graz, and his team develop new methods for time-resolved femtosecond laser spectroscopy to investigate ultrafast processes in molecular systems. In 2018 the group demonstrated for the first time that photo-induced processes can be observed inside a helium nanodroplet, a nanometer-sized droplet of superfluid helium that serves as a quantum solvent. For their investigations, the researchers placed a single indium atom inside the droplet and analysed the reaction of the system with the pump-probe principle. The atom was excited with an ultrashort laser pulse, triggering the rearrangement of the helium environment within femtoseconds (10-15 seconds). A time-delayed second laser pulse probed this development and provided information on the behavior of the system.

Successful next step

Using the same technique, Koch and his colleagues Miriam Meyer, Bernhard Thaler and Pascal Heim, visualized the movement of single, isolated molecules inside a helium droplet for the first time. The researchers formed an indium dimer molecule inside a helium droplet by loading it successively with two indium atoms. They then triggered a vibration in the molecule by photoexcitation and observed the movement of the nuclei in real time with the same pump-probe technique.

The researchers consider two aspects of the experiment as particularly important: First, it demonstrates that such experiments are able to observe ultrafast intramolecular processes - i.e. processes that occur within an excited molecule.

Helium has little influence on embedded molecules

Second, the group discovered that the influence of superfluid helium on molecular vibrations is significantly weaker than with conventional solvents, such as water or methanol. Intramolecular processes are usually influenced by interactions with the environment and in conventional solvents this interaction is so strong that intramolecular processes cannot be observed, as Bernhard Thaler explains: "The quantum fluid helium, which has a temperature of only 0.4 K (note: minus 272.75 degrees Celsius), is truly special, as the perturbation on the embedded molecule is very low. Additionally, fragile molecules, which often break apart in other techniques, are stabilized due to the cooling mechanism and can now be investigated."

Markus Koch wants to extend the method to complex molecules

"We see great potential in helium nanodroplets because they offer wonderful opportunities for creating molecular systems," said Koch, explaining why he and his team develop this method for femtosecond studies. In the next step, the Femtosecond Dynamics group aims for more complex systems. "The structure of indium molecules, which we used as a model system, is very simple but in the future we want to look at technologically relevant molecules, which are more complex. I consider this as promising approach to molecular engineering, where future materials are developed by manipulating the quantum behavior of their molecular constituents."

Credit: 
Graz University of Technology

Graphite nanoplatelets on medical devices kill bacteria and prevent infections

image: Graphite nanoplatelets integrated into plastic medical surfaces can prevent infections, killing 99.99 per cent of bacteria which try to attach - a cheap and viable potential solution to a problem which affects millions, costs huge amounts of time and money, and accelerates antibiotic resistance. The nanoplatelets on the surface of the implants prevent bacterial infection but, crucially, without damaging healthy human cells. Human cells are around 25 times larger than bacteria, so while the graphite nanoplatelets slice apart and kill bacteria, they barely scratch a human cell.

Image: 
Yen Strandqvist/Chalmers

Graphite nanoplatelets integrated into plastic medical surfaces can prevent infections, killing 99.99 per cent of bacteria which try to attach - a cheap and viable potential solution to a problem which affects millions, costs huge amounts of time and money, and accelerates antibiotic resistance. This is according to research from Chalmers University of Technology, Sweden, in the journal Small.

Every year, over four million people in Europe are affected by infections contracted during health-care procedures, according to the European Centre for Disease Prevention and Control (ECDC). Many of these are bacterial infections which develop around medical devices and implants within the body, such as catheters, hip and knee prostheses or dental implants. In worst cases implants need to be removed.

Bacterial infections like this can cause great suffering for patients, and cost healthcare services huge amounts of time and money. Additionally, large amounts of antibiotics are currently used to treat and prevent such infections, costing more money, and accelerating the development of antibiotic resistance.

"The purpose of our research is to develop antibacterial surfaces which can reduce the number of infections and subsequent need for antibiotics, and to which bacteria cannot develop resistance. We have now shown that tailored surfaces formed of a mixture of polyethylene and graphite nanoplatelets can kill 99.99 per cent of bacteria which try to attach to the surface," says Santosh Pandit, postdoctoral researcher in the research group of Professor Ivan Mijakovic at the Division of Systems Biology, Department of Biology and Biotechnology, Chalmers University of Technology.

Infections on implants are caused by bacteria that travel around in the body in fluids such as blood, in search of a surface to attach to. When they land on a suitable surface, they start to multiply and form a biofilm - a bacterial coating.

Previous studies from the Chalmers researchers showed how vertical flakes of graphene, placed on the surface of an implant, could form a protective coating, making it impossible for bacteria to attach - like spikes on buildings designed to prevent birds from nesting. The graphene flakes damage the cell membrane, killing the bacteria. But producing these graphene flakes is expensive, and currently not feasible for large-scale production.

"But now, we have achieved the same outstanding antibacterial effects, but using relatively inexpensive graphite nanoplatelets, mixed with a very versatile polymer. The polymer, or plastic, is not inherently compatible with the graphite nanoplatelets, but with standard plastic manufacturing techniques, we succeeded in tailoring the microstructure of the material, with rather high filler loadings, to achieve the desired effect. And now it has great potential for a number of biomedical applications," says Roland Kádár, Associate Professor at the Department of Industrial and Materials Science at Chalmers.

The nanoplatelets on the surface of the implants prevent bacterial infection but, crucially, without damaging healthy human cells. Human cells are around 25 times larger than bacteria, so while the graphite nanoplatelets slice apart and kill bacteria, they barely scratch a human cell.

"In addition to reducing patients' suffering and the need for antibiotics, implants like these could lead to less requirement for subsequent work, since they could remain in the body for much longer than those used today," says Santosh Pandit. "Our research could also contribute to reducing the enormous costs that such infections cause health care services worldwide."

In the study, the researchers experimented with different concentrations of graphite nanoplatelets and the plastic material. A composition of around 15-20 per cent graphite nanoplatelets had the greatest antibacterial effect - providing that the morphology is highly structured.

"As in the previous study, the decisive factor is orienting and distributing the graphite nanoplatelets correctly. They have to be very precisely ordered to achieve maximum effect," says Roland Kádár.

Credit: 
Chalmers University of Technology

Challenge and desire in Antarctic meteorology and climate

image: A map of the AWS network on Antarctica as of 2019.

Image: 
Sam Batzli, University of Wisconsin-Madison

The outcomes of the 13th and 14th Workshop on Antarctic Meteorology and Climate (WAMC), as well as the 3rd and 4th Year of Polar Prediction (YOPP) Meetings, was discussed in an article published in the peer-reviewed journal Advances in Atmospheric Sciences.

The main themes explored during the WAMC workshops were Antarctic meteorological observations, atmospheric numerical modeling, meteorological & climate research, and forecasting & operational services. Meteorologists from around the world presented their research on numerical weather prediction and its Antarctic applications, along with the implementation and status of observing systems (e.g. automatic weather stations, doppler radars, etc.), and recent Antarctic Peninsula warming.

Throughout these meetings, challenges and desires are brought up to improve the observed data and ongoing research to get accurate results. Having open and easy access to the datasets collected on and around the continent is one of the major aspirations that is being worked on, though there are issues with the available data centers. The YOPP is the flagship activity of the decade-long Polar Prediction Project that has been initiated by the WMO's World Weather Research Programme to improve predictive skill in both polar regions. This includes an increase of radiosonde and buoy launches in order to collect more data to use in forecast models, reanalysis models, and other research.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences