Tech

As seen in movies, new meta-hologram can be used as a communication tool

image: Meta-hologram optical device that operated in forward and backward directions.

Image: 
Junsuk Rho(POSTECH)

Hologram techniques are already used in our everyday life. A hologram sticker to prevent from counterfeiting money, Augmented Reality navigation projected in front mirror of a car to guide directions, and Virtual Reality game that allows a user to play in a virtual world with a feeling of live are just a few examples to mention. Recently, thinner and lighter meta-hologram operating in forward and backward directions has been developed. As seen in the movie, Black Panther, people from Wakanda Kingdom communicate to each other through the hologram and, this specific movie scene seems to become reality soon that we can exchange different information with people from different locations.

Junsuk Rho, professor of POSTECH Mechanical Engineering and Chemical Engineering Department with his student, Inki Kim developed a multifunctional meta-hologram from a monolayer meta-holographic optical device that can create different hologram images depending on a direction of light incident on the device. Their research accomplishment has been introduced as a cover story in the January 2020 issue of Nanoscale Horizons.

Televisions and beam projectors can only transmit intensity of lights but holographic techniques can save light intensity and its phase information to play movies in three-dimensional spaces. At this time, if metamaterials are used, a user can change nano structures, size, and shapes as desired and can control light intensity and phase at the same time. Meta-hologram has pixel sizes as small as 300 to 400 nanometers but can display very high resolution of holographic images with larger field of view compared to existing hologram projector such as spatial light modulator.

However, the conventional meta-holograms can display images when incident light is in one direction and cannot when light is in the other direction.

To solve such a problem, the research team used two different types of metasurfaces.1) One metasurface was designed to have phase information when incident light was in the forward direction and the other one to operate when light was in backward direction. As a result, they confirmed that these could display different images in real-time depending on the directions of light.

In addition, the team applied dual magnetic resonances and antiferromagnetic resonances, which are phenomena occurring in silicon nanopillars, to nanostructure design to overcome low efficiency of the conventional meta-hologram. This newly made meta-hologram demonstrated diffraction efficiency higher than 60% (over 70% in simulation) and high-quality and clear images were observed.

Furthermore, the new meta-hologram uses silicon and it can be easily produced by following through the conventional semiconductor manufacturing process.

The meta-hologram operating in both directions, forward and backward, is expected to set a new hologram platform that can transmit various information to multiple users from different locations, overcoming the limits of the conventional ones which could only transmit one image to a limited location.

Junsuk Rho who is leading research on metamaterials said, 'Microscopic, ultrathin, ultralightweight flat optical devices based on a metasurface is an impressive technique with great potentials as it can not only perform the functions of the conventional optical devices but also demonstrate multiple functions depending on how its metasurface is designed. Especially, we developed a meta-hologram optical device that operated in forward and backward directions and it could transmit various visual information to multiple users from different locations simultaneously. We anticipate that this new development can be employed in multiple applications such as holograms for performances, entertainment, exhibitions, automobiles and more."

Credit: 
Pohang University of Science & Technology (POSTECH)

Studying the geometry of a common skin disease

image: The wheal patterns were mapped.

Image: 
Photo courtesy of Sungrim Seirin-Lee, Hiroshima University

Hives afflict 1 in 5 people, but the exact mechanisms behind the itchy red rashes are not well known.

The research team studied the patterns of hives in patients and reproduced the hive patterns using a mathematical model called a reaction-diffusion model, a common prototype for understanding how patterns develop. The researchers' model is a single equation type which had never before been used to generate complex patterns.

In response to injury, allergens, or stress, hives can form when cells called mast cells in the skin release a compound called histamine. The red swollen mark (also known as wheals) can range from a few millimeters to the size of a hand or even larger.

While research has shown that histamine itself helps mast cells release histamine, this study considers for the first time that certain mechanisms might also inhibit histamine release and that there may be more going on behind the disease than previously thought.

"Our model succeeded in creating complex pattern of urticaria (hives), which is a very surprising result from both mathematical and biological points of views," said lead author and Associate Professor Sungrim Seirin-Lee.

To create the equation, the researchers gave rashes to eight healthy volunteers and measured the time it took for the rash to form and determined the velocity of formation. The team then looked at 14 patients with urticaria and measured them using the same model as the healthy patients.

Rather than relying solely on biological studies to investigate hives, which often requires inducing hives in patients, the mathematical focus provides a new avenue for skin disease research. In the future, the mathematical model could possibly be used as a tool to find the molecules which play a role in the inhibition process, as well.

"Finding the mechanism of urticaria is difficult only by biological methods," said Seirin-Lee. "Thus, we tried a completely different approach, mathematics. The approach using mathematical model for urticaria is the first trial in the world."

Ultimately, the findings from the study will help put together a more detailed picture of how the common skin disease develops and how to effectively deliver treatments.

Credit: 
Hiroshima University

Iron nanorobots show their true mettle

image: By combining low-power magnetic fields, which agitates nanowires, with laser heating and drug delivery, target cells can be killed efficiently.

Image: 
2019 KAUST

Drug-coated iron nanowires that can be guided to the site of a tumor using an external magnetic field before activating a three-step cancer-killing mechanism could provide an effective option for cancer therapy.

Co-developed by KAUST researchers, these nanowires release their drug cargo inside cancer cells, while also punching holes in the cell's membrane and delivering a blast of heat. While the combination therapy maximizes cancer cell death, its highly targeted nature should minimize side effects.

Iron was the obvious material to make the nanowires, says Jürgen Kosel, who leads the group at KAUST, which includes Jasmeen Merzaban and Boon Ooi, and who co-led the work with researchers from CIC biomaGUNE in San Sebastian, Spain.

The first consideration is safety. "Iron, in molecular form, is a native material in our bodies, essential for oxygen transport," Kosel explains. The nanowires comprise an iron core, coated with an iron oxide shell. "Iron-oxide-based nanomaterials have been approved by regulatory bodies for use in magnetic resonance imaging and as a dietary supplement in cases of nutrition deficiency," he says.

In addition to their biocompatibility, the magnetic properties of iron-based materials are a key benefit. "Using harmless magnetic fields, we can transport them; concentrate them in the desired area; rotate or make them vibrate, such as we did in this study; and even detect them through magnetic resonance imaging," says Aldo Martínez-Banderas, a member of Kosel's team. Applying low-power magnetic fields, the team agitated the nanowires in a way that opened the membrane of target cells, inducing cell death.

The additional advantage is that core-shell nanowires strongly absorb near-infrared light, heating up as they do so. Because light at this wavelength can penetrate far into the body, the nanowires could be heated using lasers directed at the tumor site. "The core?shell nanowires showed an extremely high photothermal conversion efficiency of more than 80 percent, which translated into a large intracellular heat dose," Martínez-Banderas says.

Finally, the anticancer drug doxorubicin was attached to the nanowires via pH-sensitive linkers. As the tumor environment is typically more acidic than healthy tissue, the linker selectively degraded in or near tumor cells, releasing the drug where it is needed. "The combination of treatment resulted in nearly complete cancer cell ablation and was more effective than individual treatments or the anticancer drug alone," Martínez-Banderas says.

"Taken together, the capabilities of iron-based nanomaterials make them very promising for the creation of biomedical nanorobots, which could revolutionize healthcare," Kosel adds. "While this might seem futuristic, the developments are well on their way."

Credit: 
King Abdullah University of Science & Technology (KAUST)

Research offers promise for treating schizophrenia

image: Gregory Strauss

Image: 
UGA

Research by a University of Georgia psychologist shows that targeting one particular symptom of schizophrenia has a positive effect on other symptoms, offering significant promise for treating an aspect of schizophrenia that currently has no pharmaceutical options.

A team led by Gregory Strauss published a study confirming that successfully treating the symptom avolition--reduced motivation--has a positive effect on other negative symptoms of schizophrenia. The results, published in Schizophrenia Bulletin, were based on a phase 2b trial of the compound roluperidone by Minerva Neurosciences.

"There's a lot of hope that Minerva's phase 3 trial will show a similar improvement in negative symptoms," said Strauss, assistant professor in the Franklin College of Arts and Sciences. "This could be the first drug that receives an indication for negative symptoms of schizophrenia from the Food and Drug Administration, which is perhaps the biggest need in the field of psychiatry. It would be a monumental benefit to the lives of people with schizophrenia."

Schizophrenia is the leading medical cause of functional disability worldwide, according to several population-based studies of health. People with functional disability struggle to hold a job, build social relationships and maintain the independent activities of daily living. In the U.S., it can also refer to receiving government-supported disability funds.

"The government spends a tremendous amount of money every year on functional disability," Strauss said. "Negative symptoms are the strongest predictor of functional disability, but no medication has received FDA approval for treating them. Therefore, they are a critical treatment target."

Strauss has published more than 125 studies exploring the symptoms of schizophrenia. A 2018 paper published in JAMA Psychiatry demonstrated that negative symptoms are not a singular construct, as has long been assumed, but reflect five distinct domains: avolition; anhedonia (reduced pleasure); asociality (reduction in social activity); blunted affect (reduction in outwardly expressed emotion in the face and voice); and alogia (reduced speech). Each domain constitutes a separate treatment target.

In a 2019 study published in Schizophrenia Bulletin, Strauss sought to identify which domain is most critical to target in treatment trials. He brought in Binghamton University's Hiroki Sayama and Farnaz Zamani Esfahlani to conduct a network analysis, an advanced mathematical approach from the field of engineering and complex systems science. Historically, researchers have looked at how symptoms function in isolation, but network analysis has revealed that they can have dynamic causal interactions with each other. Even if a drug doesn't decrease the severity of a symptom, it may serve a valuable function in changing the interactions among symptoms, Strauss said.

The study results indicated that avolition is a highly central domain within the negative symptom construct, suggesting that the other negative symptoms are tightly coupled to this domain, and if it is treated successfully, the entire constellation of negative symptoms might improve.

Strauss' most recent study, also published in Schizophrenia Bulletin, conducted network analysis on Minerva Neurosciences' clinical trial data. In the clinical trial, the company observed that roluperidone had a significant reduction on negative symptoms. The team's analysis of the data revealed that avolition was the most central domain for the active treatment group, suggesting that when the drug improved avolition, all other negative symptoms improved as a result.

"This study suggests that future drug development should target mechanisms of avolition in particular," Strauss said. "If that domain is successfully improved, it might be possible to improve all negative symptoms and subsequently reduce functional disability."

Strauss serves as a consultant with Minerva Neurosciences. He co-developed and validated the key clinical outcome measure used in their trial but was not involved with developing roluperidone.

Credit: 
University of Georgia

Rethinking land conservation to protect species that will need to move with climate change

image: A high alpine landscape in Glacier Peak Wilderness in Washington state. This is an example of an area that likely will be important for plant and animal species as the climate warms.

Image: 
University of Washington

All plants and animals need suitable conditions to survive. That means a certain amount of light, a tolerable temperature range, and access to sources of food, water and shelter.

Many of the existing efforts to protect plant and animal species across the United States rely on information about where these species currently live. For example, if a rare bird species such as the snowy plover is found in a specific location along the Washington coast, conservationists try to protect it from human development where it lives.

But as climate change disrupts the status quo, most animals and plants will need to move to cooler or otherwise more suitable environments to survive. How does this affect efforts to protect biodiversity?

A new study by the University of Washington and The Evergreen State College analyzes whether accounting for climate change in conservation planning can protect future biodiversity more effectively than current approaches, and what the costs of implementing these solutions might be. The authors found that many species of animals and plants likely will need to migrate under climate change, and that conservation efforts will also need to shift to be effective. The paper published Jan. 27 in the journal Philosophical Transactions of the Royal Society B.

"We are going to need to protect different places if we want to protect biodiversity in the future," said lead author Joshua Lawler, a UW professor in the School of Environmental and Forest Sciences. "We need to think about where species will go as the climate changes, and then plan for that. The business-as-usual planning process isn't going to work."

The research team looked at 1,460 different species of plants, birds, mammals, reptiles and amphibians across the continental U.S., considering whether current and potential future protected habitats are suitable for each species. The team found that unless climate change impacts are considered explicitly, 14% of the species would not have a viable place to live under climate change. This is because current protections focus on where species are today, not where they will need to be in the future as temperatures warm.

"Our findings show that species are going to shift around, and we are going to have to put some of our conservation efforts in different places -- and that will come at a cost," Lawler said.

For the past two decades, researchers have been trying to figure out how conservation planning can account for species moving under climate change. This research team considered three proposed suggestions for how to accomplish this, analyzing the potential costs and effectiveness associated with implementing each one.

With climate change advancing, there's an urgent need to devise plans -- and implement them by protecting important landscapes, the researchers said.

"Climate change effects that were originally projected to be decades in the future are starting to become apparent in the present day. This is not an abstract concept anymore," said co-author John Withey, a professor at Evergreen. "We need to take action as soon as possible, thinking about where species may need to go under climate change, and providing corridors through which they can move."

The researchers first looked at costs and efforts associated with selecting specific plants and animals, then protecting land where they are now and where they will likely need to live in the future. Modeling this information for species such as the Townsend's chipmunk, western rattlesnake and yellow-billed magpie, they found it would cost about 60% more than solely protecting their current habitats.

Then they looked at more general approaches, considering costs to protect landscapes with rare or disappearing climatic conditions that are likely to provide refuge for rare species as the climate changes. Many of these sites are at higher elevations, such as alpine meadows. They also factored in "climate corridors" that would potentially allow species to move safely to new locations.

Protecting these sites won't cost much more, the authors found, likely because many of the landscapes identified as important under climate change are already located in protected national parks, wilderness areas, fish and wildfire refuges and private conservation areas from land trusts.

"It was encouraging to see that there were some climate-based solutions that didn't increase the cost substantially," said co-author Julia Michalak, a UW research scientist in the School of Environmental and Forest Sciences.

The authors hope this analysis will be helpful for land trusts to determine which areas should be considered high priority for conservation. While their study highlights parts of the country that will need more conservation attention under climate change, they caution that the paper isn't intended to help pinpoint specific new parks to protect.

"This paper is pointing out that we might be missing opportunities or places where conservation is going to be needed in the face of climate change," Withey said. "Another hope is that we can start capturing places that would protect species and would allow species to move without increasing our costs too much."

Credit: 
University of Washington

Upper-plate earthquakes caused uplift along New Zealand's Northern Hikurangi Margin

image: Trench at Pakarae River mouth marine terrace, North Island, New Zealand

Image: 
GNS Science

Earthquakes along a complex series of faults in the upper plate of New Zealand's northern Hikurangi Subduction Margin were responsible for coastal uplift in the region, according to a new evaluation of local marine terraces.

The findings, reported in the Bulletin of the Seismological Society of America, could shape new evaluations of seismic hazard in New Zealand. They suggest that earthquakes rupturing multiple faults may contribute more than subduction earthquakes to damaging uplift in the area.

Using radiocarbon and other methods to date the marine terraces at two North Island sites, Puatai Beach and Pakarae River mouth, Nicola Litchfield of GNS Science and her colleagues conclude that the uplift events that created the terraces occurred at different times between the two sites. This suggests that the uplift was not the result of subduction earthquakes or single-fault upper plate earthquakes.

The pattern of uplift seen in the marine terraces led the researchers to map new offshore faults in the region, which they think may be one source of these upper-plate earthquakes, said Litchfield.

The Hikurangi Subduction Margin lies along the eastern edge of North Island, where the Pacific and Australian tectonic plates collide and the Pacific plate slips under the island. Recent New Zealand earthquakes involving multiple fault ruptures and coastal deformation, such as the magnitude 7.8 Kaikoura earthquake in 2016, have prompted seismologists to evaluate the mechanisms behind these complicated sequences, Litchfield said--especially along the remote areas of the northern Margin where there have been fewer studies overall.

Marine terraces are created when shorelines are raised above sea level by uplift at the coast, and they record the time and amount of uplift. Based on the geological evidence from other sites in New Zealand, "we are confident that each terrace represents an individual earthquake," Litchfield said.

Previous radiocarbon dating studies suggested that the youngest marine terraces at Puatai Beach and Pakarae River mouth were created at the same times. But Litchfield and colleagues decided to revisit these dates with a more comprehensive examination of the terraces. At each site, the researchers did extensive trenching "to see what the stratigraphy was, and to carefully sample, and then use more than radiocarbon dating techniques to get high resolution ages," said Litchfield.

The researchers were able to use a layer of volcanic ash, along with radiocarbon dating of beach shells, to determine ages for each terrace at each site. At Puatai Beach, the terraces correspond to three earthquakes that occurred between 1710 and 1770 years ago, 910 and 1100 years ago, and 250 and 420 years ago. At Pakarae River mouth, the terraces correspond to earthquakes that took place between 530 and 660 years ago and between 1290 and 1490 years ago.

The different terrace ages at each site combined with modeling of uplift from earthquakes on newly mapped offshore faults allowed the researchers to rule out a subduction earthquake or single-fault upper-plate earthquake as the cause of uplift.

Researchers will need to learn more about the extent and orientation of the newly mapped offshore faults, and model how they might rupture together, to fully evaluate how they impact overall seismic hazard, Litchfield said. "Simply having more faults offshore, some of them quite close, means that there is more earthquake and tsunami hazard," she noted. "We don't know yet how that might be balanced by the fact that there is less subduction earthquake hazard in the model, though."

Credit: 
Seismological Society of America

Hungry for hutia? Our taste for Bahamas' 'most peaceable rodent' shaped its diversity

image: Geocapromys ingrahami, the Bahamian hutia, flourished on the islands for millennia, but today, only one population of remains. Ancient DNA reveals a human side to the hutia's story: Indigenous people moved hutias to new islands, exploiting them for food and shaping their diversity and distribution.

Image: 
Kristen Grace/Florida Museum

GAINESVILLE, Fla. --- The Bahamian hutia, a large Caribbean rodent with a blissed-out disposition, presents a curious case study in how human food preferences can drive biodiversity, sometimes shaping it over 1,000 years.

The hutia, which resembles a bristly beanbag, flourished in the Bahamas for millennia, the islands' only native terrestrial mammal. Today, only one population of Geocapromys ingrahami remains, divided among the scrubland and limestone cliffs of three small cays, and the species is considered vulnerable by the International Union for Conservation of Nature.

Humans have played a prominent and paradoxical role in the hutia's boom-to-bust story, according to a new study led by Florida Museum of Natural History researchers.

Hutias were a savory source of red meat for the Lucayans, the islands' earliest inhabitants, who arrived around AD 800-1000. Now, ancient DNA and radiocarbon dating suggest the Lucayans transported hutias from the Great Bahama Bank - where hutias landed during the last ice age, likely after rafting from Cuba - to Bahamian islands hutias had not inhabited previously, exploiting them as food. The findings illuminate how humans historically and actively shaped hutia diversity and distribution.

"We're a funny, picky species with food," said Michelle LeFebvre, Florida Museum assistant curator of South Florida archaeology and ethnography. "Our preference for what we eat has probably had a much bigger impact on global biodiversity over time than we appreciate, and hutias provide one example of that."

Under Lucayan care, hutia populations thrived. But when Europeans landed in the Bahamas, they introduced new predators, such as cats, and competitors like rats and mice. Development subsequently degraded hutia habitat, and the species may have vanished if conservationists had not moved hutias to protected areas.

"We have not stopped messing with hutias over the past 1,000 years. We can't help ourselves," said the study's lead author Jessica Oswald, a postdoctoral researcher at the Florida Museum and the University of Nevada-Reno. "For better or worse, Bahamian hutias are only alive today because of humans. We moved them to tiny islands where they're protected, and probably the only way they are going to survive is if they live on cays devoid of people."

Oswald and David Steadman, Florida Museum curator of ornithology, often found hutia fossils in the roosts of now-extinct giant owls on Great Bahama Bank, but hutias were notably absent from pre-human fossil sites on Bahamian islands beyond that bank. Meanwhile, LeFebvre was studying hutia bones excavated from ancient Lucayan trash heaps and found evidence that certain hutia populations were eating crops, such as corn.

The researchers began comparing notes and decided to investigate the patterns they were uncovering by conducting the first ancient DNA study of the Bahamian hutia.

"We knew we had a question to chase," LeFebvre said. "And then it turned out that there was a major human footprint."

Traces of human influence in hutia DNA

While G. ingrahami is the only hutia in the Bahamas, Cuba is the historical and modern center of hutia diversity and home to 10 of the 13 species living today. More than half of known hutia species have gone extinct over the past 12,000 years, but the group once boasted a range of habitats and sizes, from a 440-pound heavyweight in the Lesser Antilles to a pygmy-sized species found on Cuba.

Their temperaments also vary. Observing G. ingrahami, biologist Garrett Clough described it as a "most peaceable rodent" - but not all hutia species are mellow.

"On Cuba, there's one species people treat as a pet and other species that could scratch your eyes out," LeFebvre said.

The researchers' findings supported previous evidence that the Bahamian hutia's closest living relative is G. brownii, a vulnerable species on Jamaica. But they hypothesize the Bahamian hutia is a descendant of a Cuban species that reached the Bahamas about 10,000 years ago when low sea levels closed the distance between the islands to about 12 miles.

"Because so many hutias have recently gone extinct, we will need more ancient DNA from extinct species to test this hypothesis," Oswald said.

Radiocarbon-dated fossils showed that hutias lived on the Great Bahama Bank before humans arrived - but none of the fossils outside of the bank were older than about AD 1300-1400, several centuries after human settlement. The researchers also found a striking genetic similarity between a population on Eleuthera, an island on the Great Bahama Bank, and a population on Abaco, part of the Little Bahama Bank. The two banks never connected, even when sea levels were at their lowest, suggesting people had ferried hutias across the channel.

"We would expect those two populations to be genetically distinct because they're isolated, but instead, they look like they're from the same population," Oswald said. "You're unravelling a mystery with DNA from fossils and trying to figure out 'whodunnit.' That's what makes this so fun."

A genetic split between hutias in the northern Bahamas and those south of Long Island is a puzzle the researchers intend to explore further.

"Did some kind of human selection impact that? Could other hutias have been introduced or made it over from Cuba?" LeFebvre said. "Genetics sets you on the path to answer those kinds of questions."

Taking the long view of hutia conservation

The Bahamian hutia was believed to be extinct before Clough traveled to East Plana Cay, a small strip of uninhabited land, in 1966. The hutias he observed there were thought to be the last remaining examples of indigenous G. ingrahami. This study suggests, however, that hutias may have been introduced to the cay by the Lucayans, LeFebvre said.

While the Bahamian hutia is a survivor, its existence is also delicate, susceptible to hurricanes, disease, invasive species and landscape changes. Its conservation needs place its future in the hands of humans.

But the value in studies like this one is the millennia-long view it offers conservation decision-makers, LeFebvre said.

"Considering truly long-term anthropogenic influences on biodiversity is a crucial step to building and planning conservation efforts," she said.

For Oswald, a paleontologist and evolutionary biologist who is an expert in ancient DNA analysis, the project offered a unique opportunity to collaborate with archaeologists.

"The combination of these fields is powerful. Michelle had evidence that humans were transporting hutias, and then we used data from an evolutionary field to help answer these questions. It speaks to what you can do when you bring together different fields."

Credit: 
Florida Museum of Natural History

Nanoparticle chomps away plaques that cause heart attacks

image: The dotted line outlines the atherosclerotic artery and the green represents our nanoparticles, which are in the plaque. The red indicates macrophages, which is the cell type that the nanoparticles are stimulating to eat the debris.

Image: 
Bryan Smith, Michigan State University

Michigan State University and Stanford University scientists have invented a nanoparticle that eats away - from the inside out - portions of plaques that cause heart attacks.

Bryan Smith, associate professor of biomedical engineering at MSU, and a team of scientists created a "Trojan Horse" nanoparticle that can be directed to eat debris, reducing and stabilizing plaque. The discovery could be a potential treatment for atherosclerosis, a leading cause of death in the United States.

The results, published in the current issue of Nature Nanotechnology, showcases the nanoparticle that homes in on atherosclerotic plaque due to its high selectivity to a particular immune cell type - monocytes and macrophages. Once inside the macrophages in those plaques, it delivers a drug agent that stimulates the cell to engulf and eat cellular debris. Basically, it removes the diseased/dead cells in the plaque core. By reinvigorating the macrophages, plaque size is reduced and stabilized.

Smith said that future clinical trials on the nanoparticle are expected to reduce the risk of most types of heart attacks, with minimal side effects due to the unprecedented selectivity of the nanodrug.

Smith's studies focus on intercepting the signaling of the receptors in the macrophages and sending a message via small molecules using nano-immunotherapeutic platforms. Previous studies have acted on the surface of the cells, but this new approach works intracellularly and has been effective in stimulating macrophages.

"We found we could stimulate the macrophages to selectively eat dead and dying cells - these inflammatory cells are precursor cells to atherosclerosis - that are part of the cause of heart attacks," Smith said. "We could deliver a small molecule inside the macrophages to tell them to begin eating again."

This approach also has applications beyond atherosclerosis, he added.

"We were able to marry a groundbreaking finding in atherosclerosis by our collaborators with the state-of-the-art selectivity and delivery capabilities of our advanced nanomaterial platform. We demonstrated the nanomaterials were able to selectively seek out and deliver a message to the very cells needed," Smith said. "It gives a particular energy to our future work, which will include clinical translation of these nanomaterials using large animal models and human tissue tests. We believe it is better than previous methods."

Smith has filed a provisional patent and will begin marketing it later this year.

Note for media: Please include a link to the original paper in online coverage: https://www.nature.com/articles/s41565-019-0619-3

Credit: 
Michigan State University

Airborne measurements point to low EPA methane estimates in south central US

image: In a series of flights during NASA's ACT-America campaign, Penn State researchers measured methane plumes in the atmosphere over portions of south central US. The researchers found measurements of methane emissions from the oil and natural gas industry are higher than EPA estimates.

Image: 
David Kubarek, Penn State

Approximately twice as much methane is seeping into the atmosphere than the Environmental Protection Agency estimates from oil and gas facilities in the south central U.S., according to a series of measurements taken by meteorologists using NASA aircraft.

In six flights through the region, researchers used onboard instruments from two planes to collect data roughly 1,000 feet above ground. They flew through massive methane plumes concentrated by regional weather patterns and used sample points and weather models to determine the actual methane concentrations of the plumes. These concentrated plumes were discovered during the Atmospheric Carbon and Transport-America (ACT-America) campaign, a much broader Penn State led-effort to understand greenhouse sources and sinks.

Researchers found methane from oil and gas facilities to be 1.1 to 2.5 times greater than EPA estimates for the region that includes Arkansas, Texas, Louisiana and Oklahoma. In another key finding, scientists showed how frontal systems in the atmosphere can be used to track methane from much larger areas at the surface because large plumes of methane concentrations come together along the frontal boundary.

"When we flew across cold fronts, one thing we noticed was that warm air was being pulled up and funneling the region's greenhouse gases into large plumes," said Zach Barkley, researcher in meteorology and atmospheric science, Penn State. "We fed data from these plumes into our weather models and, when we compared the data with the EPA inventory, we saw there was a discrepancy."

Methane comes from many sources -- including wetlands, animal agriculture and the oil and natural gas industry -- so researchers used ethane measurements to determine the source. Ethane is primarily found in methane produced by the natural gas industry, so researchers used that to omit methane produced by animal agriculture and other natural sources. The findings are reported in a recent issue of Geophysical Research Letters.

The EPA uses a bottom-up approach to estimate methane emissions from industry by applying a value to each well and transport component. Penn State researchers used a top-down approach, meaning the emissions were measured at their endpoint, the atmosphere.

"The one issue with the bottom-up approach is if you can't sample enough sources to get an accurate representation of the average," Barkley said. "When you multiply by all of the different devices and components across the U.S., you could potentially come up with a number that's not accurate."

The region is important to combating greenhouse gas emissions at large, Barkley said, because it accounts for nearly 40 percent of the man-made methane emissions in the U.S. The region is a hotspot for both natural gas extraction and animal agriculture. Methane is an important greenhouse gas with 34 times the warming potential of carbon dioxide over a 100-year period, according to the Intergovernmental Panel on Climate Change.

Barkley said there are also problems with the top-down approach to measuring methane. It is more expensive and does not identify which sources are emitting methane. He said the approach is more of a check on the accuracy of the existing approach.

But it does point to areas to target for greenhouse gas reduction.

"If oil and gas emissions are off by a factor of two, that means that oil and gas are very significantly the highest man-made source of methane emissions in the U.S. and would be a prime area to target for reducing methane emissions, particularly if we find relatively few sources contributing significantly to the bulk of the emissions," Barkley said. "If we can figure out how to target those sources and fix them, that could be a significant reduction of greenhouse gas emissions coming from the oil and gas industry."

Credit: 
Penn State

Method detects defects in 2D materials for future electronics, sensors

image: A laser beam (yellow) reflects off a 2D material (orange) highlighting a grain boundary defect in the atomic lattice.

Image: 
MRI/Penn State

To further shrink electronic devices and to lower energy consumption, the semiconductor industry is interested in using 2D materials, but manufacturers need a quick and accurate method for detecting defects in these materials to determine if the material is suitable for device manufacture. Now a team of researchers has developed a technique to quickly and sensitively characterize defects in 2D materials.

Two-dimensional materials are atomically thin, the most well-known being graphene, a single-atom-thick layer of carbon atoms.

"People have struggled to make these 2D materials without defects," said Mauricio Terrones, Verne M. Willaman Professor of Physics, Penn State. "That's the ultimate goal. We want to have a 2D material on a four-inch wafer with at least an acceptable number of defects, but you want to evaluate it in a quick way."

The researchers' -- who represent Penn State, Northeastern University, Rice University and Universidade Federal de Minas Gerais in Brazil - solution is to use laser light combined with second harmonic generation, a phenomenon in which the frequency of the light shone on the material reflects at double the original frequency. They add dark field imaging, a technique in which extraneous light is filtered out so that defects shine through. According to the researchers, this is the first instance in which dark field imaging was used, and it provides three times the brightness of the standard bright field imaging method, making it possible to see types of defects previously invisible.

"The localization and identification of defects with the commonly used bright field second harmonic generation is limited because of interference effects between different grains of 2D materials," said Leandro Mallard, a senior author on a recent paper in Nano Letters and a professor at Universidade Federal de Minas Gerais. "In this work we have shown that by the use of dark field SHG we remove the interference effects and reveal the grain boundaries and edges of semiconducting 2D materials. Such a novel technique has good spatial resolution and can image large area samples that could be used to monitor the quality of the material produced in industrial scales."

Vincent H. Crespi, Distinguished Professor of Physics, Materials Science and Engineering, and Chemistry, Penn State, added, "Crystals are made of atoms, and so the defects within crystals -- where atoms are misplaced -- are also of atomic size.

"Usually, powerful, expensive and slow experimental probes that do microscopy using beams of electrons are needed to discern such fine details in a material," said Crespi. "Here, we use a fast and accessible optical method that pulls out just the signal that originates from the defect itself to rapidly and reliably find out how 2D materials are stitched together out of grains oriented in different ways."

Another coauthor compared the technique to finding a particular zero on a page full of zeroes.

"In the dark field, all the zeroes are made invisible so that only the defective zero stands out," said Yuanxi Wang, assistant research professor at Penn State's Materials Research Institute.

The semiconductor industry wants to have the ability to check for defects on the production line, but 2D materials will likely be used in sensors before they are used in electronics, according to Terrones. Because 2D materials are flexible and can be incorporated into very small spaces, they are good candidates for multiple sensors in a smartwatch or smartphone and the myriad of other places where small, flexible sensors are required.

"The next step would be an improvement of the experimental setup to map zero dimension defects -- atomic vacancies for instance -- and also extend it to other 2D materials that host different electronic and structural properties," said lead author Bruno Carvalho, a former visiting scholar in Terrones' group,

Credit: 
Penn State

Simple test identifies patients at high risk for future dialysis or transplant

A low-cost test that screens for excess protein in the urine has been shown to accurately identify patients at higher risk for progressive kidney disease after being hospitalized for acute kidney injury, according to a new study by researchers at UC San Francisco.

Targeting treatment to these high-risk patients could reduce the future need for dialysis or kidney transplants and save patients and the U.S. health care system billions of dollars, the researchers said.

In a study of more than 1,500 patients with and without acute kidney injury (AKI) -- a condition in which the kidneys suddenly cannot filter waste from the blood -- those with higher levels of protein in the urine three months after hospital discharge were found to have a 150 percent greater risk of future kidney disease.

The findings appear online Jan. 27, 2020, in JAMA Internal Medicine.

Acute kidney injury can occur at any age and has various causes and complex symptoms. It affects more than 200,000 patients in the United States each year and 13.3 million worldwide. Patients who recover have a greater likelihood of recurrence, kidney disease progression, kidney failure, heart disease and even death. About 20 percent of patients develop chronic kidney disease within three to five years.

Prior research found that acute kidney injury was associated with a $5.4 billion to $24 billion increase in U.S. hospitalization costs. The most expensive patients were those requiring dialysis, for whom costs were $11,016 to $42,077 higher per hospitalization than for patients without acute kidney injury. AKI also results in an additional 3.2 days of hospitalization per patient, with costs higher than myocardial infarction and gastrointestinal bleeding and comparable to stroke, pancreatitis and pneumonia.

Proteinuria, or excess protein in the urine, is an important indicator of kidney function that can signal early kidney disease. However, it is not commonly measured in patients after AKI has occurred, despite being an inexpensive, non-invasive test that is frequently used by clinicians in other settings.

"There should be much more emphasis on the testing of proteinuria after AKI to identify high-risk patients," said lead author Chi-yuan Hsu, MD, MSc, professor and chief of nephrology at UCSF. "This simple test carries important prognostic information not conveyed by serum creatinine."

"Too many providers rely on serum creatinine alone to assess the health of the kidneys, but they should not be falsely assured by the latter," Hsu said. "Having a more complete picture of kidney health is necessary for proper clinical decision-making."

In the JAMA Internal Medicine study, Hsu and his colleagues provided results from their study of 1,538 hospitalized adults, equally divided between having or not having acute kidney injury. Patients in this decade-long study were enrolled between December 2009 and February 2015, with semiannual phone contact and an annual in-person visit through November 2018.

During an average follow up of 4.7 years, 138 patients overall (9 percent) had kidney disease progression, and 58 patients had end-stage renal disease. Of those with AKI, 97 patients (12.6 percent) had kidney disease progression. Those with proteinuria after three months were found to have a 1.5-times higher risk of disease progression.

In conjunction with testing, therapies to reduce proteinuria, such as blood pressure control or medications, may reduce adverse outcomes, said Hsu, who also is an adjunct investigator with the Kaiser Permanente Division of Research. Kaiser Permanente/UCSF was one of four North American clinical centers involved in the study.

"Most patients with acute kidney injury are unaware of their condition, lack understanding of its natural history or predisposing factors, and desire more information," Hsu said. "However, few discharge communications currently provided to these patients explain the condition or provide recommendations for care, which is needed throughout the care continuum."

Credit: 
University of California - San Francisco

Another reason to reduce man-made ozone: To cool a warming planet

image: Vegetation absorbs both CO2 and O3, but the O3 inhibits photosynthesis and so reduces the amount of CO2 plants can take up, leaving more behind in the atmosphere. Ozone is a product of fossil fuel combustion from key economic sectors, including energy, industry and transportation, and is formed by reactions between pollutant gases and sunlight. Reducing ozone will help vegetation to grow better and take up more CO2, while also reducing unhealthy pollutants such as nitrogen oxides.

Image: 
Ben Felzer/Lehigh University

While elected officials in the U.S. debate a proposed "Green New Deal" and U.S. President Donald Trump derides "prophets of doom" in Davos, environmental scientists continue to gather evidence about how changes to industry could mitigate the harms of climate change.

In a News and Views article in Nature Climate Change ("Cleaner Air is a Win-Win," 10.1038/s41558-019-0685-4) Lehigh University Professor of Earth and Environmental Science, Benjamin S. Felzer, highlights the importance of a new analysis based on Earth system modelling, showing that cleaning up ozone precursors within specific economic sectors can increase the mitigation potential of the land carbon sink by enhancing the ability of vegetation to remove carbon dioxide from the atmosphere through photosynthesis. (Unger, N., Zheng, Y., Yue, X. & Harper, K. L. Nat. Clim. Change https://doi.org/10.1038/s41558-019-0678-3 (2020)). Global climate models, he notes, indicate that ozone limits photosynthesis and vegetation growth in polluted regions such as the eastern United States, eatern China and Europe which, in turn, limits the ability of these regions to act as carbon sinks.

Felzer writes: "The study [by Nadine Unger et al]...assesses the effect of reducing ozone precursors in seven different economic emission sectors, the most important of which turn out to be energy (electricity and heat production from fossil fuel burning), industry (fossil fuels burned on site), road transportation and agriculture.

Unger et al. ran an Earth system model, linking climate to atmospheric chemistry, to explore the global effects on photosynthesis of reducing emissions from these sectors by 50%. Ozone pollution resulted in 9-13% reductions in photosynthesis in the aforementioned polluted regions. Cleaning up ozone precursors in the transportation, energy, industrial or agricultural sectors led to 13-16% gains in photosynthesis in eastern China, and 16-23% gains in the eastern United States and Europe due to the transportation and energy sectors. Benefits were 2-3 times larger in croplands and grasslands than forests. A 50% reduction in ozone pollution from just the transportation and energy sectors resulted in an increase in photosynthesis equivalent to the amount of carbon lost by fire each year."

According to Felzer, Unger and colleagues ultimately conclude that the mitigation potential resulting from addressing ozone pollution would result in a 15% increase in the size of the current land sink for carbon.

How reducing ozone precursors could slow down the impacts of climate change

From the perspective of human health impacts, there is "good" ozone and "bad" ozone. Natural ozone in the second major layer of Earth's atmosphere has a protective effect for humans, blocking the sun's harmful ultraviolet (UV) rays. Man-made ozone, a byproduct of fossil fuel production and other industrial processes, gets trapped in the atmospheric layer closest to earth and has been shown to be harmful to human health, as well as to plants, trees and crops.

Man-made ozone at ground-level inhibits plant photosynthesis by directly damaging some of the plant cells responsible for it.

"It affects different plants differently, for example doing more damage to crops than to trees at similar doses...," he writes. "Global climate models indicate that ozone limits photosynthesis and vegetation growth in polluted regions such as the eastern United States, eastern China, and Europe...This then reduces the carbon sequestration potential of these regions..."

Reducing ozone, concludes Felzer, will help vegetation to grow better and take up more Carbon Dioxide, while also reducing unhealthy pollutants such as nitrogen oxides and volatile organic compounds (VOCs).

Political debates and rhetoric aside, it is a conclusion that supports reducing man-made ozone for the health of humans, as well as the planet on which all life depends.

Credit: 
Lehigh University

Detection of very high frequency magnetic resonance could revolutionize electronics

image: Jing Shi is a professor in the Department of Physics and Astronomy at UC Riverside.

Image: 
I. Pittalwala, UC Riverside.

RIVERSIDE, Calif. -- A team of physicists has discovered an electrical detection method for terahertz electromagnetic waves, which are extremely difficult to detect. The discovery could help miniaturize the detection equipment on microchips and enhance sensitivity.

Terahertz is a unit of electromagnetic wave frequency: One gigahertz equals 1 billion hertz; 1 terahertz equals 1,000 gigahertz. The higher the frequency, the faster the transmission of information. Cell phones, for example, operate at a few gigahertz.

The finding, reported today in Nature, is based on a magnetic resonance phenomenon in anti-ferromagnetic materials. Such materials, also called antiferromagnets, offer unique advantages for ultrafast and spin-based nanoscale device applications.

The researchers, led by physicist Jing Shi of the University of California, Riverside, generated a spin current, an important physical quantity in spintronics, in an antiferromagnet and were able to detect it electrically. To accomplish this feat, they used terahertz radiation to pump up magnetic resonance in chromia to facilitate its detection.

In ferromagnets, such as a bar magnet, electron spins point in the same direction, up or down, thus providing collective strength to the materials. In antiferromagnets, the atomic arrangement is such that the electron spins cancel each other out, with half of the spins pointing in the opposite direction of the other half, either up or down.

The electron has a built-in spin angular momentum, which can precess the way a spinning top precesses around a vertical axis. When the precession frequency of electrons matches the frequency of electromagnetic waves generated by an external source acting on the electrons, magnetic resonance occurs and is manifested in the form of a greatly enhanced signal that is easier to detect.

In order to generate such magnetic resonance, the team of physicists from UC Riverside and UC Santa Barbara worked with 0.24 terahertz of radiation produced at the Institute for Terahertz Science and Technology's Terahertz Facilities at the Santa Barbara campus. This closely matched the precession frequency of electrons in chromia. The magnetic resonance that followed resulted in the generation of a spin current that the researchers converted into a DC voltage.

"We were able to demonstrate that antiferromagnetic resonance can produce an electrical voltage, a spintronic effect that has never been experimentally done before," said Shi, a professor in the Department of Physics and Astronomy.

Shi, who directs Department of Energy-funded Energy Frontier Research Center Spins and Heat in Nanoscale Electronic Systems, or SHINES, at UC Riverside, explained subterahertz and terahertz radiation are a challenge to detect. Current communication technology uses gigahertz microwaves.

"For higher bandwidth, however, the trend is to move toward terahertz microwaves," Shi said.  "The generation of terahertz microwaves is not difficult, but their detection is. Our work has now provided a new pathway for terahertz detection on a chip."

Although antiferromagnets are statically uninteresting, they are dynamically interesting. Electron spin precession in antiferromagnets is much faster than in ferromagnets, resulting in frequencies that are two-three orders of magnitude higher than the frequencies of ferromagnets -- thus allowing faster information transmission.

"Spin dynamics in antiferromagnets occur at a much shorter timescale than in ferromagnets, which offers attractive benefits for potential ultrafast device applications," Shi said.

Antiferromagnets are ubiquitous and more abundant than ferromagnets. Many ferromagnets, such as iron and cobalt, become antiferromagnetic when oxidized. Many antiferromagnets are good insulators with low dissipation of energy. Shi's lab has expertise in making ferromagnetic and antiferromagnetic insulators.

Shi's team developed a bilayer structure comprised of chromia, an antiferromagnetic insulator, with a layer of metal on top of it to serve as the detector to sense signals from chromia.

Shi explained that electrons in chromia remain local. What crosses the interface is information encoded in the precessing spins of the electrons.

"The interface is critical," he said. "So is spin sensitivity."

The researchers addressed spin sensitivity by focusing on platinum and tantalum as metal detectors. If the signal from chromia originates in spin, platinum and tantalum register the signal with opposite polarity. If the signal is caused by heating, however, both metals register the signal with identical polarity.

"This is the first successful generation and detection of pure spin currents in antiferromagnetic materials, which is a hot topic in spintronics," Shi said. "Antiferromagnetic spintronics is a major focus of SHINES."

Credit: 
University of California - Riverside

Cutting road transport pollution could help plants grow

Cutting emissions of particular gases could improve conditions for plants, allowing them to grow faster and capture more carbon, new research suggests.

A cocktail of gases - including nitrogen oxides, carbon monoxide, volatile organic compounds and methane - combines in the atmosphere to form ozone.

Ozone at the Earth's surface limits photosynthesis, reducing plants' ability to grow.

University of Exeter researchers say cutting emissions of ozone-forming gases offers a "unique opportunity" to create a "natural climate solution".

A 50% cut in emissions of these gases from the seven largest human-made sources - including road transport (the largest emitter) and energy production - would help plants contribute to "negative carbon emissions", the study says.

"Ecosystems on land currently slow global warming by storing about 30% of our carbon dioxide emissions every year," said Professor Nadine Unger, of the University of Exeter.

"This carbon capture is being undermined by ozone pollution.

"Our findings suggest the largest losses of plant productivity are in the eastern United States, Europe and eastern China, which all have high levels of surface ozone pollution.

"The impact on plant growth in these areas is estimated to be 5-20% annually."

Ozone is not emitted directly but forms in the atmosphere during complex chemical reactions of carbon monoxide, methane, non-methane volatile organic compounds and nitrogen oxides.

The seven areas of human activity that emit the largest amounts of these gases are agriculture, residential, energy, industry, road transportation, waste/landfill and shipping.

The study says a target of cutting these specific emissions by 50% is "large but plausible", citing examples of cuts already made in some industries.

"Deep cuts in air pollutant emissions from road transportation and the energy sector are the most effective mitigation measures for ozone-induced loss of plant productivity in eastern China, the eastern United States, Europe and globally," said Professor Unger.

"Our results suggest mitigation of ozone vegetation damage is a unique opportunity to contribute to negative carbon emissions, offering a natural climate solution that links fossil fuel emission abatement, air quality and climate.

"However, achieving these benefits requires ambitious mitigation efforts in multiple sectors."

Credit: 
University of Exeter

Research suggests benefits of conservation efforts may not yet be fully visible

The time it takes for species to respond to conservation measures - known as an 'ecological time lag' - could be partly masking any real progress that is being made, experts have warned.

Global conservation targets to reverse declines in biodiversity and halt species extinctions are not being met, despite decades of conservation action.

Last year, a UN report on global biodiversity warned one million species are at risk of extinction within decades, putting the world's natural life-support systems in jeopardy.

The report also revealed we were on track to miss almost all the 2020 nature targets that had been agreed a decade earlier by the global Convention on Biological Diversity

But work published today in the journal Nature Ecology and Evolution offers new hope that in some cases, conservation measures may not necessarily be failing, it is just too early to see the progress that is being made.

Led by Forest Research together with the University of Stirling, Natural England, and Newcastle University, the study authors highlight the need for 'smarter' biodiversity targets which account for ecological time-lags to help us better distinguish between cases where conservation interventions are on track to achieve success but need more time for the conservation benefits to be realised, and those where current conservation actions are simply insufficient or inappropriate.

Lead researcher Dr Kevin Watts of Forest Research said:

"We don't have time to wait and see which conservation measures are working and which ones will fail. But the picture is complicated and we fear that some conservation actions that will ultimately be successful may be negatively reviewed, reduced or even abandoned simply due to the unappreciated delay between actions and species' response.

"We hope the inclusion of time-lags within biodiversity targets, including the use of well-informed interim indicators or milestones, will greatly improve the way that we evaluate progress towards conservation success.

"Previous conservation efforts have greatly reduced the rate of decline for many species and protected many from extinction and we must learn from past successes and remain optimistic: conservation can and does work, but at the same time, we mustn't be complacent. This work also emphasises the need to acknowledge and account for the fact that biodiversity may still be responding negatively to previous habitat loss and degradation."

'Rebalancing the system'

Ecological time-lags relate to the rebalancing of a system following a change, such as the loss of habitat or the creation of new habitat.

Dr Watts added: "The system is analogous to a financial economy: we are paying back the extinction 'debt' from past destruction of habitats and now waiting for the 'credit' to accrue from conservation actions. What we're trying to avoid now is going bankrupt by intervening too late and allow the ecosystem to fail."

Using theoretical modelling, along with data from a 'woodland bird' biodiversity indicator, the research team explored how species with different characteristics (e.g. habitat generalists vs specialists) might respond over time to landscape change caused by conservation actions.

The authors suggest the use of milestones that mark the path towards conservation targets. For instance, the ultimate success of habitat restoration policies could be assessed firstly against the amount of habitat created, followed by the arrival of generalist species. Then, later colonisation by specialists would indicate increased habitat quality. If a milestone is missed at any point, the cause should be investigated and additional conservation interventions considered.

Philip McGowan, Professor of Conservation Science and Policy at Newcastle University and Chair of the IUCN Species Survival Commission Post-2020 Biodiversity Targets Task Force, said we need to "hold our nerve."

"Ultimately, ten years is too short a time for most species to recover.

"There are many cases where there is strong evidence to suggest the conservation actions that have been put in place are appropriate and robust - we just need to give nature more time.

"Of course, time isn't something we have. We are moving faster and faster towards a point where the critical support systems in nature are going to fail.

Almost 200 of the world's governments and the EU, signed the Convention on Biological Diversity, a 10-year plan to protect some of the world's most threatened species which was launched in 2010.

"A new plan is being negotiated during 2020 and it is critical that negotiators understand the time that it takes to reverse species declines and the steps necessary to achieve recovery of species on the scale that we need," says Professor McGowan.

"But there is hope."

Simon Duffield, Natural England, adds:

"We know that natural systems takes time to respond to change, whether it be positive, such as habitat creation, or negative such as habitat loss, degradation or increasingly climate change. How these time lags are incorporated into conservation targets has always been a challenge. We hope that this framework takes us some way towards being able to do so."

The research was conducted as part of the Woodland Creation and Ecological Networks (WrEN) project.

Professor Kirsty Park, co-lead for the WrEN project, said:

"This research is timely as there is an opportunity to incorporate time-lags into the construction of the Convention on Biological Diversity Post-2020 Global Biodiversity Framework. We need to consider realistic timescales to observe changes in the status of species, and also take into account the sequence of policies and actions that will be necessary to deliver those changes".

Credit: 
Newcastle University