Tech

Army project turns to nature for help with self-healing material

image: An Army-funded project developed a self-healing material patterned after squid ring teeth protein that could be used to repair materials under continual repetitive movement such as robotic machines, prosthetic legs, ventilators and personal protective equipment like hazmat suits.

Image: 
Penn State University

RESEARCH TRIANGLE PARK, N.C. -- An Army-funded project developed a self-healing material patterned after squid ring teeth protein. The biodegradable biosynthetic polymer could be used to repair materials that are under continual repetitive movement such as robotic machines, prosthetic legs, ventilators and personal protective equipment like hazmat suits.

"Materials that undergo continual repetitive motion often develop tiny tears and cracks that can expand, leading to catastrophic failure," said Dr. Stephanie McElhinny, biochemistry program manager, Army Research Office, an element of the U.S. Army Combat Capabilities Development Command's Army Research Laboratory. "With a self-healing bio-based synthetic material, any sites of damage that emerge can be repaired, extending the lifetime of the system or device."

The research at Penn State University and Max Planck Institute for Intelligent Systems, Stuttgart, Germany, funded in by part by ARO, and published in Nature Materials produced high-strength synthetic proteins that mimic those found in nature. The researchers surveyed large libraries of novel proteins created by assembling repetitive sequences from the squid ring teeth protein in different configurations.

Squid ring teeth are circular predatory appendages located on the suction cups of squid used to grasp prey. If the teeth are broken -- they can heal themselves. The soft parts in the proteins help the broken proteins fuse back together in water, while the hard parts help to reinforce the structure and keep it strong.

"Our goal is to create self-healing programmable materials with unprecedented control over their physical properties using synthetic biology," said Melik Demirel, professor of engineering science and mechanics at Penn State and the paper's co-author.

Repeated activity wears on soft robotic actuators, but these machines' moving parts need to be reliable and easily fixed. Now a team of researchers has a biosynthetic polymer, patterned after squid ring teeth, that is self-healing and biodegradable, creating a material not only good for actuators, but also for hazmat suits and other applications where tiny holes could cause a danger.

Current strategies for material self-healing have significant limitations, including requirements for potentially hazardous chemicals, loss in functionality of the healed material relative to the original state, and long healing times, often greater than 24 hours.

"We were able to reduce a typical 24-hour healing period to one second, so our protein-based soft robots can now repair themselves immediately," said Abdon Pena-Francelsch, Humboldt postdoctoral fellow, physical intelligence department at the Max Planck Institute for Intelligent Systems and lead author of the paper. "In nature, self-healing takes a long time. In this sense, our technology outsmarts nature."

The self-healing polymer heals with the application of water and heat, although Demirel said that it could also heal using light.

"Self-repairing physically intelligent soft materials are essential for building robust and fault-tolerant soft robots and actuators in the near future," said Metin Sitti, director, physical intelligence department at the Max Planck Institute for Intelligent Systems.

By adjusting the number of tandem repeats, the researchers created a soft polymer that healed rapidly and retained its original strength. They also created a polymer that is 100% biodegradable and 100% recyclable into the same, original polymer.

"We want to minimize the use of petroleum-based polymers for many reasons," Demirel said. "Sooner or later we will run out of petroleum and it is also polluting and causing global warming. We can't compete with the really inexpensive plastics. The only way to compete is to supply something the petroleum-based polymers can't deliver and self-healing provides the performance needed."

Demirel explained that while many petroleum-based polymers can be recycled, they are recycled into something different. For example, polyester t-shirts can be recycled into bottles, but not into polyester fibers again.

Just as the squid the polymer mimics biodegrades in the ocean, the biomimetic polymer will biodegrade. With the addition of an acid like vinegar, the polymer will also recycle into a powder that is manufacturable into the same, soft, self-healing polymer.

"This research illuminates the landscape of material properties that become accessible by going beyond proteins that exist in nature using synthetic biology approaches," McElhinny said. "The rapid and high-strength self-healing of these synthetic proteins demonstrates the potential of this approach to deliver novel materials for future Army applications, such as personal protective equipment or flexible robots that could maneuver in confined spaces."

Credit: 
U.S. Army Research Laboratory

Medical journals' commercial publishing contracts may lead to biased articles

AUSTIN, Texas -- Scientists have long been concerned that the common practice of medical journals accepting commercial payments from pharmaceutical companies may lead to pro-industry bias in published articles. According to new research at The University of Texas at Austin, scientists were right to be concerned, but they were focusing on the wrong type of payments.

In a new article published by PLOS ONE, researchers reviewed 128,781 articles published in 159 different medical journals for markers of pro-industry bias, evaluating whether accepting advertising revenue, fulfilling reprint contracts or being owned by a large multinational publishing firm made a journal more likely to publish articles favorable to industry. They found that articles published in journals that accept reprint fees are nearly three times more likely to be written by authors who receive industry payments.

"I was honestly surprised by the findings here," said S. Scott Graham, lead author of the study and assistant professor of rhetoric at UT Austin. ". There's a famous story about one company pulling a multimillion-dollar contract from the Annals of Internal Medicine because they didn't like an article published in the journal.All the available literature suggests that ad revenue should be the real concern, but that's not what we found"

A long-standing body of research has found that articles written by authors with financial conflicts of interest are more likely to discuss pharmaceutical products favorably. So Graham and his team built the first-ever machine-learning system and database to track individual conflicts of interest in English-language disclosure statements, deciphered from plain language descriptions such as "Doctor X received funds from Company A and Company B."

"This is, in some ways, a classic digital humanities research problem," said Graham, who attributed the study's success to his background in rhetoric. "I have colleagues in literature and history who train computers to read 10,000 novels or 100,000 documents in an archive. We used the same kind of techniques to develop a computer system that can read conflicts of interest."

The team found that articles published in journals that accept reprint fees are 2.81 times more likely to be written by authors who receive industry payments. They also found that accepted advertising revenue or being owned by a large publishing firm had no effect on the likelihood that any given article would represent a conflict of interest for the author.

The researchers also investigated whether there was any relationship between a journal's commercial practices and the number of author conflicts per article, finding that articles published in journals that only accept reprint contracts had 1.52 more conflicts per article on average. And articles published in journals that only make advertising space available had 1.13 fewer conflicts per article on average.

The team also found that articles published in journals owned by large publishing companies had 3.2 more conflicts of interest on average. However, many of these journals also accepted advertising and reprint fees, the researchers said.

"If we're going to make sure that medical journals are publishing the best science available, we need to focus on the commercial relationships that actually have an effect," Graham said. "The issue with reprints also suggests that academics may need to take open access publishing even more seriously."

Credit: 
University of Texas at Austin

Mars 2020 mission to be guided by USGS astrogeology maps

image: Description: This oblique view looks to the west from above the Jezero crater floor, over the fan-shaped delta deposit, and into the valley that cuts through the crater rim. The Perseverance Mars rover will land near this delta to search for evidence of past life and collect samples that could be returned to Earth by a later mission. Deltas form when flowing water is slowed down by encountering standing water, causing sediment to be deposited. On Earth, deltas are excellent at concentrating and preserving evidence of life, making this delta on Mars an appealing target. The Mars 2020 spacecraft will carry the same image mosaic used to generate the view shown here onboard and use it to steer itself away from hazards on the surface such as cliffs and dune fields. The seamless mosaic is composed of multiple precisely aligned images from the Context Camera on the Mars Reconnaissance Orbiter, and has a resolution of 6 meters per pixel. The large crater on the delta has a diameter of roughly 1km.

Image: 
Image NASA/MSSS/USGS

When NASA’s Perseverance rover lands on Mars next year, it will be equipped with some of the most precise maps of Mars ever created, courtesy of the USGS Astrogeology Science Center. Not only are the new maps essential for a safe landing on Mars, but they also serve as the foundation upon which the science activities planned for the Mars mission will be built.

“Exploration is part of human nature and USGS has a long history and enduring interest in researching planets other than our own,” said USGS director and former NASA astronaut Jim Reilly. “These maps will help the Perseverance mission unlock the mysteries of the red planet's past and guide future missions.”

Perseverance is expected to launch this week. The mission’s goals are to search for evidence of past life and habitable environments in Jezero crater and collect and store samples that, for the first time in history, could be returned to Earth by a future mission.

To safely land on the rugged Martian landscape, the spacecraft will use a new technology called “Terrain Relative Navigation.” As it descends through the planet’s atmosphere, the spacecraft will use its onboard maps to know precisely where it is and avoid hazards. For the navigation to work, the spacecraft needs the best possible maps of the landing site and surrounding terrain.

“As much as we would love to manually steer the spacecraft as it lands, that’s just not possible,” said Robin Fergason, USGS research geophysicist. “Mars is so far away -- some 130 million miles at the time of landing -- that it takes several minutes for radio signals to travel between Mars and Earth. By using the maps we created, the spacecraft will be able to safely steer itself instead.”

The USGS developed two new maps for the Mars mission. The first is a high-resolution (25-cm per pixel) map that researchers have used to accurately map surface hazards at the landing site. This map will serve as the base map for mission operations and to plot where the rover will explore after landing. The second map is a lower resolution (6-meters per pixel) map that spans the landing site and much of the surrounding terrain. This will be used onboard the spacecraft, along with the locations of the hazards from the high-resolution map, to help it land safely. The maps have been aligned with unprecedented precision to each other and to global maps of Mars to ensure that the maps show the hazards exactly where they really are.

The new maps are based on images collected by the Mars Reconnaissance Orbiter’s Context Camera and the High-Resolution Imaging Science Experiment (HiRISE) camera and are available online here.

For more details about these new maps of the Perseverance rover landing site, read the abstract. For the latest news about the mission, visit the NASA website.

USGS provides science for a changing world. For more information, visit www.usgs.gov.

When you’re planning to explore someplace new, it’s always a good idea to bring a map so you can avoid dangerous terrain. This is true whether you’re heading out for a hike on Earth or you’re landing a rover on Mars! When NASA’s Perseverance rover lands on Mars in 2021, it will be equipped with the most precise maps of Mars ever created, courtesy of the USGS Astrogeology Science Center.

Safe Landing

USGS maps will help the Mars 2020 spacecraft steer itself to a safe landing near an ancient river delta, the ideal spot to search for evidence of past life.

Guide Rover Exploration

The Perseverance rover will help unlock the mysteries of the red planet's history and guide future missions. USGS created this high-precision map of the Mars 2020 landing site in Jezero crater, continuing our long history of helping to explore our solar system.

Sources

Background: An artist's concept of NASA's Ingenuity Mars Helicopter attempts its first test flight on the Red Planet, the agency's Mars 2020 Perseverance rover will be close by. NASA/JPL-Caltech
Left: Mars 2020 Terrain Relative Navigation HiRISE Orthorectified Image Mosaic. USGS
Right: Global color coverage of Mars at a scale of 1 km/pixel. NASA/JPL/USGS

Credit: 
U.S. Geological Survey

Laser inversion enables multi-materials 3D printing

image: Laser beam transmitting upwards through glass.

Image: 
John Whitehead/Columbia Engineering

New York, NY--July 27, 2020--Additive manufacturing--or 3D printing--uses digital manufacturing processes to fabricate components that are light, strong, and require no special tooling to produce. Over the past decade, the field has experienced staggering growth, at a rate of more than 20% per year, printing pieces that range from aircraft components and car parts to medical and dental implants out of metals and engineering polymers. One of the most widely used manufacturing processes, selective laser sintering (SLS), prints parts out of micron-scale material powders using a laser: the laser heats the particles to the point where they fuse together to form a solid mass.

"Additive manufacturing is key to economic resilience," say Hod Lipson, James and Sally Scapa Professor of Innovation (Mechanical Engineering). "All of us care about this technology--it's going to save us. But there's a catch."

The catch is that SLS technologies have been limited to printing with a single material at a time: the entire part has to be made of just that one powder. "Now, let me ask you," Lipson continues, "how many products are made of just one material? The limitations of printing in only one material has been haunting the industry and blocking its expansion, preventing it from reaching its full potential."

Wondering how to solve this challenge, Lipson and his PhD student John Whitehead used their expertise in robotics to develop a new approach to overcome these SLS limitations. By inverting the laser so that it points upwards, they invented a way to enable SLS to use--at the same time--multiple materials. Their working prototype, along with a print sample that contained two different materials in the same layer, was recently published online by Additive Manufacturing as part of its December 2020 issue.

"Our initial results are exciting," says Whitehead, the study's lead author, "because they hint at a future where any part can be fabricated at the press of a button, where objects ranging from simple tools to more complex systems like robots can be removed from a printer fully formed, without the need for assembly."

Selective laser sintering traditionally has involved fusing together material particles using a laser pointing downward into a heated print bed. A solid object is built from the bottom up, with the printer placing down a uniform layer of powder and using the laser to selectively fuse some material in the layer. The printer then deposits a second layer of powder onto the first layer, the laser fuses new material to the material in the previous layer, and the process is repeated over and over until the part is completed.

This process works well if there is just one material used in the printing process. But using multiple materials in a single print has been very challenging, because once the powder layer is deposited onto the bed, it cannot be unplaced, or replaced with a different powder.

"Also," adds Whitehead, "in a standard printer, because each of the successive layers placed down are homogeneous, the unfused material obscures your view of the object being printed, until you remove the finished part at the end of the cycle. Think about excavation and how you can't be sure the fossil is intact until you completely remove it from the surrounding dirt. This means that a print failure won't necessarily be found until the print is completed, wasting time and money."

The researchers decided to find a way to eliminate the need for a powder bed entirely. They set up multiple transparent glass plates, each coated with a thin layer of a different plastic powder. They lowered a print platform onto the upper surface of one of the powders, and directed a laser beam up from below the plate and through the plate's bottom. This process selectively sinters some powder onto the print platform in a pre-programmed pattern according to a virtual blueprint. The platform is then raised with the fused material, and moved to another plate, coated with a different powder, where the process is repeated. This allows multiple materials to either be incorporated into a single layer, or stacked. Meanwhile, the old, used-up plate is replenished.

In the paper, the team demonstrated their working prototype by generating a 50 layer thick, 2.18mm sample out of thermoplastic polyurethane (TPU) powder with an average layer height of 43.6 microns and a multi-material nylon and TPU print with an average layer height of 71 microns. These parts demonstrated both the feasibility of the process and the capability to make stronger, denser materials by pressing the plate hard against the hanging part while sintering.

"This technology has the potential to print embedded circuits, electromechanical components, and even robot components. It could make machine parts with graded alloys, whose material composition changes gradually from end to end, such as a turbine blade with one material used for the core and different material used for the surface coatings," Lipson notes. "We think this will expand laser sintering towards a wider variety of industries by enabling the fabrication of complex multi-material parts without assembly. In other words, this could be key to moving the additive manufacturing industry from printing only passive uniform parts, towards printing active integrated systems."

The researchers are now experimenting with metallic powders and resins in order to directly generate parts with a wider range of mechanical, electrical, and chemical properties than is possible with conventional SLS systems today.

Credit: 
Columbia University School of Engineering and Applied Science

X-rays recount origin of oddball meteorites

image: X-ray experiments at Berkeley Lab's Advanced Light Source helped scientists to establish that the parent planetesimal of rare meteorites, like the one shown here, had a molten core, a solid crust, and a magnetic field similar in strength to the Earth's magnetic field.

Image: 
Carl Agee/Institute of Meteoritics, University of New Mexico; background edited by MIT News

X-ray experiments at Lawrence Berkeley National Laboratory (Berkeley Lab) played a key role in resolving the origin of rare, odd meteorites that have puzzled scientists since their discovery a half-century ago. Known as type IIE iron meteorites, they appear to have originated from a parent body that had a composition featuring both fully melted and unmelted parts - other meteorite types display only one composition.

Researchers used Berkeley Lab's Advanced Light Source (ALS), a synchrotron user facility that produces light ranging from infrared to X-rays for a variety of experiments, to produce 3D reconstructions of the detailed patterns of magnetic orientation imprinted in samples from two of these rare meteorites. Only nine of these meteorites have been found on Earth.

The magnetization pockets pointed to their likely origin in a large "planetesimal" - an object that took shape during the formative stage of our solar system - that indeed was both unmelted and melted. The object likely had a molten metallic core, a solid crust, and a magnetic field that may have rivaled the Earth's in strength. The study was published online July 24 in Science Advances.

The ALS experiments used an X-ray microscopy technique known as XPEEM, coupled with a technique known as XMCD that produced greater contrast, to generate more than 500 images of the 3D-reconstructed magnetization for each sample.

"We helped the research team to get this quantitative information about the direction and magnitude of the magnetization" in the samples, said Berkeley Lab's Andreas Scholl, who first met with researchers on the team in 2015. Scholl is a senior staff scientist, beamline scientist, and science deputy at the ALS.

Rajesh Chopdekar, an ALS project scientist and a beamline scientist, noted that the X-rays were tuned specifically to measure the magnetic iron present in the samples. Thousands of magnetic images were compiled to generate statistical confidence in the magnitude and direction of the ancient magnetic field imprinted into the meteorites.

Credit: 
DOE/Lawrence Berkeley National Laboratory

Stopping listeria reproduction 'in its tracks'

image: Corresponding authors Francisco Robles, associate professor of mechanical engineering technology at the University of Houston College of Technology; and Sujata Sirsat, assistant professor at UH's Conrad N. Hilton College of Hotel and Restaurant Management

Image: 
University of Houston

Listeria contaminations can send food processing facilities into full crisis mode with mass product recalls, federal warnings and even hospitalization or death for people who consume the contaminated products. Destroying the bacterium and stopping its spread can be challenging because of the formation of biofilms, or communities of resistant bacteria that adhere to drains or other surfaces.

Researchers at the University of Houston are reporting in the Journal of Environmental Chemical Engineering that cobalt-doped titanium-dioxide (CoO-TiO2) stops the reproduction of listeria monocytogenes in both light and dark conditions. This bacteriostatic effect could lead to bacterial control in food products that are not only sealed but also protected from light such as tetra packs, cans and dark glass or plastic bottles.

"The addition of cobalt, a heavy metal, drastically improved the effectiveness of titanium-dioxide because now it works under regular human conditions -- sunlight, fluorescent light such as light bulbs and even in 'the absence of light,' like in a freezer," said Francisco Robles, lead author for the study and associate professor of mechanical engineering technology.

Titanium-dioxide has long been an effective catalyst in the chemical industry with many applications, but it has limitations because ultraviolet light is needed to make it work, according to Robles. "UV light sources are in short supply in sunlight and producing it is expensive and a health hazard (e.g. carcinogen), so we set out to find a solution. Making it effective under natural light conditions is significant, and free," he said.

A naturally occurring mineral, titanium-dioxide is often used in the food industry as an additive or whitening agent for sauces, dressings and powdered foods and is considered safe by the U.S. Food and Drug Administration. It's also used in sunscreen for its protective effects against UV/UVB rays from the sun.

Sujata Sirsat, study co-author and assistant professor at UH's Conrad N. Hilton College of Hotel and Restaurant Management, believes cobalt-doped titanium-dioxide, whether manufactured directly into food packaging or added to food products, could potentially reduce the risk for large listeria outbreaks in food processing environments.

"Listeria is a rare foodborne pathogen that can survive in refrigerated conditions. So, if you had a contaminated bowl of potato salad, not only can listeria survive, it can increase in numbers potentially causing a serious health issue. The cobalt-doped titanium dioxide can potentially stop the spread in its tracks," said Sirsat, an expert in food safety and public health, who said toxicity testing is needed to determine its safety in food products.

An estimated 1,600 people get listeriosis each year from eating foods contaminated with listeria monocytogenes, and about 260 people die, according to the U.S. Centers for Disease Control and Prevention. The CDC has led investigations on 19 multistate listeria monocytogenes outbreaks involving fruits, vegetables, deli meats, cheeses and more since 2011. The infection is most likely to sicken pregnant women and their newborns, adults over 65 and people with weakened immune systems.

The researchers believe cobalt-doped titanium-dioxide could have a wide range of applications beyond bacteria control. "You could coat hospital plates with it to make them incapable of forming bacteria or coat the packaging of milk and other dairy products. You could even add it to paint to make bacteria-controlled paint. The possibilities are tremendous," said Robles, who has been studying the effects of the chemical compound for nearly 15 years.

Credit: 
University of Houston

NASA tracks Hanna's soaking path into Mexico

image: On July 26 at 3:35 p.m. EDT (1935 UTC) NASA's Aqua satellite analyzed Hanna in infrared light using the AIRS instrument. The strongest storms with the coldest cloud top temperatures were located over extreme south Texas and northeastern Mexico. Coldest cloud top temperatures were as cold as or colder than (purple) minus 63 degrees Fahrenheit (minus 53 degrees Celsius).

Image: 
NASA JPL/Heidar Thrastarson

NASA's Aqua satellite provided infrared data on Tropical Depression Hanna while imagery from NASA-NOAA's Suomi NPP satellite was used to create an animation showing its movement from Texas to Mexico. Infrared data can reveal the location of powerful storms that generate heavy rainfall and Hanna drenched Texas upon landfall over the weekend of July 25-26.

Tracking Hanna's Path to Mexico

Visible imagery of Hurricane Hanna from July 23 to 26, taken from the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard NASA-NOAA's Suomi NPP satellite was compiled and made into an animation using NASA's Worldview application. The imagery showed Hanna's landfall in east central Texas and its track to the southwest into north central Mexico.

NASA's Earth Observing System Data and Information System (EOSDIS) Worldview application provides the capability to interactively browse over 700 global, full-resolution satellite imagery layers and then download the underlying data. Many of the available imagery layers are updated within three hours of observation, essentially showing the entire Earth as it looks "right now."

Hanna the Rainmaker Breaks Records

Hanna has been a big rainmaker. On July 25, the National Weather Service (NWS) at Corpus Christi, Texas reported 2.57 inches of rainfall from Hanna. Although not a record, it was a lot of rain.

On July 26, the NWS in Brownsville, Texas reported a new record for rainfall of 3.46 inches, breaking the old record of 2.74 inches that was set in 1890. Record rainfall was also recorded at McAllen Miller International Airport where the NWS reported 4.52 inches of rainfall on July 26, breaking the old record of 1.41 inches set in the year 2000.

An Infrared Look at Hanna's Rainmaking Capability

One of the ways NASA researches tropical cyclones is using infrared data that provides temperature information. The AIRS instrument aboard NASA's Aqua satellite captured a look at those temperatures in Hanna and gave insight into the size of the storm and its rainfall potential.

Cloud top temperatures provide information to forecasters about where the strongest storms are located within a tropical cyclone. The stronger the storms, the higher they extend into the troposphere, and the colder the cloud top temperatures.

On July 26 at 3:35 p.m. EDT (1935 UTC) NASA's Aqua satellite analyzed the storm using the Atmospheric Infrared Sounder or AIRS instrument. The strongest storms with the coldest cloud top temperatures were located over extreme south Texas and northeastern Mexico. AIRS found coldest cloud top temperatures as cold as or colder than minus 63 degrees Fahrenheit (minus 53 degrees Celsius). NASA research has shown that cloud top temperatures that cold indicate strong storms that have the capability to create heavy rain.

NASA provides the AIRS infrared data to forecasters at NOAA's National Hurricane Center or NHC so they can incorporate it into their forecasting. The AIRS instrument is one of six instruments flying on board NASA's Aqua satellite, launched on May 4, 2002.

More Heavy Rainfall from Hanna

The strong storms seen in infrared imagery continue to generate heavy rain and that rainfall is expected in parts of south Texas and in northern Mexico on July 27.

NHC forecasters said, "Hanna is expected to produce the following rain accumulations and flood threats through Monday: Far south Texas can expect an additional 1 to 2 inches. The northern Mexican states of Coahuila, Nuevo Leon, and Tamaulipas can expect 4 to 8 inches. Flash flooding and mudslides are possible across these Mexican states. In addition, the northern Mexican states of northern Zacatecas, northern San Luis Potosi, and eastern Durango can expect 1 to 2 inches.

Hanna's Status on July 27, 2020

NOAA's National Hurricane Center (NHC) reported at 5 a.m. EDT (0900 UTC) on July 26, the center of Tropical Depression Hanna was located near latitude 24.1 north, longitude 102.9 west. Hanna was centered about 65 miles (105 km) north of Fresnillo, Mexico. The estimated minimum central pressure was 1004 millibars. The depression was moving toward the west near 5 mph (7 kph) and this motion is expected to continue today. Maximum sustained winds were near 25 mph (35 kph) with higher gusts.

Hanna's Forecast

NHC noted, "Ocean swells generated by Hanna will continue to affect much of the Texas and northeastern Mexico coasts early today. These swells may cause rip current conditions. Hanna will weaken into a remnant low today."

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

Credit: 
NASA/Goddard Space Flight Center

NIST expands database that helps identify unknown compounds in milk

image: Remoroza operates a mass spectrometer, a laboratory instrument used to identify chemical compounds. "We want to find out as many details as we can about milk because it is so important, but so little is known about its chemistry," Remoroza said.

Image: 
R. Press/NIST

Got milk? Most people have seen the famous ads featuring celebrities that highlight the importance of drinking milk for building strong bones. Research shows that milk has other benefits, especially for babies, such as helping them grow and strengthening their immune systems. But scientists still don't understand exactly how milk does these things.

Solving that mystery starts with identifying the compounds in milk. To support that effort, researchers at the National Institute of Standards and Technology (NIST) have recently doubled the size of a reference library that includes examples of a certain type of carbohydrate found in milk from humans and several other animals. The expansion of the library will help scientists identify the unknown compounds in their own milk samples. The researchers published their new findings in Analytical Chemistry.

The composition of milk varies from mother to mother, but in general human milk contains 87% water and 13% nutrients, including fats, proteins and carbohydrates. Milk researchers often focus on a type of carbohydrate called oligosaccharides, one of the many different sugars in milk. These sugars are known to have a biological effect, such as providing energy for growing babies or contributing to organ development.

"Babies cannot chew or swallow solid food, so they are highly dependent on milk for growth. It's a miracle compound," said NIST chemist Connie Remoroza.

One of the main reasons why scientists analyze oligosaccharides is because if they can determine what oligosaccharides are present, they can begin to understand how oligosaccharides affect cells, tissues and biological processes.

The first version of the human milk oligosaccharide (HMO) library, released in 2018, consisted of 74 oligosaccharides. To build it, Remoroza and her colleagues analyzed components in a milk sample from NIST Standard Reference Material (SRM) 1953. They used a process called liquid chromatography to separate the sample into its finer components and an instrument called a mass spectrometer to create chemical fingerprints known as mass spectra.

The team then compared those unknown spectra against a massive database of 1.3 million spectra of 31,000 compounds called the NIST Tandem Mass Spectral Library. It's part of the larger NIST Mass Spectral Library, which was updated recently in a new version called NIST20.

The team identified an additional 80 new oligosaccharides, bringing the total to 154. The new HMO library also used milk samples from SRM 1954, Organic Contaminants in Fortified Human Milk.

Researchers were also able to identify new compounds that had never been reported in milk before, such as one type of oligosaccharide that contained 15 monosaccharide units, which are the building blocks of carbohydrates.

After mass spectrometry and liquid chromatography are used to produce raw data, it's then processed in order to extract the mass spectra to identify the unknown compounds. "The identification of unknown compounds depends on state-of-the-art methods. Many oligosaccharides are now known because of improved sensitivity of the mass spectrometers, combined with NIST search software," said Remoroza.

Once the unknown compounds are identified, they are included in the milk library.

Aside from human milk, Remoroza and her colleagues have also expanded the coverage of other types of mammalian milk. They analyzed four different nonhuman samples thanks to collaborations with the Smithsonian's National Zoo and Conservation Biology Institute and the Philippine Carabao Center.

The Smithsonian provided an African lion sample, and the Philippine Carabao Center provided milk from a Saanen goat and Asian water buffalo. Bovine (cow) milk samples came from NIST's very own SRMs 1549a and 1849a. The NIST researchers identified 90 oligosaccharides from these samples, 25 of which were also found in human milk.

Prior research has been done on these animal samples, but relatively few oligosaccharides were reported. Thanks to the development of new instruments, better methods for isolating oligosaccharides, and the new HMO library, scientists can now identify more oligosaccharides in their samples.

"Many researchers will hopefully be able to find NIST milk mass spectral library useful for analyzing their samples," said Remoroza.

This milk library is especially useful to infant formula manufacturers. "Scientists are interested in identifying the oligosaccharides in milk because they want to determine if these can be added to infant formula so babies can now get the essential nutrients," said Remoroza.

The work isn't done yet. Researchers at NIST will continue to identify the different types of oligosaccharides in human milk and expand their collection of nonhuman mammalian milk. They'll soon be analyzing milk from black and white pigs (in collaboration with Mariano Marcos State University in the Philippines), rhesus monkeys (with the University of Wisconsin) and dolphins (with the National Oceanic and Atmospheric Administration, or NOAA).

Credit: 
National Institute of Standards and Technology (NIST)

How the zebrafish got its stripes

image: Zoom into the zebrafish's alternating pattern and the stripes of colour resolve into individual pigment cells like a pointillist painting.

Image: 
Adapted by Kit Yates from Wikimedia and inset by Jennifer Owen

Animal patterns - the stripes, spots and rosettes seen in the wild - are a source of endless fascination, and now researchers at the University Bath have developed a robust mathematical model to explain how one important species, the zebrafish, develops its stripes.

In the animal kingdom, the arrangement of skin pigment cells starts during the embryonic stage of development, making pattern formation an area of keen interest not only for a lay audience but also for scientists - in particular, developmental biologists and mathematicians.

Zebrafish are invaluable for studying human disease. These humble freshwater minnows may seem to have little in common with mammals but in fact they show many genetic similarities to our species and boast a similar list of physical characteristics (including most major organs).

Zebrafish also provide fundamental insights into the complex, and often wondrous, processes that underpin biology. Studying their striking appearance may, in time, be relevant to medicine, since pattern formation is an important general feature of organ development. therefore, a better understanding of pigment pattern formation might give us insights into diseases caused by disruption to cell arrangements within organs.

The new mathematical model devised in Bath paves the way for further explorations into pigment patterning systems, and their similarity across different species. Pigmentation in zebrafish is an example of an emergent phenomenon - one in which individuals (cells in this case), all acting according to their own local rules, can self-organise to form an ordered pattern at a scale much larger than one might expect. Other examples of emergent phenomena in biology include the flocking of starlings and the synchronised swimming seen in schools of fish.

Dr Kit Yates, the mathematician from Bath who led the study, said: "It's fascinating to think that these different pigment cells, all acting without coordinated centralised control, can reliably produce the striped patterns we see in zebrafish. Our modelling highlights the local rules that these cells use to interact with each other in order to generate these patterns robustly."

"Why is it important for us to find a correct mathematical model to explain the stripes on zebrafish?" asks Professor Robert Kelsh, co-author of the study. "Partly, because pigment patterns are interesting and beautiful in their own right. But also because these stripes are an example of a key developmental process. If we can understand what's going on in the pattern development of a fish embryo, we may be able to gain deeper insight into the complex choreography of cells within embryos more generally."

The stripes of an adult 'wild type' zebrafish are formed from pigment-containing cells called chromatophores. There are three different types of chromatophore in the fish, and as the animal develops, these pigment cells shift around on the animal's surface, interacting with one other and self-organising into the stripy pattern for which the fish are named. Occasionally, mutations appear, changing how the cells interact with each other during pattern development resulting in spotty, leopard-skin or maze-like labyrinthine markings.

Scientists know a lot about the biological interactions needed for the self-organisation of a zebrafish's pigment cells, but there has been some uncertainty over whether these interactions offer a comprehensive explanation for how these patterns form. To test the biological theories, the Bath team developed a mathematical model that incorporated the three cell types and all their known interactions. The model has proven successful, predicting the pattern development of both wild type and mutant fish.

Mathematicians have been trying to explain how zebrafish stripes form for many years, however many previous modelling attempts have been unable to account for the broad range of observed fish mutant patterns. Jennifer Owen, the scientist responsible for building and running the model, said "One of the benefits of our model is that, due to its complexity, it can help to predict the developmental defects of some less understood mutants. For example, our model can help to predict the cell-cell interactions that are defective in mutants such as leopard, which displays spots."

Credit: 
University of Bath

Researchers build first AI tool capable of identifying individual birds

video: Sociable weavers building their communal nest, with bounding boxes illustrating the individual identification performed by the computer.

Image: 
Katia Bougiouri and André Ferreira

New research demonstrates for the first time that artificial intelligence (AI) can be used to train computers to recognise individual birds, a task humans are unable to do. The research is published in the British Ecological Society journal Methods in Ecology and Evolution.

"We show that computers can consistently recognise dozens of individual birds, even though we cannot ourselves tell these individuals apart. In doing so, our study provides the means of overcoming one of the greatest limitations in the study of wild birds - reliably recognising individuals." Said Dr André Ferreira at the Center for Functional and Evolutionary Ecology (CEFE), France, and lead author of the study.

In the study, researchers from institutes in France, Germany Portugal and South Africa describe the process for using AI to individually identify birds. This involves collecting thousands of labelled images of birds and then using this data to train and test AI models. This study represents the first successful attempt to do this in birds.

The researchers trained the AI models to recognise images of individual birds in wild populations of great tits and sociable weavers and a captive population of zebra finches, some of the most commonly studied birds in behavioural ecology. After training, the AI models were tested with images of the individuals they had not seen before and had an accuracy of over 90% for the wild species and 87% for the captive zebra finches.

In animal behaviour studies, individually identifying animals is one of the most expensive and time-consuming factors, limiting the scope of behaviours and the size of the populations that researchers can study. Current identification methods like attaching colour bands to birds' legs can also be stressful to the animals.

These issues could be solved with AI models. Dr André Ferreira said: "The development of methods for automatic, non-invasive identification of animals completely unmarked and unmanipulated by researchers represents a major breakthrough in this research field. Ultimately, there is plenty of room to find new applications for this system and answer questions that seemed unreachable in the past."

For AI models to be able to accurately identify individuals they need to be trained with thousands of labelled images. Companies like Facebook are able to do this for human recognition because they have access to millions of pictures of different people that are voluntarily tagged by users. But, acquiring such labelled photographs of animals is difficult and has created a bottleneck in research.

The researchers were able to overcome this challenge by building feeders with camera traps and sensors. Most birds in the study populations carried a passive integrated transponder (PIT) tag, similar to the microchips implanted in pet cats and dogs. Antennae on the bird feeders were able to read the identity of the bird from these tags and trigger the cameras.

Being able to distinguish individual animals from each other is important for the long-term monitoring of populations and protecting species from pressures such as climate change. While some species such as leopards have distinct patterns that allow humans to recognise them by eye, most species require additional visual identifiers, such as colour bands attached to birds' legs, for us to tell them apart. Even then, methods like this are extremely time consuming and error prone.

AI methods like the one shown in this study use a type of deep learning known as convolutional neural networks, these are optimal for solving image classification problems. In ecology, these methods have previously been used to identify animals at a species levels and individual primates, pigs and elephants. However, until now it hasn't been explored in smaller animals like birds.

The authors caution that the AI model is only able to re-identify individuals it has been shown before. "The model is able to identify birds from new pictures as long as the birds in those pictures are previously known to the models. This means that if new birds join the study population the computer will not be able to identify them." said Dr André Ferreira.

The appearance of individual birds can change over time, for instance moulting, and it's not known how the performance of the AI model will be affected. Images of the same bird taken months apart could be mistakenly identified as different individuals.

The authors add that both these limitations can be overcome with large enough datasets containing thousands of images of thousands of individuals over long periods of time, which they are currently trying to collect.

Credit: 
British Ecological Society

Hospitalized COVID-19 Patients Have Low Risk of Stroke

PHILADELPHIA--While initial reports suggested a significant risk of stroke in patients hospitalized with COVID-19, a new paper published in Stroke from Penn Medicine shows a low risk of stroke in patients hospitalized for COVID-19. Notably, the majority of afflicted patients had existing risk factors, such as high blood pressure and diabetes. These findings provide more clarity about the role COVID-19 plays in causing stroke in a diverse population of the United States.

"While there was initial concern for a high number of strokes related to COVID-19, that has not been born out. Importantly, while the risk for stroke in COVID-19 patients is low, it's mostly tied to pre-existing conditions--so physicians who do see stroke in hospitalized COVID-19 patients must understand the virus is not the only factor, and it's necessary to follow through with normal diagnostic testing," said Brett Cucchiara, MD, an associate professor of Neurology in the Perelman School of Medicine at the University of Pennsylvania, and senior author of the paper. "However, there are still many unknowns and we need to continue investigating the linkage between stroke and COVID-19, particularly considering the racial disparities surrounding the disease."

To evaluate the risk and incidence of stroke in COVID-19 hospitalized patients, researchers analyzed data from 844 COVID-19 patients admitted to the Hospital of the University of Pennsylvania, Penn Presbyterian Medical Center, and Pennsylvania Hospital between March and May. The team also analyzed the data for cases of intracranial hemorrhage (bleeding in the brain).

Researchers found that 2.4 percent of patients hospitalized for COVID-19 had an ischemic stroke--the most common type of stroke, typically caused by a blood clot in the brain. Importantly, the majority of these stroke patients had existing risk factors, such as high blood pressure (95 percent) and a history of diabetes (60 percent), and traditional stroke mechanisms, such as heart failure. Additionally, over one-third had a history of a previous stroke.

Researchers say the results suggest that these cerebrovascular events in hospitalized COVID-19 patients are likely tied to existing conditions, and not the sole consequence of the virus. However, other factors could be at play and require continued research. While the precise mechanisms linking cerebrovascular events to COVID-19 remain uncertain at this time, it has recently been reported that the viral infection, SARS-CoV-2, causes inflammation and a hypercoagulable state (excessive blood clotting)--both could be potential mechanisms leading to stroke.

The population of patients for the study was unique as well, with a more diverse cohort compared to previous reported studies. Black patients accounted for 68 percent of the study population and of the hospitalized patients who had a stroke, 80 percent were Black.

"This aligns with the data we're seeing on the racial disparities of the virus across our country," said Cucchiara. "We worry that this could further indicate the higher risks associated with COVID-19 in Black populations, much more so than white. So far, we don't understand the disproportionate effect we're seeing, but the disparities in infection rates and outcomes is incredibly important to figure out and address."

In addition to the incidents of stroke, the research team found that 0.9 percent of hospitalized COVID-19 patients had intracranial hemorrhage. While the rate of stroke in hospitalized COVID-19 patients is comparable to studies in Wuhan, China and Italy, the rate of intracranial hemorrhage, which has not previously been reported, is higher than investigators expected. The authors note this could be tied to the increasing use of anticoagulant therapy (blood thinners) in COVID-19 patients, and requires additional exploration.

Notably, there was a relatively long duration of time from initial COVID-19 symptoms to diagnosis of ischemic stroke, at an average of 21 days. This finding is consistent with increasing evidence of a hypercoagulable state which evolves over the initial weeks of the disease in many patients, and requires further study, the authors note.

The cohort of patients had an average age of 59 years, and the mean age of the ischemic stroke patients was 64 years, with only one patient under age 50. This finding differs significantly from early reports that raised concern there might be a high rate of stroke among younger patients.

Credit: 
University of Pennsylvania School of Medicine

Potential therapeutic effects of dipyridamole in the severely ill patients with COVID-19

image: Dipyridamole bound to the SARS-CoV-2 protease Mpro after identified via the virtual screening and bioassay validation, and thus suppressed viral replication in vitro. As a result, dipyridamole supplementation was associated with significantly decreased concentrations of D-dimers, increased lymphocyte and platelet recovery in the circulation, and markedly improved clinical outcomes in comparison to the control patients.

Image: 
Acta Pharmaceutica Sinica B

Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection can cause acute respiratory distress syndrome, hypercoagulability, hypertension, and multiorgan dysfunction. In recent months, SARS-CoV-2 has gradually spread to more than 200 countries and regions, resulting in more than 500,000 deaths globally.

Effective antivirals with safe clinical profile are urgently needed to improve the overall prognosis. In an analysis of a randomly collected cohort of 124 patients with COVID-19, the authors found that hypercoagulability as indicated by elevated concentrations of D-dimers was associated with disease severity. By virtual screening of a U.S. FDA approved drug library, the authors identified an anticoagulation agent dipyridamole (DIP) in silico, which suppressed SARS-CoV-2 replication in vitro.

In a proof-of-concept trial involving 31 patients with COVID-19, DIP supplementation was associated with significantly decreased concentrations of D-dimers (P

Credit: 
Compuscript Ltd

Cycad plants provide an important 'ecosystem service'

image: Researchers from the University of Guam and Montgomery Botanical Center have revealed how cycad plants, including Micronesia's native Cycas micronesica, create niche soil microhabitats through changes in nitrogen, carbon, and phosphorus concentrations.

Image: 
University of Guam

A study published in the June 2020 edition of the peer-reviewed journal Horticulturae shows that cycads, which are in decline and among the world's most threatened group of plants, provide an important service to their neighboring organisms. The study, completed by researchers from the Western Pacific Tropical Research Center at the University of Guam and the Montgomery Botanical Center in Miami, found that at least two cycad species share nitrogen and carbon through the soil, thereby creating habitable environments for other organisms.

"The new knowledge from this study shows how loss of cycad plants from natural habitats may create detrimental ripple effects that negatively influence the other organisms that evolved to depend on their ecosystem services," said Patrick Griffith, executive director of the Montgomery Botanical Center.

Cycad plants host nitrogen-fixing cyanobacteria within specialized roots. The tiny microbes willingly share the newly acquired nitrogen with their hosts as their contribution to a symbiosis that benefits both organisms.

Research teams at the University of Guam have long been studying the nutrient relations of Cycas micronesica throughout its endemic range, according to Adrian Ares, associate director of the Western Pacific Tropical Research Center.

"This unique arborescent cycad species is of cultural and ecological importance, and the findings illuminate new knowledge about the ecosystem services that are provided by the plant," Ares said.

The study focused on the concentrations in soil of three elements that impact the growth and development of living organisms. In soils nearby the cycad plants, nitrogen and carbon increased to concentrations that exceeded those of soils that were distant from the plants. In contrast, phosphorus concentrations were depleted in the soils nearby the cycad plants when compared to the distant soils.

"In addition to the direct contributions of carbon and nitrogen to the bulk soils, the chemical changes imposed by the cycad plants created niche habitats that increased spatial heterogeneity in the native forests," Ares said, adding that ecosystems with high biodiversity are generally more resistant to damage by threats and more resilient after the negative impacts.

The niche spaces created by the cycad plants provide the soil food web with a microhabitat that differs from the surrounding forest soils. These soils imprinted by the cycad plants benefit the organisms that exploit spaces characterized by greater nitrogen levels relative to phosphorus and greater carbon levels relative to phosphorus. Scientists call these elemental relationships "stoichiometry," and much has been studied about the importance of these relationships to organismal health and productivity.

The model cycad plants that were employed for the study included two of the cycad species that are native to the United States.

"This study was apropos because the Montgomery Botanical Center is positioned within Zamia integrifolia habitat in Miami, Fla., and the Western Pacific Tropical Research Center is within Cycas micronesica habitat in Mangilao, Guam," Griffith said.

The Florida species is the only cycad species that is native to the continental United States, and the Guam species is the only Cycas species native to the United States.

"Both research teams were gratified to successfully answer questions that were asked of the botanical denizens that have long resided in the respective local forests," Griffith said.

Credit: 
University of Guam

Wetter than wet: Global warming means more rain for Asian monsoon regions

image: Map of current rainfall (mm day-1) and wind (m s-1). Vectors show wind in the lower troposphere. The tropical monsoon region lies upwind of Japan.

Image: 
Tokyo Metropolitan University

Tokyo, Japan - Researchers from Tokyo Metropolitan University studied how the weather will change with global warming in Asian monsoon regions using a high-resolution climate simulation. The region is home to a large population, and the monsoons are a major driver of global water cycles. They explicitly simulated cloud formation and dissipation, and found significantly increased precipitation over the monsoon "trough," with tropical disturbances such as typhoons and concentrated water vapor playing key roles.

As the world braces itself for the impact of global warming, it is now more vital than ever to have an accurate, detailed picture of how exactly the climate will change. This applies strongly to the Asian monsoon regions, where vast amounts of annual precipitation make it an important part of global energy and water cycles. As home to a large proportion of the world population, detailed, local predictions for the scale and nature of monsoons and tropical disturbances such as typhoons/cyclones have the potential to inform disaster mitigation strategies and key policymaking.

A team led by Assistant Professor Hiroshi Takahashi sought to address this by using a high-resolution climate model known as NICAM (Non-hydrostatic ICosahedral Atmospheric Model) to study the detailed evolution of weather in the Asian monsoon regions. The model's key strength is an explicit account of cloud formation and dissipation based on physical principles e.g. accounting for the convective effects that give rise to cumulonimbus clouds and subsequent precipitation when the air pressure drops. This level of detail allowed the team to study future precipitation patterns due to Asian monsoons with unprecedented accuracy.

The team's simulation of 30 years of global warming shows significantly elevated levels of precipitation in the monsoon "trough," a zone spanning northern India, the Indochina peninsula, and the western parts of the North Pacific. It is well known that global warming leads to more precipitation, driven mostly by more water vapor in the atmosphere. However, the different features of each region mean that the changes are far from uniform. For example, the study found that it was not clear whether "monsoon westerlies" were enhanced, but it did find more cyclones in the trough, enough to account for the increased precipitation. Concurrent with the increased precipitation, they also found distinct trends in water vapor over the monsoon region.

Furthermore, the team focused on the effect of sea surface temperature. Previous studies often applied a global, uniform increase in temperature plus the regional variations created by the El Nino effect. To separate their effects, they added them separately in two independent simulations, concluding that it was the former, a global increase in sea surface temperature, that contributed most strongly to the increased precipitation.

The effects of the monsoon season in Asia can be devastating. Examples include locations close to home for the team e.g. the 2018 and 2020 floods in western Japan and the East Asian countries. With these region-specific findings, their work may play an important role in global disaster mitigation, infrastructure development and policy decisions.

Credit: 
Tokyo Metropolitan University

Vacancy dynamics on CO-covered Pt(111) electrodes

Platinum arguably is the most important electrocatalyst material, not only because it is the best single element catalyst in a variety of important electrocatalytic reactions but also due to its relatively high stability. However, in the corrosive environment of real electrocatalysis systems, such as fuel cells, even platinum can structurally degrade. Moreover, the presence of strongly adsorbing species, in particular carbon monoxide (CO), can substantially increase these degradation effects.

A team led by Prof. CHEN Yanxia from University of Science and Technology of China (USTC) of CAS, in cooperation with Prof. Olaf Magnuseen, reported in situ video-STM observations of additional point defects in the presence of this dynamic CO adlayer. The STM observations presented in this work provide direct insights into their dynamic behavior and formation mechanisms. The research results were published in Chemical Communications on June 17th.

Adsorbed CO is known to interact with Pt electrodes, causing a relaxation of Pt surface atoms and a weakening of the Pt-Pt bond. In situ scanning tunneling microscopy (STM) studies of Pt(111) in the presence of dissolved CO found that initially disordered were transformed into perfectly straight (111) steps by potential cycling into the CO oxidation regime. CO can increase Pt surface mobility during CO oxidation, which can reduce the amount of available low coordinated sites, leading to a restructuring of the Pt steps.

In the previous work by the team, they studied the atomic-scale structural dynamics of

CO adlayers on Pt(111) electrodes in CO-saturated 0.1 M H2SO4 using in situ video-rate scanning tunneling microscopy (STM) and density functional theory (DFT).

In the recent work, their video-STM results of Pt(111) in CO-saturated 0.1 M H2SO4 revealed specific defects within the apparent (1 × 1)-CO adlayer, which we assign to vacancies in the topmost Pt layer. The presence of these Pt vacancies as well as the observed fluctuations at the Pt steps show that even under very benign conditions, i.e., for the particularly stable Pt(111) electrode surface in the CO pre-oxidation regime, the presence of CO can induce some structural degradation.

Thus, CO may affect the stability of Pt electrocatalysts already at potentials as low as 0.30 VAg/AgCl, which for example may be relevant in direct methanol fuel cells. The atomic scale data obtained by video-STM provide fundamental insights into the structure-activity and structure-stability relationships, which may contribute to the knowledge-based design of better Pt electrocatalysts.

In addition, their results show that for suitable systems the dynamics of individual electrode point defects, such as vacancies, can be directly investigated in electrochemical environment.

The observed dynamic behavior suggests a complex interaction between adsorbed CO and surface vacancies which also should affect the electrochemical reactivity of CO and
needs to be explored in future experimental and theoretical studies.

Credit: 
University of Science and Technology of China