Tech

Perinatal hypoxia associated with long-term cerebellar learning deficits and Purkinje cell misfiring

image: This photo shows Aaron Sathyanesan, Ph.D., study co-lead author; Joseph Abbah, study co-author; Srikanya Kundu, Ph.D., study co-lead author; and Vittorio Gallo, Ph.D., Children’s Chief Research Officer and the study’s senior author.

Image: 
Children’s National Health System

Oxygen deprivation associated with preterm birth leaves telltale signs on the brains of newborns in the form of alterations to cerebellar white matter at the cellular and the physiological levels. Now, an experimental model of this chronic hypoxia reveals that those cellular alterations have behavioral consequences.

Chronic sublethal hypoxia is associated with locomotor miscoordination and long-term cerebellar learning deficits in a clinically relevant model of neonatal brain injury, according to a study led by Children's National Health System researchers published online Aug. 13, 2018, by Nature Communications. Using high-tech optical and physiological methods that allow researchers to turn neurons on and off and an advanced behavioral tool, the research team finds that Purkinje cells fire significantly less often after injury due to perinatal hypoxia. However, an off-the-shelf medicine now used to treat epilepsy enables those specialized brain cells to regain their ability to fire, improving locomotor performance.

Step out of the car onto the pavement, hop up to the level of the curb, stride to the entrance, and climb a flight of stairs. Or, play a round of tennis. The cerebellum coordinates such locomotor performance and muscle memory, guiding people of all ages as they adapt to a changing environment.

"Most of us successfully coordinate our movements to navigate the three-dimensional spaces we encounter daily," says Vittorio Gallo, Ph.D., Children's Chief Research Officer and the study's senior author. "After children start walking, they also have to learn how to navigate the environment and the spaces around them."

These essential tasks, Gallo says, are coordinated by Purkinje cells, large neurons located in the cerebellum that are elaborately branched like interlocking tree limbs and represent the only source of output for the entire cerebellar cortex. The rate of development of the fetal cerebellum dramatically increases at a time during pregnancy that often coincides with preterm birth, which can delay or disrupt normal brain development.

"It's almost like a short circuit. Purkinje cells play a very crucial role, and when the frequency of their firing is diminished by injury the whole output of this brain region is impaired," Gallo says. "For a family of a child who has this type of impaired neural development, if we understand the nature of this disrupted circuitry and can better quantify it, in terms of locomotor performance, then we can develop new therapeutic approaches."

The research team leveraged a fully automated, computerized apparatus that looks like a ladder placed on a flat surface, encased in glass, with a darkened box at either end. Both the hypoxic and control groups had training sessions during which they learned how to traverse the horizontal ladder, coaxed out of the darkened room by a gentle puff of air and a light cue. Challenge sessions tested their adaptive cerebellar locomotor learning skills. The pads they strode across were pressure-sensitive and analyzed individual stepping patterns to predict how long it should take each to complete the course.

During challenge sessions, obstacles were presented in the course, announced by an audible tone. If learning was normal, then the response to the tone paired with the obstacle would be a quick adjustment of movement, without breaking stride, says Aaron Sathyanesan, co-lead author. Experimental models exposed to perinatal hypoxia showed significant deficits in associating that tone with the obstacle.

"With the control group, we saw fewer missteps during any given trial," Sathyanesan says. "And, when they got really comfortable, they took longer steps. With the hypoxic group, it took them longer to learn the course. They made a significantly higher number of missteps from day one. By the end of the training period, they could walk along all of the default rungs, but it took them longer to learn how to do so."

Purkinje cells fire two different kinds of spikes. Simple spikes are a form of constant activity as rhythmic and automatic as a heartbeat. Complex spikes, by contrast, occur less frequently. Sathyanesan and co-authors say that some of the deficits that they observed were due to a reduction in the frequency of simple spiking.

Two weeks after experiencing hypoxia, the hypoxic group's locomotor performance remained significantly worse than the control group, and delays in learning could still be seen five weeks after hypoxia.

Gamma-aminobutyric acid (GABA), a neurotransmitter, excites immature neurons before and shortly after birth but soon afterward switches to having an inhibitory effect within in the cerebellum, Sathyanesan says. The research team hypothesizes that reduced levels of excitatory GABA during early development leads to long-term motor problems. Using an off-the-shelf drug to increase GABA levels immediately after hypoxia dramatically improved locomotor performance.

"Treating experimental models with tiagabine after hypoxic injury elevates GABA levels, partially restoring Purkinje cells' ability to fire," Gallo says. "We now know that restoring GABA levels during this specific window of time has a beneficial effect. However, our approach was not specifically targeted to Purkinje cells. We elevated GABA everywhere in the brain. With more targeted and selective administration to Purkinje cells, we want to gauge whether tiagabine has a more powerful effect on normalizing firing frequency."

Credit: 
Children's National Hospital

Scientists identify enzyme that could help accelerate biofuel production

image: The red alga C. merolae grown in culture in the laboratory

Image: 
Sousuke Imamura

Researchers at Tokyo Institute of Technology have honed in on an enzyme belonging to the glycerol-3-phosphate acyltransferase (GPAT) family as a promising target for increasing biofuel production from the red alga Cyanidioschyzon merolae.

Algae are known to store up large amounts of oils called triacylglycerols (TAGs) under adverse conditions such as nitrogen deprivation. Understanding precisely how they do so is of key interest to the biotechnology sector, as TAGs can be converted to biodiesel. To this end, scientists are investigating the unicellular red alga C. merolae as a model organism for exploring how to improve TAG production.

A study led by Sousuke Imamura at the Laboratory for Chemistry and Life Science, Institute of Innovative Research, Tokyo Institute of Technology (Tokyo Tech), has now shown that an enzyme called GPAT1 plays an important role in TAG accumulation in C. merolae even under normal growth conditions -- that is, without the need to induce stress.

Remarkably, the team demonstrated that TAG productivity could be increased by more than 56 times in a C. merolae strain overexpressing GPAT1 compared with the control strain, without any negative effects on algal growth.

Their findings, published in Scientific Reports, follow up previous research by Imamura and others that had suggested two GPATs, GPAT1 and GPAT2, may be closely involved in TAG accumulation in C. merolae.

"Our results indicate that the reaction catalyzed by the GPAT1 is a rate-limiting step for TAG synthesis in C. merolae, and would be a potential target for improvement of TAG productivity in microalgae," the researchers say.

The team plans to continue exploring how GPAT1 and GPAT2 might both be involved in TAG accumulation. An important next step will be to identify transcription factors that control the expression of individual genes of interest.

"If we can identify such regulators and modify their function, TAG productivity will be further improved because transcription factors affect the expression of a wide range of genes including GPAT1-related genes," they say. "This kind of approach based on the fundamental molecular mechanism of TAG synthesis should lead to successful commercial biofuel production using microalgae."

Credit: 
Tokyo Institute of Technology

Intensifying Hurricane Lane examined by GPM satellite

image: On Aug. 17, 2018, at 1:26 a.m. EDT (0526 UTC) the GPM core satellite revealed the location of Lane's forming eye wall. Very heavy precipitation was shown in the forming eye wall. Intense rainfall was also found falling at a rate of over 128 mm (5 inches) per hour in convective storms wrapping around the southern side of the intensifying tropical cyclone.

Image: 
NASA/JAXA/Hal Pierce

Heavy rainfall and towering cloud heights were the findings when Hurricane Lane was scanned by the Global Precipitation Measurement mission or GPM core observatory satellite on Aug. 17. Lane strengthened to a Category 2 hurricane on the Saffir-Simpson Hurricane Wind Scale.

GPM passed above Lane in the Eastern Pacific Ocean on Aug. 17, 2018, at 1:26 a.m. EDT (0526 UTC). Lane was intensifying and data collected by GPM's Microwave Imager (GMI) and Dual-Frequency Precipitation Radar (DPR) instruments revealed the location of its forming eye wall. Very heavy precipitation was shown by DPR in the forming eye wall. Intense rainfall was also found falling at a rate of over 128 mm (5 inches) per hour in convective storms wrapping around the southern side of the intensifying tropical cyclone. GPM is a joint mission between NASA and the Japan Aerospace Exploration Agency, JAXA.

At NASA's Goddard Space Flight Center in Greenbelt, Maryland, GPM satellites radar (DPR Ku Band) data were used to create a close-up view of the 3D structure of precipitation in Lane's forming eye wall. Powerful convective storms on Lane's southern side are also shown stretching to heights above 12.5 km (7.75 miles).

At Goddard, a simulated 3D flyby around intensifying hurricane Lane was also created. That animation was derived from GPM's radar data (DPR Ku Band). It showed the location of the forming eye wall and the towering convective storms in the feeder band on the southern side of the tropical cyclone.

At 11 a.m. EDT (1500 UTC), the center of Hurricane Lane was located near 11.2 degrees north latitude and 132.9 degrees west longitude. Lane was moving toward the west near 16 mph (26 km/h). A motion between west and west-northwest is expected during the next few days, and

Lane is forecast to cross into the Central Pacific basin on Saturday.

Maximum sustained winds have increased to near 100 mph (155 kph) with higher gusts. Continued rapid strengthening is expected for the next 24 hours.

Hurricane Lane is predicted by the National Hurricane Center (NHC) to continue intensifying during the next few days as it moves into the Central Pacific Ocean. Lane, like hurricane Hector, is predicted to pass to the south of the Hawaiian Islands. Lane is forecast to become a major hurricane within the next couple of days.

Credit: 
NASA/Goddard Space Flight Center

NASA sees Tropical Storm Bebinca along Vietnam's coast

image: On Aug. 16 10:45 a.m. EDT (1445 UTC) the MODIS instrument aboard NASA's Terra satellite captured this infrared image of Tropical Storm Bebinca. MODIS found cloud top temperatures of strong thunderstorms (yellow/green) around the center as cold as or colder than minus 80 degrees Fahrenheit (minus 62.2 Celsius).

Image: 
NASA/NRL

Tropical Storm Bebinca showed powerful, heavy rain-making thunderstorms on infrared satellite imagery when NASA's Terra satellite saw the storm along the northern Vietnam coast.

The Vietnam Hydrological Administration (VHA) has issued a tropical storm warning for coastal Vietnam.

Infrared light provides valuable temperature data to forecasters and cloud top temperatures give clues about highest, coldest, strongest storms within a tropical cyclone.

On Aug. 16 10:45 a.m. EDT (1445 UTC) the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Terra satellite captured an infrared image of Tropical Storm Bebinca. MODIS found cloud top temperatures of strong thunderstorms around the center and just off-shore. Those cloud tops were as cold as or colder than minus 80 degrees Fahrenheit (minus 62.2 Celsius).NASA research has shown that cloud tops that cold can produce heavy rainfall.

At 11 a.m. EDT (1500 UTC) on Aug. 16, Bebinca had maximum sustained winds near 69 mph (60 knots/111 kph). It was located near 20.0 degrees north latitude and 107.3 degrees east longitude. That's approximately 99 nautical miles southeast of Hanoi, Vietnam. Bebinca has tracked toward the west and is expected to continue in that direction though landfall.

The Joint Typhoon Warning Center forecast calls for a landfall just south of Hanoi late on Aug. 16 or early on Aug. 17.

Credit: 
NASA/Goddard Space Flight Center

Most wear-resistant metal alloy in the world engineered at Sandia National Laboratories

image: Sandia National Laboratories researchers Michael Chandross, left, and Nic Argibay show a computer simulation used to predict the unprecedented wear resistance of their platinum-gold alloy, and an environmental tribometer used to demonstrate it.

Image: 
Sandia National Laboratories

ALBUQUERQUE, N.M. -- If you're ever unlucky enough to have a car with metal tires, you might consider a set made from a new alloy engineered at Sandia National Laboratories. You could skid -- not drive, skid -- around the Earth's equator 500 times before wearing out the tread.

Sandia's materials science team has engineered a platinum-gold alloy believed to be the most wear-resistant metal in the world. It's 100 times more durable than high-strength steel, making it the first alloy, or combination of metals, in the same class as diamond and sapphire, nature's most wear-resistant materials. Sandia's team recently reported their findings in Advanced Materials. "We showed there's a fundamental change you can make to some alloys that will impart this tremendous increase in performance over a broad range of real, practical metals," said materials scientist Nic Argibay, an author on the paper.

Although metals are typically thought of as strong, when they repeatedly rub against other metals, like in an engine, they wear down, deform and corrode unless they have a protective barrier, like additives in motor oil.

In electronics, moving metal-to-metal contacts receive similar protections with outer layers of gold or other precious metal alloys. But these coatings are expensive. And eventually they wear out, too, as connections press and slide across each other day after day, year after year, sometimes millions, even billions of times. These effects are exacerbated the smaller the connections are, because the less material you start with, the less wear and tear a connection can endure before it no longer works.

With Sandia's platinum-gold coating, only a single layer of atoms would be lost after a mile of skidding on the hypothetical tires. The ultradurable coating could save the electronics industry more than $100 million a year in materials alone, Argibay says, and make electronics of all sizes and across many industries more cost-effective, long-lasting and dependable -- from aerospace systems and wind turbines to microelectronics for cell phones and radar systems.

"These wear-resistant materials could potentially provide reliability benefits for a range of devices we have explored," said Chris Nordquist, a Sandia engineer not involved in the study. "The opportunities for integration and improvement would be device-specific, but this material would provide another tool for addressing current reliability limitations of metal microelectronic components."

New metal puts an old theory to rest

You might be wondering how metallurgists for thousands of years somehow missed this. In truth, the combination of 90 percent platinum with 10 percent gold isn't new at all.

But the engineering is new. Argibay and coauthor Michael Chandross masterminded the design and the new 21st century wisdom behind it. Conventional wisdom says a metal's ability to withstand friction is based on how hard it is. The Sandia team proposed a new theory that says wear is related to how metals react to heat, not their hardness, and they handpicked metals, proportions and a fabrication process that could prove their theory.

"Many traditional alloys were developed to increase the strength of a material by reducing grain size," said John Curry, a postdoctoral appointee at Sandia and first author on the paper. "Even still, in the presence of extreme stresses and temperatures many alloys will coarsen or soften, especially under fatigue. We saw that with our platinum-gold alloy the mechanical and thermal stability is excellent, and we did not see much change to the microstructure over immensely long periods of cyclic stress during sliding."

Now they have proof they can hold in their hands. It looks and feels like ordinary platinum, silver-white and a little heavier than pure gold. Most important, it's no harder than other platinum-gold alloys, but it's much better at resisting heat and a hundred times more wear resistant.

The team's approach is a modern one that depended on computational tools. Argibay and Chandross' theory arose from simulations that calculated how individual atoms were affecting the large-scale properties of a material, a connection that's rarely obvious from observations alone. Researchers in many scientific fields use computational tools to take much of the guesswork out of research and development.

"We're getting down to fundamental atomic mechanisms and microstructure and tying all these things together to understand why you get good performance or why you get bad performance, and then engineering an alloy that gives you good performance," Chandross said.

A slick surprise

Still, there will always be surprises in science. In a separate paper published in Carbon, the Sandia team describes the results of a remarkable accident. One day, while measuring wear on their platinum-gold, an unexpected black film started forming on top. They recognized it: diamond-like carbon, one of the world's best man-made coatings, slick as graphite and hard as diamond. Their creation was making its own lubricant, and a good one at that.

Diamond-like carbon usually requires special conditions to manufacture, and yet the alloy synthesized it spontaneously.

"We believe the stability and inherent resistance to wear allows carbon-containing molecules from the environment to stick and degrade during sliding to ultimately form diamond-like carbon," Curry said. "Industry has other methods of doing this, but they typically involve vacuum chambers with high temperature plasmas of carbon species. It can get very expensive."

The phenomenon could be harnessed to further enhance the already impressive performance of the metal, and it could also potentially lead to a simpler, more cost-effective way to mass-produce premium lubricant.

Credit: 
DOE/Sandia National Laboratories

Low bandwidth? Use more colors at once

image: New ultrathin nanocavities with embedded silver strips have streamlined color production, and therefore broadened possible bandwidth, for both today's electronics and future photonics.

Image: 
Purdue University image/Alexander Kildishev

The rainbow is not just colors - each color of light has its own frequency. The more frequencies you have, the higher the bandwidth for transmitting information.

Only using one color of light at a time on an electronic chip currently limits technologies based on sensing changes in scattered color, such as detecting viruses in blood samples, or processing airplane images of vegetation when monitoring fields or forests.

Putting multiple colors into service at once would mean deploying multiple channels of information simultaneously, broadening the bandwidth of not only today's electronics, but also of the even faster upcoming "nanophotonics" that will rely on photons - fast and massless particles of light - rather than slow and heavy electrons to process information with nanoscale optical devices.

IBM and Intel have already developed supercomputer chips that combine the higher bandwidth of light with traditional electronic structures.

As researchers engineer solutions for eventually replacing electronics with photonics, a Purdue University-led team has simplified the manufacturing process that allows utilizing multiple colors at the same time on an electronic chip instead of a single color at a time.

The researchers also addressed another issue in the transition from electronics to nanophotonics: The lasers that produce light will need to be smaller to fit on the chip.

"A laser typically is a monochromatic device, so it's a challenge to make a laser tunable or polychromatic," said Alexander Kildishev, associate professor of electrical and computer engineering at Purdue University. "Moreover, it's a huge challenge to make an array of nanolasers produce several colors simultaneously on a chip."

This requires downsizing the "optical cavity," which is a major component of lasers. For the first time, researchers from Purdue, Stanford University and the University of Maryland embedded so-called silver "metasurfaces" - artificial materials thinner than light waves - in nanocavities, making lasers ultrathin.

"Optical cavities trap light in a laser between two mirrors. As photons bounce between the mirrors, the amount of light increases to make laser beams possible," Kildishev said. "Our nanocavities would make on-a-chip lasers ultrathin and multicolor."

Currently, a different thickness of an optical cavity is required for each color. By embedding a silver metasurface in the nanocavity, the researchers achieved a uniform thickness for producing all desired colors. Their findings appear in Nature Communications.

"Instead of adjusting the optical cavity thickness for every single color, we adjust the widths of metasurface elements," Kildishev said.

Optical metasurfaces could also ultimately replace or complement traditional lenses in electronic devices.

"What defines the thickness of any cell phone is actually a complex and rather thick stack of lenses," Kildishev said. "If we can just use a thin optical metasurface to focus light and produce images, then we wouldn't need these lenses, or we could use a thinner stack."

Credit: 
Purdue University

Quantum material is promising 'ion conductor' for research, new technologies

image: This graphic depicts new research in which lithium ions are inserted into the crystal structure of a quantum material called samarium nickelate, suggesting a new avenue for research and potential applications in batteries, 'smart windows' and brain-inspired computers containing artificial synapses.

Image: 
Purdue University image/Yifei Sun

Researchers have shown how to shuttle lithium ions back and forth into the crystal structure of a quantum material, representing a new avenue for research and potential applications in batteries, "smart windows" and brain-inspired computers containing artificial synapses.

The research centers on a material called samarium nickelate, which is a quantum material, meaning its performance taps into quantum mechanical interactions. Samarium nickelate is in a class of quantum materials called strongly correlated electron systems, which have exotic electronic and magnetic properties.

The researchers "doped" the material with lithium ions, meaning the ions were added to the material's crystal structure.

The addition of lithium ions causes the crystal to expand and increases the material's conduction of the ions. The researchers also learned that the effect works with other types of ions, particularly sodium ions, pointing to potential applications in energy storage.

Findings are detailed in a paper appearing this week in Proceedings of the National Academy of Sciences.

"The results highlight the potential of quantum materials and emergent physics in the design of ion conductors," said Shriram Ramanathan, a Purdue University professor of materials engineering who is leading the research. "There is a lot of research now going on to identify solid-state ion conductors for building batteries, for example. We showed that this general family of materials can hold these ions, so we established some general principles for the design of these sorts of solid-state ion conductors. We showed that ions like lithium and sodium can move through this solid material, and this opens up new directions for research."

Applying a voltage caused the ions to occupy spaces between atoms in the crystal lattice of the material. The effect could represent a more efficient method to store and conduct electricity. Such an effect could lead to new types of batteries and artificial synapses in "neuromorphic," or brain-inspired, computers. Moreover, the ions remained in place after the current was turned off, a "non-volatile" behavior that might be harnessed for computer memory.

Adding lithium ions to the crystal structure also changes the material's optical properties, suggesting potential applications as coatings for "smart windows" whose light transmission properties are altered when voltage is applied.

The research paper's lead authors are Purdue materials engineering postdoctoral research associate Yifei Sun and Michele Kotiuga, a postdoctoral fellow in the Department of Physics and Astronomy at Rutgers University. The work was performed by researchers at several research institutions. A complete listing of co-authors is available in the abstract. To develop the doping process, materials engineers collaborated with Vilas Pol, a Purdue associate professor of chemical engineering and materials engineering, and Purdue graduate student Dawgen Lim.

The research findings demonstrated behavior related to the "Mott transition," a quantum mechanical effect describing how the addition of electrons can change the conducting behavior of a material.

"As we add more electrons to the system the material becomes less and less conducting, which makes it a very interesting system to study, and this effect can only be explained through quantum mechanics," Ramanathan said.

Kotiuga's contribution to the work was to study the electronic properties of lithium-doped samarium nickelate as well as the changes to the crystal structure after doping.

"My calculations show that undoped samarium nickelate is a narrow-gapped semiconductor, meaning that even though it is not metallic, electrons can be excited into a conducting state without too much trouble," she said. "As lithium is added to samarium nickelate the lithium ion will bind to an oxygen and an electron localizes on a nearby nickel-oxygen octahedron, and when an electron has localized on every nickel-oxygen octahedron the material is converted into an insulator. This is a rather counterintuitive result: the added electrons to the system make the material more insulating."

The material's crystal structure was characterized using a synchrotron-radiation light source research facility at Argonne National Laboratory.

The researchers had been working on the paper for about two years and plan to further explore the material's quantum behavior and potential applications in brain-inspired computing.

Credit: 
Purdue University

Female mosquitoes get choosy quickly to offset invasions

Certain female mosquitoes quickly evolve more selective mating behavior when faced with existential threats from other invasive mosquito species, with concurrent changes to certain genetic regions, according to new research from North Carolina State University. The findings shed light on the genetics behind insect mating behavior and could have implications for controlling mosquito pests that plague humans.

At issue is the displacement of Aedes aegypti (yellow fever) mosquitos by a cousin species, Aedes albopictus (Asian tiger), which occurred in the southeastern United States in the 1980s. In this "battle of the Aedes," the invading A. albopictus decimated A. aegypti populations throughout the Southeast, leaving smaller A. aegypti populations in Key West, Florida, Arizona and a few other southern locales. A. aegypti mosquitoes carry and spread many diseases that harm humans, including Zika, dengue fever and chikungunya.

Part of the takeover was attributed to how the larvae of each species grew; A. albopictus mosquitoes seemed to be able to outcompete the native mosquitoes. But another factor also played a huge role in the battle: When A. aegypti females mated with A. albopictus males - a genetic no-no - those females became sterile for life, a process called "satyrization." A. albopictus females didn't face the same fate; no offspring were produced when they mated with A. aegypti males, but they were later able to be fertile when mating with males of their own species.

Martha Burford Reiskind, research assistant professor in the Department of Applied Ecology at NC State and corresponding author of a paper describing the research, and colleagues wanted to understand more about how A. aegypti females respond to this type of threat and what happens in their genetic blueprint as their responses change.

The researchers found that A. aegypti females quickly - in just six generations - became more picky when selecting mates, eschewing A. albopictus males for males of their own species. This response occurred when A. aegypti females were exposed to cousin males in the lab and in the wild. Geographic location didn't seem to make a difference: The female mosquitoes in both Florida and Arizona exhibited similar genetic changes.

"We wanted to know what genes were involved in the evolution of this choosiness in female A. aegypti mosquitoes," Burford Reiskind said. "We can now look at certain gene regions and feel confident that they are involved in mating behavior."

Choosiness had its costs, though. Burford Reiskind said choosy female A. aegypti mosquitoes mated later in their brief lifespans - most live for two or three weeks - and were generally smaller.

"Invasive species are often seen as better competitors for scant resources, but that doesn't seem to be the case for these mosquitoes," Burford Reiskind said. "This study suggests other mechanisms are at play."

Burford Reiskind hopes to continue learning more about the genes involved in mating behaviors by conducting a larger-scale study, perhaps in places where A. aegypti and A. albopictus mosquitoes live in relatively equal densities.

Credit: 
North Carolina State University

Individuals shot by police exhibit distinct patterns of recent prior arrests - and hospitalizations

Ann Arbor, August 16, 2018 – A study published in the American Journal of Preventive Medicine found that more than 50 percent of people with assault-related or legal intervention (LI) firearm injuries due to law enforcement actions and over 25 percent of individuals with self-inflicted or unintentional firearm injuries were arrested, hospitalized, or both in the two years prior to being shot. While LI firearm injuries are comparatively rare, they are on the rise, increasing 10 percent nationally in the United States over the last decade. The study’s findings contribute important evidence that can be used to reduce and prevent these injuries and deaths.

“Looking at arrest and hospital records in Seattle, we identified some patterns among intent-specific firearm injuries. Notably, individuals shot by police had arrest histories similar to those shot in assaults and homicides and medical histories similar to those with self-inflicted firearm injuries. The individuals who are injured in LI encounters also exhibit disruptive, impulsive, and conduct-related disorders more frequently than individuals in our control group and increased risk of interpersonal and self-directed violence,” explained Brianna M. Mills, PhD, Harborview Injury Prevention and Research Center (HMC), University of Washington, Seattle, WA, USA. “Our results show how many firearm injuries occur after a series of encounters with institutions that are meant to help individuals in crisis, but unfortunately have failed to deter the situation leading to the injury. We believe that each of these encounters represents an opportunity for more effective interventions and long-term solutions.”

The case-control study conducted by Dr. Mills and colleagues identified 763 individuals who sustained firearm injuries and 335 who had motor vehicle passenger injuries in Seattle over a five-year period (2010-2014). Data were obtained from from the HMC trauma registry and Washington State death records. Diagnoses from prior hospitalizations were extracted from the Comprehensive Hospital Abstract Reporting System (CHARS) maintained by the Washington State Department of Health, which documents inpatient treatment and observation stays in all state-licensed acute care, long-term, and cancer specialty hospitals in Washington, including psychiatric units. Arrest data came from the Washington State Identification System criminal history database.

Patients were grouped by firearm injury cause: assault-related (58.1 percent), self-inflicted (28.6 percent), unintentional (9.3 percent), and legal intervention (4.1 percent). In the control group were motor vehicle passenger accidents. The records for the two years prior to the incidents were mined for information about misdemeanor and felony arrests, as well as diagnoses of substance abuse (alcohol, marijuana, and other drugs), and mental disorders (psychosis, depression/anxiety, and impulsivity/conduct disorder). By comparing arrest history, substance use, and mental disorders within the same geographic region and time period in a single investigation, the study helped to clarify how each contributes to the risk of sustaining intent-specific firearm injuries. It is important to note that the findings should be interpreted as providing information on non-causal markers of increased risk, rather than causal risk factors.

Key findings:

In the two years prior to being shot, individuals with fatal and non-fatal LI firearm injuries were:

22 times more likely to have an impulsivity/conduct disorder diagnosis than passengers injured in a motor vehicle collision

11 times more likely to have been diagnosed with a substance use disorder than passengers injured in a motor vehicle collision

7 times more likely to have a prior felony arrest, depression/anxiety diagnosis, or psychosis diagnosis than passengers injured in a motor vehicle collision

While there is increasing public awareness of the scope and burden of firearm injury in the US, police shootings, in particular, have received relatively less attention in research over the past decades than other types of firearm injuries. Because they are occurring with increasing frequency, with profound negative implications for individuals, their families, and communities where they occur, this study is an important step for building − and using − an evidence base to understand the connections and distinctions between different types of firearm injuries.

These findings highlight a potential role for medical and law enforcement professionals in preventing LI injuries as they encounter vulnerable individuals. Screening instruments for mental disorders or risk of firearm violence can be validated and potentially adapted for use in multiple settings. Tools to identify and utilize intervention opportunities during medical or legal encounters may help prevent later law enforcement confrontations. Coordination between police and mental health services, including de-escalation training for officers tasked as first responders and specialized crisis response teams with decision-making authority given to qualified mental health professionals, may lead to better avoidance of confrontations between police and individuals in crisis. Police departments, including Seattle’s, have begun to make some of these changes with good results.

“Moving the conversation forward requires understanding the connections and distinctions between different types of firearm injuries so we can create focused and effective interventions,” commented Dr. Mills. “We invite other researchers to collaborate with us to replicate the study in other cities and determine if some of the patterns we uncovered are generalizable or specific to Seattle.”

Credit: 
Elsevier

ShareBackup could keep data in the fast lane

image: Rice University computer scientist Eugene Ng led the development of ShareBackup, a hardware and software solution to help data centers recover from failures without slowing applications.

Image: 
Jeff Fitlow/Rice University

HOUSTON - (Aug. 16, 2018) - Anyone who has ever cursed a computer network as it slowed to a crawl will appreciate the remedy offered by scientists at Rice University.

Rice computer scientist Eugene Ng and his team say their solution will keep data on the fast track when failures inevitably happen.

Ng introduced ShareBackup, a strategy that would allow shared backup switches in data centers to take on network traffic within a fraction of a second after a software or hardware switch failure.

He will present a peer-reviewed paper on the work this week at the SIGCOMM 2018 conference in Budapest, Hungary. The paper is online and available for download.

Ng said the idea would solve a common annoyance among data professionals, scientists and everyone who relies on a network to deliver results day in and day out.

"A data network consists of servers and network switches," said Ng, a professor of computer science and electrical and computer engineering. "Switches move data packets to where they need to go. But things fail, especially in large-scale data centers with thousands of pieces of hardware."

The usual response to a failed switch is to shunt the flow of data to another line. "Generally, the network has multiple paths for connecting servers so, just like if there's a closure on the highway, we'd drive around it. This is a conventional, natural approach that makes a lot of sense: You reroute around the failure to get where you need to go."

But sometimes that other road is congested and everything slows down. "Data centers aren't the internet; they're not about people surfing websites," Ng said. "They're about supporting data-intensive applications like data mining or machine learning. And a lot of these applications have stringent performance deadlines, so blindly rerouting traffic could be the wrong thing to do in a data center."

Rather than the expensive option of installing redundant switches throughout a network, the Ng lab's strategy would put fast switches and software in strategic locations that could pick up the traffic from a failed switch in a microsecond. When that problem is resolved, the team's software makes the backup switch available to handle another failure.

The switch is fast enough -- the failure-recovery time is 0.73 milliseconds, including latency from hardware and control systems -- that most users would never know that part of the system had failed.

"The reality is that the fraction of devices that fail at any given time is very small, and most of these failures can be addressed by things like rebooting the device," Ng said. "Sometimes the software gets screwed up and a simple power cycle will bring it back. These failures may also not last long.

"These are the characteristics we're trying to exploit," he said. "Because of that, we can get away with having very few devices back up a large number of devices."

Ng said ShareBackup could save data centers time and money not only by maintaining full bandwidth but by also helping to analyze problems, including misconfigurations that commonly lead to network failure.

"Part of our work is to help data centers figure out what went wrong in the network," he said. "Once the backup is activated, you can take the failed device out of the production network and test it to identify which component caused the problem.

"Now, if we take two devices out and can't figure out which went bad, both need to be replaced," he said. "It's very likely only one of the devices is having the problem. Our software can diagnose these devices in a semiautomatic manner, and if one of the parts is good, it can be reinstated."

Credit: 
Rice University

MSU plant sciences faculty part of international discovery in wheat genome sequence

image: Hikmet Budak, Montana State University professor and Montana Plant Sciences Endowed Chair with the Department of Plant Sciences and Plant Pathology, was one of 200 international collaborators who published an article detailing the entire sequence of the wheat genome of bread wheat in Science.

Image: 
MSU Photo by Adrian Sanchez-Gonzalez

BOZEMAN - A Montana State University faculty member dedicated to researching cereal genetics and genomics for Montana farmers is part of an international research team that published an article detailing the entire sequence of the wheat genome of bread wheat. The International Wheat Genome Sequencing Consortium published the article in the prestigious journal Science this week. It is the result of 13 years of collaborative international research.

The article will pave the way for the production of wheat varieties better adapted to climate challenges, with higher yields, enhanced nutritional quality and improved sustainability, according to the consortium.

Hikmet Budak, Winifred Asbjornson Plant Sciences Chair in the MSU College of Agriculture and a member of the IWGSC board of directors, was one of more than 200 scientists from 73 research institutions in 20 countries who authored the research article.

"The publication has so many implications not only in science, but in countries facing food insecurity all over the world," Budak said. "This could lead to higher incomes for farmers, better nutrition for world populations and new wheat varieties. The research also offers immense potential for the scientific world to create new discoveries when it comes to agricultural food production and security."

At MSU, Budak and colleagues in the Department of Plant Sciences and Plant Pathology­­­­­ recently sequenced a Montana barley cultivar, Hackett, and they're currently working on sequencing a Montana winter wheat cultivar, Yellowstone.

Sequencing the bread wheat genome was long considered an impossible task, due to its enormous size - five times larger than the human genome - and complexity - bread wheat has three sub-genomes, and more than 85 percent of the genome is composed of similar elements. The article presents the reference genome of the bread wheat variety Chinese Spring. The DNA sequence ordered along the 21 wheat chromosomes is the highest quality genome sequence produced to date for wheat.

According to the Food and Agriculture Organization, in order to meet future demands of a projected world population of 9.6 billion by 2050, wheat productivity must increase by 1.6 percent each year. In order to preserve biodiversity, water and nutrient resources, the majority of this increase must be achieved through crop and trait improvement on currently cultivated land.

A key crop for food security, wheat is the staple food of more than a third of the global human population and accounts for almost 20 percent of the total calories and protein consumed by humans worldwide, more than any other single food source, according to the FAO.

With the reference genome sequence now completed, crop breeders have at their disposal new tools to address these challenges as they will be able to identify more rapidly genes and regulatory elements underlying complex agronomic traits such as yield, grain quality, resistance to fungal diseases and tolerance to abiotic stress. In turn, they can produce hardier wheat varieties.

It's expected that the availability of a high-quality reference genome sequence will boost wheat improvement over the next decades, with benefits similar to those observed with maize and rice after their reference sequences were produced, according to the IWGSC.

"The publication of the wheat reference genome is the culmination of the work of many individuals who came together under the banner of the IWGSC to do what was considered impossible," said Kellye Eversole, executive director of the IWGSC. "The method of producing the reference sequence and the principles and policies of the consortium provides a model for sequencing large, complex plant genomes and reaffirms the importance of international collaborations for advancing food security."

Credit: 
Montana State University

New manufacturing technique could improve 'coffee ring' problem in inkjet printing

image: For figure A, there is particle transport to the apex of the target droplet and a dense center deposit is formed. For figure B, the final mapped deposit is more uniform.

Image: 
<i>Langmuir</i>

BINGHAMTON, N.Y. - A new manufacturing technique developed by researchers from Binghamton University, State University at New York may be able to avoid the "coffee ring" effect that plagues inkjet printers.

The outer edges of the ring that a coffee mug leaves behind are darker than the inside of the ring. That's because the solute is separated from the liquid during the evaporation process. That's what's called the coffee ring effect.

This same effect can happen with printers as well. When printing a text document, the text itself consists of an outline of the letters and the inside of the letters. While it may not be visible to the untrained eye, the outlines are actually darker than the inside. This happens during the drying process, just like the coffee ring effect, but researchers have wanted to find a way to remove this difference in pigmentation.

Assistant Professor Xin Yong, Professor Tim Singler and Associate Professor Paul Chiarot from Binghamton University's Mechanical Engineering Department recently made a discovery that could eliminate the difference between the outline and the inside that happens during evaporation.

They published their work, titled "Interfacial Targeting of Sessile Droplets Using Electrospray," in the journal Langmuir.

The team of experimentalists and theorists worked together to study the flow inside and on the surface of drying droplets to understand more about the coffee ring effect and how to avoid it. Using a unique technique called electrospray, they applied a high voltage to a liquid to produce an aerosol to add nanoparticles to the droplets. Nanoparticles are often useful for researchers due to their small size and large surface area.

This technique allowed for a more even dispersal of ink and stopped the coffee ring effect.

"Not only does this study help us understand how to avoid the coffee ring effect, it also tells us more about the phenomena that occur during evaporation that lead to the coffee ring effect," said Yong.

In the research paper, the team said, "To our knowledge, we are among the first to use electrospray for this purpose to explore interfacial particle transport and to elucidate the role of surfactants in governing particle motion and deposit structure."

While this difference in quality may not affect the standard user's print job, it will have a substantial effect on the capabilities of additive manufacturing and biotechnology, where printing films in a uniform way is extremely important.

Credit: 
Binghamton University

Mapping the future direction for quantum research

The way research in quantum technology will be taken forward has been laid out in a revised roadmap for the field.

Published today in the New Journal of Physics, leading European quantum researchers summarise the field's current status, and examine its challenges and goals.

In the roadmap:

Dr Rob Thew and Professor Nicolas Gisin look at quantum communication, and when we may expect to see long-distance quantum key distribution networks

Professor Frank Wilhelm, Professor Daniel Esteve, Dr Christopher Eichler, and Professor Andreas Wallraff examine the state of quantum computing, and the timescale for delivering large-scale quantum computational systems

Professor Jens Eisert, Professor Immanuel Bloch, Professor Maciej Lewenstein and Professor Stefan Kuhr trace quantum simulation from Richard Feynman's original theory through to the advances in technology needed to meet future challenges, and the pivotal role quantum simulators could play

Professor Fedor Jelezko , Professor Piet Schmidt, and Professor Ian Walmsley consider the field of quantum metrology, sensing and imaging, and look at a variety of physical platforms to implement this technology

Professor Frank Wilhelm and Professor Steffen Glaser examine quantum control, and the long-term goal of gaining a thorough understanding of optimal solutions, and developing a software layer enhancing the performance of quantum hardware

Professor Antonio Acín and Professor Harry Buhrman highlight the current status and future challenges of this quantum software and theory structured along three main research directions: quantum software for computing, quantum software for networks, and theory.

Introducing the collection of articles, Dr. Max Riedel and Professor Tommasso Calarco note: "Within the last two decades, quantum technologies have made tremendous progress, moving from Nobel Prize-winning experiments on quantum physics into a cross-disciplinary field of applied research.

"One success factor for the rapid advancement of quantum technology is a well-aligned global research community with a common understanding of the challenges and goals. In Europe, this community has profited from several EC funded coordination projects, which, among other things, have coordinated the creation of a quantum technology roadmap.

"We hope this updated summary proves useful to our colleagues in the research community, as well as anyone with a broader interest in the progress of quantum technology."

Credit: 
IOP Publishing

Diving robots find Antarctic seas exhale surprising amounts of carbon dioxide in winter

image: Researcher Stephen Riser (left) drops a float into Antarctica's Southern Ocean during a 2016-2017 cruise.

Image: 
Greta Shum/ClimateCentral

The open water nearest the sea ice surrounding Antarctica releases significantly more carbon dioxide in winter than previously believed, a new study has found. Researchers conducting the study used data gathered over several winters by an array of robotic floats diving and drifting in the Southern Ocean around the southernmost continent.

The effort is part of the Southern Ocean Carbon and Climate Observations and Modeling (SOCCOM), a six-year, $21 million-program funded by the National Science Foundation (NSF) through its Office of Polar Programs (OPP) and based at Princeton University. NOAA and NASA provide additional support for SOCCOM, and NSF's OPP coordinates all U.S. research on the southernmost continent through the U.S. Antarctic Program.

As part of the SOCCOM project, researchers from the University of Washington, Princeton University and several other oceanographic institutions wanted to learn how much carbon dioxide was transferred by the surrounding seas. The floats made it possible to gather data during the peak of the Southern Hemisphere's winter from a place that remains poorly studied, despite its role in regulating the global climate.

"These results came as a really big surprise, because previous studies found that the Southern Ocean was absorbing a lot of carbon dioxide," said lead author Alison Gray, an assistant professor of oceanography at the University of Washington. "If that's not true, as these data suggest, then it means we need to rethink the Southern Ocean's role in the carbon cycle and in the climate."

From data-poor to data-rich

The Southern Ocean region plays a unique role in the global climate. It is one of the few places where water that has spent centuries in the deep ocean travels all the way up to the surface to rejoin the surface currents and connect with the atmosphere.

Carbon atoms move between rocks, rivers, plants, oceans and other sources in a planet-scale life cycle. Learning the rate of these various transfers helps to predict the long-term levels of carbon dioxide.

Obtaining data from this region is extremely difficult, though. The Southern Ocean is among the world's most turbulent bodies of water, which makes obtaining data extremely difficult. Storms in Antarctica are some of the fiercest on the planet. In winter, the circumpolar current and winds have no barrier to ripping around the continent. According to Gray, the average storm lasts four days, and the average time between storms is seven days.

Yet data from the Southern Ocean are vital to building a comprehensive global picture of how atmospheric carbon dioxide interacts with the polar oceans.

"Antarctic waters, until now, have been a data-poor region for these kinds of measurements," said Peter Milne, OPP's program manager for ocean and atmospheric science. "SOCCOM, using technologies that previously were unavailable to researchers, is already proving its worth by gathering information that otherwise would remain largely unobtainable."

Previous winter measurements in the region had come mainly from ships traveling across the Drake Passage between South America and Antarctica to supply polar research stations. Those data were few and far between.

"After four years of SOCCOM, the vast majority of information about the chemistry of the Southern Ocean is coming from these floats," Gray said. "We have more measurements from the past few years than all the decades that came before."

Floating data

The floating instruments that collected the new observations drift with the currents and control their buoyancy to collect observations at different depths. The instruments dive down to 1 kilometer and float with the currents for nine days. Next, they drop even farther, to 2 kilometers, and then rise back to the surface while measuring water properties. After surfacing they beam their observations back to shore via satellite.

Unlike more common Argo floats, which only measure ocean temperature and salinity, the SOCCOM floats also monitor dissolved oxygen, nitrogen and pH, or the relative acidity of the water. The new paper uses the pH measurements to calculate the amount of dissolved carbon dioxide, and then uses that to figure out how strongly the water is absorbing or emitting carbon dioxide to the atmosphere.

Looking at circles of increasing distance from the South Pole, the authors find that in winter the open water next to the sea-ice covered waters around Antarctica is releasing significantly more carbon dioxide than expected to the atmosphere.

"It's not surprising that the water in this region is outgassing, because the deep water is exceptionally rich in carbon," Gray said. "But we underestimated the magnitude of the outgassing because we had so little data from the winter months. That means the Southern Ocean isn't absorbing as much carbon as we thought."

The newly published study analyzes data collected by 35 floats between 2014 and 2017. Gray is now analyzing newer data from more instruments to identify seasonal or multiyear trends, where the patterns might change from one year to the next.

"There is definitely strong variability on decadal scales in the Southern Ocean," Gray said. "And the models are really all over the place in this region. The SOCCOM floats are now providing data at times and places where before we had virtually nothing, and that is invaluable for constraining the models and understanding these trends."

Paper co-author and SOCCOM director Jorge Sarmiento at Princeton University said: "This is science at its most exciting -- a major challenge to our current understanding made possible by extraordinary observations from the application of new technologies to study previously unexplored regions of the ocean."

Sarmiento added that observations from these technologies have implications for understanding the global carbon cycle.

"We find that the Southern Ocean is currently near neutral with respect to removal of carbon from the atmosphere, contrary to previous studies, which suggest there is a large uptake of carbon by the Southern Ocean," Sarmiento said. "These results can be reconciled if there is a corresponding unobserved carbon uptake waiting to be discovered somewhere else in the ocean."

Gray conducted the research as a postdoctoral researcher in Sarmiento's research group.

The paper with the study's results will be published today, Aug. 14, in Geophysical Research Letters, a journal of the American Geophysical Union.

Credit: 
U.S. National Science Foundation

Evening preference, lack of sleep associated with higher BMI in people with prediabetes

People with prediabetes who go to bed later, eat meals later and are more active and alert later in the day -- those who have an "evening preference" -- have higher body mass indices compared with people with prediabetes who do things earlier in the day, or exhibit morning preference. The higher BMI among people with evening preference is related to their lack of sufficient sleep, according to a University of Illinois at Chicago-led study.

The results of the study -- which looked at Asian participants and was led by Dr. Sirimon Reutrakul, associate professor of endocrinology, diabetes and metabolism in the UIC College of Medicine -- are published in the journal Frontiers in Endocrinology.

Prediabetes is a condition where blood sugar levels are higher than normal but not yet high enough to be Type 2 diabetes. Without modifications to diet and exercise, patients with prediabetes have a very high risk of developing Type 2 diabetes.

Lack of sufficient sleep has been previously linked to an increased risk for numerous health conditions, including obesity and diabetes. Evening preference has also been linked to higher weight and higher risk for diabetes.

Reutrakul and her colleagues wanted to investigate the relationship between morning/evening preference and BMI -- a measure of body fat in relation to height and weight -- among people with prediabetes.

"Diabetes is such a widespread disease with such an impact on quality of life, that identifying new lifestyle factors that might play into its development can help us advise patients with an early stage of the disease on things they can do to turn it around and prevent prediabetes from becoming full-blown diabetes," said Reutrakul.

A total of 2,133 participants with prediabetes enrolled in the study. Their morning/evening preference was assessed through a questionnaire.

Participants who scored high in "morningness" answered questions indicating that they preferred to wake up earlier, have activities earlier, and felt more alert earlier in the day compared with those who scored high on "eveningness." Sleep duration and timing were obtained using a questionnaire and the extent of social jet lag was evaluated for each participant. Social jet lag reflects a shift in sleep timing between weekdays and weekends. Greater social jetlag (e.g., larger shift in sleep timing) has previously been shown to be associated with higher BMI in some populations. The average age of the participants was 64 years old, and the average BMI was 25.8 kilograms per meter squared. Average sleep duration was about seven hours per night.

The researchers found that for participants younger than 60 years of age, higher levels of social jet lag were associated with a higher BMI. Among participants older than 60 years old, those with more evening preference had higher BMIs and this effect was partly due to having insufficient sleep but not social jet lag. Evening preference was directly associated with higher BMI in this group.

"Timing and duration of sleep are potentially modifiable," said Reutrakul. "People can have more regular bedtimes and aim to have more sleep, which may help reduce BMI and the potential development of diabetes in this high-risk group."

Credit: 
University of Illinois Chicago