Tech

Inconsistencies between electronic health record, physicians' observed behaviors

Bottom Line: A study of nine emergency department residents reports inconsistencies between the electronic medical record and physicians' behaviors observed and recorded during patient encounters. Resident physicians were shadowed by trained observers for 20 encounters in this study conducted at emergency departments in two academic medical centers. The study quantified the review of systems (when patients are asked questions about different organs) and physical examinations documented by physicians and what observers confirmed. Physicians documented a median of 14 systems during the review of systems, while audio recordings confirmed a median of five. For physical examination, physicians documented a median of eight systems, while observers confirmed a median of 5.5. The study notes the electronic medical record could be prone to inaccuracy in those areas because of autopopulated information. Electronic medical records are used to generate bills. Limitations of the study include the small number of resident physicians and their behavior may not represent that of attending emergency physicians, also the observers may have missed things. Further research could help to determine if such inconsistencies are widespread.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Carl T. Berdahl, M.D., M.S., of the National Clinician Scholars Program, University of California, Los Angeles, and coauthors, including David L. Schriger, M.D., M.P.H., of the University of California, Los Angeles, who is an associate editor of JAMA.

(doi:10.1001/jamanetworkopen.2019.11390)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

How often do hospitals, physician practices screen patients for food, housing, other social needs?

What The Study Did: National survey data helped the authors of this study examine how common it is for U.S. hospitals and physician practices to screen patients for social needs such as food insecurity, housing instability, utility and transportation needs, and experience with interpersonal violence.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Taressa K. Fraze, Ph.D., of Dartmouth College in Lebanon, New Hampshire, is the corresponding author.

(doi:10.1001/jamanetworkopen.2019.11514)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Developmental psychology -- One good turn deserves another

Tit for tat, do unto others, one good turn deserves another. These are only some of the familiar expressions which articulate the belief that reciprocity is a basic principle of social interactions. Developmental psychologists Monika Wörle and Professor Markus Paulus have now asked whether and when children learn to regard reciprocity as a norm. The notion of reciprocity is fundamental to many types of social interaction and serves to stabilize social relations. "Our study was designed to answer the question whether children already believe that one should reciprocate a good turn or, put another way, that one has a duty to do so?", says Markus Paulus.

The researchers studied two groups of children. One group was made up of 47 children between the ages of 3 and 4, the other consisted of 45 5- to 6-year-olds. They were presented with various scenarios in which puppets were shown either displaying reciprocity or not. Then the participants were asked to evaluate the puppets' behavior. In addition, the authors took account of spontaneous reactions to the scenes they were shown - comments such as "But that's not fair!" when a puppet behaved in what the onlooker regarded as a mean-spirited manner.

"Our study indicates that children between the ages of 3 and 4 regard prosocial behavior as a general norm. They value generosity and benevolence, and this belief is not contingent on the notion of reciprocity. From around the age of 5, they begin to develop a more complex concept of fairness, which now encompasses the idea of reciprocity. This suggests that the older age group has developed a reciprocity norm, in which a mutual obligation to reciprocate favorable treatment is incorporated. At this age, children regard the principle of tit-for-tat as just and proper," Markus Paulus explains.

Credit: 
Ludwig-Maximilians-Universität München

The future of 'extremely' energy-efficient circuits

image: Microphotograph of a 32-bit AQFP bitonic sorter generated by the proposed auto synthesis framework. This circuit contains 7557 Josephson superconducting junctions, which is the largest auto-designed system-level AQFP circuit.

Image: 
Yokohama National University

Data centers are processing data and dispensing the results at astonishing rates and such robust systems require a significant amount of energy -- so much energy, in fact, that information communication technology is projected to account for 20% of total energy consumption in the United States by 2020.

To answer this demand, a team of researchers from Japan and the United States have developed a framework to reduce energy consumption while improving efficiency.

They published their results on July 19 in Scientific Reports, a Nature journal.

"The significant amount of energy consumption has become a critical problem in modern society," said Olivia Chen, corresponding author of the paper and assistant professor in the Institute of Advanced Sciences at Yokohama National University. "There is an urgent requirement for extremely energy-efficient computing technologies."

The research team used a digital logic process called Adiabatic Quantum-Flux-Parametron (AQFP). The idea behind the logic is that direct current should be replaced with alternating current. The alternating current acts as both the clock signal and the power supply - as the current switches directions, it signals the next time phase for computing.

The logic, according to Chen, could improve conventional communication technologies with currently available fabrication processes.

"However, there lacks a systematic, automatic synthesis framework to translate from high-level logic description to Adiabatic Quantum-Flux-Parametron circuit netlist structures," Chen said, referring to the individual processors within the circuit. "In this paper, we mitigate that gap by presenting an automatic flow. We also demonstrate that AQFP can achieve a reduction in energy use by several orders of magnitude compared to traditional technologies."

The researchers proposed a top-down framework for computing decisions that can also analyze its own performance. To do this, they used logic synthesis, a process by which they direct the passage of information through logic gates within the processing unit. Logic gates can take in a little bit of information and output a yes or no answer. The answer can trigger other gates to respond and move the process forward, or stop it completely.

With this basis, the researchers developed a computation logic that takes the high-level understanding of processing and how much energy a system uses and dissipates and describes it as an optimized map for each gate within the circuit model. From this, Chen and the research team can balance the estimation of power needed to process through the system and the energy that the system dissipates.

According to Chen, this approach also compensates for the cooling energy needed for superconducting technologies and reduces the energy dissipation by two orders of magnitude.

"These results demonstrate the potential of AQFP technology and applications for large-scale, high-performance and energy-efficient computations," Chen said.

Ultimately, the researchers plan to develop a fully automated framework to generate the most efficient AQFP circuit layout.

"The synthesis results of AQFP circuits are highly promising in terms of energy-efficient and high-performance computing," Chen said. "With the future advancing and maturity of AQFP fabrication technology, we anticipate broader applications ranging from space applications and large-scale computing facilities such as data centers."

Credit: 
Yokohama National University

CBD may alleviate seizures, benefit behaviors in people with neurodevelopmental conditions

CHAPEL HILL, NC - September 18, 2019 - A marijuana plant extract, also known as cannabidiol (CBD), is being commonly used to improve anxiety, sleep problems, pain, and many other neurological conditions. Now UNC School of Medicine researchers show it may alleviate seizures and normalize brain rhythms in Angelman syndrome, a rare neurodevelopmental condition.

Published in the Journal of Clinical Investigation, the research conducted using Angelman syndrome animal models shows that CBD could benefit kids and adults with this serious condition, which is characterized by intellectual disability, lack of speech, brain rhythm dysfunction, and deleterious and often drug-resistant epilepsy.

"There is an unmet need for better treatments for kids with Angelman syndrome to help them live fuller lives and to aid their families and caregivers," said Ben Philpot, PhD, Kenan Distinguished Professor of Cell Biology and Physiology and associate director of the UNC Neuroscience Center. "Our results show CBD could help the medical community safely meet this need."

CBD, which is a major phytocannabinoid constituent of cannabis, has already shown to have anti-epileptic, anti-anxiety, and anti-psychotic effects. And in 2018, the FDA approved CBD for the treatment of seizures associated with two rare forms of epilepsy, but little is known about the potential anti-seizure and behavioral effects of CBD on Angelman symptom.

The Philpot lab is a leader in the creation of genetically modified mouse models of neurodevelopmental disorders, and they use these models to identify new treatments for various diseases, such as Rett, Pitt-Hopkins, and Angelman syndromes.

In experiments led by first author Bin Gu, PhD, a postdoctoral fellow in the Philpot lab, the UNC-Chapel Hill researchers systematically tested the beneficial effects of CBD on seizures, motor deficits, and brain activity abnormalities - as measured by EEG - in mice that genetically model Angelman syndrome, with the expectation that this information could guide eventual clinical use.

The researchers found that a single injection of CBD substantially lessened seizure severity in mice when the seizures were experimentally triggered by elevated body temperature or loud sounds. A typical anti-convulsant dose of CBD (100 mg/kg) caused mild sedation in mice but had little effect on motor coordination or balance. CBD also restored the normal brain rhythms which are commonly impaired in Angelman syndrome.

"We're confident our study provides the preclinical framework necessary to better guide the rational development of CBD as a therapy to help lessen seizures associated with Angelman syndrome and other neurodevelopmental disorders," Gu said.

Philpot and Gu added that patients and families should always seek advice from their physician before taking any CBD products, and that a human clinical trial is needed to fully understand its efficacy and safety.

Credit: 
University of North Carolina Health Care

Brain-computer interfaces without the mess

image: A cap containing a new type of EEG electrode can be used to control a toy car with brain waves.

Image: 
Adapted from <i>Nano Letters</i> <b>2019</b>, DOI: 10.1021/acs.nanolett.9b02019

It sounds like science fiction: controlling electronic devices with brain waves. But researchers have developed a new type of electroencephalogram (EEG) electrode that can do just that, without the sticky gel required for conventional electrodes. Even better, the devices work through a full head of hair. The researchers report the flexible electrodes, which could someday be used in brain-computer interfaces to drive cars or move artificial limbs, in the ACS journal Nano Letters.

Often used to diagnose seizure disorders and other neurological conditions, EEGs are machines that track and record brain wave patterns. To conduct an EEG, technicians typically use a very sticky gel to attach electrodes to different regions of the patient's scalp. However, this gel is difficult to wash out of hair and sometimes irritates the skin. In addition, hair interferes with the electrical signals. Ming Lei, Bo Hong, Hui Wu and colleagues wanted to develop an EEG electrode that is flexible, robust and gel-free. Such an electrode could help patients, but also might allow people to someday control devices with their brains.

To make the electrodes, the researchers placed silver nanowires in a commercially available melamine sponge. The electrodes cost only about 12 cents each to make and could be mass-produced. The team assembled 10 electrodes into a flexible silicon cap and measured their performance when worn by people with shaved or hairy heads. On hairless skin, the new electrodes recorded brain waves as well as conventional ones. What's more, the flexibility of the electrodes allowed them to perform similarly on hairy and hairless skin, unlike the conventional devices. A volunteer wearing the cap could control a toy car with her mind, making it go forward, backward, left or right. The electrodes are mechanically stable through different cycles and movements and are also resistant to heat and sweat, the researchers say. 

Credit: 
American Chemical Society

People with autism show atypical brain activity when coordinating visual and motor information

image: People with autism performed a precision grip-force test like this one while being scanned inside an MRI machine.

Image: 
Life Span Institute/Leilani Photographs

LAWRENCE -- A new study in the Journal of Neurophysiology by researchers at the University of Kansas Life Span Institute is the first to look at functional brain activity in people with autism spectrum disorder (ASD) while they performed precision visuomotor behavior -- in this case, a grip-force test.

The authors found new evidence sensorimotor changes in people with autism involve abnormal cortical and subcortical organization "that may contribute to key clinical issues in patients."

People with autism performed a precision grip-force test while being scanned inside an MRI machine. They watched a display containing two horizontal bars set against a black background. The subjects controlled the bars in specific ways by pressing a device in their right hand. So did a control group of people without ASD.

"In areas of the brain for dynamically incorporating and adjusting your motor behavior based on information you're receiving, those circuits were deficient," said lead author Kathryn Unruh, a postdoctoral researcher at KU's Life Span Institute and Kansas Center for Autism Research and Training (K-CART). "But then we also show that people with autism are potentially compensating for those deficits by using other areas of the brain."

While ASD is diagnosed based on deficits in social-communication skills and the presence of certain restricted and repetitive behaviors, those are difficult for researchers to objectively measure, as opposed to brain activity during visuomotor tasks, Unruh said.

"Motor behaviors are deficient across individuals with autism, regardless of their level of functional ability," she said. "Sometimes it may look like something very subtle in their eye movements that you would never be able to see without special equipment. It could look like handwriting problems or sometimes could also look like having problems with more general motor coordination, like playing sports."

The precision grip test used by the researchers allowed them to isolate and examine one task and its associated brain activity as they measure differences among 20 subjects with ASD and 18 without.

"We're able to quantify this very precisely," Unruh said. "Trying to put a number on someone's social ability or their communication -- it is very difficult. So, this is an attractive way of measuring behavior. Here, we're getting a much closer approximation of what the brain is actually doing."

Senior author Matt Mosconi, director of K-CART, an associate scientist in the Life Span Institute and associate professor in the Clinical Child Psychology Program at KU, said in ASD patients sensorimotor problems can be frustrating for them, and they often go overlooked because communication and behavioral issues are the things others usually focus on.

"Sensorimotor issues, or difficulties coordinating and controlling our movements, are common in ASD and often a major source of frustration as they affect many of our daily activities," he said. "Studying sensorimotor issues is therefore important for understanding the diverse challenges experienced by individuals with ASD."

Not only did the study show the brain is organized differently in individuals with ASD in terms of its function for basic sensorimotor behaviors, but these functions can differ between people with autism.

"Importantly, as we know every individual with ASD shows different sets of skills and challenges, we also found differences in brain organization varied across our individuals highlighting the importance of testing measures of brain function in relation to different behaviors, rather than just relying on simple comparisons of individuals with ASD and individuals without ASD," Mosconi said.

The researchers found ASD patients' ability to rapidly integrate multisensory information and precisely adjust motor output is compromised. Further, reduced ability "to maintain steady-state levels of sensorimotor output may contribute to multiple developmental issues affecting social-communication abilities and cognitive processing."

Credit: 
University of Kansas

Porcupinefish inspires sturdy superhydrophobic material

Nature has evolved a dazzling array of materials that help organisms thrive in diverse habitats. Sometimes, scientists can exploit these designs to develop useful materials with similar or completely new functions. Now, researchers reporting in ACS Applied Materials & Interfaces have made a durable and flexible super-water-repelling material inspired by spiky porcupinefish skin.

Superhydrophobic materials are extremely water repellent, causing droplets of water that fall on them to roll off or even bounce off. Such surfaces could be used for a variety of applications, such as self-cleaning, anti-icing and corrosion prevention. The materials typically owe their water repellency to tiny, needle-shaped structures on their surfaces. However, these micro- or nanotextured surfaces are fragile and easily damaged by bending. In addition, the prickly structures can be scratched or sliced off. Drawing inspiration from the spiny yet flexible skin of the porcupinefish, Yoshihiro Yamauchi, Masanobu Naito and colleagues wanted to develop a hardier superhydrophobic structure. Although porcupinefish skin itself is not superhydrophobic, fabricating the spines out of a hydrophobic compound and shrinking them down to the micrometer scale might make them so, the researchers reasoned.

To develop their superhydrophobic material, the team prepared microscale pufferfish-inspired scales made of zinc oxide. Then, to give the material elasticity, they added a silicone polymer, which combined with the spines to form a porous framework. The material, which could be molded into various shapes or coated onto other surfaces, was not only superhydrophobic but also highly flexible. Unlike other superhydrophobic materials, the porous structure retained its water repellency after being repeatedly bent or twisted. And because the structures existed throughout the material, not just on the surface, scratching or slicing didn't affect the material's repellency, either. The flexibility and porosity of the material helps cushion against mechanical impacts and deformation, the researchers say.

Credit: 
American Chemical Society

Coastal birds can weather the storm, but not the sea

How can birds that weigh less than a AA battery survive the immense power of Atlantic hurricanes? A new study in Ecology Letters finds that these coastal birds survive because their populations can absorb impacts and recover quickly from hurricanes--even storms many times larger than anything previously observed.

"Coastal birds are often held up as symbols of vulnerability to hurricanes and oil spills, but many populations can be quite resilient to big disturbances," explains lead author Dr. Christopher Field, a postdoctoral fellow at the University of Maryland's National Socio-Environmental Synthesis Center (SESYNC). "The impacts of hurricanes, in terms of populations rather than individual birds, tend to be surprisingly small compared to the other threats that are causing these species to decline."

Field and colleagues from five other universities studied the resilience of four species of coastal birds, including the endangered Saltmarsh Sparrow. The researchers developed simulations that allowed them to explore how disturbances like hurricanes would affect the birds' populations over time. They started with models that project population sizes into the future based on the species' birth and death rates. The research team then subjected these populations to simulated hurricanes that killed a certain number of birds. Because they were using computational simulations, the researchers were able to look at the full range of potential hurricane sizes--from storms that caused no bird deaths to storms that were more severe than anything ever observed.

The researchers found that the four coastal species were able to absorb the impacts of storms across a wide range in severity. For example, the study found that a storm could cause mortality for a third of Saltmarsh Sparrows and Clapper Rails in one year, and it would still be unlikely that their populations would deviate significantly from their trajectories over time.

Resilience can be defined in many ways, so Field and colleagues borrowed concepts from classical ecology and applied them to bird populations. They used these concepts to better understand the risk that these species could face from storms that are strengthening because of climate change. The research team looked not only at the ability of populations to absorb impacts, but also the birds' ability to recover over time after large disturbances. Two of the species in the study, Saltmarsh Sparrows and Clapper Rails, are declining, largely from increased coastal flooding caused by higher sea levels. The researchers found that populations were often able to recover from large storms within 20 years, even when populations continued to decline from other threats, such as regular flooding.

If coastal birds are resilient to hurricanes, does that mean they will be resilient to climate change? "It's tempting to focus on dramatic events like hurricanes, especially as they get stronger from climate change," Field says. "But less visible threats like sea-level rise and increased coastal flooding are here to stay, and they are they are going to continue to drive coastal birds, like Saltmarsh Sparrows, toward extinction."

Dr. Chris Elphick, a coauthor on the study from the University of Connecticut, suggests that there are lessons here for people too. “After a big event like a hurricane, we often rush to rebuild and improve coastal resilience without thinking as much as we perhaps should about the longer term chronic changes in the system. Obviously, we need to respond to the damage done, but addressing the gradual, less noticeable changes, may be just as important to coastal communities in the long run.”

Credit: 
University of Connecticut

Scientists forecasted late May tornado outbreak nearly 4 weeks in advance

image: The 757 tornado warnings (red polygons) issued by NOAA's National Weather Service from May 17 to May 30 of this year.

Image: 
Northern Illinois University

DeKalb, Ill. -- A team of scientists reports that they accurately predicted the nation's extensive tornado outbreak of late May 2019 nearly four weeks before it began.

The team's study, detailing factors that went into the forecast, was published recently in the journal, Geophysical Research Letters.

"This is the first documented successful long-range forecast for an extended period of tornado activity in the U.S.," said lead author Victor Gensini, a professor of meteorology at Northern Illinois University.

Gensini said extended-range predictions are the "new frontier of forecasting."

"In our field, there's a big push to accurately predict all kinds of extreme weather events well in advance," Gensini said.

"If we can better anticipate when and where these extreme events may be occurring, it gives us a better chance to mitigate their impacts. We think any additional lead time could be extremely valuable to emergency response teams, insurance companies, and numerous other sectors of industry."

May 17 through May 29 proved to be an unusually active period of severe weather in the United States--even for a time of the year known to produce violent storms.

During the 13-day stretch, 374 tornadoes occurred, more than tripling the 1986-2018 average of 107 for this period. In total, 757 tornado warnings were issued by NOAA's National Weather Service, and seven fatalities were reported. The outbreak contributed significantly to the second highest monthly (E)F1+ tornado count (220) on record for May since reliable tornado counts began in the early 1950s.

The central and southern Great Plains, along with the lower Great Lakes region, including Pennsylvania and Ohio, were particularly hard hit by the tornadic storms.

Five years ago, Gensini and colleagues formed an Extended Range Tornado Activity Forecast (ERTAF) team to conduct research on sub-seasonal, or extended-range, forecasting. Its current members include Paul Sirvatka of the College of DuPage and current study co-authors David Gold of IBM-Global Business Services, John T. Allen of Central Michigan University and Bradford S. Barrett of the United States Naval Academy.

Studies in recent years by the team and other scientists used historical weather-pattern records to develop methodologies for predicting the likelihood of severe weather across the continental United States weeks in advance.

From April 28 on, the ERTAF team highlighted the likelihood of an active period of severe weather three to four weeks into the future. The prediction was especially notable given the pre-season expectation of below-average frequencies of U.S. tornadoes due to the presence of weak El Niño conditions in the tropical Pacific Ocean.

"It's important to note that this was a single successful extended-range forecast--we're not going to get every one of these correct," Gensini said. "But our work does create a pathway to forecasting severe weather with these extended lead times. These are usually forecasts of opportunity, meaning that they are not always possible."

Gensini said the ERTAF team, which posts forecasts on its website every Sunday evening during tornado season, has had many other successful forecasts that were two to three weeks in advance. They chose to publish on this example because of the magnitude of the storms and textbook nature of the chain of events.

"This is the first extended-range forecast that has been fully scientifically dissected," Gensini said. "We wanted to make sure it's documented."

The forecast process is complex. It looks for signals in two atmospheric indices--the Madden-Julian Oscillation, an eastward moving disturbance of winds, rain and pressure, and the Global Wind Oscillation, a collection of climate and weather information that measures atmospheric angular momentum, or the degree of waviness in the jet stream.

Recurring modes within both oscillations occasionally provide enhanced predictability of future potential for severe weather frequency, the researchers said.

The conditions that resulted in the tornado outbreak began thousands of miles away as thunderstorms over the Indian Ocean and Maritime Continent. The storms progressed into the equatorial Pacific, leading to an enhancement of the jet stream--a key signal the scientists were looking for. The jet stream then crashed like a wave, breaking over western North America into a wavy pattern.

"This process often leads to a thermal trough over the western U.S. that connects downstream to a thermal ridge, creating a rollercoaster-like jet stream pattern," Gensini said. "Those types of weather patterns have long been known to be most favorable for tornado outbreaks."

From beginning to end, the pattern progressed as the researchers expected.

"It doesn't always happen that way, and we have a lot of work to do to make this methodology robust, but every year we learn something new," Gensini said.

Credit: 
Northern Illinois University

DNA 'origami' takes flight in emerging field of nano machines

image: DNA mechanotechnology expands the opportunities for research involving biomedicine and materials sciences, says Khalid Salaita, right, professor of chemistry at Emory University and co-author of the article, along with Aaron Blanchard, left, a graduate student in the Salaita Lab.

Image: 
Emory University

Just as the steam engine set the stage for the Industrial Revolution, and micro transistors sparked the digital age, nanoscale devices made from DNA are opening up a new era in bio-medical research and materials science.

The journal Science describes the emerging uses of DNA mechanical devices in a "Perspective" article by Khalid Salaita, a professor of chemistry at Emory University, and Aaron Blanchard, a graduate student in the Wallace H. Coulter Department of Biomedical Engineering, a joint program of Georgia Institute of Technology and Emory.

The article heralds a new field, which Blanchard dubbed "DNA mechanotechnology," to engineer DNA machines that generate, transmit and sense mechanical forces at the nanoscale.

"For a long time," Salaita says, "scientists have been good at making micro devices, hundreds of times smaller than the width of a human hair. It's been more challenging to make functional nano devices, thousands of times smaller than that. But using DNA as the component parts is making it possible to build extremely elaborate nano devices because the DNA parts self-assemble."

DNA, or deoxyribonucleic acid, stores and transmits genetic information as a code made up of four chemical bases: adenine (A), guanine (G), cytosine (C) and thymine (T). The DNA bases have a natural affinity to pair up with each other -- A with T and C with G. Synthetic strands of DNA can be combined with natural DNA strands from bacteriophages. By moving around the sequence of letters on the strands, researchers can get the DNA strands to bind together in ways that create different shapes. The stiffness of DNA strands can also easily be adjusted, so they remain straight as a piece of dry spaghetti or bend and coil like boiled spaghetti.

The idea of using DNA as a construction material goes back to the 1980s, when biochemist Nadrian Seeman pioneered DNA nanotechnology. This field uses strands DNA to make functional devices at the nanoscale. The ability to make these precise, three-dimensional structures began as a novelty, nicknamed DNA origami, resulting in objects such as a microscopic map of the world and, more recently, the tiniest-ever game of tic-tac-toe, played on a DNA board.

Work on novelty objects continues to provide new insights into the mechanical properties of DNA. These insights are driving the ability to make DNA machines that generate, transmit and sense mechanical forces.

"If you put together these three main components of mechanical devices, you begin to get hammers and cogs and wheels and you can start building nano machines," Salaita says. "DNA mechanotechnology expands the opportunities for research involving biomedicine and materials science. It's like discovering a new continent and opening up fresh territory to explore."

Potential uses for such devices include drug delivery devices in the form of nano capsules that open up when they reach a target site, nano computers and nano robots working on nanoscale assembly lines.

The use of DNA self-assembly by the genomics industry, for biomedical research and diagnostics, is further propelling DNA mechanotechnology, making DNA synthesis inexpensive and readily available. "Potentially anyone can dream up a nano-machine design and make it a reality," Salaita says.

He gives the example of creating a pair of nano scissors. "You know that you need two rigid rods and that they need to be linked by a pivot mechanism," he says. "By tinkering with some open-source software, you can create this design and then go onto a computer and place an order to custom synthesize your design. You'll receive your order in a tube. You simply put the tube contents into a solution, let your device self-assemble, and then use a microscope to see if it works the way you thought that it would."

Salaita's lab is one of only about 100 around the world working at the forefront of DNA mechanotechnology. He and Blanchard developed the world's strongest synthetic DNA-based motor, which was recently reported in Nano Letters.

A key focus of Salaita's research is mapping and measuring how cells push and pull to learn more about the mechanical forces involved in the human immune system.

Salaita developed the first DNA force gauges for cells, providing the first detailed view of the mechanical forces that one molecule applies to another molecule across the entire surface of a living cell. Mapping such forces may help to diagnose and treat diseases related to cellular mechanics. Cancer cells, for instance, move differently from normal cells, and it is unclear whether that difference is a cause or an effect of the disease.

In 2016, Salaita used these DNA force gauges to provide the first direct evidence for the mechanical forces of T cells, the security guards of the immune system. His lab showed how T cells use a kind of mechanical "handshake" or tug to test whether a cell they encounter is a friend or foe. These mechanical tugs are central to a T cell's decision for whether to mount an immune response.

"Your blood contains millions of different types of T cells, and each T cell is evolved to detect a certain pathogen or foreign agent," Salaita explains. "T cells are constantly sampling cells throughout your body using these mechanical tugs. They bind and pull on proteins on a cell's surface and, if the bond is strong, that's a signal that the T cell has found a foreign agent."

Salaita's lab built on this discovery in a paper recently published in the Proceedings of the National Academy of Sciences (PNAS). Work led by Emory chemistry graduate student Rong Ma refined the sensitivity of the DNA force gauges. Not only can they detect these mechanical tugs at a force so slight that it is nearly one-billionth the weight of a paperclip, they can also capture evidence of tugs as brief as the blink of an eye.

The research provides an unprecedented look at the mechanical forces involved in the immune system. "We showed that, in addition to being evolved to detect certain foreign agents, T cells will also apply very brief mechanical tugs to foreign agents that are a near match," Salaita says. "The frequency and duration of the tug depends on how closely the foreign agent is matched to the T cell receptor."

The result provides a tool to predict how strong of an immune response a T cell will mount. "We hope this tool may eventually be used to fine tune immunotherapies for individual cancer patients," Salaita says. "It could potentially help engineer T cells to go after particular cancer cells."

Credit: 
Emory Health Sciences

Army research uncovers law-like progression of weapons technologies

image: New Army research uncovers trends in the progression of weapons systems, possibly helping to predict future systems.

Image: 
US Army graphic

ABERDEEN PROVING GROUND, Md. (Sept. 18, 2019) -- Anticipating the technology and weapon systems of our future Army might not be entirely daunting, new Army research finds.

Trends in the progression of weapon systems from the early crossbowman to a musket to a military tank might help predict our future systems, according to a new study to be published in the Journal of Defense Modeling and Simulation, "Towards Universal Laws of Technology Evolution: Modeling Multi-century Advances in Mobile Direct Fire Systems."

"A number of law-like regularities are known to apply to both technological and naturally emerging complex systems," said Dr. Alexander Kott, author of the paper and a researcher at the U.S. Army Combat Capabilities Development Command's Army Research Laboratory. "Identifying these regularities may help long-range technology forecasting, which this paper illustrates by exploring two systems that might appear 30 years in the future."

Certain performance measures of technological systems often exhibit exponential--and sometimes superexponential--pattern of growth over time, Kott said. A particularly well-known example of such a regularity is Moore's Law, which states that a performance measure of a computer chip doubles approximately every two years. Many other technologies follow a similar law of exponential growth.

So-called allometric relations are another class of law-like regularities. Often, a universal relation exists between the scale of the organism and its various attributes, applicable across multiple organisms of widely different scales, Kott said. For example, the Kleiber's Law states that for the vast majority of animals--from tiny mouse to huge elephant--the organism's metabolic rate scales approximately to the 3/4 power of the organism's mass, and the data for all such organisms fall on the same curve.

This research explores whether a single regularity of technological growth might apply to technologies of widely different scales, over a period of multiple centuries. Kott investigated a collection of diverse weapon systems he describes as the mobile direct-fire systems. These include widely different families of technologies that span the period of 1300-2015 CE: Soldiers armed with weapons ranging from bows to assault rifles; foot artillery and horse artillery; towed anti-tank guns; self-propelled anti-tank and assault guns; and tanks.

Ultimately, this research finds that, indeed, a single, uncomplicated regularity describes the historical growth of this extremely broad collection of systems. Multiple, widely different families of weapon systems--from a bowman to a tank--fall approximately on the same curve, a simple function of time. Unlike a conventional curve of exponential growth with time, this regularity also depends on the physical scale (specifically, mass) of the technological artifacts. This suggests a general model that unites allometric relations (such as the Kleiber's Law) and exponential growth relations (such as the Moore's Law).

"To my knowledge, no prior research describes a regularity in the temporal growth of technology that covers such widely different technologies, of widely different physical scales, and over such a long period of history," Kott said. "However, such a regularity should be taken with a degree of caution. You cannot use it as a design guide. There is a lot more to a good system than a very parsimonious figure of performance we use in our model. Interpretations of the model require care."

This research suggests a possibility that even broader collections of technology families might evolve historically in accordance with what might be called universal laws of technological evolution, and provides related research questions for further investigation.

"What I find interesting about the findings of this paper," said Dr. Bruce West, the U.S. Army's chief mathematician, "is that from the evolutionary allometric perspective, this is the first set of empirical data that demonstrate the existence of a strongly time-dependent allometric coefficient. I anticipated such time dependency in my earlier papers, and here is a clear empirical confirmation."

Kott muses about this law-like but previously unrecognized trend.

"In hindsight," he said, "this multi-century, multi-scale regularity may not be all that surprising, but somehow nobody noticed this previously. Perhaps, the future is not a silent mystery. It speaks to us from the past, softly."

Credit: 
U.S. Army Research Laboratory

NASA-NOAA satellite provides forecasters a view of tropical storm Jerry's structure

image: On Sept. 17 NASA-NOAA's Suomi NPP satellite passed over newly developed Tropical Depression 10 as it was strengthening into a tropical storm, and found strong bands of thunderstorms over the southern and southwestern portions of the circulation, but were limited over the remainder of the cyclone.

Image: 
NASA/NOAA/NRL

Tropical Storm Jerry is the latest in a line of tropical cyclones to develop in the North Atlantic Ocean this season. NASA-NOAA's Suomi NPP satellite passed overhead and provided forecasters with a view of its structure that helped confirm it was organizing.

Tropical Depression 10 formed in the Central Atlantic on Sept. 17 by 11 a.m. EDT. On Sept. 17, at 12:48 p.m. EDT (1648 UTC), the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard Suomi NPP provided a visible image of the Tropical Depression 10, which later became Tropical Storm Jerry. The VIIRS image showed that the storm had taken a more rounded circulation than previously in the day, indicating it had become better organized. The image showed strong bands of thunderstorms that were located over the southern and southwestern portions of the circulation, but were limited over the remainder of the cyclone.

Hurricanes are the most powerful weather event on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

By 5 a.m. EDT on Sept. 18, the depression strengthened enough to become a tropical storm. At that time, it was renamed as Tropical Storm Jerry.

NOAA's National Hurricane Center or NHC said, "At 11 a.m. EDT (1500 UTC) on Sept. 18 the center of Tropical Storm Jerry was located near latitude 14.6 degrees North and longitude 49.2 degrees West. That puts Jerry's center about 855 miles (1,375 km) east of the Leeward Islands. Jerry is moving toward the west-northwest near 14 mph (22 kph). Maximum sustained winds have increased to near 50 mph (85 kph) with higher gusts.  Jerry is forecast to become a hurricane by late Thursday, with little change in strength anticipated on Friday and Saturday. The estimated minimum central pressure is 1002 millibars.

NASA and NOAA satellite data have shown that there are warm waters and light wind shear in Jerry's path which will enable it to strengthen. However, there is also dry air around the tropical storm which is forecast to limit the intensification for now.

NHC said, "A west- northwest motion at a slightly faster forward speed is expected over the next few days. On the forecast track, the system will be near the northern Leeward Islands Friday and pass north of Puerto Rico on Saturday [Sept. 21]."

Credit: 
NASA/Goddard Space Flight Center

Scientists set to start $10M project to create health diagnosis tool for bees

image: Pictuered here: Associate Professor Amro Zayed, York University.

Image: 
York University

TORONTO, September 18, 2019 - When Canada's honey bees are thriving, they produce honey and pollinate valuable crops like blueberries, apples and hybrid canola seeds.

But the health of honey bees is declining, with more than a quarter of honey bee colonies dying each winter. These deaths have left beekeepers and government regulators struggling to find ways to quickly diagnose, manage and improve bee health.

The solution could be a new bee health diagnosis tool being created as part of a research project led by bee genomics expert Associate Professor Amro Zayed, of York University, along with Professor Leonard Foster, of the University of British Columbia. On October 1, they will launch a $10 million project to develop a new health assessment and diagnosis platform, supported by Ontario Genomics and Genome Canada.

"We need to think of innovative solutions to fix the bee health crisis. The current tools are just not cutting it," said Zayed in the Department of Biology, Faculty of Science.

Honey bees produce 90 million pounds of honey each year and are needed to pollinate some of Canada's most lucrative crops. Their pollination services are valued at $5.5 billion per year in Canada alone.

The causes of bee decline are complex, variable, and difficult to identify. But beekeepers and government regulators need to rapidly identify the stressors impacting specific populations before they can make changes to improve bee health. Currently, the industry uses post-mortem analysis to test for the presence of a few known pathogens or toxins in dead colonies. These tests are often expensive, time consuming, and provide an incomplete picture of the stressors affecting bee health.

The research team is looking to modernize the industry by delivering a tool to quickly assess bee health in living colonies that would allow loss-mitigating strategies to be implemented.

"You can identify the stressors affecting a colony, not by searching for the stressor itself, but by looking for specific signatures of stress in the bee - what we call biomarkers," explained Zayed. "The biomarker approach has a lot of potential for quickly screening stressors affecting bees before colonies decline."

The researchers will use genomic tools to measure stressor-induced changes in bees to identify biomarkers for specific stressors. By the end of the project, the researchers envision a system where beekeepers can send their samples for biomarker testing and receive a report with both a health assessment and information on the most effective management strategies, which can then be applied in the field to improve the health of their colonies.

The research team is comprised of 22 researchers from across Canada including researchers from Agriculture and Agri-Food Canada (AAFC), University of Manitoba, University of Guelph and University of Laval. The project is funded through Genome Canada's Large-Scale Applied Research Project Competition: Genomics Solutions for Agriculture, Agri-food, Fisheries and Aquaculture. Funding partners include Genome Canada, AAFC, Genome British Columbia and Genome Quebec.

Credit: 
York University

Study: Spend more on housing, teens in foster care are less likely to be homeless, jailed

State spending on transitional housing supports for youth "aging out" of foster care can make a big difference in preventing homelessness, incarceration, substance abuse and early childbirth, according to a new study by social work researchers at Case Western Reserve University.

"Regardless of your high risk, if you're aging out of foster care and you live in a state that spends above average on housing support, you are less likely to be homeless or jailed compared with states that spend less," said Dana M. Prince, the national study's lead author and an assistant professor at the Jack, Joseph and Morton Mandel School of Applied Social Sciences.

The idea was straightforward: Prince and a team of researchers used national survey data from 7,449 teenagers between 18 and 19 years old who were "aging out" out of foster care. Policies about how young people transition out of foster care varies from state to state; 27 states currently have a federally approved extended foster care plan.

The data was then used to determine whether there was a link between how much states spent in federal funds earmarked for transitional housing and negative outcomes associated with aging out, such as homelessness.

For this study, researchers looked at funds from the John H. Chafee Foster Care Independence Program, a federal program to help current and former foster care youths become self-sufficient. Colloquially, the funds are known to practitioners as "Chafee dollars."

However, not all the states actually spend the funds available to them, Prince said. That was taken into account.

"Our research provides some initial compelling evidence that spending higher allotments on housing provides better outcomes, but also keeps these young people out of jail and off the streets," Prince said.

"We also found that if you're a kid living in a state with higher spending on low-income and unstable housing renters," she said, "you're also less likely to become a young parent and have substance-abuse issues than if you live in a state with lower (transitional housing) spending."

The findings were recently published in the Journal of Adolescence.

Of the national sample:

* 20% of the teens reported being homeless within the past two years;

* 14.4% got a referral for substance abuse;

* 22.4% had been incarcerated;

* and 12.3% had given birth or fathered a child.

"This confirms a lot of what we had already known: There are a lot of things that happen to young people as they transition out of foster care," Prince said. "They're moving around, often multiple times, experiencing all sorts of negative outcomes."

She said the research has implications for states supporting teens aging out of foster care--across a spectrum of social and human service sectors--while planning a budget.

"Extension of foster care services isn't just a policy thing," Prince said. "We know that when foster kids are allowed to stay in foster care past 18, to 19 to 23 years old, that there are better outcomes for these kids."

Credit: 
Case Western Reserve University