Tech

More stroke awareness, better eating habits may help reduce stroke risk for young adult African-Americans

DALLAS, Feb. 12, 2020 -- Young African American adults are experiencing higher rates of stroke compared to others due to high blood pressure, diabetes and obesity, yet their perception of their stroke risk is low, according to preliminary research to be presented at the Nursing Symposium of the American Stroke Association's International Stroke Conference 2020 - Feb. 18-21 in Los Angeles. The conference is a world premier meeting for researchers and clinicians dedicated to the science of stroke and brain health.

Researchers analyzed the results of a self-completed questionnaire by 116 young African American adults to get a better understanding of their personal stroke risk, ability to live healthy, nutrition/eating patterns and the role of health literacy in decreasing their chances of stroke. They concluded that increased education about healthy lifestyle choices and better eating habits are important in helping reduce stroke risk among this group.

Most young adults in the study had unhealthy eating habits, based on the American Heart Association's Life's Simple 7 recommendations for a healthier lifestyle, despite having high levels of health literacy and a perceived ability to live healthy, 53% had an inaccurate perception about their stroke risk.

The research is a secondary data analysis of the Stroke Counseling for Risk Reduction (SCORRE) study, which was published in August 2019.

"Nutrition habits are very important to the health of our society and are difficult to change, regardless of levels of health literacy," said Stacy Perrin, Ph.D., a student investigator of the SCORRE study, and a nurse with several certifications, including the Certified Stroke Registered Nurse credential, at Grady Health System in Atlanta, Georgia. "This research supports the need to target young adults to increase their awareness of the importance of healthy food choices to lower stroke risk."

Study participants were mostly female African American college students, age 25 on average. While participants averaged three modifiable stroke risk factors, such as poor nutrition, lack of exercise, being overweight and elevated blood pressure readings, 69% reported their future stroke risk as low or no risk. Perception of health and risk were measured on a scale of 0 to 10. On the Association's Life's Simple 7 nutrition recommendations, which include five categories (eating more than 4 cups of fruits and vegetables a day, for example), participants averaged a low score of 1.6 out of the five categories.

Among other results:

Health literacy levels, with an average 4.4 score out of 6.0, and perceived self-competence to live healthy, with an average 5.9 score out of 7.0, were high.

There was no association between health literacy and accuracy of perceived stroke risk.

Higher health literacy did not positively impact the ability to live healthy or eating patterns. However, higher perceived risk of future stroke and lower perceived ability to live healthy were significantly associated with poorer eating patterns.

"If people think they're not at risk of a stroke, they are less likely to change their behavior to reduce the risk because they don't believe anything is wrong. This perception is not healthy," Perrin said.

The National Institute of Nursing Research at the National Institutes of Health funded the SCORRE study and lead investigator, Dawn Aycock, Ph.D., R.N., ANP-BC. Author disclosures are available in the abstract.

Some of the limitations were that the sample was mostly women with some college education and the study relied on self-reports of dietary intake.

Credit: 
American Heart Association

Smoking rates falling in adults, but stroke survivors' smoking rates remain steady

DALLAS, Feb. 12, 2020 -- While the rate of Americans who smoke tobacco has fallen steadily and significantly over the last two decades, the rate of stroke survivors who smoke has not changed significantly, according to preliminary research to be presented at the American Stroke Association's International Stroke Conference 2020 - Feb. 19-21 in Los Angeles, a world premier meeting for researchers and clinicians dedicated to the science of stroke and brain health.

"Smoking cessation should be at the top of the list of preventive measures for people with a prior stroke, because prior research shows that those who continue smoking are more likely to die, or have a heart attack or second stroke," said Neal S. Parikh, M.D., lead author of the study and assistant professor of neurology at Weill Cornell Medicine in New York City. "The importance of smoking cessation should be stressed alongside the importance of taking blood-thinning medication (such as aspirin, clopidogrel or an anticoagulant) and cholesterol-lowering medication as prescribed, and ensuring blood pressure is controlled."

The researchers examined data from more than 49,000 adults age 20 or older who participated in the National Health and Nutritional Examination Surveys between 1996 and 2016. Of those, one in 37 people reported a prior stroke (average age 65, 57% women). Comparing active smoking between those with a prior stroke and the overall population, the researchers found:

Over the time period analyzed, 24% of stroke survivors reported active smoking, compared with 22% of all participants;

Between the 1999-2000 survey and the 2015-16 survey, the overall percentage of active smokers in the overall population declined steadily and significantly, from roughly 25% to 19%, respectively; and

In contrast, the percentage of stroke survivors who smoked did not change significantly, going from roughly 23% to 26%, respectively, in the two surveys.

"The persistent rate of smoking among stroke survivors, who should be motivated to quit, speaks to the highly addictive nature of nicotine. It's important for stroke survivors to know that several effective, FDA-approved medications to assist patients with smoking cessation are available. Patients should ask their health care providers to help them quit smoking and seek smoking cessation programs and resources through their cities and states where available," Parikh said.

The results are similar to a 2015 analysis on heart attack patients from the same national survey, which found that the proportion of patients with a prior heart attack who smoked did not decrease between 1999 and 2012.

"The lack of progress in smoking cessation among stroke survivors reinforces the need for aggressive cessation interventions," Parikh said.

This study is limited by the small number of patients reporting prior stroke in the survey. In addition, the survey relied on the participants self-reporting of prior strokes, so some people with minor strokes may have been missed.

"This study, while based on self-reported data, raises concerns about an important gap in the completeness of secondary prevention efforts for stroke in the U.S. While data from the American Heart Association's Tobacco Center for Regulatory Science and from others indicates that many of the public are aware that smoking is a risk factor for cardiovascular events including stroke, these disappointing figures make it clear that we have more work to do to increase this knowledge, and perhaps more importantly, to help patients turn that knowledge into effective action for smoking cessation," said Rose Marie Robertson, M.D., co-principal investigator of the Center, who was not involved in this study. "Multi-episode counseling, FDA-approved nicotine replacement therapy and medications to address cravings, when indicated, should be offered during rehabilitation to all stroke survivors who smoke or use other tobacco products."

Credit: 
American Heart Association

Blacks, Hispanics of Caribbean descent have higher stroke risk than white neighbors

DALLAS, Feb. 12, 2020 -- Both Blacks and Hispanics of Caribbean descent living in Northern Manhattan have a significantly higher risk of stroke than their non-Hispanic, white neighbors, according to preliminary research to be presented at the American Stroke Association's International Stroke Conference 2020 - Feb. 19-21 in Los Angeles, a world premier meeting for researchers and clinicians dedicated to the science of stroke and brain health.

While previous research documented an increased stroke risk among blacks and Mexican Americans, studies in Northern Manhattan have been the first to document the heightened risk for Hispanics of Caribbean descent.

Researchers from the University of Miami and Columbia University analyzed stroke risk in almost 3,300 people (average age 69; 37% men; 24% black; 21% white; 52% Hispanic) participating in the Northern Manhattan Study (NOMAS), an ongoing, community-based study that started in 1993 focused on stroke rates and risk factors.

Over an average follow-up time of more than 13 years, 460 participants had strokes, the majority of which were ischemic strokes (caused by a clot in an artery feeding the brain). Researchers also found:

Overall, men had a 48% increased rate of stroke, even after adjusting for socioeconomic status and stroke risk factors;

Compared with non-Hispanic whites, blacks had a 50% increased risk of stroke, even after adjusting for sociodemographics, including education and insurance;

The disparity in stroke risk between blacks and whites was highest in women and persisted into old age (70 and older);

Compared with non-Hispanic whites, Hispanics had a 50% increased risk of stroke, and the disparity was substantially reduced, except for among women 70 and older, after adjusting for socioeconomic status; and

By age 85, the highest stroke incidence rate was in Hispanics.

"Previous research has suggested that racial and ethnic disparities in stroke risk are greater at younger ages and dissipate as people get older, so we were surprised to find that the differences remained strong in women over 70 years old," said Hannah Gardener, Sc.D., lead author of the study, an epidemiologist and associate scientist in neurology at the University of Miami Miller School of Medicine in Miami, Florida. "Disparities in stroke risk among elderly minorities are persistent. Identifying minority populations at a higher risk for stroke and targeting their modifiable risk factors are public health priorities."

In addition to socioeconomic status, the study adjusted for the following stroke risk factors: smoking, physical activity, alcohol consumption, body mass index, high blood pressure, high cholesterol and diabetes.

"It's important for everyone to know their stroke risk factors, take their prescribed medications and make lifestyle changes that can reduce their risk," Gardener said. "Risk factor management starting at or before middle age is key in reducing stroke risk, especially among blacks and Hispanics who are at increased risk."

Credit: 
American Heart Association

Genetics enhance sex's role as a stroke, heart attack risk factor

DALLAS, Feb. 12, 2020 -- Genetics enhances the role sex plays in determining risk for stroke and heart attack in healthy middle-aged adults (ages 40 to 60), according to preliminary research to be presented at the American Stroke Association's International Stroke Conference 2020 - Feb. 19-21 in Los Angeles, a world premier meeting for researchers and clinicians dedicated to the science of stroke and brain health.

Currently, the genetic risk score employed in this study is widely used in research, however, it is not yet used in the clinical care of patients. The findings support the possibility that genetic scores could be used to better assess the likelihood of stroke and heart attack and develop and implement preventive efforts for middle-aged people without obvious risks.

"The risk of heart attack or stroke increases rapidly during middle age," said Guido J. Falcone, M.D., Sc.D., M.P.H., lead author of the study and assistant professor of neurology at the Yale School of Medicine in New Haven, Connecticut. "Identifying healthy middle-aged adults at higher risk to have a stroke or heart attack is important because it opens the possibility of intervening early and avoiding many years of disability."

The researchers analyzed data from more than 300,000 people (59% women) enrolled in the UK Biobank, an open-access, UK-based resource that recruited half a million people in the United Kingdom (aged 40-69) between 2006 and 2010. All UK Biobank volunteers agreed to undergo medical testing, provide detailed health information and have their health followed in the future. None of the participants in this analysis had high blood pressure, diabetes or high cholesterol at the time they enrolled. No participants had a prior heart attack or stroke, and all were screened for the presence of 68 gene variants that are known to be associated with the risk of heart attack or stroke. This data generated a single score that allowed the categorization of each study participant as having low, intermediate or high genetic risk.

By age 60, there were 1,954 strokes and 3,792 heart attacks. Analyzing the influence of sex and genetics, researchers found:

The proportion of people who had a heart attack or stroke was approximately 1 in 800 by age 40, 1 in 220 by age 50, and 1 in 76 by age 60;

Compared to people at low genetic risk, strokes and heart attacks increased 22% in those at intermediate risk and increased 52% among those at high genetic risk;

Men were three times as likely as women to have a stroke or heart attack; and

Compared to women at low genetic risk, men at high genetic risk were four times as likely to have a stroke or heart attack.

"We were very surprised by the results, especially the synergy between genetic burden and sex," Falcone said. "I am cautiously optimistic. We already know that treatment for high blood pressure, high cholesterol and diabetes can help prevent heart attacks and strokes. People without these conditions who are at high genetic risk could also benefit from some type of early intervention, and the findings from our study may help us develop preventive measures for this population."

Results from this study, which included an almost entirely white, British population enrolled in the UK Biobank, may not be generalizable to other racial and ethnic groups. "These results would apply to a specific group of Americans, and we would need to test all these things in other ethnic or racial groups, including Native Americans, African Americans, Hispanics and Asians," Falcone said.

Credit: 
American Heart Association

Small altitude changes could cut the climate impact of aircraft

Contrails -- the white, fluffy streaks in the sky that form behind planes -- can harm the environment. Now, scientists report in ACS' Environmental Science & Technology that small flight path adjustments could reduce the climate impact of these emissions.

Contrails materialize at cruising altitude when aircraft emit black carbon particles from incomplete fuel combustion, providing a surface on which moisture condenses to form ice particles. Most contrails last only a few minutes. But if the atmosphere is supersaturated with ice, these tracks can spread and mix with other contrails and cirrus clouds, forming "contrail cirrus" that last up to 18 hours. Previous studies suggested contrails and the clouds they induce have as much of a warming impact on climate as planes' CO2 emissions. Marc Stettler and colleagues wanted to refine the models to more accurately predict the characteristics and impact of contrails, and to evaluate mitigation strategies, such as altering flight paths.

The team combined their recently developed model to estimate black carbon emissions for specific aircraft engine types and power with a model to estimate the characteristics and climate impact of contrails from individual flights and detailed weather information. In a study of Japan's airspace, they showed that 80% of warming caused by contrails could be traced to a mere 2.2% of flights. They then showed that making only 1.7% of aircraft fly 2,000 feet higher or lower than their originally planned flight path could limit formation of contrails, thereby reducing their warming effect by 59.3%. Unfortunately, diversions can also make flight paths less efficient, leading to increased CO2 emissions, and previous studies that considered lateral diversions indicated this could offset any reduction in contrails. But by targeting the few flights that cause the most warming contrails and only making small altitude changes, the team found the reduction in contrail warming outweighed the CO2 penalty. In the long term, if conventional engines were also replaced with cleaner-burning engines, overall contrail impact could be reduced by 91.8%, the researchers say.

Credit: 
American Chemical Society

Second wind: New technology to help diagnose and manage respiratory diseases

video: This is a video illustrating, in real-time, the lung function of case study mice using the non-invasive technology.

Image: 
Monash University

Monash University researchers have developed radical non-invasive X-ray technology to help diagnose, treat and manage respiratory lung diseases.

The technology, now being commercialised by med-tech company 4Dx Limited, allows researchers to see the movement of air through the lungs in real-time, improving their capacity to assess functional airflow in both healthy and diseased lungs.

The technology can pinpoint localised areas of deficiency in a lung, offering potential for faster and more accurate diagnoses.

Monash University researchers in Australia have developed radical non-invasive technology that can be used to diagnose respiratory lung diseases, such as cystic fibrosis and lung cancer, and potentially fast-track treatments for patients.

Researchers have for the first time taken technology usually confined to high-tech synchrotron facilities into a common laboratory setting, and applied new four-dimensional X-ray velocity (XV Technology) imaging to provide high-definition and sensitive real-time images of airflow through the lungs in live organisms.

The study, led by Dr Rhiannon Murrie from the Department of Mechanical and Aerospace Engineering at Monash University, shows the likely impact this technology has in respiratory disease detection, monitoring and treatment through non-invasive and non-terminal means.

The technology also has the potential to see whether treatments for respiratory illnesses are working much earlier.

The technology has since been commercialised by Australian-based med-tech company 4Dx Limited, led by CEO and former Monash University researcher Professor Andreas Fouras. The technology has been upscaled for human clinical trials taking place in the USA, with Phase I already completed successfully.

The study was published in the internationally-renowned Nature Research Scientific Reports in January 2020.

"The early diagnosis and ongoing monitoring of genetic and chronic lung diseases, such as cystic fibrosis, asthma and lung cancer, is currently hampered by the inability to capture the spatial distribution of lung function in a breathing lung," Dr Murrie said.

"Since pulmonary function tests are measured at the mouth, these tests are unable to localise where in the lung any change in function originates. Additionally, CT scans, while providing quality 3D images, cannot image the lung while it is breathing, which means airflow through the airways and into the lung tissue cannot be measured."

Research by Dr Murrie and the multi-disciplinary collaboration of physicists, engineers, biologists and clinicians are changing this approach to the diagnosis and treatment of lung diseases, by determining the functional lung movement and airflow in live mice, acquired through X-ray technology at 30 frames per second.

A comparison of a cystic fibrosis mouse model against a healthy control mouse allowed researchers to observe a dramatic reduction in lung aeration in the left lung of the diseased mouse largely due to an obstructed airway path.

Researchers were able to pinpoint the exact locations where lung deficiencies were present and the location of the obstruction causing the restricted airflow.

The successful trial opens up avenues for respiratory diseases to be diagnosed, treated and managed earlier than current technology allows and at a lower radiation dose than current CT scanning.

"The ability to perform this technique in the lab makes longitudinal studies on disease progression and treatment development feasible at readily accessible facilities across the world," Dr Murrie said.

"This finding is an exciting step in advancing the understanding of lung diseases and treatments that affect millions of people globally, and particularly for those with cystic fibrosis, which affects more than 70,000 people worldwide."

Professor Fouras said: "I am pleased to see this technology, originally developed at Monash University, and now being commercialised to maximise clinical impact, also enabling cutting-edge medical research like this."

Credit: 
Monash University

Something from nothing: Using waste heat to power electronics

image: Figure 1

Image: 
University of Tsukuba

Tsukuba, Japan--Collecting energy from environmental waste heat such as that lost from the human body is an attractive prospect to power small electronics sustainably. A thermocell is a type of energy-harvesting device that converts environmental heat into electricity through the thermal charging effect. Although thermocells are inexpensive and efficient, so far only low output voltages--just tens of millivolts (mV)--have been achieved and these voltages also depend on temperature. These drawbacks need to be addressed for thermocells to reliably power electronics and contribute to the development of a sustainable society.

A University of Tsukuba-led research team recently improved the energy-harvesting performance of thermocells, bringing this technology a step closer to commercialization. Their findings are published in Scientific Reports .

The team developed a thermocell containing a material that exhibited a temperature-induced phase transition of its crystal structure. Just above room temperature, the atoms in this solid material rearranged to form a different crystal structure. This phase transition resulted in an increase in output voltage from zero to around 120 mV, representing a considerable performance improvement over that of existing thermocells.

"The temperature-induced phase transition of our material caused its volume to increase," explains Professor Yutaka Moritomo, senior author of the study. "This in turn raised the output voltage of the thermocell."

The researchers were able to finely tune the phase transition temperature of their material so that it lay just above room temperature. When a thermocell containing this material was heated above this temperature, the phase transition of the material was induced, which led to a substantial rise of the output voltage from zero at low temperature to around 120 mV at 50 °C.

As well as tackling the problem of low output voltage, the thermocell containing the phase transition material also overcame the issue of a temperature-dependent output voltage. Because the increase of the output voltage of the thermocell induced by the thermal phase transition was much larger than the temperature-dependent fluctuations of output voltage, these fluctuations could be ignored.

"Our results suggest that thermocell performance can be strongly boosted by including a material that exhibits a phase transition at a suitable temperature," says Professor Moritomo. "This concept is an attractive way to realize more efficient energy-harvesting devices."

The research team's design combining thermocell technology with an appropriately matched phase transition material leads to increased ability to harvest waste heat to power electronics, which is an environmentally sustainable process. This design has potential for providing independent power supplies for advanced electronics.

Credit: 
University of Tsukuba

How kirigami can help us study the muscular activity of athletes

image: Elastic kirigami patch for electromyographic analysis of the palm muscle during baseball pitching

Image: 
Waseda University

The upcoming Tokyo Olympic and Paralympic Games in 2020 represent a big opportunity for governments to promote a healthy lifestyle and sports, and the turn of the decade is a great opportunity to showcase how recent technological developments can be used to help us understand human motion during sports. In this regard, the combination of high-speed cameras and surface electromyographic sensors, which record the electromyographic activity of palm muscles, has been employed to obtain a better understanding on the fine control athletes and sportspeople exert on their palm muscles.

However, conventional devices for surface electromyography employ small electrodes that are attached to the skin and wires, which restrict free movement. All-in-one modules containing electrodes, amplifiers, and wireless transmitters help to solve this issue only to some extent; these modules are not suitable for certain parts of the body, like the palms or soles. During pitching in baseball, for example, the ball is in direct contact with palm muscles, and integrated modules cannot be employed without being a nuisance to the user. Even if skin-like electrodes were used, the high forces and friction involved would break them apart. This has limited electromyographic studies to other parts of the arms and legs.

To address this problem, a joint research team from Waseda University and Kitasato University, Japan drew inspiration from a traditional Japanese artform called kirigami, to prepare a durable skin-like patch for measuring the electromyographic activity of palm muscles, and have published their findings in NPG Asia Materials. Unlike the better-known origami, kirigami crafts contain both paper folds and cuts. Interestingly, it is possible to employ kirigami technique to create ultrathin insulated conductive sheets that are also largely bendable and stretchable. "By cutting a conductive sheet in a special kirigami pattern and sealing it with silicone rubber, we have managed to create elastic and insulated wirings that minimized the mechanical mismatch between skin and device during exercise," reports Dr. Kento Yamagishi from Waseda University (Currently, Singapore University of Technology and Design), the lead author of the paper. These wires were combined with another of their previous inventions - conductive nanosheets that can be used on the palm or soles without problems.

These two devices together form an elastic kirigami patch that can capture electromyographic signals in difficult areas and carry them to a Bluetooth device placed in a less-obtrusive zone, such as the forearm. The research team tested their invention by measuring electromyographic signals from one of the palm muscles of an experienced baseball player when throwing curveballs and fastballs, finding significant differences between both types of throw. "Our elastic kirigami patch will serve as a minimally perceivable device to investigate the activity of the palm muscles of athletes without interfering with their performance," remarks Assist. Prof. Tomoyuki Nagmi of Kitasato University. "This surface electromyographic measurement system will enable the analysis of motion in unexplored palm muscle areas, leading to a better understanding of muscular activity in a wide range of sports and even artistic or musical performances," Assoc. Prof. Toshinori Fujie of Waseda University (Currently, Tokyo Institute of Technology), who led the research, concludes. There are also potential applications in medical research for currently unexplainable motor disorders, such as the yips. It is clear that a better understanding of our own bodies during exercise could help us perform better and lead a healthier lifestyle.

Credit: 
Waseda University

Circular reasoning

image: Assistant Professor
Department of Neurosciences
The University of New Mexico School of Medicine

Image: 
The University of New Mexico

In Biology 101 you were taught that inside each cell, tiny strands of a molecule called RNA "transcribe" the genetic code in your DNA, the first step in the process of building the proteins that make up your body.

But in recent years the picture has grown more complicated. It turns out there are also circular, non-coding forms of RNA that regulate various aspects of gene expression, allowing many of our 20,000 genes to make more than one form of each protein.

Now, University of New Mexico scientists have shown that reduced levels of a type of circular RNA found in the brain with the ungainly name of circHomer1a are associated with schizophrenia and bipolar disorder. Lower levels of the molecule in the human frontal cortex were also seen to be correlated with earlier onset of schizophrenia symptoms.

In a paper newly published in the journal Molecular Psychiatry, the researchers report that lowering levels of circHomer1a in the mouse brain's frontal cortex results in impaired gene expression related to the function of the synapses - junctions at the ends of neurons that enable them to "talk" to their neighbors.

The team also showed that in mice, reduction of circHomer1a in the frontal cortex diminished their cognitive flexibility - the ability to respond to changing circumstances. This impairment is commonly seen in people with bipolar disorder.

"These mice can learn and discriminate, but when it comes time for them to adjust their behavior, they are very deficient," said lead author Nikolaos Mellios, MD, PhD, assistant professor in the Department of Neurosciences. "It takes them way more trials to reverse their behavior."

Recent studies have shown that the brains of mammals are rich in circular RNAs, and it appears that they powerfully shape the way genes are transcribed to RNA and translated into proteins, Mellios said.

"They don't produce any proteins, but there's emerging research showing that they have important regulatory roles," he said. "They are like conductors in an orchestra. You need these circular RNAs to fine-tune the expression of multiple genes."

Because circular RNAs are more stable in the body than their linear cousins, they could potentially serve as biomarkers to help clinicians diagnose disease, Mellios said. And, there's a possibility that treatments could be developed to enhance their effects in the brain.

"Our lab is working on ways to find the proper approach to manipulate these circular RNAs specifically in patients," he said. "We're also trying to find which drugs will specifically change these circular RNAs as a treatment."

Credit: 
University of New Mexico Health Sciences Center

'Atomic dance' reveals new insights into performance of 2D materials

A team of Northwestern University materials science researchers have developed a new method to view the dynamic motion of atoms in atomically thin 2D materials. The imaging technique, which reveals the underlying cause behind the performance failure of a widely used 2D material, could help researchers develop more stable and reliable materials for future wearables and flexible electronic devices.

These 2D materials - such as graphene and borophene - are a class of single-layer, crystalline materials with widespread potential as semiconductors in advanced ultra-thin, flexible electronics. Yet due to their thin nature, the materials are highly sensitive to external environments, and have struggled to demonstrate long-term stability and reliability when utilized in electronic devices.

"Atomically thin 2D materials offer the potential to dramatically scale down electronic devices, making them an attractive option to power future wearable and flexible electronics," said Vinayak Dravid, Abraham Harris Professor of Materials Science and Engineering at the McCormick School of Engineering.

The study, titled "Direct Visualization of Electric Field induced Structural Dynamics in Monolayer Transition Metal Dichalcogenides," was published on February 11 in the journal ACS Nano. Dravid is the corresponding author on the paper. Chris Wolverton, the Jerome B. Cohen Professor of Materials Science and Engineering, also contributed to the research.

"Unfortunately, electronic devices now operate as a kind of 'black box.' Although device metrics can be measured, the motion of single atoms within the materials responsible for these properties is unknown, which greatly limits efforts to improve performance," added Dravid, who serves as director of the Northwestern University Atomic and Nanoscale Characterization (NUANCE) Center. The research allows a way to move past that limitation with a new understanding of the structural dynamics at play within 2D materials receiving electrical voltage.

Building upon a previous study in which the researchers used a nanoscale imaging technique to observe failure in 2D materials caused by heat, the team used a high-resolution, atomic-scale imaging method called electron microscopy to observe the movement of atoms in molybdenum disulfide (MoS2), a well-studied material originally used as a dry lubricant in greases and friction materials that has recently gained interest for its electronic and optical properties. When the researchers applied an electric current to the material, they observed its highly mobile sulfur atoms move continuously to vacant areas in the crystalline material, a phenomenon they dubbed, "atomic dance."

That movement, in turn, caused the MoS2's grain boundaries -- a natural defect created in the space where two crystallites within the material meet-- to separate, forming narrow channels for the current to travel through.

"As these grain boundaries separate, you are left with only a couple narrow channels, causing the density of the electrical current through these channels to increase," said Akshay Murthy, a PhD student in Dravid's group and the lead author on the study. "This leads to higher power densities and higher temperatures in those regions, which ultimately leads to failure in the material."

"It's powerful to be able to see exactly what's happening on this scale," Murthy continued. "Using traditional techniques, we could apply an electric field to a sample and see changes in the material, but we couldn't see what was causing those changes. If you don't know the cause, it's difficult to eliminate failure mechanisms or prevent the behavior going forward."

With this new way to study 2D materials at the atomic level, the team believes researchers could use this imaging approach to synthesize materials that are less susceptible to failure in electronic devices. In memory devices, for example, researchers could observe how regions where information is stored evolve as electric current is applied and adapt how those materials are designed for better performance.

The technique could also help improve a host of other technologies, from transistors in bioelectronics to light emitting diodes (LEDs) in consumer electronics to photovoltaic cells that comprise solar panels.

"We believe the methodology we have developed to monitor how 2D materials behave under these conditions will help researchers overcome ongoing challenges related to device stability," Murthy said. "This advance brings us one step closer to moving these technologies from the lab to the marketplace."

Credit: 
Northwestern University

DIY tools TalkBox and SenseBox help people with disabilities to communicate

Researchers at the University of Maryland, Baltimore County (UMBC) have developed do-it-yourself (DIY) assistive technology prototypes that are revolutionizing how people with disabilities can access tools that will help them interact with the world.

Foad Hamidi, assistant professor of information systems, and his collaborators at York University in Canada and the Pamoja Community Based Organization in Kenya have created research-based assistive technology platforms for people with different abilities and in different cultural contexts to learn how to use simple computers to communicate. Importantly, the development of platform prototypes has been grounded in close collaboration among researchers and community members in Kenya and the U.S. The Institute of Electrical and Electronics Engineers (IEEE) has published the results in IEEE Pervasive Computing.

In the field of assistive technology, costs often prohibit many people with disabilities and their families from accessing useful communication technologies. Existing tools that facilitate communication are especially hard to individualize and can be costly, explains Hamidi. However, computers have steadily become less expensive to distribute and easier to use. This makes computer-based assistive technologies more accessible to people with disabilities, both inside and outside of the U.S.

Hamidi and his team have worked to develop and test two platforms: SenseBox and TalkBox. These platforms are open source and only require a Raspberry Pi (an inexpensive microcomputer), low-cost sensors, and a speaker to operate.

TalkBox allows users to communicate by touching images on an attached surface to play audio files stored within the system. The images and sounds can be customized during assembly, depending on an individual's unique needs. For example, TalkBox can be adapted to fit on a wheelchair, and it can include individualized visual elements. The TalkBox could display illustrations of faces showing different expressions, which a student could use to express an emotion. Numerous adjustments are available to the user, making the technology extremely customizable.

SenseBox relies on a similar model of stimuli being translated into audio, but it operates using tactile objects, which are recognized by sensors. These tactile objects are embedded with radio frequency identification (RFID) tags, similar to how objects are tagged in stores. The objects can be 3-D printed, which permits extensive customization.

TalkBox was successfully used in Kenya by a special education teacher who was able to input the names of all of his students onto the device to be used in class. This application of the device led to a noticeable increase in participation and inclusion. The success of the tool within that classroom has already led to an increased interest in the technology for other potential stakeholders in Kenya. The researchers hope to work with community members in Kenyan universities and healthcare facilities to expand the availability of this tool, and help stakeholders learn how to use it.

In the U.S., SenseBox was used by a speech-language pathologist and a nonverbal client with low vision and autism spectrum disorder. The client was able to play his favorite music by holding the desired CD case to the device, which was a major stepping stone in his communication. Previously, he had difficulty using other devices to achieve this same goal of playing his favorite artist.

The success of these DIY devices rests on the fact that people with limited experience using technology can quickly learn how to use the tools and teach others how to use them. Hamidi and his research partners see their close collaboration with those who will be using TalkBox and SenseBox as essential to ensuring the tools are tailored to meet their needs.

The researchers continue to explore how best they can scale up the use of these new tools to support people with disabilities who are seeking new ways to communicate in a broad range of cultural contexts.

Credit: 
University of Maryland Baltimore County

Software updates slowing you down?

image: Schematic illustrating how Muzahid's deep learning algorithm works. The algorithm is ready for anomaly detection after it is first trained on performance counter data from a bug-free version of a program.

Image: 
Texas A&M Engineering

We've all shared the frustration -- software updates that are intended to make our applications run faster inadvertently end up doing just the opposite. These bugs, dubbed in the computer science field as performance regressions, are time-consuming to fix since locating software errors normally requires substantial human intervention.

To overcome this obstacle, researchers at Texas A&M University, in collaboration with computer scientists at Intel Labs, have now developed a complete automated way of identifying the source of errors caused by software updates. Their algorithm, based on a specialized form of machine learning called deep learning, is not only turnkey, but also quick, finding performance bugs in a matter of a few hours instead of days.

"Updating software can sometimes turn on you when errors creep in and cause slowdowns. This problem is even more exaggerated for companies that use large-scale software systems that are continuously evolving," said Dr. Abdullah Muzahid, assistant professor in the Department of Computer Science and Engineering. "We have designed a convenient tool for diagnosing performance regressions that is compatible with a whole range of software and programming languages, expanding its usefulness tremendously."

The researchers described their findings in the 32nd edition of Advances in Neural Information Processing Systems from the proceedings of the Neural Information Processing Systems conference in December.

To pinpoint the source of errors within a software, debuggers often check the status of performance counters within the central processing unit. These counters are lines of code that monitor how the program is being executed on the computer's hardware in the memory, for example. So, when the software runs, counters keep track of the number of times it accesses certain memory locations, the time it stays there and when it exits, among other things. Hence, when the software's behavior goes awry, counters are again used for diagnostics.

"Performance counters give an idea of the execution health of the program," said Muzahid. "So, if some program is not running as it is supposed to, these counters will usually have the telltale sign of anomalous behavior."

However, newer desktops and servers have hundreds of performance counters, making it virtually impossible to keep track of all of their statuses manually and then look for aberrant patterns that are indicative of a performance error. That is where Muzahid's machine learning comes in.

By using deep learning, the researchers were able to monitor data coming from a large number of the counters simultaneously by reducing the size of the data, which is similar to compressing a high-resolution image to a fraction of its original size by changing its format. In the lower dimensional data, their algorithm could then look for patterns that deviate from normal.

When their algorithm was ready, the researchers tested if it could find and diagnose a performance bug in a commercially available data management software used by companies to keep track of their numbers and figures. First, they trained their algorithm to recognize normal counter data by running an older, glitch-free version of the data management software. Next, they ran their algorithm on an updated version of the software with the performance regression. They found that their algorithm located and diagnosed the bug within a few hours. Muzahid said this type of analysis could take a considerable amount of time if done manually.

In addition to diagnosing performance regressions in software, Muzahid noted that their deep learning algorithm has potential uses in other areas of research as well, such as developing the technology needed for autonomous driving.

"The basic idea is once again the same, that is being able to detect an anomalous pattern," said Muzahid. "Self-driving cars must be able to detect whether a car or a human is in front of it and then act accordingly. So, it's again a form of anomaly detection and the good news is that is what our algorithm is already designed to do."

Other contributors to the research include Dr. Mejbah Alam, Dr. Justin Gottschlich, Dr. Nesime Tatbul, Dr. Javier Turek and Dr. Timothy Mattson from Intel Labs.

Credit: 
Texas A&M University

Social control among immune cells improves defense against infections

image: This image depicts T cells interacting with each other. Cell surfaces are labeled in red, cell nuclei in blue and receptors mediating communication in green.

Image: 
Immunity journal

A simple mechanism, previously known from bacteria, ensures that the immune system strikes a balance between the rapid expansion of immune cells and the prevention of an excessive self-damaging reaction after an infection. This has now been deciphered by scientists at the University Hospital of Freiburg (Germany) and colleagues from the Netherlands and Great Britain. An infection quickly activates T-cells, which leads to their proliferation. The research team has now shown that these cells are able to perceive each other and - based on their density - jointly determine whether or not they should continue to proliferate. The newly discovered mechanism could also help to improve cancer immunotherapies. The study was published in the scientific journal Immunity on 11 February 2020.

Cooperation among immune cells

"We showed that these immune cells perceive and regulate each other. The immune cells act as a team and not as autonomously acting individualists," said Dr. Jan Rohr, head of the study and scientist at the Centre for Immunodeficiency (CCI) at the University Hospital of Freiburg. "This principle of density control of immune cells is simple and very effective. This makes it reliable and at the same time hopefully accessible for therapeutic approaches," said Rohr. At low density, the T-cells support each other in their proliferation. As soon as a threshold value of cell density is reached, the mutual support turns into mutual inhibition, which prevents further cell proliferation. This mechanism leads to the efficient amplification of initially weak immune reactions, but is also able to prevent excessive and potentially dangerous immune reactions.

Immunotherapies could become even more effective

This finding casts a new light on certain cancer immunotherapies. Tumors protect themselves by suppressing the immune system. To circumvent this, therapies have been developed in which T-cells are taken from patients, strengthened and expanded in the laboratory, and finally returned to the patient. For these therapies usually high cell counts are administered to make the therapy particularly effective. "It is possible that the immune cells switch off each other if they are administered at high numbers. A repeated administration of lower numbers of immune cells may fight the tumour cells more effectively. ," says Rohr. The extent to which this might help to improve current immunotherapies will have to be investigated in further studies.

In their study, the scientists investigated immune cells in the laboratory using microscopic time-lapse imaging and genetic analyses. The mechanisms found were then used by researchers at the University of Leiden, Netherlands, to develop a mathematical model of cell-cell interactions. Finally, the mechanisms found were tested in animal models. "These different research approaches complemented and supported each other very well," said the project leader from the University of Freiburg.

Credit: 
University of Freiburg

Revenge is more enjoyable than forgiveness -- at least in stories

COLUMBUS, Ohio - When it comes to entertainment, people enjoy seeing bad guys get their punishment more than seeing them be forgiven, a new study reveals.

But even though they don't enjoy the forgiveness stories as much, people do find these narratives more meaningful and thought-provoking than ones in which the bad guys receive their just deserts.

"We like stories in which the wrongdoers are punished and when they get more punishment than they deserve, we find it fun," said Matthew Grizzard, lead author of the study and assistant professor of communication at The Ohio State University.

"Still, people appreciate stories of forgiveness the most, even if they don't find them to be quite as fun."

The study was published online recently in the journal Communication Research and will appear in a future print edition.

The study involved 184 college students who read short narratives that they were told were plots to possible television episodes.

The students read 15 narratives: one-third in which the villain was treated positively by the victim; one-third in which the villain received a just punishment; and one-third in which the villain was punished over and beyond what would have been a suitable penalty for the crime.

For example, one story involved a person stealing $50 from a co-worker. Participants read one of three possible endings.

In one scenario, the victim bought coffee for the thief (under-retribution/forgiveness); in another, the victim stole a $50 bottle of whiskey from the thief (equitable retribution); and in the third version the victim both stole his money back and downloaded porn onto the thief's work computer (over-retribution).

Immediately after reading each scenario, the participants were asked if they liked or disliked the narrative. More people liked the equitable retribution stories than those that involved under- or over-retribution, Grizzard said.

The researchers also timed how long it took the readers to click the like or dislike button on the computer after reading each of the narratives.

They found that readers took less time to respond to stories with equitable retribution than it did for them to respond to stories with under- or over-retribution.

"People have a gut-level response as to how they think people should be punished for wrongdoing and when a narrative delivers what they expect, they often respond more quickly," Grizzard said.

When the punishment did not fit the crime, the participants took a bit longer to respond to the story with a like or dislike. But why they took longer appeared to be different for stories with under-retribution versus stories with over-retribution, Grizzard said. The reason why may be explained by the next part of the study.

After the participants read all 15 narratives, they rated each story for enjoyment ("This story would be a good time, fun, entertaining") and appreciation ("This story would be meaningful, moving, thought-provoking").

Participants thought stories in which the bad guys were over-punished would be the most enjoyable and those in which the bad guys were forgiven would be the least enjoyable to watch. Equitable punishment was in the middle.

But they also said they would appreciate the stories about forgiveness more than the other two types of narratives.

So the participants may have paused slightly before responding to the forgiveness stories to reflect, because they saw them as more meaningful, Grizzard said.

But while they also paused for the over-punishment narratives, they did not find them more meaningful, only more enjoyable, he said. That suggests the pause may have been simply to savor the extra punishment the villain received.

"It appears to be the darker side of just enjoying the vengeance," he said.

Overall, the results suggest that a fair and just retribution is the "intuitive moral standard" that comes to us easily and naturally, according to Grizzard.

"But seeing a lack of punishment requires a level of deliberation that doesn't come to us naturally. We can appreciate it, even if it doesn't seem particularly enjoyable."

Credit: 
Ohio State University

Researchers: Synthetic chemicals in soils are 'ticking time bomb'

A growing health crisis fueled by synthetic chemicals known as per- and polyfluoroalkyl substances, or PFAS, in groundwater has garnered much attention in the last few years.

The reported levels could be "just the tip of the iceberg," as most of the chemicals are still migrating down slowly through the soil, according to Bo Guo, University of Arizona assistant professor of hydrology and atmospheric sciences.

Nearly 3,000 synthetic chemicals belong to the PFAS class. They have been used since the 1940s in food packaging, water-resistant fabrics, non-stick products, pizza boxes, paints, firefighting foams and more, according to the Environmental Protection Agency.

The chemicals don't break down in the environment, nor in the body, and a growing number of research papers have shown that PFAS contamination in water sources is widespread in the United States and that exposure is harmful to health.

"Because PFAS are in a lot of consumer and industrial products, they can get into wastewater. Treatment plants are not designed to treat these compounds, so these chemicals just stay in that water to get reused. It's sprayed on soccer fields or used to recharge aquifers, for example," said Mark Brusseau, professor of environmental science. "PFAS can also get into the biosolids, which are land-applied as fertilizer, so there are all these sources, which means they could have entered the environment at many different time periods and repeatedly."

To understand how the chemicals migrate through the soil between the land surface and groundwater - an area called the vadose zone - University of Arizona researchers developed a novel mathematical model to simulate the different complex processes that affect the transport and retention of these chemicals.

Their findings are published in the journal Water Resources Research.

Their model showed that the majority of PFAS chemicals accumulate in places where air contacts the surface of water trapped in the soil, which significantly slows the chemicals' downward march to groundwater. The researchers found that the chemicals will move even more slowly than expected through coarse-grain soils than fine-grain soils.

"This means that the majority of PFAS are still in the soil, and they are migrating down slowly in a way similar to a ticking time bomb," said Guo, the study's lead author.

Previous observations showed that PFAS chemicals were moving slowly through the soil before reaching the groundwater, but no one understood why. The model defines the mechanisms behind the extremely slow migrations seen in the field.

"This has big implications for focusing remediation," Guo said. "So far, groundwater has been the focus, but should we actually focus on soil, which is where most of the PFAS are and will be for a long time? Or do we wait and remediate the groundwater for decades or centuries?"

The model can work for any of the PFAS chemicals, but the researchers specifically simulated PFOS, or perfluorooctanesulfonate, which is commonly found in firefighting foam and is of primary concern.

"One of our objectives in the future would be to apply the model to different sites," said, Brusseau, who co-authored the study with Guo and Jicai Zeng, a postdoctoral researcher in Guo's group. "Then hopefully it will be useful for policy makers, regulators, environmental consultants to do assessments."

Credit: 
University of Arizona