Tech

Working women healthier even after retirement age

In a new study, Jennifer Caputo, research scientist at the Max Planck Institute for Demographic Research, found that women who worked consistently during their prime midlife working years had better physical health than non-working women later in life. Working women were also less depressed over the next decades as they entered old age, and even lived longer shows the study recently published in the journal Demography.

Jennifer Caputo and her coauthors analyzed data from the National Longitudinal Survey of Mature Women in the US. The survey began in 1967 with about 5,100 women aged 30-44, and followed them until they were 66-80 years old in 2003.Their analyses showed that women who regularly worked for pay during the first 20 years of the study reported fewer physical health limitations and symptoms of depression as they aged over the next 16 years than women who didn't work for pay, including housewives. They also had more than 25 percent lower risk of having died by 2012.

Bad experiences at work - still healthier than non-workers

Consistently negative experiences with work did appear to take a toll on women's health later on. Those who perceived discrimination at work, didn't particularly like their jobs, and said they did not feel committed to their work had poorer physical and mental health as they aged. However, women with these experiences were still healthier in late life than non-workers.

This is also during a historical period that the gender composition of the labor force was rapidly changing. Caputo comments, "Many women in this study went to work in low-status or traditionally male-dominated fields. It is perhaps especially telling that despite these less equitable conditions, they were healthier later in life than women who didn't work outside the home."

Women's health benefitted by being employed

The authors also found that taking into account income, occupational class, and hours worked did not fully explain why working women were healthier and lived longer than non-working women.

"Our findings support the conclusion that women's health is benefitted by being employed, regardless of their economic situation and even if they don't always have the best working experiences," says Caputo. She adds: "For the first time we were able to show a positive long-term relationship between working at midlife and health over many following years, even past the age of retirement."

Credit: 
Max-Planck-Gesellschaft

Low back pain accounts for a third of new emergency department imaging in the US

image: Values in table are percentages.

Image: 
American Journal of Roentgenology (AJR)

Leesburg, VA, December 18, 2019--The use of imaging for the initial evaluation of patients with low back pain in the emergency department (ED) continues to occur at a high rate--one in three new emergency visits for low back pain in the United States--according to an article published ahead-of-print in the February 2020 issue of the American Journal of Roentgenology (AJR).

"Although there has been a modest decline," wrote Jina Pakpoor of the University of Pennsylvania, "in 2016, approximately one in three patients still continued to receive imaging in the ED. Further, significant geographic variation exists between differing states and regions of the United States."

Pakpoor and colleagues identified emergency department visits for patients with low back pain billed to insurance by querying IBM's Commercial Claims and Encounters Marketscan research database for patients 18-64 years old.

Excluding patients with concomitant encounter diagnoses suggesting trauma, as well as those with previous visits for back pain, Current Procedural Terminology codes were used to identify three imaging modalities: radiography, CT, and MRI.

Of the 134,624 total encounters meeting Pakpoor's inclusion criteria, imaging was obtained in 44,405 (33.7%) visits and decreased from 34.4% to 31.9% between 2011 and 2016 (odds ratio per year, 0.98 [95% CI, 0.98-0.99]; p

During the five-year study period, 30.9% of patients underwent radiography, 2.7% of patients underwent CT, and 0.8% of patients underwent MRI for evaluation of low back pain.

Imaging utilization varied significantly by geographic region (p

Acknowledging further research is necessary "to understand the underlying reasons for persistent use of potentially unwarranted imaging in the emergency setting," as Pakpoor concluded, "our results indicate that the use of imaging for the evaluation of patients with low back pain in the ED is moderately declining but continues to occur at an overall high rate."

Credit: 
American Roentgen Ray Society

Comparing heirloom and modern wheat effects on gut health

Amid concerns about gluten sensitivity, increasing numbers of people are avoiding wheat. Most have not been diagnosed with a wheat-related medical condition, yet they seem to feel better when they don't eat gluten-containing foods. A possible explanation is that modern varieties of wheat are responsible. But now, researchers reporting in ACS' Journal of Agricultural and Food Chemistry have shown that a popular modern variety does not impair gastrointestinal health in mice compared with heirloom wheat.

When people with celiac disease or other forms of gluten sensitivity eat wheat, they experience gastrointestinal distress and inflammation. However, little is known about whether eating wheat could cause gastrointestinal problems in healthy people. Some have speculated that selective breeding of wheat might have altered the grain in a way that negatively affects gut health. From the late 1800s to 1940s, a variety known as "Turkey" was the major wheat grown in the U.S. Then, selective breading created new types with higher yields and resistance to pests and pathogens. The "Gallagher" variety, introduced in 2012, is now one of the most widely grown bread wheats in the U.S. Great Plains region. Brett Carver, Brenda Smith and colleagues wondered whether eating the modern Gallagher variety would increase gastrointestinal problems in healthy mice relative to a blend of two heirloom wheats, Turkey and "Kharkof."

To simulate a Western-type diet, which has itself been linked to chronic inflammation and disease, the researchers fed mice chow that was high in sugar and fat. Then, they added either heirloom or modern wheat to the food, at a level that resembled normal-to-high human consumption. Signs of gut inflammation were similar between mice fed the heirloom and modern varieties, although heirloom wheat slightly reduced levels of the pro-inflammatory cytokine interleukin-17. However, modern Gallagher wheat improved the structure of villi -- fingerlike projections that absorb nutrients -- in a specific region of the small intestine compared with heirloom wheat. These findings indicate that a modern wheat variety did not compromise gut barrier function or contribute to inflammation in healthy mice compared with its heirloom predecessors, the researchers say.

Credit: 
American Chemical Society

Mentoring project deepened student learning, commitment

image: Consuelo Waight, associate professor of human development at the University of Houston, found that students in her graduate-level course on organization development reported a deeper connection to the field after working with a mentor.

Image: 
University of Houston

Pairing graduate students with professionals working in their field resulted in deeper learning and inspired passion for the work, according to new research from the University of Houston.

"By taking the students outside the classroom, they saw the relevance and meaningfulness of what they were learning. That motivated them beyond case studies," said Consuelo Waight, associate professor of human development at UH and corresponding author of a paper on the research, tied to a formal mentoring program for students in a graduate-level course on organization development and published in the journal Mentoring & Tutoring: Partnership in Learning. "Organization development was now personal. It was not a concept in a book."

Organization development is an outgrowth of the discipline of human resources, focused on leading and executing change, whether that involves the merger of two companies, training new employees or other upheaval. Responding to change has always been a core skill in the business world, Waight said, involving the ability to manage how external changes affect a company as well as how the company is affected by internal changes, such as a restructuring or onboarding a new manager.

Waight began asking students in her introductory graduate level class to find mentors working on organization development in the business world when she first taught the class more than a decade ago. Students are required to keep a diary of their interactions, thoughts and insights arising from those interactions.

She knew it was a powerful experience and ultimately used the diaries to create a dataset to quantitatively demonstrate the impact on students. Mayura Pandit-Tendulkar, associate director of learning for Emeritus Institute of Management, is co-author of the work.

Most research involving academic mentoring has looked at peer mentoring for the purposes of persistence, advancement, achievement, identity and success, Waight said. This study instead considered how forming a relationship with a practitioner in the student's field can boost learning and understanding of the field.

A second takeaway, Waight said, was that student-kept diaries can be a viable source of research data.

The mentoring project isn't an internship, which are typically focused on objectives set by the employer. In this project, students set the agenda and the mentor agrees to share insights on those topics, often inviting the student to sit in on company meetings, sharing artifacts and otherwise immerse themselves in the field.

Waight and Pandit-Tendulkar concluded that the experience helped the students better understand organization development in several ways: a deeper understanding of the concept; discovery of how widely the concept is used throughout organizations; and recognition that the theories taught in class are put into action in the workplace.

As one student wrote in a diary, "Initially, the universal aspect of OD (organization development) that my mentor spoke about surprised me. I had been considering the practice of OD in a more capsulated way - as a practice that only human resource departments and the like would use. However, speaking with my mentor expanded my view of OD."

Credit: 
University of Houston

Researchers discover how ant species uses abdomen for extra power during jumps

image: Researchers at the University of Illinois and the University of Pennsylvania have shown that a species of ant uses abdominal rotation to power their jumping.

Image: 
Adrian Smith

Researchers in the department of entomology at the University of Illinois have shown how a species of ant uses its abdomen to add speed to its jump, in a recent study published in Integrative Organismal Biology. With a name like Gigantiops destructor, one might expect this ant species to be large or aggressive, but these relatively shy ants common to South America are anything but. Compared to other notable Amazonian ants such as bullet, army and leafcutter ants, Gigantiops are smaller, less confrontational, and often overlooked as one walks through the rainforest. However, these ants are capable of a rather unique behavior - they travel through their leaf litter habitats by jumping - and rotating their abdomens to power part of that process.

Gigantiops destructor is one of only four types of ants that are known to use their legs to jump as a form of locomotion. The ants use their legs to make precise directional jumps, aided by how they move their abdomens, as the new study shows. "It had been previously thought that these ants were swinging their abdomen above their body during takeoff to help power their jumps," said study co-author Josh Gibson, a graduate student in entomology. "Until now, no one had actually tested it, so that's what we set out to do in this paper." Gibson co-authored the study with Dr. Andrew Suarez (entomology; evolution, ecology and behavior) and Dajia Ye (alumna and current graduate student at the University of Pennsylvania).

Using high speed cameras to study the ants in slow motion, the authors compared the jumps of unencumbered ants to the jumps of ants who had been sedated and had their abdomens glued in place, as well as to ants who had been sedated and sham-treated as a control. By limiting their ability to move, the researchers hoped to understand how the ants utilized this body movement for jumping.

The results indicate that moving their abdomens aids the ants to jump further, higher, and faster overall. This is particularly helpful to the ants as they try to navigate the detritus on a forest floor. "These ants forage in leaf litter, and it can be quite difficult if the ants have to walk up and down each leaf," said Gibson. "Jumping improves their ability to travel much more quickly through the litter, and their abdominal rotation powers those jumps, at least in part."

Interestingly, the abdomen rotation does not seem to be involved as much in stabilizing the body of the ant during the jump - which is where the legs come in. "Looking at future work, we could look at the center of body mass on the ants and see how they use their legs to stabilize themselves after they jump," added Gibson. "Learning more about how these ants move and how their bodies to accomplish that can help us gain a better understanding of how this could apply to other species, or provide insight into bio-inspired design concepts."

Credit: 
University of Illinois College of Liberal Arts & Sciences

Saccharin derivatives give cancer cells a not-so-sweet surprise

Saccharin received a bad rap after studies in the 1970s linked consumption of large amounts of the artificial sweetener to bladder cancer in laboratory rats. Later, research revealed that these findings were not relevant to people. And in a complete turnabout, recent studies indicate that saccharin can actually kill human cancer cells. Now, researchers reporting in ACS' Journal of Medicinal Chemistry have made artificial sweetener derivatives that show improved activity against two tumor-associated enzymes.

Saccharin, the oldest artificial sweetener, is 450 times sweeter than sugar. Recently, scientists showed that the substance binds to and inhibits an enzyme called carbonic anhydrase (CA) IX, which helps cancer cells survive in the acidic, oxygen-poor microenvironments of many tumors. In contrast, healthy cells make different -- but very similar -- versions of this enzyme called CA I and II. Saccharine and another artificial sweetener called acesulfame K can selectively bind to CA IX over CA I and II, making them possible anti-cancer drugs with minimal side effects. Alessio Nocentini, Claudiu Supuran and colleagues wondered whether they could make versions of the artificial sweeteners that show even more potent and selective inhibition of CA IX and another tumor-associated enzyme, CA XII.

The team designed and synthesized a series of 20 compounds that combined the structures of saccharin and acesulfame K and also added various chemical groups at specific locations. Some of these compounds showed greater potency and selectivity toward CA IX and XII than the original sweeteners. In addition, some killed lung, prostate or colon cancer cells grown in the lab but were not harmful to normal cells. These findings indicate that the widely used artificial sweeteners could be promising leads for the development of new anticancer drugs, the researchers say.

Credit: 
American Chemical Society

Study reveals molecular features of anxiety in the brain

Boston, MA -- Nearly 40 million people in the U.S. have an anxiety disorder. While treatment options exist, treatment success varies, and many people do not respond to treatment until weeks or months after they begin anti-depressants. Other medications, such as benzodiazepines, can relieve symptoms quickly but can have side effects and risks, especially if taken over a longer term. Better treatment is needed but the search for new therapies has lagged over the decades, in part because of the limitations of preclinical models. Investigators from Brigham and Women's Hospital have taken a new approach to the search, developing a rational, computationally inspired method for the preclinical study of anxiety. The team's efforts have been fruitful, uncovering more than 209 genes, whose activity change across anxiety categories, as well as new targets for drug development. Results are published in Translational Psychiatry.

"Treatment for acute, in-the-moment anxiety hasn't evolved much in the last 50 years, and that's in part due to the challenges of recapitulating human anxiety in preclinical models and the petri dish," said co-corresponding author Ilana Braun, MD, chief of the Psychosocial Oncology Service in the Department of Psychiatry at the Brigham. "Animal models have exaggerated symptoms, and this limits our ability to study anxiety and its neurochemistry."

To improve the study of anxiety, Braun collaborated with Aaron Goldman, PhD, associate bioengineer in the Brigham's Division of Engineering in Medicine. Goldman, who uses computational modeling to study cancer, saw the opportunity to bring the tools from his field of study to bear on the challenges of studying anxiety disorders.

"In cancer, we use computational models. We take mathematical modeling and computational approaches and marry these together with gene expression RNA sequencing or protein analysis," said Goldman, a co-corresponding author. "We realized we could take the tools in our lab and apply it to anxiety to develop a more rational way to address the challenges and identify inherent features of anxiety."

Braun, Goldman and colleagues developed their approach by first studying genetically and developmentally normal mice. The mice performed a series of behavioral tests, and the investigators selected the mouse strain that had the most variance in their performance. They then used computational modeling and RNA sequencing together in a unique fashion to stratify the animals based on their social and behavioral choices, categorizing the mice as having low, medium and high anxiety.

The team found a large degree of molecular variation in the amygdala of the mice, with more than 209 genes that had different levels of activity across the three categories of anxiety. These included changes in genes associated with synaptic plasticity (learning and memory), and genes involved in the expression of hormones such as estradiol (the strongest form of estrogen) and prolactin (a hormone tied to pregnancy and breastfeeding). The team also found changes pointing to G-protein coupled receptors, including one that has been tied to blood vessel formation but never before to anxiety states. The team conducted further analyses to assess highly interconnected sets of genes and potential druggable anxiety targets.

"One of the beautiful things about this computational approach is that it is multi-faceted. Many tests just capture one aspect of anxiety. But here we can test exploration and sociability and capture information about temperament," said Braun. "The result is a treasure trove of genes and pathways. Our next step will be to further interrogate the genes of interest that are druggable."

Credit: 
Brigham and Women's Hospital

Air travel reduces local investment bias, benefits investors and firms, study shows

Easy access to air travel has not only flattened the world, it also has flattened the bias toward investing locally, according to new research from the University of Notre Dame.

"Mobility of population is death to localism," as stated in the study "Investment in a Smaller World: The Implications of Air Travel for Investors and Firms," forthcoming in Management Science from Zhi Da, professor of finance in Notre Dame's Mendoza College of Business. The paper is one of the first to quantify the impact of a "flattening world" on financial outcomes such as local investment bias and firm cost of capital.

"We find that investors in one location -- say Austin, Texas -- are more likely to invest in companies in a faraway location -- for example, San Jose, California -- when the two locations are better connected by air traffic," Da says. "This investment reduces the risk for Austin investors since they now hold a more diversified portfolio. It also reduces the cost of capital for a San Jose company by approximately 1 percent, as its funding sources become more diverse.

"More broadly," he explains, "our findings suggest that the mobility of the population and the resulting exchange of ideas and efficient capital flows are good for both the investors and the companies."

The team extends the literature on local investment bias where proximity is simply measured by geographic distance. They argue air travel has made geographic distance less relevant. For example, the distance between Chicago and San Francisco is more than twice that between South Bend and Tallahassee, but the former location pair is better connected than the latter due to the availability of direct flights.

"When we study the investment flow between two cities," Da says, "the air traffic is probably more relevant than the geographic distance."

One could argue the improved economic conditions in Austin and San Jose are what led to increased air traffic and investment flows between the two cities. Put simply, air traffic is the symptom, rather than the cause.

"We tackle this issue by focusing on the initiation of connecting flights (between Austin and San Jose) attributable to the 1997 opening of an air hub in Los Angeles," Da says. "In other words, we focus on variation in air traffic between two peripheral airports in a network whose connectivity is re-optimized in response to the addition of a central airport. Such variation is less likely driven by economic conditions of the peripheral cities hosting airports."

The result is confirmed, because investment at destinations served by these connecting flights increases after, not before, their initiation. The improved air traffic has a bigger impact on companies in smaller and less accessible cities. Their average cost of capital is reduced by 1.5 percent as a result compared with 1 percent for an average firm.

Credit: 
University of Notre Dame

Chemicals in vaping flavors cause widespread damage to lung tissue

New research appearing in the journal Scientific Reports unpacks the list of chemicals that comprise flavored e-liquids and pods used in vaping and details their harmful effects to lung tissue, including inflammation and genetic damage that could indicate long-term risk for respiratory disease and even cancer.

"While names like mango, cucumber, and mint give the impression that the flavors in e-juices are benign, the reality is that these sensations are derived from chemicals," said Irfan Rahman, Ph.D., a professor in the University of Rochester Medical Center's (URMC) Department of Environmental Medicine and lead author of the study. "These findings indicate that exposure to these chemicals triggers damage and dysfunction in the lungs that are a precursor to long-term health consequences."

Other than propylene glycol and vegetable glycerin, which form the base of vaping liquids, and nicotine, most manufacturers do not disclose the chemical compounds used to create the flavors in vaping products.

Employing mass spectrometry, the researchers identified almost 40 different chemicals present in various combinations in seven flavors manufactured by JUUL. These include hydrocarbons and volatile organic compounds, many of which have industrial uses and are known to be harmful if inhaled.

JUUL - which accounts for more than 70 percent of all vaping product sales in the U.S. - has recently halted sales of most of its flavored pods and several states, including New York, are in the process of banning these products. However, many other companies and independent vape shops continue to manufacture and sell an estimated 8,000 different flavored e-juices and pods.

In the study, researchers exposed human lung tissue - including bronchial epithelial cells, which play an important role in the exchange of gases, and monocytes, an infection-fighting cell in the immune system - to aerosolized vapor from the flavor pods. They observed that the chemicals provoked inflammation and degraded the integrity of the epithelial cells, a condition that could eventually lead to acute lung injury and respiratory illness. Exposure also damaged DNA in the cells, a potential precursor to cancer. The study showed that menthol flavor, which JUUL continues to sell, is equally as harmful as other flavors.

"Vaping technology has only existed for a short period of time and its use, particularly among younger people, has only recently exploded," said Rahman. "This study gives further evidence that vaping - while less harmful than combustible tobacco in the short run - is placing chronic users on the path to significant health problems later in life."

Rahman helps lead the WNY Center for Research on Flavored Tobacco Products (CRoFT), a partnership between researchers at URMC and Roswell Park Comprehensive Cancer Center in Buffalo to study the health effects of one of the fastest-growing trends in tobacco use. The Center is supported by a $19 million grant from the federal Food and Drug Administration's Tobacco Centers of Regulatory Science program.

Credit: 
University of Rochester Medical Center

Can good sleep patterns offset genetic susceptibility to heart disease and stroke?

image: Dr. Lu Qi is director of the Tulane University Obesity Research Center at Tulane School of Public Health and Tropical Medicine.

Image: 
Paula Burch-Celentano

Getting a good night's sleep could be beneficial for long-term health. A pioneering new study led by Dr. Lu Qi, director of the Tulane University Obesity Research Center, found that even if people had a high genetic risk of heart disease or stroke, healthy sleep patterns could help offset that risk. The study is published in the European Heart Journal.

The researchers looked at genetic variations known as SNPs (single nucleotide polymorphisms) that were already known to be linked to the development of heart disease and stroke. They analysed the SNPs from blood samples taken from more than 385,000 healthy participants in the UK Biobank project and used them to create a genetic risk score to determine whether the participants were at high, intermediate or low risk of cardiovascular problems.

The researchers followed the participants for an average of 8.5 years, during which time there were 7,280 cases of heart disease or stroke.

"We found that compared to those with an unhealthy sleep pattern, participants with good sleeping habits had a 35% reduced risk of cardiovascular disease and a 34% reduced risk of both heart disease and stroke," Qi says. Researchers say those with the healthiest sleep patterns slept 7 to 8 hours a night, without insomnia, snoring or daytime drowsiness.

When the researchers looked at the combined effect of sleep habits and genetic susceptibility on cardiovascular disease, they found that participants with both a high genetic risk and a poor sleep pattern had a more than 2.5-fold greater risk of heart disease and a 1.5-fold greater risk of stroke compared to those with a low genetic risk and a healthy sleep pattern. This meant that there were 11 more cases of heart disease and five more cases of stroke per 1000 people a year among poor sleepers with a high genetic risk compared to good sleepers with a low genetic risk. However, a healthy sleep pattern compensated slightly for a high genetic risk, with just over a two-fold increased risk for these people.

A person with a high genetic risk but a healthy sleep pattern had a 2.1-fold greater risk of heart disease and a 1.3-fold greater risk of stroke compared to someone with a low genetic risk and a good sleep pattern. While someone with a low genetic risk, but an unhealthy sleep pattern had 1.7-fold greater risk of heart disease and a 1.6-fold greater risk of stroke.

"As with other findings from observational studies, our results indicate an association, not a causal relation," Qi says. "However, these findings may motivate other investigations and, at least, suggest that it is essential to consider overall sleep behaviors when considering a person's risk of heart disease or stroke."

Credit: 
Tulane University

Research provides new design principle for water-splitting catalysts

image: A new study shows that hydrogen atoms are loosely bound and highly mobile on the surface of a platinum catalyst during the water splitting reaction. The findings explain why platinum is so good at catalyzing this particular reaction, and could aid in the design of catalysts made of materials that are cheaper and more plentiful than platinum.

Image: 
Peterson Lab / Brown University

PROVIDENCE, R.I. [Brown University] -- Scientists have long known that platinum is by far the best catalyst for splitting water molecules to produce hydrogen gas. A new study by Brown University researchers shows why platinum works so well -- and it's not the reason that's been assumed.

The research, published in ACS Catalysis, helps to resolve a nearly century-old research question, the authors say. And it could aid in designing new catalysts for producing hydrogen that are cheaper and more plentiful than platinum. That could ultimately help in reducing emissions from fossil fuels.

"If we can figure out how to make hydrogen cheaply and efficiently, it opens the door to a lot of pragmatic solutions for fossil-free fuels and chemicals," said Andrew Peterson, an associate professor in Brown's School of Engineering and the study's senior author. "Hydrogen can be used in fuel cells, combined with excess CO2 to make fuel or combined with nitrogen to make ammonia fertilizer. There's a lot we can do with hydrogen, but to make water splitting a scalable hydrogen source, we need a cheaper catalyst."

Designing new catalysts starts with understanding what makes platinum so special for this reaction, Peterson says, and that's what this new research aimed to figure out.

Platinum's success has long been attributed to its "Goldilocks" binding energy. Ideal catalysts hold on to reacting molecules neither too loosely nor too tightly, but somewhere in the middle. Bind the molecules too loosely and it's difficult to get a reaction started. Bind them too tightly and molecules stick to the catalyst's surface, making a reaction difficult to complete. The binding energy of hydrogen on platinum just happens to perfectly balance the two parts of the water-splitting reaction -- and so most scientists have believed it's that attribute that makes platinum so good.

But there were reasons to question whether that picture was correct, Peterson says. For example, a material called molybdenum disulfide (MoS2) has a binding energy similar to platinum, yet is a far worse catalyst for the water-splitting reaction. That suggests that binding energy can't be the full story, Peterson says.

To find out what was happening, he and his colleagues studied the water-splitting reaction on platinum catalysts using a special method they developed to simulate the behavior of individual atoms and electrons in electrochemical reactions.

The analysis showed that the hydrogen atoms that are bound to the surface of platinum at the "Goldilocks" binding energy don't actually participate in the reaction at all when the reaction rate is high. Instead, they nestle themselves within the surface crystalline layer of the platinum, where they remain inert bystanders. The hydrogen atoms that do participate in the reaction are far more weakly bound than the supposed "Goldilocks" energy. And rather than nestling in the lattice, they sit atop the platinum atoms, where they're free to meet up with each other to form H2 gas.

It's that freedom of movement for hydrogen atoms on the surface that makes platinum so reactive, the researchers conclude.

"What this tells us is that looking for this 'Goldilocks' binding energy isn't the right design principle for the high activity region," Peterson said. "We suggest that designing catalysts that put hydrogen in this highly mobile and reactive state is the way to go."

Credit: 
Brown University

New coating hides temperature change from infrared cameras

image: Taken with a long-wave infrared camera, this image of researchers in Mikhail Kats' lab shows distinct color variations across areas that are warmer (faces and bodies) and cooler (the table).

Image: 
Image courtesy of the Kats group

MADISON, Wis. -- An ultrathin coating developed by University of Wisconsin-Madison engineers upends a ubiquitous physics phenomenon of materials related to thermal radiation: The hotter an object gets, the brighter it glows.

The new coating -- engineered from samarium nickel oxide, a unique tunable material -- employs a bit of temperature trickery.

"This is the first time temperature and thermal light emission have been decoupled in a solid object. We built a coating that 'breaks' the relationship between temperature and thermal radiation in a very particular way," says Mikhail Kats, a UW-Madison professor of electrical and computer engineering. "Essentially, there is a temperature range within which the power of the thermal radiation emitted by our coating stays the same."

Currently, that temperature range is fairly small, between approximately 105 and 135 degrees Celsius. With further development, however, Kats says the coating could have applications in heat transfer, camouflage and, as infrared cameras become widely available to consumers, even in clothing to protect people's personal privacy.

Kats, his group members, and their collaborators at UW-Madison, Purdue University, Harvard University, Massachusetts Institute of Technology and Brookhaven National Laboratory published details of the advance this week in the Proceedings of the National Academy of Sciences.

The coating itself emits a fixed amount of thermal radiation regardless of its temperature. That's because its emissivity -- the degree to which a given material will emit light at a given temperature -- actually goes down with temperature and cancels out its intrinsic radiation, says Alireza Shahsafi, a doctoral student in Kats' lab and one of the lead authors of the study.

"We can imagine a future where infrared imaging is much more common, negatively impacting personal privacy," Shahsafi says. "If we could cover the outside of clothing or even a vehicle with a coating of this type, an infrared camera would have a harder time distinguishing what is underneath. View it as an infrared privacy shield. The effect relies on changes in the optical properties of our coating due to a change in temperature. Thus, the thermal radiation of the surface is dramatically changed and can confuse an infrared camera."

In the lab, Shahsafi and fellow members of Kats' group demonstrated the coating's efficacy. They suspended two samples -- a coated piece of sapphire and a reference piece with no coating -- from a heater so that part of each sample was touching the heater and the rest was suspended in much cooler air. When they viewed each sample with an infrared camera, they saw a distinct temperature gradient on the reference sapphire, from deep blue to pink, red, orange and almost white, while the coated sapphire's thermal image remained largely uniform.

A team effort was critical to the project's success. Purdue collaborator Shriram Ramanathan's group synthesized the samarium nickel oxide and performed detailed materials characterization. Colleagues at MIT and at Brookhaven National Laboratory used the bright light of a particle-accelerating synchrotron to study the coating's atomic-level behavior.

Credit: 
University of Wisconsin-Madison

Stevia remains the most discussed low/zero-calorie sweetener

image: Stevia Remains the Most Discussed Low/Zero-Calorie Sweetener

Image: 
International Stevia Council

December 18, 2019 (Brussels, Belgium) - The International Stevia Council (ISC) recently unveiled data from its 2019 Online Conversation & Trends Analysis to identify and better understand the attitudes and perceptions around the sweetener stevia in English- and Spanish-speaking countries. The results: the online social conversation doubled.

The association worked with Kellen, a professional services firm, to conduct the ISC Conversation & Trends Analysis, using their researchers and Crimson Hexagon, an AI-powered consumer insights company, to analyze data from 2017 to 2018.

"There has been a dramatic increase in the total amount of the online conversation talking about stevia. In fact, it has doubled in English- and in Spanish-speaking countries," explains the International Stevia Council's Executive Director, Maria Teresa Scardigli.

The Analysis revealed that when compared to other low and no-calorie sweeteners, stevia remains the most talked about low and zero-calorie sweetener. This was particularly notable when looking at the language specific data. In English-speaking countries, stevia mentions significantly increased, going from 101,697 to 258,669. In Spanish-speaking countries, the mentions also doubled rising from 38,965 to 77,535.

Positive/neutral sentiment in English-speaking countries centered around stevia remained at nearly 80 percent, with the positive sentiment increasing to nearly 40 percent, an increase of 5 percent.

Positive/neutral sentiment in Spanish-speaking countries around stevia remained at 82 percent, with the positive sentiment increasing to 34 percent, an increase of 11 percent.

Since the last Analysis in 2013-2015, topics of conversation have shifted in both English- and Spanish-speaking countries: less mentions in relation to recipes and taste and more in conjunction with weight loss assistance and diabetes/blood sugar management. Many popular diet and fitness apps prominently mention stevia in the English-speaking conversations, including MyFitnessPal, Diabetes Daily, and Obesity Help.

Credit: 
Kellen Communications - NY

A new way to optimize sleep and light exposure can reduce jet lag and improve alertness

TROY, N.Y. -- Whether you're traveling for work or for fun, nothing ruins the start of a trip quite like jet lag. Engineers affiliated with the Lighting Enabled Systems & Applications (LESA) Center at Rensselaer Polytechnic Institute have developed a way to deliver personalized advice using smart wearable technology that would help travelers adjust more quickly.

In a series of articles, including one published today in PLOS ONE, the researchers explain how they have developed and demonstrated a series of algorithms that can analyze biometric information recorded by a smart device and then recommend the best combination of sleep and light to help a person readjust their circadian rhythm.

"Using these algorithms and a mathematical model of a person's circadian rhythm, we have the ability to compute the best light to adjust your circadian rhythm and foster your well-being. This opens the opportunity to create a smart and healthy environment," said Agung Julius, an associate professor of electrical, computer, and systems engineering at Rensselaer and one of the authors on this paper.

The same, he said, goes for determining the sleep -- both how much and when it should be received -- a person needs.

Circadian rhythms are master internal clocks that help regulate many of our physiological processes, including sleep, metabolism, hormone secretion, and even how our brain functions. Energy, alertness, and other biological processes can suffer when that rhythm doesn't align with the clock one is actually trying to follow.

The Department of Defense is funding this research because of the benefits the researchers' findings could bring to the alertness of service members.

"The circadian and sleep processes are also very tightly related to your mental state and how alert you are," Julius said. "If you try to do something in the wrong time of day, your alertness is not going to be as effective as if you do it in the right time of day as defined by your circadian clock."

Julius explained that a person's circadian rhythm variation is typically determined using information gathered from a blood or saliva test that measures levels of the hormone melatonin. The problem with that traditional approach is that obtaining the results takes time and doesn't allow for instant analysis.

The LESA team, which includes John Wen, head of the Department of Electrical, Computer, and Systems Engineering at Rensselaer and co-author on this paper, has been working on algorithms that process data -- like heart rate and body temperature -- that can be collected from wearable smart technology and converted into an estimate of a person's circadian rhythm variation.

"The question is whether that kind of data can give you as accurate an estimation as the clinical standard," Julius said.

What the team has found and demonstrated is that the estimates their algorithms generated are in line with clinical hormone measurement techniques. Julius said these findings are indicative that the team's approach works.

"This work is important, because it characterizes the fundamental processes the human body uses to synchronize circadian and sleep processes. By developing biosensing analytics to characterize circadian phase, it is now possible to optimize the efficient use of light with appropriate spectral properties to help optimize and maintain human health and performance," said Robert Karlicek, the director of the LESA Center. "This will be important to other work related to lighting and health in LESA's clinical research test beds at Thomas Jefferson University and the University of New Mexico."

Credit: 
Rensselaer Polytechnic Institute

Single-molecule detection of cancer markers brings liquid biopsy closer to clinic

image: Each dot seen in this PRAM image represents one microRNA that has bound to the sensor.

Image: 
Image courtesy of Nantao Li

CHAMPAIGN, Ill. -- A fast, inexpensive yet sensitive technique to detect cancer markers is bringing researchers closer to a "liquid biopsy" - a test using a small sample of blood or serum to detect cancer, rather than the invasive tissue sampling routinely used for diagnosis.

Researchers at the University of Illinois developed a method to capture and count cancer-associated microRNAs, or tiny bits of messenger molecules that are exuded from cells and can be detected in blood or serum, with single-molecule resolution. The team published its results in the Proceedings of the National Academy of Sciences.

"Cancer cells contain gene mutations that enable them to proliferate out of control and to evade the immune system, and some of those mutations turn up in microRNAs," said study leader Brian Cunningham, an Illinois professor of electrical and computer engineering. Cunningham also directs the Holonyak Micro and Nanotechnology Lab at Illinois.

"There are specific microRNA molecules whose presence and concentration is known to be related to the presence and aggressiveness of specific types of cancer, so they are known as biomarkers that can be the target molecule for a diagnostic test," he said.

Cunningham's group developed a technique named Photonic Resonator Absorption Microscopy to capture and count microRNA biomarkers. In collaboration with professor Manish Kohli at the Moffitt Cancer Center in Florida, they tested PRAM on two microRNAs that are known markers for prostate cancer.

They found it was sensitive enough to detect small amounts that would be present in a patient's serum, yet also selective enough to detect the marker among a cocktail of molecules that also would be present in serum.

"One of the main challenges of biosensing is to maintain sensitivity and selectivity at the same time," said Nantao Li, a graduate student and co-first author. "You want it to be sensitive enough to detect very small amounts, but you don't want it to pick up every RNA in the blood. You want this specific sequence to be your target."

PRAM achieves both qualities by combining a molecular probe and a photonic crystal sensor. The probe very specifically pairs to a designated microRNA and has a protective cap that comes off when it finds and binds to the target biomarker. The exposed end of the probe can then bind to the sensor, producing a signal visible through a microscope.

Each individual probe that binds sends a separate signal that the researchers can count. This means researchers are able to detect much smaller amounts than traditional methods like fluorescence, which need to exceed a certain threshold to emit a measurable signal. Being able to count each biomarker also carries the added benefit of allowing researchers to monitor changes in the concentration of the biomarker over time.

"With PRAM, we squirt a sample into a solution and get a readout within two hours," said postdoctoral researcher Taylor Canady, a co-first author of the study. "Other technologies that produce single-molecule readouts require extra processing and additional steps, and they require a day or more of waiting. PRAM seems like something that could be much more feasible clinically. In addition, by using an optical signal instead of fluorescence, we could one day build a miniaturized device that doesn't need a trained laboratory technician."

The PRAM approach could be adapted to different microRNAs or other biomarkers, the researchers say, and is compatible with existing microscope platforms.

"This approach makes the idea of performing a 'liquid biopsy' for low-concentration cancer-related molecules a step closer to reality," Cunningham said. "This advance demonstrates that it is possible to have an inexpensive and routine method that is sensitive enough to require only a droplet of blood. The results of the test might tell a physician whether a regimen of chemotherapy is working, whether a person's cancer is developing a new mutation that would make it resistant to a drug, or whether a person who had been previously treated for cancer might be having a remission."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau