Tech

A computer that understands how you feel

image: Kragel is combining machine learning with brain imaging to learn more about how images impact emotions.

Image: 
Glenn Asakawa/CU Boulder

Could a computer, at a glance, tell the difference between a joyful image and a depressing one?

Could it distinguish, in a few milliseconds, a romantic comedy from a horror film?

Yes, and so can your brain, according to research published this week by University of Colorado Boulder neuroscientists.

"Machine learning technology is getting really good at recognizing the content of images - of deciphering what kind of object it is," said senior author Tor Wager, who worked on the study while a professor of psychology and neuroscience at CU Boulder. "We wanted to ask: Could it do the same with emotions? The answer is yes."

Part machine-learning innovation, part human brain-imaging study, the paper, published Wednesday in the journal Science Advances, marks an important step forward in the application of "neural networks" - computer systems modeled after the human brain - to the study of emotion.

It also sheds a new, different light on how and where images are represented in the human brain, suggesting that what we see - even briefly - could have a greater, more swift impact on our emotions than we might assume.

"A lot of people assume that humans evaluate their environment in a certain way and emotions follow from specific, ancestrally older brain systems like the limbic system," said lead author Philip Kragel, a postdoctoral research associate at the Institute of Cognitive Science. "We found that the visual cortex itself also plays an important role in the processing and perception of emotion."

THE BIRTH OF EMONET

For the study, Kragel started with an existing neural network, called AlexNet, which enables computers to recognize objects. Using prior research that identified stereotypical emotional responses to images, he retooled the network to predict how a person would feel when they see a certain image.

He then "showed" the new network, dubbed EmoNet, 25,000 images ranging from erotic photos to nature scenes and asked it to categorize them into 20 categories such as craving, sexual desire, horror, awe and surprise.

EmoNet could accurately and consistently categorize 11 of the emotion types. But it was better at recognizing some than others. For instance, it identified photos that evoke craving or sexual desire with more than 95 percent accuracy. But it had a harder time with more nuanced emotions like confusion, awe and surprise.

Even a simple color elicited a prediction of an emotion: When EmoNet saw a black screen, it registered anxiety. Red conjured craving. Puppies evoked amusement. If there were two of them, it picked romance. EmoNet was also able to reliably rate the intensity of images, identifying not only the emotion it might illicit but how strong it might be.

When the researchers showed EmoNet brief movie clips and asked it to categorize them as romantic comedies, action films or horror movies, it got it right three-quarters of the time.

WHAT YOU SEE IS HOW YOU FEEL

To further test and refine EmoNet, the researchers then brought in 18 human subjects.

As a functional magnetic resonance imaging (fMRI) machine measured their brain activity, they were shown 4-second flashes of 112 images. EmoNet saw the same pictures, essentially serving as the 19th subject.

When activity in the neural network was compared to that in the subjects' brains, the patterns matched up.

"We found a correspondence between patterns of brain activity in the occipital lobe and units in EmoNet that code for specific emotions. This means that EmoNet learned to represent emotions in a way that is biologically plausible, even though we did not explicitly train it to do so," said Kragel.

The brain imaging itself also yielded some surprising findings. Even a brief, basic image - an object or a face - could ignite emotion-related activity in the visual cortex of the brain. And different kinds of emotions lit up different regions.

"This shows that emotions are not just add-ons that happen later in different areas of the brain," said Wager, now a professor at Dartmouth College. "Our brains are recognizing them, categorizing them and responding to them very early on."

Ultimately, the resesarchers say, neural networks like EmoNet could be used in technologies to help people digitally screen out negative images or find positive ones. It could also be applied to improve computer-human interactions and help advance emotion research.

The takeaway for now, says Kragel:

"What you see and what your surroundings are can make a big difference in your emotional life."

Credit: 
University of Colorado at Boulder

NASA finds two areas of strength in Tropical Storm Nari

image: On July 26 at 8:20 a.m. EDT (1220 UTC), the MODIS instrument that flies aboard NASA's Terra satellite showed two areas of strongest storms (yellow) in Tropical Storm Nari north and south of center. Cloud top temperatures in those areas were as cold as minus 50 degrees Fahrenheit (minus 45.5 Celsius).

Image: 
NASA/NRL

NASA's Terra satellite found two small areas of strength in Tropical Storm Nari on July 26 as it began to affect Japan.

NASA's Terra satellite uses infrared light to analyze the strength of storms by providing temperature information about the system's clouds. The strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

On July 26 at 8:20 a.m. EDT (1220 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite gathered infrared data on Nari, formerly known as Tropical Storm 07W. There were two areas of strongest storms in Tropical Storm Nari, and they were north and south of the center of circulation. In those areas, thunderstorms had cloud top temperatures as cold as minus 50 degrees Fahrenheit (minus 45.5 Celsius). That northernmost area of strong storms was located over the Kyoto, Osaka and Wakayama Prefectures of Japan.

At 5 a.m. EDT (0900 UTC), the center of Tropical Storm Nari was located near latitude 30.9 degrees north and longitude 136.3 degrees east. That's about 314 nautical miles southwest of Yokosaka, Japan. The tropical storm is moving toward the north-northwest. Maximum sustained winds were near 40 mph (35 knots/64 kph).

The Joint Typhoon Warning Center (JTWC) forecast for Nari brings the storm northward, with a turn to the east in 12 hours. JTWC said "The system is expected to maintain intensity prior to Landfall in Honshu. The system is expected to dissipate by 48 hours due to passage over land and cooler water to the east of Honshu."

Credit: 
NASA/Goddard Space Flight Center

For salmonella detection, genomic tool emerges as a key

ITHACA, N.Y. - The world's food supply will become safer as the food industry shifts to high-resolution, whole-genome sequencing - which examines the full DNA of a given organism all at once. This move to make sequencing ubiquitous will lead to the consistently reliable detection of salmonella.

A paper published in the journal Frontiers in Microbiology on - co-authored by researchers from Cornell University and the Mars Global Food Safety Center (GFSC), Beijing - illuminates breakthroughs.

"Salmonella is the foodborne pathogen with the biggest public health and economic impact globally. It's one of the major causes of diarrhea all around the world," said Martin Wiedmann, food safety professor and Cornell Institute for Food Systems faculty fellow. "Salmonella can be mild or it can cause death, as its severity depends on salmonella's serotypes [distinct variations] - and that's what we're trying to find out."

The paper describes how the food industry around the world should use molecular methods more often for subtyping and characterizing salmonella. The paper compared older subtyping practices - some practices going back to the 1930s - to the newer whole-genome sequencing, a method that can analyze a wider, more complete swath of the genome.

With this technique, scientists can more precisely identify a particular strain of salmonella and determine the origin and the path of the disease's outbreak, Wiedmann said.

For example, in early July the U.S. Food and Drug Administration and the federal Centers for Disease Control and Prevention (CDC) used the technique, as they began investigating a suspected salmonella link between pig-ear dog treats and humans - due to people handling the treats. At the time, there were 45 cases of salmonella in 13 states, with 12 people hospitalized, according to the CDC.

"Whole-genome sequencing is rapidly becoming the method of choice for salmonella subtyping," said microbiologist Silin Tang, the paper's lead author and senior research scientist at the Mars Global Food Safety Center. "Foods often reach the consumer through increasingly complex supply chains, creating many opportunities where food safety could be compromised. Recent cases highlight the need to reinforce salmonella control measures in the food industry, including rapid and accurate tracking of contamination sources with appropriate subtyping tools."

Tang said there is an opportunity for the food industry to create a talent pipeline of expertise in bioinformatics - the use of software tools to understand biological data - to harness the full potential of whole-genome sequencing technologies.

Further, Wiedmann said, this paper represents the translation of academic analysis into a useful industry application. "This will help industry implement better ways to proactively address the complications and complexity of salmonella," he said.

Credit: 
Cornell University

Two therapeutic targets identified for deadly lung cancer

video: Salk scientists discover a pair of enzymes that drive non-small-cell lung cancer by promoting inflammation.

Image: 
Salk Institute

LA JOLLA--(July 26, 2019) The vast majority of deadly lung cancer cases (85 percent) are termed non-small-cell lung carcinomas (NSCLCs), which often contain a mutated gene called LKB1. Salk Institute researchers have now discovered precisely why inactive LKB1 results in cancer development. The surprising results, published in the online version of Cancer Discovery on July 26, 2019, highlight how LBK1 communicates with two enzymes that suppress inflammation in addition to cell growth, to block tumor growth. The findings could lead to new therapies for NSCLC.

"For the first time, we've found specific direct targets for LKB1 that prevent lung cancer and discovered--very unexpectedly--that inflammation plays a role in this tumor growth," says Professor Reuben Shaw, director of the Salk Cancer Center and senior author of the paper. "With this knowledge we can hopefully develop new treatments for this large fraction of lung cancer patients."

When functioning normally, LKB1 acts as a tumor suppressor, actively preventing cancer from forming in the first place. Scientists knew that the LKB1 gene worked like the captain of a relay team, passing cellular signals, like a baton, to enzymes called kinases, that then passed the signal to other enzymes in a chain reaction. LKB1 acts as the captain of a team consisting of 14 different kinase teammates. But which of these kinases is specifically responsible for carrying on LKB1's tumor suppressive function has been unclear for the more than 15 years since LKB1 was first identified as a major gene disrupted in lung cancer. In 2018, the Shaw lab solved the first step of this molecular whodunnit by showing that 2 of the 14 teammates (the main enzymes known to control metabolism and growth) were surprisingly not as important to LKB1's effects to block lung cancer as most scientists had assumed. That left 12 of their kinase teammates as potentially important, but almost nothing was known about them.

"This is was like a cancer detective case. We suspected that one of these 12 kinases was likely the key to the tumor suppressing effects of LKB1, but we were not sure which one," says Pablo Hollstein, first author on the paper and a postdoctoral fellow at Salk.

To figure it out, the team used CRISPR technology combined with genetic analysis to inactivate each suspected kinase one at a time and then in combinations. They observed how the inactivations affected tumor growth and development in both cell cultures of NSCLC cells and in a genetic NSCLC mouse model. The experiments pointed the researchers to two kinases: one called SIK1 had the strongest effect in stopping tumors from forming. When SIK1 was inactivated, tumor growth increased; and when a related kinase, SIK3, was also inactivated, the tumor grew even more aggressively.

"Discovering that of the 14 kinases it was SIK1 and SIK3 that were the most critical players is like discovering that the relatively unknown backup quarterback who almost never plays is actually one of the most important quarterbacks in the history of the sport," says Shaw.

LKB1 is also known to play a role in suppressing inflammation in cells generally, so the researchers were intrigued to discover that SIK1 and SIK3 were specifically inhibiting the cellular inflammation response in lung cancer cells. Thus, when LKB1 or SIK1 and SIK3 become mutated in tumors, inflammation is increased, driving tumor growth.

In a related vein, Salk Professor Marc Montminy recently published a paper along with Shaw, identifying metabolic switches to which SIK1 and SIK3 "pass the baton," revealing three steps of the relay started by LKB1.

"By attacking the problem of lung cancer from different angles, we have now defined a single direct route that underpins how the disease develops in many patients," says Shaw, who holds the William R. Brody Chair. "We have been working on this project since I started my lab in 2006, so it is incredibly rewarding and astonishing to find that inflammation is a driving force in tumor formation in this very clearly defined set of lung cancers. This discovery highlights the nature of scientific research and how important it is to commit to pursuing difficult, complicated problems, even if it takes over 10 years to get an answer."

Next, the researchers plan to further investigate how these kinase-driven switches in inflammation trigger lung tumor growth in NSCLC.

Credit: 
Salk Institute

Study: Sizzling Southwest summers can cause pavement burns in seconds

When temperatures throughout the sizzling Southwestern U.S. climb to over 100 degrees, the pavement can get hot enough to cause second-degree burns on human skin in a matter of seconds.

In a new study published in the Journal of Burn Care & Research, a team of surgeons from the UNLV School of Medicine reviewed all pavement burn admissions into a Las Vegas area burn center over five years. The team compared the outdoor temperatures at the time of each patient admission to, in essence, determine how hot is too hot.

"Pavement burns account for a significant number of burn-related injuries, particularly in the Southwestern United States," the study authors wrote. "The pavement can be significantly hotter than the ambient temperature in direct sunlight and can cause second-degree burns within two seconds."

For the study, researchers identified 173 pavement-related burn cases between 2013 to 2017. Of those, 149 cases were isolated pavement burns and 24 involved other injuries, including those from motor vehicle accidents.
More than 88 percent (153) of related incidents occurred when temps were 95 degrees or higher, with the risk increasing exponentially as temperatures exceeded 105 degrees.

That's because pavement in direct sunlight absorbs radiant energy, making it significantly hotter and potentially dangerous. Study authors say that pavement on a 111-degree day, for example, can get as hot as 147 degrees in direct sunlight. For reference, a fried egg becomes firm at 158 degrees.

And while it seems like a no-brainer to stay off a hot sidewalk, for some it's unavoidable - including victims of motor vehicle accidents, people with mobility issues or medical episodes who have fallen to the ground, or small children who may not know better.

The takeaway - summer in the desert is no joke, and more education is needed to warn people of the risks of hot pavement, particularly as temperatures creep above 100 degrees.

"This information is useful for burn centers in hotter climates, to plan and prepare for the coordination of care and treatment," says study lead author Dr. Jorge Vega. "It can also be used for burn injury prevention and public health awareness, including increased awareness and additional training to emergency medical service and police personnel when attending to pavement burn victims in the field."

The study, "A 5-Year Review of Pavement Burns from a Desert Burn Center," was published in the July/August 2019 issue of the Journal of Burn Care & Research. [WARNING: study contains graphic imagery]

Credit: 
University of Nevada, Las Vegas

Stanford physicists discover new quantum trick for graphene: Magnetism

Sometimes the best discoveries happen when scientists least expect it. While trying to replicate another team's finding, Stanford physicists recently stumbled upon a novel form of magnetism, predicted but never seen before, that is generated when two honeycomb-shaped lattices of carbon are carefully stacked and rotated to a special angle.

The authors suggest the magnetism, called orbital ferromagnetism, could prove useful for certain applications, such as quantum computing. The group describes their finding in the July 25 issue of the journal Science.

"We were not aiming for magnetism. We found what may be the most exciting thing in my career to date through partially targeted and partially accidental exploration," said study leader David Goldhaber-Gordon, a professor of physics at Stanford's School of Humanities and Sciences. "Our discovery shows that the most interesting things turn out to be surprises sometimes."

The Stanford researchers inadvertently made their discovery while trying to reproduce a finding that was sending shockwaves through the physics community. In early 2018, Pablo Jarillo-Herrero's group at MIT announced that they had coaxed a stack of two subtly misaligned sheets of carbon atoms - twisted bilayer graphene - to conduct electricity without resistance, a property known as superconductivity.

The discovery was a stunning confirmation of a nearly decade-old prediction that graphene sheets rotated to a very particular angle should exhibit interesting phenomena.

When stacked and twisted, graphene forms a superlattice with a repeating interference, or moiré, pattern. "It's like when you play two musical tones that are slightly different frequencies," Goldhaber-Gordon said. "You'll get a beat between the two that's related to the difference between their frequencies. That's similar to what you get if you stack two lattices atop each other and twist them so they're not perfectly aligned."

Physicists theorized that the particular superlattice formed when graphene is rotated to 1.1 degrees causes the normally varied energy states of electrons in the material to collapse, creating what they call a flat band where the speed at which electrons move drops to nearly zero. Thus slowed, the motions of any one electron becomes highly dependent on those of others in its vicinity. These interactions lie at the heart of many exotic quantum states of matter.

"I thought the discovery of superconductivity in this system was amazing. It was more than anyone had a right to expect," Goldhaber-Gordon said. "But I also felt that there was a lot more to explore and many more questions to answer, so we set out to try to reproduce the work and then see how we could build upon it."

A series of fortunate events

While attempting to duplicate the MIT team's results, Goldhaber-Gordon and his group introduced two seemingly unimportant changes.

First, while encapsulating the honeycomb-shaped carbon lattices in thin layers of hexagonal boron nitride, the researchers inadvertently rotated one of the protective layers into near alignment with the twisted bilayer graphene.

"It turns out that if you nearly align the boron nitride lattice with the lattice of the graphene, you dramatically change the electrical properties of the twisted bilayer graphene," said study co-first author Aaron Sharpe, a graduate student in Goldhaber-Gordon's lab.

Secondly, the group intentionally overshot the angle of rotation between the two graphene sheets. Instead of 1.1 degrees, they aimed for 1.17 degrees because others had recently shown that twisted graphene sheets tend to settle into smaller angles during the manufacturing process.

"We figured if we aim for 1.17 degrees, then it will go back toward 1.1 degrees, and we'll be happy," Goldhaber-Gordon said. "Instead, we got 1.2 degrees."

An anomalous signal

The consequences of these small changes didn't become apparent until the Stanford researchers began testing the properties of their twisted graphene sample. In particular, they wanted to study how its magnetic properties changed as its flat band - that collection of states where electrons slow to nearly zero - was filled or emptied of electrons.

While pumping electrons into a sample that had been cooled close to absolute zero, Sharpe detected a large electrical voltage perpendicular to the flow of the current when the flat band was three-quarters full. Known as a Hall voltage, such a voltage typically only appears in the presence of an external magnetic field - but in this case, the voltage persisted even after the external magnetic field had been switched off.

This anomalous Hall effect could only be explained if the graphene sample was generating its own internal magnetic field. Furthermore, this magnetic field couldn't be the result of aligning the up or down spin state of electrons, as is typically the case for magnetic materials, but instead must have arisen from their coordinated orbital motions.

"To our knowledge, this is the first known example of orbital ferromagnetism in a material," Goldhaber-Gordon said. "If the magnetism were due to spin polarization, you wouldn't expect to see a Hall effect. We not only see a Hall effect, but a huge Hall effect."

Strength in weakness

The researchers estimate that the magnetic field near the surface of their twisted graphene sample is about a million times weaker than that of a conventional refrigerator magnet, but this weakness could be a strength in certain scenarios, such as building memory for quantum computers.

"Our magnetic bilayer graphene can be switched on with very low power and can be read electronically very easily," Goldhaber-Gordon said. "The fact that there's not a large magnetic field extending outward from the material means you can pack magnetic bits very close together without worrying about interference."

Credit: 
Stanford University - School of Humanities and Sciences

Tobacco industry has bumped up prices beyond that required by tax changes

The tobacco industry has bumped up the prices for its products beyond that required by tax changes, even when tax rises were large and unexpected, reveal the findings of research published online in the journal Tobacco Control.

'Roll your own' tobacco had the highest industry driven price rises, despite higher levels of illicit trade for these products.

This refutes industry's stated concerns that price rises fuel the illicit tobacco trade?an argument they have used to lobby against tax hikes, say the researchers.

Before 2010 the tobacco industry regularly bumped up cigarette prices over and above the level required by tax rises, accounting for almost 50% of the total price increase in the UK.

Since then manufactured cigarette sales have fallen by 17%, while sales of cheaper 'roll your own' have increased by 46%.

But the government has been reluctant to increase taxes further on 'roll your own' tobacco for fear of pushing smokers towards the illegal market, which is already larger than that for manufactured cigarettes.

To explore the extent to which price rises since 2010 have been due to tax increases or industry strategies to boost profits, the researchers analysed UK data on inflation, tax rates, and sales of 'roll your own' tobacco and manufactured cigarettes between 2010 and 2015.

Between 2010-2012 there were large and unexpected tax increases, and industry-driven price changes were small, accounting for 16% and 20%, respectively, of the price increase in manufactured and 'roll your own' tobacco. Changes were similar across pack sizes and quality.

But in 2013-2015, when tax increases were smaller and planned, almost a third (33%) of the price increase for manufactured cigarettes was industry-driven, rising to nearly half (48%) of the price hike for 'roll your own' tobacco.

"This implies that the industry does not believe [its] own argument that higher taxes/prices encourage illicit tobacco purchasing. This is further supported by a higher proportion of the total price increase being attributable to industry revenue increases for 'roll your own', despite the illicit market share for 'roll your own' being substantially higher," they write.

Much larger absolute tax and price increases were applied to premium brands for both manufactured cigarettes and 'roll your own' tobacco.

But industry extended this price differential further by raising the consumer price considerably above the cost of the tax increase, known as 'overshifting,' on more expensive products, and by absorbing the tax hike, known as 'undershifting,' on cheaper products, resulting in a more than threefold difference in price increase.

"The ability of the industry to do this is bad for public health as it means smokers are not faced with a quit-inducing sudden jump in retail prices," note the researchers.

The findings also suggest that there is still scope to further increase taxes, even in a high tax and high price environment like the UK, they suggest.

"If the tobacco industry is still able to increase its revenue (and hence profits) per pack, then the government should be able to further increase taxes to deal with the harms from tobacco," the researchers write.

Such increases should be sudden and unexpected for maximum impact, they advise.

"Unexpected and large tax increases compromise the tobacco's industry ability to manipulate prices, and hence should become a key feature of tobacco taxation."

Credit: 
BMJ Group

The Lancet Diabetes & Endocrinology: Testosterone may significantly improve sexual function and sexual wellbeing in postmenopausal women

However, authors note that non-oral formulations are preferred because of the adverse lipoprotein effects of oral testosterone. So far, adverse side effects of non-oral formulations appear to be restricted to small weight gain, mild acne and increased hair growth, but more research on long-term effects is needed.

The most comprehensive systematic review and meta-analysis of testosterone treatment for women undertaken, including 46 reports on 36 trials involving 8,480 women, published in The Lancet Diabetes & Endocrinology journal, suggests it can significantly improve sexual wellbeing for postmenopausal women. Benefits include improved sexual desire, function and pleasure, together with reduced concerns and distress about sex.

Although best known as a male hormone, testosterone is important for female sexual health, contributing to libido and orgasm as well as helping to maintain normal metabolic function, muscle strength, cognitive function and mood. Levels decline naturally over a woman's lifespan, and can also drop sharply following surgical menopause. Prior research has suggested that testosterone therapy can improve sexual function in women, but the available formulations have been designed for men and evidence for their safety or for adverse side-effects in women is scant.

"Our results suggest it is time to develop testosterone treatment tailored to postmenopausal women rather than treating them with higher concentrations formulated for men," says senior author Professor Susan Davis from Monash University, Australia. "Nearly a third of women experience low sexual desire at midlife, with associated distress, but no approved testosterone formulation or product exists for them in any country and there are no internationally-agreed guidelines for testosterone use by women. Considering the benefits we found for women's sex lives and personal wellbeing, new guidelines and new formulations are urgently needed." [1]

In this study, scientists reviewed 46 reports about 36 randomised controlled trials, conducted between January 1990 and December 2018 and involving 8,480 participants aged 18 to 75 years, approximately 95% of whom were post-menopausal. The trials compared testosterone treatment to a placebo or to an alternative hormone treatment such as oestrogen, with or without progestogen.

The authors reviewed the effects of treatments on sexual function and on measures of heart, cognitive and musculoskeletal health. The authors also looked for other serious side effects such as increased risk of heart disease or breast cancer, as well as the impact on mood and wellbeing, other measures of breast health such as mammographic density, metabolic effects, lipid profiles, and the development of androgenic effects such as increased hair growth.

As there were few studies available for premenopausal women (three studies in 226 women), the authors noted that no conclusions could be drawn about the efficacy of testosterone treatment for sexual dysfunction in this group, and larger studies are needed to inform clinical recommendations.

In 15 studies involving 3,766 naturally and surgically postmenopausal women, consistent beneficial effects were seen for all measures of sexual function. Testosterone treatment resulted in an increase in the frequency of satisfactory sexual events. Treatment significantly increased sexual desire, pleasure, arousal, orgasm, responsiveness to sexual stimuli and self-image. Women treated with testosterone also showed reduced measures of sexual concerns and sexually-associated distress.

"The beneficial effects for postmenopausal women shown in our study extend beyond simply increasing the number of times a month they have sex," says Professor Davis. "Some women who have regular sexual encounters report dissatisfaction with their sexual function, so increasing their frequency of a positive sexual experience from never, or occasionally, to once or twice a month can improve self-image and reduce sexual concerns, and may improve overall wellbeing." [1]

The study found no beneficial effects on cognitive measures, bone mineral density, body composition or muscle strength. No benefits were seen for depressive mood irrespective of menopausal status or in psychological wellbeing. However, the number of women included in these studies was small, and further research is needed.

No serious adverse effects on postmenopausal women were recorded for glucose or insulin in the blood, for blood pressure, or for measures of breast health. However, only limited data were available for breast cancer risk and further research is needed to clarify the effects. With non-oral testosterone, the authors found no effects on lipid profiles or metabolic variables such as cholesterol (10 studies involving 1774 women). However, oral formulations of testosterone increased LDL cholesterol, and reduced HDL cholesterol, overall cholesterol and triglycerides (a type of fat associated with an increased risk of heart disease). Postmenopausal women treated with testosterone were also not more likely to experience a serious cardiovascular event such as a heart attack or stroke (9 clinical trials with 4,063 women).

Although an increase in acne was shown in 11 studies involving 3,264 women, and an increase in hair growth was shown in 11 studies involving 4,178 women, the number of participants who withdrew from clinical trials due to these side-effects was not greater for women treated with testosterone compared with placebo, suggesting the effects are mild and not a major concern for participants. Five studies involving 2,032 participants indicated that testosterone treatment was associated with some weight gain. The authors recommend that patients are advised of these effects so they can make an individual choice whether to go ahead with testosterone treatment.

In a linked Comment, Dr Rossella Nappi from the University of Pavia, Italy, writes about the potential benefits of testosterone highlighted in the study: "Notwithstanding these findings, we must gain insight into the therapeutic role of testosterone for women by designing adequate long-term studies to address benefits and risk in specific clinical conditions relevant to healthy female longevity. In particular, there is an urgent need in the area of sexual medicine to ensure gender equality in treating effectively those women with female sexual dysfunction clearly related to hypoandrogenic states. However, products specifically approved in women should become available to achieve this goal; at present, only male formulations are available, with clinicians adjusting the dose to the female circulating testosterone range."

Credit: 
The Lancet

Science snapshots: Chromosomes, crystals, and drones

image: This is an artistic representation of centromeres.

Image: 
Sasha Langley and Charles Langley

Exploring Human Origins in the Uncharted Territory of Our Chromosomes

A group of geneticists from Berkeley Lab, UC Davis, UC Santa Cruz, and UC Berkeley are unraveling new details about human evolution by studying the uniquely regulated portion of our chromosomes that surround the centromeres.

These stretches of DNA - termed centromere-proximal regions (CPRs) - are largely composed of highly repetitive, mostly non-gene-coding sequences that are protected from sequence shuffling during reproductive cell division.

Hypothesizing that these protected genomic regions might generate large haplotypes (groups of neighboring genes and sequences that are inherited as a single unit from generation to generation), Berkeley Lab researcher Sasha Langley and her co-authors used a database of diverse modern human genomes to investigate variation in CPRs.

Their analysis, published in eLife, revealed that centromere haplotypes - or "cenhaps" - are indeed present. Like other parts of the genome, cenhaps harbor genetic material, including functional genes, introduced when our ancestors hybridized with other hominin species; yet these sequences are surprisingly massive compared with other archaic genetic remnants.

"Interestingly, one of the Neanderthal cenhaps contains a lot of unique variation in genes that shape our sense of smell," said Langley. "And in some individuals, we found evidence of an even more ancient cenhap that appears to be derived from a previously unknown early hominin relative."

The authors conclude that cenhaps provide a great tool for exploring functional differences in CPRs and the history of early hominins, even when available fossils contain limited intact DNA.

Crystal With a Twist: Scientists Grow Spiraling New Material

With a simple twist of the fingers, one can create a beautiful spiral from a deck of cards. In the same way, scientists at Berkeley Lab and UC Berkeley have created new inorganic crystals made of stacks of atomically thin sheets that unexpectedly spiral like a nanoscale card deck.

Their surprising structures, reported in a new study in the journal Nature, may yield unique optical, electronic and thermal properties, including superconductivity, the researchers say.

These helical crystals are made of stacked layers of germanium sulfide, a semiconductor material that, like graphene, readily forms sheets that are only a few atoms thick. Such "nanosheets" are usually referred to as "2D materials."

X-ray analyses for the study were performed at Berkeley Lab's Advanced Light Source, and the crystal's twist angles were measured at the Molecular Foundry.

"No one expected 2D materials to grow in such a way. It's like a surprise gift," said Jie Yao, a faculty scientist in Berkeley Lab's Materials Sciences Division and assistant professor of materials science and engineering at UC Berkeley. "We believe that it may bring great opportunities for materials research."

Read the original UC Berkeley news release by Kara Manke.

Drones Will Fly for Days With This New Technology

Researchers with Berkeley Lab and UC Berkeley just broke another record in thermophotovoltaic efficiency, an achievement that could lead to ultralight engines that can power drones for days.

For the past 15 years, the efficiency of converting heat into electricity with thermophotovoltaics - an ultralight alternative power source that could allow drones and other unmanned aerial vehicles to operate continuously for days - has been stalled at 23 percent.

Recently, the team of researchers led by corresponding author Eli Yablonovitch recognized that a highly reflective mirror installed on the back of a photovoltaic cell can reflect low energy infrared photons to reheat the thermal source, providing a second chance for a high-energy photon to be created and generate electricity. This groundbreaking discovery - reported on July 16 in the Proceedings of the National Academy of Sciences - has allowed the researchers to raise the efficiency of thermophotovoltaics to an unprecedented 29 percent.

According to Yablonovitch, who is a senior faculty scientist in Berkeley Lab's Materials Sciences Division and a professor of electrical engineering and computer science at UC Berkeley, the current study builds on work that he and students published in 2011, which found that the key to boosting solar cell efficiency is, counterintuitively, by externally extracting light from an intense internal luminescent photon gas.

The researchers are now aiming to reach 50 percent thermophotovoltaic efficiency in the future by applying these new scientific concepts.

Read the original UC Berkeley news release by Linda Vu.

Credit: 
DOE/Lawrence Berkeley National Laboratory

The origin and future of spam and other online intrusions

video: The digital history of spam.

Image: 
USC Viterbi School of Engineering

From a confidence trick originating in the late 19th century, to sophisticated AI that can manipulate reality, recreating anyone's face or voice with almost pinpoint accuracy--spam has come a long way.

But what does the future of digital spam look like, what risks could it pose to our personal security and privacy, and what can we do to fight it?

In a new paper, which appeared in the August 2019 issue of Communications of the ACM (CACM), Emilio Ferrara, a USC research assistant professor in computer science and research team leader at USC Viterbi's Information Sciences Institute, tracks the evolution of digital spam and explores its complex, and often surprising, history.

"The fight against spam is a constant arms race," said Ferrara, who specializes in computational social sciences and is an expert in social media bots. "Scams not only exploit technical vulnerabilities; they exploit human ones."

Social media spam bots, which automatically produce content and interact with humans, have allowed spammers to scale their operations to an unprecedented level. (Ferrara explores this in his 2016 CACM paper, The Rise of Social Bots).

Since bots have been used for a variety of nefarious scenarios, from manipulation of political discussions to the spread of conspiracy theories and false news, the stakes are high. In the future, Ferrara believes that deepfake technologies could be abused by well-resourced spammers to create AIs pretending to be human.

Milestones in Spam History:

The term "spam" is internet slang that refers to unsolicited commercial email (UCE).

The first reported case of spam occurred in 1898, when the New York Times reported unsolicited messages circulating in association with an old swindle.

The first reported case of email spam occurred in 1979, attributed to Digital Equipment Corporation and circulated to 400 users of ARPANET, the precursor network of the modern internet.

The term "spam" was coined in 1994, based on a now-legendary Monty Python's Flying Circus sketch, where a crowd of Vikings sings progressively louder choruses of "SPAM! SPAM! SPAM!"

Facts:

Billions of spam emails are sent every day.

Email spam "detection algorithms" are approximately 98% accurate, but new breeds of spam are continually evolving.

Last year, Facebook said it deleted 1.23 billion spam posts in 2018's third quarter.

Credit: 
University of Southern California

How and why resistance training is imperative for older adults

image: Pictured here is a graphic.

Image: 
Michigan Medicine

ANN ARBOR, Mich. - For many older adults, resistance training may not be part of their daily routine, but a new position statement suggests it is vital to improving their health and longevity.

"When you poll people on if they want to live to 100 years old, few will respond with a 'yes'," says Maren Fragala, Ph.D., director of scientific affairs at Quest Diagnostics and lead author of the position statement.

"The reason mainly being that many people associate advanced age with physical and cognitive decline, loss of independence and poor quality of life," adds Mark Peterson, Ph.D., M.S., FACSM, an associate professor of physical medicine and rehabilitation at Michigan Medicine and one of the senior authors of the statement.

The position statement, published in theJournal of Strength and Conditioning Research, and supported by the National Strength and Conditioning Association, highlights the benefits of strength and resistance training in older adults for healthier aging.

Fragala explains that while aging does take a toll on the body, the statement provides evidence-based recommendations for successful resistance training, or exercise focused on building muscle endurance, programs for older adults.

"Aging, even in the absence of chronic disease, is associated with a variety of biological changes that can contribute to decreases in skeletal muscle mass, strength and function," Fragala says. "Such losses decrease physiologic resilience and increase vulnerability to catastrophic events."

She adds, "The exciting part about this position statement is that it provides evidence-based recommendations for resistance training in older adults to promote health and functional benefits, while preventing and minimizing fears."

Practical applications

The position statement provides 11 practical applications divided into four main components: program design variables, physiological adaptations, functional benefits, and considerations for frailty, sarcopenia and other chronic conditions.

The applications include suggestions on training types and amounts of repetitions and intensities, patient groups that will need adaptations in training models, and how training programs can be adapted for older adults with disabilities or those residing in assisted living and skilled nursing facilities.

"Current research has demonstrated that resistance training is a powerful care model to combat loss of muscle strength and mass in the aging population," says Peterson, a member of the University of Michigan Institute for Healthcare Policy & Innovation and Michigan Center on the Demography of Aging.

"We demonstrate in this position statement just how much resistance training can positively affect physical functioning, mobility, independence, chronic disease management, psychological wellbeing, quality of life and healthy life expectancy. We also provide recommendations for how to optimize resistance training programs to ensure safety and effectiveness."

Fragala adds that the benefits of participating in resistance training as an older adult outweigh the risks.

"The coauthors of this paper and the hundreds of other prolific researchers whose work we synthesized in this position statement have found that in most cases, the vast benefits of resistance training largely outweigh the risks when training is properly implemented," Fragala says.

Empowering healthy aging

The authors are proud to have the support of the National Strength and Condition Association for the statement.

"Too few of older Americans participate in resistance training, largely because of fear, confusion and a lack of consensus to guide implementation," Peterson says. "By having this consensus statement supported by the National Strength and Condition Association, we hope it will have a positive impact on empowering healthier aging."

Credit: 
Michigan Medicine - University of Michigan

Researchers create model to predict risk of low blood sugar in people with diabetes

image: A new study identifies risk factors that could help clinicians recognize patients with diabetes who are most likely to have low blood sugar. The predictive risk model, developed and tested by researchers from Regenstrief Institute, Indiana University School of Medicine and Merck is the first to combine nearly all known and readily assessed risk factors for hypoglycemia.

Image: 
Regenstrief Institute

INDIANAPOLIS - A new study identifies the risk factors that could help healthcare providers recognize patients being treated for diabetes who are most likely to have low blood sugar. The predictive risk model, developed and tested by researchers from Regenstrief Institute, Indiana University School of Medicine and Merck, known as MSD outside of the United States and Canada, is the first to combine nearly all known and readily assessed risk factors for hypoglycemia.

Many patients with diabetes, especially those with recurring episodes of low blood sugar, are unaware when it occurs, despite the risk of serious adverse events including cognitive impairment, coma and death. Being able to identify patients at high risk may provide an opportunity to intervene and prevent hypoglycemia as well as long-term consequences.

Diabetes is one of the most common non-communicable diseases in the world. The U.S. Centers for Disease Control and Prevention estimate that more than 30 million Americans had diabetes in 2015. Low blood sugar, known as hypoglycemia, occurs in 20 to 60 percent of patients with diabetes. It has substantial negative effects on a person's mental and physical health, including the cardiovascular system.

According to the study, the strongest predictors of hypoglycemia are

Recent infections

Using insulin other than long-acting insulin

Recent occurrences of hypoglycemia

Dementia

The variables associated with the lowest risk for low blood sugar were long-acting insulin in combination with certain other drugs, as well as being 75 years of age or older, which the authors noted was surprising.

"Knowledge of these factors could assist clinicians in identifying patients with higher risk of hypoglycemia, allowing them to intervene to help their patients in lowering that risk," said Michael Weiner, M.D., MPH, director of the Regenstrief Institute William M. Tierney Center for Health Services Research and the senior author of the study. "Some factors influencing hypoglycemia may not be immediately obvious. In addition, reassessing hypoglycemia risk as a patient's health status changes may be important as new factors are identified."

Study Methods

In this retrospective cohort study, researchers gathered data from 10 years of electronic medical records covering nearly 39,000 patients with diabetes who received outpatient care at Eskenazi Health in central Indiana. Study participants were 56 percent female, 40 percent African-American and 39 percent uninsured. The researchers used laboratory tests, diagnostic codes and natural language processing to identify episodes of hypoglycemia.

The scientists found that natural language processing was useful in identifying hypoglycemia, because there were not always laboratory tests to confirm the episode. Instead, hypoglycemia was often recorded only in narrative clinical notes. The study authors believe that their risk prediction model, incorporating natural language processing, could be useful to researchers, clinical administrators and those who are measuring population health.

Future applications

"This study has implications for clinical support," continued Dr. Weiner. "The predictive model could lead to changes in practice as well as new strategies to help patients lower their risk of hypoglycemia."

Dr. Weiner and his team are now studying the implementation of a clinical decision support tool that uses information from electronic health records to alert clinicians when their patients have hypoglycemia risk factors. In addition, they are conducting an outpatient study that uses wearable devices to monitor and record the actions and continuous glucose levels of people with diabetes. The information collected includes physical activity, diet and adherence to medication regimens, data typically not available in medical records. The goal is to identify patterns that allow healthcare providers to predict hypoglycemia earlier.

"Predictive modeling of hypoglycemia for clinical decision support in evaluating outpatients with diabetes mellitus" was published online June 25 in Current Medical Research and Opinion, a peer-reviewed journal. Funding for the research was provided by Merck & Co., Inc.

Credit: 
Regenstrief Institute

NASA's terra satellite finds tropical storm 07W's strength on the side

image: On July 25, 2019 at 9:15 a.m. EDT (1315 UTC) the MODIS instrument that flies aboard NASA's Terra satellite showed strongest storms in Tropical Storm 07W were east of the elongated center where cloud top temperatures were as cold as minus 70 degrees Fahrenheit (in red) (minus 56.6 Celsius).

Image: 
NASA/NRL

Wind shear can push clouds and thunderstorms away from the center of a tropical cyclone and that's exactly what infrared imagery from NASA's Terra satellite shows is happening in newly formed Tropical Storm 07W.

NASA's Terra satellite used infrared light to analyze the strength of storms and found the bulk of them on the eastern side of the storm. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

On July 25 at 9:15 a.m. EDT (1315 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard Terra gathered infrared data on 07W and showed the strongest thunderstorms had cloud top temperatures as cold as minus 70 degrees Fahrenheit (minus 56.6 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

The storm is being affected my moderate vertical wind shear from the southwest. In general, wind shear is a measure of how the speed and direction of winds change with altitude. Tropical cyclones are like rotating cylinders of winds. Each level needs to be stacked on top each other vertically in order for the storm to maintain strength or intensify. Wind shear occurs when winds at different levels of the atmosphere push against the rotating cylinder of winds, weakening the rotation by pushing it apart at different levels. Wind shear can displace the clouds and showers of the system from around the center.

The Joint Typhoon Warning Center or JTWC noted at 11 a.m. EDT (1500 UTC) on July 25 that Tropical Storm 07W was located near 27.5 degrees north latitude and 137.4 east longitude, about 483 miles south-southwest of Yokosuka, Japan. 07W is moving to the north and has maximum sustained winds near 35 knots (40 mph/62 kph).

The JTWC forecast calls for 07W to move north. Once it reaches Japan, the system is expected to turn to the east-northeast and dissipate.

Credit: 
NASA/Goddard Space Flight Center

Underwater glacial melting is occurring at higher rates than modeling predicts

video: Video made from time-lapse photos taken daily from March 31, 2016, to Aug. 8, 2016 from LeConte Glacier camp. The glacier moves from right to left, with the ice front, or terminus, retreating backwards as summer progresses, even as ice flows rapidly towards the ice front.

Image: 
Video by Jason Amundson, University of Alaska Southeast

EUGENE, Ore. -- July 25, 2019 -- Researchers have developed a new method to allow for the first direct measurement of the submarine melt rate of a tidewater glacier, and, in doing so, they concluded that current theoretical models may be underestimating glacial melt by up to two orders of magnitude.

In a National Science Foundation-funded project, a team of scientists, led by University of Oregon oceanographer Dave Sutherland, studied the subsurface melting of the LeConte Glacier, which flows into LeConte Bay south of Juneau, Alaska.

The team's findings, which could lead to improved forecasting of climate-driven sea level rise, were published in the July 26 issue of the journal Science.

Direct melting measurements previously have been made on ice shelves in Antarctica by boring through to the ice-ocean interface beneath. In the case of vertical-face glaciers terminating at the ocean, however, those techniques are not available.

"We don't have that platform to be able to access the ice in this way," said Sutherland, a professor in the UO's Department of Earth Sciences. "Tidewater glaciers are always calving and moving very rapidly, and you don't want to take a boat up there too closely."

Most previous research on the underwater melting of glaciers relied on theoretical modeling, measuring conditions near the glaciers and then applying theory to predict melt rates. But this theory had never been directly tested.

"This theory is used widely in our field," said study co-author Rebecca H. Jackson, an oceanographer at Rutgers University who was a postdoctoral researcher at Oregon State University during the project. "It's used in glacier models to study questions like: how will the glacier respond if the ocean warms by one or two degrees?"

To test these models in the field, the research team of oceanographers and glaciologists deployed a multibeam sonar to scan the glacier's ocean-ice interface from a fishing vessel six times in August 2016 and five times in May 2017.

The sonar allowed the team to image and profile large swaths of the underwater ice, where the glacier drains from the Stikine Icefield. Also gathered were data on the temperature, salinity and velocity of the water downstream from the glacier, which allowed the researchers to estimate the meltwater flow.

They then looked for changes in melt patterns that occurred between the August and May measurements.

"We measured both the ocean properties in front of the glacier and the melt rates, and we found that they are not related in the way we expected," Jackson said. "These two sets of measurements show that melt rates are significantly, sometimes up to a factor of 100, higher than existing theory would predict."

There are two main categories of glacial melt: discharge-driven and ambient melt. Subglacial discharge occurs when large volumes, or plumes, of buoyant meltwater are released below the glacier. The plume combines with surrounding water to pick up speed and volume as it rises up swiftly against the glacial face. The current steadily eats away from the glacier face, undercutting the glacier before eventually diffusing into the surrounding waters.

Most previous studies of ice-ocean interactions have focused on these discharge plumes. The plumes, however, typically affect only a narrow area of the glacier face, while ambient melt instead covers the rest of the glacier face.

Predictions have estimated ambient melt to be 10-100 times less than the discharge melt, and, as such, it is often disregarded as insignificant, said Sutherland, who heads the UO's Oceans and Ice Lab.

The research team found that submarine melt rates were high across the glacier's face over both of the seasons surveyed, and that the melt rate increases from spring to summer.

While the study focused on one marine-terminating glacier, Jackson said, the new approach should be useful to any researchers who are studying melt rates at other glaciers. That would help to improve projections of global sea level rise, she added.

"Future sea level rise is primarily determined by how much ice is stored in these ice sheets," Sutherland said. "We are focusing on the ocean-ice interfaces, because that's where the extra melt and ice is coming from that controls how fast ice is lost. To improve the modeling, we have to know more about where melting occurs and the feedbacks involved."

Credit: 
University of Oregon

NASA finds one burst of energy in weakening Depression Dalila

image: On July 25 at 5:20 a.m. EDT (0920 UTC), the MODIS instrument that flies aboard NASA's Aqua satellite showed strongest storms in Dalila were in a small area north of the center. There cloud top temperatures were as cold as minus 50 degrees Fahrenheit (minus 45.5 Celsius).

Image: 
NASA/NRL

Infrared imagery from NASA's Aqua satellite found just a small area of cold clouds in thunderstorms within weakening Tropical Depression Dalila, enough to maintain it as a tropical cyclone.

NASA's Aqua satellite uses infrared light to analyze the strength of storms by providing temperature information about the system's clouds. The strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

On July 25 at 5:20 a.m. EDT (0920 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite gathered infrared data on Dalila. There was still a small area of strong thunderstorms with cloud top temperatures as cold as minus 50 degrees Fahrenheit (minus 45.5 Celsius). The National Hurricane Center or NHC noted, "Dalila is still technically a tropical cyclone based on the development of new convection within 70-75 nautical miles northeast of the center." That thunderstorm development was enough to maintain its classification as a tropical cyclone.

The NHC said, "At 5 a.m. EDT (0900 UTC), the center of Tropical Depression Dalila was located near latitude 21.6 degrees north and longitude 120.4 degrees west. That's about 675 miles (1,090 km) west of the southern tip of Baja California, Mexico. The depression is moving toward the northwest near 6 mph (9 kph) and this motion is expected to continue this morning. The estimated minimum central pressure is 1009 millibars. Maximum sustained winds remain near 30 mph (45 kph) with higher gusts.

Weakening is forecast during the next couple of days, and Dalila is expected to become a post-tropical remnant low later today.

For updated forecasts, visit: https://www.nhc.noaa.gov

Credit: 
NASA/Goddard Space Flight Center