Brain

Women now seen as equally as or more competent than men

WASHINGTON -- Women have come a long way in the United States over the last 70 years, to the point where they are now seen as being as competent as men, if not more so, according to research published by the American Psychological Association.

"Challenging traditional claims that stereotypes of women and men are fixed or rigid, our study joins others in finding stereotypes to be flexible to changes in social roles," said Alice Eagly, PhD, of Northwestern University and lead author on the study. "As the roles of women and men have changed since the mid-20th century, so have beliefs about their attributes."

The research was published in American Psychologist, APA's flagship journal.

Eagly and her coauthors conducted a meta-analysis of 16 nationally representative public opinion polls involving more than 30,000 U.S. adults from 1946 to 2018. They looked at three types of traits - communion (i.e., compassion, sensitivity), agency (i.e., ambition, aggression), and competence (i.e., intelligence, creativity) - and whether participants thought each trait was truer of women or men or equally true of both.

Competence stereotypes changed dramatically over time. For example, in one 1946 poll, only 35% of those surveyed thought men and women were equally intelligent, and of those who believed there was a difference, more thought men were the more competent sex. In contrast, in one 2018 poll, 86% believed men and women were equally intelligent, 9% believed women were more intelligent and only 5% believed men were more intelligent.

Communal stereotypes viewing women as more compassionate and sensitive than men strengthened over time. In contrast, agency stereotypes viewing men as more ambitious and aggressive than women did not significantly change over time.

"These current stereotypes should favor women's employment because competence is, of course, a job requirement for virtually all positions. Also, jobs increasingly reward social skills, making women's greater communion an additional advantage," said Eagly. "On a less positive note, most leadership roles require more agency than communion. Therefore, the lesser agency ascribed to women than men is a disadvantage in relation to leadership positions."

Eagly theorized that the considerable change in competence beliefs derives, in part, from the changing roles of men and women. Women's labor force participation has increased from 32% in 1950 to 57% in 2018, while men's participation has fallen from 82% to 69%. Women also now earn more bachelor's, master's and doctoral degrees than do men, unlike decades ago.

"Our interpretation of these findings is that women's increasing labor force participation and education underlie the increase in their perceived competence, but that occupational segregation and the division of domestic roles underlie the findings for communion and agency," she said.

As women entered paid employment in large numbers, their jobs remained concentrated in occupations that reward social skills or offer contribution to society. Women also spend approximately twice as much time on domestic work and child care as men on average, according to Eagly. In contrast, men are concentrated in leadership roles and in occupations that require physical strength, competition, interaction with things, and analytical, mathematical and technical skills.

"Observation of these stark differences in the typical roles of women and men causes people to ascribe different traits to them, as shown in other research studies. Gender stereotypes thus reflect the social position of women and men in society but change when this social position shifts," she said.

Credit: 
American Psychological Association

Scientists hope genetic research will lead to new breakthroughs in weed control

image: An article featured in the journal Weed Science sheds important new light on the genetics and potential control of Palmer amaranth and waterhemp -- two troublesome Amaranthus species weeds that are resistant to multiple herbicides.

Image: 
www.wssa.net

July 18, 2019 - An article featured in the journal Weed Science sheds important new light on the genetics and potential control of Palmer amaranth and waterhemp - two troublesome Amaranthus species weeds that are resistant to multiple herbicides.

While most Amaranthus species are monoecious and contain both male and female flowers on a single plant, Palmer amaranth and waterhemp are dioecious. Some plants are female, while others are male. This reproductive difference promotes outcrossing and genetic diversity, which can fuel herbicide-resistant populations.

A team based at the University of Illinois recently sequenced the DNA for both male and female Palmer amaranth and waterhemp plants to explore dioecy and the genetic basis of sex determination. The data sets they compiled from sex-specific and sex-biased sequences were able to distinguish between male and female plants from multiple, geographically distinct Palmer amaranth and waterhemp populations with a 95 percent or greater accuracy.

This new genetic-level data is expected to be of great benefit to researchers who are interested in the biology, evolution and control of both Palmer amaranth and waterhemp.

"We hope that having a better understanding of weed genetics will open up new control strategies that haven't yet been considered," says Patrick J. Tranel, Ph.D., a professor at the University of Illinois. "For example, it might be possible to manipulate Palmer amaranth or waterhemp genes so that all offspring are male, causing the collapse of a local weed population."

To learn more, you can read the article "Sex-specific markers for waterhemp (Amaranthus tuberculatus) and Palmer amaranth (Amaranthus palmeri)" In Weed Science vol. 67 issue 4 online

Credit: 
Cambridge University Press

'Trojan horse' anticancer drug disguises itself as fat

image: New drug delivery system disguises a common chemotherapy drug as a long-chain fatty acid. Thinking the drugs are tasty fats, tumors invite the drug inside. Once there, the targeted drug activates, immediately suppressing tumor growth.

Image: 
Nathan Gianneschi/Northwestern University

EVANSTON, Ill. -- A stealthy new drug-delivery system disguises chemotherapeutics as fat in order to outsmart, penetrate and destroy tumors.

Thinking the drugs are tasty fats, tumors invite the drug inside. Once there, the targeted drug activates, immediately suppressing tumor growth. The drug also is lower in toxicity than current chemotherapy drugs, leading to fewer side effects.

"It's like a Trojan horse," Northwestern University's Nathan Gianneschi, who led the research. "It looks like a nice little fatty acid, so the tumor's receptors see it and invite it in. Then the drug starts getting metabolized and kills the tumor cells."

The study will be published July 18 in the Journal of the American Chemical Society (JACS). Gianneschi is the Jacob and Rosalind Cohn Professor of Chemistry in Northwestern's Weinberg College of Arts and Sciences. Cassandra E. Callmann is the paper's first author. A current postdoctoral fellow at Northwestern, Callmann was a graduate student in Gianneschi's laboratory during the research.

To develop the targeting system, Gianneschi and his team engineered a long-chain fatty acid with two binding sites -- able to attach to drugs -- on each end. The fatty acid and its hitchhiking drugs are then hidden inside human serum albumin (HSA), which carries molecules, including fats, throughout the body.

The body's cellular receptors recognize the fats and proteins supplied by the HSA and allow them inside. Quick-growing and hungry, cancer cells consume the nutrients much faster than normal cells. When the cancer cells metabolize the hidden drug, they die.

"It's like the fatty acid has a hand on both ends: one can grab onto the drug and one can grab onto proteins," Gianneschi said. "The idea is to disguise drugs as fats so that they get into cells and the body is happy to transport them around."

In the study, the researchers used the drug delivery system to carry a common, FDA-approved chemotherapy drug, paclitaxel, into tumors in a small animal model. Disguised as fat, the drug entered and completely eliminated the tumors in three types of cancer: bone, pancreatic and colon.

Even better: the researchers found they could deliver 20 times the dose of paclitaxel with their system, compared to two other paclitaxel-based drugs. But even at such a high quantity, the drug in Gianneschi's system was still 17 times safer.

"Commonly used small-molecule drugs get into tumors -- and other cells," Gianneschi said. "They are toxic to tumors but also to humans. Hence, in general, these drugs have horrible side effects. Our goal is to increase the amount that gets into a tumor versus into other cells and tissues. That allows us to dose at much higher quantities without side effects, which kills the tumors faster."

Credit: 
Northwestern University

Monitoring air quality after Fourth of July fireworks

The U.S. recently celebrated the Fourth of July with dazzling fireworks displays in many cities. After the "oohs" and "ahhs" faded, some people might have wondered how the lingering gunpowder-scented smoke affected air quality. Now researchers reporting in ACS Earth and Space Chemistry have conducted detailed measurements and found increased levels of several pollutants after an Independence Day fireworks event in Albany, New York.

According to the American Pyrotechnics Association, about 254 million pounds of fireworks exploded in consumer and public displays in 2017. Previous studies have shown that fireworks festivities around the world can cause very high short-term air pollution, which could have harmful effects on the respiratory system in humans. However, most of these studies used filter-based methods to collect air over 12- or 24-hour time periods, so they didn't provide real-time information. James Schwab and colleagues wanted to conduct a detailed investigation on air quality before, during and after a large fireworks display in Albany, New York -- a city of about 100,000 people that typically has relatively clean air.

The researchers collected minute- and hour-averaged air samples from two sites in uptown and downtown Albany from June 27 to July 7, 2017, and analyzed pollutants by mass spectrometry. The peak levels of submicron particulate matter were more than eight times higher after the fireworks display than before. The team also observed a large spike in potassium levels -- from the black powder used as a propellant in fireworks -- on the night of July 4, which peaked at 350 times the background level for 2-3 hours and lingered until the next morning. The levels of other substances including organics, nitrate and sulfate also increased in the hours following the display. The team estimated that emissions during the fireworks show were about 10 times higher than the hourly emissions rate from vehicles in the Albany area. The researchers say that additional studies, including those assessing human health impacts, are needed.

Credit: 
American Chemical Society

How common is long-term opioid use after job injury?

What The Study Did: This observational study included 46,000 injured workers in Tennessee who weren't taking opioids at the time of their injury and looked at how common long-term opioid use was and what factors were associated with it.

Authors: Zoe Durand, Ph.D., of the Tennessee Department of Health in Nashville, is the corresponding author.

(doi:10.1001/jamanetworkopen.2019.7222)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

DistME: A fast and elastic distributed matrix computation engine using GPUs

image: This is a mimetic diagram (a) of 3D data multiplication through CuboidMM and mimetic diagram (b) of data processing computation using GPU.

Image: 
?DGIST

DGIST announced on July 4 that Professor Min-Soo Kim's team in the Department of Information and Communication Engineering developed the DistME (Distributed Matrix Engine) technology that can analyze 100 times more data 14 times faster than the existing technologies. This new technology is expected to be used in machine learning that needs big data processing or various industry fields to analyze large-scale data in the future.

'Matrix' data, which expresses numbers in row and column, is the most widely used form of data in various fields such as machine learning* and science technology. While 'SystemML' and 'ScaLAPACK' are evaluated as the most popular technologies to analyze matrix data, but the processing capability of existing technology has recently reached its limits with the growing size of data. It is especially difficult to conduct multiplications, which are required for data processing, for big data analysis with the existing methods because they cannot perform elastic analysis and processing and require a huge amount of network data transfer for processing.

In response, Professor Kim's team developed a distributed matrix multiplication method that is different from the existing one. Also called CuboidMM, this method forms matrix multiplication in a 3D hexahedron and then partitions and processes to multiple pieces called cuboid. The optimal size of the cuboid is flexibly determined depending on the characteristics of the matrices, i.e., the size, the dimension, and sparsity of matrix, so as to minimize the communication cost. CuboidMM not only includes all the existing methods but also can perform matrix multiplication with minimum communication cost. In addition, Professor Kim's team devised an information processing technology by combining with GPU (Graphics Processing Unit) which dramatically enhanced the performance of matrix multiplication.

The DistME technology developed by Professor Kim's team has increased processing speed by combining CuboidMM with GPU, which is 6.5 and 14 times faster than ScaLAPACK and SystemML respectively and can analyze 100 times larger matrix data than SystemML. It is expected to open new applicability of machine learning in various areas that need large-scale data processing including online shopping malls and SNS.

Professor Kim in the Department of Information and Communication Engineering said 'Machine Learning Technology, which has been drawing worldwide attention, has limitations in the speed for matrix-form big data analysis and the size of analysis processing. The information processing technology developed this time can overcome such limitations and will be useful in not only machine learning but also applications in wider ranges of science technology data analysis application."

This research was participated by Donghyoung Han, a Ph.D. student in the Department of Information and Communication Engineering as the first author and was presented on July 3 in ACM SIGMOD 2019, the top-renowned academic conference in the database field held in Amsterdam, Netherlands.

* Machine Learning: A computer program that improves information processing ability through learning using data and processing experience and a related research field.

Credit: 
DGIST (Daegu Gyeongbuk Institute of Science and Technology)

New insight into microRNA function can give gene therapy a boost

Scientists at the University of Eastern Finland and the University of Oxford have shown that small RNA molecules occurring naturally in cells, i.e. microRNAs, are also abundant in cell nuclei. Previously, microRNAs were mainly thought to be found in cytoplasm. The scientists also discovered that microRNA concentrations in cell nuclei change as a result of hypoxia. The findings strongly suggest that microRNAs play a role in the expression of genes in the cell nucleus. This observation is crucial for the development of novel gene therapy, among other things. The study was published in Scientific Reports today.

The scientists profiled the division of microRNAs in different parts of endothelial cells, discovering that a large share of microRNAs are enriched in cell nuclei. When the scientists exposed the cell culture to hypoxia, they noticed that changes in the concentrations of individual microRNAs mostly took place either in cytoplasm or in the cell nucleus. This profiling study is the first one of its kind, showing that microRNAs play a more central role in regulating the expression of genes in cells than previously thought. For instance, the scientists found that microRNA-210, a molecule previously strongly associated with hypoxia, is in fact especially abundant in the cell nucleus. This observation sheds light on previously unknown mechanisms that cells use to adapt to hypoxia.

MicroRNAs weaken the expression of their target genes by binding to the ends of their messenger RNAs in cytoplasm. This phenomenon is known as RNA interference, the discovery of whose mechanisms was recognised with a Nobel Prize in 2006.

At the University of Eastern Finland, Dr Mikko Turunen and colleagues showed already ten years ago that synthetic microRNA molecules can regulate genes of therapeutic importance in animal models by targeting their impact on genes' regulatory areas in cell nuclei. This discovery, along with subsequent research, led the scientists to assume that structurally similar microRNAs occurring naturally in cells also play a role in the regulation of genes in the cell nucleus.

"It is highly significant that these microRNAs targeting the cell nucleus can also increase the expression of genes, which is opposite to what happens in RNA interference. This is a very important finding in view of novel gene therapy, for example," Dr Mikko Turunen from the University of Eastern Finland points out.

Credit: 
University of Eastern Finland

Stanford researchers identify possible drug target for deadly heart condition

A genetic mutation linked to dilated cardiomyopathy, a dangerous enlargement of the heart's main pumping chamber, activates a biological pathway normally turned off in healthy adult hearts, according to a study by researchers at the Stanford University School of Medicine.

Chemically inhibiting the pathway corrected the mutation's effects in patient-derived heart cells in a lab dish, the study found. The researchers accomplished this with drugs already approved by the Food and Drug Administration.

The findings, which will be published online July 17 in Nature, suggest that existing drugs could one day be repurposed to treat dilated cardiomyopathy. More broadly, the study demonstrates how patient-derived heart cells can help scientists better study the heart and screen new candidate drugs.

"With 10 milliliters of blood, we can make clinically usable amounts of your beating heart cells in a dish," said the study's senior author, Joseph Wu, MD, PhD, director of the Stanford Cardiovascular Institute and a pioneer of the technique. "And if you tell me you're taking some kind of medication for your heart -- like beta-blockers or statins -- we can add that to see how it affects your heart. That's the beauty of this approach."

The researchers studied heart muscle cells grown from patients with a genetic mutation associated with dilated cardiomyopathy. Heart cells with a mutation in lamin, which forms part of the nuclear envelope, failed to beat properly -- just like in patients with the disease. The scientists found that the defect was the result of a surge in the platelet-derived growth factor pathway. This pathway is important in the formation of blood vessels and normally only activates when the heart first forms or is under stress. Treating heart cells with existing drug inhibitors of the pathway restored regular, rhythmic beating.

Don't stop the beat

In dilated cardiomyopathy, the heart's main pumping chamber, the left ventricle, expands so much that the heart can no longer beat regularly. Patients experience shortness of breath, chest pain and, in severe cases, sudden and deadly cardiac arrest. Approximately 1 in every 250 Americans suffer from a form of dilated cardiomyopathy of which the exact cause is not known, though 20% to 35% of these cases run in families.

Previous studies correlated mutations in lamin to familial dilated cardiomyopathy, but it seemed like an odd connection. Lamin forms part of the nuclear envelope, a structure that separates DNA from the rest of the cell and regulates the movement of molecules in and out of the nucleus -- not exactly an obvious candidate for regulating heart function.

"We were puzzled," said Wu, the Simon H. Stertzer, MD, Professor and professor of medicine and of radiology. "Why would a mutation in a nuclear envelope protein not involved in squeezing of the heart, such as sarcomere protein, or in electrophysiology of the heart, such as an ion channel, lead to dilated cardiomyopathy?"

To solve the mystery, the researchers needed to study the lamin mutation in heart muscle cells. Excising a tissue sample from a patient's heart, an invasive medical procedure, was not a good option. Mouse tissue was another possibility, but mouse findings don't always hold up in humans.

Instead, the scientists generated heart cells by turning back the clock on patient-derived skin cells to make induced pluripotent stem cells, which can become any of the specialized cells found throughout the body. While the researchers used skin cells in the study, Wu said that the same technique can also be done with 10 milliliters of blood -- roughly two teaspoons.

Heart muscle cells grown in a dish pulse rhythmically, just as they do in the body. But cells from members of a family with lamin mutations and a history of dilated cardiomyopathy beat noticeably off-rhythm and had irregular electrical activity. The defect could be fixed by swapping in a normal copy of the gene with a gene-editing technology. Introducing the mutation into cells from healthy patients caused those cells to beat off-rhythm too. Cells with the lamin mutation had abnormal levels of calcium, a key ion that regulates muscle contractions.

Getting back on rhythm

As part of the nuclear envelope, lamin interacts with a tightly packed form of DNA known as heterochromatin. Interestingly, the researchers found by various DNA sequencing techniques that cells with the lamin mutation had fewer regions of heterochromatin. Since DNA packing affects what genes get activated or shut off, the researchers looked at gene-activation patterns to see which pathways went awry in cells with the mutation -- and what they could do about it.

"Although we did all this sequencing and other experiments, without a specific target, we cannot provide the right therapy," said the study's lead author, Jaecheol Lee, PhD, a former postdoctoral scholar who is now an assistant professor at the School of Pharmacy at Sungkyunkwan University in South Korea.

They found nearly 250 genes that were more highly activated in mutated cells than in normal cells. Many of the genes were part of the platelet-derived growth factor, or PDGF, pathway. When the researchers tested heart tissue from dilated cardiomyopathy patients with a lamin mutation, they saw signs that the same pathway was activated.

But did activation of the PDGF pathway cause abnormal rhythms or the other way around? To test this, the researchers treated heart cells with two drugs, crenolanib and sunitinib, that inhibit a key PDGF receptor. After treatment, heart cells with the lamin mutation began beating more regularly, and their gene-activation patterns more closely matched those of cells from healthy donors.

These two drugs are FDA-approved for treating various cancers. But previous work from Wu's team shows that the drugs may damage the heart at high doses, which will make finding the right dose or a safer alternative critical.

The current study is part of a broader effort by the researchers to use these patient-derived cells in a dish to screen for and discover new drugs. It's why the Wu lab has generated heart muscle cells from over 1,000 patients, including Wu, his son and daughter.

"Our postdocs have taken my blood and differentiated my pluripotent stem cells into my brain cells, heart cells and liver cells," Wu said. "I'm asking them to test some of the medications that I might need to take in the future."

Credit: 
Stanford Medicine

Improving the odds of synthetic chemistry success

image: This is the process for developing predictive models of chemical reactions.

Image: 
Jolene Reid and Matthew Sigman.

Chemistry is more than just mixing compound A with compound B to make compound C. There are catalysts that affect the reaction rate, as well as the physical conditions of the reaction and any intermediate steps that lead to the final product. If you're trying to make a new chemical process for, say, pharmaceutical or materials research, you need to find the best of each of these variables. It's a time-consuming trial-and-error process.

Or, at least, it was.

In a new publication in Nature, University of Utah chemists Jolene Reid and Matthew Sigman show how analyzing previously published chemical reaction data can predict how hypothetical reactions may proceed, narrowing the range of conditions chemists need to explore. Their algorithmic prediction process, which includes aspects of machine learning, can save valuable time and resources in chemical research.

"We try to find the best combination of parameters," Reid says. "Once we have that we can adjust features of any reaction and actually predict how that adjustment will affect it."

Trial and error

Previously, chemists who wanted to carry out a reaction that hadn't been tried before, such as a reaction to attach a particular small molecule to a particular spot on a larger molecule, approached the problem by looking up a similar reaction and mimicking the same conditions.

"Almost every time, at least in my experience, it doesn't work well," Sigman says. "So then you systematically change the conditions."

But with several variables in each reaction--Sigman estimates around seven to 10 in a typical pharmaceutical reaction--the number of possible combinations of conditions becomes overwhelming. "You cannot cover all of this variable space with any type of high throughput operation," Sigman says. "We're talking billions of possibilities."

Narrowing the field

So, Sigman and Reid looked for a way to narrow the focus to a more manageable range of conditions. For their test reaction, they looked at reactions that involve molecules with opposite mirror images of each other (in the same way your right and left hands are mirror images of each other) and that select more for one configuration than another. Such a reaction is called "enantioselective," and Sigman's lab studies the types of catalysts involved in enantioselective reactions.

Reid collected published scientific reports of 367 forms of reactions involving imines, which have a nitrogen base, and used machine learning algorithms to correlate features of the reactions with how selective they were for the two different forms of imines. The algorithms looked at the reactions' catalysts, solvents and reactants, and constructed mathematical relationships between those properties and the final selectively of the reaction.

"There's a pattern hidden beneath the surface of why it works and doesn't work with this condition, this catalyst, this substrate, and so on," Sigman says.

"The key to our success is that we use information from many reactions," Reid adds.

Easing the pain

How well does their predictive model work? It successfully predicted the outcomes of 15 reactions involving one reactant that wasn't in the original set, and the outcomes of 13 reactions where both a reactant and catalyst type were not in the original set. Finally, Reid and Sigman looked at a recent study that conducted 2,150 experiments to find the optimal conditions of 34 reactions. Without dirtying a single beaker, Reid and Sigman's model arrived at the same results and same optimal catalyst.

Reid looks forward to applying the model to predicting reactions involving large, complex molecules. "Often you find that new methodologies aren't fine-tuned to complex systems," she says. "Possibly we could do that now by predicting beforehand the best kind of catalyst."

Sigman adds that predictive models can lower the barriers to new drug development.

"The pharmaceutical industry doesn't want to invest money into something that they don't know if it's going to work," he says. "So, if you have an algorithm that suggests this has a high probability of working, you ease the pain."

After publication, find the full study here.

Credit: 
University of Utah

Scientists identified the metabolic features specific to the autistic brain

Skoltech scientists looked into the differences in the concentrations of multiple metabolites in healthy humans and individuals suffering from Autism Spectrum Disorder (ASD), gaining a deeper insight into the molecular processes that take place in the brain of autistic individuals. The results of the study were published in Nature's Communications Biology journal.

ASD is a range of nervous system disorders that manifest themselves primarily through impairment of cognitive functions and social communication and interaction abilities. The underlying molecular mechanisms of ASD are still poorly understood.

Scientists from the Skoltech Center for Neurobiology and Brain Restoration (CNBR), the Icahn School of Medicine at Mount Sinai (ISMMS, New York, USA), the Max Planck Institute in Potsdam, and the Cologne Institute (Germany) studied metabolites, tiny molecules that form in the prefrontal cortex as a result of biochemical reactions in the human system, both in healthy people and individuals with ASD, and compared the results to the tests made for the same brain region in macaques and chimpanzees. The study was performed using mass spectrometry, a highly accurate and sensitive analytical technique, that helped register and measure the concentrations of 1,366 different molecules clustered in 16 metabolic pathways.

Using blood and urine samples from healthy people as a reference, the scientists discovered multiple differences in metabolite concentrations between autistic and healthy humans. Interestingly, most of those differences are known to be related to the metabolic pathways that were found earlier in the urine and blood samples taken from autistic individuals. When comparing the brain metabolites in humans and other mammals, including chimpanzees and macaques, it becomes clear that a marked difference between healthy and autistic individuals is observed in those metabolic pathways which are affected by multiple human-specific evolutionary changes, which leads the scientists to believe that autism tends to disrupt evolutionarily novel mechanisms.

"Some earlier studies clearly pointed to the differences in metabolite concentrations in urine and blood, but fell short of establishing a possible connection to the brain processes. Our team focused on the prefrontal cortex, where we identified a host of ASD-specific metabolic features. We compared metabolites in the human brain to those in the brains of chimpanzees and macaques and found that ASD affects evolutionarily novel metabolic pathways," says one of the authors of the study and Assistant Professor at Skoltech, Ekaterina Khrameeva.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Higher iron levels may boost heart health -- but also increase risk of stroke

Scientists have helped unravel the protective - and potentially harmful - effect of iron in the body.

In a series of early-stage studies examining genetic data from over 500,000 people, a team of international scientists, led by Imperial College London, explored the role that iron plays in over 900 diseases.

The results reveal not only are naturally higher iron levels associated with a lower risk of high cholesterol levels, they also reduce the risk of arteries becoming furred with a build-up of fatty substances.

However the research, funded by the Wellcome Trust, also revealed the potential risks associated with naturally higher iron levels. These included a higher risk of blood clots related to slow blood flow blood - a common cause of stroke - and a higher risk of bacterial skin infection.

Dr Dipender Gill, lead author of the study from Imperial's School of Public Health, said: "Iron is a crucial mineral in the body, and is essential for carrying oxygen around the body. However, getting the right amount of iron in the body is a fine balance - too little can lead to anaemia, but too much can lead to a range of problems including liver damage."

Dr Gill cautioned the study only looked at naturally occurring iron levels in the body related to genetic variation between individuals - and did not investigate the effect of taking iron supplements. He cautions anyone to speak to their doctor before starting - or stopping - iron supplements.

In the studies, the research team used a genetic technique called Mendelian Randomization to investigate the link between iron levels and the risk of disease. In this process, they sifted through genetic data from thousands of people to identify genetic 'variants' associated with naturally higher iron levels. They then investigated whether people who carry these variants, called single-nucleotide polymorphisms, also had higher or lower risk of a range of conditions and diseases, such as high cholesterol and atherosclerosis.

The results, published in the Journal of the American Heart Association and PLOS Medicine, revealed that naturally higher iron levels were associated with reduced risk of both high cholesterol and atherosclerosis.

Atherosclerosis is a potentially serious condition where the arteries become clogged with fatty substances. This can reduce the flow of blood in the arteries, and in some cases can lead to a block in flow to the brain (triggering a stroke), or the heart (triggering heart attack).

However, the picture was complicated by further findings from the same study, which revealed high iron levels may be linked to a risk of clots related to slow blood flow, which can increase the risk of certain types of stroke and the condition deep vein thrombosis.

And to add to this, the studies also revealed higher iron levels may also be linked to an increased risk of bacterial skin infections.

So what is going on?

Dr Gill said these findings now need to be investigated in patient trials. He explained: "These studies reveal new avenues of research, and present many questions. We are still unclear on how iron affects cholesterol levels, narrows arteries and form blood clots, but we have ideas. One possibility is that the lower cholesterol levels may be linked to the reduced risk of arteries becoming furred. Furthermore, higher iron levels may cause blood clots to arise when flow is reduced, possibly explaining the increased chance of clots."

He adds that previous research suggests that iron may also play a role in bacterial replication and virulence, which may be linked to the increased risk of skin infections.

Credit: 
Imperial College London

New 'Majorana Photons' identified

image: Majorana-radially polarized twisted photon.

Image: 
Robert R. Alfano & Yury Budansky

Hailed as a pioneer by Photonics Media for his previous discoveries of supercontinuum and Cr tunable lasers, City College of New York Distinguished Professor of Science and Engineering Robert R. Alfano and his research team are claiming another breakthrough with a new super class of photons dubbed "Majorana photons." They could lead to enhanced information on quantum-level transition and imaging of the brain and its working.

Alfano's group based its research on the fact that photons, while possessing salient properties of polarization, wavelength, coherence and spatial modes, take on several forms. "Photons are amazing and are all not the same," Alfano states.

Their focus "was to use a 'special super form' of photons, which process the entanglement twists of both polarizations and the wavefront to probe and would propagate deeper in brain tissues, microtubules and neuron cells, giving more fundamental information of the brain than the conventional photon forms."

These unique photons can travel with different wavefronts. They also have a vortex where the wavefront twists and polarization is non- homogenous in the wave beam diameter. These beams are called Cylindrical Vector Vortex Beams (CVVB).

Among these CVVB photons, the Alfano team identified a new "super special" class called classical entangled photon beams. These photons are mixed having both different types of circular polarization and + L and - L orbital angular momentum, locally. In addition, they are entangled with their own anti photon. Two stand out Radial and Azmuthal optical beams.

Alfano named them "Majorana Photons," after Ettore Majorana, an Italian theoretical physicist and protégé of Enrico Fermi, who worked on neutrino masses.

"The 'super special photon" will play an important role in understanding the fundamental and quantum processes in materials, deeper penetration and to advance applications in photo detection sensing, information, communication and future computers," said Alfano, a prolific inventor whose research has led to advancements in ultrafast laser science and nonlinear optical imaging, since 1970.

Credit: 
City College of New York

'Artificial intelligence' fit to monitor volcanoes

image: This is an interferogram of the December 2018 eruption of Etna in southern Italy, based on Sentinel-1 satellite images. Interferograms spatially map ground surface movements.

Image: 
MOUNTS system, Data: ESA Sentinel, edited: Sébastien Valade, GFZ

More than half of the world's active volcanoes are not monitored instrumentally. Hence, even eruptions that could potentially have rung an alarm can occur without people at risk having a clue of the upcoming disaster. As a first and early step towards a volcano early warning system, a research project headed by Sébastien Valade from the Technical University of Berlin (TU Berlin) and the GFZ German Research Centre for Geosciences in Potsdam led to a new volcano monitoring platform which analyses satellite images using - amongst other methods - "artificial intelligence" (AI). Through tests with data from recent events, Valade and his colleagues demonstrated that their platform called MOUNTS (Monitoring Unrest from Space) can integrate multiple sets of diverse types of data for a comprehensive monitoring of volcanoes. The team's results were published in the journal Remote Sensing.

Of the 1500 active volcanoes worldwide, up to 85 erupt each year. Due to the cost and difficulty to maintain instrumentation in volcanic environments, less than half of the active volcanoes are monitored with ground-based sensors, and even less are considered well-monitored. Volcanoes considered dormant or extinct are commonly not instrumentally monitored at all, but may experience large and unexpected eruptions, as was the case for the Chaitén volcano in Chile in 2008 which erupted after 8000 years of inactivity.

Eruptions often preceded by precursory signals

Satellites can provide crucial data when ground-based monitoring is limited or lacking completely. Continuous long-term observations from space are key to better recognizing signs of volcanic unrest. Eruptions are often - but not always - preceded by precursory signals which may last a few hours to a few years. These signals can include changes in the seismic behaviour, ground deformation, gas emissions, temperature increase or several of the above.

"Apart from seismicity, all of these can be monitored from space by exploiting various wavelengths across the electromagnetic spectrum", says Sébastien Valade, leader of the MOUNT project. It is funded by GEO.X, a research network for geosciences in Berlin and Potsdam founded in 2010, and conducted at TU Berlin and GFZ. "With the MOUNTS monitoring system, we exploit multiple satellite sensors in order to detect and quantify changes around volcanoes", he adds. "And we also integrated seismic data from GFZ's worldwide GEOFON network and from the United States Geological Survey USGS."

Part of the project was to test whether AI algorithms could be successfully integrated in the data analysis procedure. These algorithms were mainly developed by Andreas Ley from the TU Berlin. He applied so-called artificial neural networks to automatically detect large deformation events. The researchers trained them with computer-generated images mimicking real satellite images. From this vast number of synthetic examples, the software learned to detect large deformation events in real satellite data formerly not known to it. This field of data science is called 'machine learning'.

„For us, this was an important 'test balloon' to see how we can integrate machine learning into the system," says Andreas Ley. "Right now, our deformation detector just solves a single task. But our vision is to integrate several AI tools for different tasks. Since these tools usually benefit from being trained on large amounts of data, we want to make them learn continuously from all the data the system gathers on a global scale."

MOUNTS monitors 17 volcanoes worldwide

The main challenges he and his co-authors had to deal with were handling the large amounts of data, and software engineering issues. "But these problems can be solved", says Sébastien Valade. "I am deeply convinced that in the not so far future, automated monitoring systems using AI and data from different sources like satellite remote sensing and ground-based sensors will help to warn people in a more timely and robust fashion."

Already today, the analysis provided by the MOUNTS monitoring platform allows for a comprehensive understanding of various processes in different climatic and volcanic settings across the globe: from the propagation of magma beneath the surface to the emplacement of volcanic material during the eruption, as well as the morphological changes of affected areas, and the emission of gases into the atmosphere. The researchers successfully tested MOUNTS on a number of recent events like the Krakatau eruption in Indonesia in 2018 or eruptions in Hawaii and Guatemala, to name a few.

The system currently monitors 17 volcanoes worldwide including the Popocatépetl in Mexico and Etna in Italy. The website of the platform is freely accessible, and - thanks to the global coverage and free access to the underlying data - can easily incorporate new data.

Credit: 
GFZ GeoForschungsZentrum Potsdam, Helmholtz Centre

Increases in social media use and television viewing associated with increases in teen depression

A new study by a team of CHU Sainte-Justine and Université de Montréal scientists has revealed that social media use and television viewing are linked to increases in adolescent depressive symptoms.

Changes in adolescent social media use and television use predict increases in symptoms of depression. The study, published July 15 in JAMA Pediatrics, revealed that a higher than average frequency of social media and television viewing over four years predicts more severe symptoms of depression over that same time frame. Over and above a potential common vulnerability linked to both sets of behaviours, the study demonstrated that if teens reported increases that their social media use and television viewing surpassed their overall mean level of use in a given year, then their depression symptoms also increased in that same year. Thus, the more time adolescents spend on social media and in front of the television, the more severe their symptoms of depression become. Video gaming and computer use beyond average, social media use and other internet browsing, were also included in the study, but were not identified as predictors of depression in adolescence.

The study tested three explanatory hypotheses: Displacement, Upward Social Comparison, and Reinforcing Spirals. The data from teens appeared to conform with the latter two hypotheses: There was no evidence that screen time affected adolescent depression by reducing their involvement in physical activities, but there was evidence that interacting with media outlets that were more conducive to promoting upward social comparisons was particularly associated with reductions in self-esteem, which then explained increases in depressive symptoms. The study also found evidence that social media, and not other screen-based activities, might further promote depressive symptoms in those already experiencing depressive symptoms, through a reinforcing spiral process.

Consistent with previous hypotheses

These results are consistent with previous hypotheses about how depression develops. "Social media and television are forms of media that frequently expose adolescents to images of others operating in more prosperous situations, such as other adolescents with perfect bodies and a more exciting or rich lifestyle. Furthermore, based on reinforcing spirals theory, people seek out and select information congruent with their current state-of-mind. The algorithmic features of television viewing and in particular, social media, create and maintain a feedback loop by suggesting similar content to users based on their previous search and selection behaviour. Thus, the more one's depressive state influences their viewing choices, the more similar content is being suggested and provided, and the more likely one will be continuously exposed to such content, therewith maintaining and enhancing depression," explains the study's lead author, Elroy Boers, post-doctoral researcher at UdeM's Department of Psychiatry.

This study could have important implications for how youth and families choose to regulate digital screen time in order to prevent and reduce symptoms of depression. "Many people attribute increasing rates of depression among young people in North America to the recent introduction of mobile digital devices to our society. The study's findings indicate social media use and television viewing are important predictors of depression in adolescence. While our results are based on observational research design, the nature of statistical approach that we used to test possible causal effects robustly controlled for any potential common underlying vulnerability to high levels of screen time and depression. Furthermore, the effects could be explained through mediation analyses, which further supports a causal hypothesis. Nevertheless, more research is needed, among which research that includes experimental designs, to confirm that exposure to social media is causing elevated rates of depression in young people," said Dr. Patricia Conrod, senior author and Professor of Psychiatry at Universite de Montreal, and Tier 1 Canada Research Chair at CHU Sainte-Justine.

Screen time and depression

Dr Conrod's team followed almost 4,000 Canadian teenagers from ages 12 to 16 years who were part of the Co-Venture Trial. Each year of high school, teens were asked to self-report time spent in front of digital screens and specify the amount of time spent engaging in four different types of screen activities (social media, television, video gaming and computer use).

Moreover, the teenagers completed self-reported questionnaires on various depressive symptoms between ages 12 and 16. Then, after data collection, state-of-the-art statistical analyses were performed to assess the between-person and with-person associations between screen time and depression in adolescence. These analyses augment standard analyses by modelling the year-to-year changes of both sets of problems, thus taking into account possible common vulnerability and possible natural developmental changes in each set of behaviours or symptoms.

"Our research reveals that increased time spent using some forms of digital media in a given year predicts depressive symptoms within that same year," said Conrod. This is highly encouraging from a prevention perspective, she added. "Early identification of vulnerability to depression gives clinicians and parents a large window of time in which to intervene. Regulating teens' social media and television use might be one way to help young people manage depressed mood or vulnerability to depressive symptoms."

Conrod and her colleagues hope that this study will help guide the design of new intervention strategies for at-risk youth, before the symptoms become clinically significant.

Credit: 
University of Montreal

Can videogames promote emotional intelligence in teenagers?

image: Games for Health Journal breaks new ground as the first journal to address this emerging and increasingly important area of health care.

Image: 
(c) 2019 Mary Ann Liebert, Inc., publishers

New Rochelle, NY, July 15, 2019--A new study has shown that videogames, when used as part of an emotional intelligence training program, can help teenagers evaluate, express, and manage their own emotions immediately after the training. The study design, interpretation of results, and implications of these findings are published in Games for Health Journal, a peer-reviewed publication from Mary Ann Liebert, Inc. publishers. Click here to read the full-text article free on the Games for Health Journal website through August 15, 2019.

The article entitled "Can Videogames Be Used to Promote Emotional Intelligence in Teenagers? Results from EmotivaMente, a School Program" was coauthored by Claudia Carissoli and Daniela Villani, Università Cattolica del Sacro Cuore (Milan, Italy). The researchers developed an emotional intelligence training program that integrated videogames as experience-based learning tools. The experimental group of teenagers participated in eight sessions and their emotional competency was evaluated before beginning the program, at the end of the training, and three months later. The researchers provide recommendations for future research based on the results of this study.

"Games for health have been designed to address an increasing variety of issues. A relatively new health issue is emotional intelligence, which has implications for various health problems, including coping with stress," says Tom Baranowski, PhD, Editor-in-Chief of Games for Health Journal, from USDA/ARS Children's Nutrition Research Center, and Department of Pediatrics, Baylor College of Medicine, Houston, TX. "Carissoli and Villani created a videogame, EmotivaMente, to enhance emotional intelligence among adolescents, perhaps the group that could benefit most. Their preliminary evaluation indicated that playing the game enhanced the students' evaluation and expression of emotions. This is an important first step in designing a game to learn to manage emotions. While the impact was limited, further enhancements to the game may have substantial additional effects. Stay tuned!"

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News