Culture

Individuals with criminal records may stay in their jobs longer

In sales and customer service positions, employees with criminal records may stay in their jobs longer and be less likely to leave, according to a study published in the IZA Journal of Labor Policy.

Researchers at Northwestern University investigated the possible relationship between having a criminal record and job performance by evaluating data from employees in sales or customer service jobs in call centres in the US. They found employees with a criminal record stayed in their roles on average 19 days longer than those who did not have a criminal record.

Deborah Weiss, the corresponding author of the study said: "In sales and customer service positions, turnover is a major labor cost. Our study found that employees with criminal records had a longer tenure and were less likely to quit their jobs voluntarily than other workers. This finding suggests that individuals with a criminal record represent an untapped productivity pool."

The authors suggest that employees with a criminal record may stay in their jobs longer because they have fewer job prospects outside of their current role.

Deborah Weiss said: "Job applicants with criminal records are much less likely than others to receive an offer of employment. Six months after release from prison, 50 to 80 percent of the formerly incarcerated remain unemployed. Some of those who are offered employment may stay longer because they have no other options and others may feel a sense of loyalty or gratitude to an employer who has given them a second chance."

The researchers also found a 34% increased chance of misconduct in sales jobs for employees with a criminal record but not in customer service jobs, which may suggest that performance and tenure for employees with a criminal record may be better in customer service roles than sales roles. The authors suggest that despite this higher misconduct rate, sales employees with a criminal record may be a good investment for employers. The authors estimated that hiring a worker with a criminal record for a sales job increased expected theft-related costs by about $43, while saving the same employer about $746 in turnover costs on that worker.

Deborah Weiss said: "Finding gainful employment for individuals with a criminal background is an important public priority: without such employment, reoffending is almost inevitable. While our study may not entirely dispel employers' fears that hiring applicants with a criminal record may carry risks, our findings suggest that there are unexploited opportunities to hiring applicants with a record in a way that makes sense both on efficiency and on moral grounds."

The researchers used data on 58,977 applicants hired for sales or customer service jobs in call centres in the US, collected by a hiring consultancy from May 2008 to January 2014. The authors evaluated possible associations between having a criminal record or not having a criminal record and job performance, misconduct and time spent in the job.

The authors caution that the study only evaluated data from those working in the sales and customer service jobs, which may limit the generalizability of the results outside of these positions. The observational nature of this study does not allow for conclusions about cause and effect.

Credit: 
BMC (BioMed Central)

Regrowing dental tissue with stem cells from baby teeth

image: Stem cells extracted from baby teeth were able to regenerate dental pulp (shown, with fluorescent labeling) in young patients who had injured one of their adult teeth.

Image: 
Courtesy of University of Pennsylvania

Sometimes kids trip and fall, and their teeth take the hit. Nearly half of children suffer some injury to a tooth during childhood. When that trauma affects an immature permanent tooth, it can hinder blood supply and root development, resulting in what is essentially a "dead" tooth.

Until now, the standard of care has entailed a procedure called apexification that encourages further root development, but it does not replace the lost tissue from the injury and, even in a best-case scenario, causes root development to proceed abnormally.

New results of a clinical trial, jointly led by Songtao Shi of the University of Pennsylvania and Yan Jin, Kun Xuan, and Bei Li of the Fourth Military Medicine University in Xi'an, China, suggest that there is a more promising path for children with these types of injuries: Using stem cells extracted from the patient's baby teeth. The work was published in the journal Science Translational Medicine.

"This treatment gives patients sensation back in their teeth. If you give them a warm or cold stimulation, they can feel it; they have living teeth again," says Shi, professor and chair in the Department of Anatomy and Cell Biology in Penn's School of Dental Medicine. "So far we have follow-up data for two, two and a half, even three years and have shown it's a safe and effective therapy."

Shi has been working for a decade to test the possibilities of dental stem cells after discovering them in his daughter's baby tooth. He and colleagues have learned more about how these dental stem cells, officially called human deciduous pulp stem cells (hDPSC), work and how they could be safely employed to regrow dental tissue, known as pulp.

The Phase I trial, conducted in China, which has a research track for clinical trials, enrolled 40 children who had each injured one of their permanent incisors and still had baby teeth. Thirty were assigned to hDPSC treatment and 10 to the control treatment, apexification.

Those that received hDPSC treatment had tissue extracted from a healthy baby tooth. The stem cells from this pulp were allowed to reproduce in a laboratory culture, and the resulting cells were implanted into the injured tooth.

Upon follow-up, the researchers found that patients who received hDPSCs had more signs than the control group of healthy root development and thicker dentin, the hard part of a tooth beneath the enamel. Blood flow increased as well.

At the time the patients were initially seen, all had little sensation in the tissue of their injured teeth. A year following the procedure, only those who received hDPSCs had regained some sensation. Examining a variety of immune-system components, the team found no evidence of safety concerns.

As further support of the treatment's efficacy, the researchers had the opportunity to directly examine the tissue of a treated tooth when the patient reinjured it and had to have it extracted. They found that the implanted stem cells regenerated different components of dental pulp, including the cells that produce dentin, connective tissue, and blood vessels.

"For me the results are very exciting," Shi says. "To see something we discovered take a step forward to potentially become a routine therapy in the clinic is gratifying."

It is, however, just a first step. While using a patient's own stem cells reduces the chances of immune rejection, it's not possible in adult patients who have lost all of their baby teeth. Shi and colleagues are beginning to test the use of allogenic stem cells, or cells donated from another person, to regenerate dental tissue in adults. They are also hoping to secure FDA approval to conduct clinical trials using hDPSCs in the United States.

Eventually, they see even broader applications of hDPSCs for treating systemic disease, such as lupus, which Shi has worked on before.

"We're really eager to see what we can do in the dental field," Shi says, "and then building on that to open up channels for systemic disease therapy."

Credit: 
University of Pennsylvania

The Lancet: Dairy consumption linked to lower rates of cardiovascular disease and mortality

Dairy consumption of around three servings per day is associated with lower rates of cardiovascular disease and mortality, compared to lower levels of consumption, according to a global observational study of over 130,000 people in 21 countries, published in The Lancet.

In addition, the study found that people who consumed three servings of whole fat dairy per day had lower rates of mortality and cardiovascular disease compared to those who consumed less than 0.5 serving of whole fat dairy per day.

The findings are consistent with previous meta-analyses of observational studies and randomised trials, but stand in contrast to current dietary guidelines which recommend consuming 2-4 servings of fat-free or low-fat dairy per day, and minimising consumption of whole-fat dairy products for cardiovascular disease prevention.

Cardiovascular disease is the leading cause of mortality worldwide. The authors conclude that the consumption of dairy should not be discouraged and should even perhaps be encouraged in low-income and middle-income countries where dairy consumption is low.

"Our findings support that consumption of dairy products might be beneficial for mortality and cardiovascular disease, especially in low-income and middle-income countries where dairy consumption is much lower than in North America or Europe," says lead author Dr Mahshid Dehghan, McMaster University, Canada.

The Prospective Urban Rural Epidemiological (PURE) study included data from 136,384 individuals aged 35-70 years in 21 countries [1]. Dietary intakes were recorded at the start of the study using country-specific validated food questionnaires. Participants were followed up for an average of 9.1 years. During this time, there were 6,796 deaths and 5,855 major cardiovascular events.

One standard serving of dairy was equivalent to a glass of milk at 244g, a cup of yoghurt at 244g, one slice of cheese at 15g, or a teaspoon of butter at 5g.

Dairy consumption was highest in North America and Europe (368g/day or above 4 servings of total dairy per day) and lowest in south Asia, China, Africa and southeast Asia (147, 102, 91 and 37g/day respectively - less than 1 serving of total dairy per day).

Participants were grouped into four categories: no dairy (28,674 people), less than 1 serving per day (55,651), 1-2 servings per day (24,423), and over 2 servings per day (27,636).

Compared to the no intake group, the high intake group (mean intake of 3.2 servings per day) had lower rates of total mortality (3.4% vs 5.6%), non-cardiovascular mortality (2.5% vs 4%), cardiovascular mortality (0.9% vs 1.6%), major cardiovascular disease (3.5% vs 4.9%), and stroke (1.2% vs 2.9%). There was no difference in the rates of myocardial infarction between the two groups (1.9% vs 1.6%).

Among those who consumed only whole-fat dairy, higher intake (mean intake of 2.9 servings of whole fat dairy per day) was associated with lower rates of total mortality (3.3% vs 4.4%) and major cardiovascular disease (3.7% vs 5.0%), compared to those who consumed less than 0.5 servings whole-fat dairy per day.

Higher intake of milk and yoghurt (above 1 serving per day) was associated with lower rates of the composite outcome, which combines total mortality and cardiovascular disease (milk: 6.2% vs 8.7%; yoghurt: 6.5% vs 8.4%), compared to no consumption. The differences in the composite outcome for butter and cheese were not significant as intake was lower than for milk and yoghurt.

The authors say that more research into why dairy might be associated with lower levels of cardiovascular diseases is now needed. The recommendation to consume low-fat dairy is based on the presumed harms of saturated fats on a single cardiovascular risk marker (LDL cholesterol). However, evidence suggests that some saturated fats may be beneficial to cardiovascular health, and dairy products may also contain other potentially beneficial compounds, including specific amino acids, unsaturated fats, vitamin K1 and K2, calcium, magnesium, potassium, and potentially probiotics. The effect of dairy on cardiovascular health should therefore consider the net effect on health outcomes of all these elements.

Limitations include that diets were self-reported. While multiple weighted food records may be more accurate, they require extensive training, motivation, awareness and literacy which limits the practicality for such a large long-term study. The authors also note that diet was measured at baseline, and that changes in diet may have occurred over time. However, they add that the association between milk intake at 3 years follow up and cardiovascular disease was similar to the analyses using baseline information, suggesting that repeat measurements is unlikely to alter the findings.

Writing in a linked Comment, Jimmy Chun Yu Louie (University of Hong Kong), and Anna M Rangan (University of Sydney) conclude that dairy dietary guidelines do not need to change just yet. They write: "The results from the PURE study seem to suggest that dairy intake, especially whole-fat dairy, might be beneficial for preventing deaths and major cardiovascular diseases. However, as the authors themselves concluded, the results only suggest the "consumption of dairy products should not be discouraged and perhaps even be encouraged in low-income and middle-income countries." It is not the ultimate seal of approval for recommending whole-fat dairy over its low-fat or skimmed counterparts. Readers should be cautious, and treat this study only as yet another piece of the evidence (albeit a large one) in the literature."

Credit: 
The Lancet

UK heart failure patients twice as likely to die as their Japanese counterparts

Heart failure is common, and becoming more so as populations age. It is the primary diagnosis in more than 80,000 admissions to hospital in the UK; more than 200,000 in Japan; and more than 1 million in the US.

It's thought that cultural differences may have a role in differences in death rates for heart failure around the globe. To look at this in more detail, the researchers compared the death rates of 894 heart failure patients admitted to hospital in the UK with 3781 admitted to hospital in Japan.

To compare patients with a similar severity of heart failure, the researchers looked at the risk factors associated with a heightened risk of death in patients with the condition in previously published studies.

The five factors most strongly associated with the risk of death were systolic blood pressure (the amount of pressure in the arteries when the heart muscle contracts) and levels of sodium, urea (a measure of protein turnover and kidney function) and creatinine (a measure of kidney function) in the blood.

They then compared death rates in hospital, and 1, 3, and 6 months after admission.

Although both UK and Japanese patients were of similar age, UK patients had more severe heart failure, as judged by the five most important risk factors. They were also more likely to have ischaemic heart disease (narrowed arteries) and COPD (chronic lung disease).

UK patients were much more likely to die at all the time points measured than were Japanese patients. Much of this difference could be attributed to British patients being sicker at the time of admission. The threshold for hospital admission in the UK seems to be higher than it is in Japan, note the researchers.

But even after accounting for observed differences in risk, British patients were more than twice as likely to have died at 6 months than patients in Japan.

This is an observational study, and as such, can't establish exactly why British patients fared so much worse than patients in Japan. Differences in the quality of care after discharge, attitudes to medical advice and taking medicines, lifestyle, diet or genes might all have influenced outcomes, suggest the researchers.

"Explaining the differences in outcome among countries, cultures and health services might provide insights that could improve care and outcome and inform healthcare policy decisions," they conclude.

Credit: 
BMJ Group

New insight on rotavirus mechanics could lead to improved treatments

image: This is an artistic rendition of the rotavirus particle dissection process, performed with atomic force microscopy.

Image: 
Image created by Scixel (http://scixel.es/), under the instructions of D. Luque and P. J. de Pablo.

Researchers have provided new insight on the mechanics of a virus that causes severe diarrhea and sickness in young children, according to a report published in eLife.

The study, from the Autonomous University of Madrid, Carlos III Health Institute and National Center for Biotechnology, Spain, could open up new avenues for developing effective treatments for rotavirus, which commonly infects children up to five years old. It is the first paper to detail the interplay between the function and mechanical properties of a 'multilayered' virus.

Virus particles enclose their genetic material in a protein shell designed to protect, shuttle and release its genome at the host cell. The structure of virus particles therefore need to be strong enough to protect the viral genome in environments outside the cell, and to withstand attacks from the host immune system, to ensure successful infection.

Many double-stranded RNA viruses, such as rotavirus, isolate their genome within a core shell that incorporates its own molecular machinery to allow the genome to replicate and spread. Some viruses take this a step further and build extra concentric protein layers that function in other ways, such as to help bind and penetrate their target cells.

"The complete particle of rotavirus is formed by three independent protein shells. This particle and the subviral particles containing one or two protein layers play distinct roles during infection," explains lead author Manuel Jiménez-Zaragoza, Research Assistant in the Department of Physics of Condensed Matter at the Autonomous University of Madrid. "We wanted to see how the interactions between the layers that define these different particles work together during the virus replication cycle."

Although previous studies have revealed how to purify two-layer protein particles, the authors of the current work have developed a novel way to purify single-layer particles, allowing them to be studied individually. After purifying these subviral particles, the team used a scanning probe system called atomic force microscopy, which involves using a small, sharp stylus to deform the virus particles. This allowed them to study the strength and stability of individual triple, double and single-layered particles.

They discovered a strong interaction between the external and middle layers, which they say is critical for the protection of the complete virus particle. Meanwhile, the interactions that take place between the middle and inner layers help the virus to replicate its genome among host cells, a process known as transcription.

"Our findings reveal how the biophysical properties of the three protein shells are fine-tuned to enable rotavirus to be carried among host cells," says senior author Pedro de Pablo, Associate Professor at the Autonomous University of Madrid. "We believe this could prove valuable in offering new venues for the development of novel antiviral strategies."

Credit: 
eLife

Stress linked to more advanced disease in some leukemia patients

Patients with chronic lymphocytic leukemia (CLL) who feel more stress also have more cancer cells in their blood and elevated levels of three other markers of more advanced disease.

A new study of 96 patients is the first to link stress with biological disease markers in patients with CLL.

"All four variables we measured are related to prognosis in CLL patients, so they have a lot of relevance," said Barbara L. Andersen, lead author of the study and professor of psychology at The Ohio State University.

"It's more evidence of the importance of managing stress in cancer patients."

The study appeared Aug. 1 in the journal Cancer.

CLL is the most common leukemia in adults, and accounts for about one-third of adult leukemia in the United States.

The study involved patients who were entering a trial at Ohio State's Arthur G. James Cancer Hospital for ibrutinib, now approved by the U.S. Food and Drug Administration. At the time of the study, the drug was in early trials to treat the disease. Data collection was done before patients received the first dose.

All patients completed a survey that measured their cancer-related stress. They were asked questions like how often they had intrusive thoughts about their cancer, how often they tried to avoid thinking about it and how often they felt jumpy and easily startled.

The researchers took blood samples and calculated absolute lymphocyte counts (ALC), which is a measure of healthy and malignant cells circulating in the blood. This measure is often elevated in patients with CLL and is used as a marker of disease severity. They also measured levels of eight different cytokines, which are proteins involved in the body's immune response. All of these cytokines can promote unhealthy levels of inflammation in patients with cancer.

Results showed that more stress in the patients was associated with a higher number of circulating cancerous cells and higher levels of three cytokines: tumor necrosis factor alpha, interleukin 16 and chemokine ligand 3 (CCL 3).

CCL3 is a particular kind of cytokine called a chemokine. It helps facilitate the development of CLL cells in places like the spleen and lymph nodes, where leukemia cells are produced.

"Chemokines have not been used in studies like this before and it is a novel way of checking for the link between stress and disease," Andersen said.

Stress was linked to disease severity even after the researchers took into account several other important factors that also play a role in disease progression, including gender, the number of prior treatments and the presence of a genetic marker (del17p) that is associated with harder-to-treat CLL.

"The fact that stress shows an effect on CLL even after we controlled for other factors suggests it may be relevant to the course of CLL," Andersen said.

Why did the other five cytokines the researchers studied not show an effect in this study?

Andersen noted that this was the first study of its kind done with leukemia patients. Many of the other cytokines have been found to have effects in solid tumors and might not work the same way in blood cancers.

The researchers are continuing to follow these patients and will examine the relationship between stress and these same responses throughout treatment, Andersen said.

Credit: 
Ohio State University

Coral bleaching increases disease risk in threatened species

image: Staghorn corals grown in Mote Marine Laboratory's underwater nursery in the Florida Keys, US.

Image: 
Conor Goulding/Mote Marine Laboratory

Bleaching events caused by rising water temperatures could increase mortality among a coral species already threatened by disease, says new research by Mote Marine Laboratory and Penn State, US, published in eLife.

The study on the species Acropora cervicornis, known as the staghorn coral, emphasizes the need for maintaining genetic diversity while at the same time increasing resilience within the species, as part of restoration efforts to help prevent further loss in the Florida region.

Once prevalent throughout the Florida Reef Tract, the staghorn coral has suffered substantial declines over the last several decades due to increasing ocean temperatures and disease outbreaks, with no evidence of natural recovery. The Florida Reef Tract is currently estimated to be worth over $6 billion to the state economy, providing over 70,000 jobs and attracting millions of tourists into Florida each year - but much of its ecosystem services will be lost if the living coral is not restored.

"With imminent threats to the staghorn coral, it is now the focus of restoration efforts throughout much of the Florida region, thanks to the existence of some coral genotypes that are more resilient to threats than others," says lead author Dr. Erinn Muller, Program Manager and Science Director of the Elizabeth Moore International Center for Coral Reef Research and Restoration at Mote Marine Laboratory, Florida. "However, there could be tradeoffs associated with these resilient traits, such as heat-tolerant corals being highly susceptible to disease infection.

"Previous studies showed there are certain staghorn genotypes resistant to white band disease. However, it is still unclear how high-water temperatures caused by climate change influence disease resistance and what role, if any, the algae that live and interact with the corals - their 'algal symbionts' - play in stress resistance. We therefore wanted to see what percentage of staghorn corals within the lower Florida Keys are disease resistant, and how this resistance changes during a warm-water event that leads to coral bleaching."

To do this, Muller and her team exposed the same staghorn coral genotypes to white band-diseased tissue before and during a coral bleaching event. They found that, in the absence of bleaching, around 25% of the population tested was resistant to the disease. However, when the corals were exposed to it during the bleaching event, their mortality rate doubled.

Interestingly, the team, which included researchers from the Mote Marine Lab and Penn State also found that two coral genotypes were resistant to the disease even while bleached. The level of bleaching within these genotypes was not related to disease susceptibility or their algal symbiont strain, suggesting there are no direct tradeoffs between their levels of heat tolerance and disease resistance.

"While we are working on reducing carbon dioxide emissions that cause climate change and ocean warming as fast as possible, our best chance at enhancing adaptation of corals and their symbionts to their warming environments is to promote genetic diversity of coral and symbiont populations," said Iliana Baums, Associate Professor of Biology at Penn State, an expert in coral molecular ecology.

Baums developed the genetic methods used in this research to fingerprint the genetic strains of symbionts associated with each of the coral colonies. Such high-level resolution has not yet been applied commonly in coral experiments. It allowed the team to disentangle the response of the coral host versus the symbiont genotype to the multiple stressors they experimented with.

"Together, our findings show that the staghorn coral's susceptibility to temperature stress creates an increased risk in death from disease, and that only two of the genotypes tested may maintain or gain disease resistance under high temperatures," said Muller. "As recurring warming events may cause continued loss of these resistant genotypes, it is crucial that restoration efforts focus on maintaining high genetic diversity to help keep these corals alive in a warming climate."

Credit: 
eLife

Beyond deep fakes: Transforming video content into another video's style, automatically

image: Researchers at Carnegie Mellon University have devised a way to automatically transform the content of one video into the style of another, making it possible to transfer the facial expressions of one person to the video of another person, or even a cartoon character.

Image: 
Carnegie Mellon University

PITTSBURGH-- Researchers at Carnegie Mellon University have devised a way to automatically transform the content of one video into the style of another, making it possible to transfer the facial expressions of comedian John Oliver to those of a cartoon character, or to make a daffodil bloom in much the same way a hibiscus would.

Because the data-driven method does not require human intervention, it can rapidly transform large amounts of video, making it a boon to movie production. It can also be used to convert black-and-white films to color and to create content for virtual reality experiences.

"I think there are a lot of stories to be told," said Aayush Bansal, a Ph.D. student in CMU's Robotics Institute. Film production was his primary motivation in helping devise the method, he explained, enabling movies to be produced more quickly and cheaply. "It's a tool for the artist that gives them an initial model that they can then improve," he added.

The technology also has the potential to be used for so-called "deep fakes," videos in which a person's image is inserted without permission, making it appear that the person has done or said things that are out of character, Bansal acknowledged.

"It was an eye opener to all of us in the field that such fakes would be created and have such an impact," he said. "Finding ways to detect them will be important moving forward."

Bansal will present the method today at ECCV 2018, the European Conference on Computer Vision, in Munich. His co-authors include Deva Ramanan, CMU associate professor of robotics.

Transferring content from one video to the style of another relies on artificial intelligence. In particular, a class of algorithms called generative adversarial networks (GANs) have made it easier for computers to understand how to apply the style of one image to another, particularly when they have not been carefully matched.

In a GAN, two models are created: a discriminator that learns to detect what is consistent with the style of one image or video, and a generator that learns how to create images or videos that match a certain style. When the two work competitively -- the generator trying to trick the discriminator and the discriminator scoring the effectiveness of the generator -- the system eventually learns how content can be transformed into a certain style.

A variant, called cycle-GAN, completes the loop, much like translating English speech into Spanish and then the Spanish back into English and then evaluating whether the twice-translated speech still makes sense. Using cycle-GAN to analyze the spatial characteristics of images has proven effective in transforming one image into the style of another.

That spatial method still leaves something to be desired for video, with unwanted artifacts and imperfections cropping up in the full cycle of translations. To mitigate the problem, the researchers developed a technique, called Recycle-GAN, that incorporates not only spatial, but temporal information. This additional information, accounting for changes over time, further constrains the process and produces better results.

The researchers showed that Recycle-GAN can be used to transform video of Oliver into what appears to be fellow comedian Stephen Colbert and back into Oliver. Or video of John Oliver's face can be transformed a cartoon character. Recycle-GAN allows not only facial expressions to be copied, but also the movements and cadence of the performance.

The effects aren't limited to faces, or even bodies. The researchers demonstrated that video of a blooming flower can be used to manipulate the image of other types of flowers. Or clouds that are crossing the sky rapidly on a windy day can be slowed to give the appearance of calmer weather.

Such effects might be useful in developing self-driving cars that can navigate at night or in bad weather, Bansal said. Obtaining video of night scenes or stormy weather in which objects can be identified and labeled can be difficult, he explained. Recycle-GAN, on the other hand, can transform easily obtained and labeled daytime scenes into nighttime or stormy scenes, providing images that can be used to train cars to operate in those conditions.

Credit: 
Carnegie Mellon University

NASA's SDO spots 2 lunar transits in space

image: SDO captured these images in a wavelength of extreme ultraviolet light that shows solar material heated to more than 10 million degrees Fahrenheit. Extreme ultraviolet light is typically invisible to the human eye, but satellites like SDO allow us to observe the swirling movement in the Sun's atmosphere visible only in these wavelengths. View animation: https://www.nasa.gov/sites/default/files/thumbnails/image/2_2018-09-10_1...

Image: 
NASA/Goddard/SDO

On Sept. 9, 2018, NASA's Solar Dynamics Observatory, SDO, saw two lunar transits as the Moon passed in front of the Sun. A transit happens when a celestial body passes between a larger body and an observer. This first lunar transit lasted one hour, from 4:30 pm to 5:30 p.m. EDT and obscured 92 percent of the Sun at the peak of its journey. The second transit happened several hours later at 9:52 p.m. and lasted a total of 49 minutes, ending at 10:41 p.m. EDT. This transit only obscured 34 percent of the Sun at its peak.

Watch the movie here to see how -- from SDO's perspective -- the Moon appears to go in one direction and then switch direction to cross the Moon again. The Moon does not, of course, actually change direction, but it appears to do so from SDO's perspective based on the fact that the spacecraft's orbit essentially catches up and passes the Moon during the first transit.

Because the Moon does not have an atmosphere, when a lunar transit occurs no light from the Sun gets distorted, allowing for a distinct view of the Moon's surface. Although it looks smooth from far away, the surface of the Moon is rugged, sprinkled with craters, valleys and mountains.

SDO captured these images in a wavelength of extreme ultraviolet light that shows solar material heated to more than 10 million degrees Fahrenheit. Extreme ultraviolet light is typically invisible to the human eye, but satellites like SDO allow us to observe the swirling movement in the Sun's atmosphere visible only in these wavelengths.

Credit: 
NASA/Goddard Space Flight Center

New innovation improves the diagnosis of dizziness

image: The new vibrating device improves the diagnosis of dizziness.

Image: 
Johan Bodell/Chalmers University of Technology

Half of over-65s suffer from dizziness and problems with balance. But some tests to identify the causes of such problems are painful and can risk hearing damage. Now, researchers from Chalmers University of Technology, Sweden, have developed a new testing device using bone conduction technology, that offers significant advantages over the current tests.

Hearing and balance have something in common. For patients with dizziness, this relationship is used to diagnose issues with balance. Commonly, a 'VEMP' test (Vestibular Evoked Myogenic Potentials) needs to be performed. A VEMP test uses loud sounds to evoke a muscle reflex contraction in the neck and eye muscles, triggered by the vestibular system - the system responsible for our balance. The Chalmers researchers have now used bone conducted sounds to achieve better results.

"We have developed a new type of vibrating device that is placed behind the ear of the patient during the test," says Bo Håkansson, a professor in the research group 'Biomedical signals and systems' at Chalmers. The vibrating device is small and compact in size, and optimised to provide an adequate sound level for triggering the reflex at frequencies as low as 250 Hz. Previously, no vibrating device has been available that was directly adapted for this type of test of the balance system.

In bone conduction transmission, sound waves are transformed into vibrations through the skull, stimulating the cochlea within the ear, in the same way as when sound waves normally go through the ear canal, the eardrum and the middle ear. Bo Håkansson has over 40 years of experience in this field and has previously developed hearing aids using this technology.

Half of over-65s suffer from dizziness, but the causes can be difficult to diagnose for several reasons. In 50% of those cases, dizziness is due to problems in the vestibular system. But today's VEMP methods have major shortcomings, and can cause hearing loss and discomfort for patients.

For example, the VEMP test uses very high sound levels, and may in fact cause permanent hearing damage itself. And, if the patient already suffers from certain types of hearing loss, it may be impossible to draw any conclusions from the test. The Chalmers researchers' new method offers significant advantages.

"Thanks to this bone conduction technology, the sound levels which patients are exposed to can be minimised. The previous test was like a machine gun going off next to the ear - with this method it will be much more comfortable. The new vibrating device provides a maximum sound level of 75 decibels. The test can be performed at 40 decibels lower than today's method using air conducted sounds through headphones. This eliminates any risk that the test itself could cause hearing damage," says postdoctoral researcher Karl-Johan Fredén Jansson, who made all the measurements in the project.

The benefits also include safer testing for children, and that patients with impaired hearing function due to chronic ear infections or congenital malformations in the ear canal and middle ear can be diagnosed for the origin of their dizziness.

The vibrating device is compatible with standardised equipment for balance diagnostics in healthcare, making it easy to start using. The cost of the new technology is also estimated to be lower than the corresponding equipment used today.

A pilot study has been conducted and recently published. The next step is to conduct a larger patient study, under a recently received ethical approval, in collaboration with Sahlgrenska University Hospital in Gothenburg, where 30 participants with normal hearing will also be included.

Credit: 
Chalmers University of Technology

Change your diet to save both water and your health

image: The potential impact on water resources of shifting to healthy vegetarian diets, visualised for 35 000 municipalities in France. The map has been adjusted to reflect population size of each geographical entity.

Image: 
European Union, 2018

Shifting to a healthy diet is not only good for us, but it also saves a lot of precious fresh water, according to a new study by the JRC published in Nature Sustainability.

Compared to existing diets, the water required to produce our food could be reduced by between 11% and 35% for healthy diets containing meat, 33% and 55% for healthy pescetarian diets and 35% and 55% for healthy vegetarian diets.

Researchers compared these three diet patterns, defined by respective national dietary guidelines, to the current actual food consumption, using available data from more than 43 thousand areas in France, the UK and Germany.

They found that eating more healthily could substantially reduce the water footprint of people's diets, consistent across all the geographical entities analysed in the study.

The study is the most detailed nationwide food consumption-related water footprint ever made, taking into account socio-economic factors of food consumption, for existing and recommended diets.

Influences on the food we eat

The scientists also show how individual food consumption behaviour - and their related water footprints - depend strongly on the socio-economic factors like age, gender and education level.

They found interesting correlations between such factors and both the water footprint of specific foods and their resulting impact on overall water footprints.

For example, the study shows how in France, the water footprint of milk consumption decreases with age across the municipalities analysed.

Across London, they show a strong correlation between the water footprint of wine consumption and the percentage of the population of each area with a high education level.

Background

The water footprint is defined as the total volume of freshwater that is used to produce goods consumed, food in this particular case.

The scientists used national dietary surveys to assess differences in food product group consumption between regions and socio- economic factors within regions.

The diet scenarios analysed in the study take into account total daily energy and protein requirements as well as maximum daily fat amounts.

They are based upon national dietary guidelines, in which for every food product group specific recommendations are given according to age and gender.

By downscaling national water footprints to the lowest possible administrative boundaries within a country, the scientists provide a useful tool for policy makers at various levels.

The methodology could also be applied to other footprints assessments - like the carbon, land or energy footprints related to food consumption.

Animal products - and especially meat - have a high water footprint.

The average European diet is characterised by overconsumption in general, particularly of animal products.

A healthy diet would contain less sugar, crop oils, meat and animal fats, and more vegetables and fruit.

Due to the numerous negative impacts of an intensive livestock production system on the planet's resources and ecosystems, as well as the growing demands of non-western countries for animal products, moving to a more resource-efficient (and healthier) vegetable-rich diet in the EU is a necessity.

Credit: 
European Commission Joint Research Centre

Immune cells destroy healthy brain connections, diminish cognitive function in obese mice

image: Obesity leads to cognitive impairment by activating microglial cells, which consume otherwise functional synapses in the hippocampus, according to a study of male mice published in JNeurosci. The research suggests that microglia may be a potential therapeutic target for one of the lesser known effects of this global health epidemic on the brain.

Image: 
Cope at al., <i>JNeurosci</i> (2018)

Obesity leads to cognitive impairment by activating microglial cells, which consume otherwise functional synapses in the hippocampus, according to a study of male mice published in JNeurosci. The research suggests that microglia may be a potential therapeutic target for one of the lesser known effects of this global health epidemic on the brain.

Nearly two billion adults worldwide are overweight, more than 600 million of whom are obese. In addition to increasing risk of conditions such as diabetes and heart disease, obesity is also a known risk factor for cognitive disorders including Alzheimer's disease. The cellular mechanisms that contribute to cognitive decline in obesity, however, are not well understood.

Elise Cope and colleagues replicated previous research by demonstrating diet-induced obesity in mice impairs performance on cognitive tasks dependent on the hippocampus and results in loss of dendritic spines -- the neuronal protrusions that receive signals from other cells -- and activates microglia. Using genetic and pharmacological approaches to block microglial activity, the researchers established microglia are causally linked to obesity-induced dendritic spine loss and cognitive decline. The results suggest obesity may drive microglia into a synapse-eating frenzy that contributes to the cognitive deficits observed in this condition.

Credit: 
Society for Neuroscience

The universality of shame

Shame on you. These three simple words can have devastating effect on an individual's psyche.

But why is that? How is the feeling of shame generated, and what is its purpose? Some theorists argue that feeling shame is a pathology, a condition to be cured. Others dismiss it as a useless, ugly emotion.

A research team at the University of Montreal and UC Santa Barbara's Center for Evolutionary Psychology (CEP), however, suggest something altogether different. Shame, they argue, was built into human nature by evolution because it served an important function for our foraging ancestors.

Living in small, highly interdependent bands, the researchers explain, our ancestors faced frequent life-threatening reversals, and they counted on their fellow band members to value them enough during bad times to pull them through. So being devalued by others -- deemed unworthy of help -- was literally a threat to their survival. Therefore, when considering how to act, it was critical to weigh the direct payoff of a potential action (e.g., how much will I benefit by stealing this food?) and against its social costs (e.g., how much will others devalue me if I steal the food -- and how likely is it that they will find out?) became critical.

The researchers hypothesized that the intensity of anticipated shame people feel is an internally generated prediction of just how much others will devalue them if they take a given action. Moreover, if this feature was part of human nature, it should be observed everywhere -- in every culture.

To test for universality, they selected a linguistically, ethnically, economically and ecologically diverse set of cultures scattered around the world. In these 15 traditional, small-scale societies, the researchers found that the intensity of shame people feel when they imagine various actions (stealing, stinginess, laziness, etc.) accurately predicts the degree to which those actions would lead others in their social world to devalue them. Their findings appear in the Proceedings of the National Academy of Sciences.

The Function of Feelings

"In a world without soup kitchens, police, hospitals or insurance, our ancestors needed to consider how much future help they would lose if they took various actions that others disapprove of but that would be rewarding in other ways," said lead author Daniel Sznycer, an assistant professor of psychology at the University of Montreal. "The feeling of shame is an internal signal that pulls us away from acts that would jeopardize how much other people value our welfare."

Noted Leda Cosmides, a professor of psychology at UC Santa Barbara, co-director of the CEP and a co-author of the paper, "For this to work well, people can't just stumble about, discovering after the fact what brings devaluation. That's too late. In making choices among alternative actions, our motivational system needs to implicitly estimate in advance the amount of disapproval each alternative action would trigger in the minds of others."

A person who did only what others wanted would be selected against, the authors point out, because they would be completely open to exploitation. On the other hand, a purely selfish individual would be shunned rapidly as unfit to live with in this highly interdependent world -- another dead end.

"This leads to a precise quantitative prediction," said John Tooby, a professor of anthropology at UC Santa Barbara, CEP co-director and a coauthor of the paper. "Lots of research has shown that humans can anticipate personal rewards and costs accurately, like lost time or food. Here we predicted that the specific intensity of the shame a person would anticipate feeling for taking an action would track how much others in their local world would negatively evaluate the person if they took that specific act.

"The theory we're evaluating," he continued, "is that the intensity of shame you feel when you consider whether to take a potential action is not just a feeling and a motivator; it also carries vital information that seduces you into making choices that balance not only the personal costs and benefits of an action but also its social costs and benefits. Shame takes the hypothetical future disapproval of others, and fashions it into a precisely calibrated personal torment that looms the closer the act gets to commission or discovery."

A Universal Human Quality

According to the authors, shame -- like pain -- evolved as a defense. "The function of pain is to prevent us from damaging our own tissue," said Sznycer. "The function of shame is to prevent us from damaging our social relationships, or to motivate us to repair them if we do."

As a neural system, shame inclines you to factor in others' regard alongside private benefits so the act associated with the highest total payoff is selected, the authors argue. A key part of the argument is that this neurally based motivational system is a part of our species' biology. "If that is true, we should be able to find this same shame-devaluation relationship in diverse cultures and ecologies all around the world, including in face-to-face societies whose small scale echoes the more intimate social worlds in which we think shame evolved," Sznycer noted.

To test this hypothesis, the team collected data from 15 traditional small-scale societies in four continents. The people in these societies speak very different languages (e.g., Shuar, Amazigh, Icé-tód), have diverse religions (e.g., Hinduism, Shamanism), and make a living in different ways (e.g., hunting, fishing, nomadic pastoralism). If shame is part of universal, evolved human nature, the research should find that the emotion closely tracks the devaluation of others, for each specific act, in each community; but if shame is more akin to a cultural invention like agriculture or the alphabet, present in some places but not others, they should find wide variation from place to place in this relationship. Indeed, anthropologists have long proposed that some cultures are guilt-oriented, some are fear-oriented, and some are shame-honor.

Yet, the authors found the predicted relationships everywhere they tested. "We observed an extraordinarily close match between the community's negative evaluation of people who display each of the acts or traits they were asked about and the intensities of shame individuals anticipate feeling if they took those acts or displayed those traits," Sznycer said. "Feelings of shame really move in lockstep with the values held by those around you, as the theory predicts."

Further studies, he added, have demonstrated that it is specifically shame -- as opposed to other negative emotions -- that tracks others' devaluation. "Moral wrongdoing is not necessary," said Sznycer. "In other research we showed that individuals feel shame when others view their actions negatively, even when they know they did nothing wrong."

Of interesting note, anticipated shame mirrored not only the disapproval of fellow community members, but also the disapproval of (foreign) participants in each of the other societies. For example, the shame expressed by the Ik forager-horticulturalists of Ikland, Uganda, mirrored not only the devaluation expressed by their fellow Iks, but also the devaluation of fishermen from the Island of Mauritius, pastoralists from Khövsgöl, Mongolia, and Shuar forager-horticulturalists of the Ecuadorian Amazon. What's more, shame mirrored the devaluation of foreigners living nearby in geographic or cultural space just as well as it mirrored the devaluation of foreigners living farther and farther away -- another indication of shame's universality.

Credit: 
University of California - Santa Barbara

Prescribing antibiotics for children with cough does not reduce hospitalization risk

Doctors and nurses often prescribe antibiotics for children with cough and respiratory infection to avoid return visits, symptoms getting worse or hospitalisation. In a study published in the British Journal of General Practice today [Tuesday 11 September], researchers from the Universities of Bristol, Southampton, Oxford and Kings College London found little evidence that antibiotics reduce the risk of children with cough ending up in hospital, suggesting that this is an area in which unnecessary antibiotic prescribing could be reduced.

The team, funded by the National Institute for Health Research, analysed data from a study of 8,320 children (aged three months to 15 years) who had presented to their GP with cough and other respiratory infection symptoms to see whether adverse outcomes occurred within 30 days of seeing their GP.

Sixty-five (0.8 per cent) children were hospitalised and 350 (four per cent) revisited their GP due to a worsening of symptoms.

Compared with no antibiotics, there was no clear evidence that antibiotics reduced hospitalisation for children, supporting similar research findings in adults. However, there was evidence that a strategy of delayed antibiotic prescribing (giving parents or carers a prescription and advising they wait to see if symptoms worsened before using it) reduced the number of return visits to the GP.

Immediate antibiotics were prescribed to 2,313 (28 per cent) and delayed antibiotics to 771 (nine per cent).

Dr Niamh Redmond, from the Centre for Academic Primary Care at the University of Bristol and NIHR CLAHRC West, and lead author of the study, said: "The good news is that most children who present to their GP with acute cough and respiratory infection symptoms are at low risk of hospitalisation. We know that GPs, for a variety of reasons, commonly prescribe antibiotics in these cases as a precautionary measure. However, our study shows that antibiotics are unlikely to reduce this already small risk. This means that along with other strategies, there is real potential to reduce unnecessary antibiotic prescribing, which is a major contributor to the growing public health threat of antimicrobial resistance.

"If a GP or nurse is considering antibiotic prescribing for a child presenting with acute cough, a delayed prescription may be preferable as we have shown that delayed prescribing reduces the need for further consultations."

Credit: 
University of Bristol

Acute critical illness increases risk of kidney complications and death

Acute critical illness in people without previous renal disease puts them at risk of kidney complications as well as death, according to a study in published in CMAJ (Canadian Medical Association Journal).

"[P]atients with acute critical illness without apparent underlying renal disease -- a group traditionally considered to be at low risk of renal diseases -- have clinically relevant long-term renal risks," write Dr. Shih-Ting Huang and Dr. Chia-Hung Kao, Taichung Veterans General Hospital and China Medical University, Taichung, Taiwan, with coauthors.

Most studies have looked at patients with pre-existing kidney disease, while this study looked at data on 33 613 Taiwanese patients with critical acute illness and no pre-existing kidney disease compared with 63 148 controls for medium-term renal outcome. More than half the patients (53%) were over age 65 and two-thirds (67%) had high blood pressure. Patients who had experienced acute kidney illness were at increased risk of renal complications, developing chronic kidney disease and end-stage renal disease, with septicemia and septic shock being the strongest risk factors. Of the critically ill patients in the study, 335 developed end-stage renal disease, with a rate of 21 per 10 000 person-years compared with 4.9 per 10 000 person-years in the control group.

Patients who developed chronic kidney disease and end-stage renal disease were at a higher risk of death.

The authors suggest clinicians monitor kidney function at 30-90 days in patients with acute critical illness without preexisting renal disease and then at least yearly afterwards.

"Renal complications and subsequent mortality in acute critically ill patients without pre-existing renal disease" is published September 10, 2018.

Credit: 
Canadian Medical Association Journal