Culture

Joint hypermobility related to anxiety, also in animals

The relation between collagen laxity and anxiety in humans is widely known, but this relation has never been observed before in other species. A team of researchers led by professors Jaume Fatjó and Antoni Bulbena from the Department of Psychiatry and Legal Medicine at the UAB, the Hospital del Mar Medical Research Institute (IMIM) and the UAB Affinity Foundation Chair in Animals and Health, analysed a set of 13 animal behaviour characteristics and hip joint mobility in a total of 5575 domestic dogs. The results point to an association between hip joint hypermobility and a brain activation linked to emotions in dogs, with similar results as to those observed in people.

In the case of humans, researchers observed that this relation is made evident by manifestations of anxiety, fear, agoraphobia and panic, and that it is linked to hypermobility thanks to the indirect effects of emotional and mental states through a deregulation of the brain's independent reactions which intensify emotional states. Scientists maintain that excitability, this emotional reactivity, is one of the risk factors for anxiety disorders.

"Many years ago our research group discovered a relation between hypermobility and anxiety, in which people with more joint mobility and flexibility also tended to have more problems with anxiety. Now for the first time we are able to demonstrate that this association also exists in a non-human species", explains Professor Fatjó.

In the new study published in Scientific Reports, of the Nature group, researchers presented the first evidence of an association between hip joint hypermobility and behavioural alterations in a non-human species, which may suggest a very ancient evolutionary link which could be a universal trait in all mammals.

Credit: 
Universitat Autonoma de Barcelona

Boomers back on the dating scene seek cosmetic procedures to put their best face forward

video: Download HD broadcast quality news package, b-roll and TV script here: http://bit.ly/2Xy5p3Y

Image: 
American Society of Plastic Surgeons

The rise of dating websites and apps have changed the dating scene, leading singles to make judgments on potential dates as fast as they can swipe their screen. With about 20 million baby boomers actively using at least one of these dating services, they want to look their best and feel confident in the modern dating world. New statistics released by the American Society of Plastic Surgeons finds more Americans age 55 and older are seeking out cosmetic procedures, with nearly 50,000 more procedures performed on patients in this age group in 2018 than the previous year.

"I've definitely noticed more baby boomers making appointments to explore which procedures can help them look as young as they feel," said Anne Taylor, MD, a board-certified plastic surgeon in Columbus. "They're not necessarily looking to turn back the clock to their twenties, but just to make some subtle changes to put their best face forward. For example, a lot of my patients have expressed concerns about their neck or double chin, especially when it comes to looking downward to take a photo or video chat on their phones."

Dr. Taylor says boomers are interested in a wide range of procedures, both surgical and minimally invasive. The statistics show that surgeries such as liposuction (up 4%), hair transplantation (up 18%) and breast augmentation (up 4%) were popular among older patients in 2018, while smaller tweaks to freshen their look, such as Botox (up 3%) and fillers (up 3%) also increased significantly among those 55 and older.

55-year-old Jody Pinkerton says she chose to have a tummy tuck and breast augmentation to help her feel more like herself. After her divorce, her new look also gave her the confidence to create an online dating profile and begin meeting new people.

"Online dating can be scary, and you really need to be able to put yourself out there," said Pinkerton. "People are initially judging you on a few photos, and I wanted to make sure I could have current photos that really represented who I am and made a good first impression."

Having the courage to explore the modern dating scene eventually led Pinkerton to her husband, who she met online. "I have found my soulmate, and I don't know if our paths would have ever crossed had I not been on that site," she said.

"My boomer patients are confident and accomplished men and women, and they just want to look in the mirror and see the person they feel they are inside," said Dr. Taylor. "Finding the right cosmetic procedures can give them that extra boost of confidence when venturing into a new chapter of their lives."

With two-thirds (66%) of all facelifts in 2018 performed on patients 55 and older, along with nearly half (48%) of all eyelid surgeries, the rise in cosmetic procedures among older Americans is expected to continue. Statistics show procedures among this age group have steadily increased from 2 to 6 percent each year for nearly a decade. "It's a trend that's showing a shift in our culture," said Alan Matarasso, MD, president of the American Society of Plastic Surgeons. "Older Americans are vibrant and living their lives to the fullest, and they want the way they look to reflect that. That's where a board-certified plastic surgeon can help."

Dr. Matarasso stresses the importance of choosing a board-certified surgeon to ensure the most rigorous safety and ethical standards, plus those are the surgeons who are most qualified and can offer a wide range of options to fit the needs of each patient.

"Choosing a board-certified plastic surgeon is critical to your safety, your comfort and your happiness," said Dr. Matarasso. "There are a lot of choices when it comes to cosmetic procedures, and the extensive training that these doctors go through allows them to offer a full spectrum of options as well as the latest technology to find the very best plan to reach the goals of each individual."

Credit: 
MediaSource

Scientists identify genes associated with biliary atresia survival

CINCINNATI -- Scientists at Cincinnati Children's Hospital Medical Center have identified an expression pattern of 14 genes at the time of diagnosis that predicts two year, transplant-free survival in children with biliary atresia - the most common diagnosis leading to liver transplants in children.

The researchers also found that the antioxidant N-acetyl-cysteine (NAC) reduced liver injury and fibrosis (excess fibrous connective tissue) in mice with biliary atresia and increased survival times.

Biliary atresia is a rare disease of the liver and bile ducts that occurs in infants. Symptoms of the disease appear or develop about two to eight weeks after birth. When a baby has biliary atresia, bile flow from the liver to the gallbladder is blocked. This causes the bile to be trapped inside the liver, eventually causing liver failure.

"The relationship between a 14-gene signature at diagnosis and two-year survival provides insight into staging of liver disease and the development of new therapies," says Jorge Bezerra, MD, director of Gastroenterology, Hepatology and Nutrition at Cincinnati Children's. "A particularly appealing possibility is the design of a clinical trial designed to activate the glutathione pathway - a molecule highly expressed in infants with biliary atresia. The activation of the pathway by the antioxidant NAC has the potential to improve bile flow and block fibrosis development."

The study is published online in the journal Gastroenterology.

The researchers obtained liver biopsies and clinical data from infants with cholestasis - a reduction or stoppage of bile flow. The infants were enrolled in a prospective study of the Childhood Liver Disease Research Network, funded by the National Institutes of Health, and from infants evaluated at Cincinnati Children's. Liver biopsies were obtained at the time of diagnosis of biliary atresia. They performed RNA sequencing and developed a prognostic index.

The scientists administered NAC to neonatal mice with biliary atresia and fibrosis, which decreased bilirubin and liver fibrosis. Bilirubin is an orange-yellow substance made during the normal breakdown of red blood cells. Bilirubin passes through the liver and is eventually excreted out of the body. Higher levels, however, can indicate liver problems.

"We don't yet know if NAC is safe and effective in young babies with biliary atresia. Future clinical trials are needed before use in clinical practice," says Dr. Bezerra.

Credit: 
Cincinnati Children's Hospital Medical Center

'Alexa, monitor my heart': Researchers develop first contactless cardiac arrest AI system for smart speakers

image: The researchers envision a contactless system that works by continuously and passively monitoring the bedroom for an agonal breathing event. If it detects agonal breathing, it can call for help.

Image: 
Sarah McQuate/University of Washington

Almost 500,000 Americans die each year from cardiac arrest, when the heart suddenly stops beating.

People experiencing cardiac arrest will suddenly become unresponsive and either stop breathing or gasp for air, a sign known as agonal breathing. Immediate CPR can double or triple someone's chance of survival, but that requires a bystander to be present.

Cardiac arrests often occur outside of the hospital and in the privacy of someone's home. Recent research suggests that one of the most common locations for an out-of-hospital cardiac arrest is in a patient's bedroom, where no one is likely around or awake to respond and provide care.

Researchers at the University of Washington have developed a new tool to monitor people for cardiac arrest while they're asleep without touching them. A new skill for a smart speaker-- like Google Home and Amazon Alexa -- or smartphone lets the device detect the gasping sound of agonal breathing and call for help. On average, the proof-of-concept tool, which was developed using real agonal breathing instances captured from 911 calls, detected agonal breathing events 97% of the time from up to 20 feet (or 6 meters) away. The findings are published June 19 in npj Digital Medicine.

"A lot of people have smart speakers in their homes, and these devices have amazing capabilities that we can take advantage of," said co-corresponding author Shyam Gollakota, an associate professor in the UW's Paul G. Allen School of Computer Science & Engineering. "We envision a contactless system that works by continuously and passively monitoring the bedroom for an agonal breathing event, and alerts anyone nearby to come provide CPR. And then if there's no response, the device can automatically call 911."

Agonal breathing is present for about 50% of people who experience cardiac arrests, according to 911 call data, and patients who take agonal breaths often have a better chance of surviving.

"This kind of breathing happens when a patient experiences really low oxygen levels," said co-corresponding author Dr. Jacob Sunshine, an assistant professor of anesthesiology and pain medicine at the UW School of Medicine. "It's sort of a guttural gasping noise, and its uniqueness makes it a good audio biomarker to use to identify if someone is experiencing a cardiac arrest."

The researchers gathered sounds of agonal breathing from real 911 calls to Seattle's Emergency Medical Services. Because cardiac arrest patients are often unconscious, bystanders recorded the agonal breathing sounds by putting their phones up to the patient's mouth so that the dispatcher could determine whether the patient needed immediate CPR. The team collected 162 calls between 2009 and 2017 and extracted 2.5 seconds of audio at the start of each agonal breath to come up with a total of 236 clips. The team captured the recordings on different smart devices -- an Amazon Alexa, an iPhone 5s and a Samsung Galaxy S4 -- and used various machine learning techniques to boost the dataset to 7,316 positive clips.

"We played these examples at different distances to simulate what it would sound like if it the patient was at different places in the bedroom," said first author Justin Chan, a doctoral student in the Allen School. "We also added different interfering sounds such as sounds of cats and dogs, cars honking, air conditioning, things that you might normally hear in a home."

For the negative dataset, the team used 83 hours of audio data collected during sleep studies, yielding 7,305 sound samples. These clips contained typical sounds that people make in their sleep, such as snoring or obstructive sleep apnea.

From these datasets, the team used machine learning to create a tool that could detect agonal breathing 97% of the time when the smart device was placed up to 6 meters away from a speaker generating the sounds.

Next the team tested the algorithm to make sure that it wouldn't accidentally classify a different type of breathing, like snoring, as agonal breathing.

"We don't want to alert either emergency services or loved ones unnecessarily, so it's important that we reduce our false positive rate," Chan said.

For the sleep lab data, the algorithm incorrectly categorized a breathing sound as agonal breathing 0.14% of the time. The false positive rate was about 0.22% for separate audio clips, in which volunteers had recorded themselves while sleeping in their own homes. But when the team had the tool classify something as agonal breathing only when it detected two distinct events at least 10 seconds apart, the false positive rate fell to 0% for both tests.

The team envisions this algorithm could function like an app, or a skill for Alexa that runs passively on a smart speaker or smartphone while people sleep.

"This could run locally on the processors contained in the Alexa. It's running in real time, so you don't need to store anything or send anything to the cloud," Gollakota said.

"Right now, this is a good proof of concept using the 911 calls in the Seattle metropolitan area," he said. "But we need to get access to more 911 calls related to cardiac arrest so that we can improve the accuracy of the algorithm further and ensure that it generalizes across a larger population."

The researchers plan to commercialize this technology through a UW spinout, Sound Life Sciences, Inc.

"Cardiac arrests are a very common way for people to die, and right now many of them can go unwitnessed," Sunshine said. "Part of what makes this technology so compelling is that it could help us catch more patients in time for them to be treated."

Credit: 
University of Washington

Atrial fibrillation linked to increased risk of dementia, even in stroke-free patients

audio: Professor Boyoung Joung and Professor Gregory Lip discuss their research in nearly 263,000 people that shows atrial fibrillation is linked to an increased risk of developing dementia.

Image: 
Professor Boyoung Joung and Professor Gregory Lip

Atrial fibrillation (AF) is linked to an increased risk of dementia, even in people who have not suffered a stroke, according to the largest study to investigate the association in an elderly population.

In addition, the study, which is published in the European Heart Journal [1] today (Wednesday), found that AF patients who took oral anticoagulants to prevent blood clots had a decreased risk of dementia.

Atrial fibrillation (an irregular and often abnormally fast heartbeat) is the most common heart rhythm problem among elderly people, and more than half of AF patients are aged 80 or older. It increases the risk of stroke, other medical problems, and death. As populations age, the incidence of AF is expected to increase, and there has been increasing, although inconsistent, evidence that AF may contribute to the development of thinking problems and dementia.

The current study looked at 262,611 people aged 60 or over, who were free of AF and dementia in 2004. The researchers collected data from the Korea National Health Insurance Service Senior cohort and followed the participants until the end of 2013. During that time, AF was diagnosed in 10,435 of them. In those who developed AF, 24.4% (2,522) developed dementia during the follow-up period compared to 14.4% (36,322) who developed dementia among the AF-free people.

Professor Boyoung Joung, professor of cardiology and internal medicine at Yonsei University College of Medicine, Seoul, Republic of Korea, who led the research, said: "We found that the people who developed atrial fibrillation had a 50% increased risk of developing dementia compared to those who did not develop the condition; this increased risk remained even after we removed those who suffered a stroke from our calculations. This means that, among the general population, an extra 1.4 people per 100 of the population would develop dementia if they were diagnosed with atrial fibrillation. The risk occurred in people aged younger and older than 70 years.

"We also found that atrial fibrillation increased the risk of Alzheimer's disease by 30% and more than doubled the risk of vascular dementia. However, among people who developed atrial fibrillation and who took oral anticoagulants, such as warfarin, or non-vitamin K anticoagulants, such as dabigatran, rivaroxaban, apixaban or edoxaban, the risk of subsequently developing dementia reduced by 40% compared to patients who did not take anticoagulants."

The researchers say this is the largest study to investigate the link between AF and dementia in people aged 60 and over who did not have AF and had not suffered a stroke at the time of inclusion in the study. The study also has the longest follow-up with an average of more than six years.

"With these large figures, we can be sure of our findings," said co-author, Professor Gregory Lip, professor of cardiovascular medicine at the University of Liverpool, UK, and an adjunct professor at Yonsei University College of Medicine. "We also believe that our results can apply to other populations too, as they confirm similar findings of a link between atrial fibrillation and dementia in studies of people in Western and European countries.

"Our study suggests that the strong link between atrial fibrillation and dementia could be weakened if patients took oral anticoagulants. Therefore, doctors should think carefully and be readier to prescribe anticoagulants for atrial fibrillation patients to try to prevent dementia."

Prof Joung concluded: "Dementia is an untreatable disease and so prevention is important. This study confirms that atrial fibrillation is a risk factor for the development of dementia. Therefore, the prevention of atrial fibrillation may be a means to reduce the incidence of dementia. In patients with atrial fibrillation, our results suggest we can reduce the incidence of dementia through the use of anticoagulants. It is expected that non-vitamin K anticoagulants, which have a significantly lower risk of cerebral haemorrhage than warfarin, may be more effective than warfarin in terms of dementia prevention and this will be answered by an ongoing clinical trial. Furthermore, investigations should be carried out into whether aggressive rhythm control, such as catheter ablation, helps to prevent dementia." [2]

The researchers point out that their findings show only that there is an association between AF and dementia, and not that AF causes dementia. A possible mechanism for how AF might cause dementia is that AF patients often have alterations to the blood vessels in their brains, possibly from suffering mini strokes that are too small to show any outward symptoms, and this cerebrovascular damage could be implicated in the onset of dementia.

Limitations of the study include the fact that the researchers were unable to define the type of AF the patients had (paroxysmal or persistent); that AF can occur without symptoms and so some cases may have been missed; that the researchers did not have any information on the treatment of AF (successful treatment could lessen the risk of dementia) or 'real world' data on blood pressure; and, finally, although they adjusted for factors that could confound the results, unidentified confounding factors might remain.

Credit: 
European Society of Cardiology

One day of employment a week is all we need for mental health benefits -- study

As automation advances, predictions of a jobless future have some fearing unrest from mass unemployment, while others imagine a more contented work-free society.

Aside from economic factors, paid employment brings other benefits - often psychological - such as self-esteem and social inclusion. Now, researchers at the universities of Cambridge and Salford have set out to define a recommended "dosage" of work for optimal wellbeing.

They examined how changes in working hours were linked to mental health and life satisfaction in over 70,000 UK residents between 2009 and 2018*.

The study, published today in the journal Social Science and Medicine, shows that when people moved from unemployment or stay-at-home parenting into paid work of eight hours or less a week, their risk of mental health problems reduced by an average of 30%.

Yet researchers found no evidence that working any more than eight hours provided further boosts to wellbeing. The full-time standard of 37 to 40 hours was not significantly different to any other working time category when it came to mental health.

As such, they suggest that to get the mental wellbeing benefits of paid work, the most "effective dose" is only around one day a week - as anything more makes little difference.

"We have effective dosage guides for everything from Vitamin C to hours of sleep in order to help us feel better, but this is the first time the question has been asked of paid work," said study co-author Dr Brendan Burchell, a sociologist from Cambridge University who leads the Employment Dosage research project.

"We know unemployment is often detrimental to people's wellbeing, negatively affecting identity, status, time use, and sense of collective purpose. We now have some idea of just how much paid work is needed to get the psychosocial benefits of employment - and it's not that much at all."

Supporting the unemployed in a future with limited work is the subject of much policy discussion e.g. universal basic income. However, researchers argue that employment should be retained across adult populations, but working weeks dramatically reduced for work to be redistributed.

"In the next few decades we could see artificial intelligence, big data and robotics replace much of the paid work currently done by humans," said Dr Daiga Kamerāde, study first author from Salford University and Employment Dosage researcher.

"If there is not enough for everybody who wants to work full-time, we will have to rethink current norms. This should include the redistribution of working hours, so everyone can get the mental health benefits of a job, even if that means we all work much shorter weeks."

"Our findings are an important step in thinking what the minimum amount of paid work people might need in a future with little work to go round," she said.

The study used data from the UK Household Longitudinal Study to track the wellbeing of 71,113 individuals between the ages of 16 and 64 as they changed working hours over the nine-year period. People were asked about issues such as anxiety and sleep problems to gauge mental health.

Researchers also found that self-reported life satisfaction in men increased by around 30% with up to eight hours of paid work, although women didn't see a similar jump until working 20 hours.

They note that "the significant difference in mental health and wellbeing is between those with paid work and those with none", and that the working week could be shortened considerably "without a detrimental effect on the workers' mental health and wellbeing".

The team offer creative policy options for moving into a future with limited work, including "five-day weekends", working just a couple of hours a day, or increasing annual holiday from weeks to months - even having two months off for every month at work.

They also argue that working hour reduction and redistribution could improve work-life balance, increase productivity, and cut down CO2 emissions from commuting. However, they point out that reduction of hours would need to be for everyone, to avoid increasing socioeconomic inequalities.

"The traditional model, in which everyone works around 40 hours a week, was never based on how much work was good for people. Our research suggests that micro-jobs provide the same psychological benefits as full-time jobs," said co-author and Cambridge sociologist Senhu Wang.

"However, the quality of work will always be crucial. Jobs where employees are disrespected or subject to insecure or zero-hours contracts do not provide the same benefits to wellbeing, nor are they likely to in the future."

Dr Burchell added: "If the UK were to plough annual productivity gains into reduced working hours rather than pay rises, the normal working week could be four days within a decade."

Credit: 
University of Cambridge

Appearance of deep-sea fish does not signal upcoming earthquake in Japan

The unusual appearance of deep-sea fish like the oarfish or slender ribbonfish in Japanese shallow waters does not mean that an earthquake is about to occur, according to a new statistical analysis.

The study published in the Bulletin of the Seismological Society of America contradicts long-held Japanese folklore that deep sea fish sightings are a sign of an imminent earthquake, say Yoshiaki Orihara of Tokai University in Japan and colleagues.

When the researchers examined the relationship between deep-sea fish appearances and earthquakes in Japan, however, they found only one event that could have been plausibly correlated, out of 336 fish sightings and 221 earthquakes.

"As a result, one can hardly confirm the association between the two phenomena," the authors write in the BSSA paper.

The study included data from November 1928 to March 2011, looking at records of deep-sea fish appearances 10 and 30 days ahead of earthquakes that occurred 50 and 100 kilometers away from the fish sighting.

They confined their search to earthquakes of magnitude 6.0 or larger, since these are the earthquakes that have been linked to "precursor phenomena like unusual animal behavior in previous reports," said Orihara.

There were no recorded deep-sea fish appearances before an earthquake of magnitude 7.0, and no earthquakes with a magnitude greater than 6.0 occurred within 10 days of a deep-sea fish appearance.

Orihara became interested in the deep-sea fish stories after the 2011 magnitude 9.0 Tohoku earthquake in Japan. If the stories were true, he said, deep-sea fish appearances could be used in disaster mitigation efforts.

"From this motivation, we started compiling the event catalog for statistical study," said Orihara. "There were some previous papers to survey deep-sea fish appearances. However, their reports were insufficient for a statistical study. To collect a lot of events, we focused on local newspapers that have often reported the events."

The researchers scoured a digitized database of newspaper articles to find mentions of the unusual appearance of deep-sea fish that folklore says will appear before an earthquake, including oarfish, several kinds of ribbonfish, dealfish and the unicorn crestfish.

Orihara said that he and his colleagues expect that their results "will have an influence on people who believe the folklore," and they hope that their findings will be shared throughout Japan.

Credit: 
Seismological Society of America

Cool halo gas caught spinning like galactic disks

image: This is an artist conception of gas streams (blue) feeding a galactic disk. The inflow fuels new star formation, and because the infalling gas is spinning, the size of the disk grows.

Image: 
James Josephides, Swinburne Astronomy Productions

Maunakea, Hawaii - A group of astronomers led by Crystal Martin and Stephanie Ho of the University of California, Santa Barbara, has discovered a dizzying cosmic choreography among typical star-forming galaxies; their cool halo gas appears to be in step with the galactic disks, spinning in the same direction.

The researchers used W. M. Keck Observatory to obtain the first-ever direct observational evidence showing that corotating halo gas is not only possible, but common. Their findings suggest that the whirling gas halo will eventually spiral in towards the disk.

"This is a major breakthrough in understanding how galactic disks grow," said Martin, Professor of Physics at UC Santa Barbara and lead author of the study. "Galaxies are surrounded by massive reservoirs of gas that extend far beyond the visible portions of galaxies. Until now, it has remained a mystery how exactly this material is transported to galactic disks where it can fuel the next generation of star formation."

The study is published in today's issue of the Astrophysical Journal and shows the combined results of 50 standard star-forming galaxies taken over a period of several years.

Nearly a decade ago, theoretical models predicted that the angular momentum of the spinning cool halo gas partially offsets the gravitational force pulling it towards the galaxy, thereby slowing down the gas accretion rate and lengthening the period of disk growth.

The team's results confirm this theory, which show that the angular momentum of the halo gas is high enough to slow down the infall rate but not so high as to shut down feeding the galactic disk entirely.

METHODOLOGY

The astronomers first obtained spectra of bright quasars behind star-forming galaxies to detect the invisible halo gas by its absorption-line signature in the quasar spectra. Next, the researchers used Keck Observatory's laser guide star adaptive optics (LGSAO) system and near-infrared camera (NIRC2) on the Keck II telescope, along with Hubble Space Telescope's Wide Field Camera 3 (WFC3), to obtain high-resolution images of the galaxies.

"What sets this work apart from previous studies is that our team also used the quasar as a reference 'star' for Keck's laser guide star AO system," said co-author Ho, a physics graduate student at UC Santa Barbara. "This method removed the blurring caused by the atmosphere and produced the detailed images we needed to resolve the galactic disks and geometrically determine the orientation of the galactic disks in three-dimensional space."

The team then measured the Doppler shifts of the gas clouds using the Low Resolution Imaging Spectrometer (LRIS) at Keck Observatory, as well as obtaining spectra from Apache Point Observatory. This enabled the researchers to determine what direction the gas is spinning and how fast. The data proved that the gas is rotating in the same direction as the galaxy, and the angular momentum of the gas is not stronger than the force of gravity, meaning the gas will spiral into the galactic disk.

"Just as ice skaters build up momentum and spin when they bring their arms inward, the halo gas is likely spinning today because it was once at much larger distances where it was deposited by galactic winds, stripped from satellite galaxies, or directed toward the galaxy by a cosmic filament," said Martin.

NEXT STEPS

The next step for Martin and her team is to measure the rate at which the halo gas is being pulled into the galactic disk. Comparing the inflow rate to the star formation rate will provide a better timeline of the evolution of normal star-forming galaxies, and explain how galactic disks continue to grow over very long timescales that span billions of years.

Credit: 
W. M. Keck Observatory

Does greater immersion in virtual reality lead to a better experience?

image: Cyberpsychology, Behavior, and Social Networking is an authoritative peer-reviewed journal published monthly online with Open Access options and in print that explores the psychological and social issues surrounding the Internet and interactive technologies.

Image: 
(c) 2019 Mary Ann Liebert, Inc., publishers

New Rochelle, NY, June 18, 2019--Contrary to current industry trends to develop more immersive virtual reality systems, a new study found that a more immersive environment or the presence of real-world distractions could have surprising effects on a participant's recall, description of the virtual encounter, and how positive they feel about the experience. The design, results, and implications of this timely study are published in Cyberpsychology, Behavior, and Social Networking, a peer-reviewed journal from Mary Ann Liebert, Inc., publishers. Click here to read the full-text article free on the Cyberpsychology, Behavior, and Social Networking website through July 18, 2019.

"The Effects of Immersion and Real-World Distractions on Virtual Social Interactions" was coauthored by Jeremy Bailenson, Catherine Oh, and Fernanda Herrera, Stanford University, CA. The researchers designed a study in which participants interacted with a virtual agent in either an immersive or non-immersive virtual environment with three levels of real-world distractions: no distractions; passive exposure to the sound of a ringing cell phone; actively answering a ringing cell phone. Whereas increased immersion had a positive effect on feeling present in the virtual reality interaction, it had a negative effect on recognition and recall.

"This very important study highlights again the critical need to understand the role of realism during the VR experience. The clarity of the latest PlayStation characters, for example, or those seen in recently released movies and video clips begs the question of whether the uncanny valley has been overcome," says Editor-in-Chief Brenda K. Wiederhold, PhD, MBA, BCB, BCN, Interactive Media Institute, San Diego, California and Virtual Reality Medical Institute, Brussels, Belgium. "Despite over 20 years of research, we still do not fully understand how levels of realism can differentially affect human perception and the underlying psychological milieu. Many VR clinicians have shown that in certain cases, less granular or admittedly more cartoonish avatars and graphics produce better clinical outcomes. On the other hand, in certain types of surgical training, realistic VR is both justified and necessary. There remain many unanswered issues and a lack of good objective measures for our feeling of presence, and this paper plays a crucial role in furthering our knowledge."

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Record-low fertility rates linked to decline in stable manufacturing jobs

image: Three measures of the level of manufacturing business activity show a long-term decline over the last few decades. A change in 1998 in how businesses are classified creates a jump in the numbers that year.

Image: 
Nathan Seltzer

MADISON - As the Great Recession wiped out nearly 9 million jobs and 19 trillion dollars in wealth from U.S. households, American families experienced another steep decline -- they had fewer children.

Yet as employment eventually climbed and wages rose, fertility did not. The fertility rates for American women have continued to decline in the years since the recession ended and reached an all-time low of 1.7 children per woman in 2018.

This continuous decline puzzled scientists, who had previously observed that economic recoveries tend to buoy fertility rates that decline during recessions. The Great Recession seemingly halted that trend.

New research by University of Wisconsin-Madison sociologist Nathan Seltzer identifies a link between the long-term decline in manufacturing jobs -- accelerated during the Great Recession -- and reduced fertility rates. Analyzing every birth in America at the county level across 24 years, Seltzer found that the share of businesses in goods-producing industries better predicted a metropolitan area's fertility rate than the region's unemployment rate.

The link between manufacturing jobs and fertility rates was especially strong for Hispanic women, a larger share of whom work in goods-producing industries than do women from other racial or ethnic groups. Manufacturing business activity was a stronger predictor of fertility than the unemployment rate was for all racial groups.

The results suggest that fertility rates are unlikely to return to pre-recession levels as manufacturing businesses continue to represent a decreasing share of the American economy.

"These structural trends are driving this increased financial precarity and influencing women and couple's decisions to have children," says Seltzer, who published his findings this week in the journal Demography. "Metro areas that experienced steeper declines in goods-producing businesses were more likely to experience steeper declines in fertility rates."
The link between manufacturing businesses and fertility remained even when adjusting for other factors that affect fertility rates, such as education levels, marriage rates and the share of Hispanic women in the U.S. who were born in Mexico, which has been declining.

Seltzer analyzed births from the National Vital Statistics System across all 381 Census Bureau-designated metropolitan areas from 1991 to 2014, an area representing 85 percent of the U.S. population. He converted that data into a measure called the total fertility rate, or TFR, a calculation of the expected number of children the average woman will have in her lifetime based on current birth rates. A TFR of about 2.1 children per woman is required to maintain a stable population size without any immigration.

The Census Bureau provided data on the proportion of goods-producing or service sector businesses in an area. Business numbers provide the most reliable information about job opportunities in an industry.

From 1991 to 2014, goods-producing businesses declined from 18.3 percent of all businesses to 14.2 percent. Seltzer found that this decline accounted for a loss in TFR from a low of 0.08 for white women to a high of 0.21 for black women.

Throughout the Great Recession and subsequent recovery, from 2006 to 2014, manufacturing businesses declined by 2.4 percentage points. Over the same period, Hispanic women's fertility fell 24 percent. Other racial groups saw a fertility decline of 7 to 8 percent.

Seltzer found that the loss of goods-producing businesses during this period accounted for a quarter to almost half of the decline in TFR for women, depending on their race or ethnicity. The unemployment rate explained a much smaller fraction of these fertility declines.

"Once you account for the share of goods-producing businesses in an area, you find that it better explains declines in fertility than the unemployment rate does," says Seltzer, a doctoral candidate in the sociology department.

This is the first study to compare how structural changes in the economy and cyclical changes in unemployment rates affect fertility nationwide. Earlier studies had identified transient effects on fertility based on a glut of foreign imports or the effects of local oil booms. Most previous research focused on links between cyclical recessions and fertility without accounting for long-term economic trends.

It remains to be seen whether the record-low TFR represents delayed or foregone childbearing. A full account of a generation's fertility is only possible after women pass their childbearing years, designated as ages 15 to 49. But even a temporary drop in childbearing can reduce lifetime fertility for a particular population.

Fertility, in turn, affects many social and economic factors, such as the available workforce and the support of social welfare programs such as Social Security.

"U.S. manufacturing is what built the middle class," says Seltzer. "As those industries have declined, there's been little growth in jobs at a comparable skill and income level. Increasingly, people have to find work in the service sector that provides worse pay and less financial stability."

Seltzer notes that manufacturing jobs are not inherently family friendly. But in the post-War era, they have tended to provide low- and middle-skill workers with secure, middle-income positions. Without subsequent growth in middle-skill, middle-income service sector jobs, the decline in manufacturing¬ has left many workers worse off.

"In the U.S. right now, there's this sense that the economy is growing, unemployment is down," says Seltzer. "Despite these positive economic indicators, we're still going through this transition of deindustrialization that leads to more precarious work for many people."

Credit: 
University of Wisconsin-Madison

Latest artificial intelligence research from China in Big Data

image: Big Data, published quarterly online with open access options and in print, facilitates and supports the efforts of researchers, analysts, statisticians, business leaders, and policymakers to improve operations, profitability, and communications within their organizations.

Image: 
(c) 2019 Mary Ann Liebert, Inc., publishers

New Rochelle, June 18, 2019--China is among the leaders in the rapidly advancing artificial intelligence field, and its broad range of cutting-edge research expertise is on display in this special issue on "Artificial Intelligence in China" of Big Data, a peer-reviewed journal from Mary Ann Liebert, Inc., publishers. Click here to read the special issue free on the Big Data website through July 18, 2019.

Co-Guest Editors Weiping Zhang, PhD, Zheijiang University (China) and Mohit Kumar, PhD, Rostock University (Germany) organized the unique and timely collection of articles in this special issue.

Featured in the special issue is the article entitled "Abnormal Data Region Discrimination and Cross-Monitoring Points Historical Correlation Repair of Water Intake Data," coauthored by Huifeng Xue, Xi'an University of Technology and China Academy of Aerospace System Scientific and Engineering (Beijing), Qiaoyun Liu, Xi'an University of Technology, Junjie Hou, China Academy of Aerospace System Scientific and Engineering, and Yi Wan, Ministry of Water Resources (Beijing). The researchers analyze the characteristics of abnormal data distribution and show how the data from current monitoring points do not maximally correlate with historical data from corresponding points. They use sample data from recent years to demonstrate that application of the Abnormal Data Region Discrimination algorithm and the Cross Monitoring-Points Historical Correlation Repair method can correctly identify the abnormal data region and repair the abnormal data.

Yao Yu and Junhui Zhao, East China Jiaotong University (Nanchang) and Wu Lenan, Southeast University (Nanjing) collaborated on the article entitled "Multiple Targets Tracking with Big Data-Based Measurement for Extended Binary Phase Shift Keying Transceiver". The researchers proposed using Doppler measurements of target velocity in combination with target range information to improve the ability to detect multiple targets accurately in a noisy environment with an extended-binary phase shift keying (EBSPK) transmit-receive system - a high-resolution radar tracking system. In a simulated experiment, they showed significant enhancement in the tracking performance of the big Doppler data association method. The target velocity measurements support the likelihood of the EBPSK transceiver-generated information, helping to distinguish actual targets from phony targets or clutter measurements.

Big Data Editor-in-Chief Zoran Obradovic, PhD, Carnell Professor of Data Analytics, Temple University, (Philadelphia, PA) states: "China's spending on research has increased 8-fold since 2000. The overall results of this increased research activity are evident in many fields and are particularly impressive in the area of Artificial Intelligence and Big Data. This special issue provides an excellent opportunity to read about a range of ongoing AI-related developments across multiple big data-related fields in China."

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Surgery to straighten a deviated septum improves quality of life

Surgery to straighten a deviated nasal septum, also known as septoplasty, is worthwhile. Patients with a deviated (crooked) septum breathe more easily after this operation and their quality of life improves. The effects of this procedure have never been systematically investigated. Specialists have long debated its benefits. But now, researchers at Radboud university medical center have ended the controversy on June 18 with a publication in The Lancet.

The septum divides the nose into two halves. If the septum is deviated, various complaints can occur, such as chronic nasal obstruction, difficulty in breathing, or snoring. These complaints are so common that septoplasty is the most frequently performed ENT operation in adults. However, the effectiveness of this treatment has never been systematically investigated with a control group. As a result, the benefits of septoplasty have long been debated amongst healthcare professionals and policy makers. In England, the discussion even led to restrictions for the reimbursement of this procedure.

Radboud university medical center researchers Machteld van Egmond, Maroeska Rovers, and Niels van Heerbeek are the first to compare the outcomes of two groups of patients following different treatment strategies: septoplasty and non-surgical management, that is medication or watchful waiting. Machteld van Egmond, lead researcher for the project: "This procedure has been performed for many years, but in previous studies only surgical patients were assessed. Without a control group, you never know whether an observed improvement is actually the result of the intervention, or due to other factors, such as the natural course of the condition."

The researchers investigated the effect of septoplasty in more than 200 adults with nasal obstruction and a deviated septum. The study was performed in two academic medical centers and 16 secondary referral hospitals in the Netherlands. Half of the patients underwent septoplasty (surgical group), and the other half non-surgical management (control group). The researchers then measured the effects of both treatment strategies on patients' quality of life and airflow through the nose. These measurements were performed at 3, 6, 12, and 24 months after the start of the treatment.

The quality of life of the surgical group had already improved at three months. They experienced fewer limitations in daily life due to nasal problems. Nasal airflow also increased after the operation. This positive effect was still present after two years. The patients had fewer nasal symptoms, fewer colds, breathed more easily, and slept better. All the effects were largest at six months after the operation, but persisted until the end of the study.

Credit: 
Radboud University Medical Center

Study: How arousal impacts physiological synchrony in relationships

A team of researchers led by a member of the Colorado School of Public Health faculty at the Anschutz Medical Campus examined what type of social interaction is required for people to display physiological synchrony--mutual changes in autonomic nervous system activity. The study also looked at whether the levels of autonomic arousal people share predicts affiliation and friendship interest between people.

The findings are published in Nature Scientific Reports.

"In a variety of situations, people appear more social with one another when their autonomic nervous systems are in sync. However, this is the first study to show that, although people display physiological synchrony across social contexts, how much arousal people share can vary, differentially impacting social outcomes like perceived similarity and friendship interest," said Chad Danyluck, PhD, postdoctoral fellow at the Colorado School of Public Health.

Danyluck adds, "Physiological synchrony has been found in a variety of relationships and environments, from married couples arguing to military units and sports teams coordinating their behaviors. Understanding whether and how shared arousal brings people together may help us hone the development of programs targeting team leadership and social cohesion in work environments. I am particularly interested in the role of physiological synchrony in fostering friendship interest across ethnic and racial divides."

Consistent with prior work, this study observed physiological synchrony in both branches of the autonomic nervous system and across cooperative and competitive social contexts. The authors general interest in physiological synchrony is in how synchrony during social interactions relates to social processes that ultimately lead to friendship.

The study found that different social contexts caused different levels of physiological arousal, meaning that the branches of the autonomic nervous system became either more or less reactive in response to the experimental task. However, in every condition, strangers quickly went "in sync" and did so in each branch of the autonomic nervous system whether they were high or low in arousal. Whether the social or physiological context of synchrony contributed to social outcomes, however, depended on which branch of the autonomic nervous system displayed synchrony.

The findings show that sharing similar amounts of sympathetic arousal was sufficient to increase perceptions of similarity--a precursor to friendship--regardless of social context and no matter the arousal levels partners shared. One possible explanation for this finding is that patterns of sympathetic arousal may correlate with observable body movements (and by extension a lack of arousal may correlate with a lack of body movement) that might predict perceived similarity if shared among partners. By comparison, people for whom parasympathetic synchrony and parasympathetic reactivity was high generally reported more friendship interest when the social context permitted conversation than when it did not. In other words, when parasympathetic activity increased during a social interaction, parasympathetic synchrony only mattered for the development of friendship between strangers who could converse.

Using data from 134 strangers interacting in pairs, the researchers manipulated two features of social context to test their impact on synchrony in sympathetic and parasympathetic reactivity. Participants completed a knot-tying task within a collective reward ("cooperation") or individual reward ("competition") framework while conversing or not ("talking" condition). Autonomic reactivity varied by features of social context. Synchrony occurred across social contexts in both autonomic branches. The researchers then examined how synchrony predicted affiliation.

Credit: 
University of Colorado Anschutz Medical Campus

Brain anatomy links cognitive and perceptual symptoms in autism

image: Gray matter volume and neuronal density in the right posterior superior parietal correlated with higher percept stability (reds) and more rigid cognition (blues). The overlapping region is in yellow

Image: 
RIKEN

Neuroscientists at the RIKEN Center for Brain Science (CBS) and University College London have found an anatomical link between cognitive and perceptual symptoms in autism. Published in the Journal of Neuroscience, the study identified a posterior region of the brain whose size--amount of gray matter--is related to both cognitive rigidity and overly stable visual perception, two symptoms of autism that until now were only conceptually related.

Mental inflexibility is a hallmark symptom of autism spectrum disorder (ASD). This is best seen in restricted and repetitive behaviors, which are required for an ASD diagnosis. These behaviors can range from stereotyped and repetitive movements to repetitive thoughts. Perception in people with ASD can also be less flexible than in others. This can be understood by considering a line drawing of a transparent cube (called a Necker cube, see Figure). When looking at this drawing, the 3-D structure of the cube seems to spontaneously invert; the front becomes the back and then becomes the front again. This type of perceptual switching is called "bistable" perception. In the case of autism however, perception is often overly stable, and does not switch back and forth as often as it does in others.

The team of researchers sought to find a physical neuroanatomical link between these two characteristics of autism. They recruited people with and without ASD to perform two simple computer-based tests and an MRI scan. The first computer test assessed perceptual stability. Participants viewed a bistable image in which the front and back of a cylindrical shape switch back and forth. The second test evaluated cognitive rigidity, and was designed specifically for this study. Participants were shown shapes on a display and asked to choose a rule to follow: select the brightest shape or a specific shape. The researchers counted the numbers of times each participant reported a switch in perception during the first test and the number of times they spontaneously switched rules in the second test. These measures allowed the researchers to quantify perceptual stability and cognitive rigidity for each participant.

As expected, they found that perception of the bistable image switched much less frequently in people with ASD than in the control participants. They also found that people with ASD repeated the same rule choice--brightness or shape--for longer periods of time before switching rules. A control switching test in which participants were told to switch rules did not differ between groups, meaning that switching rules was not difficult for those with ASD, but that when acting freely, they chose to switch less often than the other participants.

The results from the rule-switching task were particularly encouraging. As first author Takamitsu Watanabe from RIKEN CBS explains, "cognitive rigidity in high-functioning autism is known to be difficult to detect and quantify in conventional psychological paradigms. Here, we overcame this issue with a new spontaneous task-switching test." With these results, the team was confident that their tests were good measures of perceptual stability and cognitive rigidity.

The team then took these individual scores and tested whether they correlated with the brain anatomy seen on the MRI scans. They found that one part of the brain in particular was related to both perceptual stability and cognitive rigidity. Lower density of neurons in the posterior superior parietal lobule was associated with both less frequent perceptual switching and less frequent rule switching, and was also associated with the severity of the participant's restricted and repetitive behaviors.

"We think that the posterior superior parietal lobule is the neural basis for both overly stable perception and cognitive inflexibility, two seemingly different symptoms in autism," says Watanabe. "Knowing the importance of this brain region, we can now work to identify how it produces its effects and test whether manipulating its neural activity can mitigate these ASD symptoms."

Credit: 
RIKEN

Researchers lay out plan for managing rivers for climate change

image: Researchers use thermistors in a PVC housing to monitor stream temperatures.

Image: 
Kevin Bladon

CORVALLIS, Ore. - New strategies for river management are needed to maintain water supplies and avoid big crashes in populations of aquatic life, researchers argue in a perspective piece published today in Nature.

The scientists say a fresh approach is necessary as the climate warms, which has led to historic die-offs like the January 2019 event in the Murray-Darling Basin of Australia that saw severe water shortages bring hardship to residents and kill millions of fish.

"The world's rivers are facing tough times," said the editorial's lead author, Jonathan Tonkin, who just completed a post-doctoral appointment in Oregon State University's College of Science. "Iconic species like the Murray cod, the largest freshwater fish in Australia, are in danger of vanishing. In a 2018 heat wave in Germany and Switzerland, thousands of fish died. The multiyear drought in California has restricted water supplies and wreaked havoc on wetlands, riparian forests, fish and other aquatic life."

Tonkin and his co-authors outline a four-part plan for an "adaptive" approach to river management - moving beyond simply monitoring ecosystems to understanding the biological mechanisms at play.

"We need to develop forecasting tools that project how key species, life stages and ecosystems respond to environmental changes," said co-author David Lytle, professor of integrative biology in the OSU College of Science. "We can't just track things like species diversity and population abundance and compare them to historical averages - often by the time negative trends are detected, it's too late to turn them around."

The answer, the authors assert, is developing "process-based" models that can track and predict how ecosystems change when conditions - like smaller river flows - change. The models can be tailored to life stages of populations, whole communities of species and sequences of events, enabling tipping points to be identified.

"For example, a drier future with fewer and smaller floods has been projected to reorganize and simplify the interactions between riparian plant species in the dryland river systems of the American Southwest," said Tonkin, now at the University of Canterbury in New Zealand. "The projected changes could reduce communities' resistance to climate change and ability to ward off invasions by non-native species. Knowing that, managers can intervene before a problem takes hold."

Building effective models entails a long-term funding commitment to gather more data on the basic biology of riverine species, which means years of field monitoring. Until the data store is built up, keeping the models more simple, and finding ways to make connections across gaps in data, might help.

"Species with similar life histories or characteristics possibly respond in similar ways to changing river conditions, so it's possible that studies of one species could inform models and management elsewhere," Tonkin said.

But ultimately the construction of the best models requires more information, which is why the authors argue that data collection is the top next step for river scientists and managers.

The other three steps are:

Describe key processes in models. Scientists need to better articulate the relationships between ecosystem attributes and water flow patterns.

Focus management on bottlenecks. Intervene in ways that keep populations from crashing during extreme flows while focusing on the most vulnerable life stages, not just population abundance.

Be clear about uncertainties. Quantify the level of trust that can be placed in models' predictions, and update models regularly as new data become available.

"Freshwater biodiversity is disappearing," Tonkin said. "Climate change is magnifying the pressures on river ecosystems brought on by urbanization, invasive species and pollution. As the crisis worsens, we need to change how we study, model and manage rivers to safeguard the services they provide to humanity and all of the planet."

Credit: 
Oregon State University