Tech

'Good' bacteria may prevent -- and reverse -- food allergy

BOSTON - June 24, 2019 -- A study by scientists at Boston Children's Hospital and Brigham and Women's Hospital, published today in Nature Medicine, makes a strong case that the national epidemic of food allergy is caused by the absence of certain beneficial bacteria in the human gut. "The loss of these bacteria acts as a switch that makes children susceptible to food allergy," says Talal Chatila, MD, director of the Food Allergy Program at Boston Children's and a senior author on the paper.

But the study, conducted primarily in mice, also points the way toward treatments that may protect children from developing food allergies -- and reverse the disease in people who already have it. "We're hoping this will lead to a treatment for food allergy, not just a preventative approach," says co-senior author Rima Rachid, MD, assistant director of the Food Allergy Program in Boston Children's Division of Immunology.

The study, which also tested human gut bacteria, was carried out by Azza Abdel-Gadir, PhD, a former postdoc, and Emmanuel Stephen-Victor, PhD, a current postdoc in Chatila's lab, both first co-authors on the paper, in collaboration with first co-author Georg Gerber, MD, PhD, and senior co-author Lynn Bry, MD, PhD, both of Brigham and Women's Hospital.

For reasons that remain a mystery, the number of Americans who suffer from food allergy has risen sharply over the last decade to as many 32 million, according to one recent estimate. Nearly 8 percent of children in the U.S. -- about two in every classroom -- are affected.

One hypothesis is that certain Western lifestyle factors -- an increase in births by Caesarean section, a decline in breastfeeding, increased use of antibiotics and smaller family sizes, for example -- is disrupting the normal microbial balance in the gut, depriving babies of the "good" bacteria that prepare the immune system to recognize food as harmless.

Rachid began testing this hypothesis by studying gut bacteria in babies with and without food allergies. Her team collected stool samples from 56 food-allergic patients and 98 matched controls. Gerber and his colleagues at Brigham and Women's Hospital analyzed those samples for changes in bacterial content. The work revealed that the bacteria in the feces of babies with food allergies were different from those of controls. But did those bacterial differences play a role in their food allergies?

To find out, the team transplanted fecal bacteria from the babies into a special strain of allergy-prone mice. They fed the mice small doses of chicken egg protein to sensitize their immune systems to this allergen, then challenged the mice with a large dose.

The results: Mice that had been given fecal bacteria from food-allergic babies went into the life-threatening reaction called anaphylaxis. "The fecal bacteria from food-allergic subjects did not protect against food allergy, whereas the bacteria from control subjects did," Chatila says.

To find out which bacteria might be offering that protection, the team turned to Bry at Brigham and Women's. Bry provided a mix of six bacterial species from the order Clostridiales, which previous studies had suggested might protect against food allergy. When these bacteria were given to the mice, the animals were protected from food allergy to chicken egg protein, whereas mice given other common bacteria were not. "If you give them the right bacteria, the Clostridia, they're completely resistant to food allergy," Chatila says.

Bry then provided a second mix of unrelated bacteria from the order Bacteroidales. It too was protective. And finally, when the team treated mice that already had food allergy with the Clostridiales or Bacteroidales mixes, they found those therapies completely suppressed the animals' allergic reactions.

Chatila believes the study proves that the loss of protective gut bacteria is a critical factor in food allergy. "At the very least it is a fundamental mechanism. And more likely, in my mind, it is the fundamental mechanism on which other things can be layered," he says.

While previous studies have suggested that certain bacteria can protect against food allergies, Chatila and his colleagues go a step further, describing the specific immunological pathway by which the bacteria act in mice. It begins with a protein, known as MyD88, that serves as a "microbial sensor" in the immune system's regulatory T cells.

"You need the bacteria to give particular signals that are picked up by nascent regulatory T cells in the gut," Chatila explains. Those signals trigger a chain reaction that changes the gut regulatory T cells into a specific type, known as ROR-gamma regulatory T cells, that protect against food allergies. As a result of this work, Chatila says, "we now have a fundamental concept of how food allergy happens" -- a theory he hopes other scientists will now test.

Chatila and Rachid believe their findings will eventually lead to new treatments that prevent the development of food allergies in newborns at risk. The treatments might take the form of probiotics -- mixes of beneficial bacteria -- or drugs that prime the immune system in the same way.

And for the millions who already suffer from food allergies, the same treatments may be able to reverse their disease. "Remember," Chatila says, "in adult mice that had become food-allergic, we could suppress their disease by introducing the good bacteria, which means to us there is the potential to treat somebody with established food allergy and reset their immune system in favor of tolerance."

Ultimately, Chatila cautions, the promising results in mice will have to be duplicated in humans. But that may happen soon. Rachid is already conducting a first-of-its-kind clinical trial at Boston Children's to test the safety and efficacy of fecal transplants in adults with peanut allergy. And Chatila notes that several companies are already preparing bacterial mixes for clinical trials. "If the race continues with the same intensity, or accelerates, I think you'll see a product on the market within five years," he predicts.

Credit: 
Boston Children's Hospital

Popular strategy for raising pregnancy rates in IVF fails to deliver improvement in large trial

Vienna, 24 June 2019: The increasingly popular trend for fertility clinics to freeze all IVF embryos for later transfer has been shown in a large multicentre randomised trial to offer no improvement in delivery rates over traditional 'fresh' embryo transfers. 'Our findings give no support to a general freeze-all strategy in normally menstruating women,' said investigator Dr Sacha Stormlund from Copenhagen University Hospital in Denmark, who presents the results today at the 35th Annual Meeting of ESHRE.

'The results of this trial were as we expected,' said Dr Stormlund, 'namely, to see similar pregnancy rates between the fresh and freeze-all treatment groups. So I think it can now plausibly be said that there is no indication for a general freeze-all strategy in women with regular menstrual cycles who are not at immediate risk of overstimulation in IVF.'

Behind the study lies an increasingly adopted strategy in assisted reproduction to freeze all embryos generated in a first cycle and transfer them after thawing in a later cycle (rather than as fresh embryos in the initial cycle). Some recent reports from registries in Japan and USA have suggested that there are now more frozen embryo transfers in IVF than fresh, with numbers still growing and many clinics claiming that outcomes can be improved with a freeze-all approach. However, despite the enthusiasm of clinics, most study results investigating the freeze-all vs fresh controversy have been inconsistent in their results.

This study, which aimed to test the claim of improved outcome in a general IVF patient population, was a large randomised trial involving 460 IVF patients at eight clinics in Denmark, Sweden and Spain and powered to provide a robust result. The patients were randomly assigned to test the two different treatment approaches, both of them with single blastocyst transfer: freeze-all with frozen embryo transfer in a subsequent cycle versus the control group with fresh transfer.

Results showed that the ongoing pregnancy per randomised patient after the first single blastocyst transfer was similar in the two groups: 26.1% in the freeze-all group and 28.8% in the fresh transfer group, a statistically non-significant difference suggesting that a general freeze-all policy will bring no patient benefit in terms of pregnancy outcome. Live birth rates were also comparable.

'I think we can now reasonably say, based on our results and those from other recent trials, that in normally ovulating patients there is no apparent benefit from a freeze-all strategy in IVF,' said Dr Stormlund. 'However, the evidence derived from another large trial in women with polycystic ovary syndrome and at risk of responding excessively to stimulation suggests a considerable freeze-all benefit both in terms of live birth and ovarian hyperstimulation syndrome.'

This latter component of safety was an additional secondary endpoint to the study. In the freeze-all cycles ovulation of the mature eggs was triggered with a reproductive hormone-suppressing drug known as a GnRH agonist; in the conventional fresh transfer group ovulation was triggered traditionally with hCG. And in this part of the study there was an apparent patient benefit from the freeze-all approach in terms of patient safety. Twenty-five of the 220 patients in the fresh transfer group were judged at risk of ovarian hyperstimulation syndrome (OHSS) towards the end of their treatment cycle and, to remove the risk, they were switched to the freeze-all group.

'The occurrence of OHSS was not systematically assessed in women allocated to the freeze-all group,' explained Dr Stormlund, 'as any risk in this group was practically eliminated. Indeed, there were no signs of OHSS in the freeze-all group. The 25 patients judged at high risk of OHSS in the fresh transfer group were swopped from the fresh to the freeze-all group as a safety measure and according to trial protocol.'

The latest European figures from ESHRE put the reported incidence of OHSS at 0.3%, making it the most common complication of IVF. 'However,' said Dr Stormlund, 'in a recent study from our group in Denmark, we found a considerably higher rate of severe OHSS of 5.1%. That's why in this study we aimed to reduce the risk of OHSS with a freeze-all plan with GnRH trigger and a cancellation policy in the fresh embryo transfer group for women at high risk of OHSS.'

However, Dr Stormlund insisted that there is still no reason to recommend the freeze-all strategy for patients not at immediate risk of OHSS. Only in patients with anovulatory infertility should an immediate freeze-all strategy be considered. 'At present,' she said, 'there is only sufficient evidence to recommend freeze-all for patients diagnosed with polycystic ovary syndrome. Otherwise, in women with regular cycles, we see no benefit.'

Credit: 
European Society of Human Reproduction and Embryology

Calibration method improves scientific research performed with smartphone cameras

WASHINGTON -- Although smartphones and other consumer cameras are increasingly used for scientific applications, it's difficult to compare and combine data from different devices. A new easy-to-use standardized method makes it possible for almost anyone to calibrate these cameras without any specialized equipment, helping amateurs, science students and professional scientists to acquire useful data with any consumer camera.

"The low cost of consumer cameras makes them ideal for projects involving large-scale deployment, autonomous monitoring or citizen science," said Olivier Burggraaff, who led the research team from Leiden University in the Netherlands who developed the calibration method. "Our standardized calibration method will make it easier for anyone to use a consumer camera to do things like measure pollution by detecting aerosol particles in the air."

In The Optical Society (OSA) journal Optics Express, the multi-institutional group of researchers report their new standardized calibration method and database, called SPECTACLE (Standardized Photographic Equipment Calibration Technique And CataLoguE), which can be used for smartphones, digital single-lens reflex cameras and cameras aboard drones. The database allows users to upload calibration data from their cameras for others to use.

"SPECTACLE includes many do-it-yourself (DIY) methods, which we found provided results comparable to professional methods that require high-end laboratory equipment," said Burggraaff.

Improving citizen science

The standardized calibration method was developed in response to a need that arose as Burggraaff and his Leiden Univ. colleagues were developing citizen science methods to measure optical water quality using a smartphone add-on called iSPEX (Spectropolarimeter for Planetary EXploration), they originally developed to measure air pollution. This add-on allows a smartphone camera to measure extra optical information such as hyperspectral and polarimetric data. SPECTACLE and iSPEX are part of MONOCLE (Multiscale Observation Networks for Optical monitoring of Coastal waters, Lakes and Estuaries), a project funded by the European Commission aimed at creating sustainable solutions for measuring optical water quality.

"To use smartphone cameras to measure water quality we need to understand them well because each manufacturer and each device has its own characteristics," said Burggraaff. "SPECTACLE brings together many existing calibration methods and applies them for the first time to consumer cameras, which will make it much easier for other developers and for us to use these cameras for scientific purposes."

Although calibration methods for consumer cameras have been developed previously, these efforts were often hampered by a lack of access to the software or available information about the devices. For example, until recently it wasn't possible to access data straight from the camera sensor-- known as so-called RAW data -- or to control many camera settings like focus or exposure. However, new versions of iOS and Android allow both.

"As part of SPECTACLE, we are developing a framework for both operating systems to make measurements using RAW data and process these on the phone, which simply was not possible a few years ago," said Burggraaff.

DIY vs. laboratory methods

To test the new calibration methods, the researchers compared them with established methods using several cameras. They found, for example, that the DIY method for measuring how the lens distributes light on the sensor, known as flat fielding, matched within 5 percent of results from the standard method that requires an integrating sphere in a laboratory setup. The DIY method involved taping paper on the camera and acquiring images of the sun or a computer screen.

They also tested the spectral response curves of a smartphone camera with the iSPEX attached and were able to achieve results within 4 percent of the professional measurement method, which requires an expensive and difficult-to-operate monochromator. The calibration of a single camera can take half a day with a monochromator, but the DIY method required simply taking a single picture of a piece of printer paper in the sun.

"We tested a number of cameras and found interesting differences and similarities between them," said Burggraaff. "For example, the cameras' responses to different wavelengths of light, known as spectral response curves, were very similar among most cameras except for a few devices that showed differences that could influence how the cameras sense and reproduce colors, even when imaging the exact same scene."

The researchers plan to apply the SPECTACLE methodology to a much larger number of cameras to fill in the database and get a broader idea of camera properties. This will be done by the researchers as well as anyone who wants to upload their calibration data into the database. They are also continuing to develop the iSPEX smartphone add-on to improve its ability to acquire water and air pollution measurements. This involves advancing its physical design and the algorithms for retrieving scientific results from its data while using the SPECTACLE methods and database to combine data from different smartphones.

Credit: 
Optica

How Facebook and Google avoided FEC ad disclaimers during 2016 presidential election

image: Katherine Haenschen teaches courses on political communication and social media at Virginia Tech. Photo by Jason Jones.

Image: 
Virginia Tech

A cloak of mystery often shrouds the inner workings of technological giants, but sometimes clarity is in plain sight. A Virginia Tech research team recently uncovered conclusive details about the roles Facebook, Google, and the Federal Election Commission played in digital advertising around the U.S. presidential election of 2016.

WATCH VIDEO: https://youtu.be/hGkCnLgBWUU

Katherine Haenschen, an assistant professor of communication in the College of Liberal Arts and Human Sciences, and Jordan Wolf, a 2018 graduate of Virginia Tech's master's program in communication, collaborated on the first academic research study to look specifically at how Facebook and Google deadlocked the Federal Election Commission's efforts to regulate digital political advertising.

Haenschen and Wolf wanted to know what motivated Facebook and Google to seek disclaimer exemptions from the Federal Election Commission and why the independent regulatory agency failed to regulate digital advertisements leading up to the 2016 election. Their study -- recently published by Telecommunications Policy, the International Journal of Digital Economy, Data Sciences and New Media -- explored how the two platforms avoided disclosing who paid for advertisements related to the election.

The research team analyzed digitized versions of primary-source documents on the Federal Election Commission's website to understand how Facebook, Google, and the commission perceived the need for online advertising disclaimers before the election. The authors searched through advisory opinions, which are official commission responses to questions about the application of federal campaign finance law to specific situations. The team identified three advisory opinions comprising 114 documents relevant to their study.

The analysis by Haenschen and Wolf uncovered persistent themes. The platforms showed, for example, both a desire to maximize profit and a leaning toward technological constraints as an excuse for noncompliance -- or a lack of willingness to change advertisement sizes to accommodate disclaimers. The authors also noted two other themes: the potential for digital ads to deceive the public and the use of digital tools to win elections.

The researchers then investigated the documents from each of the advisory opinions to determine which themes dominated in terms of the platforms and the commission's response to them. The team found that in 2010, the commission had its first opportunity to address disclaimers in digital advertising when Google requested an advisory opinion. Despite requirements developed in the 1970s that called for the reporting to the commission of paid political advertising expenditures for print, television, and radio, along with disclaimers on political advertisements identifying who paid for them, the commission allows for two exemptions.

"The first exception is a small-items exemption for items -- such as lapel pins and bumper stickers -- that don't have room for a disclaimer," Haenschen said. "The other is the impracticable exemption for media in which using a disclaimer makes little sense. This is basically limited to skywriting, water-tower signage, and apparel."

Marc E. Elias, an election lawyer, represented Google in its request for a small-items exemption for text-only search ads. In addition, Elias wanted to know whether the advertisements linked to the sponsor's website with the disclosure would be a sufficient alternative.

"A thematic analysis of documents from Facebook and Google reveal that both platforms were primarily motivated by profit to seek an exemption, basing their need on their business model of selling character- and size-limited ads, rather than any inherent technological limitations of the medium," Haenschen and Wolf concluded in their paper. The authors noted that both platforms neglected to argue that the telecommunications industry does not regulate ad sizes in terms of character-based limits.

The outcome was that the Federal Election Commission voted to issue a narrow advisory opinion that confirmed that Google's practice of including a link to a website containing a full disclaimer satisfied the commission's requirements. However, a split along partisan lines began to surface. The three Republicans on the commission supported an impracticable exemption while two Democrats and an Independent found no technological justification. This advisory opinion left others with minimal guidance on how to comply with digital political advertising.

Following this, Facebook, through Elias, sought a small-items exemption and an impracticable exemption for its digital political ads. Following the Google experience, Facebook did not include any alternative means for the disclaimers. This only widened the partisan rift within the commission, which became deadlocked. Commission members issued no advisory and ultimately the advertisements became unregulated.

Haenschen and Wolf's research led them to conclude that the digital platforms manipulated the Federal Election Commission's system to make greater profits from political advertising.

Where, the researchers asked, does this leave the American public in 2019? In 2010, they noted, there appeared to be little interest when the commission announced notices of potential rulemaking on internet ad disclaimers. Only 14 comments were logged. In 2017, however, the commission received 149,772 comments to its updated public notices.

In a turnaround, the platforms are working on their own solutions. Facebook now requires a disclaimer on all ads with political content, verifies the identity and physical address of the payer, and claims to release all ads to the public in an archive. Google has put similar practices in place. But Google attorney Elias continues to lobby against requiring platforms to include disclaimers on digital political ads.

Credit: 
Virginia Tech

Widely available antibiotics could be used in the treatment of 'superbug' MRSA

image: Scanning electron micrograph of mouse intestine infected with Staphylococcus aureus.

Image: 
David Goulding (Wellcome Sanger Institute)

Some MRSA infections could be tackled using widely-available antibiotics, suggests new research from an international collaboration led by scientists at the University of Cambridge and the Wellcome Sanger Institute.

Since the discovery of penicillin, the introduction of antibiotics to treat infections has revolutionised medicine and healthcare, saving millions of lives. However, widespread use (and misuse) of the drugs has led some bacteria to develop resistance, making the medicines less effective. With few new antibiotics in development, antibiotic resistance is widely considered a serious threat to the future of modern medicine, raising the spectre of untreatable infections.

One of the most widely used and clinically important groups of antibiotics is the family that includes penicillin and penicillin derivatives. The first type of penicillin resistance occurred when bacteria acquired an enzyme, known as a beta-lactamase, which destroys penicillin. To overcome this, drug manufacturers developed new derivatives of penicillin, such as methicillin, which were resistant to beta-lactamase.

In the escalating arms race, one particular type of bacteria known as Methicillin-resistant Staphylococcus aureus - MRSA - has developed widespread resistance to this class of drugs. MRSA has become a serious problem in hospital- and community-acquired infections, forcing doctors to turn to alternative antibiotics, or a cocktail of different drugs which are often less effective, and raises concerns that even these drugs will in time become ineffective.

In previous research, a team of researchers in Cambridge identified an isolate of MRSA (a sample grown in culture from a patient's infection) that showed susceptibility to penicillin in combination with clavulanic acid. Clavulanic acid is a beta-lactamase inhibitor, which prevents the beta-lactamase enzyme destroying penicillin; it is already used as a medicine to treat kidney infections during pregnancy.

In a study published today in Nature Microbiology, a team of scientists from the UK, Denmark, Germany, Portugal, and USA used genome sequencing technology to identify which genes make MRSA susceptible to this combination of drugs. They identified a number of mutations (changes in the DNA sequence) centred around a protein known as a penicillin-binding protein 2a or PBP2a.

PBP2a is crucial to MRSA strains as it enables them to keep growing in the presence of penicillin and other antibiotics derived from penicillin. Two of these mutations reduced PBP2a expression (the amount of PBP2a produced), while two other mutations increased the ability of penicillin to bind to PBP2a in the presence of clavulanic acid. Overall the effect of these mutations means that a combination of penicillin and clavulanic acid could overcome the resistance to penicillin in a proportion of MRSA strains.

The team then looked at whole genome sequences of a diverse collection of MRSA strains and found that a significant number of strains - including USA300 clone, the dominant strain in the United States - contained both mutations that confer susceptibility. This means that one of the most widespread strains of MRSA-causing infections could be treatable by a combination of drugs already licensed for use.

Using this knowledge, the researchers used a combination of the two drugs to successfully treat MRSA infections in moth larvae and then mice. Their next step will be to conduct the further experimental work required for a clinical trial in humans.

Dr Mark Holmes from the Department of Veterinary Medicine at the University of Cambridge, a senior author of the study, says: "MRSA and other antibiotic-resistant infections are a major threat to modern medicine and we urgently need to find new ways to tackle them. Developing new medicines is extremely important, but can be a lengthy and expensive process. Our works suggests that already widely-available medicines could be used to treat one of the world's major strains of MRSA."

First author Dr Ewan Harrison, from the Wellcome Sanger Institute and the University of Cambridge, adds: "This study highlights the importance of genomic surveillance - collecting and sequencing representative collections of bacterial strains. By combining the DNA sequencing data generated by genomic surveillance with laboratory testing of the strains against a broad selection of antibiotics, we may find other unexpected chinks in the armour of antibiotic-resistant bacteria that might give us new treatment options."

The research was funded by the Medical Research Council (MRC), Wellcome and the Department of Health.

Dr Jessica Boname, Head of Antimicrobial Resistance at the MRC, says: "This study demonstrates how a mechanistic understanding of resistance and access to clinical data can be used to find new ways to contain and control infections with existing resources."

Credit: 
University of Cambridge

Clouds dominate uncertainties in predicting future Greenland melt

image: The Greenland ice sheet at sunset - taken by the research team.

Image: 
University of Bristol

New research led by climate scientists from the University of Bristol suggests that the representation of clouds in climate models is as, or more, important than the amount of greenhouse gas emissions when it comes to projecting future Greenland ice sheet melt.

Recent research shows that the whole of the Greenland ice sheet could be gone within the next thousand years, raising global sea level by more than seven metres.

However, most of the predictions about the future of the Greenland ice sheet focus on the impact of different greenhouse gas emission scenarios on its evolution and sea level commitment.

New research published today in the journal Nature Climate Change, shows that in a warming world, cloud microphysics play as an important role as greenhouse gases and, for high emission scenarios, dominate the uncertainties in projecting the future melting of the ice sheet.

The difference in potential melt caused by clouds, mainly stems from their ability to control the longwave radiation at the surface of the ice sheet.

They act like a blanket. The highest melt simulation has the thickest blanket (thickest clouds) with strongest warming at the surface which leads to two times more melt.

Conversely, the lower end melt simulation has the thinnest blanket (thinnest clouds) which in turn leads to less longwave warming at the surface and less melt over Greenland.

The uncertainties in Greenland Ice Sheet melt due to clouds, until the end of the 21st century could equate to 40,000 gigatons of extra ice melt. This is equivalent to 1,500 years of domestic water supply of the USA and 11 cm of global sea level rise.

PhD student, Stefan Hofer, from the University of Bristol's School of Geographical Sciences and member of the Black and Bloom and Global Mass projects, is the lead author of the new study.

He said: "Until now we thought that differences in modelled projections of the future evolution of the Greenland Ice Sheet were mainly determined by the amount of our future greenhouse gas emissions.

"However, our study clearly shows that the uncertainties in our predictions of Greenland melt are equally dependent on how we represent clouds in those models.

"Until the end of the 21st century, clouds can increase or decrease the sea level rise coming from Greenland Ice Sheet by 11 cm."

The main message of the paper is that clouds are the principal source of uncertainties in modelling future Greenland melt and consequent sea level contribution.

Ten percent of the global population live in coastal areas threatened by global sea level rise. Therefore, constraining the uncertainties due to clouds in sea level rise predictions will be needed for more accurate mitigation plans.

Stefan Hofer added: "Observations of cloud properties in the Arctic are expensive and can be challenging.

"There are only a handful of long-term observations of cloud properties in the Arctic which makes it very challenging to constrain cloud properties in our climate models.

"The logical next step would be to increase the amount of long-term observations of cloud properties in the Arctic, which then can be used to improve our climate models and predictions of future sea level rise."

Credit: 
University of Bristol

An 'awe-full' state of mind can set you free

An induced feeling of awe, or state of wonder, may be the best strategy yet for alleviating the discomfort that comes from uncertain waiting.

Kate Sweeny's research explores the most excruciating form of waiting: the period during which one awaits uncertain news, the outcome of which is beyond one's control. It's waiting for news from a biopsy, or whether you aced -- or tanked -- the exam. That's distinguished from waiting periods such as when looking for a new job, when you have at least some control over the outcome.

Her research has found some clues for alleviating those difficult periods. Meditation helps, as does engaging in "flow" activities -- those that require complete focus, such as a video game.

"However, meditation is not for everyone, and it can be difficult to achieve a state of flow when worry is raging out of control," Sweeny and her team assert in their latest related research, published recently in The Journal of Positive Psychology.

Sweeny, a professor of psychology at UC Riverside, has discovered what may be the best strategy yet to alleviate the most uncomfortable purgatory of waiting. That is, awe, defined in the research as a state of wonder, a transportive mindset brought on by beautiful music, or a deeply affecting film.

The research drew from two studies, for a total of 729 participants. In the first test, participants took a faux intelligence assessment. In the second test, participants believed they were awaiting feedback on how other study participants perceived them.

In both cases, they watched one of three movies that inspired varying levels of awe. The first was an "awe induction" video, a high-definition video of a sunrise with instrumental music. The second was a positive control video meant to elicit happy feelings, but not awe. The video was of cute animal couples. The third was a neutral video. In this case, about how padlocks are made.

Researchers found that those exposed to the awe-induction video experienced significantly greater positive emotion and less anxiety during the period waiting for IQ test results and peer assessments.

"Our research shows that watching even a short video that makes you feel awe can make waiting easier, boosting positive emotions that can counteract stress in those moments," Sweeny said.

Sweeny said the research can be used to devise strategies for maximizing positive emotion and minimizing anxiety during the most taxing periods of waiting. Because the concept of awe has only received recent attention in psychology, the research also is the first to stress its beneficial effects during stressful waiting periods, opening new opportunities for study.

"Now that we know we can make people feel better through brief awe experiences while they're waiting in the lab, we can take this knowledge out into the real world to see if people feel less stressed when they watch "Planet Earth" or go to an observatory, for example, while they're suffering through a difficult waiting period," Sweeny said.

Credit: 
University of California - Riverside

SLAS Discovery announces its July feature article, '3D Cell-Based Assays for Drug Screens: Challenges in Imaging, Image Analysis, and High-Content Analysis'

image: In July's SLAS Discovery feature article, '3D Cell-Based Assays for Drug Screens: Challenges in Imaging, Image Analysis, and High-Content Analysis,' Tijmen H. Booij, Ph.D., Screening Specialist for NEXUS Personalized Health Technologies (Switzerland), discusses the switch from using 2D to 3D cell cultures in drug discovery to more accurately mimic human physiological conditions and improve the success rates of drugs in the early stages of preclinical drug discovery.

Image: 
David James Group

Oak Brook, IL - In July's SLAS Discovery feature article, "3D Cell-Based Assays for Drug Screens: Challenges in Imaging, Image Analysis, and High-Content Analysis," Tijmen H. Booij, Ph.D., Screening Specialist for NEXUS Personalized Health Technologies (Switzerland), discusses the switch from using 2D to 3D cell cultures in drug discovery to more accurately mimic human physiological conditions and improve the success rates of drugs in the early stages of preclinical drug discovery.

In traditional target-based discovery, drugs are designed to focus on a single molecular target, whereas phenotypic drug discovery screens cellular systems. The latter strategy could be used as an alternative to pre-select drugs for the clinic, allowing for drug discovery at a target-agnostic level. This approach, however, is mainly used with 2D cell cultures that do not fully represent human physiology and historically has shown to prioritize the wrong drugs. To overcome this limitation, 3D cell cultures have been developed that may more accurately mimic physiological conditions.

In his review, Dr. Booij details the switch from 2D to 3D cell cultures and the challenges for high-throughput screening and high-content analysis. Using high-content analysis with 3D cell cultures in high-throughput screens has long been difficult due to technological limitations, and screens with 3D cell cultures have mostly used simple readouts that limit the amount of information that can be extracted. In addition, there is an enormous range in 3D cell culture techniques that has large consequences for the ease of collecting quality data.

The introduction of more relevant cell models in early preclinical drug discovery, combined with high-content imaging and automated analysis, is expected to increase the quality of compounds progressing to preclinical stages in the drug development pipeline. Overcoming these challenges will enable front-loading the drug discovery pipeline with better biology, extracting the most from that biology and improving translation between in vitro and in vivo models. This is expected to reduce the proportion of compounds that fail in vivo testing due to a lack of efficacy or to toxicity.

Access to July's SLAS Discovery special issue is available at https://journals.sagepub.com/toc/jbxb/24/6 through August 20. For more information about SLAS and its journals, visit http://www.slas.org/journals.

Credit: 
SLAS (Society for Laboratory Automation and Screening)

A wearable vibration sensor for accurate voice recognition

video: Voice authentication and voice remote control system

Image: 
POSTECH

A voice-recognition feature can be easily found on mobile phones these days. Often times, we experience an incident where a speech recognition application is activated in the middle of a meeting or a conversation in the office. Sometimes, it is not activated at all regardless of numbers of times we call out the application. It is because a mobile phone uses a microphone which detects sound pressure to recognize voice, and it is easily affected by surrounding noise and other obstacles.

Professor Kilwon Cho of Chemical Engineering and Professor Yoonyoung Chung of Electronic and Electric Engineering from POSTECH successfully developed a flexible and wearable vibration responsive sensor. When this sensor is attached to a neck, it can precisely recognize voice through vibration of the neck skin and is not affected by ambient noise or the volume of sound.

The conventional vibration sensors recognize voice through air vibration and the sensitivity decreases due to mechanical resonance and damping effect, therefore are not capable of measuring voices quantitatively. So, ambient sound or obstacles such as mouth mask can affect its accuracy of voice recognition and it cannot be used for security authentication.

In this study, the research group demonstrated that the voice pressure is proportional to the acceleration of neck skin vibration at various sound pressure levels from 40 to 70 dBSPL and they developed a vibration sensor utilizing the acceleration of skin vibration. The device, which is consisted of an ultrathin polymer film and a diaphragm with tiny holes, can sense voices quantitively by measuring the acceleration of skin vibration.

They also successfully exhibited that the device can accurately recognize voice without vibrational distortion even in the noisy environment and at a very low voice volume with a mouth mask worn.

This research can be further extended to various voice-recognition applications such as an electronic skin, human-machine interface, wearable vocal healthcare monitoring device.

Professor Kilwon Cho explained the meaning of this study in his interview. "This research is very meaningful in a way that it developed a new voice-recognition system which can quantitively sense and analyze voice and is not affected by the surroundings. It took a step forward from the conventional voice-recognition system that could only recognize voice qualitatively."

Credit: 
Pohang University of Science & Technology (POSTECH)

How to bend waves to arrive at the right place

image: Waves do not always spread uniformly into all directions, but can form a remarkable 'branched flow'. At TU Wien (Vienna) a method has now been developed to control this phenomenon.

Image: 
TU Wien

In free space, the light wave of a laser beam propagates on a perfectly straight line. Under certain circumstances, however, the behavior of a wave can be much more complicated. In the presence of a disordered, irregular environment a very strange phenomenon occurs: An incoming wave splits into several paths, it branches in a complicated way, reaching some places with high intensity, while avoiding others almost completely.

This kind of "branched flow" has first been observed in 2001. Scientists at TU Wien (Vienna) have now developed a method to exploit this effect. The core idea of this new approach is to send a wave signal exclusively along one single pre-selected branch, such that the wave is hardly noticeable anywhere else. The results have now been published in the journal PNAS.

From Quantum Particles to Tsunamis

"Originally, this effect was discovered when studying electrons moving as quantum waves through tiny microstructures," says Prof. Stefan Rotter from the Institute of Theoretical Physics at TU Wien. "Such structures, however, are never perfect and they always come with certain imperfections; and surprisingly, these imperfections cause the electron wave to split up into branches - an effect which is called branched flow."

Soon it turned out that this wave phenomenon does not only occur in quantum physics. In principle it can occur with all types of waves and on completely different length scales. If, for example, laser beams are sent into the surface of a soap bubble, they split into several partial beams, just like tsunami waves in the ocean: the latter do not spread regularly across the ocean, but instead they travel in a complicated, branched pattern that depends on the random shape of the corrugated ocean sea bed. As a result, it can happen that a distant island is hit very hard by a tsunami, while the neighboring island is only reached by much weaker wave fronts.

"We wanted to know whether these waves can be manipulated in such a way that they only travel along one single selected branch, instead of propagating along a whole branched network of paths in completely different directions", says Andre Brandstötter (TU Wien), first author of the publication. "And as it turns out, it is indeed possible to target individual branches in a controlled way."

Analyze and Adapt

The new procedure takes only two steps: First, the wave is allowed to branch out on all possible paths as usual. At one of the locations that are reached with high intensity, the wave is measured in detail. The method developed at the TU Wien can then be used to calculate how the wave has to be shaped at the origin, so that in the second step it can be sent along one selected path, while avoiding all other paths.

"We used numerical simulations to show how to find a wave that behaves exactly the way we want it to. This approach can be applied using a variety of different methods," says Stefan Rotter. "You can implement it with light waves that are adjusted with special mirror systems or with sound waves that you generate with a system of coupled loudspeakers. Sonar waves in the ocean would also be a possible field of application. In any case, the necessary technologies are already available."

With this new method, all these different types of waves could be sent out along a single trajectory pre-selected from a complex network of paths. "This trajectory doesn't even have to be straight," explains Andre Brandstötter. "Many of the possible paths are curved - the irregularities of the surroundings act like a set of lenses by which the wave is focused and deflected again and again."

Even pulsed signals can be sent along these special paths, such that information can be transmitted in a targeted manner. This guarantees that a wave signal arrives exactly where it is supposed to be received; at other locations it can hardly be detected, which makes eavesdropping much more difficult.

Credit: 
Vienna University of Technology

Novel Chinese nanogenerator takes cue from electric eels

image: Underwater wireless multi-site human motion monitoring system based on BSNG.

Image: 
TAN Puchuan

Researchers from the Beijing Institute of Nanoenergy and Nanosystems and the University of Chinese Academy of Sciences have developed a bionic stretchable nanogenerator (BSNG) that takes inspiration from electric eels.

The scientists hope the new technology will meet the tough demands of wearable equipment applications for stretchability, deformability, biocompatibility, waterproofness and more.

BSNG, which uses technology that mimics the structure of ion channels on the cytomembrane of electric eels' electrocytes, has two broad applications: Besides providing a potential power source for wearable electronic devices underwater and on land, it can also be used for human motion monitoring due to its excellent flexibility and mechanical responsiveness.

The study was published online in Nature Communications on June 19.

BSNG is based on a mechanically sensitive bionic channel that relies on the stress mismatch between polydimethylsiloxane and silicone. Like its eel counterpart, BSNG can generate an open circuit voltage up to 10 V underwater. It can also generate an open circuit voltage up to 170 V under dry conditions.

BSNG's bionic structure and material ensure superior stretchability. For example, BSNG maintained stable output performance without attenuation after 50,000 uniaxial tensile tests (tensile rate of 50%).

To prove the practicability of the technology, the researchers built an underwater wireless motion monitoring system based on BSNG.

Through this system, the motion signals under different swimming strokes can be synchronously transmitted, displayed and recorded. As for the energy harvester application, researchers achieved underwater rescue based on BSNG.

Wearable integrated BSNGS can collect mechanical energy from human motion and convert it into electrical energy to store in capacitors. In case of emergency, the rescue signal light can be lighted remotely by tapping the alarm trigger in front of the chest.

Due to its excellent properties, BSNG holds great promise for use in electronic skin, soft robots, wearable electronic products and implantable medical devices.

Credit: 
Chinese Academy of Sciences Headquarters

Surrey researchers clear runway for tin based perovskite solar cells

Researchers at the University of Surrey believe their tin based perovskite solar cell could clear the runway for solar panel technology to take off and help the UK reach its 2050 carbon neutral goal.

As countries look to get to grips with climate change, solar cell technology is rapidly growing in popularity as an environmentally friendly energy alternative. Most commercial solar panels use silicon as the light absorber, which makes the panels rigid, heavy and costly.

Perovskites - a relatively new class of materials - are cheap and have proven to be more efficient at absorbing light than silicon. Unlike silicon, perovskites can be fabricated using solution processable "inks" that allow production of efficient, thin (semi-transparent) and flexible solar panels using low cost materials, while also allowing cell fabrication through roll-to-roll printing. This technology allows for a wide variety of affordable solar panel options, from on-wall panels to window panes. Despite the excellent performances of perovskites solar cells, they do contain toxic lead as an ingredient - which has led environmentally conscious scientists to explore ways of reducing toxicity in the technology while maintaining their high efficiency.

In a study published by the Journal of Materials Chemistry, researchers from Surrey's Advanced Technology Institute (ATI) detail how they have produced a solar cell which contains 50 percent less lead with the more innocuous tin. By finetuning their tin solar cell, researchers were able to create a product that is able to absorb infrared light in a similar manner as silicon cells. Researchers also found that by stacking lead-only cells with the ones mixed with tin can lead to power conversion results that outperform those of silicon-only power cells.

Indrachapa Bandara, lead author of the study and PhD student at ATI, said: "We are starting to see that many countries are treating the threat of climate change with the seriousness it deserves. If we are to get a handle on the problem and put the health of our planet on the right track, we need high-performing renewable energy solutions.

"Our study has shown that tin based perovskite solar cells have an incredible amount of potential and could help countries such as the United Kingdom reach its target of becoming carbon neutral by 2050."

Director of the ATI at the University of Surrey and corresponding author Professor Ravi Silva said: "Using solar panels will ultimately allow each of us to contribute to not just solving the energy crisis, but hugely reducing the impact of fossil fuels on climate change. Tin-based perovskite photovoltaics is an upcoming technology that promises major improvements to environmentally friendly and efficient solar panels at a low cost. Our new findings point researchers in the field to gaining higher efficiencies while reducing the toxic impact of the absorber materials."

Credit: 
University of Surrey

Play games with no latency

image: Figure 1. Overview of Geometric Compensation

Image: 
© KAIST

One of the most challenging issues for game players looks to be resolved soon with the introduction of a zero-latency gaming environment. A KAIST team developed a technology that helps game players maintain zero-latency performance. The new technology transforms the shapes of game design according to the amount of latency.

Latency in human-computer interactions is often caused by various factors related to the environment and performance of the devices, networks, and data processing. The term 'lag' is used to refer to any latency during gaming which impacts the user's performance.

Professor Byungjoo Lee at the Graduate School of Culture Technology in collaboration with Aalto University in Finland presented a mathematical model for predicting players' behavior by understanding the effects of latency on players. This cognitive model is capable of predicting the success rate of a user when there is latency in a 'moving target selection' task which requires button input in a time constrained situation.

The model predicts the players' task success rate when latency is added to the gaming environment. Using these predicted success rates, the design elements of the game are geometrically modified to help players maintain similar success rates as they would achieve in a zero-latency environment. In fact, this research succeeded in modifying the pillar heights of the Flappy Bird game, allowing the players to maintain their gaming performance regardless of the added latency.

Professor Lee said, "This technique is unique in the sense that it does not interfere with a player's gaming flow, unlike traditional methods which manipulate the game clock by the amount of latency. This study can be extended to various games such as reducing the size of obstacles in the latent computing environment."

Credit: 
The Korea Advanced Institute of Science and Technology (KAIST)

Music students do better in school than non-musical peers

High school students who take music courses score significantly better on math, science and English exams than their non-musical peers, according to a new study published in the Journal of Educational Psychology.

School administrators needing to trim budgets often look first to music courses, because the general belief is that students who devote time to music rather than math, science and English, will underperform in those disciplines.

"Our research proved this belief wrong and found the more the students engage with music, the better they do in those subjects," said UBC education professor and the study's principal investigator, Peter Gouzouasis. "The students who learned to play a musical instrument in elementary and continued playing in high school not only score significantly higher, but were about one academic year ahead of their non-music peers with regard to their English, mathematics and science skills, as measured by their exam grades, regardless of their socioeconomic background, ethnicity, prior learning in mathematics and English, and gender."

Gouzouasis and his team examined data from all students in public schools in British Columbia who finished Grade 12 between 2012¬ and 2015. The data sample, made up of more than 112,000 students, included those who completed at least one standardized exam for math, science and English, and for whom the researchers had appropriate demographic information--including gender, ethnicity, neighbourhood socioeconomic status, and prior learning in numeracy and literacy skills. Students who studied at least one instrumental music course in the regular curriculum counted as students taking music. Qualifying music courses are courses that require previous instrumental music experience and include concert band, conservatory piano, orchestra, jazz band, concert choir and vocal jazz.

The researchers found the predictive relationships between music education and academic achievement were more pronounced for those who took instrumental music rather than vocal music. The findings suggest skills learned in instrumental music transfer very broadly to the students' learning in school.

"Learning to play a musical instrument and playing in an ensemble is very demanding," said the study's co-investigator Martin Guhn, an assistant professor in UBC's school of population and public health. "A student has to learn to read music notation, develop eye-hand-mind coordination, develop keen listening skills, develop team skills for playing in an ensemble and develop discipline to practice. All those learning experiences, and more, play a role in enhancing the learner's cognitive capacities, executive functions, motivation to learn in school, and self-efficacy."

The researchers hope that their findings are brought to the attention of students, parents, teachers and administrative decision-makers in education, as many school districts over the years have emphasized numeracy and literacy at the cost of other areas of learning, particularly music.

"Often, resources for music education--including the hiring of trained, specialized music educators, and band and stringed instruments--are cut or not available in elementary and secondary schools so that they could focus on math, science and English," said Gouzouasis. "The irony is that music education--multiple years of high-quality instrumental learning and playing in a band or orchestra or singing in a choir at an advanced level--can be the very thing that improves all-around academic achievement and an ideal way to have students learn more holistically in schools."

Credit: 
University of British Columbia

Nutrition is the missing ingredient in home health today, new study shows

image: A new study shows that implementing a nutrition care plan for patients in home healthcare that included nutritional drinks reduced 90-day hospitalizations by 18%.

Image: 
Abbott

Real-world data from Advocate Health Care and Abbott shows prioritizing nutrition care for home health patients helped keep them out of the hospital

Improved health outcomes from nutrition intervention and educational support could help save millions of dollars in healthcare costs annually

New research from Advocate Health Care and Abbott found that prioritizing nutrition care* for home health patients at risk for malnutrition had a dramatic impact on helping keep them out of the hospital - resulting in millions of dollars in healthcare cost savings. Nearly 5 million Americans annually rely on home healthcare to recover from an illness, injury or hospitalization.1,† While healthcare providers are constantly striving to improve patients' health and minimize hospitalizations, nutrition is often not top of mind, yet it plays a critical role in helping adults bounce back and resume their normal routine.

In the first-of-its-kind study, published today in the Journal of Parenteral and Enteral Nutrition, more than 1,500 home health patients were followed for 90 days.2 The study found that when patients at risk for malnutrition received a comprehensive nutrition care program, including nutrition drinks, to aid in their recovery:

Risk of being hospitalized was significantly reduced by 24% in the first 30 days, nearly 23% after 60 days, and 18% after 90 days.
Healthcare costs were reduced by more than $2.3 million or about $1,500 per patient at risk for malnutrition treated over the course of 90 days.‡

''Our goal as a home healthcare provider is to help patients get back on their feet as quickly as possible and to keep them out of the hospital,'' said Katie Riley, R.N., vice president, post acute chief nursing officer for Advocate Aurora Health and the lead study author. ''While the primary reason people come to home health isn't because they're malnourished or at risk, we have found that when we do pay attention to their nutrition care, it helps promote their strength and prevents them from going back to the hospital, which ultimately reduces healthcare costs.''

A RECIPE FOR RECOVERY

As many as 1 in 3 home health patients are at risk of malnutrition, which can impact their recovery or cause further health issues.1,3 But malnutrition often goes unrecognized as it can be invisible to the eye and can occur in both underweight and overweight individuals. Therefore, more healthcare systems are starting to focus efforts on the identification and management of malnourished or at-risk patients through regular monitoring and follow up.

"It's clear that nutrition can be a simple, cost-effective tool to improve patient outcomes,'' said Suela Sulo, Ph.D., health outcomes researcher at Abbott and a study author. "Healthcare systems are driven to improve patient care while reducing costs. Our research shows that prioritizing nutrition across different settings of care - or from hospital to home - can significantly cut costs while improving patients' health."

While home health often helps jumpstart the road to recovery, it's even more effective when patients are given the necessary nutrition education and tools to take their health by the reins, even after they stop receiving visits from clinicians.

''Educating people on the benefits of proper nutritional care can empower them to continue thinking about their nutrition and drinking their supplements,'' said Gretchen VanDerBosch, R.D., a lead registered dietitian at Advocate Health Care and a study author. ''By maintaining proper nutrition, patients have greater strength, heal faster, have fewer falls and reduced readmissions.''

Credit: 
MediaSource