Culture

Hidden 'rock moisture' possible key to forest response to drought

image: Scientists have found underground water reservoirs that can mitigate drought effects.

Image: 
Wikimedia/USFS

A little-studied, underground layer of rock may provide a vital reservoir for trees, especially in times of drought, report scientists funded by the National Science Foundation (NSF) and affiliated with The University of Texas (UT) at Austin and the University of California, Berkeley.

The study, published today in the journal Proceedings of the National Academy of Sciences (PNAS), looked at the water stored inside the layer of weathered bedrock that lies under soils in mountain forest ecosystems.

This transitional zone beneath soils and above groundwater is often overlooked when it comes to studying hydrologic processes, but researchers found that the water contained in the fractures and pores of the rock could play an important role in the water cycle at local and global levels.

"There are significant hydrologic dynamics in weathered bedrock environments, but traditionally they are not investigated because they are hard to access," said lead PNAS author Daniella Rempe, a geoscientist at UT Austin. "Our study was designed to investigate this region."

Researchers found that water in bedrock can sustain trees through droughts even after the soil has become parched.

At a field site in Northern California's Mendocino County, scientists found that up to 27 percent of annual rainfall was stored as "rock moisture," the water clinging to cracks and pores in the bedrock.

The impact of rock moisture varies, the researchers said, depending on region and topography. But it likely explains why trees in the study area showed little effect from the severe 2010-2015 drought, which killed more than 100 million trees throughout California.

"How trees can survive extended periods of severe drought has been a mystery," said Richard Yuretich, director of NSF's Critical Zone Observatory (CZO) program. The research was conducted at the NSF Eel River Critical Zone Observatory, one of nine NSF CZO research sites across the country.

"This study reveals a significant reservoir of trapped water that had gone unnoticed in the past," says Yuretich. "Research of this kind can help greatly in managing natural resources during times of environmental stress."

To conduct the study, scientists monitored rock moisture from 2013 to 2016 at nine wells drilled into weathered bedrock along a steep, forested hillside. They used a neutron probe, a precision tool that measures the amount of water in a sample area by detecting hydrogen.

They found that the weathered rock layer built up a supply of 4 to 21 inches of rock moisture during the winter wet season, depending on the well.

The maximum amount of rock moisture in each well stayed about the same throughout the study period, which included a significant drought year. The finding indicates that the total rainfall amount does not influence the rock moisture levels.

"It doesn't matter how much it rains in the winter; rock moisture builds up to the same maximum value," Rempe said. "That leads to the same amount of water being available every summer for use by trees."

Researchers also found that the average rock moisture at all wells exceeded the average soil moisture measurements at all locations.

"Soils are important, but when it comes to determining if a place is going to experience water stress, it could be the underlying rock that matters most," Rempe said. "This is the first time this has been demonstrated in a multi-year field study."

The potential for rock moisture to travel back to the atmosphere by evaporation from tree leaves or to trickle down into groundwater indicates that it could affect the environment and climate on a larger scale.

The study provides a glimpse into rock moisture at a small, intensive research site, according to paper co-author William Dietrich of the University of California, Berkeley. He said the data collected during the study should be a starting point for more research. "The future paths are many. Now we have just one well-studied site."

The research was also supported by the Keck Foundation and the University of California Reserve System.

Credit: 
U.S. National Science Foundation

Demographics can help identify migrants to Canada at high risk of TB

Demographic characteristics can help identify groups of immigrants in Canada at high risk of tuberculosis (TB), according to new research in CMAJ (Canadian Medical Association Journal).

"Screening latent TB infection based on demographic factors at the time of immigration is a necessary first step toward eliminating TB in migrants to Canada," says Dr. James Johnston of the University of British Columbia and the BC Centre for Disease Control in Vancouver, BC.

The study examined TB outcomes in permanent residents of Canada who lived in BC between 1985 and 2012. Researchers sought to identify groups at highest risk of TB based on demographic characteristics when immigrating to Canada. TB rates increased with older age at the time of immigration, with people aged 65 years and older having the highest rates. People who emigrated from regions with highest TB incidence had TB rates in Canada more than 21 times that of people coming from regions with the lowest TB incidence.

"Our study adds to the understanding of long-term TB incidence in migrant populations in Canada by showing that rates remain elevated up to two decades after migration," write the authors.

The authors suggest that latent TB screening and treatment may be practical and high impact, and will help reduce TB incidence in some high-risk groups.

Credit: 
Canadian Medical Association Journal

Imaging plays key role in evaluating injuries at Olympics

image: Images in a sprinter with acute anterior thigh pain sustained while training. (a) Ultrasound image of anterior thigh shows complete rupture of proximal rectus femoris muscle (arrowheads) with major distal retraction (arrows). Origin of proximal tendon (arrowheads) is located at anterior inferior iliac spine (?). (b) Fat-suppressed T2-weighted MR imaging demonstrates distal retraction of proximal rectus femoris muscle (arrows).

Image: 
Radiological Society of North America

OAK BROOK, Ill. - The Olympic Games give elite athletes a chance at athletic triumph, but also carry a risk of injury. When injuries occur, it is critical that they be evaluated quickly. Onsite imaging services play an important role in the management of Olympic athletes with sports-related injuries and disorders, according to a new study published online in the journal Radiology.

"The Olympic Summer Games are considered the most important sporting event worldwide. Competing athletes are at the peak of their careers and have trained and practiced for years to be able to participate in the games at a high level," said lead author Ali Guermazi, M.D., Ph.D., professor and vice chair in the Department of Radiology at Boston University School of Medicine, in Boston, Mass., and musculoskeletal radiologist at Boston Medical Center. "Unfortunately, these elite athletes are at risk for injury, and the medical teams onsite will do anything to ensure a fast return to competition or initiate the next appropriate measures in light of more severe injuries."

The Rio de Janeiro 2016 Summer Olympic Games drew more than 11,000 athletes from 206 different countries. During the games, a total of 1,015 radiologic examinations were performed on participating athletes.

"Imaging is paramount for determining whether or not an injured athlete is able to return to competition," Dr. Guermazi said. "Anticipated absence from competition or training is often based on imaging findings. In cases of severe injury, imaging will further help in determining the best therapeutic approach."

Dr. Guermazi and colleagues set out to describe the occurrence of imaging-depicted sports-related stress injuries, fractures, and muscle and tendon disorders, and to document the usage of imaging with X-ray, ultrasound and MRI.

"We wished to elucidate further what types of injuries athletes are incurring, as reflected by imaging, and also emphasize utilization rates of imaging services during the Olympic Games," he said.

The researchers collected and analyzed data from the imaging exams. These data were categorized according to gender, age, participating country, type of sport and body part.

The results showed that 1,101 injuries occurred in 718 of the 11,274 athletes. Of the 1,015 imaging exams performed, 304 (30 percent) were X-ray, 104 (10.2 percent) were ultrasound, and 607 (59.8 percent) were MRI.

"The relevance of imaging is stressed by the fact that a large number of advanced imaging exams were requested, with MRI comprising nearly 60 percent of all imaging performed for diagnosis of sports-related injuries," Dr. Guermazi said.

Athletes from Europe underwent the most exams with 103 X-rays, 39 ultrasounds and 254 MRIs, but - excluding 10 athletes categorized as refugees - athletes from Africa had the highest utilization rate (14.8 percent). Among the sports, gymnastics (artistic) had the highest percentage of athletes who utilized imaging (15.5 percent), followed by Taekwondo (14.2 percent) and beach volleyball (13.5 percent). Athletics (track and field) had the most examinations (293, including 53 X-rays, 50 ultrasounds and 190 MRIs).

"In some sports, like beach volleyball or Taekwondo, the high utilization rate was somewhat unexpected," Dr. Guermazi said. "These numbers may help in planning imaging services for future events and will also help in analyzing further why some sports are at higher risk for injury and how these injuries can possibly be prevented."

The lower limb was the most common location of imaging-depicted sports-related injuries overall, and imaging of lower extremities was the most common exam. The second most common location was the upper limb.

Among muscle injuries, 83.9 percent affected muscles from the lower extremities. The sports most prone to muscle injuries were athletics, soccer (football) and weightlifting. Athletics also accounted for 34.6 percent of all tendon injuries.

Eighty-four percent of stress injuries were seen in the lower extremities. Stress injuries were most commonly seen in athletics, volleyball, artistic gymnastics and fencing. Fractures were most commonly found in athletics, hockey and cycling. Nearly half were upper extremity fractures.

"Two peaks of imaging utilization were observed, on the fifth and 12th days of the games," Dr. Guermazi said. "This likely corresponds with the timing of judo and athletics events, with both sports showing high proportional utilization rates. These findings will help to plan for increased availability of imaging services during those expected peaks."

Overall, imaging was used to help diagnose sports-related injuries in 6.4 percent of athletes competing in the Olympic Games. High utilization of ultrasound and MRI implies that organizers of future Olympic programs should ensure wide availability of these imaging modalities.

"Imaging continues to be crucial for establishing fast and relevant diagnoses that help in medical decision making during these events," Dr. Guermazi said.

Credit: 
Radiological Society of North America

Colorectal cancer: Combined analysis enhances risk prediction

If first-degree relatives are affected by colorectal cancer, this indicates a person's own elevated risk of developing bowel cancer. The same holds true for people who have large numbers of genetic risk markers in their genome. Both factors are usually used alternatively, not combined, to predict risk. Scientists from the German Cancer Research Center (DKFZ) in Heidelberg have now shown that a combination of family history and an analysis of genetic markers helps determine a person's colorectal cancer risk more precisely.

When the conversation is about colon cancer screening, physicians often ask their patients about cancer cases among close family members. Colorectal cancer in first-degree relatives is considered a known risk factor for the disease. Besides family history, a person's profile of genetic risk markers also provides information about his or her personal cancer risk. This scientific term refers to very small gene variants (called single nucleotide polymorphisms), each of which, taken on its own, has only minimal relation to the probability of suffering disease. Taken together, however, they have a significant impact.

"It is a widely held notion that genetic risk profile and family history essentially both provide the same, redundant information," says Hermann Brenner, an expert for colorectal cancer prevention from the DKFZ. In a current study, Brenner has now modified this concept and has proven that it is beneficial to use both information sources in combination.

Korbinian Weigl, an epidemiologist who is the first author of the study, analyzed the family history and, in a parallel effort, a panel of 53 known genetic risk markers in approximately 4,500 participants of the DACHS* study. About half of the cases studied were colorectal cancer patients. Weigl discovered that the two factors deliver largely independent results:

The risk for disease in participants with colorectal cancer cases in first-degree relatives was about twice as high as that of participants with no family history of bowel cancer.

The risk for disease in the group of participants with the largest numbers of genetic risk markers in their genomes was three times higher than that of study participants with the smallest numbers of risk variants.

The risk in participants with a positive family history who also display numerous genetic risk markers multiplied: Colorectal cancer is six times as prevalent in them as in people who have no family history of bowel cancer and exhibit only small numbers of risk markers in their genomes.

No significant association was found between the numbers of genetic risk variants and family history.

"People often assume that familial colorectal cancer is primarily related to genetic predisposition," Brenner said. "This is true in a small proportion of colorectal cancer cases, particularly those with onset at early age." However, in the majority of colorectal cancer cases, this is not the case. Brenner thinks this might be due to the fact that cancer cases in the family primarily reflect common non-genetic factors such as smoking or a physically inactive lifestyle.

In addition, large numbers of genetic risk markers increase the risk of developing cancer in people with and without a family history of bowel cancer alike.

"The result clearly shows us that the combination of both factors has the potential to make risk prediction for colorectal cancer significantly more precise," Brenner sums up. "The more accurately we can predict a person's risk, the better we can adjust preventive measures. For people with high risks, for example, it would be beneficial to start colorectal cancer screening much earlier than at age 50."

Credit: 
German Cancer Research Center (Deutsches Krebsforschungszentrum, DKFZ)

Microbiota-gut-brain axis is at epicenter of new approach to mental health

image: OMICS: A Journal of Integrative Biology addresses the latest advances at the intersection of postgenomics medicine, biotechnology and global society, including the integration of multi-omics knowledge, data analyses and modeling, and applications of high-throughput approaches to study complex biological and societal problems.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, February 26, 2018--The functional gut microbiome provides an exciting new therapeutic target for treating psychiatric disorders such as anxiety and trauma-related conditions. Innovative methods for studying and intervening in gut microbiome composition and activity to treat mental illness and maintain mental health are presented in a timely review article that is part of the "Microbiome Special Issue: Food, Drugs, Diagnostics, and Built Environments" of OMICS: A Journal of Integrative Biology, the peer-reviewed interdisciplinary journal published by Mary Ann Liebert, Inc., publishers. The special issue is available free on the OMICS website until March 26, 2018.

In the article "The Gut Microbiome and Mental Health: Implications for Anxiety- and Trauma-Related Disorders," Stefanie Malan-Muller, Stellenbosch University (Stellenbosch, South Africa) and coauthors review the emerging findings of microbiome research in psychiatric disorders. They encourage the mental health community to embrace microbiome science as a new frontier of biological psychiatry and postgenomic medicine.

"Culturomics: A New Kid on the Block of OMICS to Enable Personalized Medicine," an article coauthored by Manousos Kambouris, The Golden Helix Foundation (London, U.K.) and University of Patras (Patras, Greece) and colleagues, examine the new field of culturomics and how it may widen the scope of microbiology and expand its contributions to diagnostics and personalized medicine. The researchers also explore potential applications in agriculture, environmental sciences, pharmacomicrobiomics, and biotechnology innovation and the value of the Big Data produced by culturomics.

Kieran O'Doherty, University of Guelph, Canada, and coauthors discuss the opportunity and challenges for integrating human microbiome research into clinical practice to improve the diagnosis and treatment of patients with inflammatory bowel disease. The authors describe how patient-specific microbiome data could help guide therapeutic decision-making in the article entitled "Human Microbiome and Learning Healthcare Systems: Integrating Research and Precision Medicine for Inflammatory Bowel Disease."

Vural Özdemir, MD, PhD, DABCP, Editor-in-Chief of OMICS, has commented that "microbiome projects are booming and unraveling the microbial dark matter from humans and animals to built habitats on our planet. Microbiome research will continue to bring about insights into human health, disease susceptibility, and mechanisms of person-to-person variations in response to food, drugs, vaccines, and other health interventions, thus supporting precision medicine and the discovery of innovative diagnostics."

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

People rationalize policies as soon as they take effect

People express greater approval for political outcomes as soon as those outcomes transition from being anticipated to being actual, according to new research published in Psychological Science, a journal of the Association for Psychological Science. Findings from three field studies indicate that people report more favorable opinions about policies and politicians once they become the status quo.

"When we anticipate that something is going to happen, and then it actually happens, we immediately start to find ways of twisting our perceptions to make ourselves feel better about it, more so than we were doing when we merely anticipated this new thing," says study author Kristin Laurin of The University of British Columbia. "This suggests that public opinion polls about any potential new initiative are going to overestimate how much opposition will actually emerge once the new initiative takes effect."

In previous research, Laurin had found that people rationalize things that they feel stuck with, including situations they cannot physically escape or that are stable.

"It occurred to me that one thing that really makes you feel stuck with a new reality is when it becomes, so to speak, real--when it goes from being something that is hypothetically supposed to happen in the future, to something that is part of the present," she explains.

In one preliminary study, Laurin used Facebook ads to recruit participants in San Francisco and survey them about a citywide ban on selling plastic water bottles. Participants who completed the survey on the Tuesday immediately following the ban reported more positive attitudes toward the policy than did those who completed the survey on the Tuesday preceding the ban.

A second study expanded on these findings, surveying participants in Ontario about a new law that banned smoking in parks and restaurant patios. The participants, all of whom lived in Ontario and smoked, answered questions about their smoking habits and reported the proportion of their smoking that occurred in certain locations. They also saw information from the Ontario government website outlining the new legislation, including the date it would go into effect and the locations where smoking would be prohibited.

Participants who completed a follow-up survey after the new law went into effect changed their estimates: They reported that they had spent less of their time smoking in the prohibited areas than they had reported on the first survey. Participants who completed the follow-up survey just before the law went into effect did not report different smoking behaviors in the initial survey and follow-up.

These findings suggest that participants may have adjusted their reporting after the new law went into effect so as to minimize the extent to which the law affected them.

In a third study, Laurin surveyed Americans twice before Donald Trump's inauguration as President of the United States and once in the three days following it. The participants reported their attitudes toward the Trump presidency and how psychologically real Trump's presidency felt to them.

Once again, participants reported more positive attitudes after the inauguration compared with before, when the inauguration was anticipated but had not yet happened. An increase in how psychologically real the presidency felt seemed to account for this positive shift in attitudes.

Intriguingly, even participants who evaluated Trump's inauguration performance negatively and those who had preferred a different candidate for president showed the same pattern of results - these participants also tended to report more favorable attitudes as soon as Trump took office.

The finding may seem counterintuitive, but they fall in line with Laurin's hypothesis:

"It's hard to understand why seeing someone give what you judge to be a disappointing performance would make you more confident in their abilities to lead the country. It's not rational--it's rationalization: When something becomes a part of the present reality, even when it's something you don't like, you find ways of tricking yourself into thinking it's not quite so bad," she explains.

These studies shed light on how we might feel about new political realities, but they have implications that apply in a variety of scenarios.

"If you're going to have a new boss at work, or if you have to start a new diet for medical reasons, or if you're about to have a new baby--the message is that your 'psychological immune system' will likely kick in and make you feel better about any unpleasant aspect of these new realities once they actually take hold," Laurin concludes.

Credit: 
Association for Psychological Science

Switching from smoking to heated tobacco products reduces exposure to toxicants

A clinical study conducted by scientists at British American Tobacco have revealed that when smokers switch completely from cigarettes to glo, their exposure to certain cigarette smoke toxicants is significantly reduced, in some cases to levels comparable to those seen in smokers who quit smoking completely.

These results add to evidence suggesting that glo may have the potential to be substantially reduced risk compared to smoking conventional cigarettes.

glo is a tobacco heating product (THP) designed to heat rather than burn tobacco. This means it does not produce smoke and certain toxicants associated with tobacco combustion are substantially reduced. Previous studies revealed toxicant levels in the heated tobacco vapour from glo to be around 90-95% less than in cigarette smoke.

'Products like glo are very new and consumers and regulators alike understandably want as much information as possible about them. Understanding how vapour from glo compares to cigarette smoke is, therefore, a core component of our scientific research,' said Dr James Murphy, Head of Reduced Risk Substantiation at British American Tobacco. 'Clinical studies, which are studies involving real people, are an extremely important component of that,' he said.

Because glo vapour has lower levels of toxicants than cigarette smoke, it should in principle expose consumers to much less toxicants. The results of this study indicate that this is indeed the case. The results are presented today at the annual conference of the Society for Nicotine and Tobacco Research in Baltimore, Maryland, USA.

Clinical Study

This clinical study was conducted in Japan because THPs like glo are popular there. One hundred and eighty people participated in the study, which was conducted over a period of eight days in a clinic. They were all smokers for at least three years prior to enrolment.

For the first two days, study participants continued to smoke as normal and their urine was collected to measure levels of chemicals. Blood and breath were also collected for analysis.

For the next five days, participants were randomly allocated to either continue smoking, switch to using a THP or quit smoking. Urine, blood and breath samples were again collected for analysis.

Exposure to certain smoke toxicants was determined by measuring the levels of certain chemicals in the urine. These could be the toxicants themselves or their metabolites - which is what the body breaks it down into - called biomarkers of exposure. Toxicants measured included those identified by the World Health Organisation as being of concern in cigarette smoke.

The results show that the concentration of certain chemicals in the urine was reduced in smokers who switched to glo. In some cases, these reductions were the same as those in smokers who quit (Figure 1). This suggests that smokers who switched to glo were exposed to less toxicants - in some cases, their exposure was the same as smokers who quit altogether.

'These results are very encouraging,' explains Murphy. 'The next step will be to determine whether this reduction in exposure translates to a reduced biological effect, and in turn a reduction in adverse health effects for those smokers who switch completely to glo,' he said.

Future clinical studies will test for markers of biological effect, like cholesterol levels or heart rate (i.e. measurements that give an indication of general health). A reduction in biomarkers of biological effect could suggest that a reduction in exposure is having a positive impact on reducing the adverse health risks of smokers who switch completely.

'The results of one test are important,' said Murphy, 'but it is the combination of the results of many different tests that start to give us a real feel for the bigger picture and the potential for glo to be reduced risk compared to a conventional cigarette.'

Credit: 
R&D at British American Tobacco

PSU study: Pro-diversity policies make companies more innovative and profitable

While some may see corporate diversity initiatives as nothing more than glitzy marketing campaigns, a PSU business school professor's research shows that companies that hire a more diverse set of employees are rewarded with a richer pipeline of innovative products and a stronger financial position.

A variety of businesses large and small have launched initiatives to attract a more diverse and inclusive workforce. But no one has measured how diverse business practices actually impact a company's bottom line. In her paper "Do Pro-Diversity Policies Improve Corporate Innovation?" published in the journal Financial Management in January 2018 and co-authored with two colleagues from North Carolina State University, Jing Zhao, assistant professor of finance in The School of Business at Portland State University, attempts to measure diversity's impact on Corporate America.

The study showed that companies that promote a diverse workforce and a culture of inclusion, specifically attracting and retaining minorities, women, the disabled and LGBTQ employees, were more efficient in generating new products and patents. Zhao found that diverse hiring practices increase the pool from which a firm is able to recruit talented and creative employees. The wider the range of views and backgrounds, experiences and expertise the company attracted, Zhao concluded, the more innovative the company was to solve problems. A diverse, inclusive culture attracts and retains talent and boosts employee morale, Zhao's research found.

Zhao and her coauthors scoured and analyzed data on publicly traded U.S. firms, looking at new product introductions, patents and other company milestones. They showed that pro-diversity policies enhance future firm value by spurring innovation.

"Top corporate leaders, academics and policy makers have long been wondering about the real economic benefits of corporate diversity policies," Zhao said. "Many didn't see how hiring a more diverse workforce positively affected shareholder value. Now we have strong evidence that creating a more diverse workplace today results in more innovative outcomes for companies tomorrow."

Zhao's findings also showed that companies with pro-diversity policies weathered the 2008 economic downturn far better than those without them.

Credit: 
Portland State University

Crop-saving soil tests now at farmers' fingertips

image: This simple on-site equipment uses a polymerase chain reaction technique to let farmers do cheap, fast and accurate on-site soil testing for crop-damaging pathogens.

Image: 
Washington State University

PULLMAN, Wash. Soil pathogen testing - critical to farming, but painstakingly slow and expensive - will soon be done accurately, quickly, inexpensively and onsite, thanks to research that Washington State University scientists plant pathologists are sharing.

As the name implies, these tests detect disease-causing pathogens in the soil that can severely devastate crops.

Until now, the tests have required large, expensive equipment or lab tests that take weeks.

The soil pathogen analysis process is based on polymerase chain reaction (PCR) tests that are very specific and sensitive and only possible in a laboratory.

The new methods, designed by WSU plant pathologists, are not only portable and fast, but utilize testing materials easily available to the public. A paper by the researchers lists all the equipment and materials required to construct the device, plus instructions on how to put it all together and conduct soil tests.

Responding to growers needs

"We've heard from many growers that the time it takes to obtain results from soil samples sent to a lab is too long," said Kiwamu Tanaka, assistant professor in WSU's Department of Plant Pathology. "The results come back too late to be helpful. But if they can get results on site, they could make informed decisions about treatments or management changes before they even plant their crop."

Some diseases from soil pathogens may not be visible until weeks after the crop has sprouted, Tanaka said. That could be too late to treat the disease or could force farmers to use more treatments.

Magnetic breakthrough

WSU graduate student Joseph DeShields, a first author on the paper, said it took about six months of work to get their device to work in the field. It relies on magnets to capture pathogens' DNA from the soil.

"It turns out, it's really hard to separate and purify genetic material from soil because soil contains so much material for PCR tests," said DeShields "So we were thrilled when we made that breakthrough."

Rachel Bomberger is a WSU plant diagnostician who helped with the concepts of the machine testing. She said she's impressed by what Tanaka and the team accomplished.

"We removed a huge stumbling block when it comes to soil testing," said Bomberger, one of the co-authors on the paper. "We found the missing piece that makes the testing systems work in the field without expensive lab equipment or testing materials."

Worldwide application

The system was tested on potato fields around eastern Washington, Tanaka said, but it will work on soil anywhere in the world.

"It's a really versatile method," he said. "You could use it for nationwide pathogen mapping or look at the distribution of pathogens around the country. We started small, but this could have huge implications for testing soil health and disease."

Tanaka said it was important for this discovery to be available in an open-access video journal.

"We're always concerned about helping every grower and the industry as a whole," Tanaka said. "We want everybody to look at this and use it, if they think they'll benefit from it."

Credit: 
Washington State University

Ice chips only? Study questions restrictions on oral intake for women in labor

February 23, 2018 - At most US maternity units, women in labor are put on nil per os (NPO) status--they're not allowed to eat or drink anything, except ice chips. But new nursing research questions that policy, showing no increase in risks for women who are allowed to eat and drink during labor. The study appears in the March issue of the American Journal of Nursing, published by Wolters Kluwer.

"The findings of this study support relaxing the restrictions on oral intake in cases of uncomplicated labor," write Anne Shea-Lewis, BSN, RN, of St. Charles Hospital, Port Jefferson, N.Y., and colleagues. Adding to the findings of previous reports, these results suggest that allowing laboring women to eat and drink "ad lib" doesn't adversely affect maternal and neonatal outcomes.

No Increase in Complications with 'Ad lib' Oral Intake During Labor

The researchers analyzed the medical records of nearly 2,800 women in labor admitted to one hospital from 2008 through 2012. At the study hospital, one practice group of nurses and doctors had a policy of allowing laboring women to eat and drink ad lib (ad libitum, or "as they please"). Another four practice groups kept all patients NPO (nil per os, or "nothing by mouth").

Recommendations to restrict oral intake during labor reflect concerns over the risk of vomiting and aspiration (inhalation) in case general anesthesia and surgery are needed. However, with advances in epidural and spinal anesthesia, the use of general anesthesia during labor has become rare (and, if needed, much safer than before).

The study compared maternal and child outcomes in about 1,600 women who were kept NPO (except for ice chips) with 1,200 who were allowed to eat and drink ad lib during labor. The two groups were "sufficiently equivalent" for comparison. The women's average age was 31 years. Before delivery, a "preexisting medical condition" complicating pregnancy was identified in 14 percent of the NPO group compared with 20 percent of the ad lib group.

Even though the women in the NPO group started out with fewer medical problems, they had a significantly higher incidence of complications during labor and birth, compared with the ad lib group. The women in the NPO group were also significantly more likely to give birth via unplanned cesarean section.

Other outcomes--including requiring a higher level of care after delivery and the newborns' condition as measured by Apgar score--were not significantly different between groups. Analysis using a technique called propensity score matching, comparing groups of women with similar risk factors, yielded similar results.

The findings add to those of previous studies suggesting that restrictions on eating and drinking during labor could be safely relaxed in uncomplicated cases. "Yet in keeping with current guidelines, most obstetricians and anesthesiologists in the United States continue to recommend restrictions on oral intake for laboring women," Anne Shea-Lewis and colleagues write.

"Our findings support permitting women who are at low risk for an operative birth to self-regulate their intake of both solid food and liquids during labor," the researchers add. They note some limitations of their study, especially the fact that the women weren't randomly assigned to NPO or ad lib groups.

The authors hope their study will lead to reconsideration of current recommendations to keep women NPO during the "often long and grueling" process of labor and delivery. "Restricting oral intake to a laboring woman who is hungry or thirsty may intensify her stress," Anne Shea-Lewis and colleagues conclude. "Conversely, allowing her to eat and drink ad lib during labor can contribute to both her comfort and her sense of autonomy."

Credit: 
Wolters Kluwer Health

How cities heat up

CAMBRIDGE, Mass. - The arrangement of a city's streets and buildings plays a crucial role in the local urban heat island effect, which causes cities to be hotter than their surroundings, researchers have found. The new finding could provide city planners and officials with new ways to influence those effects.

Some cities, such as New York and Chicago, are laid out on a precise grid, like the atoms in a crystal, while others such as Boston or London are arranged more chaotically, like the disordered atoms in a liquid or glass. The researchers found that the "crystalline" cities had a far greater buildup of heat compared to their surroundings than did the "glass-like" ones.

The study, published today in the journal Physical Review Letters, found these differences in city patterns, which they call "texture," was the most important determinant of a city's heat island effect. The research was carried out by MIT and National Center for Scientific Research senior research scientist Roland Pellenq, who is also director of a joint MIT/ CNRS/Aix-Marseille University laboratory called MSE>2 (MultiScale Material Science for Energy and Environment); professor of civil and environmental engineering Franz-Josef Ulm; research assistant Jacob Sobstyl; MSE>2 senior research scientist T. Emig; and M.J. Abdolhosseini Qomi, assistant professor of civil and environmental engineering at the University of California at Irvine.

The heat island effect has been known for decades. It essentially results from the fact that urban building materials, such as concrete and asphalt, can absorb heat during the day and radiate it back at night, much more than areas covered with vegetation do. The effect can be quite dramatic, adding as much as 10 degrees Farenheit to night-time temperatures in places such as Phoenix, Arizona. In such places this effect can significantly increase health problems and energy use during hot weather, so a better understanding of what produces it will be important in an era when ever more people are living in cities.

The team found that using mathematical models that were developed to analyze atomic structures in materials provides a useful tool, leading to a straightforward formula to describe the way a city's design would influence its heat-island effect, Pellenq says.

"We use tools of classical statistical physics," he explains. The researchers adapted formulas initially devised to describe how individual atoms in a material are affected by forces from the other atoms, and they reduced these complex sets of relationships to much simpler statistical descriptions of the relative distances of nearby buildings to each other. They then applied them to patterns of buildings determined from satellite images of 47 cities in the U.S. and other countries, ultimately ending up with a single index number for each -- called the local order parameter -- ranging between 0 (total disorder) and 1 (perfect crystalline structure), to provide a statistical description of the cluster of nearest neighbors of any given building.

For each city, they had to collect reliable temperature data, which came from one station within the city and another outside it but nearby, and then determine the difference.

To calculate this local order parameter, physicists typically have to use methods such as bombarding materials with neutrons to locate the positions of atoms within them. But for this project, Pellenq says, "to get the building positions we don't use neutrons, just Google maps." Using algorithms they developed to determine the parameter from the city maps, they found that the cities varied from 0.5 to 0.9.

The differences in the heating effect seem to result from the way buildings reradiate heat that can then be reabsorbed by other buildings that face them directly, the team determined.

Especially for places such as China where new cities are rapidly being built, and other regions where existing cities are expanding rapidly, the information could be important to have, he says. In hot locations, cities could be designed to minimize the extra heating, but in colder places the effect might actually be an advantage, and cities could be designed accordingly.

"If you're planning a new section of Phoenix," Pellenq says, "you don't want to build on a grid, since it's already a very hot place. But somewhere in Canada, a mayor may say no, we'll choose to use the grid, to keep the city warmer."

The effects are significant, he says. The team evaluated all the states individually and found, for example, that in the state of Florida alone urban heat island effects cause an estimated $400 million in excess costs for air conditioning. "This gives a strategy for urban planners," he says. While in general it's simpler to follow a grid pattern, in terms of placing utility lines, sewer and water pipes, and transportation systems, in places where heat can be a serious issue, it can be well worth the extra complications for a less linear layout.

This study also suggests that research on construction materials may offer a way forward to properly manage heat interaction between buildings in cities' historical downtown areas.

Credit: 
Massachusetts Institute of Technology

Deep learning reconstructs holograms

Deep learning has been experiencing a true renaissance especially over the last decade, and it uses multi-layered artificial neural networks for automated analysis of data. Deep learning is one of the most exciting forms of machine learning that is behind several recent leapfrog advances in technology including for example real-time speech recognition and translation as well image/video labeling and captioning, among many others. Especially in image analysis, deep learning shows significant promise for automated search and labeling of features of interest, such as abnormal regions in a medical image.

Now, UCLA researchers have demonstrated a new use for deep learning: this time to reconstruct a hologram and form a microscopic image of an object. In a recent article that is published in Light: Science & Applications, a journal of the Springer Nature, UCLA researchers have demonstrated that a neural network can learn to perform phase recovery and holographic image reconstruction after appropriate training. This deep learning-based approach provides a fundamentally new framework to conduct holographic imaging and compared to existing approaches it is significantly faster to compute and reconstructs improved images of the objects using a single hologram, such that it requires fewer measurements in addition to being computationally faster.

This research was led by Dr. Aydogan Ozcan, an associate director of the UCLA California NanoSystems Institute and the Chancellor's Professor of electrical and computer engineering at the UCLA Henry Samueli School of Engineering and Applied Science, along with Dr. Yair Rivenson, a postdoctoral scholar, and Yibo Zhang, a graduate student, both at the UCLA electrical and computer engineering department.

The authors validated this deep learning based approach by reconstructing holograms of various samples including blood and Pap smears (used for screening of cervical cancer) as well as thin sections of tissue samples used in pathology, all of which demonstrated successful elimination of spatial artifacts that arise from the lost phase information at the hologram recording process. Stated differently, after its training the neural network has learned to extract and separate the spatial features of the true image of the object from undesired light interference and related artifacts. Remarkably, this deep learning based hologram recovery has been achieved without any modeling of light-matter interaction or a solution of the wave equation. This is an exciting achievement since traditional physics-based hologram reconstruction methods have been replaced by a deep learning based computational approach, said Rivenson.

These results are broadly applicable to any phase recovery and holographic imaging problem, and this deep learning based framework opens up a myriad of opportunities to design fundamentally new coherent imaging systems, spanning different parts of the electromagnetic spectrum, including visible wavelengths as well as the X-ray regime, added Ozcan, who is also an HHMI Professor with the Howard Hughes Medical Institute.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

On second thought, the Moon's water may be widespread and immobile

image: If the Moon has enough water, and if it's reasonably convenient to access, future explorers might be able to use it as a resource.

Image: 
NASA's Goddard Space Flight Center

A new analysis of data from two lunar missions finds evidence that the Moon's water is widely distributed across the surface and is not confined to a particular region or type of terrain. The water appears to be present day and night, though it's not necessarily easily accessible.

The findings could help researchers understand the origin of the Moon's water and how easy it would be to use as a resource. If the Moon has enough water, and if it's reasonably convenient to access, future explorers might be able to use it as drinking water or to convert it into hydrogen and oxygen for rocket fuel or oxygen to breathe.

"We find that it doesn't matter what time of day or which latitude we look at, the signal indicating water always seems to be present," said Joshua Bandfield, a senior research scientist with the Space Science Institute in Boulder, Colorado, and lead author of the new study published in Nature Geoscience. "The presence of water doesn't appear to depend on the composition of the surface, and the water sticks around."

The results contradict some earlier studies, which had suggested that more water was detected at the Moon's polar latitudes and that the strength of the water signal waxes and wanes according to the lunar day (29.5 Earth days). Taking these together, some researchers proposed that water molecules can "hop" across the lunar surface until they enter cold traps in the dark reaches of craters near the north and south poles. In planetary science, a cold trap is a region that's so cold, the water vapor and other volatiles which come into contact with the surface will remain stable for an extended period of time, perhaps up to several billion years.

The debates continue because of the subtleties of how the detection has been achieved so far. The main evidence has come from remote-sensing instruments that measured the strength of sunlight reflected off the lunar surface. When water is present, instruments like these pick up a spectral fingerprint at wavelengths near 3 micrometers, which lies beyond visible light and in the realm of infrared radiation.

But the surface of the Moon also can get hot enough to "glow," or emit its own light, in the infrared region of the spectrum. The challenge is to disentangle this mixture of reflected and emitted light. To tease the two apart, researchers need to have very accurate temperature information.

Bandfield and colleagues came up with a new way to incorporate temperature information, creating a detailed model from measurements made by the Diviner instrument on NASA's Lunar Reconnaissance Orbiter, or LRO. The team applied this temperature model to data gathered earlier by the Moon Mineralogy Mapper, a visible and infrared spectrometer that NASA's Jet Propulsion Laboratory in Pasadena, California, provided for India's Chandrayaan-1 orbiter.

The new finding of widespread and relatively immobile water suggests that it may be present primarily as OH, a more reactive relative of H2O that is made of one oxygen atom and one hydrogen atom. OH, also called hydroxyl, doesn't stay on its own for long, preferring to attack molecules or attach itself chemically to them. Hydroxyl would therefore have to be extracted from minerals in order to be used.

The research also suggests that any H2O present on the Moon isn't loosely attached to the surface.

"By putting some limits on how mobile the water or the OH on the surface is, we can help constrain how much water could reach the cold traps in the polar regions," said Michael Poston of the Southwest Research Institute in San Antonio, Texas.

Sorting out what happens on the Moon could also help researchers understand the sources of water and its long-term storage on other rocky bodies throughout the solar system.

The researchers are still discussing what the findings tell them about the source of the Moon's water. The results point toward OH and/or H2O being created by the solar wind hitting the lunar surface, though the team didn't rule out that OH and/or H2O could come from the Moon itself, slowly released from deep inside minerals where it has been locked since the Moon was formed.

"Some of these scientific problems are very, very difficult, and it's only by drawing on multiple resources from different missions that are we able to hone in on an answer," said LRO project scientist John Keller of NASA's Goddard Space Flight Center in Greenbelt, Maryland.

Credit: 
NASA/Goddard Space Flight Center

Study explores emerging role of NAD+ in innate and adaptive immune responses

Researchers at Brigham and Women's Hospital (BWH) have discovered a new cellular and molecular pathway that regulates CD4+ T cell response--a finding that may lead to new ways to treat diseases that result from alterations in these cells. Their discovery, published online in the Journal of Allergy and Clinical Immunology, shows that administering nicotinamide adenine dinucleotide (NAD+), a natural molecule found in all living cells, shuts off the capacity of dendritic cells and macrophages to dictate CD4+ T fate. Researchers found that NAD+ administration regulated CD4+ T cells via mast cells (MCs), cells that have been mainly described in the context of allergy, exclusively.

"This is a novel cellular and molecular pathway that is distinct from the two major pathways that were previously known. Since it is distinct and since it has the ability to regulate the immune system systemically, we can use it as an alternative to bypass the current pathways," said Abdallah ElKhal, PhD, BWH Department of Surgery, senior study author.

CD4+ T helper cells and dendritic cells play a central role in immunity. Alterations or aberrant dendritic cells and T cell responses can lead to many health conditions including autoimmune diseases, infections, allergy, primary immunodeficiencies and cancer.

As of today, two major pathways have been described to regulate CD4+ T cell response. The first pathway was described by Peter C. Doherty and Rolf M. Zinkernagel (1996 Nobel prize winners) showing the requirement of MHC-TCR signaling machinery. More recently, a second mechanism involving the Pathogen or Damage Associated Molecular Patterns (PAMPs or DAMPs) was unraveled by Bruce A. Beutler and Jules A. Hoffmann (2011 Nobel Prize winners). Of importance, both pathways require antigen presenting cells (APCs) in particular dendritic cells (DCs) or macrophages (Mφ). Elkhal's novel pathway is distinct from the two previous ones and may offer a path forward for novel therapeutic approaches.

For the current study, BWH researchers performed pre-clinical trials using an experimental infection model. They showed that mast cell-mediated CD4+ T cell response protects against lethal doses of infection (Listeria monocytogenes). Mice treated with NAD+ had a dramatically increased survival rate when compared to the non-treated group.

"Collectively, our study unravels a novel cellular and molecular pathway that regulates innate and adaptive immunity via MCs, exclusively, and underscores the therapeutic potential of NAD+ in the context of a myriad of diseases including autoimmune diseases, hemophilia, primary immunodeficiencies and antimicrobial resistance," said Elkhal.

Credit: 
Brigham and Women's Hospital

Evaluation of I-TOPP examines outcomes of transdisciplinary doctoral training program

URBANA, Ill. - Over the past 30 years, the prevalence of overweight and obesity has doubled in 2- to 5-year-olds and tripled in children aged 6 to 11 years. To address this public health concern, in 2011, the USDA funded the Illinois Transdisciplinary Obesity Prevention Program (I-TOPP), a joint doctoral/Masters of Public Health (MPH) degree program, at the University of Illinois with the goal of training future leaders to address the problem of childhood obesity.

Although transdisciplinary doctoral training programs in academic settings are relatively new, these types of research approaches are increasingly being used to address complex research areas, such as childhood obesity.

"We know the causes of childhood obesity are multifactorial, involving both genetic and environmental causes," says Sharon Donovan, professor in the Department of Food Science and Human Nutrition at U of I and director of I-TOPP. "Of the environmental factors, family routines, nutrition, food security, physical activity, sedentary behavior, and sleep are all important.

"To tackle such a multifaceted issue, many perspectives need to be brought to the table, necessitating a transdisciplinary approach," she adds.

Because Donovan and her fellow researchers were undertaking a new approach to doctoral training, they wanted to evaluate the education process as well as the outcomes. To understand the barriers and benefits to transdisciplinary doctoral training--versus focusing on a single discipline--the researchers conducted focus groups with the faculty and students at the start of the program and after five years into the program.

A paper focusing on the perspectives of faculty and students in the program, published in Palgrave Communications, describes some of the perceived benefits and barriers to transdisciplinary education. Some of the benefits cited were greater collaboration and networking, more guidance and support from advisors, newly broadened ways of thinking, and expanded opportunities for learning and research.

Some of the barriers cited by students included time concerns; feeling like they had too much to do and not enough time to do it, as well as feeling like they were under greater pressure compared to their traditional counterparts. "While both the faculty and students acknowledged the benefits of I-TOPP, it is important to think about ways to lower the barriers to transdisciplinary training in order to be successful," Donovan adds.

Previous research has shown that the timing of publications from transdisciplinary research can be delayed due to the need for the team to come together and the nature of the complex questions the teams often undertake. Thus, the researchers were interested in determining if that was the case for I-TOPP.

A second paper, recently published in PloS One, shows that the program's success in training doctoral students has included higher-impact research publications by I-TOPP students, more collaborators (co-authors) on those papers, and more disciplines represented when compared to the publications of students in traditional doctoral programs. Publication impact indicators were significantly higher for I-TOPP students, including higher citations in Google Scholar and Scopus.

Publication productivity was somewhat, though not significantly, higher for I-TOPP students, as well.

The program's transdisciplinary approaches span beyond the expertise of instructors and researchers within academia and also involve community stakeholders. These approaches, which are often a component of team science, teach students to master and then integrate broad methods to find solutions to complex public health problems such as childhood obesity.

"Our students work with the community to find real-world solutions when it comes to research," explains Anna-Sigrid Keck, program coordinator and lead author on both papers. "It's really applied research that the students are working on during their doctoral training. Students in the I-TOPP program take the disciplinary foundation, and create new thinking and new hypotheses, and then merge them together. That takes more work but the publication impact is an indication that it might be worth the extra work effort."

The program has 11 doctoral students, who were enrolled in three cohorts in 2011, 2012, and 2013. Seven I-TOPP students have already begun or have accepted prestigious grant-funded post-doctoral positions. Another student recently accepted a faculty position at Boston College, the first faculty position for one of the program's graduates.

Keck says the intent is to continue following the careers of I-TOPP graduates over the next 10 years, continuing to compare them to traditional doctoral students. "Even now there are publication differences, but I think the real impact will be 5-10 years out," Keck says.

Donovan adds, "When we started I-TOPP in 2011, we proposed that the graduates of the program would be ideally positioned to undertake complex public health problems, due to the combined PhD, MPH degree, and the transdisciplinary educational approach. Given the high-quality institutions where we have placed our graduates, including the Baylor College of Medicine, Boston College, Harvard, Northeastern University, Northwestern University, the University of Iowa, and the University of Minnesota, we are beginning to see that promise fulfilled."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences