Tech

ASU Online science course brings to life a new way of teaching

image: Students taking the Habitable Worlds online course use interactive simulators, like this one, to poke and prod scientific models, helping them understand concepts better than static images. This Stellar Nursery simulator allows students to create stars and watch as they live and die.

Image: 
ASU

Arizona State University's School of Earth and Space Exploration recently released new research on its flagship Smart Course, Habitable Worlds, published in the peer-reviewed journal, Astrobiology. The study found that its student-centered, exploration-focused design resulted in high course grades and demonstrable mastery of content.

Created for non-science majors, Habitable Worlds (HabWorlds) uses interactive simulations and virtual field trips to introduce astronomy, biology, chemistry, geology, and physics to students as they explore the search for life beyond Earth. The online course was created by the Center for Education Through eXploration (ETX) at ASU, with support from NASA and the National Science Foundation. Since 2011, it has been taken by more than 5,000 ASU students and adopted by instructors at nearly 40 other institutions globally.

In the published research, co-creator of the course Lev Horodyskyj and the team of researchers, instructors, and programmers who created HabWorlds, describe the digital design philosophy they used to develop and implement the course.

"With HabWorlds, we wanted to bring to life a new way of teaching science. Our goal was to create an interactive, game-like science course that teaches science as it really is -- a systematic process of exploring the unknown, not just memorization of known facts," said Ariel Anbar, fellow co-creator of the course and Director of ETX.

Through the use of Smart Sparrow's learning platform, instructors are able to access learner analytics captured throughout the learning experiences. They can identify common trouble areas for their students and make rapid design and content improvements to ensure students meet the learning objectives of every lesson.

"Because of the way HabWorlds is designed and our ability to use Smart Sparrow's course analytics to dig deeper into what students do throughout the lesson, we can clearly measure their understanding and mastery of content as demonstrated through their behaviors," said Horodyskyj.

The research also shared student learning outcomes and evaluation data from over the past six years. Analysis showed that the consistently higher grades through use of the course are a result of better concept mastery and efficient problem-solving, not simplified concepts or easier grading than the traditional instruction.

"What's fascinating about this research is it provides evidence that the students who received high grades are actually reasoning more effectively that those who didn't, based on how they behaved in the gameful learning system," said Dror Ben-Naim, CEO of Smart Sparrow. "This demonstrates the kinds of novel assessments that are made possible with technology that's married with smart design."

Credit: 
Arizona State University

Researchers provide potential explanation for declines in brown bear populations

Animals may fall into what are called evolutionary and ecological traps when they make poor decisions using seemingly reliable environmental cues. For example, animals may select habitats to occupy based on food availability, but mortality may be highest in habitats with the highest food availability. A new Mammal Review article examines how the brown (grizzly) bear can fall into such traps in human-modified landscapes, which may contribute to decreases in brown bear populations.

In their article, researchers describe evolutionary and ecological traps for brown bears, and they propose mechanisms by which traps may affect the dynamics and viability of brown bear populations. There are six potential trap scenarios: food resources close to human settlements; agricultural landscapes; roads; artificial feeding sites; hunting by humans; and other human activities (including ecotourism and reindeer husbandry).

"Despite the interest in large carnivore conservation in human-modified landscapes, the emergence of traps and their potential effects on the conservation of large carnivore populations has frequently been overlooked," said lead author Dr. Vincenzo Penteriani, of the Spanish National Research Council (CSIC), in Spain. "More effort should thus be put into the consideration that traps may be behind the unexpected decreases of brown bear and other large carnivore populations in human-modified landscapes."

Credit: 
Wiley

Rare coastal martens under high risk of extinction in coming decades

CORVALLIS, Ore. - The coastal marten, a small but fierce forest predator, is at a high risk for extinction in Oregon and northern California in the next 30 years due to threats from human activities, according to a new study.

The study, published today in the online journal PeerJ, will be available to federal and state wildlife agencies for their consideration to determine whether distinct geographic population segments of the coastal marten warrant state or federal listing as threatened or endangered, said Katie Moriarty, a certified wildlife biologist and lead co-author on the study.

"Martens are like the river otters of the woods," Moriarty said. "But they can be vicious little critters, too. When you capture one and it's growling at you from inside a cage, there is no mistaking its intent. They're the size of kittens and act like they'll attack a pit bull."

Some threats to coastal martens include trapping and being hit by cars, said Moriarty, an Oregon State University graduate now with the U.S. Department of Agriculture Forest Service Pacific Northwest Research Station. Martens are trapped for their fur throughout Oregon with no bag limit.

"This study provides the most conclusive evidence yet of risk to a coastal marten population," she said. "It's the only robust population estimate of a marten population in the Pacific states."

Martens are rare in the coastal forests of Oregon and northern California. A different subspecies of martens thrives in the high elevation forests within the Cascade mountains. Martens resemble a cross between a fox and a mink, with bushy tails and large paws with partially retractable claws. Coastal martens were petitioned for listing under the U.S. Endangered Species Act in 2010, but withdrawn for consideration by the fish and wildlife service in 2015. Last year, the U.S. District Court for Northern California denied the withdrawal, and the fish and wildlife service is now collecting information on marten populations for a decision to be made in October.

"This marten population is now so small that it is in imminent danger of extinction, which would leave martens without a source population to recolonize the central and northern coast of Oregon," said Taal Levi, a professor of wildlife biology in the Department of Fisheries and Wildlife in OSU's College of Agricultural Sciences and a co-author on the study.

Martens once ranged throughout coastal forests throughout Oregon to the northern California wine country. Extensive surveys revealed that the coastal marten population is now restricted to two populations, one in southern Oregon and Northern California, and another small population in the Oregon Dunes National Recreation Area, west of U.S. Highway 101 in central Oregon.

To determine how many martens are in the Oregon Dunes, the researchers live-trapped and attached radio collars to 10 adult martens (six females, four males) and set 31 remotely triggered cameras in the study area that could identify unique patterns on collars. Statistical models were then used to estimate the number of martens based on the frequency in which uniquely marked martens were seen on camera.

Their population assessment revealed that the central Oregon population of coastal martens is likely fewer than 87 adults divided into two subpopulations separated by the Umpqua River. Using a population viability analysis, they concluded that the extinction risk for a subpopulation of 30 martens ranged from 32 percent to 99 percent.

In the short term, limiting human-caused deaths of the coastal martens would have the greatest impact on the animal's survival, said Moriarty, who has studied the animals for several years. In the long term, the species requires more habitat, which perhaps could be accomplished by making the adjacent federal land in Siuslaw National Forest suitable for martens.

Credit: 
Oregon State University

Cocoa bean roasting can preserve both chocolate health benefits, taste

image: Researchers studied the impact of whole-bean roasting on the compounds that provide health benefits and aroma-related chemistry under a range of roasting conditions. Their findings suggest that cocoa roasting can be optimized to increase the content of some of the health-boosting compounds while maintaining a favorable taste profile.

Image: 
Pablo Merchán Montes/Unsplash

Manipulating the temperature and the length of time under which cocoa beans are roasted can simultaneously preserve and even boost the potency of some bioactive and antioxidant compounds while protecting desired sensory aspects of chocolate, according to Penn State researchers.

That finding flies in the face of previous studies that indicate that roasting always results in a reduction in the polyphenol content in the beans. Cocoa polyphenols are believed to have a positive influence on human health, especially with regard to cardiovascular and inflammatory diseases, metabolic disorders and cancer prevention.

Chocolate, a food usually consumed for pleasure, in recent years has been reconsidered as a source of healthy compounds, noted lead researcher Joshua Lambert, associate professor of food science. The goal of the study, he explained, was to learn whether the roasting of cocoa beans could both preserve preferred flavor characteristics and boost health benefits.

Researchers investigated the impact of whole-bean roasting on the polyphenol content, aroma-related chemistry and pancreatic lipase inhibitory activity of cocoa under a range of roasting conditions. The inhibition of pancreatic lipase activity is a potential anti-obesity strategy.

Pancreatic lipase breaks down triglycerides into fatty acids, which then get absorbed through the lining of the small intestine. A pancreatic lipase inhibitor prevents the formation of fatty acids and therefore prevents absorption of dietary fats into the body.

In the study, total phenolics, epicatechin, and smaller proanthocyanidins were reduced by roasting at temperatures under 302 degrees Fahrenheit, Lambert pointed out. By contrast, roasting at 302 F or above increased the levels of catechin and larger proanthocyanidins, which have a greater ability to inhibit pancreas lipase.

Consistent with these changes, researchers found that cocoa roasted at 338 F better inhibited pancreatic lipase inhibitory activity than cocoa roasted at lower temperatures. Cocoa aroma-related compounds increased with roasting above 212 F, whereas deleterious sensory-related compounds formed at more severe temperatures, 338 F.

The research findings suggest that cocoa roasting can be optimized to increase the content of some polyphenols and boost anti-pancreas-lipase activity, while maintaining a favorable aroma profile, Lambert pointed out.

"Our results show that if you look at the individual polyphenolic content or the individual polyphenol compounds in cocoa, roasting causes some of them to go down while some of them go up," he said. "It is more complicated than saying that roasting leads to a decrease in phenolic content, and that by extension roasting reduces the health beneficial effects of cocoa."

The findings of the research, which were recently published on-line in Food Chemistry, will be of interest to chocolate makers, Lambert believes, because of an increasing demand for chocolate products offering enhanced health benefits. He cited as an example Mars company's CocoaVia cocoa-extract supplement that promises to deliver 375 mg of cocoa flavanols -- antioxidants -- in each serving to promote good health.

Beyond cocoa and chocolate, going forward Lambert predicted that more attention will be focused on how processing can affect the health beneficial effects of food.

"The effects of roasting and processing are complex and it's important to better understand what's going on in terms of the effect of the processing on the chemistry of the food," Lambert said. "We need to know how processing really affects the biological activity rather than to make an assumption that processing is always bad and that unprocessed or minimally-processed foods are always more healthful."

One case in point are studies that show that the human body is able to absorb lycopene better from tomato sauce than from raw tomatoes -- cooking tomatoes improves the bioavailability of the antioxidant, he said.

Credit: 
Penn State

Artificial intelligence helps to predict likelihood of life on other worlds

image: Composite image showing an infrared view of Saturn's moon Titan, taken from NASA's Cassini spacecraft. Some measures suggest that Titan has the highest habitability rating of any world other than Earth, based on factors such as availability of energy, and various surface and atmosphere characteristics.

Image: 
NASA / JPL / University of Arizona / University of Idaho

Developments in artificial intelligence may help us to predict the probability of life on other planets, according to new work by a team based at Plymouth University. The study uses artificial neural networks (ANNs) to classify planets into five types, estimating a probability of life in each case, which could be used in future interstellar exploration missions. The work is presented at the European Week of Astronomy and Space Science (EWASS) in Liverpool on 4 April by Mr Christopher Bishop.

Artificial neural networks are systems that attempt to replicate the way the human brain learns. They are one of the main tools used in machine learning, and are particularly good at identifying patterns that are too complex for a biological brain to process.

The team, based at the Centre for Robotics and Neural Systems at Plymouth University, have trained their network to classify planets into five different types, based on whether they are most like the present-day Earth, the early Earth, Mars, Venus or Saturn's moon Titan. All five of these objects are rocky bodies known to have atmospheres, and are among the most potentially habitable objects in our Solar System.

Mr Bishop comments, "We're currently interested in these ANNs for prioritising exploration for a hypothetical, intelligent, interstellar spacecraft scanning an exoplanet system at range."

He adds, "We're also looking at the use of large area, deployable, planar Fresnel antennas to get data back to Earth from an interstellar probe at large distances. This would be needed if the technology is used in robotic spacecraft in the future."

Atmospheric observations - known as spectra - of the five Solar System bodies are presented as inputs to the network, which is then asked to classify them in terms of the planetary type. As life is currently known only to exist on Earth, the classification uses a 'probability of life' metric which is based on the relatively well-understood atmospheric and orbital properties of the five target types.

Bishop has trained the network with over a hundred different spectral profiles, each with several hundred parameters that contribute to habitability. So far, the network performs well when presented with a test spectral profile that it hasn't seen before.

"Given the results so far, this method may prove to be extremely useful for categorising different types of exoplanets using results from ground-based and near Earth observatories" says Dr Angelo Cangelosi, the supervisor of the project.

The technique may also be ideally suited to selecting targets for future observations, given the increase in spectral detail expected from upcoming space missions such ESA's Ariel Space Mission and NASA's James Webb Space Telescope.

Credit: 
Royal Astronomical Society

Smokers have worse diets than non-smokers

Smokers have worse quality diets than former smokers or non-smokers, according to a study published in the open access journal BMC Public Health.

Dr Jacqueline Vernarelli at Fairfield University, Connecticut and Dr R. Ross MacLean at Yale University evaluated data from 5293 US adults and found that smokers consumed around 200 more calories a day, despite eating significantly smaller portions of food, than non-smokers or former smokers.

Dr Vernarelli commented: "Smokers had diets that were high in energy density, meaning they consumed smaller amounts of food containing a greater number of calories. Non-smokers consumed more food which contained fewer calories."

The researchers found that people who had never smoked consumed around 1.79 calories per gram of food, daily smokers consumed 2.02 kcal/g and non-daily smokers consumed 1.89 kcal/g. The researchers also found that former smokers consumed more calories per gram of food (1.84kcal/g) than those who had never smoked, but the former smokers' dietary energy density was still significantly lower than that of current smokers. The finding suggests that any amount of cigarette consumption could be associated with poorer diet quality.

The calorie dense diets consumed by the smokers whose data was used in this study often included less fruit and vegetables, which means their intake of vitamin C was likely to be lower. The authors suggest that this deficiency could potentially put smokers at further risk of cardiovascular disease and cancer, presenting a major public health concern.

The researchers also suggest that a diet low in energy density could help prevent weight gain after quitting smoking.

Dr Vernarelli explained: "We know from the literature that concerns about weight gain are barriers to quitting smoking, and we know that diets high in energy density are associated with higher body weight. Our results suggest that addressing the energy density in diets of current smokers may be a good target for interventions as part of a larger smoking cessation plan"

The researchers used data from 5293 adults who took the National Health and Examination Survey, a program of studies designed to assess the health and nutritional status of adults and children in the US. The dietary data used in the study was based on participants recalling what they ate in the past 24 hours. The mean dietary energy density (kcal/g) was calculated after adjusting for age, sex, race, educational attainment, socioeconomic status, beverage energy density, physical activity and BMI.

The authors caution that the study's use of self-reported survey data may have introduced information and recall bias. The cross sectional nature of this study does not allow for conclusions about cause and effect between diet quality and smoking.

Credit: 
BMC (BioMed Central)

Three-month-old infants can learn abstract relations before language comprehension

EVANSTON, Ill. --- Three-month-old babies cannot understand words and are just learning to roll over, yet they are already capable of learning abstract relations. In a new study, Northwestern University researchers show for the first time that 3-month-old infants can learn same and different relations.

"Recent theories have suggested that humans' fluency in relational learning -- our ability to make comparisons between objects, events or ideas -- may be the key difference in mental ability between us and other animals," said Dedre Gentner, professor of psychology in the Weinberg College of Arts and Sciences at Northwestern and a senior author of the study. "While some non-human primates can learn abstract relations, they require extensive training -- sometimes thousands of trials -- to do so."

Erin Anderson, lead author of the study and a graduate student in psychology at Northwestern, said, "We know that by four years of age, children can detect and use relations like same and different. What we didn't know was when this ability begins. In this study, we asked whether young infants could form abstract relations."

The researchers began by showing the infants multiple pairs of toys. Half the infants saw pairs of toys that were the same (such as two Elmo dolls); the other half saw pairs with different toys (such as an Elmo doll next to a pink block). Then they tested what infants had learned by measuring their looking time at pairs that they had never seen before -- one pair showing the relation they had seen, and the other pair showing the relation they had not seen. The results showed that infants looked longer at the relation they had not already seen -- even when the objects involved were new to the infants.

"If infants had only learned the pairs we showed them, but not the relation between the toys, they should have looked equally between these pairs of new toys," Anderson said. "What we found, however, was that infants looked longer at pairs showing the unfamiliar relation, showing they had abstracted the relations in a few as six trials. These results constitute the earliest evidence for abstract learning in humans."

Some of the results may be particularly surprising, the researchers noted.

"Contrary to the typical idea that more examples are always better for learning, infants in our study performed better when they had fewer examples, but more chances to compare between them," said Susan Hespos, professor of psychology at Northwestern and a senior author on the study. "Specifically, 3-month-olds learned the relations when they saw two alternating pairs (AA, BB), but not when they saw six pairs (AA, BB, CC, DD, EE, FF) during the learning phase. At this early age, comparison and repetition is more helpful for learning an abstract relation."

The researchers argue that the repetition helped the infants go beyond the objects to focus on the relation between them. This finding is consistent with previous research in children and adults, which shows that attention to individual objects can interfere with relational learning.

"Finding this same pattern suggests that this is a lifelong process from infancy to adulthood," Anderson said. "Until now, it was not clear whether adults' sophisticated relational processing stemmed from experience with language and human culture. That this ability is present at 3 months of age shows that analogical processes exist prior to language learning."

The researchers speculate that language learning may even capitalize on this pre-existing relational ability.

Credit: 
Northwestern University

Astrophysicists map the infant universe in 3-D and discover 4,000 early galaxies

image: This is a view of the COSMOS field in the constellation of Sextans, seen in infrared light. This corresponds closely to the region of the sky studied in the new work.

Image: 
ESO/UltraVISTA team. Acknowledgement: TERAPIX/CNRS/INSU/CASU

Astronomers today announce one of the largest 3D maps of the infant Universe, in a presentation at the European Week of Astronomy and Space Science in Liverpool. A team led by Dr David Sobral of Lancaster University made the chart using the Subaru telescope in Hawaii and the Isaac Newton telescope in the Canary Islands. Looking back in time to 16 different epochs between 11 and 13 billion years ago, the researchers discovered almost 4000 early galaxies, many of which will have evolved into galaxies like our own Milky Way.

Light from the most distant galaxies takes billions of years to reach us. This means that telescopes act as time machines, allowing astronomers to see galaxies in the distant past. The light from these galaxies is also stretched by the expansion of the Universe, increasing its wavelength to make it redder. This so-called redshift is related to the distance of the galaxy. By measuring the redshift of a galaxy, astronomers can thus deduce its distance, how long its light has taken to reach us and hence how far back in time we are seeing it.

In the new work the team used filters to sample particular wavelengths of light, and hence specific epochs in the history of the Universe.

Sergio Santos, a Lancaster PhD student and team member, comments: "We used large amounts of data taken with 16 special filters on wide field cameras and processed them here in Lancaster to literally slice the Universe in cosmic time and time-travel to the distant past with 16 well defined cosmic time destinations."

Dr Sobral adds: "These early galaxies seem to have gone through many more "bursts" when they formed stars, instead of forming them at a relatively steady rate like our own galaxy. Additionally, they seem to have a population of young stars that is hotter, bluer and more metal-poor than those we see today."

Sobral and his team found galaxies that existed when the Universe was only 20 to 7% of its current age, and hence provide crucial information about the early phases of galaxy formation.

The researchers also found that these early galaxies are incredibly compact. "The bulk of the distant galaxies we found are only about 3 thousand light years across in size, while our Milky Way is about 30 times larger. Their compactness likely explains many of their exciting physical properties that were common in the early Universe", comments Ana Paulino-Afonso, a PhD student in Lancaster and Lisbon. "Some of these galaxies should have evolved to become like our own and thus we are seeing what our galaxy may have looked like 11 to 13 billion years ago."

The team searched for distant galaxies emitting Lyman-alpha radiation, using 16 different narrow and medium band filters over the COSMOS field, which is one of the most widely studied regions of sky outside our Milky Way, located in the direction of the constellation of Sextans. The Lancaster-led team includes young researchers from Leiden, Lisbon and California. The team also publish their findings in two papers in the journal Monthly Notices of the Royal Astronomical Society and the data are now publicly available for other astronomers to make further discoveries.

Credit: 
Royal Astronomical Society

Smartphone app performs better than traditional exam in cardiac assessment

A smartphone application using the phone's camera function performed better than traditional physical examination to assess blood flow in a wrist artery for patients undergoing coronary angiography, according to a randomized trial published in CMAJ (Canadian Medical Association Journal).

These findings highlight the potential of smartphone applications to help physicians make decisions at the bedside. "Because of the widespread availability of smartphones, they are being used increasingly as point-of-care diagnostics in clinical settings with minimal or no cost," says Dr. Benjamin Hibbert of the University of Ottawa Heart Institute, Ottawa, Ontario. "For example, built-in cameras with dedicated software or photodiode sensors using infrared light-emitting diodes have the potential to render smartphones into functional plethysmographs [instruments that measure changes in blood flow]."

The researchers compared the use of a heart-rate monitoring application (the Instant Heart Rate application version 4.5.0 on an iPhone 4S) with the modified Allen test, which measures blood flow in the radial and ulnar arteries of the wrist, one of which is used to access the heart for coronary angiography. A total of 438 participants were split into two groups; one group was assessed using the app and the other was assessed using a gold-standard traditional physical examination (known as the Allen test). The smartphone app had a diagnostic accuracy of 94% compared with 84% using the traditional method.

"The current report highlights that a smartphone application can outperform the current standard of care and provide incremental diagnostic yield in clinical practice," writes Dr. Hibbert, with colleagues.

"However, while they aren't designed as medical devices -- when smartphones and apps begin to be used clinically -- it is important that they are evaluated in the same rigorous manner by which we assess all therapies and diagnostic tests," says lead author Dr. Pietro Di Santo. "When we designed the iRadialstudy we wanted to hold the technology to the highest scientific standards to make sure the data supporting its use was as robust as possible."

"Although this application is not certified at present for use in health care by any regulatory body, our study highlights the potential for smartphone-based diagnostics to aid in clinical decision-making at the patient's bedside," concludes Dr. Hibbert.

The health care profession and regulatory agencies should proactively address the challenges associated with bringing mobile health (mHealth) solutions into practice to maximize their benefits, writes Dr. Kumanan Wilson, of The Ottawa Hospital and the University of Ottawa, in a related commentary http://www.cmaj.ca/lookup/doi/10.1503/cmaj.180269.

"Referred to as a new industrial revolution, the impact of digital technologies will be both disruptive and transformative," he writes. "The continued maturation of technologies, such as artificial intelligence, virtual reality and blockchain, will further expand the possibilities for mHealth in both diagnosis and treatment in health care."

"Photoplethysmography using a smartphone application for assessment of ulnar artery patency: a randomized clinical trial" is published April 3, 2018.

Credit: 
Canadian Medical Association Journal

Study suggests pasta can be part of a healthy diet without packing on the pounds

TORONTO, April 3, 2018 -- Carbohydrates get a lot of bad press and blame for the obesity epidemic, but a new study suggests that this negative attention may not be deserved for pasta.

Unlike most 'refined' carbohydrates, which are rapidly absorbed into the bloodstream, pasta has a low glycemic index, meaning it causes smaller increases in blood sugar levels than those caused by eating foods with a high glycemic index.

Researchers at St. Michael's Hospital undertook a systematic review and meta-analysis of all of the available evidence from randomized controlled trials, the gold standard of research design. They identified 30 randomized control trials involving almost 2,500 people who ate pasta instead of other carbohydrates as part of a healthy low-glycemic index diet. Their results were published today in the journal BMJ Open.

"The study found that pasta didn't contribute to weight gain or increase in body fat," said lead author Dr. John Sievenpiper, a clinician scientist with the hospital's Clinical Nutrition and Risk Modification Centre. "In fact analysis actually showed a small weight loss. So contrary to concerns, perhaps pasta can be part of a healthy diet such as a low GI diet."

The people involved in the clinical trials on average ate 3.3 servings of pasta a week instead of other carbohydrates. One serving equals about one-half cup of cooked pasta. They lost about one-half kilogram over a median follow-up of 12 weeks.

The study authors stressed that these results are generalizable to pasta consumed along with other low-glycemic index foods as part of a low-glycemic index diet. They caution more work is needed to determine if the lack of weight gain will extend to pasta as part of other healthy diets.

"In weighing the evidence, we can now say with some confidence that pasta does not have an adverse effect on body weight outcomes when it is consumed as part of a healthy dietary pattern," said Dr. Sievenpiper.

This work received funding from the Canadian Institutes of Health Research through the Canada-wide Human Nutrition Trialists' Network; the Diet, Digestive Tract and Disease Centre, funded through the Canadian Foundation for Innovation, and the Ministry of Research and Innovation's Ontario Research Fund, among other non-industry sponsors.

Some of the authors have received prior research grants, in-kind donations of pasta for a randomized controlled trial, and travel support from the pasta maker Barilla.

Credit: 
St. Michael's Hospital

Your wood stove affects the climate more than you might think

image: Soot is made up of particles all the way down to nanosize. These emissions are the most difficult ones to get rid of. Researchers are working to figure out what measures need to be taken in the combustion chamber to minimize soot emissions in all furnaces.

Image: 
SINTEF Energy

Norwegians love to heat with wood. That's easy to see when driving around the Norwegian countryside in the winter. Stacks of wood line the walls of houses and smoke rises from the chimneys, especially on cold days.

There was even a national "wood night program" on NRK, the Norwegian Broadcasting Corporation, which ran for 12 hours and attracted international attention because of its unusual theme.

According to figures from Statistics Norway (SSB), 1.2 million Norwegian households heat with wood. They burned 1.1 million tonnes of firewood in 2016, which provided 5.34 TWh of direct heat -- and that might have affected the climate more than you might think.

"Our findings show a complex picture. This form of heating has a significant warming effect on the climate, which is cause for concern. But at the same time, burning wood also causes significant cooling, which is encouraging," says Anders Arvesen, a researcher in the Industrial Ecology Programme at the Norwegian University of Science and Technology (NTNU).

Arvesen and his colleague Francesco Cherubini were among the co-authors of a major study on climate impacts in Norway just published in Scientific Reports.

The study analyzed so-called stationary bioenergy systems based on heat from wood-burning stoves and from wood biomass-based district heating.

"A lot of research has been done on this topic, but until now we've never had such a comprehensive study of various effects on a national level. This is the first time we've considered all the different factors in a single study," says Cherubini, who is a professor at NTNU's Industrial Ecology Programme.

A research project called CenBio, which focused on innovations in bioenergy, carried out the study, which was supported by NTNU in cooperation with SINTEF Energy, the Norwegian Institute of Bioeconomy Research (NIBIO) and the Norwegian University of Life Sciences (NMBU).

The 35 members of the OECD decided in 1991 that CO2 emissions from biomass combustion would not count in CO2 emission accounting. The theory was that nature would reabsorb the carbon dioxide released by burning, yielding a net balance of zero. Unfortunately it's not quite that simple.

"Bioenergy from forests is carbon neutral in the sense that forests are a renewable resource. The trees will absorb CO2 as they grow, but temporarily there will be a greater amount of CO2 in the atmosphere," explains Arvesen.

Logging can adversely affect the climate, including from emissions from heavy logging machinery. But the logged areas themselves can actually have a cooling effect, because open areas reflect more of the incoming sunlight back into the atmosphere than wooded areas.

"The cooling effect varies depending on where in the country the logging takes place, since different parts of the country have varying snow conditions and forest density," says Arvesen.

The CenBio study took these factors into consideration. It also analysed how other emissions from burning wood affect the climate. Methane gas and assorted particles also flow out of Norwegian chimneys.

These particles can both absorb and reflect solar radiation. Whereas organic carbon particles have a cooling effect, black carbon - also known as soot - has a warming effect on the climate.

Black carbon also destroys some of the snow's ability to reflect sunlight because it changes the colour of the snow landscape and contributes to increased snowmelt.

Black carbon from biomass combustion accounts for 1.6 million tonnes of CO2 in Norway, according to the study.

"Our analysis indicates that black carbon is the main reason for climate warming. I was surprised how important the effect of soot was, although it wasn't completely unexpected. Burning wood creates a lot of dust emissions," says Arvesen.

Cherubini thinks more research is needed in this area. He points out that reducing black carbon emissions will also have a positive health effect due to improved air quality.

Nevertheless, "it's still better to heat with wood than to burn fossil fuels," says Cherubini. He emphasizes that more technological possibilities are being developed that will result in new and better wood stoves and furnaces. Until they come on the market, people can reduce particle emissions by replacing their old stoves.

Many Norwegians have already replaced their old woodstoves with newer and cleaner-burning stoves, which has more than halved soot emissions since the early 2000s.

By 2016, 730 000 wood-burning households were using new technology. Overall, heating with wood in Norway has decreased slightly. The amount of wood burned in furnaces with old technology has decreased by more than 75 per cent in the past 20 years, according to Statistics Norway.

Old woodstoves emit more black carbon than new ones. And if the positive development continues, woodstove emissions should drop to the same level as pellet stoves in the near future. The challenge lies in the smallest particles, according to SINTEF scientist Morten Seljeskog.

"Soot is made up of particles as small as nanosize. These emissions are the most difficult ones to get rid of. Researchers are working to figure out what physical measures need to be taken in the combustion chamber to minimize soot emissions in all furnaces," he says.

Credit: 
Norwegian University of Science and Technology

When we sign, we build phrases with similar neural mechanisms as when we speak

Differences between signed and spoken languages are significant, yet the underlying neural processes we use to create complex expressions are quite similar for both, a team of researchers has found.

"This research shows for the first time that despite obvious physical differences in how signed and spoken languages are produced and comprehended, the neural timing and localization of the planning of phrases is comparable between American Sign Language and English," explains lead author Esti Blanco-Elorrieta, a doctoral student in New York University's Department of Psychology and NYU Abu Dhabi Institute.

The research is reported in the latest issue of the journal Scientific Reports.

"Although there are many reasons to believe that signed and spoken languages should be neurobiologically quite similar, evidence of overlapping computations at this level of detail is still a striking demonstration of the fundamental core of human language," adds senior author Liina Pylkkanen, a professor in New York University's Department of Linguistics and Department of Psychology.

The study also included Itamar Kastner, an NYU doctoral student at the time of the study and now at Berlin's Humboldt University, and Karen Emmorey, a professor at San Diego State University and a leading expert on sign language, who adds, "We can only discover what is universal to all human languages by studying sign languages."

Past research has shown that structurally, signed and spoken languages are fundamentally similar. However, less clear is whether the same circuitry in the brain underlies the construction of complex linguistic structures in sign and speech.

To address this question, the scientists studied the production of multiple two-word phrases in American Sign Language (ASL) as well as speech by deaf ASL signers residing in and around New York and hearing English speakers living in Abu Dhabi.

Signers and speakers viewed the same pictures and named them with semantically identical expressions. In order to gauge the study subjects' neurological activity during this experiment, the researchers deployed magnetoencephalography (MEG), a technique that maps neural activity by recording magnetic fields generated by the electrical currents produced by our brain.

For both signers and speakers, phrase building engaged the same parts of the brain with similar timing: the left anterior temporal and ventromedial cortices, despite different linguistic articulators (the vocal tract vs. the hands).

The researchers point out that this neurobiological similarity between sign and speech, then, goes beyond basic similarities and into more intricate processes--the same parts of the brain are used at the same time for the specific computation of combining words or signs into more complex expressions.

Credit: 
New York University

Medicare program linked with reduced black-white disparities in hospital readmissions

Key takeaways:

Black-white disparities in hospital readmission rates in the U.S. narrowed after the introduction of a Medicare program that penalizes higher-than-expected readmissions.

Minority-serving hospitals continue to disproportionately receive penalties for their readmission rates, suggesting that more work needs to be done to ensure that pay-for-performance programs promote greater equity in care.

Boston, MA - A Medicare program that penalizes hospitals for high readmission rates was associated with a narrowing of readmission disparities between black and white patients and between minority-serving hospitals and other hospitals in the U.S., according to a new study from Harvard T.H. Chan School of Public Health.

The study also found that, in spite of the reductions in disparities, black-white gaps still persisted, and that minority-serving hospitals--which disproportionately care for black Medicare patients--continued to be more likely to be penalized by the Medicare program.

The study will be published April 2, 2018 in Health Affairs. After the embargo lifts, the study will be posted here: http://www.healthaffairs.org/doi/abs/10.1377/hlthaff.2017.1034

"It should be reassuring for policymakers that the introduction of this Medicare program was associated with a narrowing of disparities in high readmission rates between black and white patients," said lead author José Figueroa, Burke Fellow at the Harvard Global Health Institute (HGHI) and a physician at Brigham and Women's Hospital. "However, more work needs to be done since disparities persist."

Medicare's Hospital Readmissions Reduction Program (HRRP) was established in 2012 as part of the Affordable Care Act. Prior to its start, there was evidence that black patients had, on average, 20% higher readmission rates than white patients, and that hospitals serving a higher proportion of black patients had higher readmission rates than other hospitals. Previous evidence suggested that the HRRP may have helped lower readmission rates for all Medicare patients over time, but its impact on minority populations and the hospitals that serve them was unknown.

In the new study, Harvard Chan School researchers compared trends in 30-day readmission rates among non-Hispanic black and non-Hispanic white patients and among minority-serving and other hospitals from 2007-2014. They analyzed national Medicare data from 6.3 million hospital admissions for patients with acute myocardial infarction, congestive heart failure, and pneumonia. Data came from 2,960 hospitals across the country, of which 283 were identified as minority-serving.

The researchers found that, prior to the HRRP era (January 2007-March 2010), readmissions rates were relatively flat or slightly increasing for both white and black patients. During the HRRP implementation phase (April 2010-September 2012) when hospitals knew that readmission penalties would soon begin, readmission rates improved both for blacks and whites, declining on average 0.45% per quarter for black patients and 0.36% per quarter for white patients. In the period after HRRP penalties were introduced (October 2012-December 2014), improvements in 30-day readmission rates slowed.

Overall, black patients' 30-day readmission rates fell from a high of 24.5% in 2010 to 18.9% in 2014, while white patients' rates fell from a high of 22.5% to 17.7%.

Even though minority-serving hospitals made more improvements than other hospitals, they were still more likely to be penalized because the HRRP program rewards hospitals based on their ranking relative to each other and not based on their own improvement over time, according to the study. The authors speculated that minority-serving hospitals' lack of resources may hamper their efforts to reduce readmissions.

"To better incentivize and reward all hospitals, including those at the bottom, policymakers should consider changes to how penalties are determined in the HRRP," Figueroa said.

Credit: 
Harvard T.H. Chan School of Public Health

Using water molecules to read electrical activity in lipid membranes

image: EPFL researchers were able to map out in real time how charges are transported across and along membranes simply by observing the behavior of adjacent water molecules.

Image: 
Jamani Caillet/EPFL

Every human cell is encased in a five-nanometer-thick lipid membrane that protects it from the surrounding environment. Like a gatekeeper, the membrane determines which ions and molecules can pass through. In so doing, it ensures the cell's well-being and stability and allows it to communicate via electrical signals.

Researchers from the Laboratory for fundamental BioPhotonics (LBP) in EPFL's School of Engineering were able to track these moving charges in real time in a completely non-invasive manner. Rather than observing the membranes themselves, they looked at the surrounding water molecules, which, in addition to keeping the membrane intact, change orientation in the presence of electrical charges. So by 'reading' their position, the researchers were able to create a dynamic map of how charges are transported across a membrane.

The researchers' method has just been published in the journal Proceedings of the National Academy of Sciences (PNAS). It could shed light on how ion channels function, along with other processes at work in membranes. This clinically viable method could potentially also be used to directly track ion activity in neurons, which would deepen researchers' knowledge of how nerve cells work. "Water molecules can be found wherever there are lipid membranes, which need these molecules to exist," says Sylvie Roke, head of the LBP. "But until now, most studies on membranes didn't look at these molecules. We've shown that they contain important information."

The researchers did this by using a unique second-harmonic microscope that was invented at the LBP. The imaging efficiency of this microscope is more than three orders of magnitude greater than that of existing second-harmonic microscopes. With this microscope, the researchers obtained images of water molecules at a time scale of 100 milliseconds.

To probe the lipid membranes' hydration, the researchers combine two lasers of the same frequency (femtosecond pulses) in a process that generates photons with a different frequency: this is known as second-harmonic light. It is generated only at interfaces and reveals information on the orientation of water molecules. "We can observe what's happening in situ, and we don't need to modify the environment or use bulky markers like fluorophores that would disturb water molecules' movement" says Orly Tarun, the publication's lead author.

Unexpected charge fluctuations are observed

With this method, the researchers observed charge fluctuations in membranes. Such fluctuations were previously unknown and hint at much more complex chemical and physical behavior than is currently considered.

Credit: 
Ecole Polytechnique Fédérale de Lausanne

NASA satellite gets an eye-opening look at Super Typhoon Jelawat

image: On March 30 at 12:12 a.m. EDT (0412 UTC) NASA-NOAA's Suomi NPP satellite captured a visible image of Super Typhoon Jelawat that clearly showed an eye had formed.

Image: 
NOAA/NASA Goddard Rapid Response Team

Satellite imagery showed that Tropical Cyclone Jelawat had developed an eye as it strengthened into a Super Typhoon.

On March 30, Jelawat was moving through the Philippine Sea. At 11 a.m. EST (1500 UTC) Jelawat was a hurricane with maximum sustained winds near 150 mph (130 knots/241 kph). The center of circulation was near 17.1 degrees north latitude and 139.5 degrees east longitude, approximately 384 nautical miles west-northwest of Andersen Air Force Base. Jelawat has tracked east-northeastward at 12.6 mph (11 knots/20.3 kph).

Super-typhoon is a term utilized by the U.S. Joint Typhoon Warning Center for typhoons that reach maximum sustained 1-minute surface winds of at least 130 knots/150 mph. This is the equivalent of a strong Saffir-Simpson category 4 or category 5 hurricane in the Atlantic basin or a category 5 severe tropical cyclone in the Australian basin.

On March 30 at 12:12 a.m. EDT (0412 UTC) the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard NASA-NOAA's Suomi NPP satellite captured a visible image of Jelawat. The VIIRS image revealed an eye that is about 6 nautical miles wide surrounded by strong convective storms. The image was created at NASA's Goddard Space Flight Center in Greenbelt, Md.

The Joint Typhoon Warning Center (JTWC) noted that by April 1, Jelawat is forecast "to move into an area of strong upper level westerly flow, which will result in increasing vertical wind shear and consequent steady weakening throughout the forecast period."

Credit: 
NASA/Goddard Space Flight Center