Tech

NASA-NOAA's Suomi NPP Satellite gets an infrared view of Typhoon Trami

image: The VIIRS instrument on the Suomi NPP satellite flew over Typhoon Trami at 1724 UTC (1:24 p.m. EDT) on Sept. 25. Cloud top temperatures were near 190 Kelvin/-117.7F/-83.5C around the eye of the storm.

Image: 
UWM/SSEC/CIMSS, William Straka III

Typhoon Trami looked formidable in infrared imagery taken from NASA-NOAA's Suomi NPP satellite as it moves to the southern Islands of Japan.

NASA-NOAA's Suomi NPP satellite provided forecasters with a night-time and infrared look at Trami's clouds on Sept. 25 at 1724 UTC (1:24 p.m. EDT). Cloud top temperatures were near 190 Kelvin/ minus 117.7 degrees Fahrenheit / minus 83.5 degrees Celsius around the eye of the storm. Cloud tops that cold can produce heavy rainfall.

William Straka III of the University of Wisconsin-Madison, Space Science and Engineering Center (SSEC) Cooperative Institute for Meteorological Satellite Studies (CIMSS), Madison, created the images. Straka said, "The infrared imagery showed a wide eye along with the obvious convection and tropospheric gravity waves."

At 11 a.m. EDT (1500 UTC) Trami's center was located near latitude 21.3 North, longitude 129.3 West. That's about 321 nautical miles south-southeast of Kadena Air Base, Okinawa, Japan. Maximum sustained winds are near 103 mph (90 knots/166 kph) with higher gusts.

The Joint Typhoon Warning Center forecast calls for Trami to weaken slowly as the storm moves slowly north. The storm will then re-strengthen to 110 knots, after which it will become extra-tropical. The typhoon will become extra-tropical on passing Honshu.

Credit: 
NASA/Goddard Space Flight Center

Stanford engineers study hovering bats and hummingbirds in Costa Rica

Each sunrise in Las Cruces, Costa Rica, River Ingersoll's field team trekked into the jungle to put the finishing touches on nearly invisible nets. A graduate student in the lab of David Lentink, assistant professor of mechanical engineering at Stanford University, Ingersoll needed these delicate nets to catch, study and release the region's abundant hummingbirds and bats - the only two vertebrates with the ability to hover in place.

"We're really interested in how hovering flight evolved," said Ingersoll. "Nectar bats drink from flowers like hummingbirds do, so we want to see if there's any similarities or differences between these two different taxa."

Ingersoll's nets worked, and he ended up examining over 100 individual hummingbirds and bats, covering 17 hummingbird and three bat species, during his field study, the results of which the group published in Science Advances.

Through a combination of high-speed camera footage and aerodynamic force measurements, he and his fellow researchers found that hummingbirds and bats hover in very different ways. Yet they also found that nectar bats' hovering shares some similarities with hummingbird hovering - which fruit bats do not share. This suggests that they evolved a different method to hover compared with other bats in order to drink nectar.

In addition to learning more about bats and hummingbirds, Lentink and others can apply what they learned to engineering problems, such as designing flying robots. Engineers have already created robots inspired by hummingbirds and bats but haven't known which of the natural counterparts of these robots hover most effectively.

Watching every feather

It is simple to imagine how a flying animal supports itself by flapping downward, but in order to avoid exaggerated bobbing up and down, hovering animals must maintain this support while flapping upward as well. Hummingbirds and bats accomplish this feat by twisting their wings backward on the upstroke, continuously pushing air downward to keep them steadily aloft.

"If you look amongst vertebrates, there are two that can hover in a sustained way," said Lentink. "Those are hummingbirds and nectar bats. And you'll find both in the neotropics, like Costa Rica."

To study these subjects, Ingersoll collaborated with a long-standing bird banding project run by Stanford ecologists in Las Cruces. Borrowing birds and bats from their project, he placed each animal in a flight chamber outfitted with aerodynamic force sensors at the top and bottom of the chamber - equipment developed by Lentink's lab to measure extremely small changes in vertical force at 10,000 times per second. These plates are so sensitive that they captured the vertical forces produced by every twist and flutter of hummingbirds that weighed as little as 2.4 grams.

By synching those force measurements with multiple high-speed cameras recording at 2,000 frames per second, the researchers could isolate any moment of their subjects' flights to see how the lift they were generating related to the shape of their wings.

"I'd sit and wait for the hummingbird to feed at the flower. Once it was feeding, I would trigger the cameras and the force measurements and we'd get four seconds of footage of the hummingbird flapping at the flower," said Ingersoll.

After their short stint in the flight chamber, Ingersoll returned the birds and bats to where they were caught and released them. The whole process took between one and three hours.

Different ways to hover

The researchers found that the bats and hummingbirds all exerted a similar amount of energy relative to their weight during these flights but that the hummingbirds, fruit bats and nectar bats all hovered in very different ways. The hummingbirds hovered in a more aerodynamically efficient way than the bats - the hummingbirds generated more lift relative to drag. In comparing wing shapes, the researchers found this efficiency is likely because the hummingbirds invert their wings more easily. Although the bats struggled with turning over their wings, they exerted a comparable amount of energy because they have bigger wings and larger strokes.

The researchers were surprised to find that nectar bats, which sidle up to flowers like hummingbirds, generated more upward force when the wings were lifting than fruit bats. Looking at their wing shape, the researchers found that nectar bats can twist their wings much more than fruit bats on the upstroke. So nectar bats' hovering form is like a blend of fruit bats' and hummingbirds' hovering.

The researchers plan to build on these findings as part of their work on flapping robots and drones but Lentink also sees potential for more work beyond the lab.

"When Rivers proposed to do this study in Costa Rica, a field study was something I'd never hoped for. Now, he really inspired me," said Lentink. "There are about 10,000 species of birds and most of them have never been studied. It sounds like too big a study to embark on but that's what I dream about."

Credit: 
Stanford University

Your Facebook friends don't mean it, but they're likely hurting you daily

BUFFALO, N.Y. - Social media sites often present users with social exclusion information that may actually inhibit intelligent thought, according to the co-author of a University at Buffalo study that takes a critical look not just at Facebook and other similar platforms, but at the peculiarities of the systems on which these sites operate.

The short-term effects of these posts create negative emotions in the users who read them, and may affect thought processes in ways that make users more susceptible to advertising messages.

What's particularly alarming is that the social exclusion present in these posts is not intentional. Users are not callously sharing exclusion information with their friends. Social media sites, nevertheless, by design make most information available from one friend to another and the consequences resulting from the interpretation of these messages are significant.

"These findings are compelling," says Michael Stefanone, an associate professor in UB's Department of Communication and an expert in computer-mediated communication and social networks. "We're using these technologies daily and they're pushing information to users about their networks, which is what the sites are designed to do, but in the end there's negative effect on people's well-being."

The results of the study with lead author Jessica Covert, a graduate student in UB's Department of Communication, appear in the journal Social Science Computer Review.

"These findings are not only significant because we are talking about individuals' emotions here, but it also raises questions about how exposure to these interactions affect one's day-to-day functioning," says Covert. "Offline research suggests that social exclusion evokes various physical and psychological consequences such as reduced complex cognitive thought.

"Considering the amount of time individuals spend online, it is important to investigate the effects of online social exclusion."

At a glance, the posts at the center of the study seem harmless. Users open Facebook to sees exchanges among friends which unintentionally excluded them.

It happens all the time. Right?

"Yes," says Stefanone. "It happened to me the other night. I see my friends are doing something while I'm sitting at home. It's not devastating, but there's that moment when I felt badly."

The point, says Stefanone, is the messages can be interpreted in a way that people feel left out. And that feeling, as innocuous as it might seem, is not easily dismissed.

"Social exclusion, even something that might seem trivial, is one of the most powerful sanctions people can use on others and it can have damaging psychological effects," says Stefanone. "When users see these exclusion signals from friends - who haven't really excluded them, but interpret it that way - they start to feel badly."

It's at this point that the brain's self-regulating function should take over, according to Stefanone.

That self-regulation quickly moderates the negative feelings that can result from the interpretation, but self-regulation consumes mental resources that inhibit intelligent thought.

"If users are busy self-regulating because of what they read on Facebook there's evidence that doing so reduces a level of intelligent thought, which can make them more open to persuasive messaging."

"Facebook's entire business model is built on advertising. It's nothing but an advertising machine," says Stefanone. "Given Facebook's annual ad revenue, I think it's a conversation worth having, that regular, benign and common use of this platform can lead to short-term inhibition of intelligent thought."

For the study, Covert and Stefanone created scenarios designed to mirror typical interactions on Facebook, and 194 individuals participated in an experiment ensuring exposure to social exclusion. The researchers presented one group with a scenario involving two good friends, where one of those friends had shared information that excluded the participant. The other group saw a feed that presented no social exclusion information.

Results indicated that individuals exposed to social exclusion information involving their close friends experienced greater negative emotions than the control group. They also had a tendency to devote more mental resources toward understanding their social networks, making them particularly sensitive to stimuli such as advertising.

Stefanone says plans for the future include replicating the current experiment and then measuring changes in intelligent thought using standardized test questions.

"I think the most important thing we all have to remember is to think carefully about our relationship with these corporations and these social networking platforms," says Stefanone. "They do not have our best interests in mind."

Credit: 
University at Buffalo

Researchers identify marker in brain associated with aggression in children

image: A University of Iowa-led research team has identified a brain-wave marker responsible for aggression in young children. The finding could lead to earlier identification of toddlers with aggressive tendencies before the behavior becomes more ingrained in adolescence.

Image: 
Tim Schoon, University of Iowa

Imagine a situation where one child is teasing another. While the child doing the teasing means it playfully, the other child views it as hostile and responds aggressively.

Behavior like this happens all the time with children, but why some react neutrally and others act aggressively is a mystery.

In a new study, a University of Iowa-led research team reports it has identified a brain marker associated with aggression in toddlers. In experiments measuring a type of brain wave in 2½ to 3½-year-old children, toddlers who had smaller spikes in the P3 brain wave when confronted with a situational change were more aggressive than children registering larger P3 brain-wave peaks, research showed.

The results could lead to identifying at an earlier stage children who are at risk of aggressive behavior and could help stem those impulses before adolescence, an age at which research has shown aggressive behavior is more difficult to treat.

"There are all kinds of ambiguous social cues in our environment," says Isaac Petersen, assistant professor in the Department of Psychological and Brain Sciences at the UI and corresponding author on the study. "And, when children aren't able to detect a change in social cues, they may be more likely to misinterpret that social cue as hostile rather than playful.

"Children respond to the same social cues in different ways, and we think it's due to differences in how they interpret that cue, be it neutral or hostile," Petersen says.

The P3 wave is part of a series of brain waves generated when an individual evaluates and responds to a change in the environment--such as changed cues in a social interaction. Previous research, primarily in adults, has shown individuals with shorter P3-wave peaks when confronted with a change in the environment tend to be more aggressive. As such, scientists believe P3 is a key indicator of aggression, as well as associated with depression and schizophrenia.

To tease out those differences in children, the researchers recruited 153 toddlers and, in individual sessions, outfitted each with a net of head sensors that measured brain-wave activity while a steady stream of tones sounded in the room. As the children watched silent cartoons on a television screen, the pitch of the tones changed, and the researchers measured the P3 brain wave accompanying each change in pitch.

The change in pitch is analogous to a change in a social interaction, in which the brain--consciously or subconsciously--reacts to a change in the environment. In this case, it was the change in pitch.

Toddlers with a shorter peak in the P3 brain wave accompanying the tone change were rated by their parents as more aggressive than children with more pronounced P3 spikes.

The difference in P3 peaks in aggressive and non-aggressive children "was statistically significant," Petersen says, and the effect was the same for boys and girls.

"Their brains are less successful at detecting changes in the environment," Petersen says of the children with shorter P3 brain-wave peaks. "And, because they're less able to detect change in the environment, they may be more likely to misinterpret ambiguous social information as hostile, leading them to react aggressively. This is our hypothesis, but it's important to note there are other possibilities that may explain aggression that future research should examine."

The researchers tested the same children at 30, 36, and 42 months of age to further explore the association with the P3 brain wave and aggression.

"This brain marker has not been widely studied in children and never studied in early childhood in relation to aggression," says Petersen, who has an appointment in the Iowa Neuroscience Institute. "It might be one of a host of tools that can be used in the future to detect aggression risk that might not show up on a behavioral screening."

The research is important because early interventions are more effective for stemming aggression, says Petersen, who is a clinical psychologist.

"Evidence suggests that early interventions and preventive approaches are more effective for reducing aggression than interventions that target aggression later in childhood or in adolescence when the behavior is more ingrained and stable," he says.

The paper was published Sept. 26 in the Journal of Child Psychology and Psychiatry. It is titled "A longitudinal, within-person investigation of the association between the P3 ERP component and externalizing behavior problems in young children."

The children were tested at Indiana University-Bloomington. Contributing authors at Indiana University include Caroline Hoyniak and John Bates. Angela Staples at Eastern Michigan University and Dennis Molfese at the University of Nebraska-Lincoln also are contributing authors.

Credit: 
University of Iowa

Bariatric surgery linked to safer childbirth for the mother

Obese mothers who lose weight through bariatric surgery can have safer deliveries. The positive effects are many, including fewer caesarean sections, infections, tears and haemorrhages, and fewer cases of post-term delivery or uterine inertia. This according to an observational study by researchers at Karolinska Institutet in Sweden published in PLOS Medicine.

Today, more than one in every three women admitted into prenatal care are either obese or overweight, and statistics from the National Board of Health and Welfare show that this is a rising trend.

"We know that obesity and overweight are dangerous in connection with childbirth," says Dr Olof Stephansson, obstetrician and researcher at Karolinska Institutet's Department of Medicine in Solna. "Bariatric surgery is by far your best option if you want a lasting weight reduction over time."

The research group to which he belongs has made several studies of how bariatric surgery affects pregnancy and childbirth. This latest study compared deliveries in 1,431 women who had achieved considerable weight loss after bariatric surgery, with those in 4,476 women who had not undergone surgery. The women in the control group had the same BMI during early pregnancy as the experimental group had had before surgery.

"The effects were quite salient, and all of those we studied were to the benefit of the women who'd had surgery," says Dr Stephansson. "There are a lower proportion of C-sections, fewer induced deliveries, a lower proportion of post-term deliveries, less frequent epidurals and fewer cases of uterine inertia, infection, perineal tears and haemorrhaging."

The comparison was done using the Scandinavian Obesity Surgery Registry (SOReg) and the Medical Birth Registry. The positive effect is thought to be attributable to the considerable weight loss that these women have undergone, on average 38 kilos from the time of surgery to the start of pregnancy. At the same time, earlier studies show that women who have bariatric surgery run a slightly greater risk of pre-term delivery and having babies that are small for their gestational age.

"It's therefore not as simple as just advising every woman who's overweight to have bariatric surgery," says Dr Stephansson. "But going by the results of this study, it has positive effects for mothers. More studies are needed in which we weigh up outcomes so that we can give a more general recommendation."

The Swedish studies should also be complemented with similar studies in other countries, he says. Internationally, Sweden stands out in terms of the volume of bariatric surgery performed.

"Today, one per cent of all babies born in Sweden have mothers who have had bariatric surgery. It might not sound many, but this number is rising. When the women start to lose weight their fertility quickly recovers, so this too is improved by the operation."

The research is patient-centred and conducted by people with different competencies, including obstetricians, nutritionists, surgeons and epidemiologists.

"Thanks to the fact that we combine good study design and method with proximity to clinic and patient, we can also conduct really good studies," says Olof Stephansson.

Credit: 
Karolinska Institutet

Microplastics found deep in sand where turtles nest

image: Microplastic pollution on a beach in Cyprus.

Image: 
Jessica Arrowsmith

Microplastics have been found deep in the sand on beaches where sea turtles lay their eggs.

University of Exeter scientists found an average of 5,300 particles of plastic per cubic metre at depths of 60cm (2ft) on beaches in Cyprus used by green turtles and loggerheads.

At the surface, up to 130,000 fragments of plastic were found per cubic metre - the second-worst level ever recorded on a beach (the worst was in Guangdong, South China).

Researchers say that if conditions worsen such pollution could eventually begin to affect hatching success and even the ratio of male and female turtle hatchlings.

"We sampled 17 nesting sites for loggerhead and green turtles and found microplastics at all beaches and all depths," said Dr Emily Duncan, of the Centre for Ecology and Conservation on the University of Exeter's Penryn Campus in Cornwall.

"Microplastics have different physical properties to natural sediments, so high levels could change the conditions at hatching sites, with possible effects on turtle breeding.

"For example, the temperature at which the egg incubates affects the sex of the hatchling - with females more likely in warmer conditions.

"More research is needed to investigate what effects microplastics have on turtle eggs."

Microplastics - defined as less than 5mm in diameter - come from numerous sources including discarded plastic items that have broken apart, microbeads from cosmetics and microfibres from clothes.

Of the microplastics categorised in this research, nurdles (pellets used in the production of plastic products) and hard fragments broken from larger items were the most common.

"Unlike the beaches in China where the highest levels of microplastics have been recorded, these beaches in Cyprus are located far from industrial practices and aren't visited by large numbers of people," said Professor Brendan Godley, leader of the University of Exeter's marine strategy.

"Therefore it seems that microplastics are arriving on ocean currents. In this case, our analysis suggests most of it came from the eastern Mediterranean basin. This is also true of the large plastic items found on the beaches in Cyprus in large numbers."

The findings support the theory that beaches act as a "sink" for marine microplastics, becoming key areas for contamination.

The research, is based on 1,209 sediment samples taken from 17 turtle nesting beaches in northern Cyprus.

The paper, published in Marine Pollution Bulletin, is entitled: "The true depth of the Mediterranean plastic problem: Extreme microplastic pollution on marine turtle nesting beaches in Cyprus."

Credit: 
University of Exeter

Impact of WWII bombing raids felt at edge of space

image: Bombing of a factory at Marienburg, Germany, on Oct. 9, 1943.

Image: 
US Air Force

Bombing raids by Allied forces during the Second World War not only caused devastation on the ground but also sent shockwaves through Earth's atmosphere which were detected at the edge of space, according to new research. University of Reading researchers have revealed the shockwaves produced by huge bombs dropped by Allied planes on European cities were big enough to weaken the electrified upper atmosphere - the ionosphere - above the UK, 1000km away. The results are published today in the European Geosciences Union journal Annales Geophysicae.

Scientists are using the findings to further understanding of how natural forces from below, like lightning, volcanic eruptions and earthquakes, affect Earth's upper atmosphere.

Chris Scott, Professor of Space and Atmospheric Physics, said: "The images of neighbourhoods across Europe reduced to rubble due to wartime air raids are a lasting reminder of the destruction that can be caused by man-made explosions. But the impact of these bombs way up in the Earth's atmosphere has never been realised until now."

"It is astonishing to see how the ripples caused by man-made explosions can affect the edge of space. Each raid released the energy of at least 300 lightning strikes. The sheer power involved has allowed us to quantify how events on the Earth's surface can also affect the ionosphere."

In the study published today in Annales Geophysicae researchers looked at daily records at the Radio Research Centre in Slough, UK, collected between 1943-45. Sequences of radio pulses over a range of shortwave frequencies were sent 100-300km above the Earth's surface to reveal the height and electron concentration of ionisation within the upper atmosphere.

The strength of the ionosphere is known to be strongly influenced by solar activity, but the ionosphere is far more variable than can be explained by current modelling. The ionosphere affects modern technologies such as radio communications, GPS systems, radio telescopes and some early warning radar, however the extent of the impact on radio communications during the Second World War is unclear.

Researchers studied the ionosphere response records around the time of 152 large Allied air raids in Europe and found the electron concentration significantly decreased due to the shockwaves caused by the bombs detonating near the Earth's surface. This is thought to have heated the upper atmosphere, enhancing the loss of ionisation.

Although the London 'Blitz' bombing was much closer to Slough, the continuous nature of these attacks and the fact there is far less surviving information about them made it more challenging to separate the impact of these explosions from natural seasonal variation.

Detailed records of the Allied raids reveal their four-engine planes routinely carried much larger bombs than the German Luftwaffe's two-engine planes could. These included the 'Grand Slam', which weighed up to 10 tonnes.

Professor Patrick Major, University of Reading historian and a co-author of the study, said: "Aircrew involved in the raids reported having their aircraft damaged by the bomb shockwaves, despite being above the recommended height. Residents under the bombs would routinely recall being thrown through the air by the pressure waves of air mines exploding, and window casements and doors would be blown off their hinges. There were even rumours that wrapping wet towels around the face might save those in shelters from having their lungs collapsed by blast waves, which would leave victims otherwise externally untouched."

"The unprecedented power of these attacks has proved useful for scientists to gauge the impact such events can have hundreds of kilometres above the Earth, in addition to the devastation they caused on the ground."

The researchers now need members of the public to help digitise more early atmospheric data, to understand the impact of the many hundreds of smaller bombing raids during the war, and help determine the minimum explosive energy required to trigger a detectable response in the ionosphere.

Credit: 
European Geosciences Union

Cocoa: a tasty source of vitamin D?

Many people do not get enough vitamin D. Brittle bones and an increased risk of respiratory diseases can be the result of a vitamin D deficiency. A research group at Martin Luther University Halle-Wittenberg (MLU) and the Max Rubner-Institut has now identified a new, previously unknown source of vitamin D2: cocoa and foods containing cocoa have significant amounts of this important nutrient. According to the researchers, cocoa butter and dark chocolate have the highest amount of vitamin D2. They recently published their results in the journal "Food Chemistry".

Vitamin D is crucial for the human body. It comes in two types: vitamin D2 and D3. Vitamin D3 is produced in the human skin through exposure to the sun. Humans get 90 per cent of their vitamin D requirements this way. The rest is ideally consumed through food, such as fatty fish or chicken eggs. Vitamin D2, which can also be utilised by the human body, is found in fungi. "Many people do not get enough vitamin D. The problem increases in the winter months when sunshine is scarce," says nutritionist Professor Gabriele Stangl from MLU.

In their latest study, the researchers investigated the vitamin D content of cocoa and products containing cocoa because they suspected that they contained a previously unknown source of the vitamin. Cocoa beans are dried after fermentation. They are placed on mats and exposed to the sun for one to two weeks. The precursors of vitamin D, which presumably originate from harmless fungi, are transformed by the sunlight into vitamin D2. In order to test their theory, the research group analysed various cocoa products and powders using state-of-the-art mass spectrometry.

What they found is that products containing cocoa are indeed a source of vitamin D2, but the amount varies greatly from food to food. While dark chocolate has a relatively high vitamin D2 content, researchers found very little in white chocolate. "This is not surprising as the cocoa content in white chocolate is significantly lower. It confirms our assumption that cocoa is the source of vitamin D2," explains Stangl.

However, the nutritionist's findings do not prompt her to recommend consuming large quantities of chocolate: "You would have to eat enormous amounts of chocolate to cover your vitamin D2 requirements. That would be extremely unhealthy because of the high sugar and fat content," says Stangl.

Instead, the results of the study are important for obtaining accurate data on the average nutrients consumed by the population. National consumption studies determine the population's food consumption. The Max Rubner-Institut calculates the population's daily intake of nutrients by combining this data with data from the extensive German Nutrient Data Base (Bundeslebensmittelschlüssel). If a source of vitamin D is left out of this database, the figures wind up being inaccurate. This is why the researchers recommend regularly revising the food and nutrient databases.

The research group at MLU is also using their recent findings in a follow-up project: "Cocoa is an exciting raw food material because it contains additional secondary plant substances that, for example, benefit the cardiovascular system," says nutritional specialist Stangl. As part of the nutriCARD Competence Cluster funded by the German Federal Ministry of Education and Research, her team is investigating whether it is possible to produce sugar-free foods containing cocoa, such as pasta, and whether these can raise vitamin D2 levels in humans.

Credit: 
Martin-Luther-Universität Halle-Wittenberg

Artificial intelligence to improve drug combination design & personalized medicine

image: Harnessing Artificial Intelligence to
Improve Drug Combination Design and Personalized Medicine

Image: 
Zac Goh

A new auto-commentary published in SLAS Technology looks at how an emerging area of artificial intelligence, specifically the analysis of small systems-of-interest specific datasets, can be used to improve drug development and personalized medicine. The auto-commentary builds on a study recently published by the authors in Science Translational Medicine about an artificial intelligence (AI) platform, Quadratic Phenotypic Optimization Platform (QPOP), that substantially improves combination therapy in bortezomib-resistant multiple myeloma to identify the best drug combinations for individual multiple myeloma patients.

It is now evident that complex diseases, such as cancer, often require effective drug combinations to make any significant therapeutic impact. As the drugs in these combination therapies become increasingly specific to molecular targets, designing effective drug combinations as well as choosing the right drug combination for the right patient becomes more difficult.

Artificial intelligence is having a positive impact on drug development and personalized medicine. With the ability to efficiently analyze small datasets that focus on the specific disease of interest, QPOP and other small dataset-based AI platforms can rationally design optimal drug combinations that are effective and based on real experimental data and not mechanistic assumptions or predictive modeling. Furthermore, because of the efficiency of the platform, QPOP can also be applied towards precious patient samples to help optimize and personalize combination therapy.

Credit: 
SLAS (Society for Laboratory Automation and Screening)

Once majestic Atlantic Forest 'empty' after 500 years of over-exploitation

image: New research finds that 500 years of over-exploitation has halved mammal populations in South America's once majestic Atlantic Forest.
A new analysis of mammal populations reveals the devastating effects of human disturbance since the area was first colonised in the 1500s.
As well as looking at individual species, the researchers examined species groups to understand which ecologically related groups of species had diminished most rapidly. They found that apex predators and large carnivores, such as jaguars and pumas, as well as large-bodied herbivores, such as tapirs, were among the groups whose numbers had suffered the most.

Image: 
Mariana Landis 2011 - Wildlife Ecology, Management, and Conservation Lab (LEMaC)

Five centuries of over-exploitation has halved mammal populations in South America's Atlantic Forest - according to new research from the University of East Anglia.

A new analysis of mammal populations, published today in the journal PLoS ONE, has revealed the devastating effects of human disturbance over the last 500 years.

More than half of the local species assemblages - sets of co-existing species - of medium and large mammals living in the forest have died out since the area was first colonised in the 1500s.

Human activity is largely responsible for this overwhelming biodiversity loss according to the study, which compared inventories published over the past 30 years with baseline data going back to historical times in Colonial Brazil.

Originally covering around 1.1 million square km, the Atlantic Forest lies mostly along the coast of Brazil and is the world's longest continuous latitudinal stretch of tropical forest. Activities such as farming and logging - as well as fires - have reduced the Forest to about 0.143 million square km which has, in turn, had a significant impact on mammalian populations.

Dr Juliano Bogoni - currently a postdoctoral researcher at the University of São Paulo, Brazil - led the study, along with Professor Carlos Peres from the University of East Anglia (UEA), and collaborators from the Federal University of Santa Catarina in Brazil.

The team analysed species loss among almost 500 medium-to-large-bodied local sets of mammal species that had been surveyed within the vast Atlantic Forest region.

As well as looking at individual species, the team examined species groups, to try to understand which ecologically related groups of species had diminished most rapidly. They found that apex predators and large carnivores, such as jaguars and pumas, as well as large-bodied herbivores, such as tapirs were among the groups whose numbers had suffered the most.

Prof Peres, from UEA's School of Environmental Sciences, said: "Our results highlight the urgent need for action in protecting these fragile ecosystems.

"In particular, we need to carry out more comprehensive regional scale studies to understand the local patterns and drivers of species loss.

"Efforts to protect the Atlantic Forest and other tropical forest ecosystems often rests on uncooperative political will and robust public policies, so we need compelling data to drive change."

Dr Bogoni, first author of the study, said: "The mammalian diversity of the once majestic Atlantic Forest has been largely reduced to a pale shadow of its former self.

"These habitats are now often severely incomplete, restricted to insufficiently large forest remnants, and trapped in an open-ended extinction vortex. This collapse is unprecedented in both history and pre-history and can be directly attributed to human activity."

Credit: 
University of East Anglia

UMass Amherst food scientists profile microbes at a fermented vegetable facility

image: UMass Amherst team collaborated with scientists from two units of the US Food and Drug Administration's Center for Food Safety and Applied Nutrition to map and characterize microbial populations in a vegetable fermentation facility in rural Massachusetts.

Image: 
Real Pickles, Valley Lightworks

AMHERST, Mass. - University of Massachusetts Amherst food scientists have mapped and characterized microbial populations in a vegetable fermentation facility and report that its microbiome was distinct between production and fermentation areas and that the raw vegetables themselves - cabbages destined for sauerkraut - were the main source of fermentation-related microbes in production areas rather than handling or other environmental sources.

Writing in Applied and Environmental Microbiology, UMass Amherst Commonwealth Honors College student and co-first-author Jonah Einson, with research fellow Asha Rani and senior investigator professor David Sela, say this study helps to resolve the question of how microbes are transferred within a food fermentation production facility.

The UMass Amherst team collaborated with scientists from two units of the U.S. Food and Drug Administration's Center for Food Safety and Applied Nutrition and another in Winchester, Mass.

Real Pickles is a worker-owned cooperative in Greenfield, Mass. and makes pickles, sauerkraut, and other fermented vegetables the old-fashioned way, by natural fermentation and without starter microbial cultures or other media. Nutritional microbiologist Sela notes that very little is known about the microbiomes of these facilities. Fermented foods are increasingly popular internationally, he adds, because of their enhanced nutritional properties, cultural history and tastiness, so the new information from this study should be of interest to food scientists, microbiologists and manufacturers around the world.

Sela says, "We're grateful for the cooperation and collaboration we received from the management and employees at Real Pickles for the opportunity to carry out this interesting research. We're thrilled that they were so open to working with the FDA. Not all companies would be comfortable having government regulators come in to assist with scientific research. This was a great academic-government-industry partnership. Also it reflects the exciting explosion of interest in fermented foods and beverages we are experiencing locally in western Massachusetts and beyond."

Company founder Dan Rosenberg says, "It's fascinating to learn about the influence of fresh vegetables in establishing our facility's microbiome and suggests that our use of organic vegetables is important to contributing a diverse microbial community to support fermentation. It raises interesting questions about how we can further improve our production practices to be producing fermented and probiotic foods of the highest quality. We're excited to participate in research that improves understanding of fermented food production and nutrition."

Sela notes that he, Einson and Rani used a state-of-the-art approach to this survey, using high-throughput sequencing and genomics to identify microbial species present instead of culturing the microbes. This allowed the team to quickly identify more microbes than conventional methods, to estimate their relative numbers, predict their likely function and determine the flow of microbes into and within the facility.

He adds, "For the first time, we built a map of the facility and how it was transformed over time during fermentation, which has given us a more complete picture of the population in a real vegetable fermentation facility. Both cheese and beer have been done to a certain extent, but we feel that fermented vegetable facilities could be better characterized. And because we are using these new genomic tools, we have a better understanding of what is there and how they got there. Using the traditional approach, we couldn't possibly culture everything, so we'd have a far more limited picture."

The researchers report that Leuconostoc and Lactobacillaceae dominated all surfaces where spontaneous fermentation occurs. Also, they found that "wall, floor, ceiling and barrel surfaces host unique microbial signatures."

Sela says Einson, now a graduate student at Columbia University, took the early lead in this study as an undergraduate. He swabbed surfaces in the plant's fermentation room, processing area and dry storage surfaces to collect microbial samples, identify microbes present before and after cleaning, and in linking specific microbes to certain food processing stages. After he graduated, Rani took over and completed the project.

Sela says next steps might be to see if similar findings may emerge in other vegetable processing plants around the world, whether seasonal changes might be observed and whether microbial populations change with different cleaning protocols.

"It's not as important when making traditional sauerkraut," he notes, "but some other food processing facilities might face more risk of spoilage and it would be interesting to see what microbes are present at various stages. Batches of food do get spoiled and we never see that as consumers, but it costs a lot when a whole batch is lost. We might be able to find a way to predict when that might happen by understanding the microbiological communities that surround the processes."

Credit: 
University of Massachusetts Amherst

National parks bear the brunt of climate change

image: Using data from weather stations scattered throughout the U.S., climate researchers have created maps of the average annual temperature and rainfall totals at points approximately 800 meters apart over much of the United States. In this study, the team used these maps to calculate historical temperature and rainfall trends within the parks and over the U.S. as a whole.

Image: 
Patrick Gonzalez

Berkeley -- Human-caused climate change has exposed U.S. national parks to conditions hotter and drier than the rest of the nation, says a new UC Berkeley and University of Wisconsin-Madison study that quantifies for the first time the magnitude of climate change on all 417 parks in the system.

Without action to limit greenhouse gas emissions, many small mammals and plants may be brought to the brink of extinction by the end of the century, the study shows.

The analysis reveals that over the past century, average temperatures in national parks increased at twice the rate as the rest of the nation and yearly rainfall decreased more in national parks than in other regions of the country.

At the current rate of emissions, the team projects that temperatures in the most exposed national parks could soar by as much as 9 degrees Celsius or 16 degrees Fahrenheit by 2100. This rate of change is faster than many small mammals and plants can migrate or "disperse" to more hospitable climates.

"Human-caused climate change is already increasing the area burned by wildfires across the western U.S., melting glaciers in Glacier Bay National Park and shifting vegetation to higher elevations in Yosemite National Park," said Patrick Gonzalez, associate adjunct professor in the Department of Environmental Science, Policy and Management at UC Berkeley and a lead author for the Intergovernmental Panel on Climate Change (IPCC) report, a summary of the most up-to-date scientific knowledge of climate change.

"The good news is that, if we reduce our emissions from cars, power plants, deforestation, and other human activities and meet the Paris Agreement goal, we can keep the temperature increase in national parks to one-third of what it would be without any emissions reductions," Gonzalez said.

The locations of these unique ecosystems are what make them particularly exposed to climate change, Gonzalez said. Many national parks are found in deserts, high mountains or in the Arctic region of Alaska, climates that are known to be the hardest hit by global warming.

"National parks aren't a random sample - they are remarkable places and many happen to be in extreme environments," Gonzalez said. "Many are in places that are inherently more exposed to human-caused climate change."

The analysis, which includes all 50 U.S. states, the District of Columbia and four territories in the Caribbean and Pacific, appears Sept. 24 in the journal Environmental Research Letters.

Mitigation and Adaptation

Weather stations scattered throughout the U.S. have been gathering monthly data on temperature and rainfall dating back to 1895. Using this data, climate researchers have created maps of the average annual temperature and rainfall totals at points approximately 800 meters over much of the United States.

In this study, the team used these maps to calculate historical temperature and rainfall trends within the parks and over the U.S. as a whole. They found that the temperature in national parks increased by a little over 1 degree Celsius from 1895 to 2010, roughly double the warming experienced by the rest of the country. Yearly rainfall totals decreased over 12 percent of national park land, compared to 3 percent of land in the United States. Alaska and its national parks saw the most dramatic increases in temperature, while rainfall decreased most in Hawaii.

The team mapped projected future changes in temperature and precipitation for climate models representing each of four climate change scenarios developed by the IPCC. These four "storylines of the future," as scientists call them, include a scenario where no action has been taken to reduce emissions, one that is consistent with the Paris Agreement and two that are intermediate.

Under the most extreme climate change scenario, the average temperature of all the national parks together is projected to increase between 5 and 7 degrees Celsius. Sticking to the Paris Agreement could limit this rise to between 1 and 3 degrees Celsius. Under both scenarios, temperature could increase most in Alaska and its national parks, while rainfall could decrease most in the Virgin Islands and the southwestern U.S.

To analyze future projections, the team also "downscaled" the climate models, making more detailed maps of future climate trends within the parks. Whereas the climate models themselves have coarse resolutions of approximately 100 to 400 kilometers, the downscaled data have resolutions of 100 to 800 meters over most of the country.

These maps can help park service employees plan for future vulnerabilities to climate change of endangered species and other park resources by developing measures to protect against wildfires and controlling invasive species.

"The park service is already integrating this climate change information into their planning and resource management," said Fuyao Wang, a research associate at the University of Wisconsin-Madison.

"It is important to note that even if we really do a strong mitigation of greenhouse gases, the national park system is still expected to see a 2 degree temperature change," said John Williams, a professor of geography at the University of Wisconsin-Madison. "At this point, it is likely that the glaciers in Glacier National park will ultimately disappear, and what is Glacier National park if it doesn't have glaciers anymore? So I think this adds weight to the importance of reducing our future levels of climate change and also extends the National Park Service mission to both adapt to these changes and educate all of us about these changes."

Credit: 
University of California - Berkeley

New AGS-NIA conference report explores links between senses and cognitive health

image: Founded in 1942, the American Geriatrics Society (AGS) is a nationwide, not-for-profit society of geriatrics healthcare professionals that has--for more than 75 years--worked to improve the health, independence, and quality of life of older people. Its nearly 6,000 members include geriatricians, geriatric nurses, social workers, family practitioners, physician assistants, pharmacists, and internists. The Society provides leadership to healthcare professionals, policymakers, and the public by implementing and advocating for programs in patient care, research, professional and public education, and public policy. For more information, visit AmericanGeriatrics.org.

Image: 
(C) 2018, American Geriatrics Society

Experts at a prestigious medical conference hosted by the American Geriatrics Society (AGS) and funded by the National Institute on Aging (NIA) hope their work--reported today in the Journal of the American Geriatrics Society--will have colleagues seeing eye-to-eye on an important but under-researched area of health care: The link between impaired vision, hearing, and cognition (the medical term for our memory and thinking capabilities, which are impacted as we age by health concerns like dementia and Alzheimer's disease).(1) With vision and hearing loss already affecting up to 40 percent of older adults(1)--and with one-in-ten older people already living with Alzheimer's disease(2)--the conference reviewed the current state of science regarding how these common health challenges might be connected, why the answer might matter, and what can be done to reduce sensory and cognitive impairments to preserve our health for as long as possible.

"As we live longer, we know that sensory and cognitive impairments will become more prevalent," said Heather Whitson, MD, MHS, Associate Professor of Medicine & Ophthalmology at Duke University Medical Center and one of the lead researchers for the AGS-NIA conference convened in 2017. "While we know a great deal about these impairments individually, we know less about how they are related--which is surprising, since impaired hearing and vision often go hand-in-hand and are associated with an increased risk for cognitive trouble."

One obstacle to optimizing sensory and cognitive health is our poor understanding of the two-way street connecting both.(1) For example, we know the brain relies on sensory input to understand our environment and make decisions.(1) Researchers also know that cognitive processes--such as connections in the brain that allow us to locate visual targets--guide our visual and auditory attention.(1) Yet we have a limited understanding of how these inter-related processes are affected by age-related changes in the brain, eyes, and ears.

Is the connection between sensory impairment and cognitive decline linear, with one health concern leading to the other, or is it cyclical, reflecting a more complex connection? AGS-NIA conference attendees think answers to these questions are critical, which is why their conference report maps the state of sensory and cognitive impairment research while also outlining important priorities for future scholarship and clinical practice. These include answering questions tied to the mechanics, measurement, and management of impairments:

* Identify the Mechanisms Responsible for Sensory and Cognitive Impairments (and Their Connections)(1)

** Is there a cause-and-effect relationship between cognition, vision, and hearing?

** What biological factors or characteristics of our nervous system affect both sensory and cognitive health?

* Better Equip Clinicians and Researchers to Measure Forms of Sensory and Cognitive Impairment(1)

** What standards currently exist for measuring sensory impairment and cognitive decline? How are they used among diverse populations, particularly those who might already struggle with access to health care?

** How can we develop and validate new tools and protocols to measure cognition for people who also live with vision impairments, hearing impairments, or both? Similarly, how can we better measure hearing and vision health in older people managing cognitive health concerns?

** How can we work to ensure broad measures of cognitive and sensory impairment are included in existing research studies as a way to better adapt findings to the realities of older-adult health?

* Better Prepare Older Adults and Health Professionals to Address Sensory and Cognitive Impairments(1)

** How effective, feasible, and accessible are existing options for assisting older people living with cognitive impairments, hearing impairments, and/or vision impairments?

** What innovations will be necessary to develop new resources, tools, and protocols to improve cognitive and sensory health or to accommodate those who live with these health concerns?

"The evidence we have at present indicates that impaired vision, hearing, and cognition occur more often together than would be expected by chance alone," summarized Frank Lin, MD, PhD, Associate Professor of Otolaryngology-Head and Neck Surgery at Johns Hopkins Medicine and another lead researcher at the AGS-NIA conference. "Figuring out why--and what can be done about a potential link--represents a critical new leap for the care we all will want and need as we age."

This research was supported by the NIA of the National Institutes of Health (NIH) under Award U13AG054139. The content is solely the responsibility of the authors and does not necessarily represent the views of the NIH.

AGS ACTION POINTS

* As more of us live longer, impaired vision, hearing, and cognition must remain high-priority agenda items for the future of our care. Beyond understanding these health concerns individually, we also need a better grasp of what might link them, since impaired hearing and vision often go hand-in-hand and are associated with an increased risk for cognitive concerns.

* Researchers and those who fund scholarship on aging should prioritize studies that (1) explore the mechanisms responsible for sensory and cognitive impairments, (2) give clinicians a better sense of how to measure these impairments, and (3) help older adults and health professionals better manage these concerns while promoting health, safety, and independence.

* Older adults, caregivers, and health professionals concerned about sensory and cognitive impairments can take steps to promote well-being by making a discussion about changes in vision, hearing, and cognition a routine part of care. Trusted health resources can help everyone stay informed about how these concerns impact health, and advocacy to legislators is an important way to increase funding and attention for related research.

Credit: 
American Geriatrics Society

'Ground coffee' with soil perks in Brazil

image: A mechanical coffee picker collects a windrow of fallen coffee.

Image: 
Tiago do Oliveira Tavares and Rouverson Pereira da Silva.

Coffee is one of Brazil's biggest crops. Brazil's favorable climate helps coffee beans ripen and be ready for picking during a concentrated period of weeks. This makes mechanical harvesting an economically reasonable choice.

So much mechanization, however, comes with its challenges. Tiago de Oliveira Tavares is an agronomist at Sao Paulo State University in Brazil. He and his colleagues perked up at the opportunity to brew some coffee-growing solutions.

Up to 20% of coffee berries fall to the ground. This can be due to the mechanical harvesting process as well as other causes, including rain, wind, disease, and pests. This "ground coffee" is retrieved through a process of mechanical sweeping and picking.

But the machinery is heavy. Over time it compresses the soil, interfering with the trees' root growth and their ultimate levels of production. In response, growers use a process called subsoiling to break up this hard soil. A long blade is pulled behind a tractor. It breaks up the soil very effectively but leaves an uneven soil surface behind.

Uneven ground means more ripe coffee berries are left behind by the machines, and more dirt and stones are picked up. Growers use additional soil management techniques to smooth the soil. For example, a harrow is a tool with teeth or discs that break up the big clods of dirt that remain after subsoiling. The soil is left softer and fluffier. A crusher pulverizes the soil and any plant matter, leaving the ground smooth and bare.

Which technique, or combination of techniques, does the best, most cost-effective job of improving sweeping and picking efficiency?

Here's where Tavares' team steps up to the coffee bar. The researchers had four options on their soil management menu:

Control group with no soil management. This provides the best conditions for the machinery but is not a good soil management practice.

Subsoiler followed by harrow. The researchers found the harrow treatment led to the highest losses. They concluded that subsoiling plus harrowing was an economically undesirable management practice.

Subsoiler, harrow, and crusher. This treatment was not consistently effective. The extra step also increased operational costs, making it a less desirable option.

Subsoiler followed by crusher: Tavares and his team determined that the crusher treatment was the best option to reduce the losses of mechanical coffee harvesting. This method had the lowest losses and best operational quality.

Leaving fallen coffee berries on the ground is not a good option. For one thing, the berries provide a breeding ground for the coffee borer, which is the coffee crop's second most common pest. Just as important, the sale value of the fallen coffee makes it economically worthwhile to recover as much as possible.

"This study is important because through this research coffee producers can improve the management of their coffee plantations," Tavares said. "They can reduce losses and increase their profit while taking good care of the soil."

That's a coffee blend with just the right amount of earthiness and a great finish.

Credit: 
American Society of Agronomy

GMOs reduce pesticide use 37%, increase crop yields 22%, and increase farm profits 68%.

Despite the rapid adoption of genetically modified (GM) crops by farmers in scientifically advancedc countries, controversies about the 1970s version of genetic engineering (successfully used commonly since the 1990s) continue. For every victory of the Rainbow Papaya in Hawaii, environmentalists insist dozens of times there is uncertainty and that only older genetic engineering like mutagenesis - mutating seeds with chemicals and radiation - should be allowed.