Tech

Common genetic toolkit shapes horns in scarab beetles

image: Rhinoceros beetles and Dung beetles use the same genes to form their elaborate horns.

Image: 
Takahiro Ohde and colleagues

Horns have evolved independently multiple times in scarab beetles, but distantly related species have made use of the same genetic toolkit to grow these prominent structures, according to a study publishing October 4, 2018 in the open-access journal PLOS Genetics by Teruyuki Niimi at the National Institute for Basic Biology in Okazaki, Japan, and colleagues.

There are over 35,000 species of scarab beetle (Scarabaeidae), and many scarab beetles grow horns on the head and/or upper body. Horns are considered to be independent radiation in rhinoceros beetles and their distant relatives dung beetles. Rhinoceros beetles include some of the largest insect species on earth, such as the famous Atlas and Hercules beetles. To investigate the genetic mechanisms that control horn development in these distant groups, the team examined gene expression and function in early horn cells in developing larvae of the Japanese rhinoceros beetle (Trypoxylus dichotomus), and compared this with published data for dung beetles.

From the high-throughput sequence analysis, they identified 49 genes that indicated possible involvement in horn development of the rhinoceros beetle, and used RNA interference to deactivate each gene to measure the effect on adult horn size and shape. Eleven genes expressed during larval development showed measureable effects on horn formation in the rhinoceros beetle. They found that these eleven genes include head and appendage patterning genes, and the same category of genes have also been linked to horn development in dung beetles. The results suggest that (1) horns developmentally derive from the similar head regions, and that (2) the same set of ancestral genes were co-opted repeatedly in distinctly originated horns between rhinoceros and dung beetles. This study reveals deep parallels in development between independently evolved horns, but it is also possible that the ancestor of the Scarabaeidae family had horns that were subsequently lost in the evolution of the majority of modern scarab beetles. A wider taxon-sampling is required to clarify whether horns originate single or multiple times, and this study provides a promising way to discover horn formation genes in a species.

Credit: 
PLOS

Machine learning can detect fake news, at its source

image: A machine learning system aims to determine if a news outlet is accurate or biased.

Image: 
Public domain

Lately the fact-checking world has been in a bit of a crisis. Sites like Politifact and Snopes have traditionally focused on specific claims, which is admirable but tedious - by the time they've gotten through verifying or debunking a fact, there's a good chance it's already traveled across the globe and back again.

Social media companies have also had mixed results limiting the spread of propaganda and misinformation: Facebook plans to have 20,000 human moderators by the end of the year, and is spending many millions developing its own fake-news-detecting algorithms.

Researchers from MIT's Computer Science and Artificial Intelligence Lab (CSAIL) and the Qatar Computing Research Institute (QCRI) believe that the best approach is to focus not on the factuality of individual claims, but on the news sources themselves. Using this tack, they've demonstrated a new system that uses machine learning to determine if a source is accurate or politically biased.

"If a website has published fake news before, there's a good chance they'll do it again," says postdoctoral associate Ramy Baly, lead author on a new paper about the system. "By automatically scraping data about these sites, the hope is that our system can help figure out which ones are likely to do it in the first place."

Baly says the system needs only about 150 articles to reliably detect if a news source can be trusted - meaning that an approach like theirs could be used to help stamp out fake-news outlets before the stories spread too widely.

The system is a collaboration between computer scientists at MIT CSAIL and QCRI, which is part of the Hamad Bin Khalifa University in Qatar. Researchers first took data from Media Bias/Fact Check (MBFC), a website with human fact-checkers who analyze the accuracy and biases of more than 2,000 news sites, from MSNBC and Fox News to low-traffic content farms.

They then fed that data to a machine learning algorithm called a Support Vector Machine (SVM) classifier, and programmed it to classify news sites the same way as MBFC. When given a new news outlet, the system was then 65 percent accurate at detecting whether it has a high, low or medium level of "factuality," and roughly 70 percent accurate at detecting if it is left-leaning, right-leaning or moderate.

The team determined that the most reliable ways to detect both fake news and biased reporting were to look at the common linguistic features across the source's stories, including sentiment, complexity and structure.

For example, fake-news outlets were found to be more likely to use language that is hyperbolic, subjective, and emotional. In terms of bias, left-leaning outlets were more likely to have language that related to concepts of harm/care and fairness/reciprocity, compared to other qualities such as loyalty, authority and sanctity. (These qualities represent the 5 "moral foundations," a popular theory in social psychology.)

Co-author Preslav Nakov says that the system also found correlations with an outlet's Wikipedia page, which it assessed for general length - longer is more credible - as well as target words like "extreme" or "conspiracy theory." It even found correlations with the text structure of a source's URLs: those that had lots of special characters and complicated subdirectories, for example, were associated with less reliable sources.

"Since it is much easier to obtain ground truth on sources [than on articles], this method is able to provide direct and accurate predictions regarding the type of content distributed by these sources," says Sibel Adali, a professor of computer science at Rensselaer Polytechnic Institute who was not involved in the project.

Nakov is quick to caution that the system is still a work-in-progress, and that, even with improvements in accuracy, it would work best in conjunction with traditional fact-checkers.

"If outlets report differently on a particular topic, a site like Politifact could instantly look at our 'fake news' scores for those outlets to determine how much validity to give to different perspectives," says Nakov, a senior scientist at QCRI.

Baly and Nakov co-wrote the new paper with MIT senior research scientist James Glass alongside master's students Dimitar Alexandrov and Georgi Karadzhov of Sofia University. The team will present the work later this month at the 2018 Empirical Methods in Natural Language Processing (EMNLP) conference in Brussels, Belgium.

The researchers also created a new open-source dataset of more than 1,000 news sources, annotated with factuality and bias scores - the world's largest database of its kind. As next steps, the team will be exploring whether the English-trained system can be adapted to other languages, as well as to go beyond the traditional left/right bias to explore region-specific biases (like the Muslim World's division between religious and secular).

"This direction of research can shed light on what untrustworthy websites look like and the kind of content they tend to share, which would be very useful for both web designers and the wider public," says Andreas Vlachos, a senior lecturer at the University of Cambridge who was not involved in the project.

Nakov says that QCRI also has plans to roll out an app that helps users step out of their political bubbles, responding to specific news items by offering users a collection of articles that span the political spectrum.

"It's interesting to think about new ways to present the news to people," says Nakov. "Tools like this could help people give a bit more thought to issues and explore other perspectives that they might not have otherwise considered."

Credit: 
Massachusetts Institute of Technology, CSAIL

What you can't see can hurt you

video: University of Utah engineers conducted a study to determine if homeowners change the way they live if they could visualize the air quality in their house. They discovered that many homeowners did change their behaviors based on their indoor air pollution. Researchers provided participants air pollution sensors, a Google Home speaker and a tablet to measure and chart the air quality in their homes.

Image: 
University of Utah College of Engineering

Oct. 9, 2018 -- You can't see nasty microscopic air pollutants in your home, but what if you could?

Engineers from the University of Utah's School of Computing conducted a study to determine if homeowners change the way they live if they could visualize the air quality in their house. It turns out, their behavior changes a lot.

Their study was published this month in the Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. The paper also is being presented Oct. 9 in Singapore during the "ACM International Joint Conference on Pervasive and Ubiquitous Computing." The paper can be viewed and downloaded here.

"The idea behind this study was to help people understand something about this invisible air quality in their home," says University of Utah School of Computing assistant professor Jason Wiese, who was a lead author of the paper along with U School of Computing doctoral student Jimmy Moore and School of Computing associate professor Miriah Meyer.

During the day, the air pollution inside your home can be worse than outside due to activities such as vacuuming, cooking, dusting or running the clothes dryer. The results can cause health problems, especially for the young and elderly with asthma.

University of Utah engineers from both the School of Computing and the Department of Electrical and Computer Engineering built a series of portable air quality monitors with Wi-Fi and connected them to a university server. Three sensors were placed in each of six homes in Salt Lake and Utah counties from four to 11 months in 2017 and 2018. Two were placed in different, high-traffic areas of the house such as the kitchen or a bedroom and one outside on or near the porch. Each minute, each sensor automatically measured the air for PM 2.5 (a measurement of tiny particles or droplets in the air that are 2.5 microns or less in width) and sent the data to the server. The data could then be viewed by the homeowner on an Amazon tablet that displayed the air pollution measurements in each room as a line graph over a 24-hour period. Participants in the study could see up to 30 days of air pollution data. To help identify when there might be spikes in the air pollution, homeowners were given a voice-activated Google Home speaker so they could tell the server to label a particular moment in time when the air quality was being measured, such as when a person was cooking or vacuuming. Participants also were sent an SMS text message warning them whenever the indoor air quality changed rapidly.

During the study, researchers discovered some interesting trends from their system of sensors, which they called MAAV (Measure Air quality, Annotate data streams, and Visualize real-time PM2.5 levels). One homeowner discovered that the air pollution in her home spiked when she cooked with olive oil. So that motivated her to find other oils that produced less smoke at the same cooking temperature.

Another homeowner would vacuum and clean the house just before a friend with allergies dropped by to try and clean the air of dust. But what she found out through the MAAV system is that she actually made the air much worse because she kicked up more pollutants with her vacuuming and dusting. Realizing this, she started cleaning the house much earlier before the friend would visit.

Participants would open windows more when the air was bad or compare measurements between rooms and avoid those rooms more with more pollution.
"Without this kind of system, you have no idea about how bad the air is in your home," Wiese says. "There are a whole range of things you can't see and can't detect. That means you have to collect the data with the sensor and show it to the individual in an accessible, useful way."

Researchers also learned that circumstances that made the air pollution worse differed in each home. Vacuuming in the home, for example, would cause different reactions to the air quality. They also learned that if homeowners could visualize the air quality in their home, they always stayed on top of labeling and looking at the data.

Wiese says no known manufacturers make air quality systems for the home that allow residents to visualize and label the air quality in this way, but he hopes their research can spur more innovation.

The study involves engineering in collaboration with other University of Utah scientists, including biomedical informatics and clinical asthma researchers. It was funded as part of a larger National Institutes of Health program known as Pediatric Research using Integrated Sensor Monitoring Systems (PRISMS), launched in 2015 to develop sensor-based health monitoring systems for measuring environmental, physiological and behavioral factors in pediatric studies of asthma and other chronic diseases.

Credit: 
University of Utah

NASA puts together a composite of Tropical Storm Kong-Rey

image: NASA's IMERG and NASA-NOAA's Suomi NPP satellite imagery were combined to create a picture of the extent and rainfall of Tropical Storm Kong-Rey. IMERG found heavy rain falling (red) around the center and northwest of the center on Oct. 3, 2018.

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS) /NOAA/JAXA

NASA's IMERG combines data from many satellites to provide a look at rainfall occurring around the world. Those rainfall data were combined with visible imagery from NASA-NOAA's Suomi NPP satellite to create a composite or fuller picture of Kong- Rey in the Northwestern Pacific Ocean as it weakened to a tropical storm.

The Global Precipitation Measurement mission or GPM core satellite provided a look at distribution of rainfall within Kong-Rey. GPM is a joint mission between NASA and the Japan Aerospace Exploration Agency, JAXA. GPM found heaviest rain falling around the center and northwest of the center on Oct. 4, 2018.

The Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard NASA-NOAA's Suomi NPP satellite captured a visible light image that showed the western quadrant of Kong-Rey just east of Taiwan on Oct. 4.

The Status of Kong-Rey

At 11 a.m. EDT (1500 UTC) on Oct. 4, 2018, the Joint Typhoon Warning Center downgraded Kong-Rey from a typhoon to a tropical storm as maximum sustained winds dropped to 60 knots (69 mph/111 kph). Kong-Rey was centered near 26.1 degrees north latitude and 126.5 degrees east longitude. That's approximately 88 nautical miles southwest of Kadena Air Base and has tracked north-northwestward.

Kong-Rey is forecast to turn to the north then northeast, and move into the Sea of Japan. The storm is now weakening, and will become extra-tropical over northern Japan.

About IMERG

NASA's GPM or Global Precipitation Measurement mission satellite provides information on precipitation from its orbit in space. GPM is a joint mission between NASA and the Japan Aerospace Exploration Agency or JAXA. GPM also utilizes a constellation of other satellites to provide a global analysis of precipitation that are used in the IMERG calculation.

At NASA's Goddard Space Flight Center in Greenbelt, Maryland, those data are incorporated into NASA's IMERG or Integrated Multi-satellitE Retrievals for GPM. IMERG is used to estimate precipitation from a combination of passive microwave sensors, including the Global Precipitation Measurement (GPM) mission's core satellite's GMI microwave sensor and geostationary infrared data. IMERG real-time data are generated by NASA's Precipitation Processing System every half hour and are normally available within six hours.

Credit: 
NASA/Goddard Space Flight Center

German refrigerators may pose risk to insulin quality

New research being presented at this year's European Association for the Study of Diabetes (EASD) Annual Meeting in Berlin, Germany (1-5 October), suggests that insulin is often stored at the wrong temperature in patients' fridges at home, which could affect its potency.

Many injectable drugs and vaccines are highly sensitive to heat and cold and can perish if their temperature shifts a few degrees. To prevent loss of effectiveness, insulin must stay between 2-8°C/36-46°F in the refrigerator or 2-30°C/30-86°F when carried about the person in a pen or vial.

Individuals with diabetes often store insulin at home for several months before they use it, but little is known about how storage in domestic fridges impacts insulin quality.

To investigate how often insulin is stored outside the manufacturer's recommended temperature range, Dr Katarina Braune from Charité - Universitaetsmedizin Berlin in Germany in collaboration with Professor Lutz Heinemann (Science & Co) and the digital health company MedAngel BV monitored the temperature of insulin formulations stored in fridges at home and carried as a spare.

Between November 2016 and February 2018, 388 diabetes patients living in the USA and the EU placed temperature sensors (MedAngel ONE, http://www.medangel.co) either next to their insulin in the fridge and or their diabetes bag.

Temperature data were automatically measured every 3 minutes (up to 480 times a day) before being sent to an app and recorded on a secure database. Temperature data were recorded for an average of 49 days.

Analysis of 400 temperature logs (230 for refrigerated and 170 carried insulin) revealed that 315 (79%) contained deviations from the recommended temperature range.

On average, insulin stored in the fridge was out of the recommended temperature range 11% of the time (equivalent to 2 hours and 34 mins a day). In contrast, insulin carried by patients was only outside recommendations for around 8 minutes a day.

Importantly, freezing was an even bigger issue, with 66 sensors (17%) measuring temperatures below 0?C (equivalent to 3 hours a month on average).

"Many people with diabetes are unwittingly storing their insulin wrong because of fluctuating temperatures in domestic refrigerators", says Dr Braune.

"When storing your insulin in the fridge at home, always use a thermometer to check the temperature. Long-term storage conditions of insulin are known to have an impact on its blood-glucose lowering effect. For people living with insulin-dependent diabetes who take insulin several times a day via injections or continuously administer insulin with a pump, precise dosing is essential to achieve optimal therapeutic outcomes. Even gradual loss of potency introduces unnecessary variability in dosing. More research is needed to examine the extent to which temperature deviations during domestic storage affect insulin efficacy and patient outcomes."

Credit: 
Diabetologia

Biofortification: Wheat that pumps iron, naturally

image: Graduate student Jorge Venegas inspects his wheat breeding lines, University of Nebraska greenhouse.

Image: 
Craig Chandler

Is biofortification the best thing since sliced bread? Well, biofortified wheat could certainly make it easier to help some humans get proper nutrition.

Biofortification is the process of naturally increasing the nutritional value of a crop. Unlike fortification, which might add a mineral like iron directly to something like bread dough, the goal of biofortification is to have the wheat in the dough naturally contain more iron in the first place.

Robert Graybosch of the USDA Agricultural Research Service explains that about 60% of the world's population doesn't get enough iron. This happens because the food people eat doesn't contain enough minerals or contains what are called 'antinutrients.' These are molecules that prevent the body from absorbing good nutrients.

"Fortification is potentially useful as people in many parts of the world do not consume a balanced diet and their main foods lack minerals," he says. "This can be addressed by fortification, the process of adding minerals back to food products. This is done with flours used for bread baking."

However, some people are hesitant to eat products with what they think might be weird ingredients, he adds. Graybosch is trying to naturally enhance the minerals of wheat flours to help people around the world get more iron.

"Biofortification can be done via traditional plant breeding using natural genetic variation or natural mutations, or via genetic engineering," he says. "If one found a mutation that resulted in more grain iron, and then bred this trait into wheat that was produced and consumed, then we could say the crop has been biofortified."

Graybosch and his team developed experimental breeding lines of winter wheat. Breeding lines are the first step in the long process of creating a new type of wheat that farmers can grow. The team tried to combine two properties--low phytate and high grain protein--without lowering grain yield. Phytate is an antinutrient that prevents the body from taking in some minerals.

Biofortification is a delicate balance. Often, increasing the nutrition causes the overall grain yield to drop. This can lead to the wheat being overall less nutritious and can also hurt farmers' profits.

Their results show that combining the two traits without any bad effects on grain yield is possible. It increased the amount of zinc, calcium, and manganese humans could get from it. Although more work needs to be done to get it in wheat that can be planted by farmers, the genes can be used to develop more nutritious wheat without sacrificing yield.

The next steps in their research, some of which they have already undertaken, are to then breed these beneficial genes into plants adapted for areas where wheat is grown, such as the Great Plains of the U.S.

"It is important to note that all wheat grown in a specific area is adapted to that area," Graybosch explains. "Great Plains wheats do well in the Great Plains, but not elsewhere. If the trait is of interest in other locations, additional breeders need to start introducing it to their own backgrounds. And they are interested in doing so."

Graybosch says his journey to this research began as he walked home from work one day. He wanted to devise a project to investigate "the most important nutritional problem facing mankind," which he learned was likely that people weren't getting enough iron. He and then-graduate student Jorge Venegas started to look for genes that would improve the nutrition of wheat.

"I think anything that can improve food mineral nutrition at low or no cost to the consumer is of value," Graybosch says. "Anything we can do to improve nutrition worldwide will go a long way toward improving the lives of our fellow earthlings."

Credit: 
American Society of Agronomy

NASA finds Tropical Storm Sergio on the verge of hurricane status

image: Infrared satellite data captured at 5:30 a.m. EDT (0930 UTC) on Oct. 1 from NASA's Aqua satellite revealed strongest storms with the coldest cloud top temperatures west of Sergio's center and in a band of thunderstorms southwest of the center. MODIS found coldest cloud tops had temperatures near minus 70 degrees Fahrenheit (minus 56.6 degrees Celsius).

Image: 
NASA/NRL

The National Hurricane Center noted that Tropical Storm Sergio was on the verge of becoming a hurricane in the Eastern Pacific Ocean and NASA's Aqua satellite confirmed very powerful storms within.

Infrared satellite data captured at 5:30 a.m. EDT (0930 UTC) on Oct. 1 from the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Aqua satellite revealed strongest storms with the coldest cloud top temperatures west of Sergio's center and in a band of thunderstorms southwest of the center. MODIS found coldest cloud tops had temperatures near minus 70 degrees Fahrenheit (minus 56.6 degrees Celsius). NASA research has found that cloud top temperatures that cold have the capability to generate heavy rainfall.

At 11 a.m. EDT, the National Hurricane Center or NHC noted "Sergio is almost a hurricane. Satellite images indicate that the central convection has been increasing in intensity during the past several hours, but there are still no indications of an eye in that data. Microwave imagery does show an eye feature, however. The outer bands are not particularly well organized, and there are some dry slots beyond the inner core."

The center of Tropical Storm Sergio was located near latitude 11.5 degrees north and longitude 109.5 degrees west. Sergio is far from land, so there are no coastal advisories in effect. It is about 625 miles (1,000 km) southwest of Manzanillo, Mexico. Sergio is moving toward the west near 14 mph (22 kph), and this general motion is forecast to continue through tonight. A turn toward the west-northwest is expected on Tuesday. Maximum sustained winds remain near 70 mph (110 kph) with higher gusts.

NHC noted that "Strengthening is expected during the next 48 hours, and Sergio is forecast to become a hurricane later today, and a major hurricane by Wednesday, Oct. 3."

Credit: 
NASA/Goddard Space Flight Center

Weak magnetic fields affect cells via a protein involved in bird migration

image: This is a diagram showing devices that generate magnetic fields and their respective field strengths

Image: 
pbio.3000018, CC-BY

Beneficial effects, and possible harm, of exposure to weak pulsed electromagnetic fields (PEMFs) may be mediated by a protein related to one that helps birds migrate, according to a study publishing on October 2 in the open access journal PLOS Biology by Margaret Ahmad of Xavier University in Cincinnati and colleagues. The discovery provides a potential mechanism for the benefits of PEMF-based therapies, used to treat depression and Parkinson's disease, and may accelerate development of magnetic stimulation for other applications.

PEMF-based therapies induce weak magnetic fields in treated tissue, and have been claimed to temporarily improve symptoms in several diseases; however, support for such effects remains unclear, as do the potential mechanisms. Since the magnetic field strength used in PEMF-based therapies is below what is needed to cause neurons to fire, Ahmad and co-authors asked whether the effects might instead be due to activation of a protein that senses magnetic fields. One such group of proteins are the cryptochromes, which are found in a wide variety of organisms, and which help orient birds to the Earth's magnetic field, aiding migration.

The authors found that human cells subjected to a weak magnetic field increased their production of reactive oxygen species (ROS), a group of molecules with multiple roles in cells, including signaling. The increase in ROS slowed cell growth and led to the expression of multiple ROS-responsive genes. Cryptochromes are known to synthesize ROS, and the production of ROS in response to a magnetic field could be blocked by reducing the amount of cryptochrome in the cells. The authors confirmed the presence of ROS, and the reduction in its production after cryptochrome depletion, using both biochemical and imaging methods.

These results provide a new mechanism for explaining the effects of weak magnetic fields on human cells. While proposed harmful effects of such fields on human health have been difficult to confirm, the cryptochrome-induced production of ROS, which can be damaging in excess, may explain how such harm could occur. At the same time, the potential beneficial effects of PEMF-based therapy may be due in part to the signaling function of ROS, triggered by activation of cryptochrome. Importantly, since cryptochrome activation and ROS production are quantifiable, this study points the way toward teasing out cryptochrome's contribution to both the benefits and the potential harms of exposure to weak magnetic fields.

"Our findings provide a rationale for optimizing low-strength magnetic fields for novel therapeutic applications," Ahmad said. "At the same time, they suggest that alone, or in combination with other environmental triggers of ROS production, such fields have the potential to negatively impact public health."

Because of the nature of claims relating to the biological effects of weak magnetic fields, PLOS Biology decided to commission an accompanying "Primer" article by Lukas Landler and David Keays. In their Primer, the authors place Ahmad and colleagues' study in the context of the skepticism that has surrounded some previous claims in this field: "The critical yardstick in assessing the validity of these claims is an assessment of the controls they employed. This reveals a big improvement on existing papers, but the controls are still imperfect."

Nevertheless, say Landler and Keays, "Should this paper be independently replicated by multiple labs it will undoubtedly be influential. It is conceivable that leukemia associated with 50 Hz power lines, PEMF mediated ROS generation, and animal magnetoreception are mechanistically similar -- each requiring the presence of cryptochrome... It may well transpire that cryptochrome is a magnetosensor, but one with a sinister side."

Credit: 
PLOS

NASA's Aqua satellite shows Rosa's remnants soaking Arizona

image: On Tuesday, Oct. 2 at 6:05 a.m. EDT (1005 UTC), the MODIS instrument aboard NASA's Aqua satellite looked at the remnants of tropical depression Rosa in infrared light revealing the storm soaking Arizona. MODIS found coldest cloud tops (yellow) had temperatures near minus 63 degrees Fahrenheit (minus 53 degrees Celsius) which indicated powerful rain-making storms.

Image: 
NASA/NRL

NASA provided an infrared view of Tropical Depression Rosa's remnants that showed strongest storms with heaviest rainfall potential were over east central Arizona on Oct. 2. The National Hurricane Center noted that although Rosa had dissipated by 11 a.m. EDT on Oct. 2, the threat of heavy rains and flash flooding continues over the Desert Southwest.

NOAA's National Weather Service Weather Prediction Center in College Park, Md. noted "Heavy tropical rain from Rosa will bring flash flood threats to the Desert Southwest and Four Corners region over the next couple of days. Flood Watches are in effect for multiple states, including California, Arizona, Nevada, Utah, Colorado and Idaho. There is also a moderate risk of excessive rain for central portions of Arizona."

Infrared satellite data captured on Tuesday, Oct. 2 at 6:05 a.m. EDT (1005 UTC the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Aqua satellite revealed the location of strongest storms with the coldest cloud top temperatures. MODIS found coldest cloud tops had temperatures near minus 63 degrees Fahrenheit (minus 53 degrees Celsius). NASA research has found that cloud top temperatures that cold have the capability to generate heavy rainfall.

Satellite images and surface observations indicate that Rosa has become a trough or an elongated area of low pressure with multiple swirls along its axis. Therefore, Rosa no longer qualifies as a tropical cyclone.

That heavy rainfall potential that NASA's infrared data showed are reflected in the forecast today, Oct. 2 and tomorrow, Oct. 3. In Baja California and northwestern Sonora, 3 to 6 inches are forecast with isolated 10 inches. In central and southern Arizona 2 to 4 inches are forecast with isolated 6 inch totals are possible in the mountains of central Arizona. For the rest of the Desert Southwest, Central Rockies, and Great Basin, the National Hurricane Center forecast expects between 1 to 2 inches, with isolated totals to 4 inches. These rainfall amounts may produce life-threatening flash flooding. Dangerous debris flows and landslides are also possible in mountainous terrain.

At 11 a.m. EDT (1500 UTC), the remnants of Rosa were located near latitude 29.7 degrees north and longitude 114.2 degrees west. That's about 95 miles (155 km) south-southeast of San Felipe, Mexico. The remnants are moving toward the northeast near 8 mph (13 kph), and they are expected to move over the Desert Southwest by tonight. Maximum sustained winds are near 30 mph (45 kph) with higher gusts.

Credit: 
NASA/Goddard Space Flight Center

Studded winter tires cost more lives than they save

image: The studded winter tyres damage the ground and throw up particles into the atmostphere, thus costing more lives than they actually save, according to researchers.

Image: 
Ilya Plekhanov

Researchers from Chalmers University of Technology, Sweden, have now shown that studded winter tyres cost more lives than they save. The new study takes a holistic view of the tyres' impact on wider public health. At the same time, they show that their use contributes to the bloody conflict in the Democratic Republic of Congo, and fatal accidents in their production phase.

This is the time of year in Sweden when many people start to change their normal car tyres to winter ones. According to Trafikverket, the Swedish Transport Administration, around 60% of Swedish drivers choose studded winter tyres, and there has long been a debate about the emissions caused by the studs damaging the ground and throwing up particles into the atmosphere.

Three Chalmers researchers have now investigated this question. Anna Furberg, Sverker Molander and Rickard Arvisson at the Division of Environmental Systems Analysis used a systemic perspective to analyse studded winter tyres' public health impact for their whole life cycle.

To weigh up the advantages and disadvantages, the researchers looked at how many lives are saved through their use, compared to the level of emissions they generate through wear of the roads and in their production. Additionally, they investigated accident statistics from the small-scale mining industry in the DRC, where cobalt - an important element for the studs - is most abundant. Cobalt is a highly sought-after conflict metal which contributes to the warfare in the region, something the researchers also accounted for.

The researchers estimate that from a broader life cycle perspective of studded tyres' life-spans, Swedish use of studded tyres saves between 60 and 770 life-years, compared with 570 to 2200 life-years which are lost.

"Taking everything together, the picture is very clear - studded winter tyres actually cost more lives than they save," says Sverker Molander, a professor at the Department of Technology Management and Economics at Chalmers.

The biggest negative impact is generated during usage, from the emissions caused by road damage. Even taking only this into account, the negative health impacts already clearly outweigh the advantages. Once you measure the other factors in as well, the result only becomes clearer, the researchers explain.

"The small-scale mining, where many accidents and fatalities occur, is the next biggest part of the tyres' overall negative health impact. Deaths linked to the conflict in the DRC are the smallest part, but that being said, there are many aspects of that that have not been included in the study - the conflict of course influences the whole of society. I doubt many people realise that using these tyres is contributing to the situation in the DRC," says Anna Furberg.

The advantages of the studded winter tyres are mainly enjoyed in Scandinavia, whilst nearly a third of the negative health impacts are felt elsewhere.

"This is a clear illustration of what globalised production can result in. People profiting at others' expense. It is not those who benefit from the product who are having to pay for the negative effects," says Sverker Molander.

So how should consumers react to this research? Anna Furberg and Sverker Molander suggest that good winter tyres without studs can be an alternative, in combination with careful driving and consideration of alternative means of travel.

"Of course, how you drive is important, and snow-ploughing and sweeping needs to be done properly. Most cars today also have electronic anti-skid systems fitted, which make them safer to drive at higher speeds. But our study shows that there is more research needed concerning alternatives to studded winter tyres that don't cause these health issues," says Anna Furberg.

The article "Live and let die? Life cycle human health impacts from the use of tire studs" was recently published in the scientific journal International Journal of Environmental Research and Public Health.

The research was carried out through the framework of the Mistra Environmental Nanosafety programme.

More on: the study

The study made use of life-cycle analysis (LCA) and disability-adjusted life years (DALY) - a health metric developed by the World Health Organisation (WHO) - to measure and quantify studded winter tyres' public health impact throughout their whole life cycle. The researchers investigated:

Lives saved: accident statistics and studies on differences in accidents between cars with and without studded tyres.

Emissions from use of studded tyres, as they damage the road and throw up particles from the asphalt. Looking at articles that had studied roads where such tyres were in use.

Emissions during production, from extraction to manufacturing. Looking at previous studies of different types of emissions.

Accidents and deaths during production, such as during cobalt mining. Looking at studies of accidents and fatalities in various industrial activities and in small-scale mining.

Number of deaths related to the conflict in the DRC.

The biggest contribution to studded tyres' negative health impact comes from emissions from road wear (67-77 per cent), followed by accidents and fatalities in cobalt mining (8-18 per cent). Between 23 and 33 per cent of the negative effects are felt outside of Scandinavia.

More on: Studded and non-studded winter tyres

VTI, the Swedish Road and Transport Research Institute, has examined, in two major reports, the difference in grip between studded and non-studded tyres. They report that studded tyres have a clearly better grip when driving on ice compared to non-studded tires of both Nordic and European type (report in Swedish). But when driving on snow, the difference is much smaller. When the road is wet, the asphalt is salted and the temperature is around zero, the brake and steering performance of the studded tyre and non-studded Nordic tyre is virtually equivalent (report in Swedish)

According to a Norwegian study, studded tyres reduce the number of passenger car accidents by 2 per cent on dry roads, and 5 per cent on roads covered with ice or snow, compared to non-studded winter tires.

According to the Swedish Transport Administration , the foundation for a safe winter trip is good winter tyres, the right speed and a driving mode adapted to the ground. The Administration emphasizes that a car equipped with anti-skid system (ESC) and non-studded winter tires gives a good safety level throughout the country.

For more information, contact:

Anna Furberg, PhD student at the Division of Environmental Systems Analysis, Department of Technology Management and Economics, Chalmers University of Technology, 031-772 63 28, anna.furberg@chalmers.se

Rickard Arvidsson, Assistant Professor at the Division of Environmental Systems Analysis, Department of Technology Management and Economics, Chalmers University of Technology 031 - 72 21 61, 0768 - 078733 rickard.arvidsson@chalmers.se

Sverker Molander, Professor at the Division of Environmental Systems Analysis, Department of Technology Management and Economics, Chalmers University of Technology 031-772 21 69, 0703 - 088522 sverker.molander@chalmers.se

See a video interview with Anna Furberg discussing this research

Winter tyres are permitted from 1 October to 15 April in Sweden. In extreme weather cases, tyres can be used at other times. Winter tyres, or equivalent equipment, are mandatory from December 1 to March 31.

According to a survey by the Swedish Transport Administration from 2017, 63.4% of Swedish drivers use studded tyres. The proportion running on non-studded winter tyres (of Nordic or inter-European type) is estimated to be 35.8 percent. The largest prevalence of studded tyres was measured in northern Sweden (91 percent).

Credit: 
Chalmers University of Technology

Story tips From the Department of Energy's Oak Ridge National Laboratory, October 2018

video: In work funded by the DOE Critical Materials Institute, ORNL researchers are demonstrating how rare earth permanent magnets can be harvested from used computer disk drives and repurposed in an axial gap motor.

Image: 
Jenny Woodbery/Oak Ridge National Laboratory, US Dept. of Energy

Magnets--Coming around again

Magnets recovered from used computer hard drives found new life in an electric motor in a first-ever demonstration at Oak Ridge National Laboratory. The permanent magnets made from rare earth elements were reused without alteration in an axial gap motor, which can be adapted for use in electric vehicles and industrial machinery. The demonstration is part of an effort to find ways to recycle rare earth permanent magnets, which are necessary for electric cars, cell phones, laptops, wind turbines and factory equipment. The rare earth ore used to make the magnets is in high demand and mined almost exclusively outside the United States. "We're not inventing a new magnet," said ORNL's Tim McIntyre. "We're enabling a circular economy--putting these recycled magnets into a new package that takes advantage of their strengths while addressing a key materials challenge for American industry." [Contact: Kim Askey, (865) 576-2841; askeyka@ornl.gov]

Video: https://youtu.be/bn1P6MxDMQs

Caption: In work funded by the DOE Critical Materials Institute, ORNL researchers are demonstrating how rare earth permanent magnets can be harvested from used computer disk drives and repurposed in an axial gap motor. Credit: Jenny Woodbery/Oak Ridge National Laboratory, U.S. Dept. of Energy

Image: https://www.ornl.gov/sites/default/files/Magnet_motor_ORNL1.jpg

Caption: In work funded by the DOE Critical Materials Institute, ORNL researchers are demonstrating how rare earth permanent magnets can be harvested from used computer disk drives and repurposed in an axial gap motor. Credit: Jason Richards/Oak Ridge National Laboratory, U.S. Dept. of Energy

Nuclear--Radiation effects

With an organ-on-a-chip technology, scientists at Oak Ridge National Laboratory are testing the effects of radiation on cells that mimic human respiration. The project, in collaboration with Larry Millet of the University of Tennessee, involves growing a microenvironment of human cell layers, similar to those found in human lungs, in a microfluidic chip, and then exposing the chip to ionizing radiation for subsequent analysis of the cells' response. While the technology is not new, the team has worked to incorporate and combine unique design elements and architectures, allowing them to collect more data. "For now, we are focusing on building the microenvironments to increase the amount of information we receive per experiment," said ORNL's Sandra Davern. "But, in the future, the process could be used to advance biomedical research, and aid in the discovery, testing and development of novel pharmaceuticals to treat disease, as well as mitigating agents suitable for an emergency response to radiological events." [Contact: Sara Shoemaker, (865) 576-9219; shoemakerms@ornl.gov]

Image: https://www.ornl.gov/sites/default/files/TIP%20image%20no%20scale_0.jpg

Caption: Researchers are using organ-on-a-chip technology to design a microenvironment of human microvascular cells to test how radiation could affect human respiration. These non-irradiated cells have successfully grown and stretched to cover the upper and lower surfaces of the researchers' new design. Credit: Sandra Davern/Oak Ridge National Laboratory, U.S. Dept. of Energy

Computing--Reaching rare earths

Scientists from the Critical Materials Institute used the Titan supercomputer and Eos computing cluster at Oak Ridge National Laboratory to analyze designer molecules that could increase the yield of rare earth elements found in bastnaesite, an important mineral for energy and technology applications. To utilize these rare earths--predominantly cerium--bastnaesite must first be separated from the surrounding ore of rocky minerals like calcite. Using quantum and molecular computing programs, researchers identified collector molecules that preferentially bind to metal ions on the bastnaesite surface. Through supercomputing, X-ray diffraction and surface calorimetry, researchers further discovered that displacing adsorbed water on bastnaesite and calcite surfaces is critical to collector binding, because it enables ligands to recognize the structural differences between the two minerals. They estimate that designer collectors could improve bastnaesite recovery by 50 percent via a process known as froth flotation, potentially lowering the cost of mining. [Contact: Katie Bethea, (865) 576-8039; betheakl@ornl.gov]

Image: https://www.ornl.gov/sites/default/files/Reaching%20rare%20earths_v2_0.png

Caption: Through quantum and molecular computing programs, researchers identified collector molecules that preferentially bind to metal ions on the surface of bastnaesite, a rare earth element that is important for energy and technology applications. The discovery could improve bastnaesite recovery and potentially lower mining costs. Credit: Oak Ridge National Laboratory/U.S. Dept. of Energy

Credit: 
DOE/Oak Ridge National Laboratory

UM researchers publish discoveries on antibiotic resistance

image: Engineered bacteria fluorescing green in response to stress induced by growth in a polymer-rich environment.

Image: 
Courtesy of Patrick Secor

MISSOULA - University of Montana researchers recently published their new insights into how pathogenic bacteria resist antibiotic treatment in Proceedings of the Natural Academy of Sciences.

"Antibiotic resistance is a major problem," said Patrick Secor, assistant professor in UM's Division of Biological Sciences and lead researcher on the paper. "However, it is often the case that if you take bacteria that survive antibiotic treatment from someone's infected lungs and treat those same bacteria with antibiotics in the lab, the bacteria die. We wanted to understand why."

Secor and researchers at UM and the University of Washington discovered that polymers present in airway mucus physically push on bacterial cells.

"We found that bacteria living in high concentrations of polymers get a little stressed out," said Lia Michaels, a researcher at UM and co-author of the paper. "Basically, the polymer-rich environment activates stress responses in the bacteria, causing them to tolerate higher levels of antibiotics."

"I like to compare it to the stress our bodies undergo when we exercise," Secor said. "Exercising today allows you to run a little further or lift a little more weight later on. This is analogous to the stress responses turned on in bacteria living in airway mucus - exposure to stress today allows the bacteria to survive the stress of antibiotic exposure later on."

The researchers discovered that stress responses induced by mucus polymers pressing on the bacteria were a result of mild DNA damage in the bacterial cells.

"One thing that this DNA damage did was slow bacterial growth," said Laura Jennings, UM research assistant professor and co-author of the paper. "Because most antibiotics work best on rapidly dividing cells, these slow-growing bacteria were more difficult to kill with antibiotics."

The researchers speculate that the mechanisms by which polymers turn on bacterial stress responses could be targeted therapeutically to treat long-term bacterial infections.

"Our hope is that we could come up with new ways to treat bacterial infections or increase the efficacy of antibiotic treatment," Secor said.

Credit: 
The University of Montana

NASA sees Walaka becoming a powerful Hurricane

video: On Sept. 30, GPM data revealed intense convective storms in a large feeder band wrapping around the Tropical Storm Wakala's northeastern side  where rain was falling at a rate of almost 6.5 inches (165 mm) per hour in the intense storms in the feeder band northeast of Walaka's center of circulation.  A tall convective storm was located in a line northwest of Walaka's center. It was found by DPR to reach heights above 8.5 miles (13.7 km).

Image: 
NASA/JAXA, Hal Pierce

The Global Precipitation Measurement mission or GPM core satellite passed over the Central Pacific Hurricane Center and analyzed Walaka's rainfall and cloud structure as it was strengthening into a hurricane.

Walaka formed southwest of the Hawaiian Islands on Saturday, Sept. 29. At 5 p.m. HST on Sunday, Sept. 30, Walaka strengthened to a hurricane.

The GPM core observatory recently had a couple good looks at tropical storm Walaka as it was intensifying into a powerful hurricane. GPM passed directly over tropical storm Walaka when it was located south of the Hawaiian islands on September 30, 2018 at 8:38 a.m. HST (1838 UTC).

Data collected by GPM's Microwave Imager (GMI) and Dual-Frequency Precipitation Radar (DPR) instruments showed that Walaka was well organized and very close to hurricane intensity. GPM's Radar (DPR Ku Band) data revealed intense convective storms in a large feeder band that was wrapping around the tropical storm's northeastern side and storms wrapping around a forming eye wall. GPM's DPR found rain falling at a rate of almost 6.5 inches (165 mm) per hour in the intense storms in the feeder band northeast of Walaka's center of circulation.

Walaka had strengthened to hurricane intensity when GPM flew above about twelve hours later at 8:07 p.m. HST (Oct. 1, 2018 at 0607 UTC). Walaka had developed an eye and was undergoing rapid intensification. The intensifying hurricane is passing well to the south of the Hawaiian Islands.

The GPM satellite's Dual-Frequency Precipitation Radar (DPR) data were used to show the structure of precipitation within intensifying tropical storm Walaka. The simulated 3D view of Walaka, looking from the southwest, showed storm tops of powerful storms wrapping into the center of the tropical storm. A tall convective storm was located in a line northwest of Walaka's center. It was found by DPR to reach heights above 8.5 miles (13.7 km). GPM is a joint mission between NASA and the Japanese space agency JAXA.

On Monday, October 1, 2018, NOAA's Central Pacific Hurricane Center or CPHC noted that a Hurricane Warning is in effect for Johnston Atoll. Also, interests in the Papahanaumokuakea Marine National Monument should monitor the progress of Walaka.

At 5 p.m. HST/11 p.m. EDT (0300 UTC, Oct. 1) on Sept. 30 or , the center of Hurricane Walaka was located near latitude 11.9 degrees north and longitude 166.4 degrees west. Walaka is moving toward the west near 12 mph (19 kph) and this motion is expected to slow and become northwest on Monday, then north on Tuesday. Maximum sustained winds are near 75 mph (120 kph) with higher gusts. Rapid intensification is forecast through Tuesday.

The Joint Typhoon Warning Center (JTWC) predicts that Hurricane Walaka will continue to strengthen and re-curve to the north later today. Walaka is expected to be a powerful category four hurricane on the Saffir-Simpson wind scale when it passes just to the west of Johnson Atoll in a couple days. Walaka is not expected to have a significant effect on the Hawaiian Islands.

>

Credit: 
NASA/Goddard Space Flight Center

Dutch study estimates 1 in 2 women and 1 in 3 men set to develop dementia/parkinsonism/stroke

One in two women and one in three men will likely be diagnosed with dementia, Parkinson's disease, or stroke in their lifetime, estimate Dutch researchers in an observational study published online in the Journal of Neurology Neurosurgery & Psychiatry.

But preventive strategies, which delay the onset of these common diseases by even a few years, could, in theory, cut this lifetime risk by between 20 and more than 50 per cent, they say.

The global costs of dementia, stroke, and parkinsonism are thought to amount to more than 2 per cent of the world's annual economic productivity (GDP), a figure that is set to rise steeply as life expectancy continues to increase.

But while the lifetime risks of other serious illnesses, such as breast cancer and heart disease are well known and used to raise public awareness, the same can't be said of dementia, stroke, parkinsonism, say the researchers.

To try and redress this, they tracked the neurological health of more than 12,000 people taking part in the Rotterdam Study between 1990 and 2016. This study has been looking at the incidence of, and influential factors behind, diseases of ageing in the general population.

All the participants were aged at least 45 years old when they were recruited, and more than half (just under 58 per cent) were women.

When they joined, participants were given a thorough health check, which was repeated every four years. Family doctor health records were also scrutinised for signs of disease or diagnoses arising between the four yearly check-ups.

Monitoring for dementia, parkinsonism, and stroke continued until death, or January 1 2016, whichever came first.

Between 1990 and 2016, 5291 people died, 3260 of whom had not been diagnosed with any neurological disease. But 1489 people were diagnosed with dementia, mostly Alzheimer's disease (just under 80%); 1285 had a stroke, nearly two thirds of which (65%) was caused by a blood clot (ischaemic); and 263 were diagnosed with parkinsonism.

A higher prevalence of high blood pressure, abnormal heart rhythm (atrial fibrillation), high cholesterol and type 2 diabetes was evident at the start of the monitoring period among those subsequently diagnosed with any of the three conditions.

Unsurprisingly, the risk of developing any of them rose steeply with age, but based on the data, the overall lifetime risk of a 45 year-old developing dementia, parkinsonism, or having a stroke was one in two for a woman (48%) and one in three for a man (36%).

This gender difference was largely driven by women being at heightened risk of developing dementia before men. But there were other gender differences in risk.

While 45 year-olds of both sexes had a similar lifetime risk of stroke, men were at substantially higher risk of having a stroke at younger ages than women.

And women were twice as likely as men to be diagnosed with both dementia and stroke during their lifetime.

The researchers calculated that if the onset of dementia, stroke, and parkinsonism were delayed by 1 to 3 years, the remaining lifetime risk could, in theory, be reduced by 20 per cent in 45 year-olds, and by more than 50 per cent in those aged 85+.

A delay of only a few years for one disease could also have a significant impact on combined lifetime risk, suggest the researchers.

"For instance, delaying dementia onset by 3 years has the potential to reduce lifetime risk of any disease by 15 per cent for men and women aged 45, and by up to 30 per cent for those aged 85 and older," they write.

The researchers point out that their study included only people of European ancestry with a relatively long life expectancy, so might not be applicable to other ethnicities/populations, and that they weren't able to measure the severity of any of the diagnosed conditions.

This research is observational, so no definitive conclusions can be drawn. But the researchers nevertheless conclude: "These findings strengthen the call for prioritising the focus on preventive interventions at population level which could substantially reduce the burden of common neurological diseases in the ageing population."

Credit: 
BMJ Group

Don't treat e-cigarettes like cigarettes

CHICAGO --- "Cigarette" might appear in the term "e-cigarette" but that is as far as their similarities extend, reports a new Northwestern Medicine report published Friday, Sept. 28, in the journal Nicotine & Tobacco Research. Assuming e-cigarettes are equal to cigarettes could lead to misguided research and policy initiatives, the paper says.

"Comparing cigarettes to e-cigarettes can give us a false sense of what dangers exist because it misses the gap in understanding how people use them and how they can make people dependent," said first author Matthew Olonoff, a Ph.D. student at Northwestern University Feinberg School of Medicine. "Before we start making policy changes, such as controlling nicotine or flavor options in e-cigarettes, we need to better understand what role these unique characteristics have."

The commentary distills articles and published studies that compare e-cigarettes to cigarettes and supports the importance of investigating e-cigarettes as a unique nicotine delivery system. It was published less than a month after the U.S. Food and Drug Administration declared youth vaping an epidemic.

"There are enough key differences between cigarettes and these products, especially newer-generation devices, to show that they are not interchangeable nicotine delivery systems," Olonoff said.

Key differences between the products include:

The amount of nicotine in e-cigarettes can vary widely, which doesn't provide enough consistency for researchers who study the device and smokers' behavior.

E-cigarette nicotine is ingested by vaping a liquid.

The ability to stop and restart e-cigarettes allows far more variability in intermittent use and nicotine dosing compared to a traditional cigarette.

The teen-friendly marketing and technology offered in e-cigarettes makes them more attractive than traditional cigarettes.

E-cigarettes are allowed in areas where cigarettes are prohibited.

Teens are at greatest risk because e-cigarette use and marketing is on the rise.

"Teens could potentially be getting addicted to something dangerous and harmful to their health," Olonoff said. "The only way we'll know if this is true is to study e-cigarettes as if they're their own unique device."

Researching e-cigarettes should be very different from traditional cigarettes

"From a research perspective, when we call it a 'cigarette,' we know how many puffs are typically in a cigarette, how people use it, the amount of nicotine in it," Olonoff said. "Even though it has the word 'cigarette' in it, e-cigarettes are not the same thing."

To help alleviate this inconsistency, the National Institute on Drug Abuse in March 2018 introduced a standardized research-specific e-cigarette for researchers to purchase. The device allows every researcher to study the same e-cigarette so that the chemical components (nicotine, formaldehyde, etc.) are consistent, as is the number of puffs it takes to finish the e-cigarette. Olonoff's commentary encourages researchers to use these for consistency across all research.

Setting policy initiatives for quickly advancing technology

E-cigarettes have been commercially available since the mid-2000s, Olonoff said. The technology has been advancing rapidly, which makes it nearly impossible to set up-to-date policy initiatives.

When e-cigarettes were introduced, marketing campaigns suggested they could be used to curb cigarette use. But years later, this claim is still unsubstantiated, Olonoff said.

Teens and e-cigarettes

The relative ease with which teens can use e-cigarettes -- such as Juul selling a USB drive-like e-cigarette -- might be enticing youths to think the devices are safe, Olonoff said.

"We've done so much to convince our youth that cigarettes and smoking are bad, and overall, it's been a relatively successful campaign when you look at how much smoking has decreased and continued to decrease," Olonoff said. "If teens use the device and they see it differently than the rest of the nicotine products, the researchers should adopt a different philosophical belief, too."

Credit: 
Northwestern University