Tech

Vitamin D may not help your heart

EAST LANSING, Mich. - While previous research has suggested a link between low levels of vitamin D in the blood and an increased risk of cardiovascular disease, a new Michigan State University study has found that taking vitamin D supplements did not reduce that risk.

The large-scale study, published in the Journal of the American Medical Association Cardiology, found that vitamin D supplements did not decrease the incidence of heart attacks, strokes or other major adverse cardiovascular events.

"We thought it would show some benefit," said Mahmoud Barbarawi, a clinical instructor in the MSU College of Human Medicine and chief resident physician at Hurley Medical Center in Flint, Michigan. "It didn't show even a small benefit. This was surprising."

His finding was consistent for both men and women and for patients of different ages.

Many earlier studies have found an association of low levels of vitamin D in the blood and an increased risk of cardiovascular disease, suggesting that vitamin D supplements might reduce that risk.

Barbarawi led a team of researchers and reviewed data from 21 clinical trials, including more than 83,000 patients. Half the patients were administered vitamin D supplements, and half were given placebos. The meta-analysis of data showed no difference in the incidences of cardiovascular events or all causes of death between the two groups.

Vitamin D sometimes is known as the sunshine vitamin, because human skin makes vitamin D when exposed to the sun. Thus, those living farthest from the equator tend to have lower levels of vitamin D in their blood.

While some studies have found a link between low levels of the vitamin and an increased risk of adverse cardiovascular events, Barbarawi's study suggests that other factors, such as outdoor physical activity and nutritional status, might explain the association.

Barbarawi also noted that even though his findings showed no effect on heart health, some patients, such as those being treated for osteoporosis, still might benefit from the supplements.

As a result, he suggests that doctors and patients think twice about taking the vitamin to minimize the chances of a heart attack or other cardiovascular issues.

"We don't recommend taking vitamin D to reduce this risk," Barbarawi said.

Credit: 
Michigan State University

A songbird's fate hinges on one fragile area

COLUMBUS, Ohio - Researchers were surprised to find that a migratory songbird that breeds in the eastern and central United States is concentrated during winter in just one South American country.

The study found that 91 percent of 34 Prothonotary Warblers fitted with tracking technology at six U.S. breeding sites spent their winter in northern Colombia - an area just 20 percent the size of their breeding range.

In addition, nearly all the warblers stopped over at three locations in the Yucatan Peninsula and Central America on their migration between the United States and Colombia, the study showed.

The results are concerning because they suggest the fate of the Prothonotary Warbler may depend on protecting habitat in a small area that is threatened with deforestation, said Christopher Tonra, lead author of the study and assistant professor of avian wildlife ecology at The Ohio State University.

"It was surprising to us how consistently Prothonotary Warblers from all over the United States ended up in Colombia," Tonra said.

"It speaks to how important habitat protection in this one country is to the overall population."

The study will be published June 19, 2019 in the journal The Condor: Ornithological Applications.

Prothonotary Warblers are considered a "species of concern" nationally, and in many states because of population declines during the 20th century.

The researchers used a relatively new technology called geolocators to track the birds. Researchers captured Prothonotary Warblers at their breeding sites in six states - Wisconsin, Ohio, Virginia, South Carolina, Arkansas and Louisiana - and attached the tiny devices to them.

The geolocators' light sensors detect the length of day, which gives researchers an approximation of the birds' latitude as they moved south and north. They also have clocks, which can help estimate the birds' longitudinal position as they move east or west through different time zones.

When the birds arrived back at their breeding grounds the following spring, the researchers recaptured them and downloaded the data their geolocators had collected, which showed approximately where they had spent the winter and the route they took to get there and back.

The researchers were surprised that no matter where the Prothonotary Warblers bred in the United States, they almost all ended up in Colombia. In other species of migratory, neotropical birds, populations that breed in different areas of North America often winter in different parts of Central and South America, Tonra said.

"We didn't expect that Prothonotary Warblers from all around the United States would use the same wintering region," he said.

"It shows the importance of using modern technology like geolocators to give us a clearer view of the lives of migratory songbirds."

Also important was the finding that nearly all the birds spent a significant amount of time on their migratory journey at stopover locations in the Yucatan Peninsula, and along the border between Honduras and Nicaragua and between Costa Rica and Panama. They fed there to gain the energy to complete their journey.

"This was something we really didn't know before this study, and another reason why the use of geolocators was so crucial," Tonra said.

"In addition to the breeding ground and wintering grounds, they have these very critical refueling sites that need our attention and protection."

While the fact that Prothonotary Warblers are concentrated in relatively small refueling and wintering areas is concerning, it also offers opportunity, Tonra said.

"It means that we can have a large positive impact by protecting habitat in specific regions," he said.

Credit: 
Ohio State University

FEFU scientist reported on concentration of pesticides in marine organisms

image: Head of the Laboratory of Environmental Biotechnology, Far Eastern Federal University (FEFU).

Image: 
FEFU press office

According to ecotoxicologist from Far Eastern Federal University (FEFU), from the 90s and during 2000s in the tissues of Russian Far Eastern mussels the concentration of organochlorine pesticides (OCPs) that had been globally used in agriculture in the mid-twentieth century has increased about ten times. OCPs pollute and affect badly the ecosystems of the Sea of Japan, the Sea of Okhotsk, and the Bering Sea. A related review was published in Water Research.

The author of the article notes that according to the new experimental results that aren't yet published, along with a multiple increase in the 2000s, at the present time the concentration of OCPs in Russian Far Eastern mussels has almost threefold decreased. Similar fluctuations in pesticides content indicators are relevant to the other species of marine animals: mollusks, fish, seabirds, and mammals. Toxicants are prone to accumulate especially strongly in the animals fat tissue. Pesticides enter marine organisms, preliminary accumulating in coastal waters.

'OCPs' concentration in the environment is a serious danger to all living organisms at the top of the food chain, i.e. not only for animals but also for the man. The reason is that the process of biomagnification runs: at the top level pesticides accumulate in the body of representatives of higher trophic levels as a result of their feeding on organisms of lower trophic levels, which contain a low amount of pesticides. This can lead to poisoning and even death. It has been established that OCPs suppress the endocrine and immune systems of marine animals, lead to various deviations and genetic changes in the population, and cause the tumors. These are sufficient reasons for all of us to think about new standards for monitoring pesticides in the environment', -- said the author of the study Vasiliy Tsygankov, Associate Professor of the Department of Food Science and Technology, Head of the Laboratory of Environmental Biotechnology at the FEFU School of Biomedicine.

The scientist pointed out that a systematic, time-prolonged and simultaneous study of different types of marine animals is wanted. Its results will be the basis for the conclusions whether the situation with pesticides is becoming better or worse. It is possible that in different regions the picture will be different.

Organochlorine pesticides (OCPs), i.e., derivatives of the DDT toxic substance are the most stable form of organic pollutants. They decompose extremely slowly and therefore easily spread and accumulate in aquatic ecosystems of different parts of the globe even many decades after their application had been banned. Even at the lowest concentrations, OCPs impact marine biota badly, especially zooplankton and crustaceans. Pesticides often collect in coastal waters and then get into the organisms of mollusks, fish, seabirds, and mammals. The last two are especially prone to accumulate pesticides in their fat tissues.

Among the main sources of OCPs infiltration into the environment, the scientist called the leaks from storages, agricultural fields of countries that continue to use OCPs, atmospheric phenomena, sea currents and the migration of marine animals, for example, Pacific salmon. Vasily Tsygankov stressed out that the spread of toxic OCPs on a global scale was proved by the fact that they were found both in the Arctic and in the Antarctic.

In FEFU, the study of OCPs content in the environment of the Russian Far East has been conducting for nearly 10 years. Previously, the university scientists found out that pesticides accumulate in marine organisms with a high content of fat tissue, as well as in marine animals with a longer lifespan.

Credit: 
Far Eastern Federal University

Climate change could affect symbiotic relationships between microorganisms and trees

Some fungi and bacteria live in close association, or symbiosis, with tree roots in forest soil to obtain mutual benefits. The microorganisms help trees access water and nutrients from the atmosphere or soil, sequester carbon, and withstand the effects of climate change. In exchange, they receive carbohydrates, which are essential to their development and are produced by the trees during photosynthesis.

More than 200 scientists from several countries, including 14 from Brazil, collaborated to map the global distribution of these root symbioses and further the understanding of their vital role in forest ecosystems. They identified factors that determine where different kinds of symbionts may emerge and estimated the impact of climate change on tree-root symbiotic relationships and hence on forest growth.

They concluded that the majority of ectomycorrhizal trees will decline by as much as 10% if emissions of carbon dioxide (CO2) proceed unabated until 2070, especially in cooler parts of the planet. Ectomycorrhizae are a form of symbiotic relationship that occurs between fungal symbionts and the roots of various plant species.

The authors of the study, featured on the cover of the May 16 issue of Nature, included Brazilian researchers Carlos Joly and Simone Aparecida Vieira, both professors at the University of Campinas (UNICAMP) and coordinators of the São Paulo Research Foundation - FAPESP Research Program on Biodiversity Characterization, Conservation, Restoration and Sustainable Use, as well as plant ecologist Luciana Ferreira Alves, now at the University of California, Los Angeles (UCLA) in the United States.

"We've long known that root-microorganism symbiosis is key to enable certain tree species to survive in areas where the soil is very poor and nutrients are released slowly by the decomposition of organic matter. The mapping survey helps us understand the distribution of these relationships worldwide and the factors that determine them," Vieira told.

The researchers focused on mapping three of the most common groups of tree-root symbionts: arbuscular mycorrhizal fungi, ectomycorrhizal fungi, and nitrogen-fixing bacteria. Each group comprises thousands of species of fungi or bacteria that form unique partnerships with different tree species.

Thirty years ago, botanist David Read, Emeritus Professor of Plant Science at the University of Sheffield in the United Kingdom and a pioneer of symbiosis research, drew maps to show locations around the world where he thought different symbiotic fungi might reside based on the nutrients they provide to fuel tree growth.

Ectomycorrhizal fungi provide trees with nitrogen directly from organic matter, such as decaying leaves, so Read proposed that these fungi would be more successful in forests with cooler and drier seasonal climates, where decomposition is slow and leaf litter is abundant.

In contrast, Read argued, arbuscular mycorrhizal fungi should dominate in the tropics, where tree growth is limited by soil phosphorus and the warm, wet climate accelerates decomposition.

More recently, research by other groups has shown that nitrogen-fixing bacteria seem to thrive most in arid biomes with alkaline soil and high temperatures.

These hypotheses have now become testable thanks to the data gathered from large numbers of trees in various parts of the globe and made available by the Global Forest Biodiversity Initiative, an international consortium of forest scientists.

In addition to Joly and Vieira, the Brazilian members of GFBI include Pedro Henrique Santin Brancalion and Ricardo Gomes César, researchers at the University of São Paulo (USP); UNICAMP's Gabriel Dalla Colletta; Daniel Piotto, Federal University of Southern Bahia (UFSB); André Luis de Gasper, University of Blumenau (FURB); Jorcely Barroso and Marcos Silveira, Federal University of Acre (UFAC); Iêda Amaral and Maria Teresa Piedade, National Institute of Amazon Research (INPA); Beatriz Schwantes Marimon, University of Mato Grosso (UNEMAT); and Alexandre Fadigas de Souza, Federal University of Rio Grande do Norte (UFRN).

In recent years, GFBI-affiliated researchers have built a database comprising information from more than 1.1 million forest plots and have inventoried 28,000 tree species. They surveyed actual trees located in over 70 countries on every continent except Antarctica.

The inventories also contain information on soil composition, topography, temperature and carbon storage, among other items.

"The plots inventoried by researchers linked to BIOTA-FAPESP are located in areas of Atlantic rainforest, including the northern coast of São Paulo State, such as Caraguatatuba, Picinguaba, Cunha and Santa Virgínia, and the southern coast of the state, such as Carlos Botelho and Ilha do Cardoso," Joly said. "We also inventoried a substantial part of the Amazon region via projects in collaboration with other groups."

Data on the locations of 31 million trees from this database, along with information on the symbionts associated with them, were fed by the GFBI team into a computer algorithm that estimated the impacts of climate, soil chemistry, vegetation and topography, among other variables, on the prevalence of each type of symbiosis.

The analysis suggested that climate variables associated with organic decomposition rates, such as temperature and moisture, are the main factors influencing arbuscular mycorrhizal and ectomycorrhizal symbioses, while nitrogen-fixing bacteria are likely limited by temperature and soil acidity.

"Climate changes occurring in the Northern Hemisphere may displace ectomycorrhizal fungi to other regions, leading to a drastic reduction in the density of these symbiotic relationships or their total loss," Vieira said.

"This can affect nutrient cycling and above all carbon fixation, which depends on these symbiotic associations if forest vegetation is to absorb nutrients that are scarce or not available in the requisite form."

Effects of climate change

To gauge the vulnerability of global symbiosis levels to climate change, the researchers used their mapping survey to predict how symbioses may change by 2070 if carbon emissions continue unabated.

The projections indicated a 10% reduction in ectomycorrhizal fungi and hence in the abundance of trees associated with these fungi, corresponding to 60% of all trees.

The researchers caution that this loss could lead to more CO2 in the atmosphere because ectomycorrhizal fungi tend to increase the amount of carbon stored in the soil.

"CO2 limits photosynthesis, and an increase in atmospheric carbon could have a fertilization effect. Faster-growing plant species may be able to make better use of this rise in CO2 availability in the atmosphere than slower-growing plants, potentially leading to species selection. However, this remains to be seen," Joly said.

The researchers are also investigating the likely impact of increased atmospheric CO2 and global warming on plant development. Plants must expend more resources on respiration in a warmer climate, so photosynthesis will accelerate. What the net outcome of this growth effect will be is unclear, according to the researchers.

"These questions regarding tropical forests are still moot. Continuous monitoring of permanent forest plots will help us answer them," Joly said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Methods and models

It's a well-known fact that the ocean is one of the biggest absorbers of the carbon dioxide emitted by way of human activity. What's less well known is how the ocean's processes for absorbing that carbon change over time, and how they might affect its ability to buffer climate change.

For UC Santa Barbara oceanographer Timothy DeVries and graduate student Michael Nowicki, gaining a good understanding of the trends in the ocean's carbon cycle is key to improving current models of carbon uptake by the Earth's oceans. This information could, in turn, yield better climate predictions. Their paper on the topic is published in the Proceedings of the National Academy of Sciences.

"We started off looking at the rate at which CO2 was accumulating in the atmosphere, and then we compared that to the rate of emissions," DeVries said. "One would expect basically that if you're increasing emissions at 10 percent, the accumulation rate in the atmosphere should increase at 10 percent, for example.

"But what we found is that the rate at which CO2 was accumulating in the atmosphere doesn't necessarily track emissions," he continued. Indeed, after looking at two decades' worth of carbon emissions versus atmospheric carbon accumulation data, the researchers came away with some counterintuitive results.

"We saw in the 1990s that the accumulation rate in the atmosphere was increasing quite fast, whereas the emissions weren't increasing very quickly at all," DeVries said. "Whereas the opposite was true in the 2000s when the emissions increased quite substantially, but the accumulation rate in the atmosphere was steady."

That variability, the researchers said, is due in part to the ocean's carbon-absorbing activities, a range of physical and biological processes that move carbon from the surface to depth. Up to 40% of the decadal variability of CO2 accumulation in the atmosphere can be attributed to how quickly the ocean takes up carbon; the rest can be attributed to activities in the terrestrial biosphere.

"We used a few different methods that estimate how quickly the CO2 is accumulating in the ocean, and basically they all agreed that the ocean was absorbing CO2 slower in the 1990s, which is why it accumulated faster in the atmosphere," DeVries said. "And the ocean was absorbing CO2 faster in the 2000s, so it accumulated slower in the atmosphere."

The results in this paper underscore just how dynamic the ocean is, with any number of factors influencing its ability to act as a carbon sink, the researchers said. Among the primary physical factors that drive oceanic carbon absorption is ocean circulation -- CO2 is absorbed into surface water which then sinks as the currents take it to cooler parts of the world, sequestering the greenhouse gas away from the atmosphere.

Both natural and anthropogenic warming could affect the deep ocean currents, which are driven by differences in water density (cold, dense water sinks while warm water is less dense and rises), in effect, slowing these currents down, decreasing the rate at which carbon is absorbed and making the ocean less efficient as a carbon sink in the long term. Similarly, other natural phenomena such as El Niño and volcanic eruptions have the power to change temperatures and wind patterns which could, in turn, affect ocean currents.

The good news is that climate models generally have been pointing in the correct direction.

"It was interesting to see that the ocean models got the trend generally, but the magnitude was smaller (than what was noted in the observations)," Nowicki said. "But that brings up the next question: Why is that?"

"We need to do more research to look at what's driving this variability," said DeVries, who made the distinction between climate variability -- relatively short-term fluctuations from the general trend -- and climate change, a long-term trend in which the climate enters a new mean state, a "new normal." Capturing the ocean's changing carbon sink in models, the researchers said, will lead to more accurate climate predictions.

Credit: 
University of California - Santa Barbara

Powering a solution: Professor takes charge at improving lithium ion batteries safety

video: Videos from Zhu's group demonstrate the bullet with different speed impacting the regular liquid electrolyte and shear-thickening electrolyte. The shear-thickening electrolyte absorbed the kinetic energy and slowed down the moving bullet significantly.

Image: 
Yu Zhu/The University of Akron

As cutting edge as electric vehicles are, they're still vulnerable to an Achilles heel - the very source that gives them power.

One of the common types of batteries used in electric vehicles, lithium ion batteries - or Li-ion batteries - is susceptible to catching on fire or exploding as a result of a crash or other major impact exerted onto the vehicle. The impact results in the internal short-circuit of electrodes. The small fire can spread throughout the battery and to other parts of the car through "thermal runaway."

"Although significant efforts have been applied to the thermal management of the battery cells, battery fires and explosions in recent electrical car accidents pose significant concerns in public," said Yu Zhu, Ph.D., associate professor of polymer science at The University of Akron (UA). "In most cases, the battery ignited when it was not operated under normal use, such as through a large external impact, or crash."

Zhu and his team of graduate students in UA's College of Polymer Science and Polymer Engineering are working to improve the safety of Li-ion batteries by creating a shear-thickening electrolyte - a substance that can become thicker under impact, set between the battery's anode and cathode that will be impact-resistant, thus not causing a fire or an explosion upon any collision. Under normal conditions, the novel electrolyte remains soft.

The group's research, led by Zhu's Ph.D. student Kewei Liu, was recently published in the Journal of Power Sources: "A shear thickening fluid based impact resistant electrolyte for safe Li-ion batteries."

"In Li-ion batteries for use in electric vehicles, the cathode and anode are separated by a very soft membrane and a liquid electrolyte," said Zhu. "Simply replacing a liquid electrolyte with its solid counterpart is still a challenging task because both electrodes are porous and they need liquid to fill pores and make contact. Our idea is you can still use a liquid-like electrolyte under a normal situation, but with a liquid that can improve its own mechanical strengths under impact. So, we developed a shear-thickening electrolyte."

Think of it as a starch and water mixture. You can put you hand into it and slowly stir the starch and water while feeling very little resistance. However, if you increase your stir rate, you will dramatically feel much more resistance. In fact, a bowling ball can bounce off the surface of a cornstarch and water mixture, which behaves like a solid during the impact.

A liquid with such properties is called a dilatant, a type of non-Newtonian fluid. If an electrolyte is also a dilatant, it will prevent the battery from short-circuiting under external impact. However, forming a shear-thickening electrolyte is much more difficult than mixing cornstarch and water, because the composition of the electrolyte is complicated consisting of different ions, solvents, and various additives.

"In our preliminary research," Zhu said, "we demonstrated that a modified low-cost glass fiber filler can produce the shear-thickening electrolyte we're looking for, which is compatible with commercial Li-ion batteries and shows improved impact resistance."

Videos from Zhu's group demonstrate the bullet with different speed impacting the regular liquid electrolyte and shear-thickening electrolyte. The shear-thickening electrolyte absorbed the kinetic energy and slowed down the moving bullet significantly.

"Compared to a conventional liquid electrolyte, the shear-thickening electrolyte will not significantly reduce the performance of Li-ion batteries," Zhu said. "During an impact, the shear-thickening electrolyte will immediately behave like a solid and generate larger force to resist external impact because of the shear-thickening effect. This solution is complementary to external thermal management system of the battery pack, which often falls short in response to the abrupt impact"

Zhu said the research on improving of Li-ion batteries is relatively new, especially for the use in electric vehicles. He added that shear-thickening electrolytes can have other niche uses, such as in bulletproof energy storage devices.

Credit: 
University of Akron

Expanding the temperature range of lithium-ion batteries

Electric cars struggle with extreme temperatures, mainly because of impacts on the electrolyte solutions in their lithium-ion batteries. Now, researchers have developed new electrolytes containing multiple additives that work better over a wide temperature range. They report their results in ACS Applied Materials & Interfaces.

Lithium-ion batteries are widely used in cell phones, laptop computers and electric vehicles. The electrolyte solutions in these batteries conduct ions between the negative electrode (anode) and positive electrode (cathode) to power the battery. An indispensable component of most of these solutions, ethylene carbonate helps create a protective layer, preventing further decomposition of electrolyte components when they interact with the anode. However, ethylene carbonate has a high melting point, which limits its performance at low temperatures. Wu Xu and colleagues showed previously that they could extend the temperature range of lithium-ion batteries by partially replacing ethylene carbonate with propylene carbonate and adding cesium hexafluorophosphate. But they wanted to improve the temperature range even further, so that lithium-ion batteries could perform well from -40 to 140 F.

The researchers tested the effects of five electrolyte additives on the performance of lithium-ion batteries within this temperature range. They identified an optimized combination of three compounds that they added to their previous electrolyte solution. This new combination caused the formation of highly conductive, uniform and robust protective layers on both the anode and the cathode. Batteries containing the optimized electrolyte had greatly enhanced discharging performance at -40 F and long-term cycling stability at 77 F, along with slightly improved cycling stability at 140 F.

Credit: 
American Chemical Society

A miniature robot that could check colons for early signs of disease

Engineers have shown it is technically possible to guide a tiny robotic capsule inside the colon to take micro-ultrasound images.

Known as a Sonopill, the device could one day replace the need for patients to undergo an endoscopic examination, where a semi-rigid scope is passed into the bowel - an invasive procedure that can be painful.

Micro-ultrasound images also have the advantage of being better able to identify some types of cell change associated with cancer.

The Sonopill is the culmination of a decade of research by an international consortium of engineers and scientists. The results of their feasibility study are published today (June 19th) in the journal Science Robotics.

The consortium has developed a technique called intelligent magnetic manipulation. Based on the principle that magnets can attract and repel one another, a series of magnets on a robotic arm that passes over the patient interacts with a magnet inside the capsule, gently manoeuvring it through the colon.

The magnetic forces used are harmless and can pass through human tissue, doing away with the need for a physical connection between the robotic arm and the capsule.

An artificial intelligence system (AI) ensures the smooth capsule can position itself correctly against the gut wall to get the best quality micro-ultrasound images. The feasibility study also showed should the capsule get dislodged, the AI system can navigate it back to the required location.

Professor Pietro Valdastri, who holds the Chair in Robotics and Autonomous Systems at the University of Leeds and was senior author of the paper, said: "The technology has the potential to change the way doctors conduct examinations of the gastrointestinal tract.

"Previous studies showed that micro-ultrasound was able to capture high-resolution images and visualise small lesions in the superficial layers of the gut, providing valuable information about the early signs of disease.

"With this study, we show that intelligent magnetic manipulation is an effective technique to guide a micro-ultrasound capsule to perform targeted imaging deep inside the human body.

"The platform is able to localise the position of the Sonopill at any time and adjust the external driving magnet to perform a diagnostic scan while maintaining a high quality ultrasound signal. This discovery has the potential to enable painless diagnosis via a micro-ultrasound pill in the entire gastrointestinal tract."

Sandy Cochran, Professor of Ultrasound Materials and Systems at the University of Glasgow and lead researcher, said: "We're really excited by the results of this feasibility study. With an increasing demand for endoscopies, it is more important than ever to be able to deliver a precise, targeted, and cost-effective treatment that is comfortable for patients.

"Today, we are one step closer to delivering that.

"We hope that in the near future, the Sonopill will be available to all patients as part of regular medical check-ups, effectively catching serious diseases at an early stage and monitoring the health of everyone's digestive system."

The Sonopill is a small capsule - with a diameter of 21mm and length of 39mm, which the engineers say can be scaled down. The capsule houses a micro ultrasound transducer, an LED light, camera and magnet.

A very small flexible cable is tethered to the capsule which also passes into the body via the rectum and sends ultrasound images back to a computer in the examination room.

The feasibility tests were conducted on laboratory models and in animal studies involving pigs.

Diseases of the gastrointestinal tract account for approximately 8 million deaths a year across the world, including some bowel cancers which are linked with high mortality.

Credit: 
University of Leeds

From one brain scan, more information for medical artificial intelligence

image: MIT researchers have developed a system that gleans far more labeled training data from unlabeled data, which could help machine-learning models better detect structural patterns in brain scans associated with neurological diseases. The system learns structural and appearance variations in unlabeled scans, and uses that information to shape and mold one labeled scan into thousands of new, distinct labeled scans.

Image: 
Amy Zhao/MIT

CAMBRIDGE, MA -- MIT researchers have devised a novel method to glean more information from images used to train machine-learning models, including those that can analyze medical scans to help diagnose and treat brain conditions.

An active new area in medicine involves training deep-learning models to detect structural patterns in brain scans associated with neurological diseases and disorders, such as Alzheimer's disease and multiple sclerosis. But collecting the training data is laborious: All anatomical structures in each scan must be separately outlined or hand-labeled by neurological experts. And, in some cases, such as for rare brain conditions in children, only a few scans may be available in the first place.

In a paper presented at the recent Conference on Computer Vision and Pattern Recognition, the MIT researchers describe a system that uses a single labeled scan, along with unlabeled scans, to automatically synthesize a massive dataset of distinct training examples. The dataset can be used to better train machine-learning models to find anatomical structures in new scans -- the more training data, the better those predictions.

The crux of the work is automatically generating data for the "image segmentation" process, which partitions an image into regions of pixels that are more meaningful and easier to analyze. To do so, the system uses a convolutional neural network (CNN), a machine-learning model that's become a powerhouse for image-processing tasks. The network analyzes a lot of unlabeled scans from different patients and different equipment to "learn" anatomical, brightness, and contrast variations. Then, it applies a random combination of those learned variations to a single labeled scan to synthesize new scans that are both realistic and accurately labeled. These newly synthesized scans are then fed into a different CNN that learns how to segment new images.

"We're hoping this will make image segmentation more accessible in realistic situations where you don't have a lot of training data," says first author Amy Zhao, a graduate student in the Department of Electrical Engineering and Computer Science (EECS) and Computer Science and Artificial Intelligence Laboratory (CSAIL). "In our approach, you can learn to mimic the variations in unlabeled scans to intelligently synthesize a large dataset to train your network."

There's interest in using the system, for instance, to help train predictive-analytics models at Massachusetts General Hospital, Zhao says, where only one or two labeled scans may exist of particularly uncommon brain conditions among child patients.

Joining Zhao on the paper are: Guha Balakrishnan, a postdoc in EECS and CSAIL; EECS professors Fredo Durand and John Guttag, and senior author Adrian Dalca, who is also a faculty member in radiology at Harvard Medical School.

The "Magic" behind the system

Although now applied to medical imaging, the system actually started as a means to synthesize training data for a smartphone app that could identify and retrieve information about cards from the popular collectable card game, "Magic: The Gathering." Released in the early 1990s, "Magic" has more than 20,000 unique cards -- with more released every few months -- that players can use to build custom playing decks.

Zhao, an avid "Magic" player, wanted to develop a CNN-powered app that took a photo of any card with a smartphone camera and automatically pulled information such as price and rating from online card databases. "When I was picking out cards from a game store, I got tired of entering all their names into my phone and looking up ratings and combos," Zhao says. "Wouldn't it be awesome if I could scan them with my phone and pull up that information?"

But she realized that's a very tough computer-vision training task. "You'd need many photos of all 20,000 cards, under all different lighting conditions and angles. No one is going to collect that dataset," Zhao says.

Instead, Zhao trained a CNN on smaller dataset of around 200 cards, with 10 distinct photos of each card, to learn how to warp a card into various positions. It computed different lighting, angles, and reflections -- for when cards are placed in plastic sleeves -- to synthesized realistic warped versions of any card in the dataset. It was an exciting passion project, Zhao says: "But we realized this approach was really well-suited for medical images, because this type of warping fits really well with MRIs."

Mind warp

Magnetic resonance images (MRIs) are composed of three-dimensional pixels, called voxels. When segmenting MRIs, experts separate and label voxel regions based on the anatomical structure containing them. The diversity of scans, caused by variations in individual brains and equipment used, poses a challenge to using machine learning to automate this process.

Some existing methods can synthesize training examples from labeled scans using "data augmentation," which warps labeled voxels into different positions. But these methods require experts to hand-write various augmentation guidelines, and some synthesized scans look nothing like a realistic human brain, which may be detrimental to the learning process.

Instead, the researchers' system automatically learns how to synthesize realistic scans. The researchers trained their system on 100 unlabeled scans from real patients to compute spatial transformations -- anatomical correspondences from scan to scan. This generated as many "flow fields," which model how voxels move from one scan to another. Simultaneously, it computes intensity transformations, which capture appearance variations caused by image contrast, noise, and other factors.

In generating a new scan, the system applies a random flow field to the original labeled scan, which shifts around voxels until it structurally matches a real, unlabeled scan. Then, it overlays a random intensity transformation. Finally, the system maps the labels to the new structures, by following how the voxels moved in the flow field. In the end, the synthesized scans closely resemble the real, unlabeled scans -- but with accurate labels.

To test their automated segmentation accuracy, the researchers used Dice scores, which measure how well one 3-D shape fits over another, on a scale of 0 to 1. They compared their system to traditional segmentation methods -- manual and automated -- on 30 different brain structures across 100 held-out test scans. Large structures were comparably accurate among all the methods. But the researchers' system outperformed all other approaches on smaller structures, such as the hippocampus, which occupies only about 0.6 percent of a brain, by volume.

"That shows that our method improves over other methods, especially as you get into the smaller structures, which can be very important in understanding disease," Zhao says. "And we did that while only needing a single hand-labeled scan."

Credit: 
Massachusetts Institute of Technology

New research shows an iceless Greenland may be in our future

image: Ilulissat, known as 'the city of icebergs' sits adjacent to Greenland's Ilulissat Glacier, which flows into the Atlantic Ocean. Such outlet glaciers lead ice sheet loss in Greenland. New research shows that if this loss continues at its current rate, it could result in an ice-free Greenland by the year 3000 and 24 feet of global sea level rise.

Image: 
Martin Truffer

New research shows an iceless Greenland may be in the future. If worldwide greenhouse gas emissions remain on their current trajectory, Greenland may be ice-free by the year 3000. Even by the end of the century, the island could lose 4.5% of its ice, contributing up to 13 inches of sea level rise.

"How Greenland will look in the future -- in a couple of hundred years or in 1,000 years -- whether there will be Greenland, or at least a Greenland similar to today, it's up to us," said Andy Aschwanden, a research associate professor at the University of Alaska Fairbanks Geophysical Institute.

Aschwanden is lead author on a new study published in the June issue of Science Advances. UAF Geophysical Institute researchers Mark Fahnestock, Martin Truffer, Regine Hock and Constantine Khrulev are co-authors, as is Doug Brinkerhoff, a former UAF graduate student.

This research uses new data on the landscape under the ice today to make breakthroughs in modeling the future. The findings show a wide range of scenarios for ice loss and sea level rise based on different projections for greenhouse gas concentrations and atmospheric conditions. Currently, the planet is moving toward the high estimates of greenhouse gas concentrations.

Greenland's ice sheet is huge, spanning over 660,000 square miles. It is almost the size of Alaska and 80% as big as the U.S. east of the Mississippi River. Today, the ice sheet covers 81% of Greenland and contains 8% of Earth's fresh water.

If greenhouse gas concentrations remain on the current path, the melting ice from Greenland alone could contribute as much as 24 feet to global sea level rise by the year 3000, which would put much of San Francisco, Los Angeles, New Orleans and other cities under water.

However, if greenhouse gas emissions are cut significantly, that picture changes. Instead, by 3000 Greenland may lose 8% to 25% of ice and contribute up to approximately 6.5 feet of sea level rise. Between 1991 and 2015, Greenland's ice sheet has added about 0.02 inches per year to sea level, but that could rapidly increase.

Projections for both the end of the century and 2200 tell a similar story: There are a wide range of possibilities, including saving the ice sheet, but it all depends on greenhouse gas emissions.

The researchers ran 500 simulations for each of the three climate scenarios using the Parallel Ice Sheet Model, developed at the Geophysical Institute, to create a picture of how Greenland's ice would respond to different climate scenarios. The model included parameters on ocean and atmospheric conditions as well as ice geometry, flow and thickness.

Simulating ice sheet behavior is difficult because ice loss is led by the retreat of outlet glaciers. These glaciers, at the margins of ice sheets, drain the ice from the interior like rivers, often in troughs hidden under the ice itself.

This study is the first model to include these outlet glaciers. It found that their discharge could contribute as much as 45% of the total mass of ice lost in Greenland by 2200.

Outlet glaciers are in contact with water, and water makes ice melt faster than contact with air, like thawing a chicken in the sink. The more ice touches water, the faster it melts. This creates a feedback loop that dramatically affects the ice sheet.

However, to simulate how the ice flows, the scientists need to know how thick the ice is.

The team used data from a NASA airborne science campaign called Operation IceBridge. Operation IceBridge uses aircraft equipped with a full suite of scientific instruments, including three types of radar that can measure the ice surface, the individual layers within the ice and penetrate to the bedrock to collect data about the land beneath the ice. On average, Greenland's ice sheet is 1.6 miles thick, but there is a lot of variation depending on where you measure.

"Ice is in very remote locations," said Fahnestock. "You can go there and make localized measurements. But the view from space and the view from airborne campaigns, like IceBridge, has just fundamentally transformed our ability to make a model to mimic those changes."

Because previous research results lacked these details, scientists could not simulate present-day conditions as accurately, which makes it more difficult to predict what will happen in the future.

"If it's raining in D.C. today, your best guess is that it's raining tomorrow, too," Aschwanden said. "If you don't know what the weather is today, it's all guessing."

However, that doesn't mean researchers know exactly what will happen.

"What we know from the last two decades of just watching Greenland is not because we were geniuses and figured it out, but because we just saw it happen," Fahnestock said. As for what we will see in the future, "it depends on what we are going to do next."

Credit: 
University of Alaska Fairbanks

New international initiative stresses need for global action on air pollution

The National Academies of Sciences and Medicine from South Africa, Brazil, Germany, and the United States of America have joined forces to issue an urgent call to action on harmful air pollution. They are calling for a new global compact to improve collaboration on the growing problem, and for governments, businesses and citizens to reduce air pollution in all countries. The academies launched their call with the publication of a science-policy statement, which was handed over in a ceremony today at the UN headquarters to senior UN representatives and high-level diplomats from South Africa, Brazil, Germany, and the United States of America.

In the statement, the five National Academies are also jointly calling for immediate action from all levels of society. This includes a request for emissions controls in all countries and proper monitoring of key pollutants - especially PM2.5. PM2.5 is one of the smallest particulates in the air we breathe, which can enter and impact all organs of the body. The science academies specify the need for increased funding to tackle the problem and substantial investment in measures to reduce air pollution. This can also help to reduce climate change and contribute to meeting the goal of limiting average global warming to 1.5ºC.

With this statement, the academies provide further scientific input for the global climate action summit, which the UN Secretary General will hold in September this year and where air pollution and health will be an issue of great concern. The five National academies invite science academies, research institutes, universities and individual scientists worldwide to join the initiative and to strengthen research and science-policy activities in the area of "Air Pollution and Health".

Executive Officer Himla Soodyall from the Academy of Science of South Africa says: "The health impacts of air pollution are enormous, it can harm health across the entire lifespan, causing disease, disability, and death. It is time to move the issue much higher up in the policy agenda. Strengthening synergies with other policy areas, including sustainable development, climate change and food security is important."

President Marcia McNutt of the U.S. National Academy of Sciences says: "If we do not urgently address this global challenge, air pollution will continue to take a startling toll in terms of preventable disease, disability, and death, as well as in avoidable costs of care. The good thing is that air pollution can be cost-effectively controlled. We need to act much more decisively. We need more public and private investments to tackle air pollution that match the scale of the problem."

Foreign-Secretary Margaret Hamburg of the U.S. National Academy of Medicine says: "We are only at the beginning of what we hope to achieve. Our five academies have launched the call, but tackling this issue will need the participation of many more stakeholders. We invite science academies, research institutes, universities and individual scientists worldwide to join the initiative to help solve this global crisis. We also hope that policy-makers and the public will engage with us to improve the future health of people and the planet."

President Luiz Davidovich of the Brazilian Academy of Sciences says: "Air pollution and climate change share an important, common source: the combustion of fossil fuels, that is why tackling air pollution will also help us make progress towards combating climate change."

President Jörg Hacker of the German National Academy of Sciences Leopoldina says: "National Academies are uniquely placed to address complex issues such as the interplay between air pollution and health. Academies are independent fora where scientists from all disciplines come together to exchange and reflect upon their findings. Such a collaboration across disciplines is essential to find solutions to these problems."

Unequivocal scientific evidence shows that air pollution affects human health across our entire lifespan. It can affect everyone, even unborn babies, with young, old and vulnerable people impacted the most. The health impacts include the premature deaths of at least 5 million people per year, as well as chronic health conditions like heart disease, asthma, COPD, diabetes, allergies, eczema and skin ageing. Air pollution also contributes to cancer, stroke and slows lung growth of children and adolescents. Evidence is growing that air pollution contributes to dementia in adults and impacts brain development in children.

Burning of fossil fuels and biomass for heat, power, transport and food production is the main source of air pollution. The global economic burden of disease caused by air pollution across 176 countries in 2015 was estimated to be USD 3.8 trillion. Measures, which could have positive impacts on reducing air pollution, are woefully underinvested in.

The academies state that both private and public investments are insufficient and do not match the scale of the problem. Air pollution is preventable. With sufficient action suffering and deaths from dirty air can be avoided. Clean air is as vital to life on earth as clean water. Air pollution control and reduction must now be a priority for all.

Credit: 
Leopoldina

Researchers learn dangerous brain parasite 'orders in' for dinner

image: This image shows increased expression of an arginine transporter (CAT1, green) in host cells infected with Toxoplasma parasites (nuclei stained blue).

Image: 
Indiana University

Researchers at Indiana University School of Medicine have discovered how a dangerous parasite maintains a steady supply of nutrients while replicating inside of its host cell: it calls for delivery.

Toxoplasma gondii is a single-celled parasite capable of infecting any animal, including humans. Up to one-third of infections in people happen through contact with cat waste or contaminated food or water. Although the parasite only causes acute disease in immune compromised persons, the infection is permanent and has been associated with neurological diseases such as schizophrenia and rage disorder.

The parasite can invade virtually all types of cells in the body. Once inside, it begins to divide exponentially, a process that requires a great deal of resources. The parasite extracts most of the nutrients it needs for replication from its host cell, including essential amino acids like arginine. Because arginine is quickly depleted from the host cell, researchers wanted to learn where the parasite gets more of the amino acid to fuel its expansion into the hundreds.

In a collaborative study funded by the National Institutes of Health, microbiology and immunology professor Bill Sullivan, PhD and biochemistry and molecular biology professor Ronald Wek, PhD identified a cellular starvation stress response that occurs within two hours after Toxoplasma infection. The study was led by Leo Augusto, a postdoctoral fellow in the Sullivan and Wek laboratories, who used a variety of mutant host cells to discern that a protein called GCN2 becomes activated as parasites consume the host cell's arginine supply. Augusto mapped out the cascade of events following GCN2 activation, leading him to discover that host cells infected with Toxoplasma express more of an arginine transporter called CAT1 at their cell surface. CAT1 brings more arginine into the infected cell so Toxoplasma can continue to binge.

These findings suggest infected host cells can sense their nutrients being depleted. Oblivious to the parasites growing inside them, the host cells unwittingly gear up to bring in more arginine to compensate for the loss. The identification of proteins like GCN2 that are important for parasite growth and replication may serve as promising new drug targets to treat intracellular pathogens.

"Pathogens that live and grow inside of cells face special challenges," Sullivan said. "Intracellular pathogens have to replicate without raising alarms, but in order to grow they need to pilfer nutrients from the host. Our study shows that Toxoplasma gets additional nutrients simply by hijacking a starvation response already built into the host cell."

Whether the parasite does this on purpose or if it is a happy accident is still a lingering question. Augusto's work appears to suggest the latter, as parasites deficient in arginine uptake did not elicit a strong starvation response in host cells.

Credit: 
Indiana University

Research shows wind can prevent seabirds accessing their most important habitat

image: Guillemot landing

Image: 
Glynn Trueman

We marvel at flying animals because it seems like they can access anywhere, but a first study of its kind has revealed that wind can prevent seabirds from accessing the most important of habitats: their nests.

If human pilots or animals are to land safely, they must monitor and respond to the wind. These ideas are well established in aeronautical engineering, but how the win affects the ability of birds to land has never been considered before.

In a paper published by eLife, biologists including Dr Emily Shepard at Swansea University observed common guillemots and razorbills attempting to land on their breeding cliffs on Skomer Island, Wales. They then teamed up with Dr Andrew Ross, a meteorologist at Leeds University, to assess how the number of successful and aborted landings varied with the wind and turbulence around Skomer.

Seabirds live in windy, often remote places. Many species choose to breed on steep cliffs, where nests cannot be reached by land-based predators. Here, adults must land on small ledges, and they must do this with sufficient control that they do not dislodge their egg or chick.

While all birds landed when it was wind still, 60% of attempts failed in a strong breeze. This increased to 80% in near-gale winds. Razorbills, the more manoeuvrable of the two species, were better at landing overall, but "runway" size was important for all birds, which could land more easily on larger ledges where there is more airspace to manoeuvre above their landing spot.

As a result, adults have to make repeated landing attempts on windy days - something that is costly for these birds. In fact, modelling by mathematician Dr Andrew Neate showed that in strong winds, only 50% of birds are likely to land in the first seven attempts. These results have implications for where birds should nest, providing a clear reason for birds to colonise cliffs that are sheltered from prevailing winds.

Dr Emily Shepard, Associate Professor at Swansea University said:

"Landing is a taxing and risky process for human pilots and it was fascinating to look at the conditions that make landing challenging for auks.

"These birds are able to land (and even breed) on cliff ledges so small that their tails hang over the edge, but it was striking how wind upsets this delicate balancing act."

Credit: 
Swansea University

First-ever successful mind-controlled robotic arm without brain implants

A team of researchers from Carnegie Mellon University, in collaboration with the University of Minnesota, has made a breakthrough in the field of noninvasive robotic device control. Using a noninvasive brain-computer interface (BCI), researchers have developed the first-ever successful mind-controlled robotic arm exhibiting the ability to continuously track and follow a computer cursor.

Being able to noninvasively control robotic devices using only thoughts will have broad applications, in particular benefiting the lives of paralyzed patients and those with movement disorders.

BCIs have been shown to achieve good performance for controlling robotic devices using only the signals sensed from brain implants. When robotic devices can be controlled with high precision, they can be used to complete a variety of daily tasks. Until now, however, BCIs successful in controlling robotic arms have used invasive brain implants. These implants require a substantial amount of medical and surgical expertise to correctly install and operate, not to mention cost and potential risks to subjects, and as such, their use has been limited to just a few clinical cases.

A grand challenge in BCI research is to develop less invasive or even totally noninvasive technology that would allow paralyzed patients to control their environment or robotic limbs using their own "thoughts." Such noninvasive BCI technology, if successful, would bring such much needed technology to numerous patients and even potentially to the general population.

However, BCIs that use noninvasive external sensing, rather than brain implants, receive "dirtier" signals, leading to current lower resolution and less precise control. Thus, when using only the brain to control a robotic arm, a noninvasive BCI doesn't stand up to using implanted devices. Despite this, BCI researchers have forged ahead, their eye on the prize of a less- or non-invasive technology that could help patients everywhere on a daily basis.

Bin He, Trustee Professor and Department Head of Biomedical Engineering at Carnegie Mellon University, is achieving that goal, one key discovery at a time.

"There have been major advances in mind controlled robotic devices using brain implants. It's excellent science," says He. "But noninvasive is the ultimate goal. Advances in neural decoding and the practical utility of noninvasive robotic arm control will have major implications on the eventual development of noninvasive neurorobotics."

Using novel sensing and machine learning techniques, He and his lab have been able to access signals deep within the brain, achieving a high resolution of control over a robotic arm. With noninvasive neuroimaging and a novel continuous pursuit paradigm, He is overcoming the noisy EEG signals leading to significantly improve EEG-based neural decoding, and facilitating real-time continuous 2D robotic device control.

Using a noninvasive BCI to control a robotic arm that's tracking a cursor on a computer screen, for the first time ever, He has shown in human subjects that a robotic arm can now follow the cursor continuously. Whereas robotic arms controlled by humans noninvasively had previously followed a moving cursor in jerky, discrete motions--as though the robotic arm was trying to "catch up" to the brain's commands--now, the arm follows the cursor in a smooth, continuous path.

In a paper published in Science Robotics, the team established a new framework that addresses and improves upon the "brain" and "computer" components of BCI by increasing user engagement and training, as well as spatial resolution of noninvasive neural data through EEG source imaging.

The paper, "Noninvasive neuroimaging enhances continuous neural tracking for robotic device control," shows that the team's unique approach to solving this problem not enhanced BCI learning by nearly 60% for traditional center-out tasks, it also enhanced continuous tracking of a computer cursor by over 500%.

The technology also has applications that could help a variety of people, by offering safe, noninvasive "mind control" of devices that can allow people to interact with and control their environments. The technology has, to date, been tested in 68 able-bodied human subjects (up to 10 sessions for each subject), including virtual device control and controlling of a robotic arm for continuous pursuit. The technology is directly applicable to patients, and the team plans to conduct clinical trials in the near future.

"Despite technical challenges using noninvasive signals, we are fully committed to bringing this safe and economic technology to people who can benefit from it," says He. "This work represents an important step in noninvasive brain-computer interfaces, a technology which someday may become a pervasive assistive technology aiding everyone, like smartphones."

Credit: 
College of Engineering, Carnegie Mellon University

Kazan University Clinic testing biodegradable plant-based implants

The prosthetics technology is based on potato and corn materials which serve as "food" for the replaced tissues and can be slowly absorbed by the patient's own tissue. If the trials are successful, the treatment can be used for sclerosis, aneurysms, and various blood vessel pathologies.

Over a dozen lab animals have been operated on so far, and the results sound promising. Project head, surgeon Vyacheslav Averyanov, explains that children may benefit the most, "A kid grows up, but the prosthesis cannot grow up with her. She needs another surgery. And that's what stimulated me to tackle this problem."

The work started in 2012 under the guidance of renowned surgeon Leonid Mirolyubov, who in 2008 had performed the first prosthetics procedure with Alloplant, a biodegradable material.

Dr. Averyanov has made advancements in Alloplant technology and successfully tried it on rats. The current testing is supported by Kazan Federal University, Republican Clinical Hospital, Kazan State Medical University, Liniya Zhizni (Life Line) Foundation, Ministry of Health of Tatarstan, and the University Clinic. KFU's brand new Wet Lab facilities are used.

The researchers hope that the technology can replace existing artificial prostheses. "This is all the more important for our little patients who become hostages of repeated surgical procedures because artificial prosthetics are not capable of natural growth," emphasizes Head of Cardiology at the University Clinic Daniyar Khaziakhmetov.

Credit: 
Kazan Federal University