Tech

Good heart health at age 50 linked to lower dementia risk later in life

Good cardiovascular health at age 50 is associated with a lower risk of dementia later in life, finds a study of British adults published by The BMJ today.

The researchers say their findings support public health policies to improve cardiovascular health in middle age to promote later brain health.

Dementia is a progressive disease that can start to develop 15-20 years before any symptoms appear, so identifying factors that might prevent its onset is important.

The American Heart Association's "Life Simple 7" cardiovascular health score, initially designed for cardiovascular disease, has been put forward as a potential tool for preventing dementia.

Designed for "primordial" prevention, where the aim is to prevent the development of risk factors themselves in order to affect risk of disease, it is the sum of four behavioural (smoking, diet, physical activity, body mass index) and three biological (fasting glucose, blood cholesterol, blood pressure) metrics, categorised into poor (scores 0-6), intermediate (7-11), and optimal (12-14) cardiovascular health.

But the evidence remains inconsistent. So to address this uncertainty, an international research project led by Séverine Sabia from the French National Institute of Health and Medical Research and University College London, examined the association between the Life Simple 7 cardiovascular health score at age 50 and risk of dementia over the next 25 years.

Their findings are based on cardiovascular data collected from 7,899 British men and women at age 50 in the Whitehall II Study, which is looking at the impact of social, behavioural, and biological factors on long term health.

Participants were free of cardiovascular disease and dementia at age 50. Dementia cases were identified using hospital, mental health services, and death registers until 2017.

Of the 7,899 participants, 347 cases of dementia were recorded over an average follow-up period of 25 years. Average age at dementia diagnosis was 75 years.

After taking account of potentially influential factors, the researchers found that adherence to the Life Simple 7 cardiovascular health recommendations in midlife was associated with a lower risk of dementia later in life.

Compared with an incidence rate of dementia of 3.2 per 1000 person years among the group with a poor cardiovascular score, those with an intermediate score had an incidence of 1.8 per 1000 person years, while those with an optimal score had an incidence of 1.3 per 1000 person years.

This is an observational study, so can't establish cause, and the researchers point to some limitations, for example relying on self-reported measures and potentially missing cases of dementia in patient records.

However, higher cardiovascular health score at age 50 was also associated with higher whole brain and grey matter volumes in MRI scans 20 years later. And reductions in dementia risk were also evident across the continuum of the cardiovascular score, suggesting that even small improvements in cardiovascular risk factors at age 50 may reduce dementia risk in old age, say the researchers.

"Our findings suggest that the Life's Simple 7, which comprises the cardiovascular health score, at age 50 may shape the risk of dementia in a synergistic manner," they write. "Cardiovascular risk factors are modifiable, making them strategically important prevention targets. This study supports public health policies to improve cardiovascular health as early as age 50 to promote cognitive health," they conclude.

Researchers in a linked editorial agree that the study provides further support for the UK Government's recent policy focus on vascular health in midlife. "However, other evidence makes clear that vascular health at 50 is determined by factors earlier in the life course, including inequality and social and economic determinants," they say.

"Reducing the risk of dementia is a leading concern in aging societies. We know that risk can change across generations, and in the UK the prevalence of dementia has decreased by nearly 25% when standardised for age," they add.

They conclude: "Although the Whitehall study cannot reflect the UK's population, estimates obtained from this cohort reinforce the need for action to shift population risk profiles for cognitive decline and dementia across the life course."

Credit: 
BMJ Group

Sorting out who needs a pill sorter

Researchers at the University of East Anglia have developed guidance to help prescribers and pharmacists decide which patients should use a pill organiser.

The team's previous research has shown that switching to using an organiser can do more harm than good.

Their latest study, published today, reveals that pharmacies are giving out twice as many pill organisers as they were ten years ago.

It is hoped that the new guidance will help prescribers better understand which patients' health could be put at risk by using an organiser. It will also help patients and their carers know what they can ask for to help with taking medicines as prescribed.

Lead researcher Dr Debi Bhattacharya, from UEA's School of Pharmacy, said: "A lot of people use pill organisers to help them take the right medication at the right time of the day.

"The fact that using a pill organiser could cause harm to patients sounds rather counterintuitive. But our research showed that patients were more likely to become unwell when they switched from taking their medication straight from the packet to using a pill organiser. In some cases, older people can even end up being hospitalised.

"This is likely because when the patients had been taking their medication sporadically, they weren't getting the expected health improvements. Their doctor may therefore have increased the dose of the medication to try to get the desired effect.

"When these patients were switched to a pill organiser and suddenly started taking all of their medication as prescribed, they experienced side effects of the medication.

"With usual medication packets, if a patient doesn't get on with a particular pill it's easy to deliberately miss it. A drawback to organisers is that the patient can't tell which pill they want to miss so sometimes they stop taking all of their pills. This can lead to serious health complications that wouldn't have occurred if they had simply skipped that one tablet."

The new study shows that the provision of organisers by pharmacies has more than doubled in a decade. But pharmacists are not considering the risk of adverse events arising from a patient's sudden increased adherence to their medication.

To combat these problems, the research team developed a set of guidelines for healthcare teams to work with patients to decide who might benefit from pill organisers and who may get better results with other solutions such as easy open medicine bottles or coloured labelling.

The 'Medication Adherence Support Decision Aid' (MASDA) guidance has been endorsed by the Royal College of Physicians and the Royal Pharmaceutical Society. And the research team hope that it will be adopted by the NHS.

Dr Bhattacharya said: "Until now there has been no guidance about which patients should be using medication organisers.

"Our new algorithm encourages prescribers to consider the emotional and practical barriers that might stop patients taking their medication correctly.

"Emotional barriers to taking medication as prescribed can include things like whether the patient is anxious or lacking confidence, lacking motivation or experiencing unwanted side effects. In all of these cases, using a pill organiser is likely to be inappropriate.

"Better solutions are likely to be identifying social support to boost the patient's confidence, providing information on medication benefits, agreeing goals or even stopping the medication."

"Practical barriers include things like whether the patient has impaired manual dexterity, visual impairment or difficulty remembering. In these cases, using an organiser may be appropriate but it's important to first seek other potential solutions.

"These solutions could range from providing medication in bottles without childproof lids, using colour coded bottles or helping the patient develop routines and reminders.

"When switching from usual packaging to a pill organiser, we recommend that patients speak to their GP or pharmacist to check that the doses of their medication are appropriate."

"People who are already using a pill organiser without any ill effects should not stop using it as they do seem to help some patients take their medication as prescribed. It's the switching stage which appears to be the danger."

Credit: 
University of East Anglia

Record-breaking analytical method for fingerprinting petroleum and other complex mixtures

image: The FT-ICR MS instrument used in the study.

Image: 
University of Warwick

New method for analysing complex mixtures improves assignment of elemental compositions of molecules

Using a non-distillable heavy petroleum fraction, number of compositions assigned by University of Warwick is a new world record

Technique has varied applications, including in petroleum, biofuels, proteomics, metabolomics and environmental analysis

Scientists at the University of Warwick have developed a more powerful method of analysing chemical mixtures, which has been able to assign a record-breaking number of 244,779 molecular compositions within a single sample of petroleum.

With almost a quarter of a million individual compositions assigned within a non-distillable fraction of crude oil, the new method developed by the Barrow Group within the Department of Chemistry at the University of Warwick and detailed in a paper for the journal Chemical Science paves the way for analysis of challenging samples across different fields.

Assigning the compositions of molecules in a complex mixture is valuable tool for a number of industries, where determining the elemental composition of those molecules can provide valuable data for research, determine the mixture's viability such as in the petrochemical industry, or even 'fingerprint' a complex mixture such as oil or environmental samples.

The researchers developed a new method, called operation at constant ultrahigh resolution (OCULAR), that combines experimental and data processing techniques that allowed them to characterise the most complex sample that they have ever worked on.

Using Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS), the researchers analysed a sample of heavy petroleum in solution. The molecules in the sample were then ionised, excited and detected to determine the mass-to-charge ratios using a solariX (Bruker Daltonics) FT-ICR mass spectrometer at the University of Warwick. The ultrahigh resolving power and mass accuracy of FT-ICR MS allows the scientists to determine the elemental compositions within even the most complex samples, with a high degree of confidence.

Traditional analysis performed with a variety of Fourier transform mass spectrometers (FTMS) offers decreasing resolving power and confidence in assignments of the elemental compositions at higher m/z when studying a broad m/z range. In the new OCULAR method, ions are analysed using smaller data segments based upon their mass, where the experiment is designed in a way to ensure almost constant resolving power across the full mass range analysed; in the published example, a constant resolving power of 3 million was used to characterize a heavy petroleum sample.

Using an algorithm developed by the researchers, the segmented data can be automatically prepared and 'stitched' together to generate a complete mass spectrum (relative abundance vs. m/z). Each peak represents a single molecular composition, and so the entirety of the mass spectrum covers the compositional space of the sample. This allowed them to operate at much higher resolution and also addressed issues relating to space-charge effects, where a large number of ions will affect the accuracy of the mass measurement. The result was resolution, detection and assignment of the highest number of peaks within a sample to date.

The technique can be used for any analysis of a complex mixture and has potential applications in areas such energy (e.g. petroleum and biofuels), life sciences and healthcare (e.g. proteomics, cancer research, and metabolomics), materials (e.g. polymers), and environmental analysis, including being used to 'fingerprint' oil spills by their molecular composition.

Lead author Dr Diana Palacio Lozano, from the University of Warwick's Department of Chemistry, said: "This method can improve the performance of a range of FTMS instruments, including high and low magnetic field FT-ICR MS instruments and Orbitrap instruments. We are now able to analyse mixtures that, due to their complexity, are challenging even for the most powerful analytical techniques. This technique is flexible as the performance can be selected according to the research needs."

Petroleum samples are inherently highly complex and so were an ideal test for this method. As the world's use of petroleum spurs the move to heavier oils, the samples are becoming more complex and so there is also a greater need for this type of analysis by petrochemical scientists.

The low volatility of the heavier oil can now be explained by the extraordinarily complex elemental composition. The high complexity of heavy oils can interfere with catalysis and affects extraction, transport and refining processes. The OCULAR technique is also powerful enough to be used on samples that require the highest performance to assign compositions based on mass accuracy or fine isotopic patterns.

Principal investigator Dr Mark Barrow said: "The OCULAR approach allows us to push the current analytical limits for characterizing the most complex samples. It significantly extends the performance of all FTMS instruments at no additional cost and works well with developments in the field, such as newer hardware designs, detection methods, and data processing methods. OCULAR is highly versatile, the experiments and processing can be adapted as needed, and the approach can be applied to many research areas, including energy, healthcare, and the environment."

Credit: 
University of Warwick

NASA catches transitioning Tropical Storm Francisco near Korean Peninsula

image: On Aug. 7, 2019 at 1:00 a.m. EDT (0500 UTC), the MODIS instrument aboard NASA's Aqua satellite provided a visible image of Tropical Storm Francisco in the Sea of Japan, along the Korean Peninsula.

Image: 
NASA/NRL

NASA's Aqua satellite passed over the Sea of Japan and provided forecasters with a visible image of Tropical Storm Francisco as it was transitioning into an extra-tropical cyclone.

Often, a tropical cyclone will transform into an extra-tropical cyclone as it recurves toward the poles (north or south, depending on the hemisphere the storm is located in). An extra-tropical cyclone is a storm system that primarily gets its energy from the horizontal temperature contrasts that exist in the atmosphere. Extra-tropical cyclones (also known as mid-latitude or baroclinic storms) are low pressure systems with associated cold fronts, warm fronts, and occluded fronts.

Tropical cyclones, in contrast, typically have little to no temperature differences across the storm at the surface and their winds are derived from the release of energy due to cloud/rain formation from the warm moist air of the tropics. Structurally, tropical cyclones have their strongest winds near the earth's surface, while extra-tropical cyclones have their strongest winds near the tropopause - about 8 miles (12 km) up.

On Aug. 7, 2019 at 1:00 a.m. EDT (0500 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Aqua satellite provided a visible image of Francisco that showed it was still along the coast of the Korean Peninsula. It also appeared elongated from south to north.

At 5 a.m. EDT (0900 UTC), the Joint Typhoon Warning Center issued the final warning on Francisco. Maximum sustained winds dropped to near 35 knots (40 mph/65 kph). It was centered near 39.6 degrees north latitude and 129.3 degrees east longitude. That is 166 nautical miles northeast of Seoul, South Korea. Francisco was moving to the northeast.

The Joint Typhoon Warning Center noted that Francisco will traverse the Sea of Japan and move over Hokkaido while weakening.

Credit: 
NASA/Goddard Space Flight Center

Anatomy of a cosmic seagull

image: Colourful and wispy Sharpless 2-296 forms the "wings" of an area of sky known as the Seagull Nebula -- named for its resemblance to a gull in flight. This celestial bird contains a fascinating mix of intriguing astronomical objects. Glowing clouds weave amid dark dust lanes and bright stars. The Seagull Nebula -- made up of dust, hydrogen, helium and traces of heavier elements -- is the hot and energetic birthplace of new stars.

Image: 
ESO/VPHAS+ team/N.J. Wright (Keele University)

The main components of the Seagull are three large clouds of gas, the most distinctive being Sharpless 2-296, which forms the "wings". Spanning about 100 light-years from one wingtip to the other, Sh2-296 displays glowing material and dark dust lanes weaving amid bright stars. It is a beautiful example of an emission nebula, in this case an HII region, indicating active formation of new stars, which can be seen peppering this image.

It is the radiation emanating from these young stars that gives the clouds their fantastical colours and makes them so eye-catching, by ionising the surrounding gas and causing it to glow. This radiation is also the main factor that determines the clouds' shapes, by exerting pressure on the surrounding material and sculpting it into the whimsical morphologies we see. Since each nebula has a unique distribution of stars and may, like this one, be a composite of multiple clouds, they come in a variety of shapes, firing astronomers' imaginations and evoking comparisons to animals or familiar objects.

This diversity of shapes is exemplified by the contrast between Sh2-296 and Sh2-292. The latter, seen here just below the "wings", is a more compact cloud that forms the seagull's "head". Its most prominent feature is a huge, extremely luminous star called HD 53367 that is 20 times more massive than the Sun, and which we see as the seagull's piercing "eye". Sh2-292 is both an emission nebula and a reflection nebula; much of its light is emitted by ionised gas surrounding its nascent stars, but a significant amount is also reflected from stars outside it.

The dark swathes that interrupt the clouds' homogeneity and give them texture are dust lanes - paths of much denser material that hide some of the luminous gas behind them. Nebulae like this one have densities of a few hundred atoms per cubic centimetre, much less than the best artificial vacuums on Earth. Nonetheless, nebulae are still much denser than the gas outside them, which has an average density of about 1 atom per cubic centimetre.

The Seagull lies along the border between the constellations of Canis Major (The Great Dog) and [Monoceros] - (The Unicorn), at a distance of about 3700 light-years in one arm of the Milky Way. Spiral galaxies can contain thousands of these clouds, almost all of which are concentrated along their whirling arms.

Several smaller clouds are also counted as part of the Seagull Nebula, including Sh2-297, which is a small, knotty addition to the tip of the gull's upper "wing", Sh2-292 and Sh2-295. These objects are all included in the Sharpless Catalogue, a list of over 300 clouds of glowing gas compiled by American astronomer Stewart Sharpless.

This image was taken using the VLT Survey Telescope (VST), one of the largest survey telescopes in the world observing the sky in visible light. The VST is designed to photograph large areas of the sky quickly and deeply.

Credit: 
ESO

Depleted seamounts near Hawaii recovering after decades of federal protection

image: In 2006, then President George W. Bush included the area of the seamount as part of the Papahānaumokuākea Marine National Monument

Image: 
National Oceanic and Atmospheric Administration

TALLAHASSEE, Fla. -- For decades, overfishing and trawling devastated parts of an underwater mountain range in the Pacific Ocean near Hawaii, wrecking deep-sea corals and destroying much of their ecological community.

But now, after years of federally mandated protection, scientists see signs that this once ecologically fertile area known as the Hawaiian-Emperor Seamount Chain is making a comeback.

Because of the slow growing nature of the corals and sponges that live on seamounts, "It's been hypothesized that these areas, if they've been trawled, that there's not much hope for them," said Florida State University Associate Professor of Oceanography Amy Baco-Taylor.

"So, we explored these sites fully expecting to not find any sign of recovery," she said. "But we were surprised to find evidence that some species are starting to come back to these areas."

Baco-Taylor and a team from Florida State and Texas A&M University published their findings today in the journal Science Advances. The overall understanding that a trawled seamount could recover is a game changer in terms of fishing management. Scientists and policymakers regularly debate whether protected areas could be reopened for fishing.

"This is a good story of how long-term protection allows for recovery of vulnerable species," Baco-Taylor said.

The Hawaiian-Emperor Seamount Chain is a mostly underwater mountain range in the Pacific Ocean. From the 1960s through the 1980s, the area was a hotbed for fishing and a practice called trawling, where fishermen use heavy nets dragged along the seafloor to capture fish. In the process, the nets scrape other animals off the seafloor as well.

The practice of trawling has devastated seamounts around the world and scientists have generally believed that an ecological recovery was unlikely. However, in the case of the Hawaiian-Emperor Seamount Chain, there is a glimmer of hope.

Baco-Taylor, her doctoral student Nicole Morgan and Texas A&M Associate Professor Brendan Roark led four research cruises out to the central and north Pacific Ocean to investigate the ecological communities of the region.

They specifically wanted to examine whether there was any recovery of life on the seamount chain because unlike other submerged mountain chains around the world, this one had been federally protected from fishing and trawling for decades.

In 1977, the United States claimed the region as a part of the U.S. Exclusive Economic Zone, which prevented foreign fleets from trawling the area. In 2006, then President George W. Bush included the area as part of the Papahānaumokuākea Marine National Monument, further protecting it from human disturbance.

"People started realizing how vulnerable seamounts were relatively recently, so seamounts in other locations have only been protected for 5 to 15 years," Baco-Taylor said. "Establishment of the U.S. EEZ in this region has provided protection for these sites for close to 40 years, providing a unique opportunity to look at recovery on longer time scales."

Through the four research visits, scientists sent an autonomous underwater vehicle and used a human-occupied submersible to explore sites along the chain and to photograph the seamounts roughly 300 to 700 meters below the surface.

The team analyzed 536,000 images. In them, they could not only see the remnant trawl scars on the seafloor, they also saw baby coral springing up in those areas. They could also see coral regrowing from fragments on fishing nets that were left on the seafloor.

"We know the stuff growing on the net had to come after this practice stopped in the area," Morgan said.

Most importantly, they found evidence of a few precious areas that were not harmed by the trawling. These untouched areas are crucial to further populating the seamounts with a variety of fauna, researchers said.

It's too early to say how long it took for the new coral to arrive and whether the area will return to its former glory. Scientists are still analyzing coral samples to determine the age and diversity of species in the area.

Roark, who frequently collaborates with Baco-Taylor, said this study and the ongoing work provides critical knowledge for policymakers examining the effectiveness of protecting these areas.

"This is a high impact paper that bears directly on fishery management issues in the Northwest Hawaiian Islands and is timely relative to some changes the current administration is thinking about with respect to opening up marine monuments for more fishing," Roark said.

Credit: 
Florida State University

Association of coexisting psychiatric disorders, risk of death in patients with ADHD

What The Study Did: This observational study of Swedish national register data included nearly 87,000 people with attention-deficit/hyperactivity disorder (ADHD) and examined the association of coexisting psychiatric disorders with risk of death.

Authors: Shihua Sun, M.D., of the Karolinska Institutet in Stockholm, Sweden, is the corresponding author.

(doi:10.1001/jamapsychiatry.2019.1944)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Fast-food availability near commute route linked to BMI

image: Commute food environment of a selected participant (FE Jefferson to Orleans)

Image: 
A. Dornelles, 2019

In a study of commuting workers, the number of different types of food stores available near residences and commute routes--but not near workplaces--had a significant association with body mass index (BMI). Adriana Dornelles of Arizona State University, U.S. presents these findings in the open access journal PLOS ONE on August 7, 2019.

Previous research has revealed links between the food stores available in residential neighborhoods and residents' health outcomes, including BMI. However, few prior studies have also included food stores near workplaces, and none have examined food options along commute routes. The new study addresses the relationship between these three food environments and BMI.

Dornelles analyzed data from 710 elementary school employees in New Orleans, Louisiana. Drawing on existing databases, she determined the number of supermarkets, grocery stores, full-service restaurants, and fast-food restaurants within 1 kilometer of the employees' residential and workplace addresses. She also determined the number and type of food stores within 1 kilometer of the shortest-distance commute path between each employee's residence and their workplace.

Adjusting for socio-demographic factors, statistical analyses showed that a greater number of fast-food restaurants near the commute route was associated with higher BMI. Higher BMI was also associated with a greater number of supermarkets, grocery stores, and fast-food restaurants near residences, while a greater number of full-service restaurants near residences was linked to lower BMI. The analysis did not find any links between BMI and the food stores available near workplaces.

The author notes that these findings highlight the need to consider multiple environmental factors when examining contributors to BMI. Future research could explore individuals' exact commute routes and food-purchasing habits along those routes, as well as looking at health outcomes beyond BMI. A deeper understanding of these factors could help inform interventions to promote better health outcomes.

Dornelles adds: "The most important finding of the study was to establish a significant relationship between BMI and multiple food environments. In our daily lives, we are exposed to several healthy and unhealthy food choices, which has an impact on BMI. The availability and variety of fast-food restaurants along our commute create endless opportunities for a quick, cheap, and unhealthy meal, which results, on average, in higher body mass index."

Credit: 
PLOS

New synthesis method opens up possibilities for organic electronics

image: We demonstrated the synthesis of the
isomeric all-acceptor copolymers by DArP using the electron-deficient monomers without any orienting or activating groups for the C-H bonds. Our new DArP method could effectively produce high-molecular-weight and high-quality all-acceptor
polymers, thus opening the door to synthesize various promising n-type semiconducting polymers.

Image: 
Tokyo Tech

Semiconducting polymers, very large chain-like molecules made from repeating sub-units, are increasingly drawing the attention of researchers because of their potential applications in organic electronic devices. Like most semiconducting materials, semiconducting polymers can be classified as p-type or n-type according to their conducting properties. Although p-type semiconducting polymers have seen dramatic improvements thanks to recent advances, the same cannot be said about their n-type counterparts, whose electron-conducting characteristics (or 'electron mobility') are still poor.

Unfortunately, high-performance n-type semiconducting polymers are necessary for many green applications, such as various types of solar cells. The main challenges holding back the development of n-type semiconducting polymers are the limited molecular design strategies and synthesis procedures available. Among the existing synthesis methods, DArP (which stands for 'direct arylation polycondensation') has shown promising results for producing n-type semiconducting polymers in an environmentally friendly and efficient way. However, until now, the building blocks (monomers) used in the DArP method were required to have an orienting group in order to produce polymers reliably, and this severely limited the applicability of DArP to make high-performance semiconducting polymers.

Luckily, a research team from Tokyo Institute of Technology led by Prof. Tsuyoshi Michinobu found a way around this. They managed to reliably produce two long n-type semiconducting polymers (referred to as P1 and P2) through the DArP method by using palladium and copper as catalysts, which are materials or substances that can be used promote or inhibit specific reactions.

The two polymers were almost identical and contained two thiazole rings-pentagonal organic molecules that contain a nitrogen atom and a sulfur atom. However, the position of the nitrogen atom of the thiazole rings was slightly different between P1 and P2 and, as the researchers found out, this led to significant and unexpected changes in their semiconducting properties and structure. Even though P1 had a more planar structure and was expected to have a higher electron mobility, it was P2 who stole the show. The backbone of this polymer is twisted and looks similar to alternating chain links. More importantly, the researchers were surprised to find that the electron mobility of P2 was forty times higher than that of P1 and even higher than that of the current benchmark n-type semiconducting polymer. "Our results suggest the possibility of P2 being the new benchmark among n-type semiconducting materials for organic electronics," remarks Prof. Michinobu.

In addition, semiconducting devices made using P2 were also remarkably stable, even when stored in air for a long time, which is known to be a weakness of n-type semiconducting polymers. The researchers believe that the promising properties of P2 are because of its more crystalline (ordered) structure compared with P1, which changes the previous notion that semiconducting polymers should have a very planar structure to have better semiconducting properties. "Our new DArP method opens a door for synthesizing various promising n-type semiconducting polymers which cannot be obtained via traditional methods," concludes Prof. Michinobu. This work is another step in the direction towards a greener future with sustainable organic electronics.

Credit: 
Tokyo Institute of Technology

Earth's last magnetic field reversal took far longer than once thought

image: Study co-author Rob Coe and Trevor Duarte orienting cores from a lava flow site recording the Matuyama-Brunhes magnetic polarity reversal in Haleakala National Park, Hawaii, in 2015.

Image: 
Brad Singer

MADISON, Wis. -- Earth's magnetic field seems steady and true -- reliable enough to navigate by.

Yet, largely hidden from daily life, the field drifts, waxes and wanes. The magnetic North Pole is currently careening toward Siberia, which recently forced the Global Positioning System that underlies modern navigation to update its software sooner than expected to account for the shift.

And every several hundred thousand years or so, the magnetic field dramatically shifts and reverses its polarity: Magnetic north shifts to the geographic South Pole and, eventually, back again. This reversal has happened countless times over the Earth's history, but scientists have only a limited understanding of why the field reverses and how it happens.

New work from University of Wisconsin-Madison geologist Brad Singer and his colleagues finds that the most recent field reversal, some 770,000 years ago, took at least 22,000 years to complete. That's several times longer than previously thought, and the results further call into question controversial findings that some reversals could occur within a human lifetime.

The new analysis -- based on advances in measurement capabilities and a global survey of lava flows, ocean sediments and Antarctic ice cores -- provides a detailed look at a turbulent time for Earth's magnetic field. Over millennia, the field weakened, partially shifted, stabilized again and then finally reversed for good to the orientation we know today.

The results provide a clearer and more nuanced picture of reversals at a time when some scientists believe we may be experiencing the early stages of a reversal as the field weakens and moves. Other researchers dispute the notion of a present-day reversal, which would likely affect our heavily electronic world in unusual ways.

Singer published his work Aug. 7 in the journal Science Advances. He collaborated with researchers at Kumamoto University in Japan and the University of California, Santa Cruz.

"Reversals are generated in the deepest parts of the Earth's interior, but the effects manifest themselves all the way through the Earth and especially at the Earth's surface and in the atmosphere," explains Singer. "Unless you have a complete, accurate and high-resolution record of what a field reversal really is like at the surface of the Earth, it's difficult to even discuss what the mechanics of generating a reversal are."

Earth's magnetic field is produced by the planet's liquid iron outer core as it spins around the solid inner core. This dynamo action creates a field that is most stable going through roughly the geographic North and South poles, but the field shifts and weakens significantly during reversals.

As new rocks form -- typically either as volcanic lava flows or sediments being deposited on the sea floor -- they record the magnetic field at the time they were created. Geologists like Singer can survey this global record to piece together the history of magnetic fields going back millions of years. The record is clearest for the most recent reversal, named Matuyama-Brunhes after the researchers who first described reversals.

For the current analysis, Singer and his team focused on lava flows from Chile, Tahiti, Hawaii, the Caribbean and the Canary Islands. The team collected samples from these lava flows over several field seasons.

"Lava flows are ideal recorders of the magnetic field. They have a lot of iron-bearing minerals, and when they cool, they lock in the direction of the field," says Singer. "But it's a spotty record. No volcanoes are erupting continuously. So we're relying on careful field work to identify the right records."

The researchers combined magnetic readings and radioisotope dating of samples from seven lava flow sequences to recreate the magnetic field over a span of about 70,000 years centered on the Matuyama-Brunhes reversal. They relied on upgraded methods developed in Singer's WiscAr geochronology lab to more accurately date the lava flows by measuring the argon produced from radioactive decay of potassium in the rocks.

They found that the final reversal was quick by geological standards, less than 4,000 years. But it was preceded by an extended period of instability that included two excursions -- temporary, partial reversals -- stretching back another 18,000 years. That span is more than twice as long as suggested by recent proposals that all reversals wrap up within 9,000 years.

The lava flow data was corroborated by magnetic readings from the seafloor, which provides a more continuous but less precise source of data than lava rocks. The researchers also used Antarctic ice cores to track the deposition of beryllium, which is produced by cosmic radiation colliding with the atmosphere. When the magnetic field is reversing, it weakens and allows more radiation to strike the atmosphere, producing more beryllium.

Since humanity began recording the strength of the magnetic field, it has decreased in strength about five percent each century. As records like Singer's show, a weakening field seems to be a precursor to an eventual reversal, although it's far from clear that a reversal is imminent.

A reversing field might significantly affect navigation and satellite and terrestrial communication. But the current study suggests that society would have generations to adapt to a lengthy period of magnetic instability.

"I've been working on this problem for 25 years," says Singer, who stumbled into paleomagnetism when he realized the volcanoes he was studying served as a good record of Earth's magnetic fields. "And now we have a richer record and better-dated record of this last reversal than ever before."

Credit: 
University of Wisconsin-Madison

Air pollution cuts are saving lives in New York state

image: A new study charts declines in New York State air-pollution-related deaths due to various ailments over a decade.

Image: 
Courtesy Xiaomeng Jin

Lower air pollution levels saved an estimated 5,660 lives in New York State in 2012, compared to 2002 levels, according to a new study.

Published in Environmental Research Letters, the study -- led by Columbia University's Lamont-Doherty Observatory atmospheric chemistry research group -- looked at New York State levels of a specific kind of pollution known as fine particulate matter, or "PM2.5." These microscopic particulates are a mixture of solid particles and liquid droplets. Some come from burning fuel, and others form in the atmosphere as a result of complex reactions of chemicals such as sulfur dioxide and nitrogen oxides from power plants, industries and automobiles. Long-term exposure to PM2.5 can lead to respiratory and cardiovascular problems.

The study compared seven datasets, including both on-the-ground and satellite measurements, to analyze trends in PM2.5 levels across New York State. The researchers found that PM2.5 levels dropped by 28 to 37 percent between 2002 and 2012. They calculated that this drop cut the air pollution mortality burden for New York State residents by 67 percent -- from 8,410 premature deaths in 2002 to 2,750 deaths in 2012.

"What's novel about this study is that we use seven different PM2.5 exposure estimates to analyze the long-term change in mortality burden, and they all show a consistent decrease in mortality burden," said Xiaomeng Jin, the Lamont researcher who led the study.

The study considered four ailments triggered by long-term exposure to fine particulate matter: chronic obstructive pulmonary diseases, ischemic heart disease, lung cancer, and cerebrovascular and ischemic stroke.

The study provides evidence that emission controls on air pollutants, initiated by the Clean Air Act of 1970--and expanded under amendments passed in 1990 that required a review of scientific evidence on which standards are set and implemented--have improved public health across New York State, said the researchers.

"Those reviews have sometimes resulted in stricter standards being set, which in turn set in motion the process of emission controls to meet those standards," said Lamont atmospheric chemist and co-author of the study Arlene Fiore.

Among the other factors that have helped clear the air: continued progress in cleaner vehicles; additional programs to reduce air pollution, including programs targeting diesel fuel, a source of sulfur dioxide; and the reduction of high sulfur dioxide-emitting coal-burning power plants.

Fiore said this study is a key step to documenting the health benefits from cleaner air.

Credit: 
Columbia Climate School

Tropical Storm Krosa gets a comma shape

image: NOAA's NOAA-20 polar orbiting satellite passed over the Northwestern Pacific Ocean and captured a visible image of Tropical Storm Krosa on Aug. 7, 2019.

Image: 
NASA/NRL/NOAA

Tropical Storm Krosa continued on its journey northward in the Northwestern Pacific Ocean when NOAA's NOAA-20 polar orbiting satellite passed overhead and captured a visible image of the strengthening storm in a classic tropical cyclone shape.

The Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard NOAA-20 provided a visible image of the storm. There's also a VIIRS instrument aboard the NASA-NOAA Suomi NPP satellite that preceded NOAA-20.

The VIIRS image revealed Krosa had developed the signature "comma shape" of a strengthening storm. A large wide band of thunderstorms were feeding into the low-level center from the south and east of the center.

At 5 a.m. EDT (0900 UTC) on Aug. 7, Tropical Storm Krosa's maximum sustained winds were near 60 knots (69 mph/111 kph) and strengthening. It was centered near 21.2 degrees north latitude and 141.3 degrees east longitude. That is about 31 nautical miles south of Iwo To island, Japan. Krosa was moving to the north-northwest.

The Joint Typhoon Warning Center said that Krosa will move northwest, and then later turn north while becoming a typhoon.

Credit: 
NASA/Goddard Space Flight Center

Geneticists unlock the secret of mutant flies' longevity

image: Russian researchers determined which genes are affected by mutation that extends lifespan of fruit flies. Comparing gene activity of long-living fly strains to the control insects helped reveal mechanisms of aging and identify drug targets associated with aging-related diseases.

Image: 
Elena Khavina, MIPT Press Office

Researchers from the Moscow Institute of Physics and Technology, Engelhardt Institute of Molecular Biology of the Russian Academy of Sciences, Institute of Biology of Komi Science Centre of the Ural Branch of the Russian Academy of Sciences, and Insilico Medicine, USA, determined which genes are affected by mutation that extends lifespan of fruit flies. Comparing gene activity of long-living fly strains to the control insects helped reveal mechanisms of aging and identify drug targets associated with aging-related diseases. The study was published in Scientific Reports.

"Gene activity controls all functions of a cell and, ultimately, the organism as a whole," said Alexey Moskalev, head of the Aging and Lifespan Genetics Lab at MIPT and the first author of the study. "We can better understand the biology behind longevity if we identify which genes are more active and which ones are less active at different ages in long-living strains of animals as compared to the short-living ones."

Biogerontologists use animals with short lifespans to test their hypotheses before moving on to conducting long-term experiments on mammals. Fruit fly (Drosophila melanogaster) is a very convenient model organism as its genome is well studied and it contains genes correlated to 40% of human diseases, and the fruit flies' lifespan is only a couple of months. Drosophila breeding and genome editing are well-established technologies. Apart from that, fruit flies have two sexes, unlike creatures such as nematodes.

The authors of the study used a specially bred strain of Drosophila with E(z) gene partially suppressed. This gene affects the activity of other genes. Such mutant flies have remarkably longer lifespans than control specimens and exhibit a higher tolerance to adverse conditions. Which specific genes are affected by the mutation, however, has been, until now, unclear.

The Russian researchers confirmed the positive effect of the mutation, with the average lifespan of Drosophila extended by 22-23%. As part of the experiment, the flies were starved, poisoned with paraquat, and exposed to scorching temperatures of 35 °C (95 °F). The mutant Drosophila displayed higher tolerance to all of these factors. Apart from that, the mutation had an unexpected effect on the flies' fertility.

"It is known that in Drosophila, lifespan extension induced by mutation is often associated with reduced reproduction. But in our case, we saw an increase in mutant female fecundity across all age groups", Alexey Moskalev said in his comment on the study results.

Having confirmed the positive effects of the mutation, the researchers analyzed the product of all active genes within a cell (transcriptome analysis) to compare gene activity of mutant Drosophila and control specimens. They discovered 239 genes with the amount of gene product significantly different for the long- and short-living related groups. Among other things, these genes are involved in metabolism.

"We discovered that the mutation triggers a global alteration of metabolism. It affects carbohydrate metabolism, lipid metabolism, and nucleotide metabolism, as well as immune response genes activity and protein synthesis", Moskalev added.

The authors of the study plan to extend the lifespan of fruit flies even further by exposing them to combinations of various chemical and physical factors. The ultimate goal is to extend maximum species lifespan or the longest lifespan recorded for a specimen of the species.

Credit: 
Moscow Institute of Physics and Technology

Virtual patients and in silico clinical studies improve blue light treatment for psoriasis

image: Systems Medicine: Journal of Medical Systems Biology and Network Medicine focuses on interdisciplinary approaches to exploiting the power of big data by applying systems biology and network medicine.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, August 5, 2019--A new study supports the use of virtual patients and in silico clinical studies to evaluate the effectiveness of blue light to reduce the symptoms of psoriasis. Researchers also demonstrated that this in silico approach can be used to improve the treatment response of patients with psoriasis to blue light by modifying the settings of the therapeutic protocol, as reported in the study published in Systems Medicine, an open access journal from Mary Ann Liebert, Inc., publishers. Click here to read the full article free on the Systems Medicine: Journal of Medical Systems Biology and Network Medicine website through September 5, 2019.

"In silico Clinical Studies on the Efficacy of Blue Light for Treating Psoriasis in Virtual Patients" was coauthored by Zandra Félix Garza, Peter Hilbers, and Natal van Riel, Eindhoven University of Technology, The Netherlands, and Joerg Liebmann and Matthias Born, Philips Electronics Netherlands BV, Eindhoven. The researchers note that the current computational model for studying the efficacy of blue light therapy only reproduces the response in the average patient in clinical trials and does not take into account individual variations amongst patients. Use of a computational model combined with a refined pool of virtual patients can adequately capture the patient variability in the response to treatment with blue light and the decrease in disease severity seen in previous clinical investigations. The authors suggest that a minimum of 2,500 virtual patients, which they refined down from an initial pool of 500,000 virtual patients, are needed to reproduce the responses seen in clinical investigations.

"This is a highly promising approach towards using statistical learning on virtual patient populations to draw actionable clinical conclusions on real patients, and thus a major step forward to precision medicine," says Co-Editor-in-Chief Prof. Dr. Jan Baumbach from Technical University of Munich.

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Designing a light-trapping, color-converting crystal

image: An illustration of the researchers’ design. The holes in this microscopic slab structure are arranged and resized in order to control and hold two wavelengths of light. The scale bar on this image is 2 micrometers, or two millionths of a meter.

Image: 
Momchil Minkov

Five years ago, Stanford postdoctoral scholar Momchil Minkov encountered a puzzle that he was impatient to solve. At the heart of his field of nonlinear optics are devices that change light from one color to another - a process important for many technologies within telecommunications, computing and laser-based equipment and science. But Minkov wanted a device that also traps both colors of light, a complex feat that could vastly improve the efficiency of this light-changing process - and he wanted it to be microscopic.

"I was first exposed to this problem by Dario Gerace from the University of Pavia in Italy, while I was doing my PhD in Switzerland. I tried to work on it then but it's very hard," Minkov said. "It has been in the back of my mind ever since. Occasionally, I would mention it to someone in my field and they would say it was near-impossible."

In order to prove the near-impossible was still possible, Minkov and Shanhui Fan, professor of electrical engineering at Stanford, developed guidelines for creating a crystal structure with an unconventional two-part form. The details of their solution were published Aug. 6 in Optica, with Gerace as co-author. Now, the team is beginning to build its theorized structure for experimental testing.

A recipe for confining light

Anyone who's encountered a green laser pointer has seen nonlinear optics in action. Inside that laser pointer, a crystal structure converts laser light from infrared to green. (Green laser light is easier for people to see but components to make green-only lasers are less common.) This research aims to enact a similar wavelength-halving conversion but in a much smaller space, which could lead to a large improvement in energy efficiency due to complex interactions between the light beams.

The team's goal was to force the coexistence of the two laser beams using a photonic crystal cavity, which can focus light in a microscopic volume. However, existing photonic crystal cavities usually only confine one wavelength of light and their structures are highly customized to accommodate that one wavelength.

So instead of making one uniform structure to do it all, these researchers devised a structure that combines two different ways to confine light, one to hold onto the infrared light and another to hold the green, all still contained within one tiny crystal.

"Having different methods for containing each light turned out to be easier than using one mechanism for both frequencies and, in some sense, it's completely different from what people thought they needed to do in order to accomplish this feat," Fan said.

After ironing out the details of their two-part structure, the researchers produced a list of four conditions, which should guide colleagues in building a photonic crystal cavity capable of holding two very different wavelengths of light. Their result reads more like a recipe than a schematic because light-manipulating structures are useful for so many tasks and technologies that designs for them have to be flexible.

"We have a general recipe that says, 'Tell me what your material is and I'll tell you the rules you need to follow to get a photonic crystal cavity that's pretty small and confines light at both frequencies,'" Minkov said.

Computers and curiosity

If telecommunications channels were a highway, flipping between different wavelengths of light would equal a quick lane change to avoid a slowdown - and one structure that holds multiple channels means a faster flip. Nonlinear optics is also important for quantum computers because calculations in these computers rely on the creation of entangled particles, which can be formed through the opposite process that occurs in the Fan lab crystal - creating twinned red particles of light from one green particle of light.

Envisioning possible applications of their work helps these researchers choose what they'll study. But they are also motivated by their desire for a good challenge and the intricate strangeness of their science.

"Basically, we work with a slab structure with holes and by arranging these holes, we can control and hold light," Fan said. "We move and resize these little holes by billionths of a meter and that marks the difference between success and failure. It's very strange and endlessly fascinating."

These researchers will soon be facing off with these intricacies in the lab, as they are beginning to build their photonic crystal cavity for experimental testing.

Credit: 
Stanford University