Culture

Surprise! TESS shows ancient north star undergoes eclipses

image: The star Alpha Draconis (circled), also known as Thuban, has long been known to be a binary system. Now data from NASA's TESS show its two stars undergo mutual eclipses.

Image: 
NASA/MIT/TESS

Astronomers using data from NASA's Transiting Exoplanet Survey Satellite (TESS) have shown that Alpha Draconis, a well-studied star visible to the naked eye, and its fainter companion star regularly eclipse each other. While astronomers previously knew this was a binary system, the mutual eclipses came as a complete surprise.

"The first question that comes to mind is 'how did we miss this?'" said Angela Kochoska, a postdoctoral researcher at Villanova University in Pennsylvania who presented the findings at the 235th meeting of the American Astronomical Society in Honolulu on Jan. 6. "The eclipses are brief, lasting only six hours, so ground-based observations can easily miss them. And because the star is so bright, it would have quickly saturated detectors on NASA's Kepler observatory, which would also mask the eclipses."

The system ranks among the brightest-known eclipsing binaries where the two stars are widely separated, or detached, and only interact gravitationally. Such systems are important because astronomers can measure the masses and sizes of both stars with unrivaled accuracy.

Alpha Draconis, also known as Thuban, lies about 270 light-years away in the northern constellation Draco. Despite its "alpha" designation, it shines as Draco's fourth-brightest star. Thuban's fame arises from a historical role it played some 4,700 years ago, back when the earliest pyramids were being built in Egypt.

At that time, it appeared as the North Star, the one closest to the northern pole of Earth's spin axis, the point around which all of the other stars appear to turn in their nightly motion. Today, this role is played by Polaris, a brighter star in the constellation Ursa Minor. The change happened because Earth's spin axis performs a cyclic 26,000-year wobble, called precession, that slowly alters the sky position of the rotational pole.

TESS monitors large swaths of the sky, called sectors, for 27 days at a time. This long stare allows the satellite to track changes in stellar brightness. While NASA's newest planet hunter mainly seeks dimmings caused by planets crossing in front of their stars, TESS data can be used to study many other phenomena as well.

A 2004 report suggested that Thuban displayed small brightness changes that cycled over about an hour, suggesting the possibility that the system's brightest star was pulsating.

To check this, Timothy Bedding, Daniel Hey, and Simon Murphy at the University of Sydney, Australia, and Aarhus University, Denmark, turned to TESS measurements. In October, they published a paper that described the discovery of eclipses by both stars and ruling out the existence of pulsations over periods less than eight hours.

Now Kochoska is working with Hey to understand the system in greater detail.

"I've been collaborating with Daniel to model the eclipses and advising on how to bring together more data to better constrain our model." Kochoska explained. "The two of us took different approaches to modeling the system, and we hope our efforts will result in its full characterization."

As known from earlier studies, the stars orbit every 51.4 days at an average distance of about 38 million miles (61 million kilometers), slightly more than Mercury's distance from the Sun. The current preliminary model shows that we view the system about three degrees above the stars' orbital plane, which means neither star completely covers the other during the eclipses. The primary star is 4.3 times bigger than the Sun and has a surface temperature around 17,500 degrees Fahrenheit (9,700 C), making it 70% hotter than our Sun. Its companion, which is five times fainter, is most likely half the primary's size and 40% hotter than the Sun.

Kochoska says she is planning ground-based follow-up observations and anticipating additional eclipses in future TESS sectors.

"Discovering eclipses in a well-known, bright, historically important star highlights how TESS impacts the broader astronomical community," said Padi Boyd, the TESS project scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "In this case, the high precision, uninterrupted TESS data can be used to help constrain fundamental stellar parameters at a level we've never before achieved."

Credit: 
NASA/Goddard Space Flight Center

Scientists discover the mechanism of DNA high-order structure formation

image: Molecular structures of Abo1 in different energy states (left), Demonstration of an Abo1-assisted histone loading onto DNA by the DNA curtain assay.

Image: 
KAIST

The genetic material of our cells--DNA--exists in a high-order structure called "chromatin". Chromatin consists of DNA wrapped around histone proteins and efficiently packs DNA into a small volume. Moreover, using a spool and thread analogy, chromatin allows DNA to be locally wound or unwound, thus enabling genes to be enclosed or exposed. The misregulation of chromatin structures results in aberrant gene expression and can ultimately lead to developmental disorders or cancers. Despite the importance of DNA high-order structures, the complexity of the underlying machinery has circumvented molecular dissection.

For the first time, molecular biologists have uncovered how one particular mechanism uses energy to ensure proper histone placement onto DNA to form chromatin. They published their results on Dec. 17 in Nature Communications.

The study focused on proteins called histone chaperones. Histone chaperones are responsible for adding and removing specific histones at specific times during the DNA packaging process. The wrong histone at the wrong time and place could result in the misregulation of gene expression or aberrant DNA replication. Thus, histone chaperones are key players in the assembly and disassembly of chromatin.

"In order to carefully control the assembly and disassembly of chromatin units, histone chaperones act as molecular escorts that prevent histone aggregation and undesired interactions," said Professor Ji-Joon Song in the Department of Biological Sciences at the KAIST. "We set out to understand how a unique histone chaperone uses chemical energy to assemble or disassemble chromatin."

Song and his team looked to Abo1, the only known histone chaperone that utilizes cellular energy (ATP). While Abo1 is found in yeast, it has an analogous partner in other organisms, including humans, called ATAD2. Both use ATP, which is produced through a cellular process where enzymes break down a molecule's phosphate bond. ATP energy is typically used to power other cellular processes, but it is a rare partner for histone chaperones.

"This was an interesting problem in the field because all other histone chaperones studied to date do not use ATP," Song said.

By imaging Abo1 with a single-molecule fluorescence imaging technique known as the DNA curtain assay, the researchers could examine the protein interactions at the single-molecule level. The technique allows scientists to arrange the DNA molecules and proteins on a single layer of a microfluidic chamber and examine the layer with fluorescence microscopy.

The researchers found through real-time observation that Abo1 is ring-shaped and changes its structure to accommodate a specific histone and deposit it on DNA. Moreover, they found that the accommodating structural changes are powered by ADP.

"We discovered a mechanism by which Abo1 accommodates histone substrates, ultimately allowing it to function as a unique energy-dependent histone chaperone," Song said. "We also found that despite looking like a protein disassembly machine, Abo1 actually loads histone substrates onto DNA to facilitate chromatin assembly."

The researchers plan to continue exploring how energy-dependent histone chaperones bind and release histones, with the ultimate goal of developing therapeutics that can target cancer-causing misbehavior by Abo1's analogous human counterpart, ATAD2.

Credit: 
The Korea Advanced Institute of Science and Technology (KAIST)

New frailty index may help determine adverse outcomes in older patients after hospital discharge

A new frailty index shows promise in determining how acute illness affects functional ability in older patients admitted to hospital, according to a new study in CMAJ (Canadian Medical Association Journal) co-led by researchers from Dalhousie University, Canada, and University College London (UCL), United Kingdom: http://www.cmaj.ca/lookup/doi/10.1503/cmaj.190952

Older, frail adults often lose the ability to function if they are admitted to hospital for a sudden acute illness. Understanding how to measure seniors' frailty in the context of their illness may help in providing them with specific supports after discharge from hospital.

Researchers used routine laboratory tests to create a frailty index (FI-Laboratory) linked to hospital outcomes data based on a group of adults admitted to University College Hospital, London, UK. A higher score on the FI-Laboratory was associated with an 18% increased likelihood of readmission and 45% increased likelihood of death when accounting for other health factors.

"Assessing clinical frailty in the acute care setting is difficult," writes Dr. Samuel Searle, Dalhousie University and MRC Unit for Lifelong Health and Ageing at UCL, with coauthors. "The FI-Laboratory can help to identify complex, acutely ill older adults at hospital admission who have accumulated multiple health deficits and are at an increased risk of adverse outcomes."

"By quantifying both acute and chronic deficits, the score may draw attention to risk that is not always apparent clinically," write the authors.

The authors note that although the FI-Laboratory is being studied in several clinical settings, it is not yet known whether it will help improve clinical outcomes for patients.

Credit: 
Canadian Medical Association Journal

Cannabis edibles present novel health risks

With the recent legalization of cannabis edibles in Canada, physicians and the public must be aware of the novel risks of cannabis edibles, argue authors in a commentary in CMAJ (Canadian Medical Association Journal): http://www.cmaj.ca/lookup/doi/10.1503/cmaj.191217

"Although edibles are commonly viewed as a safer and more desirable alternative to smoked or vaped cannabis, physicians and the public should be aware of several risks related to the use of cannabis edibles," write Drs. Jasleen Grewal and Lawrence Loh from the University of Toronto, Toronto, Ontario.

Cannabis edibles take on average four hours longer to produce noticeable effects in comparison to inhaled cannabis, which can increase the risk of overconsumption. With effects lasting up to 8 hours, edibles can also lead to a longer period of impairment compared to inhaled cannabis. While federal regulations have standardized the presentation of dosing information, the authors warn that "individuals' responses to different products may vary and overdosing may still occur, with cannabis-naive individuals particularly at risk."

At particular risk are children and pets as many edibles look like candy and other appetizing food and drink. Other vulnerable groups include older people and youth; of note, a recent Canadian report found that youth believe cannabis edibles have positive effects on sleep, mood and anxiety, which actually runs counter to what is seen in evidence.

"Physicians should routinely question patients who ask about cannabis about their use or intended use of edible cannabis products so that they can counsel these patients regarding child safety, potential for accidental overconsumption and delayed effects, and potential for interactions with other substances such as alcohol, benzodiazepines, sleeping aids and opioids," caution the authors.

Population-level monitoring, and evaluation of the effects of legalized edibles will ensure that regulations are best able to protect children, youth, seniors and other age groups from health effects related to the consumption of cannabis edibles.

Credit: 
Canadian Medical Association Journal

American College of Physicians issues guideline for testosterone treatment in adult men

Philadelphia, January 7, 2020 -- Physicians should prescribe testosterone for men with age-related low testosterone only to treat sexual dysfunction, the American College of Physicians (ACP) says in a new evidence-based clinical practice guideline published today in Annals of Internal Medicine.

"Physicians are often asked by patients about low 'T' and are skeptical about the benefits of testosterone treatment," said ACP President Robert M. McLean, MD, MACP. "The evidence shows that men with age-related low testosterone may experience slight improvements in sexual and erectile function. The evidence does not support prescribing testosterone for men with concerns about energy, vitality, physical function, or cognition."

ACP's guideline, endorsed by the American Academy of Family Physicians, applies to adult men with age-related low testosterone. It does not address screening or diagnosis of hypogonadism, or monitoring of testosterone levels.

Physicians should discuss whether to initiate testosterone treatment in men with age-related low testosterone with sexual dysfunction who want to improve sexual and erectile function based on the potential benefits, harms, costs, and patient preferences, ACP says. ACP also recommends that physicians reevaluate symptoms within 12 months and periodically thereafter. Physicians should discontinue testosterone treatment if sexual function does not improve, and they should not initiate testosterone treatment to improve energy, vitality, physical function, or cognition because the evidence indicates testosterone treatment is not effective.

"Given that testosterone's effects were limited to small improvements in sexual and erectile function in men with low testosterone levels, it is unlikely that screening men for low testosterone levels or treating men without sexual or erectile dysfunction and low testosterone levels would be effective." Dr. McLean said.

ACP suggests that physicians consider intramuscular rather than transdermal formulations when initiating testosterone treatment to improve sexual function because the costs are considerably lower for the intramuscular formulation and clinical effectiveness and harms are similar.

The annual cost in 2016 per beneficiary for testosterone replacement therapy was $2,135.32 for transdermal and $156.24 for the intramuscular formulation according to paid pharmaceutical claims provided in the 2016 Medicare Part D Drug Claims data.

"Most men are able to inject the intramuscular formulation at home and do not require a separate clinic or office visit for administration," said Dr. McLean.

Men experience a gradual decline in serum total testosterone levels as they age, starting in their mid-30s, at an average rate of 1.6 percent per year. This condition is referred to as age-related low testosterone. The incidence of low testosterone in the U.S. is approximately 20 percent of men over age 60 years, 30 percent over age 70, and 50 percent over age 80, though prevalence of low testosterone with sexual dysfunction symptoms (defined as at least three sexual symptoms with total testosterone less than 320 nanograms per decilitre) is lower. It is uncertain whether nonspecific signs and symptoms associated with age-related low testosterone are a consequence of age-related low testosterone levels or whether they are a result of other factors, such as chronic illnesses or medications.

ACP developed its recommendations in "Testosterone Treatment in Adult Men with Age-Related Low Testosterone" based on a systematic evidence review on the efficacy and safety of testosterone treatment in adult men with age-related low testosterone. The Minnesota Evidence-based Synthesis Center conducted the review funded by ACP. ACP's Clinical Guidelines Committee evaluated the clinical outcomes using the GRADE system for sexual function, physical function, quality of life, energy/vitality, depression, cognition, serious adverse events, major adverse cardiovascular events, and other adverse events.

Cochrane, a global leader and resource in evidence-informed health decision-making, has officially recognized ACP as a Cochrane US Network Affiliate. To receive such a designation, affiliates must show a proven record of supporting evidence-based practice and expertise and competencies in systematic reviewing and evidenced-informed health practice and policy. ACP is also a member of the Guidelines International Network, whose mission is to lead, strengthen, and support collaboration in guideline development.

Credit: 
American College of Physicians

Adolescents' view of family social standing correlates with mental health, life outcomes

Irvine, Calif. -- Young people's view of their family's social status was more strongly associated with their mental health and readiness for future education and work than how much money, education or occupational prestige their parents have, according to new research led by the University of California, Irvine. The study, published in Proceedings of the National Academy of Sciences, found that by age 18, youths who rated their family as having a higher place in society had fewer difficulties negotiating the transition to adulthood, independent of the objective position of the family.

"The amount of financial resources children have access to is one of the most reliable predictors of their health and life chances," said Candice Odgers, UCI professor of psychological science and senior author of the report. "But these findings show that how young people see their family's place in a hierarchical system also matters. Their perceptions of social status were an equally good, and often stronger, indicator of how well they were going to do with respect to mental health and social outcomes."

Researchers followed 2,232 same-sex twins born in England and Wales in 1994-95 who were part of the Environmental Risk Longitudinal Twin Study based at King's College London. Adolescents assessed their family's social ranking at ages 12 and 18. By late adolescence, these beliefs signaled how well the teen was doing, independent of the family's access to financial resources, healthcare, adequate nutrition and educational opportunities. This pattern was not seen at age 12.

Study findings also showed that despite growing up in the same family, the twins' views were not always identical. By age 18, the twin who rated the family's standing as higher was less likely to be convicted of a crime; was more often educated, employed or in training; and had fewer mental health problems than his or her sibling.

"Testing how young people's perceptions related to well-being among twins provided a rare opportunity to control for poverty status as well as environmental and genetic factors shared by children within the same family," said Joshua Rivenbark, an M.D./Ph.D. student at Duke University and the study's lead author. "Siblings grew up with equal access to objective resources, but many differed in where they placed their family on the social ladder - which then signaled how well each twin was doing."

Researchers did not discover evidence that youths' opinions of where their family was ranked were robustly associated with measures of physical health or cognitive functioning. These were more strongly linked with objective markers of family income and social status.

"We want to emphasize caution in interpreting our findings, given that adolescents' experiences of being convicted or suffering from mental health problems may also influence their views of where their family stands in society," Rivenbark said. "Studies that experimentally manipulate how young people see their social position would be needed to sort out cause from effect."

"Targeting adolescents' views of where they stand in society alone will never fully combat larger inequalities" Odgers added "But as the gap between the rich and the rest grows creative solutions focusing on both societal and individual factors are needed to help young people to overcome unprecedented obstacles to social mobility and move their way up the social ladder."

Credit: 
University of California - Irvine

Severe childhood deprivation has longstanding impacts on brain size in adulthood

Researchers from King's College London have shown that the brains of young adult Romanian adoptees who were institutionalised as children are around 8.6% smaller than the brains of English adoptees who have not suffered this form of deprivation.

According to the research, the longer the time the Romanian adoptees spent in the institutions, the smaller the total brain volume, with each additional month of deprivation associated with a 0.27% reduction in total brain volume. Deprivation related changes in brain volume were associated with lower IQ and more symptoms of attention deficit hyperactivity disorder (ADHD).

Published in Proceedings of the National Academy of Sciences (PNAS), the study analysed the MRI brain scans of 67 young adults, aged 23-28 years, who were exposed to severely depriving conditions in Romanian institutions under the Communist regime and subsequently adopted into nurturing families in the UK. They were compared to the MRI brain scans of 21 English adoptees aged 23-26 years who had not suffered this institutional deprivation.

MRI scans were conducted at the Centre for Neuroimaging Sciences at King's College London, as part of the Medical Research Council (MRC) funded English and Romanian Adoptees Brain Imaging Study (ERABIS). This is part of the larger ERA project that has collected information from Romanian and English adoptees over time including measures of mental health and cognitive performance.

This is the first time research has examined the impacts of severe early childhood deprivation on the brain structure of young adults.

Statistical analysis showed that, in this group of young Romanian adults, those changes in brain volume that were related to deprivation were also associated with lower IQ and more ADHD symptoms. This implies that changes in brain structure could play a mediating role between the experience of deprivation and levels of cognitive performance and mental health.

The research investigated other possible factors that could have influenced the results but found the results were unaffected by level of nutrition, physical growth and genetic predisposition for smaller brains.

The principle investigator of the study, Professor Edmund Sonuga-Barke from the Institute of Psychiatry, Psychology & Neuroscience (IoPPN), King's College London said: 'The English and Romanian Adoptees (ERA) study addresses one of the most fundamental questions in developmental psychology and psychiatry - how does early experience shape individual development? It's essential to recognise that these young people have nearly always received great care in loving adoptive families since they left the institutions. However, despite a lot of positive experiences and achievements there remain some deep-seated effects of deprivation on these young adults.'

First author, Dr Nuria Mackes from the IoPPN said: 'Previous research on the English and Romanian Adoptees (ERA) study has suggested that the emergence and persistence of low IQ and a high level of ADHD symptoms involves structural changes in the brain but, until now, we have not been able to provide direct evidence of this. Showing these very profound effects of early deprivation on brain size and then showing that this difference is associated with low IQ and greater ADHD symptoms provides some of the most compelling evidence of the neuro-biological basis of these problems following deprivation.'

The study also investigated where these changes were occurring in the brain and what localised features contributed to the differences. In comparison to the UK adoptees, the young Romanian adults who had suffered deprivation as children had markedly smaller right inferior frontal regions of the brain both in terms of volume and surface area.

In contrast the right inferior temporal lobe was larger in volume and surface area and thickness for the Romanian young adults and this was associated with lower levels of ADHD symptoms. This implies that this increase in volume and surface area in this region may play a compensatory role in preventing development of ADHD symptoms. In the right medial prefrontal region, the longer the duration of deprivation, the larger the volume and surface area.

The neuroimaging lead for the study, Professor Mitul Mehta from the IoPPN said: 'We found structural differences between the two groups in three regions of the brain. These regions are linked to functions such as organisation, motivation, integration of information and memory. It's interesting to see the right inferior temporal lobe is in fact larger in the Romanian young adults and that this was related to fewer ADHD symptoms, suggesting that the brain can adapt to reduce the negative effects of deprivation. This may explain why some individuals appear less affected than others by deprivation. We believe this is the first time that research has shown such compelling evidence of compensatory effects around deprivation.'

The Romanian young adults in the study had entered into institutions in the first few weeks of life, where they were often malnourished with minimal social contact and little stimulation. The time spent in institutions before adoption into families in the UK varied between 3 and 41 months.

Reflecting on the implications of the study Professor Sonuga-Barke said: 'By investigating the long term impact of deprivation our research highlights the need for a life-span perspective on the provision of any help and support, especially during the transition to adulthood. More speculatively the evidence of neural compensation in the inferior temporal lobe provides encouragement to look for ways that might help the brain adjust to deprivation and to improve outcomes. For example, it would be interesting to see if targeting this area directly through cognitive training might reduce ADHD symptoms.'

The research was published in Proceedings of the National Academy of Sciences and funded by Medical Research Council, Economic and Social Research Council and NIHR Maudsley Biomedical Research Centre.

Credit: 
King's College London

Health care paperwork cost US $812 billion in 2017, 4 times more per capita than Canada

A study published today (January 6) in the Annals of Internal Medicine finds that health care bureaucracy cost Americans $812 billion in 2017. This represented more than one-third (34.2%) of total expenditures for doctor visits, hospitals, long-term care and health insurance. The study estimated that cutting U.S. administrative costs to Canadian levels would have saved more than $600 billion in 2017.

Health administration costs were more than fourfold higher per capita in the U.S. than in Canada ($2,479 vs. $551 per person) which implemented a single-payer Medicare for All system in 1962. Americans spent $844 per person on insurers' overhead while Canadians spent $146. Additionally, doctors, hospitals, and other health providers in the U.S. spent far more on administration due to the complexity entailed in billing multiple payers and dealing with the bureaucratic hurdles insurers impose. As a result, hospital administration cost Americans $933 per capita vs. $196 in Canada. The authors note that in Canada hospitals are financed through lump-sum "global budgets" rather than fee-for-service, much as fire departments are funded in the U.S. Physicians' billing costs were also much higher in the U.S., $465 per capita vs. $87 per capita in Canada.

The analysis, the first comprehensive study of health administration costs since 1999, was carried out by researchers at Harvard Medical School, the City University of New York at Hunter College, and the University of Ottawa. The authors, who also performed the 1999 study, analyzed thousands of accounting reports that hospitals and other health care providers filed with regulators, as well as census data on employment and wages in the health sector. They obtained additional data from surveys of physicians and government reports.

The researcher found that administration's share of overall U.S. health spending rose by 3.2 percentage points between 1999 and 2017, from 31.0 % to 34.2%. Of the 3.2 percentage point increase, most (2.4 percentage points) was due to the expanding role that private insurers have assumed in tax-funded programs such as Medicaid and Medicare. Private managed care plans now enroll more than one-third of Medicare recipients and a majority of those on Medicaid; Medicare and Medicaid now account for 52% of private insurers' revenues. Private insurers' increasing involvement has pushed up overhead in those public programs; private Medicare Advantage plans take 12% or more of premiums for their overhead, while traditional Medicare's overhead is just 2%, a difference of at least $1,155 per enrollee (per year).

The authors cautioned that their estimates probably understate administrative costs, and particularly the growth since 1999. Their 1999 study included administrative spending for some items such as dental care for which no 2017 data were available. Additionally, private insurance overhead has increased since the study's completion, rising by 13.2% between 2017 and 2018 according to official health spending figures released in December.

"Medicare for All could save more than $600 billion each year on bureaucracy, and repurpose that money to cover America's 30 million uninsured and eliminate copayments and deductibles for everyone," said study senior author Dr. Steffie Woolhandler, a distinguished professor at the City University of New York (CUNY) at Hunter College and lecturer in Medicine at Harvard Medical School, where she previously served as a professor. "Reforms like a public option that leave private insurers in place can't deliver big administrative savings," Dr. Woolhandler added. "As a result, public option reform would cost much more and cover much less than Medicare for All."

According to Dr. David Himmelstein, the study's lead author who is an internist in the South Bronx, a distinguished professor at CUNY's Hunter College and lecturer in Medicine at Harvard, "Americans spend twice as much per person as Canadians on health care. But instead of buying better care, that extra spending buys us sky-high profits and useless paperwork. Before their single-payer reform, Canadians died younger than Americans, and their infant mortality rate was higher than ours. Now Canadians live three years longer and their infant mortality rate is 22% lower than ours. Under Medicare for All, Americans could cut out the red tape and afford a Rolls Royce version of Canada's system."

Credit: 
Physicians for a National Health Program

Scientists find new way to sustainably make chemicals by copying nature's tricks

image: Bacteria producing chemicals with (L) and without (R) the bioderivitization step. With bioderivitization, the bacteria are healthier (darker green)

Image: 
Patrik Jones/Imperial College London

Researchers have copied the way organisms produce toxic chemicals without harming themselves, paving the way for greener chemical and fuel production.

The new technique, pioneered by Imperial College London scientists, could reduce the need to use fossil fuels to create chemicals, plastics, fibres and fuels.

Currently, many useful chemicals are produced from fossil fuels, which require mining, are of limited supply, and disrupt the carbon cycle. An alternative is to engineer microorganisms like Escherichia coli (E. coli) and cyanobacteria to more sustainably produce the chemicals directly from atmospheric carbon dioxide.

However, many of the chemicals that can be produced this way are toxic to the microorganisms, reducing their ability to make large quantities in a cost-effective way.

Now, by copying the way natural organisms deal with their own toxic chemicals, researchers have shown that bacteria can be programmed to produce chemicals without also harming growth.

This concept could be used to produce useful chemicals, plastics and even fuels, which could further reduce the need for fossil fuels and help minimise climate change. The new technique and a first proof of concept are published today in Proceedings of the National Academy of Sciences.

Lead researcher Dr Patrik Jones, from the Department of Life Sciences at Imperial, said: "We looked at what nature does already, for its own benefit, and applied the idea to biotechnology, for our benefit."

Organisms like plants and yeasts sometimes produce chemicals that are toxic to them, so to store them safely, they make small modifications to the chemicals to render them harmless. The resulting chemicals are known as 'derivatives', and can be returned to the original, toxic form through relatively simple chemistry.

The team took this idea and used genetic engineering to program E. coli and cyanobacteria to make 1-octanol, a chemical currently used in perfumes, which is toxic to the bacteria. They then added an extra set of instructions to E. coli so it would produce two different derivatives of 1-octanol that are both less harmful.

The researchers say if this were to be scaled up for industrial systems the engineered bacteria would produce the non-toxic derivative of 1-octanol, which would then be recovered and chemically transformed back into 1-octanol, ready for use.

The team found that their system produced 1-octanol without affecting the growth of the bacteria. They also found the system produced more 1-octanol than a system without the derivatization step, which they think is related to the fact that the derivative is not only less toxic but also more soluble in the surrounding water or solvent.

Dr Jones said: "A more soluble chemical may move away from the cells quicker, where it's less likely to interfere with any of the bacteria's processes."

Now the team has shown the concept of creating derivatives using programmed microorganisms, they want to set up a complete system, from production of the derivative to recovery of the desired chemical.

This will help them refine the process, and potentially scale it up for use in industrial settings.

Credit: 
Imperial College London

Simulated image demonstrates the power of NASA's wide field infrared survey telescope

image: This simulated image of a portion of the Andromeda galaxy highlights the high resolution, large field of view, and unique footprint of NASA's upcoming Wide Field Infrared Survey Telescope (WFIRST). Made using data from the Panchromatic Hubble Andromeda Treasury (PHAT) program, the image spans approximately 34,000 light-years, or about 1/5 of the full disk of Andromeda, showcasing the red and near-infrared light of more than 50 million individual stars. Red and green represent near-infrared light, while blue represents visible red light. The image runs from the edge of the bright core of the galaxy on the lower left, out along and across several of the galaxy's spiral arms in the middle and right.

Image: 
NASA, STScI and B.F. Williams (University of Washington)

Imagine a fleet of 100 Hubble Space Telescopes, deployed in a strategic space-invader-shaped array a million miles from Earth, scanning the universe at warp speed.

With NASA's Wide Field Infrared Survey Telescope, scheduled for launch in the mid-2020s, this vision will (effectively) become reality.

WFIRST will capture the equivalent of 100 high-resolution Hubble images in a single shot, imaging large areas of the sky 1,000 times faster than Hubble. In several months, WFIRST could survey as much of the sky in near-infrared light -- in just as much detail -- as Hubble has over its entire three decades.

Elisa Quintana, WFIRST Deputy Project Scientist for Communications at NASA's Goddard Space Flight Center in Greenbelt, Maryland, is confident that WFIRST will have the power to transform astrophysics. "To answer fundamental questions like: How common are planets like those in our solar system? How do galaxies form, evolve, and interact? Exactly how -- and why -- has the universe's expansion rate changed over time? We need a tool that can give us both a broad and detailed view of the sky. WFIRST will be that tool."

Although WFIRST has not yet opened its wide, keen eyes on the universe, astronomers are already running simulations to demonstrate what it will be able to see and plan their observations.

This simulated image of a portion of our neighboring galaxy, Andromeda (M31), provides a preview of the vast expanse and fine detail that can be covered with just a single pointing of WFIRST. Using information gleaned from hundreds of Hubble observations, the simulated image covers a swath roughly 34,000 light-years across, showcasing the red and infrared light of more than 50 million individual stars detectable with WFIRST.

While it may appear to be a somewhat haphazard arrangement of 18 separate images, the simulation actually represents a single shot. Eighteen square detectors, 4096 by 4096 pixels each, make up WFIRST's Wide Field Instrument (WFI) and give the telescope its unique window into space.

With each pointing, WFIRST will cover an area roughly 1? times that of the full Moon. By comparison, each individual infrared Hubble image covers an area less than 1% of the full Moon.

The Advantages of Speed

WFIRST is designed to collect the big data needed to tackle essential questions across a wide range of topics, including dark energy, exoplanets, and general astrophysics spanning from our solar system to the most distant galaxies in the observable universe. Over its 5-year planned lifetime, WFIRST is expected to amass more than 20 petabytes of information on thousands of planets, billions of stars, millions of galaxies, and the fundamental forces that govern the cosmos.

For astronomers like Ben Williams of the University of Washington in Seattle, who generated the simulated data set for this image, WFIRST will provide a valuable opportunity to understand large nearby objects like Andromeda, which are otherwise extremely time-consuming to image because they take up such a large portion of the sky.

"We have spent the last couple of decades getting images at high resolution in small parts of nearby galaxies. With Hubble you get these really tantalizing glimpses of very complex nearby systems. With WFIRST, all of a sudden you can cover the whole thing without spending lots of time," Williams said.

The ability to image such a large area will provide astronomers with important context needed to understand how stars form and how galaxies change over time. Williams explained that with a wide field, "you get the individual stars, you get the structures they live in, and the structures that surround them in their environment."

Julianne Dalcanton of the University of Washington, who led the Panchromatic Hubble Andromeda Treasury (PHAT) program that the simulated data are based on, also believes that WFIRST's combination of ultra-telephoto and super-wide-angle capabilities will be ground-breaking. "The PHAT survey of Andromeda was a tremendous investment of time, requiring careful justification and forethought. This new simulation shows how easy an equivalent observation could be for WFIRST." WFIRST could survey Andromeda nearly 1,500 times faster than Hubble, building a panorama of the main disk of the galaxy in just a few hours.

WFIRST's extraordinary survey speed is a result of its wide field of view, its agility, and its orbit. Williams explained that by covering more area in one field and being able to switch fields more quickly, "you're avoiding all those overheads that are associated with repointing the telescope so many times." In addition, WFIRST's orbit one million miles out will provide a view that is generally unobstructed by Earth. While Hubble is often able to collect data during only half of its low-Earth orbit 350 miles up, WFIRST will be able to observe more-or-less continuously.

Major Survey Programs

Because it can collect so much detailed data so quickly, WFIRST is ideally suited for large surveys. A significant portion of the mission will be dedicated to monitoring hundreds of thousands of distant galaxies for supernova explosions, which can be used to study dark energy and the expansion of the universe. Another major program will involve mapping the shapes and distribution of galaxies in order to better understand how the universe -- including galaxies, dark matter, and dark energy -- has evolved over the past 13+ billion years.

WFIRST will also play an important role in the census of exoplanets. By monitoring the brightness of billions of stars in the Milky Way, astronomers expect to catch thousands of microlensing events -- slight increases in brightness that occur when a planet passes between the telescope and a distant star. WFIRST's ability to detect planets that are relatively small or far from their own stars -- as well as rogue planets, which don't orbit any star at all -- will help fill major gaps in our knowledge of planets beyond our solar system. Although microlensing will not give us the ability to see exoplanets directly, WFIRST will also carry a coronagraph, a technology demonstration instrument designed to block enough of the blinding starlight to make direct imaging and characterization of orbiting planets possible.

These large surveys are also expected to reveal the unexpected: strange, transient phenomena that have never before been observed. "If you cover a lot of the sky, you're going to find those rare things," explained Williams.

Open-Access Data

Further broadening its potential impact, all of the data collected by WFIRST will be non-proprietary and immediately available to the public. Dalcanton underscored the importance of this aspect of the mission: "Thousands of minds from across the globe are going to be able to think about that data and come up with new ways to use it. It's hard to anticipate what the WFIRST data will unlock, but I do know that the more people we have looking at it, the greater the pace of discovery."

Complementing other Observatories

WFIRST's combination of talents will be a valuable complement to those of other observatories, including Hubble and the James Webb Space Telescope. "With one hundred times the field of view of Hubble, and the ability to rapidly survey the sky, WFIRST will be an extremely powerful discovery tool," explained Karoline Gilbert, WFIRST Mission Scientist at the Space Telescope Science Institute in Baltimore, Maryland. "Webb, which is 100 times more sensitive and can see deeper into the infrared, will be able to observe the rare astronomical objects discovered by WFIRST in exquisite detail. Meanwhile, Hubble will continue to provide a unique view into the optical and ultraviolet light emitted by the objects that WFIRST discovers, and Webb follows up on."

The simulated image is being presented at the 235th meeting of the American Astronomical Society in Honolulu, Hawaii.

Credit: 
NASA/Goddard Space Flight Center

Step toward 'ink' development for 3-D printing a bioprosthetic ovary

For the first time, scientists identified and mapped the location of structural proteins in a pig ovary. Ongoing development of an "ink" with these proteins will be used for 3-D printing an artificial (or bio-prosthetic) ovary that could be implanted and allow a woman to have a child. Findings were recently published in Scientific Reports.

"This is a huge step forward for girls who undergo fertility-damaging cancer treatments," says senior author Monica Laronda, PhD, Director of Basic and Translational Research, Fertility & Hormone Preservation & Restoration Program at Ann & Robert H. Lurie Children's Hospital of Chicago, and Assistant Professor of Pediatrics at Northwestern University Feinberg School of Medicine. "Our goal is to use the ovarian structural proteins to engineer a biological scaffold capable of supporting a bank of potential eggs and hormone producing cells. Once implanted, the artificial ovary would respond to natural cues for ovulation, enabling pregnancy."

In November 2019, Dr. Laronda, with three other collaborators, received a patent for creation of an artificial ovary. So far, she and colleagues have 3-D printed an artificial ovary that they implanted into a sterile mouse. The mouse was then able to become pregnant and had live pups. These groundbreaking results were published in 2017 in Nature Communications.

"The structural proteins from a pig ovary are the same type of proteins found in humans, giving us an abundant source for a more complex bio-ink for 3-D printing an ovary for human use," says Dr. Laronda. "We are one step closer to restoring fertility and hormone production in young women who survive childhood cancer but enter early menopause as a late effect. There are still several steps to go and we are excited to test our new inks."

The methodology Dr. Laronda and colleagues used to identify and map structural proteins in an ovary can be used by scientists to investigate other organs of interest.

"We have developed a pipeline for identifying and mapping scaffold proteins at the organ level," says Dr. Laronda. "It is the first time that this has been accomplished and we hope it will spur further research into the microenvironment of other organs."

Credit: 
Ann & Robert H. Lurie Children's Hospital of Chicago

Wearable AC

image: An on-skin device designed by engineers at the University of Missouri can achieve around 11 degrees Fahrenheit of cooling to the human body. The device also includes numerous human health care applications such as the ability to monitor blood pressure, electrical activity of the heart and the level of skin hydration.

Image: 
University of Missouri

One day, soldiers could cool down on the military battlefield -- preventing heat stroke or exhaustion -- by using "wearable air conditioning," an on-skin device designed by engineers at the University of Missouri. The device includes numerous human health care applications such as the ability to monitor blood pressure, electrical activity of the heart and the level of skin hydration.

The findings are detailed in the journal Proceedings of the National Academy of Sciences.

Unlike similar products in use today or other related concepts, this breathable and waterproof device can deliver personal air conditioning to a human body through a process called passive cooling. Passive cooling does not utilize electricity, such as a fan or pump, which researchers believe allows for minimal discomfort to the user.

"Our device can reflect sunlight away from the human body to minimize heat absorption, while simultaneously allowing the body to dissipate body heat, thereby allowing us to achieve around 11 degrees Fahrenheit of cooling to the human body during the daytime hours," said corresponding author Zheng Yan, an assistant professor in the College of Engineering. "We believe this is one of the first demonstrations of this capability in the emerging field of on-skin electronics."

Currently, the device is a small wired patch, and researchers say it will take one to two years to design a wireless version. They also hope to one day take their technology and apply it to 'smart' clothing.

"Eventually, we would like to take this technology and apply it to the development of smart textiles," Yan said. "That would allow for the device's cooling capabilities to be delivered across the whole body. Right now, the cooling is only concentrated in a specific area where the patch is located. We believe this could potentially help reduce electricity usage and also help with global warming."

Credit: 
University of Missouri-Columbia

In a nearby galaxy, a fast radio burst unravels more questions than answers

image: An artist's conception of the localization of Fast Radio Burst 180916.J0158+65 to its host galaxy. The host galaxy image is based on real observations using the Gemini-North telescope atop Mauna Kea in Hawaii. The impulsive burst emanating from the galaxy is based on real data recorded using the 100-meter Effelsberg radio telescope in Germany.

Image: 
Danielle Futselaar (artsource.nl)

For more than a decade, astronomers across the globe have wrestled with the perplexities of fast radio bursts -- intense, unexplained cosmic flashes of energy, light years away, that pop for mere milliseconds.

Despite the hundreds of records of these enigmatic sources, researchers have only pinpointed the precise location of four such bursts.

Now there's a fifth, detected by a team of international scientists that includes West Virginia University researchers. The finding, which relied on eight telescopes spanning locations from the United Kingdom to China, was published Monday (Jan. 6) in Nature.

There are two primary types of fast radio bursts, explained Kshitij Aggarwal, a physics graduate student at WVU and a co-author of the paper: repeaters, which flash multiple times, and non-repeaters, one-off events. This observation marks only the second time scientists have determined the location of a repeating fast radio burst.

But the localization of this burst is not quite as important as the type of galaxy it was found in, which is similar to our own, said Sarah Burke-Spolaor, assistant professor of physics and astronomy and co-author.

"Identifying the host galaxy for FRBs is critical to tell us about what kind of environments FRBs live in, and thus what might actually be producing FRBs," Burke-Spolaor said. "This is a question for which scientists are still grasping at straws."

Burke-Spolaor and her student, Aggarwal, used the Very Large Array observatory in New Mexico to seek pulsations and a persistent radio glow from this burst. Meanwhile, Kevin Bandura, assistant professor of computer science and electrical engineering, and third WVU co-author of the article, worked on the Canadian Hydrogen Intensity Mapping Experiment team that initially detected the repeating fast radio burst.

"What's very interesting about this particular repeating FRB is that it is in the arm of a Milky Way-like spiral galaxy, and is the closest to Earth thus far localized," Bandura said. "The unique proximity and repetition of this FRB might allow for observation in other wavelengths and the potential for more detailed study to understand the nature of this type of FRB."

Using a technique known as Very Long Baseline Interferometry, the team achieved a level of resolution high enough to localize the burst to a region approximately seven light years across - a feat comparable to an individual on Earth being able to distinguish a person on the moon, according to CHIME.

With that level of precision, the researchers could analyze the environment from which the burst emanated through an optical telescope.

What they found has added a new chapter to the mystery surrounding the origins of fast radio bursts.

This particular burst existed in a radically different environment from previous studies, as the first repeating burst was discovered in a tiny "dwarf" galaxy that contained metals and formed stars, Burke-Spolaor said.

"That encouraged a lot of publications saying that repeating FRBs are likely produced by magnetars (neutron stars with powerful magnetic fields)," she said. "While that is still possible, the fact that this FRB breaks the uniqueness of that previous mold means that we have to consider perhaps multiple origins or a broader range of theories to understand what creates FRBs."

At half-a-billion light years from Earth, the source of this burst, named "FRB 180916," is seven times closer than the only other repeating burst to have been localized, and more than 10 times closer than any of the few non-repeating bursts scientists have managed to pinpoint. Researchers are hopeful that this latest observation will enable further studies that unravel the possible explanations behind fast radio bursts, according to CHIME.

WVU has remained at the research forefront of fast radio bursts since they were first discovered in 2007 by a team right here at the University that included Duncan Lorimer and Maura McLaughlin, physics professors, and then-student David Narkevic. The trio discovered fast radio bursts from scouring archived data from Australia's Parkes Radio Telescope.

Credit: 
West Virginia University

Blood pressure control for people aged 80 and older: What's the right target?

The number of people who are 80-years-old and older is on the rise, and will account for nearly 10 percent of the whole U.S. population by 2050. Since the lifetime chance for developing high blood pressure is at least 70 percent by age 80, more and more people will be at risk for the health problems that high blood pressure can cause.

High blood pressure, or hypertension, is sometimes called the "silent killer" because it produces few, if any, symptoms. In fact, you might not even realize you have high blood pressure. But if it's not treated, this condition can lead to heart attacks, strokes, kidney disease, and other serious problems, including a risk for dementia.

The 2017 American College of Cardiology and American Heart Association blood pressure guidelines recommend that most people aged 65 or older maintain their systolic blood pressure (the first number in a blood pressure reading) at less than 130 mmHg. But, people 80 years or older often also have multiple chronic health conditions, can be frail, take several medicines, and could have cognitive problems. Because of this, it's still unclear whether the risks and benefits of lowering systolic blood pressure to less than 130 mm Hg are the same for people aged 80 years and older as they are for people aged 65 to 80.

Given this knowledge gap, a team of researchers focused on this group of older adults within a large randomized trial called the Systolic Blood Pressure Intervention Trial (SPRINT). They published their findings in the Journal of the American Geriatrics Society. In their analysis of SPRINT data, the researchers focused on people aged 80 and older, who had reported heart disease events (such as heart attacks or strokes), changes in kidney function, cognitive impairment, quality of life, or death. The researchers also explored whether impairments in cognitive or physical function had any effect on intensive blood pressure control.

The analysis included 1,167 participants. Most were around 84 years old, and about 3 percent were 90 or older. Their baseline systolic blood pressure was around 142 mmHg. Most of the participants had at least three chronic health conditions. More than half were taking at least five medications and about 27 percent had a history of heart disease.

The participants were randomly assigned to one of two groups. One group received "intensive" treatment targeting to lower their blood pressure to less than 120 mmHg. The other group received treatment to target lowering their blood pressure to less than 140 mmHg.

The people who received treatment to lower their blood pressure to less than 120 mmHg experienced a lower risk for heart disease events, as well as less risk for mild cognitive impairment and death from all causes. However, people in this group also experienced an increased risk of small, but meaningful, declines in kidney function as well as hospitalizations for short term kidney damage (from which most people recovered). Attempting to lower systolic blood pressure to less than 120 mmHg did not increase the risk for injury-causing falls. This is important, since falls raise the risk for death in older adults and low blood pressure can result in falls.

While the rate of developing dementia was similar in the two groups, participants in the intensive 120 mmHg group were 28 percent less likely to develop mild cognitive impairment.

The researchers also reported that people with better cognitive function (remembering, thinking, and making decisions) at the beginning of the study benefited the most from intensive blood pressure control. They also experienced less heart disease and fewer deaths. This same benefit was not seen in participants who had poorer cognitive function at the beginning of the study. However, there was not strong evidence of intensive blood pressure control having a harmful impact on death rates or developing heart disease for those with poorer cognitive function.

The researchers concluded that, for adults aged 80 years or older, intensively controlling systolic blood pressure to less than 120 mmHg lowers the risk of heart attacks, stroke, death, and mild cognitive impairment, but increases the risk of declines in kidney function. Benefits related to the risk for heart disease and death were highest in people with higher cognitive performance at the beginning of the trial.

Credit: 
American Geriatrics Society

Study suggests antiretroviral therapy does not restore disease immunity

BROOKLYN, N.Y. (January 6, 2020) - A study led by researchers from SUNY Downstate Health Sciences University and Oregon Health & Science University, published in the Journal of Infectious Diseases1, showed that, despite successful antiretroviral therapy (ART), antigen specific memory to vaccinations that occurred before HIV infection did not recover, even after immune reconstitution. Additionally, a previously unrealized decline in pre-existing antibody response was also observed.

Approximately two-thirds of HIV-positive patients in the U.S. are on an ART regimen. It works by reducing viral replication and boosting CD4 T cell counts, which are important to suppressing or regulating the immune response. Normally, these cells "remember" viruses and respond in large numbers when exposed again.

However, this study suggests that this memory is inhibited in some HIV-positive patients who are otherwise doing well on therapy. Should this loss of serological memory, or HIV-immune associated amnesia, exist for other pre-infection vaccinations or viruses, it would have significant implications for the overall health status of HIV patients, including:

Providing an explanation for chronic inflammation and "accelerated ageing" observed among people with HIV

Suggesting a loss of protective immunity and an increased risk for common acute or chronic viral infections among people with HIV, regardless of whether they are on an ART regimen

Suggesting a potential loss of protection against such common childhood diseases such as measles, mumps, chickenpox, pertussis (whooping cough) and others, for which these patients were previously vaccinated as children, prior to HIV infection, and not restored by ART

"There is no doubt that ART provides significant, life-changing benefits for people with HIV by reconstituting their overall immune response," said Michael Augenbraun, MD, FACP. FIDSA, Professor of Medicine and Vice Chair, Department of Medicine and Director, Division of Infectious Diseases at SUNY Downstate Health Sciences University and Kings County Hospital Center.

"What our study suggests is that ART may not be completely effective in restoring the immune protection resulting from viral infections or childhood vaccines received prior to becoming HIV positive," added Dr. Augenbraun. "This makes these patients potentially susceptible not only to these serious diseases, but also other chronic infections and to chronic inflammation that may diminish their overall health and shorten their lifespan."

Dr. Augenbraun cautions that, while this study only examined immunologic responses to smallpox vaccination and not to specific clinical outcomes, it builds on previous studies and evidence pointing to HIV-associated immune amnesia. He says additional studies need to be conducted both on HIV positive men utilizing the smallpox vaccine antigen, and antigens of other common diseases for which people are vaccinated as children. Additionally, Dr. Augenbraun believes the study may also contribute to gathering support for earlier aggressive treatment in HIV infected individuals before they suffer significant damage to their immune system.

Should future studies identify broader HIV-associated immune amnesia, it could mean that a significant proportion of the 1.1 million people in the U.S. and more than 23 million people worldwide living with HIV have diminished protection from previous immunizations. However, he cautioned that the potential need for revaccination of patients, although suggested by these data, was not directly addressed by the study.

Credit: 
SUNY Downstate Health Science University