Culture

Forecast to help shellfish growers weather toxicity

image: Research Associate Craig Burnell analyzes a shellfish sample for toxins at Bigelow Laboratory for Ocean Sciences. A new publication reports how researchers can use deep learning algorithms to forecast shellfish toxicity, just like meteorologists forecast the weather.

Image: 
Bigelow Laboratory for Ocean Sciences

The same technology that powers facial recognition and self-driving cars may soon help Maine's shellfish industry protect people from the dangerous effects of harmful algal blooms. A recent paper reports how researchers can use these deep learning algorithms to forecast shellfish toxicity, just like meteorologists forecast the weather.

"Deep learning approaches have become incredibly sophisticated, and using them creatively can allow us to address all sorts of challenges," said Senior Research Scientist Nick Record, a modeler and the senior author of the paper. "This work unites the expertise of the industry, resource managers, and Bigelow Laboratory researchers, and I believe that by working together we can solve this problem."

Bigelow Laboratory for Ocean Sciences works with Maine's Department of Marine Resources (DMR) to test thousands of shellfish samples for toxins each year, using an advanced chemical method that Senior Research Scientist Steve Archer pioneered in 2014. These measurements help DMR judge when an area is safe for shellfish harvests. Over the years, this method has also created a dataset that reveals when and where toxins have occurred around the state - providing a unique opportunity to anticipate when they will show up in the future.

The research team used this dataset to train an algorithm to recognize the chemical "fingerprints" of the toxic compounds that some algae produce. These toxins can quickly concentrate to harmful levels in shellfish, which eat by filtering large quantities of water.

Their model utilizes neural networks, a sophisticated machine learning approach based on brain structure that can process huge volumes of data to recognize complex patterns. As it churned through more and more data, their algorithm became highly accurate at predicting oncoming toxicity. Isabella Grasso, a Southern Maine Community College Student and 2018 Research Experience for Undergraduates intern at Bigelow Laboratory, helped lead the research project and presented the results to industry and management leaders at the 2019 Northeast Shellfish Sanitation Association Annual Meeting.

"Toxic shellfish events can cause significant problems in Maine and around the country, but we think we can greatly reduce their impact," Record said. "The ability to predict these sporadic events could allow farmers to prepare and adjust the timing of their harvest, helping protect the industry and consumers."

The Maine Department of Marine Resources monitors shellfish throughout the year to ensure that no harvesting occurs while toxin levels in the water are high. While this makes certain that all seafood sold is safe to eat, the fishery closures can cause major disruptions to the state's seafood industry.

Archer and Record recently received a grant to work with DMR and Maine shellfish growers to test and refine the forecast. Individual partnerships with growers will allow the researchers to receive and incorporate feedback on whether the forecast is working accurately and providing useful information.

Researchers predict that large-scale blooms of algae may become more common as the Gulf of Maine continues to warm, potentially favoring toxic algal species. The team hopes that live, real-time forecasts will be in place to aid monitoring efforts and shellfish harvests throughout the Gulf of Maine in a few years.

"It's a huge responsibility to monitor the full coast of Maine, and the implications for human health are not something to be taken lightly," said Kohl Kanwit, the director of public health for the Maine Department of Marine Resources. "This forecast could help us optimize our sampling efforts, and it likewise could help other states in their efforts to predict and manage harmful algal blooms."

Credit: 
Bigelow Laboratory for Ocean Sciences

Mowing urban lawns less intensely increases biodiversity, saves money and reduces pests

image: An experimental site comparing the ecological effects of intense mowing (R) with low impact mowing (L) in Trois-Rivieres, Canada.

Image: 
Dr Chris Watson

The researchers combined data across North America and Europe using a meta-analysis, a way of aggregating results from multiple studies to increase statistical strength. They found strong evidence that increased mowing intensity of urban lawns - which included parks, roundabouts and road verges - had negative ecological effects, particularly on invertebrate and plant diversity. Pest species, on the other hand, benefitted from intense lawn management.

"Even a modest reduction in lawn mowing frequency can bring a host of environmental benefits: increased pollinators, increased plant diversity and reduced greenhouse gas emissions. At the same time, a longer, healthier lawn makes it more resistant to pests, weeds, and drought events." said Dr Chris Watson, lead author of the study.

The issue with regular lawn mowing is that it favours grasses, which grow from that base of the plant, and low growing species like dandelion and clover. Other species that have their growing tips or flowering stems regularly removed by mowing can't compete. Allowing plant diversity in urban lawns to increase has the knock-on effect of increasing the diversity of other organisms such as pollinators and herbivores.

The effect of intense lawn mowing on pest species was the least studied aspect of the research the authors looked at, featuring in seven datasets across three studies in Eastern Canada. However, in all of these studies they found that intensive lawn mowing resulted in an increase in the abundance of weeds and lawn pests.

"These findings support a lot of research done by the turfgrass industry that shows that the more disturbance a lawn gets, the higher the likelihood of pest and weed invasion." said Dr Chris Watson.

Common ragweed, which featured prominently in the studies, is one of the most allergenic plant species found in North America and Europe. Previous studies have estimated the cost of ragweed-based allergies to be CAD$155 million per year in Quebec and €133 million a year in Austria and Bavaria. Having a more rapid reproduction than other species, ragweed is able to colonise disturbances caused by intense mowing.

Chris Watson explained that "Certain lawn invaders, such as ragweed, can be decreased simply through reducing lawn mowing frequency. This will decrease the pollen load in the air and reduce the severity of hayfever symptoms, number of people affected, and medical costs."

To understand the economic costs of intensely mowed lawns the researchers used a case study of the city of Trois-Rivières, Quebec, Canada. By using data on mowing contractor costs they estimated a 36% reduction of public maintenance costs when mowing frequency was reduced from 15 to 10 times per year in high use lawn areas and 3 times to once a year in low use areas.

"If citizens would like to see urban greenspace improvement, they have the ability to influence how governments go about this - especially if it does not cost more money!" said Dr Chris Watson. "Likewise, complaints about long, messy lawns could quickly reduce the appetite of local government to trial these approaches - so it's important to have some community information and education as well. We need to shake the outdated social stigma that comes from having a lawn a few centimetres longer than your neighbour's"

The potential for long grass to harbour ticks and rodents is a common concern. However, Dr Chris Watson said there is little evidence to support this. "The presence of ticks are more strongly related to host populations, like deer, than type of vegetation. With respect to small mammals, some species prefer longer grass' whereas others do not. The next phase of our research aims to explore these negative perceptions in more detail."

For their meta-analysis the researchers identified studies in an urban setting that measured mowing intensity (either height or frequency) as an experimental factor. On top of the 14 studies they identified, which took place between 2004 and 2019, they also included three previously unpublished studies from their research group. A separate case study was used to estimate the economic costs of high intensity lawn management.

On the reasons for conducting a meta-analysis, Chris Watson explained that: "Often, ecological studies are done over only one or two years and can be heavily influenced by the weather conditions during the period of study. A meta-analysis looks beyond individual years or locations to provide a broad overview of a research subject."

The number of data sources from previous studies available to the authors ultimately limited the analysis. "In this case, all studies came from North America and Europe so there is a big opportunity in seeing if the trends we found are confirmed elsewhere. Likewise, all the studies were used to explore pest species were from Eastern Canada, so it is important to do more research in other places before applying these results generally." said Dr Chris Watson.

When looking at the economic impacts of intense lawn management the authors were only able to incorporate contractor costs which included worker's salaries, equipment operation and fuel. They were unable to include the costs of pesticides and fertiliser or factor in indirect economic benefits from improved ecosystem services like pollination.

The researchers are now looking expand the research and begin applying the findings to improve lawns. "We plan to conduct some larger trials in partnership with the City of Trois-Rivieres that expand the suite of pests and weeks that mowing may impact. At the same time we would like to investigate some of the negative perceptions of less-managed lawns and start working on some community outreach to promote low-intensity mowing for healthy lawns." said Dr Chris Watson.

Credit: 
British Ecological Society

Affordable Care Act led to improved treatment of colorectal cancer among young adults

An Affordable Care Act provision that allowed young adults to be covered under their parents' insurance led to a shift to earlier-stage diagnosis and more timely receipt of adjuvant chemotherapy among young colorectal cancer patients, according to a new American Cancer Society study. The study appears in JNCI.

In September 2010, the Dependent Coverage Expansion (DCE) under the Affordable Care Act (ACA) allowed young adults up to age 26 to be covered under their parents' private health insurance. To find out whether the DCE was associated with improved care among young adults with colorectal cancer, investigators led by Leticia Nogueira, Ph.D. analyzed outcomes for nearly 2,000 newly-diagnosed colorectal cancer patients who were DCE-eligible (ages 19 to 25) and compared outcomes to more than 8,000 who were not (ages 27 to 34) using the National Cancer Database during 2007-2013.

They found DCE-eligible patients who had surgery for stage IIB-IIIC colorectal cancer were 34% more likely to receive adjuvant chemotherapy post-ACA than pre-ACA. Furthermore, among DCE-eligible patients, average time from surgery to chemotherapy decreased by 7 days; from 57.4 days pre-ACA to 50.4 days post-ACA. There was no change among the comparison group (those ineligible for DCE). The authors note that because the younger age groups were too young to be eligible for routine colorectal cancer screening, the change likely reflect improved access to care that allows for timely assessment of early symptoms.

"Our results have important implications for young adults diagnosed with colorectal cancer who may experience interruptions in their insurance coverage due to loss of dependent coverage or other life transitions," write the authors. "Our findings highlight the role of the ACA in improving access to potentially life-saving cancer care, including a shift to early-stage diagnosis and more timely receipt of adjuvant chemotherapy."

Credit: 
American Cancer Society

ESO observations reveal black holes' breakfast at the cosmic dawn

image: This image shows one of the gas halos newly observed with the MUSE instrument on ESO's Very Large Telescope superimposed to an older image of a galaxy merger obtained with ALMA. The large-scale halo of hydrogen gas is shown in blue, while the ALMA data is shown in orange.

The halo is bound to the galaxy, which contains a quasar at its centre. The faint, glowing hydrogen gas in the halo provides the perfect food source for the supermassive black hole at the centre of the quasar.

The objects in this image are located at redshift 6.2, meaning they are being seen as they were 12.8 billion years ago. While quasars are bright, the gas reservoirs around them are much harder to observe. But MUSE could detect the faint glow of the hydrogen gas in the halos, allowing astronomers to finally reveal the food stashes that power supermassive black holes in the early Universe.

Image: 
ESO/Farina et al.; ALMA (ESO/NAOJ/NRAO), Decarli et al.

Astronomers using ESO's Very Large Telescope have observed reservoirs of cool gas around some of the earliest galaxies in the Universe. These gas halos are the perfect food for supermassive black holes at the centre of these galaxies, which are now seen as they were over 12.5 billion years ago. This food storage might explain how these cosmic monsters grew so fast during a period in the Universe's history known as the Cosmic Dawn.

"We are now able to demonstrate, for the first time, that primordial galaxies do have enough food in their environments to sustain both the growth of supermassive black holes and vigorous star formation," says Emanuele Paolo Farina, of the Max Planck Institute for Astronomy in Heidelberg, Germany, who led the research published today in The Astrophysical Journal. "This adds a fundamental piece to the puzzle that astronomers are building to picture how cosmic structures formed more than 12 billion years ago."

Astronomers have wondered how supermassive black holes were able to grow so large so early on in the history of the Universe. "The presence of these early monsters, with masses several billion times the mass of our Sun, is a big mystery," says Farina, who is also affiliated with the Max Planck Institute for Astrophysics in Garching bei München. It means that the first black holes, which might have formed from the collapse of the first stars, must have grown very fast. But, until now, astronomers had not spotted 'black hole food' -- gas and dust -- in large enough quantities to explain this rapid growth.

To complicate matters further, previous observations with ALMA, the Atacama Large Millimeter/submillimeter Array, revealed a lot of dust and gas in these early galaxies that fuelled rapid star formation. These ALMA observations suggested that there could be little left over to feed a black hole.

To solve this mystery, Farina and his colleagues used the MUSE instrument on ESO's Very Large Telescope in the Chilean Atacama Desert to study quasars -- extremely bright objects powered by supermassive black holes which lie at the centre of massive galaxies. The study surveyed 31 quasars that are seen as they were more than 12.5 billion years ago, at a time when the Universe was still an infant, only about 870 million years old. This is one of the largest samples of quasars from this early on in the history of the Universe to be surveyed.

The astronomers found that 12 quasars were surrounded by enormous gas reservoirs: halos of cool, dense hydrogen gas extending 100 000 light years from the central black holes and with billions of times the mass of the Sun. The team, from Germany, the US, Italy and Chile, also found that these gas halos were tightly bound to the galaxies, providing the perfect food source to sustain both the growth of supermassive black holes and vigorous star formation.

The research was possible thanks to the superb sensitivity of MUSE, the Multi Unit Spectroscopic Explorer, on ESO's VLT, which Farina says was "a game changer" in the study of quasars. "In a matter of a few hours per target, we were able to delve into the surroundings of the most massive and voracious black holes present in the young Universe," he adds. While quasars are bright, the gas reservoirs around them are much harder to observe. But MUSE could detect the faint glow of the hydrogen gas in the halos, allowing astronomers to finally reveal the food stashes that power supermassive black holes in the early Universe.

In the future, ESO's Extremely Large Telescope will help scientists reveal even more details about galaxies and supermassive black holes in the first couple of billion years after the Big Bang. "With the power of the ELT, we will be able to delve even deeper into the early Universe to find many more such gas nebulae," Farina concludes.

Credit: 
ESO

Watching TV makes us prefer thinner women

image: Samples of different body shapes and sizes used in the study.

Image: 
Martin Tovee et al

The more TV we watch the more we prefer thinner female bodies, according to a new comprehensive study on body image.

The researchers are calling on TV and advertising bosses to show people of all shapes and sizes in order to reduce the pressure on women and girls to aspire to a 'thin ideal body'.

The team, led by Durham University, worked with men and women from a number of villages in a remote area of Nicaragua in Central America who either had regular or hardly any TV access.

People with very limited access to TV preferred female figures with a higher Body Mass Index (BMI) whereas people who often watched TV preferred thinner bodies.

The villages in Nicaragua were selected because people were very similar in terms of their ecological constraints, such as nutrition, income and education, but had differing access to TV. This meant researchers were able to isolate the effect of TV exposure from the other factors.

The researchers say this is the best evidence to date that TV is having a causal effect on people's perceptions of body ideals.

The findings, published in the Journal of Personality and Social Psychology, show that TV exposure can have a powerful impact on what people perceive as the ideal body.

The representation of this 'thin ideal' in the media can lead to body dissatisfaction and can play a part in the development of eating disorders and depression.

Lead author of the research, Professor Lynda Boothroyd, from Durham University's Psychology Department, said: "TV and advertising bosses have a moral responsibility to use actors, presenters and models of all shapes and sizes and avoid stigmatising larger bodies. There needs to be a shift towards a 'health at every size' attitude and the media has an important role to play in that."

People in the villages in this part of Nicaragua generally did not have access to magazines or the Internet, and none of the participants in the study owned a smartphone. Only those people with electricity supplies to their homes as well as the money to pay for a TV and subscription were able to watch TV on a regular basis.

Those people with access to TV watched a mixture of Latin soap operas, Hollywood action movies, music videos, police "car chase" reality shows and the news.

Co-author, Dr Jean-Luc Jucker, from Durham University and University of the Autonomous Regions of the Nicaraguan Caribbean Coast, commented: "This study, utilizing a range of quantitative and qualitative research methods with non-Western participants, provides yet more empirical evidence that the mass media impact female body size ideals."

299 men and women from seven villages in the Pearl Lagoon Basin area of Nicaragua took part in the research. They completed a questionnaire about their ethnicity, education, income, hunger, language and TV exposure. They were then asked to rate the attractiveness of pictures of female bodies with varying body shapes and sizes.

In addition to this study, the team also carried out another study among those villagers who had little or no TV access.
Dr Tracey Thornborrow from the University of Lincoln, co-author and field researcher on the project, explains: "We showed the villagers a series of pictures, either showing larger women or thinner women. We found that after viewing these images, the villagers' body ideals adjusted in the same direction.

"Our findings clearly demonstrate that perceptions of attractiveness are highly changeable, and are affected by what we are visually exposed to."

Professor Boothroyd has previously found the same effect in women in Western societies but this effect had never been tested outside industrialised societies before.

Being able to show that perceptions of attractiveness are this changeable in even 'media naïve' participants is a major step forward in our understanding of cultural variation, according to the researchers. "If there's something that's universal about attraction, it is how flexible it is," Professor Boothroyd added.

Credit: 
Durham University

Long work hours at the office linked to both regular and hidden high blood pressure

DALLAS, Thursday, Dec. 19, 2019 - Office workers who spend long hours on the job are more likely to have high blood pressure, including a type that can go undetected during a routine medical appointment, according to a new study published today in the American Heart Association's journal Hypertension.

High blood pressure affects nearly half of Americans ages 18 and older and is a primary factor in more than 82,000 deaths per year. Approximately 15-30% of U.S. adults have a type of the condition called masked hypertension, meaning their high blood pressure readings are normal during health care visits but elevated when measured elsewhere.

The new study, conducted by a Canadian research team, enlisted more than 3,500 white-collar employees at three public institutions in Quebec. These institutions generally provide insurance services to the general population. Compared with colleagues who worked fewer than 35 hours a week:

Working 49 or more hours each week was linked to a 70% greater likelihood of having masked hypertension and 66% greater likelihood of having sustained hypertension- elevated blood pressure readings in and out of a clinical setting.

Working between 41 and 48 hours each week was linked to a 54% greater likelihood of having masked hypertension and 42% greater likelihood of having sustained hypertension.

The findings accounted for variables such as job strain, age, sex, education level, occupation, smoking status, body mass index and other health factors.

"Both masked and sustained high blood pressure are linked to higher cardiovascular disease risk," said lead study lead author Xavier Trudel, Ph.D., assistant professor in the social and preventive medicine department at Laval University in Quebec, Canada.

"The observed associations accounted for job strain, a work stressor defined as a combination of high work demands and low decision-making authority. However, other related stressors might have an impact," Trudel said. "Future research could examine whether family responsibilities - such as a worker's number of children, household duties and childcare role - might interact with work circumstances to explain high blood pressure."

The five-year study involved three waves of testing - in years one, three and five. To simulate in-clinic blood pressure readings, a trained assistant provided participants with a wearable monitor to check each participant's resting blood pressure three times in one morning. For the rest of the workday, the participant wore the blood pressure monitoring device, which took readings every 15 minutes - collecting a minimum of 20 additional measures for one day. Average resting readings at or above 140/90 mmHg, and average working readings at or above 135/85, were considered high.

In all, almost 19% of the workers had sustained hypertension, which included employees who were already taking high blood pressure medications. More than 13% of the workers had masked hypertension and not receiving treatment for high blood pressure. "The link between long working hours and high blood pressure in the study was about the same for men as for women," Trudel said.

The study "did not include blue-collar workers (employees who are paid by the hour and perform manual labor work in positions such agriculture, manufacturing, construction, mining, maintenance or hospitality service), therefore, these findings may not reflect the impact on blood pressure of shift-work or positions with higher physical demands," the authors said. Other limitations include the study's measurement of blood pressure only during daytime hours, and the omission of hours worked outside participants' primary job.

The authors noted several strengths of the study, including its many volunteers, accounting for multiple factors that can impact blood pressure, repeated testing over several years, the use of wearable monitors instead of relying on workers' reports of their blood pressure readings; and the use of the same monitors for all blood pressure measurements.

"People should be aware that long work hours might affect their heart health, and if they're working long hours, they should ask their doctors about checking their blood pressure over time with a wearable monitor," Trudel said. "Masked hypertension can affect someone for a long period of time and is associated, in the long term, with an increased risk of developing cardiovascular disease. We have previously shown that over five years, about 1 out of 5 people with masked hypertension never showed high blood pressure in a clinical setting, potentially delaying diagnosis and treatment."

Credit: 
American Heart Association

MAGIC system allows researchers to modulate the activity of genes acting in concert

Genomic research has unlocked the capability to edit the genomes of living cells; yet so far, the effects of such changes must be examined in isolation. In contrast, the complex traits that are of interest in both fundamental and applied research, such as those related to microbial biofuel production, involve many genes acting in concert. A newly developed system will now allow researchers to fine-tune the activity of multiple genes simultaneously.

Huimin Zhao, Steven L. Miller Chair Professor of Chemical and Biomolecular Engineering at the University of Illinois, led the study. Zhao and his research team described their new functional genomics system, which they named multi-functional genome-wide CRISPR (MAGIC), in a recent publication in Nature Communications.

"Using MAGIC, we can modulate almost all ~6000 genes in the entire yeast genome individually or in combination to various expression levels," Zhao said. Zhao leads an interdisciplinary research group at Illinois' Carl R. Woese Institute for Genomic Biology (IGB) that aims to develop sophisticated synthetic biology tools to support biological systems engineering; MAGIC is one of the latest steps in streamlining such work in yeast.

The C in MAGIC stands for CRISPR, the acronymic that has come to stand for a type of molecular system used to edit DNA. The full name, Clustered Regularly Interspaced Short Palindromic Repeats, refers to DNA sequences that enable bacteria to protect themselves from viruses. Key sections of these sequences help specialized molecules produced by the bacteria to recognize and slice up viral genomes, effectively disabling them.

Researchers design their own DNA sequences that work within CRISPR systems to precisely edit the genomes of living things. The molecules originally borrowed from bacteria have been tweaked so that they can have one of several effects on the gene toward which they are targeted, either increasing, decreasing, or completely eliminating gene activity, according to the way that cuts in the genome are made and repaired.

Until now, though, there has been no easy way to use more than one of these editing modes simultaneously. Researchers could explore the effects of different changes but could not easily combine them, as if playing improv in a jazz trio in which only one instrument could be playing at any given time.

"We have developed the tri-functional CRISPR system which can be used to engineer the expression of specific genes to various expression levels," Zhao said. In other words, MAGIC allows researchers to bring two or all three instruments into the music session at once. When combined with the comprehensive "library" of custom DNA sequences created in Zhao's lab, his group can explore the effects of turning up, turning down, and turning off any combination of genes in the yeast genome simultaneously.

Exploring this genomic harmonizing, the synergistic effects of multiple simultaneous edits, will allow researchers to better understand and to enhance complex traits and behaviors of useful microorganisms. For example, Zhao's group used the MAGIC system to look for combinations of edits that helped their yeast strain tolerate the presence of furfural, a byproduct of cellulosic hydrolysates that can limit the survival and activity of yeast cells used for cellulosic biofuels production. The resulting engineered furfural tolerant yeast strain could produce more biofuels than the parent yeast strain in fermentation.

Zhao and his group introduced sequences from their MAGIC library into yeast and looked for yeast cells that could withstand high levels of furfural. They found that some of surviving cells had taken in MAGIC sequences that altered the activity of genes known to be involved in tolerating furfural; the involvement of other genes was discovered for the first time by this experiment. The team was able to integrate one of these effective MAGIC sequences into the yeast genomic DNA and then test how further sequences might enhance tolerance.

"We were most excited about the ability of MAGIC to identify novel genetic determinants and their synergistic interactions in improving a complex phenotype [like furfural tolerance], particularly when these targets must be regulated to different expression levels," Zhao said. Because MAGIC allows researchers to examine how different genetic changes might work in combination to produce an effect, the new system can lead to clearer analyses of how different biological processes are involved in a trait.

Zhao said that among several technical challenges of the work was the development of a screening method that could be carried out efficiently at a large scale, a capability he hopes to expand to other scientific questions and other organisms.

"These challenges should be addressed in order to apply MAGIC to other eukaryotic systems such as industrial yeast strains and mammalian cells," he said.

Credit: 
Carl R. Woese Institute for Genomic Biology, University of Illinois at Urbana-Champaign

Astronomers reveal new image of candy cane-shaped feature in the center of our galaxy

image: The image color-codes different types of emission sources by merging microwave data (green, mapped by NASA's Goddard Space Flight Center IRAM Superconducting 2-Millimeter Observer, or GISMO, instrument) with infrared (blue) and radio observations (red). An area called the sickle may supply the particles responsible for setting the candy cane aglow.

Image: 
NASA's Goddard Space Flight Center

A team of astronomers has produced a new image of an arc-shaped object in the center of our Milky Way galaxy. The feature, which resembles a candy cane, is a magnetic structure that covers an enormous region of some 160 light-years. A light-year is the distance light travels in one year -- almost 6 trillion miles.

Mark Morris, a UCLA professor of physics and astronomy and a member of the research team, discovered the structure, also called the radio arc, with a former student, Farhad Yusef-Zadeh, back in 1983, but they did not have such a complete and colorful image of it then.

The new image shows the inner part of our galaxy, which houses the largest, densest collection of giant molecular clouds in the Milky Way. These vast, cool clouds contain enough dense gas and dust to form tens of millions of stars like the sun, Morris said.

In the image, blue and greenish-blue features reveal cold dust in molecular clouds where star formation is still in its infancy. Yellow features reveal the presence of ionized gas and show where hundreds of massive stars have recently formed. Red and orange regions show areas where high-energy electrons emit radiation by a process called "synchrotron emission," such as in the radio arc and Sagittarius A, the bright source at the galaxy's center that hosts its supermassive black hole.

Many of the universe's secrets are being revealed through the parts of the electromagnetic spectrum of light that are not visible to the human eye. The electromagnetic spectrum encompasses the complete range of light -- seen and unseen -- from gamma rays, X-rays and ultraviolet light on one end to infrared and radio waves on the other. In the middle is the small visible spectrum that includes the colors humans can detect with the unaided eye. Gamma rays have wavelengths billions of times smaller than those of visible light, while radio waves have wavelengths billions of times longer than those of visible light. Astronomers use the entire electromagnetic spectrum. In the study that led to the new image, the research team observed radio waves with a wavelength of 2 millimeters.

"The candy cane is a magnetic feature in which we can literally see the magnetic field lines illuminated by the radio emission," Morris said. "The new result revealed by this image is that one of the filaments is inferred to contain extremely high-energy electrons, the origin of which remains an interesting and unsettled issue."

The candy cane arc is part of a set of radio-emitting filaments extending 160 light-years. It is more than 100 light-years away from the central supermassive black hole. However, in another study recently, Morris and colleagues saw similar magnetic radio filaments that they believe are connected to the supermassive black hole, which may lead to important new ways to study black holes, he said.

To produce the new image, the astronomers used a NASA 2-millimeter camera instrument called GISMO, along with a 30-meter radio telescope located at Pico Veleta, Spain. They also took archival observations from the European Space Agency's Herschel satellite to model the infrared glow of cold dust. They added infrared data from the SCUBA-2 instrument at the James Clerk Maxwell Telescope near the summit of Maunakea, Hawaii, and radio observations from the National Science Foundation's Very Large Array, located near Socorro, New Mexico.

The team's research describing the composite image was published last month in Astrophysical Journal.

Morris' research interests include the center of the Milky Way, star formation, massive stellar clusters, and red giant stars, which are dying stars in the last stages of stellar evolution.

Credit: 
University of California - Los Angeles

Artificial Intelligence can now predict long-term risks of heart attack and cardiac death

A new study in Cardiovascular Research finds that machine learning, the patterns and inferences computers use to learn to perform tasks, can predict the long-term risk of heart attack and cardiac death. Indeed, machine learning appears to be better at predicting heart attacks and cardiac deaths than the standard clinical risk assessment used by cardiologists.

Researchers here studied subjects from the imaging arm of a prospective, randomized research trial, who underwent coronary artery calcium scoring with available cardiac CT scans and long-term follow-up. Participants here were asymptomatic, middle-aged subjects, with cardiovascular risk factors, but no known coronary artery disease.

Researchers used machine learning to assess the risk of myocardial infarction and cardiac death in the subjects, and then compared the predictions with the actual experiences of the subjects over fifteen years. Subjects here answered a questionnaire to identify cardiovascular risk factors and to describe their diets, exercise and marital status.

The final study consisted of 1,912 subjects, fifteen years after they were first studied. 76 subjects presented an event of myocardial infarction and/or cardiac death during this follow-up time. The subjects' predicted machine learning scores aligned accurately with the actual distribution of observed events. The atherosclerotic cardiovascular disease risk score, the standard clinical risk assessment used by cardiologists, overestimated the risk of events in the higher risk categories. Machine learning did not. In unadjusted analysis, high predicted machine learning risk was significantly associated with a higher risk of a cardiac event.

"Our study showed that machine learning integration of clinical risk factors and imaging measures can accurately personalize the patient's risk of suffering an adverse event such as heart attack or cardiac death," said While machine learning models are sometimes regarded as "black boxes", we have also tried to demystify machine learning; in this manuscript, we describe individual predictions for two patients as examples. When applied after the scan, such individualized predictions can help guide recommendations for the patient, to decrease their risk of suffering an adverse cardiac event. "

Credit: 
Oxford University Press USA

Extending Medicare Part D rebates to beneficiaries would save seniors $29 billion over 7 years

A new assessment of the Medicare Part D program based on a proposal from the West Health Policy Center finds that Medicare beneficiaries would save $29 billion if drug manufacturer rebates were used to reduce their out-of-pocket costs at the pharmacy counter through the Part D benefit - as long as these rebate savings are not also used to reduce Part D manufacturer liability. In contrast, pharmaceutical companies would profit and taxpayers would face additional costs if manufacturer rebates were directly applied to insurers' pharmacy prices.

The independent actuarial firm Milliman conducted the assessment at the request of the non-profit, non-partisan West Health Policy Center.

"The West Health Policy Center commissioned this analysis to offer a clear path forward for policymakers to reduce Medicare cost sharing without further lining Big Pharma's pockets," said Timothy A. Lash, president of the West Health Policy Center. "We are committed to common-sense solutions to lower healthcare and prescription drug costs, which make lifesaving medicines out of reach for millions of Americans, including many seniors."

In Medicare Part D, drug manufacturer rebates are paid by manufacturers after the point of sale, generally to a pharmacy benefit manager (PBM), who shares a portion of the rebates with the health insurer. Under this structure, rebates reduce premiums rather than out-of-pocket costs to beneficiaries. For this analysis, Milliman modeled changes in recently proposed Senate legislation impacting Part D benefits, considering spending for the Medicare program, drug manufacturers and beneficiaries. At the West Health Policy Center's request, Milliman also modeled two alternate add-on scenarios not in the Senate proposal that change how drug manufacturer rebates are handled under Part D.

Milliman estimated that the Senate Finance Committee's Prescription Drug Pricing Reduction Act of 2019 (PDPRA), as originally drafted, would generate savings of $63 billion for the Medicare program and $4 billion for beneficiaries from 2022 to 2029. These savings would be financed by increased contributions from drug manufacturers, who would offer $67 billion in additional discounts under PDPRA. Milliman's analysis was completed before Senator Grassley announced revisions to PDPRA; however, the revisions were intended to maintain the same manufacturer contribution as the original PDPRA and are not expected to impact the directionality of Milliman's results.

The additional scenarios reflect Senate leadership and White House interest in using manufacturer rebates to reduce costs at the pharmacy counter. The first considers spending changes if manufacturer rebates were fully directed to the point of sale (POS), which would reduce total pharmacy reimbursement ("POS rebates"). The second directs the rebate only to beneficiaries at the pharmacy counter, using rebate dollars to lower their cost sharing ("beneficiary rebates"). This beneficiary rebate scenario is the basis for the $29 billion in savings to Medicare beneficiaries.

Under the POS rebate model, drug manufacturers would see $44 billion in higher revenues compared to revenues under PDPRA. This surprising result follows from the structure of the Medicare program. Manufacturers' contributions to Medicare only begin once pharmacy spending has reached a certain threshold, so applying rebates directly to pharmacy spending means fewer beneficiaries reach the threshold, which reduces manufacturers' contributions. While beneficiaries would see $19 billion in lower spending under this proposal, Medicare would make up the shortfall, spending an additional $63 billion in taxpayer dollars.

"Drug manufacturers are the biggest winners under the POS rebate model, which they neglect to mention when lobbying for the policy," said Lash. "We wanted to see if there was a way to ensure beneficiaries could 'win' through better cost-sharing under a different rebate model."

The beneficiary rebate model maintains manufacturers' contributions to the program while sharing some of Medicare's savings with beneficiaries. Under this model, beneficiary cost-sharing would be based on the net price of a drug after rebates, and the Part D plan would make up the balance of any pharmacy reimbursement. The manufacturer contribution threshold, however, would be calculated based on the full, unrebated price of a drug, ensuring that manufacturers contribute appropriately for high-priced drugs. In this scenario, manufacturers would maintain the same $67 billion in contributions as under PDPRA, but beneficiaries would see $25 billion in lower spending. Although Medicare costs would increase compared to PDPRA, the program would still save $38 billion compared to the present rebate system, effectively spreading total cost reductions more evenly between Medicare and beneficiaries. Combined with the $4 billion in savings achieved under PDPRA, this would result in $29 billion in total savings to Medicare beneficiaries.

Milliman's findings are consistent with previous analyses by the Centers for Medicare & Medicaid Services Office of the Actuary and the Congressional Budget Office that found POS rebates would significantly increase Medicare spending while lowering costs for manufacturers. However, these analyses had not considered the effect of a beneficiary rebate model.

Credit: 
West Health Institute

Biodiversity has substantially changed in one of the largest Mediterranean wetlands

The Camargue in southern France is widely recognised as one of the largest and most biodiverse wetlands in the Mediterranean region.

Recent research has now shown that grasshoppers, crickets and locusts, comprising orthopterans, and also dragonflies and amphibians have severely declined since the 1970s. This provides evidence of substantial deterioration of the Camargue ecosystem.

Many of the biodiversity changes had previously gone unnoticed, because there has been no systematic biodiversity monitoring for many of the species under study. Researchers now used a new and innovative method based on expert knowledge to identify changes in biodiversity in the study area.

During a period of seven months, the researchers managed to gather information on population trends and population abundance for almost 1400 species, and also presence and absence data for more than 1500 species for a total of eight different taxonomic groups.

"These include amphibians, reptiles, breeding birds, fish, mammals, dragonflies, orthopterans such as crickets and locusts, and vascular plants. This information has allowed us to detect some interesting patterns of change within a period of only 40 years", says post-doctoral researcher Sara Fraixedas, from the Helsinki Institute of Sustainability Science (HELSUS), University of Helsinki.

The new methodology has made possible the detection of declines in certain taxa that are associated with land-use changes in the area.

"Temporary ponds and grasslands reduced in surface area by around 60% between 1942 and 1984. These two habitats have declined the most in the Camargue, and they have been converted into farmland or industrial areas. Amphibians and dragonflies are closely linked to freshwater wetlands, whereas many orthopteran species are associated with grasslands. Therefore, it is likely that the severe degradation of the conservation status of these three groups is related to the loss of those habitats", adds Thomas Galewski, leader of the Biodiversity Monitoring and Assessment project at the Tour du Valat.

The study also shows that breeding birds and vascular plants have increased in abundance. Authors attribute the increase in bird populations to conservation and management actions related to the implementation of the EU Birds and Habitats Directives, as well as to the arrival of new bird species to the ecosystem.

However, many bird species have disappeared from the Camargue between the 1970s and the 2010s. Especially farmland birds are showing a steep decline since the 1950s. The increase in vascular plants is linked to the introduction of new species, some of the them highly invasive, into the study area.

Results suggest that the current protection measures in the Camargue area have failed to protect certain taxa. On the other hand, the observed increase in novel and invasive species and the patterns of increases and declines in different groups reveal important changes in the community structure of the studied taxonomic groups.

"This may be an indication that the Camargue has undergone significant changes with important implications for local ecosystem functioning", Sara Fraixedas says.

Credit: 
University of Helsinki

AI's future potential hinges on consensus: NAM report

The role of artificial intelligence, or machine learning, will be pivotal as the industry wrestles with a gargantuan amount of data that could improve -- or muddle -- health and cost priorities, according to a National Academy of Medicine Special Publication on the use of AI in health care.

Yet, the current explosion of investment and development is happening without an underpinning of consensus of responsible, transparent deployment, which potentially constrains its potential.

The new report is designed to be a comprehensive reference for organizational leaders, health care professionals, data analysts, model developers and those who are working to integrate machine learning into health care, said Vanderbilt University Medical Center's Michael Matheny, MD, MS, MPH, Associate Professor in the Department of Biomedical Informatics, and co-editor of AI in Healthcare: The Hope, The Hype, The Promise, The Peril.

"It's critical for the health care community to learn from both the successes, but also the challenges and recent failures in use of these tools. We set out to catalog important examples in health care AI, highlight best practices around AI development and implementation, and offer key points that need to be discussed for consensus to be achieved on how to address them as an AI community and society," said Matheny.

Matheny underscores the applications in health care look nothing like the mass market imagery of self-driving cars that is often synonymous with machine learning or tech-driven systems.

For the immediate future, in health care, AI should be thought of as a tool to support and complement the decision-making of highly trained professionals in delivering care in collaboration with patients and their goals, Matheny said.

Recent advances in deep learning and related technologies have met with great success in imaging interpretations, such as radiology and retina exams, which have spurred a rush toward AI development that brought first, venture capital funding, and then industry giants. However, some of the tools have had problems with bias from the populations they were developed from, or from the choice of an inappropriate target. Data analysts and developers need to work toward increased data access and standardization as well as thoughtful development so algorithms aren't biased against already marginalized patients.

The editors hope this report can contribute to the dialog of patient inclusivity and fairness in the use of AI tools, and the need for careful development, implementation, and surveillance of them to optimize their chance of success, Matheny said.

Matheny along with Stanford University School of Medicine's Sonoo Thadaney Israni, MBA, and Mathematica Policy Research's Danielle Whicher, PhD, MS, penned an accompanying piece for JAMA Network about the watershed moment in which the industry finds itself.

"AI has the potential to revolutionize health care. However, as we move into a future supported by technology together, we must ensure high data quality standards, that equity and inclusivity are always prioritized, that transparency is use-case-specific, that new technologies are supported by appropriate and adequate education and training, and that all technologies are appropriately regulated and supported by specific and tailored legislation," the National Academy of Medicine wrote in a release.

"I want people to use this report as a foil to hone the national discourse on a few key areas including education, equity in AI, uses that support human cognition rather than replacing it, and separating out AI transparency into data, algorithmic, and performance transparency," said Matheny.

Credit: 
Vanderbilt University Medical Center

New classification system for tumors can guide diagnosis and treatment options for cancer

Based on the largest study of cancer patients of its kind, scientists have created a new way of classifying tumours. Clinicians can use genome sequencing to assign their patients' tumours to one of sixteen groups in the new classification system, ten of which provide important information for the diagnosis and treatment of the disease, like whether an individual will respond to immunotherapy.

Researchers at the Centre Nacional d'Anàlisi Genòmica, part of the Centre for Genomic Regulation in Barcelona, analysed the mutations found in 2,583 patients with 37 different types of cancer. They detected a total of 45 million mutations across all tumours, of which at least 1.2 million were non-unique mutations, meaning they were found in the same location for two or more cancer patients. With six billion potential sites in human DNA that can be mutated, the number of non-unique mutations is far higher than what's expected by chance alone. On average, 4% of the mutations in a tumour could also be found in one or more of the other tumours across the entire set of patients.

Further analyses showed these non-unique mutations were more likely to be found in certain types of primary tumours like skin cancer, oesophageal cancer and lymphoma. This suggests that causative agents of these cancers, such as UV light exposure or gastric reflux, damage DNA in a more predictable way than by chance alone. On the other end of the spectrum, researchers found very few non-unique mutations in lung cancer, liver cancer and kidney cancer, suggesting DNA damage by, for example, tobacco smoke exposure occurs more randomly compared to other groups.

Based on the number and type of non-unique mutations, researchers were able to classify the 2,583 primary tumours into one of sixteen groups, each of which have independent characteristics. Ten of these groups are clinically relevant, with the potential to help doctors make a more accurate diagnosis and select a more effective treatment course. For example, the number and type of non-unique mutations that define one group are linked to tumours unable to correct specific type of damage to their genetic code, resulting in their DNA becoming unstable. These patients are likely to respond well to immunotherapy, which would in turn allow abstaining from conventional chemotherapy and its side effects.

"Cancer is a complex disease that requires a bespoke course of action to diagnose, manage and treat effectively," says Ivo Gut, senior author of the study. "Currently doctors look for individual mutations at specific locations in DNA, which has a limited view. Using whole genome sequencing provides a complete overview of the number of mutations in a tumour, allowing doctors to classify the cancer type and gain deeper understanding of disease, which can have important implications for the way they treat their patients."

The findings also highlight other benefits for whole genome sequencing. "In a small percentage of patients, the origin of the cancer is unknown and the biopsy taken turns out to be from a metastasis instead of the primary tumour," says Miranda Stobbe, lead author of the study. "If conventional analyses conclude that it is a metastatic tumour, but they cannot determine its origin, doctors will have to start scanning the rest of the patient to try to find the primary source. In some cases, the primary may already be gone, because of the response of the immune system, or the primary is too small to be detected. Our classification would get around that by assigning the tumour to one of 16 groups, providing important information on where the tumour originates from."

Credit: 
Center for Genomic Regulation

Impact of methamphetamine use depends on your genes

New research led by La Trobe University in Australia has uncovered genetic clues which could explain why some people have more severe side effects from long-term methamphetamine use than others.

The research, published in Molecular Psychiatry found that variations in the gene known as BDNF strongly determine the effects of methamphetamine in the brain. This could potentially explain why some users develop methamphetamine-induced psychosis, which is similar to schizophrenia.

La Trobe neuroscientist Professor Maarten van den Buuse said the research, conducted using animal models, may lead to ways of identifying individuals at particular risk of developing psychosis and could mean a fundamental change in the way the effects of drug-induced psychosis on the brain are treated.

"Drug-induced psychosis is generally treated with anti-psychotic medications, but these are not generally effective and are often associated with side effects," Professor van den Buuse said.

"If further research is able to provide more details on the role genetics plays in the effects of long-term methamphetamine use, we could begin looking at therapies that would make a real difference for people affected by it."

The research looked specifically into the impact of methamphetamine use in adolescence and early adulthood, which is often when long-term users begin taking the drug, Professor van den Buuse said.

Credit: 
La Trobe University

Fossil expands ancient fish family tree

image: Illustration of the newly described lungfish Isityumzi (lower right) and other Late Devonian freshwater ecosystem creatures including an early tetrapod (Unzantsia) by South African earth sciences illustrator Maggie Newman.

Image: 
Artist's impression by Maggie Newman

A second ancient lungfish has been discovered in Africa, adding another piece to the jigsaw of evolving aquatic life forms more than 400 million years ago.

The new fossil lungfish genus (Isityumzi mlomomde) was found about 10,000km from a previous species described in Morocco, and is of interest because it existed in a high latitude (70 degrees south) or polar environment at the time.

Flinders University researcher Dr Alice Clement says the "scrappy" fossil remains including tooth plates and scales were found in the Famennian Witpoort Formation off the western cape of South Africa.

"This lungfish material is significant for a number of reasons," Dr Clement says.

"Firstly it represents the only Late Devonian lungfish known from Western Gondwana (when South America and Africa were one continent). During this period, about 372-359 million years ago, South Africa was situated next to the South Pole," she says.

"Secondly, the new taxa from the Waterloo Farm Formation seems to have lived in a thriving ecosystem, indicating this region was not as cold as the polar regions of today."

Dr Clement says the animal would still have been subject to long periods of winter darkness, very different to the freshwater habitats that lungfish live in today when there are only six known species of lungfish living only in Africa, South America and Australia.

Isityumzi mlomomde means "a long-mouthed device for crushing" in isiXhosa, one of the official languages of South Africa.

Around 100 kinds of primitive lungfish (Dipnoi) evolved from the early Devonian period more than 410 million years ago. More than 25 originated in Australian (Gondwanan) and others are known to have lived in temperate tropical and subtropical waters of China and Morocco in the Northern Hemisphere.

Lungfish are a group of fish most closely related to all tetrapods - all terrestrial vertebrates including amphibians, reptiles, birds and mammals.

"In this way, a lungfish is more closely related to humans than it is to a goldfish!" says Dr Clement, who has been involved in naming three other new ancient lungfish.

Credit: 
Flinders University