Culture

Study suggests 3D face photos could be a sleep apnea screening tool

image: Facial features analyzed from 3D photographs could predict the likelihood of having obstructive sleep apnea, according to a study published in the April issue of the Journal of Clinical Sleep Medicine. Geodesic and euclidian distances were determined between annotated landmarks.

Image: 
Journal of Clinical Sleep Medicine

DARIEN, IL – Facial features analyzed from 3D photographs could predict the likelihood of having obstructive sleep apnea, according to a study published in the April issue of the Journal of Clinical Sleep Medicine.

Using 3D photography, the study found that geodesic measurements — the shortest distance between two points on a curved surface — predicted with 89 percent accuracy which patients had sleep apnea. Using traditional 2D linear measurements alone, the algorithm’s accuracy was 86 percent.

“This application of the technique used predetermined landmarks on the face and neck,” said principle investigator Peter Eastwood, who holds a doctorate in respiratory and sleep physiology and is the director of the Centre for Sleep Science at the University of Western Australia (UWA). “Geodesic and linear distances between these landmarks were determined, and a linear discriminant algorithm was trained, tested and used to classify an individual as being at high or low risk of having obstructive sleep apnea.”

The study involved 300 individuals with varying severity levels of sleep apnea and 100 people without sleep apnea. These individuals came from a local hospital and from the Raine Study, a longitudinal cohort study in Western Australia. All underwent overnight sleep studies and took 3D photos with a craniofacial scanner system. Data were used to build a predictive algorithm that was tested on another patient set.

Eastwood worked with Syed Zulqarnain Gilani, a computer scientist at UWA to identify the facial features most commonly associated with sleep apnea as neck width and degree of retrusion of the lower jaw (retrognathia), but the study also uncovered other possible indicators.

“The data obtained from the present study indicate that other measurements such as width and length of the lower jaw, width of the face, and distance between the eyes also contribute to distinguishing individuals with and without OSA,” he said.

In a related commentary, also published in the April issue of JCSM, Drs. Ofer Jacobowitz and Stuart MacKay indicated that they see a bright future for 3D photography as a screening tool, potentially combined with data from a patient’s digital health tracker and health history.

“Certain wearable devices are already capable of measuring pulse oximetry and some provide oximetry variability analysis,” they wrote. “Likewise, the home of tomorrow will likely incorporate sensors in the bedroom which may gather physiological sleep data using optical, acoustic, infrared, ultrasonographic or other means.”

According to Eastwood, existing studies show a genetic predisposition to sleep apnea, and facial structure is a significant component of such predisposition, leading researchers to seek an accessible, affordable method of screening based on facial characteristics. Eastwood believes that 3D facial photography could represent the first, inexpensive, widely available screening tool for sleep apnea.

“OSA is a huge public health problem, and despite effective treatments being available, many with OSA are currently undiagnosed,” said Eastwood. “Therefore, simple, accurate screening tools are needed to predict those who have OSA.”

Credit: 
American Academy of Sleep Medicine

Common soil fungus could be ally in organic corn growers' fight against pests

image: Penn State researchers knew the Metarhizium robertsii fungus was deadly to insects such as caterpillars on the ground (victim shown here), but they became more interested when they learned it was taken up by plant roots. Turns out, the fungus boosts both plant growth and plant defenses against pests.

Image: 
Nick Sloff

A common soil fungus might be enlisted as a powerful partner by corn producers to suppress pests and promote plant growth, according to Penn State researchers, who suggest promoting the fungus could be an especially valuable strategy for organic growers who struggle with insect control.

These conclusions were reached after a study of fungus-insect-plant interactions in greenhouse- and lab-based settings. Researchers inoculated seeds of corn with spores of Metarhizium robertsii fungus and subsequently evaluated corn plants for fungal colonization of leaves and roots. They also measured plant height, chlorophyll content, above-ground biomass and relative growth rate of black cutworm.

"We saw that colonization of corn plants by the fungus M. robertsii promoted plant growth and boosted the expression of selected genes involved in plant defense in corn," said lead researcher Mary Barbercheck, professor of entomology. "The heightened defense response suppressed the growth rate of black cutworm larvae."

Barbercheck noted that her research group in the College of Agricultural Sciences long has been aware of this fungus and has been "just casually following it" since starting organic production studies in 2003. She knew the fungus was deadly to insects such as caterpillars on the ground but became more interested when other researchers showed it was taken up by plant roots.

"I wondered if there is a lot of this fungus out there, how it is surviving in the field and what it is doing, so I expanded my work to focus more on it," she said. "I happen to work in organic systems, and so that's where we've been studying it. But that doesn't mean that it is not beneficial in conventional crop systems, too."

Researchers recovered the fungus from 91% of corn plants grown from inoculated seeds, and they detected the fungus more frequently in roots compared with leaves. Colonized plants were greater in plant height and above-ground biomass compared to control plants. In feeding bioassays, the relative growth rate of black cutworm was lower on leaves from fungus-colonized plants than control plants.

These findings, recently published in Biological Control, are important, Barbercheck explained, because they show corn growers -- especially organic corn growers -- that they can benefit from managing their fields to promote the fungus.

"Because this fungus appears to promote plant growth, to help control pests, and to alter the plant defense response to suppress at least some pest growth, we need to adjust our management practices to support it," she said. "Next, we have to learn how much of the fungus is in plants and what the natural infection level is. And if we make a seed treatment out of it, how effective would it be?"

There is a shortage of organically produced feed grains in the United States, Barbercheck noted. She said that could stem from many producers fearing that transitioning fields to organic is a risky proposition because they don't know how to manage insects without insecticides and how to manage weeds without herbicides. Assistance from the Metarhizium robertsii fungus might help reassure them.

"We need to see what natural processes we can manage to make that transition less risky to help make organic farming more widespread," she said. "On a commercial level, there are opportunities for growers to take advantage of organic markets because there is a big demand that we currently are not meeting with domestic supplies."

Credit: 
Penn State

Novel coronavirus detected, monitored in wastewater

image: Halden's technique boasts high sensitivity, with the potential to detect the signature of a single infected individual among 100 to 2 million persons. To accomplish this, wastewater samples are screened for the presence of nucleic acid fragments of the SARS-CoV-2 virus. The RNA genomes are amplified through a process known as reverse-transcriptase quantitative PCR (RT qPCR).

Image: 
Shireen Dooling

Within weeks of arriving on the world stage, SARS-CoV-2 has managed to encircle the globe, leaving illness, mortality and economic devastation in its vast wake. One of the central challenges facing health authorities and the medical community has been testing for the elusive virus on a sufficiently comprehensive scale.

A new approach to monitoring the novel coronavirus, (as well as other dangerous pathogens and chemical agents), is being developed and refined. Known as wastewater-based epidemiology (WBE), the method mines sewage samples for vital clues about human health. It can potentially identify levels of coronavirus infection at both a local and global scale.

Ultimately, WBE holds the promise of near real-time monitoring of disease outbreaks, resistant microbes, levels of drug use or health indicators of diabetes, obesity and other maladies.

In a new study, ASU researchers Rolf Halden and Olga Hart analyze what can and cannot be measured when tracking SARS-CoV-2 in wastewater, and they highlight the economic advantages of the new approach over conventional disease testing and epidemiological surveillance.

"Our results show that exclusive reliance on testing of individuals is too slow, cost-prohibitive and in most places, impractical, given our current testing capacity," Halden says. "However, when preceded by population-wide screening of wastewater, the task becomes less daunting and more manageable."

Hart is the lead author of the new study and a researcher in the Biodesign Center for Health Engineering. Halden, who directs the center, also is a professor of engineering in the Fulton School of Sustainable Engineering and the Built Environment and author of the 2020 book Environment.

Their research appears in the current issue of the journal Science of the Total Environment.

A liter of water, an ocean of information

Wastewater-based epidemiology holds the potential to break the coronavirus testing logjam in many developed nations like the U.S., but could also be an invaluable tool for gathering health data in impoverished regions likely to bear the brunt of the pandemic.

Currently, the U.S. features the largest national and international WBE network and sample repository, known as the Human Health Observatory (HHO) at ASU. Recently, SARS-CoV-2 was added to a range of health indicators subject to continuous tracking by the HHO since May, 2008.

Tracking a lethal menace

Halden's technique boasts high sensitivity, with the potential to detect the signature of a single infected individual among 100 to 2 million persons. To accomplish this, wastewater samples are screened for the presence of nucleic acid fragments of the SARS-CoV-2 virus. The RNA genomes are amplified through a process known as reverse-transcriptase quantitative PCR (RT qPCR).

The WBE strategy involves first transcribing coronavirus RNA into complementary DNA (cDNA) by the reverse transcriptase enzyme, then amplifying the resultant DNA to improve signal detection. Sequencing techniques are used to confirm viral presence in wastewater samples.

"We can in one go monitor an entire community for presence of the new coronavirus," Hart said. "However, tradeoffs exist. To get the best results and avoid loss of information, we want to measure close to virus hotspots and take into account wastewater temperature and dilution when estimating the number of infected cases."

In the current study, researchers modeled wastewater samples in Tempe, Ariz., for the presence of the SARS-CoV-2 virus. Their work draws on computational analysis and modeling, and projections of past, present and future epidemic hotspots.

The research indicates that careful calibration must be carried out to ensure the accuracy of data, which is acutely sensitive to key variables including seasonal temperature, average in-sewer travel time, degradation rates of biomarkers, community demographics and per-person water use. (A companion paper by Halden and Hart examines the effects of these variables on WBE results in fine detail.)

Estimates based on European and North American data suggest that each person infected with SARS-CoV-2 will excrete millions if not billions of viral genomes into wastewater per day. This translates to between 0.15 and 141.5 million viral genomes per liter of wastewater generated.

With the aid of RT qPCR, researchers should be able to detect the novel coronavirus with high sensitivity, requiring the monitoring roughly every 1 in 114 individuals in the worst-case scenario and just 1 positive case among 2 million non-infected individuals under optimum conditions.

In addition to reducing transmission and fatality resulting from SARS-CoV-2 infection, improved population-wide data provides other societal benefits. By pinpointing viral hotspots, researchers will be able to better direct resources to protect vulnerable populations through social distancing measures, while easing restrictions in virus-free regions, minimizing economic and social disruption.

To accomplish that, Halden and his team have created OneWaterOneHealth, a nonprofit project of the ASU Foundation that seeks to bring COVID-19 testing to those who currently cannot afford it.

Halden said that should this approach be applied in the U.S, roughly 70% of the population could be screened for SARS-CoV-2 through monitoring the country's 15,014 wastewater treatment plants at an estimated cost for chemical reagents of $225,000 USD.

More fine-grained surveillance could be achieved by using WBE to identify regional or global hotspots for the virus, then applying targeted testing of individuals using clinical methods.

Credit: 
Arizona State University

Researchers are making recombinant-protein drugs cheaper

The mammalian cell lines that are engineered to produce high-value recombinant-protein drugs also produce unwanted proteins that push up the overall cost to manufacture these drugs. These same proteins can also lower drug quality. In a new paper in Nature Communications, researchers from the University of California San Diego and the Technical University of Denmark showed that their genome-editing techniques could eliminate up to 70 percent of the contaminating protein by mass in recombinant-protein drugs produced by the workhorses of mammalian cells -- Chinese Hamster Ovary (CHO) cells.

With the team's CRISPR-Cas mediated gene editing approach, the researchers demonstrate a significant decrease in purification demands across the mammalian cell lines they investigated. This work could lead to both lower production costs and higher quality drugs.

Recombinant proteins currently account for the majority of the top drugs by sales, including drugs for treating complex diseases ranging from arthritis to cancer and even combating infectious diseases such as COVID-19 by neutralizing antibodies. However, the cost of these drugs puts them out of reach of much of the world population. The high cost is due in part to the fact that they are produced in cultured cells in the laboratory. One of the major costs is purification of these drugs, which can account for up to 80 percent of the manufacturing costs.

In an international collaboration, researchers at the University of California San Diego and the Technical University of Denmark recently demonstrated the potential to protect the quality of recombinant protein drugs while substantially increasing their purity prior to purification, as reported in the study entitled "Multiplex secretome engineering enhances recombinant protein production and purity" published in April 2020 in the journal Nature Communications.

"Cells, such as Chinese hamster ovary (CHO) cells, are cultured and used to produce many leading drugs," explained Nathan E. Lewis, Associate Professor of Pediatrics and Bioengineering at the University of California San Diego, and Co-Director of the CHO Systems Biology Center at UC San Diego. "However, in addition to the medications we want, the cells also produce and secrete at least hundreds of their own proteins into the broth. The problem is that some of these proteins can degrade the quality of the drugs or could elicit negative side effects in a patient. That's why there are such strict rules for purification, since we want the safest and most effective medications possible."

These host cell proteins (HCPs) that are secreted are carefully removed from every batch of drug, but before they are removed, they can degrade the quality and potency of the drugs. The various steps of purification can remove or further damage the drugs.

"Already at an early stage of our research program, we wondered how many of these secreted contaminating host cell proteins could be removed," recounted Director Bjorn Voldborg, Head of the CHO Core facility at the Center of Biosustainability at the Technical University of Denmark.

In 2012 the Novo Nordisk Foundation awarded a large grant, which has funded ground-breaking work in genomics, systems biology and large scale genome editing for research and technology development of CHO cells at the Center for Biosustainability at the Danish Technical University (DTU) and the University of California San Diego. This funded the first publicly accessible genome sequences for CHO cells, and has provided a unique opportunity to combine synthetic and systems biology to rationally engineer CHO cells for biopharmaceutical production.

"Host cell proteins can be problematic if they pose a significant metabolic demand, degrade product quality, or are maintained throughout downstream purification," explained Stefan Kol, lead author on the study who performed this research while at DTU. "We hypothesized that with multiple rounds of CRISPR-Cas mediated gene editing, we could decrease host cell protein levels in a stepwise fashion. At this point, we did not expect to make a large impact on HCP secretion considering that there are thousands of individual HCPs that have been previously identified."

This work builds on promising computational work published earlier in 2020.

Researchers at UC San Diego had developed a computational model of recombinant protein production in CHO cells, published earlier this year in Nature Communications. Jahir Gutierrez, a former bioengineering Ph.D. student at UC San Diego used this model to quantify the metabolic cost of producing each host cell protein in the CHO secretome, and with the help of Austin Chiang, a project scientist in the Department of Pediatrics at UC San Diego, showed that a relatively small number of secreted proteins account for the majority of the cell energy and resources. Thus the idea to eliminate the dominant contaminating proteins had the potential to free up a non-negligible amount of cellular resources and protect drug quality. The authors identified and removed 14 contaminating host-cell proteins in CHO cells. In doing this they eliminated up to 70 percent of the contaminating protein by mass and demonstrated a significant decrease in purification demands.

These modifications can be combined with additional advantageous genetic modifications being identified by the team in an effort to obtain higher quality medications at lower costs.

Credit: 
University of California - San Diego

New research finds cost transparency can increase sales 20%

INFORMS Journal Marketing Science New Study Key Takeaways:

Cost transparency boosts sales when voluntarily instated by a business, as opposed to involuntarily. (Required by law)

Increased trust enhances consumers' willingness to purchase from businesses.

Cost transparency is associated with a 21% increase in the probability of purchasing an item.

CATONSVILLE, MD, April 23, 2020 - Businesses don't typically disclose information to consumers on how much it costs to produce a product. However, new research in the INFORMS journal Marketing Science provides evidence that doing so can increase consumers' purchase interest by more than 20%.

The study, "Lifting the Veil: The Benefits of Cost Transparency," conducted by Bhavya Mohan of the University of San Francisco and Ryan Buell and Leslie John of Harvard Business School, found that cost transparency can increase sales, but only when done voluntarily. They also found that cost transparency increases purchase interest even when prices are unexpectedly low or high.

"Even if prices aren't exactly what the customer might envision, the customer appreciates the act of cost disclosure," says Mohan, a professor in the marketing unit at the University of San Francisco.

"It's all about the psychology of disclosure and trust," said Buell, a professor in the technology and operations management unit at Harvard Business School. "Cost transparency represents an act of intimate disclosure and fosters trust. Heightened trust enhances consumers' willingness to purchase from a business."

The researchers conducted six experiments to illustrate the effects of cost transparency.

Cost transparency conveys more sensitive information to consumers than operational transparency alone by referring to the disclosure of the costs to produce a good or provide a service. But it can be risky because it makes the business vulnerable to experiencing negative consequences such as consumer ire or supplier price increases.

One experiment was a partnership with a dining services organization of a large university in the northeastern U.S. in which a month of lunchtime sales was studied. That organization revealed the costs of producing a bowl of chicken noodle soup, including the cost of each component and the total cost. Cost transparency is associated with a 21% increase in the probability of buying a bowl of soup with the probability increasing from 2.3% to 2.8% per customer.

Another experiment looked at a private online retailer and their sales of a leather wallet. For three of the wallet colors, the online product detail page included, among other information, the costs incurred to produce the wallet. The company mistakenly failed to use the graphic on two of the colors for the wallet.

"We compared the daily sales between the wallet colors before and after the graphic was introduced over a 92-day period. The infographic increased sales of the wallets by 22%," said Buell. "These studies imply that the proactive revelation of costs can improve a company's bottom line."

Credit: 
Institute for Operations Research and the Management Sciences

RIT scientists develop first 3D mass estimate of microplastic pollution in Lake Erie

image: RIT scientists developed the first three-dimensional model to show where microplastic pollution is collecting in Lake Erie. This figure is the result of a half-year model simulation of particle count distribution in the lake's open water.

Image: 
RIT

Rochester Institute of Technology scientists have developed the first three-dimensional mass estimate to show where microplastic pollution is collecting in Lake Erie. The study examines nine different types of polymers that are believed to account for 75 percent of the world's plastic waste.

Plastic behaves differently in lakes than in oceans; previous studies on both have indicated the levels of plastic pollution found on the surface are lower than expected based on how much is entering the water. While massive floating "islands" of accumulated plastic waste have been found in oceans, previous studies have indicated the levels of plastic pollution found on the surface of Lake Erie are lower than expected based on how much is entering the water.

The new RIT estimate for the 3D mass--381 metric tons--is more than 50 times greater than the previous estimates at the surface. The study also generated the first estimate of how much plastic is deposited on the bottom of the lake. It accounts for the unique properties of different types of plastics and shows that the three polymers with the lowest density--polyethylene, polypropylene and expanded polystyrene--accumulate on the surface of the lake while the other six polymers were concentrated in the sediment.

"Previously there was a focus on plastics modeled as neutrally buoyant for the most part in the beginning of plastics modeling," said Juliette Daily, a mathematical modeling Ph.D. student and author of the study. "In reality, plastic is probably almost never neutrally buoyant. It's probably always positively or negatively buoyant, which really changes how the particles behave."

The study shows other interesting patterns, such as plastic particles accumulating more heavily on the eastern shore of the lake, perhaps from the current moving predominantly west to east. This means that pollution could be pushed disproportionately to areas like Buffalo, N.Y. The authors hope other researchers will continue to build on this research and explore how factors like beaching can further explain where plastic particles end up.

"Trying to understand where plastic is going is important for people looking at mitigation or prevention and will be important for understanding what the most likely impacted areas are," said Matthew Hoffman, associate professor in the School of Mathematical Sciences and co-author of the paper. "Looking at things in the sediment or getting an idea of what is down in the lower levels of the lake will give us a better idea of what concentrations there are and what possible exposure levels are to this ecosystem."

Credit: 
Rochester Institute of Technology

In glowing colors: Seeing the spread of drug particles in a forensic lab

video: To understand how drug particles spread inside a forensic chemistry lab, NIST researchers fabricated a brick made of white flour mixed with a small amount of fluorescent powder. Under everyday lights the brick looked like evidence from a drug seizure, but under ultraviolet light--also called UV or black light--it glowed a bright orange. This is video number 1 of 3 in this series.

Image: 
Sisco, Staymates/NIST

When two scientists from the National Institute of Standards and Technology (NIST) brought black lights and glow powder into the Maryland State Police crime lab, they weren't setting up a laser tag studio or nightclub.

Instead, their aim was to study the way drug particles get spread around crime labs when analysts test suspected drug evidence. Their study, recently published in Forensic Chemistry, addresses safety concerns in an age of super-potent synthetic drugs like fentanyl, which can potentially be hazardous to chemists who handle them frequently.

The spread of drug particles cannot be completely avoided -- it is an inevitable result of the forensic analyses that crime labs must perform. To see how it happens, the two NIST research scientists, Edward Sisco and Matthew Staymates, fabricated a brick made of white flour mixed with a small amount of fluorescent powder. Under everyday lights the brick looked like evidence from a drug seizure, but under ultraviolet light -- also called UV or black light -- it glowed a bright orange.

Amber Burns, supervisor of the Maryland State Police forensic chemistry lab and a co-author of the study, examined the brick and its contents as she would real evidence. With a sheet of butcher paper covering her workspace, she cut open the package with a scalpel, scooped out a sample and transferred that scoop into a glass vial for analysis.

She also removed the powder to weigh it on a digital scale without the packaging. When she was done, the black light revealed that some particles had settled onto surfaces in her workspace. Some had also adhered to her gloves and were transferred by touch onto a marker and wash bottle.

All chemists clean their workspaces between cases to prevent evidence from one case from contaminating the next. After Burns discarded the butcher paper and cleaned her workspace, the black light showed that her cleanup routine was effective.

Before the emergence of fentanyl and other super-potent drugs, such small amounts of drug residue were not a major concern. But that has changed, and not only for reasons of workplace safety. Drug dealers often mix small amounts of fentanyl into heroin and cocaine, and some labs are increasing the sensitivity of their instruments to detect those small amounts. Highly sensitive instruments are more likely to detect small amounts of drug residue in the environment, so those labs have to be extra careful about limiting their spread.

This visualization experiment led the authors to suggest several steps that might minimize spread. These include changing gloves frequently, using vials and test tubes with large mouths to limit spillage when transferring material into them, and having two sets of wash bottles, one for casework and one for cleanup.

The researchers' paper is written in such a way that any laboratory can reproduce the black-light experiment.

"This is a great way for labs to see which of their practices contribute to the spread of drug residues, and to make sure that their cleanup routines are effective," Sisco said.

Credit: 
National Institute of Standards and Technology (NIST)

How to make the healthiest coffee during COVID-19 lockdown

Sophia Antipolis, 23 April 2020: We may all be drinking more coffee to help us survive the COVID-19 lockdown. Today scientists announce the healthiest way to make a brew.

The first study to examine links between coffee brewing methods and risks of heart attacks and death has concluded that filtered brew is safest. The research is published today in the European Journal of Preventive Cardiology, a journal of the European Society of Cardiology (ESC).1

"Our study provides strong and convincing evidence of a link between coffee brewing methods, heart attacks and longevity," said study author Professor Dag S. Thelle of the University of Gothenburg, Sweden. "Unfiltered coffee contains substances which increase blood cholesterol. Using a filter removes these and makes heart attacks and premature death less likely."

Coffee is one of the most popular beverages worldwide and the most frequently used stimulant. Some 30 years ago Professor Thelle discovered that drinking coffee was linked with raised total cholesterol and the "bad" LDL cholesterol - to such an extent that it was likely to have detrimental consequences for heart health. Experiments identified the culprit substances in coffee and found that they could be removed using a filter. A cup of unfiltered coffee contains about 30 times the concentration of the lipid-raising substances compared to filtered coffee.

He said: "We wondered whether this effect on cholesterol would result in more heart attacks and death from heart disease. But it was unethical to do a trial randomising people to drink coffee or not. So we set up a large population study and several decades later we are reporting the results."

Between 1985 and 2003, the study enrolled a representative sample of the Norwegian population: 508,747 healthy men and women aged 20 to 79. Participants completed a questionnaire on the amount and type of coffee consumed. Data was also collected on variables that could influence both coffee consumption and heart diseases, so that these could be accounted for in the analysis. For example, smoking, education, physical activity, height, weight, blood pressure, and cholesterol.

Participants were followed for an average of 20 years. A total of 46,341 participants died. Of those, 12,621 deaths were due to cardiovascular disease. Of the cardiovascular deaths, 6,202 were caused by a heart attack.

Overall, coffee drinking was not a dangerous habit. In fact, drinking filtered coffee was safer than no coffee at all. Compared to no coffee, filtered brew was linked with a 15% reduced risk of death from any cause during follow up. For death from cardiovascular disease, filtered brew was associated with a 12% decreased risk of death in men and a 20% lowered risk of death in women compared to no coffee. The lowest mortality was among consumers of 1 to 4 cups of filtered coffee per day.

Professor Thelle said: "The finding that those drinking the filtered beverage did a little better than those not drinking coffee at all could not be explained by any other variable such as age, gender, or lifestyle habits. So we think this observation is true."

Filtered brew was also less risky than the unfiltered beverage for death from any cause, death due to cardiovascular disease, and deaths from heart attacks. "Our analysis shows that this was partly because of the cholesterol-increasing effect of unfiltered coffee," said Professor Thelle.

Professor Thelle noted that unfiltered coffee did not raise the risk of death compared to abstaining from coffee - except in men aged 60 and above, where unfiltered brew was linked with elevated cardiovascular mortality.

He said: "We only had one measurement of coffee consumption, but we know that brewing habits were changing in Norway during the follow-up period. We believe that some women and younger men drinking unfiltered coffee switched to filtered, thereby reducing the strength of the association with cardiovascular mortality, whereas older men were less inclined to change their habits."

Professor Thelle emphasised that these are observational data, but that if public health authorities asked for his advice it would be: "For people who know they have high cholesterol levels and want to do something about it, stay away from unfiltered brew, including coffee made with a cafetière. For everyone else, drink your coffee with a clear conscience and go for filtered."

Credit: 
European Society of Cardiology

Protect health and social care workers and refer their deaths to the coroner, says The BMJ editor

All deaths of health and social care workers during the covid-19 pandemic should be referred to the coroner for independent review, says Dr Fiona Godlee, Editor in Chief of The BMJ today.

With many hundreds of deaths around the world, and over 100 reported in the UK, "it is impossible not to feel let down by political and healthcare leaders who, while sloganning, clapping for, and praising the NHS, have so evidently failed to protect those who work within it," she writes.

Her call echoes that of Professor John Robertson, Consultant Surgeon at the University of Nottingham and colleagues, who say "as this pandemic unfolds and we witness the deaths of our fellow healthcare professionals during active service and under controversial occupational conditions, there arises the inevitable question of whether the coroner should be involved?"

Writing in this week's journal, they say it is imperative that there is no further delay in providing every healthcare worker with effective PPE, and argue that, "until it is clear how much transmission is due to aerosol as well as droplet infection, surgical masks should not be considered effective protection."

They are also damning about the government's attempts to shift the blame for staff deaths onto community infection, and have no faith in the government's proposed investigation. "Without referring each death to the coroner, can we be confident that the circumstances of their employment have not resulted in these individuals paying the ultimate price through their daily work?"

Godlee argues that the UK government's response to this crisis "has been characterised from the beginning by complacency, arrogance and delay, worsened in subsequent weeks by broken promises about the supply of PPE, apparent ignorance of the situation on the frontline, and poorly explained and shifting guidance.

"So that we can learn for the future, honour the sacrifice, and seek compensation for families, all deaths of health and social care workers should be referred to the coroner for independent review," she concludes.

Credit: 
BMJ Group

Evidence suggests COVID-19 isn't sexually transmitted

COVID-19 is unlikely to be spread through semen, according to University of Utah Health scientists who participated in an international study of Chinese men who recently had the disease. The researchers found no evidence of the virus that causes COVID-19 in the semen or testes of the men.

The study was not comprehensive enough to fully rule out the possibility that the disease could be sexually transmitted. However, the chances of it occurring, based on this limited finding, appear to be remote.

"The fact that in this small, preliminary study that it appears the virus that causes COVID-19 doesn't show up in the testes or semen could be an important finding," says James M. Hotaling, M.D., a co-author of the study and a U of U Health associate professor of urology specializing in male fertility. "If a disease like COVID-19 were sexually transmittable that would have major implications for disease prevention and could have serious consequences for a man's long-term reproductive health."

The study appears in Fertility & Sterility, a peer-reviewed journal published by the American Society of Reproductive Medicine.

The international team of researchers from China and the United States launched the study in response to concerns that SARS-CoV-2, the virus that causes COVID-19, could be sexually transmitted like Ebola, Zika and other emerging viral pathogens. To find out, they collected semen samples from 34 Chinese men one month (on average) after they were diagnosed with mild to moderate cases of COVID-19. Laboratory tests did not detect SARS-CoV-2 in any of the semen samples.

But just because the virus wasn't present in the existing semen didn't necessary rule out that it hadn't entered the testes where sperm cells are formed.

"If the virus is in the testes but not the sperm it can't be sexually transmitted," says Jingtao Guo, Ph.D., a postdoctoral scientist at the Huntsman Cancer Institute at the University of Utah who also co-authored the study. "But if it is in the testes, it can cause long-term damage to semen and sperm production."

To sort this part of the puzzle out, the researchers analyzed a dataset generated from a single cell mRNA atlas from healthy young organ donors that was available from prior work. This atlas allows them to examine mRNA, the genetic material used to make proteins, in any single testicular cell. In this case, scientist used it to examine the expression of a pair of genes associated with SARS-CoV-2. These two genes, angiotensin-converting enzyme 2 (ACE2) and transmembrane serine protease 2 (TMPRSS2) act as receptors, allowing SARS-CoV2 to penetrate cells and replicate. In order for the virus to access cells effectively, both receptors must be present in the same cell.

When the scientists examined the dataset, they found that genes encoding these two proteins were only found in four of the 6,500 testicular cells, suggesting that SARS-CoV-2 is unlikely to invade human testicular cells, Guo says.

Despite these findings, the researchers acknowledge that their study has several important limitations including a small sample size and the fact that none of the donors had been severely ill with COVID-19.

"It could be that a man who is critically ill with COVID-19 might have a higher viral load, which could lead to a greater likelihood of infecting the semen. We just don't have the answer to that right now," Hotaling says. "But knowing that we didn't find that kind of activity among the patients in this study who were recovering from mild to moderate forms of the disease is reassuring."

However, Hotaling warns that intimate contact can still increase the risk of spreading the disease through coughing, sneezing and kissing. In addition, some infected people are asymptomatic and can appear healthy, even as they pass the virus along to others.

Credit: 
University of Utah Health

USGS releases first-ever comprehensive geologic map of the Moon

video: This animation shows a rotating globe of the new Unified Geologic Map of the Moon with shaded topography from the Lunar Orbiter Laser Altimeter (LOLA). This geologic map is a synthesis of six Apollo-era regional geologic maps, updated based on data from recent satellite missions. It will serve as a reference for lunar science and future human missions to the Moon.

Image: 
NASA/GSFC/USGS

FLAGSTAFF, Ariz. - Have you ever wondered what kind of rocks make up those bright and dark splotches on the moon? Well, the USGS has just released a new authoritative map to help explain the 4.5-billion-year-old history of our nearest neighbor in space.

For the first time, the entire lunar surface has been completely mapped and uniformly classified by scientists from the USGS, in collaboration with NASA and the Lunar Planetary Institute.

The lunar map, called the "Unified Geologic Map of the Moon," will serve as the definitive blueprint of the moon's surface geology for future human missions and will be invaluable for the international scientific community, educators and the public-at-large. The digital map is available online now and shows the moon's geology in incredible detail (1:5,000,000 scale).

"People have always been fascinated by the moon and when we might return," said current USGS Director and former NASA astronaut Jim Reilly. "So, it's wonderful to see USGS create a resource that can help NASA with their planning for future missions."

To create the new digital map, scientists used information from six Apollo-era regional maps along with updated information from recent satellite missions to the moon. The existing historical maps were redrawn to align them with the modern data sets, thus preserving previous observations and interpretations. Along with merging new and old data, USGS researchers also developed a unified description of the stratigraphy, or rock layers, of the moon. This resolved issues from previous maps where rock names, descriptions and ages were sometimes inconsistent.

"This map is a culmination of a decades-long project," said Corey Fortezzo, USGS geologist and lead author. "It provides vital information for new scientific studies by connecting the exploration of specific sites on the moon with the rest of the lunar surface."

Elevation data for the moon's equatorial region came from stereo observations collected by the Terrain Camera on the recent SELENE (Selenological and Engineering Explorer) mission led by JAXA, the Japan Aerospace Exploration Agency. Topography for the north and south poles was supplemented with NASA's Lunar Orbiter Laser Altimeter data.

Credit: 
U.S. Geological Survey

Toward a more energy-efficient spintronics

image: (Top) Example of the system developed: a ferromagnetic material that can generate a spin current and inject it into an interface material, in which it is converted into a charge current. Traditionally, in order to change the sign of the charge current produced, the magnetization of the ferromagnetic material must be reversed by applying a magnetic field or a powerful current. Here this is produced by reversing the polarisation of the ferroelectric material using an electric field. (Bottom) Experimental curve showing the evolution of the charge produced as a function of the voltage applied to the ferroelectric material.

Image: 
CNRS/Thales and Spintec (CNRS/CEA/Université Grenoble Alpes)

Electron spin--a fundamentally quantum property--is central to spintronics, a technology that revolutionized data storage,[1] and that could play a major role in creating new computer processors. In order to generate and detect spin currents, spintronics traditionally uses ferromagnetic materials whose magnetization switching consume high amounts of energy. In the April 22, 2020 issue of Nature, researchers at the Spintec Laboratory (CNRS/CEA/Université Grenoble Alpes) and the CNRS/Thales Laboratory recently presented an approach that can detect spin information at low power using a non-magnetic system. Their research opens the way towards spintronic devices that operate on ferroelectricity rather than on ferromagnetism, thereby consuming 1,000 times less energy.

Credit: 
CNRS

Finding genetic ripple effects in a single-cell environment

image: These images from a study published by Nature show gene heat maps and graphs that help illustrate the links researchers identified between gene expression and biological processes in the immune disease severe congenital neutropenia (SCN). The single-cell analyses in the study create a new platform for researchers and clinicians to study the single-cell genomics of different diseases, with the potential to make genetic-based clinical diagnoses more precise and effective.

Image: 
Cincinnati Children's

CINCINNATI – Although advances in genetics and genomics reveal numerous disease-associated gene mutations, physicians and researchers still wrestle with the tricky challenge of linking those mutations to actual disease-causing processes.

Add this to the growing use of single-cell biology, which detects gene and/or protein expression in each cell at a given moment, and the task becomes even more daunting.

Now, researchers at Cincinnati Children's Hospital Medical Center report in the journal Nature developing a molecular workflow that leverages single-cell methods to understand the molecular pathways associated with specific patient gene mutations. Scientists first identified a gene mutation that causes the blood disease severe congenital neutropenia (SCN) in children. This mutation was found to block blood cell development leading to the formation of neutrophils.

The study creates a new platform for researchers and clinicians to study the single-cell genomics of a variety of different diseases, which potentially could make genetic-based diagnoses in the clinic more precise and effective, according to the study's principal investigator, H. Leighton Grimes, PhD, in the Division of Immunobiology at Cincinnati Children's.

"Other than some already well-known causative mutations, determining the difference between important and unimportant DNA changes can be tedious and difficult," Grimes said. "We can already sequence a child's genome and link a DNA sequence difference to the disease. However, determining which of these DNA changes is a real disease-causing mutation is difficult, but critical to understanding the molecular mechanisms of a disease, and driving toward a cure."

Tracking Genetic Mischief

The researchers focused on mutations involving the gene Growth Factor Independent-1 (GFI1). Children with SCN only have immature blood cells in their immune systems, including immature neutrophils. This puts them at high risk for infection and disease.

Researchers created genetic models of SCN with mice and human neutrophil cells, which confirmed the disease-causing effect of GFI1 mutations and allowed scientists to determine how the mutant GFI1 affects neutrophil blood cells at each stage of the cell development cycle. That new data allowed them to successfully treat and partially rescue immune cells called neutrophils in their SCN models.

Although genetic manipulations rescued the impact of the GFI1 mutations on neutrophil specification (an initial step of differentiation), the resulting cells were still defective in commitment (a later stage of differentiation where effector functions are programmed). That meant the rescued cells were still unable to function fully as part of the innate immune system, the body's first line of defense against infection.

The finding underscores the importance of treating each cell type impacted by a disease causing mutation. The new molecular workflow system is expected to help facilitate strategic targeting for therapeutic intervention.

Blending Biology and Supercomputers

The work began within the Division of Human Genetics, using a clinical assay that re-sequenced the genomes of cells from 225 children with SCN. The researchers found DNA sequence alterations in the GFI1 gene. Although some mutations in GFI1 are known to cause neutropenia, many of the GFI1 DNA sequence changes were of unknown clinical significance.

Next, the research team combined computational bioinformatics (lead by co-corresponding author Nathan Salomonis, PhD) with biological experimentation in the laboratory.

To determine whether any of the newly discovered DNA sequence changes in GFI1 could cause neutropenia, researchers created two genetic models. Namely, they genetically introduced the SCN-patient's GFI1 mutations into the mouse genome, and human induced pluripotent stem cells.

The stem cells can be induced to generate human neutrophil cells.

As the neutrophil cells developed, investigators captured their expressed genes, proteomes and other molecular components in normal and mutant human cells and mice through each stage of the cells' development cycle. They next employed a new informatics workflow called cellHarmony. This allowed them to detail and compare the downstream target genes and molecular activities of normal and mutant cells at each stage of neutrophil development.

In human cells and mice with mutant GFI1, the mutation triggered a cascade of molecular dysfunction that blocked the development of immature neutrophils. This confirmed the disease causing effect of the GFI1 gene DNA sequence alterations.

That dysfunction was propelled in part by defects in chromatin in the developing mutant cells' nuclei--the part of the cell that contains its genetic code and DNA. The chromatin in mutant cells remained open and subject to DNA alterations throughout subsequent development stages. This helped scientists identify the cell states most affected by the mutant gene -notably those encompassing neutrophil specification.

The new data allowed researchers to genetically repair neutrophil specification, which in turn controls the number of neutrophils. But repairing neutrophil specification did not automatically repair neutrophil commitment, the development stage where the neutrophils' innate immune function is programmed.

Children with SCN are regularly treated with a cytokine called Granulocyte colony stimulating factor, a therapy that rescues the numbers of neutrophils produced. But those same children often need to be treated with antibiotics and antifungals to battle various infections.

Grimes said this raises the question of whether cytokine treatment completely repairs the defect, especially since children receiving cytokine therapy for SCN require additional treatment to protect them from recurrent infections. When the researchers analyzed the innate immune function of neutrophils from cytokine-treated SNC patients, they discovered that similar to the mouse cells in their study the cells did not undergo normal commitment and were defective in innate immune function.

Future work is now focused on understanding the steps of neutrophil commitment, and how developing neutrophils program their chromatin and gene expression to fight bacterial and fungal infections.

Credit: 
Cincinnati Children's Hospital Medical Center

UCLA scientists invent nanoparticle that could improve treatment for bone defects

A team of biomaterials scientists and dentists at the UCLA School of Dentistry has developed a nanoparticle that, based on initial experiments in animals, could improve treatment for bone defects.

A paper describing the advance is published today in the journal Science Advances.

Bone defects, which can be caused by traumatic injury, infection, osteoporosis or the removal of tumors, are difficult for orthopedic surgeons to treat. And the need for bone grafts are becoming more common thanks in part to our aging population: Bone injuries are particularly prevalent among the elderly.

Today, the standard treatment for bone defects is a bone graft, which involves transplanting healthy bone from another part of the body to repair the damaged area. However, the procedure can cause complications, including infections where the transplanted bone is taken from, bleeding and nerve damage.

So the researchers turned their attention to liposomes, tiny spherical sacs that are derived from naturally existing lipids. Liposomes have been used since the 1990s to treat cancer and infectious diseases, and more recently they are being explored for their possible use in bone tissue engineering. They can be used to administer nutrients and pharmaceutical drugs in the body and can easily enter cells to administer their valuable cargo, but they do have some drawbacks: They are physically unstable and it can be difficult to control how and when they release drugs.

To help improve their stability and enhance their ability to form bone in the body, the UCLA researchers developed a new type of liposome called a sterosome. (The name is inspired by the fact that they contain a high concentration of steroids.)

To produce the sterosomes, the scientists replaced cholesterol, an important component of liposomes, with oxysterol, a type of cholesterol that has a key role in skeletal development and bone healing. In tests using mice with bone defects, the researchers found that the sterosomes successfully activated bone regeneration on their own, without needing therapeutic drugs.

"Liposomes are generally made from pharmacologically inactive substances," said Min Lee, the paper's corresponding author and a professor of biomaterials science at the dental school. "Including oxysterol into our liposomal formulation not only increased nanoparticle stability but also stimulated cells to develop into bone-forming cells."

In a second phase of the study, the researchers wanted to see how they could make the sterosome even more effective.

They added their sterosome nanoparticle to a tissue engineering scaffold -- a structure often used to move and grow naturally occurring stem cells, which is matched to the site of the defect and is used during bone graft procedures. They loaded the sterosomes with a bone-building drug called purmorphamine. Next, they immobilized the drug-loaded sterosome onto a scaffold to ensure that the sterosomes stayed concentrated in the defective areas and released the drugs where they were most needed for as long as possible.

In a six-week study using mice with bone defects in their skulls, the researchers saw an average reduction of roughly 50% in the size of the defects after the drug-loaded scaffold was implanted.

"By using our nanoparticle, which we found has intrinsic bone-forming capabilities, along with the addition of therapeutic drugs, we were able to speed up the bone regeneration process," Lee said. "Our nanoparticle-packaged drugs will be useful in many clinical situations where bone grafting is required to treat non-healing skeletal defects and related bone pathologies."

Dr. Paul H. Krebsbach, professor of periodontics and dean of the dental school, said, "The research led by Min Lee and his team demonstrates that UCLA Dentistry's research endeavors go well beyond treating the diseases of the oral cavity, and their findings have wider implications for treating bone defects throughout the entire body."

Credit: 
University of California - Los Angeles

Tectonic plates started shifting earlier than previously thought

image: A geologic map of the Pilbara Craton in Western Australia. The rocks exposed here range from 2.5 to 3.5 billion years ago, offering a uniquely well-preserved window into Earth's deep past. The authors of the study spent two field seasons in the Pilbara sampling lavas (shown in green shades) dated to 3.2 billion years ago. For scale, the image is about 500 kilometers across, covering approximately the same area as the state of Pennsylvania.

Image: 
Alec Brenner, Harvard University. Map data from the Geological Survey of Western Australia.

An enduring question in geology is when Earth's tectonic plates began pushing and pulling in a process that helped the planet evolve and shaped its continents into the ones that exist today. Some researchers theorize it happened around four billion years ago, while others think it was closer to one billion.

A research team led by Harvard researchers looked for clues in ancient rocks (older than 3 billion years) from Australia and South Africa, and found that these plates were moving at least 3.2 billion years ago on the early Earth. In a portion of the Pilbra Craton in Western Australia, one of the oldest pieces of the Earth's crust, scientists found a latitudinal drift of about 2.5 centimeters a year, and dated the motion to 3.2 billion years ago.

The researchers believe this shift is the earliest proof that modern-like plate motion happened between two to four billion years ago. It adds to growing research that tectonic movement occurred on the early Earth. The findings are published in Science Advances.

"Basically, this is one piece of geological evidence to extend the record of plate tectonics on Earth farther back in Earth history," said Alec Brenner, one of the paper's lead authors and a member Harvard's Paleomagnetics Lab. "Based on the evidence we found, it looks like plate tectonics is a much more likely process to have occurred on the early Earth and that argues for an Earth that looks a lot more similar to today's than a lot of people think."

Plate tectonics is key to the evolution of life and the development of the planet. Today, the Earth's outer shell consists of about 15 rigid blocks of crust. On them sit the planet's continents and oceans. The movement of these plates shaped the location of the continents. It helped form new ones and it created unique landforms like mountain ranges. It also exposed new rocks to the atmosphere, which led to chemical reactions that stabilized Earth's surface temperature over billions of years. A stable climate is crucial to the evolution of life.

When the first shifts occurred has long been an issue of considerable debate in geology. Any information that sheds light on it is valuable. The study, published on Earth Day, helps fill in some of the gaps. It also loosely suggests the earliest forms of life developed in a more moderate environment.

"We're trying to understand the geophysical principles that drive the Earth," said Roger Fu, one of the paper's lead authors and an assistant professor of earth and planetary sciences in the Faculty of Arts and Science. "Plate tectonics cycles elements that are necessary for life into the Earth and out of it."

Plate tectonics helps planetary scientists understand worlds beyond this one, too.

"Currently, Earth is the only known planetary body that has robustly established plate tectonics of any kind," said Brenner, a third-year graduate student in the Graduate School of Arts and Sciences. "It really behooves us as we search for planets in other solar systems to understand the whole set of processes that led to plate tectonics on Earth and what driving forces transpired to initiate it. That hopefully would give us a sense of how easy it is for plate tectonics to happen on other worlds, especially given all the linkages between plate tectonics, the evolution of life and the stabilization of climate."

For the study, members of the project traveled to Pilbara Craton in Western Australia. A craton is a primordial, thick, and very stable piece of crust. They are usually found in the middle of tectonic plates and are the ancient hearts of the Earth's continents.

This makes them the natural place to go to study the early Earth. The Pilbara Craton stretches about 300 miles across, covering approximately the same area as the state of Pennsylvania. Rocks there formed as early as 3.5 billion years ago.

In 2017, Fu and Brenner took samples from a portion called the Honeyeater Basalt. They drilled into the rocks there and collected core samples about an inch wide.

They brought the samples back to Fu's lab in Cambridge, where they placed the samples into magnetometers and demagnetizing equipment. These instruments told them the rock's magnetic history. The oldest, most stable bit of that history is hopefully when the rock formed. In this case, it was 3.2 billion years ago.

The team then used their data and data from other researchers, who've demagnetized rocks in nearby areas, to date when the rocks shifted from one point to another. They found a drift of 2.5 centimeters a year.

Fu and Brenner's work differs from most studies because the scientists focused on measuring the position of the rocks over time while other work tends to focus on chemical structures in the rocks that suggest tectonic movement.

Researchers used the novel Quantum Diamond Microscope to confirm their findings from 3.2 billion years ago. The microscope images the magnetic fields and particles of a sample. It was developed in collaboration between researchers at Harvard and MIT.

In the paper, the researchers point out they weren't able to rule out a phenomenon called "true polar wander." It can also cause the Earth's surface to shift. Their results lean more towards plate tectonic motion because of the time interval of this geological movement.

Fu and Brenner plan to keep analyzing data from the Pilbara Craton and other samples from around the world in future experiments. A love of the outdoors drives both of them, and so does an academic need to understand the Earth's planetary history.

"This is part of our heritage," Brenner said.

Credit: 
Harvard University