Tech

How success breeds success in the sciences

image: Berkeley Haas Assistant Professor Mathijs De Vaan.

Image: 
UC Berkeley Haas

A small number of scientists stand at the top of their fields, commanding the lion's share of research funding, awards, citations, and prestigious academic appointments. But are they better and smarter than their peers? Or is this a classic example of success breeding success--a phenomenon known as the "Matthew effect"?

Mathijs De Vaan, an assistant professor in the Haas Management of Organizations Group, believes it's clearly the latter. In a paper published this week in Proceedings of the National Academy of Sciences, "The Matthew Effect in Science Funding," De Vaan presents the results of a study of Dutch research grants that shows precisely how much of an advantage early achievement confers, and identifies the reasons behind the boost. De Vaan, who came to Haas in 2015 after earning a PhD in sociology from Columbia University, co-authored the paper with Thijs Bol of the University of Amsterdam and Arnout van de Rijt of Utrecht University.

"To those who have, more will be given"

The term "Matthew effect" was coined by sociologist Robert Merton in the 1960s to describe how eminent scientists get more recognition for their work than less-well-known researchers--the reference is to the New Testament parable that, to those who have, more will be given. Previous attempts to study this phenomenon have yielded inconclusive results, in part because it is hard to prove that differences in achievement don't reflect differences in work quality.

To get around the quality question, De Vaan and his co-authors took advantage of special features of the main science funding organization in the Netherlands, IRIS, which awards grants based on a point system. Everyone whose application scores above the point threshold gets money, while everyone below is left out. The authors zeroed in on researchers who came in just above and just below the funding threshold, assuming that, for practical purposes, their applications were equal in quality.

First off, they found the benefits of winning an early-career grant were enormous. Recent PhDs who scored just above the funding threshold later received more than twice as much research money as their counterparts who scored immediately below the threshold. The winners also had a 47 percent greater chance of eventually landing a full professorship. "Even though the differences between individuals were virtually zero, over time a giant gap in success became evident," De Vaan notes.

Status and participation

De Vaan says that two main mechanisms may explain the Matthew effect in science funding. First, winners achieve status that can tilt the playing field in their direction when it comes to funding, awards, and job opportunities. The second is participation, meaning that successful applicants continue seeking grant money, while unsuccessful applicants often give up, withdrawing from future competition.

De Vaan and his coauthors argue that the Matthew effect erodes the quality of scientific research because projects tend to get funded based on an applicant's status, not merit. Groundbreaking work may not get done because the researchers are unknown or too discouraged to compete for funds. They recommend several reforms to the funding process, including limiting information grant application reviewers have about previous awards. They also suggest that rejected applicants learn their scores, which might encourage those just below the threshold to try again.

These findings may apply in many areas beyond science. For example, the Matthew effect may also widen a gulf between winning and losing entrepreneurs in the race for venture capital. Even the Academy Awards may favor big movie industry names over lesser-known talent. "There are a lot of social settings with large amounts of inequality, which could be ripe for the study of the Matthew effect," De Vaan stresses.

Credit: 
University of California - Berkeley Haas School of Business

Too liberal use of oxygen increases risk of death in acutely ill adult patients

image: Derek Chu, left, McMaster University clinical fellow, and Waleed Alhazzani, assistant professor of medicine at McMaster, led a study published in The Lancet on oxygen use in adult patients. Photo courtesy McMaster University.

Image: 
McMaster University

Hamilton, ON (April 26, 2018) - McMaster University researchers have found there is such a thing as too much oxygen for acutely ill adults.

Extensive data analyses in a study from the university show that supplemental oxygen, when given liberally to these patients, increases the risk of death without improving other health outcomes.

The results were published today in The Lancet.

"Supplemental oxygen is administered to millions of acutely unwell patients around the world every day," said Waleed Alhazzani, senior author of the paper, assistant professor of medicine at McMaster and an intensive care and general internal medicine staff member at St. Joseph's Healthcare Hamilton.

"Despite this, there is a striking lack of definitive, high-quality evidence related to this common intervention."

The McMaster-led team of researchers searched electronic academic databases from their inception through to October 2017 for randomized controlled trials done worldwide which compared liberal versus conservative oxygen therapy and death rates, as well as impacts on such aspects as disability, infections and hospital length of stay.

The 25 randomized controlled trials encompassed more than 16,000 adult patients with sepsis, stoke, trauma, emergency surgery, heart attack or cardiac arrest.

Data analysis demonstrated that, compared to the conservative strategy, liberal administration of oxygen resulted in increased in-hospital death by 21 per cent. Additional analyses suggested that the more supplemental oxygen patients were given, the higher their risk was for death. However, the incidence of other conditions, such as infections or length of hospital stay, were similar between the two groups.

The researchers estimated one additional death for every 71 patients treated with a liberal oxygen strategy.

"Our findings are distinct from the pervasive view that liberal oxygen therapy for acute illnesses is at worst, harmless," said Alhazzani.

The results of the study, called Improving Oxygen Therapy in Acute-illness (IOTA), have immediate and important implications for health-care providers, policymakers and researchers, say the authors.

"Prior practice guidelines and medical directives on oxygen therapy for acute illnesses have been inconsistent," said Derek Chu, first author of the paper and a McMaster clinical fellow.

"Our results provide much-needed clarification by showing, with high-quality evidence, that administering too much supplemental oxygen increases mortality among a broad range of acute illnesses.

"Currently, patients are frequently given supplemental oxygen and at excessive levels. A simple change to current practice - being more moderate and cautious with how much oxygen is administered to acutely unwell patients - could save lives."

Credit: 
McMaster University

Taxing sweet snacks may bring greater health benefits than taxing sugar-sweetened drinks

A 10% tax on sweet snacks could lead to a similar reduction in consumer demand as taxing sugar-sweetened drinks

Taxing chocolate and confectionary is estimated to have knock-on effects that may further reduce purchases of sugar-sweetened drinks and other snacks

Sweet snacks provide twice as much sugar in the diet as sugar-sweetened drinks, so the overall reduction on sugar intake would be greater than that observed with taxes on sugar-sweetened drinks

The potential effect of a tax on sweet snacks is expected to be greatest in low-income households

Taxing sweet snacks could lead to broader reductions in the amount of sugar purchased than similar increases in the price of sugar-sweetened beverages (SSBs), according to new research published in BMJ Open.

The research team from the London School of Hygiene & Tropical Medicine, the University of Cambridge and the University of Oxford, estimate that adding 10% to the price of chocolate, confectionery, cakes and biscuits may reduce purchases by around 7%. This is a similar outcome to taxing SSBs, where previous research suggests a 10% price rise can reduce purchases by 6-8%.

Crucially, however, the study found that taxing sweet snacks could have knock-on effects on the sales of other food items, reducing the purchase of soft-drinks (by 0.6-0.8%), biscuits and cakes (1.2%), and savoury snacks (1.6%).

This study is an observational analysis and cannot explain why consumers change their purchasing behaviour but, although some uncertainty remains, the researchers say the associations observed suggest that relevant policies and future research should consider a broader range of fiscal measures to improve diet than is currently the case.

Lead author Professor Richard Smith from the London School of Hygiene & Tropical Medicine said: "We know that increasing the price of sugar-sweetened beverages is likely to generate a small, but significant, reduction in their purchase. However, there has been little research on the impact that a similar price increase on other sweet foods such as chocolate, confectionery, cakes and biscuits could have on the purchase of sugar. This research suggests that taxing these sweet snacks could bring greater health gains and warrants detailed consideration."

This study, funded by the National Institute for Health Research Policy Research Programme, is the first to provide a direct analysis of the relationship between price increases and consumer demand for snack foods across different income groups.

Household expenditure on food and drink items were classified into 13 different groups and examined in a national representative sample of around 32,000 UK homes. Purchasing was examined overall and then compared across low-income, middle-income and high-income households. The data (from Kantar Worldpanel) covered a two-year period in 2012 and 2013, and provided complete details of each sales transaction, in addition to social and demographic information for each household. To estimate the change in purchasing, the researchers applied a specialised tool for studying consumer demand.

The researchers found that increasing the price of sweet snacks led to a decrease in purchases and may have wider effects on purchasing patterns, which they suggest could potentially bring additional benefits to public health. For example, increasing the price of chocolate snacks was estimated to bring about significant reductions in purchases across most food categories, while a price increase on biscuits showed a potential reduction in the demand for cakes (2.3%) as well as chocolate and confectionary (1.7%).

The potential effects of price increases were greatest in the low-income group. Increasing the price of biscuits was linked to a reduction in the purchase of chocolate and confectionery for the low-income group (3% if price increases by 10%). No such reductions for the high-income group were seen. Increasing the price of chocolate and confectionery was estimated to have a similar effect across all income groups.

Co-author Professor Susan Jebb from the University of Oxford said: "It's impossible to study the direct effects of a tax on snack food on consumer behaviour until such policies are introduced, but these estimates show the likely impact of changes in the price. These snacks are high in sugar but often high in fat too and very energy dense, so their consumption can increase the risk of obesity. This research suggests that extending fiscal policies to include sweet snacks could be an important boost to public health, by reducing purchasing and hence consumption of these foods, particularly in low-income households."

The authors acknowledge limitations of the study including the exclusion of purchases of foods and drink bought and consumed outside of homes (e.g. in restaurants) which they say are likely to be greater among higher income earners.

Credit: 
London School of Hygiene & Tropical Medicine

Einstein-Podolsky-Rosen paradox observed in many-particle system for the first time

image: A cloud of atoms is held above a chip by electromagnetic fields. The EPR paradox was observed between the spatially separated regions A and B.

Image: 
University of Basel, Department of Physics

Physicists from the University of Basel have observed the quantum mechanical Einstein-Podolsky-Rosen paradox in a system of several hundred interacting atoms for the first time. The phenomenon dates back to a famous thought experiment from 1935. It allows measurement results to be predicted precisely and could be used in new types of sensors and imaging methods for electromagnetic fields. The findings were recently published in the journal Science.

How precisely can we predict the results of measurements on a physical system? In the world of tiny particles, which is governed by the laws of quantum physics, there is a fundamental limit to the precision of such predictions. This limit is expressed by the Heisenberg uncertainty relation, which states that it is impossible to simultaneously predict, for example, the measurements of a particle's position and momentum, or of two components of a spin, with arbitrary precision.

A paradoxical decrease in uncertainty

In 1935, however, Albert Einstein, Boris Podolsky, and Nathan Rosen published a famous paper in which they showed that precise predictions are theoretically possible under certain circumstances. To do so, they considered two systems, A and B, in what is known as an "entangled" state, in which their properties are strongly correlated.

In this case, the results of measurements on system A can be used to predict the results of corresponding measurements on system B with, in principle, arbitrary precision. This is possible even if systems A and B are spatially separated. The paradox is that an observer can use measurements on system A to make more precise statements about system B than an observer who has direct access to system B (but not to A).

First observation in a many-particle system

In the past, experiments have used light or individual atoms to study the EPR paradox, which takes its initials from the scientists who discovered it. Now, a team of physicists led by Professor Philipp Treutlein of the Department of Physics at the University of Basel and the Swiss Nanoscience Institute (SNI) has successfully observed the EPR paradox using a many-particle system of several hundred interacting atoms for the first time.

The experiment used lasers to cool atoms to just a few billionths of a degree above absolute zero. At these temperatures, the atoms behave entirely according to the laws of quantum mechanics and form what is known as a Bose-Einstein condensate - a state of matter that Einstein predicted in another pioneering paper in 1925. In this ultracold cloud, the atoms constantly collide with one another, causing their spins to become entangled.

The researchers then took measurements of the spin in spatially separated regions of the condensate. Thanks to high-resolution imaging, they were able to measure the spin correlations between the separate regions directly and, at the same time, to localize the atoms in precisely defined positions. With their experiment, the researchers succeeded in using measurements in a given region to predict the results for another region.

"The results of the measurements in the two regions were so strongly correlated that they allowed us to demonstrate the EPR paradox," says PhD student Matteo Fadel, lead author of the study. "It's fascinating to observe such a fundamental phenomenon of quantum physics in ever larger systems. At the same time, our experiments establish a link between two of Einstein's most important works."

On the path towards quantum technology

In addition to their basic research, the scientists are already speculating about possible applications for their discovery. For example, the correlations that are at the heart of the EPR paradox could be used to improve atomic sensors and imaging methods for electromagnetic fields. The development of quantum sensors of this kind is one objective of the National Centre of Competence in Research Quantum Science and Technology (NCCR QSIT), in which the team of researchers is actively involved.

Credit: 
University of Basel

Discovery of new material is key step toward more powerful computing

CORVALLIS, Ore. - A new material created by Oregon State University researchers is a key step toward the next generation of supercomputers.

Those "quantum computers" will be able to solve problems well beyond the reach of existing computers while working much faster and consuming vastly less energy.

Researchers in OSU's College of Science have developed an inorganic compound that adopts a crystal structure capable of sustaining a new state of matter known as quantum spin liquid, an important advance toward quantum computing.

In the new compound, lithium osmium oxide, osmium atoms form a honeycomb-like lattice, enforcing a phenomenon called "magnetic frustration" that could lead to quantum spin liquid as predicted by condensed matter physics theorists.

Corresponding author Mas Subramanian, Milton Harris Professor of Materials Science at OSU, explains that in a permanent magnet like a compass needle, the electrons spin in an aligned manner - that is, they all rotate in the same direction.

"But in a frustrated magnet, the atomic arrangement is such that the electron spins cannot achieve an ordered alignment and instead are in a constantly fluctuating state, analogous to how ions would appear in a liquid," Subramanian said.

The lithium osmium oxide discovered at OSU shows no evidence for magnetic order even when frozen to nearly absolute zero, which suggests an underlying quantum spin liquid state is possible for the compound, he said.

"We are excited about this new development as it widens the search area for new quantum spin liquid materials that could revolutionize the way we process and store data," Subramanian said. "The quantum spin liquid phenomenon has so far been detected in very few inorganic materials, some containing iridium. Osmium is right next to iridium in the periodic table and has all the right characteristics to form compounds that can sustain the quantum spin liquid state."

Arthur Ramirez, condensed matter physicist at the University of California, Santa Cruz, one of the co-authors in the paper, noted that this compound is the first honeycomb-structured material to contain osmium and expects more to follow.

Ramirez also noted that this study demonstrates the importance of multidisciplinary collaboration involving materials chemists and condensed matter physicists engaged in synthesis, theory and measurements to tackle emerging science like quantum spin liquid.

The next step for Subramanian's team is exploring the chemistry needed to create various perfectly ordered crystal structures with osmium.

The National Science Foundation is funding the research through its DMREF program: Designing Materials to Revolutionize and Engineer our Future. Findings were published today in Scientific Reports.

The concept of quantum computing is based on the ability of subatomic particles to exist in more than one state at any time.

Classical computing relies on bits - pieces of information that exist in one of two states, a 0 or a 1. In quantum computing, information is translated to quantum bits, or qubits, that can store much more information than a 0 or 1 because they can be in any "superposition" of those values.

Think of bits and qubits by visualizing a sphere. A bit can only be at either of the two poles on the sphere, whereas a qubit can be anywhere on the sphere. What that means is much more information storage potential and much less energy consumption.

Credit: 
Oregon State University

Prosthetic arms can provide controlled sensory feedback, study finds

image: A patient performs various everyday tasks with a sensory control module integrated with his prosthetic arm.

Image: 
Aadeel Akhtar, University of Illinois

CHAMPAIGN, Ill. -- Losing an arm doesn't have to mean losing all sense of touch, thanks to prosthetic arms that stimulate nerves with mild electrical feedback.

University of Illinois researchers have developed a control algorithm that regulates the current so a prosthetics user feels steady sensation, even when the electrodes begin to peel off or when sweat builds up.

"We're giving sensation back to someone who's lost their hand. The idea is that we no longer want the prosthetic hand to feel like a tool, we want it to feel like an extension of the body," said Aadeel Akhtar, an M.D./Ph.D. student in the neuroscience program and the medical scholars program at the University of Illinois. Akhtar is the lead author of a paper describing the sensory control module, published in Science Robotics, and the founder and CEO of PSYONIC, a startup company that develops low-cost bionic arms.

"Commercial prosthetics don't have good sensory feedback. This is a step toward getting reliable sensory feedback to users of prosthetics," he said.

Prosthetic arms that offer nerve stimulation have sensors in the fingertips, so that when the user comes in contact with something, an electrical signal on the skin corresponds to the amount of pressure the arm exerts. For example, a light touch would generate a light sensation, but a hard push would have a stronger signal.

However, there have been many problems with giving users reliable feedback, said aerospace engineering professor Timothy Bretl, the principal investigator of the study. During ordinary wear over time, the electrodes connected to the skin can begin to peel off, causing a buildup of electrical current on the area that remains attached, which can give the user painful shocks. Alternately, sweat can impede the connection between the electrode and the skin, so that the user feels less or even no feedback at all.

"A steady, reliable sensory experience could significantly improve a prosthetic user's quality of life," Bretl said.

The controller monitors the feedback the patient is experiencing and automatically adjusts the current level so that the user feels steady feedback, even when sweating or when the electrodes are 75 percent peeled off.

The researchers tested the controller on two patient volunteers. They performed a test where the electrodes were progressively peeled back and found that the control module reduced the electrical current so that the users reported steady feedback without shocks. They also had the patients perform a series of everyday tasks that could cause loss of sensation due to sweat: climbing stairs, hammering a nail into a board and running on an elliptical machine.

"What we found is that when we didn't use our controller, the users couldn't feel the sensation anymore by the end of the activity. However, when we had the control algorithm on, after the activity they said they could still feel the sensation just fine," Akhtar said.

Adding the controlled stimulation module would cost much less than the prosthetic itself, Akhtar said. "Although we don't know yet the exact breakdown of costs, our goal is to have it be completely covered by insurance at no out-of-pocket costs to users."

The group is working on miniaturizing the module that provides the electrical feedback, so that it fits inside a prosthetic arm rather than attaching to the outside. They also plan to do more extensive patient testing with a larger group of participants.

"Once we get a miniaturized stimulator, we plan on doing more patient testing where they can take it home for an extended period of time and we can evaluate how it feels as they perform activities of daily living. We want our users to be able to reliably feel and hold things as delicate as a child's hand," Akhtar said. "This is a step toward making a prosthetic hand that becomes an extension of the body rather than just being another tool."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Whale shark logs longest-recorded trans-Pacific migration

image: Whale sharks are filter feeders, eating plankton, fish eggs, krill, crab larvae as well as small squid and fish that enter their large mouths. They cannot digest plastic garbage.

Image: 
Kevan Mantell

Little is known about the world's largest living fish, gentle giants reaching 12 meters (40 feet) in length. Researchers from the Smithsonian Tropical Research Institute (STRI) and colleagues tracked a female whale shark from the eastern Pacific to the western Indo-Pacific for 20,142 kilometers (more than 12,000 miles), the longest whale shark migration route ever recorded.

STRI marine biologist Héctor M. Guzmán tagged a female whale shark (Rhincodon typus) near Coiba Island in Panama, the largest island off of the coast of Central America, a National Park, World Heritage Site and marine protected area. His team named the shark Anne for conservationist Anne McEnany, president and CEO of the International Community Foundation (ICF). The multi-year project also tagged 45 additional sharks in Panama with sponsorship from Christy Walton's Candeo Fund at the ICF, along with STRI and Panama's science and technology bureau (SENACYT).

Guzmán estimated Anne's position based on signals from a Smart Position and Temperature (SPOT) tag tethered to the shark, received by the Advanced Research and Global Observation Satellite (ARGOS). The tag only communicates with the satellite when the shark swims near the surface. Anne remained in Panamanian waters for 116 days, then swam toward Clipperton Island (France), nearing Cocos Island (Costa Rica) en route to Darwin Island in the Galapagos (Ecuador), a site known to attract groups of sharks. 266 days after she was tagged, the signal disappeared, indicating that Anne was too deep to track. After 235 days of silence, transmissions began again, south of Hawaii. After a nine-day stay, she continued through the Marshall Islands until she arrived at the Marianas Trench, a canyon in the ocean floor near Guam in the Western Pacific where movie director James Cameron located the deepest point on the Earth's surface almost 11,000 meters (36,000 feet) below sea level.

Whale sharks dive to more than 1900 meters (6000 feet). But it is unknown what the animal was doing in this area.

"We have very little information about why whale sharks migrate," said Guzmán. "Are they searching for food, seeking breeding opportunities or driven by some other impulse?"

"Despite being the world's largest fish, it's amazing to me how little we know about this species," said Scott Eckert, co-author and biology professor at Principia College. "When I first began working on them, their taxonomy was debated, and it still wasn't clear how they reproduced."

Found in warm, tropical and sub-tropical waters, it is thought that about a quarter of whale sharks live primarily in the Atlantic, whereas about three-fourths live in the Indo-Pacific. Tourists are drawn to sites where 500 or more whale sharks gather: in Oman, Australia, Galapagos, Mexico, Mozambique and the Seychelles. Large groups are also reported from Taiwan, Southern China and the Gujarat coast of India.

Genetic studies show that whale sharks across the globe are closely related, indicating that they must travel long distances to mate. Whale sharks have been tracked for shorter distances along similar routes, but this report is the longest-recorded migration to date and the first evidence of a potential trans-Pacific route. Like Anne, other whale sharks appear to follow the North Equatorial Current for most of the distance. Large females can swim an average of 67 kilometers (about 40 miles) per day.

The whale shark is one of only three known filter-feeding sharks, feeding on plankton, fish eggs, krill, crab larvae as well as small squid and fish (and, accidentally, plastic, which they cannot digest). As such, they are not considered to be particularly dangerous, and tourism companies that offer the opportunity to swim very close to whale sharks are common near areas where they aggregate in large numbers. But their size also attracts fishing boats. They are sought after for their fins and meat, for their teeth (used for crafts and sold to tourists) and for cartilage and oil with purported medicinal value. Juvenile whale sharks often end up as bycatch in tuna and other fisheries.

Whale sharks were classified as endangered in 2016. During the past 75 years, it is estimated that nearly half of the world's whale sharks have disappeared. In many parts of the world, whale sharks have legal protection, but regulations are often not enforced. Guzman's data were used to design and draft local and regional policies for the protection of the species. Fishing, capture and sale of whale sharks are prohibited in Panama by Executive Decree No. 9, signed in 2009. In 2014, Panama's environmental authority passed an additional resolution regulating whale shark watching in Coiba National Park and the Isla Canales de Afuera marine reserve. The resolution includes a Whale Shark Watching Manual but unfortunately, tourism activities are not well organized and the authorities are not present to enforce the regulations.

"Whale sharks in Coiba have already changed their behavior to avoid the surface and tourists," Guzman said. "These studies are critical as we design international policy to protect transboundary species like the whale sharks and other highly migratory marine species."

Credit: 
Smithsonian Tropical Research Institute

Sub-sea rift spills secrets to seismic probe

image: The Galicia group -- from left, Rice graduate student Nur Schuba, alumnus Ara Alexanian and graduate research assistant Mari Tesi Sanjurjo -- discuss the northwest portion of the 3-D seismic volume at Rice's Visualization Lab.

Image: 
Gary Linkevich/Rice University

The first study to spring from a Rice University-led 2013 international expedition to map the sea floor off the coast of Spain has revealed details about the evolution of the fault that separates the continental and oceanic plates.

A paper in Earth and Planetary Science Letters by Rice graduate student Nur Schuba describes the internal structure of a large three-dimensional section of the Galicia, a non-volcanic passive margin between Europe and the Atlantic basin that shows no signs of past volcanic activity and where the crust is remarkably thin.

That thinness made it easier to capture 3-D data for about 525 square miles of the Galicia, the first transition zone in the world so analyzed.

Sophisticated seismic reflection tools towed behind a ship and on the ocean floor enabled the researchers to model the Galicia. Though the rift is buried under several hundreds of meters of powdered rock and invisible to optical instruments, seismic tools fire sound into the formation. The sounds that bounce back tell researchers what kind of rock lies underneath and how it's configured.

Among the data are the first seismic images of what geologists call the S-reflector, a prominent detachment fault within the continent-ocean transition zone. They believe this fault accommodated slipping along the zone in a way that helped keep the crust thin.

"The S-reflector, which has been studied since the '70s, is a very low-angle, normal fault, which means the slip happens due to extension," Schuba said. "What's interesting is that because it's at a low angle, it shouldn't be able to slip. But it did.

"One mechanism people have postulated is called the rolling hinge," she said. "The assumption is that an initially steep fault slipped over millions of years. Because the continental crust there is so thin, the material underneath it is hot and domed up in the middle. The initially steep fault started rolling and became almost horizontal.

"So with the help of the doming of the material coming from below and also the continuous slip, that's how it is likely to have happened," Schuba said.

The large data set also provided clues about interactions between the detachment fault and the serpentinized mantle, the dome of softer rock that presses upward on the fault and lowers friction during slippage. The researchers believe that led the Galicia to evolve differently, weakening faults and allowing for longer durations of activity.

The research is relevant to geologists who study land as well as sea because detachment faults are common above the water, Schuba said. "One of my advisers, (adjunct faculty member) Gary Gray, is jazzed about this because he says you can see these faults in Death Valley and Northern California, but you can't ever see them fully because the faults keep going underground. You can't see how deep they go or how the fault zones change or how they're associated with other faults.

"But a 3-D dataset is like having an MRI," she said. "We can bisect it any way we want. It makes me happy that this was the first paper to come out of the Galicia data and the fact that we can see things no one else could see before."

Credit: 
Rice University

New imaging system makes back surgery safer, faster and less expensive

image: This is an exposed spine with ligamentous structures intact and bone mini-screws implanted. An attached reference frame is also visible.

Image: 
Dartmouth College

Hanover, N.H. - Researchers at Dartmouth College have found a way to make back surgery safer, faster and more cost effective.

MRIs and CT scans help surgeons identify spine problems, like compressed vertebrae or herniated disks, but finding a clear path to those problem areas is not always as straightforward. Tissue and bone not only stand in the way, they can also move during spinal surgery, rendering a CT scan taken prior to surgery much less accurate.

To solve this problem, Dartmouth professors from the Thayer School of Engineering and the Geisel School of Medicine developed a 3-dimensional, real-time optical tracking system to guide back surgeons as they operate, like a Google Maps for the body, according to findings published in the journal Operative Neurosurgery.

Using a complex software algorithm and two cameras attached to a surgical microscope, the system produces real-time 3-dimensional digitized images on a monitor, according to the study. This type of tracked, calibrated stereoscopic camera system has been extensively used in brain surgery but until now has been unexplored for use in spinal surgery.

The surgeon can use this new intraoperative stereovision system (iSV) without any additional radiation or labor-intensive marking of key areas on the patient's spine, to match up or co-register with the pre-operative CT scan, as some surgeons do today. This new mapping provides more accurate renderings of where spinal implants or other surgical tools and devices need to go during the procedure, and is expected to save up to 30 minutes, according to one of the study's authors, Keith D. Paulsen, PhD, Robert A. Pritzker Professor of Biomedical Engineering at Thayer School of Engineering at Dartmouth.

Paulsen and the multidisciplinary team at Dartmouth's Center for Surgical Innovation tested the new iSV system for accuracy and efficiency while operating on pig spines. Since completing this study, the team has taken its complex system one step further by converting it into a handheld "wand" that the surgeon can pass over the surgical area.

"By rendering images real-time, with a simple handheld tool, we believe we can make surgeries safer and less costly in the future," said Paulsen.

Next up is fine-tuning the system and testing in humans. The National Institutes of Health has provided the Dartmouth team with another round of funding to continue testing. It could be several years before the system becomes widely available for human spinal surgeries.

Credit: 
Dartmouth College

Belief in fake causes of cancer is rife

Mistaken belief in mythical causes of cancer is rife according to new research jointly funded by Cancer Research UK and published today (Thursday) in the European Journal of Cancer*.

Researchers at University College London (UCL) and the University of Leeds surveyed 1,330 people in England and found that more than 40% wrongly thought that stress (43%) and food additives (42%) caused cancer.

A third incorrectly believed that electromagnetic frequencies (35%) and eating GM food (34%) were risk factors, while 19% thought microwave ovens and 15% said drinking from plastic bottles caused cancer despite a lack of good scientific evidence.

Among the proven causes of cancer, 88% of people correctly selected smoking, 80% picked passive smoking and 60% said sunburn.

Belief in mythical causes of cancer did not mean a person was more likely to have risky lifestyle habits.

But those who had better knowledge of proven causes were more likely not to smoke.

Dr Samuel Smith from the University of Leeds said: "It's worrying to see so many people endorse risk factors for which there is no convincing evidence.

"Compared to past research it appears the number of people believing in unproven causes of cancer has increased since the start of the century which could be a result of changes to how we access news and information through the internet and social media.

"It's vital to improve public education about the causes of cancer if we want to help people make informed decisions about their lives and ensure they aren't worrying unnecessarily."

Dr Lion Shahab from UCL said: "People's beliefs are so important because they have an impact on the lifestyle choices they make. Those with better awareness of proven causes of cancer were more likely not to smoke and to eat more fruit and vegetables."

Clare Hyde from Cancer Research UK said: "Around four in 10 cancer cases could be prevented through lifestyle changes** so it's crucial we have the right information to help us separate the wheat from the chaff.

"Smoking, being overweight and overexposure to UV radiation from the sun and sunbeds are the biggest preventable causes of cancer.

"There is no guarantee against getting cancer but by knowing the biggest risk factors we can stack the odds in our favour to help reduce our individual risk of the disease, rather than wasting time worrying about fake news."

Credit: 
Cancer Research UK

Should doctors recommend e-cigarettes to help smokers quit?

The National Institute for Health and Care Excellence offers guidance for doctors to advise people who are trying to quit smoking - that e-cigarettes are helpful tools when trying to quit. However, emerging evidence suggests that e-cigarettes as actually used, actually depress, not assist cigarette smoking cessation for most users, and are a gateway to youth smoking.

So, should they be recommended? Experts debate the issue in The BMJ today.

Smokers are asking their doctors for advice on e-cigarette use, they want to vape, and e-cigarettes can help smokers quit, argues Paul Aveyard, Professor of Behavioural Medicine at University of Oxford, and Deborah Arnott, Chief Executive of Action on Smoking & Health.

E-cigarettes are as effective as nicotine replacement therapy (NRT) in quitting smoking, and many people choose e-cigarettes over NRT. E-cigarettes are popular quit smoking aids, which leads to increases in quit attempts and quitting as a whole in England and the USA, they explain.

Some fear that addiction from tobacco to e-cigarettes leads to continued vaping which may be harmful. But they say, "for most vapers, the uncertainty around harms is largely irrelevant because vaping will be short term."

Some young people do experiment with e-cigarettes, but only one in several hundred young people who have never smoked use them more often than once a week. At a time when e-cigarettes have become more common, smoking among young people has fallen to record lows, so the risk of picking up smoking must be low, if it exists at all.

Concerns have been raised about the tobacco industry's involvement in the e-cigarette market, however, "the evidence suggests that e-cigarettes are not benefiting the tobacco industry because the rate of people smoking is falling" the authors say.

"In the UK, e-cigarettes are part of a comprehensive anti-smoking strategy that protects public policy from the commercial interests of the tobacco industry." The UK health policy "promotes vaping as an alternative to smoking and has consensus among the public health community with the endorsement of Cancer Research UK and other charities, medical royal colleges, and the BMA," they conclude.

But Kenneth Johnson, Adjunct Professor at the University of Ottawa, says recommending e-cigarettes for quitting smoking as currently promoted and used is irresponsible.

The overall evidence is that e-cigarettes as actually used depress, not assist cigarette smoking cessation. New research is replacing optimistic speculation with evidence that indicate the limits and hazards of e-cigarettes as a smoking aid, he says.

E-cigarettes have a serious public health risk of addicting new generations of young smokers, he adds. In a 2016 study among English youth (11-18 years of age), e-cigarette users were 12 times as likely to initiate smoking (52%) as never e-cigarette users.

"They [tobacco companies] have a long history of aggressively using their economic and political power to profit at the expense of public health," he adds. British American Tobacco has big plans for expanding the recreational nicotine market with e-cigarettes - and cessation is not part of the game plan."

The net effect of e-cigarettes on smoking cessation is negative, high levels of dual use undermine harm reduction, and gateway risks for youth smoking initiation are a demonstrated danger. Recommending e-cigarettes for smoking cessation, as currently promoted and used, is irresponsible," he concludes.

Credit: 
BMJ Group

Sunlight reduces effectiveness of dispersants used to clean up oil spills

image: An aircraft applies dispersants to a slick of sunlight-weathered oil in the Gulf of Mexico in 2010.

Image: 
WHOI

A new study shows that sunlight transforms oil spills on the ocean surface more quickly and significantly than previously thought, limiting the effectiveness of chemical dispersants that break up floating oil.

A research team funded by the National Science Foundation (NSF) and led by the Woods Hole Oceanographic Institution (WHOI) found that sunlight chemically alters crude oil floating on the sea surface within days or hours.

The team reported that sunlight changes oil into different compounds that dispersants cannot easily break up. The findings, published today in the journal Environmental Science & Technology Letters, could affect how responders decide when, where and how to use dispersants.

"It's been thought that sunlight has a negligible impact on the effectiveness of dispersants," said Collin Ward, a scientist at WHOI and lead author of the study. "Our findings show that sunlight is a primary factor that controls how well dispersants perform. And because photochemical changes happen fast, they limit the window of opportunity to apply dispersants effectively."

Added Henrietta Edmonds, a program director in NSF's Division of Ocean Sciences, which funded the research, "This study shows how important it is to do basic research on the chemical reactions that take place in the environment. The results will help us learn how to effectively respond to oil spills."

Oil and water: no mixing in the sea

Dispersants contain detergents, not unlike those people use to wash dishes, which help break oil into small droplets that are diluted in the ocean or are eaten by microbes before the oil can be swept to sensitive coastlines. But to do their work, the detergents (also known as surfactants) first need to mix with both the oil and water -- and oil and water, famously, don't mix.

To overcome this barrier, dispersants contain an organic solvent that helps the oil, detergents and water mix. Only when this step happens can the surfactants do their work to break oil into droplets. But sunlight obstructs this step, the new study shows.

Before dispersants can be applied, light energy from the sun immediately begins to break chemical bonds in oil compounds -- splitting off atoms or chemical chains and creating openings for oxygen to attach. This photo-oxidation process (also known as photochemical "weathering") is similar to the process that causes paint on cars or colors on clothes to fade if they are left out in the sun for too long.

To date, tests to determine the effectiveness of dispersants used only "fresh" oil that hadn't been altered by sunlight. In the new study, the researchers conducted extensive lab tests exposing oil to sunlight. They showed that sunlight rapidly transforms oil into residues that are only partially soluble in a dispersant's solvent, limiting the ability of detergents to mix with the photo-oxidized oil and break the oil into droplets.

New 'window of opportunity' estimate needed

The finding suggests that responders should factor in sunlight when determining the "window of opportunity" to use dispersants effectively. That window is far smaller on sunny days than previously thought.

"This study challenges the paradigm that photochemical weathering has a negligible impact on the effectiveness of aerial dispersants applied in response to oil spills," Ward said. "Sunlight rapidly alters oil into chemical compounds that dispersants can't easily break up into droplets. So photochemical weathering is a critical factor that should be considered to optimize decisions on when to use dispersants."

The continuous flow from the 2010 Deepwater Horizon disaster in the Gulf of Mexico provided a unique opportunity to study the effects of sunlight on oil. Because oil floated on the sea surface for 102 days, it gave officials a chance to collect oil shortly after it surfaced and was exposed to sunlight.

Testing oil, water -- and sunlight

The WHOI scientists obtained and tested samples of Deepwater Horizon oil that was skimmed from the surface almost immediately after it appeared. They found that the longer the oil floated on the sunlit sea surface, the more the oil was photo-oxidized. They estimated that half the spilled oil had been altered within days.

The next step was to test how the photo-oxidized oil would respond to dispersants. The scientists tested fresh, unaltered Deepwater Horizon oil that was collected directly from the broken riser pipe on the seafloor.

They controlled laboratory conditions to prevent temperature changes, evaporation, light infiltration and other factors, and they exposed the oil to increasing durations of light. Cassia Armstrong, a guest student from Trinity College, played a key role in conducting these tests and is a co-author of the paper.

Sunlight reduces dispersant effectiveness

Results of the experiments showed that light rapidly photo-oxidized the fresh oil, changing it within a few days into compounds that reduced the effectiveness of dispersants by at least 30 percent.

The scientists teamed with Deborah French McCay, an oil spill modeler at RPS ASA, a science and technology consulting firm in Rhode Island. The researchers simulated conditions that might have occurred during the Deepwater Horizon spill, including a range of wind speeds and sunlight levels. Then they superimposed the 412 flight lines of planes that sprayed dispersants during the crisis.

The results showed that because they targeted photochemically-weathered oil, the majority of dispersant applications would not have achieved minimum effectiveness levels under average wind and sunlight conditions.

Even under the best-case scenarios for aerial dispersant spraying -- cloudy weather (which would limit photochemical weathering) and high-wind conditions (which would transport oil farther from the spill before sunlight transformed it) -- dozens of aerial dispersant applications still would not have achieved designated effectiveness levels.

"We assembled a team that combined the expertise of academia, government and industry researchers," explained Christopher Reddy, a marine chemist at WHOI. "In future oil spill crises, the community needs the same kind of cooperation and collaboration to make the wisest decisions on how to respond most effectively."

Credit: 
U.S. National Science Foundation

Targeting telomerase as therapeutic strategy for melanoma

PHILADELPHIA -- (April 25, 2018) -- Targeting telomerase was effective at killing NRAS-mutant melanoma cells, and the impact was further enhanced when the strategy was paired with an inhibitor of mitochondrial function, according to study results by The Wistar Institute published in Oncogene.

"Many melanoma patients do not benefit from new treatment options or experience disease progression because of resistance. Our research advances the search for novel therapeutic strategies to treat NRAS-mutant melanoma, which is highly resistant to most therapies and associated with poor prognosis," said lead researcher Jessie Villanueva, Ph.D., assistant professor in the Molecular & Cellular Oncogenesis Program at Wistar.

Targeting NRAS as well as the other oncogenes of the Ras family has proven extremely challenging, according to Villanueva, and researchers have resorted to searching for vulnerabilities in NRAS-mutant cancer cells that can provide alternative therapeutic targets.

Mutations in the regulatory element of the TERT gene, which encodes the catalytic subunit of telomerase, are found in more than 70 percent of melanomas. Telomerase is an enzyme that protects the integrity of chromosome ends during replication and represents a promising target for cancer therapy because it is absent in most normal adult cells while its reactivation in malignant cells allows continuous cell divisions.

"We linked the presence of mutant NRAS to TERT expression and showed that NRAS mutant melanoma cells are highly dependent on telomerase," said Villanueva. "We also demonstrated the therapeutic value of exploiting this dependency by inducing telomere dysfunction, which caused cell death selectively in NRAS mutant cells and not in normal cells that do not express telomerase."

Villanueva and colleagues studied the consequence of NRAS depletion on gene expression in NRAS-mutant melanoma cells, focusing on genes that regulate proliferation. They observed a strong decrease in expression of TERT. They then inhibited telomerase activity via TERT gene silencing or induced telomere dysfunction through a telomerase substrate, called 6-thio-dG; both interventions led to extensive cell death and DNA damage. They also observed an increase in expression of several enzymes involved in the function of mitochondria, the organelles designated to energy production, suggesting that following telomere dysfunction cells put in place an adaptive metabolic response that helps them cope with the damage.

"We asked whether inhibition of mitochondrial function could synergize with the anti-melanoma effects of telomere dysfunction," said Patricia Reyes-Uribe, a research assistant in the Villanueva Lab and the first author of the study. "We found that the mitochondrial inhibitor gamitrinib enhances the cytotoxic effects of TERT depletion or 6-thio-dG selectively in NRAS mutant tumor cells."

These observations were reproduced in vivo in a mouse model of NRAS-mutant melanoma by showing that the combination of gamitrinib and 6-thio-dG reduced tumor size and substantially prolonged survival.

"Our work provides proof-of-principle that we can successfully address drug resistance by developing combination therapies that simultaneously impair telomerase and block adaptive resistance mechanisms," added Villanueva.

Credit: 
The Wistar Institute

Cheaper and easier way found to make plastic semiconductors

Cheap, flexible and sustainable plastic semiconductors will soon be a reality thanks to a breakthrough by chemists at the University of Waterloo.

Professor Derek Schipper and his team at Waterloo have developed a way to make conjugated polymers, plastics that conduct electricity like metals, using a simple dehydration reaction the only byproduct of which is water.

"Nature has been using this reaction for billions of years and industry more than a hundred," said Schipper, a professor of Chemistry and a Canada Research Chair in Organic Material Synthesis. "It's one of the cheapest and most environmentally friendly reactions for producing plastics."

Schipper and his team have successfully applied this reaction to create poly(hetero)arenes, one of the most studied classes of conjugated polymers which have been used to make lightweight, low- cost electronics such as solar cells, LED displays, and chemical and biochemical sensors.

Dehydration is a common method to make polymers, a chain of repeating molecules or monomers that link up like a train. Nature uses the dehydration reaction to make complex sugars from glucose, as well as proteins and other biological building blocks such as cellulose. Plastics manufacturers use it to make everything from nylon to polyester, cheaply and in mind-boggling bulk.

"Synthesis has been a long-standing problem in this field," said Schipper. "A dehydration method such as ours will streamline the entire process from discovery of new derivatives to commercial product development. Better still, the reaction proceeds relatively fast and at room temperature."

Conjugated polymers were first discovered by Alan Heeger, Alan McDonald, and Hideki Shirakawa in the late 1970s, eventually earning them the Nobel Prize in Chemistry in 2000.

Researchers and engineers quickly discovered several new polymer classes with plenty of commercial applications, including a semiconducting version of the material; but progress has stalled in reaching markets in large part because conjugated polymers are so hard to make. The multi-step reactions often involve expensive catalysts and produce environmentally harmful waste products.

Schipper and his team are continuing to perfect the technique while also working on developing dehydration synthesis methods for other classes of conjugated polymers. The results of their research so far appeared recently in the journal Chemistry - A European Journal.

Credit: 
University of Waterloo

Telemedicine aided people hit by hurricanes Harvey and Irma

Direct-to-consumer telemedicine is a viable way to deliver medical care in the days following a natural disaster, although most people who use such services do so for routine matters rather than disaster-caused illnesses, according to a new RAND Corporation study.

Examining the experience of one direct-to-consumer service in the weeks following Hurricanes Harvey and Irma during 2017, researchers found that use of the service peaked three to six days after the hurricanes made landfall.

The top diagnosis during the first month included acute respiratory illnesses and skin problems, which was similar to the trends the telemedicine service observed among all patients it treated nationally.

However, during the first week post-hurricane, telemedicine visits for chronic conditions, advice, counseling and refills, and back and joint concerns, including injuries, were more common among people affected by the hurricanes than among the other patients treated by the service. The findings are published online by the Journal of General Internal Medicine.

"Our study suggests that direct-to-consumer telemedicine is a new way to deliver routine health care to people in the immediate aftermath of a natural disaster, although it does require that certain infrastructure like cellular service and Wi-Fi remain intact," said Lori Uscher-Pines, lead author of the study and a senior policy researcher at RAND, a nonprofit research organization.

While telemedicine has been used during disasters for many years, providing care via direct-to-consumer telehealth only has become viable in recent years because of the widespread growth of smartphones and the creation of services that allow consumers to directly access thousands of U.S. physicians.

Several direct-to-consumer telemedicine companies offer patients with minor illnesses around-the-clock access to physicians via telephone or videoconferencing on their smartphone, tablet or laptop computer. Use of such services has grown rapidly, with more than 1.2 million visits reported nationally in 2015.

During the 2017 hurricane season, at least five direct-to-consumer telemedicine companies offered free visits to hurricane victims. RAND researchers analyzed the experiences of one of those providers, Doctor on Demand, during the 30 days following Hurricanes Harvey and Irma.

Researchers say a key advantage of telemedicine in disaster response is that out-of-state providers can be tapped to expand the response workforce quickly and cost effectively, so long as the cellular services and Wi-Fi infrastructure generally remains in place.

The RAND analysis found that 2,057 people affected by Harvey and Irma used Doctor on Demand services, with 63 percent of those being first-time users of the service. Physicians located outside the affected states handled slightly more than half of the visits.

"Relying on direct-to-consumer telehealth services may help relieve the immediate burden on local health care system so that limited in-person care resources can be reserved for those patients with the greatest need, Uscher-Pines said. "Our study illustrates the emerging role for direct-to-consumer telemedicine in disaster response."

Support for the study was provided by RAND and the National Institutes of Health. Other authors of the study are Dr. Shira Fischer and Rosalie Malsberger of RAND, Dr. Ian Tong of Doctor on Demand, Dr. Ateev Mehrotra of the Harvard Medical School, and Dr. Kristin Ray of the University of Pittsburgh School of Medicine.

RAND Health is the nation's largest independent health policy research program, with a broad research portfolio that focuses on health care costs, quality and public health preparedness, among other topics.

Credit: 
RAND Corporation