Tech

Lack of standard dosage for blood thinners can lead to bleeding during bariatric surgery

image: Study authors Luigi Brunetti (left) and Leonid Kagan at the Rutgers Ernest Mario School of Pharmacy found that measuring the enzyme in blood plasma that causes blood to clot provides a more accurate assessment of bleeding risk with blood thinners in obese patients.

Image: 
Nick Romanenko/Rutgers University

Rutgers researchers have found a way to reduce bleeding in patients following bariatric surgery.

The study, which appeared in the journal Surgery for Obesity and Related Disorders, was conducted by Luigi Brunetti and Leonid Kagan at the Rutgers Ernest Mario School of Pharmacy in collaboration with Ragui Sadek at Advanced Surgical and Bariatrics of New Jersey.

More than 30 percent of adults in the United States are considered obese and 228,000 bariatric surgeries, which causes weight loss by restricting the amount of food the stomach can hold, were performed last year. One risk of bariatric surgery are clots that block blood vessels after surgery. While blood thinners given before surgery and during hospitalization have been shown to reduce clot risk, the optimal dose, timing and the duration of the treatment for obese patients has yet to be determined.

"There is no requirement for drug manufacturers to perform research studies in obese individuals, which means there are no standard recommendations," said Brunetti, the lead investigator on the study. "We do not know which methods perform best in preventing blood clots without risk of bleeding in cases of extreme body weight."

In the study, patients received one of two types of blood thinners before surgery: subcutaneous enoxaparin, which lowers the activity of clotting proteins in the blood, and unfractionated heparin (UFH), which works with a natural protein in the body to block clot formation.

The researchers looked at two measures for determining effectiveness in preventing blood clots: anti-Xa, the standard measure, which assesses blood clot inhibitors in plasma, and endogenous thrombin potential (ETP), which measures the enzyme in blood plasma that causes blood to clot.

The results showed that while both types of blood thinners prevented clots, 80 percent of patients receiving enoxaparin had minor bleeding during surgery, which correlated with the ETP measurement. The results suggest that ETP, which identified the bleeding risk, provides a more accurate assessment of the best dosage for the various blood thinners.

"Currently, most physicians prescribe 40 milligrams of enoxaparin twice daily, and some are advocating for even higher doses for patients with extreme obesity," Brunetti said. "That dosage is ill-advised based on our study, which found that patients who received that dose experienced minor bleeding. I suspect a higher dosage would result in more minor bleeding and perhaps major bleeding as well. ETP can potentially help physicians better understand the right dosage, so obese patients will see benefits without being harmed."

Credit: 
Rutgers University

School counselors reflect on their experience following student deaths

image: This is Michael Hannon - Journal of Counseling and Development.

Image: 
Mike Peters, Montclair State University

When five school counselors who were part of a counseling team were interviewed to learn how they professionally and personally experienced the deaths of multiple students in one year in their school while attending to the needs of the school community, several themes emerged.

The Journal of Counseling & Development study's first theme, gravity of the losses, related to the significance of the losses the counseling team and broader school community experienced as each student died.

The second theme, logistics of care, pertained to how the school counselors managed and navigated the student deaths with the rest of the student body, other school personnel, and each other, both in the initial moments after learning about the deaths and later as students and staff were continuing to process what happened.

The third theme of personal versus professional conflicts reflected how the school counselors reported experiencing a conflict between attending to students' grief as professional counselors and experiencing their own personal grief.

The fourth theme of increased student cohesion represented the school counselors' description of a deeper sense of community among the student body that resulted from the student deaths.

The final theme of efficacy reflected how the counselors repeatedly questioned themselves about their effectiveness as school counselors in general, even with evidence that their work supported strong graduation and employment rates and a safe school environment.

"The collective narrative of these school counselors has provided us with invaluable lessons and perspective. A community is, oftentimes, connected by its local schools, and school counselors are on the front lines when crises occur," said lead author Dr. Michael Hannon, of Montclair State University. "The school counselors who contributed to this study offer us insight into how they provided acute and ongoing care for the school community, while simultaneously balancing and trying to meet their own personal and professional needs. My co-authors and I are deeply grateful for their transparency about such a demanding time."

Credit: 
Wiley

Nuclear deep space travel

video: Oak Ridge National Laboratory scientists have automated part of the process of producing plutonium-238, which is used by NASA to fuel deep space exploration. Resolving this key bottleneck will help boost annual production of the radioisotope towards NASA's goal of 1.5 kilograms of Pu-238 per year by 2025.

Image: 
Genevieve Martin and Jenny Woodbery/Oak Ridge National Laboratory, U.S. Dept. of Energy

Nuclear--Deep space travel

By automating the production of neptunium oxide-aluminum pellets, Oak Ridge National Laboratory scientists have eliminated a key bottleneck when producing plutonium-238 used by NASA to fuel deep space exploration. Pu-238 provides a constant heat source through radioactive decay, a process that has powered spacecraft such as Cassini and the Mars Rover. "Automating part of the Pu-238 production process is helping push annual production from 50 grams to 400 grams, moving closer to NASA's goal of 1.5 kilograms per year by 2025," said ORNL's Bob Wham. "The automation replaces a function our team did by hand and is expected to increase the output of pressed pellets from 80 to 275 per week." Once the pellets are pressed and enclosed in aluminum tubing, they are irradiated at ORNL's High Flux Isotope Reactor and chemically processed into Pu-238 at the Radiochemical Engineering Development Center. In 2012, NASA reached an agreement with the Department of Energy to restart production of Pu-238, and ORNL was selected to lead the project. [Contact: Jason Ellis, (865) 241-5819; ellisjk@ornl.gov]

Video: https://youtu.be/gl8vESVnRBc

Caption: Oak Ridge National Laboratory scientists have automated part of the process of producing plutonium-238, which is used by NASA to fuel deep space exploration. Resolving this key bottleneck will help boost annual production of the radioisotope towards NASA's goal of 1.5 kilograms of Pu-238 per year by 2025. Credit: Genevieve Martin and Jenny Woodbery/Oak Ridge National Laboratory, U.S. Dept. of Energy

Supercomputing--Memory boost

Scientists at Oak Ridge National Laboratory and Hypres, a digital superconductor company, have tested a novel cryogenic, or low-temperature, memory cell circuit design that may boost memory storage while using less energy in future exascale and quantum computing applications. The team used Josephson junctions made from niobium and aluminum-based materials, fabricated at Hypres, for the single-bit memory design on a chip and demonstrated write, read and reset memory operations occurring on the same circuit. "The test showed the viability of memory processing functions to operate faster and more efficiently," ORNL's Yehuda Braiman said. "This could lead to substantially decreased access energies and access times and allow for more circuits to occupy less space." Building on the initial design, ORNL's Braiman, Niketh Nair and Neena Imam continue working on multi-valued memory cell circuits and large arrays of memory cells. Their first step was a ternary memory cell circuit design, which was published in Superconductor Science and Technology. [Contact: Sara Shoemaker, (865) 576-9219; shoemakerms@ornl.gov]

Image 1: https://www.ornl.gov/sites/default/files/Supercomputing-Memory_boost1.jpg

Caption: Oak Ridge National Laboratory and digital superconductor company Hypres designed a layout of four memory cells with different parameters. Their study of cryogenic memory cell circuit designs may boost storage while using less energy in future exascale and quantum computing applications. Credit: Dr. Amir Jafair-Salim/Hypres

Image 2: https://www.ornl.gov/sites/default/files/news/images/Supercomputing-Memory_boost2.jpg

Caption: A fabricated single-bit memory design on a chip developed by Oak Ridge National Laboratory and Hypres demonstrated write, read and reset memory operations occurring on the same circuit. Credit: Dr. Amir Jafair-Salim/Hypres

Buildings--On-the-go HVAC check

Technicians can access a free tool developed by Oak Ridge National Laboratory to support the installation and repair of heating, ventilation and air conditioning systems, particularly when using new refrigerants. Researchers at ORNL have launched a mobile app called fProps to quickly check fluid properties such as refrigerant, coolant and air while installing or repairing HVAC equipment in commercial and residential buildings. Users specify inputs for each property and a wizard then guides through the module within the tool. "With fProps, technicians have at their fingertips a way to evaluate air, coolant, refrigerants and capacity calculation functions," said ORNL's Bo Shen. "This tool also provides additional support for professionals using new low global warming potential refrigerants in HVAC systems." The fProps app is a pilot program, and additional modules can be added in the future based on interest. [Contact: Jennifer Burke, (865) 576-3212; burkejj@ornl.gov]

Image: https://www.ornl.gov/sites/default/files/Building-HVAC_app.jpg

Caption: ORNL's fProps is a mobile phone app that allows HVAC technicians to quickly check fluid properties before equipment installation or repair. Credit: Oak Ridge National Laboratory, U.S. Dept. of Energy

Vehicles--Fuel cell power up

Oak Ridge National Laboratory scientists studying fuel cells as a potential alternative to internal combustion engines used sophisticated electron microscopy to investigate the benefits of replacing high-cost platinum with a lower cost, carbon-nitrogen-manganese-based catalyst. "We used electron microscopy to demonstrate that atomically dispersed manganese can act as an oxygen reduction reaction catalyst while also increasing durability," said ORNL's David Cullen. Fuel cell technologies hold promise for use in vehicles because of their high-power density, low operating temperature and carbon-free emissions. Yet, the high cost associated with platinum-based catalysts and insufficient durability of alternative platinum-free catalysts remains a market barrier. "Our team's finding could open up the potential for widespread use in transportation and other energy conversion applications," said Cullen. ORNL researchers were part of a team that produced the results published in Nature Catalysis. [Contact: Jennifer Burke, (865) 576-3212; burkejj@ornl.gov]

Image: https://www.ornl.gov/sites/default/files/Picture2_1.png

Caption: ORNL researchers used high-resolution electron microscopy to show that nitrogen-doped carbon with atomically dispersed manganese can enhance the performance and durability of low-cost platinum-free polymer electrolyte fuel cells, an important step towards use of such fuel cells in transportation applications. Credit: Oak Ridge National Laboratory, U.S. Dept. of Energy

Neutrons--Quest for QSLs

Researchers used neutron scattering at Oak Ridge National Laboratory's Spallation Neutron Source to investigate bizarre magnetic behavior, believed to be a possible quantum spin liquid rarely found in a three-dimensional material. QSLs are exotic states of matter where magnetism continues to fluctuate at low temperatures instead of "freezing" into aligned north and south poles as with traditional magnets. "If you could shrink down to see what individual electrons are doing, it would seem as though nothing special was going on," said Kemp Plumb of Brown University. "But, when you zoom out, a beautiful collective pattern emerges signifying a new phase of matter that hasn't been seen before." Observations of the material's quantum behavior are consistent with the theoretical models. This indicates the material has the right ingredients for fractionalized magnetic excitations that could be harnessed for future quantum information technologies. The research was published in Nature Physics. [Contact: Jeremy Rumsey, (865) 576-2038; rumseyjp@ornl.gov]

Image: https://www.ornl.gov/sites/default/files/18-G01703%20PinchPoint-v2.jpg

Caption: Neutrons reveal a striking pattern of connected "bow ties" that is characteristic of the emergent electron motion in the quantum spin liquid state, observed in a three-dimensional material belonging to a class of minerals used in a wide range of technological applications. Credit: Kemp Plumb/Brown University and Genevieve Martin/Oak Ridge National Laboratory, U.S. Dept. of Energy

Credit: 
DOE/Oak Ridge National Laboratory

Space microbes aren't so alien after all

image: The International Space Station photographed by Expedition 56 crew members from a Soyuz spacecraft after undocking.

Image: 
NASA/Roscosmos

EVANSTON, Ill. --- Microbes stranded in the International Space Station (ISS) are just trying to survive, man.

A new Northwestern University study has found that -- despite its seemingly harsh conditions -- the ISS is not causing bacteria to mutate into dangerous, antibiotic-resistant superbugs.

While the team found that the bacteria isolated from the ISS did contain different genes than their Earthling counterparts, those genes did not make the bacteria more detrimental to human health. The bacteria are instead simply responding, and perhaps evolving, to survive in a stressful environment.

"There has been a lot of speculation about radiation, microgravity and the lack of ventilation and how that might affect living organisms, including bacteria," said Northwestern's Erica Hartmann, who led the study. "These are stressful, harsh conditions. Does the environment select for superbugs because they have an advantage? The answer appears to be 'no.'"

The study was published today (Jan. 8) in the journal mSystems. Hartmann is an assistant professor of environmental engineering in Northwestern's McCormick School of Engineering.

As the conversation about sending travelers to Mars gets more serious, there has been an increasing interest in understanding how microbes behave in enclosed environments.

"People will be in little capsules where they cannot open windows, go outside or circulate the air for long periods of time," said Hartmann. "We're genuinely concerned about how this could affect microbes."

The ISS houses thousands of different microbes, which have traveled into space either on astronauts or in cargo. The National Center for Biotechnology Information maintains a publicly available database, containing the genomic analyses of many of bacteria isolated from the ISS. Hartmann's team used that data to compare the strains of Staphylococcus aureus and Bacillus cereus on the ISS to those on Earth.

Found on human skin, S. aureus contains the tough-to-treat MRSA strain. B. cereus lives in soil and has fewer implications for human health.

"Bacteria that live on skin are very happy there," Hartmann said. "Your skin is warm and has certain oils and organic chemicals that bacteria really like. When you shed those bacteria, they find themselves living in a very different environment. A building's surface is cold and barren, which is extremely stressful for certain bacteria."

To adapt to living on surfaces, the bacteria containing advantageous genes are selected for or they mutate. For those living on the ISS, these genes potentially helped the bacteria respond to stress, so they could eat, grow and function in a harsh environment.

"Based on genomic analysis, it looks like bacteria are adapting to live -- not evolving to cause disease," said Ryan Blaustein, a postdoctoral fellow in Hartmann's laboratory and the study's first author. "We didn't see anything special about antibiotic resistance or virulence in the space station's bacteria."

Although this is good news for astronauts and potential space tourists, Hartmann and Blaustein are careful to point out that unhealthy people can still spread illness on space stations and space shuttles.

"Everywhere you go, you bring your microbes with you," Hartmann said. "Astronauts are exceedingly healthy people. But as we talk about expanding space flight to tourists who do not necessarily meet astronaut criteria, we don't know what will happen. We can't say that if you put someone with an infection into a closed bubble in space that it won't transfer to other people. It's like when someone coughs on an airplane, and everyone gets sick."

Credit: 
Northwestern University

Environmental sustainability should be inherent to dietary guidance

audio: It is the position of the Society for Nutrition Education and Behavior (SNEB) that environmental sustainability should be inherent to dietary guidance, whether working with individuals or groups about their dietary choices or in setting national dietary guidance. Improving the nutritional health of a population is a long-term goal that requires ensuring the long-term sustainability of the food system.

Image: 
Journal of Nutrition Education and Behavior

Philadelphia, January 8, 2019 - It is the position of the Society for Nutrition Education and Behavior (SNEB) that environmental sustainability should be inherent to dietary guidance, whether working with individuals or groups about their dietary choices or in setting national dietary guidance. Improving the nutritional health of a population is a long-term goal that requires ensuring the long-term sustainability of the food system. The position paper is published in the Journal of Nutrition Education and Behavior.

Beginning with a description of current environmental problems, the authors of the position paper discuss the challenges faced in meeting future food needs as well as the recent science behind assessing the environmental impacts of foods and diets. In a subsequent section they cover sustainability, dietary guidance, and research. While there are various angles of sustainability to consider, the focus of this paper is on the environmental dimension of sustainability.

According to Diego Rose, PhD, School of Public Health and Tropical Medicine, Tulane University, New Orleans, LA, USA, lead author of the position paper, "Based on the best science we have today, it is clear that current environmental problems--including global climate change, biodiversity loss, land degradation, water shortages, and water pollution--demand urgent attention, threaten long-term food security, and are in part caused by our current food choices and agricultural practices."

"The position paper was motivated by the severity of current environmental problems, including global climate change," said Dr. Rose on the creation of the position paper. "A number of studies have been published about the difficulty of getting to 2050 with an adequate worldwide food supply due to factors such as population increase and change in dietary habits."

"The paper was also inspired by the information published in the Dietary Guidelines Advisory Committee's scientific report to the Secretaries of Agriculture and Health and Human Services in 2015, which included a chapter dedicated to sustainability. We wanted to pass this vital information along to others."

Based on the evidence presented throughout the paper, the authors make recommendations on dietary guidance policy, research, and nutrition education practice. In terms of dietary guidance, SNEB recommends that environmental sustainability considerations be included in future federal dietary guidance. Future guidelines should contain specific advice, such as consuming less ruminant animal foods in favor of other protein foods. According to results from the American Climate Values Survey of 2014, about half of Americans might be disposed to dietary advice that food choice could affect the environment.

"In discussing dietary recommendations, nutritionists can discuss both the health and environmental impacts of food choices to promote behavior change among consumers," Dr. Rose said. "People want to know what to eat today, so it is incumbent on those of us who are knowledgeable about nutritional science and education techniques to provide the best advice, based on the available evidence to date."

"In order to implement the dietary advice outlined in the position paper, nutrition educators will be interested in pursuing continuing education opportunities," commented Adrienne White, PhD, RDN, professor emerita, University of Maine, Orono, ME, USA, who was the president of SNEB when this position paper was developed. "The SNEB Sustainable Food Systems Division will be an important resource on environmental sustainability in dietary guidance."

Credit: 
Elsevier

Dental flossing and other behaviors linked with higher levels of PFAS in the body

Newton, Mass. (Jan. 8, 2019) - A new study suggests certain types of consumer behaviors, including flossing with Oral-B Glide dental floss, contribute to elevated levels in the body of toxic PFAS chemicals. PFAS are water- and grease-proof substances that have been linked with numerous health problems. The findings provide new insight into how these chemicals end up in people's bodies and how consumers can limit their exposures by modifying their behavior.

The study, led by Silent Spring Institute in collaboration with the Public Health Institute in Berkeley, CA, appears online January 8 in the Journal of Exposure Science & Environmental Epidemiology (JESEE), and is part of a special issue dedicated to PFAS (per- and polyfluoroalkyl substances).

PFAS are used in a range of consumer products, including fast food packaging, non-stick pans, waterproof clothing, and stain-resistant carpets. People can be exposed to the substances directly through the products they use and the food they eat. They can also be exposed through indoor air and dust and contaminated drinking water.

Scientists are concerned about widespread exposure to PFAS in the population because the chemicals have been linked with health effects including kidney and testicular cancer, thyroid disease, high cholesterol, low birth weight, decreased fertility, and effects on the immune system.

In the new study, researchers measured 11 different PFAS chemicals in blood samples taken from 178 middle-aged women enrolled in the Public Health Institute's Child Health and Development Studies, a multigenerational study of the impact of environmental chemicals and other factors on disease.

To understand how people's behavior influences their exposure to PFAS, the researchers then compared the blood measurements with results from interviews in which they asked the women about nine behaviors that could lead to higher exposures. Half of the women in the analysis were non-Hispanic white and half were African American.

Women who flossed with Oral-B Glide tended to have higher levels of a type of PFAS called PFHxS (perfluorohexanesulfonic acid) in their body compared with those who didn't. To further understand the connection, the researchers tested 18 dental flosses (including 3 Glide products) for the presence of fluorine--a marker of PFAS--using a technique called particle-induced gamma-ray emission (PIGE) spectroscopy. All three Glide products tested positive for fluorine, consistent with previous reports that Glide is manufactured using Teflon-like compounds. In addition, two store brand flosses with "compare to Oral-B Glide" labelling and one floss describing itself as a "single strand Teflon fiber" tested positive for fluorine.

"This is the first study to show that using dental floss containing PFAS is associated with a higher body burden of these toxic chemicals," says lead author Katie Boronow, a staff scientist at Silent Spring. "The good news is, based on our findings, consumers can choose flosses that don't contain PFAS."

Other behaviors that were associated with higher PFAS levels included having stain-resistant carpet or furniture and living in a city served by a PFAS-contaminated drinking water supply. Additionally, among African American women, those who frequently ate prepared food in coated cardboard containers, such as French fries or takeout, had elevated blood levels of four PFAS chemicals compared to women who rarely ate such food. The researchers did not see the same relationship with prepared food among non-Hispanic whites.

Overall, non-Hispanic whites tended to have higher levels of two PFAS chemicals, PFOA (perfluorooctanoic acid) and PFHxS, compared with African Americans. The researchers could not explain the differences, suggesting that there are other behaviors they didn't measure that contribute to PFAS exposure.

"Overall, this study strengthens the evidence that consumer products are an important source of PFAS exposure," says Boronow. "Restricting these chemicals from products should be a priority to reduce levels in people's bodies."

Credit: 
Silent Spring Institute

Some Facebook users perceive worsening physical health

Facebook use linked to perceptions of worsening physical health, new research from the University of Surrey, reports.

In the first study of its kind, published today in the journal Heliyon, researchers led by Dr Bridget Dibb investigated the relationship between Facebook and perceptions of physical health. One hundred and sixty five participants, all Facebook users, were surveyed to identify levels of comparison with others on the social networking site, self-esteem rates, perceived physical health and life satisfaction.

Researchers found that participants who compared themselves to others on Facebook had greater awareness of physical ailments, such as sleep problems, weight change and muscle tension. It is believed that those who compare with others on Facebook may perceive more physical symptoms but equally, those who perceive more symptoms may compare more with others on Facebook. Social comparison is a process where comparisons are made to others in order to evaluate our lives and are more likely to occur when we feel uncertain about our situation.

In addition, it was discovered that females and those experiencing anxiety or depression also perceived more symptoms. Participants who were more satisfied with their lives and had high self-esteem rates were associated with fewer physical symptoms.

Researchers believe that the increased use of the social networking site may be associated with more opportunities to compare ourselves unfavourably to others who we perceive to be 'better off' than ourselves both in lifestyle and in health.

Dr Bridget Dibb, Senior Lecturer in Health Psychology at the University of Surrey, said: "Comparing ourselves to others is not a new concept; however, with the rise of social media it is becoming a part of our everyday lives.

"An entity like Facebook, with 2.27 billion active monthly users, has never existed before. The long term effect it has on individuals is unknown, but it is clear that comparison with others is associated with perceptions of ill-health. Users need to be aware of how they feel when they use sites like Facebook and recognise the dangers of comparisons in this context."

Credit: 
University of Surrey

Illegal opioid crisis roadmap overlooks gender

New Haven, Conn. - Women's Health Research at Yale (WHRY) is calling on a government committee to revise its report on a coordinated response to the opioid epidemic so that it reflects the unique needs of women.

In a commentary published in the peer-reviewed journal Biology of Sex Differences, WHRY Director Carolyn M. Mazure, Ph.D., and Jill Becker, Ph.D., chair of the Biopsychology Area of the University of Michigan Psychology Department, detailed the laboratory, clinical, and epidemiological evidence showing the need for the report to endorse and encourage the research of sex and gender differences. They argued such data is necessary to generate gender-based interventions that more fully address the opioid epidemic.

"All data must be reported by sex and gender so that gender-specific treatment and prevention strategies derived from this research are provided to practitioners and the public," the authors said. "We encourage biomedical researchers and clinical care providers, as well as the public, to insist that a successful response to the opioid crisis should highlight the importance of understanding sex and gender differences in the current opioid epidemic."

Mazure and Becker noted that the draft report of the White House National Science and Technology Council's Fast-Track Action Committee (FTAC) created to respond to the opioid crisis does include important concerns about maternal and neonatal exposure to opioids. But they said the draft, released in October, overlooks significant and growing data on sex and gender differences in opioid use disorder (OUD). For example, they wrote that women are more likely than men to be prescribed and use opioid analgesics and that females and males experience pain and the effects of opioids differently.

In addition, women more quickly develop addictions after first using addictive substances, and women are more likely than men to relapse after a quit attempt.

The authors also described how women with opioid addiction are more likely than men to have experienced early trauma and have been diagnosed with depression. And women with opioid addiction suffer greater functional impairment in their lives, impacting their ability to work, secure steady housing, and -- because women are more often family caretakers -- avoid negative effects on children.

"Our experimental models will not begin to yield the desired information until they employ appropriate models that include both females and males, and our clinical and epidemiological investigations will not uncover needed data until both women and men are studied," the authors said. "A successful response to the opioid crisis will only be found when scientists, practitioners and the public incorporate the essential importance of understanding sex and gender differences into the solution for OUD."

Credit: 
Yale University

Tiny satellites could be 'guide stars' for huge next-generation telescopes

There are more than 3,900 confirmed planets beyond our solar system. Most of them have been detected because of their "transits" -- instances when a planet crosses its star, momentarily blocking its light. These dips in starlight can tell astronomers a bit about a planet's size and its distance from its star.

But knowing more about the planet, including whether it harbors oxygen, water, and other signs of life, requires far more powerful tools. Ideally, these would be much bigger telescopes in space, with light-gathering mirrors as wide as those of the largest ground observatories. NASA engineers are now developing designs for such next-generation space telescopes, including "segmented" telescopes with multiple small mirrors that could be assembled or unfurled to form one very large telescope once launched into space.

NASA's upcoming James Webb Space Telescope is an example of a segmented primary mirror, with a diameter of 6.5 meters and 18 hexagonal segments. Next-generation space telescopes are expected to be as large as 15 meters, with over 100 mirror segments.

One challenge for segmented space telescopes is how to keep the mirror segments stable and pointing collectively toward an exoplanetary system. Such telescopes would be equipped with coronagraphs -- instruments that are sensitive enough to discern between the light given off by a star and the considerably weaker light emitted by an orbiting planet. But the slightest shift in any of the telescope's parts could throw off a coronagraph's measurements and disrupt measurements of oxygen, water, or other planetary features.

Now MIT engineers propose that a second, shoebox-sized spacecraft equipped with a simple laser could fly at a distance from the large space telescope and act as a "guide star," providing a steady, bright light near the target system that the telescope could use as a reference point in space to keep itself stable.

In a paper published today in the Astronomical Journal, the researchers show that the design of such a laser guide star would be feasible with today's existing technology. The researchers say that using the laser light from the second spacecraft to stabilize the system relaxes the demand for precision in a large segmented telescope, saving time and money, and allowing for more flexible telescope designs.

"This paper suggests that in the future, we might be able to build a telescope that's a little floppier, a little less intrinsically stable, but could use a bright source as a reference to maintain its stability," says Ewan Douglas, a postdoc in MIT's Department of Aeronautics and Astronautics and a lead author on the paper.

The paper also includes Kerri Cahoy, associate professor of aeronautics and astronautics at MIT, along with graduate students James Clark and Weston Marlow at MIT, and Jared Males, Olivier Guyon, and Jennifer Lumbres from the University of Arizona.

In the crosshairs

For over a century, astronomers have been using actual stars as "guides" to stabilize ground-based telescopes.

"If imperfections in the telescope motor or gears were causing your telescope to track slightly faster or slower, you could watch your guide star on a crosshairs by eye, and slowly keep it centered while you took a long exposure," Douglas says.

In the 1990s, scientists started using lasers on the ground as artificial guide stars by exciting sodium in the upper atmosphere, pointing the lasers into the sky to create a point of light some 40 miles from the ground. Astronomers could then stabilize a telescope using this light source, which could be generated anywhere the astronomer wanted to point the telescope.

"Now we're extending that idea, but rather than pointing a laser from the ground into space, we're shining it from space, onto a telescope in space," Douglas says. Ground telescopes need guide stars to counter atmospheric effects, but space telescopes for exoplanet imaging have to counter minute changes in the system temperature and any disturbances due to motion.

The space-based laser guide star idea arose out of a project that was funded by NASA. The agency has been considering designs for large, segmented telescopes in space and tasked the researchers with finding ways of bringing down the cost of the massive observatories.

"The reason this is pertinent now is that NASA has to decide in the next couple years whether these large space telescopes will be our priority in the next few decades," Douglas says. "That decision-making is happening now, just like the decision-making for the Hubble Space Telescope happened in the 1960s, but it didn't launch until the 1990s.'"

Star fleet

Cahoy's lab has been developing laser communications for use in CubeSats, which are shoebox-sized satellites that can be built and launched into space at a fraction of the cost of conventional spacecraft.

For this new study, the researchers looked at whether a laser, integrated into a CubeSat or slightly larger SmallSat, could be used to maintain the stability of a large, segmented space telescope modeled after NASA's LUVOIR (for Large UV Optical Infrared Surveyor), a conceptual design that includes multiple mirrors that would be assembled in space.

Researchers have estimated that such a telescope would have to remain perfectly still, within 10 picometers -- about a quarter the diameter of a hydrogen atom -- in order for an onboard coronagraph to take accurate measurements of a planet's light, apart from its star.

"Any disturbance on the spacecraft, like a slight change in the angle of the sun, or a piece of electronics turning on and off and changing the amount of heat dissipated across the spacecraft, will cause slight expansion or contraction of the structure," Douglas says. "If you get disturbances bigger than around 10 picometers, you start seeing a change in the pattern of starlight inside the telescope, and the changes mean that you can't perfectly subtract the starlight to see the planet's reflected light."

The team came up with a general design for a laser guide star that would be far enough away from a telescope to be seen as a fixed star -- about tens of thousands of miles away -- and that would point back and send its light toward the telescope's mirrors, each of which would reflect the laser light toward an onboard camera. That camera would measure the phase of this reflected light over time. Any change of 10 picometers or more would signal a compromise to the telescope's stability that, onboard actuators could then quickly correct.

To see if such a laser guide star design would be feasible with today's laser technology, Douglas and Cahoy worked with colleagues at the University of Arizona to come up with different brightness sources, to figure out, for instance, how bright a laser would have to be to provide a certain amount of information about a telescope's position, or to provide stability using models of segment stability from large space telescopes. They then drew up a set of existing laser transmitters and calculated how stable, strong, and far away each laser would have to be from the telescope to act as a reliable guide star.

In general, they found laser guide star designs are feasible with existing technologies, and that the system could fit entirely within a SmallSat about the size of a cubic foot. Douglas says that a single guide star could conceivably follow a telescope's "gaze," traveling from one star to the next as the telescope switches its observation targets. However, this would require the smaller spacecraft to journey hundreds of thousands of miles paired with the telescope at a distance, as the telescope repositions itself to look at different stars.

Instead, Douglas says a small fleet of guide stars could be deployed, affordably, and spaced across the sky, to help stabilize a telescope as it surveys multiple exoplanetary systems. Cahoy points out that the recent success of NASA's MARCO CubeSats, which supported the Mars Insight lander as a communications relay, demonstrates that CubeSats with propulsion systems can work in interplanetary space, for longer durations and at large distances.

"Now we're analyzing existing propulsion systems and figuring out the optimal way to do this, and how many spacecraft we'd want leapfrogging each other in space," Douglas says. "Ultimately, we think this is a way to bring down the cost of these large, segmented space telescopes."

Credit: 
Massachusetts Institute of Technology

DNA on auto-pilot

image: Hao Yan directs the Biodesign Center for Molecular Design and Biomimetics and is the Milton D. Glick Distinguished Professor in the School of Molecular Sciences at ASU.

Image: 
The Biodesign Institute at ASU

Nature has made extravagant use of a simple molecule--DNA, the floorplan of all earthly life.

Inventive researchers have used the same base-pairing properties that bond two strands of DNA into the familiar double helix to build innumerable useful structures at the nanometer scale.

One such method, known as DNA origami, has yielded rich results in recent years, enabling the construction of a rapidly growing menagerie of 2- and 3-dimensional objects, with far-flung applications in material science, nanoelectronics, photonics and the biomedical arena.

In new research appearing in the current issue of the journal Science Advances, Hao Yan and his colleagues, in collaboration with scientists at MIT, describe a method allowing for the automation of DNA origami construction, vastly accelerating and simplifying the process of crafting desired forms, and opening the world of DNA architecture to a broader audience.

"DNA origami design has come to the time that we now can draw a form freely and ask the computer to output what is needed to build the target form," Yan says.

The variety and versatility of DNA nanoarchitectures have enabled their application for tiny logic gates and nanocomputers, advanced materials with unique properties, nanoelectronics and nanocircuitries and structures displaying dynamic properties, including nanotweezers, nano walkers and nanorobots.

In recent research, DNA origami nanostructures have demonstrated the ability to improve the effectiveness of chemotherapy, reduce therapeutic side-effects and even manage drug resistance.

Yan directs the Biodesign Center for Molecular Design and Biomimetics and is the Milton D. Glick Distinguished Professor in the School of Molecular Sciences at ASU. His is joined by Biodesign researchers Fei Zhang and Xiaodong Qi, along with colleagues led by Prof. Mark Bathe from the Departments of Biological and Chemical engineering at MIT. The ASU team contributes their expertise to validate the design computed by the MIT team.

Endless forms

The power of structural DNA origami lies in the method's capacity to design and construct a virtually limitless array of forms, which self-assemble from their component parts. The basic technique involves a length of single-stranded DNA designed to elaborately fold into desired shapes through base pairing of its four complimentary nucleotides. To complete the nanoform, short DNA segments from 20-60 nucleotides in length--known as staple strands--are added, acting to pin the folded scaffold structure in place by base-pairing at pre-selected locations (see animation).

Initially, DNA origami was used to design fairly humble 2D structures, including stars, triangles and smiley faces. These objects, measuring just billionths of a meter in diameter, can only be seen with sophisticated imaging technology, principally, atomic force microscopy (AFM). The DNA origami technique has since undergone rapid expansion, permitting the design and construction of nearly any arbitrary two- or three-dimensional object a researcher may envision.

The rapid advance of such technology is due to the expanded possibilities for DNA construction via scaffolded DNA origami as well as the safety and stability of DNA in physiological environments.

But while the nanostructures self-assemble with impressive reliability, the actual design phase required to engineer the varied forms has been complex and highly labor-intensive, particularly the design of staple strands needed to fold the long scaffold strand to the target geometry.

This step is typically handled manually for each geometric form, with the aid of visualization software, presenting a significant hurdle in the process. The lack of systematic design rules for producing accurate staple strands and folded scaffolding has meant that the powerful technology of DNA origami has been largely reserved for experts in the field.

Folding made easy

The new study offers a fully automated alternative that permits the design of all DNA staple sequences needed to fold any free-form 2D scaffolded DNA origami object. Previously, researchers had devised ways of automating staple strand design for 3D polygonal structures, but the ability to replicate this with arbitrary 2D nanoforms has been elusive, until now.

For the automated process, the designer of a given 2D structure first makes a simple freehand drawing of just the outer border of the desired shape. This drawing is used as the input, with the internal structure of the design determined automatically, using the program's specialized algorithm. In an alternate method, complete internal and external boundary geometries are drawn freehand with continuous lines.

Using either technique, the 2D line-based geometric representation is used as input for the algorithm that performs automatic scaffold and staple routings, after which, the resulting DNA sequences for the scaffold and staple strands can be ordered from commercial outlets, mixed in a test tube according to a prescribed recipe of heating and cooling and self-assembled into finished structures that can be visualized using AFM imaging.

The two approaches provide non-experts with the means to easily synthesize complex nanostructures, helping to advance the field.

Draw it, build it!

The study demonstrates the automated sequence design of 15 irregular design objects, featuring triangular mesh, square and honeycomb geometries with varying shape parameters. The researchers have dubbed their algorithmic approach PERDIX (Programmed Eulerian Routing for DNA Design using X-overs), and the program is available to the research community at: http://perdix-dna-origami.org

PERDIX is a deceptively simple and user-friendly means of producing 2D DNA nanostructures, which has only been made possible after many years of trial and error to flesh out complex, generalizable design rules, making design automation a reality.

The PERDIX design software can automatically convert any 2D polygonal mesh design into a blueprint for a scaffolded DNA nanoform, using simple Computer-Aided Design (CAD). The simplicity and speed of the software empowers non-specialists to translate virtually any 2D polygonal shape into a completed design needed to "print" a 2D nanometer-scale structure.

As the authors note, the design program permits customized nanometer-scale templating of molecules, including dyes, nucleic acids, proteins, and semiconductor nanocrystals. Resulting forms may find their way into varied applications in nanophotonics, nanoscale energy transport, biomolecular sensing, intelligent drug delivery, structural studies, and cell-based binding.

Credit: 
Arizona State University

Carrying and releasing nanoscale cargo with 'nanowrappers'

image: The 3-D structure and chemical composition characterizations of the products obtained after five minutes (a), 20 minutes (b), and one hour (c). The scanning electron microscope images (subscript 1, scale bars are 100 nanometers), reconstructed 3-D volume renderings (subscript 2), and 3-D elemental mappings (subscript 3, gold in green and silver in red) show the transformation of the silver nanocubes into gold-silver nanowrappers.

Image: 
Brookhaven National Laboratory

UPTON, NY--This holiday season, scientists at the Center for Functional Nanomaterials (CFN)--a U.S. Department of Energy Office of Science User Facility at Brookhaven National Laboratory--have wrapped a box of a different kind. Using a one-step chemical synthesis method, they engineered hollow metallic nanosized boxes with cube-shaped pores at the corners and demonstrated how these "nanowrappers" can be used to carry and release DNA-coated nanoparticles in a controlled way. The research is reported in a paper published on Dec. 12 in ACS Central Science, a journal of the American Chemical Society (ACS).

"Imagine you have a box but you can only use the outside and not the inside," said co-author Oleg Gang, leader of the CFN Soft and Bio Nanomaterials Group. "This is how we've been dealing with nanoparticles. Most nanoparticle assembly or synthesis methods produce solid nanostructures. We need methods to engineer the internal space of these structures."

"Compared to their solid counterparts, hollow nanostructures have different optical and chemical properties that we would like to use for biomedical, sensing, and catalytic applications," added corresponding author Fang Lu, a scientist in Gang's group. "In addition, we can introduce surface openings in the hollow structures where materials such as drugs, biological molecules, and even nanoparticles can enter and exit, depending on the surrounding environment."

Synthetic strategies have been developed to produce hollow nanostructures with surface pores, but typically the size, shape, and location of these pores cannot be well-controlled. The pores are randomly distributed across the surface, resulting in a Swiss-cheese-like structure. A high level of control over surface openings is needed in order to use nanostructures in practical applications--for example, to load and release nanocargo.

In this study, the scientists demonstrated a new pathway for chemically sculpturing gold-silver alloy nanowrappers with cube-shaped corner holes from solid nanocube particles. They used a chemical reaction known as nanoscale galvanic replacement. During this reaction, the atoms in a silver nanocube get replaced by gold ions in an aqueous solution at room temperature. The scientists added a molecule (surfactant, or surface-capping agent) to the solution to direct the leaching of silver and the deposition of gold on specific crystalline facets.

"The atoms on the faces of the cube are arranged differently from those in the corners, and thus different atomic planes are exposed, so the galvanic reaction may not proceed the same way in both areas," explained Lu. "The surfactant we chose binds to the silver surface just enough--not too strongly or weakly--so that gold and silver can interact. Additionally, the absorption of surfactant is relatively weak on the silver cube's corners, so the reaction is most active here. The silver gets "eaten" away from its edges, resulting in the formation of corner holes, while gold gets deposited on the rest of the surface to create a gold and silver shell."

To capture the structural and chemical composition changes of the overall structure at the nanoscale in 3-D and at the atomic level in 2-D as the reaction proceeded over three hours, the scientists used electron microscopes at the CFN. The 2-D electron microscope images with energy-dispersive X-ray spectroscopy (EDX) elemental mapping confirmed that the cubes are hollow and composed of a gold-silver alloy. The 3-D images they obtained through electron tomography revealed that these hollow cubes feature large cube-shaped holes at the corners.

"In electron tomography, 2-D images collected at different angles are combined to reconstruct an image of an object in 3-D," said Gang. "The technique is similar to a CT [computerized tomography] scan used to image internal body structures, but it is carried out on a much smaller size scale and uses electrons instead of x-rays."

The scientists also confirmed the transformation of nanocubes to nanowrappers through spectroscopy experiments capturing optical changes. The spectra showed that the optical absorption of the nanowrappers can be tuned depending on the reaction time. At their final state, the nanowrappers absorb infrared light.

"The absorption spectrum showed a peak at 1250 nanometers, one of the longest wavelengths reported for nanoscale gold or silver," said Gang. "Typically, gold and silver nanostructures absorb visible light. However, for various applications, we would like those particles to absorb infrared light--for example, in biomedical applications such as phototherapy."

Using the synthesized nanowrappers, the scientists then demonstrated how spherical gold nanoparticles of an appropriate size that are capped with DNA could be loaded into and released from the corner openings by changing the concentration of salt in the solution. DNA is negatively charged (owing to the oxygen atoms in its phosphate backbone) and changes its configuration in response to increasing or decreasing concentrations of a positively charged ion such as salt. In high salt concentrations, DNA chains contract because their repulsion is reduced by the salt ions. In low salt concentrations, DNA chains stretch because their repulsive forces push them apart.

When the DNA strands contract, the nanoparticles become small enough to fit in the openings and enter the hollow cavity. The nanoparticles can then be locked within the nanowrapper by decreasing the salt concentration. At this lower concentration, the DNA strands stretch, thereby making the nanoparticles too large to go through the pores. The nanoparticles can leave the structure through a reverse process of increasing and decreasing the salt concentration.

"Our electron microscopy and optical spectroscopy studies confirmed that the nanowrappers can be used to load and release nanoscale components," said Lu. "In principle, they could be used to release optically or chemically active nanoparticles in particular environments, potentially by changing other parameters such as pH or temperature."

Going forward, the scientists are interested in assembling the nanowrappers into larger-scale architectures, extending their method to other bimetallic systems, and comparing the internal and external catalytic activity of the nanowrappers.

"We did not expect to see such regular, well-defined holes," said Gang. "Usually, this level of control is quite difficult to achieve for nanoscale objects. Thus, our discovery of this new pathway of nanoscale structure formation is very exciting. The ability to engineer nano-objects with a high level of control is important not only to understanding why certain processes are happening but also to constructing targeted nanostructures for various applications, from nanomedicine and optics to smart materials and catalysis. Our new synthesis method opens up unique opportunities in these areas."

"This work was made possible by the world-class expertise in nanomaterial synthesis and capabilities that exist at the CFN," said CFN Director Charles Black. "In particular, the CFN has a leading program in the synthesis of new materials by assembly of nanoscale components, and state-of-the-art electron microscopy and optical spectroscopy capabilities for studying the 3-D structure of these materials and their interaction with light. All of these characterization capabilities are available to the nanoscience research community through the CFN user program. We look forward to seeing the advances in nano-assembly that emerge as scientists across academia, industry, and government make use of the capabilities in their research."

Credit: 
DOE/Brookhaven National Laboratory

Forest soundscapes monitor conservation efforts inexpensively, effectively

image: A version of a bioacoustic data recorder. (Photo credit: Justine Hausheer, The Nature Conservancy)

Image: 
Justine Hausheer, The Nature Conservancy

Recordings of the sounds in tropical forests could unlock secrets about biodiversity and aid conservation efforts around the world, according to a perspective paper published in Science.

Compared to on-the-ground fieldwork, bioacoustics --recording entire soundscapes, including animal and human activity -- is relatively inexpensive and produces powerful conservation insights. The result is troves of ecological data in a short amount of time.

Because these enormous datasets require robust computational power, the researchers argue that a global organization should be created to host an acoustic platform that produces on-the-fly analysis. Not only could the data be used for academic research, but it could also monitor conservation policies and strategies employed by companies around the world.

"Nongovernmental organizations and the conservation community need to be able to truly evaluate the effectiveness of conservation interventions. It's in the interest of certification bodies to harness the developments in bioacoustics for better enforcement and effective measurements," said Zuzana Burivalova, a postdoctoral research fellow in Professor David Wilcove's lab at Princeton University's Woodrow Wilson School of Public and International Affairs.

"Beyond measuring the effectiveness of conservation projects and monitoring compliance with forest protection commitments, networked bioacoustic monitoring systems could also generate a wealth of data for the scientific community," said co-author Rhett Butler of the environmental news outlet Mongabay.

Burivalova and Butler co-authored the paper with Edward Game, who is based at the Nature Conservancy and the University of Queensland.

The researchers explain that while satellite imagery can be used to measure deforestation, it often fails to detect other subtle ecological degradations like overhunting, fires, or invasion by exotic species. Another common measure of biodiversity is field surveys, but those are often expensive, time consuming and cover limited ground.

Depending on the vegetation of the area and the animals living there, bioacoustics can record animal sounds and songs from several hundred meters away. Devices can be programmed to record at specific times or continuously if there is solar polar or a cellular network signal. They can also record a range of taxonomic groups including birds, mammals, insects, and amphibians. To date, several multiyear recordings have already been completed.

Bioacoustics can help effectively enforce policy efforts as well. Many companies are engaged in zero-deforestation efforts, which means they are legally obligated to produce goods without clearing large forests. Bioacoustics can quickly and cheaply determine how much forest has been left standing.

"Companies are adopting zero deforestation commitments, but these policies do not always translate to protecting biodiversity due to hunting, habitat degradation, and sub-canopy fires. Bioacoustic monitoring could be used to augment satellites and other systems to monitor compliance with these commitments, support real-time action against prohibited activities like illegal logging and poaching, and potentially document habitat and species recovery," Butler said.

Further, these recordings can be used to measure climate change effects. While the sounds might not be able to assess slow, gradual changes, they could help determine the influence of abrupt, quick differences to land caused by manufacturing or hunting, for example.

Credit: 
Princeton School of Public and International Affairs

Physicists uncover new competing state of matter in superconducting material

image: Ames Laboratory researchers used laser pulses of less than a trillionth of a second in much the same way as flash photography, in order to take a series of snapshots. Called terahertz spectroscopy, this technique can be thought of as "laser strobe photography" where many quick images reveal the subtle movement of electron pairings inside the materials using long wavelength far-infrared light.

Image: 
US Department of Energy, Ames Laboratory

A team of experimentalists at the U.S. Department of Energy's Ames Laboratory and theoreticians at University of Alabama Birmingham discovered a remarkably long-lived new state of matter in an iron pnictide superconductor, which reveals a laser-induced formation of collective behaviors that compete with superconductivity.

"Superconductivity is a strange state of matter, in which the pairing of electrons makes them move faster," said Jigang Wang, Ames Laboratory physicist and Iowa State University professor. "One of the big problems we are trying to solve is how different states in a material compete for those electrons, and how to balance competition and cooperation to increase temperature at which a superconducting state emerges."

To get a closer look, Wang and his team used laser pulses of less than a trillionth of a second in much the same way as flash photography, in order to take a series of snapshots. Called terahertz spectroscopy, this technique can be thought of as "laser strobe photography" where many quick images reveal the subtle movement of electron pairings inside the materials using long wavelength far-infrared light.

"The ability to see these real time dynamics and fluctuations is a way to understanding them better, so that we can create better superconducting electronics and energy-efficient devices," said Wang.

Credit: 
DOE/Ames National Laboratory

Surrey AI predicts cancer patients' symptoms

Doctors could get a head start treating cancer thanks to new AI developed at the University of Surrey that is able to predict symptoms and their severity throughout the course of a patient's treatment.

In what is believed to be the first study of its kind, published in the PLOS One journal, researchers from the Centre for Vision, Speech and Signal Processing (CVSSP) at the University of Surrey detail how they created two machine learning models that are both able to accurately predict the severity of three common symptoms faced by cancer patients - depression, anxiety and sleep disturbance. All three symptoms are associated with severe reduction in cancer patients' quality of life.

Researchers analysed existing data of the symptoms experienced by cancer patients during the course of computed tomography x-ray treatment. The team used different time periods during this data to test whether the machine learning algorithms are able to accurately predict when and if symptoms surfaced.

The results found that the actual reported symptoms were very close to those predicted by the machine learning methods.

This work has been a collaboration between the University of Surrey and the University of California in San Francisco (UCSF). The UCSF research in this joint collaboration is led by Professor Christine Miaskowski.

Payam Barnaghi, Professor of Machine Intelligence at the University of Surrey, said: "These exciting results show that there is an opportunity for machine learning techniques to make a real difference in the lives of people living with cancer. They can help clinicians identify high-risk patients, help and support their symptom experience and pre-emptively plan a way to manage those symptoms and improve quality of life."

Nikos Papachristou, who worked on designing the machine learning algorithms for this project, said: "I am very excited to see how machine learning and AI can be used to create solutions that have a positive impact on the quality of life and well-being of patients."

Credit: 
University of Surrey

Radiation doses from CT scans should be more consistent, say experts

Large differences in radiation doses used for CT scans are mainly due to how the scanners are used by medical staff rather than differences in the patients scanned or the machines used, finds a study in The BMJ today.

Setting more consistent dose standards should therefore be possible and will ensure that patients are not exposed to unnecessary radiation risks.

Computed tomography (CT) scanning creates detailed pictures of areas inside the body to help diagnose a range of conditions. But CT radiation is associated with an increased risk of cancer, so it is important to minimise exposure and reduce unnecessary variation.

In fact, evidence suggests that CT radiation doses are highly variable across patients, institutions, and countries and, in many cases, doses can be reduced by 50% or more without reducing image quality and diagnostic accuracy.

To better understand the factors contributing to this variation, an international research team analysed dose data for over 2 million CT scans from 151 institutions, across seven countries.

They included scans of the abdomen, chest, combined chest and abdomen, and head from 1.7 million adults between November 2015 and August 2017.

They adjusted the data for a range of variables related to the patient (e.g. sex and size), institution (e.g. trauma centre, academic or private), and machine (e.g. manufacturer and model).

The researchers found that most of these factors had only a small effect on dose variation across countries.

For example, after adjusting for patient characteristics, there was still a fourfold range in mean effective dose for abdominal scans and a 17-fold range in proportion of high dose scans (4-69%). Similar variation persisted for chest scans, and combined chest and abdomen scans.

Adjusting for institution and machine factors also had little effect on dose variation.

However, adjusting for technical factors (how scanners were used by medical staff) substantially reduced or eliminated nearly all the dose variation across countries.

As such, the researchers conclude that the variation in doses used for CT scanning of patients is primarily driven by how CT scanners are used, rather than to underlying differences in the patients scanned or the machines used.

This is an observational study, and as such, can't establish cause, and the researchers point to some limitations that may have influenced the results.

Nevertheless, they say these findings suggest that optimising doses to a consistent standard should be possible. And they call for more education and international collaboration to set benchmarks for optimum target doses.

Credit: 
BMJ Group