Tech

Towards an AI diagnosis like the doctor's

image: This series shows the four different stages of the same eye scan described in this article. From left to right: the original image; with highlights of anomalies by human experts; with highlights of anomalies by one scan by an AI system; with highlights of anomalies made by an iterative series of scans by an AI system. Note that the iterative process yields a more complete overview of the anomalies present than the single assessment.

Image: 
Cristina González-Gonzalo

Artificial intelligence (AI) is an important innovation in diagnostics, because it can quickly learn to recognize abnormalities that a doctor would also label as a disease. But the way that these systems work is often opaque, and doctors do have a better "overall picture" when they make the diagnosis. In a new publication, researchers from Radboudumc show how they can make the AI show how it's working, as well as let it diagnose more like a doctor, thus making AI-systems more relevant to clinical practice.

Doctor vs AI

In recent years, artificial intelligence has been on the rise in the diagnosis of medical imaging. A doctor can look at an X-ray or biopsy to identify abnormalities, but this can increasingly also be done by an AI system by means of "deep learning" (see 'Background: what is deep learning' below). Such a system learns to arrive at a diagnosis on its own, and in some cases it does this just as well or better than experienced doctors.

The two major differences compared to a human doctor are, first, that AI is often not transparent in how it's analyzing the images, and, second, that these systems are quite "lazy". AI looks at what is needed for a particular diagnosis, and then stops. This means that a scan does not always identify all abnormalities, even if the diagnosis is correct. A doctor, especially when considering the treatment plan, looks at the big picture: what do I see? Which anomalies should be removed or treated during surgery?

AI more like the doctor

To make AI systems more attractive for the clinical practice, Cristina González Gonzalo, PhD candidate at the A-eye Research and Diagnostic Image Analysis Group of Radboudumc, developed a two-sided innovation for diagnostic AI. She did this based on eye scans, in which abnormalities of the retina occurred - specifically diabetic retinopathy and age-related macular degeneration. These abnormalities can be easily recognized by both a doctor and AI. But they are also abnormalities that often occur in groups. A classic AI would diagnose one or a few spots and stop the analysis. In the process developed by González Gonzalo however, the AI goes through the picture over and over again, learning to ignore the places it has already passed, thus discovering new ones. Moreover, the AI also shows which areas of the eye scan it deemed suspicious, therefore making the diagnostic process transparent.

An iterative process

A basic AI could come up with a diagnosis based on one assessment of the eye scan, and thanks to the first contribution by González Gonzalo, it can show how it arrived at that diagnosis. This visual explanation shows that the system is indeed lazy - stopping the analysis after it as obtained just enough information to make a diagnosis. That's why she also made the process iterative in an innovative way, forcing the AI to look harder and create more of a 'complete picture' that radiologists would have.

How did the system learn to look at the same eye scan with 'fresh eyes'? The system ignored the familiar parts by digitally filling in the abnormalities already found using healthy tissue from around the abnormality. The results of all the assessment rounds are then added together and that produces the final diagnosis. In the study, this approach improved the sensitivity of the detection of diabetic retinopathy and age-related macular degeneration by 11.2+/-2.0% per image. What this project proves is that it's possible to have an AI system assess images more like a doctor, as well as make transparent how it's doing it. This might help these systems become easier to trust and thus to be adopted by radiologists.

Background: what is 'deep learning'?

Deep learning is a term used for systems that learn in a way that is similar to how our brain works. It consists of networks of electronic 'neurons', each of which learns to recognize one aspect of the desired image. It then follows the principles of 'learning by doing', and 'practice makes perfect'. The system is fed more and more images that include relevant information saying - in this case - whether there is an anomaly in the retina, and if so, which disease it is. The system then learns to recognize which characteristics belong to those diseases, and the more pictures it sees, the better it can recognize those characteristics in undiagnosed images. We do something similar with small children: we repeatedly hold up an object, say an apple, in front of them and say that it is an apple. After some time, you don't have to say it anymore - even though each apple is slightly different. Another major advantage of these systems is that they complete their training much faster than humans and can work 24 hours a day.

Credit: 
Radboud University Medical Center

Growing polymers with different lengths

It is hard to imagine everyday life without materials made of synthetic polymers. Clothes, car parts, computers or packaging - they all consist of polymer materials. Lots of polymers are present in nature, too, such as DNA or proteins.

Polymers are built on a universal architecture: they are composed of basic building blocks called monomers. Polymer synthesis involves linking monomers together to form long chains. Imagine threading glass beads onto a string and creating chains of different length (and weight).

Polymerization processes with limits

An important industrial process for producing polymers is free radical polymerisation (FRP). Each year the chemical industry uses FRP to produce 200 million tonnes of polymers of various types, such as polyacrylic, polyvinyl chloride (PVC) and polystyrene.

Although this production method has many advantages, it also has its limitations. FRP produces an uncontrollable mixture of countless polymers of different lengths; in other words, its dispersity is high. Dispersity is a measure of how uniform or non-?uniform the length of the polymer chains in a material is. Material's properties are determined to a large extent by this dispersity.

In the case of everyday polymers, polymers with both low and high dispersity are required. In fact, for many   high-?tech applications including pharmaceuticals or 3D printing, high dispersity can even be an advantage.

Polymers with new properties

However, if chemists want to produce polymer materials with very specific properties, they must first and foremost be able to adjust the dispersity as desired. This lets them produce a wide range of polymer materials that either contain uniform polymer species, i.e. have a low dispersity, or are highly dispersed with a great number of polymers of different lengths. Until now, this has hardly been possible.

A group of researchers led by Athina Anastasaki, Professor of Polymer Materials at the Department of Materials Science, has now developed a method of controlling radical polymerisation, thus enabling researchers to systematically and completely control the dispersity of polymer materials. The results of their research were recently published in the journal Chem.

In the past, in order to be able to control the radical polymerisation process at least to some extent, chemists would use a single catalyst. While this ensures that the resulting polymer chains become uniformly long, it doesn't allow the overall dispersity to be controlled as desired.

Two catalysts do the trick

Now the ETH researchers simultaneously employ two catalysts with different effects - one is highly active, the other only slightly active. This enabled them to adjust the dispersity precisely as a function of the ratio in which they mixed the two catalysts. If the more active catalyst was more abundant, more uniform polymers were produced, which meant the resulting material had low dispersity. If, however, the less active catalyst was more abundant, a large number of different polymer molecules were formed.

This work means Anastasaki and her team have created a basis for the development of new polymer materials. In addition, their process is also scalable; it works not only in the laboratory, but also when applied to larger quantities of substances. Another advantage of this method is that even polymers with high dispersity can continue growing once the polymerisation process itself is complete - something that was previously considered impossible.

The high efficiency and scalability of the approach have already attracted interest from industry. Polymers produced with the new process could be put to use in medicine, vaccines, cosmetics or 3D printing.

Credit: 
ETH Zurich

Evergreen idea turns biomass DNA into degradable materials

ITHACA, N.Y. - DNA has a lot of handy uses. It stores the blueprint of genetic code. It helps usher along the evolution of species.

It could also potentially make a stronger, more sustainable spoon, among other things.

A Cornell-led collaboration is turning DNA from organic matter - such as onions, fish and algae - into biodegradable gels and plastics. The resulting materials could be used to create everyday plastic objects, unusually strong adhesives, multifunctional composites and more effective methods for drug delivery, without harming the environment the way petrochemical-based materials do.

The team's paper, "Transformation of Biomass DNA Into Biodegradable Materials From Gels to Plastics for Reducing Petrochemical Consumption," published May 11 in the Journal of the American Chemical Society.

The collaboration is led by Dan Luo, professor of biological and environmental engineering in the College of Agriculture and Life Sciences. Luo's group has been exploring ways to use biomass DNA as a genetic as well as generic material, capitalizing on its properties as a novel polymer.

"There are many, many reasons why DNA is so good as a generic material," Luo said. "DNA is programmable. It has more than 4,000 nanotools - those are enzymes - that can be used to manipulate the DNA. And DNA is biocompatible. You eat DNA all the time. It is nontoxic and degradable. Essentially you can compost it."

Perhaps biomass DNA's greatest virtue is its sheer abundance. There are an estimated 50 billion metric tons of biomass on Earth, and less than 1% of that amount could fulfill the world's need for plastics for a year, according to Luo's team. Meanwhile, petrochemical-based products take a tremendous toll on the environment - from oil and gas exploration and refining, to the industrial synthesis of plastic, to the millions of tons of products that litter the land and oceans without degrading.

While biomass has previously been converted into biodegradable materials, that process - in which polysaccharides such as cellulose are broken down and resynthesized into polymers - requires extra energy and extreme temperatures that also strain the environment.

Luo's team bypassed that breakdown-synthesis process by developing a one-step cross-linking method that maintains DNA's function as a polymer without breaking its chemical bonds. The process is surprisingly simple: The researchers extract the DNA from any organic source - such as bacteria, algae, salmon or apple pomace - and dissolve it in water. After the pH of the solution is adjusted with alkali, the researchers add polyethylene glycol diacrylate, which chemically links with the DNA polymer and forms a hydrogel.

The gel can then be dehydrated to produce a range of denser materials, like plastic and glue.

"It's a much simpler process than conventional synthesis," Luo said. "The whole process is more doable, more economical and [can be done] at greater scale, because you don't have to pretreat the biomass DNA. You just directly cross-link them into plastics."

An additional perk of cross-linking is that researchers can tweak the new materials with unusual properties. For example, postdoctoral researcher Dong Wang created a glue that can stick to Teflon at minus 20 degrees Celsius, a temperature that would freeze traditional water-based adhesives. Wang also made a biomass "flower" that incorporated magnetic nanoparticles and could be manipulated with a magnetic field.

"The product's application depends on the properties we afford to it," Luo said. "You can make it luminescent, make it conducting or non-conducting, make it much stronger. Anything you can think of."

In addition to generating everything from toys and utensils to clothing and skin for buildings, Luo said hydrogels could be particularly well-suited for controlled-release drugs. The researchers also were able to achieve cell-free protein production that had not been possible in petrochemical-based products.

"Our cross-link method is very general," said Wang, the paper's lead author. "It can be expanded to other polymers, other molecules."

The cost of conversion at the present lab setting is about $1 per gram of material, with almost 90% of the expense going to the ethanol required to extract the DNA from the biomass. If manufactured on an industrial scale, Luo estimates the cost would be reduced dramatically, by a hundredfold or even a thousandfold.

One potential challenge is obtaining large enough amounts of biomass to extract the DNA. The researchers still need to figure out how to control the lifespan of the materials and the time it takes for them to degrade.

"We are also working to make the biomass DNA materials much more functional, to make different types of materials, making them super strong, super soft," Luo said. "But we will never forget it's a DNA-based material. Whenever possible, we want to take advantage of DNA's genetic role."

Credit: 
Cornell University

Feed additive reduces enteric methane emissions in dairy cows

Philadelphia, June 24, 2020 - The enteric methane mitigation potential of 3-nitrooxypropanol (3-NOP) has been confirmed in previous studies. 3-NOP is highly soluble and rapidly metabolized in the rumen. Previous studies have shown a persistent methane mitigation effect when 3-NOP is administered through the total mixed ration (TMR). In a recent article appearing in the Journal of Dairy Science, scientists from six universities studied the methane mitigation effects of varying doses of 3-NOP in the feed of 49 multiparous Holstein cows at The Pennsylvania State University's Dairy Teaching and Research Center.

After a 14-day adjustment period, cows received the base TMR mixed with a placebo or one of six treatment doses of 3-NOP ranging from 40 to 200 mg of 3-NOP/kg of feed. Dose levels were chosen based on previous research at this laboratory as well as studies conducted in beef cattle. The scientists hypothesized that within the range of application rates studied, 3-NOP would decrease enteric methane emissions without affecting dry matter intake or lactational performance of the cows.

The inclusion of 3-NOP in the TMR quadratically decreased daily enteric methane emissions by 22 to 40 percent in lactating dairy cows, with an average reduction of 31 percent. In this experiment, 3-NOP had no effect on dry matter intake or milk yield but linearly increased milk fat concentration and yield.

"We can determine by calculation that the decrease in daily enteric methane emissions would have increased the availability of feed digestible energy," said lead investigator Alexander Hristov, PhD, Department of Animal Science, The Pennsylvania State University, University Park, PA, USA. "The reduction in emitted methane with 3-NOP would represent, in theory, additional energy for lactation that could potentially be used for productive purposes."

The results of this study suggest that 3-NOP is a promising feed additive for reducing enteric methane emissions, while maintaining lactational performance in dairy cows and potentially increasing milk fat yield.

Credit: 
Elsevier

Researchers sharply reduce time needed for glass and ceramic 3D printing

image: Interparticle photo-cross-linkable suspension enables 3D structuring of transparent SiO2 glass components through rapid heating profiles.

Image: 
Motoyuki Iijima, Yokohama National University

The fabrication of complex ceramic or glass structures via stereolithography, a type of 3D printing, has long been held back by how much time it takes at the back-end of the process, which can take up to two days. A new technique reduces this time down to less than 5 hours.

Stereolithography prints objects from CAD files out of a powder suspended in a liquid. Objects are built layer by layer out of this liquid by shining a light, or laser, into the liquid-and-powder suspension, also called a "colloid." (Milk, for example, is also a colloid, but of milk fats suspended in water)

The laser causes some of the particles that are sensitive to light to join up together, or "cross-link," and form layers made of particles embedded in polymers (long chains of molecules)--a hardening process called "curing." The laser in effect "writes" layers in the liquid suspension, and these layers, printed out on top of one another, form a 3D object composed of the binding agent and the powder.

Ceramic or glass stereolithography holds the potential to fabricate parts with much more accurate and complex geometries--including hollow objects or ones with intricate internal structures that can reduce weight while maintaining strength--than could ever be achieved by conventional manufacturing with these substances. Such complexity of ceramic and glass design offers a raft of new biomedical, structural, and energy system applications.

The cured object, termed 'green' once it has been printed, still has a couple of additional stages to go though at high temperature: debinding--or removal of its binding agents--and sintering--the fusing of the powder particles firmly together. After debinding, the object is termed 'brown', and the finished product can be obtained after sintering.

However, in order to avoid structural collapse during this process--as a result of the production of gas bubbles that could shatter the strong but brittle ceramic or glass--debinding and sintering are performed extremely slowly, typically taking up to 48 hours.

"Until now, this time-consuming and costly constraint has limited 3D printed glass and ceramic parts to highly specialized applications," said Motoyuki Iijima, an engineering researcher at Yokohama National University, whose team has developed a new colloid recipe that sharply reduces the time needed for debinding and sintering.

"What you want instead is to get closer to the speed of conventional glass or ceramic production but combined with the complexity offered by 3D printing and similar additive manufacturing processes."

The research team published their findings in the journal Communications Materials on May 20.

To prove their recipe concept, the researchers wanted to manufacture transparent glass. This particular colloid recipe takes silica (SiO2) particles that have been modified with polyethyleneimine (a type of polymer) and oleic acid, a type of fatty acid that occurs in many animal and vegetable fats or oils. These particles are then mixed in an alcohol-based solvent along with a photo-initiator--the recipe ingredient that is sensitive to light and initiates the curing.

Crucially, the recipe also allows for a much smaller amount of monomers--the particles that perform the cross-linking--than is normal in glass and ceramic stereolithography, and encourages more cross-linking by other particles.

Normally, the large amount of monomers is what requires the slow debinding and sintering process because a rapid burning of monomers would generate the gas that threatens the structure of the object.

By being able to use only tiny amounts of monomers in the colloid, the researchers managed to get the debinding and sintering time down to less than 5 hours.

Having demonstrated how much faster this process can be performed for transparent glass, the researchers now want to extend the liquid suspension recipe, with its short processing time, to any kind of ceramic or glass.

Credit: 
Yokohama National University

Researchers use electric fields to herd cells like flocks of sheep

image: Princeton researchers created a device that uses electrical fields to herd cells like sheep. This time-lapse movie shows a 90-degree turn in a layer of cells viewed under a microscope for eight hours.

Image: 
Video courtesy of the researchers; GIF by Neil Adelantar

Princeton researchers have created a device that can herd groups of cells like sheep, precisely directing the cells' movements by manipulating electric fields to mimic those found in the body during healing. The technique opens new possibilities for tissue engineering, including approaches to promote wound healing, repair blood vessels or sculpt tissues.

Scientists have long known that naturally occurring electrochemical signals within the body can influence the migration, growth and development of cells -- a phenomenon known as electrotaxis. These behaviors are not nearly as well understood as chemotaxis, in which cells respond to chemical concentration differences. One barrier has been a lack of accessible tools to rigorously examine cells' responses to electric fields.

The new system, assembled from inexpensive and readily available parts, enables researchers to manipulate and measure cultured cells' movements in a reliable and repeatable way. In a paper published June 24 in Cell Systems, the Princeton team described the assembly and preliminary studies using the device, which they call SCHEEPDOG, for Spatiotemporal Cellular HErding with Electrochemical Potentials to Dynamically Orient Galvanotaxis. (Galvanotaxis is another term for electrotaxis.)

Previous systems for studying cells' responses to electric fields have been "either bespoke and handmade, with issues of reproducibility, or requiring fabrication facilities that make them expensive and inaccessible to many labs," said co-lead author Tom Zajdel, a postdoctoral research fellow in mechanical and aerospace engineering. "We wanted to use rapid prototyping methods to make a well-defined device that you could just clamp onto your petri dish."

While there is a long history of work on electrotaxis, said Zajdel, the phenomenon is not well understood. Evidence shows, for example, that reversing the direction of a natural electric field can inhibit wound healing in animal models, while amplifying the existing field might improve healing.

"There are a lot of unknowns about how individual cells detect such fields," said senior author Daniel Cohen, an assistant professor of mechanical and aerospace engineering. "But the beauty of crowd dynamics is that even if you don't understand everything about the individuals, you can still engineer behaviors at the group level to achieve practical results."

The SCHEEPDOG device contains two pairs of electrodes that are used to generate electric fields along horizontal and vertical axes, as well as recording probes to measure voltage and integrated materials to separate the cells from chemical byproducts of the electrodes. The voltage level is similar to that of an AA battery concentrated over the centimeter-wide chamber containing the cells.

"It's kind of like an Etch A Sketch," said Zajdel, referring to the classic drawing toy in which lines can be created in any direction by turning two control knobs. "We've got the horizontal and the vertical knobs, and we can get the cells to trace out arbitrary trajectories in the whole 2-D space just by using those two knobs."

The team tested SCHEEPDOG using mammalian skin cells and epithelial cells from the lining of the kidney, which are often used to study cells' collective movements. They found that the cells time-averaged signals generated along the two axes over a time window of about 20 seconds: Turning on the vertical electric field for 15 seconds and the horizontal field for 5 seconds, for instance, would cause the cells to migrate more in the vertical than in the horizontal direction.

"What the cells perceive is sort of a virtual angle, and that allows us to program any complex maneuver, like a full circle," said Cohen. "That's really surprising -- that's an amazing level of control that we wouldn't have expected to be possible, especially with thousands of neighboring cells executing these maneuvers on command."

The study "adds to the growing appreciation of cells' responses to bioelectric aspects of their environment," said Michael Levin, who directs the Center for Regenerative and Developmental Biology at Tufts University and was not involved in the research. "It demonstrates a technique to address not just individual cells' activities in response to bioelectric cues, but the action of a cell collective, which is essential to understand how physical forces play into the kind of cooperativity we see in embryogenesis, regeneration and cancer."

Using SCHEEPDOG, the team is expanding their studies to different cell types and contexts. Graduate student Gawoon Shim is investigating how varying levels of cell-cell adhesion impact directed cell migration -- key information for eventual applications like regenerating skin, blood vessels and nerve cells in damaged tissue.

"This is the first step for whatever healing and regeneration we may need" in a variety of clinical contexts, said Shim, co-lead author of the study along with Zajdel. "We're learning how to direct the cells where we need them, and then we can figure out what they're going to do afterwards."

Applying engineering principles to understand and control electrotaxis will deepen understanding of its role not only in cell movement, but also in growth and differentiation, said Cohen. While today's cutting-edge tissue regeneration techniques usually involve pre-patterning new tissues, sculpting tissues with electric fields may allow for more flexibility and better outcomes. "In the long term, this might offer some very exciting, completely new ways of thinking about working with living tissues," he said.

Credit: 
Princeton University, Engineering School

Cowbirds change their eggs' sex ratio based on breeding time

image: A new study finds that cowbirds adjust the sex of their offspring throughout the nesting season. A female, left, and male cowbird perch on a wire fence. Both birds are adults.

Image: 
Photo (c) by Michael Jeffords and Sue Post

CHAMPAIGN, Ill. -- Brown-headed cowbirds show a bias in the sex ratio of their offspring depending on the time of the breeding season, researchers report in a new study. More female than male offspring hatch early in the breeding season in May, and more male hatchlings emerge in July.

Cowbirds are brood parasites: They lay their eggs in the nests of other birds and let those birds raise their young. Prothonotary warblers are a common host of cowbirds.

"Warblers can't tell the difference between their own offspring and cowbirds," said Wendy Schelsky, a principal scientist at the Illinois Natural History Survey and co-author of the study. "They do a really good job of raising cowbirds, even though cowbird chicks are larger and need more food."

The researchers studied the interactions between cowbirds and warblers for seven years to determine whether there was a difference in the relative number of males and females among cowbird offspring. They collected DNA samples from cowbird eggs or newly hatched chicks.

"Other scientists have not seen any difference in the sex ratios of brood-parasitic birds," said study co-author Mark Hauber, a professor of evolution, ecology, and behavior at the University of Illinois at Urbana-Champaign. "This is the first time anyone has detected a seasonal bias and we believe that it is due to our large sample sizes."

The researchers think their results may reflect the different developmental trajectories of male and female cowbirds.

"Male cowbirds take longer to mature and are unlikely to breed in their first year as adults," Schelsky said. "However, since most adult females breed in their first year, they have a better chance of being in good shape if they are produced earlier."

Although the eggs and newly hatched chicks both show the seasonal sex bias, it is unclear whether the differing sex ratios persist in birds that grow up and leave the nest.

"We have not looked at what happens to the chicks after they fledge," Hauber said. "We know that adult cowbird flocks are heavily male-biased, so perhaps increased mortality or dispersal by early-hatched female cowbirds impacts the eventual adult sex ratios."

The researchers hope to understand the molecular mechanisms that female cowbirds use to influence the sex of their offspring.

"It would be interesting to know if the females change their hormone levels across the season to influence the sex ratio of the eggs," Hauber said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Should diabetes treatment lessen for older adults approaching the end of life?

One in four people aged 65 or older has diabetes. The disease is the seventh leading cause of death in the United States and a major contributor to heart disease. Experts have recommended that the best way to slow the progression of diabetes—and help prevent its many complications—is to maintain strict control of blood sugar levels. For healthy younger people, this means keeping the target blood sugar level (known as A1c or HbA1c) lower than 6.5 percent to 7.0 percent.

For older adults who have a limited life expectancy or who have advanced dementia, however, maintaining that target blood sugar level may cause more harm than good. For example, these older adults may not live long enough to experience potential benefits. What’s more, maintaining these strict blood sugar levels can raise the risk of potentially harmful events such as low blood sugar (also known as hypoglycemia). This can cause falls or loss of consciousness.

For these reasons, many guidelines now suggest targeting higher HbA1c targets—such as between 8.0 percent and 9.0 percent—for older adults who have multiple chronic conditions or limited life expectancy, or who live in nursing homes.

There is not much existing research to guide health care practitioners as to what the appropriate levels of diabetes medications are for this group of older adults. There is also little information about the effects for these individuals of taking fewer or lower dose of diabetes medications.

Experts suspect that lessening diabetes treatment in these older adults has the potential to prevent unnecessary hospitalizations due to lowering the risk for harmful drug events and increasing the patients’ comfort.

In order to investigate the issue, a team of researchers conducted a study—one of the first national studies to examine potential overtreatment and deintensification of diabetes management in nursing home residents with limited life expectancy or dementia. The researchers chose nursing home residents to study because admission to a nursing home could give healthcare practitioners a chance to learn more about patient goals and preferences and to review and adjust medications accordingly. The researchers published their results in the Journal of the American Geriatrics Society.

The researchers examined information from Veterans Affairs nursing homes from 2009 to 2015. Their goal was to learn more about older adults with diabetes, particularly those nearing the end of their life or who have dementia. The researchers investigated whether these older adults were overtreated for diabetes, whether they had their diabetes medication regimens lessened, and what effects might result from lowered doses, types and/or different kinds of medication.

The researchers wanted to learn specifically how often diabetes treatments were lessened. Among the nursing home residents identified as potentially overtreated, the researchers examined how much their diabetes treatment regimens were lessened during the 90 days of follow-up.

The researchers did not consider insulin dose changes, because insulin doses may be influenced by factors such as eating habits.

The researchers said they observed potential overtreatment of diabetes in almost 44 percent of nursing home admissions for veterans with diabetes and veterans who had limited life expectancy or dementia. Potentially overtreated residents were about 78 years old and were nearly all male and non-Hispanic white. Two-thirds of the residents had been admitted to nursing homes from hospitals. A total of 29 percent had advanced dementia, almost 14 percent were classified with end-of-life status, and 79 percent had a moderately high risk of dying within six months. Many were physically dependent and had heart disease and/or potential diabetes-related complications. In addition, about 9 percent of overtreated residents had a serious low blood sugar episode in the year prior, emphasizing the need for deintensification.

Nearly half of residents received two or more diabetes medications, and those with higher HbA1c values of between 6.5 percent to 7.5 percent received more diabetes medications than those with lower HbA1c.

The researchers concluded that many veteran nursing home residents with limited life expectancy or dementia may be overtreated for their diabetes at the time of admission. The researchers suggested that future studies examine the impact of deintensification on health outcomes and adverse events to better understand the risks and benefits of diabetes management strategies in this group of older adults.

This summary is from “Deintensification of Diabetes Medications among Veterans at the End of Life in VA Nursing Homes.” It appears online ahead of print in the Journal of the American Geriatrics Society. The study authors are Joshua D. Niznik, PharmD, PhD; Jacob N. Hunnicutt, PhD; Xinhua Zhao, PhD; Maria K. Mor, PhD; Florentina Sileanu, MS; Sherrie L. Aspinall, PharmD, MSc; Sydney P. Springer, PharmD, MS; Mary J. Ersek, PhD, RN; Walid F. Gellad, MD, MPH; Loren J. Schleiden, MS; Joseph T. Hanlon, PharmD, MS; Joshua M. Thorpe, PhD, MPH; and Carolyn T. Thorpe, PhD, MPH.

Journal

Journal of the American Geriatrics Society

DOI

10.1111/jgs.16360

Credit: 
American Geriatrics Society

New process could safeguard water quality, environment and health

image: Samples being placed into a centrifuge to separate sample components.

Image: 
Swansea University

A research team at Swansea University have developed a new method for fast removal and detection of wastewater pollutants that come from everyday pharmaceuticals like paracetamol, ibuprofen and aspirin, which could help minimise their impact on the environment.

The all-female team of (bio)chemists from the Medical School, in collaboration with international company, Biotage, have published the research in Analytical Science Advances. The research outlines how they successfully developed a single process for separating and quantifying a wide range of different pharmaceuticals and chemicals from personal care products found in everyone's bathrooms that can end up in wastewater sludge and blood plasma. The new method will speed up our understanding of which pollutants may be released and could help reduce the negative effects they have on the wider environment.

First author Dr Rachel Townsend said: "Many people don't really think about what happens to these drugs once they've taken them. Like any foodstuff, once a drug has been taken, it is excreted from the body and ends up in a wastewater treatment plant.

"It was thought that pharmaceuticals were degraded during the treatment process, but research has shown this isn't the case. And of course this becomes a problem as the treated wastewater is released into water courses such as rivers and streams, while 80% of treated sludge is also recycled back onto agricultural land as fertiliser and potentially onto future food crops."

There have been global reports of the adverse effects of pharmaceuticals on the animal kingdom. Diclofenac, for example, a non-steroidal anti-inflammatory has caused multiple species of vulture in Asia to become critically endangered, while the Indian long-billed vulture and red-headed vulture populations have decreased by 97-99%. The female contraceptive pill has caused the feminisation of male fish, which has caused populations to decrease rapidly over 2 years. There are also concerns that that sludge used in agriculture could impact on human health too.

The team have pioneered one process that uses a sample preparation method, called QuEChERS, with mass spectrometric detection. Using this process, they were able to detect, extract and quantify a range of pharmaceutical compounds and personal care products from a variety of sources, such as wastewater sludge, where previously multiple extraction methods were needed, making it more efficient in time and resources needed.

The researchers could then get a clearer picture of the factors controlling how antimicrobial resistance develops and spreads in the community, and this knowledge has the potential to help safeguard water quality, the environment and health.

The results will now help to inform the Chemical Investigation Programme, which is a British research initiative that contributes to the European Union Directive for environmental management. With enough research and data, changes can be made to the wastewater treatment process to ensure these everyday pollutants are degraded or removed with the hope of preventing any further impact on the wider environment and ensuring human health remains unaffected.

Co-author, Dr Claire Desbrow from Biotage said: "The newly developed method fits perfectly with our portfolio of sample preparation products. Being able to clean up complex human, food or environmental samples fast and efficiently will be of benefit to not only researchers, but also to industrial, environmental and regulatory laboratories across the globe."

Credit: 
Swansea University

Transgenic rice lowers blood pressure of hypertensive rats

In the future, taking your blood pressure medication could be as simple as eating a spoonful of rice. This "treatment" could also have fewer side effects than current blood pressure medicines. As a first step, researchers reporting in ACS' Journal of Agricultural and Food Chemistry have made transgenic rice that contains several anti-hypertensive peptides. When given to hypertensive rats, the rice lowered their blood pressure.

High blood pressure, also known as hypertension, is a major risk factor for cardiovascular disease and stroke. A common class of synthetic drugs used to treat hypertension, called ACE inhibitors, target the angiotensin converting enzyme (ACE), which is involved in blood pressure regulation. However, ACE inhibitors often have unpleasant side effects, such as dry cough, headache, skin rashes and kidney impairment. In contrast, natural ACE inhibitors found in some foods, including milk, eggs, fish, meat and plants, might have fewer side effects. But purifying large amounts of these ACE-inhibitory peptides from foods is expensive and time-consuming. Le Qing Qu and colleagues wanted to genetically modify rice -- one of the world's most commonly eaten foods -- to produce a mixture of ACE-inhibitory peptides from other food sources.

The researchers introduced a gene to rice plants that consisted of nine ACE-inhibitory peptides and a blood-vessel-relaxing peptide linked together, and confirmed that the plants made high levels of the peptides. The researchers then extracted total protein (including the peptides) from the transgenic rice and administered them to rats. Two hours after treatment, hypertensive rats showed a reduction in blood pressure, while rats treated with wild-type rice proteins did not. Treatment of rats over a 5-week period with flour from the transgenic rice also reduced blood pressure, and this effect remained 1 week later. The treated rats had no obvious side effects in terms of growth, development or blood biochemistry. If these peptides have the same effects in humans, a 150-pound adult would need to eat only about half a tablespoon of the special rice daily to prevent and treat hypertension, the researchers say.

Credit: 
American Chemical Society

'Very low' risk of unknown health hazards from exposure to 5G wireless networks

June 24, 2020 - Experts weigh in on recent online reports that warn of frightening health consequences from new fifth generation (5G) wireless networks. Within current exposure limits, there appears to be little or no risk of adverse health effects related to radiofrequency (RF) exposure from 5G systems, concludes an evidence-based expert review in the June issue of Health Physics, official journal of the Health Physics Society. The journal is published in the Lippincott portfolio by Wolters Kluwer.

"While we acknowledge gaps in the scientific literature, particularly for exposures at millimeter-wave frequencies, [we judge] the likelihood of yet unknown health hazards at exposure levels within current limits to be very low, if they exist at all," according to the statement by the Committee on Man and Radiation (COMAR) of the Institute of Electrical and Electronics Engineers (IEEE). As outlined by its Chair, Richard A. Tell, COMAR is an organization composed of physicians, biologists, epidemiologists, engineers and physical scientists who are experts on health and safety issues related to electromagnetic fields who work voluntarily and collaboratively on a consensus basis.

5G Networks Unlikely to Cause Exposure Above Current Safety Limits

The consensus statement seeks to counter the rise in alarming messages regarding mysterious health effects of 5G technology. "This misinformation together with activist websites expressing even more ominous consequences of 5G - ranging from cancer induction to being responsible for the current coronavirus pandemic - has created substantial and unnecessary public anxiety," comments Jerrold T. Bushberg of the University of California Davis School of Medicine and Vice-Chair of COMAR.

Fifth-generation wireless systems are expanding worldwide to meet the rapidly increasing demand for wireless connectivity. The new technology can transmit much greater amounts of data at much higher speeds, compared to previous 2G to 4G systems. That's in part because 5G uses the greater bandwidth available at higher frequencies, including the so-called millimeter-wave (MMW) band. Expansion of 5G "will produce a more ubiquitous presence of MMW in the environment," according to the report.

Because MMW do not penetrate foliage and building materials as well as lower-frequency signals, many lower-power "small cell" transmitters will be needed to provide effective indoor coverage. Some 5G systems will have "beamforming" antennas that transmit signals to individual users as they move around, which means that nonusers will have less exposure.

Tissue heating is the main potential harmful effect of exposure to RF fields. Most countries, including the United States, have adopted exposure limits similar to those recommended by the recent standards (2019) published by IEEE International Committee on Electromagnetic Safety (ICES) or the International Commission on Non-Ionizing Radiation Protection (ICNIRP). These guidelines seek to avoid harmful effects by setting exposure limits far below the threshold at which any adverse human health effects would be expected to occur. These standards only allow for low levels of public RF exposures for which the energy is deposited in the form of thermal heating.

The COMAR statement provides perspectives to address concerns about possible health effects of 5G exposure:

In contrast to lower-frequency fields, MMW do not penetrate beyond the outer layer of the skin - and thus does not produce heating of deeper tissues.
The introduction of 5G is unlikely to change overall levels of RF exposure. As is currently the case, most exposure will be mainly due to "uplink" from one's own cell phone or other devices - not from transmission from base stations.
In nearly all publicly accessible locations, RF exposures from cellular base stations, including 5G stations will remain small - a fraction of current IEEE or ICNIRP exposure limits.

"[S]o long as exposures remain below established guidelines, the research results to date do not support a determination that adverse health effects are associated with RF exposures, including those from 5G systems," concludes the COMAR statement. The Committee acknowledges limitations of the current evidence on possible health and safety effects of 5G exposure and identifies key areas for further research, including high-quality studies of the biological effects of MMW.

Credit: 
Wolters Kluwer Health

Food-grade wheatgrass variety released for public use

image: MN-Clearwater wheatgrass seed, whose grain threshes freely from the hulls 63% of the time.

Image: 
Brett Heim

Wheatgrass is packed with beneficial nutrients, which makes the crop a popular superfood. And now, more farmers will have access to growing this beneficial crop.

Historically, wheatgrass has been used as a crop in animal feed. However, a partnership between The Land Institute and the University of Minnesota changed that.

The University of Minnesota recently released the first food-grade wheatgrass variety for public use. Now, this eco-friendly and cost-effective crop can be commonly grown as human food, too.

"The Land Institute has been breeding intermediate wheatgrass since 2002," explains James Anderson, a professor at the University of Minnesota. "Developed using germplasm provided by The Land Institute in 2011, this variety of wheatgrass is the first to be available for public use."

The new variety, called MN-Clearwater, was produced by crossing seven wheatgrass parents with desired qualities. These qualities include high grain yield and seed size, which are ideal for farmers. Breeders have been successful in the domestication of this perennial crop that provides benefits to both farmers and the environment.

"Because wheatgrass is a perennial, it's known to be a soil builder," said Anderson. "It provides soil cover throughout the year."

Soil coverage prevents soil and nutrient runoff during heavy rainfall. Wheatgrass also has deep, dense roots that capture nutrients before it gets into groundwater. This helps to protect groundwater-based water systems.

Other benefits of this new wheatgrass variety compared to other crops like corn and soybeans, are:

-Less soil loss from the field;

-Fewer chemicals and fertilizers entering the groundwater system; and

-Improved carbon storage.

There are also economic advantages for the farmer growing wheatgrass. As a perennial crop, wheatgrass uses less fertilizer and machinery than annual crops.

"Wheatgrass can lower the growth of certain weed species," explains Anderson. Natural weed control also reduces potential costs for herbicides.

For farmers, the big advantage is that they only must plant once every three years and will have multiple harvests off of the one crop.

"But the farmer isn't the only one who benefits," explains Anderson. "As the first food-grade wheatgrass, food processers and consumers can see a benefit, too."

End-users are always searching for new items. MN-Clearwater wheatgrass provides new flavors and nutritional properties that can be added to food products.

The harvested wheatgrass goes well with wheat-based products. It can be used as a replacement for wheat, but it is best used with it. By using both wheat and wheatgrass as ingredients, the product can maintain its baking and functional properties while offering new flavors.

The first registered food product using the MN-Clearwater wheatgrass was a beer from Patagonia Provisions, and other products include several locally brewed beers and a limited-edition cereal from Cascadian Farm.

Credit: 
American Society of Agronomy

Biomedical researchers get closer to why eczema happens

BINGHAMTON, NY -- A new study from researchers at Binghamton University, State University of New York may help to peel back the layers of unhealthy skin -- at least metaphorically speaking -- and get closer to a cure.

An estimated 35 million Americans suffer from eczema, a chronic skin condition also known as atopic dermatitis. Worldwide, 2 to 5% of adults and about 15% of children suffer from symptoms such as dry, inflamed and very itchy skin with open sores.

Although there are myriad treatments for eczema, such as medical creams and natural remedies, the exact causes of the condition remain elusive.

In a new paper, the team -- Associate Professor Guy German and PhD student Zachary W. Lipsky from the Thomas J. Watson School of Engineering and Applied Science's Department of Biomedical Engineering, and Associate Professor Claudia N.H. Marques of the Harpur College of Arts and Sciences' Department of Biological Sciences -- connects two aspects of eczema research that are rarely studied together.

One result of atopic dermatitis is a decreased level of skin oils known as lipids, particularly one group called ceramides. Lipids on the surface of the skin function to regulate hydration and also help defend the skin from foreign invaders either indirectly through immune signaling or directly through their inherent antimicrobial activity.

Another result of eczema is an increase in staph bacteria in the skin, which can cause irritation and infection.

German said that genetics can play a part in whether someone has eczema, but people in certain occupations have also been shown to be more likely to get the skin condition, such as healthcare professionals, metalworkers, hairdressers and food processing workers. The connection? An increased amount of handwashing or regular contact with detergents for your job.

"What happens if, either through a mutation or through occupational risks, there's a decreased presence of lipids on the skin?" he asked. "The essence of this study is that in normal, healthy conditions, bacteria do not penetrate the skin barrier. In atopic dermatitis conditions or lipid levels consistent with AD, it does -- and it consistently takes nine days."

Because the staph bacteria are immobile, they need to multiply in number to grow through the protective outer skin layer known as the stratum corneum. The Watson researchers believe the bacteria don't grow around the skin cells but actually through them. With lipid depletion -- either through genetics or occupational risks -- the skin appears to become more vulnerable to bacterial invasion and infection of underlying skin tissue.

"When we usually think about the oils in our skin, we think about water retention and moisturizing -- things like that," Lipsky said. "Now we're looking at how these lipids are important for protection against these microorganisms that can come in and cause diseases."

While this study has not unlocked all the secrets of atopic dermatitis, showing that the bacteria could be the cause rather than the result of the disease is a major step forward. Further research is required, and that's where the Watson team will investigate next.

"Now that we know that bacteria can permeate through lipid depleted skin, how does it affect the skin mechanically?" Lipsky asked. "Does it make the skin weaker and more likely to crack? Can we figure out how bacteria are moving through different skin layers?"

German added: "In scientific research, you get one answer and three additional questions pop up, so we're never stuck for things to do."

Credit: 
Binghamton University

Plug-and-play lens simplifies adaptive optics for microscopy

image: Researchers have developed a new plug-and-play lens that can add adaptive optics correction to commercial optical microscopes. The image shows a section of mouse brain acquired with a light sheet microscope and corrected with the adaptive lens.

Image: 
T. Furieri (CNR-IFN), G. Calisei and A. Bassi (Politecnico of Milan), E. Daini and A. Vilella (University of Modena and Reggio Emilia)

WASHINGTON -- Researchers have developed a new plug-and-play device that can add adaptive optics correction to commercial optical microscopes. Adaptive optics can greatly improve the quality of images acquired deep into biological samples, but has, until now, been extremely complex to implement.

"Improving the technology available to life scientists can further our understanding of biology, which will, in turn, lead to better drugs and therapies available to doctors," said research team leader, Paolo Pozzi from the University of Modena and Reggio Emilia in Italy.

In The Optical Society (OSA) journal Optics Letters, Pozzi and a multidisciplinary team of researchers from Delft University of Technology (TU Delft), CNR-Institute for Photonics and Nanotechnology (CNR-IFL) and University Medical Center Rotterdam describe their new adaptive lens device. They also show how it can be easily installed onto the objective lens of a commercial multiphoton microscope to improve image quality.

"This approach will allow advanced optical techniques such as multiphoton microscopy to image deeper under the surface of the brain in live organisms," said Stefano Bonora, group leader at the CNR-IFL. "We look forward to seeing how it might also be implemented in other systems, such as light-sheet microscopes, super-resolution systems, or even simple epifluorescence microscopes."

Imaging deeper

Optical microscopy can be used to image biological samples in natural conditions, making it possible to observe various biological processes over time. However, as light travels through tissue it gets distorted. This distortion gets worse as light travels deeper into tissue, causing images to look blurry and obscuring important details.

Adaptive optics, a technology initially developed to compensate for atmospheric turbulence when using telescopes to view celestial objects, can be used to correct the optical aberrations that occur when imaging through thick tissue. However, doing so typically requires building a custom microscope that incorporates a deformable mirror. This mirror is used to compensate for the distortions, creating an image that looks sharp and clear.

"Including a deformable mirror in an existing microscope is nearly impossible, and no commercial adaptive microscope is available on the market yet," said Pozzi. "This means that the only option for a life scientist to use adaptive optics is to build the entire microscope from scratch, an operation which is too difficult and time consuming for most life sciences laboratories."

A simpler approach

To simplify this setup, the researchers created a smart lens made with glass so thin it can bend without breaking. The lens consists of a glass disk-shaped container filled with a transparent liquid. A set of 18 mechanical actuators on the glass edges can be controlled with a computer to bend the glass to a desired shape.

The lens functions like the deformable mirror used in most adaptive optics setups, but instead of reflecting light, it transmits light. As light travels through the liquid inside the lens, it gets distorted differently depending on the shape of the lens. "This is similar to the distorted images you see when looking through a bottle of water while squishing it with your hands," said Bonora.

Using the lens for adaptive optics correction requires a complex algorithm to control the actuators. "Efficient optical correction was made possible by the DONE algorithm (database online nonlinear extremum-seeker), a very elegant solution based on machine learning-like principles, which we previously developed at TU Delft," said Pozzi.

Quick results

The researchers tested the new software, which is also made available to others via github, and adaptive lens by applying it to the objective lens of a commercial multiphoton microscope. They used the microscope to perform calcium imaging on the brains of living mice, one of the most complex life science experiments performed with microscopes.

"We surpassed our expectations by achieving very nice results within a few hours," said Pozzi. "This technology can be retrofitted on any existing microscope that has interchangeable objectives and displays images on a computer screen."

The researchers are now testing the system on other types of microscopes and samples while also exploring whether multiple adaptive lenses could be used to achieve a better correction than is possible with more complex techniques using deformable mirrors. The team has also founded a spin-off company, Dynamic Optics srl, to commercialize the multiactuator adaptive lenses.

The new lens could also be useful for applications beyond microscopy. "Our new device could also be applied in other fields such as free space optics communications, where it could increase data connection rates and bring data connections to remote and isolated areas," said Pozzi.

Credit: 
Optica

Artificial intelligence classifies colorectal cancer using IR imaging

A research team from the Prodi Centre for Protein Diagnostics at Ruhr-Universität Bochum (RUB) has used infrared (IR) microscopes based on quantum cascade lasers to classify tissue samples of colorectal cancer from routine clinical operations in a marker-free and automated way. Artificial intelligence enabled the researchers to differentiate between different tumour types with great accuracy within approximately 30 minutes. Based on the classification, doctors can predict which course the disease will take and, consequently, choose the appropriate therapy. The team published their report in the journal Scientific Reports of 23 June 2020.

Microsatellite status facilitates prognosis

A distinction is made between microsatellite stable (MSS) and microsatellite instable (MSI) tumours in colon and other cancers. Microsatellites are usually functionless, short DNA sequences that are frequently repeated. Patients with MSI tumours have a significantly higher survival rate. This is due to a mutation rate of cancer cells that is about 1,000 times higher, which makes their growth less successful. Moreover, innovative immunotherapy is more successful in patients with MSI tumours. "It is therefore important for the prognosis and the therapy decision to know what kind of tumour we are dealing with," says Professor Anke Reinacher-Schick, Head of the Department of Haematology and Oncology at the RUB clinic St. Josef Hospital. To date, differential diagnosis has been carried out by immunohistochemical staining of tissue samples with subsequent complex genetic analysis.

Fast and reliable measurement

The potential of IR imaging as a diagnostic tool for the classification of tissue, the so-called label-free digital pathology, had already been demonstrated in earlier studies by the group headed by Professor Klaus Gerwert from the RUB Department of Biophysics. The method recognises cancer tissue without prior staining or other marking and, consequently, also works automatically with the aid of artificial intelligence. Unlike the conventional differential diagnosis of microsatellite status, which takes about one day, the new method requires only about half an hour.

The protein research team has significantly improved the method by optimising it for the detection of a molecular change in the tissue. Previously, the tissue could be only morphologically visualised. "This is a big step that shows that IR imaging can become a promising method in future diagnostics and therapy prediction," says Klaus Gerwert.

Encouraging feasibility study

In collaboration with the Institute of Pathology at RUB headed by Professor Andrea Tannapfel and the Department of Haematology and Oncology at the RUB St. Josef Hospital, the research team conducted a feasibility study with 100 patients. It showed a sensitivity of 100 per cent and a specificity of 93 per cent: all MSI tumours were correctly classified with the new method, only a few samples were falsely identified as MSI tumours. An expanded clinical trial is now starting, which will be carried out on samples from the Colopredict Plus 2.0 registry study. Initiated by Andrea Tannapfel and Anke Reinacher-Schick, the registry study allows the validation of the results from the published work. "The methodology is also of great interest to us, because very little sample material is used, which can be a decisive advantage in today's diagnostics with an increasing number of applicable techniques," explains Andrea Tannapfel.

Another step towards personalised healthcare

In future, the method is to be introduced into the clinical workflow to assess its potential for precision oncology. "Following an increasingly targeted therapy of oncological diseases, it is very important to provide rapid and precise diagnostics," concludes Anke Reinacher-Schick.

Credit: 
Ruhr-University Bochum