Tech

Robots grow mini-organs from human stem cells

image: This is a bird's eye view of a microwell plate containing kidney organoids, generated by liquid handling robots from human stem cells. Yellow boxed region is shown at higher magnification. Red, green, and yellow colors mark distinct segments of the kidney.

Image: 
Freedman Lab/UW Medicine

An automated system that uses robots has been designed to rapidly produce human mini-organs derived from stem cells. Researchers at the University of Washington School of Medicine in Seattle developed the new system.

The advance promises to greatly expand the use of mini-organs in basic research and drug discovery, according to Benjamin Freedman, assistant professor of medicine, Division of Nephrology, at the UW School of Medicine, who led the research effort.

"This is a new 'secret weapon' in our fight against disease,' said Freedman, who is a scientist at the UW Institute for Stem Cell and Regenerative Medicine, as well as at the Kidney Research Institute, a collaboration between the Northwest Kidney Centers and UW Medicine.

A report describing the new technique will be published online May 17 in the journal Cell Stem Cell. The lead authors were research scientists Stefan Czerniecki, and Nelly Cruz from the Freedman lab, and Dr. Jennifer Harder, assistant professor of internal medicine, Division of Nephrology at the University of Michigan School of Medicine, where she is a kidney disease specialist.

The traditional way to grow cells for biomedical research, Freedman explained, is to culture them as flat, two-dimensional sheets, which are overly simplistic. In recent years, researchers have been increasingly successful in growing stem cells into more complex, three-dimensional structures called mini-organs or organoids. These resemble rudimentary organs and in many ways behave similarly. While these properties make organoids ideal for biomedical research, they also pose a challenge for mass production. The ability to mass produce organoids is the most exciting potential applications of the new robotic technology, according to the developers.

In the new study, the researchers used a robotic system to automate the procedure for growing stem cells into organoids. Although similar approaches have been successful with adult stem cells, this is the first report of successfully automating the manufacture of organoids from pluripotent stem cells. That cell type is versatile and capable of becoming any type of organ.

In this process, the liquid-handling robots introduced the stem cells into plates that contained as many as 384 miniature wells each, and then coaxed them to turn into kidney organoids over 21 days. Each little microwell typically contained ten or more organoids, and each plate contained thousands of organoids. With a speed that would have impressed Henry Ford's car assembly line, the robots could produce many plates in a fraction of the time.

"Ordinarily, just setting up an experiment of this magnitude would take a researcher all day, while the robot can do it in 20 minutes," said Freedman.

"On top of that, the robot doesn't get tired and make mistakes," he added. "There's no question. For repetitive, tedious tasks like this, robots do a better job than humans."

The researchers further trained robots to process and analyze the organoids they produced. Harder and her colleagues at the University of Michigan Kidney Center used an automated, cutting-edge technique called single cell RNA sequencing to identify all the different types of cells found in the organoids.

"We established that these organoids do resemble developing kidneys, but also that they contain non-kidney cells that had not previously been characterized in these cultures," said Harder.

"These findings give us a better idea of the nature of these organoids and provide a baseline from which we can make improvements," Freedman said. "The value of this high-throughput platform is that we can now alter our procedure at any point, in many different ways, and quickly see which of these changes produces a better result."

Demonstrating this, the researchers discovered a way to greatly expand the number of blood vessel cells in their organoids to make them more like real kidneys.

The researchers also used their new technique to search for drugs that could affect disease. In one of these experiments, they produced organoids with mutations that cause polycystic kidney disease, a common, inherited condition that affects one in 600 people worldwide and often leads to kidney failure.

In this disease, tiny tubes in the kidneys and other organs swell like balloons and form expanding cysts that crowd out the healthy tissue.

In their experiment, the researchers exposed the polycystic kidney disease organoids to a number of substances. They found that one, a factor called blebbistatin that blocks a protein called myosin, led to a significant increase in the number and size of cysts.

"This was unexpected, since myosin was not known to be involved in PKD," Freedman said. Myosin, which is better known for its role in muscle contraction, may allow kidney tubules to expand and contract. If it is not functioning properly it might lead to cysts, Freedman explained.

"It's definitely a pathway we will be looking at," he said.

Credit: 
University of Washington School of Medicine/UW Medicine

Arthritis drugs potentially safe for expectant mothers

image: This is Dr. Évelyne Vinet of the Centre for Outcomes of Evaluative Research (CORE) of the Research Institute of the MUHC.

Image: 
MUHC

Montreal, May 17, 2018 - A new study led by a team at the Research Institute of the MUHC (RI-MUHC) in Montreal has revealed that pregnant women with rheumatoid arthritis (RA) may be able to use certain RA drugs without possible increased health risks to their unborn babies. The research findings are published today in the journal Arthritis & Rheumatology.

Rheumatoid arthritis is a debilitating disease with physical, emotional, and economic consequences that afflicts about one per cent of the world's adult population. This autoimmune disease, which causes chronic inflammation of the joints and other areas of the body, affects two to three times more women than men and there is no existing cure.

Dr. Évelyne Vinet and her team from the Centre for Outcomes of Evaluative Research (CORE) of the RI-MUHC analyzed offspring exposed to tumour necrosis factors inhibitors (TNFs) - an immunosuppressant RA drug commonly used to reduce inflammation and relieve pain.

They did not observe any marked excess risk of serious side effects when compared to unexposed children from mothers with RA and children from the general population. Their research showed that although TNFs cross the placenta, the drug may not increase immunosuppression nor compromise the child's ability to fight infections.

"Knowing there is not necessarily an association between infections and these RA drugs will be very reassuring to expectant mothers," says first author of the study Dr. Vinet, who is a scientist from the Infectious Diseases and Immunity in Global Health Program at the RI-MUHC and an assistant professor in the Department of Medicine and Division of Rheumatology at the Faculty of Medicine of McGill University. "It is important to highlight these findings so would-be mothers understand they can enjoy a normal pregnancy without being burdened by unnecessary stress."

Dr. Vinet's team studied nearly 3,000 children from mothers with RA (the largest cohort ever assembled) and a random selected group of nearly 15,000 children over the course of their first year of life. Within the RA group, 380 children were exposed to TNFs and 3.2 per cent presented serious infections. That number is just slightly above those with no TNFs (2 per cent) and the control group (1.9 percent).

"However, until further studies are conducted to address this issue," says Dr. Vinet, "it is important to follow current recommendations when treating women with rheumatoid arthritis during pregnancy."

Credit: 
McGill University Health Centre

New method eliminates guesswork when lenses go freeform

image: Using a new step-by-step method developed by Aaron Bauer, a senior research engineer at the University of Rochester's Center for Freeform Optics (CeFO), these eight different designs for a three-mirror reflective imager were ranked by their potential to be corrected using freeform optics, with Tier 1 having the greatest potential.

Image: 
(University of Rochester illustration / courtesy Jannick Rolland)

Lenses and mirrors with freeform surfaces enable designers to focus light within optical devices that are lighter, more compact, and more effective than ever before.

But until now, determining which freeform surfaces will work best - if at all - in a given configuration of mirrors and lenses has been a time-consuming and often expensive process of trial and error.

It doesn't have to be that way anymore.

In a paper in Nature Communications, lead author Aaron Bauer, a senior research engineer at the University of Rochester's Center for Freeform Optics (CeFO), combines theory and practice in a step-by-step method that eliminates much of the guesswork.

"Aaron has developed a process to design with freeform surfaces that can be applied very generally," says coauthor Jannick Rolland, CeFO director and Brian F. Thompson Professor of Optical Engineering. "It's really beautiful and even at times feels like magic."

She believes the findings will help accelerate the adoption of freeform optics in industry. "People will no longer say 'Oh, it's too expensive to build with freeform optics,'" she says. "Because now you can make something that may cost a tenth of what it would have cost otherwise."

Laying the groundwork

For as long as mirrors and lenses have been packaged together in telescopes, spectrometers, and a host of other optical devices, performance has been defined by how well those elements are able to keep a beam of light focused with minimal "aberration."

Traditionally, optical designers have relied on rotationally symmetric optical surfaces, because their design and manufacture was relatively straightforward.

Within the last 20 years, advances in high-speed micro milling, computer-controlled lens polishing, and ion beam etching, among other technologies, have made asymmetric freeform surfaces more feasible.

In a paper in 2014, Kyle Fuerschbach, a former member of the Rolland Lab, laid the theoretical framework for freeform aberrations theory.

"But we still didn't have a systematic process to design with that theory," Rolland says.

Putting two and two together

Bauer, in the meantime, was working alongside Fuerschbach, designing a head-worn display using freeform surfaces.

"I noticed that there were very common patterns of aberrations that were always popping up, and limiting my system from going any further," Bauer says. Moreover, "those patterns of aberration matched the ones that Kyle predicted would be corrected by freeform surfaces. So, I put two and two together."

The method he came up with starts with the initial "folding geometry" (alignment of mirrors and lenses) contemplated for a design, and then, based on an analysis of the various aberrations produced by that alignment, predicts:

whether freeform surfaces could minimize those aberrations and, if so,

which freeform surfaces should be used for maximum effect.

"Freeform surfaces are not a universal solution for correcting every aberration," Bauer notes. "So, what our method does is to allow designers to analyze all of these geometries ahead of time, in order to predict whether or not there would be a good solution."

That's far better than the "brute force" approach where "people heuristically try various freeform surfaces into a design," Rolland says. "Even if it eventually works, you could end up with a system where the departure of the surfaces is much larger than they would be otherwise, because all those freeform surfaces may be fighting each other. And if it does not work, there is nowhere go as a designer."

By using Bauer's method instead, she says, "you will be able to design something that is a lot simpler, and that will be easier to manufacture and test. Furthermore, the method will quickly and unequivocally provide insight into why a given geometry might be intrinsically limited, which is essential for designers."

Credit: 
University of Rochester

Climate change impacts fragile river ecosystems

image: This is South Africa's Kruger National Park.

Image: 
Stephen Tooth

Boulder, Colo., USA: Research undertaken in South Africa's Kruger National Park (KNP) has shown that some of the world's most sensitive and valuable riverine habitats are being destroyed due to an increasing frequency of cyclone-driven extreme floods.

As part of a Natural Environmental Research Council (UK) funded project, researchers from the universities of Hull, Aberystwyth, and Salford and the engineering consultants "Architecture, Engineering, Consulting, Operations, and Maintenance" (AECOM), used laser survey technology (LiDAR) flown from an aircraft, to measure the impacts of cyclone-driven extreme floods in 2000 and 2012 on rivers in the KNP. The KNP game reserve has global significance for its habitats and associated species, and the rivers flowing through the park provide essential ecosystem services, including water and habitat in the shape of the many varied channel morphologies and associated riparian forest. The high-resolution data has been used to create accurate digital models of the river bed, and through comparisons with pre-2012 flood data, they were able map detailed spatial patterns of erosion and deposition.

Dr. David Milan, University of Hull, principal investigator for the project said, "We are primarily interested in trying to understand how these large bedrock-influenced river channels respond to large floods. From comparing our LiDAR models between 2012 and 2004, we have calculated that the 2012 event alone removed almost 1.25 million tonnes of sediment from the river bed. We also found that patches of mature riparian forest that survived larger floods in 2000 were removed by the 2012 floods. There is a suggestion that the frequency of large flood events is increasing due to climate change, and our analysis of river channel morphology for a 50 km length of the Sabie River shows us that these rivers need timespans longer than a decade to recover."

The UK group has concerns over the impacts of the geomorphological changes happening in the KNP rivers. Drylands are known to be some of the regions most vulnerable to climate change, and some climate model predictions suggest increased landfall for cyclones over South Africa.

Dr. Milan continued, "We present a conceptual model showing the likely pathways that the KNP river systems are likely to follow in the future depending upon flood frequency and magnitude, and conclude by suggesting that more frequent floods will continue to strip out sediment and vegetation from the river channel, leaving a more barren environment with less habitat value. Continued progressive loss of habitat diversity will fundamentally, and for all intents and purposes irreversibly, alter our riverine landscapes and this will be accompanied by a catastrophic loss of species unable to adapt to the new environments. Conservationists need to work alongside geomorphologists to look at ways in which dryland river habitats can be best managed into the future."

These results not only have significance for the rivers of the KNP, but also for bedrock-influenced rivers in other dryland areas globally.

The full findings of this research have just been published in the Geological Society of America Bulletin.

Credit: 
Geological Society of America

Processes in the atomic microcosmos are revealed

Physicists at Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) have successfully generated controlled electron pulses in the attosecond range. They used optical travelling waves that are formed by laser pulses of varying wavelengths. The movements of electrons in atoms were revealed using attosecond free-electron pulses. The findings of the researchers from Erlangen have been published in the acclaimed journal Physical Review Letters (DOI: 10.1103/PhysRevLett.120.103203).

Scientists have been researching ways of generating packets of electrons in extremely short timescales for several years. Such pulses enable ultrafast movements to be tracked, for example vibrations in atomic lattices, phase transitions in materials or molecular bonds in chemical reactions. 'The shorter the pulse, the faster the movements that can be mapped,' explains Prof. Dr. Peter Hommelhoff, Chair of Laser Physics at FAU. 'However, this also involves the special challenge of how to control the packets of electrons.' Last year, Hommelhoff and his team successfully generated periodic electron pulses with a duration of 1.3 femtoseconds - a femtosecond is one quadrillionth of a second. To do so, they directed a continuous beam of electrons over a silicon lattice and superimposed it with the optical field of laser pulses.

From femtosecond to attosecond pulses

The researchers at FAU have now gone one better and have generated electron pulses of 0.3 femtoseconds or 300 attoseconds. Lasers were also used for this method. Firstly, packets of electrons are emitted from an electron source using ultraviolet laser pulses. These packets then interact with optical travelling waves that are formed in a vacuum by two infrared laser pulses of varying wavelengths. 'The ponderomotive interaction causes a shift in the electron density,' explains Norbert Schönenberger, a researcher at Prof. Hommelhoff's Chair and co-author of the study. 'We break down the electron packet to a certain extent into even smaller packets to generate electron pulses in the attosecond range. The time delay in the arrival of the laser beams enables us to generate specific travelling waves and thus precisely control the trains of pulses.'

This method developed by the physicists at FAU could revolutionise experiments in electron diffraction and microscopy. In future, attosecond pulses will not only be able to be used to trace the movements of atoms , but also even to show the dynamics of electrons within atoms, molecules and solid bodies. The results have been published under the title 'Ponderomotive Generation and Detection of Attosecond Free-Electron Pulse Trains' in the renowned journal Physical Review Letters.

Credit: 
Friedrich-Alexander-Universität Erlangen-Nürnberg

Main regularities of thermal expansion and thermal stability of layered ferroelectrics established

image: During the transition to the paraelectric state, the compounds' linear parameters increase more evenly throughout the bulk of the material. This information is important for predicting the behavior of the material under specified operating conditions.

Image: 
Lobachevsky University

The scientists of the Lobachevsky University and the Institute of Low Temperatures and Structural Research in Wroclaw (Poland) conducted unique studies of oscillation properties using modern methods of optical spectroscopy.

Bismuth-containing layered perovskites, first described by Aurivillius, have recently received researchers' increasing attention. Lobachevsky University scientists have obtained the main representatives of the family of Aurivillius phases: Bi2MoO6, Bi2WO6, Bi3NbTiO9, Bi4Ti3O12 and CaBi4Ti4O15. The Aurivillius phases have long remained the main candidate materials for producing nonvolatile memory chips.

Currently, existing random access memory types are volatile, i.e. the contents of the memory is erased when the power is turned off. The desire to equip computers with non-volatile memory has long been evident. Research has been going on for quite a while in this direction and such types of memory already exist, for example, the so-called FRAM (Ferroelectric Random Access Memory).

The main element of the microchip is a thin film of ferroelectric. Ferroelectrics are substances that have spontaneous electric polarization in the absence of an external electric field in a certain temperature range. The scientific and practical interest in Aurivillius phases is based on the transition from the ferroelectric state to the paraelectric phase, which is accompanied by the disappearance of spontaneous polarization. The principle of operation of FRAM chips is based on polarization switching by an external electric field between the two phases: polar and nonpolar, while the memory cell stores 0 and 1, respectively. The information is recorded or read by switching the polarization of ferroelectric domains by an external electric field.

Microchips may have to operate under extreme conditions, i.e. at high temperatures, therefore, information on these compounds' thermal stability is required. Lobachevsky University scientists have studied the behavior of the compounds when heated and have determined the operating temperature range of the microchips' material. In addition, the temperature of the transition from the ferroelectric state to the paraelectric state was determined by the method of differential scanning calorimetry in conjunction with high-temperature X-ray diffraction. The dependence of the transition temperature on the composition and structure of the samples was revealed for a number of compounds. In the future, this will help to obtain samples with specified properties.

Memory chips resistant to thermal effects can be used at chemical plants to control industrial processes (for example, under synthesis conditions) and in fire protection systems equipped with video recording systems. In this connection, to estimate the changes in the linear dimensions of the substance upon heating, the thermal expansion of the Aurivillius phases was studied.

According to Professor Alexander Knyazev, Dean of the UNN Faculty of Chemistry, the researchers found that an increase in the linear dimensions of the heated samples occurred primarily in the horizontal plane.

"During the transition to the paraelectric state, the compounds' linear parameters increase more evenly throughout the bulk of the material. This information is important for predicting the behavior of the material under specified operating conditions," Alexander Knyazev notes.

Jointly with their colleagues from the Institute of Low Temperatures and Structural Research in Wroclaw (Poland), UNN scientists have carried out unique studies of oscillation properties using modern methods of optical spectroscopy. The results of the study reveal a number of structural features characteristic only of the Aurivillius phases due to their layered structure.

As a follow-up of this research, the Nizhny Novgorod team proceeds with the study of the Dion-Jacobson phases, which also belong to the class of layered perovskites. The researchers' interest in these compounds is due to the possibility of their use as ferroelectrics, dielectrics, piezoelectrics, superconductors and photocatalysts for water decomposition under the action of visible light. The use of the Dion-Jacobson phases as an initial reagent for the synthesis of other layered perovskites is also of great importance.

Credit: 
Lobachevsky University

Cell type switch helps colon cancer evade treatment, study suggests

image: Human colon cancers contain two populations of cancer cells, one at the tumor edge in which the MAPK pathway is highly active (indicated by green staining and the white arrowhead), and one in the center of tumor in which the NOTCH pathway is activated (indicated by red staining and the white arrow).

Image: 
Schmidt et al., 2018

Researchers in Germany have discovered that colon cancers are often resistant to existing drug treatments because they are composed of two different cell types that can replace each other when one cell type is killed. The study, which will be published May 16 in the Journal of Experimental Medicine, suggests that combination therapies targeting both cell types at once may be more effective at treating colorectal cancer, the third highest cause of cancer-related death in the United States.

Early-stage colon cancers can be surgically removed but later stages of the disease require more targeted treatments, including therapies designed to block the MAPK signaling pathway that promotes colon cancer progression. “However, targeting MAPK signaling has limited effects and usually prolongs patient survival by only a few months. We therefore urgently need radical improvements in targeted therapy for patients with colorectal cancer,” says Professor David Horst of the Charité University Hospital in Berlin, Germany.

One potential alternative is to target the NOTCH signaling pathway, which is also thought to drive colon cancer progression even though, in bladder cancer, it suppresses MAPK signaling. But initial trials of NOTCH pathway inhibit ors have so far yielded disappointing results.

Horst and colleagues examined over 300 patient samples and found that the NOTCH pathway wasn’t activated in all colon cancer cells. Cells in the middle of tumors showed signs of active NOTCH signaling but reduced MAPK activity. This population of cells appeared to be highly proliferative. Cells at the edges of colon cancers, in contrast, showed high levels of MAPK signaling but little NOTCH pathway activity. This population of cells was less proliferative but appeared to be undergoing the initial stages of metastasis, in which colon cancer cells invade and spread to other tissues.

These two different cell types could also be seen in the tumors formed by human colon cancer cells injected into mice. The tumors quickly lost their MAPK-active cells when the researchers treated these mice with the MAPK pathway inhibitor selumetinib, but the number of NOTCH-active cells increased so that there was minimal disruption to the tumors’ overall growth. And, after stopping selumetinib treatment, some of these NOTCH-active cells gave rise to new MAPK-active cells at the tumor edge.

In contrast, treatment with the NOTCH pathway inhibitor dibenzazepine eliminated NOTCH-active cells from tumors but the population of MAPK-active cells expanded and gave rise to new NOTCH-active cells once dibenzazepine treatment was discontinued.

“This suggests that colon cancers may evade targeted treatment against MAPK or NOTCH signaling by a reversible shift in the predominating pathway activity,” says Horst. “However, when combining both therapies to target both cell populations, we found strong repressive effects on tumor cell proliferation and increased cell death, resulting in slower tumor growth and prolonged survival times compared to either treatment alone.”

Horst and colleagues note that targeting the NOTCH pathway alone may even be detrimental to colon cancer patients, if it results in an increased number of MAPK-active cells poised to undergo metastasis.

“Our data support a new concept for cancer therapy that advocates specific and simultaneous targeting of several different tumor cell subpopulations to strongly improve therapy response,” Horst says. “Further preclinical and clinical trials may therefore reveal if combined MAPK and NOTCH inhibition, in addition to established chemotherapeutic protocols, can improve therapy response in patients with colorectal cancer.”

Credit: 
Rockefeller University Press

Not quite a 'double bind' for minority women in science

COLUMBUS, Ohio - Many studies have shown that both minority and women scientists face disadvantages in reaching the highest levels of their careers.

So it would make sense that minority women would face a "double bind" that would particularly disadvantage them.

But a new study using a massive database of scientific articles suggests that minority women actually face what might be called a "one-and-a-half bind." They are still worse off than other groups, but their disadvantage is less than the disadvantage of being black or Hispanic plus the disadvantage of being a woman.

"There is less disadvantage than you would have thought if you simply added the penalties of being a minority and being a woman," said Bruce Weinberg, co-author of the study and professor of economics at The Ohio State University.

The study appears in the May 2018 issue of AEA Papers and Proceedings.

The findings are particularly timely now, said study co-author Gerald Marschke, associate professor of economics at the University at Albany, State University of New York.

"The underrepresentation of women and minorities is a huge concern to policymakers and is the focus of many commissions and initiatives," Marschke said.

The researchers used an innovative method to overcome one of the biggest issues in studying the careers of minority women.

"Because of the small number of minorities and the small number of women in some science careers, it is hard to study them, particularly with people who are members of both groups," Weinberg said.

The researchers found a way around this problem by using a convention of the biomedical sciences to their advantage. In the journals where these scientists publish their results, the last author listed on an article is the principal investigator who supported the work and has the highest level of prestige.

"Being listed as the last author is the pinnacle of the research career and has a lot of status that goes along with it," Weinberg said. So the researchers compared how many minorities and women were listed as last author on papers compared to white men.

The study used a massive database of 486,644 articles with two to nine authors published in medical journals by U.S. scientists between 1946 and 2009. Computer software categorized author names by race, ethnicity and gender.

This software also identified individual authors so that the researchers could follow how scientists' authorship position on papers changed over the course of their careers.

Overall, results showed that the probability of being a last author - the prestige position - increased from 18 percent during the first four years of a scientist's career to 37 percent after 25 and up to 29 years.

Black scientists were substantially less likely to be last authors compared to white men after five years into their careers, with a gap of 6 percentage points at 25 to 29 years.

The movement of women and Hispanics into last authorship was even slower, with a gap of 10 percentage points after 25 years in their career.

Marschke noted that women and minorities have fewer publications than white men and controlling for these differences can account for some of the gaps with white men.

"But even after controlling for experience differences you see these gaps," Marschke said.

The researchers also did several statistical analyses to assess the impact of various factors on whether an author would have the last position on an article.

In one such analysis, they found that blacks were 0.4 percentage points less likely than white men to be the last author and women were about 4 percentage points less likely to be listed last.

Given that, it would have been reasonable to assume that the penalty for black women would be at least the sum of those two disadvantages, or 4.4 percentage points, Weinberg said.

But in fact, the findings showed black women were about 3.5 percentage points less likely than white men to receive the last authorship position.

"You lose something for being black and you lose something for being a woman. But you lose less than simply adding those two disadvantages together," he said.

A similar result was found for Hispanic women.

This result was surprising, Weinberg said, partly because the two disadvantages could have been more than just additive.

"Our expectation, based on research that has been done on intersectionality, was that, if anything, the penalties of being a woman and being a minority could have compounded each other, and their position would have been even worse," he said.

Marschke added: "Women who are minorities may feel isolated by their minority status, but unlike minority men, also face the strain of balancing careers and families like white women. But unlike white women, they also have to uphold their roles as women within their culture."

The researchers are now investigating why they found these results and trying to determine if factors like the number of people on a research team and the source of funding may affect how women and minorities fare.

Credit: 
Ohio State University

No motor, no battery, no problem

image: An artist's rendering of the new design for a robot that uses material deformation to propel itself through water.

Image: 
Tian Chen and Osama R. Bilal/Caltech

Video available at: https://youtu.be/tBxb1A6boTI

Engineers at Caltech and ETH Zurich have developed robots capable of self-propulsion without using any motors, servos, or power supply. Instead, these first-of-their-kind devices paddle through water as the material they are constructed from deforms with temperature changes.

The work blurs the boundary between materials and robots. In the self-propelled devices, the material itself makes the machine function. "Our examples show that we can use structured materials that deform in response to environmental cues, to control and propel robots," says Daraio, professor of mechanical engineering and applied physics in Caltech's Division of Engineering and Applied Science, and corresponding author of a paper unveiling the robots that appears in the Proceedings of the National Academy of Sciences on May 15.

The new propulsion system relies on strips of a flexible polymer that is curled when cold and stretches out when warm. The polymer is positioned to activate a switch inside the robot's body, that is in turn attached to a paddle that rows it forward like a rowboat.

The switch is made from a bistable element, which is a component that can be stable in two distinct geometries. In this case, it is built from strips of an elastic material that, when pushed on by the polymer, snaps from one position to another.

When the cold robot is placed in warm water, the polymer stretches out, activates the switch, and the resulting sudden release of energy paddles the robot forward. The polymer strips can also be "tuned" to give specific responses at different times: that is, a thicker strip will take longer to warm up, stretch out, and ultimately activate its paddle than a thinner strip. This tunability allows the team to design robots capable of turning and moving at different speeds.

The research builds on previous work by Daraio and Dennis Kochmann, professor of aerospace at Caltech. They used chains of bistable elements to transmit signals and build computer-like logic gates.

In the latest iteration of the design, Daraio's team and collaborators were able to link up the polymer elements and switches in such a way to make a four-paddled robot propel itself forward, drop off a small payload (in this case, a token with a Caltech seal emblazoned on it), and then paddle backward.

"Combining simple motions together, we were able to embed programming into the material to carry out a sequence of complex behaviors," says Caltech postdoctoral scholar Osama R. Bilal, co-first author of the PNAS paper. In the future, more functionalities and responsivities can be added, for example using polymers that respond to other environmental cues, like pH or salinity. Future versions of the robots could contain chemical spills or, on a smaller scale, deliver drugs, the researchers say.

Currently, when the bistable elements snap and release their energy, they must be manually reset in order to work again. Next, the team plans to explore ways to redesign the bistable elements so that they are self-resetting when water temperature shifts again--making them potentially capable of swimming on indefinitely, so long as water temperature keeps fluctuating.

Credit: 
California Institute of Technology

Some calories more harmful than others

While calories from any food have the potential to increase the risk of obesity and other cardiometabolic diseases, 22 nutrition researchers agree that sugar-sweetened beverages play a unique role in chronic health problems. The disease risk increases even when the beverages are consumed within diets that do not result in weight gain.

It's just one of the conclusions published today in Obesity Reviews in a position paper by a group of researchers who participated in the 2017 CrossFit Foundation Academic Conference. The task of researchers was to deliberate the question: Are all calories equal with regards to effects on cardiometabolic disease and obesity? The paper provides an extensive review of the current science on diets that can lead to obesity, cardiovascular disease and Type 2 diabetes.

The paper's sugar-sweetened beverage consensus is particularly relevant in light of a recent legal battle over warning labels on soda, which hinged on the 9th Circuit Court's determination of whether soda and other sweetened beverages are uniquely harmful to human health or one source of calories among many.

"What's new is that this is an impressive group of scientists with vast experience in nutrition and metabolism agreeing with the conclusion that sugar-sweetened beverages increase cardiometabolic risk factors compared to equal amounts of starch," said lead author Kimber Stanhope, a research nutritional biologist with the School of Veterinary Medicine at the University of California, Davis.

SUGAR SUBSTITUTE WON'T MAKE YOU FAT

Another interesting point of consensus among researchers is the role of the sugar substitute aspartame. The authors agreed that aspartame does not promote weight gain in adults. Stanhope said this might come as a surprise to most people.

"If you go on the internet and look up aspartame, the layperson would be convinced that aspartame is going to make them fat, but it's not," said Stanhope. "The long and short of it is that no human studies on noncaloric sweeteners show weight gain."

The authors also agreed that consumption of polyunsaturated (n-6) fats, such as those found in some vegetable oils, seeds and nuts, lowers disease risk when compared with equal amounts of saturated fats. However, that conclusion comes with a caveat. Dairy foods such as cheese and yogurts, which can be high in saturated fats, have been associated with reduced cardiometabolic risk.

The paper reviews the significant challenges involved in conducting and interpreting nutrition research.

"We have a long way to go to get precise answers on a lot of different nutrition issues," said Stanhope. "Nevertheless, we all agree that a healthy diet pattern consisting of minimally processed whole grains, fruit, vegetables, and healthy fats promotes health compared with the refined and palatable typical Western diet pattern."

Credit: 
University of California - Davis

Monitoring lava lake levels in Congo volcano

image: Nyiragongo crater lava lake, Democratic Republic of Congo.

Image: 
Julien Barriere

Nyiragongo in the Democratic Republic of the Congo is among the world's most active volcanoes, with a persistent lava lake as one of its defining features. In a talk at the 2018 SSA Annual Meeting, Adrien Oth of the European Center for Geodynamics and Seismology discussed how he and his colleagues are using multiple methods to monitor lava lake levels at the volcano.

The researchers analyze seismic and infrasound signals generated by the volcano as well as data collected during satellite flyovers to measure Nyiragongo's lake level fluctuations. During the eruption in 2002, which caused a major humanitarian crisis, the lava lake was drained and the depth of the remaining crater was estimated between 600 and 800 meters. About four months after the eruption, the crater started filling up again. Nowadays, the inner crater floor is about 400 meters below the rim and the lava lake remains at high level.

"The lava lake level is, among other things, related to the variations of the pressure inside the magmatic system underneath Nyiragongo volcano," Oth and his colleagues explained." In that sense, the lava lake represents a window into the magmatic system, and its level fluctuations provide information on the recharge and drainage of the magmatic system, such as batches of fresh magma and/or gas, or lateral magmatic intrusions into the surrounding crust."

The different techniques used to observe the lava lake offer a more complete look at the volcano's activity, the authors said. The seismic and infrasound data, collected continuously, help researchers gauge pressure changes in magmatic activity. "Until very recently, very few high-quality data were available for this region," the researchers noted." Over the past few years, our consortium assisted the Goma Volcano Observatory to deploy one of the densest modern real-time telemetered monitoring systems in Africa. Combined with modern processing techniques, these newly acquired datasets provide unprecedented opportunities to investigate the behavior of this unique magmatic system."

In combination with seismic and infrasound data, the scientists are using high resolution synthetic-aperture radar (SAR) images captured by satellites passing over the volcano to directly measure the rise and fall of the lava lake level. These images measure the length of the shadow cast by the crater's edge on the lava lake surface, which can be used to calculate the lava depth.

The lava lake observations are only one piece of the puzzle within the regional volcanic system, and "will certainly be of key importance for successful eruption forecasting in the future," said Oth and colleagues. "At this stage, however, these observations need to be first put into the larger context of the magmatic system in order to allow their proper interpretation in terms of eruptive processes."

Credit: 
Seismological Society of America

Wildfires may cause long-term health problems for endangered orangutans

image: This is Otto, an adult flanged male orangutan, avoiding fires and traveling through a smoke-filled forest in the Tuanan Orangutan Research Station in Indonesian Borneo.

Image: 
Beth Barrow

Orangutans, already critically endangered due to habitat loss from logging and large-scale farming, may face another threat in the form of smoke from natural and human-caused fires, a Rutgers University-New Brunswick study finds.

The study appears in the journal Scientific Reports.

In 2015, Wendy Erb, a postdoctoral researcher in the Department of Anthropology at Rutgers, was studying male orangutans in the forests of Indonesian Borneo when fires started. She and her colleagues at the Tuanan Orangutan Research Station continued working until they had to stop and help fight the blazes, which occur annually, often due to smallholder farmers and plantations clearing forests to plant crops.

A few weeks into the fire season, Erb noticed a difference in the sound of the males' "long call," which scientists believe is used to attract females and warn other males. "I thought they sounded raggedy, a little like humans who smoke a lot," she said.

Erb decided to find out if the smoke the orangutans inhaled during the fires had affected their health. Humans who inhale smoke suffer ill effects, but she knew of no studies on the possible effects on orangutans.

Erb studied four "flanged" males, who weigh about 200 pounds and have large cheek pads. She awoke each day before dawn to collect their urine in a bag at the end of a stick she held below them. Analyzing their behavior and urine, the scientists discovered the big males traveled less, rested more and consumed more calories. They also produced more ketone bodies, molecules made by the liver from fatty acids during periods of low food intake, which was unexpected because the apes were eating more, not less. Why were they burning fat?

The only new element in the orangutans' lives was the three months of fire and smoke. The forests' natural surface consists of peat, which is flammable, allowing the fires to burn underground for weeks. The fires were worse in 2015 because of a strong El Niño effect, which brought with it a severe drought.

Soil analyses suggest that wildfires have occurred in Borneo for millennia, but have become increasingly frequent and intense in recent decades due to deforestation and draining of peatlands. In 2015, Indonesia experienced the most severe fire activity and smoke pollution on record since the disastrous wildfires during the 1997 El Niño droughts burned some 24,000 square kilometers of peatlands (12 percent of the total peat area). Peatland fires destroy forest habitats, release greenhouse gases and produce hazardous particulate matter, the leading cause of worldwide pollution-related mortality. Two independent studies estimated that the 2015 haze caused somewhere between 12,000 and 100,000 premature human deaths, but there has been very little research into the effects on wildlife populations inhabiting these burning habitats.

The unexpected loss of nearly 100,000 Bornean orangutans from intact forests in Kalimantan between 1999 and 2015 indicates that habitat loss alone is not driving this critically endangered species' declines. Increasingly frequent exposure to toxic smoke could have severe consequences for orangutans, other animals and people, and this research highlights the urgent need to understand the long-term and indirect impacts of Indonesia's peatland fires, beyond the immediate loss of forests and their inhabitants.

Anthropology professor Erin Vogel, co-author of the study and the Tuanan Research Station's co-director, said the next step is to analyze data from female and juvenile orangutans to see how the fires affected their health.

"We'll look at different indicators of inflammation in the urine," she said. "We'll look for cytokines, proteins that are part of the immune response, and cortisol, a hormone associated with stress. It's possible these males are burning fat because their energy is going to repairing tissue."

Credit: 
Rutgers University

Joint resolution: A link between Huntington's disease and rheumatoid arthritis

image: This is Gary Firestein, MD, dean and associate vice chancellor of translational medicine at UC San Diego School of Medicine.

Image: 
UC San Diego Health

Using new analytic tools, researchers at University of California San Diego School of Medicine and the Icahn School of Medicine at Mount Sinai have decoded the epigenetic landscape for rheumatoid arthritis (RA), a common autoimmune disease that affects more than 1.3 million Americans. In unveiling RA's epigenome -- the proteins and molecules that decorate DNA and help turn genes on and off -- scientists made a surprising discovery: an overlap between the causes of RA and Huntington's disease, a fatal and incurable genetic brain disease.

The findings are published online in the May 15 issue of Nature Communications.

The research team, led by senior author Gary S. Firestein, MD, dean and associate vice chancellor of translational medicine at UC San Diego School of Medicine, said the unexpected connection between RA and Huntington's disease opens up the possibility of new therapeutic targets and drugs for both conditions.

"We did not expect to find an overlap between rheumatoid arthritis and Huntington's disease, but discovering the unexpected was the reason that we developed this technology. Now that we have uncovered this connection, we hope that it opens a door for treatment options for people living with either disease," said Firestein.

RA is a chronic inflammatory disorder that causes pain and swelling in joints. As an autoimmune disease, it can also affect other organs, including heart and blood vessels. Treatment for RA has improved, but 10 to 20 percent of patients do not respond to any available medicines.

The investigative approach used by the research team involved developing a novel algorithm, or set of computational rules, called EpiSig, which integrated and reduced the number of epigenetic combinations in the genes of patients with RA. The team could then identify new cell signaling pathways.

"Comparing different types of epigenomic data is difficult because it involves a variety of different data subsets that cannot normally be analyzed together, including various methods in which DNA gets modified," said Wei Wang, PhD, professor of chemistry, biochemistry and cellular and molecular medicine at UC San Diego School of Medicine.

"This methodology can also be used to find connections between other diseases, not just rheumatoid arthritis," added Firestein. "As genes involved are discovered, researchers can potentially identify new treatment options and even repurpose existing drugs."

Firestein and team studied the epigenome in cells from the joints of patients with RA. Patients with osteoarthritis, which is a disease of cartilage degeneration, served as a control group. Both data sets were analyzed through an expansive process that examines chromatin, DNA and histone modifications. The results produced 12 terabytes of data (12 trillion bytes) that were then analyzed using EpiSig.

Epigenetics, or "above the genome", is the study of processes that alter the gene structure without changing the DNA sequence itself. These DNA modifications are essential to human growth and development and change throughout people's lives. Epigenetic changes are influenced by a variety of environmental factors, including stress, activity and lifestyle choices.

"By revealing the comprehensive epigenetics behind rheumatoid arthritis, we now have a better understanding of this disease. More importantly, our new approach, could not only help patients with rheumatoid arthritis, but also people with other immune-mediated diseases," Firestein said.

Credit: 
University of California - San Diego

The big ethical questions for artificial intelligence in healthcare

AI in healthcare is developing rapidly, with many applications currently in use or in development in the UK and worldwide. The Nuffield Council on Bioethics examines the current and potential applications of AI in healthcare, and the ethical issues arising from its use, in a new briefing note, Artificial Intelligence (AI) in healthcare and research, published today.

There is much hope and excitement surrounding the use of AI in healthcare. It has the potential to make healthcare more efficient and patient-friendly; speed up and reduce errors in diagnosis; help patients manage symptoms or cope with chronic illness; and help avoid human bias and error. But there are some important questions to consider: who is responsible for the decisions made by AI systems? Will increasing use of AI lead to a loss of human contact in care? What happens if AI systems are hacked?

The briefing note outlines the ethical issues raised by the use of AI in healthcare, such as:

the potential for AI to make erroneous decisions;

who is responsible when AI is used to support decision-making;

difficulties in validating the outputs of AI systems;

the risk of inherent bias in the data used to train AI systems;

ensuring the security and privacy of potentially sensitive data;

securing public trust in the development and use of AI technology;

effects on people's sense of dignity and social isolation in care situations;

effects on the roles and skill-requirements of healthcare professionals; and

the potential for AI to be used for malicious purposes.

Hugh Whittall, Director of the Nuffield Council on Bioethics, says:

"The potential applications of AI in healthcare are being explored through a number of promising initiatives across different sectors - by industry, health sector organisations and through government investment. While their aims and interests may vary, there are some common ethical issues that arise from their work.

Our briefing note outlines some of the key ethical issues that need to be considered if the benefits of AI technology are to be realised, and public trust maintained. These are live questions that set out an agenda for newly-established bodies like the UK Government Centre for Data Ethics and Innovation, and the Ada Lovelace Institute. The challenge will be to ensure that innovation in AI is developed and used in a ways that are transparent, that address societal needs, and that are consistent with public values."

Credit: 
Nuffield Council on Bioethics

Chemotherapy-induced peripheral neuropathy in long-term survivors of childhood cancer

Bottom Line: A new study assesses chemotherapy-induced peripheral neuropathy in 121 long-term survivors of childhood cancer to detail clinical, functional, neurophysiological and patient-reported outcomes of the condition.

Why The Research Is Interesting: Childhood and adolescent cancer survival rates have improved and it's important to understand the long-term effect of cancer treatment. Chemotherapy-induced peripheral neuropathy is a potentially long-lasting adverse effect of chemotherapy agents that can be toxic to peripheral nerves.

Authors: Susanna B. Park, Ph.D., of the University of Sydney, Australia, and coauthors

To read the full study, please visit the For The Media website.

(doi:10.1001/jamaneurol.2018.0963)

Editor's Note:  The article contains funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network