Earth

An old plant virus inspires the design of a modern vaccine to fight against malaria

Scientists from the Walter Reed Army Institute of Research have demonstrated that a novel, second-generation malaria vaccine candidate based on the tobacco mosaic virus may offer protection against Plasmodium falciparum malaria in the upcoming issue of the journal Proceedings of the National Academy of Sciences.

Malaria, infecting approximately 228 million individuals in 2018, remains a meaningful threat to public health and regional stability. Large human populations live in malaria-infested regions of Africa, Southeast Asia and South America, where mosquitoes continuously transmit the malaria parasites from sick to healthy individuals. Though infection rates have been decreasing, this decline has stagnated in recent years, necessitating novel interventions. While malaria has been eradicated from the United States, it remains one of the top five infectious disease threats to deployed Service Members.

The first generation malaria vaccine, RTS,S (Mosquirix), developed through a collaboration between GlaxoSmithKline Vaccines and the Walter Reed Army Institute of Research, is based on the circumsporozoite protein of Plasmodium falciparum. While RTS,S has conferred high level protection in controlled human malaria infection trials, its potency and duration of protection against natural malaria infection needs to be improved.

In an attempt to develop a second-generation CSP-based malaria vaccine, Dr. Sheetij Dutta's laboratory at the WRAIR Malaria Biologics Branch, has used the nano-sized disk and rod shaped particles of the tobacco mosaic virus. TMV was one of the earliest known viruses that causes mottling of tobacco leaves; this research shows that the TMV coat protein can also be highly effective as a vaccine scaffold to refocus the host immune system to the most vulnerable epitopes on CSP. Since the TMV-based malaria vaccine was produced using recombinant DNA technology in bacterial cells, it is non-infectious to humans and will pose no risk to plants.

Dutta added, "The TMV-malaria vaccine showed a 10X improvement over a comparator vaccine in mice, and the superiority of this vaccine was confirmed in Rhesus monkeys. Serum antibodies from the vaccinated monkeys potently blocked parasite entry into human liver cells up to 11 months following vaccine administration. We are now exploring the utility of TMV particles for rational design of second generation vaccines against other infectious diseases."

Credit: 
Walter Reed Army Institute of Research

Research leads to life changing improvement for some people living with depression

image: Dr. Raj Ramasubbu shows study participant Beth MacKay where the stimulation device was implanted in her brain.

Image: 
Photo by Kelly Johnston, Cumming School of Medicine.

Beth MacKay knew at a young age that she saw the world differently than many of her friends and family. She thought her pessimism and cynicism were rooted in realism, a proud reminder of her Scottish roots, and not a sign of an underlying medical condition. But, that understanding of herself changed when at the age of 17, she attempted suicide.

"I was diagnosed with depression, but looking back, it started much earlier," says MacKay, now 31. "Doctors believe it may have started when I was 10 or 11-years-old. As a child I would go through periods where I couldn't sleep, I didn't want to go to school, and I was constantly sick."

MacKay's parents tried to find help and support for her. They thought her symptoms may have been related to a learning disorder, but no one suspected depression could be the cause.

Prescribed anti-depressants and therapy, MacKay went on to university. She noticed everyone around her seemed to be functioning, but she couldn't get out of bed. She spent the next several years pretending to be okay. She would sleep most of the day, and get up-and-out only long enough to put on a front to show people she was fine. It seemed that no matter what treatment options she tried nothing made life manageable.

"Everyday felt like climbing up a mountain. Something as simple as showering, doing dishes or throwing in a load of a laundry felt too difficult at times," recalls MacKay.

Always open to trying something else to improve her life, MacKay volunteered for a research study at the University of Calgary. Dr. Rajamannar Ramasubbu, MD, was investigating the effects of two different methods of deep brain stimulation (DBS), short pulse and long pulse, for treatment-resistant depression.

"It can be very difficult to find study participants for research like this," says Ramasubbu, a professor in the departments of Psychiatry and Clinical Neurosciences, and member of The Mathison Centre for Mental Health Research & Education and the Hotchkiss Brain Institute at the Cumming School of Medicine (CSM). "The procedure is invasive, so many clinicians are reluctant to recommend it. It requires implanting an electrode into the brain that is connected to a pulse generator that is implanted under the clavicle into the chest."

Just as pacemakers deliver electrical impulses to help control abnormal heart rhythms, DBS devices deliver electrical impulses to help neurons (brain cells) within the brain communicate more efficiently with each other.

"Depression is caused by abnormalities in the neural circuit responsible for emotional regulation," says Ramasubbu. "The region of the brain we target (subcallosal cingulate) is the junction of the limbic and frontal regions. Stimulating this area helps to keep a balance between these two unique systems."

Multi-disciplinary team collaborates on DBS study

Participants are awake when the device is implanted. Dr. Zelma Kiss, MD/PhD, a neurosurgeon and co- principal investigator of the study performed the procedure at the Foothills Medical Centre (FMC).

Participants were randomized into two groups, one group received short pulse stimulation, the other long pulse width stimulation. After six months, treatment switched for those who did not respond in the first six months. Researchers used the Hamilton Depression Rating Scale to measure change in symptoms.

"Both methods of stimulation were equally safe and effective in reducing depressive symptoms," says Ramasubbu. "50 per cent of the participants responded to the stimulation with 50 per cent reduction in symptoms. Of which 30 per cent experienced complete improvement in their symptoms, especially those who received long pulse width stimulation."

MacKay says she's experienced a massive change.
"Basically I was nearly dead and now I'm mostly alive. I'm still figuring out what life feels like, because it feels so different and so much better than before the implant."

Ramasubbu adds more research is needed to determine which patients with treatment resistant depression will benefit from DBS. Study participants ranged in age from 20 to 70, with younger participants showing better improvement than older participants.

Credit: 
University of Calgary

First-of-its-kind technology lights up lung cancer cells, helps improve patient outcomes

image: Inderpal (Netu) S. Sarkaria, MD, from the University of Pittsburgh Medical Center in Pennsylvania

Image: 
University of Pittsburgh Medical Center in Pennsylvania

NEW ORLEANS, Louisiana (January 27, 2020) -- A groundbreaking tumor-highlighting technology--OTL38--enhances the visualization of lung cancer tissue, providing surgeons with a significantly better chance of finding and removing more cancer than previously possible, according to a scientific presentation at the 56th Annual Meeting of The Society of Thoracic Surgeons.

"Lung cancer is the most common and lethal cancer worldwide," said Inderpal (Netu) S. Sarkaria, MD, from the University of Pittsburgh Medical Center in Pennsylvania. "Technologies to improve the care of these patients are needed. Near-infrared imaging with OTL38 during surgery for lung cancer is one such promising technology with the potential to significantly improve the completeness and quality of the operation, therefore improving patient outcomes."

Dr. Sarkaria and colleagues at six institutions (University of Pittsburgh, University of Pennsylvania, Harvard University, Cleveland Clinic, Leiden University, and MD Anderson) participated in a phase 2 clinical trial, identifying 92 patients who had lung lesions and were to undergo pulmonary resection for non-small cell lung cancer (NSCLC). Before their operations, each patient received a measured intravenous dose of OTL38, composed of near-infrared dye and a targeting molecule. The molecule attaches to folic-acid-based receptors on cancer cells and can be illuminated during surgery using a special surgical endoscope. This helps identify small, hard-to-detect cancer lesions that might otherwise have been missed and should be surgically removed.

The researchers made assessments in three phases: "Lung Inspection," "Tumor Resection," and "Specimen Check." During the inspection phase, the molecular imaging identified 10 additional cancers--all missed when using visual examination and manual touch--in seven patients (8%). In the resection phase, researchers determined that OTL38 enabled localization of lesions that were not found in 11 patients (12%). After surgeons found that all margins were visually adequate or clear in the specimen check, the resected specimens were further assessed using the molecular imaging. Inadequate margins (microscopic residual tumor left at the edges) were uncovered in eight patients (9%). Overall, researchers determined that the OTL38 molecular imaging helped improve outcomes for 1-in-4 patients (26%).

"OTL38 is the first technique that is specific to imaging adenocarcinomas of the lung, which is one of the most common types of invasive lung cancer, making it unique and clinically useful in this respect," said Dr. Sarkaria. "Localization of tumors, identification of occult tumors, and immediate tumor margin assessment during surgery for adenocarcinomas of the lung were significantly improved with the use of this technology."

Surgery remains the best potentially curative treatment for early stage NSCLC. However, research has shown that 30% to 55% of patients with NSCLC develop recurrence, which often is caused by microscopic clusters of cancer cells that were undetected by standard staging methods. This suggests that complete removal needs to be ensured both macroscopically and microscopically during surgery.

"Near-infrared imaging with OTL38 may be a powerful tool to help surgeons significantly improve the quality of lung cancer surgery by more clearly identifying tumors and allowing the surgeon to better see and completely remove them--one of the most vital components in the overall care of patients with this disease," said Dr. Sarkaria.

Surgeons traditionally use X-rays, magnetic resonance imaging, computed tomography (CT) scans, positron emission tomography, and/or ultrasound to determine the size and location of tumors before surgery. However, these imaging modalities are rarely, if ever, used during surgery.

OTL38 is believed to be the first targeted fluorescent marker to provide this type of benefit for lung cancer. The OTL38 technology is different in its ability to detect cancerous tissue not previously identified on preoperative scans and do so in real-time, while the surgeon is operating. This is crucial in ensuring that surgeons adequately detect and remove cancer cells that may not be visible to the naked eye or located through touch. The complete removal of diseased tissue during surgery helps to avoid additional surgeries and cancer relapse, as well as increase patients' overall chances of survival.

"Use of advanced near-infrared imaging techniques such as OTL38 may provide surgeons with powerful tools to improve the quality of lung cancer operations by better identifying small, hard-to-find tumors, finding previously undetected cancers at the time of surgery, and better assessing if the entire tumor has been removed," said Dr. Sarkaria.

In addition, with the implementation of lung screening and the increased use of CT scans in general, cardiothoracic surgeons are seeing more patients with small or undefined nodules, so the timing of the availability of technology such as OTL38 is just right, according to Linda W. Martin, MD, MPH, of the University of Virginia in Charlottesville, who was not directly involved with this research.

"In many circumstances, a preoperative biopsy is not practical or feasible, and we are faced with the need for intraoperative identification of these nodules," said Dr. Martin. "This research describes an exciting new approach to localize nodules that are difficult to find without a separate procedure. More importantly, the study showed that because of this technology, additional nodules that were in fact separate cancers were found, and useful information about margin status also resulted."

Dr. Martin described another significant advantage that the OTL38 technology offers--the ability to better identify small nodules. This may allow surgeons to more often utilize minimally invasive operative approaches in some patients who otherwise would have been required to undergo a thoracotomy in order to find these nodules.

The completion of the OTL38 Phase 2 trial in lung cancer is a major milestone, advancing the technology closer to Food and Drug Administration approval and commercialization. Phase 3 trials currently are under way.

Credit: 
The Society of Thoracic Surgeons

Successfully predicting bone marrow failure caused by drugs, radiation, and disease

Your bone marrow produces about 500 billion new blood cells every single day - roughly equivalent to the number of stars thought to be in the Milky Way. Being so prolific, however, comes with a price: medical interventions that aim to disrupt cell growth and differentiation, such as chemotherapies and radiation, can hit the bone marrow extremely hard, causing serious side effects like anemia, severe bleeding, and increased infections. Efforts to understand and reduce bone marrow toxicity have been hampered by the marrow's inaccessible location, as the only way to effectively study living marrow tissue in humans is to take invasive, painful biopsies from patients' bones.

Now, a new organ-on-a-chip (Organ Chip) technology advance from the lab of Donald Ingber, M.D., Ph.D. at the Wyss Institute for Biologically Inspired Engineering at Harvard University opens a window into this understudied field: a Human Bone Marrow Chip that effectively replicates drug- and radiation-induced toxicity responses observed in human patients at clinically relevant doses. The chip also replicated blood cell formation defects seen in patients with a rare genetic disorder, and correctly predicted the existence of a previously unknown abnormality in the marrow of those patients. The research, which was carried out in collaboration with investigators at Boston Children's Hospital, Dana-Farber Cancer Institute, Massachusetts General Hospital, and AstraZeneca, is published in Nature Biomedical Engineering.

"Traditional in vitro cultures of bone marrow cells often do not predict how our bone marrow behaves when exposed to hematotoxic drugs, even when those cultures are treated at concentrations and timeframes that match those experienced by actual patients," said co-first author David Chou, M.D., Ph.D., a Clinical Fellow at the Wyss Institute and in the Department of Pathology at Massachusetts General Hospital and Harvard Medical School (HMS). "Unlike classic culture methods, the Bone Marrow Chip provides a more physiologically accurate microenvironment and allows dynamic drug dosing, which may explain why our model was a better predictor of drug-induced toxicity responses."

Building better bone marrow

The Bone Marrow Chip emerged from work co-funded by the U.S. Food and Drug Administration (FDA), which wanted a tool to study the effects of acute radiation syndrome on bone marrow, and the biopharmaceutical company AstraZeneca. Ingber's team had developed an earlier version of the Bone Marrow Chip with support from the Defense Advanced Research Projects Agency (DARPA), which they integrated into a multi-Organ Chip human "Body-on-Chips" platform that quantitatively predicts human drug pharmacokinetics in vitro and is described in two separate articles in the same issue of Nature Biomedical Engineering.

The Bone Marrow Chip is USB-sized and made of clear silicone rubber, with two hollow, parallel channels separated by a permeable membrane. The top channel is filled with human patient-derived bone marrow progenitor cells (CD34+ cells) and bone-marrow-derived stromal cells embedded in a matrix gel to mimic the three-dimensional nature of marrow tissue, while the lower channel is lined with human endothelial cells to mimic the blood vessels that permeate the marrow. A liquid medium that supports the growth of CD34+ cells and their differentiation into a plethora of different blood cell types is flowed through the vascular channel to feed the bone marrow cells and remove waste.

The researchers cultured the Bone Marrow Chip for up to 28 days, and compared the state of the cells in the chip with CD34+ cells that were grown either in static suspension cultures or static gel cultures in a plate at multiple timepoints. The Bone Marrow Chip improved the survival of the CD34+ progenitor cells and supported the growth and differentiation of neutrophils (white blood cells) and erythrocytes (red blood cells) over a longer time period than the other two methods. In addition, some of the more mature neutrophils migrated from the marrow channel into the vascular channel of the chip, which mimics the behavior of these cells in living marrow.

Demystifying drug toxicity

Having confirmed that the Bone Marrow Chip recapitulates the growth, maturation, and migration of bone marrow cells observed in vivo, the team next tested whether it could predict drug-induced bone marrow toxicities more accurately than static suspension or gel cultures. They grew bone marrow cells for 10-12 days using all three methods, then exposed them for two days to a range of concentrations of the chemotherapy drug 5-fluorouracil, which is known to cause bone marrow toxicity. The Bone Marrow Chip displayed the expected damage and death of bone marrow cells at patient-relevant concentrations and exposure times, while the other culture methods required much higher, potentially unsafe drug levels before displaying toxicity behavior.

To see if its predictive power would prove useful for preclinical drug safety testing, the researchers next exposed the Bone Marrow Chip to an AstraZeneca drug candidate, AZD2811, that is currently in Phase II clinical development for the potential treatment of cancer. In previous Phase I trials, when administered as a pro-drug, a 2-hour infusion producing a high peak concentration of AZD2811 caused a decrease in both red and white blood cells, while a 48-hour infusion producing a lower concentration caused a bigger drop in white blood cell counts but, surprisingly, had no significant effect on red blood cells. The Wyss team was able to precisely replicate the clinical trial patients' drug exposure profiles (pharmacokinetics) on the Bone Marrow Chip, and saw that the chips displayed the expected clinical results while suspension cultures of the same cells did not.

The team also used the Bone Marrow Chip to investigate AZD2811's mechanism of action, which would be difficult to study in human patients without performing a bone marrow biopsy soon after drug infusion. They observed that exposure to AZD2811 selectively decreased the numbers of immature dividing blood cell precursors while the numbers of more mature red and white blood cells were essentially unchanged, which confirmed a preexisting hypothesis about how the drug affects bone marrow cells.

The chips were then exposed to ionizing gamma-radiation similar to what cancer patients can be exposed to during radiation therapy, which is another potential cause of bone marrow damage, and they once again displayed the appropriate toxicity at human-relevant radiation doses. These results highlight how the Bone Marrow Chip could be used to study the effects of radiation on human bone marrow as well as search for novel ways to mitigate radiation-induced injury, which the Wyss team is currently exploring.

"Not only does the Bone Marrow Chip match and predict the effects of drugs and radiation on human bone marrow, it also enables detailed studies into the physiology of cells that are healthy versus cells that are damaged. This new method could become an important tool in preclinical testing because it could significantly reduce our reliance on animal models, which often fail to detect and predict side effects, are notoriously expensive, and raise ethical questions," said co-first author Viktoras Frišmantas, Ph.D., a Postdoctoral Fellow at the Wyss Institute. "We could even study a complete cycle of human bone marrow injury and recovery in response to drugs, which is not possible in alternative in vitro model systems or in animal models.

"Organ Chip technology has the potential to enhance and accelerate our ability to translate science into innovative medicines for patients," said Stefan Platz, Senior Vice President of Clinical Pharmacology and Safety Sciences, R&D at AstraZeneca. "Our collaborative work with the Wyss Institute highlights the potential of human Bone Marrow Chips to mimic both disease and drug-related injury. The ability to efficiently use human cells rather than animal models offers numerous benefits, both in terms of translatability to the clinic and reduction of animal usage. This publication represents an important milestone in demonstrating how these systems can accurately predict clinical effects."

A subtle finding in a rare disease

To round out their studies, the researchers looked at whether Bone Marrow Chips created using bone marrow cells from children with a known genetic disorder developed characteristics similar to those found in real bone marrow from patients with the same deficiency. They focused on Shwachman-Diamond Syndrome (SDS), a rare disease that is caused by a genetic mutation that results in bone marrow failure with abnormally low cell counts, especially reduced white blood cells, and which has not been well-studied given its low incidence and because animal models fail to reproduce the known hallmarks of human disease.

The team cultured CD34+ cells from two SDS patients with normal bone-marrow-derived stromal cells and endothelial cells in the Bone Marrow Chips for two weeks, and saw that these produced much fewer blood cells and showed reduced white blood cell maturation compared to chips with CD34+ cells from healthy donors, much as occurs in patients. They also discovered that neutrophils maturing in the SDS Bone Marrow Chips did not upregulate the gene that produces the protein CD13 as much as neutrophils maturing in normal Bone Marrow Chips, which had not been previously noted in SDS patients. The researchers then reviewed clinical data from bone marrow samples taken from patients with SDS and found that 50% of them displayed this aberrant pattern of lower CD13 expression discovered in the Bone Marrow Chip.

"This finding opens up new avenues of investigation into the causes of bone marrow failure," said co-author Akiko Shimamura, M.D., Ph.D., who is a Professor of Pediatrics at Harvard Medical School (HMS) and Director of the Bone Marrow Failure and Myelodysplastic Syndrome Programs at the Dana-Farber/Boston Children's Cancer and Blood Disorders Center. "The development of this novel bone marrow failure model provides a platform to test new therapies for these devastating diseases."

The team of researchers and clinicians is continuing to work with its collaborators on the Bone Marrow Chip. They are studying the biological differences between red and white blood cells' responses to AZD2811 and are evaluating whether the chip can be used to develop well-tolerated dosing regimens. They have also received additional funding from the FDA to use this model to search for novel countermeasures to enhance bone marrow recovery after radiation exposure, and to explore sex-specific responses.

"This collaboration between biologists, engineers, and clinicians across the Wyss Institute's consortium of institutions, and with experts in industry and government, demonstrates the kinds of advances in medicine that can be made when organizations that have a common goal contribute their respective expertise and talents to meet that challenge," said Ingber, who is the Wyss Institute's Founding Director, Judah Folkman Professor of Vascular Biology at HMS and the Vascular Biology Program at Boston Children's Hospital, and Professor of Bioengineering at Harvard's John A. Paulson School of Engineering and Applied Sciences. "With this model in hand that can also replicate patient-specific marrow responses, we are in a position to assist in the design of human clinical trials for rare genetic disorders and advance personalized medicine in ways not possible before."

Credit: 
Wyss Institute for Biologically Inspired Engineering at Harvard

Human Body-on-Chip platform enables in vitro prediction of drug behaviors in humans

image: In this graphic, the Wyss Institute's human Body-on-Chip system is layered on top of Leonardo da Vinci's ink drawing of the "Vitruvian Man", which represents ideal human body proportions. The researchers used a computational scaling method to translate data obtained from drug experiments in the human Body-on-Chip to the organ dimensions of the real human body.

Image: 
Wyss Institute at Harvard University

(BOSTON) -- Drug development is an extremely arduous and costly process, and failure rates in clinical trials that test new drugs for their safety and efficacy in humans remain very high. According to current estimates, only 13.8% of all tested drugs demonstrate ultimate clinical success and obtain approval by the Food and Drug Administration (FDA). There are also increasing ethical concerns relating to the use of animal studies. As a result, there has been a world-wide search to find replacements for animal models.

To help address this bottleneck in drug development, Donald Ingber, M.D., Ph.D., and his team at Harvard's Wyss Institute for Biologically Inspired Engineering, developed the first human "Organ-on-a-Chip" (Organ Chip) model of the lung that recapitulates human organ level physiology and pathophysiology with high fidelity, which was reported in Science in 2010. Organ Chips are microfluidic culture devices composed of a clear flexible polymer the size of a computer memory stick, which contains two parallel hollow channels that are separated by a porous membrane. Organ-specific cells are cultured on one side of the membrane in one of the channels, and vascular endothelial cells recapitulating a blood vessel line the other, while each channel is independently perfused with cell type-specific medium. The porous membrane allows the two compartments to communicate with each other, and to exchange molecules like cytokines, growth factors, and drugs, as well as drug breakdown products generated by organ-specific metabolic activities.

One example where living animals must be used in preclinical testing is the characterization of a drug's "pharmacokinetics" (PK) that involves the quantification of its absorption, distribution, metabolism, and excretion (ADME), which together determine drug levels in the blood. These responses involve interplay between many different organs linked by a vasculature containing flowing blood. Animals are also used to analyze drug "pharmacodynamics" (PD), the effects the drug produces on its target organs, which underlies its mechanism of action as well as its adverse effects.

Because the Wyss Institute's Organ Chips contain an endothelium-lined vascular channel, Ingber proposed in 2011 that it might be possible to create a human "Body-on-Chips" by transferring fluids between the vascular channels of many different types of Organ Chips to mimic blood flow, and assessing drug PK/PD behaviors across the entire linked system. Inspired by this vision and the realization that existing animal-based development programs are inadequate to confront the needs for accelerated development of drug countermeasures in a biothreat situation, the Defense Advanced Research Projects Agency (DARPA) requested grant applications in 2012 with a seemingly impossible challenge: develop 10 types of Organ Chips that recapitulate the complex functionalities of 10 different human organs, engineer an automated instrument to fluidically link them to create a functional human Body-on-Chips platform, and leverage computational modeling in combination with experimental data generated using this platform to quantitatively predict human drug PK/PD behavior in vitro. Now, two back-to-back publications in Nature Biomedical Engineering, describe the Wyss team's success in meeting this goal in full.

Known for posing impossible challenges such as this, DARPA understands that most investigators will not meet the goals as set out, but that extraordinary technological fallout will be created along the way. "We were very proud to obtain major funding support from DARPA to take on this challenge, and we are now even more proud that we have successfully met their goal, which would not have been possible without the exceptional talents, interdisciplinary spirit, and monumental team effort at the Wyss Institute," said Ingber, who is the Wyss Founding Director, as well as the Judah Folkman Professor of Vascular Biology at Harvard Medical School and Boston Children's Hospital, and Professor of Bioengineering at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS). He has been leading the DARPA-funded program along with Wyss Core Faculty member Kevin Kit Parker, Ph.D., who also is the Tarr Family Professor of Bioengineering and Applied Physics at SEAS.

In their first article, the Wyss team presents a highly modular Body-on-Chips platform, which is enabled by an engineered "Interrogator" instrument that can culture up to 10 different Organ Chips, and sequentially transfer fluids between their endothelium-lined vascular channels to mimic normal human blood flow between the different organs of our body. In the second article, the team uses a computational scaling method to translate data obtained from drug experiments involving 3 different types of fluidically-linked Organ Chips to their respective organ dimensions in the real human body. The approach is able to quantitatively predict changes in drug levels over time, as well as organ-specific toxicities, that have been previously measured in human patients.

"Both studies represent a tremendous effort by scores of researchers at the Wyss Institute, who worked together with our industrial modeling collaborators, and pooled their collective tissue engineering, microfabrication, pharmacological, physiological, and computational expertise to make this huge advance in preclinical drug testing possible," said Rachelle Prantil-Baun, Ph.D., a Wyss Institute Senior Staff Scientist with past experience in the pharmaceutical industry who helped to orchestrate this complex multi-investigator effort with multiple other staff members in the Wyss' Bioinspired Therapeutics and Diagnostics platform.

The Interrogator instrument enabled the team to culture, perfuse and link many living human cultured tissues in a multi-Organ Chip system, as well as add and sample the medium in a fully programmable way, using the device's robotic liquid transfer capabilities, while continuously monitoring tissue integrity with an integrated microscope. "In this study, we serially linked the vascular channels of eight different Organ Chips, including intestine, liver, kidney, heart, lung, skin, blood-brain barrier and brain, using a highly optimized common blood substitute, while independently perfusing the individual channels lined by organ-specific cells. The instrument maintained the viability of all tissues and their organ-specific functions for over three weeks and, importantly, it allowed us to quantitatively predict the tissue-specific distribution of a chemical across the entire system," said Richard Novak, Ph.D., a co-first-author on both studies. Novak is a Senior Staff Engineer at the Wyss Institute who designed, fabricated, and operated the Interrogator instrument with his bioengineering team.

In the second study, the team used the Interrogator instrument to support two different configurations of 3 different Organ Chips linked to each other and to a central arterio-venous (AV) fluid mixing reservoir that helped recapitulate life-like blood and drug exchange between the individual organs, while also providing a way to carry out blood sampling that would mimic blood drawing from a peripheral vein. The researchers coupled a human Gut Chip with a Liver Chip and a Kidney Chip and added nicotine to the Gut Chip's channel lined by intestinal epithelium to simulate oral administration of this drug, and its first pass through the intestinal wall and via the vascular system to the liver where it is metabolized, and to the kidney where it is excreted. Nicotine chewing gum is used to help with smoking cessation; however, it is also being investigated as an oral drug for neurodegenerative and inflammatory bowel diseases.

Applying mass spectrometry analysis, the Wyss team quantified nicotine levels in the AV reservoir and the effluents of the vascular channels of all the different Organ Chips, and then fitted the data with a newly developed biomimetic scaling approach that translates them from the dimensions of the Organ Chips to their actual organ dimensions in the human body. For the first time, this computational approach combined with experimental human Organ Chip data demonstrated the ability to model drug uptake and metabolism, and quantitatively predict dynamic changes in drug blood levels (PK) that previously were observed in human clinical trials. The scaling approach, which also solves the challenge of drug adsorption into materials in the experimental system, was developed by co-author Andrzej Przekwas, Ph.D., and his team at CFD Research Corporation in Huntsville, Alabama.

"The resulting calculated maximum nicotine concentrations, the time needed for nicotine to reach the different tissue compartments, and the clearance rates in the Liver Chips in our in vitro-based in silico model mirrored closely what had been measured previously in patients," said Ben Maoz, Ph.D., a co-first author on the second study and former Technology Development Fellow at the Wyss Institute in the lab of Parker. Maoz currently is an Assistant Professor at Tel Aviv University, Israel.

With a second multi-Organ Chip configuration comprising fluidically-linked Liver, Kidney, and Bone Marrow Chips, the team investigated the pharmacological effects of cisplatin, a chemotherapeutic drug commonly used in cancer treatments, which is administered intravenously and displays unwanted toxicity in the kidney and bone marrow. "Our analysis recapitulates the pharmacodynamic effects of cisplatin in patients, including a decrease in numbers of different blood cell types and an increase in markers of kidney injury," said co-first author Anna Herland, Ph.D., who worked on Ingber's team at the time of the study. "In addition, the in vitro-to-in vivo translation capabilities of the system produced quantitative information on how cisplatin is metabolized and cleared by the liver and kidney, which will make it suitable for more refined predictions of drug absorption, distribution, metabolism, excretion and toxicity." Herland is now an Associate Professor at KTH Royal Institute of Technology and the Karolinska Institute in Stockholm, Sweden.

"This is what we love to do at the Wyss Institute: make science fiction into science fact. And we hope our demonstration that this level of biomimicry is possible using Organ Chip technology will garner even greater interest from the pharmaceutical industry so that animal testing can be progressively reduced over time," said Ingber.

Credit: 
Wyss Institute for Biologically Inspired Engineering at Harvard

Patterns of thinning of Antarctica's biggest glacier are opposite to previously observed

image: The lead author undertaking satellite validation fieldwork on the Filchner Ronne Ice Shelf, West Antarctica with the Alfred Wegener Institute, Germany.

Image: 
Jonathan Bamber, University of Bristol

Using the latest satellite technology from the European Space Agency (ESA), scientists from the University of Bristol have been tracking patterns of mass loss from Pine Island - Antarctica's largest glacier.

They found that the pattern of thinning is evolving in complex ways both in space and time with thinning rates now highest along the slow-flow margins of the glacier, while rates in the fast-flowing central trunk have decreased by about a factor of five since 2007. This is the opposite of what was observed prior to 2010.

Pine Island has contributed more to sea level rise over the past four decades than any other glacier in Antarctica, and as a consequence has become one of its most intensively and extensively investigated ice stream systems.

However, different model projections of future mass loss give conflicting results; some suggesting mass loss could dramatically increase over the next few decades, resulting in a rapidly growing contribution to sea level, while others indicate a more moderate response.

Identifying which is the more likely behaviour is important for understanding future sea level rise and how this vulnerable part of Antarctica is going to evolve over the coming decades.

The results of the new study, published in the journal Nature Geoscience, suggest that rapid migration of the grounding line, the place where the grounded ice first meets the ocean, is unlikely over that timescale, without a major change in ocean forcing. Instead, the results support model simulations that imply that the glacier will continue to lose mass but not at much greater rates than present.

Lead author Professor Jonathan Bamber from the University of Bristol's School of Geographical Sciences, said: "This could seem like a 'good news story' but it's important to remember that we still expect this glacier to continue to lose mass in the future and for that trend to increase over time, just not quite as fast as some model simulations suggested.

"It's really important to understand why the models are producing different behaviour in the future and to get a better handle on how the glacier will evolve with the benefit of these new observations.

"In our study, we didn't make projections but with the aid of these new data we can improve model projections for this part of Antarctica."

Credit: 
University of Bristol

UCI oceanographers predict increase in phytoplankton by 2100

Irvine, Calif. - A neural network-driven Earth system model has led University of California, Irvine oceanographers to a surprising conclusion: phytoplankton populations will grow in low-latitude waters by the end of the 21st century.

The unexpected simulation outcome runs counter to the longstanding belief by many in the environmental science community that future global climate change will make tropical oceans inhospitable to phytoplankton, which are the base of the aquatic food web. The UCI researchers provide the evidence for their findings in a paper published today in Nature Geoscience.

Senior author Adam Martiny, UCI professor in oceanography, explained that the prevalent thinking on phytoplankton biomass is based on an increasingly stratified ocean. Warming seas inhibit mixing between the heavier cold layer in the deep and lighter warm water closer to the surface. With less circulation between the levels, fewer nutrients reach the higher strata where they can be accessed by hungry plankton.

"All the climate models have this mechanism built into them, and it has led to these well-established predictions that phytoplankton productivity, biomass and export into the deep ocean will all decline with climate change," he said. "Earth system models are largely based upon laboratory studies of phytoplankton, but of course laboratory studies of plankton are not the real ocean."

According to Martiny, scientists traditionally account for plankton by measuring the amount of chlorophyll in the water. There is considerably less of the green stuff in low-latitude regions that are very hot compared to cooler regions further away from the equator.

"The problem is that chlorophyll is not everything that's in a cell, and actually in low latitudes, many plankton are characterized by having a very small amount of it; there's so much sunlight, plankton only need a few chlorophyll molecules to get enough energy to grow," he noted. "In reality, we have had so far very little data to actually demonstrate whether or not there is more or less biomass in regions undergoing stratification. As a result, the empirical basis for less biomass in warmer regions is not that strong."

These doubts led Martiny and his UCI colleagues to conduct their own phytoplankton census. Analyzing samples from more than 10,000 locations around the world, the team created a global synthesis of the key phytoplankton groups that grow in warm regions.

The vast majority of these species are very tiny cells known as picophytoplankton. Ten times smaller in diameter than the strains of plankton one would find off the California coast - and 1,000 times less voluminous - picophytoplankton are nonetheless great in number, making up 80 to 90 percent of plankton biomass in most warm regions.

The group built global maps and compared the quantity of biomass along the gradient of temperature, a key parameter, according to Martiny. Conducting a machine learning analysis to determine the difference now versus the year 2100, they found a big surprise: "In many regions there would be an increase of 10 to 20 percent of plankton biomass, rather than a decline," Martiny said.

"Machine learning is not biased by the human mind," he said. "We just give the model tons and tons of data, but they can help us challenge existing paradigms."

One of the theories the team explored to explain the growth, with help from co-author Francois Primeau, UCI professor of Earth system science, had to do with what happens to phytoplankton at the end of their life cycle.

"When plankton die - especially these small species - they sit around for a while longer, and maybe at high temperature other plankton can more easily degrade them and recycle the nutrients back to build new biomass," Martiny said.

Such ecosystem features are not easily taken into account by traditional, mechanistic Earth system models, according to Martiny, but they were part of the geographically diverse dataset the team used to train its neural network-derived quantitative niche model.

Martiny said that this study as a follow-up to research published last summer is further evidence as to the diversity and resilience of phytoplankton.

"We could obviously let climate change get out of hand and go into completely uncharted territory, and then all bets are off," he said. "But at least for a while, I think the adaptive capabilities in these diverse plankton communities will help them maintain high biomass despite these environmental changes."

Joining Martiny and Primeau were fellow authors Pedro Flombaum, former UCI postdoctoral researcher and later visiting scholar in Earth system science (currently a professor at the University of Buenos Aires, Argentina), and Weilei Wang, UCI postdoctoral scholar in Earth system science. The study received support from the National Science Foundation's Ten Big Ideas program and the U.S. Department of Energy Office of Biological and Environmental Research.

Credit: 
University of California - Irvine

Current model for storing nuclear waste is incomplete

COLUMBUS, Ohio - The materials the United States and other countries plan to use to store high-level nuclear waste will likely degrade faster than anyone previously knew because of the way those materials interact, new research shows.

The findings, published today in the journal Nature Materials, show that corrosion of nuclear waste storage materials accelerates because of changes in the chemistry of the nuclear waste solution, and because of the way the materials interact with one another.

"This indicates that the current models may not be sufficient to keep this waste safely stored," said Xiaolei Guo, lead author of the study and deputy director of Ohio State's Center for Performance and Design of Nuclear Waste Forms and Containers, part of the university's College of Engineering. "And it shows that we need to develop a new model for storing nuclear waste."

The team's research focused on storage materials for high-level nuclear waste -- primarily defense waste, the legacy of past nuclear arms production. The waste is highly radioactive. While some types of the waste have half-lives of about 30 years, others -- for example, plutonium -- have a half-life that can be tens of thousands of years. The half-life of a radioactive element is the time needed for half of the material to decay.

The United States currently has no disposal site for that waste; according to the U.S. General Accountability Office, it is typically stored near the plants where it is produced. A permanent site has been proposed for Yucca Mountain in Nevada, though plans have stalled. Countries around the world have debated the best way to deal with nuclear waste; only one, Finland, has started construction on a long-term repository for high-level nuclear waste.

But the long-term plan for high-level defense waste disposal and storage around the globe is largely the same. It involves mixing the nuclear waste with other materials to form glass or ceramics, and then encasing those pieces of glass or ceramics -- now radioactive -- inside metallic canisters. The canisters then would be buried deep underground in a repository to isolate it.

In this study, the researchers found that when exposed to an aqueous environment, glass and ceramics interact with stainless steel to accelerate corrosion, especially of the glass and ceramic materials holding nuclear waste.

The study qualitatively measured the difference between accelerated corrosion and natural corrosion of the storage materials. Guo called it "severe."

"In the real-life scenario, the glass or ceramic waste forms would be in close contact with stainless steel canisters. Under specific conditions, the corrosion of stainless steel will go crazy," he said. "It creates a super-aggressive environment that can corrode surrounding materials."

To analyze corrosion, the research team pressed glass or ceramic "waste forms" -- the shapes into which nuclear waste is encapsulated -- against stainless steel and immersed them in solutions for up to 30 days, under conditions that simulate those under Yucca Mountain, the proposed nuclear waste repository.

Those experiments showed that when glass and stainless steel were pressed against one another, stainless steel corrosion was "severe" and "localized," according to the study. The researchers also noted cracks and enhanced corrosion on the parts of the glass that had been in contact with stainless steel.

Part of the problem lies in the Periodic Table. Stainless steel is made primarily of iron mixed with other elements, including nickel and chromium. Iron has a chemical affinity for silicon, which is a key element of glass.

The experiments also showed that when ceramics -- another potential holder for nuclear waste -- were pressed against stainless steel under conditions that mimicked those beneath Yucca Mountain, both the ceramics and stainless steel corroded in a "severe localized" way.

Credit: 
Ohio State University

19th-century bee cells in a Panamanian cathedral shed light on human impact on ecosystems

image: Locations of nest cell aggregations of Eufriesea surinamensis within the Cathedral in Casco Viejo, Panamá

Image: 
Paola Galgani-Barraza

Despite being "neotropical-forest-loving creatures," some orchid bees are known to tolerate habitats disturbed by human activity. However, little did the research team of Paola Galgani-Barraza (Smithsonian Tropical Research Institute) expect to find as many as 120 clusters of nearly two-centuries-old orchid bee nests built on the altarpiece of the Basilica Cathedral in Casco Viejo (Panamá). Their findings are published in the open-access Journal of Hymenoptera Research.

This happened after restoration work, completed in 2018 in preparation for the consecration of a new altar by Pope Francis, revealed the nests. Interestingly, many cells were covered with gold leaf and other golden material applied during an earlier restoration following an 1870 fire, thus aiding the reliable determination of the age of the clusters. The cells were dated to the years prior to 1871-1876.

The bee species, that had once constructed the nests, was identified as the extremely secretive Eufriesea surinamensis. Females are known to build their nests distant from each other, making them very difficult to locate in the field. As a result, there is not much known about them: neither about the floral resources they collect for food, nor about the materials they use to build their nests, nor about the plants they pollinate.

However, by analysing the preserved pollen for the first time for this species, the researchers successfully detected the presence of 48 plant species, representing 43 genera and 23 families. Hence, they concluded that late-nineteenth century Panama City was surrounded by a patchwork of tropical forests, sufficient to sustain nesting populations of what today is a forest-dwelling species of bee.

Not only did the scientists unveil important knowledge about the biology of orchid bees and the local floral diversity in the 19th century, but they also began to uncover key information about the functions of natural ecosystems and their component species, where bees play a crucial role as primary pollinators. Thus, the researchers hope to reveal how these environments are being modified by collective human behaviour, which is especially crucial with the rapidly changing environment that we witness today.

Credit: 
Pensoft Publishers

Study urges national review of support services for male survivors of sexual violence

The sentencing of Reynhard Sinaga, the most prolific convicted rapist in British history, who preyed on men in Manchester, was shocking and destroyed the myth that only women are raped and sexually abused.

Now, a call for a complete review of national support services for male survivors of sexual violence and abuse has been made following a study by Lancaster University, launched today.

The comments are made in a 40-page study carried out by Lancaster University for the Male Survivors Partnership.

The study reviewed The National Male Survivors Helpline and Online Service (NMSHOS) run by sexual abuse and rape charity Safeline, based in Warwick, and funded by the Ministry of Justice.

The NMSHOS is the only dedicated national helpline to support male survivors of sexual abuse and rape.

The aim of the NMSHOS, who offer telephone, email and text support, is to provide a 'listening space' for male survivors and those supporting them, and to provide practical and emotional support to help with the emotional, psychological, social, relational, financial and physical health difficulties that men can experience as a result of sexual victimisation.

The research reveals how a lack of male support service provision can make survivors feel as if they do not deserve to speak about their trauma, and highlights how accessing support can be a 'post-code lottery' with a complete lack of support in certain geographic areas.

The research also highlighted long waiting lists for services which, said the study, was a regular source of distress and frustration for male survivors.

The study suggests a national review of male survivor support provision should be undertaken to identify areas where there is no support and where waiting lists are extensive.

Data analysis revealed the telephone helpline service has seen a 199 per cent increase in calls between 2016 and 2018, with the majority of callers discussing non-recent abuse.

The research highlighted how male survivors preferred to access telephone support for various reasons including talking through suicidal feelings and thoughts and for reassurance, support and signposting to other services.

And, despite an increase of 37% in the opening times of the service, the research revealed increasing demands on the service from both new and repeat callers who wanted to take that first step to recovery.

The research also calls for increased funding to allow for increased service capacity, and to maintain the high-quality support currently provided to clients.

"There are a number of reasons why demand for the helpline services continues to increase," explains Dr Siobhan Weare, who led the research project.

"The profile of the NMSHOS has increased since it was launched, with it being featured in relation to high profile male survivor storylines on TV shows such as Coronation Street and Hollyoaks, as well as in national news coverage.

"Male survivors may also prefer accessing support via the telephone or online or there may be a lack of face-to-face counselling service provision for men in their local area."

This research was commissioned by the Male Survivors Partnership and funded by the Home Office.

Commenting on the report, Duncan Craig, Trustee and Joint National Strategic Lead for the Male Survivors Partnership, and CEO of Survivors Manchester, said: 'As the first national service for male survivors, the NMSHOS has enabled thousands of men and boys to speak about their abuse to trained specialists who will help them consider their options for support.

"The helpline services play a critical role in supporting male survivors, their families, and their friends. Male survivors should be offered choice in the type of service they wish to access (e.g. face-to-face counselling, group support, or telephone/ online support). This can only happen if all services are properly funded across the country."

Credit: 
Lancaster University

Enhancing drug testing with human body-on-chip systems

The U.S. Food and Drug Administration (FDA) approves only 13.8% of all tested drugs, and these numbers are even lower in "orphan" diseases that affect relatively few people.

Part of the problem lies in the imperfect nature of preclinical drug testing that aims to exclude toxic effects and predetermine concentrations and administration routes before drug candidates can be tested in people. How new drugs move within the human body and are affected by it, and how drugs affect the body itself, cannot be predicted accurately enough in animal and standard in vitro studies.

"To solve this massive preclinical bottleneck problem, we need to become much more effective at setting the stage for drugs that are truly promising and rule out others that for various reasons are likely to fail in people," explains Prof. Donald Ingber, M.D., Ph.D., founding director of Harvard University's Wyss Institute for Biologically Inspired Engineering, co-author of two new studies on the subject published in Nature Biomedical Engineering on January 27, 2020.

Co-led by Dr. Ben Maoz of Tel Aviv University's Department of Biomedical Engineering and Sagol School of Neuroscience and over 50 colleagues, a team of scientists at TAU and Harvard have now devised a functioning comprehensive multi-Organ-on-a-Chip (Organ Chip) platform that enables effective in-vitro-to-in-vivo translation (IVIVT) of human drug pharmacology.

"We hope that this platform will enable us to bridge the gap on current limitations in drug development by providing a practical, reliable, relevant system for testing drugs for human use," says Dr. Maoz, co-first author of both studies and former Technology Development Fellow at the Wyss Institute on the teams of Prof. Ingber and Prof. Kevin Kit Parker, Ph.D., the latter of whom is also a leading author of both studies.

In the first of two studies, the scientists developed the "Interrogator," a robotic liquid transfer device to link individual "Organ Chips" in a way that mimics the flow of blood between organs in the human body.

Organ Chips are microfluidic devices composed of a clear flexible polymer the size of a computer memory stick that contains two parallel running hollow channels separated by a porous membrane and independently perfused with cell type-specific media. While one of the channels, the parenchymal channel, is lined with cells from a specific human organ or functional organ structure, the other one is lined with vascular endothelial cells presenting a blood vessel. The membrane allows the two compartments to communicate with each other and to exchange molecules like cytokines and growth factors, as well as drugs and drug products generated by organ-specific metabolic activities.

The team then applied their Interrogator automated linking platform and a new computational model they developed to three linked organs to test two drugs: nicotine and cisplatin.

"The modularity of our approach and availability of multiple validated Organ Chips for a variety of tissues for other human Body-on-Chip approaches now allows us to develop strategies to make realistic predictions about the pharmacology of drugs much more broadly," says Prof. Ingber. "Its future use could greatly increase the success rates of Phase I clinical trials."

The researchers accurately modeled the oral uptake of nicotine and intravenous uptake of cisplatin, a common chemotherapy medication, and their first passage through relevant organs with highly quantitative predictions of human pharmacokinetic and pharmacodynamic parameters.

"The resulting calculated maximum nicotine concentrations, the time needed for nicotine to reach the different tissue compartments, and the clearance rates in the Liver Chips in our in vitro-based in silico model mirrored closely what had been measured in patients," concludes Dr. Maoz.

The multidisciplinary research project is the culmination of a Defense Advanced Research Projects Agency (DARPA) project at the Wyss Institute. Several authors on both studies, including Prof. Ingber, are employees and hold equity in Emulate, Inc., a company that was spun out of the Wyss Institute to commercially develop Organ Chip technology.

Credit: 
American Friends of Tel Aviv University

Genetic marking discovery improves fruit quality, bolsters climate defenses

ITHACA, N.Y. - Transferring genetic markers in plant breeding is a challenge, but a team of grapevine breeders and scientists at Cornell University have come up with a powerful new method that improves fruit quality and acts as a key defense against pests and a changing climate.

Plant breeders are always striving to develop new varieties that satisfy growers, producers and consumers. To do this, breeders use genetic markers to bring desirable traits from wild species into their cultivated cousins.

The team's new technique for developing genetic markers improves markers' transfer rate across grapevine species from 2% to 92%. With it, breeders worldwide can screen their collections and find out immediately which vines have the traits they want - regardless of what varieties they are, where they came from or which species their parents were.

"This new marker development strategy goes well beyond grapes," said co-author Bruce Reisch, professor of horticulture in the College of Agriculture and Life Sciences, and leader of Cornell's Grapevine Breeding and Genetics Program. "It's applicable for breeding and genetic studies across different grape breeding programs, plant species and other diverse organisms."

To create the genetic markers, the research team used new automated DNA sequencing technology to create a "core genome" for grapevines, matching important regions shared between 10 species' genomes. Using powerful new genetic mapping technology, they targeted those regions to develop robust DNA markers.

This breakthrough in translating the grapevine genome into a common language for breeders is central to the mission of VitisGen2, the second iteration of a multi-institution research project from which the new marker development strategy emerged.

"This is game-changing work - and it's only the beginning," said Donnell Brown, president of the National Grape Research Alliance, an industry-led nonprofit representing the research interests of wine, juice, raisin and table grapes. "From here, we can greatly accelerate the genetic exploration that will help us improve fruit and production quality and, ultimately, respond to the threats of pests and diseases, a changing climate and more."

Credit: 
Cornell University

Driven by Earth's orbit, climate changes in Africa may have aided human migration

MADISON - In 1961, John Kutzbach, then a recent college graduate, was stationed in France as an aviation weather forecaster for the U.S. Air Force. There, he found himself exploring the storied caves of Dordogne, including the prehistoric painted caves at Lascoux.

Thinking about the ancient people and animals who would have gathered in these caves for warmth and shelter, he took up an interest in glaciology. "It was interesting to me, as a weather person, that people would live so close to an ice sheet," says Kutzbach, emeritus University of Wisconsin-Madison professor of atmospheric and oceanic sciences and the Nelson Institute for Environmental Studies.

Kutzbach went on to a career studying how changes in Earth's movements through space - the shape of its orbit, its tilt on its axis, its wobble - and other factors, including ice cover and greenhouse gases, affect its climate. Many years after reveling at Ice Age cave art, today he's trying to better understand how changes in Earth's climate may have influenced human migration out of Africa.

In a recent study published in the Proceedings of the National Academy of Sciences, Kutzbach and a team of researchers trace changes in climate and vegetation in Africa, Arabia and the Mediterranean going back 140,000 years to aid others studying the influences underlying human dispersal.

The study describes a dynamic climate and vegetation model that explains when regions across Africa, areas of the Middle East, and the Mediterranean were wetter and drier and how the plant composition changed in tandem, possibly providing migration corridors throughout time.

"We don't really know why people move, but if the presence of more vegetation is helpful, these are the times that would have been advantageous to them," Kutzbach says.

The model also illuminates relationships between Earth's climate and its orbit, greenhouse gas concentrations, and its ice sheets.

For instance, the model shows that around 125,000 years ago, northern Africa and the Arabian Peninsula experienced increased and more northerly-reaching summer monsoon rainfall that led to narrowing of the Saharan and Arabian deserts due to increased grassland. At the same time, in the Mediterranean and the Levant (an area that includes Syria, Lebanon, Jordan, Israel and Palestine), winter storm track rainfall also increased.

These changes were driven by Earth's position relative to the sun. The Northern Hemisphere at the time was as close as possible to the sun during the summer, and as far away as possible during the winter. This resulted in warm, wet summers and cold winters.

"It's like two hands meeting," says Kutzbach. "There were stronger summer rains in the Sahara and stronger winter rains in the Mediterranean."

Given the nature of Earth's orbital movements, collectively called Milankovitch cycles, the region should be positioned this way roughly every 21,000 years. Every 10,000 years or so, the Northern Hemisphere would then be at its furthest point from the sun during the summer, and closest during winter.

Indeed, the model showed large increases in rainfall and vegetation at 125,000, at 105,000, and at 83,000 years ago, with corresponding decreases at 115,000, at 95,000 and at 73,000 years ago, when summer monsoons decreased in magnitude and stayed further south.

Between roughly 70,000 and 15,000 years ago, Earth was in a glacial period and the model showed that the presence of ice sheets and reduced greenhouse gases increased winter Mediterranean storms but limited the southern retreat of the summer monsoon. The reduced greenhouse gases also caused cooling near the equator, leading to a drier climate there and reduced forest cover.

These changing regional patterns of climate and vegetation could have created resource gradients for humans living in Africa, driving migration outward to areas with more water and plant life.

For the study, the researchers, including Kutzbach's UW-Madison colleagues Ian Orland and Feng He, along with researchers at Peking University and the University of Arizona, used the Community Climate System Model version 3 from the National Center for Atmospheric Research. They ran simulations that accounted for orbital changes alone, combined orbital and greenhouse gas changes, and a third that combined those influences plus the influence of ice sheets.

It was Kutzbach who, in the 1970s and 1980s, confirmed that changes in Earth's orbit can drive the strength of summer monsoons around the globe by influencing how much sunlight, and therefore, how much warming reaches a given part of the planet.

Forty years ago, there was evidence for periodic strong monsoons in Africa, but no one knew why, Kutzbach says. He showed that orbital changes on Earth could lead to warmer summers and thus, stronger monsoons. He also read about periods of "greening" in the Sahara, often used to explain early human migration into the typically-arid Middle East.

"My early work prepared me to think about this," he says.

His current modeling work mostly agrees with collected data from each region, including observed evidence from old lake beds, pollen records, cave features, and marine sediments. A recent study led by Orland used cave records in the Levant to show that summer monsoons reached into the region around 125,000 years ago.

"We get some things wrong (in the model)," says Kutzbach, so the team continues to refine it. For instance, the model doesn't get cold enough in southern Europe during the glacial period and not all vegetation changes match observed data. Computing power has also improved since they ran the model.

"This is by no means the last word," Kutzbach says. "The results should be looked at again with an even higher-resolution model."

Credit: 
University of Wisconsin-Madison

Effects of contact between minority and majority groups more complex than once believed

image: In recent research, Linda Tropp at UMass Amherst, with Tabea Hässler at the University of Zurich and others, examined whether and how contact between groups might help to promote support for social change and in pursuit of greater social equality, among other goals.

Image: 
UMass Amherst

AMHERST, Mass. - For more than 50 years, social scientists and practitioners have suggested that having members of different groups interact with each other can be an effective tool for reducing prejudice. But emerging research points to a more complex and nuanced understanding of the effects of contact between groups, say Linda Tropp at the University of Massachusetts Amherst and Tabea Hässler, leader of a multi-national research team based at the University of Zurich, Switzerland.

As Tropp explains, studies from the last 10 to 15 years suggest that the positive effects of intergroup contact tend to be weaker among members of historically advantaged groups, such as white people and heterosexuals, compared to the effects typically observed among members of historically disadvantaged groups such as people of color and sexual minorities. There has also been growing concern that contact may effectively reduce prejudice between groups but do little to change existing social inequalities, she adds.

"With our research, we wanted to examine whether and how contact between groups might help to promote support for social change, in pursuit of greater social equality, while also testing whether the effects of contact might vary depending on status relations between the groups and how the relevant variables were measured," she explains. "So, we embarked on this multi-national study, which included researchers from more than twenty countries around the world, who gathered survey responses from 12,997 individuals across 69 countries."

The authors highlight that this comprehensive study "makes substantial advances in our understanding of the relation between intergroup contact and social change." Details appear in Nature Human Behaviour.

The researchers found robust evidence, Tropp says, that when members of historically advantaged groups engage in contact with disadvantaged groups, they are more likely to support social change to promote equality. In contrast, when members of historically disadvantaged groups have contact with advantaged groups, they are generally less likely to support social change to promote equality.

However, the researchers also point out an important exception: "Among both advantaged and disadvantaged groups, contact predicted greater willingness to work in solidarity to achieve greater social equality. Thus, this research may offer a new route to reach social cohesion and social change, such that social harmony would not come at the expense of social justice."

Tropp, Hässler and their colleagues say their results raise two important questions and directions for future research. First, they ask, "How can positive and intimate contact between groups occur without reducing disadvantaged group members' support for social change?" Second, "How can support for social change be increased among disadvantaged group members without requiring negative contact experiences?"

They suggest, "Possible answers to both questions may be that advantaged group members who engage in contact should openly acknowledge structural inequalities and express support for efforts by disadvantaged group members to reduce these inequalities," they conclude.

Credit: 
University of Massachusetts Amherst

For cheaper solar cells, thinner really is better

Costs of solar panels have plummeted over the last several years, leading to rates of solar installations far greater than most analysts had expected. But with most of the potential areas for cost savings already pushed to the extreme, further cost reductions are becoming more challenging to find.

Now, researchers at MIT and at the National Renewable Energy Laboratory (NREL) have outlined a pathway to slashing costs further, this time by slimming down the silicon cells themselves.

Thinner silicon cells have been explored before, especially around a dozen years ago when the cost of silicon peaked because of supply shortages. But this approach suffered from some difficulties: The thin silicon wafers were too brittle and fragile, leading to unacceptable levels of losses during the manufacturing process, and they had lower efficiency. The researchers say there are now ways to begin addressing these challenges through the use of better handling equipment and some recent developments in solar cell architecture.

The new findings are detailed in a paper in the journal Energy and Environmental Science, co-authored by MIT postdoc Zhe Liu, professor of mechanical engineering Tonio Buonassisi, and five others at MIT and NREL.

The researchers describe their approach as "technoeconomic," stressing that at this point economic considerations are as crucial as the technological ones in achieving further improvements in affordability of solar panels.

Currently, 90 percent of the world's solar panels are made from crystalline silicon, and the industry continues to grow at a rate of about 30 percent per year, the researchers say. Today's silicon photovoltaic cells, the heart of these solar panels, are made from wafers of silicon that are 160 micrometers thick, but with improved handling methods, the researchers propose this could be shaved down to 100 micrometers -- and eventually as little as 40 micrometers or less, which would only require one-fourth as much silicon for a given size of panel.

That could not only reduce the cost of the individual panels, they say, but even more importantly it could allow for rapid expansion of solar panel manufacturing capacity. That's because the expansion can be constrained by limits on how fast new plants can be built to produce the silicon crystal ingots that are then sliced like salami to make the wafers. These plants, which are generally separate from the solar cell manufacturing plants themselves, tend to be capital-intensive and time-consuming to build, which could lead to a bottleneck in the rate of expansion of solar panel production. Reducing wafer thickness could potentially alleviate that problem, the researchers say.

The study looked at the efficiency levels of four variations of solar cell architecture, including PERC (passivated emitter and rear contact) cells and other advanced high-efficiency technologies, comparing their outputs at different thickness levels. The team found there was in fact little decline in performance down to thicknesses as low as 40 micrometers, using today's improved manufacturing processes.

"We see that there's this area (of the graphs of efficiency versus thickness) where the efficiency is flat," Liu says, "and so that's the region where you could potentially save some money." Because of these advances in cell architecture, he says, "we really started to see that it was time to revisit the cost benefits."

Changing over the huge panel-manufacturing plants to adapt to the thinner wafers will be a time-consuming and expensive process, but the analysis shows the benefits can far outweigh the costs, Liu says. It will take time to develop the necessary equipment and procedures to allow for the thinner material, but with existing technology, he says, "it should be relatively simple to go down to 100 micrometers," which would already provide some significant savings. Further improvements in technology such as better detection of microcracks before they grow could help reduce thicknesses further.

In the future, the thickness could potentially be reduced to as little as 15 micrometers, he says. New technologies that grow thin wafers of silicon crystal directly rather than slicing them from a larger cylinder could help enable such further thinning, he says.

Development of thin silicon has received little attention in recent years because the price of silicon has declined from its earlier peak. But, because of cost reductions that have already taken place in solar cell efficiency and other parts of the solar panel manufacturing process and supply chain, the cost of the silicon is once again a factor that can make a difference, he says.

"Efficiency can only go up by a few percent. So if you want to get further improvements, thickness is the way to go," Buonassisi says. But the conversion will require large capital investments for full-scale deployment.

The purpose of this study, he says, is to provide a roadmap for those who may be planning expansion in solar manufacturing technologies. By making the path "concrete and tangible," he says, it may help companies incorporate this in their planning. "There is a path," he says. "It's not easy, but there is a path. And for the first movers, the advantage is significant."

What may be required, he says, is for the different key players in the industry to get together and lay out a specific set of steps forward and agreed-upon standards, as the integrated circuit industry did early on to enable the explosive growth of that industry. "That would be truly transformative," he says.

Andre Augusto, an associate research scientist at Arizona State University who was not connected with this research, says "refining silicon and wafer manufacturing is the most capital-expense (capex) demanding part of the process of manufacturing solar panels. So in a scenario of fast expansion, the wafer supply can become an issue. Going thin solves this problem in part as you can manufacture more wafers per machine without increasing significantly the capex." He adds that "thinner wafers may deliver performance advantages in certain climates," performing better in warmer conditions.

Renewable energy analyst Gregory Wilson of Gregory Wilson Consulting, who was not associated with this work, says "The impact of reducing the amount of silicon used in mainstream cells would be very significant, as the paper points out. The most obvious gain is in the total amount of capital required to scale the PV industry to the multi-terawatt scale required by the climate change problem. Another benefit is in the amount of energy required to produce silicon PV panels. This is because the polysilicon production and ingot growth processes that are required for the production of high efficiency cells are very energy intensive."

Wilson adds "Major PV cell and module manufacturers need to hear from credible groups like Prof. Buonassisi's at MIT, since they will make this shift when they can clearly see the economic benefits."

Credit: 
Massachusetts Institute of Technology