Earth

Scientists develop non-invasive method to predict onset of dementia

image: Information gathered from routine visits to the doctor is enough to accurately predict a person's risk of developing Alzheimer's disease and related dementias, according to new research led by scientists from Regenstrief Institute, Indiana University and Merck. The researchers developed and tested machine learning algorithms using data from electronic medical records to identify patients who may be at risk for developing the dementia.

Image: 
Regenstrief Institute

INDIANAPOLIS -- Information gathered from routine visits to the doctor is enough to accurately predict a person's risk of developing Alzheimer's disease and related dementias, according to new research led by scientists from Regenstrief Institute, Indiana University and Merck. The researchers developed and tested machine learning algorithms using data from electronic medical records to identify patients who may be at risk for developing the dementia.

At least 50 percent of older primary care patients living with Alzheimer's disease and related dementias never receive a diagnosis. And many more live with symptoms for two to five years before being diagnosed. Currently, tests to screen for dementia risk are invasive, time-consuming and expensive.

"The great thing about this method is that it's passive, and it provides similar accuracy to the more intrusive tests that are currently used," said lead researcher Malaz Boustani, M.D., MPH, a research scientist at Regenstrief Institute and a professor at Indiana University School of Medicine. "This is a low cost, scalable solution that can provide substantial benefit to patients and their families by helping them prepare for the possibility of life with dementia and enabling them to take action."

Developing machine learning algorithms for predicting dementia

The research team, which also included scientists from Georgia State, Albert Einstein College of Medicine and Solid Research Group, recently published its findings on two different machine learning approaches. The paper published in the Journal of the American Geriatrics Society analyzed the results of a natural language processing algorithm, which learns rules by analyzing examples, and the Artificial Intelligence in Medicine article shared the results from a random forest model, which is built using an ensemble of decision trees. Both methods showed similar accuracy at predicting the onset of dementia within one and three years of diagnosis.

In order to train the algorithms, researchers gathered data on patients from the Indiana Network for Patient Care. The models used information on prescriptions and diagnoses, which are structured fields, as well as medical notes, which are free text, to predict the onset of dementia. Researchers found that the free text notes were the most valuable to helping identify people at risk of developing the disease.

"This research is exciting because it potentially provides significant benefit to patients and their families," said Patrick Monahan, PhD, study author from IU School of Medicine and a Regenstrief affiliate scientist. "Clinicians can provide education on behavior and habits to help patients cope with their symptoms and live a better quality of life."

Zina Ben Miled, PhD, M.S., a study author from the Purdue School of Engineering and Technology at IUPUI and a Regenstrief affiliate scientist, said, "The early risk identification allows an opportunity for doctors and families to put a care plan in place. I know from experience what a burden it can be to deal with a dementia diagnosis. The window provided by this test is so important to help improve the quality of life for both patients and their families."

In addition to the benefit to families, these methods can also provide significant cost savings for patients and health systems. They replace the need for expensive tests and allow clinicians to screen entire populations to identify those most at risk. Delaying the onset of symptoms also saves a significant amount of money on treatment.

The next step is to deploy these machine learning algorithms in real-life clinics to test if they help identify more true cases of dementia as well as to learn how they impact a patient's willingness to follow up on the results.

Credit: 
Regenstrief Institute

Live imaging of flowers reveals hidden secrets of plant reproduction

image: An Arabidopsis floral bud with pollen mother cells highlighted in green.

Image: 
Valuchova, Mikulkova et al. (CC BY 4.0)

Scientists have developed a way to image sexual reproduction in living flowers, according to a study published today in the open-access journal eLife.

The new technique, originally reported on bioRxiv*, records for the first time movies of fundamental processes in flower development and opens up new avenues for research on plant sexual reproduction.

Plant reproduction occurs in the anthers and ovaries of developing flowers, resulting in the formation of pollen and an embryo sac that holds male and female germ cells. The production of germ cells involves both types of cell division called meiosis and mitosis. Following production, these cells fuse together during fertilisation to produce a cell that develops into a plant.

Much of our understanding of plant reproductive processes has come from studying dissected samples of plants under a microscope and by examining aberrations arising from mutations in plant genes involved in plant reproduction. However, these methods do not provide information about where and when different events happen during reproduction. Live imaging provides a way of capturing this important detail.

"Live imaging has been instrumental in research into root growth and development, but live-cell imaging of cell processes within the flower is technically much more challenging," explains co-first author Sona Valuchova, a postdoctoral researcher at the Central European Institute of Technology at Masaryk University, Czech Republic. "There is a need to develop imaging methods in the context of whole organs or plants."

Valuchova and her colleagues used a technique called light sheet fluorescence microscopy (LSFM), where a sample is moved through a thin sheet of laser light, and a detector picks up three-dimensional image data. In this way, an entire flower that has been embedded in agar can be imaged quickly by moving it through the plane of light. The resulting 3D model shows intricate detail of the flower structure and can be used to track the fate of individual germ cells within it.

Having shown that LSFM could provide high-resolution images of flowers, the team's next goal was to establish that it could detect specific events in reproduction. To achieve this, they used flowers that had been engineered to have fluorescent labels on key molecules involved in meiosis and mitosis. They were able to capture the entire process of meiosis in male germ cells by detecting changes in the amount and location of a molecule called ASY1 every hour for four days.

The team went on to show that live imaging could be successfully used to study plant hormone levels during different stages of flower development and to watch the movement of chromosomes across the cell during cell division.

One of the most neglected areas of plant reproduction research is the production of female germ cells during female meiosis. Most studies on plant meiosis have focused on male germ cells because the female equivalent - called the megaspore mother cell - are incredibly rare and look very similar to other cells, making them hard to study. To overcome this, the team developed a version of live imaging specifically for female meiosis. "This required careful dissection of the flower bud to reveal the ovules, which were then passed through the laser light every 10 minutes over 24 hours to create a 3D film," says co-first author Pavlina Mikulkova, also a senior scientist at the Central European Institute of Technology at Masaryk University. "Using this technique, we were able to record the two phases of female meiosis and determine how long each one lasted."

"This work demonstrates the power of LSFM to provide novel information about plant reproduction that could not previously be studied by other types of microscopy," concludes senior researcher Karel Riha, Deputy Director for Research at the Central European Institute of Technology at Masaryk University. "Our success in developing a live imaging protocol for female meiosis represents a major technical advancement in plant cell biology."

Credit: 
eLife

Cairo car drivers exposed to dangerous levels of pollution, new study finds

Car drivers in Cairo are exposed to dangerous levels of air pollution, finds an unprecedented new study from the University of Surrey.

Greater Cairo, which is the sixth largest city in the world, is home to 2.4 million cars and it is thought that particulate matter (PM) concentration - i.e. air pollution - is the cause of around 10 per cent of premature deaths in Egypt.

In the study 'Car users' exposure to particulate matter and gaseous air pollutants in megacity Cairo' published in Sustainable Cities and Society, a team led by Surrey's Global Centre for Clean Air Research (GCARE), together with the American University in Cairo, investigated the contributing factors that determine particulate matter (PM) exposure levels while in a vehicle - for different car settings (open window, closed window, or with air conditioning (AC) on), during different times of the day and across various commuter routes (cross city and inner city).

The results found that Cairo motorists who drive with their windows open can be exposed to 65 per cent more PM10 (coarse particles) and 48 per cent more PM2.5 (fine particles) than those who drive with windows closed and the AC on. However, many car owners simply do not have AC in their vehicles and are exposed to levels of PM10 and PM2.5 as high as 227 and 119 μg/m3 while driving through cross-city routes.

The study further found that evening peak hours are more congested, causing drivers to get exposed to higher pollution concentrations compared to morning peak hours.

The team also discovered that Cairo's close proximity to the desert and its low precipitation rates are contributing factors to higher concentration of PM10. GCARE found that areas with high construction activity or unmaintained roads also feature higher levels of PM10.

Professor Prashant Kumar, Founding Director of GCARE at the University of Surrey, said: "Air pollution causes seven million premature deaths worldwide, disproportionately affecting poor and vulnerable communities and exacerbating inequalities in official development assistance (ODA) countries. Knowledge is invaluable in our battle against climate change and air pollution, which is why we hope that our study -the first of its kind - is not the last."

Credit: 
University of Surrey

Climate change could trigger more landslides in High Mountain Asia

image: Banner NASA'S LHASA landslide risk model and Global Landslide Catalog track the areas most at risk from deadly landslides, which can cause effects ranging from destroying towns to cutting off drinking water and transportation networks.

Image: 
NASA Scientific Visualization Studio / Helen-Nicole Kostis

More frequent and intense rainfall events due to climate change could cause more landslides in the High Mountain Asia region of China, Tibet and Nepal, according to the first quantitative study of the link between precipitation and landslides in the region.

High Mountain Asia stores more fresh water in its snow and glaciers than any place on Earth outside the poles, and more than a billion people rely on it for drinking and irrigation. The study team used satellite estimates and modeled precipitation data to project how changing rainfall patterns in the region might affect landslide frequency. The study team found that warming temperatures will cause more intense rainfall in some areas, and this could lead to increased landslide activity in the border region of China and Nepal.

More landslides in this region, especially in areas currently covered by glaciers and glacial lakes, could cause cascading disasters like landslide dams and floods that affect areas downstream, sometimes hundreds of miles away, according to the study. The study was a collaboration between scientists from NASA's Goddard Space Flight Center in Greenbelt, Maryland; the National Oceanic and Atmospheric Administration (NOAA) in Washington; and Stanford University in Palo Alto, California.

High Mountain Asia stretches across tens of thousands of rugged, glacier-covered miles, from the Himalayas in the east to the Hindu Kush and Tian Shan mountain ranges in the west. As Earth's climate warms, High Mountain Asia's water cycle is changing, including shifts in its annual monsoon patterns and rainfall.

Heavy rain, like the kind that falls during the monsoon season in June through September, can trigger landslides on the steep terrain, creating disasters that range from destroying towns to cutting off drinking water and transportation networks. In summer 2019, monsoon flooding and landslides in Nepal, India and Bangladesh displaced more than 7 million people. In order to predict how climate change might affect landslides, researchers need to know what future rainfall events might look like. But until now, the research making the landslide predictions has relied on records of past landslides or general precipitation estimate models.

"Other studies have either addressed this relationship very locally, or by adjusting the precipitation signal in a general way," said Dalia Kirschbaum, a research scientist at NASA's Goddard Space Flight Center. "Our goal was to demonstrate how we could combine global model estimates of future precipitation with our landslide model to provide quantitative estimates of potential landslide changes in this region."

The study team used a NASA model that generates a "nowcast" estimating potential landslide activity triggered by rainfall in near real-time. The model, called Landslide Hazard Assessment for Situational Awareness (LHASA), assesses the hazard by evaluating information about roadways, the presence or absence of nearby tectonic faults, the types of bedrock, change in tree cover and the steepness of slopes. Then, it integrates current precipitation data from the Global Precipitation Measurement mission. If the amount of precipitation in the preceding seven days is abnormally high for that area, then the potential occurrence of landslides increases.

The study team first ran LHASA with NASA precipitation data from 2000-2019 and NOAA climate model data from 1982-2017. They compared the results from both data sets to NASA's Global Landslide Catalog, which documents landslides reported in the media and other sources. Both data sets compared favorably with the catalog, giving the team confidence that using the modeled precipitation data would yield accurate forecasts.

Finally, the study team used NOAA's model data to take LHASA into the future, assessing precipitation and landslide trends in the future (2061-2100) versus the past (1961-2000). They found that extreme precipitation events are likely to become more common in the future as the climate warms, and in some areas, this may lead to a higher frequency of landslide activity.

Most significantly, the border region of China and Nepal could see a 30-70% increase in landslide activity. The border region is not currently heavily populated, Kirschbaum said, but is partially covered by glaciers and glacial lakes. The combined impacts of more frequent intense rainfall and a warming environment could affect the delicate structure of these lakes, releasing flash floods and causing downstream flooding, infrastructure damage, and loss of water resources.

The full human impact of increasing landslide risk will depend on how climate change affects glaciers and how populations and communities change. When they evaluated their model projections in the context of five potential population scenarios, the team found that most residents in the area will be exposed to more landslides in the future regardless of the scenario, but only a small proportion will be exposed to landslide activity increases greater than 20%.

The study demonstrates new possibilities for research that could help decision-makers prepare for future disasters, both in High Mountain Asia and in other areas, said Kirschbaum.

"Our hope is to expand our research to other areas of the world with similar risks of landslides, including Alaska and Appalachia in the United States," said Sarah Kapnick, physical scientist at NOAA's Geophysical Fluid Dynamics Laboratory and co-author on the study. "We've developed a method, figured out how to work together on a specific region, and now we'd like to look at the U.S. to understand what the hazards are now and in the future."

Credit: 
NASA/Goddard Space Flight Center

New synthesis methods enhance 3D chemical space for drug discovery

image: Graphic shows the dirhodium catalyst developed used to synthesize a 3D scaffold of keen interest to the pharmaceutical industry. The Davies lab has published a series of major papers on dirhodium catalysts that selectively funcitonalized C-H bonds in a streamlined manner.

Image: 
Davies Lab/Emory University

After helping develop a new approach for organic synthesis -- carbon-hydrogen functionalization -- scientists at Emory University are now showing how this approach may apply to drug discovery. Nature Catalysis published their most recent work -- a streamlined process for making a three-dimensional scaffold of keen interest to the pharmaceutical industry.

"Our tools open up whole new chemical space for potential drug targets," says Huw Davies, Emory professor of organic chemistry and senior author of the paper.

Davies is the founding director of the National Science Foundation's Center for Selective C-H Functionalization, a consortium based at Emory and encompassing 15 major research universities from across the country as well as industrial partners.

Traditionally, organic chemistry has focused on the division between reactive molecular bonds and the inert bonds between carbon-carbon (C-C) and carbon-hydrogen (C-H). The inert bonds provide a strong, stable scaffold for performing chemical synthesis with the reactive groups. C-H functionalization flips this model on its head, making C-H bonds become the reactive sites.

The aim is to efficiently transform simple, abundant molecules into much more complex, value-added molecules. Functionalizing C-H bonds opens new chemical pathways for the synthesis of fine chemicals -- pathways that are more direct, less costly and generate less chemical waste.

The Davies lab has published a series of major papers on dirhodium catalysts that selectively functionalize C-H bonds in a streamlined manner.

The current paper demonstrates the power of a dirhodium catalyst to efficiently synthesize a bioisostere of a benzene ring. A benzene ring is a two-dimensional (2D) molecule and a common motif in drug candidates. The bioisostere has similar biologicial properties to a benzene ring. It is a different chemical entity, however, with a 3D structure, which opens up new chemical territory for drug discovery.

Previous attempts to exploit this bioisostere for biomedical research have been hampered by the delicate nature of the structure and the limited ways to make them. "Traditional chemistry is too harsh and causes the system to fragment," Davies explains. "Our method allows us to easily achieve a reaction on a C-H bond of this bioisostere in a way that does not destroy the scaffold. We can do chemistry that no one else can do and generate new, and more elaborate, derivatives containing this promising bioisostere."

The paper serves as proof of principle that bioisosteres can serve as fundamental building blocks to generate an expanded range of chemical entities. "It's like getting a new Lego shape in your kit," Davies says. "The more Lego shapes you have, the more new and different structures you can build."

Credit: 
Emory Health Sciences

Novel drug therapy shows promise for quality, quantity of kidneys available for transplant

image: Researchers have developed a new way to preserve donated kidneys--a method that could extend the number and quality of kidneys available for transplant.

Image: 
Case Western Reserve University

CLEVELAND (Feb. 11, 2020)--Researchers from Case Western Reserve University School of Medicine, University Hospitals Cleveland Medical Center (UH), Cleveland Clinic and Lifebanc (a Northeast Ohio organ-procurement organization) have developed a new way to preserve donated kidneys--a method that could extend the number and quality of kidneys available for transplant, saving more people with end-stage renal disease, more commonly known as "kidney failure."

The team identified a drug--ethyl nitrite--that could be added to the preservation fluid to generate tiny molecules called S-nitrosothiols (SNOs), which regulate tissue-oxygen delivery. This, in turn, restored flow-through and reduced resistance within the kidney. Higher flow-rates and lower resistance are associated with better kidney function after transplantation.

Their research was funded by a grant from the Roche Organ Transplant Research Foundation and recently published in Annals of Surgery.

The United States has one of the world's highest incidences of end-stage renal disease, and the number of afflicted individuals continues to increase. The prevalence of end-stage renal disease has more than doubled between 1990 and 2016, according to the Centers for Disease Control.

The optimal treatment is a kidney transplant, but demand far exceeds supply. Additionally, donation rates for deceased donors have been static for several years, despite various public-education campaigns, resulting in fewer kidneys available for transplant. And while the proportion and number of living donors has increased, this latter group still only makes up a small percentage of recovered kidneys for transplant.

Increasing the number of kidneys available for transplant benefits patients by extending lifespans and/or enhancing quality of life as well as the potential for reducing medical costs (a transplant is cheaper than ongoing dialysis). To help improve outcomes for kidney transplant patients, the team explored ways to extend the viability of donated kidneys.

Improvements in surgical techniques and immunosuppression therapies have made kidney transplants a relatively common procedure. However, less attention has been paid to maintaining/improving kidney function during the kidney-transport phase.

"We addressed this latter point through developing enhanced preservation methods," said senior author James Reynolds, professor of Anesthesiology and Perioperative Medicine at Case Western Reserve School of Medicine and a member of the Harrington Discovery Institute at UH.

For decades, procured kidneys were simply flushed with preservation solution and then transported in ice-filled coolers to the recipient's hospital. But advances in pumping technology slowly changed the field toward active storage, the preferred method for conveying the organ from donor to recipient.

"However, while 85% of kidneys are now pumped, up to 20% of kidneys are determined to be unsuitable for transplant during the storage phase," said Kenneth Chavin, professor of surgery at the School of Medicine, chief of hepatobiliary and transplant surgery and director of the UH Transplant Institute.

"For several years, our team has directed research efforts toward understanding and improving the body's response to medical manipulation," Reynolds said. "Organ-donor physiology and 'transport status' fit well within this metric. We identified a therapy that might improve kidney perfusion, a significant factor in predicting how the organ will perform post-transplant."

Previous work by Reynolds and long-time collaborator Jonathan Stamler, the Robert S. and Sylvia K. Reitman Family Foundation Distinguished Chair in Cardiovascular Innovation and president of the Harrington Discovery Institute, determined that brain death significantly reduces SNOs, which impairs blood-flow and tissue-oxygenation to the kidneys and other commonly transplanted organs. The loss of SNOs is not corrected by current preservation fluids, so impaired flow through the kidneys continues during storage and transport.

Credit: 
Case Western Reserve University

City of Hope's Triplex vaccine reduces rate of CMV complications in transplant recipients

DUARTE, Calif. -- Patients who underwent a stem cell transplant and received the Triplex vaccine to prevent a type of herpes virus - cytomegalovirus (CMV) - from duplicating out of control were 50% less likely to develop health complications related to the virus than patients who did not take Triplex, according to a City of Hope-led study published today in Annals of Internal Medicine.

The phase 2 randomized, placebo-controlled clinical trial, which took place at City of Hope, Dana-Farber Cancer Institute and MD Anderson Cancer Center, is the first time a viral anti-CMV vaccine has been tested in patients. The trial enrolled 102 people who underwent an allogeneic hematopoietic stem cell transplant (HSCT), which can cure blood cancers such as lymphoma and leukemia. Half of the patients were randomly assigned to receive Triplex, developed by City of Hope scientists to enhance CMV-specific T cells, create immunity in patients against CMV and prevent it from causing such severe complications as pneumonia, gastroenteritis and retinitis.

In the trial, patients who received at least the first of two planned injections of Triplex on day 28 and day 56 after HSCT had 50% fewer CMV complications (virus reactivation, antiviral treatment and disease) through day 100 (primary endpoint) than those who received the placebo. There were five subjects with CMV complications in the vaccine arm versus 10 in the placebo arm.

In addition, patients who received Triplex developed immunity against CMV that was 212% higher than those in the placebo group. Prior to undergoing a transplant, patients received high-dose chemotherapy to obliterate the hematopoietic system, which produces blood and is part of the immune system. Despite having weakened immune systems, patients who received Triplex developed immunity against CMV, and the immunity endured a year after the trial took place.

"It is unprecedented that we can measure the response to the vaccine so early after the transplant procedure," said Don J. Diamond, Ph.D., City of Hope professor in the Department of Hematology & Hematopoietic Cell Transplantation and the study's senior author. "That is significant because we believe that immunity is the key to controlling CMV's negative effects."

Furthermore, the number of patients who developed serious health problems in both groups was low. No Triplex-related infections or deaths occurred in the study.

"Our study represents hope for transplant patients who already have weakened immune systems and can develop serious, life-threatening CMV-related complications," Diamond said. "Triplex spurs immunity in patients and prevents CMV-related complications from occurring in stem cell patients. Our vaccine has the potential to become yet another powerful therapy for transplant recipients who are fighting for a cure."

Triplex could also eventually be used in patients who receive solid organ transplants.

One finding that needs further examination is patients who received high-dose steroids to prevent graft- versus-host disease (GVHD) did not have as strong of a response to the vaccine as patients who did not take steroids.

"Steroids are the standard treatment for GVHD but we are looking at other ways to overcome or avoid steroids," Diamond added.

One trial underway at City of Hope vaccinates the patient's donor with Triplex. That could provide the patient with earlier and greater immunity against CMV, which is particularly important for patients undergoing high-risk transplants in which they have a haploidentical donor (a half-matched bone marrow donor.)

Ryotaro Nakamura, M.D., director of City of Hope's Center for Stem Cell Transplantation, served as the overall principal investigator of the trial, and Ibrahim Aldoss, M.D., assistant clinical professor in the Department of Hematology & Hematopoietic Cell Transplantation, was City of Hope's site principal investigator.

Over half of adults by the age of 40 have been infected with CMV, according to the Centers for Disease Control and Prevention. While symptoms of the virus do not show up in healthy individuals because the immune system is able to fight off CMV infection, it can cause severe, life-threatening disease in those with weakened immune systems such as transplant patients.

Triplex is a best-in-class, universal recombinant viral vector vaccine that can be given without restrictions to all eligible transplant recipients and/or their donors, scientifically called modified vaccinia Ankara, engineered to induce a robust and durable virus-specific T cell response to three immuno-dominant proteins linked to CMV complications in stem cell transplant recipients.

Credit: 
City of Hope

Adapting to climate change: We're doing it wrong

COLUMBUS, Ohio - When it comes to adapting to the effects of climate change, scientists and policymakers are thinking too small, according to a new research review.

The authors argue that society should focus less on how individuals respond to such climate issues as flooding and wildfires and instead figure out what it takes to inspire collective action that will protect humans from climate catastrophes on a much grander scale.

Ohio State University researchers analyzed studies that have been published to date on behavioral adaptation to climate change. They found that most studies have emphasized the psychology behind individual coping strategies in the face of isolated hazards, and came from the point of view of a single household managing their own risk.

What is needed, they propose, is systems-level thinking about what is truly adaptive for society, and research on the dynamics that lead people to change entire systems through transformational actions and on barriers that keep people from embracing transformative efforts.

"What we know about adaptation has come from a longer history of studying the sorts of things that are getting worse because of climate change," said Robyn Wilson, lead author of the paper and a professor of risk analysis and decision science in Ohio State's School of Environment and Natural Resources.

"If we want to really adapt to climate change, we're talking about transformational change that will truly allow society to be resilient in the face of these increasing hazards. We're focused on the wrong things and solving the wrong problems."

The research review is published today (Feb. 10, 2020) in the journal Nature Climate Change.

Wilson and colleagues are not being critical of their peer scientists - or of themselves. When the incremental nature of adaptation research became evident, the review became a platform to sound an alarm: We can't take baby steps anymore when it comes to being ready for all that climate change will bring.

"Thinking holistically is part of what transformation research is all about - saying we have to work together to really think differently," Wilson said. "We can't all be individually running around doing our own thing. We need to think beyond the selfish individual who says, 'What do I need to do to be better off?'"

Take, for example, preservation of a seaside community. Current activities may include building municipal floodwalls and, as individuals, moving valuables to higher ground and letting insurance resolve problems as they arise. Instead, the authors suggest, a look at a much bigger picture could clarify whether the coastal community should exist at all.

Wilson said there was a time when researchers avoided studying adaptation for fear it would redirect attention and efforts away from mitigation - addressing the causes of climate change rather than its effects.

"Eventually, there was a recognition that we have to do both. We don't really have a choice," she said. "We have to adapt while also mitigating so we can try to avoid the really catastrophic outcomes that will come down the road for children today. The worst-case things aren't happening tomorrow, but they're happening on a time frame that will impact people we care about."

In biological terms, survival requires adaptation. Is there a chance those impacts could threaten the human species?

Though plenty of civilizations have failed, Wilson doesn't expect humans to go extinct as a result of even the worst-case climate scenario: global warming of 8 or 9 degrees Fahrenheit by the end of the 21st century.

"Somebody's going to survive," she said. "It's more a question of social equity and social justice.

"Fast-forward a couple hundred years and someone will be here. But if we don't think from a more transformative standpoint of how society should be structured and where we should live and how we should live, there will be a lot of losers - those with the least resources and low socioeconomic status and people in developing countries. ... We're living in a different world and we need to think differently about how we do things so we're all equally able to survive."

Credit: 
Ohio State University

Tropical cyclones: How they contribute to better forecast in the Maritime Continent

image: Correlation between the activity of tropical cyclones and precipitation over the region (left) and the reason behind: the tropical cyclones-induced meridional water transport (upper right, where red represents an eastward water transport and blue represents a westward water transport).

Image: 
©EnricoScoccimarro

Tropical cyclones are important players within the Earth's climate system. While literature usually investigates their role in determining flood events and inducing precipitations, a new study led by the CMCC Foundation - Euro-Mediterranean Center on Climate Change points out for the first time that they can also create drying effects in other regions, due to induced zonal wind anomalies.

Through observational data for the period 1979-2015, the study shows that tropical cyclones in the West-North Pacific not only increase precipitations in the areas where they transit from June to August, but also decrease precipitations in the Maritime Continent - the region between the Indian and Pacific Oceans including the Southeast Asian archipelagos - that is not directly affected by typhoons. This is explained by an eastward water transport anomaly in the equatorial region of the North West Pacific, induced by tropical cyclones developing in the basin.

"Tropical cyclone-associated winds that move around its centre reach 200-300 km/h. Winds move not only the air mass but also the water present in the air mass itself and they can involve an area of about 10.000 km from the centre of the cyclone", explains Dr. Scoccimarro, senior researcher at the CMCC Foundation under the Climate Simulation and Prediction Division and principal investigator of the study. "In a tropical cyclone winds spiral towards the center in a counterclockwise direction in the Northern Hemisphere. Therefore, in the southern part of the cyclone, the vertically integrated water content is moved to the East, while in the northern part of the cyclone water is moved to the West".

The eastward water transport anomaly is responsible for moving to east the water that would belong to the air column of the Maritime Continent, thus reducing the humidity of the area: the result is that in years with most tropical cyclones, the Maritime Continent is dryer.

The findings were confirmed by numerical experiments using the high-resolution General Circulation Model developed by the CMCC Foundation (CMCC-CM2-VHR4). "Using one of the three models in the world able to resolve intense typhoons, thanks to its high horizontal resolution of 25km in both atmosphere and ocean components" explains Scoccimarro "we could exclude other external factors potentially interacting with both tropical cyclone activity and Maritime Continent precipitation, such as El Nino Southern Oscillation".

The study highlights that forecasting tropical cyclones activity in advance over the West North Pacific may help in forecasting the onset and duration of the dry season over the Maritime Continent, thus helping to improve the forecasts for all the processes associated with the circulation in the area. This has important implications, as the Maritime Continent plays a role in the global circulation pattern, due to the energy released by convective condensation over the region which influences the global atmospheric circulation.

Credit: 
CMCC Foundation - Euro-Mediterranean Center on Climate Change

Study: It's devastatingly common for African mothers to experience child loss

Experiencing the death of a child is seen as a violation of "the natural order." And yet, despite global health gains, such deaths remain prevalent in many poor countries and regions around the world. But just how often do mothers suffer this almost unfathomable loss?

To quantify and better understand this bereavement burden, USC and University of Chicago sociologists propose new indicators to estimate how common it is for mothers to have experienced the death of a child. In contrast to traditional measures of infant and child mortality, their results capture the cumulative impact of child loss through a mother's lifetime.

Published today in PNAS, their study demonstrates the persistently high prevalence of African mothers who have ever experienced a child's death. Using data from 20 sub-Saharan African countries spanning two decades, the researchers found that more than half of 45- to 49-year-old mothers have experienced the death of a child under age five, and nearly two thirds have experienced the death of any child, irrespective of age.

"In the shadows of very high child mortality rates that the global health community typically focuses on are all these grieving parents that never receive any attention," said study co-author Emily Smith-Greenaway, assistant professor of sociology at the USC Dornsife College of Letters, Arts and Sciences. "These results increase our recognition of bereavement as itself a public health threat -- one that's unfairly concentrated in low-income regions of the world."

"These questions have not been asked or explored enough in this part of the world," said co-author Jenny Trinitapoli, associate professor of sociology at UChicago. "There isn't just an inequality in the mortality burden but also in the knowledge base. The global health community has lacked a standard metric for capturing the inequality of the risk of losing a young child from the perspective of the parents -- specifically of the mothers."

Smith-Greenaway and Trinitapoli say their study grew out of the idea that parents everywhere suffer immensely when they outlive their children. While other researchers have examined similar outcomes in the United States and Europe, very few have quantified the loss felt by mothers in Africa.

"This study tells us the burden of bereavement is much greater than we knew and offers a new perspective on global inequality," Smith-Greenaway said. "We believe these indicators can be used to improve current understandings of mortality change, bereavement as a public health threat and population dynamics."

Decreases in infant mortality may obscure lasting grief

According to the WHO, from 1990 to 2018 the global infant mortality rate decreased from an estimated 65 deaths per 1,000 live births to 29 deaths per 1,000 live births. During that same span, annual infant deaths declined from 8.7 million to 4 million.

Sub-Saharan African countries have experienced some of the swiftest reduction in infant and under-five mortality rates. This progress is rightfully celebrated, noted the study authors, but also obscures the long-term trauma of child loss. High rates of childhood, adolescent and young adult mortality mean that mothers continue to experience bereavement in ways that aren't recognized in the intensive focus on under-five mortality reductions.

"These are factors that we need to consider very carefully as we think about the consequences of stress and of aging," Trinitapoli said. "Looking at child loss from the perspective of mothers gives us ideas about where interventions might be the most useful, both for improving child health and helping women."

While research on bereavement in developing countries is sparse, studies in high-income settings demonstrate that the death of a family member is an underappreciated source of social inequality. Bereaved parents are at higher risk of psychological problems, deteriorating health and relationship strain.

A new way to look at mortality data and the legacy of loss

Using data spanning more than two decades, the study authors calculated the prevalence for three categories: death of an infant, death of a child under five and the loss of any child for mothers in multiple age groups. They say the three different metrics all point to a much higher burden of loss than the story told by current indicators.

They based their calculations on demographic and health survey data funded by the U.S. Agency for International Development, which conducts surveys in 90 countries.

Smith-Greenaway said she was particularly struck by the fact that the majority of women alive today in some African countries have experienced the death of a child. Fewer young mothers than older mothers have experienced a child die, yet in many countries, up to one third have.

Although the study focuses on sub-Saharan Africa, the authors say these indicators can be used to measure and consider the bereavement burden anywhere with high child mortality. They anticipate these numbers would be just as high in many other low-income settings.

Trauma's ripple effects

"As a demographer, I'm interested in tapping into a collective consciousness," Smith-Greenaway said. "These premature deaths live on in the collective memory in such a way that could shape ideas about parenthood, loss and the risk for tomorrow's generation of mothers."

Smith-Greenaway and Trinitapoli point to research on negativity bias, which suggests survival of children in one's social network is a forgettable event whereas a child's death registers as memorable and influential -- and can live on in the collective memory for decades.

"We have no reason to believe that the effect of these losses on mothers -- the grief, sadness and disappointment -- fades with time," said Smith-Greenaway. "We can do better to shed light on the grief and trauma that is still very much alive in a population. Their bereavement matters, and until now, we haven't had a mother-centered tool that quantified it."

Credit: 
University of Southern California

There's a twist in the story of volcanism & mass extinctions, say CCNY researchers

image: The Siberian Traps, the scene of ancient volcanic eruptions 252 million years ago that led to a massive extinction of life on Earth. CCNY researchers Ellen Gales and Benjamin Black obtained samples for their study there.

Image: 
Photo B. Black and L.T. Elkins-Tanton

An emerging scientific consensus is that gases--in particular carbon gases--released by volcanic eruptions millions of years ago contributed to some of Earth's greatest mass extinctions. But new research at The City College of New York suggests that that's not the entire story.

"The key finding of our research is that carbon from massive, ancient volcanic eruptions does not line up well with the geochemical clues that tell us about how some of Earth's most profound mass extinctions occurred," said Benjamin Black, assistant professor in CCNY's Division of Science, whose expertise includes effects of volcanism on climate and mass extinctions.

The study by Black with his M.S. in geology student Ellen Gales, the lead author, is entitled "Carbonatites as a record of the carbon isotope composition of large igneous province outgassing." It appears in the current issue of the journal "Earth and Planetary Science Letters," and is a product of Gales' thesis work.

The new data does not rule out volcanism as the culprit in driving past mass extinctions, the article points out. But it does conclude that there must have been something extra at work.

"Ellen's work is new in that scientists have previously guessed what the geochemical fingerprint of CO2 from these giant eruptions might be, but our findings are some of the first direct measurements of this fingerprint," said Black.

"Our finding challenges the idea that carbon from this kind of eruption might be special, and therefore capable of easily matching changes in the carbon cycle during mass extinctions. It also helps us understand how volcanic eruptions move carbon--a key ingredient for life and climate--around inside the Earth and between the solid Earth and the atmosphere," said Gales.

In addition, the CCNY research also offers insights into Earth's current climate. "Right now, people are releasing large quantities of CO2 into the atmosphere. In a way, we are heading into almost uncharted territory," noted Black. "This scale of CO2 release has only happened a few times in Earth's history, for example during rare, enormous volcanic eruptions like the ones we studied."

Consequently, Black pointed out, even though volcanic eruptions on the scale of these enormous volcanic provinces are not expected any time soon, understanding the environmental changes triggered by prodigious volcanic CO2 release in the deep past is important for understanding how Earth's climate could change in the coming centuries.

The researchers used samples collected from ancient volcanic eruptions including the 252-million-year-old Siberian Traps. They included data collected at Columbia University's Lamont-Doherty Earth Observatory. Lindy Elkins-Tanton at Arizona State University also contributed to the work, which received support from the National Science Foundation.

Credit: 
City College of New York

Sensory perception is not superficial brain work

image: Due to the dense folding of the cerebral cortex, the scientists had to digitally smooth it on the computer and break it down into different layers, in order to be able to precisely locate the signals.

Image: 
Remi Gau

If we cross a road with our smartphone in view, a car horn or engine noise will startle us. In everyday life we can easily combine information from different senses and shift our attention from one sensory input to another - for example, from seeing to hearing. But how does the brain decide which of the two senses it will focus attention on when the two interact? And, are these mechanisms reflected in the structure of the brain?

To answer these questions, scientists at the Max Planck Institute for Human Cognitive and Brain Sciences (MPI CBS) in Leipzig and the Computational Neuroscience and Cognitive Robotics Centre at the University of Birmingham measured how sensory stimuli are processed in the brain. In contrast to previous studies, they did not restrict their observations to the surface the cerebral cortex. For the first time, they also measured the sensory signals at different depths in the cortex. The researchers' findings suggest that our brains conduct the multi-sensory flow of information via distinct circuits right down to the smallest windings of this highly folded brain structure.

While the participants in their study were lying in a magnetic resonance tomograph (MRI), the scientists showed them visual symbols on a screen while simultaneously playing sounds. In a prior condition, the participants had been asked to explicitly focus their attention on either the audible or visible aspect of the stimuli. The neurophysicists Robert Turner, Robert Trampel and Rémi Gau then analyzed at which exact points the sensory stimuli were being processed. Two challenges needed to be overcome. "The cerebral cortex is only two to three millimeters thick. So we needed a very high spatial resolution (of less than one millimeter) during data acquisition," explains Robert Trampel, who co-directed the study at the MPI CBS. "Also, due to the dense folding of the cerebral cortex, we had to digitally smooth it and break it down into different layers, in order to be able to precisely locate the signals. This was all done on a computer of course."

The results showed that when participants heard a sound, visual areas of their brains were largely switched off. This happened regardless of whether they focused on the audible or visible aspect of the stimuli. However, if they strongly attended to the auditory input, brain activity decreased, particularly in the regions representing the center of the visual field. Thus, it seems that sound can strongly draw our attention away from what we're looking at.

In auditory brain regions the researchers also observed, for the first time, that the activity pattern, across different cortical layers, changed when participants were presented with only sounds. The situation was different when participants only perceived "something to the eye": in that case there was no change. Rémi Gau sums up, "So when we have to process different sensory impressions at the same time, different neuron circuits become active, depending on what we focus our attention on. We have now been able to make these interactions visible through novel computerized experiments."

Credit: 
Max Planck Institute for Human Cognitive and Brain Sciences

It's Iron, Man: ITMO scientists found a way to treat cancer with iron oxide nanoparticles

image: Drug release inside the cell.

Image: 
The Faculty of Physics and Engineering of ITMO University

The concept is based on the interaction of resonant semiconductor iron oxide Fe2O3 nanoparticles with light. Particles previously loaded with the antitumor drug are injected in vivo and further accumulate at the tumor areas. In order to release the drug non-invasively, the carrier particles have to be light-sensitive. For this purpose, the polymer containers (capsules) can be modified with iron oxide resonant semiconductor nanoparticles. When irradiated with light, they get heated and induce drug release. The research was published in Laser and Photonics Reviews.

Nowadays, there are anti-cancer drugs that can effectively treat malignant tumors. Regrettably, they have an effect on not just malignant cells and tissues but also on healthy ones. Therefore, there is a need of new approaches to treat cancer. A powerful method to overcome the mentioned barrier is the delivery of drugs with micro- and nanoparticles that make it possible to accumulate large amounts of drugs near the tumor region with a minimum systemic concentration of these highly toxic drugs in the organism as a whole.

Another advantage of iron oxide nanoparticles is that it is not just an efficient nanoheater, but also a local nanothermometer. This means that you can control the temperature when heating the particles, therefore, preventing overheating of healthy cells and tissues.

"We've tested our systems in-vitro on stem and tumor cells. Stem cells were used as a model of healthy cells in the experiment and tumor cells as a model of diseased cells. As a result, the anti-tumor drug affected tumor cells as they were irradiated with a laser, and almost no toxicity was observed in healthy cells. The control cells also survived the experiment, which means that tumor cells died as a result of the drug release. This is how we created efficient light-sensitive systems for optically driven drug delivery," says Mikhail V. Zyuzin.

The drug delivery systems can also be used as local nanothermometers, which makes them cross-functional.

"In this case, nanoparticles as both converters of light into heat and a thermometer. The point here is that it is very hard to measure temperature in such small areas. For example, there are methods that make use of dyes that burn out and stop giving light at a specific temperature. But the problem is that the only thing that we can understand from that if where the temperature is higher or lower than some specific value, yes or no. We will not get any details. On the other hand, semiconductor nanoparticles can efficiently absorb light and convert it into heat. Because of that, the oscillation frequency of their crystalline lattice slightly changes, and the light starts to dissipate in a different manner. We can use these changes to tell how much did we heat the particle, as well as see this data on a spectrometer," explains George Zograf.

International team

In 2017, George Zograf, a PhD student at ITMO's Faculty of Physics and Engineering under the guidance of professor Sergei Makarov published a scientific work dedicated to optically-induced heating and simultaneous temperature measurement of resonant semiconductor nanoparticles. Some time later, Mikhail V. Zyuzin, a researcher at ITMO's Faculty of Physics and Engineering who specializes in biophysical research, joined the team. His help made it possible to apply the effects that George and Sergei studied earlier in the fields of biology and medicine, namely in drug delivery.

An international team of physicists, chemists and biologists performed an interdisciplinary study in the field of the non-invasive release of drugs encapsulated in polymer capsules under optical radiation. Scientists from ITMO University were responsible for the synthesis and optical characterization of iron oxide nanoparticles, as well as polymer capsules. Their French colleagues helped with structural characterization of synthesized materials. Chinese researchers helped to visualize the process of the release of bioactive compounds from capsules under laser irradiation, finally, researchers from the First Pavlov State Medical University of St. Petersburg conducted experiments on the delivery of an anti-cancer drug in primary tumor cells.

The researchers plan to continue their work and develop their current results. They have plans to conduct pre-clinical trials on animals in-vivo next year.

Credit: 
ITMO University

Reimagining the link between space and species could boost wildlife conservation

image: University of Kansas investigator Jorge Sobero?n offers a new method for ecologists to calculate the correlation between geographic space and the number of species inhabiting that space.

Image: 
KU News Service

LAWRENCE -- In the latest issue of The American Naturalist, University of Kansas investigator Jorge Sobero?n offers a new method for ecologists to calculate the correlation between geographic space and the number of species inhabiting that space.

"There's a problem in ecology that's been around since the 1920s called the 'species-area relationship,'" said Sobero?n, a University Distinguished Professor at the KU Biodiversity Institute and Department of Ecology & Evolutionary Biology. "Ecologists noticed that the more space you assemble, the more species you will discover. And that's a mathematical relationship."

The "problem" is that there's not a clear understanding of why the increase in geographic space is always accompanied by an increase in the numbers of species in that space.

"This observed relationship is pretty universal," Sobero?n said. "You can do it for islands or continental masses, in the tropics or in temperate regions -- wherever you do a plot of area versus number of species you get an increasing line. But why? Most explanations are based on the idea that the more individuals you sample, the more likely it is that they belong to a different species."

But Sobero?n's new approach to species-area relationship is different. Rather than performing a headcount of animals (or plants) within a given geographic area, the KU researcher proposes calculating the number of "Grinnellian niches," or conditions that support the existence of species in that area.

He said the method, expressed by mathematical formulae, could offer a more accurate roll call of species.

"What I did was to apply a theory we have been developing here at KU for the last 10 years, which is called 'Grinnellian-niche theory,'" Sobero?n said. "Instead of concentrating on counting individuals, I explained this pattern in terms of the requirements of species for different climatic combinations. Different species have different climatic requirements, and if you can model that you can place the different requirements -- which are called 'niches' -- in environmental niche space. And when you increase the area you're looking at you are also increasing niche space. That's the bottom line."

According to Sobero?n, examples of niches could include water availability or the right temperature range.

"Some species tolerate very hot weather -- some species require less hot weather," he said. "Some species can tolerate very dry environments; other species require water in different amounts. So, this combination of, say, from what lowest temperature to what highest temperature. From what driest environment to what wettest environment, from what acidity in the soil, and so on -- you derive these combinations of requirements, they're called niches."

The new approach refines the understanding of the species-area relationship because it accounts for factors beyond simply the size of an area. For instance, Sobero?n said the Grinnellian-niche approach could account for variations in environment, such as temperature, that occur more powerfully north-to-south than east-to-west.

"Latitudinal effects are just one of the many things that conventional theories ignore," Sobero?n said. "But they play a very prominent role in the theory I've published, because it matters a lot whether you are moving, say, from Alaska to Central America, or vice versa. You encounter different species, different sets of tolerances and different niches. But in traditional theories, that doesn't matter because you just care about how much area -- the distinction is not made whether that's tropical area or temperate area or polar area. But for me that's a very major thing."

To hone his theoretical approach, Sobero?n performed the Grinnellian analysis for North American mammals, then compared results with massive databases of known populations of species for the same region, subdivided into grids.

"There's an organization called the International Union for Conservation of Nature that has been convening meetings of experts from all over the world to draw the various distributions of terrestrial vertebrates -- mammals, birds, reptiles and amphibians," the KU researcher said. "We have databases which are publicly available with distributions, already established by these experts. This is the work of decades. In fact, anyone can download them. They are public. I chose to look at the mammals because, well, I am slightly familiar with mammals and I have friends that are very familiar with mammals and I could ask questions of them. So, I downloaded IUCN database, which I used to test -- and I also downloaded a GBIF (Global Biodiversity Information Facility) database, which is a huge database of observations of species. Those I used to do the calculations."

Sobero?n said his new approach to biogeography could aid species conservation and species management across a spectrum of commercial and ecological efforts.

"It's a matter of universal concern, what species live in a particular place," he said. "Those species may be species of conservation interest -- maybe you don't want tigers to go extinct, or gorillas. But there are also species of economic interest for people. For instance, there are plagues and diseases of crops, and you don't want to have those in your fields. There are species that transmit diseases for people, for instance, mosquitoes. So, the general problem of 'why did you find this species in a particular place?' is important for people. We care about certain species, and we don't want to have other species near us. And this combination of the ones that we'd like to keep, and the ones that we don't want to be close to us, is an instance of this general problem that I am dealing with in the paper."

Credit: 
University of Kansas

The human brain's meticulous interface with the bloodstream now on a precision chip

image: The blood-brain barrier on a chip is as small as many organs on chips, but it gives astrocytes lots of room to unfold in 3D.

Image: 
Georgia Tech / YongTae Kim lab

A scrupulous gatekeeper stands between the brain and its circulatory system to let in the good and keep out the bad, but this porter, called the blood-brain barrier, also blocks trial drugs to treat diseases like Alzheimer's or cancer from getting into the brain.

Now a team led by researchers at the Georgia Institute of Technology has engineered a way of studying the barrier more closely with the intent of helping drug developers do the same. In a new study, the researchers cultured the human blood-brain barrier on a chip, recreating its physiology more realistically than predecessor chips.

The new chip devised a healthy environment for the barrier's central component, a brain cell called the astrocyte, which is not a neuron, but which acts as neurons' intercessors with the circulatory system. Astrocytes interface in human brains with cells in the vasculature called endothelial cells to collaborate with them as the blood-brain barrier.

But astrocytes are a particularly fussy partner, which makes them a great part of the gatekeeper system but also challenging to culture in a physiologically accurate manner. The new chip catered to astrocytes' sensibilities by culturing in 3D instead of in a flat manner, or 2D.

The 3D space allowed astrocytes to act more naturally, and this improved the whole barrier model by also allowing cultured endothelial cells to function better. The new chip presented researchers with more healthy blood-brain barrier functions to observe than in previous barrier models.

'Astro' in astrocyte

"You need to be able to closely mimic a tissue on a chip in a healthy status and in homeostasis. If we can't model the healthy state, we can't really model disease either, because we have no accurate control to measure it against," said YongTae Kim, an associate professor in Georgia Tech's George W. Woodruff School of Mechanical Engineering and the study's principal investigator.

In the new chip, the astrocytes even looked more natural in the 3D space, unfolding the star-like shape that gives them their "astro" name. In the 2D cultures, by contrast, astrocytes looked like fried eggs with fringes. With this 3D setting, the chip has added possibilities for reliable research of the human blood-brain barrier, where currently alternatives are few.

"No animal model comes close enough to the intricate function of the human blood-brain barrier. And we need better human models because experimental drugs that have successfully entered animal brains have failed at the human barrier," Kim said.

The team published its results on January 10, 2020, in the journal Nature Communications. The research was funded by the National Institutes of Health. Kim has founded a company with plans to mass-produce the new chip in the future for use in academic and potentially pharmaceutical research.

Choosy, bossy astrocytes

The brain is the only part of the body outfitted with astrocytes, which regulate nourishment uptake and waste removal in their own, unique way.

"Upon the brain's request, astrocytes collaborate with the vasculature in real-time what the brain needs and opens its gates to let in only that bit of water and nutrients. Astrocytes go to get just what the brain needs and don't let much else in," Kim said.

Astrocytes form a protein structure called aquaporin-4 in their membranes that are in contact with vasculature to let in and out water molecules, which also contributes to clearing waste from the brain.

"In previous chips, aquaporin-4 expression was not observed. This chip was the first," Kim said. "This could be important in researching Alzheimer's disease because aquaporin-4 is important to clearing broken-down junk protein out of the brain."

One of the study's co-authors, Dr. Allan Levey from Emory University, a highly cited researcher in neurological medicine, is interested in the chip's potential in tackling Alzheimer's. Another, Dr. Tobey McDonald, also of Emory, researches pediatric brain cancer and is interested in the chip's possibilities in studying the delivery of potential brain cancer treatments.

Barrier acting healthy

Astrocytes also gave signs that they were healthier in the chip's 3D cultures than in 2D cultures by expressing less of a gene triggered by pathology.

"Astrocytes in 2D culture expressed significantly higher levels of LCN2 than those in 3D. When we cultured in 3D, it was only about one fourth as much," Kim said.

The healthier state also made astrocytes better able to show an immune reaction.

"When we purposely confronted the astrocyte with pathological stress in a 3D culture, we got a clearer reaction. In 2D, the ground state was already less healthy, and then the reaction to pathological stresses did not come across so clearly. This difference could make the 3D culture very interesting for pathology studies."

Nanoparticle delivery

In testing related to drug delivery, nanoparticles moved through the blood-brain-barrier after engaging endothelial cell receptors, which caused these cells to engulf the particles then transport them to what would be inside the human brain in a natural setting. This is part of how endothelial cells worked better when connected to astrocytes cultured in 3D.

"When we inhibited the receptor, the majority of nanoparticles wouldn't make it in. That kind of test would not work in animal models because of cross-species inaccuracies between animals and humans," Kim said. "This was an example of how this new chip can let you study the human blood-brain barrier for potential drug delivery the way you can't in animal models."

Credit: 
Georgia Institute of Technology