Tech

Damaged hearts rewired with nanotube fibers

image: Rice University Professor Matteo Pasquali, left, and Dr. Mehdi Razavi of the Texas Heart Institute check a thread of carbon nanotube fiber invented in Pasquali's Rice lab. They are collaborating on a method to use the fibers as electrical bridges to restore conductivity to damaged hearts.

Image: 
Texas Heart Institute

Thin, flexible fibers made of carbon nanotubes have now proven able to bridge damaged heart tissues and deliver the electrical signals needed to keep those hearts beating.

Scientists at Texas Heart Institute (THI) report they have used biocompatible fibers invented at Rice University in studies that showed sewing them directly into damaged tissue can restore electrical function to hearts.

"Instead of shocking and defibrillating, we are actually correcting diseased conduction of the largest major pumping chamber of the heart by creating a bridge to bypass and conduct over a scarred area of a damaged heart," said Dr. Mehdi Razavi, a cardiologist and director of Electrophysiology Clinical Research and Innovations at THI, who co-led the study with Rice chemical and biomolecular engineer Matteo Pasquali.

"Today there is no technology that treats the underlying cause of the No. 1 cause of sudden death, ventricular arrhythmias," Razavi said. "These arrhythmias are caused by the disorganized firing of impulses from the heart's lower chambers and are challenging to treat in patients after a heart attack or with scarred heart tissue due to such other conditions as congestive heart failure or dilated cardiomyopathy."

Results of the studies on preclinical models appear as an open-access Editor's Pick in the American Heart Association's Circulation: Arrhythmia and Electrophysiology. The association helped fund the research with a 2015 grant.

The research springs from the pioneering 2013 invention by Pasquali's lab of a method to make conductive fibers out of carbon nanotubes. The lab's first threadlike fibers were a quarter of the width of a human hair, but contained tens of millions of microscopic nanotubes. The fibers are also being studied for electrical interfaces with the brain, for use in cochlear implants, as flexible antennas and for automotive and aerospace applications.

The experiments showed the nontoxic, polymer-coated fibers, with their ends stripped to serve as electrodes, were effective in restoring function during monthlong tests in large preclinical models as well as rodents, whether the initial conduction was slowed, severed or blocked, according to the researchers. The fibers served their purpose with or without the presence of a pacemaker, they found.

In the rodents, they wrote, conduction disappeared when the fibers were removed.

"The reestablishment of cardiac conduction with carbon nanotube fibers has the potential to revolutionize therapy for cardiac electrical disturbances, one of the most common causes of death in the United States," said co-lead author Mark McCauley, who carried out many of the experiments as a postdoctoral fellow at THI. He is now an assistant professor of clinical medicine at the University of Illinois College of Medicine.

"Our experiments provided the first scientific support for using a synthetic material-based treatment rather than a drug to treat the leading cause of sudden death in the U.S. and many developing countries around the world," Razavi added.

Many questions remain before the procedure can move toward human testing, Pasquali said. The researchers must establish a way to sew the fibers in place using a minimally invasive catheter, and make sure the fibers are strong and flexible enough to serve a constantly beating heart over the long term. He said they must also determine how long and wide fibers should be, precisely how much electricity they need to carry and how they would perform in the growing hearts of young patients.

"Flexibility is important because the heart is continuously pulsating and moving, so anything that's attached to the heart's surface is going to be deformed and flexed," said Pasquali, who has appointments at Rice's Brown School of Engineering and Wiess School of Natural Sciences.

"Good interfacial contact is also critical to pick up and deliver the electrical signal," he said. "In the past, multiple materials had to be combined to attain both electrical conductivity and effective contacts. These fibers have both properties built in by design, which greatly simplifies device construction and lowers risks of long-term failure due to delamination of multiple layers or coatings."

Razavi noted that while there are many effective antiarrhythmic drugs available, they are often contraindicated in patients after a heart attack. "What is really needed therapeutically is to increase conduction," he said. "Carbon nanotube fibers have the conductive properties of metal but are flexible enough to allow us to navigate and deliver energy to a very specific area of a delicate, damaged heart."

Credit: 
Rice University

A leap forward in kidney disease research: Scientists develop breakthrough in vitro model

image: Fluorescent image showing natural filter formed by human kidney cells in the model developed by Dr. Laura Perin and Dr. Stefano Da Sacco of Children's Hospital Los Angeles. Human podocytes appear green and glomerular endothelial cells appear red.

Image: 
Dr. Perin and Dr. Da Sacco of Children's Hospital Los Angeles

Kidneys work to constantly filter blood and remove toxins from the body. Conditions such as chronic kidney disease (CKD) are characterized by a reduced ability to perform this essential function. CKD incidence is growing and more than 1.4 million individuals depend on dialysis or kidney transplant for survival. Development of new treatments requires an understanding of the mechanisms of the disease progression, but scientists have not been able to accurately model kidney filtration in vitro - until now.

In a landmark study published in Nature Communications, scientists at Children's Hospital Los Angeles demonstrate an in vitro kidney model that could change the course of research for diseases like CKD.

The kidney contains specialized structures called glomeruli. Within each glomerulus is a filtration barrier made up of two thin layers of highly specialized cells and a membrane that acts as a selective filter. As blood moves through each glomerulus, toxins and small molecules can pass through, while proteins and other important components are kept in the bloodstream. "This filtration process breaks down in patients with kidney disorders," explains Laura Perin, PhD, who is co-senior author on the study along with Stefano Da Sacco, PhD. "But because we haven't had a good in vitro model, we still don't know the mechanisms of injury to the glomerulus in CKD."

Dr. Perin and Dr. Da Sacco conduct research in the GOFARR Laboratory for Organ Regenerative Research and Cell Therapeutics in Urology along with co-director Roger De Filippo, MD, at CHLA's Saban Research Institute. The lead author on the study was CHLA postdoctoral research fellow Astgik Petrosyan. Together, the team studies the structure of the glomerulus to better understand how and why their ability to filter blood breaks down.

"A big challenge in the kidney research field has been trying to replicate the glomerulus in vitro," says Dr. Da Sacco. "In particular, the glomerular filtration barrier is very difficult to recreate in a lab using standard techniques." Because of this, most published studies have used an artificial membrane between the two cell layers. While fluid can be exchanged, the cells cannot communicate across this membrane in the same way they do biologically. "This results in a model that doesn't really filter properly," he explains.

The critical component missing from current experiments is a filter that is selective and allows proper cell-to-cell communication. Dr. Da Sacco and Dr. Perin set out to grow healthy kidney cells in a way that allowed for the natural glomerular barrier to form, just as it does in the body. Using specialized, compartmented containers called OrganoplatesTM, the investigators did exactly that.

The result?

A model glomerulus that functions nearly identically to that found in real kidneys. They are calling this model, which is derived entirely from healthy, human kidney tissue, a glomerulus on a chip.

On one side of the cells, investigators add fluid and, on the other side, they collect what the 'glomerulus' filters, which is called the filtrate. In their experiment, the scientists added blood serum from healthy individuals. Without the use of a manufactured filter, the team's in vitro glomerulus behaved as human kidneys are expected to act: proteins remained in the serum while smaller molecules passed into the filtrate. "The barrier that our cells naturally formed is selective, just as it would be in a fully-functioning kidney," says Dr. Da Sacco. "It is remarkable."

This model represents a substantial leap forward from the current standard of in vitro kidney research. "Our system behaves like a biologically, physiologically correct glomerulus," says Dr. Perin. "This opens up the door for us to understand what we still don't know - the molecular mechanisms of injury in CKD and, more importantly, how to prevent damage."

While this seemed a distant goal in the past, Dr. Da Sacco and Dr. Perin are already recreating and studying the disease state in their model. When the investigators added serum from patients with CKD, they found that the glomerulus exhibited the same type of damage observed clinically: proteins began to leak through the compromised filter. Protein levels measured in the experimental filtrate matched patient clinical filtrate samples with a correlation of approximately 90%.

This breakthrough paves the way for numerous clinical applications. In the burgeoning era of personalized medicine, a preparation such as this can be used to examine molecular mechanisms of kidney damage in individual patients. Disease progression can then be monitored over time using serial blood sampling. The model could also be used for screening new drugs prior to human clinical testing.

Credit: 
Children's Hospital Los Angeles

Study: Naltrexone to treat opioid use disorder during pregnancy, favorable for mom, baby

BOSTON -Infants born to mothers taking naltrexone to treat opioid use disorder during pregnancy developed no signs of neonatal opioid withdrawal syndrome (NOWS) during their hospitalization, a new study shows. In comparison to infants of mothers taking buprenorphine during pregnancy, infants exposed to naltrexone had shorter hospital stays, and mothers reported no use of other opioids during their pregnancy. Led by researchers at Boston Medical Center's Grayken Center for Addiction, the findings provide important preliminary data on the outcomes for both mother and baby when naltrexone is used to treat opioid use disorder during pregnancy.

Neonatal opioid withdrawal syndrome (NOWS) affects infants exposed to opioids in utero. With the increase in both unprescribed and prescribed opioids use in the U.S., there has been a five-fold increase in the rate of NOWS over the past decade. Approximately 50 to 80 percent of opioid-exposed infants have required medication to manage their withdrawal symptoms, which usually appear two to three days after birth. These can include irritability, trouble eating and sleeping, diarrhea, muscle rigidity, and difficulty soothing.

Naltrexone, buprenorphine and methadone are three FDA-approved medications prescribed to treat opioid use disorder. However, there is limited data on the use of naltrexone during pregnancy in terms of safety and efficacy given its relative short time on the market. This study, published in Clinical Therapeutics, followed mother-infant dyads during their pregnancy and after delivery at one academic medical center between 2017 and 2019. Six of the mothers were taking naltrexone to treat their opioid use disorder before and during their pregnancy, and 12 were taking buprenorphine. The two groups were monitored and compared based on participants' opioid use via urine toxicology reports, provider reports during pregnancy and six months post-delivery, delivery outcomes, gestational age, birth weight, APGAR scores, NICU admission, and NOWS outcomes (diagnosis, pharmacologic treatment, total hospital length of stay). Maternal demographics were also compared between all participants.

The infants born to women taking naltrexone showed no withdrawal symptoms during their initial hospitalization, compared to 92 percent of infants born to women taking buprenorphine. Of those showing symptoms, 46 percent required medication to treat their withdrawal symptoms. Additionally, 83 percent of women in the naltrexone cohort breastfed, without any immediate issues in the perinatal period.

"While these study results are preliminary, the outcomes we observed for both mother and baby when naltrexone is used to treat opioid use disorder during pregnancy are promising," said Elisha Wachman, MD, a neonatologist at BMC and the study's corresponding author.

Previous research has compared methadone and buprenorphine as standards of care for pregnant women with OUD. It was confirmed that buprenorphine was associated with lower risks of preterm birth, greater birth weight, larger head circumference, and less severe NOWS compared with women taking methadone. Women treated with buprenorphine also had improved compliance with prenatal care and less unprescribed substance use during the pregnancy. The use of naltrexone during pregnancy is relatively new given that it was FDA-approved in 2010.

"Our findings support the need for a larger multi-center study examining the long-term maternal and child safety and efficacy outcomes of naltrexone during pregnancy," said Wachman, who also is an associate professor of pediatrics at Boston University School of Medicine. "If those studies yield positive outcomes for both mother and baby, continuing women on naltrexone during their pregnancy could be another safe approach to treat opioid use disorder."

Another important finding in the study was that women taking naltrexone received prenatal care later during their pregnancy than women taking buprenorphine, which could indicate that patients and/or providers are not clear on its safety during pregnancy. In addition, the authors note that the most important aspect of treating opioid use disorder during pregnancy is keeping the mothers stable on their medication to decrease any risk of relapse.

Credit: 
Boston Medical Center

New water-beetle species show biodiversity still undiscovered in at-risk South American habitats

image: University of Kansas doctoral student Jennifer Girón performs fieldwork in South America that resulted in descriptions of 17 new species of aquatic beetles.

Image: 
Andrew Short

LAWRENCE -- Researchers from the University of Kansas have described three genera and 17 new species of water scavenger beetles from the Guiana and Brazilian Shield regions of South America, areas seen as treasure houses of biodiversity. The beetles from the countries of French Guiana, Suriname, Brazil, Guyana and Venezuela were discovered through fieldwork and by combing through entomological collections at the Smithsonian Institution and KU.

The beetles are described in a new paper in ZooKeys, a peer-reviewed journal.

Lead author Jennifer Girón, a KU doctoral student in ecology & evolutionary biology and the Division of Entomology at KU's Biodiversity Institute, said the new species hint at vast biodiversity left to be described in regions where resource-extraction operations today are destroying huge swaths of natural habitat.

"The regions we've been working on, like Venezuela and Brazil, are being degraded by logging and mining," she said. "Eventually, they're going to be destroyed, and whatever lives there is not going to be able to survive. At this point, we don't even know what's there -- there are so many different kinds of habitats and so many different resources. The more we go there, and the more we keep finding new species, the more we realize that we know next to nothing about what's there."

According to Girón and co-author Andrew Short, associate professor of ecology & evolutionary biology at KU, fieldwork and taxonomic work on Acidocerinae (a subfamily of the family Hydrophilidae of aquatic beetles) during the past 20 years have exposed "an eye-opening diversity of lineages and forms resulting in the description of seven of the 11 presently recorded genera since 1999."

The KU researchers said the three new genera they've now added to Acidocerinae possibly have remained obscure until now because many of the species inhabit seepages -- areas where groundwater rises to the surface through mud or flow over rocks near rivers or streams.

Girón and Short discovered some of the new species during a field trip to Suriname.

"I have only been to one of the expeditions there," Girón said. "Before that, I had no experience collecting aquatics. But Andrew (Short) has been to those places many times. It's very remote, in the heart of the jungle. We went four hours in a bus and then four more hours in a boat up the river. There is a field station for researchers to go and stay for a few days there. We looked for the beetles along the river, forest streams and also in seepages."

During their fieldwork, Girón and Short, along with a group of KU students, sought the seepages that were rich hunting grounds for acidocerine aquatic beetles.

"If you're along a big river, you're not as likely to find them," Girón said. "You have to find places where there's a thin layer of running water or small pools on rocks. They're more common around places with exposed rock, like a rock outcrop or a cascade. These habitats have been traditionally overlooked because when you think of collecting aquatic beetles or aquatic insects in general, you think of rivers or streams or ponds or things like that -- you usually don't think about seepages as places where you would find beetles. So usually you don't go there. It's not that these aquatic beetles are especially rare or hard to find. It's more like people usually don't collect in these habitats."

Girón said the descriptions of the new aquatic beetles also underscore the usefulness of museum collections to ongoing scientific research in biodiversity.

"It's important to highlight the value of collections," she said. "Without specimens housed in collections, it would be impossible to do this kind of work. Nowadays, there has been some controversy about whether it is necessary to collect specimens and deposit them in collections in order to describe new species. Every person that has ever worked with collections will say, 'Yes, we definitely need to maintain specimens accessible in collections.' But there are recent publications where authors essentially just add a picture of one individual to their description without actual specimens deposited in collections, and that can be enough for them to publish a description. The problem with that is there would be no reference specimens for detailed comparisons in the future. For people who do taxonomic work and need to compare many specimens to define the limits of different species, one photo is not going to be enough."

To differentiate and classify the new species, Girón and Short focused on molecular data as well as a close examination of morphology, or the bodies of the aquatic beetles.

"This particular paper is part of a bigger research effort that aims to explain how these beetles have shifted habitats across the history of the group," Girón said. "It seems like habitat has caused some morphological differences. Many aquatic beetles that live in the same habitats appear very similar to each other -- but they're not necessarily closely related. We've been using molecular techniques to figure out relationships among species and genera in the group."

Girón, who grew up in Colombia and earned her master's degree in Puerto Rico, said she hoped to graduate with her KU doctorate in the coming academic year. After that, she will continue her appointments as research associate and acting collections manager at the Natural Science Research Laboratory of the Museum of Texas Tech University.

Credit: 
University of Kansas

What's more powerful, word-of-mouth or following someone else's lead?

Key Takeaways:

Researchers studied users of an online anime platform that provided individual-level data on users' friendship networks, anime watching behaviors, forum posts, and ratings.

Both word-of-mouth and the example of others, (i.e., others' decisions of what to watch or following someone else's lead), are two of the most powerful sources in social learning.

Word-of-mouth referrals from the platform community is more influential than following the example of a trusted or admired friend from your personal network or than watching popular hits among members of the platform community.

CATONSVILLE, MD, August 13, 2019 - Researchers from the University of Pittsburgh, UCLA and the University of Texas published new research in the INFORMS journal Marketing Science (Editor's note: The source of this research is INFORMS), that reveals the power of word-of-mouth in social learning, even when compared to the power of following the example of someone we trust or admire. The same research found, however, that both word-of-mouth and following someone else's lead are two of the most powerful dynamics in influencing others through social learning.

The study to be published in an upcoming edition of the INFORMS journal Marketing Science is titled "Word of Mouth, Observed Adoptions, and Anime-Watching Decisions: The Role of the Personal vs. the Community Network," and is authored by Mina Ameri of the University of Pittsburgh's Katz Graduate School of Business; Elisabeth Honka of the Anderson School of Management at UCLA; and Ying Xie of the University of Texas at Dallas.

For empirical analysis, the researchers studied an online anime (Japanese cartoon) platform called MyAnimeList.net. This contained individual-level data on users' friendship networks, product adoptions, forum posts and ratings on anime series. The website serves as a gathering place for anime fans to share their enthusiasm and exchange opinions about anime series.

The researchers specifically explored how users responded to word-of-mouth dynamics and the anime watching decisions of other users based on observational data. They called the latter dynamic "observed adoptions." They further distinguish between these dynamics at two network levels: those from their own personal friends (personal network) and those from all other members of the community (community network).

They found that word-of-mouth referrals from the community network is the largest driver among the social learning forces they studied. And while word-of -mouth was more powerful than observed adoptions, both factors are significant in influencing social learning.

"While both word-of-mouth and observed adoptions are highly influential in affecting a person's social learning, our results show that each provide unique and different information that individuals use in their decision-making," said Ameri. "Ultimately, we found that a person's community network is the primary source of information driving anime watching decisions and behaviors."

Credit: 
Institute for Operations Research and the Management Sciences

Exposure to outdoor air pollutants, change in emphysema, lung function

Bottom Line: Whether exposure to outdoor air pollutants is associated with emphysema progression and change in lung function was the focus of this observational study. The study included 7,071 participants from the Multi-Ethnic Study of Atherosclerosis studies conducted in six U.S. metropolitan regions (New York; Los Angeles; Chicago; Baltimore; Winston-Salem, North Carolina; and St. Paul, Minnesota). Computed tomographic (CT) scans were used to assess changes in emphysema (measured for density as a percentage of lung pixels) and lung function testing was done. Levels of outdoor air pollutants (ozone, fine particulate matter, oxides of nitrogen and black carbon) at each participant's home were estimated. Study authors report exposure to ambient air pollutants, especially ozone, was associated with increasing emphysema progression based on up to five repeated CT scans over 10 years. Ambient ozone exposure, but not the other pollutants, also was associated with decline in lung function. Limitations of the study include that outdoor air pollutant concentrations may not reflect everything about an individual's air pollutant exposures, and outdoor concentrations don't explain all of the variations in concentrations of pollutants indoors, where most people spend the majority of their time.

Authors: Joel D. Kaufman, M.D., M.P.H., University of Washington, Seattle, and coauthors

(doi:10.1001/jama.2019.10255)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

NASA finds Henriette fading

image: On August 13, 2019 at 1:30 a.m. EDT (0530 UTC), the MODIS instrument that flies aboard NASA's Terra satellite showed strongest thunderstorms in Tropical Depression Henriette were fragmented. Coldest cloud top temperatures were as cold as minus 50 degrees Fahrenheit (minus 45.5 Celsius).

Image: 
NASA/NRL

Infrared imagery from NASA's Terra satellite found just a few scattered areas of cold clouds in the Eastern Pacific Ocean's Tropical Depression Henriette on August 13.

NASA's Terra satellite uses infrared light to analyze the strength of storms by providing temperature information about the system's clouds. The strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

On August 13 at 1:30 a.m. EDT (0530 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite gathered infrared data on Henriette.

MODIS found just a few scattered areas of cold clouds in thunderstorms in the depression. Those thunderstorms had cloud top temperatures as cold as minus 50 degrees Fahrenheit (minus 45.5 Celsius).

NOAA's National Hurricane Center (NHC) noted that Henriette was weakening quickly. At 5 a.m. EDT (0900 UTC), the center of Tropical Depression Henriette was located near latitude 21.1 degrees north and longitude 115.2 degrees west. That's about 360 miles (580 km) west-southwest of the southern tip of Baja California, Mexico. The depression is moving toward the west-northwest near 13 mph (20 km/h) and this general motion should continue through tonight.

Maximum sustained winds have decreased to near 30 mph (45 kph) with higher gusts.

The estimated minimum central pressure is 1008 mb (29.77 inches).

NHC said, "Additional weakening is expected during the next 24 hours, and the depression is forecast to degenerate into a remnant low later today [Aug. 13, 2019]."

Credit: 
NASA/Goddard Space Flight Center

Through the kidneys to the exit

image: Fig.1 kidney structure

Image: 
© NUST MISIS

Scientists at the National University of Science and Technology "MISIS" (NUST MISIS) have identified a new mechanism for removing magnetic nanoparticles through the kidneys, which will help to create more effective and safe drugs. The results of the study are published in the Journal of Controlled Release.

The development of nanoparticles for carrying drugs, which can accumulate in the target organs and are safely metabolized remains a big scientific problem. The distribution and metabolism of nanoparticles are determined by many factors, in particular, their size, composition, surface charge, and coating.

"The combination of such methods as atomic emission spectroscopy, fluorescence microscopy, and magnetic resonance imaging revealed the rapid accumulation of magnetic nanoparticles in the kidneys. Moreover, intravital microscopy made it possible to track in real-time the transportation of nanoparticles from the blood into the renal clearance within an hour after administration. Two hours later, with the help of transmission electron microscopy the magnetic nanoparticles were detected in the urine of animals," said one of the study authors, Maxim Abakumov, head of Laboratory of Biomedical Nanomaterials at NUST MISIS.

The results suggest that it is possible to transport nanoparticles through the endothelial barrier, not into the glomerular filter of the kidney, but directly into the renal tubule.

The study has shown the fundamental possibility of creating magnetic nanoparticles that will be excreted through the kidneys, reducing the total dose and side effects, and not accumulate in the liver for several weeks until being completely dissolved. Scientists plan to evaluate the most optimal surface design and structure of magnetic nanoparticles to increase the efficiency of excretion by the kidneys.

Credit: 
National University of Science and Technology MISIS

Do internal medicine residents feel bullied during training?

What The Study Did: This research letter uses survey data to report on perceived bullying by internal medicine residents during training.

Authors: Scott M. Wright, M.D., of the Johns Hopkins Bayview Medical Center in Baltimore, is the corresponding author.

(doi:10.1001/jama.2019.8616)

Editor's Note: The article contains funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Note: More information about the U.S. Preventive Services Task Force, its process, and its recommendations can be found on the newsroom page of its website.

Credit: 
JAMA Network

Air pollution can accelerate lung disease as much as a pack a day of cigarettes

Air pollution--especially ozone air pollution which is increasing with climate change--accelerates the progression of emphysema of the lung, according to a new study led by the University of Washington, Columbia University and the University at Buffalo.

While previous studies have shown a clear connection of air pollutants with some heart and lung diseases, the new research published Aug. 13 in JAMA demonstrates an association between long-term exposure to all major air pollutants--especially ozone--with an increase in emphysema seen on lung scans. Emphysema is a condition in which destruction of lung tissue leads to wheezing, coughing and shortness of breath, and increases the risk of death.

"We were surprised to see how strong air pollution's impact was on the progression of emphysema on lung scans, in the same league as the effects of cigarette smoking, which is by far the best-known cause of emphysema," said the study's senior co-author, Dr. Joel Kaufman, UW professor of environmental and occupational health sciences and epidemiology in the School of Public Health.

In fact, the researchers found, if the ambient ozone level was 3 parts per billion higher where you live compared to another location over 10 years, that was associated with an increase in emphysema roughly the equivalent of smoking a pack of cigarettes a day for 29 years. And the study determined that ozone levels in some major U.S. cities are increasing by that amount, due in part to climate change. The annual averages of ozone levels in study areas were between about 10 and 25 ppb.

"Rates of chronic lung disease in this country are going up and increasingly it is recognized that this disease occurs in nonsmokers," said Kaufman, also a professor of internal medicine and a physician at UW School of Medicine. "We really need to understand what's causing chronic lung disease, and it appears that air pollution exposures that are common and hard to avoid might be a major contributor."

The results are based on an extensive, 18-year study involving more than 7,000 people and a detailed examination of the air pollution they encountered between 2000 and 2018 in six metropolitan regions across the U.S.: Chicago, Winston-Salem, N.C., Baltimore, Los Angeles, St. Paul, Minnesota, and New York. The participants were drawn from the Multi-Ethnic Study of Atherosclerosis (MESA) Air and Lung studies.

"To our knowledge, this is the first longitudinal study to assess the association between long-term exposure to air pollutants and progression of percent emphysema in a large, community-based, multi-ethnic cohort," said first author Meng Wang, an assistant professor of epidemiology and environmental health at the University at Buffalo who conducted the research as a postdoctoral researcher at the UW.

The authors developed novel and accurate exposure assessment methods for air pollution levels at the homes of study participants, collecting detailed measurement of exposures over years in these metropolitan regions, and measurements at the homes of many of the participants. This work in the MESA Air study was led at the University of Washington. While most of the airborne pollutants are in decline because of successful efforts to reduce them, ozone has been increasing, the study found. Ground-level ozone is mostly produced when ultraviolet light reacts with pollutants from fossil fuels.

"This is a big study with state-of-the-art analysis of more than 15,000 CT scans repeated on thousands of people over as long as 18 years. These findings matter since ground-level ozone levels are rising, and the amount of emphysema on CT scans predicts hospitalization from and deaths due to chronic lung disease," said Dr. R. Graham Barr, professor of medicine and epidemiology at Columbia University who led the MESA Lung study and is a senior author of the paper.

"As temperatures rise with climate change," Barr explained, "ground-level ozone will continue to increase unless steps are taken to reduce this pollutant. But it's not clear what level of the air pollutants, if any, is safe for human health."

Emphysema was measured from CT scans that identify holes in the small air sacs of the participants' lungs, and lung function tests, which measure the speed and amount of air breathed in and out.

"This study adds to growing evidence of a link between air pollution and emphysema. A better understanding of the impact of pollutants on the lung could lead to more effective ways of preventing and treating this devastating disease," said James Kiley, director of the Division of Lung Diseases at the National Heart, Lung, and Blood Institute, part of the National Institutes of Health.

"It's important that we continue to explore factors that impact emphysema," Kiley added, "particularly in a large, well-characterized multi-ethnic group of adults such as those represented by MESA."

Credit: 
University of Washington

Growth of wind energy points to future challenges, promise

image: Advances in adapting the technology and better methods for predicting wind conditions have fanned significant growth of the use of wind turbines for electricity in the last 40 years. A new report, in Applied Physics Reviews, takes stock of where the field is now and what lies ahead. Researchers surveyed the growth of wind technology as a source of renewable energy and assessed its viability for continuing to capture larger shares of the electricity market. This image shows the growth in wind energy solutions, with larger, more efficient wind turbines over the decades.

Image: 
Fraunhofer IWES

WASHINGTON, D.C., August 13, 2019 -- Advances in adapting the technology for cold climates and offshore use and better methods for predicting wind conditions have fanned significant growth of the use of wind turbines for electricity in the last 40 years. A new report takes stock of where the field is now and what lies ahead.

A team of researchers from Germany has published its findings in Applied Physics Reviews, from AIP Publishing, surveying the growth of wind technology as a source of renewable energy and assessing its viability for continuing to capture larger shares of the electricity market. The report notes dramatic improvements in the technology, thanks in part to economies of scale, and foresees even larger, more cost-effective turbines in the future.

"The size of a state-of-the-art turbine is extremely impressive. The swept area of the rotor of a standard turbine is now twice the size of a football field," said Berthold Hahn, one of the authors of the paper. "In parallel to the development in size, the technology has also become mature, meaning cost-effective and reliable."

Since the 1970s, wind turbines have improved to generate at least 100 times more power than their predecessors. Current large turbines each have the capacities to generate roughly 5 megawatts. Hahn said market expectations for future turbine capacity reach up to 10 to 15 megawatts. Some of these larger turbines might have rotor diameters up to 200 meters long.

During this time, the cost to produce electricity from wind has plummeted from $500 per megawatt-hour to $50.

"The technical developments, like floating offshore turbines enabling the harvest of wind energy in very deep waters, integrated control strategies considering the needs of the grid, and artificial intelligence permanently assessing the performance of the turbines, have contributed to the impressive cost reductions," Hahn said.

Even then, the wind power industry faces continued pressure to reduce costs. The report identified a key area, finding more economical ways to maintain the turbines. Work that draws together historical maintenance data and real-time measurements from turbines is needed to detect turbine failures earlier.

With improvements to other parts of the power grid, such as power storage, Hahn foresees wind technology becoming crucial for responding to the electricity supply and demand volatility that markets experience.

"In many countries, wind energy has started to take over tasks of stabilizing the grid from large conventional plants, meaning that the energy systems are now eventually changing from a mainly centralized structure to a decentralized one," he said.

Credit: 
American Institute of Physics

Platform for lab-grown heart cells lets researchers examine functional effects of drugs

image: The human heart's energy needs and functions are difficult to reproduce in other animals; one new system looks to circumvent these issues and provide a functional view of how different treatments can help ailing cells in the heart following oxygen and nutrient deprivations. Researchers have unveiled a new silicon chip that holds human lab-grown heart muscle cells for assessing the effectiveness of new drugs. They discuss their work in this week's APL Bioengineering. This image shows a human in vitro platform for evaluating pharmacologic strategies in cardiac ischemia.

Image: 
Carlota Oleaga

WASHINGTON, D.C., August 13, 2019 -- Animal models provide benefits for biomedical research, but translating such findings to human physiology can be difficult. The human heart's energy needs and functions are difficult to reproduce in other animals, such as mice and rats. One new system looks to circumvent these issues and provide a functional view of how different treatments can help ailing cells in the heart following oxygen and nutrient deprivations.

Researchers have unveiled a new silicon chip that holds human lab-grown heart muscle cells for assessing the effectiveness of new drugs. The system includes heart cells, called cardiomyocytes, patterned on the chip with electrodes that can both stimulate and measure electrical activity within the cells. The researchers discuss their work in this week's APL Bioengineering, from AIP Publishing.

These capabilities provide a way for determining how the restriction of blood supply, a dangerous state known as ischemia, changes a heart's conduction velocity, beat frequency and important electrical intervals associated with heart function.

Ischemic conditions have been difficult to mimic in other animals. The heart of a rat, for example, has a metabolism that is more than six times higher than a human heart's, meaning they face an increased need for glucose faster and can fall into ischemia more quickly.

A key contribution of functional systems based on human cells is the ability to examine how treatments affect the broader function of human tissues.

"You go to the doctor's office, and they don't immediately start looking for biomarkers, but that's how a lot of drug discovery is conducted," said James Hickman, an author on the study. "Instead, the doctor would essentially ask you, 'How are you functioning?'"

The group used the chip's unique ability to determine conduction velocity from electrical activity to study how drugs designed to curb the effects of ischemia affect gap junctions, the tunnels between cardiomyocytes that allow them to propagate an electric signal.

They found the drug ZP1609 lessened the dramatic drop in conduction speed associated with gap junction degradation when the chips were placed in ischemic conditions, echoing findings about the drug in other models.

Hickman soon hopes to use the robustness of the platform to study other organs, including how cells in the heart signal to cells in the liver and the brain while under ischemia or how ischemia effects other organs, especially the brain.

The group has also developed cantilevers measuring less than a millimeter long that cardiomyocytes can be attached to in order to measure effects on contraction force. Those cantilevers could be used to predict the effects on pumping efficiency.

Credit: 
American Institute of Physics

Machine learning tool improves tracking of tiny moving particles

image: Beyond manual tracing: An artist's impression of a deep neural network trained to recognise particle motion in space-time representations.

Image: 
Eva Pillai

Scientists have developed an automated tool for mapping the movement of particles inside cells that may accelerate research in many fields, a new study in eLife reports.

The movements of tiny molecules, proteins and cellular components throughout the body play an important role in health and disease. For example, they contribute to brain development and the progression of some diseases. The new tool, built with cutting-edge machine learning technology, will make tracking these movements faster, easier and less prone to bias.

Currently, scientists may use images called kymographs, which represent the movement of particles in time and space, for their analyses of particle movements. These kymographs are extracted from time-lapse videos of particle movements recorded using microscopes. The analysis needs to be done manually, which is both slow and vulnerable to unconscious biases of the researcher.

"We used the power of machine learning to solve this long-standing problem by automating the tracing of kymographs," says lead author Maximilian Jakobs, a PhD student in the Department of Physiology, Development and Neuroscience at the University of Cambridge, UK.

The team developed the software, dubbed 'KymoButler', to automate the process. The software uses deep learning technology, which tries to mimic the networks in the brain to allow software to learn and become more proficient at a task over time and multiple attempts. They then tested KymoButler using both artificial and real data from scientists studying the movement of an array of different particles.

"We demonstrate that KymoButler performs as well as expert manual data analysis on kymographs with complex particle trajectories from a variety of biological systems," Jakobs explains. The software could also complete analyses in under one minute that would take an expert 1.5 hours.

KymoButler is available for other researchers to download and use at kymobutler.deepmirror.ai. Senior author Kristian Franze, Reader in Neuronal Mechanics at the University of Cambridge, expects the software will continue to improve as it analyses more types of data. Researchers using the tool will be given the option of anonymously uploading their kymographs to help the team continue developing the software.

"We hope our tool will prove useful for others involved in analysing small particle movements, whichever field they may work in," says Franze, whose lab is devoted to understanding how physical interactions between cells and their environment shape the development and regeneration of the brain.

Credit: 
eLife

Monitoring the Matterhorn with millions of data points

image: Working with a spectacular -- or dizzying -- view: Jan Beutel during maintenance work on the sensor network in the 2003 rockfall zone.

Image: 
PermaSense / ETH Zurich

The summer heatwave of 2003 triggered a rockfall that shocked both researchers and the general public: 1,500 cubic metres of rock broke away from the Hoernli ridge - a volume roughly equivalent to two houses. The fracture event exposed bare ice on the surface of the steep scarp. Experts soon realised that the record temperatures had warmed the rock down to such a depth that the ice contained in its pores and fissures had melted. This effectively caused a sudden reduction of the bonding holding the rock mass together.

The unpredicted rockfall was the incentive for setting up PermaSense, a unique project consortium bringing together experts from different engineering and environmental research disciplines from ETH Zurich and several other institutions, including the universities of Basel and Zurich.

The project was launched in 2006 with the initial goal of making measurements and observations that had not previously been possible. Using state-of-the-art technology, the researchers were looking to obtain in-situ measurements in steep bedrock permafrost of unprecedented quality and quantity.

Not only were they successful, but the researchers comfortably beat their goal, as they report in an article just published in the journal Earth System Science Data. The study describes a unique 10-year record of high-resolution data captured by scientists on the Hörnli ridge of the Matterhorn, 3500 metres above sea level. A total of 17 different sensor types positioned at 29 distinct sensor locations in and around the 2003 rockfall zone delivered 115 million separate data points.

"This data set constitutes the longest, densest and most diverse data record in the history of alpine permafrost research worldwide," says Jan Beutel, Senior Researcher at the Computer Engineering and Networks Laboratory of ETH Zurich, with an understandable sense of pride: he is the driving force behind the initiative.

Using cutting-edge wireless sensors, the researchers have managed to make large volumes of high-quality data available almost in real time, and closely monitor and control the running experiments. &laquoThe combined analysis of long-term monitoring obtained from different types of instruments lead to a better understandingof the processes that can lead to the destabilization of steep rock,&laquo says Samuel Weber, co-leader of the project and now postdoctoral researcher at TU Munich.

The sensor network also comprises an automatic high-resolution camera that takes photos of the fracture site every two minutes. "Crackmeters" measure the widening of the fissures and the displacement of boulders. Temperatures are measured at various depths in the rock face, as well as on the surface. Inclinometers and GPS sensors permanently measure how much larger rock partitions as well as the whole mountain ridge are deforming and gradually tilting towards the valley. In recent years the researchers have added equipment for measuring acoustic emissions and microseismic data.

The data are relayed via WLAN from the Hoernli ridge to the summit station of the cable car of the Klein Matterhorn nearby, from where they are transmitted in real time via the Internet to ETH Zurich's data centre.

Here they are continuously captured, analysed and assessed - and have been for the past 10 years, around the clock, whatever the weather.

"Over the past three years of our project, the incorporation of more complex seismic data have been particularly useful in helping us to quantify what we were keen to research from the start: the destabilization leading to rockfall. This has helped us identify patterns in the signals from the mountain that enable us to capture such events," Beutel says.

Measuring the resonance frequencies of the rockface

The use of seismic sensing systems made it possible to detect many different signals - such as the formation of cracks initially invisible and hidden in the rockface - which the previous sensors were unable to capture. "Seismic sensors capture much more data, and offer us unprecedented information density and analysis opportunities," says the electrical engineer. But these sensors have several drawbacks: they need cables, more power, and deep bore holes, which first have to be drilled. And they also record signals which have nothing to do with the mountain, such as the footsteps of climbers on their way to the Matterhorn summit.

The researchers first had to remove all the ambient noise from these data using machine learning and smart algorithms which were programmed directly into the wireless sensors by the ETH doctoral students currently involved in the project. In order to test against ground truth they also fed the algorithms with data recorded at the Hoernli hut, where mountaineers climbing the Matterhorn spend the night. The number of people staying overnight and climbing each serve as an indication when people climbing the mountain are creating interference.Analysis of the filtered seismic data provide an interesting picture for Beutel: "The resonance frequencies that occur in the rocks vary considerably over the course of the year."

This phenomenon is linked to the freezing and thawing processes on the mountain. Many micro-cracks and fissures are filled with ice and sediment, and this mix is frozen rock-hard in the winter. When this thaws in the summer, the bonding in the fissures changes. The freely vibrating rock mass enlarges, and as a result the resonance frequency decreases. The reverse is happening in winter: the resonance frequency of the rock mass increases.

"It's the same principle as on a guitar - the tone depends on where you grip the strings creating different length vibrating elements," Beutel explains.

"Very abrupt changes in the pattern of these resonance frequencies would indicate that the stability of part of the rockface has changed," Beutel says. If the frequencies drop, it may mean that existing fissures have deepened or opened up possibly indicating an emerging rockfall of a sizeable mass.

"Using seismic and acoustic data, combined with measurements of crack widths and photos of the investigation site, we can identify quite precisely how the permafrost is changing and make predictions about problems starting to develop," Beutel says. "I consider this to be one of the best achievements to date of the PermaSense project."

He says this is all thanks to his project partner, Samuel Weber, who spent the past three years writing a ground-breaking thesis on this topic at the University of Zurich. Another key factor was the involvement of ETH Professor Donath Fäh and the Swiss Seismological Service, who provided the seismology expertise.

Sudden opening of rock cracks

The measurement project on the Matterhorn is not over yet, but still ongoing. While it is still running, Beutel is keen to transfer the know-how gained from the "Horu", the local name for the iconic mountain, to other projects and sites. The technical and geological expertise acquired can now be applied to the forecasting of natural hazard event.

Credit: 
ETH Zurich

Schrödinger's cat with 20 qubits

image: In quantum computing, a cat state - named after the famous analogy of Schrödinger's cat -- is a quantum state composed of two diametrically opposed conditions simultaneously. Together with experts from Forschungszentrum Jülich, an international team has now succeeded in placing 20 entangled quantum bits in such a state of superposition.

Image: 
Forschungszentrum Jülich / Annette Stettien

In 1935, the physicist Erwin Schrödinger put forward the thought experiment with the quantum cat, in which the cat is enclosed in a box together with a radioactive sample, a detector and a lethal amount of poison. If the radioactive material decays, the detector triggers an alarm and the poison is released. The special feature is that according to the rules of quantum mechanics, unlike everyday experience, it is not clear whether the cat is dead or alive. It would be both at the same time until an experimenter takes a look. A single state would only be obtained starting from the time of this observation.

Since the early 1980s, researchers have been able to realize this superposition of quantum states experimentally in the laboratory using various approaches. "However, these cat states are extremely sensitive. Even the smallest thermal interactions with the environment cause them to collapse," explains Tommaso Calarco from Forschungszentrum Jülich. Among other things, he plays a leading role in Europe's major quantum initiative, the EU's Quantum Flagship programme. "For this reason, it is only possible to realize significantly fewer quantum bits in Schrödinger cat states than those that exist independently of each other".

Of the latter states, scientists can now control more than 50 in laboratory experiments. However, these quantum bits, or qubits for short, do not display the special characteristics of Schrödinger's cat in contrast to the 20 qubits that the team of researchers have now created using a programmable quantum simulator thus establishing a new record that is still valid even if other physical approaches with optical photons, trapped ions or superconducting quantum circuits are taken into account.

Experts from several of the world's most renowned institutions joined forces to develop the experiment. In addition to the Jülich researchers, scientists from numerous top American universities - Harvard, Berkeley, MIT and Caltech - as well as the Italian University of Padua were involved.

"Qubits in the cat state are considered extremely important for the development of quantum technologies," explains Jian Cui. "The secret of the enormous efficiency and performance expected of future quantum computers is to be found in this superposition of states," says the physicist from the Peter Grünberg Institute at Jülich (PGI-8).

Classical bits in a conventional computer always only have one certain value, which is composed of 0 and 1, for example. Therefore, these values can only be processed bit by bit one after the other. Qubits, which have several states simultaneously due to the superposition principle, can store and process several values in parallel in one step. The number of qubits is crucial here. You don't get far with just a handful of qubits. But with 20 qubits, the number of superimposed states already exceeds one million. And 300 qubits can store more numbers simultaneously than there are particles in the universe.

The new result of 20 qubits now comes a little closer to this value, after the old record of 14 qubits remained unchanged since 2011. For their experiment, the researchers used a programmable quantum simulator based on Rydberg atom arrays. In this approach, individual atoms, in this case rubidium atoms, are captured by laser beams and held in place side by side in a row. The technique is also known as optical tweezers. An additional laser excites the atoms until they reach the Rydberg state, in which the electrons are located far beyond the nucleus.

This process is rather complicated and usually takes too much time, such that the delicate cat state is destroyed before it can even be measured. The group in Jülich contributed their expertise in Quantum Optimal Control to solve this issue. By cleverly switching the lasers off and on at the right rate, they achieved a speed up in the preparation process which made this new record possible.

"We practically inflated some atoms to such an extent that their atomic shells merge with the adjacent atoms to simultaneously form two opposite configurations, namely excitations occupying all even or odd sites," explains Jian Cui. "This goes so far that the wave functions overlap as in the analogy of Schrödinger's cat and we were able to create the superposition of the opposite configurations which is also known as the Greenberger-Horne-Zeilinger state."

Their advances in quantum research were complemented by the efforts of a Chinese research group, which was also published in the current issue of "Science". Using superconducting quantum circuits, the researchers succeeded in creating 18 qubits in the Greenberger-Horne-Zeilinger state, which is also a new record for this experimental approach.

Credit: 
Forschungszentrum Juelich