Tech

Charging ahead to higher energy batteries

image: Image (a) is a cross-sectional SEM image of the Li5La3Nb2O12 crystal layer and image (b) shows computationally simulated trajectories of the Li, La, Nb, and O framework atoms obtained for Σ3 (2-1-1) = (1-21) at a temperature of 1300 K.

Image: 
Nobuyuki Zettsu Ph.D., the Center for Energy and Environmental Science, the Department of Materials Chemistry, Shinshu University.

Researchers have developed a new way to improve lithium ion battery efficiency. Through the growth of a cubic crystal layer, the scientists have created a thin and dense connecting layer between the electrodes of the battery.

Professor Nobuyuki Zettsu from the Center for Energy and Environmental Science in the Department of Materials Chemistry of Shinshu University in Japan and the director of the center, Professor Katsuya Teshima, led the research.

The authors published their results online in January this year in Scientific Reports.

"Owing to some intrinsic characteristics of liquid electrolytes, such as low lithium transport number, complex reaction at the solid/liquid interface, and thermal instability, it has not been possible to simultaneously achieve high energy and power in any of the current electrochemical devices," said Nobuyuki Zettsu, as first author on the paper.

Lithium ion batteries are rechargeable and power such devices as cell phones, laptops, power tools, and even store power for the electrical grid. They're particularly sensitive to temperature fluxes, and have been known to cause fires or even explosions. In response to the problems with liquid electrolytes, scientists are working toward developing a better all-solid-state battery without liquid.

"Despite the expected advantages of all-solid-state batteries, their power characteristic and energy densities must be improved to allow their application in such technologies as long-range electric vehicles," Zettsu said. "The low rate capabilities and low energy densities of the all-solid-state batteries are partly due to a lack of suitable solid-solid heterogeneous interface formation technologies that exhibit high iconic conductivity comparable to liquid electrolyte systems."

Zettsu and his team grew garnet-type oxide solid electrolyte crystals in molten LiOH used as a solvent (flux) on a substrate that bonded the electrode into a solid state as they grew. A specific crystal compound known to grow cubically allowed the researchers to control the thickness and connection area within the layer, which acts as a ceramic separator.

"Electron microscopy observations revealed that the surface is densely covered with well-defined polyhedral crystals. Each crystal is connected to neighboring ones," wrote Zettsu.

Zettsu also said that the newly grown crystal layer could be the ideal ceramic separator when stacking the electrolyte layer on the electrode layer.

"We believe that our approach having robustness against side reactions at the interface could possibly lead to the production of ideal ceramic separators with a thin and dense interface," wrote Zettsu, noting that the ceramics used in this particular experiment were too thick to be used in solid batteries. "However, as long as the electrode layer can be made as thin as 100 microns, the stacking layer will operate as a solid battery."

One hundred microns is about the width of a human hair, and slightly less than twice the thickness of a standard electrode layer in contemporary lithium-ion batteries.

"All-solid-state batteries are promising candidates for energy storage devices," Zettsu said, noting that several collaborations between researchers and private companies are already underway with the ultimate goal of displaying all-solid-state battery samples at the 2020 Olympic games in Tokyo.

Zettsu and other researchers plan to fabricate prototype cells for electric vehicle use and for wearable devices by 2022.

Credit: 
Shinshu University

Dementia increases the risk of 30-day readmission to the hospital after discharge

About 25 percent of older adults admitted to hospitals have dementia and are at increased risk for serious problems like in-hospital falls and delirium (the medical term for an abrupt, rapid change in mental function). As a result, older adults with dementia are more likely to do poorly during hospital stays compared to older adults without dementia.

Until now, little was known about the effects of dementia on early hospital readmission. Researchers in Japan recently published the results of a study to learn more about the effects of dementia and being admitted to the hospital within 30 days of a previous hospital discharge (the medical term for leaving the hospital once your care is considered complete). Their study was published in the Journal of the American Geriatrics Society.

The researchers studied information from people 65-years-old and older who had been discharged from hospitals between 2014 and 2015, and then followed them for six months. The researchers were looking for unplanned readmissions to the hospital within 30 days of the patient's discharge.

Older adults with dementia had about twice the risk for hospital readmissions compared to the risk for those without dementia. However, the rate of risk depended on the older adult's diagnosis. For example, people with dementia who were hospitalized for hip fractures were at higher risk for hospital readmission than people with dementia who were diagnosed with gallbladder inflammation.

In 17 of the top 30 most common health conditions, older adults with dementia were more likely to be readmitted to the hospital than people without dementia.

The researchers noted that three issues may raise the risk of being readmitted to the hospital if you have dementia:

Older adults with dementia may have difficulty following directions about taking medication and attending follow-up visits. This may lead to poor health and readmissions.

People with dementia may be less able to express their symptoms, which can delay decisions to seek treatment.

Special discharge planning for people with dementia may not be available in all hospitals.

The researchers concluded that the risk of readmission for older adults with dementia varies according to diagnosis, and that special discharge planning for people with dementia is important.

Credit: 
American Geriatrics Society

Looking for an off switch for celiac disease

image: Transglutaminase 2 (TG2) is reversibly regulated by the protein cofactors thioredoxin and ERp57 via an allosteric disulfide redox switch.

Image: 
Chaitan Khosla, Stanford University

Celiac disease is an autoimmune disorder that affects by some estimates nearly 1 in 100 people. Celiac disease symptoms are triggered by gluten, a protein found in wheat and related plants, but gluten doesn't act alone to cause the digestive symptoms that patients suffer. Rather, gluten induces an overactive immune response when it's modified by the enzyme transglutaminase 2, or TG2, in the small intestine. New research published in the Feb. 23 issue of the Journal of Biological Chemistry identifies an enzyme that turns off TG2, potentially paving the way for new treatments for celiac disease.

"Currently, therapies to treat people with celiac disease are lacking. The best approach right now is just a strict adherence to a lifelong gluten-free diet," said Michael Yi, a chemical engineering graduate student at Stanford University who led the new study. "Perhaps the reason behind this is our relatively poor understanding of TG2."

The biochemistry of how TG2 interacts with gluten and induces an immune response has been well studied, but more basic mysteries remain, for example how TG2 behaves in people without celiac disease. Chaitan Khosla, the professor at Stanford and director of Stanford Chemistry, Engineering & Medicine for Human Health who oversaw the new study, has conducted several studies showing that TG2 can be active or inactive, depending on the forming or breaking of a specific chemical bond, called a disulfide bond, between two amino acids in the enzyme.

"(E)ven though there's a lot of transglutaminase 2 protein in the (small intestine), it's all inactive," Khosla said. "When it became clear that even though the protein was abundant, its activity was nonexistent in a healthy organ, the question became 'What turns the protein on, and then what turns the protein off?'"
In 2011, Khosla's team identified the enzyme that activates TG2 by breaking its disulfide bond. In the new paper, the researchers performed experiments in cell cultures and found an enzyme that re-forms this bond, inactivating TG2. This enzyme, ERp57, is mainly known for helping fold proteins inside the cell. When it turns off TG2, it does so outside of cells, raising more questions about its functions in healthy people.

"Nobody really understands how (Erp57) gets outside the cell," Khosla said. "The general thinking is that it's exported from the cell in small quantities; this particular observation suggests that it actually does have a biological role outside the cell."

TG2 is now also the first protein known to have a reversible disulfide bond on/off switch of this type. "This is a very different kind of on-and-off chemistry than the kind that medicinal chemists would (typically) use," Khosla said.

Understanding this mechanism has led the team to investigate whether there are any FDA-approved drugs that could target the switch directly. Because previous studies have suggested that lack of TG2 doesn't seem to negatively affect the health of mice, blocking TG2 is a promising avenue for treating celiac disease patients without requiring lifelong changes to their diets.

Credit: 
American Society for Biochemistry and Molecular Biology

Emergency CT for head trauma may be overused, study shows

Leesburg, VA, February 20, 2017 - Emergency patients are too often given head CT to check for skull fractures and brain hemorrhage, leading to unnecessary heath care costs and patient exposure to radiation, according to a study to be presented at the ARRS 2018 Annual Meeting, set for April 22-27 in Washington, DC.

The study, to be presented by Michaela Cellina, of the Radiology Department of Fatebenefratelli-Sacco Hospital in Milan, Italy, evaluated head CT scans executed for minor head injury (MHI) in patients aged 18-45 who presented to the hospital's emergency department between January 1 and June 30, 2016. For each CT scan, researchers determined whether the CT referral met the criteria indicated by the National Institute for Health and Care Excellence (NICE) and the Canadian CT Head Rule (CCHR).

Of 492 cases reviewed, 260 (52.8%) and 376 (76.4%) of the CT examinations were not indicated according to the NICE and CCHR, respectively. Researchers noted no statistically significant difference between the specialty and seniority of the referring physician and over-referral, or between the patient's age and unwarranted CT studies. Motor vehicle accidents, however, were associated with a higher rate of non-indicated CT examinations for both NICE and CCHR, and two-wheel vehicle driver accidents were associated with a higher rate of appropriated CT exams for both NICE and CCHR. Only 15 of the 260 CT examinations were positive for brain hemorrhage, subarachnoid hemorrhage, or skull fracture.

With educational activities representing the entire spectrum of radiology, ARRS will host leading radiologists from around the world at the ARRS 2018 Annual Meeting, April 22-27, at the Marriott Wardman Park Hotel in Washington, DC. For more information, http://www.arrs.org/am18.

Credit: 
American Roentgen Ray Society

Scientists examine link between surface-water salinity, climate change

The interplay between surface-water salinity and climate change in Central New York is the subject of a recent paper by researchers in Syracuse University's College of Arts and Sciences.

Kristina Gutchess, a Ph.D. candidate in Earth Sciences, is the lead author of an article in the prestigious journal Environmental Science and Technology (ACS Publications). Her co-authors at Syracuse include Laura Lautz, the Jesse Page Heroy Professor and chair of Earth sciences, and Christa Kelleher, assistant professor of Earth sciences.

Another co-author is Gutchess' Ph.D. supervisor, Associate Professor Zunli Lu.

Rounding out the group are Li Jin G'08, associate professor of geology at SUNY Cortland; José L. J. Ledesma, a postdoctoral researcher of aquatic sciences and assessment at the Swedish University of Agricultural Sciences; and Jill Crossman, assistant professor of Earth and environmental sciences at the University of Windsor (Ontario).

The paper draws on the group's study of the impact of de-icing salt from Interstate 81 and other surrounding roads and highways on the Tioughnioga River watershed. Gutchess says their findings make her "cautiously optimistic" about the watershed's future surface-water chloride concentrations.

"The long-term application of road salts has led to a rise in the river's salinity level," says Gutchess, who studies processes affecting the quality of surface water and groundwater. "While various models have been used to assess potential future impacts of continued de-icing practices, they have not incorporated different climate scenarios, which are projected to impact hydrogeology in the 21st century."

Gutchess' team combined various computational approaches with rigorous fieldwork and laboratory analysis to simulate surface-water chloride concentrations in the Tioughnioga--a large, deep, 34-mile tributary of the Chenango River, flowing through Cortland and Broome counties.

Central to their experiment was INCA (short for "INtegrated CAtchment"), a semi-distributed catchment-modeling platform that assesses environmental-change issues. Gutchess calibrated the model for a historical, or baseline, period (1961-90), and used the results to make projections for three 30-year intervals: 2010-39, 2040-69 and 2070-99.

Based on the model's projections, the salinity of the Tioughnioga's east and west branches will start decreasing in 20-30 years. "A gradual warming trend between 2040 and 2099 will lead to reductions in snowfall and associated salt applications, causing [the river's] salinity to drop. By 2100, surface-water chloride concentrations should be below 1960s values," Gutchess says.

This is potentially big news for a part of the country that has experienced rising surface-water chloride concentrations since the 1950s, when road salting began.

Salt, or sodium chloride, is the most commonly used de-icing chemical in the country, spread at a rate of more than 10 million tons a year.

In New York State, a typical wintertime event requires 90-450 pounds of salt per lane-mile. Vehicle traffic picks up about 10 percent of the residue; the rest enters adjacent water catchments in the form of runoff, jeopardizing terrestrial ecosystems and drinking water resources.

Gutchess' hydrogeological study is one of only a few combining long-term climate variability and salinity management. The INCA model framework enabled her team to assess stream response under 16 different future scenarios, taking into account climate, land use and snow management.

"INCA originally was developed to assess sources of nitrogen in catchments in a single-stem main river," Jin says. "Here, we modified the model to incorporate a new multi-branched structure, enabling us to simulate daily estimates of in-stream concentrations of chloride. We also allowed for differences in salting practices between rural and urban areas."

According to INCA, road salt accounts for more than 87 percent of Tioughnioga's salinity. Current de-icing practices, combined with increased urbanization, will likely add to its salinity, but only for a while, thanks in part to the changing climate.

According to Lu, the study suggests that climatic impacts are not always negative in a specific region: "It is important to understand the nuances of climate change at various time and geographic scales. Ultimately, this project will help us manage our resources more effectively, as we adapt to future changes."

With a wink and a nod, he adds, "At the same time, we should not make blanket statements about climate change. No one is exempt from its effects, pro or con."

Gutchess is a member of EMPOWER, a water-energy graduate-training program at Syracuse that is sponsored by the National Science Foundation and directed by Lautz. Additional support for Gutchess' research comes from the University's new Campus as a Laboratory for Sustainability program. Upon graduation in May, she will begin postdoctoral research at Yale.

Credit: 
Syracuse University

Giant intrinsic chirality from planar dielectric nanostructures

Harvard researchers have developed a metasurface, comprised of a single planar layer of nanostructures, which exhibits strong optical chirality in transmission. This means it can let circularly polarized light of one polarization pass through almost unhindered, while light of the opposite helicity is completely diffracted away. Such capabilities are incredibly useful for a host of applications, such as circular dichroism spectroscopy in the analysis of drug samples, and polarization filters in telecommunications.

This work challenges some long-held notions about chiral metamaterials and metasurfaces. 'Previously people thought that to achieve a strong, intrinsic chiro-optical response, the structures had to be complicated three-dimensional shapes, such as corkscrews or helices, in order to break the symmetry ', says Prof. Federico Capasso, Robert L. Wallace Professor of Applied Physics and Vinton Hayes Senior Research Fellow in Electrical Engineering at Harvard University. 'These 3D metamaterials were extremely difficult to fabricate on a large-scale. With this work, we showed that even a planar layer of dielectric nanostructures whose thickness is on the order of the incident wavelength can exhibit strong intrinsic chirality. This offers a practical way for such devices to be implemented in various applications as they can now be made in a single lithographic step. '

The authors were able to achieve this using gammadion-shaped nanostructures made of a relatively high-index dielectric material: titanium oxide. "This allows us to create planar structures with a strong in-plane magnetic moment, without resorting to a 3D geometry. By further optimizing the in-plane parameters of the gammadions, we can achieve the necessary coupling between the electric and magnetic moments to observe strong intrinsic chiro-optical activity," says Alexander Zhu, first author of the study. The authors experimentally achieved up to 80% circular dichroism in transmission at green wavelengths, with more than 90% of light with the correct helicity being transmitted at normal incidence. This result is on par with the state-of-the-art 3D metamaterials and greatly exceeds planar counterparts under similar conditions.

Further analysis points to some rich physics underlying this phenomenon of giant intrinsic chirality in planar structures. The authors found that the optical response of the gammadion structures is dominated by higher-order multipoles, such as the toroidal quadrupole and magnetic octupole. In naturally occurring media, such high orders are vanishingly small, such that only dipole responses are typically observed. However, their existence is critical since dipole modes radiate primarily along normal incidence, whereas the primary radiation direction for higher-order modes is off-normal. This provides some insight into the design and optimization of these nanostructures. The authors are now seeking to further improve these results and develop a fast, efficient sensor for spectroscopic detection of chiral compounds.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

Almost all adolescents in an economically disadvantaged urban population exposed to tobacco smoke

Bottom Line: Ninety-four percent of adolescents ages 13 to 19 in an economically disadvantaged, largely minority population in San Francisco had measurable levels of a biomarker specific for exposure to tobacco smoke (NNAL).

Journal in Which the Study Was Published: Cancer Epidemiology, Biomarkers & Prevention, a journal of the American Association for Cancer Research.

Author: Neal L. Benowitz, MD, professor of Medicine and Bioengineering & Therapeutic Sciences, and chief of the Division of Clinical Pharmacology at the University of California, San Francisco (UCSF).

Background: Benowitz explained that exposure of adolescents to secondhand smoke poses a public health challenge because it increases risk for respiratory infections, aggravates asthma, and is linked to an increased likelihood of becoming an active smoker.

In a prior study, Benowitz and colleagues showed that 87 percent of adolescents in an economically disadvantaged population had evidence of exposure to nicotine, as defined by the presence of cotinine in urine samples.

In this study, they set out to assess tobacco smoke exposure in this population by measuring levels of NNAL in urine samples. Benowitz explained that NNAL is detectable in urine for much longer periods after tobacco exposure compared with cotinine and that it is present only in the urine of people exposed to tobacco. He and his colleagues, therefore, investigated whether NNAL would be a more sensitive biomarker of exposure to secondhand smoke compared with cotinine, more likely to identify adolescents only intermittently exposed.

How the Study Was Conducted and Results: The researchers measured levels of cotinine and NNAL in urine samples from 465 adolescents who received pediatric care at the Children's Health Center at Zuckerberg San Francisco General Hospital. Among these adolescents, 91 percent had public health insurance and 8 percent had no health insurance; 53 percent were Latino, 22 percent African-American, 11 percent Asian, and 3 percent white.

Overall, 94 percent of the adolescents had detectable levels of NNAL, compared with 87 percent for cotinine. Thus, using the NNAL biomarker indicated a higher prevalence of tobacco exposure in this population compared with cotinine.

Using a level of more than 30 ng of cotinine per ml of urine as a biomarker of active smoking, which is consistent with prior research, 12 percent of the adolescents were identified as active smokers. Eighty-two percent of the adolescents were identified as nonsmokers who had been exposed to secondhand smoke because they had detectable levels of NNAL but did not have cotinine levels above 30 ng per ml of urine.

The percentage of individuals who were identified as active smokers was highest among the African-American adolescents, 32 percent. In addition, the level of NNAL in the urine of nonsmokers was highest among African-American nonsmokers, suggesting higher levels of secondhand smoke exposure.

Author Comment: "Now, using the tobacco-specific biomarker and lung carcinogen NNAL, we find an even higher prevalence of tobacco exposure, which eliminates the possibility that the [prior] result with cotinine was due to consumption of nicotine-containing products such as tomatoes, potatoes, eggplant, and black tea," said Benowitz

"Our data show nearly ubiquitous exposure to tobacco smoke in this population of economically disadvantaged adolescents, which highlights the need for new public health initiatives to reduce exposure," he added. "It also suggests that routine urine screening for NNAL or cotinine, with counseling intervention in those screening positive for exposure, could help address this public health challenge."

Limitations: According to Benowitz, the main limitations of the study are that it was conducted at a single hospital and that ethnic minorities comprised the majority of the study population, both of which mean the results might not be generalizable to all urban adolescents.

Credit: 
American Association for Cancer Research

Transforming patient health care and well-being through lighting

image: The world of health care is changing rapidly and there is increased interest in the role that light and lighting can play in improving health outcomes for patients and providing healthy work environments for staff, according to many researchers. Recently, the Center for Lighting Enabled Systems & Applications (LESA) at Rensselaer Polytechnic Institute, together with the Illumination Engineering Society (IES), sponsored a workshop to explore pathways to define and promote the adoption of lighting systems specifically for health-care environments.

Image: 
Rensselaer Polytechnic Institute

Troy, N.Y. --The world of health care is changing rapidly and there is increased interest in the role that light and lighting can play in improving health outcomes for patients and providing healthy work environments for staff, according to many researchers. Recently, the Center for Lighting Enabled Systems & Applications (LESA) at Rensselaer Polytechnic Institute, together with the Illumination Engineering Society (IES), sponsored a workshop to explore pathways to define and promote the adoption of lighting systems specifically for health-care environments.

The workshop brought together lighting and human health researchers, healthy-lighting design experts, senior representatives from health-care standards organizations, and health-care providers. The aim of the workshop was to initiate an important discussion among diverse stakeholders on the changes in modern health-care interior lighting applications. The result is the release of a white paper detailing the outcomes and contributions of the participants.

"Today, the field of lighting and health care is undergoing rapid development," said Robert F. Karlicek Jr., LESA director, who also serves as a professor in the Department of Electrical, Computer, and Systems Engineering at Rensselaer. "As research continues to build the link between lighting spectral power distributions and wellness, LED lighting technology strives to bring new healthy lighting to market. Often commercialization in this capacity happens without establishing the clinical data to demonstrate a value-added benefit for patients or the providers, or a defined return on investment for the health-care industry."

"It has long been known that lighting can impact human health and wellness," said Brian Liebel, director of technical standards at IES. "Research continues to refine the precise role of light spectrum, intensity, and timing on the scope of patient outcomes, and on health-care worker productivity. But more research is required to provide the evidence necessary for new, modern standards for lighting systems in health-care and eldercare markets."

According to Karlicek and Liebel, the workshop white paper is intended to be an evidence-based resource for lighting designers and health-care providers to better understand market drivers. Both noted that the whitepaper is not intended to be a comprehensive summary of the field of lighting for human health and wellbeing--rather, a comprehensive introduction of insights shaping the field for participants attending the upcoming IES "Light + Human Health Symposium" that will be held in Atlanta, Georgia, April 8-10.

Funded by the National Science Foundation, LESA's vision is focused on creating digitized, color tunable illumination for new applications in lighting, health care, building management, horticulture, and advanced 5G wireless communications platforms. The collaboration with the Illuminating Engineering Society exemplifies the vision of The New Polytechnic, an emerging paradigm for teaching, learning, and research at Rensselaer, the foundation of which is the recognition that global challenges and opportunities are so great they cannot be adequately addressed by even the most talented person working alone. Rensselaer serves as a crossroads for collaboration--working with partners across disciplines, sectors, and geographic regions--to address complex global challenges, using the most advanced tools and technologies, many of which are developed at Rensselaer.

Credit: 
Rensselaer Polytechnic Institute

Prevention is better than cure: Targeted vaccination to halt epidemics

Amidst growing concerns over the low uptake of flu shots in Europe, scientists from the Italian National Research Council and the JRC confirm that vaccinations remain the best way forward when it comes to stopping the spread of infectious diseases.

It's an option that is nearly always more effective than either doing nothing or attempting to contain an outbreak through quarantine.

Under normal circumstances, the most effective way to prevent illness is to vaccinate according to national immunisation schedules. Widespread immunisation programmes in Europe have made previously deadly diseases such as smallpox and polio a thing of the past.

This study looked specifically at epidemic outbreaks. They found that in such cases targeting carefully selected individuals with vaccination can be successful in containing the outbreak, even with only a relatively small number of individuals getting the relevant shot.

The scientists ran physics-based simulations on networks which sought to replicate the way individuals interact with one another in the real world, such as through the global air transportation network. The simulations are simplified versions of computational frameworks commonly used to investigate the global spread of real-world epidemics, such as Severe Acute Respiratory Syndrome (SARS). Nevertheless, they help understanding basic features of the more complicated and realistic models.

In the simulations, individuals correspond to 'nodes' that can transmit an infection through the links between them. The scientists found that quarantining nodes after the outbreak of an epidemic very quickly becomes ineffective. Quite early on in a simulated outbreak, even the 'do nothing' (non-intervention) strategy becomes preferable to quarantine.

Targeted vaccination was found to be the best option in nearly all epidemic cases. The scientists used a vaccination strategy based on 'optimal percolation', which consists of finding the least set of nodes that, when removed from a network, can fragment it into small clusters. The idea behind this approach is that fragmenting the network ensures infections are contained within small groups, hence preventing the occurrence of large outbreaks.

This might all seem like common sense, but preventive vaccination is not common practice for all illnesses and for some, vaccines do not yet exist. The norovirus outbreak at this year's Winter Olympic Games is an example where quarantine has been used as the option available to health officials. Medical professionals have attempted to initially contain the outbreak by imposing quarantine on the hundreds of staff who were unlucky enough to catch the virus. Despite these measures, the illness is continuing to spread and has started to affect some of the athletes.

In recent years, physicists have made significant advances in the field of network immunisation, developing increasingly efficient techniques to immunise a network by the 'removal' (vaccination) of a few nodes. This knowledge can help to support health policy as policymakers look to ensure increased global security against epidemics.

Credit: 
European Commission Joint Research Centre

The Australian government's plan for the biocontrol of the common carp presents several risks

Belgian, English and Australian scientists are calling on the Australian authorities to review their decision to introduce the carp herpes virus as a way to combat the common carp having colonised the country's rivers. In a letter published in the journal Science, they not only believe that this measure will be ineffective but that it also represents a risk to ecosystems.

On a global scale, the common carp (Cyprinus carpio) is one of the most important fish species in fish farming. Its annual production ranges between 4 and 5 million tonnes. Initially introduced to Australia for production in fish farms, the species has gradually colonised the rivers to the point of dominating the indigenous species. One of the methods proposed by the Australian government to reduce the number of carp is to release a virus which is deadly to this species, the cyprinid herpesvirus 3 (CyHV-3, also called the Koi herpesvirus or KHV) in rivers. However, scientists note that data currently available on the carp's biology, the pathogeny of the virus and the ecology of Australian rivers suggests that this tactic will not be effective and could even represent a risk to ecosystems. It is thanks to Professor Alain Vanderplasschen from the University of Liège (Belgium) that the prestigious scientific review Science asked to issue a scientific opinion on the Australian biocontrol plan (1).

Before the large-scale release of the CyHV-3, which will be costly (the plan proposed has a budget of 18 million Dollars) and irreversible, assessments must be carried out on the virus' actual capacity to sustainably reduce Australian carp populations living freely without harming the indigenous ecosystems.

The authors advocate for the introduction of limited testing to safely assess if the virus can effectively control carp populations without harming ecosystems.

The opinion of the scientists is notably based on work carried out for over a decade by Professor Alain Vanderplasschen from the Immunology and Vaccination Laboratory at the University of Liège who is behind the development of the first vaccine against CyHV-3.

"The discovery in our laboratory of the beneficial role of the behavioural fever expressed by carp as well as other recent results indicate that the Australian government's biocontrol plan will not meet its objectives. This may even cause serious damage to the ecosystems", explains Professor Alain Vanderplasschen.

"By discussing this in Science, one of the world's most respected scientific journals, we hope that the warning will not be ignored by the Australian authorities", notes Professor Alain Vanderplasschen.

Credit: 
University of Liège

Seasonal patterns in the Amazon explained

image: A view of San Lorenzo, Panama, where Brookhaven scientists conducted field observations in tropical forests.

Image: 
Brookhaven National Laboratory

Environmental scientists at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory have led an international collaboration to improve satellite observations of tropical forests.

Responsible for nearly one-third of the world's terrestrial photosynthesis, tropical forests are a critical biome for examining climate change and its potential impacts across the Earth.

"If we can improve our understanding of how much carbon dioxide (CO2) is absorbed by tropical regions, we can improve future climate projections," said Jin Wu, a scientist at Brookhaven's Environmental & Climate Sciences Department.

Satellite images are one of the most common tools scientists use to observe tropical forests, but the efficacy of the method has been a subject of debate. Some researchers have argued that seasonal changes in the "greenness" of tropical forests, as satellites have recently shown, could be misleading. Now, the collaboration led by Brookhaven has used field observations and computational models to help clear up the controversy. Their results, published on Feb. 9 in New Phytologist, also shed light on biological processes that have changed scientists' understanding of seasonality in tropical forests.

Focusing on the canopy

Satellites take wide, sweeping images of the Earth's surface to image the global tropical biome. Captured routinely, these satellite images allow scientists to observe changes in the treetops throughout the year. Changes in the canopies' greenness can indicate how much light--and therefore, how much CO2--the trees are absorbing. Yet, because these satellites take images so far above the treetops and collect data over large areas of forests, the resolution is too low to identify why these changes are occurring.

"One pixel in a satellite image covers almost one square kilometer of forest. That's huge," said Wu, who led the study. "So, within this huge footprint, we cannot tell what kind of biological processes are occurring. That's why we integrated field-based data with the computational models, which track the interactions between light and leaves within a forest canopy, to advance our understanding of what is happening in these satellite images."

With the help of professional tree climbers, the scientists collected field data on three factors that affect canopy greenness: the amount of leaves present, the age of the leaves, and whether the trees were deciduous (lose their leaves annually) or evergreen (retain leaves for more than one year). Overall, they found their field observations closely matched the satellite images, confirming the accuracy of the satellites. In addition, they quantified the influence of each of these three factors on canopy greenness.

"In previous studies, scientists always assumed leaves are homogenously displayed within the forest landscape, but we found this is not true," Wu said. "Even in a tropical rainforest, we find that deciduous and evergreen trees can coexist. The same is true for seasonally dry forests. Moreover, the timing of leaf shedding varies between different deciduous trees, creating a large heterogeneity in leaf display over space and time."

The growth of individual leaves is also unique to each tree.

"Leaves actually behave similarly to humans," Wu said. "As we become older, our metabolic rate will change, and the same is true for leaves. When they are baby leaves, their photosynthesis rates are really low, and that means they can only take in very small amounts of CO2 from our atmosphere. When they become mature, they can take a lot of CO2 from the atmosphere. And when they become old, they take in less CO2 again. It's a convex response."

That means the seasonality of tropical forests is far more complex than scientists previously believed. In future studies, scientists will need to consider the role of leaf heterogeneity in satellite observations to accurately analyze the climate's impact on tropical forests.

To continue studying how the diverse ecology of tropical forests relates to climate change, Wu and his colleagues at Brookhaven installed a network of cameras at multiple sites in the tropics. Every day, these cameras automatically capture images of the canopy at different heights, enabling the Brookhaven team to conduct higher resolution observations of tropical forests. The cameras will operate for more than two years and capture two annual cycles of environmental changes.

"Data from these cameras will allow us to directly see when new leaves grow or old leaves drop off each tree during the annual cycle," Wu said. "Now, we have leaf-level information from our field measurements plus the camera-based data to continue improving our interpretation of satellite observations and climate models to better project the future climate."

Credit: 
DOE/Brookhaven National Laboratory

Therapy for muscular dystrophy-caused heart failure also improves muscle function in mice

LOS ANGELES (Feb. 22, 2018) -- Injections of cardiac progenitor cells help reverse the fatal heart disease caused by Duchenne muscular dystrophy and also lead to improved limb strength and movement ability, a new study shows.

The study, published today in Stem Cell Reports, showed that when researchers injected cardiosphere-derived cells (CDCs) into the hearts of laboratory mice with muscular dystrophy, heart function improved along with a marked increase in exercise capacity.

"We unexpectedly found that treating the heart made the whole body better," said Eduardo Marbán, MD, PhD, director of the Smidt Heart Institute and the investigator who developed the cardiosphere-derived cell technology used in the study. "These basic findings, which have already been translated to clinical trials, rationalize why treating the heart may also benefit skeletal muscle function in boys and young men with Duchenne."

Duchenne muscular dystrophy, which affects 1 in 3,600 boys, is a neuromuscular disease caused by a shortage of a protein called dystrophin, leading to progressive muscle weakness. Most Duchenne patients lose their ability to walk by their early teens. Average life expectancy is about 25. The cause of death often is heart failure because the dystrophin deficiency not only affects the muscles which control movement, but also the heart, crippling its ability to pump blood effectively.

The new findings represent the preclinical basis for the Phase I/II HOPE-Duchenne clinical trial that was presented at the November 2017 American Heart Association Scientific Sessions. That study, the first to test cell therapy in Duchenne muscular dystrophy patients and sponsored by Capricor, Inc., showed improved arm strength after 13 patients received cell infusions (as compared to 12 patients randomly assigned to receive usual care only). Another, larger clinical trial, again sponsored by Capricor, is scheduled to begin later this year - with a key difference. The patients in the Phase II trial will receive multiple cell infusions via an intravenous drip during the course of a year, rather than a single dose injected directly into the heart during a Cath Lab procedure.

Investigators note two surprising results of the newest study, beyond the unexpected effects on skeletal muscle: first, the benefits of the cell therapy lasted long after the cells were naturally pumped out of the heart; and second, levels of the missing protein dystrophin were increased, although the effect was temporary and dystrophin levels remained lower than normal.

"We found that within a few weeks, the injected cells were undetectable," Marbán said, "but the benefits persisted for at least three months, which led us to discover that exosomes secreted by CDCs are responsible."

Exosomes are microscopic vesicles, shed by cells, which contain a diversity of biologically-active contents that are taken up by, and influence the behavior of, nearby and distant cells. Exosomes are increasingly being recognized for their therapeutic potential because they serve as messengers for cells to communicate with each other.

"We found that after receiving CDCs, the lab mice had elevated levels of dystrophin, which likely enabled easier movement and improved survival," said Ronald G. Victor, MD, associate director of the Smidt Heart Institute and a primary investigator on the study. "Even just adding a small amount of dystrophin would make a tremendous difference for these young patients."

The cells used in the mouse study were manufactured in Marbán's laboratory at Cedars-Sinai. The cells used in the Phase I/II study were derived from donor hearts by Capricor Therapeutics. Marbán developed the process to grow CDCs when he was on the faculty of Johns Hopkins University; the process was further developed at Cedars-Sinai. Capricor has licenses the process from Johns Hopkins and Cedars-Sinai for clinical and commercial development. Capricor has licensed additional intellectual property from Cedars-Sinai and the University of Rome. Cedars-Sinai and Marbán have financial interests in Capricor. Victor has been a consultant to the company but was not paid by the company for his work on this study.

Credit: 
Cedars-Sinai Medical Center

Can surgery and anaesthesia affect memory?

Findings from a new Anaesthesia study suggest that patients may score slightly lower on certain memory tests after having surgery and anaesthesia.

In the study of 312 participants who had surgery and 652 participants who had not (with an average age in the 50s), surgery between tests was associated with a decline in immediate memory by one point out of a possible maximum test score of 30 points. Memory became abnormal in 77 out of 670 participants with initially normal memory comprising 18% of those who had had surgery compared with 10% of those who had not. No differences in other measures of memory and executive function were observed between participants having and not having surgery. Reduced immediate memory scores at the second visit were significantly associated with the number of operations in the preceding nine years. Working memory decline was associated with longer cumulative operations.

"The cognitive changes we report are highly statistically significant in view of the internal normative standards we employ, and the large sample size of the control, or non-surgery, population. But the cognitive changes after surgery are small--most probably asymptomatic and beneath a person's awareness," said senior author Dr. Kirk Hogan, of the University of Wisconsin-Madison School of Medicine and Public Health. "The results await confirmation both in follow-up investigations in our own population sample after more surgeries in aging participants, and by other investigators with other population samples."

Dr. Hogan noted that it is too early to recommend any changes in clinical practice regarding prevention, diagnosis, management, and prognosis of cognitive changes after surgery.

Credit: 
Wiley

Study offers more food for thought on kids' eating habits, emotions

image: Dr. Shayla Holub, a psychologist at the University of Texas at Dallas, recently published a study on the effects of happiness and sadness on children's snack consumption.

Image: 
University of Texas at Dallas

A University of Texas at Dallas psychologist has examined the preconceptions about the effects of emotions on children's eating habits, creating the framework for future studies of how dietary patterns evolve in early childhood.

Dr. Shayla C. Holub, head of the psychological sciences PhD program and associate professor in the School of Behavioral and Brain Sciences, demonstrated that children from 4½ to 9 years old chose chocolate candy over goldfish crackers more frequently in response to both happiness and sadness.

Her paper, "The effects of happiness and sadness on children's snack consumption," was recently published online in the journal Appetite. It was co-authored with Dr. Cin Cin Tan, research faculty at University of Michigan's Center for Human Growth and Development, who completed her doctoral dissertation on the topic with Holub at UT Dallas.

Their study showed that, when presented with four snacking options, sad children ate more chocolates than the happy children, who in turn ate more chocolates than the neutral group. Conversely, for the goldfish crackers, the neutral group ate the most, followed by the happy children and the sad children.

"It was nice to see that there was this hierarchy," said Holub, who used clips from Disney's The Lion King to create happy, sad and neutral cohorts of children. "The kids watching the saddest video ate the most chocolate. There was a significant drop in consumption among the ones watching the happy video, but they still consumed more chocolate than the neutral video group. This suggests that children eat in response to both happy and sad emotions, but more for sadness."

The study's results also show that these tendencies increase with age, which suggests to Holub that it is at least in part a socialized behavior.

"This is one of a very few experimental studies on emotional eating in young children," she said. "What we're learning is that it's sometime during the preschool period that children are developing these eating habits. For example, you go to birthday parties and experience positive emotions -- everyone has fun and gets candy or cake. And at holidays, it's all about the food. Children begin to associate food with certain feelings."

Holub, the 2015 recipient of the Aage Møller Teaching Award at UT Dallas, explained that children begin with a strong ability to consume the right amount of calories for their energy needs.

"Very young kids are really good at regulating their food intake," she said. "If you change the energy density of a baby's formula content, the child adapts his or her food intake in response. If you give preschoolers a snack, they will adjust their meal intake to react appropriately so that they are not too hungry or too full. They know their own body cues."

Holub argues that it's in the preschool period that children begin to think not about what their body is telling them, but instead about what their social environment is telling them. It's during this time that lessons such as eating all the food on the plate or prohibiting certain types of food are frequently introduced.

"If the portion that is on my plate is what I'm supposed to eat, I'm going to force myself to eat it," she said. "Restrictive feeding practices also seem to be problematic -- telling children they can't have something makes it a preferred food, and when they gain access to it, they immediately eat more of it. That's another way that children learn to stop listening to their internal cues."

The latest study built on earlier work by the duo showing that parents teach emotional eating behavior both by example and through their feeding practices.

"In 2015, we published one of the first studies to find that it's not only that the behavior is being modeled for a child -- seeing a parent turn to food when they're sad, for example -- but that it sometimes also might be that parents feed children in emotion-regulating ways," Holub said. "Your child gets upset? Here's a piece of candy. You're bored? Here's something to eat."

Holub emphasized that while these trends don't indicate that habits can't be modified later, ages 3 to 5 constitute a crucial time in which some children lose their ability to self-regulate.

"If we can learn how to nurture healthy habits early on, that makes us less likely to have to eliminate negative behaviors later," she said. "The idea is to set up healthy trajectories and communicate with our children about how to choose healthy options."

Credit: 
University of Texas at Dallas

Precision cancer therapy effective in both children and adults

Three quarters of patients, both adults and children, with a variety of advanced cancers occurring in different sites of the body responded to larotrectinib, a novel therapy that targets a specific genetic mutation. Results of this multisite phase 1/2 trial have been published in the New England Journal of Medicine on February 22, 2018.
Unlike most cancer therapies, this oral treatment is based on the genetic traits of the tumor and not the organ where the cancer originated.

An acquired genetic defect, TRK fusions accelerate cancer cell growth. Larotrectinib is highly selective for inhibiting this process. Fifty-five patients, ranging from 4 months to 76 years of age, with 17 unique tumor types, were treated with larotrectinib. Three quarters of patients enrolled responded to therapy and 86% of responding patients remain on study or have undergone curative surgery. No patients discontinued treatment due to drug-related side effects.

Several pediatric patients that enrolled in the study had infantile fibrosarcoma, a type of cancer that harbors a TRK fusion and is difficult to treat since it responds poorly to chemotherapy. Radiation therapy is also not a good option since it has devastating long-term effects for young patients.

"This is truly a magic bullet for our patients with TRK-positive cancer," said Leo Mascarenhas M.D, M.S, deputy director of the Children's Center for Cancer and Blood Diseases and director of the Names Family Foundation Early Phase Clinical Trials Program in oncology at Children's Hospital Los Angeles, who helped design the pediatric part of the study. "In some cases, this cancer can be treated surgically - often requiring amputation or another disfiguring surgical procedure. After treating our patient with infantile fibrosarcoma with larotrectinib, the cancer shrunk sufficiently and we were able to surgically remove the tumor while preserving the patient's leg."

This study is part of a noteworthy drug development program. Typically, testing of new therapies in a pediatric population is done after the drug is licensed for adults, if at all. However, larotrectinib was simultaneously studied in children and adults. A special liquid formulation was developed for administering appropriate doses to very young patients. This early pediatric focus helped to accelerate clinical development by aiding in the rapid accrual of appropriate patients.

The FDA granted larotrectinib breakthrough therapy designation that resulted in an expedited review. Drugs may qualify as breakthrough therapies when preliminary clinical data indicate that the new treatment offers substantial advantages over existing options for serious or life-threatening diseases.

Credit: 
Children's Hospital Los Angeles