Culture

MU Health Care neurologist publishes guidance related to COVID-19 and stroke care

A University of Missouri Health Care neurologist has published more than 40 new recommendations for evaluating and treating stroke patients based on international research examining the link between stroke and novel coronavirus (COVID-19).

Neurologist Adnan I. Qureshi, MD, a professor of clinical neurology at the MU School of Medicine, led a team of stroke experts from 18 countries with documented COVID-19 outbreaks to develop recommendations for doctors evaluating patients with acute ischemic stroke who have either suspected or confirmed COVID-19 infection.

The international panel noted increased clotting in COVID-19 patients, which raised their risk for stroke. The research team found evidence that young people without previous risk factors for stroke are experiencing ischemic stroke with clots in the arteries of the brain presumably related to a COVID-19 infection. The average onset of stroke in COVID-19 patients occurred 10 days after infection, but in some cases, stroke was the initial symptom.

"People may come to the emergency department with stroke, and that may be the initial manifestation of COVID-19 infection, which puts a clear burden on providers because now you may not know if the patient you are evaluating for stroke may actually have underlying COVID-19 infection," Qureshi said. "The purpose of these recommendations is to provide a step-by-step guide of how to manage these patients. The modifications we suggest have implications for the health of patients, but also the health of those who are involved in their care."

Qureshi's research indicates health care workers are at risk of acquiring COVID-19 from stroke patients and they should take safety precautions while limiting the number of care providers who have direct interaction with each patient. The guidelines also call for providers to treat any suspected COVID-19 stroke patient as though the patient has the infection, ensuring the sanitation of all equipment used during the stroke assessment. If a stroke patient is suspected to have COVID-19, a chest CT scan can provide rapid evidence of a possible infection in the lungs.

"Since COVID-19 actually involves the lungs, a simultaneous scan of the chest and brain can check for stroke and identify changes in the lungs that may identify whether this patient truly has or does not have COVID-19 infection," Qureshi said. "This step has been incorporated into acute stroke protocol at MU Health Care."

Qureshi encourages stroke patients and their family members to recall any symptoms of dry cough, fever or body aches before the stroke, which may help the provider determine if the stroke is related to an underlying COVID-19 infection. If a COVID-19 infection is confirmed and other organs have been affected, guidelines suggest a Sequential Organ Failure Assessment (SOFA) can provide an overall prognosis before determining the appropriate stroke treatment in COVID-19 patients.

Qureshi's study, "Management of acute ischemic stroke in patients with COVID-19 infection: Report of an international panel," also featured contributions from MU Health Care neurologist Camilo R. Gomez, MD, professor of clinical neurology at the MU School of Medicine. It was recently published by the International Journal of Stroke.

Credit: 
University of Missouri-Columbia

Moffitt Cancer Center study suggests more could benefit from CAR T-cell therapy

TAMPA, Fla. -- Chimeric antigen receptor T-cell therapy, or CAR T, has become a game changer for lymphoma and leukemia patients who have relapsed or become resistant to previous treatments. The therapy uses a patient's own immune cells that are re-engineered in the lab to seek out and kill cancer cells when infused back into the patient. Yescarta® (axicabtagene ciloleucel) was the first CAR T-cell therapy approved for the treatment of adults with large B cell lymphoma. The pivotal ZUMA-1 clinical trial that led to its approval showed that 83% of patients responded to the therapy, with 58% having a complete response. But clinical trials often have stringent eligibility criteria and the outcomes observed may not match what physicians see in a real-world clinical setting.

Moffitt Cancer Center organized a consortium of 16 cancer treatment facilities across the U.S. that offer Yescarta as a standard-of-care therapy for patients with relapsed/refractory large B cell lymphoma. They wanted to determine if the safety and effectiveness seen in the ZUMA-1 clinical trial were similar for patients treated with the now commercially available CAR T therapy. Their findings were published in the Journal of Clinical Oncology.

The consortium pooled retrospective data on 298 patients who completed apheresis, the process to remove a patient's T cells, with the intent of having Yescarta manufactured and administered. It is important to point out that of this group, 129 patients (43%) would not have qualified for CAR T-cell therapy based on the ZUMA-1 comorbidity eligibility criteria. Overall, 275 patients (92%) received a Yescarta infusion. In the ZUMA-1 trial, 108 patients received Yescarta.

"Our analysis found that the overall response rate of 82%, and estimated 12 month durable response rate of 47%, for our group of patients compared favorably to the ZUMA-1 trial results," said Frederick Locke, M.D., corresponding author of the study and vice chair of the Department of Blood and Marrow Transplant and Cellular Immunotherapy and co-leader of the Immunology Program at Moffitt. "Durable response rates were encouraging even in patients with significant comorbidities, suggesting that patients need not meet ZUMA-1 eligibility criteria to benefit from axicabtagene ciloleucel."

One adverse reaction that can occur following CAR T therapy is cytokine release syndrome (CRS). This occurs when a large number of cytokines, which are small proteins released by immune cells, are rapidly released into the blood. This can cause a patient to have a fever, increased heart rate, difficulty breathing and low blood pressure. In the ZUMA-1 trial, 11% of patients treated with Yescarta experienced severe CRS. However, in the commercial setting, that number was lower at 7%.

"We believe this observation is due to the greater use of tocilizumab and corticosteroids compared to ZUMA-1, in line with evolving practice patterns for toxicity management," said Michael Jain, M.D., Ph.D., co-first author and assistant member of the Blood and Marrow Transplant and Cellular Immunotherapy Department at Moffitt.

The authors believe this study suggests that patients do not need to meet the ZUMA-1 eligibility criteria to benefit from Yescarta, including upper age limits and those with underlying conditions.

Credit: 
H. Lee Moffitt Cancer Center & Research Institute

How range residency and long-range perception change encounter rates

From vast herds of wildebeest thundering across the Serengeti to a malaria-laden mosquito silently stalking a human host, the movement of animals has effects that reverberate throughout the biosphere. The way that animals move governs many ecological interactions including predation, disease transmission, and human-wildlife conflict. Encounter rates, which quantify how often moving individuals come in contact with each other, serve as the "glue" that links movement behavior to ecological processes. While GPS devices have revolutionized the study of animal movement, research on encounter rates has not kept pace. A multidisciplinary research team consisting of ecologists and physicists has found that the gap between the data and how encounter is modeled could have serious consequences for certain ecological predictions (DOI: 10.1016/jtbi.2020.110267).

Specifically, the standard ecological encounter model, dubbed "the law of mass action", has been in use for over 100 years, and assumes that any individual roams freely across the entire range of its population. "When you look at animal tracking data, one of the first things you see is that most animals have home ranges, meaning that each individual typically only uses a small portion of the population range", says Dr. Justin Calabrese, Visiting Professor at the Center for Advanced Systems Understanding in Görlitz (CASUS), Germany. The disconnect between the data and the mass-action assumption suggests that encounter rate models are due for a rethink. "This matters because the mass-action assumption is everywhere in ecology and related fields" explains Calabrese, "with one current example being the SIR-type (Susceptible, Infected, Recovered) epidemiological models used to understand the spread and control of SARS-CoV-2."

The international, multidisciplinary research team found that incorporating home ranging behavior into encounter rates could drastically change the results, and only under very narrow conditions did the more realistic models mimic mass-action encounter. "This suggests that interacting population models relying on mass action, like SIR disease models, could give very different predictions with a more realistic accounting of movement behavior", explains Dr. Ricardo Martinez-Garcia, Assistant Professor at the ICTP-South American Institute for Fundamental Research in São Paulo. Worryingly, these more realistic encounter rates could either be higher or lower relative to mass action, and were sensitive to the details of the movement behavior. Martinez-Garcia explains that "this kind of context dependence makes it very hard to know even in which direction the predictions of mass-action based models will be off, so this is something we need to explore in the future."

Encounter rate modeling in ecology has deep interdisciplinary roots. The aforementioned "law of mass action" relies on the concept of the ideal gas, and thus describes animals as erratic particles that move largely at random and do not interact with their environment. To develop this new encounter modeling framework, the team built upon more refined physical concepts borrowed from nonequilibrium statistical mechanics. "The transfer of concepts originally developed in one discipline to solve problems in another is a powerful way to tackle the complexity we see in nature. This work is a clear example of that", says Calabrese who recently joined the newly founded CASUS, which combines systems science with a variety of research disciplines. At CASUS, Calabrese will work at the interface of ecology and Earth system science. Interdisciplinarity, however, is a two-way street. As the encounter between individuals, groups or entities that show limited spatial occupation is a widespread phenomenon in the natural, social and economic sciences, the conclusions of the encounter rate work could extend well beyond ecology.

Credit: 
Helmholtz-Zentrum Dresden-Rossendorf

K-State infectious disease scientist offers road map for future COVID-19 research

image: The Biosecurity Research Institute, or BRI, at Pat Roberts Hall provides the high-security laboratories that Jürgen A. Richt, the Regents distinguished professor at Kansas State University, and other scientists need to study SARS-CoV-2, which is the virus responsible for COVID-19.

Image: 
Kansas State University

MANHATTAN, KANSAS -- There are many unanswered questions about COVID-19. A Kansas State University infectious disease scientist and collaborators are offering a possible research road map to find the answers.

Jürgen A. Richt, the Regents distinguished professor at Kansas State University in the College of Veterinary Medicine, has co-authored a critical needs assessment for coronavirus-related research in companion animals and livestock. The article, "A Critical Needs Assessment for Research in Companion Animals and Livestock Following the Pandemic of COVID-19 in Humans," appears in the journal Vector-Borne and Zoonotic Diseases. Co-authors include Tracey McNamara from Western University of Health Sciences and Larry Glickman from Purdue University.

"We need to address these challenges in a scientific manner -- in a proactive manner, not in a reactive manner," said Richt, also the director of the university's Center of Excellence for Emerging and Zoonotic Animal Diseases, known as CEEZAD. "With COVID, every day something is new -- what was correct yesterday, could be wrong today."

Because of the rapid change of knowledge related to coronavirus, Richt and his collaborators wrote the article to stress importance of studying the ways that COVID-19 could spread between humans and animals. The scientists say that research should focus in several areas, including:

The potential for companion animals, such as cats and dogs, to carry the virus.

The economic and food security effects if the virus can spread among livestock and poultry.

National security areas, especially among service animals such as dogs that detect narcotics or explosives because COVID-19 is known to affect smell and cause hyposmia or anosmia.

"If dogs are susceptible and lose their smell and taste, it could affect our national security," said Richt, who also serves on an expert panel for the World Health Organization. "If livestock are also susceptible, it could significantly affect food safety and food security, too."

Richt's recent research has shown that pigs do not seem to be susceptible to coronavirus, but little is known if the virus affects cattle, sheep, chickens or wildlife. He is further studying if other livestock, such as cattle or sheep, may be susceptible to coronavirus.

The K-State Biosecurity Research Institute, or BRI, at Pat Roberts Hall provides the high-security laboratories needed for Richt and other scientists to study SARS-CoV-2, which is the virus responsible for COVID-19. The BRI is a biosafety level-3 facility that houses important multidisciplinary research, training and educational programs on pathogens that affect animals, plants and insects as well as food safety and security.

"Time is of the essence when responding to a new biological threat, and everyone at the BRI greatly appreciates the continued support from K-State leadership who realized the importance of keeping us operational," said Stephen Higgs, director of the BRI and editor-in-chief of Vector-Borne and Zoonotic Diseases. "Thanks to our dedicated and highly skilled BRI staff, we provided the safe, secure environment, training and infrastructure required for research on SARS-CoV-2. The paper explains why Dr. Richt's research in the BRI is so important to all of us and it was great that we were able to publish it so quickly and make it freely available."

Richt's own coronavirus research at the Biosecurity Research Institute focuses on four areas: animal susceptibility and transmission of SARS-CoV-2, therapeutic treatments, diagnostics and vaccines. Richt develops models to test therapies and has collaborated with researchers nationally and internationally. He also is collaborating to test and develop potential vaccines that are safer and do not lead to vaccine-associated enhancement of the disease, which is an important issue for coronavirus vaccines.

One of his collaborative projects involves Sean Joseph with Scripps Research, Sumit Chanda with Sanford Burnham Prebys Medical Discovery Institute and Adolfo García-Sastre with Icahn School of Medicine at Mount Sinai. The project has involved repurposing existing drugs that have been approved by the Food and Drug Administration for treating cancer, leprosy, Crohn's disease and other illnesses.

The researchers used a National Institutes of Health library of 12,000 drugs and tested them against COVID-19 in cell cultures to see if they inhibit SARS-CoV-2 replication. They have narrowed the list of drugs down to about 20 potentially effective drugs. Richt is now testing these potential antiviral drugs in preclinical models.

Another collaborative project with Nevan Krogan with University of California San Francisco and Adolfo García-Sastre with Icahn School of Medicine at Mount Sinai focuses on repurposing drugs based on coronavirus protein--host protein interaction studies.

"We are on the front end of studying whether these drugs, which look very promising in cell culture assays, can be used in COVID patients," Richt said. "We hope that the work we are doing presently will save lives."

Credit: 
Kansas State University

Clever new robot rover design conquers sand traps

image: Built with multifunctional appendages able to spin wheels that can also be wiggled and lifted, a new robot developed at Georgia Tech with U.S. Army funding has complex locomotion techniques robust enough to allow it to climb sand covered hills to avoid getting stuck while surveying a planet or the moon.

Image: 
hristopher Moore, Georgia Tech)

RESEARCH TRIANGLE PARK, N.C. -- Built with wheeled appendages that can be lifted, a new robot developed with U.S. Army funding has complex locomotion techniques robust enough to allow it to climb sand covered hills and avoid getting stuck. The robot has NASA interested for potential surveying of a planet or the Moon.

Using a move that researchers at Georgia Institute of Technology dubbed rear rotator pedaling, the robot, known as the Mini Rover, climbs a slope by using a design that combines paddling, walking, and wheel spinning motions. The rover's behaviors were modeled using a branch of physics known as terradynamics.

The journal Science Robotics published the research as a cover article. The Army Research Office, an element of the U.S. Army Combat Capabilities Development Command's Army Research Laboratory and NASA, through the National Robotics Initiative, funded the research.

"This basic research is revealing exciting new approaches for locomotion in complex terrain," said Dr. Samuel Stanton, a program manager at ARO. "This could lead to platforms capable of intelligently transitioning between wheeled and legged modes of movement to maintain high operational tempo."

According to the scientists, when loose materials like sand flow, that can create problems for robots moving across it.

"This rover has enough degrees of freedom that it can get out of jams pretty effectively," said Dan Goldman, the Dunn Family Professor in the School of Physics at the Georgia Institute of Technology. "By avalanching materials from the front wheels, it creates a localized fluid hill for the back wheels that is not as steep as the real slope. The rover is always self-generating and self-organizing a good hill for itself."

A robot built by NASA's Johnson Space Center pioneered the ability to spin its wheels, sweep the surface with those wheels and lift each of its wheeled appendages where necessary, creating a broad range of potential motions. Using in-house 3-D printers, the Georgia Tech researchers collaborated with the Johnson Space Center to re-create those capabilities in a scaled-down vehicle with four wheeled appendages driven by 12 different motors.

"The rover was developed with a modular mechatronic architecture, commercially available components, and a minimal number of parts," said Siddharth Shrivastava, an undergraduate student in Georgia Tech's George W. Woodruff School of Mechanical Engineering. "This enabled our team to use our robot as a robust laboratory tool and focus our efforts on exploring creative and interesting experiments without worrying about damaging the rover, service downtime, or hitting performance limitations."

The rover's broad range of movements gave the research team an opportunity to test many variations that were studied using granular drag force measurements and modified Resistive Force Theory. The team began with the gaits explored by the NASA RP15 robot, and experimented with locomotion schemes that could not have been tested on a full-size rover.

The researchers also tested their experimental gaits on slopes designed to simulate planetary and lunar hills using a fluidized bed system known as SCATTER, or Systematic Creation of Arbitrary Terrain and Testing of Exploratory Robots, that could be tilted to evaluate the role of controlling the granular substrate.

In the experiments, the new gait allowed the rover to climb a steep slope with the front wheels stirring up the granular material - poppy seeds for the lab testing - and pushing them back toward the rear wheels. The rear wheels wiggled from side-to-side, lifting and spinning to create a motion that resembles paddling in water. The material pushed to the back wheels effectively changed the slope the rear wheels had to climb, allowing the rover to make steady progress up a hill that might have stopped a simple wheeled robot.

"In our previous studies of pure legged robots, modeled on animals, we had kind of figured out that the secret was to not make a mess," Goldman said. "If you end up making too much of a mess with most robots, you end up just paddling and digging into the granular material. If you want fast locomotion, we found that you should try to keep the material as solid as possible by tweaking the parameters of motion."

But simple motions had proved problematic for Mars rovers, which famously got stuck in granular materials. Goldman says this gait discovery might be able to help future rovers avoid that fate.

"This combination of lifting and wheeling and paddling, if used properly, provides the ability to maintain some forward progress even if it is slow," Goldman said. "Through our laboratory experiments, we have shown principles that could lead to improved robustness in planetary exploration - and even in challenging surfaces on our own planet."

The researchers hope next to scale up the unusual gaits to larger robots, and to explore the idea of studying robots and their localized environments together.

Though the Mini Rover was designed to study lunar and planetary exploration, the lessons learned could also be applicable to terrestrial locomotion - an area of interest to the Army.

Credit: 
U.S. Army Research Laboratory

Sainsbury Wellcome Center researchers find mouse and human eye movements share important similarity

In a study published today in Current Biology, Arne Meyer, John O'Keefe and Jasper Poort used a lightweight eye-tracking system composed of miniature video cameras and motion sensors to record head and eye movements in mice without restricting movement or behaviour. Measurements were made while the animals performed naturalistic visual behaviours including social interactions with other mice and visual object tracking. While the eyes in humans typically move together in the same direction, those in mice often moved in opposite directions. Although humans also make eye movement without head movement, for example when reading a book, the study found that mouse eye movements were always linked to head movement.

The researchers identified two types of mouse eye movement coupled to head movement with different functions: 'head tilt compensation' and 'saccade and fixate' eye movements. 'Head tilt compensation' allows mice to maintain a consistent view of the world by compensating for slow changes in head tilt, and results in the two eyes moving in opposite directions, which is typically not observed in humans. 'Saccade and fixate' eye movements allow animals to stabilise their view during fast head rotations and shift their gaze in the direction of the head rotation. These 'saccade and fixate' movements are similar to those seen in humans and monkeys, which often sample their environment by a sequence of stable images (fixations) and result in the two eyes moving in the same direction. 'Fixate' eye movements keep the flow of visual information steady while 'saccade' movements allow the animal to select relevant visual information to focus on.

The mouse is an important species to help understand how the human brain functions. First, the organisation and function of the mouse and human brain is similar in many ways, although there are also important differences. Second, scientists can use unique genetic research tools in mice to study brain circuits at a level of detail not possible in other mammals. Third, scientists use genetic tools in mice to model human brain disorders.

The traditional approach to studying vision in humans, monkeys and mice involves restraining head movement. While this facilitates the interpretation of data and allows researchers to use a wider range of experimental measuring methods, it has been unclear whether the results can be generalised to naturalistic behaviours where both head and eyes are free to move. Understanding how mice visually sample their surroundings is also crucial to further close the gap between vision and navigation which has traditionally been studied in freely moving rodents.

This research validates using mice to study important aspects of how humans select visual features that are most important for navigation and decision-making. This visual process is impaired in multiple neurological and neuropsychiatric disorders, including schizophrenia, Alzheimer's disease and stroke. These impairments are currently difficult to treat, and using mice to model these conditions will allow scientists to study the underlying brain mechanisms to help identify and develop new treatments.

Credit: 
Sainsbury Wellcome Centre

Scientists develop tool to sequence circular DNA

University of Alberta biologists have invented a new way for sequencing circular DNA, according to a new study. The tool--called CIDER-Seq--will give other scientists rich, accurate data on circular DNA in any type of cell.

While our own DNA is linear,, circular DNA is common in the genomes of bacteria and viruses. Scientists have also discovered circular DNA within the nuclei of human and plant cells, called extrachromosomal circular DNA (eccDNA). Recently, research has begun to investigate the role of eccDNA in human cancer--but progress has been hampered due to the lack of effective methods for studying and sequencing eccDNA.

"Our key advance is that, through our method, scientists can finally gain an unbiased, high-resolution understanding of circular DNA in any type of cell," explained Devang Mehta, postdoctoral fellow in the Department of Biological Sciences and lead author. "With our invention of CIDER-Seq, we can start to begin to understand the function of these mysterious circular DNAs in human and plant cells."

CIDER-Seq uses DNA sequencing technology called PacBio. The method includes a web-lab protocol, as well as a new computational pipeline. It is optimized to examine both viral genomes and eccDNA and is made accessible to other scientists online.

"We devised a new molecular biology method and a new bioinformatics algorithm to finally obtain full length sequences of eccDNA," explained Mehta. "Our method finally allows us to sequence these molecules completely and gives us and other researchers a tool to better understand what they actually do in the cell."

Credit: 
University of Alberta

Study: Multiscale crop modeling effort required to assess climate change adaptation

image: Researchers Bin Peng, left, and Kaiyu Guan led a large, multi-institutional study that calls for a better representation of plant genetics data in the models used to understand crop adaptation and food security during climate change.

Image: 
Photo illustration by Fred Zwicky

CHAMPAIGN, Ill. -- Crop modeling is essential for understanding how to secure the food supply as the planet adapts to climate change. Many current crop models focus on simulating crop growth and yield at the field scale, but lack genetic and physiological data, which may hamper accurate production and environmental impact assessment at larger scales.

In a new paper published in the journal Nature Plants, researchers identify a series of multiscale and multidisciplinary components - from crop genetics up to global factors - that are critical for finding environmentally sustainable solutions to food security.

Many crop models focus on understanding how plant characteristics such as leaf size play into the crop yield at the field scale, the researchers said. "Modeling at this scale is critical, but we would like to incorporate information from gene-to-cell and regional-to-global scale data into our modeling framework," said Bin Peng, a University of Illinois at Urbana-Champaign postdoctoral researcher and co-lead author.

The study identifies components that could help generate a more informative modeling framework. "Multiscale modeling is the key to linking the design of climate change adaptation strategies for crop and field management with a large-scale assessment of adaptation impact on crop production, environment, climate and economy," Peng said.

The study calls for a better representation of the physiological responses of crops to climate and environmental stressors - like drought, extreme rainfall and ozone damage. "Many physiological processes would be important to simulate the crop growth under stressed conditions accurately," Peng said. Examples include water moving from soil to plant to atmosphere driven by canopy energy balance, he said.

"We should also include a better representation of crop management," Peng said. "That would be extremely important for assessing both crop production and environmental sustainability, as well as their tradeoffs."

The researchers said there are opportunities to close a variety of data gaps, as well. "Integration of remote-sensing data, such as the work performed in our lab, will be extremely valuable for reducing data gaps and uncertainties," said natural resources and environmental sciences professor and project investigator Kaiyu Guan. "One of the advantages of remote sensing is its vast spatial coverage - we can use remote sensing to constrain crop models over every field on the planet."

The authors also propose a model-data integration pathway forward. "Doing the right simulation of crop responses to climate change factors is critically important," Guan said. "The most challenging part is whether crop models can capture those emergent relationships, which can be derived from empirical observations."

"No single scientist or research lab can produce these models on their own," said study co-author and plant biology professor Amy Marshall-Colón. "This type of effort will require patience and collaboration across many disciplines."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Answers to these questions can help #Decision2020 build momentum for Americans as we age

image: AGS #Decision2020 Candidate Question Guide

Image: 
(C) 2020, American Geriatrics Society

With primary and general elections on the horizon across the U.S., the American Geriatrics Society (AGS) today released a series of high-priority questions for candidates. The AGS candidate question guide is aimed at helping Americans keep all political leaders--including and perhaps especially those running for president--committed to a clear, articulated vision how they will support us all as Americans age.

"How candidates answer a question gives us a sense of what policies they would put forward to support us all as we age," explains AGS Chief Executive Officer Nancy Lundebjerg, MPA. "The current COVID-19 pandemic highlights how important federal and state leadership are to having policies in place that support the health of us all through a comprehensive approach informed by how geriatrics approaches our care: Team-based, person-centered, and focused on the whole person, with the goal of each of us remaining active and engaged in our communities."

The AGS question guide--published in the Journal of the American Geriatrics Society (DOI: 10.1111/jgs.16515) and in an easy-to-use, tweetable format at the AGS #Decision2020 Hub, AmericanGeriatrics.org/Decision2020--focuses on eight policy issues important to older adults, caregivers, and the health professionals who help promote their well-being:

1. Ensuring Access to Geriatrics Health Professionals: What policies and programs would you champion that would increase access to geriatrics health professionals for older Americans?

2. Expanding Title VII Geriatrics Training Programs: How would you work to expand the reach of federal training programs so that all older people have access to health professionals who are competent to meet our needs as we age?

3. Ensuring Our Workforce is Competent to Care for Older Americans: How would you reform graduate medical education to address the gap between training requirements and our nation's need for a workforce to care for us as we age?

4. Reducing the Toll and Impact of Chronic Diseases: How would you prioritize aging research across federal agencies and institutions so that we can address the human and economic toll of chronic diseases on older Americans?

5. Ensuring Access to Adequate Pain Relief for Older Americans Living with Advanced Illness: What policies would you champion to ensure frail older Americans living with advanced illness (typically those 85 years old or older, with multiple chronic conditions) have access to adequate pain relief?

6. Supporting American Women:

What will you do to ensure women receive equal pay for equal work?

What are your plans for ensuring women and other traditionally underrepresented groups are vibrant parts of your Administration?

7. Supporting American Families: How would you ensure that all Americans, including all those employed by the federal government, have access to paid family leave?

8. Addressing Complexity in Caring for Older Americans:

How would you work to improve the quality and efficiency of care delivered to the increasing number of Medicare beneficiaries with multiple chronic and complex conditions?

How would you improve care and care coordination across healthcare settings important to individuals who have dual eligibility for both Medicare and Medicaid?

Each issue included in the guide features background on its importance, as well as potential policy solutions to help guide voter evaluations of candidate responses. To make its recommendations as actionable as possible, the AGS also included contact information for presidential candidates, as well as tools for determining state and local contenders by district, on its #Decision2020 Hub. Visit AmericanGeriatrics.org/Decision2020 to learn more.

Credit: 
American Geriatrics Society

Bike commuting accelerated when bike-share systems rolled into town

image: LimeBike, shown here, is a bike-share system serving cyclists in Seattle.

Image: 
Jackson Holtz/U. of Washington

In the past couple of years, if you lived in a major, or even mid-sized city, you were likely familiar with bike-share bikes.

Whether propped against a tree, strewn along the sidewalk or standing “docked” at a station, the often brightly colored bikes with whimsical company names promised a ready means to get from Point A to Point B.

But one person’s spontaneous ride is another person’s commute to work. Prior to the COVID-19 pandemic, in cities where bike-share systems have been introduced, bike commuting increased by 20%, said Dafeng Xu, an assistant professor in the University of Washington’s Evans School of Public Policy & Governance. Xu studied U.S. cities with and without bike-share systems, using Census and company data to analyze how commuting patterns change when bike shares become available.

“This study shows that bike-share systems can drive a population to commute by bike,” said Xu, whose study was published May 11 in the Journal of Policy Analysis and Management.

Bike-share systems, common in cities in Europe and Asia, were launched in four U.S. cities in 2010 and as of 2016 had grown to more than 50. Not all systems have been successful: Convenience – how easy it is to find and rent a bike – is the key. In Seattle, for example, a city-owned bike-share program failed in 2017 due largely to a limited number of bikes and a lack of infrastructure, but private companies in the same market thrived prior to the pandemic.

[Around the world, cities have enacted mobility restrictions during the coronavirus outbreak. The responses of bike-share companies, and bike-share usage, have varied by community.]

Among other interests in transportation and immigration policy, Xu researches the effects of bicycling on the environment and human health, and on the ways bike-share systems can play a role by expanding access to cycling.

“In general, biking is good and healthy, and it means less pollution and traffic, but it can be expensive, and people worry about their bikes being stolen, things like that,” Xu said. “Bike share solves some of these problems, because people don’t need to worry about the cost and theft.”

For this study, Xu sorted through nine years of demographic and commute statistics from the American Community Survey, a detailed, annual report by the Census Bureau. He then examined bike-share company data (through the National Association of City Transportation Officials) from 38 cities with systems, focusing on trips logged during morning and afternoon rush hours. By comparing the number, location and time of work-related bike commutes from Census data against bike-share company records of trips logged, both before and after the launch of bike shares, Xu was able to estimate the use of bike shares for commute trips.

Xu found that in both bike-share and non-bike-share cities, the rate of bike commuting increased, while car commuting decreased, from 2008-2016. However, the rate of bike commuting – and the use of public transportation – was significantly greater in bike-share cities.

For example, in bike-share cities in 2008, roughly 66% of commuters drove to work, about 1% biked, and 22% took transit. That compared to non-bike-share cities, where about 88% of commuters drove, fewer than 1% biked, and 4% took transit.

By 2016 – after many bike-share systems had launched — car commuting had fallen to 59% in bike-share cities, while bike commuting had climbed to 1.7% and transit to 26%. Commuting by car in non-bike-share cities had slipped to 83% in 2016, while bike commuting had grown to 1%, and transit to 6%.

Nationwide, 0.6% of commuters bike to work, according to an American Community Survey report in 2017.

In general, cities with larger bike-share systems also experienced sharper increases in bicycle commuting, Xu said.

“This is not surprising: A large bike-share system means a higher density of public bicycles and is thus more accessible by commuters,” he said. “In contrast, sadly, Seattle’s Pronto struggled to attract commuters and was finally doomed only after three years of operation partially due to its relatively small size.”

In his paper, Xu points to Chicago, which operates a municipally owned bike-share system called Divvy. Prior to Divvy’s launch in 2013, 1.5% of commuters biked to work, Xu said, but afterward, that rate grew to 2%.

The trends held, he said, even when controlling for a city’s expansion of protected bike lanes – another significant factor in whether people choose to bike to work, according to other research.

Overall, the numbers before COVID-19 were promising, Xu said. The numbers could grow, he said, if communities and bike-share companies make changes that can boost the appeal of bike commuting: adding bike lanes to city streets, expanding programs to outlying communities, or increasing the allowable rental time. Many bike shares, for instance, last only up to a half-hour before a user has to pay for a new trip.

Xu is also the author of a previous paper analyzed the impact of bike-share systems on obesity rates.

For more information, contact Xu at dafengxu@uw.edu.

Journal

Journal of Policy Analysis and Management

DOI

10.1002/pam.22216

Credit: 
University of Washington

Seeing the universe through new lenses

image: A ground-based space image of a lensing candidate identified in the study (left), and a Hubble Space Telescope image confirming the lens (right).

Image: 
Dark Energy Camera Legacy Survey, Hubble Space Telescope

Like crystal balls for the universe's deeper mysteries, galaxies and other massive space objects can serve as lenses to more distant objects and phenomena along the same path, bending light in revelatory ways.

Gravitational lensing was first theorized by Albert Einstein more than 100 years ago to describe how light bends when it travels past massive objects like galaxies and galaxy clusters.

These lensing effects are typically described as weak or strong, and the strength of a lens relates to an object's position and mass and distance from the light source that is lensed. Strong lenses can have 100 billion times more mass than our sun, causing light from more distant objects in the same path to magnify and split, for example, into multiple images, or to appear as dramatic arcs or rings.

The major limitation of strong gravitational lenses has been their scarcity, with only several hundred confirmed since the first observation in 1979, but that's changing ... and fast.

A new study by an international team of scientists revealed 335 new strong lensing candidates based on a deep dive into data collected for a U.S. Department of Energy-supported telescope project in Arizona called the Dark Energy Spectroscopic Instrument (DESI). The study, published May 7 in The Astrophysical Journal, benefited from the winning machine-learning algorithm in an international science competition.

"Finding these objects is like finding telescopes that are the size of a galaxy," said David Schlegel, a senior scientist in Lawrence Berkeley National Laboratory's (Berkeley Lab's) Physics Division who participated in the study. "They're powerful probes of dark matter and dark energy."

These newly discovered gravitational lens candidates could provide specific markers for precisely measuring distances to galaxies in the ancient universe if supernovae are observed and precisely tracked and measured via these lenses, for example.

Strong lenses also provide a powerful window into the unseen universe of dark matter, which makes up about 85 percent of the matter in the universe, as most of the mass responsible for lensing effects is thought to be dark matter. Dark matter and the accelerating expansion of the universe, driven by dark energy, are among the biggest mysteries that physicists are working to solve.

In the latest study, researchers enlisted Cori, a supercomputer at Berkeley Lab's National Energy Research Scientific Computing Center (NERSC), to automatically compare imaging data from the Dark Energy Camera Legacy Survey (DECaLS) - one of three surveys conducted in preparation for DESI - with a training sample of 423 known lenses and 9,451 non-lenses.

The researchers grouped the candidate strong lenses into three categories based on the likelihood that they are, in fact, lenses: Grade A for the 60 candidates that are most likely to be lenses, Grade B for the 105 candidates with less pronounced features, and Grade C for the 176 candidate lenses that have fainter and smaller lensing features than those in the other two categories.

Xiaosheng Huang, the study's lead author, noted that the team already succeeded in winning time on the Hubble Space Telescope to confirm some of the most promising lensing candidates revealed in the study, with observing time on the Hubble that began in late 2019.

"The Hubble Space Telescope can see the fine details without the blurring effects of Earth's atmosphere," Huang said.

The lens candidates were identified with the assistance of a neural network, which is a form of artificial intelligence in which the computer program is trained to gradually improve its image-matching over time to provide an increasing success rate in identifying lenses. Computerized neural networks are inspired by the biological network of neurons in the human brain.

"It takes hours to train the neural network," Huang said. "There is a very sophisticated fitting model of 'What is a lens?' and 'What is not a lens?'"

There was some painstaking manual analysis of lensing images to help pick the best images to train the network from tens of thousands of images, Huang noted. He recalled one Saturday during which he sat down with student researchers for the entire day to pore over tens of thousands of images to develop sample lists of lenses and non-lenses.

"We didn't just select these at random," Huang said. "We had to augment this set with hand-selected examples that look like lenses but are not lenses," for example, "and we selected those that could be potentially confusing."

Student involvement was key in the study, he added. "The students worked diligently on this project and solved many tough problems, all while taking a full load of classes," he said. One of the students who worked on the study, Christopher Storfer, was later selected to participate in the DOE Science Undergraduate Laboratory Internship (SULI) program at Berkeley Lab.

Researchers have already improved upon the algorithm that was used in the latest study to speed up the identification of possible lenses. While an estimated 1 in 10,000 galaxies acts as a lens, the neural network can eliminate most of the non-lenses. "Rather than going through 10,000 images to find one, now we have just a few tens," he said.

The neural network was originally developed for The Strong Gravitational Lens Finding Challenge, a programming competition that ran from November 2016 to February 2017 that motivated the development of automated tools for finding strong lenses.

With a growing body of observational data, and new telescope projects like DESI and the Large Synoptic Survey Telescope (LSST) that is now scheduled to start up in 2023, there is heated competition to mine this data using sophisticated artificial intelligence tools, Schlegel said.

"That competition is good," he said. A team based in Australia, for example, also found many new lensing candidates using a different approach. "About 40 percent of what they found we didn't," and likewise the study that Schlegel participated in found many lensing candidates that the other team hadn't.

Huang said the team has expanded its search for lenses in other sources of sky-imaging data, and the team is also considering whether to plug into a broader set of computing resources to expedite the hunt.

"The goal for us is to reach 1,000" new lensing candidates, Schlegel said.

NERSC is a DOE Office of Science User Facility.

Study participants included researchers from the University of San Francisco, Berkeley Lab, the National Optical Astronomy Observatory, Siena College, the University of Wyoming, the University of Arizona, the University of Toronto and the Perimeter Institute for Theoretical Physics in Canada, and Université Paris-Saclay in France.

Credit: 
DOE/Lawrence Berkeley National Laboratory

Pollinator-friendly flowers planted along with crops aid bumblebees

image: A new study from UMass Amherst and NC State shows that sunflowers are one of the plants that can be planted among rows of agricultural crops in flower strips -- rows of pollinator-friendly flowers -- to benefit common Eastern bumblebee (Bombus impatiens) colony reproduction with a lower gut pathogen load than some other flowers.

Image: 
Ben Barnhart

AMHERST, Mass. - A new study reported this week by evolutionary ecologist Lynn Adler at the University of Massachusetts Amherst and Rebecca Irwin of North Carolina State University, with others, suggests that flower strips - rows of pollinator-friendly flowers planted with crops - offer benefits for common Eastern bumblebee (Bombus impatiens) colony reproduction, but some plants do increase pathogen infection risk.

As Adler and colleagues point out, pollinator declines affect food security, and pollinators are threatened by such stressors as pathogens and inadequate food. Bumblebees feed on pollen and nectar they gather from such plants as sunflower and milkweed. But bumblebees are also likely to acquire a gut disease pathogen, Crithia bombi, from some of these plant species more than others, the authors note.

Until now, the effect of plant species composition on bee disease was unknown, they add. Study details appear in Proceedings of the National Academy of Sciences.

In earlier work on flowers and bee infection, Adler explains, "We evaluated 15 plant species by putting the same amount of C. bombi on each, letting a bee forage, and then seeing whether and how bad an infection it developed. We used that to designate plant species as 'high- or low-infection' for this study." Low-infection plants include sunflower and thyme; high-infection plants include swamp milkweed and purple loosestrife.

For this study, the researchers placed bees in tents in three conditions - canola plants only and no flower strips (controls), canola and high-infection flower strips, or canola plus low-infection flower strips - to measure and compare effects on bee infection load and reproduction success. Though bees in the high-infection strips saw double the infection load compared to low-infection flower strips, bee reproduction was higher with any flower strips compared to canola only, no flower strips. "Thus, floral resources in flowering strips benefited bees," the authors state, despite the added disease risk.

Adler says, "The bees were all infected with the same amount of pathogen and then allowed to forage, so the plants could increase or decrease infection." The tradeoff - more bee reproduction but higher pathogen infection rates - may be acceptable, she adds. "It depends on how critical food versus the pathogen is for pollinators," she adds. Irwin, a professor of applied ecology, says, "Flowering strips are becoming more common as people look for ways to mitigate pollinator declines."

Further, Adler points out, "Crithidia is somewhat benign, but if these patterns hold for other pathogens like Nosema, a common honey bee disease, it may be more of a concern. Right now I would not recommend stopping our investment in flowering strips."

The researchers hope to continue investigating the flower strip effects on bee populations and health by including other bee species and pathogens. Adler says, "I think we need a much more comprehensive program to evaluate how pollinator habitat characteristics affect pathogen spread to make informed choices. In the meantime, providing flowering resources in pollinator habitat is still the best path forward."

Credit: 
University of Massachusetts Amherst

Research shows fungicides effective in fighting Fusarium wilt of watermelon

image: Patchy distribution of diseased and healthy watermelon plants.

Image: 
Jeff Standish

Fusarium wilt is one of the most economically important diseases of watermelon and a major problem to growers worldwide. In the past, watermelon growers based in the Southeastern United States were able to use methyl bromide to manage this disease, but this is no longer an option due to environmental concerns.

There are two fungicides available, but, until recently, little information was available on the efficacy of these two chemicals, Prothioconazole and pydiflumetofen, against Fusarium wilt in North Carolina. As a result, North Carolina State University plant pathologists sought to characterize Fusarium wilt under seven fungicide programs and determine the efficacy.

Their results show that both fungicides provide effective control of Fusarium wilt, regardless of application rate or method. According to Jeff Standish, one of the scientists behind this research, "Based on our results, pydiflumetofen and prothioconazole were equally as effective at reducing Fusarium wilt, and pydiflumetofen seemed to be more effective at preserving yield when disease was severe. This provides evidence that pydiflumetofen could be used as an additional mode of action for watermelon growers, which will likely reduce selection for fungicide resistance."

"Documenting that efficacy of pydiflumetofen was similar to that of prothiconazole regardless of rate was a nice and surprising finding," Standish explains, adding that "Despite there only being one year of yield data, the plants grown in nontreated control plots produced no marketable fruit, which really highlights the importance of managing this disease."

Before the publication of this work, the sensitivity of Fusarium oxysporum f. sp. Niveum (the fungus that causes Fusarium wilt) isolates to pydiflumetofen had never been described. This knowledge will be useful for determining when and if fungicide resistance management strategies are needed in the future.

More details about this development can be found in "Sensitivity of Fusarium oxysporum f. sp. niveum to Prothioconazole and Pydiflumetofen In Vitro and Efficacy for Fusarium Wilt Management in Watermelon" published in Plant Health Progress Volume 20, Issue 1.

Credit: 
American Phytopathological Society

Type 2 diabetes linked to worse cognitive performance after a stroke; prediabetes not linked, but prevention needed

DALLAS, May 14, 2020 -- People with Type 2 diabetes, but not those with prediabetes, had worse cognitive performance three to six months after a stroke than those with normal fasting blood sugar levels, according to new research published today in Stroke, a journal of the American Stroke Association, a division of the American Heart Association.

"Type 2 diabetes increases the risk of stroke and has been associated with cognitive impairment and may increase dementia risk. That's why Type 2 diabetes is another important target in the prevention of dementia, and the focus should be on early treatment for prediabetes to delay or prevent the progression to Type 2 diabetes," said Perminder Sachdev, M.D., Ph.D., senior author of the study and Scientia professor at UNSW Sydney's Centre for Healthy Brain Ageing (CHeBA) in Kensington, Australia.

Previous research by Sachdev and colleagues found that stroke patients with a history of Type 2 diabetes have worse cognitive function compared to stroke patients without Type 2 diabetes.

"In this study, we wanted to know if stroke patients with prediabetes also have worse cognitive function compared to stroke patients without prediabetes or diabetes," Sachdev said. "This is important because prediabetes is very common, and individuals can have prediabetes for several years before progressing to Type 2 diabetes. Early and aggressive treatment of prediabetes can delay or prevent Type 2 diabetes. If we target the treatment of prediabetes, could this prevent the development of dementia in some individuals?" said Sachdev.

Researchers combined data from 1,601 stroke patients (average age 66; 63% male; 70% Asian; 26% white; 2.6% African American) who participated in one of seven international studies from six countries. Almost all had clot-caused strokes, and a variety of cognitive functions were assessed between three to six months after the stroke. Patients' fasting blood sugar levels measured at hospital admission and medical history were used to define Type 2 diabetes and prediabetes.

After adjusting for age, sex and education, researchers found:

Compared to patients with normal fasting blood sugar, those with Type 2 diabetes scored significantly lower in different areas of cognitive function, including memory, attention, speed of processing information, language, visual ability to copy or draw shapes or figures or lines, mental flexibility and executive functioning.

Patients with prediabetes did not score significantly worse than those with normal blood sugar in any areas of cognitive function.

The comparisons remained the same after researchers adjusted for additional factors, including type of stroke, ethnicity, high blood pressure, smoking, previous stroke, abnormal heart rhythm and body mass index.

"The deficits we found in all areas of cognitive function highlight the importance of assessing the capacity for self-care in patients with Type 2 diabetes following a stroke," said Jess Lo, M.Sc., lead author of the study and research associate at UNSW Sydney's Centre for Healthy Brain Ageing (CHeBA), in Kensington, Australia. "We need to ensure that stroke survivors have the mental competency to manage the complex and intertwined tasks to effectively treat Type 2 diabetes, which can include measuring glucose levels multiple times a day, managing glucose monitoring devices, adjusting medication doses, self-administering insulin or other medications, and understanding food labels and portion sizes to adjust what is eaten at each meal or snack."

"While our study is focused on cognition after a stroke, there is strong evidence that Type 2 diabetes is associated with cognitive impairment. This is an important message for the general public. Since our study shows no evidence that prediabetes is associated with worse cognitive performance, this emphasizes the importance of the early diagnosis and treatment of prediabetes (which is often under-diagnosed) in order to delay or prevent the progression to Type 2 diabetes," Lo said.

The study is limited by not having information on the duration and severity of diabetes, and having only one measurement of blood sugar levels.

Credit: 
American Heart Association

Scientists discover why some birds live fast and die young

Research from the University of Sheffield has revealed why some bird species take longer to develop than others

Findings could help scientists predict how animals will adapt to climate breakdown and habitat destruction

The study is the first to consider the importance of lifestyle, environment, evolutionary history and body size when explaining variation

Size, safety and parenting all have an impact on how quickly a species of bird matures, according to new research from the University of Sheffield that could help scientists to understand and predict how animals will respond to climate breakdown and the destruction of habitats.

The team of scientists has studied thousands of species of birds to understand why there is so much diversity in the length of time they take to grow from a fertilised egg to an independent adult.

The research, published in Nature Communications, is the first study to consider the importance of lifestyle and environmental factors alongside evolutionary history and body size to explain the variation.

All organisms face a trade-off between reproducing and surviving and they solve this problem in different ways. The team found that bird species with a 'live fast die young' strategy develop quicker, allowing them to maximise the number of offspring they can produce in the short time they have available.

Findings showed that birds that breed and live in safer environments with fewer predators typically took longer to develop, possibly because they can afford to spend longer in a vulnerable state.

They also found that migratory birds develop much quicker, which may ensure they are ready to return to their winter habitats at the end of the summer.

As expected, the research showed that bigger birds took longer to develop - but even among birds of a similar size there was variation in development times.

Dr Chris Cooney, from the University of Sheffield's Department of Animal and Plant Sciences and lead author of the research, said: "The amount of time it takes for a fertilised egg to develop into a fully grown adult varies hugely across the animal kingdom. For instance, it takes an elephant almost 10 years to reach independence, whereas a fruit fly is fully grown after only a matter of days.

"This extraordinary diversity is also encapsulated within birds, where albatrosses can take almost a year to develop from an embryo to an independent adult, but a typical UK garden songbird takes little more than a month. We found that certain aspects of a species' lifestyle and environment are important in explaining how long they take to develop."

Dr Alison Wright, co-author of the research from the University of Sheffield, said: "Our study on birds gives us some clues about the type of factors that may be important in other species. However, it may be that different factors are important for determining development length in other animal groups.

"The next step is therefore to address these questions using data that covers the breadth of the animal kingdom - from fish to mammals to insects - to gain an even broader insight into the factors shaping these fundamental differences across species."

Dr Nicola Hemmings, co-senior author of the research from the University of Sheffield, said: "The insights from our research may prove crucial in understanding and even predicting how organisms may respond when conditions change, for instance as our climate warms and habitats become modified."

The Department of Animal and Plant Sciences at the University of Sheffield is home to one of the biggest communities of whole-organism biologists in the UK. Our research covers animals, plants, humans, microbes, evolution and ecosystems, in habitats ranging from the polar regions to the tropics. This work aims to shed new light on the fundamental processes that drive biological systems and help solve pressing environmental problems.

Credit: 
University of Sheffield