Culture

Tracking down cryptic peptides

image: Schematic representation of the formation and presentation of cryptic peptides.

Image: 
Rudolf Virchow Center / University of Wuerzburg

Almost all cells of the human body present fragments of cellular proteins on their surface, so-called human leukocyte antigens or HLA peptides, which play an important role in the immune response. If the immune system detects foreign HLA peptides, such as viral peptides on a virus-infected cell or mutated peptides on a tumour cell, T-cells eliminate the corresponding cell. The entirety of the HLA peptides presented on a cell is referred to as the cell´s immunopeptidome.

New approach enables comprehensive analysis for the first time

Besides the usual HLA peptides, there are also cryptic HLA peptides. These are derived from specific RNA sequences that do not contain information for a specific protein as is usually the case. Over the last few decades, only a few cryptic HLA peptides have been identified because they are very small and are quickly degraded in the cells. On the other hand, efficient computer algorithms for the analysis were lacking.

In a completely new approach, the Würzburg scientists have now combined several analytical methods that are particularly suitable for small peptides. "Using a novel bioinformatics method developed by us, for the first time we were able to identify thousands of cryptic HLA peptides in the immunopeptidomes of a wide variety of tumors such as melanoma and breast cancer," explains Dr. Andreas Schlosser, research group leader at the Rudolf Virchow Center at Julius-Maximilians-Universität (JMU) Würzburg in Bavaria, Germany.

The new bioinformatics approach is based exclusively on data from mass spectrometry, a method for determining the mass of molecules such as Peptides. This makes it possible to systematically and comprehensively determine the cryptic HLA peptides. In addition, it was possible to clarify on which cells and to what extent cryptic peptides are present: "We were able to show that cryptic HLA peptides make up a significant part of the immunopeptidomes of tumors," explains Prof. Dr. Florian Erhard, group leader at the JMU Institute of Virology.

Effective points of attack for the immune system

It was already known from individual studies that cryptic peptides can trigger autoimmune reactions such as in diabetes type 1 as well as immune responses against tumor cells. The new analyses provide evidence that certain cryptic HLA peptides are exclusively found on tumour cells. Such tumour-specific cryptic HLA peptides might thus prove to be worthwhile target structures for cancer immunotherapies. Scientists at the University of Würzburg and the University Hospital of Würzburg are already examining a selection of the identified peptides to determine whether they are suitable as targets for cancer immunotherapy.

Virus-infected cells also present cryptic HLA peptides that could be used as a target structure for vaccines. With their new method, the researchers thus have an effective tool in hand to learn more about the general function and formation of cryptic peptides. "We hope that our bioinformatics approach will provide us with a better understanding of autoimmune reactions as well as immune reactions against tumour cells and virus-infected cells," says Schlosser.

Credit: 
University of Würzburg

Lack of damage after secondary impacts surprises researchers

image: A map showing crystallographic orientation of a region that originally contained a void, which was then subjected to a second shock loading (the shock wave passed from the bottom to the top of the image). The void has been recompacted with enough energy to not only reach a fully dense state, but drive recrystallization at the interface, as demonstrated by the thin band of very small grains.

Image: 
David Jones

WASHINGTON, June 23, 2020 -- When a material is subjected to an extreme load in the form of a shock or blast wave, damage often forms internally through a process called spall fracture.

Since these types of intense events are rarely isolated, research is needed to know how damaged materials respond to subsequent shock waves -- a piece of armor isn't much use if it disintegrates after one impact.

To the surprise of researchers, recent experimentation on spall fracture in metals found that, in certain cases, there was an almost complete lack of damage with only a thin band of altered microstructure observed. Usually, under these sorts of conditions, the material would contain hundreds of small voids and cracks.

In an article for the Journal of Applied Physics, published by AIP Publishing, researchers from Los Alamos National Laboratory narrowed down exactly why the expected damage was missing.

"Conflicting hypotheses were suggested for the lack of damage. Was there some sort of strengthening occurring, so that damage never nucleated, or was the damage recompacted to a fully dense state by some other loading?" said author David Jones. "By splitting the experiment into two phases -- damage formation and recompaction -- we could determine which hypothesis was correct."

Materials experiencing shock damage at high strain rates from a sudden impact will exhibit significantly different behavior compared to their response under standard, low-rate mechanical testing.

The researchers used gas-gun flyer-plate impact experiments to first damage samples, and then impact these samples a second time to see how the shock wave interacts with the damage field, which had not been done before. They found a shock stress of just 2 to 3 gigapascal actually recompacted a damaged copper target and created a new bond where the once broken surfaces were brought back together.

"This research, where careful experiments are used to isolate the strength and damage response of a material under shock loading, helps to reveal how microstructure plays a key role in dynamic response," said Jones.

The authors hope the future of shock physics research will involve next-generation free electron X-ray lasers, a game-changing tool.

"Being able to image in real time these micrometer-scale, microsecond-duration damage events in metals will be a paradigm shift in shock physics diagnostics," said Jones.

Credit: 
American Institute of Physics

Getting real with immersive sword fights

image: Computer Scientists at the University of Bath have found a solution to the challenges of creating realistic VR sword fights, and it's called Touché.

Image: 
Christof Lutteroth

Sword fights are often the weak link in virtual reality (VR) fighting games, with digital avatars engaging in battle using imprecise, pre-recorded movements that barely reflect the player's actions or intentions. Now a team at the University of Bath, in collaboration with the game development studio Ninja Theory, has found a solution to the challenges of creating realistic VR sword fights: Touche - a data-driven computer model based on machine learning.

Dr Christof Lutteroth, who created Touche with colleague Dr Julian Padget and EngD student Javier Dehesa, said: "Touche increases the realism of a sword fight by generating responsive animations against attacks and eliminating non-reactive behaviour from characters.

"Using our model, a game character can anticipate all possible fight situations and react to them, resulting in a more enjoyable and immersive game experience."

The unpredictability of user actions presents a major conundrum for designers of VR games, explained Dr Lutteroth, who is a senior lecturer in Computer Science, director of Real and Virtual Environments Augmentation Labs (REVEAL) and co-investigator at the Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA). "VR games offer new freedom for players to interact naturally using motion, but this makes it harder to design games that react to player motions convincingly," he said.

He added: "There are different expectations for screen-based video games. With these, a player presses 'attack' and their character displays a sequence of animations. But in a VR game, the player input is much harder to process."

The Touche framework for VR sword fighting simplifies the necessary technical work to achieve a convincing simulation. It eliminates the need for game designers to add layer upon layer of detail when programming how a character should move in a particular situation (for instance, to block a particular sword attack). Instead, actors wearing motion capture equipment are asked to perform a range of sword fighting movements, and Touche builds a model from these movements. The virtual version of the actor is able to react to different situations in a similar fashion to a flesh-and-blood fighter. Game designers can then fine-tune this model to meet their needs by adjusting high-level parameters, such as how skilled and aggressive the game character should be. All this saves game developers a lot of time and leads to more realistic results.

For the Bath study, 12 volunteers were asked to take part in two three-minute sword fights: for the first fight, they used technology that is currently available and for the second, they used Touche. Touche had a strong positive effect on realism and the perceived sword fighting skills of game characters. Feedback from participants pointed to a convincing preference for Touche, with current sword fights being described as 'unresponsive' and 'clumsy' by comparison.

"Based on this, we are convinced that Touche can deliver more enjoyable, realistic and immersive sword fighting experiences, presenting a more skilled and less repetitive opponent behaviour," said Dr Lutteroth. "I'm convinced this framework is the future for games - not only for sword fighting but also for other types of interaction between game characters. It will save developers a lot of time."

Javier Dehesa Javier, who is based at the Centre for Digital Entertainment, interviewed game developers who had tested this new technology. He said: "Developers see the Touche framework as an important practical step in the industry towards data-driven interaction techniques. We could see this technology appear in commercial games very soon."

Touche: Data-Driven Interactive Sword Fighting in Virtual Reality is published by CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems.

Video to accompany press release: https://vimeo.com/430682565

Credit: 
University of Bath

New compressor delivers above-terawatt 1.5-cycle pulses at kilohertz repetition rate

image: The inset shows the measured shape and beam profile of the 1.5-cycle pulses.

Image: 
MBI

Researchers at the Max Born Institute for Nonlinear Optics and Short Pulse Spectroscopy (MBI) have reached a new milestone in few-cycle pulse generation, breaking a 10-year-old record and achieving 1.5-optical-cycle-long laser pulses with 1.2 terawatt peak power by a new high-energy hollow fiber compressor beamline. The intense pulses will be used to generate intense attosecond harmonic radiation for nonlinear XUV spectroscopy studies.

In order to shed light on complex charge transfer mechanisms during the formation of a chemical bond or in biologically relevant processes, one needs tools with exceptional temporal resolution in the attosecond (10^-18 s) realm. Single attosecond light pulses can be generated in the extreme ultraviolet (XUV) spectral range by focusing intense few-cycle laser pulses encompassing only a few oscillations of the electric field onto noble gas atoms, using the process called high-harmonic generation (HHG). However, the conversion efficiency is low, yielding very weak attosecond pulses, insufficient for nonlinear spectroscopic applications. In order to create more intense isolated attosecond pulses, high-energy, near infrared, few-cycle driving laser pulses are required.

Now, researchers at the MBI have made a big step forward in the energy scaling of the driver pulses. The group succeeded to spectrally broaden and subsequently compress pulses of a titanium sapphire laser, which emits at a wavelength of 790 nm, to 3.8 fs duration (1.5 optical cycles) at an energy of 6.1 mJ, which is unprecedented at kilohertz repetition rate. Thus the peak power of the pulses exceeds the terawatt (> 10^12 W) level. This result breaks a 10-year-old record achieved at RIKEN [1].

In order to achieve these results, a new 8.2 meter long compressor beamline was built around a 3.75 meter long, stretched flexible hollow-core fiber (SF-HCF) where spectral broadening took place as a result of nonlinear interaction between the intense light pulses and helium atoms admitted into the capillary. The spectrally broadened pulses were then compressed by a set of chirped mirrors and characterized by an in-line dispersion scan device that was directly placed into the vacuum beamline that is constructed for subsequent high-harmonic generation and XUV experiments. The new HCF compressor is an up-scaled version of a device that was recently developed in the framework of an international collaboration with participation of the MBI [2].

Credit: 
Forschungsverbund Berlin

Scientists create program that finds synteny blocks in different animals

image: Scientists developed a software tool that makes it possible to quickly and efficiently find similar parts in the genomes of different animals, which is essential for understanding how closely related two species are, and how far they have evolved from their common ancestor. The research was published in Giga Science.

Image: 
Dmitry Lisovsky, ITMO.NEWS

Modern genetics implies working with immense amounts of data which cannot be processed without the help of complex mathematical algorithms. For this reason, the task of developing special processing programs is no less important for bioinformatics specialists than that of genomic sequencing of specific animals. An international team of scientists that included researchers from ITMO University developed a software tool that makes it possible to quickly and efficiently find similar parts in the genomes of different animals, which is essential for understanding how closely related two species are, and how far they have evolved from their common ancestor. The research was published in GigaScience.

There are millions of biological species on planet Earth, and this diversity is laid down on the genetic level. Animals' anatomy, size, color patterns and habits are defined by their genes. Then again, the diversity of genes themselves is not that great: by today, scientists have only identified about over 20,000. Therefore, species are different in not only the sets of genes they have but also in how their genes are arranged. In the language of comparative genomics, this is called synteny, i.e. the arrangement of genes and regulatory elements.

"Let's take a gorilla and a chimpanzee as an example," says Ksenia Krasheninnikova, a researcher and engineer at ITMO University. "These two species have the same set of genes, but their regulatory elements and genome mutations create slightly different orders which results in differences between these primates."

Therefore, for the purposes of understanding how close two species are from the evolutionary standpoint, scientists need to know not just their genes but also how they are arranged in a chromosome, and how many common genome fragments, or synteny blocks, as geneticists call them, there are. Then again, looking for them manually is impossible: the amount of data is just too big. Genomes of mammals consist of millions and billions of base pairs, which makes processing without big data technologies next to impossible. For this reason, scientists create programs of their own that make it possible to solve this new category of tasks which has emerged in the course of the development of this science. And this is what the research team that included scientists from ITMO's Laboratory of Genomic Diversity did.

The new software tool was named halSynteny. According to its authors, it can search for synteny blocks better and faster than other programs developed for this purpose. What's more, halSynteny works with data in two standard and well-documented formats.

"Our goal was to create an algorithm that could be easily applied to accessible data," says Ksenia, who is the first author of this research. "Some of the approaches to the identification of synteny sequences are based on annotating genes in advance; our method is different. We don't use any additional annotation. We use the alignment method, when different parts of one genome are aligned by their degree of similarity with parts of another genome. This way, we can identify homogeneous parts, parts that are of the same origin."

The program makes it possible to speed up the computations by over two times in comparison with SatsumaSynteny2, another popular tool. Such high efficiency was attained by implementing a mathematically effective algorithm using C++.

The proposed method and software tool were tested by comparing cat and dog genomes.

"We showed that large fragments of cat chromosomes and some fragments of dog chromosomes unite in synteny blocks, which means that they've evolved from similar chromosomes of a common ancestor. And this can be used as a basis for making conclusions about their evolutionary process. Previous research in the field of "wet" biology demonstrated that cats' genome changed less from the genome of their common ancestor in comparison with that of dogs. This can be seen in comparison with other species that are not part of the carnivora order. The results that we got confirm these conclusions and make them more accurate. This means that in some specific part, the genome of a cat and the species taken for comparison is similar, and in dogs, it is rearranged."

In future, this algorithm will be used in other research in the field of comparative genomics that takes place at ITMO University.

Credit: 
ITMO University

Nearly 70% of patients make personal or financial sacrifices to afford medications

video: Beyond industry and survey data, the 2020 Medication Access Report also includes filmed interviews with patients and caregivers who share their personal experiences with common medication access challenges, and how these challenges have impacted their lives.

Image: 
CoverMyMeds

COLUMBUS, Ohio, June 23, 2020 -- Nearly 70 percent of patients have made personal or financial sacrifices to afford prescribed medications1 according to new research released by CoverMyMeds, highlighting the impact of one of the most common medication access challenges. The 2020 Medication Access Report, and corresponding Prescription Decision Support, Electronic Prior Authorization, Specialty Patient Support and COVID-19 sub-reports, investigate healthcare barriers that impede access to prescriptions, contribute to increased provider burden and motivate patient consumerism. The report also examines the impact of the COVID-19 pandemic on healthcare and assesses how the market is responding to these challenges with tools that inform medication decisions, streamline administrative tasks and support remote healthcare. Click here to access the full report.

"Navigating the patient journey can be complex, especially for patients who encounter significant barriers--such as medication cost, clinical requirements and enrollment processes--to obtain their prescription," said Eric Weidmann, MD. Chief Medical Officer of eMDs. "The complexity of obtaining access to medications can also take a toll on providers who are faced with increasingly burdensome administrative tasks, especially during the pandemic. As a practicing physician, I appreciate the benefits of technology to help streamline many traditionally-manual processes, inform my conversations with patients and support my prescribing decisions."

Key takeaways from the 2020 Medication Access Report include:

The COVID-19 pandemic has caused millions of Americans to face healthcare insecurity:

As of June 5, 2020, 44 million people -- over a quarter of the U.S. workforce -- had filed for first-time unemployment benefits since March 2020, when much of the U.S. economy began to shut down in response to the pandemic. This is six times the number of claims during the peak of the Great Recession.2

When asked what medication barriers their patients are experiencing due to COVID-19, 30 percent of providers said their patients are unable to pay for prescriptions.3

Since the beginning of COVID-19, more than 20 percent of patients said they've used a cash price program to help afford medications.4

The COVID-19 pandemic fast-tracked adoption of many healthcare technologies, but there is still room for growth.

Prior to COVID-19, only 11 percent of patients used telehealth services.5 Now, 67 percent say they are more likely to use telehealth services moving forward.4

Despite increased utilization, over 30 percent of providers said lack of integration within EHR and privacy concerns were challenges they faced with telemedicine.3

80 percent of providers surveyed listed patients' lack of technology skills as a telemedicine challenge.3

Many Americans face medication access challenges, such as affordability barriers and manual processes that can delay care:

When patients cannot afford their prescriptions, 29 percent admit to abandoning their medications while 52 percent seek affordability options through their physician, a labor-intensive process which creates additional work for the provider and can delay the patient's time to therapy.1

55 percent of patients reported delays in time to therapy due to a prescribed medication requiring prior authorization.1

82 percent of patients say they spent at least one hour or more making multiple phone calls to track down needed information to begin specialty therapies.1 As a result of this time-consuming administrative work, nearly one in 10 patients reported waiting eight weeks or more to receive their first dose of therapy.1

"The 2020 Medication Access Report uses industry statistics, market research and new survey data to highlight critical barriers that can limit patients' access to medications," said Miranda Gill MSN, RN, NEA-BC, Senior Director, Provider Services and Operations at CoverMyMeds. "The report also highlights important strides in creating innovative solutions that help patients overcome many of these disruptive obstacles. However, there needs to be more widespread adoption and collaboration across the healthcare industry to see the true benefits of these solutions: streamlining inefficiencies which can help improve patients' health outcomes."

The 2020 Medication Access Report is published by CoverMyMeds, part of McKesson Prescription Technology Solutions, with an advisory board of leaders from BestRx, Blue Cross Blue Shield North Carolina, Cerner Corporation, eMDs, Express Scripts, Horizon Government Affairs, Humana, National Alliance of State Pharmacy Associations, National Council for Prescription Drug Programs, National Patient Advocate Foundation, OptumRx, Orsini Healthcare, RelayHealth Pharmacy Solutions, RxCrossroads and University of Virginia Health System.

To view the full 2020 Medication Access Report, click here.

Sources

1 - CoverMyMeds Patient and Provider Surveys, 2020
Survey based on responses from 1,000 patients and 400 providers.

2 - Unemployment Insurance Weekly Claims, Department of Labor, 2020

3 - CoverMyMeds Provider COVID-19 Survey, 2020
Survey based on responses from 3,000 providers.

4 - CoverMyMeds Patient COVID-19 Survey, 2020
Survey based on responses from 500 patients.

5 - Telehealth: A quarter-trillion-dollar post-COVID-19 reality?, McKinsey & Company, 2020

Credit: 
CoverMyMeds

CAR T cell therapy: potential for considerable savings

Chimeric antigen receptor (CAR) T cell therapy is a new and in some cases highly effective form of immunotherapy to treat certain types of cancer of the blood and lymph system. This promising treatment comes at a cost, however: The manufacturers charge up to EUR 320,000 for the production of immune cells for a single patient. By determining the fixed and variable costs involved, researchers from the German Cancer Research Center (DKFZ) established that cellular immunotherapy could be produced at a scientific institution such as DKFZ at around a tenth of the cost.

For several years now, physicians have been achieving considerable success using an innovative form of immunotherapy in patients with certain types of cancer of the blood and lymph system: The treatment involves harvesting immune cells (T cells) from the patient and modifying them outside the body to make them more effective in attacking the malignant leukemia cells. To do so, the researchers equip the cells in the laboratory with a specific receptor protein. This chimeric antigen receptor (CAR) recognizes a protein molecule as a target structure that is formed by every cancer cell in certain types of leukemia. The CAR T cells are subsequently propagated and transferred back to the patient - where they hunt down malignantly transformed leukemia cells, sometimes with spectacular success.

Two commercial CAR T cell products have been approved to treat patients with acute lymphoblastic T cell leukemia and non-Hodgkin lymphomas such as diffuse large cell B cell lymphoma. They are only used if other treatment options have failed. The treatment is often effective: two years after treatment, 40 to 60 percent of patients have not relapsed.

This promising and highly individualized treatment comes at a cost, however: In Germany, the manufacturers charge up to EUR 320,000 for the production of CAR T cells for a patient. "CAR T cell therapy is still only feasible in a few cancer patients, but this treatment approach will hopefully be able to be extended to other types of cancer. There is considerable concern that our health care systems will not be able to meet the costs of this potential increase in the number of patients," Michael Schlander, a health economist at DKFZ, explained.

As an alternative to the two CAR T cell products developed by large pharmaceutical companies, many research institutions, including DKFZ, hope to manufacture the therapeutic cells themselves. For the first time, health economist Schlander, immunologist Stefan Eichmüller and their team of researchers at DKFZ have now listed in detail the costs incurred by an academic institution in producing CAR T cell therapies.

The fixed and variable costs included setting up a clean room, laboratory materials, equipment, and all salaries and non-wage labor costs for the specially trained laboratory staff. As expected, the calculated costs were heavily dependent on the capacity utilization rate of the fully automated production system for the cells. The researchers based their calculations on different scenarios, including a maximum annual capacity utilization rate of the machine with 18 CAR T cell products.

Under these conditions, DKFZ could produce a CAR T cell product to treat a patient for less than EUR 60,000. "That would be only about a fifth of the price that the companies charge. And we can cut these costs even further by a considerable amount," Michael Schlander explained, adding that the greatest savings could be made if several of the automated production machines were operated at the same time. An alternative method of transferring the genes for the chimeric receptor could further reduce the production costs to as little as around EUR 33.000 Euro - a tenth of the current commercial price.

Quite apart from these direct savings, the patients would benefit from decentralized CAR T cell production too: "Without the time needed to send the patient's blood and the finished cell therapy, we can provide the treatment within 12 to 14 days - a considerable reduction compared with the three- to four-week waiting time for the commercial products. Patients might then need less chemotherapy and would spend less time in hospital, which would lead to further savings," remarked immunologist Stefan Eichmüller.

The analysis did not take account of the cost of license fees. Michael Schlander and Stefan Eichmüller hope that the present study will also encourage the manufacturing companies to review their current pricing policy for CAR T cell therapies.

Credit: 
German Cancer Research Center (Deutsches Krebsforschungszentrum, DKFZ)

A bacterial toxin turning cells into swiss cheese

image: Researchers from Kanazawa University developed a novel tool to study how the innate immune system fights bacterial toxins. They purified the pore-forming toxin Monalysin from a bacterial culture, and structurally and functionally characterized the purified toxin to show how it functions at the molecular level. This study could help understand the mechanisms underlying the interactions between host defense and microbial invaders.

Image: 
Kanazawa University

Kanazawa, Japan - Although the innate immune system is the front line of defense against microbial infections, the complex mechanisms of innate immunity are incompletely understood. In a new study, researchers from Kanazawa University synthesized and characterized the bacterial toxin Monalysin to enable the study of how the innate immune system and toxin-producing bacteria interact with each other.

The innate immune system detects microbial infections through sensing either microbial molecules (pathogen-associated molecular patterns, or PAMPs) or host signaling molecules that are released from damaged host cells (damage-associated molecular patterns, or DAMPs). The bacterium Pseudomonas entomophila has been utilized as a tool to study the mechanisms of DAMPs in the gut. P. entomophila infects insects and damages intestinal cells using a pore-forming toxin called Monalysin. Monalysin is secreted as an inactive pro-toxin, which is then activated by certain proteins called proteases. Although the fruit fly, Drosophila, protects itself from activation of the pro-toxin by building a physical barrier against proteases, it can still take damage upon exposure to the toxin.

"Activated Monalysin forms pores in the plasma membrane of host cells, resulting in cell death, so it is important for the host to prevent its activation," says corresponding author of the study Takayuki Kuraishi. "We wanted to purify and functionally characterize Monalysin from P. entomophila to develop a tool that could help us understand how the host and bacteria that produce pore-forming toxins interact."

To achieve their goal, the researchers cultured P. entomophila and purified pro-Monalysin from their lysates. By reacting the purified toxin with Drosophila cells, the researchers confirmed its toxic effect when cell viability dropped significantly as more pro-Monalysin was added to the cells. To confirm that purified Monalysin forms pores, the researchers added activated Monalysin onto a chip covered with a lipid bilayer, similar to the plasma membrane of cells. By measuring the electrical current resulting from ion passage through the formed pores, the researchers showed that Monalysin forms pores around 0.7-1nm in diameter. To analyze the structural composition of Monalysin, the researchers then turned to atomic force microscopy (AFM), which provides high-resolution images by touching the surface with a sensitive mechanical probe. Using AFM, the researchers showed that eight Monalysin molecules came together to form pores in the plasma membrane. By combining AFM with high-speed imaging, the researchers then demonstrated that activated Monalysin preferentially inserted into the edge of the plasma membrane, suggesting that highly curved parts of membranes are the sites of their action.

"These are striking results that show how Monalysin functions at the molecular level," says Kuraishi. "Our findings could help understand how the innate immune system fights off bacteria that produce pore-forming toxins."

Credit: 
Kanazawa University

Deep drone acrobatics

image: A quadrotor performs a matty flip.

Image: 
Elia Kaufmann

Since the dawn of flight, pilots have used acrobatic maneuvers to test the limits of their airplanes. The same goes for flying drones: Professional pilots often gage the limits of their drones and measure their level of mastery by flying such maneuvers in competitions

Greater efficiency, full speed

Working together with microprocessor company Intel, a team of researchers at the University of Zurich has now developed a quadrotor helicopter, or quadcopter, that can learn to fly acrobatic maneuvers. While a power loop or a barrel role might not be needed in conventional drone operations, a drone capable of performing such maneuvers is likely to be much more efficient. It can be pushed to its physical limits, make full use of its agility and speed, and cover more distance within its battery life.

The researchers have developed a navigation algorithm that enables drones to autonomously perform various maneuvers - using nothing more than onboard sensor measurements. To demonstrate the efficiency of their algorithm, the researchers flew maneuvers such as a power loop, a barrel roll or a matty flip, during which the drone is subject to very high thrust and extreme angular acceleration. "This navigation is another step towards integrating autonomous drones in our daily lives," says Davide Scaramuzza, robotics professor and head of the robotics and perception group at the University of Zurich.

Trained in simulation

At the core of the novel algorithm lies an artificial neural network that combines input from the onboard camera and sensors and translates this information directly into control commands. The neural network is trained exclusively through simulated acrobatic maneuvers. This has several advantages: Maneuvers can easily be simulated through reference trajectories and do not require expensive demonstrations by a human pilot. Training can scale to a large number of diverse maneuvers and does not pose any physical risk to the quadcopter.

Only a few hours of simulation training are enough and the quadcopter is ready for use, without requiring additional fine-tuning using real data. The algorithm uses abstraction of the sensory input from the simulations and transfers it to the physical world. "Our algorithm learns how to perform acrobatic maneuvers that are challenging even for the best human pilots," says Scaramuzza.

Fast drones for fast missions

However, the researchers acknowledge that human pilots are still better than autonomous drones. "Human pilots can quickly process unexpected situations and changes in the surroundings, and are faster to adjust," says Scaramuzza. Nevertheless, the robotics professor is convinced that drones used for search and rescue missions or for delivery services will benefit from being able to cover long distances quickly and efficiently.

Credit: 
University of Zurich

Sweet or sour natural gas

Natural gas that contains larger amounts of hydrogen sulfide (H(2)S) and carbon dioxide (CO(2)) is termed sour gas. Before it can enter a pipeline, it must be "sweetened" by removal of its acidic impurities. Through fine tuning of the ratios of two molecular components, it is possible to produce tailored polyimide membranes that can purify sour gas with a wide range of compositions, as reported by researchers in the journal Angewandte Chemie.

The main component of natural gas is methane (CH(4)). The H(2)S and CO(2) in sour gas react acidically with moisture, making them highly corrosive. In addition, H(2)S is highly toxic and poses a safety risk. Today, sweetening is usually achieved through very energy-intensive chemical scrubbing, which is not economically viable for gas with high concentrations of H(2)S and CO(2). In addition, this process requires a large, complex apparatus that is impossible to use in remote or offshore facilities. Scalable, economical membrane separations represent an excellent alternative.

Membranes based on glassy polyimide polymers made of a special nitrogen- and oxygen-containing group demonstrate good separation efficiency. However, a fundamental understanding of the relationships between the structures of polyimides and their gas-transport properties in the presence of H(2)S has been lacking, impeding the design of advanced membranes. A team led by William J. Koros at the Georgia Institute of Technology (Atlanta, USA) has now taken on this subject.

Membrane separations are based on the fact that gases with higher solubility pass more easily through membrane materials; however, smaller gas molecules can also diffuse through membranes more easily. The challenge for sweetening lies in the fact that the separation of CO(2) relies primarily on a size difference (CO(2) is smaller than CH(4)), while the separation of the similarly sized H(2)S and CH(4) depends on differences in solubility. In addition, glassy polyimide membranes begin to soften as they absorb more dissolved gas. This is favorable for the separation of H(2)S but unfavorable for the separation of CO(2).

For their experiments, the researchers produced polyimides based on 6FDA (4,4'-(hexafluoroisopropylidene) diphthalic anhydride. They used two different 6FDA building blocks, which they polymerized in a variety of ratios. One building block (DAM) introduces a bulky trimethyl benzene group, which prevents the polymer chains from being densely packed. This increases both the gas permeability and the tendency to soften. The other building block (DABA) contains a polar benzoic acid group. This tightens the packing of the chains, decreasing permeability, but increases H(2)S solubility.

Higher proportions of DAM increase the permeability toward CO(2), but also CH(4), which decreases selectivity. In contrast, the selectivity with regard to H(2)S is barely affected. The more DAM included, the more the polymer softens, which is unfavorable for CO(2) but favorable for H(2)S. By carefully adjusting the relative amounts of the building blocks, the packing of the polymer chains and the tendency to plasticize can be balanced to produce membranes that simultaneously and efficiently separate out both H(2)S and CO(2). This makes it possible to tailor membranes for different natural gas compositions.

Credit: 
Wiley

Marching for change: 2017 Women's March met with mostly positive support online

UNIVERSITY PARK, Pa. -- Large protest events can be divisive, spurring an outpouring of both support and opposition. But new Penn State research found that the 2017 Women's March, which championed goals in support of women and human rights, was met with mostly positive support on social media, with relatively few negative messages.

In an analysis of all geolocated tweets in the continental U.S. on the day of the march, the researchers found that not only were tweets about the march generally positive, but they were actually more positive than other geolocated tweets -- those that have an attached location -- on that day. Tweets about the march rose to a peak of 12% of all geolocated tweets on that day.

Diane Felmlee, professor of sociology, said research into understanding how social justice issues are reflected in social media continues to be relevant given the current protests championing racial justice happening across the country.

"The U.S. has a long history of protest movements, from Colonial times through Civil Rights and Vietnam War protests," Felmlee said. "Recently, however, protests are accompanied by corresponding online reactions in real-time. Furthermore, controversy regarding Twitter's recent move to increase its labeling of tweets that are misleading, glorify violence, or could cause harm, makes studying links between social media and events that advocate social causes especially important."

According to the researcher's, the Women's March -- which took place on January 21, 2017, the day after the inauguration of President Donald Trump -- was at the time the largest single-day protest in American history. More than 4 million people, about 1.3% of the nation's population, participated.

But despite the widespread support and participation, Felmlee said the March also spurred some controversy from certain political and activist groups.

"The Women's March of 2017 was an inspiring, historic event, and my coauthors and I wanted to capture public sentiment before, during, and after it occurred," Felmlee said. "My own research focuses on sexist and racist online harassment, and we were concerned that a backlash toward the March could erupt into extensive, aggressive, harmful messaging on social media."

For the study, the researchers used the Twitter Streaming Application Programming Interface to collect all geolocated tweets in the continental U.S. from Jan. 20 to 22, 2017, that referenced the Women's March and its sister marches. They analyzed the content of the tweets to rank them on how positive or negative they were.

The researchers found that from Jan. 20 to 22, 3.1% of all tweets in the study contained a term about the Women's March. On the day of the March -- Jan. 21 -- more than 40,000 users tweeted more than 64,000 geotagged messages about the Women's March. These tweets made up 2.6% of all tweets.

According to the researchers, while the majority of messages were positive, they did find negative ones. Negative sentiment was mostly tweeted from seven metropolitan areas in Alabama, Colorado, New Mexico, Ohio, Oregon, South Dakota and Washington. However, these tweets made up a small portion of tweets.

Felmlee said she was happy to find few aggressive and bullying tweets that targeted the March or its participants.

"The widespread, largely supportive and positive nature of messages across the continental U.S. was significant," Felmlee said. "I was particularly surprised by these generally supportive reactions to the March, especially given the political divides in our country, and the all too common occurrence of sexist and racist online harassment and abuse."

Additionally, Felmlee said the results -- recently published in PLoS ONE -- give insights into how social media fits into social movements and pushes for change.

"The physical and online worlds are linked -- social movement rallies can be effective tools to gain online public support for social justice causes," Felmlee said. "This is why some governments shut down social media during protests. More positively, it is why so many agencies and organizations use social media to disseminate information."

Justine I. Blanford, associate teaching professor of geography; Stephen A. Matthews, professor of sociology, anthropology, demography, and geography; and Alan M. MacEachren, professor of geography and information science and technology, also coauthored this work.

The Penn State Social Science Research Institute and the Population Research Institute helped support this research.

Credit: 
Penn State

LIGO-Virgo finds mystery astronomical object in 'mass gap'

image: In August of 2019, the LIGO-Virgo gravitational-wave network witnessed the merger of a black hole with 23 times the mass of our sun and a mystery object 2.6 times the mass of the sun. Scientists do not know if the mystery object was a neutron star or black hole, but either way it set a record as being either the heaviest known neutron star or the lightest known black hole.

Image: 
LIGO/Caltech/MIT/R. Hurt (IPAC)

An international research collaboration, including Northwestern University astronomers, has detected a mystery object inside the puzzling area known as the "mass gap" -- the range that lies between the heaviest known neutron star and the lightest known black hole. The finding has important implications for astrophysics and the understanding of low-mass compact objects.

When the most massive stars die, they collapse under their own gravity and leave behind black holes; when stars that are a bit less massive than this die, they explode in a supernova and leave behind dense, dead remnants of stars called neutron stars. The heaviest known neutron star is no more than 2.5 times the mass of our sun, or 2.5 solar masses, and the lightest known black hole is about 5 solar masses. For decades, astronomers have wondered: Are there any objects in this mass gap?

Now, in a new study from the National Science Foundation's Laser Interferometer Gravitational-Wave Observatory (LIGO) and the European Virgo observatory, scientists have announced the discovery of an object of 2.6 solar masses, placing it firmly in the mass gap.

The intriguing object was found on Aug. 14, 2019, as it merged with a black hole of 23 solar masses, generating a splash of gravitational waves detected back on Earth by LIGO and Virgo. A paper about the detection was published today (June 23) by The Astrophysical Journal Letters.

"Mergers of a mixed nature -- black holes and neutron stars -- have been predicted for decades, but this compact object in the mass gap is a complete surprise," said Northwestern's Vicky Kalogera, who coordinated writing of the paper. "We are really pushing our knowledge of low-mass compact objects. Even though we can't classify the object with conviction, we have seen either the heaviest known neutron star or the lightest known black hole. Either way, it breaks a record."

Kalogera, a leading astrophysicist in the LIGO Scientific Collaboration (LSC), is an expert in the astrophysics of compact object binaries and analysis of gravitational-wave data. She is the Daniel I. Linzer Distinguished University Professor of Physics and Astronomy and director of CIERA (Center for Interdisciplinary Exploration and Research in Astrophysics) in Northwestern's Weinberg College of Arts and Sciences.

"Whereas we are not sure about the nature of the low-mass compact object, we have obtained a very robust measure of its mass, which falls right into the so-called mass gap," said Mario Spera, a co-author of the paper who studies the formation of merging binaries. He is a Virgo collaboration member and a European Union Marie Curie Postdoctoral Fellow at CIERA and the University of Padova.

"This exciting and unprecedented finding, combined with the unique mass ratio of the merger event, challenges all the astrophysical models that try to shed light on the origins of this event," Spera said. "However, we are quite sure that the universe is telling us, for the umpteenth time, that our ideas on how compact objects form, evolve and merge are still very fuzzy."

The cosmic merger described in the study, an event dubbed GW190814, resulted in a final black hole about 25 times the mass of the sun. (Some of the merged mass was converted to a blast of energy in the form of gravitational waves). The newly formed black hole lies about 800 million light-years away from Earth.

Before the two objects merged, their masses differed by a factor of nine, making this the most extreme mass ratio known for a gravitational-wave event. Another recently reported LIGO-Virgo event, called GW190412, occurred between two black holes with a mass ratio of about 4:1.

In addition to Kalogera and Spera, the other Northwestern researchers involved in the study are Chase Kimball, Christopher Berry and Mike Zevin. The three are authors of the paper and members of CIERA.

Kimball, an astronomy Ph.D. student and LSC member, assessed how often mergers such as GW190814 occur in the universe. Berry, the CIERA Board of Visitors Research Professor, is a member of the LSC Editorial Board for all LSC publications and served as the lead representative for this study. Zevin, an astronomy Ph.D. student and LSC member, contributed to the astrophysical interpretation and also to writing the GW190412 discovery paper.

"It's a challenge for current theoretical models to form merging pairs of compact objects with such an extreme mass ratio in which the low-mass partner resides in the mass gap," Kalogera said. "This discovery implies these events occur much more often than we predicted, making this a really intriguing low-mass object.

"The mystery object may be a neutron star merging with a black hole -- an exciting possibility expected theoretically but not yet confirmed observationally," she said. "However, at 2.6 times the mass of our sun, it exceeds modern predictions for the maximum mass of neutron stars and may instead be the lightest black hole ever detected."

"Whether or not the object is a heavy neutron star or a light black hole, the discovery is the first in a new class of binary mergers," Kimball added. "Models of binary populations will have to account for how often we now can infer that these sort of events occur."

When the LIGO and Virgo scientists spotted this merger, they immediately sent out an alert to the astronomical community. Dozens of ground- and space-based telescopes followed up in search of light waves generated in the event, but none picked up any signals.

So far, such light counterparts to gravitational-wave signals have been seen only once, in an event called GW170817. The event, discovered by the LIGO-Virgo network in August of 2017, involved a fiery collision between two neutron stars that was subsequently witnessed by dozens of telescopes on Earth and in space. Neutron star collisions are messy affairs with matter flung outward in all directions and are thus expected to shine with light. Conversely, black hole mergers, in most circumstances, are thought not to produce light.

According to the LIGO and Virgo scientists, the August 2019 event was not seen in light for a few possible reasons. First, this event was six times farther away than the merger observed in 2017, making it harder to pick up any light signals. Secondly, if the collision involved two black holes, it likely would have not shone with any light. Thirdly, if the object was in fact a neutron star, its nine-fold more massive black-hole partner might have swallowed it whole; a neutron star consumed whole by a black hole would not give off any light.

"I think of Pac-Man eating a little dot," says Kalogera. "When the masses are highly asymmetric, the smaller compact object can be eaten by the black hole in one bite."

How will researchers ever know if the mystery object was a neutron star or black hole? Future observations with LIGO and possibly other telescopes may catch similar events that would help reveal whether additional objects exist in the mass gap.

"The mass gap has been an interesting puzzle for decades, and now we've detected an object that fits just inside it," said Pedro Marronetti, program director for gravitational physics at the National Science Foundation (NSF). "That cannot be explained without defying our understanding of extremely dense matter or what we know about the evolution of stars. This observation is yet another example of the transformative potential of the field of gravitational-wave astronomy, which brings novel insights to light with every new detection."

Credit: 
Northwestern University

LIGO-Virgo finds mystery object in 'mass gap'

image: In August of 2019, the LIGO-Virgo gravitational-wave network witnessed the merger of a black hole with 23 times the mass of our sun and a mystery object 2.6 times the mass of the sun. Scientists do not know if the mystery object was a neutron star or black hole, but either way it set a record as being either the heaviest known neutron star or the lightest known black hole.

Image: 
LIGO/Caltech/MIT/R. Hurt (IPAC)

When the most massive stars die, they collapse under their own gravity and leave behind black holes; when stars that are a bit less massive die, they explode in supernovas and leave behind dense, dead remnants of stars called neutron stars. For decades, astronomers have been puzzled by a gap that lies between neutron stars and black holes: the heaviest known neutron star is no more than 2.5 times the mass of our sun, or 2.5 solar masses, and the lightest known black hole is about 5 solar masses. The question remained: does anything lie in this so-called mass gap?

Now, in a new study from the National Science Foundation's Laser Interferometer Gravitational-Wave Observatory (LIGO) and the Virgo detector in Europe, scientists have announced the discovery of an object of 2.6 solar masses, placing it firmly in the mass gap. The object was found on August 14, 2019, as it merged with a black hole of 23 solar masses, generating a splash of gravitational waves detected back on Earth by LIGO and Virgo. A paper about the detection is being published today, June 23, in The Astrophysical Journal Letters.

"We've been waiting decades to solve this mystery" says co-author Vicky Kalogera, a professor at Northwestern University. "We don't know if this object is the heaviest known neutron star, or the lightest known black hole, but either way it breaks a record."

"This is going to change how scientists talk about neutron stars and black holes," says co-author Patrick Brady, a professor at the University of Wisconsin, Milwaukee, and the LIGO Scientific Collaboration spokesperson. "The mass gap may in fact not exist at all but may have been due to limitations in observational capabilities. Time and more observations will tell."

The cosmic merger described in the study, an event dubbed GW190814, resulted in a final black hole about 25 times the mass of the sun (some of the merged mass was converted to a blast of energy in the form of gravitational waves). The newly formed black hole lies about 800 million light-years away from Earth.

Before the two objects merged, their masses differed by a factor of 9, making this the most extreme mass ratio known for a gravitational-wave event. Another recently reported LIGO-Virgo event, called GW190412, occurred between two black holes with a mass ratio of about 4:1.

"It's a challenge for current theoretical models to form merging pairs of compact objects with such a large mass ratio in which the low-mass partner resides in the mass gap. This discovery implies these events occur much more often than we predicted, making this a really intriguing low-mass object," explains Kalogera. "The mystery object may be a neutron star merging with a black hole, an exciting possibility expected theoretically but not yet confirmed observationally. However, at 2.6 times the mass of our sun, it exceeds modern predictions for the maximum mass of neutron stars, and may instead be the lightest black hole ever detected."

When the LIGO and Virgo scientists spotted this merger, they immediately sent out an alert to the astronomical community. Dozens of ground- and space-based telescopes followed up in search of light waves generated in the event, but none picked up any signals. So far, such light counterparts to gravitational-wave signals have been seen only once, in an event called GW170817. That event, discovered by the LIGO-Virgo network in August of 2017, involved a fiery collision between two neutron stars that was subsequently witnessed by dozens of telescopes on Earth and in space. Neutron star collisions are messy affairs with matter flung outward in all directions and are thus expected to shine with light. Conversely, black hole mergers, in most circumstances, are thought not to produce light.

According to the LIGO and Virgo scientists, the August 2019 event was not seen by light-based telescopes for a few possible reasons. First, this event was six times farther away than the merger observed in 2017, making it harder to pick up any light signals. Secondly, if the collision involved two black holes, it likely would have not shone with any light. Thirdly, if the object was in fact a neutron star, its 9-fold more massive black-hole partner might have swallowed it whole; a neutron star consumed whole by a black hole would not give off any light.

"I think of Pac-Man eating a little dot," says Kalogera. "When the masses are highly asymmetric, the smaller neutron star can be eaten in one bite."

How will researchers ever know if the mystery object was a neutron star or a black hole? Future observations with LIGO, Virgo, and possibly other telescopes may catch similar events that would help reveal whether additional objects exist in the mass gap.

"This is the first glimpse of what could be a whole new population of compact binary objects," says Charlie Hoy, a member of the LIGO Scientific Collaboration and a graduate student at Cardiff University. "What is really exciting is that this is just the start. As the detectors get more and more sensitive, we will observe even more of these signals, and we will be able to pinpoint the populations of neutron stars and black holes in the universe."

"The mass gap has been an interesting puzzle for decades, and now we've detected an object that fits just inside it," says Pedro Marronetti, program director for gravitational physics at the National Science Foundation (NSF). "That cannot be explained without defying our understanding of extremely dense matter or what we know about the evolution of stars. This observation is yet another example of the transformative potential of the field of gravitational-wave astronomy, which brings novel insights with every new detection."

Credit: 
California Institute of Technology

Bedtime media use linked to less sleep in children who struggle to self-regulate behavior

video: Leah Doane and Sierra Clifford, from the Arizona State University Department of Psychology, discuss how media use in the hour before bed was associated with less sleep in children who generally struggle to self-regulate their behavior. Media use in children who scored high on measures of effortful control was not related to less sleep.

Image: 
Robert Ewing, ASU

For some children, screen time before bed translates to less sleep.

According to a study from the Arizona State University Department of Psychology, media use in the hour preceding bedtime impacts how kids sleep, especially children who struggle to self-regulate their behavior. Frequent media use before bed in these children predicted later bedtimes and less sleep. The work is now available online in Psychological Science.

"Among kids who used the same amount of media in the hour before bed, we found differences that were explained by a personality characteristic called effortful control," said Leah Doane, associate professor of psychology at ASU and senior author on the paper. "Kids who score low on measures of effortful control are the ones who struggle to wait to unwrap a present or are easily distracted. We found a strong association between media use in the hour before bed and when these kids went to sleep and how long they slept. Media use before bed was not associated with the sleep of kids who scored high on measures of effortful control."

The research team spent a week following 547 children, aged 7-9 years. The participant group was socioeconomically diverse and lived in rural and urban areas. The parents kept daily diaries that tracked the children's media use and sleep patterns. They also completed a survey that asked about their children's temperament, including their ability to self-regulate behavior.

For the entire week, the children wore specialized wrist watches called actigraphs that tracked their movement and also ambient light. The actigraph data gave the research team detailed information about when and how long the children slept.

The children slept an average of 8 hours a night and used media before bed for an average of 5 nights during the study week. Children who did not use media before bed during the study week slept 23 minutes more and went to bed 34 minutes earlier than children who used media most nights during the study week.

"Media use was generally associated with a shorter sleep duration, but this effect was most pronounced in children with low effortful control," said Sierra Clifford, a research scientist at ASU and first author on the paper. "The impact of media on sleep was also an average affect, meaning that it reflects habitual media use rather than occasionally staying up late to watch a movie."

The children who scored low on measures of effortful control slept the least amount of time when they consistently used media in the hour before bed during the study week. These children slept approximately 40 minutes less per night. Media use before bed did not affect the sleep of children who scored high on effortful control, which was approximately 35 percent of the study participants.

"Media exposure mattered for the children who measured lowest in effortful control," Clifford said.

Children with low effortful control might struggle with switching their attention from watching media before bed to calming down and falling asleep. But because effortful control is a personality characteristic, it is more difficult to change.

"Instead of parents wondering how to help their child better regulate their behavior, they can try to focus on creating more consistent sleep and media use schedules," Doane said.

Credit: 
Arizona State University

Welfare concerns highlighted over 'institutional hoarding' of cats

image: The risk of chronic upper respiratory infection appears to be significantly greater for cats in institutional hoarding settings, as shown by this individual rescued by the Toronto Humane Society.

Image: 
Courtesy of Linda Jacobson/Toronto Humane Society

The compulsive hoarding of animals is a poorly understood psychiatric disorder in people. Characterised by failure to provide minimum standards of care, it can result in malnourishment, uncontrolled breeding, overcrowding and neglect. Typically there is denial of this failure and its impact on the animals and people involved. Even less well understood is the growing trend of 'institutional hoarding' by organisations masquerading to the public as legitimate shelters or rescue centres. A new epidemiological study by Dr Linda Jacobson, of the Toronto Humane Society (THS), and shelter medicine colleagues at Ontario Veterinary College and JVR Shelter Strategies, California, shows that there are significant welfare concerns for hoarded cats not just from the home environment but from institutional settings also.1

While animal hoarding is recognised to encompass a continuum of harm and severity, attention from the scientific community has mostly focused on large-scale cases and/or those involving legal seizure of animals and prosecutions. Comparatively little attention has been directed at smaller-scale cases, particularly those associated with a collaborative approach and voluntary relinquishment of animals.

Published this month in the Journal of Feline Medicine and Surgery, Dr Jacobson's study looked at 371 hoarded cats relinquished over a three-year period to the THS, an adoption guaranteed ('no-kill') shelter with a full-service veterinary hospital. Groups of cats ranged from 10 to 77 in number, with nine groups originating from home environments (designated non-institutional hoarding) and three from rescues (designated institutional hoarding). The majority of cats (95%) were surrendered voluntarily, many with the assistance of a community intermediary who was able to provide a navigable pathway between the animal hoarders and the THS.

The authors documented a range of conditions typical for hoarded cats. Almost 90% of cats were unneutered and 18% of females of breeding age were pregnant. Upper respiratory infection (URI), skin disease, ear mites and oral disease (gingivitis) were found in the largest number of groups. URI, which is associated with stress and overcrowding in cats, was the most common medical condition (38%) overall, followed by skin disease (30%). Notably, the risk of URI, and particularly chronic URI, was significantly greater for institutional hoarding compared with non-institutional hoarding settings.

As part of their study, the authors also analysed outcomes between hoarded and non-hoarded cats. Interestingly, they found similar adoption rates among their sample of 371 hoarded cats and a separate cohort of over 6000 non-hoarded cats that had been surrendered to the THS during the study period. In fact, the hoarded cats had a shorter maximum length of stay in the shelter than the non-hoarded cats. This reflects the fact that most of the hoarded cats were young and most of their medical conditions were curable or manageable, versus the complex medical or behavioural conditions among some non-hoarded cats. This finding, say the authors, underlines a shift in the expectations and abilities of shelters to successfully manage and rehome hoarded cats.

The study concludes that there is a need for a greater focus on institutional hoarding, and also points to the role that can be played by colony cat caregivers and other community intermediaries as an alternative to the legal seizure of animals in hoarding cases. Commenting, Dr Jacobson says: 'The "seize and euthanize" model is outdated and can often be successfully replaced by a least harms model.' This might help to reduce some of the unintended negative consequences associated with the traditional approach, such as delayed response times, stress for the animals and owners and overwhelming costs.

Credit: 
SAGE