Culture

Satellite constellations harvest energy for near-total global coverage

ITHACA, N.Y. - Think of it as a celestial parlor game: What is the minimum number of satellites needed to see every point on Earth? And how might those satellites stay in orbit and maintain continuous 24/7 coverage while contending with Earth's gravity field, its lumpy mass, the pull of the sun and moon, and pressure from solar radiation?

In the mid-1980s, researcher John E. Draim proposed what is generally considered to be the ideal solution: a four-satellite constellation. However, the amount of propellant needed to keep the satellites in place, and the ensuing cost, made the configuration unfeasible.

Now, a National Science Foundation-sponsored collaboration led by Patrick Reed, the Joseph C. Ford Professor of Engineering at Cornell University, has discovered the right combination of factors to make a four-satellite constellation possible, which could drive advances in telecommunication, navigation and remote sensing. And in an ingenious twist, the researchers accomplished this by making the forces that ordinarily degrade satellites instead work in their favor.

"One of the interesting questions we had was, can we actually transform those forces? Instead of degrading the system, can we actually flip it such that the constellation is harvesting energy from those forces and using them to actively control itself?" Reed said.

Their paper, "Low Cost Satellite Constellations for Nearly Continuous Global Coverage," published Jan. 10 in Nature Communications.

The AI-based evolutionary computing search tools that Reed has developed are ideally suited for navigating the numerous complications of satellite placement and management.

For this project, Reed collaborated with researchers from The Aerospace Corporation, combining his algorithmic know-how with the company's expertise in cutting-edge astrophysics, operational logistics and simulations.

In order to sift through the hundreds of thousands of possible orbits and combinations of perturbations, the team used the Blue Waters supercomputer at University of Illinois, Urbana-Champaign. Blue Waters compressed 300 or 400 years' worth of computational exploration into the equivalent of roughly a month of actual computing, Reed said.

They winnowed their constellation designs to two models that could orbit for either a 24- or 48-hour period and achieve continuous coverage over 86% and 95% of the globe, respectively. While 100% performance coverage would be ideal in theory, the researchers found that sacrificing only 5%-14% created greater gains in terms of harvesting energy from the same gravitational and solar radiation forces that would normally make a satellite constellation short lived and difficult to control.

The tradeoff is worth it, Reed said, especially since satellite operators could control where the gaps in coverage would occur. Outages in these low-priority regions would last approximately 80 minutes a day, at most, in the worst-case scenario.

"This is one of those things where the pursuit of perfection actually could stymie the innovation," Reed said. "And you're not really giving up a dramatic amount. There might be missions where you absolutely need coverage of everywhere on Earth, and in those cases, you would just have to use more satellites or networked sensors or hybrid platforms."

Using this type of passive control could potentially extend a constellation's lifespan from five years to 15 years. These satellites would require less propellant and would float at higher elevations, removing them from the risky high-traffic zone of low Earth orbit. But perhaps the biggest selling point is the low cost. Commercial interests or countries without the financial resources to launch a large constellation of satellites could attain near-continuous global coverage very economically, with reduced long-term technical overhead.

"Even one satellite can cost hundreds of millions or billions of dollars, depending on what sensors are on it and what its purpose is. So having a new platform that you can use across the existing and emerging missions is pretty neat," Reed said. "There's a lot of potential for remote sensing, telecommunication, navigation, high-bandwidth sensing and feedback around the space, and that's evolving very, very quickly. There's likely all sorts of applications that might benefit from a long-lived, self-adapting satellite constellation with near global coverage."

The paper's lead author is Lake Singh with The Aerospace Corporation. Researchers from the University of California, Davis, also contributed.

"We leveraged Aerospace's constellation design expertise with Cornell's leadership in intelligent search analytics and discovered an operationally feasible alternative to the Draim constellation design," said Singh, systems director for The Aerospace Corporation's Future Architectures department. "These constellation designs may provide substantive advantages to mission planners for concepts out at geostationary orbits and beyond."

Credit: 
Cornell University

Study puts the 'Carib' in 'Caribbean,' boosting credibility of Columbus' cannibal claims

image: Researchers analyzed the skulls of early Caribbean inhabitants, using 3D facial "landmarks" as a genetic proxy for determining how closely people groups were related to one another.

Image: 
Ann Ross/North Carolina State University

GAINESVILLE, Fla. --- Christopher Columbus' accounts of the Caribbean include harrowing descriptions of fierce raiders who abducted women and cannibalized men - stories long dismissed as myths.

But a new study suggests Columbus may have been telling the truth.

Using the equivalent of facial recognition technology, researchers analyzed the skulls of early Caribbean inhabitants, uncovering relationships between people groups and upending longstanding hypotheses about how the islands were first colonized.

One surprising finding was that the Caribs, marauders from South America and rumored cannibals, invaded Jamaica, Hispaniola and the Bahamas, overturning half a century of assumptions that they never made it farther north than Guadeloupe.

"I've spent years trying to prove Columbus wrong when he was right: There were Caribs in the northern Caribbean when he arrived," said William Keegan, Florida Museum of Natural History curator of Caribbean archaeology. "We're going to have to reinterpret everything we thought we knew."

Columbus had recounted how peaceful Arawaks in modern-day Bahamas were terrorized by pillagers he mistakenly described as "Caniba," the Asiatic subjects of the Grand Khan. His Spanish successors corrected the name to "Caribe" a few decades later, but the similar-sounding names led most archaeologists to chalk up the references to a mix-up: How could Caribs have been in the Bahamas when their closest outpost was nearly 1,000 miles to the south?

But skulls reveal the Carib presence in the Caribbean was far more prominent than previously thought, giving credence to Columbus' claims.

Face to face with the Caribbean's earliest inhabitants

Previous studies relied on artifacts such as tools and pottery to trace the geographical origin and movement of people through the Caribbean over time. Adding a biological component brings the region's history into sharper focus, said Ann Ross, a professor of biological sciences at North Carolina State University and the study's lead author.

Ross used 3D facial "landmarks," such as the size of an eye socket or length of a nose, to analyze more than 100 skulls dating from about A.D. 800 to 1542. These landmarks can act as a genetic proxy for determining how closely people are related to one another.

The analysis not only revealed three distinct Caribbean people groups, but also their migration routes, which was "really stunning," Ross said.

Looking at ancient faces shows the Caribbean's earliest settlers came from the Yucatan, moving into Cuba and the Northern Antilles, which supports a previous hypothesis based on similarities in stone tools. Arawak speakers from coastal Colombia and Venezuela migrated to Puerto Rico between 800 and 200 B.C., a journey also documented in pottery.

The earliest inhabitants of the Bahamas and Hispaniola, however, were not from Cuba as commonly thought, but the Northwest Amazon - the Caribs. Around A.D. 800, they pushed north into Hispaniola and Jamaica and then the Bahamas where they were well established by the time Columbus arrived.

"I had been stumped for years because I didn't have this Bahamian component," Ross said. "Those remains were so key. This will change the perspective on the people and peopling of the Caribbean."

For Keegan, the discovery lays to rest a puzzle that pestered him for years: why a type of pottery known as Meillacoid appears in Hispaniola by A.D. 800, Jamaica around 900 and the Bahamas around 1000.

"Why was this pottery so different from everything else we see? That had bothered me," he said. "It makes sense that Meillacoid pottery is associated with the Carib expansion."

The sudden appearance of Meillacoid pottery also corresponds with a general reshuffling of people in the Caribbean after a 1,000-year period of tranquility, further evidence that "Carib invaders were on the move," Keegan said.

Raiders of the lost Arawaks

So, was there any substance to the tales of cannibalism?

Possibly, Keegan said.

Arawaks and Caribs were enemies, but they often lived side by side with occasional intermarriage before blood feuds erupted, he said.

"It's almost a 'Hatfields and McCoys' kind of situation," Keegan said. "Maybe there was some cannibalism involved. If you need to frighten your enemies, that's a really good way to do it."

Whether or not it was accurate, the European perception that Caribs were cannibals had a tremendous impact on the region's history, he said. The Spanish monarchy initially insisted that indigenous people be paid for work and treated with respect, but reversed its position after receiving reports that they refused to convert to Christianity and ate human flesh.

"The crown said, 'Well, if they're going to behave that way, they can be enslaved,'" Keegan said. "All of a sudden, every native person in the entire Caribbean became a Carib as far as the colonists were concerned."

Credit: 
Florida Museum of Natural History

Deep learning, 3D technology to improve structure modeling, create better drugs

image: DOVE, created by Purdue researchers, captures structural and energetic features of the interface of a protein docking model with a 3D box and judges if the model is more likely to be correct or incorrect using 3D convolutional neural network.

Image: 
Daisuke Kihara/Purdue University

WEST LAFAYETTE, Ind. - Proteins are often called the working molecules of the human body. A typical body has more than 20,000 different types of proteins, each of which are involved in many functions essential to human life.

Now, Purdue University researchers have designed a novel approach to use deep learning to better understand how proteins interact in the body - paving the way to producing accurate structure models of protein interactions involved in various diseases and to design better drugs that specifically target protein interactions. The work is released online in Bioinformatics.

"To understand molecular mechanisms of functions of protein complexes, biologists have been using experimental methods such as X-rays and microscopes, but they are time- and resource-intensive efforts," said Daisuke Kihara, a professor of biological sciences and computer science in Purdue's College of Science, who leads the research team. "Bioinformatics researchers in our lab and other institutions have been developing computational methods for modeling protein complexes. One big challenge is that a computational method usually generates thousands of models, and choosing the correct one or ranking the models can be difficult."

Kihara and his team developed a system called DOVE, DOcking decoy selection with Voxel-based deep neural nEtwork, which applies deep learning principles to virtual models of protein interactions. DOVE scans the protein-protein interface of a model and then uses deep learning model principles to distinguish and capture structural features of correct and incorrect models.

"Our work represents a major advancement in the field of bioinformatics," said Xiao Wang, a graduate student and member of the research team. "This may be the first time researchers have successfully used deep learning and 3D features to quickly understand the effectiveness of certain protein models. Then, this information can be used in the creation of targeted drugs to block certain protein-protein interactions."

Credit: 
Purdue University

New study finds 8% of Chinese men are problem drinkers

Alcohol consumption has become more prevalent in China in recent years but limited large-scale epidemiological evidence has made it difficult to know the true scale of the problem. A new large study of Chinese adults, published by the scientific journal Addiction, has found that eight percent of men in China are problem drinkers, and that problem drinking is more prevalent among men of lower socio-economic status and in rural areas. Problem drinking is associated with significantly increased risk of physical and mental health problems and premature death.

Researchers from Oxford University, Peking University, and the Chinese Academy of Medical Sciences led a large collaborative study of over 500,000 men and women aged 30-79 years from ten rural and urban areas in China. Women in the study drank little alcohol, but a third of men drank alcohol regularly and one in four of these men experienced at least one indicator of problem drinking.

'Problem drinking' was defined by self-report as one or more of the following indicators in the past month: drinking in the morning, being unable to work or to do anything due to drinking, negative emotions after drinking, being unable to avoid drinking, or having the shakes when stopping drinking.

Co-author Pek Kei Im, of the Nuffield Department of Population Health at the University of Oxford, says: "In China, the patterns of drinking differ from Western populations. Our study shows that problem drinking is fairly common among Chinese men, particularly among more disadvantaged groups."

Compared with low-risk drinkers, men with problem drinking had poorer self-reported health, poorer life satisfaction, more sleep problems, and a higher risk of depression and anxiety. Men with two or more problem drinking indicators had an approximately two-fold higher risk for all-cause mortality and a 15% higher risk for hospitalisation compared with low-risk drinkers. Professor Zhengming Chen, co-author from the Nuffield Department of Population Health at the University of Oxford, says: "This large collaborative study has shown that drinking alcohol can result in significant adverse consequences, for both mental and physical health and wellbeing."

The prevalence of alcohol dependence in China increased from 0.02% to 0.68% between the 1980s and 1990s, and per capita alcohol consumption increased from 4.1L in 2005 to 7.2L in 2016, whereas in Europe per capita consumption decreased from 12.3L to 9.8L over the same period.

Co-author Dr Iona Millwood, of the Nuffield Department of Population Health at the University of Oxford, says: "Drinking has been on the rise in China since the 1980s, and now we're looking at a significant national health problem that is beginning to resemble those in Western countries. Knowing the scale of the problem, and the fact that it's more intense in rural and poorer areas, can help to inform policy decisions to improve health outcomes in China."

Credit: 
Society for the Study of Addiction

Acidic environment could boost power of harmful pathogens

video: Under normal conditions while feeding on healthy bacteria, C. elegans digestive tracts are moderately acidic compared to human stomachs. Harmful bacteria prompted a less acidic digestive tract in C. elegans -- a result that runs counter to what one might expect if the acidic environment was generated to kill bacteria.

Image: 
University of Kansas

LAWRENCE -- When food we've swallowed reaches our stomachs, it finds an acidic environment. The low pH in the stomach helps to begin digestion -- and has been thought to kill the bacteria that hides in food that otherwise could harm our bodies.

However, recent work from the Ackley and Chandler labs in the Department of Molecular Biosciences at the University of Kansas runs counter to this idea, instead suggesting lower pH in the digestive tract may make some bacterial pathogens even more harmful.

Their findings, published in the peer-reviewed journal PLOS Pathogens, could have implications for addressing the crisis of antibiotic resistance in bacterial infections around the world.

The investigation was performed using small, bacteria-eating organisms called Caenorhabditis elegans.

"These wormlike animals are transparent, so we can watch things that happen inside them quite easily," said co-author Brian Ackley, associate professor of molecular biosciences at KU. "Using pH-sensitive chemicals developed at KU, called Kansas Reds, we were able to monitor the pH inside the digestive system and watch what happens when they eat harmful bacteria, compared to nonharmful bacteria."

According to the KU researchers, under normal conditions while feeding on healthy bacteria, C. elegans digestive tracts are moderately acidic compared to human stomachs. But these model species' stomachs also show regional differences within the digestive tract. When they ingest pathogens, they neutralize the acidic environment.

This observation suggested the animals could discriminate between good and bad bacteria, and harmful bacteria prompted a less acidic digestive tract in C. elegans -- a result that runs counter to what one might expect if the acidic environment was generated to kill bacteria.

To test this, the researchers used animals with mutations in genes that helped regulate the pH in their digestive tracts.

"When animals had a more acidic digestive system, they were more likely to be affected by pathogenic bacteria -- again counter to what one might guess if acidity was useful in killing harmful bugs that might sneak into the body with food," Ackley said. "Our lab teams were able to show the effect on the animals was specifically due to the pH by adding a base to buffer the digestive tract. We used bicarbonate, the same agent our bodies use to neutralize stomach contents when they pass into our intestines. Neutralizing the pH in the mutant animals reverted the accelerated infection by the pathogenic bacteria."

The KU researcher said different species react differently when their bodies sense pathogenic bacteria -- but some biological reactions are common to many animals.

"A general response involves the creation of chemicals, like hydrogen peroxide or hypochlorous acid -- aka bleach -- near the bacteria, and then having specialized immune cells eat the dying bacteria," Ackley said. "To keep our bodies safe, the immune system only deploys these defenses when it's sure it is being invaded. The work in C. elegans may suggest a way the body can have these defenses ready to go at a moment's notice -- that is, keep the chemical environment in a moderately acidic state where making those chemicals is difficult, then, upon infection, simply neutralize the environment to deploy the defenses."

Ackley's KU colleagues on the work were lead author Saida Benomar, Patrick Lansdon and Josephine R. Chandler of the Department of Molecular Biosciences, along with Aaron Bender of the Department of Medicinal Chemistry, and Blake R. Peterson of The Ohio State University.

The researchers believe there may be reasons to believe these systems could work similarly in people.

The genes they studied in C. elegans also exist in humans and control parts of the immune system. Further, research in other labs has shown occasions in humans where problems with regulating pH are associated with increased risk of infection. Moving forward, the researchers want to understand the mechanism at a deeper level.

"Our goal is to boost this natural defense system in people as a way to either avoid or reduce the use of antibiotics," Ackley said. "Right now, our antibiotic use is unsustainable, and bacteria are evolving resistance at an alarming rate. If the system discovered in C. elegans is in fact still present in humans, it would suggest bacteria are much slower to adapt to this defensive strategy than they are to antibiotics."

Credit: 
University of Kansas

Collection of new bird species discovered on small Wallacean islands

Hidden away on a trio of tiny and under-explored Wallacean islands off the eastern Indonesian coast, researchers discovered 10 new species and subspecies of songbirds, according to a new study, bringing a long-overlooked pocket of local biodiversity to light. The findings mark the largest number of new species identified from such a small geographically confined region in more than a century. Birds are perhaps the best-known groups of animals on Earth. However, despite the roughly 11,000 presently recognized species, it's estimated that many thousands more remain undiscovered. Still, identifying new taxa is relatively rare; between 1990-2019 only 161 new species descriptions have been made worldwide - even rarer is the discovery of multiple new species from a geographically restricted area. Here, Frank Rheindt and colleagues report on the combined discovery of five new songbird species and five new subspecies from three islands off the eastern coast of Sulawesi, Indonesia. During their six-week bird collecting expedition, Rheindt et al. explored the remote islands of Taliabu, Peleng and Batudaka - three out of hundreds in the region - which were targeted based on their geological history and complexity as well as their sparse historic record of biological exploration. According to the authors, similar approaches could be used to identify areas in other regions that could similarly lead to new species discoveries. "These findings suggest that many biologically underexplored places still exist across the Earth's surface," write Jonathan Kennedy and Jon Fjeldsa in a related Perspective. "Formal species descriptions, such as those by Rheindt et al. are a necessary step towards the initiation of conservation actions that aim to preserve these little-known biotas," Kennedy and Fjeldsa write

Credit: 
American Association for the Advancement of Science (AAAS)

Early humans arrived in Southeast Asia later than previously believed

New dates from the World Heritage archeological site at Sangiran on the island of Java suggest that that the first appearance of Homo erectus occurred more recently than previously thought, researchers report. The new findings place the arrival of the first hominins in Sangiran between 1.3-1.5 million years ago (Ma), suggesting that early humans migrated from Asia to Southeast Asia and Java nearly 300,000 years later than previously believed. The fossil-rich Sangiran dome in Java contains the oldest human fossils in Southeast Asia and is widely regarded as one of the most important sites in understanding the evolution of our early ancestors and their slow march across the globe. To date, more than 100 specimens from at least three different hominid species have been recovered from Sangrian sediments. However, despite decades of research, the site's chronology remains uncertain and controversial, particularly the timing of H. erectus' first appearance in the region, and the current widely accepted dates are difficult to reconcile with other early sites in Asia. An accurate understanding of the Sangiran chronology is crucial for understanding the earliest human migrations and settlements in Asia. To resolve this debate, Shuji Matsu'ura and colleagues used a combination of fission-track and Uranium/Lead (U/Pb) dating to determine the age of volcanic zircons found above, below and within the hominin-bearing layers of the Sangiran fossil deposit. While previous estimates have estimated hominin arrival as early as 1.7 Ma, Matsu'ura et al.'s findings suggest a much younger date; likely by 1.3 Ma, however no earlier than 1.5 Ma. In a related Perspective, Boris Brasseur discusses the study's findings in more detail.

Credit: 
American Association for the Advancement of Science (AAAS)

An international study discovers a new origin of lymphatic vessels in the heart

image: A mouse heart with the coronary arteries are shown in green. The red staining shows the cells descended from a single lymphatic endothelial cell.

Image: 
CNIC

An international study led by Drs Miguel Torres and Ghislaine Lioux of the Centro Nacional de Investigaciones Cardiovasculares (CNIC), has identified and characterized a new vasculogenic niche that contributes to the development of the cardiac lymphatic system. The study shows that the coronary lymphatic vessels have varied origins and functions: the results of the study reveal that the coronary lymphatic vasculature does not have a single origin, but instead forms through the participation of cells from different tissues.

The study, published today in Developmental Cell, opens the way to future research into the mechanism underlying lymphatic vasculogenesis in this new niche and the functional diversity of coronary lymphatics.

The coronary circulation, essential for heart function, is not only composed of the arteriovenous system that provides oxygen and nutrients, but also includes lymphatic vessels. The essential functions of the cardiac lymphatic vessels include protecting the heart against infection, preventing edema (fluid retention) in the myocardium, and the recovery of cardiac function after an infarction. Despite its importance, the coronary lymphatic vasculature is little understood and has received much less attention than the coronary arteries and veins.

This international study was led by Dr Miguel Torres's group at the CNIC in partnership with Dr Guillermo Oliver's group at the University of Chicago, Dr Robert Kelly at the University of Marseilles, and Dr Sagrario Ortega at the Centro Nacional de Investigaciones Oncológicas (CNIO). The study examines the origin of the coronary lymphatic system during the formation of the heart in the mouse embryo.

Until now, explained Dr Torres, all lymphatic vessels were believed to develop from cells derived from the major veins of the early embryo, from where they migrate to colonize all the tissues and organs in the embryo.

The new study shows that, in the heart, there is a second population of lymphatic cells that is recruited later during development and is derived not from veins, but from a region called the second heart field.

The second heart field, explained Ghislaine Lioux, is composed of multipotent cells "able to generate different types of heart cells, including cardiomyocytes (the cells of the cardiac muscle), smooth muscle cells, and the endothelial cells of the arteries and veins."

The study shows that the repertoire of cells generated by the second heart field is wider than previously thought and also includes the lymphatic endothelium. According to Dr Torres "this unique feature of the coronary lymphatic vasculature opens up several interesting avenues for future research."

One of the more surprising findings was that lymphatic cells generated in the second heart field mix with lymphatic cells with a different, likely venous, origin; the two populations together form the lymphatic vessels in the ventral part of the heart.

Although the second heart field contributes only 50% of the cells of the lymphatic vessels, blockade of their formation completely prevented the formation of the coronary lymphatic vasculature.

This result indicates that the newly discovered cell population not only contributes a large proportion of the cells of the coronary lymphatic system, but also leads a specific and irreplaceable process in the formation of the coronary lymphatic vasculature. "This function reveals, for the first time, the specialization of endothelial subpopulations in the formation of the coronary vasculature and opens the way toward a better understanding of the formation of lymphatic vessels, a process essential not only for embryonic heart development but also for the response of the heart to stress and disease in adulthood."

Credit: 
Centro Nacional de Investigaciones Cardiovasculares Carlos III (F.S.P.)

AAFP releases updated feline retrovirus guidelines to the veterinary community

[HILLSBOROUGH, NJ - January 2020] On Thursday, January 9, The American Association of Feline Practitioners (AAFP) will release updated Feline Retrovirus Testing and Management Guidelines to the veterinary community, which will be published in the Journal of Feline Medicine and Surgery. In publishing these Guidelines, the AAFP aims to provide the most current information about feline retrovirus infections to veterinary practitioners so they may optimize the care and management of their feline patients. In addition, the Client Brochure provides cat caregivers with information regarding transmission, testing, prevalence, and precautions. These Guidelines focus on feline leukemia virus (FeLV) and feline immunodeficiency virus (FIV) infections, which are found in cats worldwide. The spread of these viruses can be minimized through education, testing, and vaccinations.

The updated Guidelines represent a consensus of current information compiled by an international panel of researchers and practitioners, and is an update of the AAFP's heavily referenced 2008 Retrovirus Testing and Management Guidelines.

"Education and early testing can greatly assist in the treatment and management of feline retrovirus infections. Routine veterinary care, when cats are well and when they are sick, can lead to better care and decrease the spread of infection. We are pleased to present these Guidelines to support both veterinary professionals and cat caregivers in the management of these illnesses. We further stress the partnership between veterinarians and cat owners in caring for infected cats because with regular healthcare and reduced stress, cats infected with retroviruses, especially FIV, may live many healthy years," said Heather O'Steen, CEO, AAFP.

"The 2020 Feline Retrovirus Testing and Management Guidelines contain much new information about feline leukemia and feline immunodeficiency virus infections. The Guidelines were written by an international panel of experts and included not only retrovirus researchers, but veterinarians working in private practice and in shelters. We hope these Guidelines will be of practical use for all veterinarians. The panel is especially proud to have endorsement of the Guidelines by the International Society of Feline Medicine," said Retrovirus Guidelines Co-Chair Susan Little, DVM, DABVP (Feline).

Julie Levy, DVM, PhD, DACVIM, DABVP (Shelter Medicine) added, "These guidelines address rapidly evolving knowledge about how testing results, clinical expression, and prognosis for FeLV may change over time relative to the cat's current immune response and resulting levels of virus in circulation, how quantitative testing may be used to better inform clinical decision-making, and an emerging trend in which screening for FeLV and FIV is increasingly shifting from animal shelters, where cats are adopted, to veterinary practices, where animals receive comprehensive care."

More About Retroviruses:

These Guidelines and Client Brochure represent current knowledge on the pathogenesis, diagnosis, prevention, and treatment of retrovirus infections in cats. Infections with FeLV and FIV are associated with a variety of clinical signs and can impact quality of life and longevity. Although vaccines are available for FeLV in many countries and for FIV in some countries, identification of infected cats remains an important factor for preventing new infections. The retrovirus status of every cat at risk of infection should be known. Cats should be tested as soon as possible after they are acquired, following exposure to an infected cat or a cat of unknown infection status, prior to vaccination against FeLV or FIV, and whenever clinical illness occurs. It might not be possible to determine a cat's infection status based on testing at a single point in time; repeat testing using different methods could be required. Although FeLV and FIV infections can be associated with clinical disease, some infected cats, especially those infected with FIV, can live for many years with good quality of life. There is a paucity of data evaluating treatments for infected cats, especially antiretroviral and immunomodulatory drugs. Management of infected cats is focused on effective preventive health care strategies and prompt identification and treatment of illness, as well as limiting spread of infection.

Prevalence and the Spread of Retroviruses in Cats:

FIV: Feline immunodeficiency virus is more commonly found in male cats and cats that fight with other cats. It is found less often in kittens and neutered adult cats. The virus is spread primarily through saliva and is usually passed to other cats by bite wounds. In North America, about 3 to 5% of tested cats are found to be infected with FIV.

FeLV: Feline leukemia virus infection is more commonly spread from mother to kittens. The virus can also be spread between cats that live together or those that fight. It is mainly spread in saliva during grooming and when food and water bowls are shared. The virus is less often spread through urine, feces, or nasal discharge. In North America, 4% of tested cats are found to be infected with the virus.

Prevention:

There are no vaccines marketed in the United States or Canada that can protect cats from FIV infection.

Vaccines to protect cats from FeLV infection are available. The vaccine is recommended for all kittens, again one year later, and for cats that have ongoing risk of infection. Adult indoor-only cats living alone or with uninfected cats may not need to be vaccinated after the first two years. Veterinarians will help assess an individual cat's vaccination needs.

To access the Feline Retrovirus Guidelines, visit catvets.com/retroviruses. Cat caregivers can learn more about feline retroviruses at catfriendly.com/felv and catfriendly.com/fiv.

Credit: 
SAGE

In fighting gut infections, nervous system is key, Yale-Harvard team finds

The peaceful and delicate co-existence of friendly gut bacteria and the immune system relies on highly coordinated information exchange between immune system cells and certain cells lining the intestine. Until now, scientists generally believed these two cell types were also central to the production of antibacterial molecules that fend off dangerous infections.

But scientists at Yale and Harvard medical schools have discovered that, in response to bacterial invaders, nerve cells within the intestine -- and not immune cells or cells lining the intestinal wall -- release infection-fighting cytokines. They report their findings Jan. 9 in the journal Cell.

The findings provide new insights into the body's response to bacterial infections that cause food poisoning and other illnesses.

"We used to believe that the immune system cells and intestinal barrier cells communicated to keep out invading bugs by mobilizing anti-microbial proteins," said co-corresponding author Richard Flavell, Sterling Professor of Immunobiology at Yale and a Howard Hughes Medical Institute investigator. "The story is actually not true -- it is the nervous system telling barrier cells what to do."

The foot soldiers in the war against intestinal pathogens, it turns out, are the immune system molecules interleukin-18 or IL-18. Interleukins are part of the immune system's arsenal.

When the researchers deleted IL-18 from both immune cells and cells lining intestinal barriers, mice were still able to fend off intestinal infection from Salmonella bacteria. This ruled them out as the agents responsible for the immune response, said Flavell. But mice without IL-18 produced by nervous system cells were more susceptible to infection, revealing its key role in fighting infection.

The key role of nerve cells in the defense against pathogens makes sense, given the ability of the nervous system to communicate across long distances, Flavell said.

"The findings offer an opportunity to explore new ways to intervene in infections through the nervous system," he said.

Credit: 
Yale University

Scientists transform a BBQ lighter into a high-tech lab device

image: Georgia Tech undergraduate student Gaurav Byagathvalli and assistant professor Saad Bhamla are shown with examples of butane lighters they used to create the inexpensive ElectroPen - an electroporator device useful in life sciences research.

Image: 
Christopher Moore, Georgia Tech

Researchers have devised a straightforward technique for building a laboratory device known as an electroporator - which applies a jolt of electricity to temporarily open cell walls - from inexpensive components, including a piezoelectric crystal taken from a butane lighter.

The goal would be to make the low-cost device available to high schools, budget-pressed laboratories and other organizations whose research might otherwise be limited by access to conventional lab-grade electroporators. Plans for the device, known as the ElectroPen, are being made available, along with the files necessary for creating a 3D-printed casing

"Our goal with the ElectroPen was to make it possible for high schools, budget-conscious laboratories and even those working in remote locations without access to electricity to perform experiments or processes involving electroporation," said M. Saad Bhamla, an assistant professor in Georgia Tech's School of Chemical and Biomolecular Engineering. "This is another example of looking for ways to bypass economic limitations to advance scientific research by putting this capability into the hands of many more scientists and aspiring scientists."

In a study to be reported January 9 in the journal PLOS Biology and sponsored by the National Science Foundation and the National Institutes of Health, the researchers detail the method for constructing the ElectroPen, which is capable of generating short bursts of more than 2,000 volts needed for a wide range of laboratory tasks.

One of the primary jobs of a cell membrane is to serve as a protective border, sheltering the inner workings of a living cell from the outside environment.

But all it takes is a brief jolt of electricity for that membrane to temporarily open and allow foreign molecules to flow in -- a process called electroporation, which has been used for decades in molecular biology labs for tasks ranging from bacterial detection to genetic engineering.

Despite how commonplace the practice has become, the high cost of electroporators and their reliance on a source of electricity has kept the technique mostly within the confines of academic or professional labs. Bhamla and undergraduate student Gaurav Byagathvalli set out to change that, with help from collaborators Soham Sinha, Yan Zhang, Assistant Professor Mark Styczynski and Lambert High School teacher Janet Standeven.

"Once we decided to tackle this issue, we began to explore the inner workings of electroporators to understand why they are so bulky and expensive," said Byagathvalli. "Since their conception in the early 1980s, electroporators have not had significant changes in design, sparking the question of whether we could achieve the same output at a fraction of the cost. When we identified a lighter that could produce these high voltages through piezoelectricity, we were excited to uncover new mysteries behind this common tool."

In addition to the piezoelectric lighter crystal - which generates current when pressure is applied to it - the other parts in the device include copper-plated wire, heat-shrinking wire insulator and aluminum tape. To hold it all together, the researchers designed a 3D-printed casing that also serves as its activator. With all the parts on hand, the device can be assembled in 15 minutes, the researchers reported.

While the ElectroPen is not designed to replace a lab-grade electroporator, which costs thousands of dollars and is capable of processing a broad range of cell mixtures, the device is still highly capable of performing tasks when high volumes are not required.

The researchers tested several different lighter crystals to find ones that produced a consistent voltage using a spring-based mechanism. To understand more about how the lighters function, the team used a high-speed camera at 1,057 frames-per-second to view their mechanics in slow motion.

"One of the fundamental reasons this device works is that the piezoelectric crystal produces a consistently-high voltage, independent of the amount of force applied by the user," Bhamla said. "Our experiments showed that the hammer in these lighters is able to achieve acceleration of 3,000 Gs, which explains why it is capable of generating such a high burst of voltage."

To test its capabilities, the researchers used the device on samples of E. coli to add a chemical that makes the bacterial cells fluorescent under special lights, illuminating the cell parts and making them easier to identify. Similar techniques could be used in a lab or in remote field operations to detect the presence of bacteria or other cells.

The team also evaluated whether the device was easy to use, shipping the assembled ElectroPens to students at other universities and high schools.

"The research teams were able to successfully obtain the same fluorescence expression, which I think validates how easily these devices can be disseminated and adopted by students across the globe," Bhamla said.

To that end, the researchers have made available the plans for how to build the device, along with digital files to be used by a 3D printer to fabricate the casing and actuator. Next steps of the research include testing a broader range of lighters looking for consistent voltages across a wider range, with the goal of creating ElectroPens of varying voltages.

Credit: 
Georgia Institute of Technology

Moths' flight data helps drones navigate complex environments

image: An image of the experimental setup showing a moth attached to a metal rod in front of the virtual forest scene.

Image: 
Thomas Daniel lab, University of Washington, Seattle.

The flight navigation strategy of moths can be used to develop programs that help drones to navigate unfamiliar environments, report Ioannis Paschalidis at Boston University, Thomas Daniel at University of Washington, and colleagues, in the open-access journal PLOS Computational Biology.

To understand how real moths plan their route, the researchers mounted 8 hawk moths (Mantuca sexta) on metal rods connected to a torque meter. In front of each moth they projected a moving forest scene created from beams of light for the moth to navigate. They captured data from the moth flight and built a mathematical model to describe the moth trajectory through the virtual forest. The flight data were translated into a decision-making program that could be used to control a drone. They compared how the drone and the moth performed in simulations of the same forest layout, as well as new configurations with different densities of trees.

The researchers found that hawk moths mainly rely on the pattern created by the apparent motion of objects caused by their flight, which agrees with studies of flight behavior in other insects. However, the flight programs optimized for drones performed 60% better in the simulated forest because they also incorporated information about the exact location of objects in their surroundings into their navigational decisions.

Although the researchers were able to optimize the strategy used by moths to improve performance in certain environments, the moths' strategy was more adaptable, performing well in a variety of different forest layouts. The moth model performed best in dense forests, suggesting that hawk moths have evolved a flight strategy adapted to the thick forests they often encounter.

The researchers say that by using real data from animal flight paths they can program bio-inspired drones that will be able to navigate autonomously in cluttered environments.

Credit: 
PLOS

Mars: Water could disappear faster than expected

image: When the sun lights up the large reservoirs of ice at the poles, water vapor is released into the atmosphere. These water molecules are then transported by winds toward higher and colder altitudes where, in the presence of dust particles, they can condense into clouds and prevent a rapid and mass progression of water toward higher altitudes (as on Earth). On Mars condensation is often hindered. The atmosphere is thus regularly supersaturated in water vapor, which allows even more water to reach the upper atmosphere, where the sun's UV rays disassociate them into atoms. The discovery of the increased presence of water vapor at very high altitude entails that a greater number of hydrogen and oxygen atoms are able to escape from Mars, amplifying the loss of Martian water over the long term.

Image: 
© ESA

The small red planet is losing water more quickly than what theory as well as past observations would suggest. The gradual disappearance of water (H2O) occurs in the upper atmosphere of Mars: sunlight and chemistry disassociate water molecules into hydrogen and oxygen atoms that the weak gravity of Mars cannot prevent from escaping into space. An international research team,1 led partly by CNRS researcher Franck Montmessin, has just revealed that water vapour is accumulating in large quantities and unexpected proportions at an altitude of over 80 km in the Martian atmosphere. Measurements showed that large atmospheric pockets are even in a state of supersaturation, with the atmosphere containing 10 to 100 times more water vapour than its temperature should theoretically allow. With the observed supersaturation rates, the capacity of water to escape would greatly increase during certain seasons. These results, which were published in Science on 9 January 2020, were obtained thanks to the Trace Gas Orbiter probe from the ExoMars mission, financed by the European Space Agency and the Russian space agency Roscosmos.

Credit: 
CNRS

The Lancet: Study suggests mental health impact of ongoing social unrest in Hong Kong

The ongoing social unrest in Hong Kong may be affecting the mental health of the general adult population--potentially leading to substantial increases in demand for mental and psychosocial support services, according to a 10-year observational study published in The Lancet.

The new estimates obtained from surveys suggest that the prevalence of probable depression [1] (in Hong Kong residents aged 18 years or more) was five times higher during the 2019 social unrest than the general population norm before the 2014 Occupy Central Movement (11% vs 2%); whilst post-traumatic stress disorder (PTSD) symptoms were estimated to be six times higher (rising from around 5% shortly after Occupy Central in March 2015 to almost 32% in Sept-Nov, 2019).

Even though less than half of those affected by health problems related to the social unrest said they would seek professional help, the authors estimate that mental health-care providers should prepare for potentially a 12% rise in demand for public sector services, which will require major increases in the surge capacity of these services.

"Hong Kong is under-resourced to deal with this excess mental health burden", explains Professor Gabriel Leung from The University of Hong Kong who co-led the research. "With only around half the per-capita psychiatry capacity of the UK, and pre-existing average public sector outpatient waiting times of up to 64 weeks, it is important that we enhance mental health and social care provision so that all those in need are able to access high-quality services." [2]

The study is the largest and longest prospective cohort study of the population-wide impact of social unrest on mental health in the world. However, the researchers caution that measuring the impact of mental health due to social unrest has several data and methodological issues that might affect the accuracy of the estimates, including the potential measurement error of assessment tools for depression and PTSD, and the many assumptions around care-seeking behaviour, psychopathology, and the duration and disposition of the ongoing social unrest.

Hong Kong has experienced a wave of mass protests since June 2019, initiated by the now shelved extradition bill. Over the course of 7 months, peaceful protests have descended into escalating levels of violence. Depressive and PTSD symptoms have been reported following widespread unrest worldwide, including after the 2014 Ferguson unrest and the 2015 Baltimore unrest in the USA. However, little is known about the mental health impact on the general population during recent protests in Hong Kong.

Researchers at The University of Hong Kong used the large population-based FAMILY Cohort with nine successive waves of longitudinal data to assess the population mental health burden before, during, and after major protests over 10 years [3]. The findings of two initial surveys (March 2009-April 2011 and Aug 2011-March 2014) involving more than 18,000 randomly sampled Hong Kong residents were compared with a representative sample of 1,213-1,715 adults surveyed five times during and following the Occupy Central Movement (Oct and Nov, 2014; March and Nov, 2015; Sept 2017), and 1,600-1,736 adults surveyed two times during the 2019 social unrest (June-Aug and Sept-Nov, 2019).

Questionnaires were used to measure changes in the prevalence of probable major depression, suspected PTSD (which included direct exposure to traumatic events such as tear gas or physical violence), and symptoms of depression and PTSD. The researchers used a weighted prevalence approach such that the rate of probable depression and suspected PTSD in the adults surveyed would be more representative of all adults in Hong Kong.

The study also examined risk factors associated with social unrest (after adjusting for socio-demographics and doctor-diagnosed depression or anxiety disorders before the 2019 unrest), and estimated potential health-care needs.

One in five Hong Kong residents (22%; aged 18 or older) surveyed during the 2019 social unrest reported probable major depression or suspected PTSD. The authors say that this is comparable to the prevalence of mental health conditions observed following large-scale disasters, armed conflicts, or terrorist attacks.

Estimates suggest that up to 11% of the adult general population in 2019 were affected by probable depression compared to around 2% in 2009-2014 before the 2014 Occupy Central Movement, and 6.5% in 2017 (figure 2)--potentially equivalent to an additional 590,000 adults with probable depression compared to a decade ago, with an estimated 300,000 of these cases potentially linked to the 2019 unrest (figure 3C).

Similarly, symptoms of PTSD were reported by an estimated 2% of adults in November, 2015 (a year after Occupy Central), rising to almost 32% of those surveyed in September-November, 2019--and could be equivalent to an additional 1.9 million adults with PTSD symptoms.

During the 2019 social unrest, the researchers estimate that the prevalence of suspected PTSD was around 13%--equivalent to around 810,000 adults with PTSD (figure 3A).

Adults using social media for two hours or more a day on socio-political news and events appear to be more at risk of probable depression and suspected PTSD, the findings suggest. However, family support seemed to protect against probable depression, potentially acting as a buffer against stress (figures 4 & 5).

Whilst fewer than half of those affected said they would seek help from health-care professionals--citing a preference for self-management, seeking help from family or friends, and privacy concerns among others--the researchers estimate that the 2019 social unrest may be associated with an additional 140,000 adults seeking outpatient support services for depression, and roughly 360,000 adults looking for help with PTSD (figure 3C).

The authors acknowledge that their findings provide observational associations rather than cause and effect, and point to several limitations of their study, including that the true population burden may be underestimated because they did not account for individuals younger than 18 years old who make up a substantial proportion of protesters, and did not specially oversample members of the police force. They also note that probable depression or suspected PTSD might represent psychological distress in response to an abnormal event rather than true mental illness.

"With social unrest rising around the world, including in major cities such as Barcelona, Delhi, Paris, and Santiago in 2019, the issue of how social unrest impacts population mental health is of great public-health importance," says Dr Michael Ni from The University of Hong Kong who co-led the research. [2]

"We hope our study will alert health-care professionals, service planners, and policy makers to the need for mental health and psychosocial support during and after widespread unrest to better protect population mental health globally," adds co-author Ms Cynthia Yau from The University of Hong Kong. [2]

Credit: 
The Lancet

Plant life expanding in the Everest region

image: View towards Khumbu and Cholatse from below Ama Dablam at about 4,900 m showing typical subnival vegetation in the foreground.

Image: 
Karen Anderson

Plant life is expanding in the area around Mount Everest, and across the Himalayan region, new research shows.

Scientists used satellite data to measure the extent of subnival vegetation - plants growing between the treeline and snowline - in this vast area.

Little is known about these remote, hard-to-reach ecosystems, made up of short-stature plants (predominantly grasses and shrubs) and seasonal snow, but the study reveals they cover between 5 and 15 times the area of permanent glaciers and snow.

Using data from 1993 to 2018 from NASA's Landsat satellites, University of Exeter researchers measured small but significant increases in subnival vegetation cover across four height brackets from 4,150-6,000 metres above sea level.

Results varied at different heights and locations, with the strongest trend in increased vegetation cover in the bracket 5,000-5,500m.

Around Mount Everest, the team found a significant increase in vegetation in all four height brackets. Conditions at the top of this height range have generally been considered to be close to the limit of where plants can grow.

Though the study doesn't examine the causes of the change, the findings are consistent with modelling that shows a decline in "temperature-limited areas" (where temperatures are too low for plants to grow) across the Himalayan region due to global warming.

Other research has suggested Himalayan ecosystems are highly vulnerable to climate-induced vegetation shifts.

"A lot of research has been done on ice melting in the Himalayan region, including a study that showed how the rate of ice loss doubled between 2000 and 2016," said Dr Karen Anderson, of the Environment and Sustainability Institute on Exeter's Penryn Campus in Cornwall.

"It's important to monitor and understand ice loss in major mountain systems, but subnival ecosystems cover a much larger area than permanent snow and ice and we know very little about them and how they moderate water supply.

"Snow falls and melts here seasonally, and we don't know what impact changing subnival vegetation will have on this aspect of the water cycle - which is vital because this region (known as 'Asia's water towers') feeds the ten largest rivers in Asia."

Dr Anderson said "some really detailed fieldwork" and further validation of these findings is now required to understand how plants in this high-altitude zone interact with soil and snow.

Dominic Fawcett, who coded the image processing, said: "These large-scale studies using decades of satellite data are computationally intensive because the file sizes are huge. We can now do this relatively easily on the cloud by using Google Earth Engine, a new and powerful tool freely available to anyone, anywhere."

The Hindu Kush Himalayan region extends across all or part of eight countries, from Afghanistan in the west to Myanmar in the east. More than 1.4 billion people depend on water from catchments emanating here.

The paper, published in the journal Global Change Biology, is entitled: "Vegetation expansion in the subnival Hindu Kush Himalaya."

Credit: 
University of Exeter