Tech

Virtual reality could help flu vaccination rates

image: Faculty members tested methods of delivering effective vaccination messages through print, video and virtual reality.

Image: 
Sarah Freeman

Athens, Ga. - Using a virtual reality simulation to show how flu spreads and its impact on others could be a way to encourage more people to get a flu vaccination, according to a study by researchers at the University of Georgia and the Oak Ridge Associated Universities in Oak Ridge, Tennessee. This is the first published study to look at immersive virtual reality as a communication tool for improving flu vaccination rates among "flu vaccine avoidant" 18- to 49-year-old adults.

"When it comes to health issues, including flu, virtual reality holds promise because it can help people see the possible effects of their decisions, such as not getting a flu vaccine," said Glen Nowak, the principal investigator and director of the Center for Health and Risk Communication headquartered at Grady College. "In this study, we used immersive virtual reality to show people three outcomes--how if infected, they can pass flu along to others; what can happen when young children or older people get flu; and how being vaccinated helps protect the person who is vaccinated as well as others. Immersive VR increases our ability to give people a sense of what can happen if they do or don't take a recommended action."

The research, "Using Immersive Virtual Reality to Improve the Beliefs and Intentions of Influenza Vaccine Avoidant 18- to 49-year-olds," was published by the journal Vaccine on Dec. 2, which falls during National Influenza Vaccination Week, Dec. 1 - 7, 2019. NIVW is a national awareness week focused on highlighting the importance of influenza vaccination.

The research was conducted by faculty at Grady College of Journalism and Mass Communication, including faculty in Grady's Center for Health and Risk Communication. The research was conducted with support from a grant and researchers from ORAU.

According to the Centers for Disease Control and Prevention during the 2017-18 flu season, only 26.9% of 18- to 49-year-olds in the United States received a recommended annual influenza vaccination even though it is recommended for all 18- to 49-year-olds. The low current acceptance of flu vaccination makes it important to identify more persuasive ways to educate these adults about flu vaccination. The findings from this study suggest one-way virtual reality can be more effective as it can create a sense of presence or feeling like one is a part of what is happening.

The 171 participants in this study self-identified as those who had not received a flu shot last year and did not plan to receive one during the 2017-18 influenza season. In the study, participants were randomly assigned to one of four groups: 1) a five-minute virtual reality experience; 2) a five-minute video that was identical to the VR experience but without the 3-dimensional and interactive elements; 3) an e-pamphlet that used text and pictures from the video presented on a tablet computer; and 4) a control condition that only viewed the U.S. Centers for Disease Control and Prevention's influenza Vaccination Information Statement, which is often provided before a flu vaccine is given and describes benefits and risks. Participants in the VR, video and e-pamphlet conditions also viewed the CDC VIS before answering a series of questions regarding flu vaccination, including whether they would get a flu vaccine.

In the VR condition, participants were provided headsets, which enabled them to vividly experience the information and events being shown as if they were in the story, and video game controllers, which enabled them to actively participate at points in the story. Compared to video or the e-pamphlet, the VR condition created a stronger perception of presence - that is, a feeling of "being there" in the story, which, in turn, increased participants' concern about transmitting flu to others. This increased concern was associated with greater confidence that one's flu vaccination would protect others, more positive beliefs about flu vaccine and increased intention to get a flu vaccination. Neither the e-pamphlet nor the video was able to elicit a sense of presence nor were they able to improve the impact of the VIS on the confidence, belief and intention measures.

"This study affirms there is much to be excited about when it comes to using virtual reality for heath communication," Karen Carera, senior evaluation specialist at ORAU, said. "However, the findings suggest that for virtual reality to change beliefs and behaviors, the presentations used need to do more than deliver a story. They need to get users to feel like they are actually in the story."

Credit: 
University of Georgia

NASA catches typhoon Kammuri post landfall

image: NASA-NOAA's S-NPP satellite provided an infrared image of Typhoon Kammuri on Dec. 2 at 12:07 p.m. EST (1707 UTC) that showed the eye near Gubat, Sorsogon, the southeastern part of Luzon. A wide, thick band of powerful thunderstorms (red) circled the entire eye, where cloud top temperatures were as cold as or colder than 210 Kelvin (minus 81.6 degrees Fahrenheit/minus 63.1 degrees Celsius). NASA research has shown that cloud top temperatures that cold or colder have the ability to generate heavy rainfall. The strongest storms were west and southwest of center where cloud tops were as cold (black) as 119 Kelvin (minus 117.6 degrees Fahrenheit/minus 83.1 degrees Celsius).

Image: 
NOAA/NASA/UWM-CIMSS-SSEC/William Straka III

NASA-NOAA's Suomi NPP or S-NPP satellite provided infrared and night-time imagery of Typhoon Kammuri shortly after it made landfall in the Philippines.

On Dec 2 at 7 a.m. EST (1200 UTC) Kammuri, known as Tisoy in the Philippines, had maximum sustained winds near 115 knots (132 mph), and that made it a Category 4 hurricane on the Saffir Simpson Hurricane Wind Scale.

Per the 10 a.m. EST (1500 UTC) bulletin from the Philippine Atmospheric, Geophysical and Astronomical Services Administration (PAGASA), at 11 p.m. PHT, Philippine  Standard Time. Kammuri (Tisoy) made landfall near Gubat, Sorsogon which is located in the extreme southeastern part of Luzon, Philippines

Views of a Typhoon from NASA Satellites

NASA-NOAA's Suomi NPP or S-NPP satellite saw Typhoon Kammuri on Dec. 2 at 12:07 p.m. EST (1707 UTC). Infrared and nighttime images were created by William Straka III, Researcher at the University of Wisconsin - Madison, Space Science and Engineering Center (SSEC), Cooperative Institute for Meteorological Satellite Studies (CIMSS). The infrared image showed the eye of Kammuri (Tisoy) near Gubat, Sorsogon, in the southeastern part of Luzon. A thick,wide band of powerful thunderstorms circled the entire eye, where cloud top temperatures were as cold as or colder than 210 Kelvin (minus 81.6 degrees Fahrenheit/minus 63.1 degrees Celsius). NASA research has shown that cloud top temperatures that cold or colder have the ability to generate heavy rainfall.

Straka noted that several prominent features include tropospheric gravity waves along with multiple overshooting cloud tops. In addition, there were also some mesospheric gravity waves. Infrared data revealed coldest temperatures of cloud tops were as cold as 119 Kelvin (minus 117.6 degrees Fahrenheit/minus 83.1 degrees Celsius) in the western and southwestern quadrants of the storm. Those were the most powerful storms that likely produced the highest rainfall rates.

In the nighttime image from the S-NPP satellite, Kammuri's eye was visible near Gubat, Sorsogon, in southeastern Luzon. City lights could be seen to the north and south of the storm, in Northern Luzon, Western and Central Visayas regions, and Northern Mindanao. The eastern Visayas and Bicol regions were covered by Kammuri's clouds.

On Dec. 2 at 12:11 a.m. EST (0511 UTC), the Atmospheric Infrared Sounder or AIRS instrument aboard NASA's Aqua satellite provided a look at the cloud top temperatures in Typhoon Kammuri. A thick, wide band of powerful thunderstorms (red) circled the entire eye, where cloud top temperatures were as cold as or colder than 210 Kelvin (minus 81.6 degrees Fahrenheit/minus 63.1 degrees Celsius), which confirmed the Suomi NPP satellite data.

Warnings in Effect on Dec. 3, 2019

PAGASA continues to track Kammuri in order to assess the impacts on the various islands in the path of the storm.  On Dec. 3, PAGASA maintained the following warnings.

Tropical Cyclone Wind Signal number 3 was in effect for Luzon: Northern Occidental Mindoro, Lubang Island.

Tropical Cyclone Wind Signal number 2 was in effect for Luzon: Oriental Mindoro, Batangas, rest of Occidental Mindoro, Marinduque, Cavite, Laguna, Rizal, Bataan, Metro Manila, southern Bulacan, southern Pampanga, southern Zambales, Calamian Islands, and western parts of Quezon.

Tropical Cyclone Wind Signal number 1 was in effect for Luzon: Northern parts of Camarines Sur, southern Neuva Ecija, southern Aurora, northern parts of Palawan, Cuyo Islands, rest of Quezon, rest of Camaines Sur, rest of Zambales, rest of Pampanga and rest of Bulacan. In addition, Signal 1 was in effect for Visayas: Northern Aklan and northern Antique.

Status of Kammuri (Tisoy) on Dec. 3

On Dec. 3, 2019 at 10 a.m. EST (1500 UTC) Typhoon Kammuri (Tisoy) was located near latitude 13.2 degrees north, and longitude 120.1 degrees east,  about 99 nautical miles south-southwest of Manila, Philippines.  Kammuri is moving west and had maximum sustained winds of 80 knots (92 mph/148 kph). 

Typhoon Kammuri continues to weaken as it emerges on the western edges of the Philippine archipelago. The storm is forecast to turn to the east-northeast and is expected to quickly turn to the south-southwest and dissipate.

Credit: 
NASA/Goddard Space Flight Center

Successful instrument guidance through deep and convoluted blood vessel networks

image: A team led by Professor Sylvain Martel at the Polytechnique Montréal Nanorobotics Laboratory has developed a novel approach to tackling one of the biggest challenges of endovascular surgery: how to reach the most difficult-to-access physiological locations. Their solution is a robotic platform that uses the fringe field generated by the superconducting magnet of a clinical magnetic resonance imaging (MRI) scanner to guide medical instruments through deeper and more complex vascular structures. The approach has been successfully demonstrated in-vivo.

Image: 
Massouh bioMEDia for the Polytechnique Montréal Nanorobotics Laboratory

A team led by Professor Sylvain Martel at the Polytechnique Montréal Nanorobotics Laboratory has developed a novel approach to tackling one of the biggest challenges of endovascular surgery: how to reach the most difficult-to-access physiological locations. Their solution is a robotic platform that uses the fringe field generated by the superconducting magnet of a clinical magnetic resonance imaging (MRI) scanner to guide medical instruments through deeper and more complex vascular structures. The approach has been successfully demonstrated in-vivo, and is the subject of an article just published in Science Robotics.

When a researcher "thinks outside the box"--literally

Imagine having to push a wire as thin as a human hair deeper and deeper inside a very long, very narrow tube full of twists and turns. The wire's lack of rigidity, along with the friction forces exerted on the walls of the tube, will eventually render the manoeuvre impossible, with the wire ending up folded on itself and stuck in a turn of the tube. This is exactly the challenge facing surgeons who seek to perform minimally invasive procedures in ever-deeper parts of the human body by steering a guidewire or other instrumentation (such as a catheter) through narrow, tortuous networks of blood vessels.

It is possible, however, to harness a directional pulling force to complement the pushing force, countering the friction forces inside the blood vessel and moving the instrument much farther. The tip of the device is magnetized, and pulled along inside the vessels by the attraction force of another magnet. Only a powerful superconducting magnet outside the patient's body can provide the extra attraction needed to steer the magnetized device as far as possible. There is one piece of modern hospital equipment that can play that role: an MRI scanner, which has a superconducting magnet that generates a field tens of thousands of times stronger than that of the Earth.

The magnetic field inside the tunnel of an MRI scanner, however, is uniform; this is key to how patient imaging is performed. That uniformity poses a problem: to pull the tip of the instrument through the labyrinthine vascular structures, the guiding magnetic field must be modulated to the greatest possible amplitude and then be decreased as quickly as possible.

Pondering that problem, Professor Martel had the idea of using not the main magnetic field present inside the MRI machine tunnel, but the so-called fringe field outside the machine. "Manufacturers of MRI scanners will normally reduce the fringe field to the minimum," he explains. "The result is a very-high-amplitude field that decays very rapidly. For us, that fringe field represents an excellent solution that is far superior to the best existing magnetic guidance approaches, and it is in a peripheral space conducive to human-scale interventions. To the best of our knowledge, this is the first time that an MRI fringe field has been used for a medical application," he adds.

Move the patient rather than the field

To steer an instrument deep within blood vessels, not only is a strong attraction force required, but that force must be oriented to pull the magnetic tip of the instrument in various directions inside the vessels. Because of the MRI scanner's size and weight, it's impossible to move it to change the direction of the magnetic field. To get around that issue, the patient is moved in the vicinity of the MRI machine instead. The platform developed by Professor Martel's team uses a robotic table positioned within the fringe field next to the scanner.

The table, designed by Arash Azizi--the lead author of the article and a biomedical engineering PhD candidate whose thesis advisor is Professor Martel--can move on all axes to position and orient the patient according to the direction in which the instrument must be guided through their body. The table automatically changes direction and orientation to position the patient optimally for the successive stages of the instrument's journey thanks to a system that maps the directional forces of the MRI scanner's magnetic field--a technique that Professor Martel has dubbed Fringe Field Navigation (FFN).

An in-vivo study of FFN with X-ray mapping demonstrated the capacity of the system for efficient and minimally invasive steering of extremely small-diameter instruments deep within complex vascular structures that were hitherto inaccessible using known methods.

Robots to the rescue of surgeons

This robotic solution, which greatly outperforms manual procedures as well as existing magnetic field-based platforms, enables endovascular interventional procedures in very deep, and therefore currently inaccessible, regions of the human body.

The method promises to broaden possibilities for application of various medical procedures including diagnosis, imaging and local treatments. Among other things, it could serve to assist surgeons in procedures requiring the least invasive methods possible, including treatment of brain damage such as an aneurysm or a stroke.

Credit: 
Polytechnique Montréal

For some corals, meals can come with a side of microplastics

image: Under a black light, fluorescent green microplastics are seen in the water during a small demonstration experiment. In the actual 2018 experiment discussed in this paper, the cauliflower coral seen above ingested microplastics when prey was also present in the water, but avoided eating microplastics when no prey was there.

Image: 
Dennis Wise/University of Washington

Tiny microplastic particles are about as common in the ocean today as plastic is in our daily lives.

Synthetic clothing, containers, bottles, plastic bags and cosmetics all degrade and release microplastics into the environment. Corals and other marine organisms are eating microplastics that enter the waterway. Studies in this emerging field show some harmful effects, but it's largely unknown how this ubiquitous material is impacting ocean life.

A new experiment by the University of Washington has found that some corals are more likely to eat microplastics when they are consuming other food, yet microplastics alone are undesirable. Two coral species tested responded differently to the synthetic material, suggesting variations in how corals are adapting to life with microplastics. The study was published Dec. 3 in the journal Scientific Reports.

"The more plastic we use, the more microplastics there are, and the more corals are going to be exposed," said lead author Jeremy Axworthy, a UW doctoral student in the School of Aquatic and Fishery Sciences. "Our study found that some corals probably won't eat microplastics and will keep going about their daily business. But some might -- and if they happen to be sensitive to warmer ocean temperatures or other stressors, this is just another compounding factor to be worried about."

Corals are tiny animals that are rooted to the reef or rocks on the ocean floor. They use tentacle-like arms to sweep food into their mouths. Many rely on algae for energy, but most also consume drifting animals for survival.

This study is the first to examine whether corals eat microplastics when exposed to warmer water, which is expected to accelerate with climate change. Rising ocean temperatures can be deadly for coral: warm water stresses them, causing corals to lose their symbiotic algae partner that undergoes photosynthesis and provides energy for them to survive. When this happens, coral bleaching and eventual death can occur.

But some corals have adapted to bleaching by shifting their diets to feed on tiny marine organisms called zooplankton, which provide an alternate energy source. As they munch on these small animals -- often the same size as microplastics -- the research team wondered whether they also were ingesting plastic fragments.

The experiment shows corals do eat microplastics when they switch to a zooplankton diet, adding one more stressor for corals in a changing ocean environment.

"Microplastics are not as simple as a life-or-death threat for corals -- it's not that black or white," said senior author Jacqueline Padilla-Gamiño, assistant professor at the UW School of Aquatic and Fishery Sciences. "It's about total energy lost. If corals constantly are dealing with microplastics, it might not kill them, but there will be less energy for them to grow and to reproduce."

The researchers collected two species of common corals off the east coast of Oahu, Hawaii, and exposed half of each species to warmer water for several weeks to induce stress and bleaching. Then they ran four different feeding experiments on both bleached and non-bleached corals: corals were fed only microplastics; only a type of zooplankton; microplastics and zooplankton; or nothing.

After dissecting the coral polyps, researchers found that corals stressed by warmer temperatures actually ate much less than their counterparts in normal seawater. This was unexpected and possibly due to stress from high water temperatures. However, one of the two species, known for its voracious eating habits in the wild, consumed microplastics only while also eating zooplankton. Neither coral species ate microplastics alone.

The researchers don't know why one species of coral readily ate microplastics in the presence of other food, but avoided microplastics when they were the only thing on the menu. They suspect that this species of coral can read certain chemical or physical cues from the plastics and the prey, but might not be able to distinguish between the two when both are present.

It's also possible the plastic used in this experiment is less desirable to corals, and that plastics with a different chemical makeup could, in fact, be tasty to corals. The researchers plan to test the "tastiness" of other types of microplastics, such as synthetic fibers from clothing.

Ultimately, some coral species likely face greater risks from exposure to microplastics than others, the study found. The researchers will look next at impacts on the physiology of corals that are exposed over a longer period to microplastics.

"Knowing that will provide a lot more context to this work," Axworthy said. "We need to know the full physiological impacts of chronic exposure to microplastics on corals, especially at increased temperatures, to understand how serious the problem is."

In the meantime, the problem of microplastics isn't going away. A 2014 estimate found between 15 and 51 trillion microplastic particles in the oceans, and plastic waste entering the oceans is expected to increase tenfold between 2010 and 2025.

"It's important when talking about waste management to think big picture -- what are we putting in the oceans?" Padilla-Gamiño said. "We don't know where plastic goes, where it stays, who grabs it, and what are the mechanisms by which we get it back. We are just at the tip of understanding these implications."

Credit: 
University of Washington

Lack of specialists doom rural sick patients

image: Kenton Johnston, Ph.D., is an assistant professor of health management and policy at Saint Louis University College for Public Health and Social Justice.

Image: 
Saint Louis University

Residents of rural areas are more likely to be hospitalized and to die than those who live in cities primarily because they lack access to specialists, recent research found.

The study, led by Kenton Johnston, Ph.D., assistant professor of health management and policy at Saint Louis University College for Public Health and Social Justice, looked at data from Medicare patients who have chronic health problems. The paper was published in the December 2019 issue of Health Affairs.

"People on Medicare with chronic conditions such as heart failure or diabetes who live in rural areas have higher death and hospitalization rates than their urban peers," Johnston said.

"The biggest reason for this appears to be that people in rural areas have less access to specialist physicians like cardiologists and endocrinologists."

Johnston and his coauthors, Hefei Wen, Ph.D., assistant professor in the division of health policy and insurance research at Harvard Medical School and the Harvard Pilgrim Health Care Institute, and Karen E. Joynt Maddox, M.D., assistant professor of cardiology at Washington University School of Medicine in St. Louis, urge policy makers to target innovations to bring more specialist care to rural areas.

Some of the strategies they suggest are:

Expanding telemedicine in key areas, such as cardiology, to provide routine specialty care visits through technologies such as video conferencing

Adding incentives for physicians to practice in rural areas such as loan forgiveness

Considering differential payment rates that offer specialists who practice in rural areas more money

Incentivizing rural and urban hospitals partnerships

Bringing urban specialists into rural health systems on certain days of the week

Researchers examined 2006-2013 data from Medicare claims of patients in rural and urban areas who have heart disease, diabetes and other complex chronic conditions.

They linked the claims to health care supply data from hospitals that was provided by the Dartmouth Institute for Health Policy and Clinical Practice and determined rural-urban classifications using a Health Resources and Services Administration database.

The researchers defined a rural area as any town with fewer than 10,000 people, and found that 10% of Medicare beneficiaries lived in such areas.

Patients who saw a specialist at least once in addition to a primary care provider compared to those who saw only a primary care provider were 15.9% less likely to be hospitalized for a preventable cause and 16.6% less likely to die.

Preventable hospitalizations were highest in rural areas and lowest in metropolitan areas. Residents of rural areas had 40% higher rates of preventable hospitalizations and 23% higher mortality rates than their metropolitan counterparts.

Their findings have implications for all Medicare patients with chronic conditions, Johnston said.

"Our research shows that all Medicare beneficiaries with chronic conditions--urban and rural--have lower death and hospitalization rates when they visit a specialist at least once annually," Johnston said. "Primary care is important, but it is not enough by itself; specialist care is needed as well."

Johnston will participate in a Health Affairs Forum on rural health from 8 a.m. to noon (CST) on Wednesday. Dec. 4 to discuss his findings. The policy briefing will be held at the National Press Club. Those interested in watching the briefing in real time can attend in person or virtually by registering on Dec. 3 or 4 through this link.

Credit: 
Saint Louis University

Highly sensitive epigenomic technology combats disease

image: A Virginia Tech professor and his team of researchers have created new technology to help in understanding how the human body battles diseases. Left to right, Bohan Zhu, Yuan-Pang Hsieh, and Chang Lu.

Image: 
Virginia Tech

Much remains unknown about diseases and the way our bodies respond to them, in part because the human genome is the complete DNA assembly that makes each person unique. A Virginia Tech professor and his team of researchers have created new technology to help in understanding how the human body battles diseases.

In a recently published article in Nature Protocols, Chang Lu, the Fred W. Bull Professor of Chemical Engineering at Virginia Tech, along with chemical engineering doctoral students Bohan Zhu and Yuan-Pang Hsieh, describe a microfluidic technology they are using to study a variety of diseases ranging from breast and brain cancer to schizophrenia and addiction.

"We were motivated by the fact that the molecular basis of a lot of diseases remains to be elusive after years of research, due to lack of advanced and precise technologies," said Lu.

The new technology developed includes detailed instructions for device fabrication, setup, and operation of microfluidic oscillatory washing-based chromatin immunoprecipitation followed by a sequencing method called MOWChIP-seq, a low-input technology that allows characterization of the epigenome using as few as 100 cells.

The method is novel because up until now, conventional methods required tens of millions of cells per assay. But by using the new sequencing method, the team has discovered they can produce a profile of histone modifications using as few as 100 cells per assay with a throughput as high as eight assays in one run.

"By comparing normal and diseased epigenomes, useful markers and patterns can be discovered and used for precision medicine based on epigenomic features of an individual patient," Lu said.

This research builds upon Lu's previous work in developing the MOWChIP-seq, first published in a Nature Methods paper in 2015. A U.S. patent was awarded to Lu and his former doctoral student, Zhenning Cao, for their work. Compared to their original 2015 publication, in the current protocol Lu and his team detailed their recent effort on automating the MOWChIP-seq process and processing eight samples in parallel in one run.

The team's process is semi-automated and reduces labor and improves reproducibility and is scalable.

Credit: 
Virginia Tech

Transition to exhaustion: clues for cancer immunotherapy

Research on immune cells "exhausted" by chronic viral infection provides clues on how to refine cancer immunotherapy. The results are scheduled for publication in Immunity.

Scientists at Emory Vaccine Center, led by Rafi Ahmed, PhD, have learned about exhausted CD8 T cells, based on studying mice with chronic viral infections. In the presence of persistent virus or cancer, CD8 T cells lose much of their ability to fight disease, and display inhibitory checkpoint proteins such as PD-1 on their surfaces. PD-1 is targeted by cancer immunotherapy drugs, such as pembrolizumab and nivolumab, which allow CD8 T cells to regain their ability to attack and kill infected cells and cancers.

Those drugs are now FDA-approved for several types of cancer, yet some types of tumors do not respond to them. Studying exhausted CD8 T cells can help us understand how to better draw the immune system into action against cancer or chronic infections.

In previous research, Ahmed's lab found that exhausted cells are not all alike, and the diversity within the exhausted T cell pool could explain variability in responses to cancer immunotherapy drugs. Specifically, they observed that a population of "stem-like" cells proliferated in response to PD-1-blocking drugs, while a more differentiated population of exhausted cells stayed inactive. The stem-like cells are responsible for maintaining the exhausted T cell population, but cannot kill virus-infected or tumor cells on their own.

The current paper defines a transitional stage in between the stem-like and truly exhausted cells. The truly exhausted cells are marked by a molecule called CD101, and are unable to migrate to sites of infection and contain lower amounts of proteins needed to kill infected or tumor cells.

"The transitional cells are not completely exhausted," says postdoctoral fellow Will Hudson, PhD, first author of the Immunity paper. "They are still capable of proliferating and performing their 'killer cell' functions. In our experiments, they contribute to viral control."

The transitional cells, lacking CD101, could be a good marker for response to PD-1 blocking drugs, Hudson says. Enhancing the proliferation or survival of these cells, or preventing their transition to lasting exhaustion, may be a novel therapeutic strategy for cancer.

"It is extremely exciting to have contributed to this project and know that our findings have the potential to inform cancer immunotherapy," says co-author Julia Gensheimer, an Emory graduate, now a MD/PhD student at UCLA.

The Immunity paper also includes systematic identification of other markers for CD8 T cells in various stages of exhaustion, which could be a guide to efforts to promote their activity.

Credit: 
Emory Health Sciences

Siting cell towers needs careful planning

image: Almost everyone has a cell phone, and that creates a lot of demand for data, which means engineers need to think about where to site new cell towers.

Image: 
Zach Smith/Michigan Tech

No one can overengineer like an engineer. So introducing a little more caution into an existing engineering process is nothing much to ruffle feathers. A new paper published in Environmental Research offers insight on how to include simple precautionary approaches to siting cell towers.

And there are many cell towers -- and more coming -- since almost everyone has a cell phone and the towers are being used for more data intensive applications. In the U.S., the Pew Research Center reports 96% of Americans own a cell phone of some kind, and smart phone ownership today has risen to 81% from 35% in 2011. Industry data reported by GSMA Intelligence estimates more than five billion people worldwide use mobile devices. All these devices work using electromagnetic waves, which expose people to low levels of radio-frequency radiation (RFR).

"The research on the health impacts of RFR is still inconclusive. But some of the preliminary data gives us reason to be concerned," said Joshua Pearce, a professor in electrical and materials engineering from Michigan Technological University who led the study, which reviews current data on RFR and engineering solutions for placing towers. "I'm pro-tech and I'm pro-human, so I think there are ways for us to have our cell phones and minimize potential risk without waiting to find out that putting a cell tower on top of a school was a bad idea."

Pearce and his team's solutions focus on getting companies to rethink where to place cell towers when they do a standard "search ring" map that prioritizes potential sites based on maximizing coverage for the least cost. Assessing tower placement is not a new idea; Canada and many European countries are looking into siting guidelines that help keep particularly vulnerable populations safe, like kids and those with illnesses.

The handful of human studies reviewed in Pearce's paper indicate that proximity to base stations correlates with headaches, dizziness, depression and other neurobehavioral symptoms, as well as increased cancer risk. Animal studies also indicate that these effects may be cumulative.

Given the current research, cell towers would be cautiously placed 500 meters, or about a third of a mile, away from schools, hospitals and lots of sleeping people in dense neighborhoods or high rises.

The challenge in the U.S., unlike in India where such setback laws are already in place, is the laws that govern cell tower siting plans in Section 704 of The Telecommunications Act of 1996 specifically eliminate "environmental effects" from consideration.

"This is a peculiar law, but saying that something is legal doesn't make it right or cost-effective in the long run," Pearce said. "It's in companies' best interests to be thoughtful about where to place cell towers; they don't want to move towers or be held responsible down the line. These effects are inadvertent -- but there are options to do it differently that can reduce potential health impacts and thus a company's future bottom line."

In addition to revamping search ring mapping to include a 500-meter buffer, which doesn't impact the cost of the siting process but reduces future liability, Pearce says there are other more innovative options, like cell splitting and small cell deployment, that could also decrease RFR exposure. At the end of the day, it comes down to thinking before building.

Credit: 
Michigan Technological University

'Going negative': How Trump has changed the Twitter narrative

If not for Twitter, US President Donald Trump would not be in the White House today. True/false? That's for others to judge but it's probably true, say two Australian linguists who have released a paper analysing Trump's use of Twitter prior to and six months after his election in 2016.

In a new report, Dr David Caldwell from the University of South Australia and Dr Andrew Ross from the University of Sydney have examined how Trump tends to "hyper personalise, character assassinate and use unprofessional language" on Twitter.

Trump's negative, blunt approach, using emotive hashtags to attack Democrat candidate Hillary Clinton in the 2016 election campaign proved devastatingly effective, the authors say.

"In the lead up to his election, Trump engaged a whole new cohort of people who wouldn't ordinarily be interested in politics," says Dr Caldwell. "His hashtag, #CrookedHillaryClinton, seared into the American consciousness, casting doubt on Clinton's honesty, her capacity to lead and her trustworthiness."

Trump's use of rhetorical questions, loaded words, capital letters and exclamation marks also saturated the 3000 tweets that the authors analysed between June 2016 and August 2017.

"Twitter lends itself to punchy, emotional statements and suits President Trump's communication style. Previous research shows that the more negative his tweets are, the higher his poll ratings. Twitter was clearly highly influential for him in winning the 2016 US election," Dr Caldwell says.

Lead author on the paper, Dr Ross, says Trump's tweets appealed to his own base and energised his supporters into voting for him. At the same time, his tweets didn't dissuade his opponents to vote against him in large enough numbers to defeat him.

Unlike Obama, who used a trained media team to tweet on his behalf, Trump won more support by speaking directly to people through his tweets - a factor that worked in his favour.

"Many Americans like the fact that their President bypasses the mainstream media to communicate with them. People are now able to get 'news' from the President directly via Twitter and the element of celebrity has undoubtedly worked for him," Dr Ross says.

There is evidence that Trump's "de-professional" style is being adopted - albeit a more toned-down version - by some other politicians, Dr Caldwell says, including Australian Prime Minister Scott Morrison who uses more personal and casual language in his tweets than his predecessors.

"There is a more aggressive tone to politics these days and that is reflected in President Trump's tweets," Dr Caldwell says.

"When you are so belligerent and direct in your language, people have no choice but to take sides because the platform doesn't encourage reasoned debate. There is no middle ground. Twitter is combative and that's what makes it so divisive."

While Facebook's popularity is declining, Dr Ross says there is no evidence to show that Twitter is losing ground, particularly as a political communication tool.

"Trump's constant use of Twitter has created new interest in the platform, with the global economy sometimes hinging on his every tweet," he says.

"I imagine Twitter will be just as important to Trump - or even more so - in his bid for re-election in 2020 than it was in 2016. Certainly, the Democrats will need a plan of some sort to compete."

Credit: 
University of South Australia

Science snapshots from Berkeley Lab

image: Trent Northen, a Berkeley Lab co-author, analyzes a microbiome sample.

Image: 
Roy Kaltschmidt/Berkeley Lab

A Matchmaker for Microbiomes

Microbiomes play essential roles in the natural processes that keep the planet and our bodies healthy, so it's not surprising that scientists' investigations into these diverse microbial communities are leading to advances in medicine, sustainable agriculture, cheap water purification methods, and environmental cleanup technology, just to name a few. However, trying to determine which microbes contribute to an important geochemical or physiological reaction is both incredibly challenging and slow-going, because the task involves analyzing enormous datasets of genetic and metabolic information to match the compounds mediating a process to the microbes that produced them.

But now, researchers have devised a new way to sort through the information overload.

Writing in Nature Methods, a team led by UC San Diego describes a neural network-based approach called microbe-metabolite vectors (mmvec), which uses probabilities to identify the most likely relationship of co-occurring microbes and metabolites. The team demonstrates how mmvec can outperform traditional correlation-based approaches by applying mmvec to datasets from two well-studied microbiomes types - those found in desert soils and cystic fibrosis patients' lungs - and gives a taste of how the approach could be used in the future by revealing relationships between microbially-produced metabolites and inflammatory bowel disease.

"Previous statistical tools used to estimate microbe-metabolite correlations performed comparably to random chance," said Marc Van Goethem, a postdoctoral researcher who is one of three study authors from Berkeley Lab. "Their poor performance led to the detection of spurious relationships and missed many true relationships. Mmvec is a powerful new tool that accurately links metabolite and microbial abundances to solve this problem. There could be wide-ranging applications from clinical trials to environmental engineering. Ultimately, mmvec will allow us to begin moving away from simple pattern recognition towards unravelling mechanisms."

When Solids and Liquids Meet: In Nanoscale Detail

How a liquid interacts with the surface of a solid is important in batteries and fuel cells, chemical production, corrosion phenomena, and many biological processes.

To better understand this solid-liquid interface, researchers at Berkeley Lab developed a platform to explore these interactions under real conditions ("in situ") at the nanoscale using a technique that combines infrared light with an atomic force microscopy (AFM) probe. The results were published in the journal Nano Letters.

The team explored the interaction of graphene with several liquids, including water and a common battery electrolyte fluid. Graphene is an atomically thin form of carbon. Its single-layer atomic structure gives the material some unique properties, including incredible mechanical strength and high electrical conductivity.

Researchers used a beam of infrared light produced at Berkeley Lab's Advanced Light Source and they focused it at the tip of an AFM probe that scanned across a section of graphene in contact with the liquids. The infrared technique provides a nondestructive way to explore the active nanoscale chemistry of the solid-liquid interface.

By measuring the infrared light scattered from the probe's tip, researchers collected details about the chemical compounds and the concentration of charged particles along the solid-liquid interface. The same technique, which revealed hidden features at this interface that were not seen using conventional methods, can be used to explore a range of materials and liquids.

Researchers from the Lab's Materials Sciences Division, Molecular Foundry, and Energy Storage and Distributed Resources Division participated in the study. The Molecular Foundry and Advanced Light Source are DOE Office of Science user facilities.

Underwater telecom cables make superb seismic network

Fiber-optic cables that constitute a global undersea telecommunications network could one day help scientists study offshore earthquakes and the geologic structures hidden deep beneath the ocean surface.

In a recent paper in the journal Science, researchers UC Berkeley, Lawrence Berkeley National Laboratory (Berkeley Lab), Monterey Bay Aquarium Research Institute (MBARI), and Rice University describe an experiment that turned 20 kilometers of undersea fiber-optic cable into the equivalent of 10,000 seismic stations along the ocean floor. During their four-day experiment in Monterey Bay, they recorded a 3.5 magnitude quake and seismic scattering from underwater fault zones.

Their technique, which they had previously tested with fiber-optic cables on land, could provide much-needed data on quakes that occur under the sea, where few seismic stations exist, leaving 70% of Earth's surface without earthquake detectors.

"This is really a study on the frontier of seismology, the first time anyone has used offshore fiber-optic cables for looking at these types of oceanographic signals or for imaging fault structures," said Jonathan Ajo-Franklin, a geophysics professor at Rice University in Houston and a faculty scientist at Berkeley Lab. "One of the blank spots in the seismographic network worldwide is in the oceans."

Credit: 
DOE/Lawrence Berkeley National Laboratory

Face mask can help combat mild cases of sleep condition

A night time face mask can improve energy levels and vitality in people who suffer from the condition sleep apnoea, which is associated with snoring and breathing problems at night.

This is the finding from a new study of over 200 patients, published in the journal The Lancet Respiratory Medicine, led by Imperial College London.

The research, conducted at 11 NHS sleep centres across the UK including the Royal Brompton & Harefield NHS Foundation Trust, is one of the first to investigate the use of the treatment for mild cases of sleep apnoea. The mask - called a CPAP machine - is currently only recommended for people whose sleep apnoea is moderate to severe.

Sleep apnoea affects over one billion adults globally, and causes the airways to become too narrow during sleep, causing people to briefly stop breathing many times throughout the night. It can also trigger loud snoring, and cause frequent awakening from sleep, and subsequent daytime sleepiness.

Severe cases of sleep apnoea are thought to affect up to 1.5 million in the UK, with some estimates suggesting up to eight million people in the UK may have a mild form of the condition.

One treatment is a mask that fits over the nose or mouth called a continuous positive airway pressure (CPAP) machine, which gently pushes air into the mouth and throat, keeping the airways open.

Although previous trials have found a CPAP machine to improve symptoms of moderate to severe cases of the condition, this is the first large trial to find that mild cases of sleep apnoea can also be treated with this technology.

Mary Morrell, Professor of Sleep and Respiratory Physiology at the National Heart and Lung Institute at Imperial, and lead author of the research, said: "We are seeing increasing cases of sleep apnoea, and in a wide range of patients. Although the condition was previously thought to mainly affect overweight men, we now know it also strikes post-menopausal women, the elderly, and even children."

Professor Morrell, who is also honorary researcher at the Royal Brompton Hospital, added: "Around 60 per cent of all cases of sleep apnoea are classed as mild, but until now we didn't know whether a CPAP would be helpful to these patients."

In the study, 115 patients were asked to use the CPAP for three months, while 118 received standard care for mild sleep apnoea, which includes advice on improving sleep and avoiding anything that can exacerbate the condition, such as drinking alcohol before bed.

The research revealed those who used the CPAP machine had an improvement of 10 points on a so-called vitality scale, compared to those who received standard care.

The vitality scale assesses a range of factors such as sleep quality, energy levels and daytime sleepiness. The researchers also saw improvements in a number of additional factors among the patients who used the CPAP, including fatigue, depression, and social and emotional functioning.

The researchers explain they have not yet conducted an economic analysis of the cost to the NHS of treating mild cases of sleep apnoea with a CPAP machine. In previous studies they have shown that, if used correctly, the machines are cost-effective (using the criteria for cost-effectiveness defined by the National Institute for Health and Care Excellence)

Dr Julia Kelly, first author of the paper, said: "Currently the NHS doesn't routinely offer CPAP machines to cases of mild sleep apnoea, but our research suggests this treatment should now be considered."

The research was funded by ResMed, who manufacture CPAP machines, but the funder had no involvement in the trial methods or data analysis.

PATIENT VIEWPOINT:

Patricia Ware, 62, from Southall, tried a CPAP machine after being diagnosed with sleep apnoea at age 60. She explains:

"My energy levels had been low for a while, but I started to consider whether I should see my GP when my husband told me I was snoring very loudly. At one stage I even woke myself up through snoring.

"I finally decided to make an appointment when I started nodding off at work. I was working at a school at the time, and during the day colleagues told me that I'd fallen asleep. I was horrified as I didn't remember drifting off. My doctor sent me for tests at Harefield Hospital, and the medical team asked if I wanted to take part in a trial using a CPAP machine. At first it felt slightly strange - and involved wearing a mask that just fitted over my nose. However it was soon discovered I was a mouth-breather, and so I was given a mask that covered my nose and mouth. After initial adjustments I found the machine very comfortable, and now don't even notice I'm wearing it.

After a year and a half of using the machine I now feel like the old me - I have my energy levels back, and am now working as a steward at a football training ground, and have not fallen asleep at work since."

Credit: 
Imperial College London

Researchers find clue to preventing addiction relapse

With any addiction in which a user has successfully resisted a chemical, activity or substance, relapse is vexing. And with opioids, it's often deadly. Fatal overdoses following relapse from an opioid addiction is reaching epidemic proportions.

In 2017, more than 70,000 people died from drug overdoses, making it a leading cause of injury-related death in the United States, according to the Centers for Disease Control and Prevention. Of those deaths, almost 68 percent involved a prescription or illicit opioid.

A study published in Neuropsychopharmacology reported that relapse can be prevented by controlling cells in a brain region called the nucleus accumbens. The study was conducted among 90 Sprague Dawley rats with genetic diversity.

"We used a tool called chemogenetic receptors to act as a light switch on the cells," said senior author Susan Ferguson, director of the Alcohol and Drug Abuse Institute at the University of Washington and associate professor of psychiatry and behavioral sciences at UW's School of Medicine. "When we changed activity of neurons in the nucleus accumbens, we were able to control relapse behavior."

She said this process could be used to prevent relapse for any addiction - including compulsive gambling and overeating - because they affect the same brain regions as drug addiction.

Among the 90 rats exposed to heroin, roughly 40% developed addiction-like behavior. The researchers used six common features of addiction to determine whether the rats were high-risk or casual users:

How much heroin did they ingest?

During periods of drug-availability, how much time was spent engaging in drug use?

During periods in which a cue signaled that the drug was unavailable, how much time did they spend seeking the drug?

How motivated were they to get heroin?

During treatment, were they still motivated to get drugs?

If they were given a cue associated with their drug use, did they relapse?

With this model, the researchers focused on identifying the brain circuitry that regulates addictive behavior, and used artificial receptors to control activity in the nucleus accumbens. Receptors are activated by chemicals such as dopamine or by medications, which cause brain cell activity to increase or decrease.

The researchers could affect the behavior only of the high-risk rats, however, and they could not discern what motivated some rats to use drugs and others to ignore the drugs. Future studies could explore that, Ferguson said.

The research confirms the influence of chemogenetic receptors, Ferguson said, and shows how technology can target specific cell populations in the brain rather than the entire brain.

"I envision and hope we could make a pill that decreases relapse but still keeps people motivated for other things, and feeling good," she said.

Credit: 
University of Washington School of Medicine/UW Medicine

Human behaviour follows probabilistic inference patterns

image: The researchers designed their experiments presenting hierarchical integration tasks using the plane task.

Image: 
UPF

How do human beings perceive their environment and take their decisions? To successfully interact with the immediate environment, for human beings it is not enough to have basic evidence of the world around them. This information by itself is insufficient because it is inherently ambiguous and requires integrating into a particular context to minimize the uncertainty of sensory perception. But, at the same time, the context is ambiguous. For example, am I in a safe or a dangerous place?

A study published on 28 November in Nature Communications by Philipp Schustek, Alexandre Hyafil and Rubén Moreno-Bote, researchers at the Center for Brain and Cognition (CBC) of the Department of Information and Communication Technologies (DTIC) at UPF, suggests that the brain has a refined form of representation of uncertainty at several hierarchical levels, including context. Hence, the brain has a very detailed, almost mathematical probabilistic representation of all that surrounds us we consider important.

"The notions of probability, though intuitive, are very difficult to quantify and use rigorously. For example, my statistics students often fail to solve some of the problems I pose in class. In our study, we find that a complicated mathematical problem involving the use of the most sophisticated rules of probability can be solved intuitively if it is presented simply and in a natural context", asserts Rubén Moreno-Bote, coordinator of the Research Group on Theoretical and Cognitive Neuroscience at the CBC.

Cognitive tasks of hierarchical integration

Let us suppose that a city airport is hosting a football final and we look at a few passengers who are leaving a plane. If we note that four of them are fans of the red team and two of the blue team, we could conclude that more fans of the red team are attending the final than of the blue team. This inference, based on incomplete sensory evidence, could be improved with contextual information. For example, if worldwide there are more fans of the blue team than of the red team, despite our initial observation, we would review our inference counting how many supporters of each group are travelling on the plane to more accurately confirm whether more fans of the red team have really come to the city than of the blue team. Or, we could also do the opposite, basing ourselves on the context inferring whether the sample observed follows the more general context or not.

The researchers designed their experiments presenting hierarchical integration tasks using the plane task. "For the study, we told our participants that they are at an airport where planes can arrive carrying more of one type of person than of another, for example, more supporters of Barça than of Madrid. On seeing a handful of passengers leaving several aircraft, the participants can predict with mathematical precision the likelihood that the next plane will be carrying more passengers of a certain type", Moreno-Bote explains.

"In general, this structure of tasks creates hierarchical dependencies among the hidden variables to be solved bottom up (deducing the context of previous observations) and then passing the message top down (deducing the current status combining current observations with the inferred context)", the authors explain.

The results showed that the participants, based on their preliminary observations, built a probabilistic representation of the context. These results help to understand how people form mental representations of what surrounds us and how we assign and perceive the uncertainty of this context.

Credit: 
Universitat Pompeu Fabra - Barcelona

SwRI-built instrument confirms solar wind slows farther away from the Sun

image: The SWAP instrument aboard NASA's New Horizons spacecraft has confirmed that the solar wind slows as it travels farther from the Sun. This schematic of the heliosphere shows the solar wind begins slowing at approximately 4 AU radial distance from the Sun and continues to slow as it moves toward the outer solar system and picks up interstellar material. Current extrapolations reveal the termination shock may currently be closer than found by the Voyager spacecraft. However, increasing solar activity will soon expand the heliosphere and push the termination shock farther out, possibly to the 84-94 AU range encountered by the Voyager spacecraft.

Image: 
Figure courtesy of Southwest Research Institute; background artist rendering by NASA and Adler Planetarium

SAN ANTONIO -- December 2, 2019 -- Measurements taken by the Solar Wind Around Pluto (SWAP) instrument aboard NASA's New Horizons spacecraft are providing important new insights from some of the farthest reaches of space ever explored. In a paper recently published in the Astrophysical Journal, a team led by Southwest Research Institute shows how the solar wind -- the supersonic stream of charged particles blown out by the Sun -- evolves at increasing distances from the Sun.

"Previously, only the Pioneer 10 and 11 and Voyager 1 and 2 missions have explored the outer solar system and outer heliosphere, but now New Horizons is doing that with more modern scientific instruments," said Dr. Heather Elliott, a staff scientist at SwRI, Deputy Principal Investigator of the SWAP instrument and lead author of the paper. "Our Sun's influence on the space environment extends well beyond the outer planets, and SWAP is showing us new aspects of how that environment changes with distance."

The solar wind fills a bubble-like region of space encompassing our solar system, called the heliosphere. From aboard New Horizons, SWAP collects detailed, daily measurements of the solar wind as well as other key components called "interstellar pickup ions" in the outer heliosphere. These interstellar pickup ions are created when neutral material from interstellar space enters the solar system and becomes ionized by light from the Sun or by charge exchange interactions with solar wind ions.

As the solar wind moves farther from the Sun, it encounters an increasing amount of material from interstellar space. When interstellar material is ionized, the solar wind picks up the material and, researchers theorized, slows and heats in response. SWAP has now detected and confirmed this predicted effect.

The SWAP team compared the New Horizons solar wind speed measurements from 21 to 42 astronomical units to the speeds at 1 AU from both the Advanced Composition Explorer (ACE) and Solar TErrestrial RElations Observatory (STEREO) spacecraft. (One AU is equal to the distance between the Sun and Earth.) By 21 AU, it appeared that SWAP could be detecting the slowing of the solar wind in response to picking up interstellar material. However, when New Horizons traveled beyond Pluto, between 33 and 42 AU, the solar wind measured 6-7% slower than at the 1 AU distance, confirming the effect.

In addition to confirming the slowing of the solar wind at great distances, the change in the solar wind temperature and density could also provide a means to estimate when New Horizons will join the Voyager spacecraft on the other side of the termination shock, the boundary marking where the solar wind slows to less than the sound speed as it approaches the interstellar medium. Voyager 1 crossed the termination shock in 2004 at 94 AU, followed by Voyager 2 in 2007 at 84 AU. Based on current lower levels of solar activity and lower solar wind pressures, the termination shock is expected to have moved closer to the Sun since the Voyager crossings. Extrapolating current trends in the New Horizons measurements also indicates that the termination shock might now be closer than when it was intersected by Voyager. At the earliest, New Horizons will reach the termination shock in the mid-2020s. As the solar cycle activity increases, the increase in pressure will likely expand the heliosphere. This could push the termination shock to the 84-94 AU range found by the Voyager spacecraft before New Horizons has time to reach the termination shock.

New Horizons' journey through the outer heliosphere contrasts Voyager's in that the current solar cycle is mild compared to the very active solar cycle Voyager experienced in the outer heliosphere. In addition to measuring the solar wind, New Horizons' SWAP is extremely sensitive and simultaneously measures the low fluxes of interstellar pickup ions with unprecedented time resolution and extensive spatial coverage. New Horizons is also the only spacecraft in the solar wind beyond Mars (1.5 AU) and, consequently, the only spacecraft measuring interactions between the solar wind and the interstellar material in the outer heliosphere during the current mild solar cycle. New Horizons is on course to be the first spacecraft to measure both the solar wind and interstellar pickup ions at the termination shock.

"New Horizons has significantly advanced our knowledge of distant planetary objects, and it's only fitting that it is now also revealing new knowledge about our own Sun and its heliosphere," said New Horizons Principal Investigator Dr. Alan Stern of the SwRI.

Credit: 
Southwest Research Institute

New index maps relationships between poverty and accessibility in Brazil

Researchers from the School of Engineering in Trinity College Dublin have developed a new spatial index that measures the connections between poverty and poor accessibility.

The research, recently published in the Journal of Transport Geography, builds on previous work that shows how poor transportation availability can result in poor access to health care and employment, hence reinforcing the cycle of poverty in this area of rural Brazil.

One in five people in extreme poverty in this region are forced to journey 10 km at least, mostly by non-motorised transports, in the hottest and driest part of Brazil to reach the nearest healthcare facility. This results in concerning health outcomes such as low life expectancy and high child mortality.

Transport planners in more economically developed countries use large amounts of data to plan and predict the usage of transportation networks. But because these data are expensive to collect and due to the large geographical size of Brazil, such datasets do not exist.

Information on where low-income populations live in rural areas is not available, so the Trinity researchers used the location of water cisterns as a proxy for rural dwellings to determine access to services.

The research developed a planning tool that enables local governments to measure the levels of poor accessibility among nearly 2,000 municipalities. The mapping tool also enables them to target funding and transportation interventions to alleviate multidimensional poverty where it is most needed.

The main findings of the research include:

53% of the rural low-income population is situated more than 5 km away from the closest basic healthcare centre

60% is more than 10 km from the nearest hospital

49% is more than 10km from the closest urban centre

Brian Caulfield, Associate Professor in Trinity's School of Engineering, and project coordinator, said:

"Transport policy development is dependent upon good data which are often expensive to collect. The approach developed in this research enables municipalities to use existing databases to outline the extent of the rural transport poverty problem and to direct policies that can break the poverty cycle."

Rodolfo Benevenuto, PhD Researcher at Trinity College Dublin and co-author of the research, added:

"As poverty is perceived not only as a lack of income but also as a lack of access to life opportunities, transport planning and development can offer instrumental strategies to reach the global goal of eradicating extreme poverty by 2030. Accessibility measurements are essential to promote evidence-based guidance that can make transport interventions more effective in tackling poverty."

Credit: 
Trinity College Dublin