Tech

US wildfire smoke deaths could double by 2100

WASHINGTON D.C. -- The number of deaths associated with the inhalation of wildfire smoke in the U.S. could double by the end of the century, according to new research.

A new study simulating the effects of wildfire smoke on human health finds continued increases in wildfire activity in the continental United States due to climate change could worsen air quality over the coming decades. The number of human deaths from chronic inhalation of wildfire smoke could increase to more than 40,000 per year by the end of the 21st century, up from around 15,000 per year today.

Wildfire smoke is composed of a mixture of gases and microscopic particles from burned material known as particulate matter. Particulate matter from wildfire smoke often reaches nearby communities and can irritate human eyes, exasperate respiratory systems and worsen chronic heart and lung diseases, according to the federal Centers for Disease Control and Prevention.

Exposure to particulate matter is associated with visibility degradation, premature death in people with heart or lung disease, heart attacks, irregular heartbeats, aggravated asthma, decreased lung function  and increased respiratory symptoms, according to the U.S. Environmental Protection Agency. Older adults, children and those with heart or lung diseases are most at risk.

Researchers used global climate model simulations to estimate particulate matter's impacts on air quality and human health in the contiguous United States in the early-, mid-, and late-21st century under different climate scenarios. The new study, published in GeoHealth, a journal of the American Geophysical Union, provides the first estimates of future smoke health and visibility impacts using a predictive land-fire model.

Emissions of particulate matter from human activities--such as burning fossil fuels--are declining nationwide, but wildfires are increasing in frequency and intensity because of climate change, according to the study. From January to July 2018, NOAA recorded 37,718 fires that burned 4.8 million acres of land. In 2017, the U.S. Forest Service's wildfire suppression costs reached a historic high of $2.4 billion.

The new study finds the number of deaths attributable to total particulate matter from all sources will decrease by the end of the 21st century, but the number of deaths attributable to fire-related particulate matter could double under the worst climate scenarios.

This new finding highlights the need to prepare for future air quality changes caused by wildfires in the U.S., according to the study's authors.

"We know from our own research and many, many other groups that smoke has negative impacts on human health," said Jeff Pierce, associate professor of atmospheric science at Colorado State University in Fort Collins and co-author of the new study. "With the knowledge that fires have been increasing in parts of the U.S., we wanted to look at how bad this might get."

Looking Forward

In the new study, Pierce and his team analyzed the potential effects of wildfire smoke on human health over the coming decades.  They simulated the impacts of changing fire emissions on air quality, visibility, and premature deaths in the middle to late 21st century under different climate scenarios.

They found that declines in particulate matter from human sources like car, industry and power plant emissions over the 21st century is offset by increases in smoke emissions from more intense wildfires, causing an increase in particulate matter in some regions. In the study, researchers used simulated concentrations of particulate matter generated by a model for early-, mid- and late-century time frames.

The new study predicts that average visibility due to particulate matter will improve across the contiguous United States over the 21st century, but fire-related particulate matter will reduce visibility on the worst days in the western and southeastern U.S. Haze from wildfire smoke affects how people see colors, forms and textures of a given vista or skyline. The fine particles in the air absorb and scatter sunlight, making it difficult to see clearly, according to the National Park Service.

From 2000 to 2010, approximately 140,000 deaths per year, or 5 percent of total deaths, were attributable to total particulate matter. Of those deaths, about 17,000, or 0.7 percent per year, were linked to particulate matter from wildfires. In the paper, the authors estimate uncertainties in these numbers.

The new study estimates fire-related particulate matter deaths could more than double by the end of the century in the worst-case-scenario prediction model.

"People could use this information as sort of a first estimate of what to prepare for in terms of future air quality," Pierce said. "We need more simulations to be able to assess the different probabilities of what the future might be."

Although there are increased efforts in place to reduce wildfire risks in the U.S., the occurrence of wildfires has continued to increase in frequency and intensity, which are strongly linked to a changing climate, according to the study.

To continue reducing the health burdens due to fire-related particulate matter, the study's authors call for more emphasis on reducing exposure through public health campaigns in conjunction with climate mitigation efforts.

"I think that we need to act now," said Sheryl Magzamen, associate professor of epidemiology at Colorado State University, who was not involved in the new study. "Our exposure to wildfire smoke is only going to get worse going into the next century, so we need to plan and be prepared in terms of acting to protect population health."

Credit: 
American Geophysical Union

Following Twitter conversations around hacked diabetes tools to manage blood sugar

image: This is a hacked open artificial pancreas system (OpenAPS).

Image: 
University of Utah Health

The diabetes online community is leading grassroots efforts focused on accelerating the development, access and adoption of diabetes-related tools to manage the disease. Researchers at University of Utah Health examined the community's online Twitter conversation to understand their thoughts concerning open source artificial pancreas (OpenAPS) technology. The results of this study are available online in the September 10 issue of the Journal of Diabetes Science and Technology.

"There is a large community that is actively exploring how they can manage their diabetes using off-label solutions," says Michelle Litchman, Ph.D., FNP-BC, FAANP, an assistant professor in the College of Nursing at U of U Health and first author on the paper. "Health care providers, industry and the FDA need to understand the wants and needs of people with diabetes in order to better serve them. OpenAPS was created out of a need for better solutions."

For the diabetes community, OpenAPS has been touted as an ideal technology for managing their illness. The off-label technology combines an off-the-shelf continuous glucose monitor (CGM) and insulin pump that interact to minimize glucose variability.

Before the Food and Drug Administration (FDA) approved the first technology to bridge these two devices in 2017, the community took matters into their own hands. They hacked into current CGMs and older insulin pumps and developed open source code to get the two devices to speak to one another, creating an OpenAPS. By crowdsourcing their code hacks, the community has improved this approach for blood sugar management.

Litchman and colleagues followed the #OpenAPS hashtag on Twitter to understand how the community is discussing this option.

After surveying more than 3,000 tweets using the #OpenAPS hashtag, generated by more than 300 participants from January 2016 to January 2018, Litchman found five overarching themes circulating around the community.

1. With OpenAPS, self-reported A1C and glucose variability improved.

2. OpenAPS reduced the daily distress and burden associated with diabetes.

3. OpenAPS is perceived as safe.

4. Interactions with health care providers concerning OpenAPS.

5. How to adapt OpenAPS technology for individual user needs.

To date, more than 700 diabetes patients are using OpenAPS to manage their diabetes. One participant likened OpenAPS to having an autopilot on an aircraft. While the patient still has to manage their diabetes, OpenAPS has reduced some of the burden of care. Some tweets include:

"Finding OpenAPS literally changed my life. My numbers have been astounding. Last A1C was 5.4!"

"Boring glucose is beautiful [photo of CGM with a flat glucose pattern for the previous 3 hours]"

"For those times when I've lost [connection to] CGM readings... fallback [to standard] basal."

"Citizen scientists from all over the world are coming together to enhance existing diabetes technology," Litchman said. "They are not waiting for solutions. They are making solutions to help themselves manage their diabetes with more ease."

Although the Twitter analysis exposes an active community that is exploring their options, there are limitations to the study, according to Litchman. It was not constructed using an experimental design or prospective cross-sectional data. In addition, the community participating in the conversation has a vested interest in OpenAPS. Social pressure may compel community participants to only post positive experiences with OpenAPS, reflecting a potential positive bias in the conversation.

"There are some unknowns about this type of technology," Litchman said. "While there are obvious benefits to many people who are using OpenAPS, there are some areas that may be concerning."

Patients do not need a prescription to create and use OpenAPS. They do not have a trained professional diabetes educator to help set up and train the user about the technology, a common practice when initially starting a new FDA-approved diabetes technology such as an insulin pump or CGM. Additionally, many of the hackable insulin pumps are no longer for sale through the device maker, opening a black market for used products.

This do-it-yourself (DIY) system is not regulated or approved by the FDA or device manufacturers of the altered insulin pump or CGM; however, the FDA is currently exploring OpenAPS technology as another option for diabetes management.

The pancreas produces the hormones (insulin and glucagon) that help regulate blood sugar. When this organ does not function properly, the body cannot control blood sugar effectively, leading to diabetes.

"The open artificial pancreas system is the next technological frontier of diabetes," said Kelly L. Close, Founder, The DiaTribe Foundation, an organization that seeks to empower readers with useful, actionable information that helps make sense of diabetes. "It promises to transform all aspects of care and create opportunities that no other therapy can approach. I take nothing for granted, but if successful, it will normalize a condition that has bedeviled humanity for thousands of years." Ms. Close was not involved in this study.

Credit: 
University of Utah Health

Transparency may improve US home buyout programs

Imagine a major storm hits your neighborhood and the government offers to purchase homes with "a history of flood damage." Your basement is completely flooded. Will you qualify for the buyout? What about your neighbors?

Relocating residents from areas vulnerable to flooding, known as "managed retreat," is a potentially important approach for helping communities at risk of losing their homes to coastal erosion and natural disasters. But a closer look at the government's past buyouts reveals a major weakness in the process: lack of transparency. Fortunately, past buyout programs also reveal strategies to address this challenge.

Based on analyses of academic studies and reports on managed retreat from floodplains in the United States, Stanford University research shows local government programs are often subjective about which homes qualify for buyouts and use vague language in their communications. Buyout programs must be cost-effective to qualify for funding from the Federal Emergency Management Agency (FEMA) - but that approach may have unintended and disproportionate effects on low-income and minority populations. The research appears in Climatic Change Sept. 10.

"When you're talking about a publicly funded government program to relocate people, I think it's problematic that we're not being transparent about why and how we're buying up homes," said author A.R. Siders, who conducted research as a PhD student in the Emmett Interdisciplinary Program in Environment and Resources at Stanford School of Earth, Energy & Environmental Sciences (Stanford Earth). "When we look at how managed retreat has been done to date, we can see the ways it went wrong, and this helps us to improve in the future. We can also see strategies where it went right."

Moving vulnerable populations

Siders found that property acquisitions historically occurred in areas that experience recurrent flooding and rebuilding, or a where a major disaster triggers the need for new solutions. The programs are often funded by the federal government and administered by the state or local government.

"If you look at the trends, we are accumulating assets in places at risk - in floodplains, in fire-exposed zones," said Katharine Mach, a climate change risk expert who worked with Siders in a climate adaptation research group at Stanford. "The question of how will we grapple with flooding is going to be relevant to many millions more people globally in this century than the past century."

Siders looked at publicly available information for 8,614 buyouts in low-lying areas adjacent to rivers that were subject to flooding since the 1980s. The areas experienced floodwaters following disasters including hurricanes Sandy (2012), Irene (2011), Floyd (1999) and Fran (1996), and from floods in Texas, North Carolina, Oklahoma and the Midwest. In her analysis of communications about buyout opportunities, Siders found that messaging often included language about eligibility based on "the best interest of the community," "meeting community values" or "enhancing the natural environment."

"All of the data we have suggests that this lack of transparency is creating a lack of trust, and that's translating into lower participation rates," said Siders, who recently started a position as a postdoctoral fellow at the Harvard University Center for the Environment. "Fewer homeowners want to be involved in a process they don't understand."

Without cooperation on managed retreat, the United States will face difficulty adapting its existing infrastructure to climate change risks, the researchers said. The country experienced more than $300 billion in disaster damages in 2017, one of the most damaging years in terms of property, people, lives and well-being. Managed retreat may be "a key partner in adaptive actions that preserve vibrant lifestyles moving forward," said Mach, a senior research scientist in the department of Earth System Science at Stanford Earth.

Social justice

Siders' research found that managed retreat has resulted in a majority of buyouts occurring in low-income neighborhoods - a pattern that may be perpetuating a history of social inequality.

"The communities most exposed to natural hazards in the U.S. tend to be low-income and minority communities," Siders said. "Managed retreat through buyouts can help people escape disaster cycles, but it can also break up neighborhoods and perpetuate problems if it's not done with social justice in mind."

Political objectives can also motivate legislators to offer buyouts to some neighborhoods but not others. Homeowners can feel either forced out or left behind to fend for themselves in a disaster-prone area, depending on the situation. That subjectivity creates a sense of injustice, according to the literature that Siders analyzed. The research highlights the need for local governments to learn from past experiences moving forward, Siders said.

"It's just not feasible to build a cement wall around the entire coastal United States - some people are going to have to move," Siders said. "If more governments are going to be using managed retreat and doing it more frequently, we want to make even more sure that it's being done in a just, equitable and effective way."

Recommendations in the study to improve future buyout programs include increasing transparency by making decision criteria about where buyouts will happen clear and publicly available; involving community members in pre-disaster planning so retreat programs come as less of a surprise; and placing more emphasis on where people will relocate if they leave the floodplain.

Credit: 
Stanford University

Social support is critical to life satisfaction in young patients with cancer

Among adolescents and young adults with cancer, social support was the most decisive factor associated with life satisfaction. Published early online in CANCER, a peer-reviewed journal of the American Cancer Society, the findings indicate that social support and how young cancer patients process the experience of being ill have far greater importance for their life satisfaction than sociodemographic or medical factors do. September is Childhood Cancer Awareness Month.

Life satisfaction strongly relates to quality of life, which can be affected during cancer treatment. Adolescents and young adults with cancer may be especially vulnerable as they are dealing with a serious disease at a complex psychosocial stage of life that can include leaving home, establishing financial and social independence, forming a family, and starting a career.

To determine which factors might affect life satisfaction in these patients a team of researchers at University Medical Center Leipzig in Germany provided a questionnaire at two time points (12 months apart) to 514 young patients who were aged 18 to 39 years at the time of cancer diagnosis and were diagnosed in the last four years. In comparing answers between the first and second questionnaire, the investigators looked for differences in life satisfaction and 10 subdomains: friends/acquaintances, leisure activities/hobbies, health, income/financial security, work/profession, housing situation, family life, children/family planning, partnership, and sexuality. The researchers also assessed various sociodemographic (e.g. age, education, having children), medical (e.g. treatments, time since diagnosis, additional disease), and psychosocial (e.g. social support, perceived adjustment to the disease) factors in patients.

The most prevalent areas of life impacted in a negative way were observed in financial and professional situations, family planning, and sexuality. Of all the examined variables, social support was the most decisive factor associated with life satisfaction at both time points.

"Care providers should pay special attention to those patients who lack social support and have higher levels of disease-related burden, and should be included in suitable supportive care programs," said lead author Katja Leuteritz, Dipl-Psych.

Credit: 
Wiley

Robot can pick up any object after inspecting it

image: Manuelli uses the DON system and Kuka robot to grasp a cup.

Image: 
Tom Buehler

Humans have long been masters of dexterity, a skill that can largely be credited to the help of our eyes. Robots, meanwhile, are still catching up. Certainly there's been some progress: for decades robots in controlled environments like assembly lines have been able to pick up the same object over and over again.

More recently, breakthroughs in computer vision have enabled robots to make basic distinctions between objects, but even then, they don't truly understand objects' shapes, so there's little they can do after a quick pick-up.

In a new paper, researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), say that they've made a key development in this area of work: a system that lets robots inspect random objects, and visually understand them enough to accomplish specific tasks without ever having seen them before.

The system, dubbed "Dense Object Nets" (DON), looks at objects as collections of points that serve as "visual roadmaps" of sorts. This approach lets robots better understand and manipulate items, and, most importantly, allows them to even pick up a specific object among a clutter of similar objects - a valuable skill for the kinds of machines that companies like Amazon and Walmart use in their warehouses.

For example, someone might use DON to get a robot to grab onto a specific spot on an object - say, the tongue of a shoe. From that, it can look at a shoe it has never seen before, and successfully grab its tongue.

"Many approaches to manipulation can't identify specific parts of an object across the many orientations that object may encounter," says PhD student Lucas Manuelli, who wrote a new paper about the system with lead author and fellow PhD student Pete Florence, alongside MIT professor Russ Tedrake. "For example, existing algorithms would be unable to grasp a mug by its handle, especially if the mug could be in multiple orientations, like upright, or on its side."

The team views potential applications not just in manufacturing settings, but also in homes. Imagine giving the system an image of a tidy house, and letting it clean while you're at work, or using an image of dishes so that the system puts your plates away while you're on vacation.

What's also noteworthy is that none of the data was actually labeled by humans; rather, the system is "self-supervised," so it doesn't require any human annotations.

Making it easy to grasp

Two common approaches to robot grasping involve either task-specific learning, or creating a general grasping algorithm. These techniques both have obstacles: task-specific methods are difficult to generalize to other tasks, and general grasping doesn't get specific enough to deal with the nuances of particular tasks, like putting objects in specific spots.

The DON system, however, essentially creates a series of coordinates on a given object, which serve as a kind of "visual roadmap" of the objects, to give the robot a better understanding of what it needs to grasp, and where.

The team trained the system to look at objects as a series of points that make up a larger coordinate system. It can then map different points together to visualize an object's 3-D shape, similar to how panoramic photos are stitched together from multiple photos. After training, if a person specifies a point on a object, the robot can take a photo of that object, and identify and match points to be able to then pick up the object at that specified point.

This is different from systems like UC-Berkeley's DexNet, which can grasp many different items, but can't satisfy a specific request. Imagine an infant at 18-months old, who doesn't understand which toy you want it to play with but can still grab lots of items, versus a four-year old who can respond to "go grab your truck by the red end of it."

In one set of tests done on a soft caterpillar toy, a Kuka robotic arm powered by DON could grasp the toy's right ear from a range of different configurations. This showed that, among other things, the system has the ability to distinguish left from right on symmetrical objects.

When testing on a bin of different baseball hats, DON could pick out a specific target hat despite all of the hats having very similar designs - and having never seen pictures of the hats in training data before.

"In factories robots often need complex part feeders to work reliably," says Manuelli. "But a system like this that can understand objects' orientations could just take a picture and be able to grasp and adjust the object accordingly."

In the future, the team hopes to improve the system to a place where it can perform specific tasks with a deeper understanding of the corresponding objects, like learning how to grasp an object and move it with the ultimate goal of say, cleaning a desk.

The team will present their paper on the system next month at the Conference on Robot Learning in Zürich, Switzerland.

Credit: 
Massachusetts Institute of Technology, CSAIL

Smart technology to help diagnose sepsis in children in Canada

Podcast link: https://soundcloud.com/cmajpodcasts/180434-com

Smart technology and artificial intelligence could be used to improve detection of sepsis in children in Canada, write authors of a commentary in CMAJ (Canadian Medical Association Journal) http://www.cmaj.ca/lookup/doi/10.1503/cmaj.180434.

Canadian physicians do not often encounter children with sepsis, because pediatric sepsis in Canada is uncommon, unlike in developing countries. However, several recent deaths highlight the need for reliable, fast identification of early sepsis, as the condition can be lethal if not treated quickly.

"The optimal sepsis trigger tool needs to be rapid, objective, accurate and low cost; must easily integrate into the current workflow of a busy clinical setting; should require minimal training and require minimal additional effort; and offer a clear clinical benefit, particularly in community settings where the prevalence and clinical experience with sepsis is likely to be low," writes Mark Ansermino, University of British Columbia and BC Children's Hospital, Vancouver, BC, with coauthors.

The authors suggest that current smart technologies, like those used to program washing machines and automate medical imaging processing, could be utilized to automate data combinations of sepsis symptoms and other relevant information.

"The recognition and anticipation of sepsis represents an important opportunity for artificial intelligence to revolutionize health care, by optimizing algorithms to a degree of accuracy that would avoid alert fatigue and optimize efficiencies in work flow," they write.

Better collection of patient outcome data and integration into medical records is needed.

"We need smarter trigger tools for diagnosing sepsis in children in Canada" is published September 10, 2018.

Credit: 
Canadian Medical Association Journal

Monitoring at home yields better blood pressure control

CHICAGO Sept 8, 2018 -- Home blood pressure monitoring improved hypertension control and saved medical costs, according to results of a pilot initiative presented at the American Heart Association's Joint Hypertension 2018 Scientific Sessions.

American Heart Association/American College of Cardiology guidelines stress the importance of home blood pressure monitoring for optimal high blood pressure management.

However, according to Roy R. Champion, M.Sc., B.S.N., clinical quality R.N. at Scott and White Health Plan, Temple, Texas, home blood pressure monitoring isn't a common part of most treatment plans. Based on trends noted during medical record reviews, Champion said less than one in five providers were including home blood pressure monitoring in documentations for hypertension patients.

"Meanwhile, in the charts that did use home blood pressure monitoring, approximately 86 percent of those patients had their hypertension under control," Champion said.

Home monitoring combined with doctor visits to measure a patient's blood pressure helps to avoid numbers skewed by "white-coat hypertension," when blood pressure is high in a medical setting but not in everyday life, and "masked hypertension," when blood pressure is normal in a medical setting but high at home.

Champion studied the impact of an intervention that provided free home blood pressure monitors, online and print resources for tracking their readings, and monitoring reminders to 2,550 adult patients with persistent uncontrolled high blood pressure. In each case, the patient's provider would know the patient received a free at-home blood pressure monitor and resources for how to use it.

They found:

By their 3rd office visit, nearly 67 percent of patients had their blood pressure controlled.

Nearly 60 percent of patients had blood pressure control by their sixth visit.

Champion attributed the decline from the third to sixth visit to providers' adjusting blood pressure medications based on information from home blood pressure monitoring. Patients only had to see their doctors a few times to settle on the ideal medication amount, he said.

At the end of the intervention, systolic blood pressures had decreased an average 16.9 mmHg and diastolic blood pressures fell an average 6.5 mmHg.

In the six months after the intervention, nearly 80 percent of the participants achieved blood pressure under control using the Healthcare Effectiveness Data and Information Set (HEDIS) 2018 standards.

Using the 2017 AHA/ACC guidelines, 72 percent achieved hypertension control.

"Even with the more stringent guidelines, we showed home blood pressure monitoring is vital to achieving control among hypertensive patients," Champion said.

Each kit, including the monitor, cost an average $38.50; yet, the cost savings from the intervention were substantial. The intervention reduced needed office visits by 1.2 office visits per participant per year and significantly reduced emergency department and medication costs.

Home monitoring helps providers better understand patients' everyday blood pressure numbers in a cost-saving way that doesn't increase the burden on patients or providers, Champion said.

Credit: 
American Heart Association

Hot streak: Finding patterns in creative career breakthroughs

You've likely heard of hot hands or hot streaks -- periods of repeated successes -- in sports, financial markets and gambling. But do hot streaks exist in individual creative careers?

A team of researchers, including two from Penn State's College of Information Sciences and Technology, examined the works of nearly 30,000 scientists, artists and film directors to learn if high-impact works in those fields came in streaks.

According to Lu Liu, a doctoral student in the College of IST and member of the research team, they found a universal pattern.

"Around 90 percent of professionals in those industries have at least one hot hand, and some of them have two or even three," she said.

The team's paper, "Hot streaks in artistic, cultural, and scientific careers," recently appeared in Nature.

Liu says that there are two previous schools of thought regarding hot streaks in individual careers. According to the "Matthew effect," the more famous you become, the more likely you'll have success later, which supports the existence of a hot streak. The other school of thought -- the random impact rule -- implies that the success of a career is primarily random and is primarily driven by levels of productivity.

"Our findings provide a different point of view regarding individual careers," said Liu. "We found a period when an individual performs better than his normal career, and that the timing of a hot streak is random."

She added, "Different from the perception [in innovation literature] that peak performance occurs in an individual's 30s or 40s, Our results suggest that individuals have equal chance to perform better even in their late careers."

The researchers also wanted to learn if individuals were more productive during their hot streak periods, which last an average of four to five years. Unexpectedly, they were not.

"Individuals show no detectable change in productivity during hot streaks, despite the fact that their outputs in this period are significantly better than the median, suggesting that there is an endogenous shift in individual creativity when the hot streak occurs," wrote the team in their paper.

Through their research, the team analyzed data they collected from a variety of sources. They looked at scientists' most-cited papers from Web of Science and Google Scholar, auction prices for artists, and Internet Movie Database (IMDB) ratings to gauge popularity of films and their directors. Then, they reconstructed a career path for each individual based on that data.

"The question starts from looking at the random impact rule," said Liu. "We start from that to analyze if it applies to different domains. To our surprise, we found something more interesting."

She explained that when the researchers looked at a scientist's highest-impact work through their most-cited papers, its timing was random, as well as the timing of the second-most cited paper. But in looking at the relative timing of these highest-impact works, the researchers found that they are correlated.

"That's how we find a hot streak period," said Liu. "We then analyzed [this finding] in other creative domains, like artists and movie directors, to see if there are similar patterns in these careers."

Liu said that there are many cases when the most famous works of an individual came in sequence. She cited Peter Jackson, director of "The Lord of the Rings" film series; Vincent Van Gogh, whose most famous paintings were completed late in his career; and Albert Einstein, whose four published papers in his "miracle year" of 1905 contributed significantly to the foundation of modern physics.

"[A hot streak] doesn't just matter to these individuals," said Liu. "It matters to society as well."

Liu said that this could help to understand the innovative process, and have the potential to discover and cultivate individuals during a hot streak.

As the research shows that hot streaks do in fact exist in creative careers, the researchers hope to apply the research methods to more domains, including musicians, inventors and entrepreneurs.

"We know that these domains have different natures," Liu said. "For example, scientists collaborate with each other and artists work alone. If we can find the triggers and drivers behind the universal pattern, that would be much more interesting."

Credit: 
Penn State

New discovery on T cell behavior has major implications for cancer immunotherapy

AURORA, Colo. (September 7, 2018) - Scientists at the University of Colorado Anschutz Medical Campus have discovered that disease-fighting T cells, elicited from vaccines, do not require glucose for their rapid reproduction, a finding with major implications for the development of immunotherapies for cancer patients.

In the study, published today in the journal Science Immunology, researchers from CU Anschutz, along with colleagues from the Mayo Clinic and the University of Pennsylvania, examined T cells that arose in the body's immune system after they received a subunit vaccination - a vaccine that uses just part of a disease-causing virus.

They found that these critical white blood cells, which attack and kill infection, did not rely on glucose to fuel their rapid division which occurs every two to four hours. Instead, they used another cellular engine, the mitochondria, to support their expansion.

"The knowledge that this magnitude of cell division can be supported by mitochondrial function has a number of potential practical implications for the development of future vaccines," said the study's senior author Ross Kedl, PhD, professor of immunology and microbiology at the University of Colorado School of Medicine.

Kedl said T cells responding to infection usually depend on glucose for fuel. So do cancerous tumors. When T cells come up against tumors, they end up competing for glucose and the T cells often lose.

But when a T cell doesn't need glucose, he noted, it has a better chance of defeating tumor cells.

"T cells generated by subunit vaccination are ideally suited for use against cancer in conjunction with drugs that block aerobic glycolysis, a metabolic pathway to which the cancer is addicted," Kedl said. "Tumor growth can be inhibited while the T cells are free to attack the tumor instead of competing against it for access to glucose."

Lead author Jared Klarquist, PhD, explained that scientists have historically studied T cell responses to infection with the idea that if they could understand how the cells respond, they could create better vaccines. Kedl and colleagues had already discovered a non-infectious vaccine method that could induce the same level of T cell immunity as those using infection.

Since then, researchers in Kedl's lab have found that the rules governing T cell responses to an infectious agent are very different from the cell's response to a subunit vaccine. And the fact that T cells derived from subunit vaccines don't require glucose to reproduce is a major finding.

"Prior to these findings, it was generally thought that whereas the mitochondria are good at making energy, T cells need glucose to produce the raw materials like proteins, fats and nucleic acids (like DNA) required to turn one cell into two," said Klarquist. "Knowing how the immune response is fueled after vaccination provides potential opportunities for metabolic or nutritional interventions for boosting a vaccine-elicited immune response."

Kedl agreed. "Perhaps most intriguing, however, is the application of this knowledge to cancer immunotherapy," he said.

The lab is currently exploring how these strategies might positively influence the outcomes of immune-based cancer treatments that are already in the clinic.

Credit: 
University of Colorado Anschutz Medical Campus

New blood pressure app

image: Michigan State University has invented a proof-of-concept blood pressure app that can give accurate readings using an iPhone -- with no special equipment.

Image: 
MSU

EAST LANSING, Mich. - Michigan State University has invented a proof-of-concept blood pressure app that can give accurate readings using an iPhone - with no special equipment.

The discovery, featured in the current issue of Scientific Reports, was made by a team of scientists led by Ramakrishna Mukkamala, MSU electrical and computer engineering professor.

"By leveraging optical and force sensors already in smartphones for taking 'selfies' and employing 'peek and pop,' we've invented a practical tool to keep tabs on blood pressure," he said. "Such ubiquitous blood pressure monitoring may improve hypertension awareness and control rates, and thereby help reduce the incidence of cardiovascular disease and mortality."

In a publication in Science Translational Medicine earlier this year, Mukkamala's team had proposed the concept with the invention of a blood pressure app and hardware. With the combination of a smartphone and add-on optical and force sensors, the team produced a device that rivaled arm-cuff readings, the standard in most medical settings.

With advances in smartphones, the add-on optical and force sensors may no longer be needed. Peek and pop, available to users looking to open functions and apps with a simple push of their finger, is now standard on many iPhones and included in some Android models.

If things keep moving along at the current pace, an app could be available in late 2019, Mukkamala added.

"Like our original device, the application still needs to be validated in a standard regulatory test," he said. "But because no additional hardware is needed, we believe that the app could reach society faster."

Internationally, this app could be a game-changer. While high blood pressure is treatable with lifestyle changes and medication, only around 20 percent of people with hypertension have their condition under control. This invention gives patients a convenient option and keeping a log of daily measurements would produce an accurate average, Mukkamala added.

Anand Chandrasekhar, Keerthana Natarajan, Mohammad Yavarimanesh - all electrical and computer engineering doctoral candidates - contributed to this research.

Credit: 
Michigan State University

Harnessing the power of the crowd could improve screening accuracy

Averaging the results from two independent participants improved screening accuracy, whether participants were looking at baggage scans or mammograms, according to findings published in Psychological Science, a journal of the Association for Psychological Science.

The research findings, reported by researchers at Brunel University, suggest that having multiple screeners could improve the detection of rare items in real-world contexts, such as airport security, radiology and military reconnaissance.

"There is a known problem with detecting rare targets," said study author Jennifer E Corbett, an honorary lecturer for Brunel's College of Health and Life Sciences. "When you go to the airport, they always seem to find the bottle of water in your bag - it's a very common item, so people have a mental template. They'll just find it. But with rare targets like weapons and guns, people see these far less frequently, so are more likely to miss them."

Corbett, who coauthored the paper with Brunel researcher Jaap Munneke, says the problem lies with the human visual system, which is only capable of processing a few objects in detail at any given moment. The brain averages out redundant and specific information, filling in the spaces based on prior knowledge. As a result, infrequent objects - those that the observers aren't expecting to see - are often missed.

However, two people independently looking at the same scan perceive it differently, significantly increasing the possibility of infrequent items being spotted.

"We found that when we pair the estimates of two people who don't know they're working together - they have no interaction whatsoever - there is a huge improvement in detection, just by capitalizing on the diversity of people's judgments," said Corbett.

To test their ideas, Corbett and Munneke conducted two experiments - one which challenged participants to undertake airport screening and the other mammogram screening.

In the airport screening experiment, 16 participants, who had no experience with security screening, saw an image containing nine objects for half a second. They then indicated whether they'd like to call the image back, based on whether they detected a target object.

"The experiment tested weapons detection as well as simple detection tasks," said Corbett. "We found that not only did pairing observers estimates improve detection in both types of tasks, but that pairing individuals' estimates from the simple task in a way that maximized the decorrelated patterns actually improved the performance in the separate weapons task."

The researchers discovered that when they paired the detections of two people who worked individually and independently, they not only saw an increase in the detection of rare objects, but also a reduction in the likelihood of harmless items being wrongly flagged as suspicious.

For the second experiment, 18 participants learned how to identify a tumor on a mammogram. They then saw 400 unique scans, 5% of which had a tumor present, and then another 400, of which 50% had a tumor present.

In both cases, a significant increase in detection rate was observed when two individuals' results were averaged.

"The task is not so different between airport scanner and a radiologist - the idea is you're looking for something you have knowledge of but see infrequently," said Corbett. "It doesn't matter though whether it's a tumor or a weapon or something else, averaging two different perceptions of the same scene increases detection."

The researchers say their detection method is a marked improvement over those currently used in airport and radiological screening, as it significantly reduces the time someone needs to look at a scan.

"The method we propose is probably the best candidate for maximizing the resources of a limited pool of highly trained experts needing to detect rare targets in a lot of images," said Corbett. "Obviously the limit here is that it requires a second set of eyes, but we're now looking for ways to use a deep-learning algorithm to cover the aspects of the images which are causing these decorrelations. We can then pair a single person with the algorithm."

Credit: 
Association for Psychological Science

Global warming: Worrying lessons from the past

image: Sébastien Castelltort facing the Eocene Cis conglomerate cliff, near Roda de Isabena, Spain.

Image: 
UNIGE

56 million years ago, the Earth experienced an exceptional episode of global warming. In a very short time on a geological scale, within 10 to 20'000 years, the average temperature increased by 5 to 8 degrees, only returning to its original level a few hundred thousand years later. Based on the analysis of sediments from the southern slope of the Pyrenees, researchers from the University of Geneva (UNIGE) measured the impact of this warming on river floods and the surrounding landscapes: the amplitude of floods increased by a factor of eight - and sometimes even by a factor of 14 -, and vegetated landscapes may have been replaced by arid pebbly plains. Their disturbing conclusions, to be discovered in Scientific Reports, show that the consequences of such global warming may have been much greater than predicted by current climate models.

«The method we relied on to analyse this global warming is directly inspired by cell signaling in systems biology, where researchers analyse the response of cells to external stimuli and the ensuing signal transmission,» explains Sébastien Castelltort, professor in the Department of Earth Sciences at the UNIGE Faculty of Sciences, and leader of the study, in collaboration with researchers from the universities of Lausanne, Utrecht, Western Washington and Austin. «We are interested in how a system, in this case the hydrologic cycle through the behavior of rivers, reacts to an external signal, here the global warming.» This project focused on an extreme climatic case that was well known to scientists: a warming of 5 to 8 degrees that occurred 56 million years ago, between the Paleocene and the Eocene epochs, also known by the acronym PETM (Palaeocene-Eocene Thermal Maximum). Named Earth Surface Signaling System (ESSS) this project is supported by the Swiss National Science Foundation (SNSF).

Palm trees at polar latitudes

As early as the 1970s, scientists observed a strong anomaly in the ratio between stable carbon isotopes (δ13C), due to the relative increase in the proportion of the light isotope (12C) compared to the heavy isotope (13C), reflecting a disruption of the carbon cycle, both in the oceans and on the continents, associated with a global warming and its spectacular consequences. Palm trees thrived at polar latitudes and some marine plankton, such as dinoflagellate Apectodinium, normally restricted to tropical waters, suddenly spread across the globe. Geologists use this type of observation as true «paleothermometres», which in this case show a rise in surface water temperature that has reached almost 36 degrees in places, a lethal temperature for many organisms. Several phenomena are cited as possible causes for this global warming, from the intense volcanic activity in several areas of the globe at this period, to the destabilization of methane hydrates, these methane «ice cubes» that only remain stable under certain pressure and temperature conditions, and which by degassing would have released their greenhouse gas.

But although the event is known and its causes have been extensively explored, what about the consequences? «The question is important because there is an obvious analogy with the current global warming. There are lessons to be learned from this event, even more so as the rise in temperatures we are currently witnessing seems to be much faster,» Sébastien Castelltort emphasizes.

Pebbles that reveal the history of rivers

The Spanish Pyrenees offer sediments that allow us to observe the ancient river channels and to determine their size. As part of Chen Chen's thesis project, a doctoral student at the Department of Earth Science at the UNIGE Faculty of Sciences, thousands of ancient river pebbles were measured in the field. Step by step, thanks to the direct relationship between the size of the pebbles and the slope of the rivers, researchers were thus able to calculate their flow velocity and discharge. They have therefore unveiled the whole history of these rivers, and that of the spectacular changes that have affected them.

56 million years ago, the Pyrenees were being formed and their foothills were traversed by small isolated channels in a flood plain where they deposited very fertile alluvium, promoting the development of vegetation whose roots would anchor the soil. Leaving the Pyrenean piedmont, these small rivers then headed west into the Atlantic which was then only about thirty kilometres away.

The landscape changed completely

«With global warming, the landscape changed completely. The channel-forming floods, which occur on average every 2 to 3 years and whose flow we have been able to measure, went up to 14 times greater than before when climate was cooler,» explains Sébastien Castelltort. During the PETM, rivers constantly changed course, they no longer adapted to increased discharge by incising their bed but instead they widened sometimes dramatically, from 15 to 160 meters wide in the most extreme case. Instead of being trapped in the floodplains, the alluvium was transferred directly towards the ocean, and the vegetation seemed to disappear. The landscape turned into arid extensive gravel plains, crossed by ephemeral and torrential rivers.

Far greater risks than expected

Scientists still do not know how precipitation patterns have changed, but they know that this warming has led to more intense floods and higher seasonality, with significantly warmer summers. Higher evaporation resulted in an unexpected increase in flood magnitude. One degree of temperature rise implies a 7% increase in the atmosphere capacity to retain moisture, and this ratio is generally used to assess the increase in precipitation. «But our study shows that there are thresholds, non-linear evolutions that go beyond this ratio. With a ratio of 14 for flood magnitude, we face effects that we do not understand, which can perhaps be explained by local factors, but also by global factors that are not yet incorporated into current climate models. Our study proves that the risks associated with global warming may be far greater than we generally think,» concludes Sébastien Castelltort.

Credit: 
Université de Genève

Birds retreating from climate change, deforestation in Honduras cloud forests

image: This is fresh deforestation in Honduras.

Image: 
Monte Neate-Clegg

The cloud forests of Honduras can seem like an otherworldly place, where the trees are thick with life that takes in water straight from the air around it, and the soundscape is littered with the calls of animals singing back and forth.

Otherworldly, yes, but scientists have found that the cloud forests are not immune to very down-to-earth problems of climate change and deforestation. A 10-year study of bird populations in Cusuco National Park, Honduras, shows that the peak of bird diversity in this mountainous park is moving higher in elevation. Additional land protection, unfortunately, may not be enough to reverse the trend, driven in part by globally rising temperatures. The study is published in Biotropica.

"A lot of these species are specialized to these mountain ranges," says study lead author Monte Neate-Clegg, a doctoral student at the University of Utah, "and they don't have a lot of options as to where to go should things go wrong."

Find images from Cusuco here.

Heads in the clouds

A cloud forest is an ecosystem that derives much of its moisture from water vapor in the surrounding air. Due to elevation and climate conditions, these forests are fed directly by clouds. Nothing ever dries, Neate-Clegg says.

"Cloud forests are pretty special," he says. "The tropics hold most of the world's biodiversity to begin with, and then the mountain slopes hold the greatest biodiversity within the tropics." For example, he adds, Cusuco National Park is the home to at least six amphibian species that are known nowhere else on Earth. The park also supports species large and small-from jaguars to hummingbirds.

Such specialized environments are at high risk for drastic alteration due to climate change,however. Scientists, including U professor and study co-author Cagan Sekercioglu, predicted that rising temperatures and changes in precipitation would cause species, particularly birds, to shift to higher elevations, shrinking their habitat and boosting the risk of extinction.

That, the authors found, is exactly what's happening.

Trouble in paradise

Neate-Clegg and his colleagues, including researchers from the UK and Belgium, examined a ten-year dataset of bird species counts in Cusuco National Park. The counts were conducted starting in 2006 by Operation Wallacea, a conservation organization. Few long-term studies like this have been undertaken in the tropics, Neate-Clegg says, with a significant data gap in Central America. "I wanted to plug this geographic gap," he says.

They found most species moving upslope, at an average of 23 feet (7 m) per year. Beyond species-specific changes in elevation, though, the researchers focused on bird diversity along the mountain slopes.

"By looking across all species we could show that the diversity was increasing at higher elevations and decreasing at lower elevations," Neate-Clegg says.

Losing ground

The authors turned their attention toward discriminating the likely causative factors for such a shift. One factor is the continuing development and deforestation within the park.

"Every year we go back and resurvey, and transects that were forested the previous year are suddenly cut down," Neate-Clegg says. "They are encroaching year on year." The terrain's status as a national park, he says, doesn't seem to be much of a deterrent for those seeking to expand agricultural land.

But habitat loss is not the only factor. Comparing forested study plots, the authors concluded that changes to the local climate were responsible for the upward shift as well.

Increased land protection would help give the birds more stable habitats, Neate-Clegg says, especially protections that encompass as much elevation as possible. But, as the paper grimly states, "Increased protection is unlikely to mitigate the effects of climate change."

Find this release and photos here.

Find the full study here.

Credit: 
University of Utah

Do you know why and how you forget passwords?

Do you frequently forget passwords to a baffling array of accounts and websites? Much depends on a password's importance and how often you use it, according to a Rutgers University-New Brunswick-led study that could spur improved password technology and use.

"Websites focus on telling users if their passwords are weak or strong, but they do nothing to help people remember passwords," said Janne Lindqvist, study co-author and assistant professor in the Department of Electrical and Computer Engineering in the School of Engineering.

"Our model could be used to predict the memorability of passwords, measure whether people remember them and prompt password system designers to provide incentives for people to log in regularly," Lindqvist said. "Logging in more often helps people remember passwords."

It's well-known that text-based passwords are hard to remember and people prefer simple, unsecure passwords. The study found evidence that human memory naturally adapts based on an estimate of how often a password will be needed. Important, frequently used passwords are less likely to be forgotten, and system designers need to consider the environment in which passwords are used and how memory works over time.

"Many people struggle with passwords because you need a lot of them nowadays," Lindqvist said. "People get frustrated. Our major findings include that password forgetting aligns well with one of the psychological theories of memory and predicting forgetting of passwords.

The peer-reviewed study by researchers at Rutgers-New Brunswick and Aalto University in Finland was formally published last month at the 27th USENIX Security Symposium in Baltimore, Maryland. The symposium - a tier-1 international conference - covered novel and scientifically significant practical advances in computer security.

Credit: 
Rutgers University

NASA finds a weaker Hurricane Olivia

image: On Sept. 5 at 2:10 a.m. EDT (0610 UTC) the MODIS instrument aboard NASA's Terra satellite captured an image of Hurricane Olivia. Strongest thunderstorms were smaller in area (red) than the previous day.

Image: 
NASA/NRL

Infrared data from NASA's Terra satellite revealed that the area of coldest cloud topped thunderstorms has dropped from the previous day, indicating weaker uplift and less-strong storms.

On Sept. 5 at 2:10 a.m. EDT (0610 UTC) NASA's Terra satellite passed over Olivia and analyzed the storm in infrared light to show temperatures. The MODIS or Moderate Resolution Imaging Spectroradiometer instrument aboard NASA's Aqua satellite revealed cloud top temperatures as cold or colder than minus 70 degrees Fahrenheit (minus 56.6 degrees Celsius) in fragmented areas southwest and east of the center. NASA research indicates very cold cloud tops with the potential to generate very heavy rainfall. At the time of the satellite image, Olivia had weakened but it was still a Category 3 hurricane.

Olivia encountered moderate easterly wind shear which continued to weaken the storm.

By 11 a.m. EDT (1500 UTC), the National Hurricane Center or NHC said Olivia weakened to a Category 2 hurricane on the Saffir-Simpson Hurricane Wind Scale. The center of Hurricane Olivia was located near latitude 17.1 degrees north and longitude 122.3 degrees west. Olivia was far from land areas. It was 900 miles (1,445 km) west-southwest of the southern tip of Baja California, Mexico.

Olivia is moving toward the west near 13 mph (20 kph), and this motion is expected to continue today. A turn toward the west-northwest is expected tonight, followed by a gradual turn back toward the west over the weekend.

Maximum sustained winds have decreased to near 110 mph (175 kph) with higher gusts. The NHC said additional slow weakening is expected during the next few days.

For updated forecasts, visit: http://www.nhc.noaa.gov

Credit: 
NASA/Goddard Space Flight Center