Tech

Enhancing blood sugar control boosts brain health for people with type 2 diabetes

Controlling blood sugar levels improved the ability to clearly think, learn and remember among people with type 2 diabetes who were overweight, a new study shows. But losing weight, especially for people who were obese, and increasing physical activity produced mixed results.

"It's important to properly control your blood sugar to avoid the bad brain effects of your diabetes," said Owen Carmichael, Ph.D., Professor and Director, Biomedical Imaging at Pennington Biomedical Research Center. "Don't think you can simply let yourself get all the way to the obese range, lose some of the weight, and everything in the brain is fine. The brain might have already turned a corner that it can't turn back from."

The new paper examined close to 1,100 participants in the Look AHEAD (Action for Health In Diabetes) study. One group of participants was invited to three sessions each year that focused on diet, physical activity, and social support. The other group changed their diet and physical activity through a program designed to help them lose more than 7 percent of their body weight in a year and maintain that weight loss. Cognitive tests - tests of thinking, learning, and remembering - were given to participants between 8 to 13 years after they started the study.

The research team theorized that people with greater improvements in blood sugar levels, physical activity and weight loss would have better cognitive test scores. This hypothesis proved partially true. Reducing your blood sugar levels did improve test scores. But losing more weight and exercising more did not always raise cognitive test scores.

"Every little improvement in blood sugar control was associated with a little better cognition," Dr. Carmichael said. "Lowering your blood sugar from the diabetes range to prediabetes helped as much as dropping from prediabetes levels to the healthy range."

More weight loss was either better or worse depending on the mental skill involved, Dr. Carmichael said. People who lost more weight improved their executive function skills: short-term memory, planning, impulse control, attention, and the ability to switch between tasks. But their verbal learning and overall memory declined.

"The results were worse for people who had obesity at the beginning of the study. That's a 'too little, too late' type of message," he said. "People with diabetes who let their obesity go too far, for too long may be past the point of no return, cognition-wise."

Increasing physical activity also generated more benefits for people who had overweight compared to those with obesity, the study shows.

Finding a way to offset the health effects of type 2 diabetes is vital. More than 25 percent of U.S. adults 65 or older have type 2 diabetes. The disease doubles the risk of cognitive impairment and dementia, including Alzheimer's disease, and greatly increases health care needs and costs.

Credit: 
Pennington Biomedical Research Center

Cancer immunotherapy 'uniquely suppressed' by liver tumors

Though cancer immunotherapy has become a promising standard-of-care treatment--and in some cases, perhaps a cure--for a wide variety of different cancers, it doesn't work for everyone, and researchers have increasingly turned their attention to understanding why.

For example, doctors have noticed that patients who initially respond well to the immunotherapy drugs known as checkpoint inhibitors, such as those that target a protein called PD-1, can develop resistance to these therapies if their cancer has metastasized from its initial location to form additional tumors in the liver--even if their primary cancer is quite distant from the liver.

In a new study published Oct. 2, 2020 in Science Immunology, a UCSF research team led by Hematology and Oncology Clinical Fellow James Lee, MD, MHS, used a unique mouse model to figure out how this happens.

Then, the researchers, including senior author Jeffrey Bluestone, PhD, adjunct professor of microbiology and immunology and the A.W. and Mary Margaret Clausen Distinguished Professor of Metabolism and Endocrinology, showed that adding a second type of checkpoint inhibitor in a combination therapy can overcome this resistance, and might significantly increase the effectiveness of immunotherapy in patients with liver metastases.

"The liver actually triggers differences in immune cells at distant sites," Lee said. And what's more, he added, "the liver can choose its enemy--what it wants to protect or not protect."

Cancers are sometimes able to avoid detection within the body by cloaking themselves from the immune system. They can produce large quantities of proteins like PD-L1, which "switch off" cells called regulatory T cells (Tregs; pronounced "tee-regs"), in turn tamping down the immune response of other T cells that attack cancer. Some checkpoint inhibitors counteract this cloaking process by preventing PD-L1 from binding to the PD-1 off-switches on T cells, allowing a normal defensive immune response against cancer cells.

The liver, which is tasked with filtering large quantities of blood directly from the digestive system and the rest of the body, plays an unexpectedly large role in regulating the immune system--specifically, by signaling which of the scavenged proteins it encounters as it does its job are from hostile invaders and which should be ignored.

In work supported by the Parker Institute for Cancer Immunotherapy, the scientists simulated metastasis by implanting mice with cancer cells in two separate locations, first under the skin and then in either the liver or the lung. They found evidence that when cancer takes hold in the liver it is "uniquely suppressive," said Lee--able to harness the liver's powers to retrain the immune system and exert its influence on the immune response to related cancers that are distant in the body.

Compared to mice with secondary cancers implanted in the lung, survival rates were significantly worse in mice with secondary liver cancers after anti-PD-1 treatment: the immune system did not learn to recognize the liver tumor or, notably, the related tumor implanted under the skin.

That level of immune-system discernment clued the team in on a possible mechanism, because "only a few types of cells can be that specific in regulating the immune system," Lee said, including Tregs. Bluestone has spent decades studying these cells, and that's where the researchers looked for an explanation. Could a liver tumor change the response of Tregs, and thus other T cells, to a separate, but related, tumor?

Using single-cell analyses, the team showed that, in mice with liver tumors, T cells associated with the related "primary" tumor were not as highly activated. Finally, the researchers showed that liver tumors change which genes are expressed in Tregs and, through those cells, a host of other immune-system cells as well. "It turned out that there wasn't a difference in the quantity of Tregs between the skin tumors of mice with liver cancers and the mice without liver cancers. It was a difference in quality," Lee said.

Since liver tumors caused Tregs to suppress the T cell response against tumors, the researchers tested two drugs to see if they could override the effect of the Tregs. The first was a drug that blocks the T cell checkpoint inhibitor CTLA-4, which unleashes these cells to attack cancer; in the 1990s, Bluestone did pioneering research on CTLA-4 that helped lay the foundations for cancer immunotherapy. The second drug, another anti-CTLA-4 compound, targets Tregs directly and depletes their numbers. Both restored the effectiveness of anti-PD-1 therapy, though the anti-CTLA-4 drug that depletes Tregs was more effective.

The researchers hope to apply this combination therapy in the future to patients whom they know ahead of time are less likely to respond to treatment.

"We've never had this kind of precision in immunotherapy in the past," Lee said. "What if, right from the start, you could use a drug that depletes Tregs as a complement to immunotherapy in patients with liver metastasis?"

Credit: 
University of California - San Francisco

Lab grown tumour models could lead to improved ovarian cancer treatments

image: Ovarian cancer cells forming a cancer spheroid as they grow within the peptide-protein co-assembling material.

Image: 
Alvaro Mata

Scientists have created a three-dimensional (3D) tumour model in the laboratory for ovarian cancer that could lead to improved understanding and treatment of the disease.

The international team, led by the University of Nottingham and Queen Mary University London have created a multicellular 3D microenvironment that recreates the way tumour cells grow in ovarian cancer and respond to chemotherapy drugs. The research has been published today in Science Advances (link to be added)

There is a need for improved 3D cancer models to study tumour growth and progression in patients and test responses to new treatments. At present, 90% of successful cancer treatments tested pre-clinically fail in the early phases of clinical trials and less than 5% of oncology drugs are successful in clinical trials. Pre-clinical tests mostly rely on a combination of two-dimensional (2D) lab grown cell cultures and animal models to predict responses to treatment.

However, conventional 2D cell cultures fail to mimic key features of tumour tissues and interspecies differences can result in many successful treatments in animal hosts being ineffective in humans. Consequently, novel experimental 3D cancer models are needed to better recreate the human tumour microenvironment and incorporate patient-specific differences.

The new hydrogel biomaterial is made by the co-assembly of peptides with proteins found in ovarian cancer. The mechanism of formation enables the peptides to assemble these proteins into molecular environments, emulating how they are presented in the patient tumour.

Professor Alvaro Mata, from the University of Nottingham's School of Pharmacy led the study, said: "Bioengineered self-assembling matrices expand our experimental repertoire to study tumour growth and progression in a biologically relevant, yet controlled, manner. In this study we used peptide amphiphiles to co-assemble with extracellular matrix proteins into tuneable 3D models of the tumour microenvironment. The peptide/protein composite matrix was designed to attempt to resemble physical, biomolecular, and cellular features of tumours present in patients. We tested the response of the lab grown tumours using chemotherapeutics to validate the functionality of the multicellular constructs and saw the tumour shrink. This suggests that the new peptide/protein/cellular biomaterial could lead to more effective testing of new drugs and treatments for ovarian cancer."

Self-assembly is the process by which multiple components can organise into larger well-defined structures. Biological systems rely on this process to controllably assemble molecules and cells into complex and functional tissues with the remarkable properties that we know of such as the capacity to grow, replicate, and perform robust functions.

Associate Professor Daniela Loessner, from Monash University in Australia and co-author of the study says: "Currently, the gold standard for 3D cancer models is the commercially available MatrigelTm, a solubilized basement membrane extracted from mouse sarcoma. A major reason for Matrigel's popularity is its capacity to enable cell-matrix interactions, which promote the growth of cancer and stromal cells into aggregates known as spheroids. However, it lacks control in mimicking the tumour microenvironment due to its batch variability, undefined composition, and animal origin. These features are important limitations to effectively screen and develop new treatments for cancer. Our research has demonstrated the capacity to engineer a 3D matrix that can serve as a complex, yet controllable, alternative to Matrigel."

Credit: 
University of Nottingham

Scale-adaptive auto-context-guided fetal US segmentation with structured random forests

Announcing a new article publication for BIO Integration journal. In this article the authors Xin Yang, Haoming Li, Li Liu, and Dong Ni from Shenzhen University, Shenzhen, China and Chinese University of Hong Kong, Hong Kong, China consider scale-adaptive auto-context-guided fetal US segmentation with structured random forests.

Accurate measurement of fetal biometrics in ultrasound at different trimesters is essential in assisting clinicians to conduct pregnancy diagnosis. However, the accuracy of manual segmentation for measurement is highly user dependent.

The authors of this article design a general framework for automatically segmenting fetal anatomical structures in two-dimensional (2D) ultrasound (US) images thus making objective biometric measurements available. Structured random forests (SRFs) are introduced as the core discriminative predictor to recognize the region of fetal anatomical structures with a primary classification map. The patch-wise joint labeling presented by SRFs has inherent advantages in identifying an ambiguous/fuzzy boundary and reconstructing incomplete anatomical boundary in US.

To obtain a more accurate and smooth classification map, a scale-adaptive auto-context model is then injected to enhance the contour details of the classification map from various visual levels. Final segmentation can be obtained from the converged classification map with thresholding. The framework is validated on two important biometric measurements: fetal head circumference (HC) and abdominal circumference (AC).

The final results illustrate that the authors proposed method outperforms state-of-the-art methods in terms of segmentation accuracy.

Credit: 
Compuscript Ltd

Sex-specific adverse drug effects identified by Columbia University algorithm

There is a paucity of real-world clinical data that evaluates adverse drug effects in women, among other underserved populations, due to a long history of trials done on relatively homogenous patient populations (healthy white males).

Without heterogeneous data availability, biased results leave women in the dangerous position of not having accurate information on adverse drug effects, currently the fourth-leading cause of death in the U.S.

An example of this issue is Ambien, an insomnia drug that previously had the same dosage prescribed for both men and women. When evidence appeared that women were having a significantly greater rate of adverse reactions the following morning, the FDA reduced the recommended dosage by half in 2013.

"Rather than take the stance that we wait for evidence to become so overwhelming that we have to do something about, we wanted to be more proactive," said Nicholas Tatonetti, PhD, a Columbia University researcher who co-led a study in Patterns that uses machine learning to identify these adverse effects in women. "We want to use databases like the Adverse Event Reporting System (FAERS) from the FDA or the electronic health records to get a jump on identifying sex-specific adverse events before it's too late."

Tatonetti, an associate professor in the Department of Biomedical Informatics, collaborated with Payal Chandak, an undergraduate student in Columbia's Department of Computer Science, to develop AwareDX (Analysing Women At Risk for Experiencing Drug toxicity), an algorithm that leverages advances in machine learning to predict sex risks.

"We developed a machine-learning framework to data-mine for sex-specific adverse events," Tatonetti said. "We went through hundreds of thousands of hypotheses and evaluated them. Payal designed a system that addresses confounding biases because it's very difficult to study these effects, because some drugs or effects are more common in either women or men. We invented a technique that mitigates the confounding biases, develops a statistical basis to identify sex difference in adverse effects, and rank them by the strength of that evidence."

An example of findings from this algorithm (see graphic, right), believed to be the first validated approach for predicting sex risks, is the confirmation that a single gene (ABCB1) can pose different risks for men (from simvastatin) and women (from risperidone). Overall, this resource includes 20,817 adverse drug effects posing sex-specific risks, and it presents an opportunity to minimize adverse events by tailoring drug prescription and dosage to sex.

"We were motivated by the lack of information across different communities on the efficacy and safety of drug effects," Tatonetti said. "We addressed that issue in this study specifically for women, who hold greater risk for these adverse effects than men due to differences in pharmacokinetics and pharmacodynamics. This specific knowledge can impact guidelines about drug prescription and dosage and creating safer, healthier conditions for women."

Credit: 
Columbia University Irving Medical Center

Solving global challenges using insect research

image: Colobathristide bug (ordered Heteroptera), Pilchicocha (Ecuador).

Image: 
© IRD - Olivier Dangles - François Nowicki / Une Autre Terre

To achieve food security, to promote peace, to ensure access to quality education and clean water and sanitation, to improve health, to take action to combat climate change, to restore ecosystems and to reduce inequalities: these are some of the 17 SDG identified by the UN to address the global challenges faced by societies.

Research can be used to achieve these interrelated goals, by not only producing reliable knowledge and data, offering innovative solutions and assessing progress but also in providing some perspective on SDGs.

"We have brought together researchers from many different countries - Germany, Australia, Burkina Faso, Brazil, China, Columbia, Ecuador, the United States, India, Panama, the Netherlands, the Philippines, Thailand and Vietnam - to present original insect research that falls within the area of Sustainability science", emphasised Olivier Dangles (IRD) and Verónica Crespo-Pérez (Pontifical Catholic University of Ecuador, PUCE), coordinators of the special issue published in Current Opinion In Insect Science. "These examples show that research on insects has great potential in tackling today's challenges".

* An overview of games for entomological literacy: the article considers the use of video games in improving the dissemination of knowledge about major insect-related challenges (pollinator decline, managing vectors of disease).

* Insect vectors endosymbionts as solutions against diseases: The authors of this article present new strategies to combat viral diseases transmitted by mosquitoes, in particular a strategy based on the symbiotic bacteria Wolbachia, and how mosquitoes themselves can help us to control the diseases they transmit.

* Orienting insecticide research in the tropics: Using a bibliometric analysis of insecticides, the researchers identify the research topics (bioinsecticides and integrated pest management) that should be promoted to ensure the protection of sustainable crops.

* Insect-inspired architecture to build sustainable cities: Entomologists describe the functional principles of insect structures, which may inspire the construction of more sustainable cities (particularly in terms of multifunctionality, energy saving and sustainability).

* Insects for peace: In countries recovering from conflict, agricultural development should focus on restoring food production by smallholder farmers and improving their socioeconomic position. The authors of the article describe the example of the reintegration of ex-combatants of the Revolutionary Armed Forces of Columbia as insect producers for livestock farming.

* Moving beyond the distinction between the bright and dark sides of termites: Termites are amongst the main decomposers of matter in tropical ecosystems and have a positive impact on many services for humankind. These insects also act as pests, threatening agriculture and constructions. This article assesses the impact of termites on several sustainable development goals and proposes a reconciliation between the termite's dark and bright sides.

* The importance of insects on land and in water: The authors of this article advocate for increased knowledge of the role played by insects in tropical terrestrial and aquatic ecosystems, whose diversity and distribution are affected by global changes.

* Unsung heroes: fixing multifaceted sustainability challenges through insect biological control. In this article, researchers explain how biological control contributes to food security, poverty alleviation, human well-being and environmental preservation.

Credit: 
Institut de recherche pour le développement

Climate change responsible for record sea temperature levels, says study

Global warming is driving an unprecedented rise in sea temperatures including in the Mediterranean, according to a major new report published by the peer-reviewed Journal of Operational Oceanography.

Data from the European Union's (EU) Copernicus Marine Environment Monitoring Service (CMEMS) will increase concerns about the threat to the world's seas and oceans from climate change.

The Ocean State Report reveals an overall trend globally of surface warming based on evidence from 1993 to 2018, with the largest rise in the Arctic Ocean.

European seas experienced record high temperatures in 2018, a phenomenon which the researchers attribute to extreme weather conditions - a marine heat wave lasting several months.

In the same year, a large mass of warm water occurred in the northeast Pacific Ocean, according to the report. This was similar to a marine heatwave - dubbed 'the Blob' - which was first detected in 2013 and had devastating effects on marine life.

Now the study authors are calling for improved monitoring to provide better data and knowledge. They argue this will help countries progress towards sustainable use of seas and oceans which are an essential source of food, energy and other resources.

Findings from the report confirm record rises in sea temperatures

"Changes to the ocean have impacted on these (ocean) ecosystem services and stretched them to unsustainable limits," says Karina von Schuckmann and Pierre-Yves Le Traon, the report's editors.

"More than ever a long term, comprehensive and systematic monitoring, assessment and reporting of the ocean is required. This is to ensure a sustainable science-based management of the ocean for societal benefit."

The Ocean State Report identifies other major strains on the world's seas and oceans from climate change including acidification caused by carbon dioxide uptake from the atmosphere, sea level rise, loss of oxygen and sea ice retreat.

Long-term evidence of global warming outlined in the report includes a decrease over 30 years of up to two days in the period of Baltic Sea ice cover and an acceleration in the global mean sea level rise.

The report highlights that the message from recent EU and global assessments of the state of seas and oceans is 'we are not doing well'. The authors add: "Human society has always been dependent on the seas. Failure to reach good environmental status for our seas and oceans is not an option."

Credit: 
Taylor & Francis Group

NASA finds heavy rainfall ringing major Hurricane Maria's eye

image: On Oct. 2 at 4:30 a.m. EDT (0830 UTC), NASA's IMERG estimated Hurricane Maria was generating as much as 50 mm (2 inches of rain/dark red) around the center of circulation. Rainfall throughout most of the storm was occurring between 3 and 20 mm (0.1 to 0.8 inches/yellow, green and pink colors) per hour. The rainfall data was overlaid on infrared imagery from NOAA's GOES-16 satellite.

Image: 
NASA/NOAA/NRL

Imagine being able to look down at a storm from orbit in space, and provide data that lets scientists calculate the rate in which rain is falling throughout it. That is what a NASA satellite rainfall product does as it incorporates data from satellites and observations. NASA found very heavy rainfall ringing around the compact eye of Major Hurricane Marie.

Maria's Status on Oct. 2

At 5 a.m. EDT (0900 UTC), NOAA's National Hurricane Center (NHC) reported Hurricane Marie was a Category 4 storm on the Saffir-Simpson Hurricane Wind Scale. The center of Hurricane Marie was located near latitude 16.2 degrees north and longitude 123.2 degrees west. Fortunately, Marie is far from land areas. It is centered about 980 miles (1,580 km) west-southwest of the southern tip of Baja California, Mexico.

Marie was moving toward the west-northwest near 15 mph (24 kph). Maximum sustained winds have increased to near 130 mph (215 kph) with higher gusts. Hurricane-force winds extend outward up to 25 miles (35 km) from the center and tropical-storm-force winds extend outward up to 125 miles (205 km). The estimated minimum central pressure is 947 millibars.

Estimating Maria's Rainfall Rates from Space

NASA's Integrated Multi-satellitE Retrievals for GPM or IMERG, which is a NASA satellite rainfall product, estimated on Oct. 2 at 4:30 a.m. EDT (0830 UTC), Hurricane Maria was generating as much as 50 mm (2 inches) of rain in the eyewall, ringing around the eye. Rainfall throughout most of the storm was occurring between 3 and 20 mm (0.1 to 0.8 inches) per hour.

At the U.S. Naval Laboratory in Washington, D.C., the IMERG rainfall data was overlaid on infrared imagery from NOAA's GOES-16 satellite to provide a full picture of the storm.

What Does IMERG Do?

This near-real time rainfall estimate comes from the NASA's IMERG, which combines observations from a fleet of satellites, in near-real time, to provide near-global estimates of precipitation every 30 minutes. By combining NASA precipitation estimates with other data sources, we can gain a greater understanding of major storms that affect our planet.

Instead, what the IMERG does is "morph" high-quality satellite observations along the direction of the steering winds to deliver information about rain at times and places where such satellite overflights did not occur. Information morphing is particularly important over the majority of the world's surface that lacks ground-radar coverage. Basically, IMERG fills in the blanks between weather observation stations.

Marie's Future

Additional strengthening is expected today, with weakening forecast to begin on Saturday, Oct. 3. A motion toward the west-northwest or northwest with a gradual decrease in forward speed is expected during the next several days.

Credit: 
NASA/Goddard Space Flight Center

Smartphone surveys find a connection between daily spiritual experiences and well-being

image: Baylor University sociologist Matt Bradshaw, Ph.D.

Image: 
Baylor University

Using smartphone check-ins twice a day for two weeks, sociologists in a national study have found a link between individuals' daily spiritual experiences and overall well-being, say researchers from Baylor University and Harvard University.

While other studies have found such a connection between spirituality and positive emotions, the new study is significant because frequent texting made it easier to capture respondents' moment-to-moment spiritual experiences over 14 days rather than only one or two points in time, they say.

"This study is unique because it examines daily spiritual experiences -- such as feeling God's presence, finding strength in religion or spirituality, and feeling inner peace and harmony -- as both stable traits and as states that fluctuate," said study co-author Matt Bradshaw, Ph.D., research professor of sociology at Baylor University Institute for Studies of Religion (ISR).

"Because surveys usually capture only one or two points in time, researchers often have to assume that associations between spirituality and positive emotions capture stable traits in respondents rather than momentary states of mind," he said. "But these findings suggest that stable, consistent spiritual experiences as well as short-term periodic ones both serve as resources to promote human flourishing and help individuals cope with stressful conditions."

Additionally, "the prevalence of smartphones makes this sort of 'experience sampling' study doable on a much larger scale than in the past, when pagers or palm pilots were used to trigger data collection," said lead author Blake Victor Kent, Ph.D., Research Fellow of Harvard Medical School/Massachusetts General Hospital and a non-resident scholar at Baylor ISR.

The study -- published in The International Journal for the Psychology of Religion -- uses data from SoulPulse, a project funded by the John Templeton Foundation, to study religion, spirituality and mental and physical well-being. Participants were 2,795 individuals who signed up for the study after learning of it through national media -- including the Associated Press, the Religion News Service and The New Yorker -- and by word of mouth.

Kent said that daily spiritual experiences are measured as one of two types:

* Theistic spiritual experiences examine the degree to which God is experienced as present, available and active in the individual's life using six questions: "I feel God's presence," "I find strength and comfort in my religion or spirituality," "I feel God's love for me directly or through others," "I desire to be closer to God or in union with the divine," "I feel guided by God in the midst of daily activities" and "I feel close to God."

* Non-theistic spiritual experiences assess transcendent feelings not specifically connected to God or a divine being using three questions: "I feel a deep inner peace or harmony," "I am spiritually touched by the beauty of creation" and "I feel thankful for my blessings."

To keep daily surveys short and interesting for participants, 10 to 15 items were pulled from some 100 questions and appeared with varying frequency. They included assessments of depression or positive emotions with such items as: "I feel downhearted and blue," "I feel that life is meaningless," "I am unable to become enthusiastic about anything," "I am feeling happy," "I am feeling that I have a warm and trusting relationship with others" and "I have something important to contribute to society."

Another item asked whether, since the most recent daily survey, the person had experienced a stressful situation such as an argument with a loved one, illness, injury, accident, job stress, financial problems or tragedy.

"The findings indicate, as you would expect, that the wear and tear of daily stressors are associated with increased depressive symptoms and lower levels of flourishing," Kent said. "What this study really contributes is that daily spiritual experiences play an important role as well. Essentially, if you take two people who have equal levels of stress, the one with more spiritual experiences will be less likely to report depressive symptoms and more likely to indicate feelings of flourishing. That's a comparison between two people.

"But what about one person?" he said. "The unique thing about this study is we are able to show that when someone's spiritual experiences vary day to day, the 'above average' days of spiritual experience are associated with better mental well-being than the 'below average' days."

Credit: 
Baylor University

New tool shows main highways of disease development

As people get older they often jump from disease to disease and carry the burden of more chronic diseases at once. But is there a system in the way diseases follow each other? Danish researchers have for the past six years developed a comprehensive tool, the Danish Disease Trajectory Browser, that utilizes 25 years of public health data from Danish patients to explore what they call the main highways of disease development.

"A lot of research focus is on investigating one disease at a time. We try to add a time perspective and look at multiple diseases following each other to discover where are the most common trajectories - what are the disease highways that we as people encounter," says professor Søren Brunak from the Novo Nordisk Foundation Center for Protein Research at University of Copenhagen.

To illustrate the use of the tool the research group looked at data for Down Syndrome patients and showed, as expected, that these patients in general are diagnosed with Alzheimer's Disease at an earlier age that others. Other frequent diseases are displayed as well.

The Danish Disease Trajectory Browser is published in Nature Communications.

Making health data accessible for research

In general, there is a barrier for working with health data in research. Both in terms of getting approval from authorities to handle patient data and the fact that researchers need specific technical skills to extract meaningful information from the data.

"We wanted to make an easily accessible tool for researchers and health professionals where they don't necessarily need to know all the details. The statistical summary data on disease to disease jumps in the tool are not person-sensitive. We compute statistics over many patients and have boiled it down to data points that visualize how often patients with one disease get a specific other disease at a later point. So we are focusing on the sequence of diseases," says Søren Brunak.

The Danish Disease Trajectory Browser is freely available for the scientific community and uses WHO's disease codes. Even though there are regional differences in disease patterns the tool is highly relevant in an international context to compare i.e. how fast diseases progress in different countries.

Disease trajectories can help in personalized medicine

For Søren Brunak the tool has a great potential in personalized medicine.

"In personalized medicine a part of the job is to divide patients into subgroups that will benefit most from a specific treatment. By knowing the disease trajectories you can create subgroups of patients not just by their current disease, but based on their previous conditions and expected future conditions as well. In that way you find different subgroups of patients that may need different treatment strategies," Søren Brunak explains.

Currently the Disease Trajectory Browser contains data from 1994 to 2018 and will continuously be updated with new data.

Credit: 
University of Copenhagen - The Faculty of Health and Medical Sciences

Physicists build circuit that generates clean, limitless power from graphene

image: Paul Thibado, professor of physics, holds prototype energy-harvesting chips.

Image: 
Russell Cothren, University of Arkansas

FAYETTEVILLE, Ark. - A team of University of Arkansas physicists has successfully developed a circuit capable of capturing graphene's thermal motion and converting it into an electrical current.

"An energy-harvesting circuit based on graphene could be incorporated into a chip to provide clean, limitless, low-voltage power for small devices or sensors," said Paul Thibado, professor of physics and lead researcher in the discovery.

The findings, published in the journal Physical Review E, are proof of a theory the physicists developed at the U of A three years ago that freestanding graphene -- a single layer of carbon atoms -- ripples and buckles in a way that holds promise for energy harvesting.

The idea of harvesting energy from graphene is controversial because it refutes physicist Richard Feynman's well-known assertion that the thermal motion of atoms, known as Brownian motion, cannot do work. Thibado's team found that at room temperature the thermal motion of graphene does in fact induce an alternating current (AC) in a circuit, an achievement thought to be impossible.

In the 1950s, physicist Léon Brillouin published a landmark paper refuting the idea that adding a single diode, a one-way electrical gate, to a circuit is the solution to harvesting energy from Brownian motion. Knowing this, Thibado's group built their circuit with two diodes for converting AC into a direct current (DC). With the diodes in opposition allowing the current to flow both ways, they provide separate paths through the circuit, producing a pulsing DC current that performs work on a load resistor.

Additionally, they discovered that their design increased the amount of power delivered. "We also found that the on-off, switch-like behavior of the diodes actually amplifies the power delivered, rather than reducing it, as previously thought," said Thibado. "The rate of change in resistance provided by the diodes adds an extra factor to the power."

The team used a relatively new field of physics to prove the diodes increased the circuit's power. "In proving this power enhancement, we drew from the emergent field of stochastic thermodynamics and extended the nearly century-old, celebrated theory of Nyquist," said coauthor Pradeep Kumar, associate professor of physics and coauthor.

According to Kumar, the graphene and circuit share a symbiotic relationship. Though the thermal environment is performing work on the load resistor, the graphene and circuit are at the same temperature and heat does not flow between the two.

That's an important distinction, said Thibado, because a temperature difference between the graphene and circuit, in a circuit producing power, would contradict the second law of thermodynamics. "This means that the second law of thermodynamics is not violated, nor is there any need to argue that 'Maxwell's Demon' is separating hot and cold electrons," Thibado said.

The team also discovered that the relatively slow motion of graphene induces current in the circuit at low frequencies, which is important from a technological perspective because electronics function more efficiently at lower frequencies.

"People may think that current flowing in a resistor causes it to heat up, but the Brownian current does not. In fact, if no current was flowing, the resistor would cool down," Thibado explained. "What we did was reroute the current in the circuit and transform it into something useful."

The team's next objective is to determine if the DC current can be stored in a capacitor for later use, a goal that requires miniaturizing the circuit and patterning it on a silicon wafer, or chip. If millions of these tiny circuits could be built on a 1-millimeter by 1-millimeter chip, they could serve as a low-power battery replacement.

The University of Arkansas holds several patents pending in the U.S. and international markets on the technology and has licensed it for commercial applications through the university's Technology Ventures division. Researchers Surendra Singh, University Professor of physics; ; Hugh Churchill, associate professor of physics; and Jeff Dix, assistant professor of engineering, contributed to the work, which was funded by the Chancellor's Commercialization Fund supported by the Walton Family Charitable Support Foundation.

Credit: 
University of Arkansas

Coastal flooding will disproportionately impact 31 million people globally

image: An aerial view of Belem, Brazil, a city situated along the Amazon Delta in northeastern Brazil. A new study by IU researchers found that climate change places millions of people living near river deltas at risk of flooding from tropical storms.

Image: 
Photo by Eduardo Brondizio, Indiana University

Thirty-one million people living in river deltas are at high risk of experiencing flooding and other impacts from tropical cyclones and climate change, according to a study by Indiana University researchers.

"To date, no one has successfully quantified the global population on river deltas and assessed the cumulative impacts from climate change," said Douglas Edmonds, the Malcolm and Sylvia Boyce Chair in the Department of Earth and Atmospheric Sciences and lead author on the study. "Since river deltas have long been recognized as hotspots of population growth, and with increasing impacts from climate change, we realized we needed to properly quantify what the cumulative risks are in river deltas."

The findings are the result of a collaboration facilitated by IU's Institute for Advanced Study with support from the Environmental Resilience Institute.

The team’s analysis shows that river deltas occupy 0.5 percent of the earth’s land surface, yet they contain 4.5 percent of the global population—a total of 339 million people. Because river deltas form at the ocean at or below sea level, they are highly prone to storm surges, which are expected to occur more frequently due to climate change-fueled sea-level rise and coastal flooding.

In the study, IU researchers analyzed these geographic regions, which include cities like New Orleans, Bangkok, and Shanghai, using a new global dataset to determine how many people live on river deltas, how many are vulnerable to a 100-year storm surge event, and the ability of the deltas to naturally mitigate impacts of climate change.

“River deltas present special challenges for predicting coastal floods that deserve more attention in discussions about the future impacts of climate change,” said IU Distinguished Professor of Anthropology Eduardo Brondizio, a co-author of the study who has been working with rural and urban communities in the Amazon delta for 3 decades. “Our estimates are likely a minimum because the storm surge and flooding models do not account for the compound interactions of the climate impacts, deficient infrastructure, and high population density.”

With Edmonds and Brondizio, co-authors on the study include Rebecca Caldwell and graduate student Sacha Siani.

In addition to the threat of flooding, many of the residents in river deltas are low-income and experience water, soil, and air pollution, poor and subnormal housing infrastructure, and limited access to public services. According to the study, of the 339 million people living on deltas throughout the world, 31 million of these people are living in the 100-year storm surge floodplains. To make matters worse, 92 percent of the 31 million live in developing or least-developed economies. As a result, some of the most disadvantaged populations are among the most at-risk to the impacts of climate change.

“These communities are already dealing with health risks, lack of sanitation and services, poverty, and exposure to flooding and other environmental risks. Climate change is exacerbating all of these issues and creating more impacts,” Brondizio said.

To conduct their study, the researchers created a global dataset of delta populations and areas, aggregating 2,174 delta locations. They then cross-referenced the dataset with a land population count to determine how many people were living in the deltas. To determine the natural mitigation capacity of the deltas, researchers looked at the volume of incoming sediment deposited by rivers and other waterways flowing out to sea. The volume of incoming sediment was compared to the relative area of the delta to determine if the delta would be considered sediment starved and thus unable to naturally mitigate flooding.

Decades of engineering have expanded the habitable land area of river deltas, but they’ve also starved the regions of flood-preventing sediment. Without the sediment being renewed naturally, the shorelines will continue to recede, worsening the impacts of storm surges

“To effectively prepare for more intense future coastal flooding, we need to reframe it as a problem that disproportionately impacts people on river deltas in developing and least-developed economies,” said Edmonds. “We need better models for the climate impacts that are capable of stimulating compound flooding in densely populated areas so that exposure and risk can be mapped to more accurately assess risk and vulnerability.”

Credit: 
Indiana University

Potential new tool for frost screening in crops

image: Frost-damaged barley in field trials.

Image: 
University of Adelaide

Agricultural scientists and engineers at the University of Adelaide have identified a potential new tool for screening cereal crops for frost damage.

Their research, published this week in the journal Optics Express, has shown they can successfully screen barley plants for frost damage non-destructively with imaging technology using terahertz waves (which lie between the microwave and infrared waves on the electromagnetic spectrum).

"Frost is estimated to cost Australian grain growers $360 million in direct and indirect losses every year," says project leader Professor Jason Able, at the University's School of Agriculture, Food and Wine.

"To minimise significant economic loss, it is crucial that growers' decisions on whether to cut the crop for hay or continue to harvest are made soon after frost damage has occurred. However, analysing the developing grains for frost damage is difficult, time-consuming and involves destructive sampling."

Frost damage can happen when the reproductive organs of the plant are exposed to air temperatures below 0°C during the growing season, with the amount of damage dependent on the severity and occurrence of frost events.

Cereal crops like barley and wheat show a wide range of susceptibility to frost damage depending on the genetics, management practices, environmental conditions and their interactions. For example, one-degree difference in temperature could result in frost damage escalating from 10% to 90% in wheat.

Supported by the University's Waite Research Institute and the Grains Research and Development Corporation, the researchers tested whether a state-of-art imaging system at the Terahertz Engineering Laboratory in the School of Electrical and Electronic Engineering, could be used to scan both barley and wheat spikes for frost damage.

Terahertz waves are able to penetrate the spike to determine differences between frosted and unfrosted grains.

"Barley and wheat spikes subjected to frost do not necessarily show symptoms for many days until after the frost event," says Professor Able. "This technology holds promise for identifying frost damage before symptoms can be visibly detected."

The researchers, including Dr Wendy Lee, Dr Ariel Ferrante and Associate Professor Withawat Withayachumnankul, found that terahertz imaging can discriminate between frosted and unfrosted barley spikes, and that the results were repeatable over many scans. This imaging technology was also able to determine individual grain positions along the length of the individual spike.

"This technology could possibly be developed into a field-based tool, which could be used by growers and agronomists to assist with their crop management and help minimise losses due to frost," says Professor Able. "The technology as it stands could also be used by plant breeders to make more rapid and more informed selection decisions about the performance of one breeding line over many others."

Further R&D is required to enable field deployment of terahertz non-destructive inspection for early frost damage and the research team is looking to develop a working prototype for field tests with other collaborators.

Credit: 
University of Adelaide

Einstein's description of gravity just got much harder to beat

image: Simulation of M87 black hole showing the motion of plasma as it swirls around the black hole. The bright thin ring that can be seen in blue is the edge of what we call the black hole shadow.

Image: 
L. Medeiros; C. Chan; D. Psaltis; F. Özel; UArizona; IAS.

Einstein's theory of general relativity - the idea that gravity is matter warping spacetime - has withstood over 100 years of scrutiny and testing, including the newest test from the Event Horizon Telescope collaboration, published today in the latest issue of Physical Review Letters.

According to the findings, Einstein's theory just got 500 times harder to beat.

Despite its successes, Einstein's robust theory remains mathematically irreconcilable with quantum mechanics, the scientific understanding of the subatomic world. Testing general relativity is important because the ultimate theory of the universe must encompass both gravity and quantum mechanics.

"We expect a complete theory of gravity to be different from general relativity, but there are many ways one can modify it. We found that whatever the correct theory is, it can't be significantly different from general relativity when it comes to black holes. We really squeezed down the space of possible modifications," said UArizona astrophysics professor Dimitrios Psaltis, who until recently was the project scientist of the Event Horizon Telescope collaboration. Psaltis is lead author of a new paper that details the researchers' findings.

"This is a brand-new way to test general relativity using supermassive black holes," said Keiichi Asada, an EHT science council member and an expert on radio observations of black holes for Academia Sinica Institute of Astronomy and Astrophysics.

To perform the test, the team used the first image ever taken of the supermassive black hole at the center of nearby galaxy M87 obtained with the EHT last year. The first results had shown that the size of the black-hole shadow was consistent with the size predicted by general relativity.

"At that time, we were not able to ask the opposite question: How different can a gravity theory be from general relativity and still be consistent with the shadow size?" said UArizona Steward Theory Fellow Pierre Christian. "We wondered if there was anything we could do with these observations in order to cull some of the alternatives."

The team did a very broad analysis of many modifications to the theory of general relativity to identify the unique characteristic of a theory of gravity that determines the size of a black hole shadow.

"In this way, we can now pinpoint whether some alternative to general relativity is in agreement with the Event Horizon Telescope observations, without worrying about any other details," said Lia Medeiros, a postdoctoral fellow at the Institute for Advanced Study who has been part of the EHT collaboration since her time as a UArizona graduate student.

The team focused on the range of alternatives that had passed all the previous tests in the solar system.

"Using the gauge we developed, we showed that the measured size of the black hole shadow in M87 tightens the wiggle room for modifications to Einstein's theory of general relativity by almost a factor of 500, compared to previous tests in the solar system," said UArizona astrophysics professor Feryal Özel, a senior member of the EHT collaboration. "Many ways to modify general relativity fail at this new and tighter black hole shadow test."

"Black hole images provide a completely new angle for testing Einstein's theory of general relativity," said Michael Kramer, director of the Max Planck Institute for Radio Astronomy and EHT collaboration member.

"Together with gravitational wave observations, this marks the beginning of a new era in black hole astrophysics," Psaltis said.

Testing the theory of gravity is an ongoing quest: Are the general relativity predictions for various astrophysical objects good enough for astrophysicists to not worry about any potential differences or modifications to general relativity?

"We always say general relativity passed all tests with flying colors - if I had a dime for every time I heard that," Özel said. "But it is true, when you do certain tests, you don't see that the results deviate from what general relativity predicts. What we're saying is that while all of that is correct, for the first time we have a different gauge by which we can do a test that's 500 times better, and that gauge is the shadow size of a black hole."

Next, the EHT team expects higher fidelity images that will be captured by the expanded array of telescopes, which includes the Greenland Telescope, the 12-meter Telescope on Kitt Peak near Tucson, and the Northern Extended Millimeter Array Observatory in France.

"When we obtain an image of the black hole at the center of our own galaxy, then we can constrain deviations from general relativity even further," Özel said.

Will Einstein still be right, then?

Credit: 
University of Arizona

Chemical innovation stabilizes best-performing perovskite formulation

image: Publishing in Science, researchers at EPFL have successfully overcome a limiting problem with stabilizing the best-performing formulation of metal-halide perovskite films, a key player in a range of applications, including solar cells.

Image: 
Nripan Mathews NTU, Singapore

Perovskites are a class of materials made up of organic materials bound to a metal. Their fascinating structure and properties have propelled perovskites into the forefront of materials' research, where they are studied for use in a wide range of applications. Metal-halide perovskites are especially popular, and are being considered for use in solar cells, LED lights, lasers, and photodetectors.

For example, the power-conversion efficiency of perovskite solar cells (PSCs) have increased from 3.8% to 25.5% in only ten years, surpassing other thin-film solar cells - including the market-leading, polycrystalline silicon.

Perovskites are usually made by mixing and layering various materials together on a transparent conducting substrate., which produces thin, lightweight films. The process, known as "chemical deposition", is sustainable and relatively cost-effective.

But there is a problem. Since 2014, metal halide perovskites have been made by mixing cations or halides with formamidinium (FAPbI3). The reason is that this recipe results in high power-conversion efficiency in perovskite solar cells. But at the same time, the most stable phase of FAPbI3 is photoinactive, meaning that it does not react to light - the opposite of what a solar power harvester ought to do. In addition, solar cells made with FAPbI3 show long-term stability issues.

Now, researchers led by Michael Grätzel and Anders Hafgeldt at EPFL, have developed a deposition method that overcomes the formamidinium issues while maintaining the high conversion of perovskite solar cells. The work has been published in Science.

In the new method, the materials are first treated with a vapor of methylammonium thiocyanate (MASCN) or formamidinium thiocyanate FASCN. This innovative tweak turns the photoinactive FAPbI3 perovskite films to the desired photosensitive ones.

The scientists used the new FAPbI3 films to make perovskite solar cells. The cells showed more than 23% power-conversion efficiency and long-term operational and thermal stability. They also featured low (330 mV) open-circuit voltage loss and a low (0.75 V) turn-on voltage of electroluminescence.

Credit: 
Ecole Polytechnique Fédérale de Lausanne