Tech

Pollutants rapidly seeping into drinking water

The entire ecosystem of the planet, including humans, depends on clean water. When carbonate rock weathers, karst areas are formed, from which around a quarter of the world's population obtains its drinking water. Scientists have been studying how quickly pollutants can reach groundwater supplies in karst areas and how this could affect the quality of drinking water. An international team led by Junior Professor Dr. Andreas Hartmann of the Chair of Hydrological Modeling and Water Resources at the University of Freiburg compared the time it takes water to seep down from the surface to the subsurface with the time it takes for pollutants to decompose in carbonate rock regions in Europe, North Africa and the Middle East. The researchers published their results in the scientific journal Proceedings of the National Academy of Sciences (PNAS).

Previous continental or global hydrologic model applications have focused mainly on the occurrence of floods or droughts and the general availability of drinking water. However, scientists have predominantly neglected water quality as an important factor for the potability of water on these large scales, in particular how quickly pollutants can seep from the earth's surface into the groundwater through cracks or fissures.

The current research results of Hartmann and his team show that in karst regions, which are characterized by an increased occurrence of cracks or fissures, the risk of pollution by degradable pollutants such as pesticides, pharmaceuticals or pathogens is significantly higher than previously expected. Although pollutants are considered short-lived, up to 50 percent of them can still reach groundwater, depending on the period of their decomposition. The main reason for this, the researchers show, is rapid seepage pathways that allow large amounts of infiltrating water to reach groundwater in a short time. Particularly in regions with thin soils, such as the Mediterranean region, pollutants on the surface can thus seep quickly and in high concentrations into the subsurface during large rain events. Hartmann's researchers demonstrated the consequences using the example of the degradable pesticide Glyphosate. According to their calculations, the rapid transport of Glyphosate into the groundwater can cause it to exceed its permissive values by a factor of up to 19. The increased risk of pollution for drinking water or ecosystems that depend on groundwater is particularly relevant for regions where agriculture depends on degradable fertilizers and pesticides.

Credit: 
University of Freiburg

Type of heart failure may influence treatment strategies in patients with AFib

Among patients with both heart failure and atrial fibrillation (AFib), treatment strategies focused on controlling the heart rhythm (using catheter ablation) and those focused on controlling the heart rate (using drugs and/or a pacemaker) showed no significant differences in terms of death from any cause or progression of heart failure, according to a study presented at the American College of Cardiology's 70th Annual Scientific Session. The trial was stopped early and, as a result, has limited statistical power to reveal differences between the two treatment approaches; however, trends observed in the study suggest the type of heart failure a patient has may influence which approach is optimal, researchers said.

Heart failure is a condition in which the heart becomes too weak or too stiff to effectively pump blood to the rest of the body. AFib, one of the most common heart rhythm disorders, is a problem with the heart's electrical signals that causes a fast and irregular heart rhythm. The two conditions often occur together, and the prognosis for people with both conditions is worse than for either condition alone. Researchers have long sought to determine whether treatments for these patients should focus on controlling the heart's rhythm or reducing the heart rate.

"The study didn't have a sufficient sample size to be definitive, but it is highly suggestive that ablation-based rhythm control appears to reduce the primary outcome measures, along with secondary outcomes of quality of life and heart failure markers, in patients who have heart failure with reduced ejection fraction," said Anthony S.L. Tang, MD, a cardiologist and professor of medicine at Western University in London, Canada and the study's lead author. "It's not quite as conclusive as I'd like it to be, but I think it still is useful to help individual physicians determine treatment strategies. I would be less inclined to use the rhythm control strategy for heart failure with preserved ejection fraction and more likely to use it in people with reduced ejection fraction."

The trial is the first of its kind to include patients with AFib and who had heart failure of either reduced ejection fraction (HFrEF) or preserved ejection fraction (HFpEF). In the study, 171 participants had HFpEF, in which the ventricles do not fill properly and developed higher filling pressure, and 240 had HFrEF, in which the heart does not squeeze as strongly as it should. The findings suggest rhythm control may be especially beneficial for patients with HFrEF, though more research would be needed to confirm this trend, according to the researchers.

Researchers enrolled 411 patients treated for heart failure at 21 medical centers in Canada, Sweden, Brazil and Taiwan. Participants were 67 years old on average and nearly three-quarters were men. Half were randomly assigned to receive treatments focused on rhythm control and half received rate control strategies. For rhythm control, patients underwent catheter-based ablation, a procedure to permanently disable certain areas of the heart to prevent them from sending erratic electrical signals. For rate control, patients took medications; if medications did not achieve the desired heart rate, these patients underwent a procedure to disable the heart's atrio-ventricular node and implant a pacemaker.

During a median follow-up of 37 months, 23.4% of patients in the rhythm control arm and 32.5% of those in the rate control arm died or had progressive heart failure requiring acute heart failure treatment. Among patients with HFrEF, the composite primary endpoint occurred in 22.8% of patients in the rhythm control arm and 37.1% of patients in the rate control arm.

Researchers also assessed a variety of markers for heart failure severity; while the study was not designed to demonstrate significant differences in these markers, Tang said that they all exhibited the same general trends as the study's primary endpoint.

The study was funded by the Canadian Institute of Health Research.

Tang will be available to the media in a virtual press conference on Monday, May 17, at 12:15 p.m. ET / 16:15 UTC.

Tang will present the study, "A Randomized Ablation-based Atrial Fibrillation Rhythm Control Versus Rate Control Trial in Patients with Heart Failure and High Burden Atrial Fibrillation (RAFT-AF)," on Monday, May 17, at 10:45 a.m. ET / 14:45 UTC, virtually.

Credit: 
American College of Cardiology

Preemie boys age faster as men, study shows

image: Ryan Van Lieshout, first author of the study, physician and associate professor of psychiatry and behavioural neurosciences at McMaster's Michael G. DeGroote School of Medicine

Image: 
McMaster University

Hamilton, ON (May 17, 2021) - Boys born weighing less than a kilogram are miracles, but they do not age as well as the girls, according to new research from McMaster University.

Researchers following a group of extremely low birth weight (ELBW) babies as well as their normal weight counterparts have found that, at least biologically, the premature or preemie boys age more quickly and are 4.6 years older than boys with normal birth weight born at the same time. The difference was not found between birth weight groups in girls.

In the study published in the journal Pediatrics today, the researchers point out that the rate of aging may be influenced by boys' handling of physiological stress before birth, and in the hospital neonatal intensive care unit after they are born.

The information comes from the world's oldest longitudinal study of ELBW babies who have been followed since the study began at McMaster and Hamilton Health Sciences in 1977.

Using an epigenetic clock, the researchers looked at the genes of 45 of those who were ELBW babies along with 47 who were normal birth weight when they were age 30 to 35 to compare their biological age, controlling for chronic health problems and sensory impairments.

"Although it is unclear why accelerated biological aging is seen in the ELBW men, this suggests that prenatal exposures play an important role in aging," said Ryan Van Lieshout, first author of the study, physician and associate professor of psychiatry and behavioural neurosciences at McMaster's Michael G. DeGroote School of Medicine.

He added that previous research has shown that the ELBW boys are more susceptible to prenatal stresses than ELBW girls.

"This certainly highlights the need to monitor the health of preterm survivors across their lifespan, and more research needs to be done," he said. "This also emphasizes the need to forewarn the ELBW men and promote healthy aging so they may proactively mitigate these risks."

He said optimizing health during adulthood includes a balanced diet, avoiding smoking, proper sleep and exercise, stress management, cognitive stimulation and development of strong social networks.

Credit: 
McMaster University

Oregon State research shows why some pockets of conifer survive repeated forest fires

image: Red Buttes Wilderness, southern Oregon

Image: 
Will Downing, OSU

CORVALLIS, Ore. - Oregon State University researchers say "topographic templates" can help forest conservation managers develop strategies for protecting and restoring the most fire-resistant parts of vulnerable forests across a range of ecosystems.

That's important because changing wildfire regimes are affecting forests around the globe, the scientists note, and areas that burn over and over in relatively quick succession may not be able to recover between fires.

"Fire refugia" - areas that burn less frequently and/or less severely than the landscape around them - are crucial for supporting post-blaze ecosystem recovery, including the persistence of species under pressure.

Findings of the study, led by faculty research assistant Will Downing, were published in Global Change Biology.

"Observed and projected forest losses from wildfire tell us that we need to understand where and why refugia persists through multiple fire events," said OSU ecologist Meg Krawchuk, who oversees the College of Forestry's Landscape Fire and Conservation Science lab group. "And we really need to understand fire refugia in the Klamath-Siskiyou ecoregion of southwest Oregon and northwest California. That area holds some of the most diverse collections of conifers in western North America, and expected increases in fire activity, along with a warming climate, could result in the loss of more than 30% of the region's conifer forests."

Krawchuk, Downing, Matt Gregory of the College of Forestry and Garrett Meigs of the Washington State Department of Natural Resources used recent advances in fire progression mapping and weather interpolation - estimating the information between known weather data points - plus a novel application of satellite smoke imagery to build new fire refugia statistical models for the Klamath-Siskiyou region.

The analysis focused on mature, conifer-dominated forests and looked at the key factors behind fire refugia occurrence and persistence through a series of three fire events over 32 years.

"The models suggest hotter-than-average fire weather is associated with lower refugia probability and higher fire severity," Krawchuk said. "Refugia that persisted through three fire events appeared to benefit from topographic variability - a mix of rocky outcrops and landscape depressions, for example - which means the variability may be an important stabilizing factor as forests experience successive fires."

In addition, the models show that smoke density strongly influences fire effects - refugia are more likely to occur when smoke is moderate or dense in the morning, a connection the scientists attribute to the shade smoke provides.

"Our hope is that this study can inform management strategies designed to protect fire-resistant portions of biologically and topographically diverse landscapes," Krawchuk said.

Fire refugia are part of a larger category of hardy areas known as disturbance refugia, and comparatively little is known about why certain refugia are able to hang tough as they pass through successive "fire filters," she said.

"Refugia can be transient and survive a single fire because of random weather or fire behavior conditions, or there can be persistent refugia that don't change very much in the face of multiple fire events," Downing said.

The Klamath-Siskiyou ecoregion is ideal for studying refugia occurrence and persistence because it's a "biodiversity hotspot" in which fire has been a key ecological component for thousands of years.

"Fire there has contributed to the maintenance of patchy, heterogeneous landscapes of conifer and hardwood forests, shrublands and grasslands," Downing said. "But a hotter and drier climate and a lack of surviving post-fire seed sources eat away at the ability of conifer forests to recover after a high-severity fire. Climate change is expected to increase fire frequency in the region, and repeat burning is projected to convert about a third of the conifer forest to shrublands or hardwood forest by the end of the 21st century."

In some cases, that conversion will be a good thing ecologically, she said - such as where fire suppression has led to a decline in early seral communities, those that spring up after a stand-replacing event and before a new forest takes hold. In others, carbon storage and biodiversity, as well as timber supply, will be vulnerable from widespread conifer forest loss.

"Figuring out which areas are most likely to persist as forest through wildfire requires using landscape-scale assessments of the factors behind fire behavior and severity: topography, fuels and weather," Krawchuk said. "Refugia are ecologically important parts of fire severity mosaics, and it appears that the more times a landscape burns, the more important terrain features are for refugia persistence."

Credit: 
Oregon State University

Aggressive or friendly? The inflammatory protein interleukin 1β may decide

Tsukuba, Japan - Aggression is common in many neuropsychiatric diseases, such as dementia, autism spectrum disorder, and schizophrenia. It causes many problems for patients and their families, but can be difficult to treat because little is known about what causes it. In a study published last month in Molecular Psychiatry, researchers from the University of Tsukuba revealed that variation in levels of interleukin 1β (IL-1β), a protein that mediates the inflammatory response, is associated with individual differences in aggressive behaviors in male mice.

In humans, levels of inflammatory proteins such as IL-1β in the blood correlate with aggressive traits. To better understand these findings, researchers at the University of Tsukuba decided to investigate IL-1β levels in the blood of male mice, which they classified as aggressive or non-aggressive based on their behaviors toward other male mice. Unexpectedly, there were no differences in blood IL-1β levels between the aggressive and non-aggressive mice, in contrast to what had been reported in humans. This finding intrigued the researchers, and they wanted to know more.

"The dorsal raphe nucleus is a region of the brain that is important in aggressive behaviors," says lead author of the study Professor Aki Takahashi. "We decided to investigate IL-1β levels in this brain region in mice, and to experiment using drugs and genetic methods to reduce the effects of IL-1β on its receptors, to see if there were any related changes in aggressive behaviors."

The results were surprising: IL-1β was actually lower in the dorsal raphe nucleus of aggressive mice than in non-aggressive mice. In addition, in the experiments where IL-1β was less able to act on its receptors in this brain region, the mice were more aggressive.

The researchers then decided to look at the relationship between IL-1β and serotonin, a key neurotransmitter in the control of aggression. They found that, during aggressive encounters, serotonin neurons in the dorsal raphe nucleus were more active in aggressive mice than in non-aggressive mice. Moreover, when they experimentally lowered the expression of IL-1 receptors, serotonin neurons were also more active in this brain region.

"Our findings suggest that IL-1β in the dorsal raphe nucleus suppresses aggressive behavior, possibly by acting on the serotonin system," says Takahashi.

The findings suggest that IL-1β and serotonin neurons might be potential drug targets for reducing aggression, which currently has few effective treatments. The results of this study could therefore lay the foundations for research into treatment approaches for aggression in patients with neuropsychiatric diseases.

Credit: 
University of Tsukuba

A LiDAR device the size of a finger available

image: Conventional macroscanner and MEMS-type LiDAR systems

Image: 
POSTECH

The nanophotonics-based LiDAR technology developed by a POSTECH research team was presented as an invited paper in Nature Nanotechnology, the leading academic journal in the field of nanoscience and nanoengineering.

In this paper, a POSTECH research team (led by Professor Junsuk Rho of the departments of mechanical engineering and chemical engineering, postdoctoral researcher Dr. Inki Kim of the Department of Mechanical Engineering, and Ph.D. candidate Jaehyuck Jang of the Department of Chemical Engineering) in cooperation with the French National Science Institute (CNRS-CRHEA) focused on the LiDAR device developed through studying the metamaterials based ultralight nanophotonics.

In addition, the paper introduces core nanophotonic technologies such as the phase-change material-based beam scanning technique, a flash-type LiDAR that does not require beam scanning by applying point-cloud generation device, and light-source device integration and scalable manufacturing methods.

In particular, the paper explains that the ultra-precise LiDAR device developed by the research team can be applied not only to autonomous vehicles, but also to intelligent robots, drones, 3D panoramic cameras, CCTVs, and augmented reality platforms. LiDAR technology collects the depth information of an object by irradiating a laser beam onto the object and measuring the time of its return. LiDAR sensors are gaining attention in the field of future displays from machineries - such as autonomous vehicles, artificially intelligent robots, and unmanned aerial vehicles - to being mounted on iPhones for 3D face recognition or used in secure payment systems.

Currently, the high-end mechanical LiDAR system on the roof of autonomous vehicles is about the size of two adult fists stacked together, and costs tens of thousands of dollars. In addition, there are still many challenges to be overcome, such as a charging process that consumes a huge amount of power and heat management.

As a solution to this, the research team proposed an ultracompact LiDAR technology based on nanophotonics. The researchers explain how this nanophotonic technology can innovate the LiDAR sensor system in various aspects, from the basic measurement principles of LiDAR to the latest ultrafast and ultra-precise nanophotonic measurement methods, and nanophotonic devices such as metasurfaces, soliton microcomb, and optical waveguides.

"Currently, the research team is conducting several follow-up studies to develop ultralight metasurface-based compound LiDAR systems," remarked Professor Junsuk Rho. "If this research is successful, we can look forward to manufacturing affordable ultrafast and ultra-precise LiDAR systems at an affordable cost."

Credit: 
Pohang University of Science & Technology (POSTECH)

Study shows online gambling soared during lockdown, especially among regular gamblers

Regular gamblers were more than six times more likely to gamble online compared to before the COVID-19 pandemic, according to new research.

The study, led by the University of Bristol and published today (17 May) in the Journal of Gambling Studies, showed regular male gamblers were particularly prone to gambling more often online during the public lockdown in the UK, compared to their previously reported gambling habits.

Although overall men and women gambled less frequently during lockdown, partly due to betting shops being closed, some forms of gambling increased. For instance, usage of online gambling, including poker, bingo, and casino games, grew six-fold among regular gamblers. Respondents who gambled occasionally were still found to be more than twice as likely than before to gamble online. Those who struggled financially before the pandemic were more likely to report gambling during lockdown.

Lead author Professor Alan Emond, of the University of Bristol's Medical School, said: "This study provides unique real time insights into how people's attitudes and gambling behaviour changed during lockdown, when everyone was stuck inside and unable to participate in most social activities. The findings reveal that although many forms of gambling were restricted, a minority of regular gamblers significantly increased their gambling and betting online. As with so many repercussions of the pandemic, inequalities have been exacerbated and particularly vulnerable groups were worse affected."

The comparative research used two online questionnaires during the first lockdown in 2020, which surveyed the same group of adults, aged 28 years on average, who had previously been asked similar questions about gambling before the pandemic as part of the renowned Children of the 90s study, also known as the Avon Longitudinal Study of Parents and Children (ALSPAC).

More than 2,600 adults responded and results revealed that during lockdown men were three times more likely than women to gamble regularly, defined as more than once a week. Drinking heavily, defined as more than six units in a session (equivalent to more than three pints of beer) at least once a week, was strongly linked to regular gambling among men and women. These trends are likely to be much greater in reality, as the majority (70 per cent) of respondents to the surveys in lockdown were women.

Professor Emond, a public health expert, said: "The strong link between binge drinking and regular gambling is of particular concern, as they are both addictive behaviours which can have serious health and social consequences. With the wider availability of gambling through different online channels, vulnerable groups could get caught in a destructive cycle. A public health approach is needed to minimise gambling harms."

The research builds on other evidence, including the YouGov Covid-19 tracker study, which found that regular gamblers turned to new online options during lockdown. Data from the Gambling Commission derived from the biggest gambling operators in the UK also showed increased revenues during lockdown for online gambling, especially on esports, which dramatically gained in popularity as live sporting events traditionally betted on were suspended. Previous research in the Journal of Public Policy & Marketing, led by the University of Bristol, has revealed children are engaging particularly with esports gambling advertising on social media.

Online advertising expert and co-author Agnes Nairn, Professor of Marketing at the University of Bristol's School of Management, said: "The results of this study and trends being reported more widely are quite alarming. As gambling habits shift online, vulnerable groups including children and adults who drink heavily may be more easily sucked into these channels. The increased prevalence of home working is also an important consideration for future policy making, as the temptation to gamble online, amplified by clever advertising, is always there. Children are also falling prey to this advertising, especially for esports, on social media and could get locked into addictive habits from an early age. Stricter regulation is needed in this growing field to protect unwitting consumers."

Alison Clare, Research, Information and Knowledge Director at GambleAware, said: "We know that gambling is part of the daily lives of children, young people and vulnerable adults and this research sheds further light on the impact Covid-19 and lockdown has had on gambling habits for young people. GambleAware is committed to ensuring all those affected by gambling harm have access to the necessary information and advice. All organisations, including National Health Services and charities need to work together to reduce stigma and raise awareness of the help and support that is available via the National Gambling Treatment Service."

Credit: 
University of Bristol

Poverty associated with worse survival, fewer lung transplants in lung disease patients

image: Poverty associated with worse survival rates in patients with lung disease.

Image: 
ATS

ATS 2021, New York, NY - Patients with idiopathic pulmonary fibrosis (IPF), a rare lung disease that causes shortness of breath and low oxygen levels because of lung scarring, have worse outcomes if they live in poor neighborhoods, according to research presented at the ATS 2021 International Conference.

Gillian Goobie, MD, Human Genetics, Graduate School of Public Health, University of Pittsburgh, and colleagues sought to determine how environmental and occupational factors contribute to the development and progression of IPF. People who live in areas with high neighborhood-level disadvantage, as measured by the Area Deprivation Index, experience disparities in housing, poverty, employment, and education. These social determinants of health impact the outcome of many chronic diseases.

"Our preliminary data from our single center study at the University of Pittsburgh indicates that neighborhood-level disadvantage may be associated with increased mortality and reduced odds of receiving a lung transplant in patients with IPF," stated Dr. Goobie, study author.

Policies and legislation that promote more equitable environments and reduce the burden of poverty in our society may help to alleviate the disparities we see in outcomes of patients with IPF.

People are more likely to develop IPF or other forms of interstitial lung disease (ILD) if they have worked in an occupation that has significant exposure to airborne materials. For example, individuals exposed to asbestos, silica, wood chippings, or numerous other materials through their work are at a higher risk for development of ILD in comparison to individuals without those exposures. We also know that smoking is a very important risk factor that contributes to the development of IPF in many patients, as well as exposure to air pollution.

"I think there are substantial real-world implications of this research. With a disease like IPF, which has a very high mortality, we are more able to demonstrate the substantial impact that these neighborhood-level factors can have on survival and transplant outcomes. I was surprised that we were able to find a significant impact of neighborhood-level disadvantage on survival in our relatively small cohort of patients with IPF. I am looking forward to validating these results in a larger and more diverse population of patients with IPF and other forms of fibrotic ILD," stated Dr. Goobie.

Credit: 
American Thoracic Society

Novel monoclonal antibody can substantially lower triglycerides in patients with acute pancreatitis

The investigational drug evinacumab reduced triglycerides in patients with severe hypertriglyceridemia (sHTG) and a history of hospitalizations for acute pancreatitis in a phase 2 global study led by Mount Sinai. The fully human monoclonal antibody produced sustained reductions in triglyceride levels of up to 82 percent, depending on the patient's genotype, while also lowering the risk of recurrent acute pancreatitis. The results of the study will be presented as a late-breaking clinical trial at the American College of Cardiology (ACC) Annual Scientific Session, on May 16.

"Evinacumab has the potential to not only lower triglycerides, but the risk of acute pancreatitis, quality of life, and the risk of cardiovascular events in a highly vulnerable patient population," says Robert S. Rosenson, MD, Professor of Medicine at the Icahn School of Medicine at Mount Sinai, and lead investigator of the study. "The unmet clinical need couldn't be greater. Even after the current therapeutic options of dietary counseling, fibrates, and omega-3 fatty acid products, many individuals with severe hypertriglyceridemia have elevated triglyceride levels above 500 mg/dL, and some in the thousands."

Severe hypertriglyceridemia, defined as triglycerides greater than 500 mg/dL, is believed responsible for around 10 percent of all cases of acute pancreatitis which affects more than 200-thousand patients a year in the United States. It is an inflammatory condition of the pancreas that causes abdominal pain and fever and, in some individuals, can be life-threatening. Recurrent acute pancreatitis typically requires frequent hospitalizations and the most common causes are gallstones and alcoholism.

In their study of 52 patients with severe hypertriglyceridemia, researchers found that clinical improvements depended on genetic variants. The greatest triglyceride reductions, up to 82 percent, occurred in a cohort of patients without two mutations in the lipoprotein lipase (LPL) pathway. LPL is an enzyme responsible for metabolizing, or breaking down, triglycerides. In a second cohort of patients with a genetic disorder known as multifactorial chylomicronemia syndrome (MCS)--which can be exacerbated by comorbidities, medications, and even lifestyles--triglycerides were reduced by around 65 percent. And in a third cohort--of those with loss of function mutations in two genes encoding lipoprotein lipase, a condition known as familial chylomicronemia syndrome (FCS)--there was no reduction in triglyceride levels.

"Our research underscored the importance of genetic testing of the LPL pathway to determine which patients are most likely to respond well to evinacumab therapy," says Dr. Rosenson, who is Director of Metabolism and Lipids for the Mount Sinai Health System. "Even in patients with two LPL mutations who experienced no reduction in triglycerides, there were reductions in non-HDL cholesterol and in the cholesterol content of triglyceride-rich lipoproteins, demonstrating that evinacumab was impacting the triglyceride pathway."

Evinacumab works by binding to and blocking the function of angiopoietin-like protein 3 (ANGPTL3), a protein thought to play a role in cholesterol metabolism. People who are missing or have very low ANGPTL3 due to genetic causes are known to have significantly reduced lipid levels, suggesting to scientists that it could also be a therapeutic target for lowering triglycerides.

Evinacumab, from Regeneron Pharmaceuticals, was approved by the U.S. Food and Drug Administration in February 2021 (under the name Evkeeza™) for homozygous familial hypercholesterolemia, an inherited disorder that makes it difficult for the body to eliminate LDL cholesterol (so-called "bad cholesterol") from the blood.

The next clinical trial for evinacumab in patients with severe hypertriglyceridemia is designed to evaluate the reduction in the risk of acute pancreatitis and is expected to begin shortly, with Mount Sinai again playing a pivotal global role. "Based on the results we've seen to date, we believe evinacumab can significantly decrease the risk of recurrent acute pancreatitis in people with severely elevated triglycerides," says Dr. Rosenson. "At the same time, this novel drug could help to ease the financial burden on a health system which provides ongoing care for these high-risk patients who are frequently hospitalized for recurrent episodes of acute pancreatitis."

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

Researchers develop 3D-printed jelly

image: The hydrogel material comes from different-sized seaweed particles.

Image: 
Orlin Velev, NC State University

3D-printable gels with improved and highly controlled properties can be created by merging micro- and nano-sized networks of the same materials harnessed from seaweed, according to new research from North Carolina State University. The findings could have applications in biomedical materials - think of biological scaffolds for growing cells - and soft robotics.

Described in the journal Nature Communications, the findings show that these water-based gels - called homocomposite hydrogels - are both strong and flexible. They are composed of alginates - chemical compounds found in seaweed and algae that are commonly used as thickening agents and in wound dressings.

Merging different-size scale networks of the same alginate together eliminates the fragility that can sometimes occur when differing materials are merged together in a hydrogel, says Orlin Velev, S. Frank and Doris Culberson Distinguished Professor of Chemical and Biomolecular Engineering at NC State and corresponding author of the paper.

"Water-based materials can be soft and brittle," he said. "But these homocomposite materials - soft fibrillar alginate particles inside a medium of alginate - are really two hydrogels in one: one is a particle hydrogel and one is a molecular hydrogel. Merged together they produce a jelly-like material that is better than the sum of its parts, and whose properties can be tuned precisely for shaping through a 3D printer for on-demand manufacturing."

"We are reinforcing a hydrogel material with the same material, which is remarkable because it uses just one material to improve the overall mechanical properties," said Lilian Hsiao, an assistant professor of chemical and molecular engineering at NC State and a co-author of the paper. "Alginates are used in wound dressings, so this material potentially could be used as a strengthened 3D-printed bandage or as a patch for wound healing or drug delivery."

"These types of materials have the potential to be most useful in medical products, in food products as a thickening agent, or in soft robotics," said Austin Williams, one of the paper's first coauthors and a graduate student in Velev's lab.

Future work will attempt to fine-tune this method of merging of homocomposite materials to advance 3D printing for biomedical applications or biomedical injection materials, Velev said.

"This technique may have uses with other types of gels, like those used in coatings or in consumer products," Hsiao said.

Credit: 
North Carolina State University

Interim study suggests oral TXA is equally effective in preventing blood loss in joint replacement

Interim results of a study conducted by researchers at Hospital for Special Surgery (HSS) suggest that oral tranexamic acid (TXA) is non-inferior to intravenous (IV) TXA in preventing blood loss in total knee and total hip replacement surgery. These findings were presented at the 2021 Spring American Society of Regional Anesthesia and Pain Medicine (ASRA) Annual Meeting.1

Previously available information suggests that oral, IV and topical TXA are all effective at reducing blood loss and drastically reducing blood transfusion rates during and after surgery, but research with direct comparisons for each method is limited.

"TXA in orthopedic surgery has become the standard of care. However, the most efficient, efficacious and cost effective method of administration remains unknown," said principal investigator Stavros Memtsoudis, MD, PhD, MBA, an anesthesiologist at HSS. "The oral administration of TXA is logistically easier, thus reducing the risk of drug errors in the OR. It is also less costly. We are performing this study to identify if oral TXA is also equally efficacious at preventing blood loss. If this is the case, oral administration of the drug preoperatively as a one-time dose could become the standard of care."

Dr. Memtsoudis and colleagues randomized 199 patients between ages 18 and 80 undergoing total hip or total knee replacement to receive either oral TXA (1950 mg) two hours before surgery or IV TXA (1 g) at the start of the procedure. The primary outcomes observed were blood loss and transfusion rates.

In patients who underwent total hip replacement, the estimated blood loss calculated in the post anesthesia care unit (PACU) for oral TXA was 534 ± 285 mL, versus 676 ± 550 for IV TXA. On postoperative day one, estimated blood loss was 769 ± 257 mL for oral TXA and 798 ± 302 ml for IV TXA.

In patients who underwent total knee replacement, estimated blood loss in the PACU was 289 ± 219 mL for oral TXA, and was 486 ± 670 mL for IV TXA. On postoperative day one, estimated blood loss was 716 ± 288 mL for oral TXA versus 846 ± 659 mL for IV TXA.

No patients received transfusions during surgery. One patient who received IV TXA received a transfusion after surgery.

"Given our interim results, it seems that the oral version of TXA is equally as effective as intravenous administration. This translates to improvements in efficiency, cost and safety, all of which are important for patients, clinicians and policy makers," Dr. Memstoudis said. "The research seems rather clear at this point. However, a uniform translation into policy is what is needed, as there seems to be limited translation of best evidence into practice."

Complete results of this study will be analyzed later this year.

Credit: 
Hospital for Special Surgery

Above the noise

image: Deep learning helps researchers to find hidden features in resistive pulse signals in nanopore sensing of nanoscale objects.

Image: 
Osaka University

Osaka, Japan - Scientists from the Institute of Scientific and Industrial Research at Osaka University used machine learning methods to enhance the signal-to-noise ratio in data collected when tiny spheres are passed through microscopic nanopores cut into silicon substrates. This work may lead to much more sensitive data collection when sequencing DNA or detecting small concentrations of pathogens.

Miniaturization has opened the possibility for a wide range of diagnostic tools, such as point-of-care detection of diseases, to be performed quickly and with very small samples. For example, unknown particles can be analyzed by passing them through nanopores and recording tiny changes in the electrical current. However, the intensity of these signals can be very low, and is often buried under random noise. New techniques for extracting the useful information are clearly needed.

Now, scientists from Osaka University have used deep learning to "denoise" nanopore data. Most machine learning methods need to be trained with many "clean" examples before they can interpret noisy datasets. However, using a technique called "Noise2Noise," which was originally developed for enhancing images, the team was able to improve resolution of noisy runs even though no clean data was available. Deep neural networks, which act like layered neurons in the brain, were utilized to reduce the interference in the data.

"The deep denoising enabled us to reveal faint features in the ionic current signals hidden by random fluctuations," first author Makusu Tsutsui says. "Our algorithm was designed to select features that best represented the input data, thus allowing the computer to detect and subtract the noise from the raw data."

The process was repeated many times until the underlying signal was recovered. Essentially, many noisy runs were utilized to produce one clean signal.

"Our method may expand the capability nanopore sensing for rapid and accurate detection of infection diseases," explains senior author Takashi Washio. "This research may lead to much more accurate diagnostic tests, even when the underlying signal is very weak."

Credit: 
Osaka University

Researchers observe new complexity of traveling brain waves in memory circuits

video: Embeddable video showing propagation of brain waves over the hippocampus

Image: 
Jon Kleen/UCSF

Researchers at UC San Francisco have observed a new feature of neural activity in the hippocampus - the brain's memory hub - that may explain how this vital brain region combines a diverse range of inputs into a multi-layered memories that can later be recalled.

Using a special "micro-grid" recording device developed by colleagues at Lawrence Livermore National Laboratory (LLNL), the UCSF researchers were able to measure hippocampus activity in study participants undergoing surgery to treat severe epilepsy. They discovered that brain waves travel back and forth across this structure, integrating messages from different areas of the brain, and showed for the first time what scientists previously had only been able to hypothesize.

"Brain recordings are an important part of guiding epilepsy surgery," said Edward Chang, MD, PhD, chair of the Department of Neurological Surgery and the senior author on the study, which appears May 12 in Nature Communications. "The new high-density electrode grid technology used here allowed us to see a novel property of hippocampal activity that was previously unknown."

Chang specializes in treating epilepsy with brain surgery, during which the hippocampus, a long structure deep the brain within an area called the temporal lobe, is exposed and sometimes fully or partially removed. The hippocampus can be a source of seizures for people with epilepsy and is one of the first brain regions affected in Alzheimer's disease.

Previous studies had suggested that waves of activity in the hippocampus only travel in one direction: from the back end, which encodes most of the information about physical location, to the front, which encodes most emotional information. To Jon Kleen, MD, PhD, lead author on the study and assistant professor of neurology in the Weill Institute for Neurosciences, this one-way travel wasn't sufficient to explain how this small brain region manages to link multiple types of information to form a memory.

As an example, he said, imagine that you've lost your keys in Times Square. "You remember the spatial "where" aspect - Times Square - but you also remember the emotional feeling 'Ack, I lost my keys!'" he said. To process a memory, Kleen noted, there must be some way to integrate many parts of a memory together. To accomplish this, he surmised, it would make sense for brain waves to travel via multiple routes to process information.

Customized Electrode Array Gives Two-Dimensional View of Brain Waves

In an effort to test this hypothesis, Chang and Kleen partnered with Razi Haque, Implantable Microsystems Group Lead at LLNL, to develop a device that could give a high-resolution, two-dimensional picture of neural activity. Haque helped create a device smaller than a dime, containing 32 electrodes spaced 2 mm apart in a flexible polymer that could conform to the shape of the hippocampus.

During surgery, Chang gently laid the electrode array directly on the hippocampi of six different surgical patients to monitor electrical activity while the patients rested. Using algorithms such as machine learning to analyze the data, the team found that not only do brain waves travel both up and down the hippocampus, but that the directions they move can be predicted.

The team also found that at times, waves of two different frequencies would be present at once, moving in different directions and potentially carrying different information. The finding lends new insight into how the hippocampus can integrate information coming from multiple brain areas into detailed memories.

Wave Direction Changes with Cognitive Activity

Two of the patients were awake and interacting during surgery. Kleen was able to show them photos of common objects, such as a dog, and ask them to recall the word for it. Electrode data showed that while one patient was recalling the word, cycles of activity consistently traveled from the back of the hippocampus toward the front. Seconds later, the cycles of activity changed, traveling in the opposite direction. "The direction of wave travel may be a biomarker reflecting the cognitive process the patient is engaged in at that moment," Kleen said.

These initial observations are just the tip of the iceberg, he said. The next steps are to make observations with an even higher resolution set of electrodes and to observe neuronal activity in patients performing more complex cognitive tasks. Ultimately, he hopes the information gained could lead to treatments using deep brain stimulation to enhance the neurostimulator therapies that are showing great success in epilepsy.

"The goal of this research is to accelerate our understanding of how the hippocampus works, so that we can address the damage to it that we see in patients with epilepsy and Alzheimer's disease," Kleen said. "If we find that, in some patients, the waves don't travel in the proper way, we can design more sophisticated stimulation patterns that may be more effective at preventing seizures or restoring cognition."

Credit: 
University of California - San Francisco

Less wastage during production of marble slabs in the Roman imperial period than today

image: Hall of the ancient Roman villa in Ephesus with its restored marble slabs, which have now been examined in more detail

Image: 
photo/©: Sinan Ilhan

When it comes to ancient Roman imperial architecture, most people usually have a mental image of white marble statues, columns, or slabs. While it is true that many buildings and squares at that time were decorated with marble, it was frequently not white but colored marble that was employed, such as the green-veined Cipollino Verde, which was extracted on the Greek island of Euboea. Because marble was very expensive, it was often placed in thin slabs as a cladding over other, cheaper stones. "To date, however, no actual remains of marble workshops from the Roman imperial era have been found, so little is known about marble processing during this period," said Professor Cees Passchier of the Institute of Geosciences at Johannes Gutenberg University Mainz (JGU). Together with other researchers based in Mainz, Turkey, and Canada, he has now finished analyzing the marble cladding of a second century A.D. Roman villa. As the researchers detail in the online edition of the Journal of Archaeological Science: Reports, they utilized special software normally used for the 3D modeling of geological structures. They discovered that the material loss during marble slab production at the time was likely lower than it is today.

The researchers examined, photographed, and measured 54 restored slabs of Cipollino Verde, each measuring around 1.3 square meters, which had been used to decorate the walls of a villa in ancient Ephesus on the west coast of Turkey. In view of the saw marks on one of the slabs, they were able to infer that these slabs had been cut in a water-powered sawmill, in effect using what we today know as hydraulic metal saws. Using reconstructions based on the slab patterns, the research team was also able to conclude that a total of 40 slabs had been sawn from a single marble block weighing three to four tons. They had been subsequently mounted on the walls in the order in which they were produced and arranged in book-matched pairs side by side, producing a symmetrical pattern. Finally, with the help of the software, the researchers created a three-dimensional model of the marble block, which in turn enabled them to draw conclusions about the material wastage during the production of the slabs. "The slabs are about 16 millimeters thick and the gaps between them, caused by sawing and subsequent polishing, are about 8 millimeters wide. This material loss attributable to production equates to around one third and is therefore less than the rates now commonly associated with many forms of modern marble production," Passchier pointed out. "We can therefore conclude that marble extraction during the imperial period was remarkably efficient."

The researchers also found that although 42 slabs had been sawn from one original marble block, two had not been fixed to the walls of the hall. "The arrangement of the slabs on the villa walls suggests these slabs were most likely broken, possibly during polishing or their subsequent transportation," added Passchier. "This would mean that the amount lost due to breakage would be 5 percent, which would also be an astonishingly low figure." This small loss leads Passchier to assume that the entire marble block had been transported to Ephesus and that the slabs were then cut and polished there.

Credit: 
Johannes Gutenberg Universitaet Mainz

Nagoya University scientists reveal unprecedentedly versatile new DNA staining probe

image: Allowing discrimination between organelle DNA using low phototoxicity visible light, Kakshine offers easy imaging even with cutting edge microscopy techniques.

Image: 
Yoshikatsu Sato

A group of scientists at Nagoya University, Japan, have developed an incredibly versatile DNA fluorescent dye, named 'Kakshine' after a former NU student of its members, Dr Kakishi Uno, but it also means to make the nucleus shine brightly, since the nucleus is pronounced 'Kaku' in Japanese. Dr Uno, with Dr Yoshikatsu Sato and Nagisa Sugimoto, the other two members of the research team at the Institute of Transformative Bio-Molecules (ITbM), succeeded in developing a DNA binding fluorescent dye with the pyrido cyanine backbone, which satisfied the three principal qualities required of such a dye - that it have high selectivity for DNA, ability to use visible light with limited phototoxicity, and be applicable to a wide range of organisms - in a way that no previous dye has been able to.

Adding to the central set of functions required for such a chemical, Kakshine and its derivatives are highly compatible with cutting-edge microscope techniques. They represent the first dye of its kind to achieve super-resolution imaging of mitochondrial DNA in living cells with STED imaging, a kind of microscopy whose resolution exceeds the diffraction limit of light. Additionally, they also enable deep tissue imaging by two-photon excitation imaging, discrimination of different organelle DNAs with a single dye by fluorescence lifetime imaging.

Kakshine is an exceptionally versatile new dyeing agent which improves upon the capabilities of and solves the shortcomings of current-generation fluorescent dyes in DNA imaging. Moreover, with applications in the medical and life science fields including electrophoresis, quantitative PCR and flow cytometry, Kakshine is expected to make a splash as the next-generation tool for DNA analysis.

Credit: 
Institute of Transformative Bio-Molecules (ITbM), Nagoya University