Culture

Gene-network analysis is a valuable new tool for understanding Alzheimer's disease

image: Overview of the present study.

Image: 
Osaka University

Osaka, Japan -- Alzheimer's disease (AD) is known to involve interactions among many different genes, making it difficult to pinpoint the specific mechanisms. But now, researchers from Japan have found a new way to identify genes implicated in neurodegeneration in AD.

In a study published this month in Human Molecular Genetics, researchers from Osaka University, Niigata University, and the National Center for Geriatrics and Gerontology have revealed genes that trigger specific changes in connections between proteins, called protein domain networks (PDNs), that are significantly associated with neurodegeneration in AD.

Network analysis is effective for identifying features of AD in humans, as well as identifying genes that control pathological changes in network properties. The researchers conducted network analysis of changes in PDNs to determine whether this process could be used to reveal genetic changes associated with AD pathology throughout the different stages of the disease.

"PDNs can vary widely among tissues and cell types and have different structures in normal versus disease states," explains Masataka Kikuchi, lead author of the study. "We wanted to see if we could use an integrated network approach to identify new genes associated with the collapse of PDNs in AD," adds Michiko Sekiya, co-lead author.

To do this, the researchers examined protein interaction data and gene expression data from postmortem brain samples from patients with AD to generate PDNs. They found that the deterioration of PDNs occurred at specific stages of AD and identified RAC1 as a specific gene that plays a key role in the alteration of PDNs. They then examined the effects of deactivating RAC1 in the fruit fly Drosophila.

"We found reduced levels of RAC1 in postmortem brain samples from patients with AD, and these changes were validated by gene expression levels from a publicly available data set" says senior author Koichi M. Iijima. "Further, inhibiting RAC1 in Drosophila induced age-dependent changes in behavior, accompanied by neurodegeneration."

In addition, they found that the number of interactions constituting PDNs in three regions from AD brains decreased with each stage of AD.

"These data provide further evidence that PDNs collapse during the progression of AD," says Norikazu Hara, co-lead author. "Thus, changes in PDNs may be a key element of neuronal dysfunction and neurodegeneration that occurs in AD."

These results indicate that an integrated network approach is advantageous in determining the factors that lead to neurodegeneration in AD, which may improve the chances of uncovering new biomarkers and therapeutic treatments.

Credit: 
Osaka University

Students often do not question online information

The Internet and social media are among the most frequently used sources of information today. Students, too, often prefer online information rather than traditional teaching materials provided by universities. According to a study conducted by Johannes Gutenberg University Mainz (JGU) and Goethe University Frankfurt, students struggle to critically assess information from the Internet and are often influenced by unreliable sources. In this study, students from various disciplines such as medicine and economics took part in an online test, the Critical Online Reasoning Assessment (CORA). "Unfortunately, it is becoming evident that a large proportion of students are tempted to use irrelevant and unreliable information from the Internet when solving the CORA tasks," reported Professor Olga Zlatkin-Troitschanskaia from JGU. The study was carried out as part of the Rhine-Main Universities (RMU) alliance.

Critical evaluation of online information and online sources are particularly important today

Learning using the Internet offers many opportunities, but it also entails risks. It has become evident that not only "fake news" but also "fake science" with scientifically incorrect information is being spread on the Internet. This problem becomes particularly apparent in the context of controversially discussed social issues such as the current corona crisis, but it actually goes much deeper. "Having a critical attitude alone is not enough. Instead, Internet users need skills that enable them to distinguish reliable from incorrect and manipulative information. It is therefore particularly important for students to question and critically examine online information so they can build their own knowledge and expertise on reliable information," stated Zlatkin-Troitschanskaia.

To investigate how students deal with online information, Professor Olga Zlatkin-Troitschanskaia and her team have developed a new test based on the Civic Online Reasoning (COR) assessment developed by Stanford University. During the assessment, the test takers are presented with short tasks. They are asked to freely browse the Internet, focusing on relevant and reliable information that will help them to solve the tasks within the relatively short time frame of ten minutes, and to justify their solutions using arguments from the online information they used.

CORA testing requires complex and extensive analysis

The analysis of the results is based on the participants' responses to the tasks. In addition, their web search activity while solving the tasks is recorded to examine their strengths and weaknesses in dealing with online information in more detail. "We can see which websites the students accessed during their research and which information they used. Analyzing the entire process requires complex analyses and is very time-consuming," said Zlatkin-Troitschanskaia. The assessments have so far been carried out in two German federal states. To date, 160 students from different disciplines have been assessed; the majority of the participants studied medicine or economics and were in their first or second semester.

Critical online reasoning skills should be specifically promoted in higher education

The results are striking: almost all test participants had difficulties solving the tasks. On a scale of 0 to 2 points per task, the students scored only 0.75 points on average, with the results ranging from 0.50 to 1.38 points. "The majority of the students did not use any scientific sources at all," said Zlatkin-Troitschanskaia, pointing out that no domain-specific knowledge was required to solve the CORA tasks. "We are always testing new groups of students, and the assessment has also been continued as a longitudinal study. Since we first started conducting these assessments two years ago, the results are always similar: the students tend to achieve low scores". However, students in higher semesters perform slightly better than students in their first year of study. Critical online reasoning skills could therefore be promoted during the course of studies. In the United States, a significant increase in these kinds of skills was observed only a few weeks after implementing newly developed training approaches.

The study shows that most students do not succeed in correctly evaluating online sources in the given time and in using relevant information from reliable sources on the Internet to solve the tasks. "As we know from other studies, students are certainly able to adequately judge the reliability of well-known media portals and Internet sources. We could build on this fact and foster the skills required to critically evaluate new sources and online information and to use the Internet in a reflected manner to generate warranted knowledge," concluded Professor Olga Zlatkin-Troitschanskaia.

In research on this topic, skills related to critically dealing with online information and digital sources are regarded as an essential prerequisite for learning in the 21st century. However, there are still very few training approaches and assessments available for students to foster these skills, especially online. "The RMU study is still in the early stages of development. We have only just developed the first test of this kind in Germany," Zlatkin-Troitschanskaia pointed out. "We are currently in the process of developing teaching/learning materials and training courses and of testing their effectiveness. The analysis of the processing will be particularly useful when it comes to offering students targeted support in the future.

Credit: 
Johannes Gutenberg Universitaet Mainz

Traditional vegetable diet lowers the risk of premature babies

It turns out we should follow our parent's advice when we're thinking about becoming parents ourselves, with a study finding eating the traditional 'three-vegies' before pregnancy lowers the risk of a premature birth.

University of Queensland PhD candidate Dereje Gete analysed the diets of nearly 3500 women and found high consumption of carrots, cauliflower, broccoli, pumpkin, cabbage, green beans and potatoes before conception helped women reach full term pregnancy.

"Traditional vegetables are rich in antioxidants or anti-inflammatory nutrients, which have a significant role in reducing the risk of adverse birth outcomes," Mr Gete said.

"Women depend on certain stored nutrients such as calcium and iron before conception, which are critical for placenta and foetus tissue development.

"Starting a healthier diet after the baby has been conceived may be too late, because babies are fully formed by the end of the first trimester," he said.

Professor Gita Mishra said the study suggested dietary intervention and strategies to change behaviour may be helpful when women start thinking about having a baby.

"People born prematurely face a greater risk of metabolic and chronic diseases in adulthood, as well as poor cognitive development and academic performance," Professor Mishra said.

Premature births, which are births before 37 weeks of gestation, are the leading cause of death in Australian children and affect 8.5 per cent of births each year, a figure which is trending upwards.

Credit: 
University of Queensland

Chinese scientists optimize strontium content to improve bioactive bone cement

Researchers from the Shenzhen Institutes of Advanced Technology (SIAT) of the Chinese Academy of Sciences have developed a new strontium-substituted bioactive glass (BG) bone cement that optimizes the concentration of strontium to improve peri-implant bone formation and bone-implant contact.

BG bone cement is a minimally invasive alternative to arduous and risky autologous bone grafts and allografts for the treatment of large bone defects.

Previous studies from SIAT found that adding strontium (Sr) to bioactive borate glass cement enhanced its osteogenic capacity in vivo. However, researchers didn't know how much Sr was needed to optimize the cement's physicochemical properties and capacity to stimulate bone regeneration. Likewise, they didn't clearly understand the molecular mechanism underlying this stimulation.

Their current research answers these questions.

In the present study, the scientists found that adding Sr to BGs could modulate the physicochemical properties and osteogenic activity of BG-based bone cements. For example, adding Sr sped up the setting reaction of bone cements and slowed down their degradation rate.

In order to determine an optimum level of Sr substitution, the researchers created bone cements composed of bioactive borosilicate glass particles substituted with varying amounts of Sr (0 mol% to 12 mol% SrO) and evaluated them in vitro and in vivo.

They discovered that osteogenic characteristics were optimally enhanced with a cement (designated BG6Sr) composed of particles substituted with 6 mol% SrO. When implanted in rabbit femoral condyle defects, the BG6Sr cement supported better peri-implant bone formation and bone-implant contact compared to cements substituted with 0 mol% or 9 mol% SrO.

The researchers also discovered that the underlying stimulation mechanism of Sr-containing bone cements involves the activation of the Wnt/β-catenin signaling pathway in the osteogenic differentiation of human blood marrow mesenchymal stem cells (hBMSCs).

These results show that BG-based bone cements offer a promising combination of physicochemical properties and biological performance for the minimally invasive treatment of bone defects when Sr is appropriately added.

Credit: 
Chinese Academy of Sciences Headquarters

Novel 3D imaging technology makes fluorescence microscopy more efficient

image: Dr Kevin Tsia (1st from right) and his team developed a new optical imaging technology to make 3D fluorescence microscopy more efficient and less damaging. (From left: Dr Yuxuan Ren, Dr Queenie Lai and Dr Kevin Tsia)

Image: 
@The University of Hong Kong

Scientists have been using fluorescence microscopy to study the inner workings of biological cells and organisms for decades. However, many of these platforms are often too slow to follow the biological action in 3D; and too damaging to the living biological specimens with strong light illumination.

To address these challenges, a research team led by Dr Kevin Tsia, Associate Professor of the Department of Electrical and Electronic Engineering and Programme Director of Bachelor of Engineering in Biomedical Engineering of the University of Hong Kong (HKU), developed a new optical imaging technology - Coded Light-sheet Array Microscopy (CLAM) - which can perform 3D imaging at high speed, and is power efficient and gentle to preserve the living specimens during scanning at a level that is not achieved by existing technologies.

This advanced imaging technology was recently published in Light: Science & Applications. An US patent application has been filed for the innovation.

"CLAM allows 3D fluorescence imaging at high frame rate comparable to state-of-the-art technology (~10's volumes per second). More importantly, it is much more power efficient, being over 1,000 times gentler than the standard 3D microscopes widely used in scientific laboratories, which greatly reduces the damage done to living specimens during scanning," explained Dr Tsia.

Existing 3D biological microscopy platforms are slow because the entire volume of the specimen has to be sequentially scanned and imaged point-by-point, line-by-line or plane-by-plane. In these platforms, a single 3D snapshot requires repeated illumination on the specimen. The specimens are often illuminated for thousands to million times more intense than the sunlight. It is likely to damage the specimen itself, thus is not favorable for long-term biological imaging for diverse applications like anatomical science, developmental biology and neuroscience.

Moreover, these platforms often quickly exhaust the limited fluorescence "budget" - a fundamental constraint that fluorescent light can only be generated upon illumination for a limited period before it permanently fades out in a process called "photo-bleaching", which sets a limit to how many image acquisitions can be performed on a sample.

"Repeated illumination on the specimen not only accelerates photo-bleaching, but also generates excessive fluorescence light that does not eventually form the final image. Hence, the fluorescence "budget" is largely wasted in these imaging platforms," Dr Tsia added.

The heart of CLAM is transforming a single laser beam into a high-density array of "light-sheets" with the use of a pair of parallel mirrors, to spread over a large area of the specimen as fluorescence excitation.

"The image within the entire 3D volume is captured simultaneously (i.e. parallelized), without the need to scan the specimen point-by-point or line-by-line or plane-by-plane as required by other techniques. Such 3D parallelization in CLAM leads to a very gentle and efficient 3D fluorescence imaging without sacrificing sensitivity and speed," as pointed out by Dr Yuxuan Ren, a postdoctoral researcher of the work. CLAM also outperforms the common 3D fluorescence imaging methods in reducing the effect of photo-bleaching.

To preserve the image resolution and quality in CLAM, the team turned to Code Division Multiplexing (CDM), an image encoding technique which is widely used in telecommunication for sending multiple signals simultaneously.

"This encoding technique allows us to use a 2D image sensor to capture and digitally reconstruct all image stacks in 3D simultaneously. CDM has never been used in 3D imaging before. We adopted the technology, which became a success," explained by Dr Queenie Lai, another postdoctoral researcher who developed the system.

As a proof-of-concept demonstration, the team applied CLAM to capture 3D videos of fast microparticle flow in a microfluidic chip at a volume rate of over 10 volumes per second comparable to state-of-the-art technology.

"CLAM has no fundamental limitation in imaging speed. The only constraint is from the speed of the detector employed in the system, i.e. the camera for taking snapshots. As high-speed camera technology continually advances, CLAM can always challenge its limit to attain an even higher speed in scanning," highlighted by Dr Jianglai Wu, the postdoctoral research who initiated the work.

The team has taken a step further to combine CLAM with HKU LKS Faculty of Medicine's newly developed tissue clearing technology to perform 3D visualization of mouse glomeruli and intestine blood vasculature in high frame-rate.

"We anticipate that this combined technique can be extended to large-scale 3D histopathological investigation of archival biological samples, like mapping the cellular organization in brain for neuroscience research." Dr Tsia said.

"Since CLAM imaging is significantly gentler than all other methods, it uniquely favours long term and continuous 'surveillance' of biological specimen in their living form. This could potentially impact our fundamental understanding in many aspects of cell biology, e.g. to continuously track how an animal embryo develops into its adult form; to monitor in real-time how the cells/organisms get infected by bacteria or viruses; to see how the cancer cells are killed by drugs, and other challenging tasks unachievable by existing technologies today," Dr Tsia added.

CLAM can be adapted to many current microscope systems with minimal hardware or software modification. Taking advantage of this, the team is planning to further upgrade the current CLAM system for research in cell biology, animal and plant developmental biology.

Credit: 
The University of Hong Kong

Reducing the risk to children's health in flood-prone areas of India

Monsoon rainfall has become more unpredictable in India. Floods and droughts have become more common and pose multiple risks to human health and wellbeing, with children under five being particularly vulnerable. New research finds that more assistance needs to be provided to communities in flood-prone areas to protect children under five from undernutrition.

Little is known about the risks climate change will pose to human health in South Asia. New research done at IIASA and published in the journal PLOS ONE shows that the effects of climate shocks vary across different parts of India and that more support needs to be provided in areas at risk of floods, with interventions focused on households with pregnant women and infants. The study for the first time used geographical climate data with new data from the Indian National Family Health Survey 2015-16, to examine the relationship between undernutrition in children under five and exposure to excessive rainfall in different parts of India. According to the results, the effects of excessive rainfall are not equally distributed across India, with impacts varying depending on different circumstantial factors.

The authors highlight that exposure to excessive monsoon rainfall in-utero and during the first year increases the risk of undernutrition among children under the age of five, which can impair their long-term growth and development. Furthermore, children living in households without access to safe sanitation facilities were found to face an increased risk of stunting due to heavy monsoon rain. Poor sanitation can contribute to undernutrition, particularly in young children, through the transmission of infections, such as diarrhoea.

"Although there has been some progress towards reducing childhood undernutrition in India, it has been slow in comparison with other developing countries. Climate change could slow this trend further, or even reverse it. The transmission of infectious diseases like diarrhoea, is one way in which heavy rainfall can affect children's health. Over half of rural households in India do not have access to safe sanitation facilities, which means that their living environment is likely to be contaminated during heavy rainfall," explains lead author Anna Dimitrova, a researcher with the Wittgenstein Centre for Demography and Global Human Capital, which is a collaboration between the IIASA World Population Program, the Vienna Institute of Demography of the Austrian Academy of Sciences, and the University of Vienna.

"We found that heavy rainfall increases the risk of childhood undernutrition and diarrheal diseases in the tropical wet and humid sub-tropical zones. These areas usually receive high monsoon rainfall and are more susceptible to floods, which can contaminate water and damage crops. In contrast, during periods of higher than average rainfall in the relatively dry mountainous regions in northern India, the risk of childhood undernutrition and diarrheal diseases is shown to be reduced. This is because higher rainfall during the monsoon season may improve access to clean water and increase crop yield in this region," adds Jayanta Bora, study coauthor and former researcher in the World Population Program at IIASA, who is now working at the Indian Institute of Dalit Studies (IIDS), New Delhi.

These results have significant implications for policymakers, as they facilitate a better understanding of the potential hazards for children and create the opportunity to develop early-response systems. Interventions should be focused on households with pregnant women and infants; some of the measures that can be taken include increased immunization coverage, improved access to healthcare, and the provision of safe drinking water and sanitation facilities in flood-prone areas.

"Improving adaptation capacities and creating early warning systems will be vital to minimize the future risks associated with climate change. Alleviating human suffering is the ultimate goal of such research. Ensuring that young children grow up to be healthy and productive individuals also brings positive effects to society at large," concludes Dimitrova.

Credit: 
International Institute for Applied Systems Analysis

Beacon in space: BRITE Constellation observes complete nova eruption for the first time

image: Artistic representation of a nova eruption: During a nova eruption a white dwarf sucks matter from its companion star and stores this mass on its surface until the gas pressure becomes extremely high.

Image: 
© Nova_by K. Ulaczyk, Warschau Universität Observatorium

Since the beginning of the BRITE Constellation in 2013 - a mission in which the first two Austrian satellites were involved - the five nanosatellites have taken millions of images. However, the recordings of a complete nova eruption are unique worldwide.

The nova phenomenon

During a nova eruption, a white dwarf sucks matter from its companion star and stores this mass on its surface until the gas pressure becomes extremely high. An explosion occurs in which hydrogen is burned, creating enormous shock fronts. These shocks are much stronger than, for example, those generated by supersonic aircraft in our Earth's atmosphere. Instead of sound, therefore, an enormous burst of light and high-energy radiation is produced, such as gamma and X-ray radiation. This means that stars that could previously only be observed with telescopes can suddenly be seen with the naked eye.

"But what causes a previously unimpressive star to explode? This was a problem that has not been solved satisfactorily until now," says Prof. Werner Weiss from the Department of Astrophysics at the University of Vienna. An explosion of Nova V906 in the constellation Carina (Latin for keel of a ship) has now provided answers and confirmed this explanatory concept, long after the explosion took place locally. "After all, this nova is so far away from us that its light takes about 13,000 years to reach the earth," explains Weiss. The event could be documented by the BRITE Constellation between March and July 2018.

Accidental observation

This first-time ever observation of a complete nova eruption came about by chance. The BRITE Constellation had just photometrically observed 18 stars in the constellation Carina continuously over several weeks when the nova suddenly appeared in the field of view. Dr. Rainer Kuschnig, Operations Manager of the BRITE Constellation at TU Graz, discovered the eruption during his daily inspection of the five nanosatellites. "Suddenly there was a star on our records that wasn't there the day before. I'd never seen anything like it in all the years of the mission!"

A short search among the top news in the night sky showed that the new star was identified as Nova Carinae 2018. Dr. Kuschnig informed the 12-member leadership of the BRITE Constellation's scientific team, the BRITE Science Team. "It is fantastic that for the first time a nova could be observed by our satellites even before its actual eruption and until many weeks later," says Prof. Otto Koudelka, project manager of the BRITE Austria (TUGSAT-1) satellite at TU Graz.

"This fortunate circumstance was decisive in ensuring that the nova event could be recorded with unprecedented precision," explains Prof. Konstanze Zwintz, head of the BRITE Science Team, from the Institute for Astro- and Particle Physics at the University of Innsbruck. Zwintz immediately realised "that we had access to observation material that was unique worldwide." The cooperation of BRITE Constellation with Dr. Elias Aydi from Michigan State University, USA, led to the publication now published in Nature Astronomy with the title "Direct evidence for shock-powered optical emission in a nova".

Observing stars from space

The BRITE Constellation is an ensemble of small satellites that record the light of selected stars in the sky by high-precision photometry. From an altitude of about 800 km, the BRITE Constellation observes stars with magnitudes between 0 and 6 in optical light, with the faintest stars just barely visible to the naked eye under excellent observation conditions. Typically, 15 to 20 stars are measured continuously for about half a year in a 24 square degree field - an area as large as, for example, the entire constellation of Orion or the Plough (Big Dipper).

The BRITE Constellation was initiated with the launch of the first two Austrian satellites, BRITE-Austria/TUGSAT-1 and UniBRITE in 2013. Poland and Canada joined in 2014 with one pair of identical satellites each. The BRITE Constellation has since studied more than 660 of the brightest stars in the sky.

Credit: 
Graz University of Technology

Novel high-speed microscope captures brain neuroactivities

image: Dr. Kevin Tsia, Associate Professor of the Department of Electrical and Electronic Engineering and Programme Director of Bachelor of Engineering in Biomedical Engineering of the University of Hong Kong (HKU)

Image: 
@The University of Hong Kong

Our brain contains tens of billions of nerve cells (neurons) which constantly communicate with each other by sending chemical and electrical flashes, each lasting a short one millisecond (0.001 sec). In every millisecond, these billions of swift-flying flashes altogether traveling in a giant star-map in the brain that lights up a tortuous glittering pattern. They are the origins of all body functions and behaviours such as emotions, perceptions, thoughts, actions, and memories; and also brain diseases e.g. alzheimer's and parkinson's diseases, in case of abnormalities.

One grand challenge for neuroscience in the 21st century is to capture these complex flickering patterns of neural activities, which is the key to an integrated understanding of the large-scale brain-wide interactions. To capture these swift-flying signals live has been a challenge to neuroscientists and biomedical engineers. It would take a high-speed microscope into the brain, which has not been possible so far.

A research team led by Dr Kevin Tsia, Associate Professor of the Department of Electrical and Electronic Engineering and Programme Director of Bachelor of Engineering in Biomedical Engineering of the University of Hong Kong (HKU); and Professor Ji Na, from the Department of Molecular & Cell Biology, University of California, Berkeley (UC Berkeley) offers a novel solution with their super high-speed microscope - two-photon fluorescence microscope, which has successfully recorded the millisecond electrical signals in the neurons of an alert mouse.

The new technique is minimally invasive to the animal being tested compared to the traditional method that require inserting an electrode into the brain tissue. Not only is this less damaging to the neurons but also can pinpoint individual neurons and trace their firing paths, millisecond by millisecond.

The result of this ground-breaking work has recently been published in the academic journal Nature Methods. The project was funded by the National Institute of Health, U.S.

At the heart of the high-speed microscope is an innovative technique called FACED (free-space angular-chirp-enhanced delay imaging) - developed by Dr Tsai's team earlier (note 1). FACED makes use of a pair of parallel mirrors which generate a shower of laser pulses to create a super-fast sweeping laser beam at least 1,000 times faster than the existing laser-scanning methods.

In the experiment, the microscope projected a beam of sweeping laser over the mouse's brain and captured 1,000 to 3,000 full 2D scans of a single mouse brain layer (of the neocortex) every second. To probe the genuine electrical signals that pulse between the neurons, the team inserted a biosensor (protein molecules), developed by Dr Michael Lin of Stanford University, into the neurons of the mouse brain.

"These engineered proteins will light up (or fluoresce) whenever there is a voltage signal passes through the neurons. The emitted light is then detected by the microscope and formed into a 2D image that visualises the locations of these voltage changes," said Dr Tsia.

"This is really an exciting result as we now can peek into the neuronal activities, that were once obscured and could provide the fundamental clues to understanding brain functions and more importantly brain diseases," he added.

Apart from electrical signals, the team also used the microscope to capture the slow-motion of chemical signals in the mouse brain, such as calcium and glutamate, a neurotransmitter, as deep as one-third of a millimeter from the brain's surface.

A notable advantage of this technique is the ability to track the signals that do not trigger the neuron to fire - weak neuronal signals (called sub-threshold signals) that are often difficult to capture and detect, which could also happen in many disease condition in the brain, but have yet been studied in detail because of the lack of high-speed technique like the one developed by the team.

Another important feature of the novel technique is that it is minimally invasive. The classical method for recording electrical firing in the brain is to physically embed or implant electrodes in the brain tissue. However, such physical intrusion could cause damage to the neurons, and can only detect fuzzy signals from a couple of neurons.

"This is so far a one-of-its-kind technology that could detect millisecond-changing activities of individual neurons in the living brain. So, this is, I would say, the cornerstone of neuroscience research to more accurately "decoding" brain signals."Dr Tsia said the team would work to advance the capability of the microscope.

"We are working to further combine other advanced microscopy techniques to achieve imaging at higher resolution, wider view and deeper into the brain in the neocortex, which is about 1 millimeter. This will allow us to probe deeper into the brain for a better and more comprehensive understanding of the functions of the brain." he added.

Credit: 
The University of Hong Kong

Blood pressure awareness and control rates in Canadians are slipping alarmingly, particularly among women

Philadelphia, April 14, 2020 - In a new study that draws attention to a growing cardiovascular health concern, investigators report that an increasing number of Canadians, particularly women, are unaware that they have high blood pressure, and they are not getting treatment to control their hypertension. The study appears in the Canadian Journal of Cardiology, published by Elsevier.

"The results of this study should serve as a serious warning that high blood pressure remains one of the leading public health threats in Canada, as in all other developed countries. The widening disparity between how women and men are treated is concerning, particularly as high blood pressure continues to be the leading cause of preventable heart attacks, strokes, and premature death in our country," stated lead investigator Alexander A. Leung, MD, MPH, assistant professor in the departments of Medicine and Community Health Sciences, Cumming School of Medicine, University of Calgary, Calgary, AB, Canada.

Canada has been a world-leader in hypertension care, outperforming other Western countries over many years. Between 1990 and 2007, the management of hypertension improved due to a strong publicly funded primary healthcare system, the success of a coordinated national strategic plan, and a collaboration among national hypertension organizations on guideline implementation and evaluation. In recent years, however, developments have emerged with the potential to adversely impact hypertension care, including declining federal government support for surveillance and implementation, a refocusing of non-governmental organizations away from implementation of hypertension recommendations, and declining industry support as patented drugs were replaced by generics.

This study was based on a national survey conducted by Statistics Canada from 2007 to 2017. In addition to conducting in-person interviews, the study also included carefully performed blood pressure checks. Using the results, the investigators were able to estimate how many people were affected by high blood pressure, along with the number who were aware of their condition, appropriately treated, and adequately controlled.

Nearly a quarter of Canadian adults (or 5.8 million people) suffer from hypertension. While prevalence remained stable over 10 years, the researchers found a noticeable plunge in the percentage of people who were aware that they had high blood pressure, in the percentage of people treated with medications, and in the percentage of people achieving good blood pressure control, which were especially marked in 2016 to 2017. For the first time in decades, less than 75 percent of people with high blood pressure were treated and less than 60 percent of people were controlled.

Dr. Leung noted, "This means that hundreds of thousands of people living with high blood pressure do not even know they have a problem, a large number are not receiving adequate treatment, and as a result, many people are suffering from preventable deaths and disability from heart attacks and strokes. Consistent with our findings, public data sources also indicate a rise in cardiovascular disability and death in Canada in the same period."

Importantly, women were impacted much more than men, driving the overall decline. While the prevalence of hypertension did not differ significantly between men and women, a smaller number of women were taking antihypertensive medications compared to men, and among those who were treated, blood pressure was less commonly controlled. While treatment and control rates remained generally stable for men (at around 79 percent for treatment and 67 percent for control), treatment in women has declined to 65 percent and control to 49 percent.

"The results of this study should serve as a serious warning that high blood pressure remains a major public health threat in Canada," said Dr. Leung. "We should not be complacent about national blood pressure control, but urgently re-engage the public, doctors, and policy makers in government about the importance of blood pressure treatment and control for people of all ages and genders."

Credit: 
Elsevier

A new species of black endemic iguanas in Caribbeans is proposed for urgent conservation

image: A basking iguana optimizing after different trials its warming by a curved position when the sun is low on the horizon on the Windward coast of Saba.

Image: 
M. Breuil

A newly discovered endemic species of melanistic black iguana (Iguana melanoderma), discovered in Saba and Montserrat islands, the Lesser Antilles (Eastern Caribbean), appears to be threatened by unsustainable harvesting (including pet trade) and both competition and hybridization from escaped or released invasive alien iguanas from South and Central America. International research group calls for urgent conservation measures in the article, recently published in the open-access journal ZooKeys.

So far, there have been three species of iguana known from The Lesser Antilles: the Lesser Antillean iguana (Iguana delicatissima), a species endemic to the northernmost islands of the Lesser Antilles; and two introduced ones: the common iguana (Iguana iguana iguana) from South America and the green iguana (Iguana rhinolopha) from Central America.

The newly described species is characterised with private microsatellite alleles, unique mitochondrial ND4 haplotypes and a distinctive black spot between the eye and the ear cavity (tympanum). Juveniles and young adults have a dorsal carpet pattern, the colouration is darkening with aging (except for the anterior part of the snout).

It has already occurred before in Guadeloupe that Common Green Iguana displaced the Lesser Antilles iguanas through competition and hybridization which is on the way also in the Lesser Antilles. Potentially invasive common iguanas from the Central and South American lineages are likely to invade other islands and need to be differentiated from the endemic melanistic iguanas of the area.

The IUCN Red List lists the green iguana to be of "Least Concern", but failed to differentiate between populations, some of which are threatened by extinction. With the new taxonomic proposal, these endemic insular populations can be considered as a conservation unit with their own assessments.

"With the increase in trade and shipping in the Caribbean region and post-hurricane restoration activities, it is very likely that there will be new opportunities for invasive iguanas to colonize new islands inhabited by endemic lineages," shares the lead researcher prof. Frédéric Grandjean from the University of Poitiers (France).

Scientists describe the common melanistic iguanas from the islands of Saba and Montserrat as a new taxon and aim to establish its relationships with other green iguanas. That can help conservationists to accurately differentiate this endemic lineage from invasive iguanas and investigate its ecology and biology population on these two very small islands that are subject to a range of environmental disturbances including hurricanes, earthquakes and volcanic eruptions.

"Priority actions for the conservation of the species Iguana melanoderma are biosecurity, minimization of hunting, and habitat conservation. The maritime and airport authorities of both islands must be vigilant about the movements of iguanas, or their sub-products, in either direction, even if the animals remain within the same nation's territory. Capacity-building and awareness-raising should strengthen the islands' biosecurity system and could enhance pride in this flagship species," concludes Prof. Grandjean.

The key stakeholders in conservation efforts for the area are the Dutch Caribbean Nature Alliance (DCNA), the Saba Conservation Foundation (SCF), the Montserrat National Trust (MNT) and the UK Overseas Territories Conservation Forum (UKOTCF), which, the research team hope, could take measures in order to protect the flagship insular iguana species, mainly against alien iguanas.

Credit: 
Pensoft Publishers

Soot may only be half the problem when it comes to cookstoves

image: Rajan Chakrabarty, assistant professor of energy, environmental and chemical engineering at the McKelvey School of Engineering, Washington University in St. Louis

Image: 
Washington University in St. Louis

A telltale signature of a cookstove, commonly used to prepare food or provide heat by burning wood, charcoal, animal dung or crop residue, is the thick, sooty smoke that rises from the flames. Its remnants, black stains left on the walls and clothes and in the lungs of the people -- usually women -- who tend to the stoves, are a striking reminder of the hazards the stoves pose both to human health and to the environment.

But soot is only part of the story when it comes to environmental impact -- about half of it, it turns out.

As the temperatures in a cookstove begin to drop, and the black smoke turns greyish-white, soot (or black carbon) emission is replaced by organic carbon.

Research from the McKelvey School of Engineering at Washington University in St. Louis has revealed that, despite its whitish appearance, organic carbon particles absorb as much -- if not more -- sunlight in the atmosphere as black carbon. And its health effects may be worse for the nearly 2.7 billion households worldwide that use them.

The research was published in the April 14, 2020, issue of Environmental Science & Technology Letters.

To better understand the effects cookstoves are having on the environment and human health, Rajan Chakrabarty, assistant professor of energy, environmental and chemical engineering, took his research to a pollution hotspot: Chhattisgarh, in the heart of rural central India, where cookstoves are one of the biggest emitters of aerosols and greenhouse gases.

"Previous experiments had been carried out in a lab under controlled conditions, which are different from what you'll find in the field," Chakrabarty said.

Yet global mitigation and policy strategies have for 15 years been influenced by a 2005 study based on experiments carried out in a lab. That paper recommended controlling black carbon as a way to mitigate climate change in the South Asian region.

To better understand the effects of cookstove emissions in the field, Chakrabarty's team headed to Chhattisgarh, between New Delhi and Calcutta. For two weeks, they lived with residents in rural homes where people used mud chulhas, or cookstoves, to prepare food. During that time, they ran 30 tests, cooking with different fuels: wood, agricultural residue and cattle dung, all of which were locally sourced.

The success of their field study was greatly facilitated by local researcher and Chakrabarty's longtime collaborator, Professor Shamsh Pervez and his research group at Pt. Ravishankar Shukla University.

"We found the reality of a cooking cycle is a mix of black and white smoke," Chakrabarty said. The color black absorbs light, and white reflects it. "We had thought that the organic carbons counteracted the black carbon, to a degree," he said. "Soot absorbs, organics scatter."

What they found was something different. The particulates in the white smoke were absorbing light very strongly in the near ultraviolet wavelengths. "When we looked under the hood," Chakrabarty said, "they were actually brown carbon." After completing the analysis, the team determined the absorption was equally if not greater than black carbon, making it an equally potent agent of atmospheric warming.

Burning biomass fuel, such as wood or dung, is the dominant source of ambient air pollution particles in the South Asian region, according to Chandra Venkataraman, professor of chemical engineering at the Indian Institute of Technology Bombay. Venkataraman was the first author of that influential 2005 Science paper.

"This work makes a novel and crucial finding, using field measurements of particulate emissions from biomass stoves, that radiation absorption from the organic carbon component could equal that from black carbon," she said.

Beyond its effects on the climate, organic carbon also poses significant health risks. Many of the particulates are what are known as high molecular weight polycyclic aromatic hydrocarbons -- compounds of carbon and hydrogen that are established carcinogens in animals and generally believed to be so in humans.

Chakrabarty's findings are applicable around the world, as the World Health Organization reports nearly half of the world's population cooks over an open fire. Cookstoves can be found in homes not only in India, but around the globe, from Senegal to Peru to Albania.

"The finding has very significant implications for regional atmospheric warming from pollution particles containing short-lived climate forcers like black and organic carbon in the context of achieving Paris Agreement temperature targets," Venkataraman said. 

Credit: 
Washington University in St. Louis

COVID-19 may impact treatment for patients with type 2 diabetes

WASHINGTON--Individuals with diabetes are at increased risk for bacterial, parasitic and viral infections. New research published in Endocrine Reviews, a journal of the Endocrine Society, illuminates how intersections of the coronavirus infection (COVID-19) and type 2 diabetes may require new approaches in treatment for hospitalized patients.

Not only does the global COVID-19 pandemic have immediate implications for the therapy of type 2 diabetes, individuals with obesity are known to be at increased risk for complications arising from influenza, and obesity is emerging as an important comorbidity for disease severity in the context of COVID-19.

"We reviewed how the pathophysiology of diabetes and obesity might intersect with COVID-19 biology and found key shared pathways and mechanisms linked to the development and treatment of type 2 diabetes," said the study's author Daniel J. Drucker, M.D., of Mount Sinai Hospital in Toronto. "Cells within the lung and gut are major sites for coronavirus entry and inflammation. These cells express key proteins like Angiotensin Converting Enzyme 2 (ACE2) and Dipeptidyl Peptidase-4 (DPP4) that are also present in the development of type 2 diabetes."

More studies need to be done to understand the risks and benefits of commonly used diabetes medications in patients with severe coronavirus infections. The pandemic highlights the importance of expanding innovative delivery of diabetes care and regular communication between people with diabetes and their health care providers.

The study, "Coronavirus infections and type 2 diabetes-shared pathways with therapeutic implications," was published online, ahead of print.

Credit: 
The Endocrine Society

Novel metrics suggests electronic consultations are appropriate, useful alternative to face-to-face medical appointments

1. Novel metrics suggests that electronic consultations are an appropriate and useful alternative to face-to-face medical appointments

Abstract: http://annals.org/aim/article/doi/10.7326/M19-3852
Editorial: http://annals.org/aim/article/doi/10.7326/M20-1320
URL goes live when the embargo lifts

Using novel metrics, researchers found that 70 percent of electronic consultations, or e-consults, were appropriate based on their proposed criteria and 81 percent were associated with avoided face-to-face visits. Study authors say these metrics provide meaningful insight into practice and may provide a rubric for comparison in future studies. Findings from a cohort study were published in Annals of Internal Medicine.

E-consults can improve patient access to specialists, minimize travel, and reduce unnecessary in-person visits. However, metrics to enable study of e-consults and their effect on processes and patient care are lacking.

Researchers from Brigham and Women's Hospital and Massachusetts General Hospital reviewed a random sample of 150 medical records from each of five specialties with a high volume of e-consult requests - psychiatry, hematology, dermatology, infectious diseases, and rheumatology - to assess novel metrics of e-consult appropriateness and utility. The appropriateness of each e-consult was measured by the following criteria: could not be answered by reference to society guidelines or a point-of-care resource, were not requesting logistical information, were not urgent, and were considered complex. Utility was measured by rate of avoided face-to-face visits within 120 days of the e-consult. The authors found 70 percent of e-consults to be appropriate based on their criteria, ranging from 61 percent in rheumatology to 78 percent in psychiatry. Nearly all questions were of appropriate urgency, but some were deemed too simple or too complex. Across all specialties, 81 percent of e-consults were associated with avoided visits, ranging from 62 percent in dermatology to 93 percent in psychiatry.

Media contacts: For an embargoed PDF please contact Lauren Evans at laevans@acponline.org. To speak with the lead author, Salman Ahmed, MD, MPH, please contact Haley Bridger at hbridger@bwh.harvard.edu.

2. Levetiracetam may reduce anticoagulation effect of rivaroxaban

Abstract: http://annals.org/aim/article/doi/10.7326/L19-0712
URL goes live when the embargo lifts

Levetiracetam, a commonly used medication to prevent seizures, may reduce the anticoagulation effect of oral rivaroxaban in humans. As such, clinicians should measure direct oral anticoagulant plasma levels during treatment. A case report is published in Annals of Internal Medicine.

Clinical guidelines recommend the use of levetiracetam with oral anticoagulants because animal studies suggest that the anti-convulsant acts as a P-glycoprotein inducer to reduce rivaroxaban plasma levels. However, not everyone is convinced that levetiracetam should be avoided in patients receiving rivaroxaban because there is little or no published evidence describing this interaction in humans.

Researchers from the University of Prugia, Perugia, Italy, report the case of a 69-year-old man who was taking rivaroxaban for atrial fibrillation and started to experience seizures in his right frontal lobe, for which he was prescribed levetiracetam. Several months later, he was clinically diagnosed with recurrent transient ischemic attacks. The clinicians measured his rivaroxaban plasma levels to determine if low levels would explain the transient ischemic attacks and then changed levetiracetam with lacosamide, an anticonvulsive not interfering with P-glycoprotein. Repeated measurement of rivaroxaban plasma levels showed a clinically relevant interaction between levetiracetam and rivaroxaban, where the drug reduced plasma levels, with a particularly strong and long-lasting effect on trough levels. The clinicians believe that this interaction is clinically important but caution that their study was limited and they did not measure P-glycoprotein activity in the patient.

Media contacts: For an embargoed PDF please contact Lauren Evans at laevans@acponline.org. To speak with the lead author, Paolo Gresele, MD, PhD, please contact Silvia Spaccini at 0755783989 or paolo.gresele@gmail.com.

Credit: 
American College of Physicians

Study finds rise in between-workplace inequalities in the US, high-income countries

image: The proportion of total inequality that is between firms for the total (left), private (middle) and public sectors (right). The proportion of total inequality attributable to the between-workplace component has grown in every country except Hungary and Canada. Estimates are for all job except for South Korea, which are full-time jobs only. Japan, South Korea, and US-Song only have private sector estimates. South Korea is missing for 2005.

Image: 
UMass Amherst/Tomaskovic-Devey

AMHERST, Mass. - A new analysis of earnings inequalities by an international team of 27 researchers has found that the between-workplace share of wage inequality is growing in 12 of 14 high-income countries studied, and that the countries vary a great deal in their levels and trends in overall earnings inequality.

In a new report in the Proceedings of the National Academies of Science (PNAS), lead author Donald Tomaskovic-Devey of the University of Massachusetts Amherst and his colleagues detail their examination of roughly 25 years of administrative records covering more than 2 billion job-years nested within more than 50 million workplace-years for 14 high-income countries: Canada, Czechia, Denmark, France, Germany, Hungary, Israel, Japan, the Netherlands, Norway, Slovenia, South Korea, Sweden and the United States.

In 12 of the countries they found that the share of inequality between workplaces is growing; Canada and Hungary were the only exceptions. Rising between-workplace inequality occurs when firms with powerful market positions simultaneously outsource production and services to temporary labor firms, subcontractors, global supply chains, franchisees, independent contractors and other low-wage firms. Firms such as Apple, Amazon, Marriott, McDonalds, Uber and Nike are prominent examples of this combination of market power and externalized labor.

"The extreme vulnerability of low-wage workers to the COVID-19 pandemic in the U.S. is linked to this trend of larger firms outsourcing risk and low-wage labor to weaker firms," Tomaskovic-Devey points out.

"Most strikingly, we find in 12 of the 14 countries examined that the organizational structure of production is shifting toward increasing between-workplace wage dispersion," the report states. "In all of those 12 countries this process is more pronounced in the private sector, but we also find rising between-workplace inequality in the public sector in eight countries."

The study also shows that in countries with weak or declining labor market protections, inequality - particularly between-workplace inequalities - rises the fastest. In contrast, widespread collective bargaining coverage and high minimum wages reduce inequalities both between and within firms.

"We show that trends in rising between-workplace wage dispersion are closely aligned with declining national labor market institutions, institutions that in some countries once protected the bargaining power of employees relative to employers," the authors write.

"We knew from past research that earnings inequalities in the U.S. were being driven by wage polarization between high-wage and low-wage firms, but I was shocked to see how widespread this trend is," says Tomaskovic-Devey, professor of sociology and founding director of the Center for Employment Equity at UMass Amherst. "Although rising between-firm inequalities are widespread, it is crucial to recognize that both the levels of inequality and the speed of firm wage polarization are strongly tied to national labor market institutions. The U.S. has the weakest labor market protections of all fourteen countries we studied and has the highest levels of inequality."

The authors write that results of their analyses suggest that policies aimed at reducing rising inequalities in national production systems might focus on between-firm and workplace inequalities via mechanisms that strengthen the bargaining power of employees and curtail the ability of powerful firms to outsource risk while absorbing revenue.

"Strengthening institutional protections for lower-skilled workers," it concludes, "will not only improve their wages and job security, but also reduce the ability of more powerful firms to outsource production to lower wage firms. Policies to limit the market power of dominant firms may moderate both the earnings going to the top of those firms and their ability to externalize labor costs."

Credit: 
University of Massachusetts Amherst

Molecular & isotopic evidence of milk, meat & plants in prehistoric food systems

image: Examples of potsherds analysed

Image: 
Kate Grillo

A team of scientists, led by the University of Bristol, with colleagues from the University of Florida, provide the first evidence for diet and subsistence practices of ancient East African pastoralists.

The development of pastoralism is known to have transformed human diets and societies in grasslands worldwide. Cattle-herding has been (and still is) the dominant way of life across the vast East African grasslands for thousands of years.

This is indicated by numerous large and highly fragmentary animal bone assemblages found at archaeological sites across the region, which demonstrate the importance of cattle, sheep and goat to these ancient people.

Today, people in these areas, such as the Maasai and Samburu of Kenya, live off milk and milk products (and sometimes blood) from their animals, gaining 60 - 90 percent of their calories from milk.

Milk is crucial to these herders and milk shortages during droughts or dry seasons increase vulnerabilities to malnutrition, and result in increased consumption of meat and marrow nutrients.

Yet we do not have any direct evidence for how long people in East Africa have been milking their cattle, how herders prepared their food or what else their diet may have consisted of.

Significantly though, we do know they have developed the C-14010 lactase persistence allele, which must have resulted from consumption of whole milk or lactose-containing milk products. This suggests there must be a long history of reliance on milk products in the area.

To address this question, the researchers examined ancient potsherds from four sites in Kenya and Tanzania, covering a 4000-year timeframe (c 5000 to 1200 BP), known as the Pastoral Neolithic, using a combined chemical and isotopic approach to identify and quantify the food residues found within the vessels. This involves extracting and identifying the fatty acids, residues of animal fats absorbed into the pot wall during cooking.

The findings, published today in the journal PNAS, showed that by far the majority of the sherds yielded evidence for ruminant (cattle, sheep or goat) meat, bones, marrow and fat processing, and some cooking of plants, probably in the form of stews.

This is entirely consistent with the animal bone assemblages from the sites sampled. Across this entire time frame, potsherds preserving milk residues were present at low frequencies, but this is very similar to modern pastoralist groups, such as the heavily milk-reliant Samburu, who cook meat and bones in ceramic pots but milk their cattle into gourds and wooden bowls, which rarely preserve at archaeological sites.

In the broader sense, this work provides insights into the long-term development of pastoralist foodways in east Africa and the evolution of milk-centred husbandry systems. The time frame of the findings of at least minor levels of milk processing provides a relatively long period (around 4,000 years) in which selection for the C-14010 lactase persistence allele may have occurred within multiple groups in eastern Africa, which supports genetic estimates. Future work will expand to studies of other sites within the region.

Dr Julie Dunne, from the University of Bristol's School of Chemistry, who led the study, said: "How exciting it is to be able to use chemical techniques to extract thousands of year-old foodstuffs from pots to find out what these early East African herders were cooking.

"This work shows the reliance of modern-day herders, managing vast herds of cattle, on meat and milk-based products, has a very long history in the region."

Credit: 
University of Bristol