Tech

Brain-related visual problems may affect one in 30 primary school children

A brain-related visual impairment, which until recently was thought to be rare, may affect one in every 30 children according to new research investigating the prevalence of Cerebral Visual Impairment [CVI]. The University of Bristol-led findings published today [3 February] in Developmental Medicine and Child Neurology, aim to raise awareness of CVI among parents and teachers to help them identify signs of the condition earlier.

The brain is just as important as the eyes when it comes to seeing, and many vision problems are caused by areas of the brain that are needed for sight not working properly and cannot be resolved by wearing glasses. Brain-related vision problems include difficulties with moving the eyes, seeing things in the space around (visual field) and recognising objects accurately and quickly.

While eye chart tests check how well a person can see the details of a letter or symbol from a specific distance, these visual acuity diagnostic assessments miss many children with CVI, whose acuity is normal or near-normal (they can read down a chart). The study, funded by the National Institute for Health Research (NIHR), investigated how many school-aged children may have undiagnosed brain-related vision problems.

Researchers from University of Bristol Medical School collected information about 2,298 children aged five to 11-years across 12 schools using teacher and parent questionnaires. They invited over ten per cent of the children (262 pupils) for a detailed assessment using validated tests to identify children with brain-related visual problems suggestive of CVI.

The team found that based on their results, on average, every class of 30 children, would have one or two children with at least one brain-related vision problem. They found no single problem was most common: the difficulties observed included problems with eye movements, visual field, recognition of objects and seeing things in clutter. The team also found that children who were struggling with their learning and were already being given extra help at school, were more likely to have brain-related vision problems: four in every ten children with support for special educational needs had one or more brain-related vision problems, whilst for all children it was only about three in 100.

Dr Cathy Williams, the study's lead author and Associate Professor in Paediatric Ophthalmology at Bristol Medical School: Population Health Sciences and Consultant Paediatric Ophthalmologist at University Hospitals Bristol and Weston NHS Foundation Trust (UHBW), explained: "While this does not prove that these kind of vision problems are the cause of the difficulties with learning for any particular child, it does suggest that attending to children's visual needs, such as making things bigger or less cluttered, might be a good place to start. If interventions can work to reduce the impact of these problems on children's learning, it might improve both educational and wellbeing outcomes for children."

The authors recommend in future that detailed vision checking of all children who need extra support at school, as well as the existing paediatrician and educational psychology assessments, could improve outcomes for children.

Dr Williams added: "We would like to thank all the teachers, parents and children who helped support this important study, which is part of the CVI project."

Credit: 
University of Bristol

Some food contamination starts in the soil

image: Rice Investigation, Communication and Education (RICE) Facility at the University of Delaware where the Seyfferth Lab conducts rice experiments in outdoor rice paddies.

Image: 
Matt Limmer

When most people hear "food contamination," they think of bacteria present on unwashed fruits or vegetables, or undercooked meat. However, there are other ways for harmful contaminants to be present in food products.

Angelia Seyfferth, a member of the Soil Science Society of America, investigates food contamination coming from the soil where the plants grow. "It all comes down to the chemistry of the soil," explains Seyfferth.

Most recently, Seyfferth has been studying rice. The elements arsenic and cadmium can be present in the paddies where rice is grown. She presented her research at the virtual 2020 ASA-CSSA-SSSA Annual Meeting.

"Contaminants being taken up by crop plants is a route of dietary exposure to contaminants that is understudied," Seyfferth says. "We can help decrease human exposure to toxins by applying our knowledge of soil chemistry."

Small amounts of arsenic and cadmium are present all over the globe and can be detected in many food products. It's the concentration in the vegetable or fruit, the chemical form of the element, and how much of it someone eats that determines if an individual experiences a negative health effect.

High concentrations of arsenic and cadmium are harmful to the body. Consuming low doses over a long period of time can even cause cancer.

Elements like arsenic and cadmium can be in different chemical forms depending on their environment. Contaminants are taken up by plants when their chemical form in the soil resembles a nutrient the plant needs.

"How food is grown affects not only the concentration of contaminants, but also where the contaminants are stored within the food," says Seyfferth. "If we understand the chemical forms of contaminants in soil, we can design solutions to decrease plant uptake of them."

In rice, arsenic and cadmium uptake results from opposite conditions. Arsenic can be taken up when the field is flooded. Cadmium is more likely to be taken up when the field is not flooded.

Seyfferth's work has searched for a way to prevent plants from taking up arsenic and cadmium from the soil. This is often done by adding materials to the soil, called amendments.

An amendment helps change the soil environment. By changing the soil environment, researchers can help control the chemical forms and plant uptake of contaminants in the soil.

In this case, Seyfferth found that adding rice husk residue to rice paddy soils can help lower the amount of arsenic and cadmium taken up by the plants. Rice husk residue is plant material left over after processing rice for human consumption.

This solution is simple yet effective. Rice husk residue is high in the element silicon, which is an important nutrient for rice. The chemical form of silicon is similar to the form of arsenic taken up by rice plants when fields are flooded. This similarity helps 'distract' the plant, which prevents it from taking up as much arsenic.

In soils where cadmium is a problem, rice husk residue helps make the soil less acidic. This helps to lock up cadmium in the soil. The silicon in the husk may also help decrease the toxicity of cadmium.

"Not all sources of silicon behave the same way though," says Seyfferth. "In order for it to be effective, the silicon source must provide silicon in a high enough concentration during the time the rice plant is filling grain. Rice husk residue is a successful source because it breaks down slowly and releases silicon throughout the growing season."

High arsenic can decrease grain yield, but Seyfferth's work shows that adding rice husk residues can help prevent yield loss. Half of the world depends on rice as a staple food, so this research has exciting potential for positive impact.

In the past, Seyfferth has studied similar contamination issues in mushrooms.

For most American adults, the amount of arsenic and cadmium they consume from rice and mushrooms is not enough to cause concern. But there are other populations that eat these products more frequently and from an early age.

"People need to be aware of their daily load of contaminants, which depends on their body weight, the concentration and chemical form of the contaminant in the food, and the amount consumed," Seyfferth explains.

"The daily load is highest for people who consume rice multiple times a day and who may also have arsenic in their drinking water," she says. "Some examples include populations in South and Southeast Asia."

Angelia Seyfferth is an associate professor at the University of Delaware. This work was supported by the National Science Foundation, the National Institute of Food and Agriculture, and the University of Delaware. The 2020 ASA-CSSA-SSSA Annual Meeting was hosted by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America.

Credit: 
American Society of Agronomy

Experts 'scan horizon' to help prepare governments for next major biosecurity threat

image: Part 1 of a cartoon summary providing a brief excursion into what the future could like as these issues in bioengineering develop. Based on the bioengineering horizon scan published in eLife.

Image: 
Centre for Existential Risk, Cambridge University

During the summer of 2019, a global team of experts put their heads together to define the key questions facing the UK government when it comes to biological security.

Facilitated by the Centre for the Study of Existential Risk (CSER) at the University of Cambridge and the BioRISC project at St Catharine's College, the group of 41 academics and figures from industry and government submitted 450 questions which were then debated, voted on and ranked to define the 80 most urgent.

The final line-up includes major questions on future disease threats, including what role shifts in climate and land use might play, and whether data from social media platforms should be used to help detect the earliest signs of emerging pathogens.

Other key areas that experts believe should be a focus for investigation include questions around custom DNA synthesis and threats from "human-engineered agents", the challenges posed by Brexit and vulnerabilities in transport and food systems, risks from "invasive alien species" in water and soil, and how best to incorporate biological security issues in scientific education.

Researchers say that these questions, published in the journal PLOS ONE, are critical to the UK's biosecurity.

"In the year before the pandemic, the UK was ranked second in the world for global health security by the Global Health Security Index - a confidence underpinned by its 2018 Biological Security Strategy," said Dr Luke Kemp from CSER who led the foresight research. "Clearly, improvements are needed, and not just to be ready for a future COVID-19-like crisis."

"We need to plan for a biosecure future that could see anything from brain-altering bioweapons and mass surveillance through DNA databases to low-carbon clothes produced by microorganisms. Many of these may seem to lie in the realm of science fiction but they do not. Such capabilities in bioengineering could prove even more impactful, for better or worse, than the current pandemic."

Some of these questions are linked to an earlier 'horizon scan' process by many of the same experts (but part of an expanded, more global group) and also led by Kemp, which focused on the future of bioengineering.

An international team put forward issues, anonymously scored them, discussed and then voted to produce a list of twenty priority issues in the field, published in the middle of last year in the journal eLife.

The list is divided into those most immediate, and likely to come to the fore within the next five years, those on a 5-10 year timeline, and those a decade or more hence. Kemp describes this top twenty as running from the "promising to the petrifying".

The horizon scan pointed toward the pressing concern (

"The possibility of using genetic databases for mass surveillance will only grow in coming years, particularly with the rise of new tracking and monitoring methods, powers and apps during the COVID-19 response."

One of the highest-ranked issues for the longer-term future was the malicious uses of neurochemistry. The experts argue that advances in neuroscience and bioengineering could lead to new beneficial drugs and "nootropics", but also new weapons.

"Imagine a world in which law enforcement uses to drugs to placate and control crowds, greatly diminishing the promise of non-violent protest movements on climate and social justice," Kemp said.

"Regulation is critical at both the international and national level. We need to ensure that new insights into the human brain are not weaponised for either the army or police-force".

On a more positive note, the panel of experts suggest that bioengineering will provide new ways of addressing climate change e.g. the bio-based production of materials that could lead to low-carbon plastics, clothes and construction with renewable microorganisms.

"Metabolic engineering could allow for the creation of plants and bacteria that efficiently draw-down masses of greenhouse gases. Such industrial outcomes are distant, but plausible, especially if actions such as carbon pricing are scaled up," said Kemp.

He added: "The world, not just the UK, needs a thoughtful, transparent and evidence based way of identifying emerging issues in biosecurity and bioengineering. Whether it be a new flu pandemic, new bioweapons, or new ways to sequester carbon, forewarned is forearmed."

Credit: 
University of Cambridge

Digital health divide runs deep in older racial and ethnic minorities

image: Ruth Tappen, Ed.D., R.N., F.A.A.N, lead author and Christine E. Lynn Eminent Scholar, FAU's Christine E. Lynn College of Nursing.

Image: 
Florida Atlantic University

The COVID-19 pandemic is a great example of the importance of access to the Internet and to digital health information. Unfortunately, historical disparities in health care appear to be reflected in computer ownership, access to the Internet and use of digital health information. However, few studies have qualitatively explored reasons for digital health information disparity, especially in older adults.

A study led by Florida Atlantic University's Christine E. Lynn College of Nursing in collaboration with the Dana-Farber Cancer Institute and the University of Massachusetts Medical School, examined the extent of computer ownership, Internet access, and digital health information use in older (ages 60 and above) African Americans, Afro-Caribbeans, Hispanic Americans and European Americans. They quantitatively identified factors related to electronic device ownership, Internet access, and digital health information in 562 study participants and explored the reasons for any differences using a series of focus groups.

Results of the study, published in the Journal of Racial and Ethnic Health Disparities, revealed a deep digital health divide within the older population, which was evident in both the community sample and focus groups. Participants who were older, less educated, economically disadvantaged and from ethnic groups (African American, Afro-Caribbean or Hispanic American) were up to five times less likely to have access to digital health information than were those who were younger and more highly educated, had a higher income, or were European Americans.

The odds of owning a computer or having access to the Internet were one-fifth as likely in the African American group as it was for European Americans and one-fourth as likely for the Afro-Caribbean group. The odds of Hispanic Americans owning a computer were one-third as likely and for having access to the Internet less than one-half as great as for the European Americans. Those who received Medicaid assistance were less than one-half as likely to use either the Internet or digital health information as were those who did not receive Medicaid and a little more than half as likely to own a computer.

"Currently, digital health technology development is outpacing parallel efforts to conquer the digital health divide, which also has important implications for helping older adults get registered for the COVID-19 vaccine," said Ruth Tappen, Ed.D., R.N., F.A.A.N, lead author and Christine E. Lynn Eminent Scholar, FAU's Christine E. Lynn College of Nursing. "Portals that allow patients access to their electronic health records, decision aids that prepare patients to discuss options with their providers, making telehealth appointments with providers and so forth, needlessly, though unintentionally, excludes, marginalizes, and disenfranchises those who are older, have low incomes, have low health literacy, and/or are members of minority groups."

The results of the focus group sessions shed some light on the effects of this disparity and highlight differences in response across the minority groups represented. Interest in obtaining Internet health-related information was highest in the African Americans, Afro-Caribbeans and European Americans, and moderate at best in the Hispanic American group. Ability to afford a device that allows Internet access differed greatly across groups, and participants' expressed preferences for provider-patient information or more independent searches, and involvement in decision-making also varied considerably.

Participants in the African American/Afro-Caribbean group expressed frustration with lack of access to digital health information but appreciation for alternative sources of information. Hispanic Americans critiqued information received from providers and drug inserts, some suggesting that a positive attitude and trust in God also contributed to getting well. European American participants evaluated various digital health information websites, looking to providers for help in applying information to their personal situation.

The researchers stress that addressing this digital health divide in the older population requires attention at several levels. At the policy level, national connectivity plans are needed, and greater effort to provide universal Internet access needs to be made. Municipal broadband networks can achieve this at the local level. Eventually Internet service needs to be redefined as a necessity, not a luxury, a necessary utility like electricity and water or to become a free service supported by advertisements as are broadcast radio and television. Not only does it need to be affordable but also adequate for the job.

"Until Internet access is universal, creative use of printed materials, telephone calls, in-person groups, family assistance, individual meetings, and mailings are needed for those disadvantaged and minority older adults who remain affected by this digital health information disparity," said Tappen.

Credit: 
Florida Atlantic University

Breast cancer-on-a-chip for testing immunotherapy drugs

image: Breast Cancer Chip

Image: 
Khademhosseini Lab

(LOS ANGELES) - There are many mechanisms by which the body responds to foreign invaders. One of these involves the T-cells of the immune system, which have a number of different proteins on their surface called "checkpoint proteins." These checkpoint proteins bind to proteins on the surface of other cells and can result in either stimulation or suppression of T-cell activity. Normally, surface proteins on foreign or invading cells will produce a stimulation of T-cell activity against these cells, while T-cell suppression is a built-in mechanism to prevent the immune system from attacking the body's own normal cells.

Tumor cells, however, can sometimes outwit the immune system by displaying surface proteins that bind with T-cell checkpoint proteins to cause suppression of T-cell activity. In some cases, interaction of these tumor surface proteins with T-cells even causes the T-cells to rupture. In recent years, scientists have been trying to develop "checkpoint inhibitor" drugs which will counteract these suppressive checkpoint interactions in order to re-activate the body's immune response to tumor cells. One of these drugs is U.S. FDA approved to treat metastatic melanoma; others are available or under development to treat other malignancies.

Despite these advances, however, it remains difficult to determine which cancer patients are likely candidates for this type of therapy and which drugs have the most potential. Developing a method to address these challenges would be instrumental in determining the safest, most effective drugs for cancer patients while saving time and money in the process. In order for such a method to be practical for clinical use, it should be able to achieve rapid testing of large numbers of potential immunotherapy drugs against live tumor cells for accurate, easily analyzable data.

A collaborative team from the Terasaki Institute for Biomedical Innovation (TIBI) has successfully designed and tested such a system. They began by culturing spherical aggregates of breast cancer cells in a custom-fabricated, 3D printed, transparent chip with conical microwells. These microwells were designed for optimum growth and stability of the cellular spheres. Tests performed on the microwells' cellular spheres confirmed the cells' viability and their production of T-cell de-activating surface proteins.

"The features of our microwell-based chip is the key to our successful development of an immunoactive tissue model," said Wujin Sun, Ph.D., from the Terasaki Institute's team. "The chip's transparency allows for direct microscopic observation. And its design allows for high-volume testing, which lends itself well to the rapid screening of immunotherapeutic drugs."

In order to test the effectiveness of checkpoint inhibitor drugs in activating T-cells' anti-tumor response, the team next considered how a T-cell normally behaves during activation. When a T-cell is stimulated to attack cellular invaders, it secretes proteins called cytokines, which mobilize other immune cells to the invasion site and stimulates the cells to multiply and destroy the invaders. Measurement of these cytokines can therefore indicate the level of a T-cell's activation.

The team then created an efficient, automated system to measure cytokine levels using their breast cancer-laden microwell chip. Experiments with this system were performed using anti-checkpoint protein drugs; the results showed that upon incubation of the breast cancer cells with the T-cells, cytokine production was increased by the use of the drugs, demonstrating their effectiveness in activating the T-cells.

Another way the team used their breast cancer chip was to assess the breast cancer cells' effect on stimulated T-cells. The T-cells were fluorescently labeled and added to the breast cancer cells in the microwells; the chip's transparency allowed direct observation of their cellular interaction using fluorescent microscopy. These breast cancer cells normally cause rupture of the T-cells, but experiments conducted with checkpoint inhibitor drugs showed that the drugs increased T-cell viability in the cultures, visually demonstrating how they can counter the effects of T-cell rupture by tumor cell interaction.

The breast cancer chip was also used for the direct observation of how the T-cells infiltrated the breast cancer cellular spheres; this type of infiltration is a measure of a T-cell's anti-tumor activity and viability. After labeling each group of cells with separate dyes and mixing them in the chip's microwells, T-cell infiltration could be directly visualized using high resolution fluorescence microscopy. Experiments conducted with checkpoint inhibitor drugs indicated that there were increased numbers of T-cells and deeper penetration into the breast cancer cells in the presence of the drugs.

In summary, the TIBI researchers were able to design robust and efficient methods for characterizing the interaction between tumor and immune cells and for rapid, high-volume and clinically-relevant ways to screen immunotherapeutic drugs against tumor cells. The microwell chip and its related apparatus can also be used to include other types of tumor cells and individual patient cells for optimizing patient response and for screening and developing additional anti-cancer drugs.

"Bringing ways to optimize clinical decisions and personalized medicine for patients is a top goal at our institute," said Ali Khademhosseini, Ph.D., director and CEO of the Terasaki Institute. "This work is a significant step towards achieving that goal in the realm of cancer immunotherapy."

Credit: 
Terasaki Institute for Biomedical Innovation

Tracking cells with omnidirectional visible laser particles

image: None

Image: 
by Shui-Jing Tang, Paul H. Dannenberg, Andreas C. Liapis, Nicola Martino, Yue Zhuo, Yun-Feng Xiao, and Seok-Hyun Yun

Laser particles are micrometre and nanometre lasers in the form of particles dispersible in aqueous solution, which have attracted considerable interest in the life sciences as a promising new optical probe. Laser particles emit highly bright light with extremely narrow spectral bandwidth. By transferring laser particles into live cells as shown in Figure 1, individual cells in a heterogeneous population can be tracked using each intracellular particle's specific spectral fingerprint as an optically readable barcode. However, laser particles emit directional light (Figure 2) and freely tumble inside living cells, their orientation varying randomly over time. Therefore optical readout of these labels results in "lighthouse-like" blinking, leading to frequent loss of cell traces.

In a new paper published in Light: Science & Application, scientists from Professor Seok-Hyun Yun's group at Harvard Medical School, and Professor Yun-Feng Xiao's group at Peking University demonstrate single-cell tracking with intracellular laser particles engineered to emit nearly homogeneously in all directions. The omnidirectional laser emission is achieved by incorporating light scattering into the microdisk cavity, which reduces orientation dependent intensity fluctuations by two orders of magnitude (Figure 2), enabling blinking-free tracking of single cells under the same conditions where existing technology suffers from frequent tracking failure. The reported technique will open new avenues for large-scale single-cell analysis, and facilitate other applications of laser particles, such as cellular and biochemical sensing and single-cell analysis in microfluidics.

These scientists summarize the single-cell tagging principle of laser particles:
"Typically, researchers use fluorescent probes to label specific cells, but only a few colors can be used at the same time before spectral overlap becomes a problem. Laser particles are tiny lasers that can be inserted inside living cells. These tiny lasers can be designed to produce many more distinguishable colors. The intracellular laser particles with a specific color will move with live cells, and therefore single cells can be tracked as they move throughout complex biological samples," said Dr. Shui-Jing Tang, a former visiting student at Harvard Medical School and a current Boya postdoctoral researcher at Peking University.

"Unfortunately, laser particles emit light in a specific direction. When particles rotate freely over time as the cell moves, their apparent brightness, as seen by a photodetector, changes dramatically. We developed a new kind of laser particle emitting light in all directions. Therefore, the spatial cell traces could be tracked continuously no matter how each particle was oriented inside a cell," added Paul Dannenberg, a graduate student at Harvard Medical School.

"The presented technique makes it possible to detect and identify laser particles reliably over time in cell tracking applications, which could enable large-scale single-cell analysis in complex biological specimens. In addition to cell-tracking, our work will facilitate other applications of laser particles, such as cellular and biochemical sensing and single-cell analysis in microfluidics," said Dr. Andreas Liapis, a research fellow at Harvard University.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

NTU Singapore team develops portable device that creates 3D images of skin in 10 minutes

image: (Left to Right) Members of the NTU research team include Assistant Professor Grzegorz Lisak from NTU's School of Civil and Environmental Engineering, PhD student Yi-Heng Cheong, the study's first author and PhD student Fu Xiaoxu, and Research Associate Ashiq Ahamed.

Image: 
NTU Singapore

A team from Nanyang Technological University, Singapore (NTU Singapore) has developed a portable device that produces high-resolution 3D images of human skin within 10 minutes.

The team said the portable skin mapping (imaging) device could be used to assess the severity of skin conditions, such as eczema and psoriasis.

3D skin mapping could be useful to clinicians, as most equipment used to assess skin conditions only provide 2D images of the skin surface. As the device also maps out the depth of the ridges and grooves of the skin at up to 2mm, it could also help with monitoring wound healing.

The device presses a specially devised film onto the subject's skin to obtain an imprint of up to 5 by 5 centimetres, which is then subjected to an electric charge, generating a 3D image.

The researchers designed and 3D printed a prototype of their device using polylactic acid (PLA), a biodegradable bioplastic. The battery-operated device which measures 7cm by 10cm weighs only 100 grams (See Image 1)

The made-in-NTU prototype is developed at a fraction of the cost of devices with comparable technologies, such as optical coherence tomography (OCT) machines, which may cost thousands of dollars and weigh up to 30 kilogrammes.

Assistant Professor Grzegorz Lisak from NTU's School of Civil and Environmental Engineering, who led the research, said: "Our non-invasive, simple and inexpensive device could be used to complement current methods of diagnosing and treating skin diseases. In rural areas that do not have ready access to healthcare, non-medically trained personnel can make skin maps using the device and send them to physicians for assessment."

Providing an independent comment on how the device may be useful to clinicians, Dr Yew Yik Weng, a Consultant Dermatologist at the National Skin Centre and an Assistant Professor at NTU's Lee Kong Chian School of Medicine, said: "The technology is an interesting way to map the surface texture of human skin. It could be a useful method to map skin texture and wound healing in a 3D manner, which is especially important in research and clinical trials. As the device is battery-operated and portable, there is a lot of potential in its development into a tool for point of care assessment in clinical settings."

Asst Prof Dr Yew added: "The device could be especially useful in studies involving wound healing, as we are currently lacking a tool that maps the length and the depth of skin ridges. Currently, we rely on photographs or measurements in our trials which could only provide a 2D assessment."

First author of the study, Mr Fu Xiaoxu, a PhD student from NTU's School of Civil and Environmental Engineering, said: "The 3D skin mapping device is simple to operate. On top of that, a 1.5V dry battery is all that is necessary to run the device. It is an example of a basic, yet very effective application of electrochemistry, as no expensive electronic hardware is required."

Published in the scientific journal Analytica Chimica Acta this month, the technology was developed by Asst Prof Lisak, who is also Director of Residues & Resource Reclamation Centre at the Nanyang Environment and Water Research Institute (NEWRI) and his PhD student, Mr Fu Xiaoxu.

The 'golden' solution to 3D skin mapping

The key component of the NTU device is a polymer called PEDOT:PSS , commonly used in solar panels to convert light into electricity. However, the team found a different use for its electrical conductivity - to reproduce skin patterns on gold-coated film. Gold is used as it has excellent electrical conductivity and flexibility.

To use the device, a person pushes a button to press the gold-coated film onto the subject's skin to obtain an imprint. This causes sebum, an oily substance produced by the skin, to be transferred onto the film, creating an imprint of the skin surface (see video).

Next, the imprint of the skin is transferred to the portable device where a set of electrodes is immersed in a solution. With another push of a button, the device triggers a flow of electric charge, causing PEDOT:PSS to be deposited on the surfaces of the gold-coated film in areas that are not covered with sebum. This results in a high-resolution 3D map of the skin, which reflects the ridges and grooves of the subject's skin.

Using pig skin as a model, the researchers demonstrated that the technology was able to map the pattern of various wounds such as punctures, lacerations, abrasions, and incisions.

The team also showed that even the complex network of wrinkles on the back of a human hand could be captured on the film. The thin film is also flexible enough to map features on uneven skin areas, such as the creases of an elbow and fingerprints.

Asst Prof Lisak added: "The device has also proven to be effective in lifting fingerprints and gives a high-resolution 3D image of their characteristics." (See Image 2)

Commenting on the potential uses of the device, Asst Prof Dr Yew added: "The device may aid in fingerprint identification, which is commonly performed in forensic analysis. The device could offer a higher degree of accuracy when it comes to differentiating between similar prints, due to the 3D nature of its imagery."

To further validate its efficacy, the team is exploring conducting clinical trials later this year to test the feasibility of their device, as well as other potential therapeutic uses.

Credit: 
Nanyang Technological University

A fine-grained view of dust storms

A satellite-based dataset generated by KAUST researchers has revealed the dynamics of dust storm formation and movements over the last decade in the Arabian Peninsula. Analysis of this long-term dataset reveals the connection between the occurrence of extreme dust events and regional atmospheric conditions, a finding that could help improve weather forecasting and air-quality models.

Dust storms occur when strong winds lift tiny particles of sand into the atmosphere. These events often span several miles and can have an enormous impact on daily life, from damaging buildings and disrupting air traffic to triggering respiratory illnesses and other health problems.

The Arabian Peninsula is a global hotspot of extreme dust events, with storms occurring all year round, typically peaking in March and August. While previous studies that rely on ground-based measurements have explored how these dust storms form, few have captured in detail how they vary across the region.

"Comprehensive and continuous observations are needed to identify extreme dust events over the Arabian Peninsula," says lead author of the study, Harikishan Gandham.

"Ground-based networks do not provide enough information," explains Gandham. "We tried to fill this gap by analyzing long-term high-resolution dust data generated by satellite observations." Gandham and his team used satellite data to analyze extreme dust events that occurred in the Arabian Peninsula between 2003 and 2017. To identify and track dust storms, the researchers used state-of-the-art algorithms to determine where they formed, their frequency and duration, and their spatial extent.

The researchers found a link between intensifying Shamal winds in the north and the formation of extreme dust storms in the Arabian Peninsula. These strong winds transport fine particles from the surrounding desert to the region, leading to thick clouds of dust that can reach altitudes of up to four kilometers above sea level and last for three days or longer.

A total of 49 dust storms occurred in the Arabian Peninsula over 207 days during the 15-year study period -- these became more frequent from 2007 to their peak in 2012.

Following 2012, extreme dust events suddenly began to decline due to changing atmospheric conditions, such as heavier winter rainfall brought on by the development of a low-pressure trough in the Red Sea.

"This study generated and validated a much-needed long-term high-resolution dust aerosol dataset for the Arabian Peninsula and adjoining region," says co-author Ibrahim Hoteit.

"Compared to other global data products available, our high-resolution dataset enables more accurate study of the variability of dust events over our region."

Credit: 
King Abdullah University of Science & Technology (KAUST)

"Genetic SD-card": Scientists obtained new methods to improve the genome editing system

image: Researchers take a step in the development of genome editing technology

Image: 
Peter the Great St.Petersburg Polytechnic University

Researchers from Peter the Great St.Petersburg Polytechnic University (SPbPU) in collaboration with colleagues from Belgium take a step in the development of genome editing technology. Currently it is possible to deliver genetic material of different sizes and structures to organs and tissues. This is the key to eliminating DNA defects and treating more patients. The project is guided by Professor Gleb Sukhorukov and supported by the Russian Science Foundation. Research results were published in Particle & Particle Systems Characterization journal.

An international research group developed a polymer carrier with a number of unique properties, several types of genetic material can be loaded in its structure. In particular, the scientists managed to load genetic material of various sizes and structures into "universal containers". From small interfering RNAs (siRNAs) to messenger RNAs (mRNAs). The efficiency of delivery was demonstrated on human stem cells.

"Nowadays most of the vaccines, including those for COVID-19, are made on the basis of mRNA. This is a kind of "genetic SD-card" with information which activates human immune system, thus teaches it how to deal with the "enemy proteins" of the virus. Typically, for medical purposes, different types of carriers are used to deliver specific molecules, we proved that it is possible to deliver genetic materials of different sizes using one type of carrier. This technology opens up new horizons for the development of non-viral delivery systems", - notes Alexander Timin, head of the Laboratory for microencapsulation and controlled delivery of biologically active compounds at St. Petersburg Polytechnic University.

Scientists added that the micron-scaled carrier with incorporated genetic material can be delivered by systemic administration, or locally (directly into the tumor focus for cancer).

"The study is conducted jointly with the Raisa Gorbacheva Memorial Research Institute of Children Oncology, Hematology and Transplantation, which provided the patients'mesenchymal stem cells (cells building organs and tissues) for the experiments. In the future, we plan to conduct experiments on tumor-bearing laboratory animals in order to find out how the genetic material delivered to the tumor will be managed, "- said Igor Radchenko, director of the "RASA-Polytech" center.

Credit: 
Peter the Great Saint-Petersburg Polytechnic University

Fine tuned: adjusting the composition and properties of semiconducting 2D alloys

image: Scanning tunneling microscopy image of a Si-Ge alloy with a composition of Si5.67Ge0.33. Tall protrusions correspond to Ge atoms and short ones to Si atoms. The distance between protrusions is only 0.64 nm.

Image: 
Picture Courtesy: Antoine Fleurence from JAIST

Semiconducting 2D alloys could be key to overcoming the technical limitations of modern electronics. Although 2D Si-Ge alloys would have interesting properties for this purpose, they were only predicted theoretically. Now, scientists from Japan Advanced Institute of Science and Technology have realized the first experimental demonstration. They have also shown that the Si to Ge ratio can be adjusted to fine tune the electronic properties of the alloys, paving the way for novel applications.

Alloys--materials composed of a combination of different elements or compounds--have played a crucial role in the technological development of humans since the Bronze Age. Today, alloying materials with similar structures and compatible elements is essential because it enables us to fine tune the properties of the final alloy to match our needs.

The versatility provided by alloying naturally extends to the field of electronics. Semiconductor alloys are an area of active research because new materials will be needed to redesign the building blocks of electronic devices (transistors); in this regard, two-dimensional (2D) semiconductor alloys are seen as a promising option to go past the technical limitations of modern electronics. Unfortunately, graphene, the carbon-based poster child for 2D materials, does not lend itself easily to alloying, which leaves it out of the equation.

However, there is an attractive alternative: silicene. This material is composed entirely of silicon (Si) atoms arranged in a 2D honeycomb-like structure reminiscent of graphene. If the properties of silicene could be tuned as needed, the field of 2D silicon-based nanoelectronics would take off. Although alloying silicene with germanium (Ge) was theoretically predicted to yield stable 2D structures with properties tunable by the Si to Ge ratio, this was never realized in practice.

Now, a team of scientists from Japan Advanced Institute of Science and Technology (JAIST) have experimentally demonstrated a new way to grow a silicene layer and stably replace a portion of its atoms with Ge, allowing them to fine tune some of its electrical properties.

Their study is published in Physical Review Materials.

First, the scientists grew a single layer of 2D silicene onto a zirconium diboride (ZrB2) thin film grown on a silicon substrate through the surface segregation of Si atoms which crystallize in a 2D honeycomb-like structure. However, this silicene layer was not perfectly flat; one sixth of all Si atoms were a bit higher than the rest, forming periodic bumps or 'protrusions.'

Then, Ge atoms were deposited onto the silicene layer in ultrahigh vacuum conditions. Interestingly, both theoretical calculations and experimental observations through microscopy and spectroscopy revealed that Ge atoms could only replace the protruding Si atoms. By adjusting the number of Ge atoms deposited, a Si-Ge alloy with a desired Si to Ge ratio could be produced. The composition of the final material would thus be Si6-xGex, where x can be any number between 0 and 1.

The team then studied the effects of this adjustable Si to Ge ratio on the electronic properties of the Si-Ge alloy. They found that its electronic band structure, one of the most important characteristics of a semiconductor, could be adjusted within a specific range by manipulating the composition of the material. Excited about the results, Senior Lecturer Antoine Fleurence from JAIST, lead author of the study, remarks, "Silicon and germanium are elements commonly used in the semiconductor industry, and we showed that it is possible to engineer the band structure of 2D Si-Ge alloys in a way reminiscent of that for bulk (3D) Si-Ge alloys used in various applications."

The implications of this study are important for multiple reasons. First, the ultimate thinness and flexibility of 2D materials is appealing for many applications because it means they could be more easily integrated in devices for daily life. Second, the results could pave the way to a breakthrough in electronics. Co-author of the study, Professor Yukiko Yamada-Takamura from JAIST, explains, "Semiconducting 2D materials made of silicon and germanium with atomically-precise thickness could further decrease the dimensions of the elemental bricks of electronic devices. This would represent a technological milestone for silicon-based nanotechnologies."

Overall, this study showcases but a few of the advantages of alloying as a way to produce materials with more desirable properties than those made from a single element or compound. Let us hope semiconducting 2D alloys are further refined so that they can take the spotlight in next-generation electronic devices.

Credit: 
Japan Advanced Institute of Science and Technology

How do electrons close to Earth reach almost the speed of light?

image: The contours in color show the intensities of the radiation belts. Grey lines show the trajectories of the relativistic electrons in the radiation belts. Concentric circular lines show the trajectory of scientific satellites traversing this dangerous region in space.

Image: 
Ingo Michaelis and Yuri Shprits, GFZ

New study found that electrons can reach ultra-relativistic energies for very special conditions in the magnetosphere when space is devoid of plasma.

Recent measurements from NASA's Van Allen Probes spacecraft showed that electrons can reach ultra-relativistic energies flying at almost the speed of light. Hayley Allison, Yuri Shprits and collaborators from the German Research Centre for Geosciences have revealed under which conditions such strong accelerations occur. They had already demonstrated in 2020 that during solar storm plasma waves play a crucial role for that. However, it was previously unclear why such high electron energies are not achieved in all solar storms. In the journal Science Advances, Allison, Shprits and colleagues now show that extreme depletions of the background plasma density are crucial.

Ultra-relativistic electrons in space

At ultra-relativistic energies, electrons move at almost the speed of light. Then the laws of relativity become most important. The mass of the particles increases by a factor ten, time is slowing down, and distance decreases. With such high energies, charged particles become most dangerous to even the best protected satellites. As almost no shielding can stop them, their charge can destroy sensitive electronics. Predicting their occurrence - for example, as part of the observations of space weather practised at the GFZ - is therefore very important for modern infrastructure.

To investigate the conditions for the enormous accelerations of the electrons, Allison and Shprits used data from a twin mission, the "Van Allen Probes", which the US space agency NASA had launched in 2012. The aim was to make detailed measurements in the radiation belt, the so-called Van Allen belt, which surrounds the Earth in a donut shape in terrestrial space. Here - as in the rest of space - a mixture of positively and negatively charged particles forms a so-called plasma. Plasma waves can be understood as fluctuations of the electric and magnetic field, excited by solar storms. They are an important driving force for the acceleration of electrons.

Data analysis with machine learning

During the mission, both solar storms that produced ultra-relativistic electrons and storms without this effect were observed. The density of the background plasma turned out to be a decisive factor for the strong acceleration: electrons with the ultra-relativistic energies were only observed to increase when the plasma density dropped to very low values of only about ten particles per cubic centimetre, while normally such density is five to ten times higher.

Using a numerical model that incorporated such extreme plasma depletion, the authors showed that periods of low density create preferential conditions for the acceleration of electrons - from an initial few hundred thousand to more than seven million electron volts. To analyse the data from the Van Allen probes, the researchers used machine learning methods, the development of which was funded by the GEO.X network. They enabled the authors to infer the total plasma density from the measured fluctuations of electric and magnetic field.

The crucial role of plasma

"This study shows that electrons in the Earth's radiation belt can be promptly accelerated locally to ultra-relativistic energies, if the conditions of the plasma environment - plasma waves and temporarily low plasma density - are right. The particles can be regarded as surfing on plasma waves. In regions of extremely low plasma density they can just take a lot of energy from plasma waves. Similar mechanisms may be at work in the magnetospheres of the outer planets such as Jupiter or Saturn and in other astrophysical objects", says Yuri Shprits, head of the GFZ section Space physics and space weather and Professor at University of Potsdam.

"Thus, to reach such extreme energies, a two-stage acceleration process is not needed, as long assumed - first from the outer region of the magnetosphere into the belt and then inside. This also supports our research results from last year," adds Hayley Allison, PostDoc in the Section Space physics and space weather.

Credit: 
GFZ GeoForschungsZentrum Potsdam, Helmholtz Centre

Combining PD-1inhibitor with VEGF inhibitor in chemotherapy of cholangiocarcinoma patient

Cholangiocarcinoma (CCA) is considered as a diverse group of epithelial cancers characterized by poor outcomes. Cholangiocarcinoma can be divided into three types according to the original position: Intrahepatic Cholangiocarcinoma (ICC), Perihilar Cholangiocarcinoma (PCC) and Distal Extrahepatic Tumors (DET). The most promising way to cure cholangiocarcinoma is surgery, including laparoscopic liver resection and open liver resection. However, the post-surgical outcome is less than satisfactory and there is a poor 5-years survival rate of 16.5-48%. Furthermore, more than two-thirds of the patients are unable to be treated with surgery when diagnosed with cholangiocarcinoma. According to the present meta-analyses, HBV and HCV infections significantly increase the risk of cholangiocarcinoma. For instance, with the high HBV infection rates in Asia and China, the incidence rate of cholangiocarcinoma is approximately 7 per 1 million people in China. Unfortunately, there are no effective biomarkers and typical clinical features for the early diagnosis of cholangiocarcinoma. Additionally, its poor response and limited chemotherapeutic regimens also render its progression difficult to control. Cholangiocarcinoma's median survival time is commonly less than 24 months after diagnosis and therefore, new therapeutic approaches are strongly needed to improve patients' survival rate. In this article, medical specialists from The Affiliated Brain Hospital of Guangzhou Medical University, China report a case of a male patient with a stage 4 intrahepatic cholangiocarcinoma, who was unsuccessfully treated with different chemotherapy regimens and who was further treated with a new therapeutic method to restrict the progression of the lesion.

This patient had been diagnosed with intrahepatic cholangiocarcinoma for 2 years and when first discovered in August 2018, metastases were found in the lung and multiple lymph nodes. Medical imaging reasonably doubted that the original lesion was in the bile duct. However, a pathologic biopsy confirmed the result in September 2018. Pathological genetic testing showed alterations in the in TP53 and KRAS genes.

According to the specific medical imaging and pathological results, gemcitabine and nedaplatin were used as a first-line chemotherapy regimen in October 2018. After 6 courses of chemotherapy, the disease was assessed as stable. In the February 2019 follow up period, Apatinib and Tigio were used to maintain the treatment. However, the disease progressed after the next assessment in April 2019. Paclitaxel for Injection (Albumin Bound) and Oxaliplatin were used with the expectation of stopping the disease progression. After 2 courses, medical imaging that was performed in June 2019, showed that the disease was progressing again and with an increased growth of the lung lesion. Therefore, the clinician had to change the patient's chemotherapy regimen into Loplat and Retitrexet, which was, unfortunately, unsuccessful after 2 courses. From September 2018 to August 2019, the patient CT images showed that the lung metastases continued to increase, the hepatic lesions were obviously enhanced, and the para-aortic lymph nodes were enlarged and fused. In August 2019, with the absence of a solution, the doctors discussed with other doctors and confirmed the possibility to combine PD-1 and VEGF inhibitors with chemotherapy to control the tumor progression. Therefore, the oncologists recommended the treatment combination of FOLFOX6 (Oxaliplatin 130mg d2 + Calcium Levofolinate 500mg d2 + 5-FU 500mg d2 + 5-FU 3000mg d2 46h; q2w), Camrelizumab (PD-1 inhibitor) (200mg d1) and Fruquintinib (VEGF/VEGFR inhibitor) (5mg d1-21). After discussing this with his family, the patient provided his consent, and consequently, they collected his data for results' comparison.

After a complete course, the liver and lung lesions stopped growing, which was encouraging, but still worrying, and after discussion with the patient, the doctors decided to proceed with the course. Six courses later, they delightfully discovered that the patient's lung lesions were significantly reduced, the liver lesion enhancement was reduced, and the abdominal para-aortic lymph nodes were stable. The lesion was assessed as PR (partial response) and stable afterwards.

Significant efforts have been made in improving survival of cholangiocarcinoma patients. In this case, combining a PD-1 inhibitor, a VEGF/VEGFR inhibitor in chemotherapy for controlling cholangiocarcinoma is a promising therapeutic approach; however, more studies are still needed to fully understand how the treatment works.

Credit: 
Bentham Science Publishers

Supercomputer in your bedroom

image: Dr James Knight and Prof Thomas Nowotny of the University of Sussex School of Engineering and Informatics.

Image: 
Stuart Robinson/University of Sussex

University of Sussex academics have established a method of turbocharging desktop PCs to give them the same capability as supercomputers worth tens of millions of pounds.

Dr James Knight and Prof Thomas Nowotny from the University of Sussex's School of Engineering and Informatics used the latest Graphical Processing Units (GPUs) to give a single desktop PC the capacity to simulate brain models of almost unlimited size.

The researchers believe the innovation, detailed in Nature Computational Science, will make it possible for many more researchers around the world to carry out research on large-scale brain simulation, including the investigation of neurological disorders.

Currently, the cost of supercomputers is so prohibitive they are only affordable to very large institutions and government agencies and so are not accessible for large numbers of researchers.

As well as shaving tens of millions of pounds off the costs of a supercomputer, the simulations run on the desktop PC require approximately 10 times less energy bringing a significant sustainability benefit too.

Dr Knight, Research Fellow in Computer Science at the University of Sussex, said: "I think the main benefit of our research is one of accessibility. Outside of these very large organisations, academics typically have to apply to get even limited time on a supercomputer for a particular scientific purpose. This is quite a high barrier for entry which is potentially holding back a lot of significant research.

"Our hope for our own research now is to apply these techniques to brain-inspired machine learning so that we can help solve problems that biological brains excel at but which are currently beyond simulations.

"As well as the advances we have demonstrated in procedural connectivity in the context of GPU hardware, we also believe that there is also potential for developing new types of neuromorphic hardware built from the ground up for procedural connectivity. Key components could be implemented directly in hardware which could lead to even more truly significant compute time improvements."

The research builds on the pioneering work of US researcher Eugene Izhikevich who pioneered a similar method for large-scale brain simulation in 2006.

At the time, computers were too slow for the method to be widely applicable meaning simulating large-scale brain models has until now only been possible for a minority of researchers privileged to have access to supercomputer systems.

The researchers applied Izhikevich's technique to a modern GPU, with approximately 2,000 times the computing power available 15 years ago, to create a cutting-edge model of a Macaque's visual cortex (with 4.13 × 106 neurons and 24.2 × 109 synapse) which previously could only be simulated on a supercomputer.

The researchers' GPU accelerated spiking neural network simulator uses the large amount of computational power available on a GPU to 'procedurally' generate connectivity and synaptic weights 'on the go' as spikes are triggered - removing the need to store connectivity data in memory.

Initialization of the researchers' model took six minutes and simulation of each biological second took 7.7 min in the ground state and 8.4 min in the resting state- up to 35 % less time than a previous supercomputer simulation. In 2018, one rack of an IBM Blue Gene/Q supercomputer initialization of the model took around five minutes and simulating one second of biological time took approximately 12 minutes.

Prof Nowotny, Professor of Informatics at the University of Sussex, said: "Large-scale simulations of spiking neural network models are an important tool for improving our understanding of the dynamics and ultimately the function of brains. However, even small mammals such as mice have on the order of 1 × 1012 synaptic connections meaning that simulations require several terabytes of data - an unrealistic memory requirement for a single desktop machine.

"This research is a game-changer for computational Neuroscience and AI researchers who can now simulate brain circuits on their local workstations, but it also allows people outside academia to turn their gaming PC into a supercomputer and run large neural networks."

Credit: 
University of Sussex

Sub-surface imaging technology can expose counterfeit travel documents

New research by the University of Kent has found that optical coherence tomography (OCT) imaging technology can be utilised to distinguish between legitimate and counterfeit travel documents.

OCT imaging has been widely used in the medical and biomedical fields, recognised as transforming the field of clinical ophthalmology, and this research published in Science & Justice has now identified its capabilities for forgery detection use.

This was a joint study between the Applied Optics Group (PDRA Dr Manuel Marques and Professor Adrian Podoleanu) and the Forensic Group (Reader Robert Green OBE) in the University of Kent's School of Physical Sciences, while working alongside the forensics science technology company, Foster + Freeman (Dr Roberto King). The work demonstrates that OCT can perform quantitative, non-destructive, high resolution sub-surface analysis of multi-layered identification document, with a high imaging throughput and high-density volume. The technology typically takes less than 10 seconds to detect counterfeit documentation.

The researchers have assessed the security features in specimen passports and national ID cards. The OCT technology exposed the documentation's translucent structures, non-destructively enabling quantitative visualisation of embedded security features.

The large number of fraudulent identity documents in circulation continues to be a concern for the UK Government, with organised, transnational crime and the threat of criminals and terrorists crossing international borders undetected still a threat. Passport fraud remains one of the greatest threats to global security. While an increasing number of security features have been introduced by authorities in the latest generation of identification documents (such as several layers of polycarbonate), this sophistication can make the ability to distinguish legitimate from counterfeit documents an ever-evolving challenge. Therefore, this presents an unmet and evolving need to identify such sophisticated forgeries, in a non-destructive, high throughput manner.

Robert Green OBE, said: 'As documents become harder to forge, so does the sophistication of forgery detection. Although more secure than their predecessors, the latest generation of identity documents manufactured using polycarbonate layers remain susceptible to counterfeiting. Fraudsters tend to adopt tactics such as copying paper or polycarbonate, reproducing documents and hologram images using sophisticated computer technology before re-laminating. Any of these tactics will affect the inner structure of a document, showing the importance of its subsurface characterisation and the benefit that OCT can provide to identify such tampering.'

Dr King said: 'We believe that the application of OCT can be used by multiple stakeholders in the field, especially forensic scientists working to validate suspected counterfeit documents and document manufacturers as a non-destructive method of quality control. OCT can preserve evidence which may be useful for criminal investigations, as well as prevent the unnecessary destruction of legitimate documents which may have been previously flagged as suspected forgeries.'

Credit: 
University of Kent

Automated imaging detects and tracks brain protein involved in Alzheimer's disease

image: By mapping individuals' unique brain anatomy, Massachusetts General Hospital researchers have identified early tau PET imaging signals to track Alzheimer's pathology. Left: Tau PET images for a person with normal cognition. Right: top, 3D brain surface rendering with tau PET overlay; bottom, flat map showing brain surface details, with cortical tau origin (rhinal cortex) outlined in white.

Image: 
Justin Sanchez

BOSTON - Amyloid-beta and tau are the two key abnormal protein deposits that accumulate in the brain during the development of Alzheimer's disease, and detecting their buildup at an early stage may allow clinicians to intervene before the condition has a chance to take hold. A team led by investigators at Massachusetts General Hospital (MGH) has now developed an automated method that can identify and track the development of harmful tau deposits in a patient's brain. The research, which is published in Science Translational Medicine, could lead to earlier diagnoses of Alzheimer's disease.

"While our understanding of Alzheimer's disease has increased greatly in recent years, many attempts to treat the condition so far have failed, possibly because medical interventions have taken place after the stage at which the brain injury becomes irreversible," says lead author Justin Sanchez, a data analyst at MGH's Gordon Center for Medical Imaging.

In an attempt to develop a method for earlier diagnosis, Sanchez and his colleagues, under the leadership of Keith A. Johnson, MD, of the departments of Radiology and Neurology at MGH, evaluated brain images of amyloid-beta and tau obtained by positron emission tomography, or PET, in 443 adults participating in several observational studies of aging and Alzheimer's disease. Participants spanned a wide range of ages, with varying degrees of amyloid-beta and cognitive impairment -- from healthy 20-year-olds to older patients with a clinical diagnosis of Alzheimer's dementia. The researchers used an automated method to identify the brain region most vulnerable to initial cortical tau buildup in each individual PET scan.

"We hypothesized that applying our method to PET images would enable us to detect the initial accumulation of cortical tau in cognitively normal people, and to track the spread of tau from this original location to other brain regions in association with amyloid-beta deposition and the cognitive impairment of Alzheimer's disease," explains Sanchez. He notes that cortical tau, when it spreads from its site of origin to neocortical brain regions under the influence of amyloid-beta, appears to be the "bullet" that injures brains in Alzheimer's disease.

The technique revealed that tau deposits first emerge in the rhinal cortex region of the brain, independently from amyloid-beta deposits, before spreading to the nearby temporal neocortex. "We observed initial cortical tau accumulation at this site of origin in cognitively normal individuals without evidence of elevated amyloid-beta, as early as 58 years old," says Sanchez.

Importantly, when the scientists followed 104 participants for two years, they found that people with the highest initial levels of tau at the point of origin exhibited the most spread of tau throughout the brain over time.

The findings suggest that PET measurements of tau focused on precisely individualized specific brain regions may predict an individual's risk of future tau accumulation and consequent Alzheimer's disease. Targeting tau when detected at an early stage might prevent the condition or slow its progression.

"Clinical trials evaluating the efficacy of anti-tau therapeutics would benefit from an automated, individualized imaging method to select cognitively normal individuals vulnerable to impending tau spread, thus advancing our efforts to provide effective interventions for patients at risk for Alzheimer's disease," says Sanchez.

Credit: 
Massachusetts General Hospital