Culture

Rationally designing hierarchical zeolites for better diffusion and catalyst efficiency

image: Overview of the synthesis routes toward zeolite-based hierarchical materials.

Image: 
©Science China Press

Thanks to various crystalline topologies, tunable chemical composition, high (hydro)thermal stability, and controllable surface acidity/basicity, zeolites are widely used in petroleum refining, petrochemical manufacture, fine chemical synthesis, biomedicine , environmental chemistry, etc. However, for many zeolite-catalyzed reactions, the molecular diameters of the reaction species involved are often larger than the pore apertures of the zeolites. This leads to undesired diffusion resistance between the bulk phase and the active centers of the catalyst, thereby significantly reducing the catalyst efficiency.

Alleviating diffusion resistance and improving catalyst efficiency of the zeolite-based catalyst is always one of the most concerned issues in academia and industry. Within the past decades, tools for integrating hierarchical micro-/mesoporous structures into zeolites for better diffusion and catalyst efficiency have been greatly enriched.

However, in the real industrial catalysis processes, even if zeolitic component contains hierarchically porous structure, it is just one of the components of the multi-component industrial catalyst. The zeolite-based industrial catalyst is essentially hierarchical structure composed of microporous zeolitic and macroporous non-zeolitic components. When the hierarchically porous structure is integrated, the catalyst also has a micro-/meso-/macroporous trimodal hierarchical structure. Obviously, the hierarchical pore structure of industrial zeolite-based catalysts exists in two levels: "inside the zeolitic component" and "between the components of the industrial catalyst".

In a new review paper published in the Beijing-based National Science Review, scientists at the China University of Petroleum in Qingdao, China (Peng Peng, Zi-Feng Yan), China National Petroleum Company in Beijing, China (Xiong-Hou Gao), and French National Center for Scientific Research (CNRS) in Caen, France (Svetlana Mintova) analyzed the state-of-the-arts in rational design of hierarchical micro-/mesoporous structures from catalytic reaction engineering point of view.

From the perspective of catalytic reaction engineering, the quantitative indicators for evaluating catalyst efficiency are catalyst effectiveness factor (η) and Thiele modulus (φ). If the catalyst system undergoes strong diffusion resistance (η

Zeolite with a hierarchical porous structure is just one of the components of real industrial catalysts. In order to meet the requirements of mechanical strength, hydrothermal stability, resistance to poisoning and coking in the industrial catalytic processes, industrial catalysts need to add other non-zeolitic components. Although the interaction mechanism between the industrial catalyst components is not fully understood, the non-ideal matching of the porous structures between the zeolitic and the non-zeolite components can cause reducing performance of the hierarchical pore zeolite components. The coordination of pores interconnectivity of hierarchical zeolites and other non-zeolitic components in industrial catalysts is an urgent issue to be addressed prior the industrial applications of hierarchical zeolites.

The ultimate goal for preparing hierarchically porous material is to fully release its potential at industrial scale by controlling the hierarchical pore structure, different components' locations and interconnectivity that play a pivotal role on enhancing of their catalytic efficiency. Developing combined in-situ or operando spectroscopic, microscopic or diffraction techniques is the key to unravel the structure-activity relationship of hierarchical zeolites as a component in industrial catalysts.

Credit: 
Science China Press

Vaccine proves effective against the most severe type of pneumonia

image: A pneumococcal vaccine was effective at protecting children in Laos against the most severe type of pneumonia, a new study has found.

Image: 
Natee K Jindakum

A pneumococcal vaccine was effective at protecting children in Laos against the most severe type of pneumonia, a new study has found.

The research led by the Murdoch Children's Research Institute (MCRI) and published in The Lancet Regional Health - Western Pacific, found the PCV13-vaccine reduced hypoxic pneumonia and pneumonia requiring oxygen support by 37 per cent.

MCRI Dr Cattram Nguyen said although pneumococcal vaccines were known to reduce severe cases of childhood pneumonia, no studies from Asia had measured their effectiveness until now.

The study involved 826 children, aged up to five years, admitted to hospital with pneumonia. PCV13 reduced hypoxic pneumonia and pneumonia requiring extra oxygen by 37 per cent.

Dr Nguyen said because pneumonia was a leading cause of childhood deaths in Laos, the PCV13 vaccine had great potential to alleviate this burden of disease on the most vulnerable. Pneumonia that requires oxygen therapy is one of the severest manifestations of pneumonia.

"Universal health care did not exist in Laos until recently, and supplementary oxygen treatment was prohibitively expensive for families," she said.

In October 2013, Laos introduced the PCV13 vaccine into its national childhood vaccination program, supported by Gavi, the Vaccine Alliance. But the Ministry of Health requested evidence of the health benefits of the vaccine to support its ongoing use.

MCRI Professor Fiona Russell said Asian countries have been very slow to introduce PCV13 into their national immunisation programs.

"These results provide a compelling argument to continue childhood PCV13 vaccination in Laos and for its introduction into similar countries with high death rates from pneumonia," she said.

Professor Russell said the study also described a simple, low-cost single hospital-based method to assess vaccine effectiveness that was feasible for other low and middle-income countries to adopt. Measuring the success of this vaccine would usually require thousands of cases collected over many years of surveillance, and often involving many hospitals, she said

"In this study, we enrolled children hospitalised with hypoxic and non-hypoxic pneumonia in a single hospital and compared pneumococcal vaccination rates between the two groups to determine vaccine effectiveness over about four years," she said.

Globally, lower respiratory infections, including pneumonia, are a leading cause of death in children under five years old, causing 800,000 deaths annually, predominantly in low- and middle-income countries.

Streptococcus pneumoniae (the pneumococcus) is estimated to cause over half of all pneumonia-related deaths in children under five years old.

Credit: 
Murdoch Childrens Research Institute

Colors evoke similar feelings around the world

People all over the world associate colors with emotions. In fact, people from different parts of the world often associate the same colors with the same emotions. This was the result of a detailed survey of 4,598 participants from 30 nations over six continents, carried out by an international research team. "No similar study of this scope has ever been carried out," said Dr. Daniel Oberfeld-Twistel, member of the participating team at Johannes Gutenberg University Mainz (JGU). "It allowed us to obtain a comprehensive overview and establish that color-emotion associations are surprisingly similar around the world."

In the current issue of Psychological Science, the scientists report that the participants were asked to fill out an online questionnaire, which involved assigning up to 20 emotions to twelve different color terms. The participants were also asked to specify the intensity with which they associated the color term with the emotion. The researchers then calculated the national averages for the data and compared these with the worldwide average. "This revealed a significant global consensus," summarized Oberfeld-Twistel. "For example, throughout the world the color of red is the only color that is strongly associated with both a positive feeling - love - and a negative feeling - anger." Brown, on the other hand, triggers the fewest emotions globally. However, the scientists also noted some national peculiarities. For example, the color of white is much more closely associated with sadness in China than it is in other countries, and the same applies to purple in Greece. "This may be because in China white clothing is worn at funerals and the color dark purple is used in the Greek Orthodox Church during periods of mourning," explained Oberfeld-Twistel. In addition to such cultural peculiarities, the climate may also play a role. According to the findings from another of the team's studies, yellow tends to be more closely associated with the emotion of joy in countries that see less sunshine, while the association is weaker in areas that have greater exposure to it.

According to Dr. Daniel Oberfeld-Twistel, it is currently difficult to say exactly what the causes for global similarities and differences are. "There is a range of possible influencing factors: language, culture, religion, climate, the history of human development, the human perceptual system." Many fundamental questions about the mechanisms of color-emotion associations have yet to be clarified, he continued. However, by using an in-depth analysis that included the use of a machine learning approach developed by Oberfeld-Twistel, a computer program that improves itself as the database grows, the scientists have already discovered that the differences between individual nations are greater the more they are geographically separated and/or the greater the differences between the languages spoken in them.

Credit: 
Johannes Gutenberg Universitaet Mainz

Male circumcision campaigns in Africa to fight HIV are a form of cultural imperialism

World Health Organization-recommended campaigns to circumcise millions of African boys and men to reduce HIV transmission are based more on systemic racism and 'neocolonialism' than sound scientific research, according to a critical appraisal published in Developing World Bioethics.

More than 25 million men and boys have already been circumcised as a result of voluntary medical male circumcision (VMMC) campaigns in eastern and southern Africa, implemented by the United States government and Western non-governmental organisations (NGOs).

The critical appraisal examined the history and politics of these circumcision campaigns in the context of race and colonialism, and found that they had been started in haste and without sufficient contextual research. The paper concluded that the campaigns have been carried out in a manner that implies troubling assumptions about culture, health and sexuality in Africa. Africans were underrepresented in the decision making process, and needed a greater voice in the planning of such an intimate health intervention.

Max Fish, lead author and founder of the VMMC Experience Project, a grassroots effort to elevate African voices about the effects of the campaigns on their lives, said: "There has been a global spotlight on systemic racism--and racist institutions--following the death of George Floyd, an African American man, at the hands of a White police officer in May. However, unethical human experimentation on Africans and African Americans remains a pervasive problem in Western medicine that has received relatively little attention."

"Africa was targeted, and it is still being targeted," said Cleophas Matete, a Kenyan bishop interviewed by the VMMC Experience Project, who is quoted in the study. "It is used as a continent to experiment. Should they introduce anything that is [morally questionable], they want to experiment in Africa. So I believe that the entire process of trying to test it in Africa was wrong from the beginning, and I say no to it."

Dr Arianne Shahvisi, Senior Lecturer in Ethics at Brighton and Sussex Medical School and second author, said: "We believe the decision to implement the circumcision campaign in southern and eastern Africa was not based on robust scientific evidence, but instead assumed that the results from clinical trials would safely 'scale' to the real world without thinking through the cultural implications. We argue that as a surgically corrective measure, the present circumcision campaigns hinge on racist, homogenising assumptions about the sexuality of those who are targeted, as well as a belief that HIV risk behaviours can be appraised independently of poverty and systemic factors."

There has been a long history of unethical medical research conducted on Africans and African Americans, including the infamous "Tuskegee Study of Untreated Syphilis in the Negro Male," in which African American syphilis patients living in rural poverty were observed but not treated, leading to suffering, the spread of infection and widespread death, and subsequent concerns about medical exploitation among these communities.

The decision to implement the circumcision policies in Africa was based on three clinical trials conducted in South Africa, Uganda, and Kenya, which showed that circumcision reduced men's HIV risk by 50-60% over two years. However, critics have alleged that the trials had serious limitations: they could not be placebo-controlled, and participants were explicitly informed of the study's aim to establish a lower HIV incidence following circumcision.

In addition, HIV prevalence at the start of the campaign was higher in circumcised than uncircumcised men in 10 out of 18 countries where such data was available, including five countries that were targeted for mass circumcision.

A fourth trial seeking to establish an HIV risk reduction for women allowed HIV-positive Ugandan men to infect unknowing partners--one of Tuskegee's ethical violations. This trial was stopped early for "futility" after partners of newly circumcised men became infected at a 55% higher rate, although this has received much less attention from the global public health community.

The critical appraisal was conducted by ethicists, legal and medical experts from the UK, US, Cameroon, Zimbabwe and South Africa.

Credit: 
University of Sussex

Bumblebees benefit from faba bean cultivation

image: A bumble bee (Bombus hortorum) collects nectar from a faba bean flower.

Image: 
Nicole Beyer

About one third of the payments received by farmers are linked to specific "greening measures" to promote biodiversity. The cultivation of nitrogen-fixing legumes is very popular. However, these measures have been criticized because the benefits for biodiversity are unclear. Now a team from the University of Göttingen, the Julius Kühn Institute and the Thuenen Institute in Braunschweig has investigated whether the cultivation of the faba bean (Vicia faba - also known as the broad bean or fava bean) can support wild bees. It turns out that bumblebees benefit from the cultivation of faba beans, while all other wild bees depend on the presence of semi-natural habitats. The results of the study have been published in the Journal of Applied Ecology.

The researchers recorded wild bees in various German agricultural landscapes for the study. In one half of the landscapes, conventionally farmed faba beans were cultivated; in the other half there were no bean fields. "The nectar of the faba bean is hidden deep in the flowers and is only easily accessible to larger bees with long tongues, such as bumblebees. We therefore wanted to investigate how groups of wild bees, which differ in their external appearance, react to the cultivation of faba beans and whether they can benefit from it," says first author Nicole Beyer from the Functional Agrobiodiversity Group at the University of Göttingen. The study results show that there were more than twice as many bumblebees in the faba bean landscapes than in the landscapes without beans. In contrast, the cultivation of beans did not affect other wild bees. However, these other wild bees benefited from a high proportion of semi-natural habitats.

"Our research clearly showed that certain bee species can be supported by similar measures in farmed areas. But the benefits depend strongly on the characteristics of the crop and pollinator. In order to encourage the widest possible range of species, we propose a combination of measures: the cultivation of various flowering arable crops such as faba beans and the promotion or preservation of semi-natural habitats with a diverse range of flowers and nesting sites for many other wild bees," concludes Professor Catrin Westphal, Head of Functional Agrobiodiversity at the University of Göttingen.

Credit: 
University of Göttingen

Binge-drinkers' brains have to work harder to feel empathy for others

image: A standard brain image from Dr Rae's laboratory (not from study).

Image: 
Dr Charlotte Rae

People who binge-drink show more extensive dysfunction across their brains than previously realised, a new study from the University of Sussex has shown.

The research shows that binge-drinkers' brains have to put more effort into trying to feel empathy for other people in pain.

The paper "Differential brain responses for perception of pain during empathic response in binge drinkers compared to non-binge drinkers" is published in the October 2020 edition of the Neuroimage: Clinical journal. The study involved 71 participants (from France and the UK) whose brain activity was observed in fMRI scanners while undertaking a pain perception task. Half of these people were classified as binge-drinkers and half were not. The binge-drinkers were sober while they were being observed.

In the task participants were shown an image of a limb being injured, and asked to imagine either that the body part was theirs, or that of another person, and to state how much pain was associated with the image. The binge-drinking participants struggled more than their non-binge-drinking counterparts when trying to adopt the perspective of another person experiencing the pain: they took more time to respond and the scans revealed that their brains had to work harder - to use more neural resources - to appreciate how intensely another person would feel pain.

The study also revealed a more widespread dysfunction than previously realised; a visual area of the brain, which is involved in recognising body parts, showed unusually high levels of activation in the binge-drinkers. This was not true in the non-binge drinkers who looked at the same images.

When the binge-drinkers were asked to imagine the injured body part in the picture as their own, their pain estimate was not different from that of their non-binge drinking counterparts.

Professor Theodora Duka from the School of Psychology at the University of Sussex said:

"I have been studying the effects of drinking excessive alcohol for many years. In that time I have built up a strong body of evidence about the widespread way in which binge-drinking is associated with brain dysfunction in areas supporting self-control and attention. Our aim with the present study was to examine whether binge drinkers show less empathy and their brains show different responses to non-binge drinkers, when they imagine another person in pain. Reduced empathy in binge drinkers may facilitate drinking as it can blunt the perception of suffering of self or others during a drinking session. We have shown with this study that dysfunction associated with binge drinking is even more extensive than previously known. A region of the brain called the Fusiform Body Area associated with recognition of body parts showed hyperactivity in binge-drinkers in a situation in which feelings of empathy are experienced.

Dr Charlotte Rae from the School of Psychology at the University of Sussex said:

"Our results are quite surprising. Our data show that binge-drinkers need to work harder to feel empathy for other people in pain. They need to use more resources in terms of higher brain activity than non-binge drinkers. What this means in everyday life is that people who binge-drink might struggle to perceive the pain of others as easily as non-binge drinkers do. It's not that binge drinkers feel less empathy - it's just that they have to put more brain resource into being able to do so. However, under certain circumstances when resources become limited, binge drinkers may struggle to engage in an empathic response to others."

Bring drinking is defined as consuming more than 60 g of pure alcohol - (equivalent to about three quarters of one bottle of wine, or 2½ pints of lager) on at least one occasion in the past 30 days. About 30% of all adults (over 15 years of age) who drink alcohol in UK and France meet this criterion.

Credit: 
University of Sussex

Epigenetic changes precede onset of diabetes

image: The researchers first identified early changes in DNA methylations and the expression patterns in the islets of Langerhans in diabetes-prone mice and then investigated which of these could be identified in humans before diabetes was diagnosed.

Image: 
DIfE

Epigenetic* changes in the islets of Langerhans of the pancreas can be detected in patients several years before the diagnosis of type 2 diabetes (T2D). These changes are responsible for the altered methylation activity of specific genes which differs from that in healthy individuals. In humans, 105 such changes have been discovered in blood cells. This was shown in a study by researchers from the DZD/DIfE, which has now been published in the journal Diabetes. These findings could help to develop diagnostic markers for type 2 diabetes.

Several causes play a role in the development of type 2 diabetes. These include a genetic predisposition, epigenetic factors as well as a diet high in fat and sugar, overweight and lack of exercise. In order to prevent the development of the metabolic disease, it is important to identify people with an increased risk for the metabolic disease at an early stage. Since the development of diabetes can also lead to functional disorders in the islets of Langerhans in the pancreas, researchers from the German Institute of Human Nutrition (DIfE) and the German Center for Diabetes Research (DZD) have investigated whether there are epigenetic changes in the islets of Langerhans that are related to the development of diabetes. Lund University also participated in the study.

"Our aim was to identify early changes in DNA methylation and the expression pattern in the islets of Langerhans in a diabetes-prone mouse and then to test which of these can also be detected in the blood of humans before diabetes is diagnosed," said Prof. Dr. Annette Schürmann, spokesperson of the DZD and head of the Department of Experimental Diabetology at DIfE, explaining the translational research approach. For this purpose, obese mice were fed a high-calorie diet for five weeks and divided into diabetes-prone and diabetes-resistant animals on the basis of certain criteria (e.g. the liver fat content). The DNA methylations and expression patterns in the islets of Langerhans were determined for both groups. "We were able to identify 497 candidates which differed both in terms of their expression and their DNA methylation," said first author Dr. Meriem Ouni.

The next step was to search for similar epigenetic changes in blood cells of participants in the EPIC-Potsdam study** (270 controls and 270 incident T2D cases on average 3.8 years before diagnosis). The researchers found altered levels of DNA methylation in 105 genes that were associated with the later diagnosis of diabetes. Most of these changes were also found in the islets of Langerhans in type 2 diabetes patients. The researchers assume that most of the alterations in DNA methylation that can be detected in the blood before diagnosis are still present in the islets of Langerhans later in the course of the disease.

"Our broad and translational research approach has identified a number of interesting genes whose expression and altered DNA methylation are associated with the later diagnosis of diabetes," said Schürmann. "In humans, 105 such differences can be detected in blood cells a few years prior to the diabetes diagnosis. This may open up the possibility of using some of these changes as diagnostic markers for type 2 diabetes in the future. "

Next, the researchers want to investigate whether diets or certain drugs can correct unfavorable DNA methylation patterns. They also want to determine whether the identified markers differ in the various diabetes clusters.

Credit: 
Deutsches Zentrum fuer Diabetesforschung DZD

Experiments reveal why human-like robots elicit uncanny feelings

Androids, or robots with humanlike features, are often more appealing to people than those that resemble machines -- but only up to a certain point. Many people experience an uneasy feeling in response to robots that are nearly lifelike, and yet somehow not quite "right." The feeling of affinity can plunge into one of repulsion as a robot's human likeness increases, a zone known as "the uncanny valley."

The journal Perception published new insights into the cognitive mechanisms underlying this phenomenon made by psychologists at Emory University.

Since the uncanny valley was first described, a common hypothesis developed to explain it. Known as the mind-perception theory, it proposes that when people see a robot with human-like features, they automatically add a mind to it. A growing sense that a machine appears to have a mind leads to the creepy feeling, according to this theory.

"We found that the opposite is true," says Wang Shensheng, first author of the new study, who did the work as a graduate student at Emory and recently received his PhD in psychology. "It's not the first step of attributing a mind to an android but the next step of 'dehumanizing' it by subtracting the idea of it having a mind that leads to the uncanny valley. Instead of just a one-shot process, it's a dynamic one."

The findings have implications for both the design of robots and for understanding how we perceive one another as humans.

"Robots are increasingly entering the social domain for everything from education to healthcare," Wang says. "How we perceive them and relate to them is important both from the standpoint of engineers and psychologists."

"At the core of this research is the question of what we perceive when we look at a face," adds Philippe Rochat, Emory professor of psychology and senior author of the study. "It's probably one of the most important questions in psychology. The ability to perceive the minds of others is the foundation of human relationships. "

The research may help in unraveling the mechanisms involved in mind-blindness -- the inability to distinguish between humans and machines -- such as in cases of extreme autism or some psychotic disorders, Rochat says.

Co-authors of the study include Yuk Fai Cheong and Daniel Dilks, both associate professors of psychology at Emory.

Anthropomorphizing, or projecting human qualities onto objects, is common. "We often see faces in a cloud for instance," Wang says. "We also sometimes anthropomorphize machines that we're trying to understand, like our cars or a computer."

Naming one's car or imagining that a cloud is an animated being, however, is not normally associated with an uncanny feeling, Wang notes. That led him to hypothesize that something other than just anthropomorphizing may occur when viewing an android.

To tease apart the potential roles of mind-perception and dehumanization in the uncanny valley phenomenon the researchers conducted experiments focused on the temporal dynamics of the process. Participants were shown three types of images -- human faces, mechanical-looking robot faces and android faces that closely resembled humans -- and asked to rate each for perceived animacy or "aliveness." The exposure times of the images were systematically manipulated, within milliseconds, as the participants rated their animacy.

The results showed that perceived animacy decreased significantly as a function of exposure time for android faces but not for mechanical-looking robot or human faces. And in android faces, the perceived animacy drops at between 100 and 500 milliseconds of viewing time. That timing is consistent with previous research showing that people begin to distinguish between human and artificial faces around 400 milliseconds after stimulus onset.

A second set of experiments manipulated both the exposure time and the amount of detail in the images, ranging from a minimal sketch of the features to a fully blurred image. The results showed that removing details from the images of the android faces decreased the perceived animacy along with the perceived uncanniness.

"The whole process is complicated but it happens within the blink of an eye," Wang says. "Our results suggest that at first sight we anthropomorphize an android, but within milliseconds we detect deviations and dehumanize it. And that drop in perceived animacy likely contributes to the uncanny feeling."

Credit: 
Emory Health Sciences

Concussion discovery reveals dire, unknown effect of even mild brain injuries

image: UVA School of Medicine researchers John Lukens, PhD, and Ashley Bolte have discovered that concussions and traumatic brain injuries, even when mild, cause swelling that blocks the brain's ability to clean itself of harmful toxins and debris.

Image: 
Dan Addison | UVA Health

UVA researchers have discovered that concussions and traumatic brain injuries, even when mild, cause swelling that blocks the brain's ability to clean itself of harmful toxins and debris.

In addition to an immediate impact on memory and brain inflammation, this may seed the brain for Alzheimer's, dementia and other neurodegenerative diseases.

The discovery helps explain why repeated brain injuries are so harmful and suggests they increase the risk of long-term problems.

It also suggests a reason why blows to the head affect different people differently.

The findings point to a new approach to treating brain injury and could lead to a better way to determine when it is safe for athletes and military personnel to resume their duties.

Even mild concussions cause severe and long-lasting impairments in the brain's ability to clean itself of toxins, and this may seed it for Alzheimer's disease, dementia and other neurodegenerative problems, new research from the University of Virginia School of Medicine reveals.

The discovery offers important insights into traumatic brain injury (TBI), a poorly understood condition that has become a major public concern, particularly in sports and for the military. The findings help explain why TBI is so harmful and why it can have such long-term effects. The research also suggests that certain patients are at greater risk of a decline in brain function later in life, and it paves the way for new and better treatments.

"This provides some of the best evidence yet that if you haven't recovered from a brain injury and you get hit in the head again, you're going to have even more severe consequences," said John Lukens, PhD, of UVA's Department of Neuroscience and the Center for Brain Immunology and Glia (BIG). "This reinforces the idea that you have to give people an opportunity to heal. And if you don't, you're putting yourself at a much higher risk for long-term consequences that you might not see in a year but could see in a couple of decades."

New Understanding of TBI

Lukens' research identifies a previously unknown consequence of TBI that can have long-lasting effects. When the brain swells, it presses against the skull; trapped in-between are tiny lymphatic vessels that clean the brain. This pressure on the vessels, the UVA researchers found, causes serious and long-lasting impairment of the brain's ability to purge itself of toxins. Working with lab mice, one of the best models of TBI available, the scientists found the impairment could last at least two weeks - a long time for mice - and possibly much longer.

These lymphatic vessels were identified by Jonathan Kipnis, PhD, and his collaborators at UVA in 2015. Until then, medical textbooks insisted the vessels did not exist and that the brain was "immune privileged," meaning that it did not interact with the immune system. Kipnis' discovery changed all that, and he has since determined the vessels play important roles in both Alzheimer's and the cognitive decline that comes with age.

Now they emerge as an important player in TBI. "We know that traumatic brain injury carries an increased risk for a bunch of long-term issues like dementia, Alzheimer's disease and CTE [chronic traumatic encephalopathy], and this has really been made extra public because of the NFL," said researcher Ashley C. Bolte, an MD/PhD student. "Then there's also anxiety, depression, suicide. The reasons why TBI results in increased risk for this isn't totally known, and we think that our findings might provide a mechanism as to why."

People Most at Risk

The research suggests that people who have pre-existing problems with their brain drainage, either from prior concussions or naturally, are likely to suffer much more severe consequences from TBI. In mice, this led to more brain inflammation and worse outcomes, including memory impairment. "If you have a pre-existing kink in the pipes and you get hit in the head, then everything is taken to a higher level - the impacts on memory, the neuroinflammation," Lukens said. "There are a lot of implications to it."

Emerging imaging technology may eventually make it possible for doctors to identify people who will suffer the greatest consequences of TBI. More good news: Lukens also believes that doctors may one day be able to rejuvenate the impaired lymphatic vessels with drugs to improve patients' outcomes and possibly stave off long-term consequences. (This also may prove useful in the battle against the cognitive decline that naturally occurs with age.)

In addition, Lukens said, it eventually may be possible for doctors to evaluate brain drainage after injury to determine when it is safest for patients to return to action.

"Right now, we really don't know what to tell these kids who want to get back out on the field, or even members of the military," Lukens said. "It would be important to have empirical tests to say you can continue or never to do those things ever again."

Credit: 
University of Virginia Health System

Tel Aviv University study confirms widespread literacy in biblical-period kingdom of Judah

image: Examples of Hebrew ostraca from Arad.

Image: 
Michael Cordonsky, TAU and the Israel Antiquities Authority

Researchers at Tel Aviv University (TAU) have analyzed 18 ancient texts dating back to around 600 BCE from the Tel Arad military post using state-of-the-art image processing, machine learning technologies, and the expertise of a senior handwriting examiner. They have concluded that the texts were written by no fewer than 12 authors, suggesting that many of the inhabitants of the kingdom of Judah during that period were able to read and write, with literacy not reserved as an exclusive domain in the hands of a few royal scribes.

The special interdisciplinary study was conducted by TAU's Dr. Arie Shaus, Ms. Shira Faigenbaum-Golovin, and Dr. Barak Sober of the Department of Applied Mathematics; Prof. Eli Piasetzky of the Raymond and Beverly Sackler School of Physics and Astronomy; and Prof. Israel Finkelstein of the Jacob M. Alkow Department of Archeology and Ancient Near Eastern Civilizations. The forensic handwriting specialist, Ms. Yana Gerber, is a senior expert who served for 27 years in the Questioned Documents Laboratory of the Israel Police Division of Identification and Forensic Science and its International Crime Investigations Unit.

The results were published in PLOS ONE on September 9, 2020.

"There is a lively debate among experts as to whether the books of Deuteronomy, Joshua, Judges, Samuel, and Kings were compiled in the last days of the kingdom of Judah or after the destruction of the First Temple by the Babylonians," Dr. Shaus explains. "One way to try to get to the bottom of this question is to ask when there was the potential for the writing of such complex historical works.

"For the period following the destruction of the First Temple in 586 BC, there is very scant archaeological evidence of Hebrew writing in Jerusalem and its surroundings, but an abundance of written documents has been found for the period preceding the destruction of the Temple. But who wrote these documents? Was this a society with widespread literacy, or was there just a handful of literate people?"

To answer this question, the researchers examined the ostraca (fragments of pottery vessels containing ink inscriptions) writings discovered at the Tel Arad site in the 1960s. Tel Arad was a small military post on the southern border of the kingdom of Judah; its built-up area was about 20,000 square feet and it housed between 20 and 30 soldiers.

"We examined the question of literacy empirically, from different directions of image processing and machine learning," says Ms. Faigenbaum-Golovin. "Among other things, these areas help us today with the identification, recognition, and analysis of handwriting, signatures, and so on. The big challenge was to adapt modern technologies to 2,600-year-old ostraca. With a lot of effort, we were able to produce two algorithms that could compare letters and answer the question of whether two given ostraca were written by two different people."

In 2016, the researchers theorized that 18 of the Tel Arad inscriptions were written by at least four different authors. Combined with additional textual evidence, the researchers concluded that there were in fact at least six different writers. The study aroused great interest around the world.

The TAU researchers then decided to compare the algorithmic methods, which have since been refined, to the forensic approach. To this end, Ms. Gerber joined the team. After an in-depth examination of the ancient inscriptions, she found that the 18 texts were written by at least 12 distinct writers with varying degrees of certainty. She examined the original Tel Arad ostraca at the Israel Museum, the Eretz Israel Museum, the Sonia and Marco Nedler Institute of Archaeology of Tel Aviv University, and the Israel Antiquities Authority's warehouses at Beit Shemesh.

Ms. Gerber explained:

"This study was very exciting, perhaps the most exciting in my professional career. These are ancient Hebrew inscriptions written in ink on shards of pottery, utilizing an alphabet that was previously unfamiliar to me. I studied the characteristics of the writing in order to analyze and compare the inscriptions, while benefiting from the skills and knowledge I acquired during my bachelor's degree studies in classical archaeology and ancient Greek at Tel Aviv University. I delved into the microscopic details of these inscriptions written by people from the First Temple period, from routine issues such as orders concerning the movement of soldiers and the supply of wine, oil, and flour, through correspondence with neighboring fortresses, to orders that reached the Tel Arad fortress from the high ranks of the Judahite military system. I had the feeling that time had stood still and there was no gap of 2,600 years between the writers of the ostraca and ourselves.

"Handwriting is made up of unconscious habit patterns. The handwriting identification is based on the principle that these writing patterns are unique to each person and no two people write exactly alike. It is also assumed that repetitions of the same text or characters by the same writer are not exactly identical and one can define a range of natural handwriting variations specific to each one. Thus, forensic handwriting analysis aims at tracking features corresponding to specific individuals, and concluding whether a single or rather different authors wrote the given documents.

"The examination process is divided into three steps: analysis, comparison, and evaluation. The analysis includes a detailed examination of every single inscription, according to various features, such as the spacing between letters, their proportions, slant, etc. The comparison is based upon the aforementioned features across various handwritings. In addition, consistent patterns,such the same combinations of letters, words, and punctuation, are identified. Finally, an evaluation of identicalness or distinctiveness of the writers is made. It should be noted that, according to an Israel Supreme Court ruling, a person can be convicted of a crime based on the opinion of a forensic handwriting expert."

Dr. Shaus further elaborated:

"We were in for a big surprise: Yana identified more authors than our algorithms did. It must be understood that our current algorithms are of a "cautious" nature -- they know how to identify cases in which the texts were written by people with significantly different writing; in other cases they refrain from definite conclusions. In contrast, an expert in handwriting analysis knows not only how to spot the differences between writers more accurately, but in some cases may also arrive at the conclusion that several texts were actually written by a single person. Naturally, in terms of consequences, it is very interesting to see who the authors are. Thanks to the findings, we were able to construct an entire flowchart of the correspondence concerning the military fortress -- who wrote to whom and regarding what matter. This reflects the chain of command within the Judahite army.

"For example, in the area of Arad, close to the border between the kingdoms of Judah and Edom, there was a military force whose soldiers are referred to as "Kittiyim" in the inscriptions, most likely Greek mercenaries. Someone, probably their Judahite commander or liaison officer, requested provisions for the Kittiyim unit. He writes to the quartermaster of the fortress in Arad "give the Kittiyim flour, bread, wine" and so on. Now, thanks to the identification of the handwriting, we can say with high probability that there was not only one Judahite commander writing, but at least four different commanders. It is conceivable that each time another officer was sent to join the patrol, they took turns."

According to the researchers, the findings shed new light on Judahite society on the eve of the destruction of the First Temple -- and on the setting of the compilation of biblical texts. Dr. Sober explains:

"It should be remembered that this was a small outpost, one of a series of outposts on the southern border of the kingdom of Judah. Since we found at least 12 different authors out of 18 texts in total, we can conclude that there was a high level of literacy throughout the entire kingdom. The commanding ranks and liaison officers at the outpost, and even the quartermaster Eliashib and his deputy, Nahum, were literate. Someone had to teach them how to read and write, so we must assume the existence of an appropriate educational system in Judah at the end of the First Temple period. This, of course, does not mean that there was almost universal literacy as there is today, but it seems that significant portions of the residents of the kingdom of Judah were literate. This is important to the discussion on the composition of biblical texts. If there were only two or three people in the whole kingdom who could read and write, then it is unlikely that complex texts would have been composed."

Prof. Finkelstein concludes:

"Whoever wrote the biblical works did not do so for us, so that we could read them after 2,600 years. They did so in order to promote the ideological messages of the time. There are different opinions regarding the date of the composition of biblical texts. Some scholars suggest that many of the historical texts in the Bible, from Joshua to II Kings, were written at the end of the 7th century BC, very close to the period of the Arad ostraca. It is important to ask who these texts were written for. According to one view, there were events in which the few people who could read and write stood before the illiterate public and read texts out to them. A high literacy rate in Judah puts things into a different light.

"Until now, the discussion of literacy in the kingdom of Judah has been based on circular arguments, on what is written within the Bible itself, for example on scribes in the kingdom. We have shifted the discussion to an empirical perspective. If in a remote place like Tel Arad there was, over a short period of time, a minimum of 12 authors of 18 inscriptions, out of the population of Judah which is estimated to have been no more than 120,000 people, it means that literacy was not the exclusive domain of a handful of royal scribes in Jerusalem. The quartermaster from the Tel Arad outpost also had the ability to read and appreciate them."

Credit: 
American Friends of Tel Aviv University

CityU develops anti-bacterial graphene face masks

image: Dr Ye's team uses the CO2 infrared laser system to generate graphene. Experiment results show that the graphene they produced exhibit a much better anti-bacterial efficiency than activated carbon fibre and melt-blown fabrics.

Image: 
City University of Hong Kong

Face masks have become an important tool in fighting against the COVID-19 pandemic. However, improper use or disposal of masks may lead to "secondary transmission". A research team from City University of Hong Kong (CityU) has successfully produced graphene masks with an anti-bacterial efficiency of 80%, which can be enhanced to almost 100% with exposure to sunlight for around 10 minutes. Initial tests also showed very promising results in the deactivation of two species of coronaviruses. The graphene masks are easily produced at low cost, and can help to resolve the problems of sourcing raw materials and disposing of non-biodegradable masks.

The research is conducted by Dr Ye Ruquan, Assistant Professor from CityU's Department of Chemistry, in collaboration with other researchers. The findings were published in the scientific journal ACS Nano, titled "Self-Reporting and Photothermally Enhanced Rapid Bacterial Killing on a Laser-Induced Graphene Mask".

Commonly used surgical masks are not anti-bacterial. This may lead to the risk of secondary transmission of bacterial infection when people touch the contaminated surfaces of the used masks or discard them improperly. Moreover, the melt-blown fabrics used as a bacterial filter poses an impact on the environment as they are difficult to decompose. Therefore, scientists have been looking for alternative materials to make masks.

Converting other materials into graphene by laser

Dr Ye has been studying the use of laser-induced graphene in developing sustainable energy. When he was studying PhD degree at Rice University several years ago, the research team he participated in and led by his supervisor discovered an easy way to produce graphene. They found that direct writing on carbon-containing polyimide films (a polymeric plastic material with high thermal stability) using a commercial CO2 infrared laser system can generate 3D porous graphene. The laser changes the structure of the raw material and hence generates graphene. That's why it is named laser-induced graphene.

Graphene is known for its anti-bacterial properties, so as early as last September, before the outbreak of COVID-19, producing outperforming masks with laser-induced graphene already came across Dr Ye's mind. He then kick-started the study in collaboration with researchers from the Hong Kong University of Science and Technology (HKUST), Nankai University, and other organisations.

Excellent anti-bacterial efficiency

The research team tested their laser-induced graphene with E. coli, and it achieved high anti-bacterial efficiency of about 82%. In comparison, the anti-bacterial efficiency of activated carbon fibre and melt-blown fabrics, both commonly-used materials in masks, were only 2% and 9% respectively. Experiment results also showed that over 90% of the E. coli deposited on them remained alive even after 8 hours, while most of the E. coli deposited on the graphene surface were dead after 8 hours. Moreover, the laser-induced graphene showed a superior anti-bacterial capacity for aerosolised bacteria.

Dr Ye said that more research on the exact mechanism of graphene's bacteria-killing property is needed. But he believed it might be related to the damage of bacterial cell membranes by graphene's sharp edge. And the bacteria may be killed by dehydration induced by the hydrophobic (water-repelling) property of graphene.

Previous studies suggested that COVID-19 would lose its infectivity at high temperatures. So the team carried out experiments to test if the graphene's photothermal effect (producing heat after absorbing light) can enhance the anti-bacterial effect. The results showed that the anti-bacterial efficiency of the graphene material could be improved to 99.998% within 10 minutes under sunlight, while activated carbon fibre and melt-blown fabrics only showed an efficiency of 67% and 85% respectively.

The team is currently working with laboratories in mainland China to test the graphene material with two species of human coronaviruses. Initial tests showed that it inactivated over 90% of the virus in five minutes and almost 100% in 10 minutes under sunlight. The team plans to conduct testings with the COVID-19 virus later.

Their next step is to further enhance the anti-virus efficiency and develop a reusable strategy for the mask. They hope to release it to the market shortly after designing an optimal structure for the mask and obtaining the certifications.

Dr Ye described the production of laser-induced graphene as a "green technique". All carbon-containing materials, such as cellulose or paper, can be converted into graphene using this technique. And the conversion can be carried out under ambient conditions without using chemicals other than the raw materials, nor causing pollution. And the energy consumption is low.

"Laser-induced graphene masks are reusable. If biomaterials are used for producing graphene, it can help to resolve the problem of sourcing raw material for masks. And it can lessen the environmental impact caused by the non-biodegradable disposable masks," he added.

Dr Ye pointed out that producing laser-induced graphene is easy. Within just one and a half minutes, an area of 100 cm² can be converted into graphene as the outer or inner layer of the mask. Depending on the raw materials for producing the graphene, the price of the laser-induced graphene mask is expected to be between that of surgical mask and N95 mask. He added that by adjusting laser power, the size of the pores of the graphene material can be modified so that the breathability would be similar to surgical masks.

A new way to check the condition of the mask

To facilitate users to check whether graphene masks are still in good condition after being used for a period of time, the team fabricated a hygroelectric generator. It is powered by electricity generated from the moisture in human breath. By measuring the change in the moisture-induced voltage when the user breathes through a graphene mask, it provides an indicator of the condition of the mask. Experiment results showed that the more the bacteria and atmospheric particles accumulated on the surface of the mask, the lower the voltage resulted. "The standard of how frequently a mask should be changed is better to be decided by the professionals. Yet, this method we used may serve as a reference," suggested Dr Ye.

Credit: 
City University of Hong Kong

COVID-19 study links strict social distancing to much lower chance of infection

Using public transportation, visiting a place of worship, or otherwise traveling from the home is associated with a significantly higher likelihood of testing positive with the coronavirus SARS-CoV-2, while practicing strict social distancing is associated with a markedly lower likelihood, suggests a study from researchers at the Johns Hopkins Bloomberg School of Public Health.

For their analysis, the researchers surveyed a random sample of more than 1,000 people in the state of Maryland in late June, asking about their social distancing practices, use of public transportation, SARS-CoV-2 infection history, and other COVID-19-relevant behaviors. They found, for example, that those reporting frequent public transport use were more than four times as likely to report a history of testing positive for SARS-CoV-2 infection, while those who reported practicing strict outdoor social distancing were just a tenth as likely to report ever being SARS-CoV-2 positive.

The study is believed to be among the first large-scale evaluations of COVID-19-relevant behaviors that is based on individual-level survey data, as opposed to aggregated data from sources such as cellphone apps.

The results were published online on September 2 in Clinical Infectious Diseases.

"Our findings support the idea that if you're going out, you should practice social distancing to the extent possible because it does seem strongly associated with a lower chance of getting infected," says study senior author Sunil Solomon, MBBS, PhD, MPH, an associate professor in the Bloomberg School's Department of Epidemiology and an associate professor of medicine at Johns Hopkins School Medicine. "Studies like this are also relatively easy to do, so we think they have the potential to be useful tools for identification of places or population subgroups with higher vulnerability."

The novel coronavirus SARS-CoV-2 has infected nearly 27 million people around the world, of whom some 900,000 have died, according to the World Health Organization. In the absence of a vaccine, public health authorities have emphasized practices such as staying at home, and wearing masks and maintaining social distancing while in public. Yet there hasn't been a good way to monitor whether--and among which groups--such practices are being followed.

Solomon and colleagues, including first author Steven Clipman, a PhD candidate in the Bloomberg School's Department of International Health, quickly accessed willing survey participants via a company that maintains a large nationwide pool of potential participants as a commercial service for market research. The 1,030 people included in the study were all living in Maryland, which has logged more than 113,000 SARS-CoV-2 confirmed cases and nearly 3,700 confirmed deaths, according to the Maryland Department of Health.

The researchers asked the survey participants questions about recent travel outside the home, their use of masks, social distancing and related practices, and any confirmed infection with SARS-CoV-2 either recently or at all.

The results indicated that 55 (5.3 percent) of the 1,030 participants had tested positive for SARS-CoV-2 infection at any time, while 18 (1.7 percent) reported testing positive in the two weeks before they were surveyed.

The researchers found that when considering all the variables they could evaluate, spending more time in public places was strongly associated with having a history of SARS-CoV-2 infection. For example, an infection history was about 4.3 times more common among participants who stated that they had used public transportation more than three times in the prior two weeks, compared to participants who stated they had never used public transportation in the two-week period.

An infection history also was 16 times more common among those who reported having visited a place of worship three or more times in the prior two weeks, compared to those who reported visiting no place of worship during the period. The survey did not distinguish between visiting a place of worship for a religious service or other purposes, such as a meeting, summer camp or meal.

Conversely, those who reported practicing social distancing outdoors "always" were only 10 percent as likely to have a SARS-CoV-2 history, compared to those who reported "never" practicing social distancing.

An initial, relatively simple analysis linked many other variables to SARS-CoV-2 infection history, including being Black or Hispanic. But a more sophisticated, "multivariable" analysis suggested that many of these apparent links were largely due to differences in movement and social distancing.

"When we adjusted for other variables such as social distancing practices, a lot of those simple associations went away, which provides evidence that social distancing is an effective measure for reducing SARS-CoV-2 transmission," Clipman says.

The data indicated a greater adoption of social distancing practices among some groups who are especially vulnerable to serious COVID-19 illness, suggesting that they were relatively aware of their vulnerability. For example, 81 percent of over-65 participants reported always practicing social distancing at outdoor activities, while only 58 percent of 18-24 year olds did so.

The results are consistent with the general public health message that mask-wearing, social distancing, and limiting travel whenever possible reduce SARS-CoV-2 transmission. The researchers suggest, though, that studies such as these, employing similarly rapid surveys of targeted groups, could also become useful tools for predicting where and among which groups infectious diseases will spread most quickly.

"We did this study in Maryland in June, and it showed among other things that younger people in the state were less likely to reduce their infection risk with social distancing--and a month later a large proportion of the SARS-CoV-2 infections detected in Maryland was among younger people," says Solomon. "So, it points to the possibility of using these quick, inexpensive surveys to predict where outbreaks are going to happen based on behaviors, and then mobilizing public health resources accordingly."

Solomon and his team are now conducting similar surveys in other states and are studying the surveys' potential as predictive epidemiological tools.

Credit: 
Johns Hopkins Bloomberg School of Public Health

Human norovirus strains differ in sensitivity to the body's first line of defense

image: Cluster of norovirus virions.

Image: 
CDC

Interferon (IFN) responses are one of the first defenses the body mounts against viral infections, and research has shown that it plays a role controlling viral replication. But when researchers at Baylor College of Medicine investigated whether IFN restricted human norovirus (HuNoV) infection in human intestinal enteroids (HIEs), a cultivation system that recapitulates many of the characteristics of the human infection, they unexpectedly discovered that endogenous IFN responses by HIEs restricted growth of HuNoV strain GII.3, but not of GII.4, the most common strain worldwide.

The findings, published in the Proceedings of the National Academy of Sciences USA, highlight the importance of considering strain differences when studying HuNoV biology and designing therapies.

"HuNoVs cause the majority of the cases of viral gastroenteritis in the world and bring about significant mortality in all age groups; yet, there are still no vaccines or other approved therapeutic strategies available," said first author Shih-Ching Lin, a graduate student in the laboratory of Dr. Mary Estes at Baylor. "In-depth studies of how virus and host interact have been possible only recently thanks to the development of several laboratory cultivation systems. In this study, we worked with HIEs to investigate their response to HuNoV infection and how it affects viral replication."

Novel insights into the interplay between HuNoV and HIEs

The researchers infected HIEs with HuNoV strain GII.3 or pandemic strain GII.4 and determined which HIE genes were activated as a result.

"We discovered that both strains preferentially triggered a type III IFN response, including activation of a number of IFN-stimulated genes, as well as increases in a subset of long non-coding RNAs. Changes in long non-coding RNAs, which are known to regulate gene expression, had never been reported before for gastrointestinal virus infections," Lin said.

Next, Lin, Estes and their colleagues studied the effect of IFN on the replication of HuNoV. Adding IFN to the cultures reduced replication of both strains, suggesting that IFN may have value as therapeutics for HuNoV infections. This could be important for chronically infected immunocompromised patients who can suffer with diarrhea for years.

To obtain insights into the genes of the IFN pathway that contribute to the antiviral response to HuNoV, the researchers knocked out several of these genes in HIE cultures, infected them with strain GII.3 or GII.4 and measured the rate of viral proliferation.

"We expected that the absence of IFN responses by HIEs would promote viral replication in both strains. It was surprising and very exciting to find significant strain differences," Lin said.

"We saw that only strain GII.3 was able to spread and multiply more when HIEs could not activate IFN responses. Replication of GII.4, on the other hand, was not enhanced," said Estes, Cullen Foundation Endowed Chair and Distinguished Service Professor of molecular virology and microbiology at Baylor. Estes also is a member of the Dan L Duncan Comprehensive Cancer Center. "It was exciting to see the GII.3 strain proliferate and spread in the cultures as we had never seen it before."

"The strain-specific sensitivities of innate IFN responses to HuNoV replication we observed provide a potential explanation for why GII.4 infections are more widespread and become pandemic," Lin said. "Our findings also show the importance of keeping potential strain differences in mind when studying the biology of HuNoV and developing therapies. Our new genetically modified HIE cultures also will be useful tools for studying innate immune responses to other viral or microbial pathogens."

Credit: 
Baylor College of Medicine

Levodopa may improve vision in patients with macular degeneration

image: Spectral domain ? optical coherence tomography images of the same macular segmentation line at baseline and monthly follow-up visits in a patient naïve to intravitreal anti-vascular endothelial growth factor (VEGF) therapy. There was a 59 percent reduction in retinal fluid at one month; retinal fluid completely resolved at the same macular segmentation line at month 2, and fluid remained stable up to month 3 without anti-VEGF injections. Collectively, all macular segmentation lines revealed a 92 percent total retinal fluid decrease at 3 months.

Image: 
The American Journal of Medicine

Philadelphia, September 10, 2020 - Investigators have determined that treating patients with an advanced form of age-related macular degeneration (AMD) with levodopa, a safe and readily available drug commonly used to treat Parkinson's disease, stabilized and improved their vision. It reduced the number of treatments necessary to maintain vision, and as such, will potentially reduce the burden of treating the disease, financially and otherwise. Their findings appear in the American Journal of Medicine, published by Elsevier.

More than 15 percent of the US population over the age of 70 has AMD, a common cause of blindness in developed nations. Neovascular AMD (nAMD) is characterized by the abnormal growth of new blood vessels, triggered by vascular endothelial growth factor (VEGF), which can cause fluid and blood to leak in the subretinal space of the eye. While nAMD represents only 10-15 percent of all AMD cases, it is responsible for 90 percent of the vision loss attributed to the disease. The standard treatment requires frequent injections of agents to block VEGF. While effective, the injections are expensive and painful.

Earlier research found that patients being treated with levodopa for movement disorders such as Parkinson's disease were significantly less likely to develop any type of AMD. Lead investigator Robert W. Snyder, MD, PhD, Department of Biomedical Engineering, The University of Arizona, Tucson, and Snyder Biomedical Corporation, Tucson, AZ, USA, explained, "Levodopa has a receptor (GPR143) selectively expressed on pigmented cells. This receptor can be supportive of retinal health and survival, which led to the development of our hypothesis that it may prevent or treat AMD."

The investigators developed two proof-of-concept studies to test whether levodopa improves visual acuity and the anatomical changes caused by nAMD. They also evaluated the safety and tolerability of the drug in treating nAMD and whether treatment reduced or delayed the need for anti-VEGF therapy.

In the first study, 20 patients newly diagnosed with nAMD who had never had VEGF treatment were given a small daily dose of levodopa for one month and were evaluated weekly by their referring retina specialist, who determined whether anti-VEGF treatment was needed. In the second part of the study, the patients who completed the first study and a second group of 14 patients who had received anti-VEGF treatment for at least three months before the study received escalating doses of levodopa to test the tolerance and efficacy of the drug. The patients continued to be evaluated monthly by their referring retina specialist.

This trial demonstrated for the first time that levodopa is safe, well-tolerated, and delayed anti-VEGF injection therapy while improving visual outcomes. In the first month, retinal fluid decreased by 29 percent. After six months the decrease in retinal fluid was sustained and mean visual acuity improved enabling patients in the first and second group to read an additional line on the eye chart. This is the equivalent of improvement from 20/40 to 20/32. Side effects were limited.

The investigators noted that levodopa may be unlikely as a standalone treatment in patients with newly diagnosed nAMD since 11 of the patients did require anti-VEGF injections. However, they required fewer than the standard monthly treatments, and in the second group, monthly injections of anti-VEGF decreased by 52 percent.

According to Dr. Snyder, although this limited proof-of-concept study included a small sample size and limited patient diversity, its findings suggest efficacy and support the targeting of the GPR13 receptor with levodopa for the treatment of nAMD in future studies.

The concept had its genesis 20 years ago when Dr. Snyder began working with co-investigator Brian S. McKay, who had developed techniques to culture and examine retinal endothelial pigment cells. "We had a strong desire to make an impact in AMD, and I had a strong hunch that Dr. McKay could make a significant contribution," Dr. Snyder said. "Although this is nowhere near completed, I am happy to say, 20 years later, we have all persevered, and I believe the GPR143/levodopa story will make a significant impact on our treatment and prevention of AMD."

Credit: 
Elsevier

Addicted to the sun? Research shows it's in your genes

Sun-seeking behaviour is linked to genes involved in addiction, behavioural and personality traits and brain function, according to a study of more than 260,000 people led by King's College London researchers.

This means that people's behaviour towards seeking sun is complicated by a genetic predisposition, and this needs to be taken into account when designing skin cancer awareness campaigns.

The researchers studied detailed health information of 2,500 twins from TwinsUK, including their sun-seeking behaviour and genetics. Identical twins in a pair were more likely to have a similar sun-seeking behaviour than non-identical twins, indicating that genetics play a key role.

The team then identified five key genes involved in sun-seeking behaviour from a further analysis of 260,000 participants from other cohorts. Some of these genes have been linked to behavioural traits associated with risk-taking and addiction, including smoking, cannabis and alcohol consumption and number of sexual partners.

Senior author Dr Mario Falchi from King's College London said: "Our results suggest that tackling excessive sun exposure or use of tanning beds might be more challenging than expected, as it is influenced by genetic factors. It is important for the public to be aware of this predisposition, as it could make people more mindful of their behaviour and the potential harms of excessive sun exposure."

Dr Veronique Bataille, Consultant Dermatologist involved in the research from King's College London added: "It is clear that we see individuals who have very unhealthy sun behaviour and are fully aware of it. They will continue to expose themselves excessively even if they have clear skin cancer risk factors. Our research shows that genes regulating addiction and other risky behaviour are important and may explain some of the reticence in changing behaviours in the sun."

Credit: 
King's College London