Body

Penn Engineering's blinking eye-on-a-chip used for disease modeling and drug testing

image: The Huh lab's eye-on-a-chip attached to a motorized, gelatin-based eyelid. Blinking spreads tears over the corneal surface, and so was a critical aspect to replicate in the researchers' model of dry eye disease. cells. The cells of the cornea grow on the inner circle of scaffolding, dyed yellow, and the cells of the conjunctiva grow on the surrounding red circle. Artificial tears are supplied by a tear duct, dyed blue.

Image: 
University of Pennsylvania

People who spend eight or more hours a day staring at a computer screen may notice their eyes becoming tired or dry, and, if those conditions are severe enough, they may eventually develop dry eye disease (DED). DED is a common disease with shockingly few FDA-approved drug options, partially because of the difficulties of modeling the complex pathophysiology in human eyes. Enter the blinking eye-on-a-chip: an artificial human eye replica constructed in the laboratory of Penn Engineering researchers.

This eye-on-a-chip, complete with a blinking eyelid, is helping scientists and drug developers to improve their understanding and treatment of DED, among other potential uses. The research, published in Nature Medicine, outlines the accuracy of the eye-on-a-chip as an organ stand-in and demonstrates its utility as a drug testing platform.

The study was led by Dan Huh, associate professor in the Department of Bioengineering, and graduate student Jeongyun Seo.

They collaborated with Vivian Lee, Vatinee Bunya and Mina Massaro-Giordano from the Department of Ophthalmology in Penn's Perelman School of Medicine, as well as with Vivek Shenoy, Eduardo D. Glandt President's Distinguished Professor in Penn Engineering's Department of Materials Science and Engineering. Other collaborators included Woo Byun, Andrei Georgescu and Yoon-suk Yi, members of Huh's lab, and Farid Alisafaei, a member of Shenoy's lab.

Huh's lab specializes in creating organs-on-a-chip that provide microengineered in vitro platforms to mimic their in vivo counterparts, including lung and bone marrow proxies launched into space this May to study astronaut illness. The lab has spent years fine-tuning its eye-on-a-chip, which earned them the 2018 Lush Prize for its promise in animal-free testing of drugs, chemicals, and cosmetics.

In this study, Huh and Seo focused on engineering an eye model that could imitate a healthy eye and an eye with DED, allowing them to test an experimental drug without risk of human harm.

To construct their eye-on-a-chip, Huh's team starts with a porous scaffold engineered with 3D printing, about the size of a dime and the shape of a contact lens, on which they grow human eye cells. The cells of the cornea grow on the inner circle of scaffolding, dyed yellow, and the cells of the conjunctiva, the specialized tissue covering the white part of human eyes, grow on the surrounding red circle. A slab of gelatin acts as the eyelid, mechanically sliding over the eye at the same rate as human blinking. Fed by a tear duct, dyed blue, the eyelid spreads artificial tear secretions over the eye to form what is called a tear film.

"From an engineering standpoint, we found it interesting to think about the possibility of mimicking the dynamic environment of a blinking human eye. Blinking serves to spread tears and generate a thin film that keeps the ocular surface hydrated. It also helps form a smooth refractive surface for light transmission. This was a key feature of the ocular surface that we wanted to recapitulate in our device," says Huh.

For people with DED, that tear film evaporates faster than it's replenished, resulting in inflammation and irritation. A common cause of DED is the reduced blinking that occurs during excessive computer usage, but people can develop the disease for other reasons as well. DED affects about 14 percent of the world's population but has been notably difficult to develop new treatments for, with 200 failed clinical drug trials since 2010 and only two currently available FDA-approved drugs for treatment.

Huh's lab has been considering the drug-testing potential of organs-on-a-chip since their initial conceptualization, and, because of its surface-level area of impact, DED seemed the perfect place to start putting their eye model to the test. But before they started a drug trial, the team had to ensure their model could really imitate the conditions of DED.

"Initially, we thought modeling DED would be as simple as just keeping the culture environment dry. But as it turns out, it's an incredibly complex multifactorial disease with a variety of sub-types," Huh says. "Regardless of type, however, there are two core mechanisms that underlie the development and progression of DED. First, as water evaporates from the tear film, salt concentration increases dramatically, resulting in hyperosmolarity of tears. And second, with increased tear evaporation, the tear film becomes thinner more rapidly and often ruptures prematurely, which is referred to as tear film instability. The question was: Is our model capable of modeling these core mechanisms of dry eye?"

The answer, after much experimentation, was yes. The team evoked DED conditions in their eye-on-a-chip by cutting their device's artificial blinking in half and carefully creating an enclosed environment that simulated the humidity of real-life conditions. When put to the test against real human eyes, both healthy and with DED, the corresponding eye-on-a-chip models proved their similarity to the actual organ on multiple clinical measures. The eyes-on-a-chip mimicked actual eyes' performance in a Schirmer strip, which tests liquid production; in an osmolarity test, which looks at tear film salt content; and in a keratography test, which evaluates the time it takes for a tear film to break up.

Having confirmed their eye-on-a-chip's ability to mirror the performance of a human eye in normal and DED-inducing settings, Huh's team turned to the pharmaceutical industry to find a promising DED drug candidate to test-drive their model. They landed on an upcoming drug based on lubricin, a protein primarily found in the lubricating fluid that protects joints.

"When people think of DED, they normally treat it as a chronic disease driven by inflammation," says Huh, "but there's now increasing evidence suggesting that mechanical forces are important for understanding the pathophysiology of DED. As the tear film becomes thinner and more unstable, friction between the eyelids and the ocular surface increases, and this can damage the epithelial surface and also trigger adverse biological responses such as inflammation. Based on these observations, there is emerging interest in developing ophthalmic lubricants as a topical treatment for dry eye. In our study, we used an lubricin-based drug that is currently undergoing clinical trials. When we tested this drug in our device, we were able to demonstrate its friction-lowering effects, but, more importantly, using this model we discovered its previously unknown capacity to suppress inflammation of the ocular surface."

By comparing the testing results of their models of a healthy eye, an eye with DED, and an eye with DED plus lubricin, Huh and Seo were able to further scientists' understanding of how lubricin works and show the drug's promise as a DED treatment.

Similarly, the process of building a blinking eye-on-a-chip pushed forward scientists' understanding of the eye itself, providing insights into the role of mechanics in biology. Collaborating with Shenoy, director of the Center for Engineering MechanoBiology, the team's attention was drawn to how the physical blinking action was affecting the cells they cultivated to engineer an artificial eye on top of their scaffolding.

"Initially, the corneal cells start off as a single layer, but they become stratified and form multiple layers as a result of differentiation, which happens when these cells are cultured at the air-liquid interface. They also form tight cell-cell junctions and express a set of markers during differentiation," Huh says. "Interestingly, we found out that mechanical forces due to blinking actually help the cells differentiate more rapidly and more efficiently. When the corneal cells were cultured under air in the presence of blinking, the rate and extent of differentiation increased significantly in comparison to static models without blinking. Based on this result, we speculate that blink-induced physiological forces may contribute to differentiation and maintenance of the cornea."

In other words, human cornea cells growing on the scientists' scaffold more quickly became specialized and efficient at their particular jobs when the artificial eyelid was blinking on top of them, suggesting that mechanical forces like blinking contribute significantly to how cells function. These types of conceptual advances, coupled with drug discovery applications, highlight the multifaceted value that engineered organs-on-a-chip can contribute to science.

Huh and Seo's eye-on-a-chip is still just dipping its toes into the field of drug testing, but this first step is a victory that represents years of work refining their artificial eye to reach this level of accuracy and utility.

"Although we have just demonstrated proof-of-concept," says Seo, "I hope our eye-on-a-chip platform is further advanced and used for a variety of applications besides drug screening, such as testing of contact lenses and eye surgeries in the future."

"We are particularly proud of the fact that our work offers a great and rare example of interdisciplinary efforts encompassing a broad spectrum of research activities from design and fabrication of novel bioengineering systems to in vitro modeling of complex human disease to drug testing," says Huh. "I think this is what makes our study unique and representative of innovation that can be brought about by organ-on-a-chip technology."

Credit: 
University of Pennsylvania

Researchers create first-ever personalised sound projector with £10 webcam

image: The current version of Sussex's acoustic projector. The speaker is contained in the back, together with the tracking camera and one of the acoustic lenses. The part in white is the second acoustic lens in the telescope.

Image: 
University of Sussex

A University of Sussex research team have demonstrated the first sound projector that can track a moving individual and deliver an acoustic message as they move, to a high-profile tech and media conference in LA.

Dr Gianluca Memoli and his colleagues demonstrated what they believe to be the world's first sound projector with an autozoom objective in a talk at the 46th International Conference and Exhibition on Computer Graphics & Interactive Techniques (SIGGRAPH 2019) this week.

Dr Memoli, Lecturer in Novel Interfaces and Interactions at the University of Sussex's School of Engineering and Informatics who led the research, said: "By designing acoustic materials at a scale smaller than the wavelength of the sound to create thin acoustic lenses, the sky is the limit in new potential acoustic applications.

"Centuries of optical design can now be applied to acoustics. We believe this technology can be harnessed for plenty of positive applications including personalised alarm messages in a crowd, immersive experiences without headphones, the audio equivalent of special effects."

The system works with an in-house face-tracking software which is used to pilot an Arduino-controlled acoustic telescope to focus sound on a moving target.

The low-cost camera is able to track a person and command the distance between two acoustic lenses, delivering a sphere of sound around 6cm in diameter in front of the target, which then responds to the individual's movement.

Joshua Kybett, the second-year undergraduate at Sussex who designed the tracking, adds: "Since acoustic lenses can be 3D-printed for only £100, we wanted a tracking technique that worked on a similar low budget. With a £10 webcam, this is one tenth of standard tracking systems.

"In addition, our method has been designed to require user consent in order to function. This requirement ensures the technology cannot be used intrusively, nor deliver sound to an unwilling audience."

Thomas Graham, the research fellow in the School of Engineering and Informatics who run the measurements and the simulations, says: "In our study, we were inspired by autozoom cameras that extend their objectives to match the distance of a target.

"We used a very similar system, with even the same mechanical sound of the motor. I believe our work is also the first step towards hand-held, low-cost acoustic cameras."

The research team are now working to expand the capabilities of the system beyond tracking for just one direction and over one octave, to ensure it can be scaled up to cover most speech and basic melodies and eventually to deliver a full piece of music.

Arash PourYazdan, who designed the electronics, said: "SIGGRAPH is a place where emerging and futuristic ideas are discussed. This is the conference where entertainment giants such as Disney, Marvel and Microsoft meet to share their visions: it was the perfect place for us to demonstrate how we think sound might be managed in the future."

Credit: 
University of Sussex

Multiple genes affect risk of asthma, hay fever and eczema

In a new study from SciLifeLab at Uppsala University, researchers have found a total of 141 regions (genes) in our genetic material that largely explain the genetic risk underlying asthma, hay fever and eczema. As many as 41 of the genes identified have not previously been linked to an elevated risk for these diseases. The results are published in the scientific journal Human Molecular Genetics.

The risk of developing asthma, hay fever or eczema is affected by genes, environment and lifestyle factors. Many patients diagnosed with one of these diseases also develop the other two at some stage in life. Although previous studies have found many genes that exert an effect on these diseases, research have been unable to explain the whole genetic background to the origin of asthma, hay fever and eczema.

In this study, which is the largest of its kind to date, researchers have analysed self-reported data from 350,000 participants in Britain's UK Biobank. Millions of gene positions were tested for their effect on people's risk of being diagnosed with asthma, hay fever and/or eczema. The 41 new genetic finds were also tested in an independent group of individuals comprising 110,000 clients of the American company 23andMe. This testing verified that most of these new genetic variants have an effect on the individual's risk of developing disease. Every 23andMe participant, or client, has paid personally to send in a saliva sample, used by the company to analyse the person's DNA. The participants then receive information about whether they carry various inherited genetic traits that may elevate their risk of a number of diseases. Researchers can apply to obtain results in which 23andMe have analysed clients' DNA, to find additional genetic variants that affect the disease(s) analysed (in this case asthma, hay fever and/or eczema). The researchers can never access any given individual's results, nor can they link their findings with specific individuals (the data are deidentified).

"For those interested in taking part in similar studies where they can get information about their own genetic inheritance, we'd like to point out that the results you can read from DNA in similar studies relate only to people's disease risk, which doesn't correspond to a diagnosis. External factors also affect our risk for these complex traits, and an elevated risk doesn't mean we're going to develop the disease," says Weronica Ek, researcher at the Department of Immunology, Genetics and Pathology at Uppsala University, who headed the study.

The study showed that a large number of the genes identified entail a raised risk for all three diseases. This, in turn, shows that the elevated risk of suffering from allergy when asthma is diagnosed, or the elevated risk of asthma when allergy is diagnosed, seems to be largely due to genetic factors. The study was also able to identify several genes that boost the risk of one of these diseases in relation to the others, which demonstrates that a number of more disease-specific effects also exist.

All three diseases arise through a complex association among several genes and also with environmental and lifestyle factors. To be able to improve the patients' everyday lives, it is important to develop drugs that are adapted to individual patients' genetic risks, and also to understand how our environment and lifestyle can prevent disease and improve symptoms of disease.

"The results from this study are helping us to reach a greater understanding of why certain individuals are at higher risk of developing asthma and allergies, and we hope the results will be put to use both in clinical diagnostics and in drug development," Ek says.

Credit: 
Uppsala University

'Spin' found in over half of clinical trial abstracts published in top psychiatry journals

'Spin'--exaggerating the clinical significance of a particular treatment without the statistics to back it up--is apparent in more than half of clinical trial abstracts published in top psychology and psychiatry journals, finds a review of relevant research in BMJ Evidence Based Medicine.

The findings raise concerns about the potential impact this might be having on treatment decisions, as the evidence to date suggests that abstract information alone is capable of changing doctors' minds, warn the study authors.

Randomised controlled trials serve as the gold standard of evidence, and as such, can have a major impact on clinical care. But although researchers are encouraged to report their findings comprehensively, in practice they are free to interpret the results as they wish.

In an abstract, which is supposed to summarise the entire study, researchers may be rather selective with the information they choose to highlight, so misrepresenting or 'spinning' the findings.

To find out how common spin might be in abstracts, the study authors trawled the research database PubMed for randomised controlled trials of psychiatric and behavioural treatments published between 2012 and 2017 in six top psychology and psychiatry journals.

They reviewed only those trials (116) in which the primary results had not been statistically significant, and used a previously published definition of spin to see how often researchers had 'spun' their findings.

They found evidence of spin in the abstracts of more than half (65; 56%) of the published trials. This included titles (2%), results sections (21%), and conclusion sections (49%).

In 17 trials (15%), spin was identified in both the results and conclusion sections of the abstract.

Spin was more common in trials that compared a particular drug/behavioural approach with a dummy (placebo) intervention or usual care.

Industry funding was not associated with a greater likelihood of spinning the findings: only 10 of the 65 clinical trials in which spin was evident had some level of industry funding.

The study authors accept that their findings may not be widely applicable to clinical trials published in all psychiatry and psychology journals, and despite the use of objective criteria to define spin, inevitably, their assessments would have been subjective.

Nevertheless, they point out: "Researchers have an ethical obligation to honestly and clearly report the results of their research. Adding spin to the abstract of an article may mislead physicians who are attempting to draw conclusions about a treatment for patients. Most physicians read only the article abstract the majority of the time."

They add: "Those who write clinical trial manuscripts know that they have a limited amount of time and space in which to capture the attention of the reader. Positive results are more likely to be published, and many manuscript authors have turned to questionable reporting practices in order to beautify their results."

Credit: 
BMJ Group

Pregnancy problems may lead to later cardiac trouble in adult children

A new study in Cardiovascular Research finds that female offspring of females with polycystic ovary syndrome have an increased risk for developing cardiac dysfunction.

Polycystic ovary syndrome is the most common reproductive disorder, affecting one in 10 women of childbearing age. Women with disorder may have infrequent or prolonged menstrual periods or excess male hormone (androgen) levels, making pregnancy more difficult. The disorder is also associated with developing type 2 diabetes, depression, high blood pressure, and uterine cancer.

Although the disorder has a strong genetic component, considerable evidence suggests that the syndrome may originate by an adverse environment in utero with maternal androgen excess.

Researchers here tested the hypothesis that elevated maternal dihydrotestosterone during late pregnancy may cause cardiac dysfunction in adult female offspring.

Researchers conducted three experiments to assess the effects of prenatal exposure to dihydrotestosterone, as well as maternal obesity, in mice. Researchers generated prenatally androgenized female offspring by injecting pregnant dams with dihydrotestosterone during late pregnancy. To generate maternal obesity, twelve-week-old female mice were fed for 10 weeks with a high-fat/high-sucrose diet prior to and during pregnancy. After birth, female offspring were separated from their mothers and assigned to a control diet. The cardiac function was measured with echocardiography in a subset of adult female mice offspring and the heart tissue was thereafter harvested for molecular analysis.

Researchers also measured the effects of continuous dihydrotestosterone exposure from pre-puberty to adult age in cardiac function, using a mouse model that resembles features of the human polycystic ovary condition. In this experiment, 4-week-old female mice were implanted subcutaneously with either an implant containing dihydrotestosterone; or an empty, blank implant; or a pellet containing flutamide (an anti-androgen) together with the dihydrotestosterone -containing pellet. Seven weeks after pellet implantation, mice were subject to cardiovascular assessment.

Throughout the experiments, researchers found that maternal androgen excess as well as continuous androgen exposure from pre-puberty leads to pathological cardiac hypertrophy in adult life. Moreover, they showed that the cardiac dysfunction in the adult prenatally androgenized offspring was linked to an early cardiac reprogramming. Maternal high-fat/high sucrose feeding prior to and during gestation alone did not have an impact on the cardiac profile of the female offspring.

"Our study provides novel insight into the mechanisms that may lead to increased risk of developing cardiovascular disease in women with PCOS and their daughters. We revealed that exposure to male hormones during the critical period of fetal life is a stronger factor than maternal obesity in PCOS, which has a long-lasting impact on the cardiovascular profile of female offspring", says Stener-Victorin.

Credit: 
Oxford University Press USA

Young teens of color more likely to avoid peers with mental illness

WASHINGTON -- Students identifying as black or Latino are more likely to say they would socially distance themselves from peers with a mental illness, a key indicator of mental illness stigma, according to research published by the American Psychological Association. The findings reinforce how stigma may prevent teens who face prejudice and discrimination from seeking help for a mental health problem when they need it.

"Even as the need for mental health care among youth is rising, stigma can significantly impede access," said Melissa DuPont-Reyes, PhD, MPH, of the Latino Research Institute at the University of Texas at Austin and lead author on the study. "Our research shows that race, ethnicity and gender identities can affect how adolescents perceive mental illness in themselves and in others."

The study, published in the American Journal of Orthopsychiatry, examined how mental illness stigma varies across race, ethnicity and gender in students ages 11 to 13, a developmental period when stigmatizing attitudes and behaviors can become cemented and last into adulthood.

DuPont-Reyes and her coauthors surveyed 667 sixth graders from an urban school system in Texas on their knowledge, positive attitudes, and behaviors about mental illness, representing critical measures of mental illness stigma. The students were also asked to react to two stories of hypothetical peers diagnosed with a mental illness: Julia, living with bipolar disorder, and David, with social anxiety disorder. After each vignette, participants were asked whether they believed Julia or David was a bad person, whether his or her condition would improve with treatment, and whether they would socially interact with Julia or David such as sit at lunch or work on a class project together.

In general, girls and white boys appeared to have more knowledge of and positive attitudes and behaviors toward mental illness than boys and teens of color. However, assessing race, ethnicity, and gender identities together, the study revealed that black boys exhibited less knowledge of and positive attitudes toward people with mental illness, including bipolar disorder and social anxiety disorder, than white girls, and at times compared with black girls. Similar patterns were observed for Latina girls and Latino boys, particularly regarding the David vignette (social anxiety). Finally, black boys less often believed that David could improve with treatment compared to boys of other races and ethnicities.

While most young teens of color said they were less likely to socially interact with peers exhibiting mental illness, black and Latino boys also reported greater feelings of discomfort and intention to avoid people with mental illness than white girls.

Black girls demonstrated mental health knowledge and awareness similar to white girls, at least in this sample and at this age, according to DuPont-Reyes, but Latina girls were significantly more likely to avoid the David character compared with girls of other races and ethnicities.

"We found differences in mental illness knowledge and attitudes among boys and members of racial and ethnic minority groups, as anti-stigma efforts reach these populations less often," she said. "These differences in early life, prior to the onset of most common and major mental illness, can contribute to the disparities in mental health service utilization and recovery by people of color."

These findings suggest that boys of color, as well as Latina girls, may particularly benefit from targeted, tailored, anti-stigma interventions, according to DuPont-Reyes.

"The racial, ethnic and gender patterns we find in mental illness stigma mirror previous findings among adults, indicating that mental illness stigma crystalizes early in life and persists into adulthood," DuPont-Reyes said. "Understanding how members of racial and ethnic minority groups differ in their views of mental illness and how gender affects these perceptions can help us better grasp how stigma impedes use of mental health services in underserved populations."

Credit: 
American Psychological Association

Scientists can now manipulate brain cells using smartphone

image: The device, using Lego-like replaceable drug cartridges and powerful bluetooth low-energy, can target specific neurons of interest using drug and light for prolonged periods.

Image: 
Korea Advanced Institute of Science and Technology

A team of scientists in Korea and the United States have invented a device that can control neural circuits using a tiny brain implant controlled by a smartphone.

Researchers, publishing in Nature Biomedical Engineering, believe the device can speed up efforts to uncover brain diseases such as Parkinson's, Alzheimer's, addiction, depression, and pain.

The device, using Lego-like replaceable drug cartridges and powerful bluetooth low-energy, can target specific neurons of interest using drug and light for prolonged periods.

"The wireless neural device enables chronic chemical and optical neuromodulation that has never been achieved before," said lead author Raza Qazi, a researcher with the Korea Advanced Institute of Science and Technology (KAIST) and University of Colorado Boulder.

Qazi said this technology significantly overshadows conventional methods used by neuroscientists, which usually involve rigid metal tubes and optical fibers to deliver drugs and light. Apart from limiting the subject's movement due to the physical connections with bulky equipment, their relatively rigid structure causes lesion in soft brain tissue over time, therefore making them not suitable for long-term implantation. Though some efforts have been put to partly mitigate adverse tissue response by incorporating soft probes and wireless platforms, the previous solutions were limited by their inability to deliver drugs for long periods of time as well as their bulky and complex control setups.

To achieve chronic wireless drug delivery, scientists had to solve the critical challenge of exhaustion and evaporation of drugs. Researchers from the Korea Advanced Institute of Science and Technology and the University of Washington in Seattle collaborated to invent a neural device with a replaceable drug cartridge, which could allow neuroscientists to study the same brain circuits for several months without worrying about running out of drugs.

These 'plug-n-play' drug cartridges were assembled into a brain implant for mice with a soft and ultrathin probe (thickness of a human hair), which consisted of microfluidic channels and tiny LEDs (smaller than a grain of salt), for unlimited drug doses and light delivery.

Controlled with an elegant and simple user interface on a smartphone, neuroscientists can easily trigger any specific combination or precise sequencing of light and drug deliveries in any implanted target animal without need to be physically inside the laboratory. Using these wireless neural devices, researchers could also easily setup fully automated animal studies where behaviour of one animal could positively or negatively affect behaviour in other animals by conditional triggering of light and/or drug delivery.

"This revolutionary device is the fruit of advanced electronics design and powerful micro and nanoscale engineering," said Jae-Woong Jeong, a professor of electrical engineering at KAIST. "We are interested in further developing this technology to make a brain implant for clinical applications."

Michael Bruchas, a professor of anesthesiology and pain medicine and pharmacology at the University of Washington School of Medicine, said this technology will help researchers in many ways.

"It allows us to better dissect the neural circuit basis of behaviour, and how specific neuromodulators in the brain tune behaviour in various ways," he said. "We are also eager to use the device for complex pharmacological studies, which could help us develop new therapeutics for pain, addiction, and emotional disorders."

The researchers at the Jeong group at KAIST develop soft electronics for wearable and implantable devices, and the neuroscientists at the Bruchas lab at the University of Washington study brain circuits that control stress, depression, addiction, pain and other neuropsychiatric disorders. This global collaborative effort among engineers and neuroscientists over a period of three consecutive years and tens of design iterations led to the successful validation of this powerful brain implant in freely moving mice, which researchers believe can truly speed up the uncovering of brain and its diseases.

Credit: 
University of Washington School of Medicine/UW Medicine

Long-term declines in heart disease and stroke deaths are stalling, research finds

Heart disease and stroke mortality rates have almost stopped declining in many high-income countries, including Australia, and are even increasing in some countries, according to new research.

University of Melbourne researchers have analysed trends in cardiovascular disease mortality - which consists of mainly heart disease and stroke - in 23 high-income countries since the year 2000.

Researchers found cardiovascular disease mortality rates for people aged 35 to 74-years-old are now barely declining, or are increasing, in 12 of the 23 countries.

In the USA and for Canadian females, cardiovascular disease mortality rates have increased in the most recent year, while in Australia, the United Kingdom and New Zealand annual declines in deaths from cardiovascular disease are now only 20 to 50 per cent of what they were in the 2000s.

University of Melbourne expert on the global burden of disease Alan Lopez said research suggests that obesity, or at least poor diet, may have been a significant contributor to the slowdown in the decline of cardiovascular disease deaths.

"Each of these countries have very high levels of obesity. In Australia, close to one-third of adults are obese," Professor Lopez said.

"These increases in obesity levels mean that a significant portion of the population has been exposed to the cardiovascular disease risks associated with being overweight for several decades."

However, obesity is only one of many risk factors for cardiovascular disease mortality - others include smoking, high blood pressure, high cholesterol and diabetes.

Researchers found obesity levels are low in Italy and France where the slowdown in cardiovascular disease mortality in recent years is among the most notable of all countries.

University of Melbourne researcher and co-author Tim Adair said the research shows that the effect of successful public health interventions on cardiovascular disease mortality over the past 50 years is diminishing.

"In order to combat this, significant investment in preventive health measures is needed, particularly those aimed at increasing physical activity, improving diet and reducing obesity," Dr Adair said.

"Failure to address these issues could confirm the end of the long-term decline in cardiovascular disease deaths and threaten future gains in life expectancy."

Credit: 
University of Melbourne

How common is sesame allergy?

What The Study Did: This study used survey responses from nearly 79,000 individuals to estimate how common sesame allergy is in the United States.

Authors: Ruchi S. Gupta, M.D., M.P.J., of the Northwestern University Feinberg School of Medicine and the Ann & Robert H. Lurie Children's Hospital of Chicago, is the corresponding author.

(doi:10.1001/jamanetworkopen.2019.9144)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

#  #  #

Media advisory: The full study and commentary are linked to this news release.

Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time: http://jamanetwork.com/journals/jamanetworkopen/fullarticle/10.1001/jamanetworkopen.2019.9144?utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_term=080219

About JAMA Network Open: JAMA Network Open is the new online-only open access general medical journal from the JAMA Network. Every Wednesday and Friday, the journal publishes peer-reviewed clinical research and commentary in more than 40 medical and health subject areas. Every article is free online from the day of publication.

Credit: 
JAMA Network

Mayo Clinic study shows AI could enable accurate screening for atrial fibrillation

ROCHESTER, Minnesota -- A new Mayo Clinic research study shows that artificial intelligence (AI) can detect the signs of an irregular heart rhythm -- atrial fibrillation (AF) -- in an electrocardiogram (EKG), even if the heart is in normal rhythm at the time of a test. In other words, the AI-enabled EKG can detect recent atrial fibrillation that occurred without symptoms or that is impending, potentially improving treatment options. This research could improve the efficiency of the EKG, a noninvasive and widely available method of heart disease screening. The findings and an accompanying commentary are published in The Lancet.

While common, atrial fibrillation often is fleeting. Therefore, it is challenging to diagnose. Atrial fibrillation may not occur during a standard 10-second, 12-lead EKG, and people are often unaware of its presence. Prolonged monitoring methods, such as a loop recorder, require a procedure and are expensive.

Accuracy and timeliness are important in making an atrial fibrillation diagnosis. Left undetected, atrial fibrillation can cause stroke, heart failure and other cardiovascular disease. Knowing that a patient has atrial fibrillation helps direct treatment with blood thinners, notes Paul Friedman, M.D., chair of the Department of Cardiovascular Medicine at Mayo Clinic. Dr. Friedman, who is a cardiac electrophysiologist, is the study's senior author.

"When people come in with a stroke, we really want to know if they had AF in the days before the stroke, because it guides the treatment," says Dr. Friedman. "Blood thinners are very effective for preventing another stroke in people with AF. But for those without AF, using blood thinners increases the risk of bleeding without substantial benefit. That's important knowledge. We want to know if a patient has AF."

Using approximately 450,000 EKGs of the over 7 million EKGs in the Mayo Clinic digital data vault, researchers trained AI to identify subtle differences in a normal EKG that would indicate changes in heart structure caused by atrial fibrillation. These changes are not detectable without the use of AI.

Researchers then tested the AI on normal-rhythm EKGs from a group of 36,280 patients, of whom 3,051 were known to have atrial fibrillation. The AI-enabled EKG correctly identified the subtle patterns of atrial fibrillation with 90% accuracy.

Dr. Friedman says that he is surprised by the findings of this research. If proven out, he said, AI-guided EKGs could direct the right treatment for disease caused by atrial fibrillation, even without symptoms. Moreover, this technology can be processed using a smartphone or watch, making it readily available on a large scale.

"An EKG will always show the heart's electrical activity at the time of the test, but this is like looking at the ocean now and being able to tell that there were big waves yesterday," says Dr. Friedman. "AI can provide powerful information about the invisible electrical signals that our bodies give off with each heartbeat -- signals that have been hidden in plain sight."

"Rather than finding the needle in the haystack by prolonged monitoring, authors basically suggest that AI will be able to judge by looking at the haystack if it has a needle hidden in it," notes Jeroen Hendriks, Ph.D., of the University of Adelaide in Australia, who wrote the study's commentary with Larissa Fabritz, M.D., of the University of Birmingham in the U.K.

Credit: 
Mayo Clinic

Study finds genetic testing motivates behavior changes in families at risk for melanoma

image: Lisa G Aspinwall, Ph.D., Huntsman Cancer Institute at the University of Utah; Sancy Leachman, MD, Ph.D., Knight Cancer Institute; Tammy K. Stump, Ph.D., Oregon Health and Science University

Image: 
Huntsman Cancer Institute; Northwestern University; Oregon Health and Science University

SALT LAKE CITY -- Skin cancer is the most commonly diagnosed cancer in the United States, and melanoma is the most severe type of skin cancer. The National Cancer Institute estimates more than 96,000 new cases will be diagnosed this year, and the disease will cause more than 7,000 deaths. Utah has a particularly high melanoma rate. A new study led by researchers at Huntsman Cancer Institute (HCI) at the University of Utah (U of U) and collaborators at Northwestern University (NW) and Oregon Health and Science University (OHSU) investigated whether genetic testing would motivate people at risk of developing melanoma to alter their behavior in order to reduce their risk. The study was published today in Genetics in Medicine.

"We are trying to understand whether a genetic test result adds value over and above what can be achieved by patient counseling alone," said study co-author Lisa G. Aspinwall, PhD, HCI researcher and professor of psychology at the U of U. "A genetic test result is concrete and highly personalized. We thought this would be more motivating for difficult behavior change than information about risk based on family history alone."

The Utah Behavior, Risk Information, Genealogy, and Health Trial (BRIGHT) study focused on families with a high risk of melanoma. Individuals enrolling in the study had three or more family members diagnosed with melanoma. Participants between the ages of 16-70 were recruited from melanoma-prone families of two types: families with a known cancer-causing gene called CDKN2A and families with comparably high rates of melanoma but no identified CDKN2A mutation. Researchers at the U of U previously discovered that individuals who carry an inherited mutation in the CDKN2A gene are rare, but those individuals have a risk of up to 76 percent of developing melanoma in their lifetime. Co-author Sancy Leachman, MD, PhD, director of the melanoma research program at Knight Cancer Institute and professor of dermatology at OHSU, explains, "All melanoma has a strong genetic component, but individuals with a strong family history are at extremely high risk. They are ideal candidates for prevention and early detection measures. Making a few relatively simple changes could save their lives."

Each participant received individual counseling from a licensed genetic counselor at HCI. These sessions consisted of a review of family medical history and education about melanoma risk factors, including exposure to environmental ultraviolet radiation (UVR) and genetic predisposition. Participants also received pretest counseling and basic information about melanoma and genetic testing. Members of families known to carry the CDKN2A mutation then were assessed through clinical genetic testing, while subjects from families with no known CDKN2A mutations received information about risk based on family history alone. All participants received identical recommendations for reducing sun exposure.

The multidisciplinary BRIGHT research team included genetic counselors, psychologists, a dermatologist, photobiologists, and an atmospheric scientist. The team examined changes in sun exposure following genetic counseling and test reporting. They used objective measures to track participants, including a special wristwatch-like device to measure UVR and a laser that measures skin color to assess the degree of tanning.

The BRIGHT study results showed genetic counseling about highly elevated melanoma risk, both with and without test reporting, led to sustained reductions in UVR exposure. Additionally, the results provide evidence of a unique benefit to participants who received genetic testing--those who learned they carry the CDKN2A mutation showed reduced exposure to daily UVR the month following genetic counseling, and they showed lighter skin pigmentation one year later.

"The results support the use of melanoma genetic testing to motivate people to adopt risk-reducing behaviors," Aspinwall concluded. "Previously, it was thought genetic testing wouldn't matter because members of high-risk families already knew about their risk and were already being advised to reduce their sun exposure. Our study shows that genetic testing, paired with counseling about familial risk and its management, can be a useful tool to motivate cancer prevention behavior."

Co-author Tammy K. Stump, PhD, NW researcher in the department of preventive medicine, added, "We are especially confident in these results because we were able to use state-of-the-art objective measures of sun exposure. It's not simply that those with the CDKN2A mutation wished to limit sun exposure--information about personal risk resulted in significant reductions in the levels of sun exposure during the year following genetic counseling."

Researchers say their next goal is to better understand how knowledge of having a high risk for cancer leads to health-promoting changes in behavior rather than leading to debilitating responses. They also seek to improve education so the public understands a suntan is a sign of skin and DNA damage that can lead to melanoma.

Findings from this study and future work will help guide the development of counseling techniques, including optimal follow-up and integration into a patient's routine healthcare program. Dr. Stump noted, "Genetic testing is becoming more common in healthcare, but we still don't know a lot about how and when to deliver this information in a manner that promotes healthy behavior changes. This study is an example of the type of research needed to guide decisions about genetic test disclosure in clinical settings."

Credit: 
Huntsman Cancer Institute

Greening devastates the citrus industry -- new research offers a solution

image: The appearance of HLB symptoms in 5-month-old cultivar Valencia seedlings fed by Asian citrus psyllid for 2 to 20 days. The HLB symptoms were monitored at 30, 60, and 90 days postinoculation (dpi). Representative images of HLB symptom development are shown.

Image: 
Sheo Shankar Pandey and Nian Wang

St. Paul, MN (August 2019)--Citrus Huanglongbing (HLB), also known as greening, is one of the most serious citrus plant diseases in the world. Infected trees produce bitter fruits that are green, misshapen, and unsuitable for sale. Once a tree is infected, there is no cure and it typically dies within a few years. Greening has already devastated the Florida citrus industry and poses a threat to California and Texas as well as Australia and the Mediterranean region.

Currently the most effective ways to prevent the spread of HLB are to stop the causal agent (Candidatus Liberibacter asiaticus) using quarantine measures, control the insect that spreads the disease (Asian citrus psyllid), remove the diseased trees, and plant HLB free trees. To this end, early diagnosis of HLB-diseased trees is crucial. Traditionally, diagnosis relies on observing blotchy mottle symptoms and confirming disease presence using molecular tools. However, these symptoms do not show until months after disease transmission and by then the disease has likely already spread throughout the grove.

Professor Nian Wang and his postdoctoral research associate Dr. Sheo Shanker Pandey, both from Citrus Research and Education Center, Department of Microbiology and Cell Science, at the Institute of Food and Agricultural Sciences of University of Florida, developed a strategy for early diagnosis of HLB before the appearance of blotchy mottle symptoms. They used a low-cost staining method to identify insect feeding sites and tested those identified sites for the causal agent using quantitative real-time PCR (polymerase chain reaction).

Through this method, the pair were able to detect the HLB causal agent up to two days after transmission and long before the appearance of symptoms. This early detection will enable citrus growers to prevent the spread of HLB in their fields. This finding is especially crucial for California, Texas, Australia, and the Mediterranean region as those areas are currently plagued by HLB.

Credit: 
American Phytopathological Society

AI reveals new breast cancer types that respond differently to treatment

Scientists have used artificial intelligence to recognise patterns in breast cancer - and uncovered five new types of the disease each matched to different personalised treatments.

Their study applied AI and machine learning to gene sequences and molecular data from breast tumours, to reveal crucial differences among cancers that had previously been lumped into one type.

The new study, led by a team at The Institute of Cancer Research, London, found that two of the types were more likely to respond to immunotherapy than others, while one was more likely to relapse on tamoxifen.

The researchers are now developing tests for these types of breast cancer that will be used to select patients for different drugs in clinical trials, with the aim of making personalised therapy a standard part of treatment.

The researchers previously used AI in the same way to uncover five different types of bowel cancer and oncologists are now evaluating their application in clinical trials.

The aim is to apply the AI algorithm to many types of cancer - and to provide information for each about their sensitivity to treatment, likely paths of evolution and how to combat drug resistance.

The new research, published today (Friday) in the journal NPJ Breast Cancer, could not only help select treatments for women with breast cancer but also identify new drug targets.

The Institute of Cancer Research (ICR) - a charity and research institute - funded the study itself from its own charitable donations.

The majority of breast cancers develop in the inner cells that line the mammary ducts and are 'fed' by the hormones oestrogen or progesterone. These are classed as 'luminal A' tumours and often have the best cure rates.

However, patients within these groups respond very differently to standard-of-care treatments, such as tamoxifen, or new treatments - needed if patients relapse - such as immunotherapy.

The researchers applied the AI-trained computer software to a vast array of data available on the genetics, molecular and cellular make-up of primary luminal A breast tumours, along with data on patient survival.

Once trained, the AI was able to identify five different types of disease with particular patterns of response to treatment.

Women with a cancer type labelled 'inflammatory' had immune cells present in their tumours and high levels of a protein called PD-L1 - suggesting they were likely to respond to immunotherapies.

Another group of patients had 'triple negative' tumours - which don't respond to standard hormone treatments - but various indicators suggesting they might also respond to immunotherapy.

Patients with tumours that contained a specific change in chromosome 8 had worse survival than other groups when treated with tamoxifen and tended to relapse much earlier - after an average of 42 months compared to 83 months in patients who had a different tumour type that contained lots of stem cells. These patients may benefit from an additional or new treatment to delay or prevent late relapse.

The markers identified in this new study do not challenge the overall classification of breast cancer - but they do find additional differences within the current sub-divisions of the disease, with important implications for treatment.

The use of AI to understand cancer's complexity and evolution is one of the central strategies the ICR is pursuing as part of a pioneering research programme to combat the ability of cancers to adapt and become drug resistant. The ICR is raising the final £15 million of a £75 million investment in a new Centre for Cancer Drug Discovery to house a world-first programme of 'anti-evolution' therapies.

Study leader Dr Anguraj Sadanandam, Team Leader in Systems and Precision Cancer Medicine at The Institute of Cancer Research, London, said:

"We are at the cusp of a revolution in healthcare, as we really get to grips with the possibilities AI and machine learning can open up.

"Our new study has shown that AI is able to recognise patterns in breast cancer that are beyond the limit of the human eye, and to point us to new avenues of treatment among those who have stopped responding to standard hormone therapies. AI has the capacity to be used much more widely, and we think we will be able to apply this technique across all cancers, even opening up new possibilities for treatment in cancers that are currently without successful options."

Dr Maggie Cheang, a pioneer in identifying different types of breast cancer and Team Leader of the Genomic Analysis Clinical Trials Team at The Institute of Cancer Research, London, said:

"Doctors have used the current classification of breast cancers as a guide for treatment for years, but it is quite crude and patients who seemingly have the same type of the disease often respond very differently to drugs.

"Our study has used AI algorithms to spot patterns within breast cancers that human analysis had up to now missed - and found additional types of the disease that respond in very particular ways to treatment.

"Among the exciting implications of this research is its ability to pick out women who might respond well to immunotherapy, even when the broad classification of their cancer would suggest that these treatments wouldn't work for them.

"The AI used in our study could also be used to discover new drugs for those most at risk of late relapse, beyond 5 years, which is common in oestrogen-linked breast cancers and can cause considerable anxiety for patients."

As well as ICR charity funding, the work was also supported by the NIHR Biomedical Research Centre at The Institute of Cancer Research, London, and The Royal Marsden NHS Foundation Trust.

Credit: 
Institute of Cancer Research

Model predicts cognitive decline due to Alzheimer's, up to two years out

CAMBRIDGE, MA -- A new model developed at MIT can help predict if patients at risk for Alzheimer's disease will experience clinically significant cognitive decline due to the disease, by predicting their cognition test scores up to two years in the future.

The model could be used to improve the selection of candidate drugs and participant cohorts for clinical trials, which have been notoriously unsuccessful thus far. It would also let patients know they may experience rapid cognitive decline in the coming months and years, so they and their loved ones can prepare.

Pharmaceutical firms over the past two decades have injected hundreds of billions of dollars into Alzheimer's research. Yet the field has been plagued with failure: Between 1998 and 2017, there were 146 unsuccessful attempts to develop drugs to treat or prevent the disease, according to a 2018 report from the Pharmaceutical Research and Manufacturers of America. In that time, only four new medicines were approved, and only to treat symptoms. More than 90 drug candidates are currently in development.

Studies suggest greater success in bringing drugs to market could come down to recruiting candidates who are in the disease's early stages, before symptoms are evident, which is when treatment is most effective. In a paper to be presented next week at the Machine Learning for Health Care conference, MIT Media Lab researchers describe a machine-learning model that can help clinicians zero in on that specific cohort of participants.

They first trained a "population" model on an entire dataset that included clinically significant cognitive test scores and other biometric data from Alzheimer's patients, and also healthy individuals, collected between biannual doctor's visits. From the data, the model learns patterns that can help predict how the patients will score on cognitive tests taken between visits. In new participants, a second model, personalized for each patient, continuously updates score predictions based on newly recorded data, such as information collected during the most recent visits.

Experiments indicate accurate predictions can be made looking ahead six, 12, 18, and 24 months. Clinicians could thus use the model to help select at-risk participants for clinical trials, who are likely to demonstrate rapid cognitive decline, possibly even before other clinical symptoms emerge. Treating such patients early on may help clinicians better track which antidementia medicines are and aren't working.

"Accurate prediction of cognitive decline from six to 24 months is critical to designing clinical trials," says Oggi Rudovic, a Media Lab researcher. "Being able to accurately predict future cognitive changes can reduce the number of visits the participant has to make, which can be expensive and time-consuming. Apart from helping develop a useful drug, the goal is to help reduce the costs of clinical trials to make them more affordable and done on larger scales."

Joining Rudovic on the paper are: Yuria Utsumi, an undergraduate student, and Kelly Peterson, a graduate student, both in the Department of Electrical Engineering and Computer Science; Ricardo Guerrero and Daniel Rueckert, both of Imperial College London; and Rosalind Picard, a professor of media arts and sciences and director of affective computing research in the Media Lab.

Population to personalization

For their work, the researchers leveraged the world's largest Alzheimer's disease clinical trial dataset, called Alzheimer's Disease Neuroimaging Initiative (ADNI). The dataset contains data from around 1,700 participants, with and without Alzheimer's, recorded during semiannual doctor's visits over 10 years.

Data includes their AD Assessment Scale-cognition sub-scale (ADAS-Cog13) scores, the most widely used cognitive metric for clinical trials of Alzheimer's disease drugs. The test assesses memory, language, and orientation on a scale of increasing severity up to 85 points. The dataset also includes MRI scans, demographic and genetic information, and cerebrospinal fluid measurements.

In all, the researchers trained and tested their model on a sub-cohort of 100 participants, who made more than 10 visits and had less than 85 percent missing data, each with more than 600 computable features. Of those participants, 48 were diagnosed with Alzheimer's disease. But data are sparse, with different combinations of features missing for most of the participants.

To tackle that, the researchers used the data to train a population model powered by a "nonparametric" probability framework, called Gaussian Processes (GPs), which has flexible parameters to fit various probability distributions and to process uncertainties in data. This technique measures similarities between variables, such as patient data points, to predict a value for an unseen data point -- such as a cognitive score. The output also contains an estimate for how certain it is about the prediction. The model works robustly even when analyzing datasets with missing values or lots of noise from different data-collecting formats.

But, in evaluating the model on new patients from a held-out portion of participants, the researchers found the model's predictions weren't as accurate as they could be. So, they personalized the population model for each new patient. The system would then progressively fill in data gaps with each new patient visit and update the ADAS-Cog13 score prediction accordingly, by continuously updating the previously unknown distributions of the GPs. After about four visits, the personalized models significantly reduced the error rate in predictions. It also outperformed various traditional machine-learning approaches used for clinical data.

Learning how to learn

But the researchers found the personalized models' results were still suboptimal. To fix that, they invented a novel "metalearning" scheme that learns to automatically choose which type of model, population or personalized, works best for any given participant at any given time, depending on the data being analyzed. Metalearning has been used before for computer vision and machine translation tasks to learn new skills or adapt to new environments rapidly with a few training examples. But this is the first time it's been applied to tracking cognitive decline of Alzheimer's patients, where limited data is a main challenge, Rudovic says.

The scheme essentially simulates how the different models perform on a given task -- such as predicting an ADAS-Cog13 score -- and learns the best fit. During each visit of a new patient, the scheme assigns the appropriate model, based on the previous data. With patients with noisy, sparse data during early visits, for instance, population models make more accurate predictions. When patients start with more data or collect more through subsequent visits, however, personalized models perform better.

This helped reduce the error rate for predictions by a further 50 percent. "We couldn't find a single model or fixed combination of models that could give us the best prediction," Rudovic says. "So, we wanted to learn how to learn with this metalearning scheme. It's like a model on top of a model that acts as a selector, trained using metaknowledge to decide which model is better to deploy."

Next, the researchers are hoping to partner with pharmaceutical firms to implement the model into real-world Alzheimer's clinical trials. Rudovic says the model can also be generalized to predict various metrics for Alzheimer's and other diseases.

Credit: 
Massachusetts Institute of Technology

Nordic researchers: A quarter of the world's population at risk of developing tuberculosis

A new study from Aarhus University Hospital and Aarhus University, Denmark, has shown that probably 1 in 4 people in the world carry the tuberculosis bacterium in the body. The disease tuberculosis is caused by the bacterium Mycobacterium Tuberculosis, which affects more than 10 million people every year, and kills up to 2 million, making it the most deadly of the infectious diseases.

In addition, many are infected with the tuberculosis bacterium without having active disease, which is called latent tuberculosis. This number has so far been estimated on the basis of assumptions on how many a patient with active tuberculosis may infect, but there has not been an empirical basis for these assumptions.

Now, researchers from Denmark and Sweden have used a new method to describe the occurrence of latent tuberculosis infection. The researchers have reviewed 88 scientific studies from 36 different countries, and on the basis of this epidemiological evidence they have estimated a prevalence also in those countries where no studies are available, additionally they have calculated the approximate total global prevalence.

- The study emphasizes that it will be extremely difficult to reach the goal of eliminating tuberculosis by 2035, which is the aim of the WHO. At any rate, the objective cannot be achieved without treating the large incidence of latent tuberculosis, since all infected people are at risk of developing active tuberculosis disease later in life, says Christian Wejse, an infectious disease specialist at Aarhus University Hospital and Associate Professor at Aarhus University, Denmark.

It has previously been estimated that somewhere between one-third and one-fourth have latent tuberculosis, but the new study, which is based on tests from 351,811 individuals, indicates that it is between one-fifth and one-fourth, depending on the test method used. The study thus documents a significant occurrence of tuberculosis infection in the world today, albeit slightly less than previously thought.

Credit: 
Aarhus University