Culture

A better kind of cybersecurity strategy

During the opening ceremonies of the 2018 Winter Olympics, held in PyeongChang, South Korea, Russian hackers launched a cyberattack that disrupted television and internet systems at the games. The incident was resolved quickly, but because Russia used North Korean IP addresses for the attack, the source of the disruption was unclear in the event's immediate aftermath.

There is a lesson in that attack, and others like it, at a time when hostilities between countries increasingly occur online. In contrast to conventional national security thinking, such skirmishes call for a new strategic outlook, according to a new paper co-authored by an MIT professor.

The core of the matter involves deterrence and retaliation. In conventional warfare, deterrence usually consists of potential retaliatory military strikes against enemies. But in cybersecurity, this is more complicated. If identifying cyberattackers is difficult, then retaliating too quickly or too often, on the basis of limited information such as the location of certain IP addresses, can be counterproductive. Indeed, it can embolden other countries to launch their own attacks, by leading them to think they will not be blamed.

"If one country becomes more aggressive, then the equilibrium response is that all countries are going to end up becoming more aggressive," says Alexander Wolitzky, an MIT economist who specializes in game theory. "If after every cyberattack my first instinct is to retaliate against Russia and China, this gives North Korea and Iran impunity to engage in cyberattacks."

But Wolitzky and his colleagues do think there is a viable new approach, involving a more judicious and well-informed use of selective retaliation.

"Imperfect attribution makes deterrence multilateral," Wolitzky says. "You have to think about everybody's incentives together. Focusing your attention on the most likely culprits could be a big mistake."

The paper, "Deterrence with Imperfect Attribution," appears in the latest issue of the American Political Science Review. In addition to Wolitzky, the authors are Sandeep Baliga, the John L. and Helen Kellogg Professor of Managerial Economics and Decision Sciences at Northwestern University's Kellogg School of Management; and Ethan Bueno de Mesquita, the Sydney Stein Professor and deputy dean of the Harris School of Public Policy at the University of Chicago.

The study is a joint project, in which Baliga added to the research team by contacting Wolitzky, whose own work applies game theory to a wide variety of situations, including war, international affairs, network behavior, labor relations, and even technology adoption.

"In some sense this is a canonical kind of question for game theorists to think about," Wolitzky says, noting that the development of game theory as an intellectual field stems from the study of nuclear deterrence during the Cold War. "We were interested in what's different about cyberdeterrence, in contrast to conventional or nuclear deterrence. And of course there are a lot of differences, but one thing that we settled on pretty early is this attribution problem." In their paper, the authors note that, as former U.S. Deputy Secretary of Defense William Lynn once put it, "Whereas a missile comes with a return address, a computer virus generally does not."

In some cases, countries are not even aware of major cyberattacks against them; Iran only belatedly realized it had been attacked by the Stuxnet worm over a period of years, damaging centrifuges being used in the country's nuclear weapons program.

In the paper, the scholars largely examined scenarios where countries are aware of cyberattacks against them but have imperfect information about the attacks and attackers. After modeling these events extensively, the researchers determined that the multilateral nature of cybersecurity today makes it markedly different than conventional security. There is a much higher chance in multilateral conditions that retaliation can backfire, generating additional attacks from multiple sources.

"You don't necessarily want to commit to be more aggressive after every signal," Wolitzky says.

What does work, however, is simultaneously improving detection of attacks and gathering more information about the identity of the attackers, so that a country can pinpoint the other nations they could meaningfully retaliate against.

But even gathering more information to inform strategic decisions is a tricky process, as the scholars show. Detecting more attacks while being unable to identify the attackers does not clarify specific decisions, for instance. And gathering more information but having "too much certainty in attribution" can lead a country straight back into the problem of lashing out against some states, even as others are continuing to plan and commit attacks.

"The optimal doctrine in this case in some sense will commit you to retaliate more after the clearest signals, the most unambiguous signals," Wolitzky says. "If you blindly commit yourself more to retaliate after every attack, you increase the risk you're going to be retaliating after false alarms."

Wolitzky points out that the paper's model can apply to issues beyond cybersecurity. The problem of stopping pollution can have the same dynamics. If, for instance, numerous firms are polluting a river, singling just one out for punishment can embolden the others to continue.

Still, the authors do hope the paper will generate discussion in the foreign-policy community, with cyberattacks continuing to be a significant source of national security concern.

"People thought the possibility of failing to detect or attribute a cyberattack mattered, but there hadn't [necessarily] been a recognition of the multilateral implications of this," Wolitzky says. "I do think there is interest in thinking about the applications of that."

Credit: 
Massachusetts Institute of Technology

Prehistoric 'sea dragon' discovered on English Channel Coast is identified as new species

image: illustration of Thalassodraco etchesi

Image: 
Megan Jacobs

A mysterious small marine reptile dating from 150 million years ago has been identified as a new species that may have been capable of diving very deeply. The well-preserved specimen was found in a Late Jurassic deep marine deposit along the English Channel coastline in Dorset, England.

The aquatic reptile has been determined to be part of the group known as ichthyosaurs, which were streamlined marine predators from the Late Jurassic period, according to paleontologist Megan L. Jacobs, a Baylor University doctoral candidate in geosciences and co-author of a study published in the journal PLOS ONE.

"This ichthyosaur has several differences that makes it unique enough to be its own genus and species," Jacobs said. "New Late Jurassic ichthyosaurs in the United Kingdom are extremely rare, as these creatures have been studied for 200 years. We knew it was new almost instantly, but it took about a year to make thorough comparisons with all other Late Jurassic ichthyosaurs to make certain our instincts were correct. It was very exciting to not be able to find a match."

The specimen, estimated to have been about 6 feet long, was discovered in 2009 by fossil collector Steve Etches MBE after a cliff crumbled along the seaside. He found it encased in a slab that would originally have been buried 300 feet deep in a limestone seafloor layer. The specimen since has been housed in The Etches Collection Museum of Jurassic Marine Life in Kimmeridge, Dorset. Jacobs named it Thalassodraco etchesi, meaning "Etches sea dragon" after Etches.

"Now that the new sea dragon has been officially named, it's time to investigate its biology," said study co-author David Martill, Ph.D., professor of paleontology at the University of Portsmouth in Portsmouth, United Kingdom. "There are a number of things that make this animal special."

Investigating the Differences

"This animal was obviously doing something different compared to other ichthyosaurs. One idea is that it could be a deep diving species, like sperm whales," Jacobs said. "The extremely deep rib cage may have allowed for larger lungs for holding their breath for extended periods, or it may mean that the internal organs weren't crushed under the pressure. It also has incredibly large eyes, which means it could see well in low light. That could mean it was diving deep down, where there was no light, or it may have been nocturnal."

With the deep rib cage, the creature would have looked very barrel-like, she said. Given its comparatively small flippers, it may have swum with a distinctive style from other ichthyosaurs.

The specimen's hundreds of tiny teeth would have been suited for a diet of squid and small fish, and "the teeth are unique by being completely smooth," Jacobs said. "All other ichthyosaurs have larger teeth with prominent striated ridges on them, so we knew pretty much straight away this animal was different."

Changes Through History

Ichthyosaurs originated as lizard-like creatures living on land and slowly evolved into the dolphin/shark-like creature found as fossils. Their limbs evolved into flippers, most of them very long or wide.

"They still had to breathe air at the surface and didn't have scales," Jacobs said. "There is hardly anything actually known about the biology of these animals. We can only make assumptions from the fossils we have, but there's nothing like it around today. Eventually, to adapt to being fully aquatic, they no longer could go up onto land to lay eggs, so they evolved into bearing live young, tail first. There have been skeletons found with babies within the mother and also ones that were actually being born."

Thalassodraco etchesi is closely related to Nannopterygius, a widespread genus of ichthyosaurs which inhabited Late Jurassic seas across Europe, Russia and the Arctic around 248 million years ago before becoming extinct around 90 million years ago. The largest ichthyosaurs, found in North America, had skulls nearly 16 feet long.

Jacobs said that the new specimen likely died from old age or attack by predators, then sank to the seafloor.

"The seafloor at the time would have been incredibly soft, even soupy, which allowed it to nose-dive into the mud and be half buried," she said. "The back end didn't sink into the mud, so it was left exposed to decay and scavengers, which came along and ate the tail end. Being encased in that limestone layer allowed for exceptional preservation, including some preserved internal organs and ossified ligaments of the vertebral column."

"It's excellent that new species of ichthyosaurs are still being discovered, which shows just how diverse these incredible animals were," Martill said.

Credit: 
Baylor University

Shining a light on what's really happening in perovskite solar cells

image: Schematic structure of electrical contacts and wires of the perovskite solar cell in an ESR sample tube.

Image: 
University of Tsukuba

Tsukuba, Japan - Consumers worldwide are demanding greener energy sources; therefore, optimizing the performance and economic viability of solar cells is an important research focus. Improving the efficiency of perovskite solar cells has been a particular priority; however, less emphasis has been placed on understanding what makes the cell performance deteriorate. Now, recent findings from researchers at the University of Tsukuba provide a microscopic-level study of perovskite solar cells to address the knowledge gap.

Organic-inorganic hybrid perovskites are attractive materials for use in solar cells because they are easy and cheap to prepare and absorb light over a wide range of wavelengths. Solar cells that use perovskite layers as the photoactive material are continually being improved, with a particular focus on their power conversion efficiency (PCE), which can now exceed 25%.

However, focusing on improving PCEs alone could be causing researchers to miss the significant steps forward that might result from a more detailed understanding of the underlying mechanisms. For example, the question of what causes the performance of perovskite solar cells to deteriorate is an important one that has not been comprehensively answered.

External factors such as oxygen and moisture in the air are known to compromise perovskite layers. However, the internal changes that affect the performance of cells are not as well understood. The researchers have therefore probed the deterioration mechanism using electron spin resonance (ESR) spectroscopy.

"We carried out ESR spectroscopy on perovskite solar cells while they were in use, which gave us a real-time picture of the molecular-level changes," study corresponding author Professor Kazuhiro Marumoto explains. "Specifically, we observed the charges and defects, and related spin states, in the solar cell layers while the current-voltage characteristics of the solar cells were being measured. This allowed us to understand the relationships between these factors."

This in-depth investigation of perovskite solar cells in operation showed that changes in the spin states result from changes in hole transport as well as the formation of interfacial electric dipole layers. It was therefore concluded that cell deterioration could be prevented by improving charge mobility in the hole transport material and preventing electric dipole layer formation.

"Establishing that changes in spin states are correlated with device performance has significantly broadened our understanding of perovskite solar cells," Professor Marumoto says. "We hope that our findings will provide a valuable new starting point for the continued development of solar cells and help accelerate the reality of cost-effective green energy."

Credit: 
University of Tsukuba

Index reveals integrity issues for many of the world's forests

image: Forest integrity data mapped via the Forest Landscape Integrity Index.

Image: 
WCS

Only 40 per cent of forests are considered to have high ecological integrity, according to a new global measure, the Forest Landscape Integrity Index.

The Index was created by 47 forest and conservation experts from across the world, including Professor James Watson of The University of Queensland and the Wildlife Conservation Society.

"This extremely fine-scale analysis of the ecological integrity of the world's forests has found that only 17.4 million square kilometres of Earth's remaining forests - or 40 per cent of them - are considered to have high integrity," Professor Watson said.

"And just 27 per cent of this area is found in nationally designated protected areas.

"High integrity forests are those which contain high levels of biodiversity, provide high quality ecosystem services and are more resilient to climate change.

"Many of our remaining forests have been heavily impacted by a variety of human activities, including logging, fires, hunting, wildlife exploitation and edge effects.

"These actions damage forest integrity.

"By protecting and expanding forests with high integrity, we can help slow the impacts of climate change, preserve biodiversity, protect the rights of indigenous peoples and local communities and prevent future pandemics."

Professor Watson said the index was a result of rapid advances in remote sensing, big data and cloud computing.

"The use of this index is critical in allowing us to locate Earth's remaining intact forests and ensure that they are better protected but also hold nations to account for how they treat their forests," he said.

"We show how critical some countries are, including Canada, Brazil, Democratic Republic of Congo, Papua New Guinea and Australia, in sustaining the world's last large intact forests.

"The fine-scale nature of the map will also allow land managers to plan activities more effectively and to monitor change over time."

Dr Hedley Grantham, lead author of the study and WCS's Director of Conservation Planning, said the study's results were fundamental to talks at the Convention on Biological Diversity.

"The current draft of the post-2020 Global Biodiversity Framework wisely proposes targets relating to ecosystem integrity and there has been active discussion about how this can be quantified and monitored," Dr Grantham said.

"Using this index, we can now set ambitious policy goals to improve the integrity of forests globally."

Credit: 
University of Queensland

Alzheimer Europe sets out future vision of EU dementia policy

image: Dementia as a European Priority - A Policy Overview (front cover)

Image: 
Alzheimer Europe

Luxembourg, 9 December 2020 - At an online European Parliament workshop hosted by Sirpa Pietikäinen, MEP (Finland), Alzheimer Europe launched a new report "Dementia as a European Priority - A Policy Overview" which takes stock of dementia policy at an EU level and sets out recommendations for future priorities across Europe.

As the European Union is about to agree a new long-term budget and the details of the EU4Health and Horizon Europe programmes are being finalised, Alzheimer Europe reflects on the place of dementia as a political priority in Europe in recent years. This includes the different ways in which dementia policy and research have been supported by the three institutions of the EU, as well as some of the high-profile coordination and research projects which have been made possible as a result of EU funding.

In the report, Alzheimer Europe also highlights some of its key activities in campaigning for change, as well as the work it has coordinated and participated in, along with its national member associations, to raise the profile of the condition and build an evidence base to make the case for the prioritisation of dementia.

Despite the progress made and the knowledge generated, the report highlights that people living with dementia continue to face a number of challenges. These challenges, which concern wider society too, include the increase in the number of people living with dementia (estimated to double by 2050) and the societal and economic cost of dementia.

As a result, the report sets out a number of recommendations for the EU, outlining specific areas in which dementia should be prioritised across international, health, research and social policy.

Recommendations include:

Prioritising dementia research in EU Research Programmes (including Horizon Europe), providing a fair allocation of resources and funding for existing programmes and better coordination between programmes

Prioritising dementia within policies relating to chronic diseases, mental health and ageing, both at an EU and national level

Supporting Member States to work towards the implementation of the World Health Organization's Global Action Plan on Dementia 2017-2025

Recognising dementia as a disability and including dementia in disability policies.

Commenting on the publication of the report, Alzheimer Europe's Executive Director, Jean Georges, stated:

"Alzheimer Europe has worked with its members over the past three decades to ensure that dementia is a political priority at the European level. The policy landscape has changed dramatically during this time and we have seen considerable progress as both national governments and the EU have given dementia greater prominence within their health and research policies."

"However, there is much still to do. The European Union and its Member States are on the cusp of historic deals on the EU budget, a greatly expanded Health Programme and the forthcoming Horizon Europe research programme. If we are to build on the knowledge, experience and progress gained in recent years, it is vital that dementia remain a political priority at a European level across health, research and social policy."

Credit: 
Alzheimer Europe

Insecure livelihoods hindering efforts to combat anti-microbial resistance globally

image: A Thai villager studying a formulary for herbal medicine.

Image: 
Patthanan Thavethanutthanawin

Researchers led by University of Warwick find that precarity - in employment, personal circumstances, social status - is among the biggest factors in affecting whether patients use antibiotics correctly

Whereas poverty did not have an impact, precarity pushed up to 1 in 2 people into inappropriate antibiotic use, suggesting that this could be a challenge in higher income as well as in developing countries

Research demonstrates that individuals cannot always be blamed for misuse of antibiotics, and that better sustainable development policy is needed

Efforts to improve security of livelihoods globally could also help tackle anti-microbial resistance as a side-effect

Patients living in precarious circumstances are less likely to use antibiotics appropriately according to a new study from the University of Warwick, suggesting that efforts to improve conditions for those with little security in their livelihoods could have an unexpected benefit in helping to tackle antimicrobial resistance globally.

The findings add to evidence that focus should shift from influencing individuals' efforts to combat anti-microbial resistance to supporting sustainable development policy that tackles the contributing factors.

The research, published in the journal BMJ Global Health and funded by the Economic and Social Research Council (ESRC), part of UK Research and Innovation, provides comprehensive analysis of how patients accessed and used healthcare, and is the first study to quantitatively examine the relationship between precarity and antibiotic use.

Precarity refers to personal circumstances dominated by uncertainty, whether that is in employment, your personal life or social status. Those in precarious circumstances are limited in their ability to plan ahead, and deprived of safety nets, social support, economic certainty and flexibility. It is not necessarily about being poor: although this study looked at low to middle income countries, precarity can also be a problem in higher income countries.

Using statistical analysis, the researchers were able examine the impact of socioeconomic factors, such as precarity, poverty and marginalisation, on their use of antibiotics to treat their illness. They found that patients in precarious circumstances had a chance of up to 51% of using antibiotics without advice from a medical professional or for inappropriate illnesses - compared to 17% for an average patient.

Lead author Dr Marco Haenssgen, from the Warwick Institute of Advanced Study and the Department of Global Sustainable Development, argues that this behaviour is understandable as when living in precarious circumstances individuals are engaged in a constant balancing act.

Dr Haenssgen, Assistant Professor in Global Sustainable Development, said: "You have to balance your health, your economic life, feed your family, go to school; these are all competing priorities.

"Antibiotics have become such a staple of healthcare that they have become what some people see as a 'quick fix' solution. When people are in precarious circumstances and deprived of social support, if they can find a quick fix to keep them going they will use it, and then it could become problematic."

In many low to middle income countries, patients often have to travel long distances to access healthcare services, something they may not be able to easily arrange or afford. Even in more economically developed areas, employment can be less secure and social networks can be eroded.

Antimicrobial resistance occurs when microbes become resistant to antibiotics, threatening their effectiveness. Human antibiotic use is known to be a main driver of this process, and policies to combat anti-microbial resistance have therefore typically focused on clinical factors and promoting individual responsibilities. However, the researchers argue that the impact of socioeconomic factors such as precarity are being underestimated.

Dr Haenssgen said: "You cannot always blame the individual for misuse of antibiotics. Often, we find ourselves in living conditions that provoke problematic behaviours, which means that if you want to improve antibiotic use you need to improve those living conditions.

"If we can improve situations of precarity then we have a good start for future interventions. We have a very substantial facet of the problem that is continuously disregarded and where potentially we have a lot of gains to realise.

"Antimicrobial resistance is a massive global health problem, it can potentially overturn what global health is. Many of the past gains that we have had in infectious disease control and prevention are potentially being undone by antimicrobial resistance."

The study focused on five local communities across rural Thailand and Lao People's Democratic Republic and surveyed 2066 residents on recent illnesses they had experienced and how they sought healthcare support for it. This provided the researchers with a rich dataset on 1421 illness episodes that they could analyse to determine what healthcare patients seek, if any, whether they were able to access suitable healthcare, and when these occur in the timeline of their illness.

Credit: 
University of Warwick

Blood test for alzheimer's disease predicts future cognitive decline in healthy people

Today, a clinician can order a blood test to check a patient's cholesterol or hemoglobin A1c levels -- biomarkers that help predict an individual's risk of cardiovascular disease or diabetes, respectively. But despite decades of advances in the understanding of Alzheimer's disease (AD), a blood test for predicting its risk remains elusive. Imaging scans of the brain and lumbar punctures that collect cerebrospinal fluid can offer diagnoses, but such tests are expensive and cumbersome for patients. Two years ago, investigators at Brigham and Women's Hospital reported the development of a blood test for a fragment of the protein tau, a hallmark of AD. Now, that test for levels of N-terminal fragment of tau (NT1) has been evaluated in participants in the Harvard Aging Brain Study (HABS), a cohort of cognitively normal, older adults who are followed closely over time. In Nature Communications, the authors report that baseline NT1 levels in the blood were highly predictive of the risk of cognitive decline and AD dementia.

"Our findings indicate that measuring a tau fragment in plasma can help predict which elderly people are likely to decline and how quickly they are likely to decline," said corresponding author Dennis Selkoe, MD, co-director of the Ann Romney Center for Neurologic Diseases. "We're excited because there are currently no commercially available blood tests to predict risk of AD in still-healthy individuals. Having such a blood test allows us to better screen people for enrollment in AD prevention trials and represents progress toward diagnostic tests for AD in medical care."

Selkoe cautions that a commercial test for routine clinical care likely remains several years away. But for clinical trials that seek to evaluate preventive treatments for AD, such as the large-scale clinical trials led by co-author Reisa Sperling, MD, MMSc, director of the Center for Alzheimer Research and Treatment at the Brigham NT1 levels could be measured before a participant enrolls in a the trial, and potentially also as a longitudinal measure to assess treatment response. The test ultimately represents a far less costly and less invasive alternative to imaging and lumbar punctures.

The current study, led by first author Jasmeer Chhatwal, MD, PhD, now an attending physician and scientist in the Massachusetts General Hospital Department of Neurology, evaluated the predictive value of NT1 among 236 cognitively normal participants in HABS. Participants were on average 74 years old when they entered HABS and were followed for an average of five years. Blood samples were collected in the first year.

The research team found that higher levels of NT1 in blood samples taken at the beginning of the trial were strongly associated with future clinical progression. The team divided participants into those with high, medium and low NT1 levels, finding that for the group with the highest levels, the risk of advancing to mild cognitive impairment (MC I) or AD dementia was 2.4-fold. NT1 levels predicted decline across multiple areas of memory, including episodic memory -- remembering specific events or experiences such as a person's birthday or a family visit -- and also predicted how fast the participant's cognition would decline. Imaging data showed that higher baseline NT1 blood levels were associated with elevated brain levels of β-amyloid plaques and the accumulation of tangles of tau -- both classical signs of AD.

The authors note that relatively few participants in HABS progressed to AD, an important limitation of this cohort. They found that another brain protein -- known as NfL -- which has been studied by other groups, may also be associated with cognitive decline, especially among people who already show signs of cognitive deficits. NfL was a less strong predictor than NT1 in the study.

"The NT1 tau fragment may be a reflection of damage to neurons and synapses, allowing us to use blood samples to detect what is happening in a patient's brain years before they begin experiencing symptoms," said Selkoe. "This could give us an invaluable window of time in which to evaluate interventions for preventing cognitive decline and AD dementia."

Credit: 
Brigham and Women's Hospital

Hip-hop is helping tackle stigma around mental health, say Cambridge researchers

Hip-hop is one of the world's most popular musical genres. Seven of the 10 most streamed artists in the US are rappers. With almost 50 years of history, hip-hop has evolved to give rise to many sub-genres that appeal to different people in diverse ways.

With social media being a contributing factor, more than ever before, hip-hop artists are publicly acknowledging their mental health struggles, promoting anti-stigma campaigns around mental health, and encouraging people to seek professional treatment.

The Hip Hop Psych co-founders said: "Hip-hop can be a vehicle to tackle stigma around mental health and address cultural imbalances. Hip-hop connects with hard-to-reach groups, particularly men within the Black community.

"Underrepresented communities are at a higher risk of developing mental health problems, and they are more likely to experience worsened mental health outcomes. This is in part, due to socioeconomic disparities. They are also less likely to use mental health services. Stigma around mental health issues is prevalent in underserved communities and it is a significant barrier to accessing health services.

"Discrimination, bias, and a lack of cultural competence from healthcare professionals can also lead to unmet needs, late presentation of symptoms, and poorer quality of care."

In their article, the researchers highlight various hip-hop artists and songs that have helped shape the narrative around mental health.

Hip Hop Psych added: "Since the genre's conception almost 50 years ago, hip-hop's progressive narratives have increasingly spoken up about mental health and there is no denying that it is helping to tackle stigma. Hip-hop artists are speaking candidly through their art form, and it may be helping people around the world to acknowledge their own inner struggles."

Among the artists and songs highlighted in the article are:

The Message - Grand Master Flash and the Furious Five

In the 1970s, pioneering rappers such as Grand Master Flash and the Furious Five were 'street epidemiologists' who documented the harsh living conditions and social inequalities that could be damaging to mental health. Their 1982 hit The Message describes a world of financial hardship, deprivation, and inner city violence.

Drop The World - Lil Wayne (featuring Eminem)

The relationship between mental health and masculinity is complex, as societies sometimes promote narratives like 'strong men don't cry' or 'his emotions got the best of him'. Such negative views of men expressing their emotions are seen as a 'sign of weakness'; however, men under the age of 50 are at an increased risk of taking their own lives and are less likely to seek help when experiencing a mental health crisis. A jarring inner conflict between anger and sadness is portrayed in Lil Wayne's song 'Drop The World' (featuring Eminem).

Cleaning Out My Closet - Angel Haze

In 2012, Angel Haze remixed Eminem's song Cleanin' Out My Closet, producing a song she went on to describe as "...probably the realest song I ever recorded", a candid, explicit and disturbing song about childhood sexual abuse.

1-800-273-8255 - Logic

The rapper Logic partnered with the National Suicide Prevention Lifeline, releasing a song called "1-800-273-8255" about a suicidal hotline caller getting support. When Logic performed the song at the Grammys alongside suicide survivors, calls to the National Suicide Prevention Lifeline tripled.

Sell Out - Rico Nasty

A number of female artists have also addressed mental health through hip-hop. On her most recent album Anger Management, Rico Nasty channels expressions of rage and anger into a form of empowerment. The track Sell Out describes how her "...expression of anger is a form of rejuvenation" and how she has used to her emotions to help herself and others.

Man on the Moon - Kid Cudi

In recent years, as well as opening up about their troubles, some notable rappers have helped reduce stigma by endorsing therapy. One such advocate is rapper, Kid Cudi, who posted on social media about going to an inpatient mental health treatment centre, triggering a hugely positive reaction from his fans. His songs such as Man on the Moon resonate with his fans emotionally - one comment under the YouTube video for this track reads: "If you're listening to this it's probably for a reason, keep your head up guys(: everything will be okay".

4:44 - Jay Z

Another major therapy advocate is Jay Z, who in 2018 talked to CNN's Van Jones about the "ridiculousness" of stigma surrounding mental health problems and said that he would like to see therapists in schools. His album 4:44 documents his own experiences in therapy, leading to songs such as the title track where Jay Z apologises for his behaviour, trying to make amends for what he has done.

Drs Sule and Inkster set up Hip Hop Psych as a social venture in November 2011, aiming to bridge the gap between the medical community and hip-hop culture and working directly with health professionals and the public. They analyse hip-hop lyrics for mental health themes and translate medical information in an accessible manner for the public, which generates culturally sensitive resources.

Hip Hop Psych also perform anti-stigma events in various settings including prisons, nightclubs and African and Caribbean societies. In addition, they use this same approach to educate health professionals and academics to help broaden their awareness and appreciation of some of the different ways in which people communicate and use language in their daily lives to describe their personal experiences with mental health.

Credit: 
University of Cambridge

UL, Ireland, research finds promising treatment to protect kidney function in diabetes

A clinical trial involving researchers at University of Limerick, Ireland has demonstrated the potential benefits of new drugs in protecting kidney function in diabetes.

The new study has found that combining two treatments that lower uric acid concentrations in the blood reduces the leakage of albumin in the urine, one of the earliest signs of kidney damage in diabetes.

The discovery could help to prevent kidney failure in diabetes patients, the UL researchers believe.

Researchers from the University of Limerick School of Medicine and University Hospital Limerick, working with investigators from the University of California San Diego, USA and AstraZeneca, found that the combination of Verinurad and Febuxostat reduced albuminuria in the urine by 39.4% in patients with Type 2 diabetes after 12 weeks of treatment compared to placebo.

The results of this AstraZeneca sponsored Phase 2a clinical trial were recently published in the American Journal of Kidney Disease.

Verinurad is a novel inhibitor of the uric acid transporter (URAT1) and is currently under investigation for the treatment of hyperuricaemia and kidney disease. Febuxostat is a potent, selective xanthine oxidase inhibitor used to lower urate levels in patients with gout and hyperuricaemia.

The CITRINE clinical trial results show that the combination of drugs reduces the leaking of protein through the kidney.

"This is exciting news as leaking of protein is the earliest clinical sign of kidney damage," said Professor Austin Stack, Foundation Chair of Medicine at UL's School of Medicine and Consultant Nephrologist at University Hospital Limerick, who was lead author of the study.

"The results are very promising as they demonstrate an important reduction in albuminuria and hyperuricaemia in patients with type 2 diabetes when treated with a combination of Verinurad and Febuxostat.

"If we can intervene early on then we are more likely to prevent patients from going into kidney failure. The findings raise hope for the 350 million people with type 2 diabetes globally who are at increased risk of kidney failure," added Professor Stack, director of the National Kidney Disease Surveillance System (NKDSS).

In the multicentre randomised clinical trial, 60 patients with type 2 diabetes with albuminuria and elevated uric acid levels were randomised to receive either Verinurad 9mg, and Febuxostat 80mg or placebo. The patients were followed up for 24 weeks.

The primary endpoint of the study was met and showed a 39% reduction in albuminuria, after 12 weeks with combined treatment of Verinurad and Febuxostat versus placebo. This effect persisted at 24 weeks with an overall 49% reduction in albuminuria. Treated patients also experienced a 57% reduction in uric acid levels at 12 weeks. Both Verinurad and Febuxostat were well tolerated by patients, according to the study.

"One of the earliest signals of kidney disease is development of albuminuria (leaking of albumin into the urine) and recent studies have shown that this can be associated with high levels of uric acid," said Professor Stack, a HRB-funded principal investigator whose work in this area has raised the profile of uric acid as a potential risk factor for kidney and heart disease.

"A key goal in protecting kidney function is the lowering of albuminuria in the urine, as patients with high levels are at risk of progressing to kidney failure. This clinical trial was designed to examine the effects of an intensive uric acid lowering strategy on albuminuria by combining Verinurad with Febuxostat in patients with type 2 diabetes mellitus with pre-existing albuminuria.

"Although these are early clinical findings, our results show that combined treatment with Verinurad and Febuxostat in patients with diabetes results in a rapid reduction in albuminuria that was sustained through week 24," Professor Stack added.

A larger clinical trial, the SAPPHIRE study, is currently underway to determine whether an intensive uric acid lowering strategy combining Verinurad with a xanthine oxidase inhibitor will slow the progression of chronic kidney disease.

"Diabetes is the greatest contributor to the 850 million globally with chronic kidney disease," explained Professor Stack.

"More than 40% of patients with diabetes are at risk for developing kidney disease and significant number of these will progress to kidney failure. Preventing kidney failure is a key goal in all healthcare systems to reduce morbidity of diabetes and improve patient outcomes," he added.

Credit: 
University of Limerick

Health care workers' COVID infections driven mainly by community exposure

Nurses only group with higher risk once community exposure considered

Largest cohort of health care worker risk for SARS CoV-2

Only high-flow oxygen therapy and hemodialysis linked to more antibodies to SARS-CoV-2

CHICAGO --- In a well-resourced health system with adequate PPE (Personal Protection Equipment), health care worker risk for SARS-CoV-2 infection was more strongly driven by community exposure than patient exposure early in the pandemic, reports a new Northwestern Medicine study.

The study of 6,510 health care workers is the largest systematically collected cohort study of health care worker risk for SARS CoV-2 in the United States.

Nurses were the only occupation group with higher risks once community exposure was accounted for, the study reports.

Participation in high-flow oxygen therapy and hemodialysis were associated with a higher likelihood for having antibodies to SARS-CoV-2, but other high-risk procedures were not associated with higher risk.

"This suggests that PPE is highly effective in acute exposures to SARS-CoV-2, but some longer exposures may still expose health care workers to increased risks for infection," said co-lead author Dr. John Wilkins, an associate professor of medicine at Northwestern University Feinberg School of Medicine and a Northwestern Medicine physician. "Fortunately, with adequate PPE and vigilant infection control policies, we can keep most providers safe while they are at work."

The paper will be published Dec. 8 in the journal Open Forum Infectious Diseases.

Data from the Centers for Disease Control and Prevention (CDC) found 11% of the total number of reported COVID-19 cases in the U.S. were health care workers. As of November 2020, there have been 797 deaths among health care workers in the U.S. It is a high priority to identify factors associated with SARS-CoV-2 infection in health care settings to protect the essential workforce that delivers care.

A total of 6,510 health care workers, including 1,794 nurses, 1,260 doctors, 904 non-patient facing administrators and 2,552 other staff members were included in the study. The academic health care system included 10 hospitals, 18 immediate care centers and 325 outpatient practices in the Chicago area and surrounding Illinois.

Co-lead author Dr. Charlesnika Evans, professor of preventive medicine at Northwestern, said hospitals may need to enhance infection control approaches for nurses and other health care workers exposed to hemodialysis and high-flow oxygen therapy.

"Continued infection prevention vigilance is needed within and outside of health care settings," Evans said.

Wilkins and Evans will continue to follow this cohort over time and understand the incidence rates of infection and their predictors.

Credit: 
Northwestern University

Out with the old, in with the new

A research breakthrough from the University of Virginia School of Engineering demonstrates a new mechanism to control temperature and extend the lifetime of electronic and photonic devices such as sensors, smart phones and transistors.

The discovery, from UVA's experiments and simulations in thermal engineering research group, challenges a fundamental assumption about heat transfer in semiconductor design. In devices, electrical contacts form at the junction of a metal and a semiconducting material. Traditionally, materials and device engineers have assumed that electron energy moves across this junction through a process called charge injection, said group leader Patrick Hopkins, professor of mechanical and aerospace engineering with courtesy appointments in materials science and engineering and physics.

Charge injection posits that with the flow of the electrical charge, electrons physically jump from the metal into the semiconductor, taking their excess heat with them. This changes the electrical composition and properties of the insulating or semiconducting materials. The cooling that goes hand-in-hand with charge injection can significantly degrade device efficiency and performance.

Hopkins' group discovered a new heat transfer path that embraces the benefits of cooling associated with charge injection without any of the drawbacks of the electrons physically moving into the semiconductor device. They call this mechanism ballistic thermal injection.

As described by Hopkins' advisee John Tomko, a Ph.D. student of materials science and engineering: "The electron gets to the bridge between its metal and the semiconductor, sees another electron across the bridge and interacts with it, transferring its heat but staying on its own side of the bridge. The semiconducting material absorbs a lot of heat, but the number of electrons remains constant."

"The ability to cool electrical contacts by keeping charge densities constant offers a new direction in electronic cooling without impacting the electrical and optical performance of the device," Hopkins said. "The ability to independently optimize optical, electrical and thermal behavior of materials and devices improves device performance and longevity."

Tomko's expertise in laser metrology--measuring energy transfer at the nanoscale--revealed ballistic thermal injection as a new path for device self-cooling. Tomko's measurement technique, more specifically optical laser spectroscopy, is an entirely new way to measure heat transfer across the metal-semiconductor interface.

"Previous methods of measurement and observation could not decompose the heat transfer mechanism separately from charge injection," Tomko said.

For their experiments, Hopkins' research team selected cadmium oxide, a transparent electricity-conducting oxide that looks like glass. Cadmium oxide was a pragmatic choice because its unique optical properties are well suited to Tomko's laser spectroscopy measurement method.

Cadmium oxide perfectly absorbs mid-infrared photons in the form of plasmons, quasiparticles composed of synchronized electrons that are an incredibly efficient way of coupling light into a material. Tomko used ballistic thermal injection to move the light wavelength at which perfect absorption occurs, essentially tuning the optical properties of cadmium oxide through injected heat.

"Our observations of tuning enable us to say definitively that heat transfer happens without swapping electrons," Tomko said.

Tomko probed the plasmons to extract information on the number of free electrons on each side of the bridge between the metal and the semiconductor. In this way, Tomko captured the measurement of electrons' placement before and after the metal was heated and cooled.

The team's discovery offers promise for infrared sensing technologies as well. Tomko's observations reveal that the optical tuning lasts as long as the cadmium oxide remains hot, keeping in mind that time is relative--a trillionth rather than a quadrillionth of a second.

Ballistic thermal injection can control plasmon absorption and therefore the optical response of non-metal materials. Such control enables highly efficient plasmon absorption at mid-infrared length. One benefit of this development is that night vision devices can be made more responsive to a sudden, intense change in heat that would otherwise leave the device temporarily blind.

"The realization of this ballistic thermal injection process across metal/cadmium oxide interfaces for ultrafast plasmonic applications opens the door for us to use this process for efficient cooling of other device-relevant material interfaces," Hopkins said.

Tomko first-authored a paper documenting these findings. Nature Nanotechnology published the team's paper, Long-lived Modulation of Plasmonic Absorption by Ballistic Thermal Injection, on November 9; the paper was also promoted in the journal editors' News and Views. The Nature Nanotechnology paper adds to a long list of publications for Tomko, who has co-authored more than 30 papers and can now claim first-authorship of two Nature Nanotechnology papers as a graduate student.

The research paper culminates a two-year, collaborative effort funded by a U.S. Army Research Office Multi-University Research Initiative. Jon-Paul Maria, professor of materials science and engineering at Penn State University, is the principal investigator for the MURI grant, which includes the University of Southern California as well as UVA. This MURI team also collaborated with Josh Caldwell, associate professor of mechanical engineering and electrical engineering at Vanderbilt University.

The team's breakthrough relied on Penn State's expertise in making the cadmium oxide samples, Vanderbilt's expertise in optical modeling, the University of Southern California's computational modeling, and UVA's expertise in energy transport, charge flow, and photonic interactions with plasmons at heterogeneous interfaces, including the development of a novel ultrafast-pump-probe laser experiment to monitor this novel ballistic thermal injection process.

Credit: 
University of Virginia School of Engineering and Applied Science

"Birthday" of the roof of the world recalibrated

image: New study reveals that elevation of the Tibetan Plateau is more recent than previously concluded

Image: 
TPE

As the roof of the world, the Third Pole centered on the Tibetan Plateau can be easily considered a permanent presence. However, it is not. The place where Mount Everest stands today was once underwater. Exactly when the Third Pole grew to its current height has been a topic of debate for years. However, a recent study published in Science Advances proves, through fossil analysis, that much of the Third Pole only grew to its modern height over the past 10 million to 20 million years, rather than 40 million years ago (Ma) as previously inferred.

Using magnetostratigraphic and radiochronologic dating, the study found that low-elevation tropical fossils retrieved from the central Third Pole were deposited about 40 million years ago. However, an analysis of paleosols (fossil soils) using oxygen paleoaltimetry showed that paleosols corresponding with the elevation of the present day date from about 25.5 Ma to 21 Ma, rather than over 35 Ma - the figure often previously used to date the age of the Tibetan Plateau.

"This means the Third Pole was still lower than 2300 m about 40 million years ago," said FANG Xiaomin, lead author of the study from Institute of Tibetan Plateau Research (ITP), Chinese Academy of Sciences (CAS). "It only grew to 3500 m and above around 26 million to 21 million years ago."

"What we found is not entirely news," observed FANG, referring to findings of the First Tibetan Plateau Expedition and Research (FTEP). That project, which dated from the 1970s and was CAS's first to focus on the Third Pole, had already suggested approximately the same period for the "birth" of the plateau, based on research involving over a thousand scientists from 18 countries. However, the FTEP finding was largely discredited and discarded over the years as later oxygen-isotope-based estimates argued that a fully elevated plateau existed at least 35 million years ago. Interestingly, FANG's study, which "reconciles the FTEP results," was actually part of the Second Tibetan Plateau Expedition and Research (STEP), a science project launched in 2018 by CAS to reassess the environment of the Third Pole given rapid climate changes over recent years.

The much debated "birthday" of the roof of the world is not just an academic issue concerning how the Third Pole uplifted over history. It also helps shape our understanding of several processes highly relevant to regional and global climate. These include continental collision and uplift geodynamic mechanisms, Asian atmospheric circulation, surface processes and biotic evolution. With this recalibrated elevation history, there is still much rethinking to do.

Credit: 
Chinese Academy of Sciences Headquarters

The impact of the pandemic on the Brazilian labor market

The pandemic has disrupted economic activity and worsened social problems in many countries. In Brazil, its impact has been especially severe. “The level of employment, defined as the number of people in work divided by the working-age population, fell below 50% in April 2020. It remained low until July when it bottomed out at 47%. This means over half the working-age population was unemployed,” said Rogério Barbosa, a professor at the State University of Rio de Janeiro’s Institute for Social and Political Studies (IESP-UERJ) and a former researcher at the Center for Metropolitan Studies (CEM), one of FAPESP’s Research, Innovation and Dissemination Centers (RIDCs).

An article entitled “The Impact of COVID-19 in Brazil: Labour Market and Social Protection Responses” by Barbosa and Ian Prates, a researcher at the Brazilian Center of Analysis and Planning (CEBRAP), a São Paulo-based think tank, is published in The Indian Journal of Labour Economics. The study was supported by FAPESP via a postdoctoral scholarship awarded to Barbosa.

“We used data for June 2020 in the study. But since then we’ve published other studies with more up-to-date statistics based on the National Household Sample Survey [PNAD, an official Brazilian government survey conducted by IBGE, the national census bureau] – both the Continuous PNAD and the COVID-19 PNAD. This dataset confirmed the forecast we made at the start of the pandemic,” Barbosa said.

“Back then, we cross-tabulated two parameters: formal or informal employment, including self-employment, and employment in essential or non-essential sectors. Based on this, we predicted that black people and women would be the most affected: blacks because they mostly work in the informal sector, and women because so many work in sectors considered non-essential. In both cases, this status derives from the historical formation of Brazil. The PNAD data confirmed this prediction. For every formal worker terminated, three informal workers lost their jobs. The non-essential sectors involving service provision by individuals were the worst affected” (read more at: agencia.fapesp.br/33354).

According to Barbosa, the level of employment in Brazil has always been in the range of 60%. It fell during the 2014 crisis, but then rebounded. The pandemic produced a drop below 50% for the first time. “A new category of unemployment appeared: ‘unemployment concealed by social distancing’. The unemployment rate, which we can call ‘open unemployment’, is typically the percentage of people who are unsuccessfully seeking employment. With the pandemic, between 17 million and 19 million people simply stopped looking, either because of the risk of contagion or because many vacancies disappeared. If we add this ‘hidden unemployment’ to ‘open unemployment’, the total number of unemployed people reached almost 30% of the workforce in July 2020,” he said.

“This percentage is the national average. The level of unemployment varied significantly between states. In some states, it exceeded 50%. When the social isolation measures were relaxed, it improved moderately, but it’s still well below the level seen before the pandemic.”

According to Barbosa, the emergency aid doled out by the federal government to low-income workers affected by COVID-19 lockdowns and similar measures that halted economic activity was effective, despite bungled implementation logistics and fraudulent claims by individuals who were not eligible. “It acted as a safety net for the poorest 30%, whose incomes had declined systematically since 2014. They were at the bottom of the income curve when it arrived. It offset some of the losses caused by the pandemic and prior losses,” he said. “However, we need to understand that while the emergency aid reduced the monetary indicators of poverty, it didn’t affect poverty as such, which encompasses other dimensions besides the strictly monetary aspect, such as housing conditions, for example. When it ended, the poorest remained as poor as they were before.”

The Emergency Program to Maintain Employment and Income benefited large employers, who avoided the expense of terminations, he explained. It also acted as a relief for workers in formal employment, who kept their jobs, albeit with lower earnings; income held steady only for those on the minimum wage. “But small and medium enterprises, which account for most jobs in Brazil, weren’t covered by the program. When the economy reopens, there will be a shortage of vacancies because many SMEs have had to close their doors for good,” Barbosa said.

Working from home has proved effective in the developed countries of Europe, but in Brazil, WFH benefits only 10% of the workforce, mainly white-collar workers with a relatively high standard of living. The vast majority have a university degree and do managerial or administrative work. Most jobs open to workers with little formal schooling cannot be done remotely. Even those workers who can work from home often lack the technical resources to do so, such as broadband internet access. “Before the pandemic, Brazil was more or less at the same level as the rich countries in terms of remote working. Now we’re way behind,” Barbosa said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Within a hair's breadth--forensic identification of single dyed hair strand now possible

image: An overview of the analytical techniques used in this study, which together make it possible to distinguish between two single strands of colored hair

Image: 
Shinsuke Kunimura from Tokyo University of Science

In crime scene investigations, a single strand of hair can make a huge difference in the evolution of a case or trial. In most cases, forensic scientists must look for clues hidden in minuscule amounts of substances or materials found at crime scenes. If a fallen strand of hair with root cells attached is found, a DNA test can reveal the identity of a criminal; unfortunately, this seldom happens. Even though other types of DNA analysis can be conducted using the "mitochondrial DNA" embedded in the hair shaft itself, such tests are not sufficient to reliably identify a person and usually call for additional evidence.

But what if a bit of fashion consciousness could inspire a new forensic technique? In a recent study published in Analytical Sciences, scientists at the Tokyo University of Science, Japan, developed a strategy for identifying criminals from a single strand of hair, leveraging the fact that hair dyes are becoming increasingly common. Their approach involves finding out if two individual strands of hair belong to the same person based on the composition of hair dye products found on them. To do this, they employed two well-known analytical methods: surface-enhanced Raman spectroscopy (SERS) and X-ray fluorescence (XRF) analysis.

Raman spectroscopy is an analytical technique based on the physical phenomenon of Raman scattering, which models certain energetic interactions that occur when photons collide with matter. SERS is a special type of Raman spectroscopy that provides a "structural fingerprint" of a material even when very few molecules are present in the target sample. On the other hand, XRF analysis involves irradiating a material with X-rays and examining the energies of photons re-emitted when the electrons in the sample leave the excited states. XRF analysis is especially useful to determine which metallic elements are present in a material.

The scientists conducted SERS and XRF analyses using portable devices to see if they could distinguish between single strands of hog hairs dyed with different products. Associate Professor Shinsuke Kunimura, who led the study, explains why both analytical methods had to be used in combination, "SERS can easily detect the overall differences in composition between different types of hair dyes, such as permanent, semi-permanent, or natural dyes. However, it is not enough to distinguish between hair coloring products that contain or produce similar dyes. To do this, we also relied on XRF analysis, which can detect the presence of metallic elements used in the ingredients of hair dye products." Using both techniques, the scientists were able to easily distinguish between five different dyes applied to individual strands of hog hair.

Because both analytical methods used are almost non-destructive, the strategy proposed in this study could be used to quickly analyze hairs found in crime scenes on-site before they are sent for DNA analysis. "Our approach provides supportive information for more reliably identifying whose hair was found in a crime scene," remarks first author Momona Horiguchi. "This could help us clarify if someone is a criminal, meaning that our methodology could greatly contribute to forensic investigations."

Overall, this study showcases how analytical tools normally used in chemistry and materials science can be creatively adapted to vastly different fields, such as forensic investigations. Hopefully, in the future, it will prevent criminals from escaping by a hair's breadth!

Credit: 
Tokyo University of Science

DeepLabCut-Live! real-time marker-less motion capture for animals

image: DeepLabCut-Live! is a new package for real-time animal pose estimation. It allows researchers to design new experiments to give real-time feedback to animals (whether that's your pet getting treats, or mice performing decision making tasks).

Image: 
Kane et al, eLife 2020

Gollum in "The Lord of the Rings", Thanos in the "Avengers", Snoke in "Star Wars", the Na'vi in "Avatar": we have all experienced the wonders of motion-capture, a cinema technique that tracks an actor's movements and "translates" them into computer animation to create a moving, emoting - and maybe one day Oscar-winning - digital character.

But what many might not realize is that motion capture isn't limited to the big screen, but extends into science. Behavioral scientists have been developing and using similar tools to study and analyze the posture and movement of animals under a variety of conditions. But motion-capture approaches also require that the subject wears a complex suit with markers that let the computer "know" where each part of the body is in three-dimensional space. That might be okay for a professional actor, but animals tend to resist dressing up.

To solve the problem, scientists have begun combining motion-capture with deep learning, a method that lets a computer essentially teach itself how to optimize performing a task, e.g., recognizing a specific "key-point" in videos. The idea is to teach the computer to track and even predict the movements or posture of an animal without the need for motion capture markers.

But to be of meaningful use to behavioral science, "marker-less" tracking tools must also allow scientists to quickly - literally in real-time - control or stimulate the neural activity of the animal. This is particularly important in experiments that try to work out which part of the nervous system underlies a specific movement of posture.

DeepLabCut: deep-learning, marker-less posture tracking

One of the scientists spearheading the marker-less approach is Mackenzie Mathis who recently joined EPFL's School of Life Sciences from Harvard. Mathis' lab has been developing a deep-learning software toolbox named DeepLabCut, that can track and identify animal movements in real time directly from video. Now, in a paper published in eLife, Mathis and her Harvard post-doctoral fellow Gary Kane present a new version named DeepLabCut-Live! (DLC-Live!), which features low-latency (within 15 msec at over 100 FPS) - or with a module we provide to forward-predict posture one can achieve zero latency feedback - and can be integrated into other software packages.

DeepLabCut was originally developed in order to study and analyze the way animals adapt their posture in response to changes in their environment. "We're interested in how neural circuits control behavior and in particular in how animals adapt to quick changes in their environment," says Mathis.

"For example, you pour coffee in a mug and when it's full it has a particular weight. But, as you drink it the weight changes, yet you don't need to actively think about changing your grip force or how much you have to lift your arm to reach your mouth. This is a very natural thing we do and we can adapt to these changes very quickly. But this actually involves a massive amount of interconnected neurocircuitry, from the cortex all the way to the spinal cord."

DLC-Live!, the new update to a state-of-the-art "animal pose estimation package" that uses tailored networks to predict the posture of animals based on video frames, which offline allows for up to 2,500 FPS on a standard GPU. Its high-throughput analysis makes it invaluable for studying and probing the neural mechanisms of behavior. Now, with this new package, it's low-latency allows researchers to give animals feedback in real time and test the behavioral functions of specific neural circuits. And, more importantly, it can interface with hardware used in posture studies to provide feedback to the animals.

"This is important for things in our own research program where you want to be able to manipulate the behavior of an animal," says Mathis. "For example, in one behavioral study we do, we train a mouse to play a video game in the lab, and we want to shut down particular neurons or brain circuits in a really specific time window, i.e., trigger a laser to do optogenetics or trigger an external reward."

"We wanted to make DLC-Live! super user-friendly and make it work for any species in any setting," she adds. "It's really modular and can be used in a lot of different contexts; the person who's running the experiments can set up kind of the conditions and what they want to trigger quite easily with our graphical user interface. And we've also built in the ability to use it with other common neuroscience platforms." Two of those commonly used platforms are Bonsai and Autopilot, and in the paper, Mathis and her colleagues who developed those software packages show how DLC-Live! can easily work with them.

"It's economical, it's scalable, and we hope it's a technical advance that allows even more questions to be asked about how the brain controls behavior," says Mathis.

Credit: 
Ecole Polytechnique Fédérale de Lausanne