Culture

Cancer treatment in young women need not mean the end of their fertility

This press release is in support of a presentation by Dr Dalia Khalife presented online at the 36th Annual Meeting of ESHRE.

6 July 2020: The first long-term record of how cancer patients made use of their stored eggs and embryos after cancer treatment is presented today at the 36th Annual Meeting of ESHRE. The results demonstrate from the 20-year data how successful fertility preservation can be in these patients, especially those with breast cancer. Details of the analysis, covering the longest reported period of use, are presented today online by Dr Dalia Khalife from Guy's and St Thomas's Hospital, London, at the virtual Annual Meeting of ESHRE.

Data were analysed on 879 young female patients who were treated for a range of cancers between 2000 and 2019; all had sought counselling on preserving their fertility ahead of treatment. Treatments such as chemo- and radiotherapy are known to have adverse effects on ovarian function, often causing infertility.(1) The mean age of the patients was 33 years, and breast cancer was the most frequent diagnosis (63% of cases). After counselling, 373 patients (42%) chose to have their fertility preserved by one of three methods available: egg freezing (53%); embryo freezing (41%); both methods together (5%); and ovarian tissue cryopreservation (1%)

So far, reports Dr Khalife, the rate of those returning to make use of their frozen eggs and embryos is 16% (61/373), and 44 of them achieved a remarkably high birth rate of 71%, with a twin rate of 9%, when the fertilised eggs and embryos were transferred in an IVF procedure. Around two-thirds of patients returned within two years of their diagnosis; women with breast cancer were the most likely to return for fertility treatment. These patients also achieved the highest birth rates, significantly higher, for example, than those with lymphoma (70% vs 30%).

"The results are a demonstration of how fertility preservation in these cases can be effective," says Dr Khalife. "Around one in six of those who stored their gametes had a good outcome."

While the cancer treatment had variable effects on fertility, "almost all patients" did show some deterioration in their ovarian reserve levels, reflecting a range of responses from mild toxicity of treatment (minimal effect on ovarian reserve markers) to severe toxicity (premature ovarian insufficiency). There were even a number of naturally conceived pregnancies after cancer treatment.

Dr Khalife explains that the most appropriate method of fertility preservation is decided on an individual basis. "Oocyte freezing is usually offered to young women," she says, "and, with our vastly improved freezing techniques, provides a good chance of future pregnancy. Ovarian tissue cryopreservation, though still not widely available, is undertaken in selected cases where time is urgent. This technique also now provides an option for the prepubertal female, where previously none existed."

This report is based on cases referred to a tertiary referral centre in the UK, the South East Cancer Network in the UK and one of the earliest centres set up to provide a dedicated fertility preservation and long-term follow up service. "It is our hope today that all young women diagnosed with cancer and good prognosis are referred for fertility consultation," says Dr Khalife. "Success of such a service requires close work with our oncology colleagues, rapid access, and clear referral pathways to enable a large number of young patients to be treated.

"We do believe that a fertility preservation service must be integral to a modern cancer care pathway. Fertility preservation with eggs and embryos has been beyond experimental for some time. And it's important that clinicians across the world continue to collect and share data on long-term outcome for all methods, including ovarian tissue preservation, to provide patients with robust information."

This study showed a patient follow-up rate of 16% to make use of their frozen eggs and embryos, but this rate, says Dr Khalife, "will definitely increase" in time. "When fertility preservation is carried out in young women - in their teens and twenties," she explains, "they are unlikely to return for many years. Previously, there were few options for fertility in these young women - but now there is and our data show that the results can be of great benefit."

Credit: 
European Society of Human Reproduction and Embryology

Lack of lockdown increased COVID-19 deaths in Sweden, analysis finds

Sweden's controversial decision not to lock down during COVID-19 produced more deaths and greater healthcare demand than seen in countries with earlier, more stringent interventions, a new analysis finds. But Sweden fared better than would be expected from its public-health mandates alone, roughly similar to France, Italy and Spain - countries that had more stringent measures but adopted them after the pandemic took hold there.

Sweden's unusual approach also saw fewer patients admitted to intensive-care units than expected. But the country has seen a higher percentage of deaths in older patients outside ICUs than other countries when ICU beds were not limited. That suggests health authorities there have considered patients' chances of recovery in deciding who receives access to intensive care, the researchers say.

"Our study shows that individually driven infection-control measures can have a substantial effect on national outcomes, and we see Sweden as a good example of this case," said Peter Kasson, MD, PhD, of the University of Virginia School of Medicine and Sweden's Uppsala University. "Higher levels of individual action would further suppress the infection, while a complete lack of individual action would likely have led to runaway infection, which fortunately hasn't happened."

Understanding the Impact of COVID-19

Kasson and Uppsala's Lynn Kamerlin set out to analyze the effects of the country's public-health response using population, employment and household data. They say the insights gained from their work can guide future public-health policies. In particular, the findings will help doctors understand the effects of individual compliance with infection-control measures.

The researchers conclude that Sweden's "mild" government restrictions, coupled with a population willing to voluntarily self-isolate, produced results quite similar to those seen in countries that enacted more stringent measures later in the pandemic.

Sweden's per capita death rate was 35 per 100,000 as of May 15. Meanwhile, Denmark's death rate was 9.3 per 100,000, Finland's 5.2 and Norway's 4.7. All three neighboring countries enacted stricter policies. For comparison, the United States had 24 deaths per 100,000 as of May 15. But Sweden has fared better than hard-hit countries such as the United Kingdom and Spain.

"Sweden is perhaps the most prominent example of mitigation -- limiting the extent of socially and economically disruptive interventions while still aiming to slow spread and allow for an effective medical response," the researchers wrote in a new paper outlining their findings. "Studying the effects of this strategy, which elements are key to reducing mortality and healthcare need, and how it might compare to other approaches, is thus of critical importance to the global understanding of pandemic responses."

Key Measures in Sweden

While it did not opt for full lockdown, Sweden took several steps to mitigate the spread of COVID-19. The researchers created computer models to measure the effects of these steps, including voluntary self-isolation by symptomatic people and those over 70, closing schools and other interventions. They then validated their results by comparing the models with Sweden's death rate and compared Sweden's results to other countries'.

The researchers' models anticipated that Sweden's public-health mandates would result in 40 times more patients needing ICU beds than the number of ICU beds available before the pandemic. Voluntary self-isolation reduced this to five-fold, and the country essentially doubled its number of ICU beds as the pandemic emerged.

That still leaves many patients without a bed, however, and yet the country's ICUs weren't overrun. That outcome - and the fact older patients in Sweden were several times more likely to die than to be admitted to an ICU - prompted the researchers to analyze choices that Swedish health authorities made about who would receive intensive care.

"Analyzed by categorical age group, older Swedish patients with confirmed COVID-19 were more likely to die than to be admitted to the ICU, suggesting that predicted prognosis may have been a factor in ICU admission," the researchers wrote. "This likely reduced ICU load at the cost of more high-risk patients dying outside the ICU."

"The key finding is that individual actions matter," Kasson said. "If enough individuals stay home and take precautions when in the community, it can really change the infection curve. And we can't let up now."

Credit: 
University of Virginia Health System

Fathers are more likely to be referred for nutrition or exercise counseling

audio: Lead investigator Alicia Boykin, MD, MS, discusses a new study in the Journal of Nutrition Education and Behavior that found that overweight and obese men who are fathers were more likely than men without children to be referred for nutrition or exercise counseling.

Image: 
Journal of Nutrition Education and Behavior

Philadelphia, July 6, 2020 - Fatherhood status has been linked to medical providers' weight-related practices or counseling referrals. A new study in the Journal of Nutrition Education and Behavior, published by Elsevier, found that overweight and obese men who are fathers were more likely than men without children to be referred for nutrition or exercise counseling.

Researchers from the UPMC Children's Hospital of Pittsburgh and the University of Pittsburgh School of Nursing studied 2,562 men visiting their medical provider for both routine and sick visits. This study corroborates other researchers' findings that only 20 percent to 40 percent of obese patients report receiving nutrition or weight loss counseling.

"There's more research showing that fathers play a central role in child development but also in their weight-related health outcomes," said lead study author Alicia Boykin, MD, MS, Division of Adolescent and Young Adult Medicine, UPMC Children's Hospital of Pittsburgh, Pittsburgh, PA, USA. "It's critical to address healthy diets and physical activity among men who are already fathers, but also among men who may become fathers soon in the future."

As previous research has shown, fathers' commitment to their children has increased as evidenced by the increased time (doubled) that fathers spend on a child's care. Researchers have documented that fathers are more committed to weight programs that enable them to support their children (and families) and focus on child health and well-being rather than solely on their own health. "Men are willing to make positive changes during fatherhood and the results may suggest that providers are capitalizing on this time," Dr. Boykin said.

This study furthers a general understanding of weight-related practices and management during clinic visits for men, in general, and fathers, in particular.

"I think that given the link between paternal obesity and child obesity, providers have a great opportunity to positively influence family outcomes, so not just the health outcomes for their patients, but also the health outcomes for their patients' children. The next step would include understanding adult provider motivators for referring, but also understanding the type of interventions that providers refer fathers to for nutrition and exercise counseling."

Credit: 
Elsevier

New guideline: Don't routinely screen for EAC in patients with chronic GERD

A new guideline from the Canadian Task Force on Preventive Health Care, based on a rigorous systematic review of the latest evidence, found no benefit of routine screening for esophageal adenocarcinoma (EAC) and precursor conditions (Barrett esophagus and dysplasia) in patients with chronic gastroesophageal reflux disease (GERD).

The guideline, published in CMAJ (Canadian Medical Association Journal), recommends physicians in Canada continue current practice to not screen routinely http://www.cmaj.ca/lookup/doi/10.1503/cmaj.190814.

"Given the many needs facing the health system, it is important to use services where we know there is benefit," says Dr. Stephane Groulx, assistant clinical professor, Department of Community Health Sciences, Université de Sherbrooke and Chair of the Task Force EAC working group. "We did not find sufficient data to recommend routine screening by upper endoscopy of people with chronic GERD for EAC and precursor conditions, such as Barrett esophagus."

This recommendation does not apply to people with alarm symptoms for esophageal cancer, such as difficulty or pain swallowing, recurrent vomiting, unexplained weight loss, anemia, loss of appetite or gastrointestinal bleeding, or to those who have already been diagnosed with Barrett esophagus.

Barrett esophagus is a condition where the normal lining of the esophagus changes to look more like the lining of the intestine. It is linked to chronic GERD and can lead to the growth of abnormal cells (dysplasia) that may turn into EAC over time much more frequently than GERD alone. It is found in 5%-20% of patients who undergo esophagogastroduodenoscopy (EGD) for chronic GERD.

Current practice in Canada does not involve organized screening programs for EAC among patients diagnosed with chronic GERD, although some family physicians do refer these patients for EGD.

"Clinicians should be aware of alarm symptoms in patients and conduct appropriate investigation, referral and management of these patients," says Dr. Scott Klarenbach, a member of the working group and professor in the Department of Medicine, University of Alberta. "Physicians who routinely refer patients without alarm symptoms for screening may want to stop, given the lack of evidence showing benefit."

EAC is the most common type of esophageal cancer in Canada and has one of the poorest survival rates among all cancers. The estimated 5-year survival rate is 15%. Unfortunately, most esophageal adenocarcinomas are diagnosed at a late stage of the disease, after alarm symptoms develop. It was hoped that early detection could save lives; unfortunately, the Task Force's rigorous review of available evidence did not identify any benefit from screening.

Although age 50 years or older, male gender, having a family history, white race, abdominal obesity and smoking are factors that may increase the risk of EAC, relevant trials and cohort studies did not provide sufficient data to recommend screening for individuals with one or more of these risk factors.

As the evidence underpinning the guideline was of low- or very-low-certainty, and because screening by endoscopy is costly and may cause harm, the Task Force calls for more research to help understand which patients with chronic GERD are most likely to develop EAC and whether screening of specific high-risk groups provides benefit that outweighs the known harms.

During the development of the guideline, the task force engaged patients to understand values and preferences around screening.

The College of Family Physicians of Canada and the Nurse Practitioner Association of Canada have endorsed the guideline. The Canadian Partnership Against Cancer has provided a statement of support for the guideline.

For the full guideline, podcast, clinician and patient FAQs, visit the EAC guideline page at http://www.canadiantaskforce.ca.

In a related commentary http://www.cmaj.ca/lookup/doi/10.1503/cmaj.200697, Dr. Sander Veldhuyzen van Zanten, Division of Gastroenterology, Department of Medicine, University of Alberta, Edmonton, Alberta, writes "The Task Force's strong recommendation against gastroscopy screening for patients with chronic GERD without alarm symptoms depended in part on the assumption that scarce health resources would need to be expended to implement screening."

He agrees that routine screening of patients younger than 50 who have chronic GERD is unnecessary. However, because gastroscopy is generally a safe and straightforward procedure, he suggests it may be considered in patients older than 50 who have chronic GERD and risk factors such as obesity and smoking.

The Canadian Task Force on Preventive Health Care is an independent panel of health professionals who are experts in clinical preventive health care and guideline methodology. The task force's mandate is to develop and disseminate evidence-based clinical practice guidelines for primary and preventive care.

Credit: 
Canadian Medical Association Journal

Cell 'membrane on a chip' could speed up screening of drug candidates for COVID-19

image: Researchers have developed a human cell 'membrane on a chip' that allows continuous monitoring of how drugs and infectious agents interact with our cells, and may soon be used to test potential drug candidates for COVID-19.

Image: 
Susan Daniel/Cornell University

Researchers have developed a human cell 'membrane on a chip' that allows continuous monitoring of how drugs and infectious agents interact with our cells, and may soon be used to test potential drug candidates for COVID-19.

The researchers, from the University of Cambridge, Cornell University and Stanford University, say their device could mimic any cell type--bacterial, human or even the tough cells walls of plants. Their research recently pivoted to how COVID-19 attacks human cell membranes and, more importantly, how it can be blocked.

The devices have been formed on chips while preserving the orientation and functionality of the cell membrane and have been successfully used to monitor the activity of ion channels, a class of protein in human cells which are the target of more than 60% of approved pharmaceuticals. The results are published in two recent papers in Langmuir and ACS Nano.

Cell membranes play a central role in biological signalling, controlling everything from pain relief to infection by a virus, acting as the gatekeeper between a cell and the outside world. The team set out to create a sensor that preserves all of the critical aspects of a cell membrane--structure, fluidity, and control over ion movement--without the time-consuming steps needed to keep a cell alive.

The device uses an electronic chip to measure any changes in an overlying membrane extracted from a cell, enabling the scientists to safely and easily understand how the cell interacts with the outside world.

The device integrates cell membranes with conducting polymer electrodes and transistors. To generate the on-chip membranes, the Cornell team first optimised a process to produce membranes from live cells and then, working with the Cambridge team, coaxed them onto polymeric electrodes in a way that preserved all of their functionality. The hydrated conducting polymers provide a more 'natural' environment for cell membranes and allows robust monitoring of membrane function.

The Stanford team optimised the polymeric electrodes for monitoring changes in the membranes. The device no longer relies on live cells that are often technically challenging to keep alive and require significant attention, and measurements can last over an extended time period.

"Because the membranes are produced from human cells, it's like having a biopsy of that cell's surface - we have all the material that would be present including proteins and lipids, but none of the challenges of using live cells," said Dr Susan Daniel, associate professor of chemical and biomolecular engineering at Cornell and senior author of the Langmuir paper.

"This type of screening is typically done by the pharmaceutical industry with live cells, but our device provides an easier alternative," said Dr Róisín Owens from Cambridge's Department of Chemical Engineering and Biotechnology, and senior author of the ACS Nano paper. "This method is compatible with high-throughput screening and would reduce the number of false positives making it through into the R&D pipeline."

"The device can be as small as the size of a human cell and easily fabricated in arrays, which allows us to perform multiple measurements at the same time," said Dr Anna-Maria Pappa, also from Cambridge and joint first author on both papers.

To date, the aim of the research, supported by funding from the United States Defense Research Projects Agency (DARPA), has been to demonstrate how viruses such as influenza interact with cells. Now, DARPA has provided additional funding to test the device's effectiveness in screening for potential drug candidates for COVID-19 in a safe and effective way.

Given the significant risks involved to researchers working on SARS-CoV-2, the virus which causes COVID-19, scientists on the project will focus on making virus membranes and fusing those with the chips. The virus membranes are identical to the SARS-CoV-2 membrane but don't contain the viral nucleic acid. This way new drugs or antibodies to neutralise the virus spikes that are used to gain entry into the host cell can be identified. This work is expected to get underway on 1 August.

"With this device, we are not exposed to risky working environments for combating SARS-CoV-2. The device will speed up the screening of drug candidates and provide answers to questions about how this virus works," said Dr Han-Yuan Liu, Cornell researcher and joint first author on both papers.

Future work will focus on scaling up production of the devices at Stanford and automating the integration of the membranes with the chips, leveraging the fluidics expertise from Stanford PI Juan Santiago who will join the team in August.

"This project has merged ideas and concepts from laboratories in the UK, California and New York, and shown a device that works reproducibly in all three sites. It is a great example of the power of integrating biology and materials science in addressing global problems," said Stanford lead PI Professor Alberto Salleo.

Credit: 
University of Cambridge

Injections are two-and-a-half times safer when nurses use revamped guidelines

image: Injections are two-and-a-half times safer when nurses use revamped guidelines.

Image: 
Gustavo Fring

The UK's National Health Service (NHS) is changing the way it writes its guidelines for giving injections in hospitals, following groundbreaking research from the University of Bath.

The Bath study, funded by the National Institute for Health Research (NIHR), found that hospital nurses make far fewer mistakes when administering medicines intravenously if they follow instructions written with nurses in mind. Researchers used a process called 'user testing', which identifies where mistakes are being made and introduces changes so the instructions are easier to use.

Current NHS guidelines on intravenous injections are written by pharmacists with little input from their primary audience - nurses. These instructions can be confusing or overly complicated, which contributes to 30-50% of intravenous doses being incorrect in some way.

Injection-related mistakes form a major part of the 237 million medication errors that occur annually in England, and they can arise because nurses struggle to find relevant, unambiguous information in the NHS guidelines, explained Dr Matthew Jones, who led the research from the University's Department of Pharmacy and Pharmacology in collaboration with colleagues from the Universities of Leeds and Strathclyde, and UCL School of Pharmacy.

"When nurses follow modified guidelines that present the same information in a more user-friendly way, nearly two and half times more doses are given without mistakes," he said. "As a bonus, the injection procedure is also completed faster and nurses feel more confident about their decisions."

He added: "Current instructions are usually written by pharmacists using a format and language that makes immediate sense to other pharmacists, but not necessarily to nurses.

"Different professions think about things in different ways as a result of the different training they receive, and we need injection guidelines to be written in a way that is understood by nurses because they prepare and administer most injections."

In developing the Bath study, researchers collaborated with Luto Research, a University of Leeds spin-out company with expertise in user testing. The study involved 273 nurses and midwives who regularly administer injections from four NHS hospitals.

Participants were taken part way through a shift and asked to give an injection to a rubber dummy simulating a patient's arm. Around half of participants were given the current NHS guidelines and the other half the modified guidelines. A researcher watched to identify guideline-related errors.

"The results make it clear that busy, stressed staff need information to be presented in a way that is easy to understand and quick to find," said Dr Jones. "User testing allowed us to identify where the information needed improving and how we could do that."

The study's findings have triggered a review of how injection guidelines are produced for the NHS Injectable Medicines Guide. This website-based guide provides information on the correct procedures for preparing and administering over 350 intravenous medicines in over 120 hospitals. It is accessed approximately 3-million times per year, mostly by hospital nurses.

Dr Jones said: "To improve patient safety, most injectable medicines' guidelines should be user-tested. This is particularly important when high-risk and complex decisions are being made."

According to the World Health Organisation, unsafe medication practices and medication errors are a leading cause of injury and avoidable harm in health care systems across the world. Globally, the cost associated with medication errors is estimated at US$42 billion every year.

Credit: 
University of Bath

Time to get real on the power of positive thinking -- new study

video: Dr Chris Dawson, Associate Professor in Business Economics at the University of Bath's School of Management discusses his research suggesting optimism is not the route to long term happiness

Image: 
University of Bath, UK

Positive thinking has long been extolled as the route to happiness, but it might be time to ditch the self-help books after a new study shows that realists enjoy a greater sense of long-term wellbeing than optimists.

Researchers from the University of Bath and London School of Economics and Political Science (LSE) studied people's financial expectations in life and compared them to actual outcomes over an 18-year period. They found that when it comes to the happiness stakes, overestimating outcomes was associated with lower wellbeing than setting realistic expectations.

The findings point to the benefits of making decisions based on accurate, unbiased assessments. They bring in to question the 'power of positive thinking' which frames optimism as a self-fulfilling prophecy, whereby believing in success delivers it, along with immediate happiness generated by picturing a positive future.

Negative thinking should not replace positive thinking though. Pessimists also fared badly compared to realists, undermining the view that low expectations limit disappointment and present a route to contentment.

Their numbers are dwarfed though by the number of people - estimated to be 80 percent of the population - who can be classed as unrealistic optimists. These people tend to overestimate the likelihood that good things will happen and underestimate the possibility of bad things. High expectations set them up for large doses of destructive disappointment.

"Plans based on inaccurate beliefs make for poor decisions and are bound to deliver worse outcomes than would rational, realistic beliefs, leading to lower well-being for both optimists and pessimists. Particularly prone to this are decisions on employment, savings and any choice involving risk and uncertainty," explains Dr Chris Dawson, Associate Professor in Business Economics in Bath's School of Management.

"I think for many people, research that shows you don't have to spend your days striving to think positively might come as a relief. We see that being realistic about your future and making sound decisions based on evidence can bring a sense of well-being, without having to immerse yourself in relentless positivity."

The results could also be due to counteracting emotions, say the researchers. For optimists, disappointment may eventually overwhelm the anticipatory feelings of expecting the best, so happiness starts to fall. For pessimists, the constant dread of expecting the worst may overtake the positive emotions from doing better than expected.

In the context of the Covid-19 crisis the researchers highlight that optimists and pessimists alike make decisions based on biased expectations: not only can this lead to bad decision making but also a failure to undertake suitable precautions to potential threats.

"Optimists will see themselves as less susceptible to the risk of Covid-19 than others and are therefore less likely to take appropriate precautionary measures. Pessimists, on the other hand, may be tempted to never leave their houses or send their children to school again. Neither strategy seems like a suitable recipe for well-being. Realists take measured risks based on our scientific understanding of the disease," said co-author Professor David de Meza from LSE's Department of Management.

Published in the American journal Personality & Social Psychology Bulletin the, findings are based on analysis from the British Household Panel Survey - a major UK longitudinal survey - tracking 1,600 individuals annually over 18 years.

To investigate whether optimists, pessimists or realists have the highest long-term well-being the researchers measured self-reported life satisfaction and psychological distress. Alongside this, they measured participants' finances and their tendency to have over- or under-estimated them.

Credit: 
University of Bath

Context reduces racial bias in hate speech detection algorithms

Understanding what makes something harmful or offensive can be hard enough for humans, never mind artificial intelligence systems.

So, perhaps it's no surprise that social media hate speech detection algorithms, designed to stop the spread of hateful speech, can actually amplify racial bias by blocking inoffensive tweets by black people or other minority group members.

In fact, one previous study showed that AI models were 1.5 times more likely to flag tweets written by African Americans as "offensive"--in other words, a false positive--compared to other tweets.

Why? Because the current automatic detection models miss out on something vital: context. Specifically, hate speech classifiers are oversensitive to group identifiers like "black," "gay," or "transgender," which are only indicators of hate speech when used in some settings.

Now, a team of USC researchers has created a hate speech classifier that is more context-sensitive, and less likely to mistake a post containing a group identifier as hate speech.

To achieve this, the researchers programmed the algorithm to consider two additional factors: the context in which the group identifier is used, and whether specific features of hate speech are also present, such as dehumanizing and insulting language.

"We want to move hate speech detection closer to being ready for real-world application," said Brendan Kennedy, a computer science PhD student and co-lead author of the study, published at ACL 2020, July 6.

"Hate speech detection models often 'break,' or generate bad predictions, when introduced to real-world data, such as social media or other online text data, because they are biased by the data on which they are trained to associate the appearance of social identifying terms with hate speech."

Additional authors of the study, titled "Contextualizing Hate Speech Classifiers with Post-Hoc Explanation," are co-lead author Xisen Ji, a USC computer science PhD student, and co-authors Aida Mostafazadeh Davani, a PhD computer science student, Xiang Ren, an assistant professor of computer science and Morteza Dehghani, who holds joint appointments in psychology and computer science

Why AI bias happens

Hate speech detection is part of the ongoing effort against oppressive and abusive language on social media, using complex algorithms to flag racist or violent speech faster and better than human beings alone. But machine learning models are prone to learning human-like biases from the training data that feeds these algorithms.

For instance, algorithms struggle to determine if group identifiers like "gay" or "black" are used in offensive or prejudiced ways because they're trained on imbalanced datasets with unusually high rates of hate speech (white supremacist forums, for instance). As a result, the models find it hard to generalize to real-world applications.

"It is key for models to not ignore identifiers, but to match them with the right context," said Professor Xiang Ren, an expert in natural language processing.

"If you teach a model from an imbalanced dataset, the model starts picking up weird patterns and blocking users inappropriately."

To test the systems, the researchers accessed a large, random sample of text from "Gab," a social network with a high rate of hate speech, and "Stormfront," a white supremacist website. The text had been hand-flagged by humans as prejudiced or dehumanizing.

They then measured the state-of-the-art model's tendencies, versus their own model's, towards inappropriately flagging non-hate speech, using 12,500 New York Times articles devoid of hate speech, excepting quotation. State-of-the-art models achieved a 77 % accuracy of identifying hate versus non-hate. The USC model was able to boost this to 90 %.

"This work by itself does not make hate speech detection perfect, that is a huge project that many are working on, but it makes incremental progress," said Kennedy.

"In addition to preventing social media posts by members of protected groups from being inappropriately censored, we hope our work will help ensure that hate speech detection does not do unnecessary harm by reinforcing spurious associations of prejudice and dehumanization with social groups."

Credit: 
University of Southern California

'Biologically relevant' levels of a fertility hormone are detected in human hair samples

This press release is in support of a poster presentation from Mr Sarthak Sawarkar and colleagues presented online at the 36th Annual Meeting of ESHRE.

6 July 2020: The prospect of a non-invasive test of ovarian reserve is a little closer following results from a study showing that measurement of a fertility hormone can be accurately taken from a sample of human hair.

Anti-Mullerian hormone - or AMH - has become a key marker in the assessment of how women may respond to fertility treatment. The hormone is produced by small cells surrounding each egg as it develops in the ovary, and is thus seen as a measure of ovarian reserve. Although studies have not correlated AMH levels to a reliable chance of live birth (nor to forecasting the time of menopause), AMH measurement has become an intrinsic marker in assessing how a patient will respond to ovarian stimulation for IVF - as a normal responder, poor responder (with few eggs), or over-responder (with many eggs and a risk of ovarian hyperstimulation syndrome, OHSS).(1)

AMH is presently measured in serum taken from a blood sample drawn intravenously. The readings represent a measurement at a short moment in time and are relatively invasive to complete. Now, however, a new study presented at the online Annual Meeting of ESHRE has tested the quantification of AMH from human hair and found it to be a less invasive and a "more appropriate representation of hormone levels" than from an "acute" source like serum. The results are presented this week in a poster from PhD student Sarthak Sawarkar, working in the laboratory of Professor Manel Lopez-Bejar in Barcelona, with collaborators from MedAnswers Inc in the USA.

The study, which still continues, now reports results from 152 women from whom hair and blood samples were routinely collected during hospital visits. AMH measured in serum from the same subjects was used to provide a control, as was an ultrasound count of developing follicles in the ovary (AFC) as a further measure of ovarian reserve.

"Biologically relevant" AMH levels were successfully detected in the hair samples, with levels declining with patient age, as expected. As ovarian reserve declines with age, so do AMH levels. The AMH levels from hair strongly correlated with both serum levels and AFC. It was also seen that the hair test was able to detect a wide range of AMH levels within individuals from a similar age cohort, suggesting a greater accuracy than from a single blood sample. Hormones accumulate in hair shafts over a period of months, while hormone levels in serum can change over the course of hours. "So hair," explain the authors, "is a medium that can accumulate biomarkers over several weeks, while serum is an acute matrix representing only current levels. While hormone levels in blood can fluctuate rapidly in response to stimuli, hormone levels measured in hair would represent an accumulation over several weeks. A measurement using a hair sample is more likely to reflect the average hormone levels in an individual."

Among the other advantages of a hair test, the authors note that hormone levels are assessed non-invasively, which reduces testing stress and offers a less expensive assay. Testing can be done without visiting a clinic, and thus makes this type of test available to a broader range of women. "Finally," explains Mr Sawarkar, "as hair offers a look at the long-term accumulation of hormones, this measurement may allow a better understanding of an individual's hormone levels - unlike blood-based assays, which can only measure the hormone at the moment of the testing."

AMH has so far had an important - though sometimes controversial - role in reproductive medicine. Thus, while its role as a measure of ovarian reserve in predicting response to ovarian stimulation for IVF now seems beyond question, there has been doubt over its broader application as a measure of female fertility in the general population.

Commenting on the biology of the test, Mr Sawarkar explains that hormones are incorporated into the matrix of hair before the growing hair reaches the skin surface, thereby allowing an accumulating measurement of hormone concentration.

Credit: 
European Society of Human Reproduction and Embryology

New guidelines for children and adolescents with T2D

image: A team of paediatric specialists, including an expert from the University of Adelaide, has produced new guidelines regarding assessment and management of type 2 diabetes (T2D) in Australian and New Zealand children and adolescents.

Image: 
University of Adelaide

A team of paediatric specialists, including an expert from the University of Adelaide, has produced new guidelines regarding assessment and management of type 2 diabetes (T2D) in Australian and New Zealand children and adolescents.

With the incidence of T2D on the rise among children and adolescents, especially in Indigenous communities, early assessment and management is critical.

"Our publication includes the first Australasian evidence-based recommendations for screening, assessment and management of children and adolescents with type 2 diabetes and was approved by the peak body of paediatricians who look after children with diabetes, the Australasian Paediatric Endocrine Group (APEG)," says the author Dr Alexia Peña, Senior Lecturer from the University of Adelaide's Robinson Research Institute and Paediatric Endocrinologist at the Women's and Children's Hospital.

"The obesity epidemic, particularly in Indigenous young people, has caused the increase in the incidence of T2D especially in children older than 10 years of age.

"Adolescents develop complications earlier than adults with T2D and they are more likely to require insulin within a few years of diagnosis.

"Early identification and management of the condition, which is most prevalent among Indigenous people, is therefore critical to prevent complications and maintain their long-term health.

"Up until now there have been no guidelines in Australasia for assessment and management of T2D in children and adolescents and health professionals have had to refer to adult guidelines.

"These Australasian Paediatric Endocrine Group guidelines were developed by a group of expert health care professionals from Australia and New Zealand and included paediatric and adult endocrinologists, diabetes nurse educators, dietitians, psychologists and physiotherapists."

Recommended changes contained in the new guidelines include:

Specific recommendations regarding care of Indigenous children and adolescents in
Australia and New Zealand with regards to screening and management of T2D.

Tighter diabetes control for all children and adolescents.

The possibility of using newer medications, currently only approved for adults with
T2D, under the guidance of a paediatric endocrinologist.

The need to transition adolescents with T2D to a diabetes multidisciplinary care team
including an adult endocrinologist for their ongoing care.

Type 2 diabetes is a serious condition in which the body becomes resistant to the normal effects of insulin and/or gradually loses the capacity to produce enough insulin in the pancreas. It represents 85 to 90 per cent of all cases of diabetes and usually develops in middle age adults. Many people and even adolescents with T2D display none of the classic symptoms of the disease such as lethargy, excessive thirst and the need to pass more urine. It is more likely to occur if a person is overweight, has a family history of the condition or they are from particular ethnic backgrounds.

"There needs to be increasing awareness among the public that this chronic illness can start early. Children and adolescents need to be tested if they are in high-risk groups," says Dr Peña.

"It is critical that early diagnosis is followed with culturally sensitive advice to help them manage their diabetes in a way that promotes family-centred behavioural change.

"All health care professionals need to be aware of specifics for assessment and management of children with T2D.

"In some cases, by the time T2D is diagnosed, the complications of diabetes may already be present which is why early diagnosis and assessment followed by effective management is critical."

This study was published in the Medical Journal of Australia.

Credit: 
University of Adelaide

Pioneering brain haemorrhage treatment reduces long-term disability in premature babies

image: This is an image of Isaac Walker-Cox at hospital during the DRIFT treatment.

Image: 
North Bristol NHS Trust

Premature babies with serious brain haemorrhage treated with a 'brain washing' technique pioneered by Bristol researchers have shown in a 10-year follow-up study, were twice as likely to survive without severe learning disability when compared with infants given standard treatment. The findings are published today [5 July] in the journal Archives of Diseases in Childhood.

The surgical technique called 'Drainage, Irrigation and Fibrinolytic Therapy' (DRIFT), is the first and only treatment to objectively benefit infants with serious brain haemorrhage, known as intraventricular haemorrhage (IVH) which can lead to severe learning impairment and cerebral palsy.

Pioneered in 1998 and trialled from 2003 by Andrew Whitelaw, Professor of Neonatal Medicine at the University of Bristol and Ian People, Consultant Neurosurgeon from University Hospitals Bristol NHS Foundation Trust, the therapy aimed to reduce disability in premature babies with serious brain haemorrhage by washing out the ventricles in the brain to remove toxic fluid and reduce pressure.

In this NIHR-funded (DRIFT10) ten-year follow-up study, researchers assessed 52 of the 65 survivors from the original (DRIFT) cohort of 77 premature babies with severe brain haemorrhage who had been recruited for the randomised controlled trial. Of these, 39 babies received the DRIFT intervention, and 38 received standard treatment which uses lumbar punctures to control expansion of the ventricles and reduce pressure.

A research team led by Dr Karen Luyt from Bristol Medical School, traced and assessed the children at age ten and at school, to investigate whether the treatment had led to reduced neurodisability rates.

Using results from cognitive, vision, movement and behaviour assessments, parent /guardian interviews, and educational attainment scores, the team found that the pre-term babies who received DRIFT were almost twice as likely to survive without severe cognitive disability than those who had received standard treatment.

They also found that infants given the DRIFT treatment were also more likely to attend mainstream education.

Dr Luyt, the DRIFT10 study's lead author and Reader in Neonatal Medicine at Bristol Medical School, said: "Bleeding in the brain is one of the most serious complications of preterm birth and premature babies are particularly at risk of bleeding, the condition can cause significant brain injury leading to subsequent severe learning disabilities.

"While a two-year follow-up study showed reduced rates of severe cognitive disability, it was important for us to assess whether the DRIFT intervention had longer-term benefits.

"The results of this study clearly demonstrate that secondary severe brain injury is reduced in preterm infants by using this neonatal intervention, and importantly, this is sustained into middle school-age.

"We hope that these results will be used to inform UK and international healthcare guidelines and support implementation of DRIFT as a clinical service to help improve outcomes for vulnerable babies.

"We would also like to thank the families and the children who took part in the study for their support and significant contribution that has helped advance our treatment of this condition."

Dr William van't Hoff, Chief Executive Officer of the NIHR's Clinical Research Network (CRN), said: "These landmark results provide the first long-term evidence that this novel intervention can help reduce cognitive disability in infants born with serious brain haemorrhage.

"Central to this trial has been patient and public involvement - one of the NIHR's key values - with patients and their parents involved throughout the trial design and process, which has led to its success."

A series of short films explaining the findings are available on the NIHR website.

Credit: 
University of Bristol

How reliable are the reconstructions and models for past temperature changes?

Understanding of climate changes during the past millennia is crucial for the scientific attribution of the current warming and the accurate prediction of the future climate change. The proxy-based reconstructions and model simulations that offer insights into past temperature changes, however, are subject to large uncertainties. Large-scale climate reconstructions are always related to the uncertainties arising from the disturbance of non-climate signals in individual proxy record, and the differences in seasonality or temporal resolution for different proxy records. Model simulations are always related to the uncertainties arising from the uncertainties of forcing reconstruction itself and the lack of some important feedback mechanisms in climate models. Nearly 20 proxy-based reconstruction and 10 model simulation datasets have been published over the past three decades; however, due to the large uncertainty in them, significant differences between different reconstructions and between reconstructions and simulations frequently happen. The uncertainty makes it difficult to have a clear picture of past climate changes, but, unfortunately a detailed evaluation of such uncertainties in reconstructions and model simulations is still rare.

The recently published paper in SCIENCE CHINA Earth Sciences, "Evaluation of multidecadal and longer-term temperature changes since 850 CE based on Northern Hemisphere proxy-based reconstructions and model simulations", were jointly written by Dr. Wang Jianglin, Prof. Yang Bao, Dr. Fang Miao, Dr. Liu Jingjing of Northwest Institute of Eco-Environment and Resources, Chinese Academy of Sciences (CAS), Prof. Zheng Jingyun, Dr. Zhang xuezhen of Institute of Geographic Sciences and Natural Resources Research, CAS, Dr. Wang Zhiyuan of Zhejiang Normal University, and Dr. Shi Feng from Institute of Geology and Geophysics, CAS. The researchers evaluated the uncertainties in the published 18 reconstructions and 6 model simulations by assessing the covariance, climate sensitivity and amplitude of temperature changes in them.

The results show the uncertainty generally increases back in time as the covariances between different reconstructions or between reconstructions and simulations steadily decline back in time and becomes particularly large during the Medieval times. The results also show that climate modeling results show a shorter recovery (i.e., lag) in response to the cooling caused by volcanic eruptions and solar activity minima, and a smaller amplitude of multi-centennial temperature changes compared with those in proxy-based reconstructions.

Finally, the article gives the prospects and suggestions for future works to reduce uncertainty in large-scale climate reconstructions. Firstly, more efforts are suggested to be taken in developing long, high-quality and temperature-sensitive proxy records for the areas with sparse proxy archives (e.g., East China, Africa, Antarctic, South America, and some oceanic areas). Secondly, the reliability of reconstruction outside the instrumental period is encouraged to be strictly assessed by comparing with the low-resolution proxy records that was excluded from the current proxy network and by applying the "pseudo-proxy experiment" method.

Credit: 
Science China Press

Location, location, location -- Even gut immune response is site-specific

image: The pictures show the same stomach organoids: It shows the cell nuclei (blue) and the skeleton of the cell (pink) as a cross-section of the organoids. In grey is the microscopic picture of the organoids. A single organoid here is about a quarter millimeter in size.

Image: 
Sina Bartfeld / University of Würzburg

Why is it that some chronic inflammatory bowel diseases, such as Crohn's, affect both the small intestine and the colon, while others, like ulcerative colitis, are restricted to the colon? In order to solve clinical puzzles such as this one, among others, researchers from the University of Würzburg created miniature versions of the digestive tract in the lab. One of their discoveries: the digestive tract contains an inherent segmentation that could shed new light onto these common inflammatory conditions.

Scientists are now able to generate miniature versions of practically any organ of our bodies - including skin, brain and intestine. These three-dimensional structures are generated from stem cells and are called "organoids".

With a diameter of around 0.5 millimeter, organoids may only be the size of a grain of mustard, but they show a remarkable similarity to the real organs. "Despite their miniscule size, organoids simulate the organ they are derived from extremely well" says Dr. Sina Bartfeld, who led the study at the Research Center for Infectious Diseases at the Institute for Molecular Infection Biology. "Organoids contain the same types of cells as the real organ. The stem cells from which the organoids are generated contain a kind of pre-programmed tissue identity. The stem cell "knows" which organ it comes from and even in culture it produces the kinds of cells that are present in this organ in our bodies." In collaboration with surgeon Armin Wiegering from the University Hospital of Würzburg, Dr. Bartfeld's team generated organoids from stomach, small intestine and colon. They discovered an unexpectedly large molecular complexity, as revealed by RNA sequencing, which reflects the gene activity of the cells.

One of their findings was that organoids from the different segments of the digestive tract switch on specific gene-programs, depending on their tissue identity. "It's intuitive that gastric and intestinal cells have to produce different digestive enzymes - but we were surprised to discover that particular binding sites of the immune system are also part of this tissue identity", says Bartfeld.

The particular organization of the immune binding sites may play a role in the organ-specific inflammatory diseases. It could even be relevant for the development of cancer, where chronic inflammation has also been implicated. Whether this is the case and how inflammation could contribute to carcinogenesis requires further research, for which organoids form a novel basis.

Not only can organoids be generated rapidly and in large numbers in the laboratory, they have the added advantage that they consist of human tissue and form an approximate representation of a human organ. Since there are substantial differences between humans and animals, organoids can help to reduce animal experiments and illuminate uniquely human diseases. They also play an increasingly important role in drug development.

Organoids demonstrate the amazing organization of the gut - also regarding the recognition of bacteria and viruses

In addition, organoids open up entirely new ways of investigating basic molecular processes in a biologically realistic model - such as the digestive system, which is also the focus of Dr. Bartfeld's research team in Würzburg. The epithelial cells that line our digestive tract have an important barrier function, which prevents bacteria from entering our bodies. These could be pathogens, such as disease-causing bacteria or viruses.

At the same time, the intestine is colonized by billions of beneficial bacteria, the so-called microbiota, which help us to digest food. The epithelial cells thus have to be able to sense both friendly and hostile bacteria or viruses and respond appropriately. This is accomplished via special immune binding sites, called pattern recognition receptors.

These receptors recognize specific molecules produced by the different bacteria in the intestine. If the epithelial cells recognize molecules produced by dangerous pathogens, as opposed to beneficial bacteria, they have to raise the alarm and induce an immune response. So far it was unclear how the epithelium is able to differentiate between friend and foe. "It is extremely difficult to untangle the complex interaction between immune cells, epithelial cells and microbes" so Dr. Bartfeld. "However, since our organoids contain only epithelial cells, we can use them to specifically investigate the contribution of the epithelium in this interaction."

During their study, the scientists discovered that each pattern recognition receptor has its own, segment-specific gene activity pattern. "The stomach as well as each segment of the intestine has its own specific repertoire of pattern recognition receptors" explains Özge Kayisoglu, first author of the study. "Thus, the immune response of the epithelium is location-specific. In this way, the stomach reacts to different bacterial compounds than the small intestine or the colon." These differences in the immune response may contribute to segment-specific diseases like ulcerative colitis or Crohn's disease.

What induces this differential reaction to bacterial compounds? Initially, the obvious assumption was that the immune receptors are regulated in response to colonization with beneficial bacteria. To test this hypothesis, the researchers generated organoids that had never come into contact with bacteria. "The data showed that the microbiota does have an effect - but we were surprised to find that in large part, immune recognition of the epithelium is in fact genetically determined during development and independent of the environment," says Bartfeld.

Collectively, their findings represent an important step in illuminating inflammatory processes. They show that each section of the digestive tract has its own specific combination of immune recognition receptors. Dysfunctions of this innate immunity may promote the development of inflammatory diseases.

Credit: 
University of Würzburg

Obese BME people at 'higher-risk' of contracting COVID-19

Obese people among black and minority ethnic communities (BME) are at around two times higher the risk of contracting COVID-19 than white Europeans, a study conducted by a team of Leicester researchers has found.

Previous research has shown that ethnicity can alter the association between the body mass index (BMI) and cardiometabolic health so the researchers wanted to explore whether a person's weight could change the relative risk of COVID-19 across ethnic groups.

Emerging COVID-19 evidence has found that South Asian and black, African or Caribbean populations are at a higher risk of becoming seriously unwell with the condition. In addition, a link with obesity has also been found.

The research team drew expertise from the University of Leicester's Diabetes Research Centre, the National Institute for Health Research (NIHR) Leicester Biomedical Research Centre (BRC) and the NIHR Applied Research Collaboration (ARC) East Midlands.

The researchers accessed the UK Biobank - a national database of more than 500,000 people who had their health information recorded between March 2006 and July 2010. These individuals had their health records cross-referenced to a national COVID-19 laboratory test data bank between the period of March 16 and June 14, 2020. Of that cohort, 5,623 unique test results were available.

The information helped the research team to quantify the association of BMI with the risk of a positive test for COVID-19, broken down by ethnic group.

According to the results, the greater risk of COVID-19 in BME people relative to white Europeans was only apparent at higher BMI values. For example, at a BMI value of 25 kg/m2, there was no difference in risk, whereas at a BMI of 30 kg/m2 the risk of COVID-19 was nearly twice as high (1.75) and at 35 kg/m2 more than two and a half times higher (2.76) in BME individuals relative to white Europeans.

Cameron Razieh, a PhD student at the University of Leicester, concluded: "Although limited by non-random testing for COVID-19 within the UK, this data suggests that the association between BMI and the risk of COVID-19 may vary by ethnicity and acts as an important effect modifier for the increased risk of COVID-19 in BME populations. These results suggest that the combination of obesity and BME status may place individuals at particularly high-risk of contracting COVID-19, which is consistent with findings for associations of BMI and ethnicity with cardiometabolic dysfunction."

One of the researchers, Professor Kamlesh Khunti, Director of ARC East Midlands and Professor of Primary Care, Diabetes and Vascular Medicine at the University of Leicester, said: "COVID-19 has caused so much death and misery, devastated global economies and put unprecedented strain on clinical services, but despite the worldwide impact this virus has had, we still know very little about it.

"There's still so much we need to understand about the condition, but our latest findings suggest the association between BMI and the risk of COVID-19 may vary by ethnicity.

"That means the combination of obesity and BME status may place individuals at a significantly higher-risk of contracting COVID-19, which is consistent with other studies that have previously linked BMI and ethnicity with cardiometabolic dysfunction."

Co- lead author, Professor Thomas Yates, Professor of Physical Activity and Health at the University of Leicester, said: "The role of obesity as risk factor for chronic disease is well established, however we need more research to understand its role as a risk factor for the COVID-19 pandemic and how this affects different populations."

Another of the authors, Professor Melanie Davies CBE, Professor of Diabetes Medicine at the University of Leicester, who is also the Director of the Leicester BRC, said: "Knowledge is power and if we are to have any hope of beating this pandemic, we need to gain as much information about COVID-19 as we possibly can.

"Our research has given us more insight into who is at more risk from being infected and we now need to come together as one to help protect our BME communities, keeping them safe from this devastating virus."

ARC East Midlands funds vital work to tackle the region's health and care priorities by speeding up the adoption of research onto the frontline of health and social care. The organisation puts in place evidence-based frameworks to drive up standards of care and save time and money.

The Centre for BME Health, led by Professor Khunti, and funded by the ARC East Midlands and the University of Leicester, has led a campaign to increase the number of BME people involved in research relating to coronavirus in a bid to address the disproportionate impact on BME communities. Celebrities including Omid Dajlili and Whoopi Goldberg have got behind the campaign.

Credit: 
National Institute for Health Research

Closing the gap: Citizen science for monitoring sustainable development

Citizen science could help track progress towards all 17 UN Sustainable Development Goals (SDGs). An IIASA-led study, for the first time, comprehensively analyzed the current and potential contribution of citizen science data to monitor the SDGs at the indicator level.

Huge amounts of accurate, timely, and comprehensive data are required to track progress towards the SDGs. The 17 goals set by the UN in 2015 currently include 169 targets and 231 unique indicators that are defined in an evolving framework. However, many of these indicators lack sufficient data to regularly track progress. Citizen science can help to close this data gap. According to new research published in Sustainability Science, citizen science has the potential to provide data to track one third of all SDG indicators. The study included a systematic review of all SDG indicators and mapped past and ongoing citizen science initiatives that could directly or indirectly provide data for SDG monitoring.

Most citizen science initiatives engage members of the public to contribute observations of nature at global, regional, national or local levels, so unsurprisingly, the greatest potential for input was shown to be for the environmental SDG indicators. This is particularly encouraging as, according to the United Nations Environment Programme (UNEP), 68% of environment-related indicators lack data.

"Without new ways of monitoring, such as citizen science, we will never be able to achieve global monitoring of the SDG framework, as traditional means of data collection are too expensive to cover all 231 indicators on a regular, geospatially representative basis," explains UNEP Statistician, Jillian Campbell.

Indicators that were shown to align well with citizen science approaches included those that could be supported by spatial data, such as monitoring of water or air quality, disease threats, post disaster damage assessment, and open spaces in cities.

"The most remarkable finding from this review process is that citizen science has the potential to contribute to all 17 SDGs, since it is already directly or indirectly contributing, or could contribute to at least one indicator per goal. For example, indicators that could be supported by self-reporting such as sexual violence or perceptions of safety, align well with data already being collected by some citizen science initiatives. These findings are generating interest and our results have been presented to the Inter-agency and Expert Group on SDG Indicators that is responsible for developing and implementing the global indicator framework for the SDGs and targets," says IIASA researcher and study lead author, Dilek Fraisl.

The researchers demonstrate that while citizen science data cannot replace, nor compensate for all the limitations of traditional data sources, there is great potential for new data sources to complement the traditional sources, such as the census, household surveys, and administrative records that are currently used to monitor progress on the SDGs. In many cases, citizen science initiatives are already established and only require varying degrees of modification, opening out, and collaboration to bring their approaches and tools to the table.

Examples from the study of how citizen science could contribute to SDG monitoring are the Picture Pile tool developed at IIASA and the Humanitarian OpenStreetMap for the SDG indicator on "direct economic loss attributed to a disaster". Picture Pile requires volunteers to classify satellite images to identify damaged buildings after a disaster. It is designed as a flexible tool that can also be used for monitoring SDG indicators related to poverty, food security, ecosystem health, and deforestation, among others. In the Humanitarian OpenStreetMap application, participants digitize the areas affected by disasters, which includes identifying damaged roads for disaster responders to reach those in need. Researchers can then apply maximum damage functions to the mapped areas to calculate some of the direct economic losses due to a disaster.

The study also showed that citizen science was introduced successfully as part of the reporting and monitoring process for the indicator on marine litter that initially had no established methodology and standards available to support it. The methodology of this indicator currently suggests citizen science as a primary source of data for monitoring marine litter.

The researchers highlight that realizing the full potential of citizen science will require demonstrating its value in the global data ecosystem, building partnerships around citizen science data to accelerate SDG progress, and encouraging investment to enhance its use and impact.

Credit: 
International Institute for Applied Systems Analysis