Earth

COVID-19 pandemic has had significant impact on the mental health of adolescents

New York, NY, June 3, 2021 -- A study of over 59,000 Icelandic adolescents by a team of Icelandic and North American behavioral and social scientists found that COVID-19 has had a significant, detrimental impact on adolescent mental health, especially in girls. The study is the first to investigate and document age- and gender-specific changes in adolescent mental health problems and substance use during the COVID-19 pandemic, while accounting for upward trends that were appearing before the pandemic. The findings are published in The Lancet Psychiatry.

The study found that negative mental health outcomes were disproportionately reported by girls and older adolescents (13-18-year-olds), compared to same-age peers prior to the pandemic. At the same time, it revealed a decline in cigarette smoking, e-cigarette usage and alcohol intoxication among 15-18-year-old adolescents during the pandemic.

"The decrease observed in substance use during the pandemic may be an unintended benefit of the isolation that so many adolescents have endured during quarantine," said collaborating senior investigator John Allegrante, an affiliated professor of sociomedical sciences at Columbia University Mailman School of Public Health and an applied behavioral scientist.

Thorhildur Halldorsdottir, a clinical psychologist and assistant professor of psychology at Reykjavik University who is the study co-principal investigator, said the study represents a "landmark contribution to what we now know about just how psychologically devastating being socially isolated from peers and friends during the ongoing pandemic has been for young people."

According to the researchers, prior studies have not been designed to determine whether clinically relevant levels of depression--as opposed to self-reported depressive symptoms--and substance use have increased during the pandemic.

Inga Dora Sigfusdottir, professor of sociology at Reykjavik University, scientific director of the Icelandic Centre for Social Research and Analysis, and research professor of health education at Teachers College, said the study "differs in methodology from previous studies in that it tracked population-based prevalence of mental health outcomes and substance use over several years in order to better understand the potential effects of COVID-19 from recent upward trends in adolescent mental health problems.

Previous studies of adolescents during COVID-19 found evidence of increased mental health problems and certain types of substance use that had been rising before the pandemic. This study, however, compares current data with several pre-pandemic time points, which enabled the researchers to separate the effect of COVID-19 from other recent, downward trends in adolescent mental health.

The implication of the new study is that interventions intended to lessen the negative impact of the pandemic on adolescent mental health might help improve the mental health outlook for young people around the world who have been caught up in the pandemic, observed Allegrante, who is also senior professor of health education at Columbia Teachers College.

"Isolation during the pandemic has been universal and it is global, and it is having a clinically important, negative impact on young people who have not been in school during the pandemic. Whether an adolescent was an Icelander in Reykjavik who had been at home for most of the last year or an American in New York City, living under the same circumstances - being at home, engaged in remote learning and separated from friends--the consequences of not going to school not only set back their learning but also negatively affected their mental health. What we don't know is by how much."

The study shows that population-level prevention efforts, especially for girls, are warranted," but that "more study is needed to determine the long-term effects of quarantine and being socially isolated from one's peers, including the effects on learning and academic achievement and relationships with parents, siblings, and peers," said Allegrante.

Ingibjorg Eva Thorisdottir, chief data analyst at the Icelandic Centre for Social Research and Analysis (ICSRA) at Reykjavik University (who studied at Teachers College in 2009 as part of an exchange with Reykjavik University), was the principal investigator and lead author of the report.

Alfgeir L. Kristjansson, Senior Scientist at ICSRA and Associate Professor of Public Health at West Virginia University and a co-author of the study, said the "results underline the significance of social relationships in the health and well-being of youth and the importance of nurturing and maintaining strong social support mechanisms in their lives. The Lancet Psychiatry study report highlights these findings at population scale." Kristjansson was a postdoctoral fellow with Allegrante at Teachers College during 2010-2012.

In a commentary that accompanies the article's publication, Gertrud Sofie Hafstad and Else-Marie Augusti, both senior researchers at the Norwegian Centre for Violence and Traumatic Stress Studies in Oslo, write that the study "clearly shows that gauging the mental health status of adolescents over time is of imminent importance."

Credit: 
Columbia University's Mailman School of Public Health

NTU scientists establish new records of Singapore's sea-level history

image: The NTU Asian School of the Environment team behind the study of Singapore's sea-level include (L-R): Associate Professor Adam Switzer, Research fellow Dr Stephen Chua and Director of the Earth Observatory of Singapore, Professor Benjamin Horton.

Image: 
NTU Singapore

Climate scientists at the Nanyang Technological University, Singapore (NTU, Singapore) have extended the known record of Singapore's sea-level to almost 10,000 years ago, providing a more robust dataset to aid future predictions of sea-level rise.

One of the main challenges in researching climate change is to reconstruct its history over thousands of years. To have a better sense of the potential causes and effects of future changes, scientists need to learn from and understand the past.

Extracting ancient sediments from a depth of up to 40 m underground at a site at Singapore's Marina South, an international team led by NTU researchers put the samples through rigorous laboratory methods (e.g., identifying microfossils such as foraminifera) and statistical analysis to obtain data to reconstruct Singapore's sea level history.

For climate scientists, the further the sea-level record goes back in time, the clearer the picture can be for future predictions. The transition at the beginning of the Holocene (10,000-7,000 years ago) represented the last major episode of natural global warming in Earth's history, when melting ice sheets and warming oceans led to a 20 m rise in sea level. For the last 3,000 years, the sea level in Singapore had been stable, before the recent acceleration in the 20th century due to climate change.

Lead author, Dr Stephen Chua, who completed the study as part of his doctoral work at the Earth Observatory of Singapore (EOS) and Asian School of the Environment (ASE) at NTU Singapore said, "By dating the Singapore sea-level record to 10,000 years ago, we retrieved crucial new information from the early Holocene period. This is a period that is characterised by rapid sea-level rise yet remains poorly understood - until now."

"This more refined sea-level record also has wider implications. For instance, it would lead to more robust and accurate local projection of sea-level rise, offering a strategic guide for Singapore as it moves to adapt to climate change."

Professor Maureen Raymo, Co-Founding Dean of the Columbia Climate School at Columbia University, who was not involved in the study, said: "This is the type of crucial information needed to effectively plan adaptation measures in the face of ongoing sea level rise due to global warming. Our past does inform our future."

Why Marina South site for investigations?

Developing an accurate ancient sea-level record required sediment extraction from an 'ideal' site where deposits such as marine mud and mangrove peats are present.

To pick the best possible coring site for accurate results, researchers looked through thousands of available borehole logs - records of holes that have been drilled into the ground for infrastructure projects.

Associate Professor Adam Switzer who leads the Coastal Lab at ASE and EOS and who was Dr Chua's supervisor, said, "Finding the right place to drill was a huge effort. Stephen spent well over a year going over old borehole information from a variety of construction efforts over the last 30 years just to find records that might be suitable. As a result, our understanding of the geology of the whole area has also dramatically improved."

Findings useful for Singapore's coastal defence plan against rising sea levels

The study, published in the peer-reviewed journal The Holocene on 4 June 2021, also found the first conclusive evidence that mangroves only existed in the Marina South area for around 300 years before succumbing to flooding associated with rising sea level at the time.

At a depth of 20 m below modern sea level, researchers found abundant mangrove pollen indicating that a mangrove shoreline existed in southern Singapore almost 10,000 years ago. The NTU findings reveal that sea-level rise during that time was as high as 10 - 15 mm per year which likely led to the mangrove's demise.

The findings provide Singapore with useful insights for current and future adaption methods as the island nation looks to go beyond engineering solutions and to incorporate natural methods to safeguard the country's coastlines.

Despite its adaptability and effectiveness as a coastal defence, the study highlights the limitations of mangroves in the event of rapid sea-level rise. This confirms an earlier study co-authored by NTU showing that mangroves will not survive if sea-level rise goes beyond 7 mm per year under a high carbon emissions scenario.

Co-author of the study, Professor Benjamin Horton, Director of EOS, said, "Sea-level rise is a potentially disastrous outcome of climate change, as rising temperatures melt ice sheets and warm ocean waters. Scenarios of future rise are dependent upon understanding the response of sea level to climate changes. Accurate estimates of past sea-level variability in Singapore provide a context for such projections".

Providing an independent comment on the research, Professor Philip Gibbard, a Quaternary geologist from the Scott Polar Research Institute at the University of Cambridge, underscored the importance of records from localities distant from the glaciated regions such as Singapore.

"They offer a model of the process of sea-level change uncomplicated by factors associated with deglaciation, meltwater discharge and more. This important systematic contribution from Singapore and the region provides a valuable record that spans the post-glacial Holocene period, thus allowing a general pattern of sea-level change in the region to be established. This record can then be further refined as more studies become available in the future."

Credit: 
Nanyang Technological University

AI outperforms humans in creating cancer treatments, but do doctors trust it?

image: Senior Author Dr. Purdie says there can be a disconnect between a lab-type of setting and a clinical one in AI-generated treatments.

Image: 
Courtesy of Dr. Purdie.

(Toronto, June 3, 2021) -- The impact of deploying Artificial Intelligence (AI) for radiation cancer therapy in a real-world clinical setting has been tested by Princess Margaret researchers in a unique study involving physicians and their patients.

A team of researchers directly compared physician evaluations of radiation treatments generated by an AI machine learning (ML) algorithm to conventional radiation treatments generated by humans.

They found that in the majority of the 100 patients studied, treatments generated using ML were deemed to be clinically acceptable for patient treatments by physicians.

Overall, 89% of ML-generated treatments were considered clinically acceptable for treatments, and 72% were selected over human-generated treatments in head-to-head comparisons to conventional human-generated treatments.

Moreover, the ML radiation treatment process was faster than the conventional human-driven process by 60%, reducing the overall time from 118 hours to 47 hours. In the long term this could represent a substantial cost savings through improved efficiency, while at the same time improving quality of clinical care, a rare win-win.

The study also has broader implications for AI in medicine.

While the ML treatments were overwhelmingly preferred when evaluated outside the clinical environment, as is done in most scientific works, physician preferences for the ML-generated treatments changed when the chosen treatment, ML or human-generated, would be used to treat the patient.

In that situation, the number of ML treatments selected for patient treatment was significantly reduced issuing a note of caution for teams considering deploying inadequately validated AI systems.

Results by the study team led by Drs. Chris McIntosh, Leigh Conroy, Ale Berlin, and Tom Purdie are published in Nature Medicine, June 3, 2021.

"We have shown that AI can be better than human judgement for curative-intent radiation therapy treatment. In fact, it is amazing that it works so well," says Dr. McIntosh, Scientist at the Peter Munk Cardiac Centre, Techna Institute, and chair of Medical Imaging and AI at the Joint Department of Medical Imaging and University of Toronto.

"A major finding is what happens when you actually deploy it in a clinical setting in comparison to a simulated one."

Adds Dr. Purdie, Medical Physicist, Princess Margaret Cancer Centre: "There has been a lot of excitement generated by AI in the lab, and the assumption is that those results will translate directly to a clinical setting. But we sound a cautionary alert in our research that they may not.

"Once you put ML-generated treatments in the hands of people who are relying upon it to make real clinical decisions about their patients, that preference towards ML may drop. There can be a disconnect between what's happening in a lab-type of setting and a clinical one." Dr. Purdie is also an Associate Professor, Department of Radiation Oncology, University of Toronto.

In the study, treating radiation oncologists were asked to evaluate two different radiation treatments - either ML or human-generated ones - with the same standardized criteria in two groups of patients who were similar in demographics and disease characteristics.

The difference was that one group of patients had already received treatment so the comparison was a 'simulated' exercise. The second group of patients were about to begin radiation therapy treatment, so if AI-generated treatments were judged to be superior and preferable to their human counterparts, they would be used in the actual treatments.

Oncologists were not aware of which radiation treatment was designed by a human or a machine. Human-generated treatments were created individually for each patient as per normal protocol by the specialized Radiation Therapist. In contrast, each ML treatment was developed by a computer algorithm trained on a high-quality, peer-reviewed data base of radiation therapy plans from 99 patients previously treated for prostate cancer at Princess Margaret.

For each new patient, the ML algorithm automatically identifies the most similar patients in the data base, using learned similarity metrics from thousands of features from patient images, and delineated target and healthy organs that are a standard part of the radiation therapy treatment process. The complete treatment for a new patient is inferred from the most similar patients in the data base, according to the ML model.

Although ML-generated treatments were rated highly in both patient groups, the results in the pre-treatment group diverged from the post-treatment group.

In the group of patients that had already received treatment, the number of ML-generated treatments selected over human ones was 83%. This dropped to 61% for those selected specifically for treatment, prior to their treatment.

"In this study, we're saying researchers need to pay attention to a clinical setting," says Dr. Purdie. "If physicians feel that patient care is at stake, then that may influence their judgement, even though the ML treatments are thoroughly evaluated and validated."

Dr. Conroy, Medical Physicist at Princess Margaret, points out that following the highly successful study, ML-generated treatments are now used in treating the majority of prostate cancer patients at Princess Margaret.

That success is due to careful planning, judicious stepwise integration into the clinical environment, and involvement of many stakeholders throughout the process of establishing a robust ML program, she explains, adding that the program is constantly refined, oncologists are continuously consulted and give feedback, and the results of how well the ML treatments reflect clinical accuracy are shared with them.

"We were very systematic in how we integrated this into the clinic at Princess Margaret," says Dr. Berlin, Clinician-Scientist and Radiation Oncologist at Princess Margaret. "To build this novel software, it took about six months, but to get everyone on board and comfortable with the process, it took more than two years. Vision, audacity and tenacity are key ingredients, and we are fortunate at Princess Margaret to have leaders across disciplines that embody these attributes." Dr. Berlin is also an Assistant Professor, Department of Radiation Oncology, University of Toronto.

The success for launching a study of this calibre relied heavily on the commitment from the entire genitourinary radiation cancer group at Princess Margaret, including radiation oncologists, medical physicists, and radiation therapists. This was a large multidisciplinary team effort with the ultimate goal for everyone to improve radiation cancer treatment for patients at Princess Margaret.

The team is also expanding their work to other cancer sites, including lung and beast cancer with the goal of reducing cardiotoxicity, a possible side effect of treatment.

Credit: 
University Health Network

Milk makeover: A great start for a healthy heart

image: More milk = lower blood cholesterol, lower blood lipid levels, and less risk of heart disease.

Image: 
Unsplash

A dash of milk could make all the difference to a healthy heart as new research from the University of South Australia finds that people who regularly consume milk have a lower risk of heart disease.

Conducted in partnership with the University of Reading, the world-first study used a genetic-approach to investigate causal relationships between milk consumption and risk of cardiovascular disease.

Assessing genetic biomarkers among 400,000+ people, the study found that greater milk consumption was associated with lower blood cholesterol, lower blood lipid levels, and a lower risk of heart disease.

Cardiovascular diseases are the number one cause of death globally, taking an estimated 17.9 million lives each year. In Australia, cardiovascular disease affects more than four million people, and kills one Australian every 28 minutes.

Most cardiovascular disease risks are preventable through a healthy diet and lifestyle.

UniSA researcher and Director of the Australian Centre for Precision Health, Professor Elina Hypponen says the finding supports the role of milk as a healthy part of a balanced diet.

"People have long had a love-hate relationship with milk, which is not surprising given the mixed messages about dairy," Prof Hypponen says.

"While some reports show that high dairy and milk consumption is linked with cardio-metabolic risk factors, evidence from randomised controlled trials have been inconsistent.

"In this study, we conducted robust genetic tests to assess whether milk was associated with an increase in heart disease, and while we confirm that milk can cause an increase in body fat, we also show that it leads to lower cholesterol concentration and lower cardiovascular disease risk.

"The risk reduction could be explained by milk calcium, which has shown to increase the enzymes that break down fats within the body and thereby lower cholesterol levels.

"What this shows is that milk can be a part of a healthy balanced diet; there is no need to limit milk consumption if you're looking to improve your heart health."

Credit: 
University of South Australia

Dominant factor of carrier transport mechanism in multilayer graphene nanoribbons revealed

image: Carrier transport characteristics of the FETs using the GNR channel.

Image: 
Osaka University

Researchers from Osaka University, Toyo University, and Kyushu Institute of Technology clarified the expression mechanism of semiconducting and metallic properties in graphene nanoribbons (GNRs) by analyzing the carrier transport properties in the field effect transistor (FET) with a multilayer GNR channel (Fig. 1).

The research team fabricated multilayer GNRs with precisely controlled numbers of layers via a chemical vapor deposition method using a solid template. "This enabled us to compare the observed carrier transport properties in the FET using a multilayer GNR channel with various numbers of layers and device simulations based on theoretical calculations," explains first author Ryota Negishi.

The study shows that the carrier transport behavior of the FET-GNR changes from semiconducting to metallic properties as the effect of the charged impurities on the device substrate (SiO2/Si) is reduced, namely semiconducting behavior for few layers (1-3 layers) and metallic behavior for multilayers (6-8 layers) as shown in Fig. 2.

Fig 3(a) and 3(b) show the surface potential map of the monolayer and multilayer graphene calculated by the device simulation. In the case of monolayer GNR, the local high potential barriers are formed by the charged impurities on the substrate; however, in the case of multilayer GNRs, the surface potential becomes flat due to the field screening effect between layers.

Fig 3(c) and 3(d) schematically show the surface potential of monolayer and multilayer GNRs, respectively. The carrier transport of the monolayer GNR on the device substrate is affected by the local high potential barriers due to the charged impurities and hence shows hopping conduction over the barriers assisted by thermal energy. In particular, in a thin one-dimensional structure such as a monolayer GNR, the conduction electrons are localized and exhibit semiconducting-like properties because the locally formed potential barriers divides the current pass of the thin GNR channel. On the other hand, in a multilayer GNR, the electric field of the charged impurities is dramatically reduced as the number of layers increases due to the screening effect between layers, and the potential near the top layer becomes flat and shows the metallic characteristics with high conductivity.

Fig 3(e) shows the observed carrier transport properties of the FET with GNR channel as a function of the number of layers. After the number of layers reaches around five, the carrier transport characteristics change from semiconducting to metallic properties. A film thickness of five layers is in good agreement with the screening length (~1.5 nm) of the electric field due to the charged impurities that exist on the substrate. This indicates that carrier transport characteristics (i.e. semiconducting and metallic properties) in actual GNR devices are decided by the presence or absence of environmental effects.

"We have succeeded in controlling semiconducting and metallic properties of FETs using multilayer GNRs by modulating the number of layers," says Ryota Negishi. "GNRs with excellent carrier transport properties are attracting attention as next-generation electronic device materials. This study provides design guidelines for GNRs that incorporate actual device structures. This achievement is an important milestone that will accelerate the application of GNR materials for purposes such as high-speed transistors and ultra-thin wiring."

Credit: 
Osaka University

CNIC scientists identify essential factors for limb formation

image: Skeletal staining of the inferior limb regions of normal embryos, embryos with a single Meis allele, and an embryo with complete absence of Meis. Embryos with a single copy of Meis lack the fibula and posterior digits (black arrow). Total absence of Meis prevents limb development (white arrow).

Image: 
CNIC

Scientists at the Centro Nacional de Investigaciones Cardiovasculares (CNIC), working in partnership with researchers at the Institut de Recherches Cliniques de Montréal (IRCM) in Canada, have identified Meis transcription factors as essential biomolecules for the formation and antero-posterior patterning of the limbs during embryonic development.

In the study, published in Nature Communications, the research team carried out an in-depth characterization of the Meis family of transcription factors. Genetic deletion of all four family members showed that these proteins are essential for the formation of the limbs during embryonic development. "An embryo that develops in the absence of Meis does not grow limbs," said study coordinator Miguel Torres, who leads the Genetic Control of Organ Development and Regeneration group at the CNIC.

Embryonic development is a highly complex process involving interactions among a large array of molecules to ensure the correct formation of a specific organ or tissue from a small initial number of cells. The limbs, explained first author Irene Delgado, "start to form as bulges on the flank of the embryo called limb buds. Growth of the limb bud eventually results in the formation of the skeletal components of the limb."

One of the factors that plays a crucial role in the developing limb is a group of transcription factors called Meis. "In a normal embryo, the Meis genes are expressed very early during the formation of the limb buds," the scientists explained.

In the new study, in-depth molecular characterization of developing mouse embryos revealed that Meis factors initiate a signaling cascade that is essential for limb bud development and involves contributions from Fgf10 and Lef1. "Our results identify roles for Meis transcription factors in the developing limb and reveal their participation in essential pathways for limb development. During ealry limb bud formation, Meis transcription factors are essential for inducing the expression of Fgf10 and Lef1."

"Meis proteins, together with other factors like Hox and Tbx, bind to regulatory DNA sequences of the Fgf10 and Lef1 genes and regulate their expression," explained Irene Delgado.

An embryo that lacks Meis genes is unable to grow limbs, but the presence of just one of these four genes (a single allele) "is enough to initiate limb development and also reveals other functions of Meis, such as its importance for the formation of the proximal limb structures (pelvis and femur) and for antero-posterior limb patterning," said Miguel Torres.

Nevertheless, the pelvis and femur of embryos with a single Meis allele are smaller than those of a normal embryo. Moreover, added Delgado, "These embryos have defects in, or simply lack, posterior skeletal elements such as the fibula and posterior digits."

The authors further demonstrated that the molecular basis for these defects is failed initiation of the expression of the Sonic Hedgehog gene, which is essential for antero-posterior limb patterning.

Credit: 
Centro Nacional de Investigaciones Cardiovasculares Carlos III (F.S.P.)

Extreme rainfall: More accurate predictions in a changing climate

To limit the impacts of climate change it is essential to predict them as accurately as possible. Regional Climate Models are high-resolution models of the Earth's climate that are able to improve simulations of extreme weather events that may be affected by climate change and thus contribute to limiting impacts through timely action.

At their highest resolutions, Regional Climate Models are capable of simulating atmospheric convection, a key process in many extreme weather events which is often the cause of very intense and localized precipitations. Although "convection permitting" models are widely used in weather forecasting, they require large supercomputing resources which limits their use in longer-term climate modelling. However, improved computer power has now made their use in climate prediction more viable.

A study involving research teams from across Europe collaborating on the CORDEX-FPS Flagship Pilot Study on convective phenomena - including scientists of the CMCC Foundation - Euro-Mediterranean Center on Climate Change - presents the first multi-model ensemble of decade-long regional climate models run at kilometre scale. The CORDEX-FPS project on Europe and the Mediterranean region, which focuses on convective precipitation events and their evolution under human-induced climate change, selected the Alpine space as a common target area on which to experiment.

Convection permitting models were used to produce high-resolution simulations that predicted rainfall dynamics from 2000-2009. The simulated rainfall during this period was compared with observed rainfall datasets, assessing how well the models had simulated real events. The configuration developed by the CMCC obtained particularly good results . Moreover, results were compared with lower resolution models, revealing that high resolution models bring a significant improvement in model performance.

Paola Mercogliano, Director of the CMCC Division Regional Models and geo-Hydrological Impacts, and co-author of the study together with CMCC researchers Marianna Adinolfi and Mario Raffa, explains that: "Although differences still exist between ultra-high-resolution simulations and observations, it is clear that these simulations perform better than simulations with lower resolution in representing precipitation in the current climate, and thus offer a promising prospect for studies on climate and climate change at local and regional scales. The most significant improvements of the high-resolution simulations compared to the lower resolution ones are found especially in summer, when the low-resolution model overestimated the frequency and underestimated the intensity of daily and hourly rainfall."

The benefit of a higher resolution was most pronounced for heavy rainfall events.

On average, the low-resolution models underestimated summer heavy rainfall per hour by ~40%. The high-resolution models only underestimated this rainfall by ~3%. Moreover, the uncertainty ranges in the simulations - namely the variability between the models - were also almost halved at a high resolution for wet hour frequency.

Policymakers rely on accurate climate information to formulate effective measures to adapt to and mitigate the impact of climate change, and this study presents a useful method to improve predictions of extreme rainfall. Improving these predictions helps people and policymakers formulate climate adaptation and mitigation measures with the best available information.

Further studies are currently being developed within the CORDEX-FPS Flagship Pilot Study on convective phenomena to demonstrate the added value of ultra-high resolution configurations.

Credit: 
CMCC Foundation - Euro-Mediterranean Center on Climate Change

The DNA of three aurochs found next to the Elba shepherdess opens up a new enigma for palaeontology

image: Artistic reconstruction of the Elba shepherdess, accompanied by the three aurochs found at the site, whose mitochondrial DNA has been analysed.

Image: 
José Antonio Peñas (SINC)

Research involving scientists from the University of A Coruña has succeeded in sequencing the oldest mitochondrial genome of the immediate ancestor of modern cows that has been analysed to date. The remains, some 9,000 years old, were found next to a woman. Why were they with her if cattle had not yet been domesticated? Do they belong to ancestors of today's Iberian cows?

Humans have maintained a very close relationship with aurochs (Bos primigenius) since their beginnings, first by hunting them and then by breeding and selecting them.
This extinct species of mammal is little known in the Peninsula because its skeletal remains are difficult to distinguish from bison. In fact, there have been references to the presence of "large bovids" in many sites because they cannot be differentiated. At a European level, there is also a lack of genetic data.

An international team of scientists has managed to extract mitochondrial DNA from ruminants from different periods in Galicia. They have analysed the remains of B. primigenius from the Chan do Lindeiro cave (Lugo). These remains were found in a chasm together with the human fossils of the shepherdess of O Courel, "Elba", dated at around 9,000 years old. The aurochs analysed are not the oldest ones discovered, but they are the oldest ones whose mitochondrial DNA has been sequenced so far. Interestingly, although they were found together, they are genetically very different.

"Their discovery in the chasm together with a human is a great enigma. Given all the evidence, such as their similar chronology and the fact that the bones are intermingled at the base of a slump caused by the sinking of the ground -at a depth of 15 to 20 metres-, we think that the woman and the aurochs were found together. This interpretation is controversial because domestication is not regarded as having existed at the time," as Aurora Grandal, a researcher at the University of A Coruña and the co-author of the study published in the PLoS ONE journal, has explained to SINC.

The analysis of their mitochondrial DNA has not allowed these three aurochs to be related to the modern cows of the Peninsula. To investigate this possible relationship, the next step for the research team is to analyse the nuclear DNA.

Until now, different varieties of aurochs have been described, based on their morphology only. The three analysed in this study belong to haplogroup P, which is characteristic of the species. However, they differ from each other in a large number of base pairs [pieces that make up the genetic sequences], which is striking considering they are coeval. "This may indicate that they were from different origins, in a scenario in which the Elba woman played an active role; or a trait that simply reflected a very high genetic variability in the aurochs," says the researcher.

The origin of cattle domestication in the North
Domestic cattle were introduced into Spain by the first settlers and agricultural societies. Due to the absence of Neolithic sites in Galicia, very little is known about the process in this region.

To extract information about the introduction of this livestock in Galicia, researchers sampled 18 cattle fossils of different ages from different Galician mountain caves, of which eleven were subjected to mitochondrial genome sequencing and phylogenetic analysis.

The study of the three aurochs revealed their kinship with aurochs from other parts of Europe. "By studying their mitochondrial DNA, which is transmitted almost intact from mother to offspring, we can determine in which geographical areas the different lineages predominated and what their movements were due to changes in climatic conditions or even to humans following the onset of livestock farming," the palaeontologist and veterinarian Amalia Vidal, co-author of the study at the same university, tells SINC.

Thanks to the DNA, it is possible to know whether the native aurochs contributed to local livestock farming or, on the contrary, were imported animals, "with all the information that this provides about the movement of bovine and human populations," Vidal continues.

Her data show a close relationship between the first domesticated cattle in Galicia and modern cow breeds and provide an overview of cattle phylogeny. The results of the study indicate that settlers migrated to this region of Spain from Europe and introduced European cow breeds now common in Galicia.

Aurochs related to the British

"Specifically, these aurochs are more closely related to the aurochs of the British Isles than to the Central European specimens. British aurochs are more recent than those from Galicia. This may be related to the role of the Peninsula as a glacial refuge and the origin of the later recolonisation of the islands," Grandal points out.

These three coeval animals are small and have relatively short horns compared to those of northern Europe, and their morphology is different.

The researchers are now endeavouring to analyse the nuclear DNA of the three aurochs, which will allow them to learn about the possible contributions of these individuals to later domestic livestock. "For example, fragments of nuclear DNA from the British aurochs can be recognised in some breeds of northern European cows. This shows that there was a genetic contribution from aurochs to the already domestic cattle. We are going to look for possible contributions from our aurochs to Iberian cows, whether present-day or fossil," Grandal stresses.

In recent years, there has been a growing interest in the scientific community in learning about the origins of domestic animals, and there are a large number of projects to reconstruct their ancestors. One of the reasons for this is that these species are considered to be more rustic and with a better capacity to adapt to harsh environmental conditions.

"Early projects sought to generate phenotypes similar to the species they were trying to recreate (as was done with Heck cattle), but more modern projects also use DNA as a source of information," Vidal concludes.

Ancestors of bulls and cows

The social organisation of aurochs herds is assumed to have been similar to that of their domesticated bovine descendants: a single male who is relieved by another male as he weakens and his group of females.

The new males, when they reach adulthood, do not remain in the group, whereas the females do. In this way, it is normal for females of the same group to be related, which means that their mitochondrial lineages are similar.

The domestic cow comes from the domestication of the aurochs, albeit not in the Iberian Peninsula but in Asia, specifically in the Middle East, and from a small number of uses. This is the origin of the domestic cow, which then spread along with humans to occupy the whole of Europe.

In Italy, some researchers claim that the already domesticated cows had genetic contributions from local aurochs. The same holds for the British Isles. The contribution of local aurochs to cows is best observed in the nuclear DNA and was detected in some cases in northern European breeds.

In the north of the peninsula, the oldest domestic cows are about 7 to 6 thousand years old.

Credit: 
Spanish Foundation for Science and Technology

The biodegradable battery

image: The biodegradable battery consists of four layers, all flowing out of a 3D printer one after the other. The whole thing is then folded up like a sandwich, with the electrolyte in the center.

Image: 
Gian Vaitl / Empa

The fabrication device for the battery revolution looks quite unconspicuous: It is a modified, commercially available 3D printer, located in a room in the Empa laboratory building. But the real innovation lies within the recipe for the gelatinous inks this printer can dispense onto a surface. The mixture in question consists of cellulose nanofibers and cellulose nanocrystallites, plus carbon in the form of carbon black, graphite and activated carbon. To liquefy all this, the researchers use glycerin, water and two different types of alcohol. Plus a pinch of table salt for ionic conductivity.

A sandwich of four layers

To build a functioning supercapacitor from these ingredients, four layers are needed, all flowing out of the 3D printer one after the other: a flexible substrate, a conductive layer, the electrode and finally the electrolyte. The whole thing is then folded up like a sandwich, with the electrolyte in the center.

What emerges is an ecological miracle. The mini-capacitor from the lab can store electricity for hours and can already power a small digital clock. It can withstand thousands of charge and discharge cycles and years of storage, even in freezing temperatures, and is resistant to pressure and shock.

Biodegradable power supply

Best of all, though, when you no longer need it, you could toss it in the compost or simply leave it in nature. After two months, the capacitor will have disintegrated, leaving only a few visible carbon particles. The researchers have already tried this, too.

"It sounds quite simple, but it wasn't at all," says Xavier Aeby of Empa's Cellulose & Wood Materials lab. It took an extended series of tests until all the parameters were right, until all the components flowed reliably from the printer and the capacitor worked. Says Aeby: "As researchers, we don't want to just fiddle about, we also want to understand what's happening inside our materials."

Together with his supervisor, Gustav Nyström, Aeby developed and implemented the concept of a biodegradable electricity storage device. Aeby studied microsystems engineering at EPFL and came to Empa for his doctorate. Nyström and his team have been investigating functional gels based on nanocellulose for some time. The material is not only an environmentally friendly, renewable raw material, but its internal chemistry makes it extremely versatile. "The project of a biodegradable electricity storage system has been close to my heart for a long time," Nyström says. "We applied for Empa internal funding with our project, Printed Paper Batteries, and were able to start our activities with this funding. Now we have achieved our first goal."

Application in the Internet of Things

The supercapacitor could soon become a key component for the Internet of Things, Nyström and Aeby expect. "In the future, such capacitors could be briefly charged using an electromagnetic field, for example, then they could provide power for a sensor or a microtransmitter for hours." This could be used, for instance, to check the contents of individual packages during shipping. Powering sensors in environmental monitoring or agriculture is also conceivable - there's no need to collect these batteries again, as they could be left in nature to degrade.

The number of electronic microdevices will also be increasing due to a much more widespread use of near-patient laboratory diagnostics ("point of care testing"), which is currently booming. Small test devices for use at the bedside or self-testing devices for diabetics are among them. "A disposable cellulose capacitor could also be well suited for these applications", says Gustav Nyström.

Credit: 
Swiss Federal Laboratories for Materials Science and Technology (EMPA)

Enantiomorph distribution maps for metals and metallic alloys

image: Equivalent fragments of the crystal structures of β-Mn enantiomorphs. The screw-like arrange-ments are formed by manganese atoms on different Wyckoff positions (color coded)

Image: 
MPI CPfS

Left- or right-handedness is a symmetry property that many macroscopic objects also exhibit and which is of immense importance, particularly for the bioactivity of organic molecules. Chirality is also relevant for physical or chemical properties such as optical activity or enantioselectivity of crystalline solids or their surfaces. In the case of chiral metallic phases, unconventional superconductivity and unusual magnetic ordered states are linked to the chirality of the underly-ing crystal structure. Despite this connection between chirality and the properties of a material, detection is often difficult because left-handed and righthanded structural variants can cancel each other out or at least weaken chirality effect.

It is not always possible to prepare chiral materials that contain only one of the two structural variants. More often, both structural variants are present in a polycrystalline material. For sys-tematic investigations, it is therefore important to be able to determine the handedness with good spatial resolution.

In the present work, it is show that the EBSD method can be used to determine the distribution of enantiomorphic structural variants not only in polycrystalline materials of multicomponent phases, but also for the chiral elemental structure β-Mn. The difference between multicomponent crystal structures and the elemental structure is therefore of particular importance, since the x-ray diffraction method, which is usually used to determine handedness, does not provide any information of the handedness for a chiral elemental structure such as β-Mn. Since a few years EBSD (Electron Backscatter Diffraction) is an established method to determine the local crystal orientation in a polycrystalline material by means of Kikuchi lines. The EBSD investigation is carried out with a scanning electron microscope. It is therefore a comparatively simple method for determining the local crystallographic properties of a polycrystalline material. The Kikuchi lines are formed by diffraction of the electrons on a strongly tilted, flat surface. However, conventional methods for evaluating the EBSD pattern do not allow any conclusion about the handedness of a phase. Only the consideration of dynamic electron multiple scattering in the simulation calculations yields differences in the Kikuchi lines of the two enantiomorphs. An assign-ment of the handedness is made on base of the better agreement of the experimental EBSD pat-tern with one of the two simulated patterns.

These investigations were carried out on the phases β-Mn and the structurally closely related multicomponent compound Pt2Cu3B. The distribution of enantiomorphs was determined from the EBSD pattern for both phases, while the x-ray diffraction on the Xenon- FIB (Focused ion beam) cut crystals allowed an assignment for the ternary phase only. The EBSD-based determi-nation of the distribution of the enantiomorphs in a polycrystalline material significantly simpli-fies the preparation of materials with defined handedness.

Credit: 
Max Planck Institute for Chemical Physics of Solids

Are wind farms slowing each other down?

image: Not always equally powerful: Wind farms can slow each other down.

Image: 
Photo: Nicholas Doherty via Unsplash

The expansion of wind energy in the German Bight and the Baltic Sea has accelerated enormously in recent years. The first systems went into operation in 2008. Today, wind turbines with an output of around 8,000 megawatts rotate in German waters, which corresponds to around eight nuclear power plants. But space is limited. For this reason, wind farms are sometimes built very close to one another. A team led by Dr. Naveed Akhtar from Helmholtz Zentrum Hereon has found that wind speeds at the downstream windfarm are significantly slowed down. As the researchers now write in the journal Nature Scientific Reports, this braking effect results in astonishingly large-scale low wind pattern noticeable in mean wind speeds. On average, they extend 35 to 40 kilometers - in certain weather conditions even up to 100 kilometers. The output of a neighboring wind farm can thus be reduced by 20 to 25 percent, which ultimately leads to economic consequences. If wind farms are planned close together, this wake effects need to be considered in the future.

Combination of climate and wind farm data

With their study, Naveed Akhtar, an expert in regional climate modeling, and his colleagues took a look into the future and assessed the wind characteristics for a medium-term target state of offshore expansion. They used the computer model COSMO-CLM, which is also used by weather services and which is able to resolve weather situations regionally in detail - in this case for the entire North Sea and combined it with the future wind farm characteristic - their area and the number and size of the turbines. They used the wind farm planning for the North Sea from 2015 as a basis. This contains wind farms, some of which have not yet been built.

Braking effect especially in stable weather conditions

Naveed Akhtar used the COSMO model to calculate the wind speed over the North Sea for the period from 2008 to 2017 covering a range of different weather conditions. The results clearly show that we will face a large scale pattern of reduced wind speed, which show largest extensions during stable weather conditions, typically the case in March and April. In stormy times, on the other hand - especially in November and December - the atmosphere is so mixed that the wind farm wake effects are relatively small. In order to verify the model data, the team compared the simulations with wind measurements from 2008 to 2017. They used measurements that were recorded on two research platforms in the North Sea and data from wind measurement flights that colleagues from the TU Braunschweig performed over existing wind farms. The comparison shows that the Hereon researchers are correctly simulating the wind wakes. What is special about the work is that for the first time a full ten-year period has been calculated for the entire North Sea. "Conventional flow models for analyzing wind farms have a very high spatial resolution, but only look at a wind field over a short period of time," says Akhtar. "In addition, these cannot be used to determine how a wind farm changes the air flow over a large area."

While the group has mainly dealt with the extent to which the wind farms influence each other in their current work, they intend to investigate in the near future what influence the reduced wind speeds have on life in the sea. Wind and waves mix the sea. This changes the salt and oxygen content of the water, its temperature and the amount of nutrients in certain water depths. Naveed Akhtar: "We would now like to find out how the reduced mixing affects the marine ecosystem."

Credit: 
Helmholtz-Zentrum Hereon

Polar vortex, winter heat may change bird populations

image: Red-breasted nuthatch.

Image: 
Jeremy Cohen, UW-Madison

MADISON, Wis. -- For birds and other wildlife, winter is a time of resource scarcity. Extreme winter weather events such as a polar vortex can push some species to the edge of survival. Yet winter tends to get short shrift in climate change research, according to UW-Madison forest and wildlife ecology Professor Ben Zuckerberg.

"When we think about the impact of climate change, winter tends to be overlooked as a time of year that could have significant ecological and biological implications," says Zuckerberg. "It makes me, and my colleagues, think quite deeply about the impacts of these extreme events during this time when species are particularly vulnerable."

Zuckerberg, along with Jeremy Cohen, a former UW-Madison postdoctoral researcher now at the Yale Center for Biodiversity and Global Change, and Daniel Fink of the Cornell Lab of Ornithology, set out to learn how extreme winter cold and heat affected 41 common bird species in eastern North America. Their work, recently published in Ecography, found that individual bird species respond differently to these weather events, and extreme winter heat may lead to longer-term changes in bird populations.

The researchers analyzed extensive data submitted through eBird, a global citizen science initiative where bird watchers contribute checklists of birds observed at a specific location, date and time. They homed in on data occurring before and after a four-day-long polar vortex in January 2014 and a December 2015 heat wave. These two events were the coldest and warmest stretches observed in a decade, and each affected an area of about 2 million square kilometers in the midwestern and northeastern U.S. and Canada. The researchers also analyzed temperature and land cover data.

That's a lot of data. Twenty years ago, working with this amount of diverse data would not have been possible. However, recent advances in environmental data science have enabled ecologists to work at scales that reflect the vast regions and species affected by climate change.

With colleagues at the Cornell Lab of Ornithology, Cohen and Zuckerberg used machine learning, an advanced computing technique used to gain insights from large data sets, to predict the abundance and occurrence of bird species starting 10 days before the onset of each extreme weather event, until 30 days after the event. They compared the data to identical time periods in 14 recent winters.

During the polar vortex, bird abundance -- the number of individual birds of a species observed in the study area -- decreased five-to-10 days following the event and returned to previous levels 20 days afterward, ruling out mortality as the reason for the decline. However, the prevalence of species across an entire region, or occurrence, was relatively stable. This result surprised the researchers, as local abundance and regional occurrence are usually closely linked.

"This data suggests some birds may have abandoned the area, moved south, and came back," says Cohen. "Alternatively, some birds could have laid low because of the stress caused by the cold, and then returned to earlier activity levels."

The data following the winter heat wave was even more surprising. Across most bird species, abundance and occurrence increased, and this trend persisted for 30 days following this extreme weather event. This may have been due to short-distance migrants moving into the area, and staying there, in response to warm weather.

"I was not expecting this impact of the winter heat wave effect," says Zuckerberg. "What I found to be most intriguing was the lasting and dramatic response."

Over the past 30 to 40 years, bird species have slowly moved northward. Ecologists believe this decades-long process is a response to climate change. However, Zuckerberg says that winter heat waves could speed up this geographic movement, and without time to gradually adapt, some birds may be more vulnerable to extreme heat or cold in new areas.

At the species level, Zuckerberg and Cohen found that warm-adapted and small bodied birds were more sensitive to both extreme heat and cold. Cold-adapted species were far more resilient.

The researchers also observed species-level differences related to habitat requirements. Waterbird species occurred more often after the polar vortex and less often after the winter heat wave -- the opposite of what was observed, on average, for other species. According to Cohen, open water bodies would have frozen during the polar vortex, perhaps causing species that overwinter at high latitudes to head south and seek more favorable habitat in the study area.

To help birds and other wildlife cope with extreme winter weather, wildlife managers can create sheltered habitats and other pockets of refuge. Continuous monitoring of bird activity and weather variability can help conservationists and policymakers predict which species will be most vulnerable to climate change over the next decade.

The confluence of environmental data science and citizen science is making this kind of prediction possible. As Zuckerberg puts it, "The amount of data we are getting through public participation in science has opened up new areas of exploration at a time that, frankly, we really need it, because climate change is such a big problem."

Credit: 
University of Wisconsin-Madison

NIH researchers identify potential new antiviral drug for COVID-19

The experimental drug TEMPOL may be a promising oral antiviral treatment for COVID-19, suggests a study of cell cultures by researchers at the National Institutes of Health. TEMPOL can limit SARS-CoV-2 infection by impairing the activity of a viral enzyme called RNA replicase. The work was led by researchers at NIH's Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). The study appears in Science.

"We urgently need additional effective, accessible treatments for COVID-19," said Diana W. Bianchi, M.D., NICHD Director. "An oral drug that prevents SARS-CoV-2 from replicating would be an important tool for reducing the severity of the disease."

The study team was led by Tracey A. Rouault, M.D., head of the NICHD Section on Human Iron Metabolism. It discovered TEMPOL's effectiveness by evaluating a more basic question on how the virus uses its RNA replicase, an enzyme that allows SARS-CoV-2 to replicate its genome and make copies of itself once inside a cell.

Researchers tested whether the RNA replicase (specifically the enzyme's nsp12 subunit) requires iron-sulfur clusters for structural support. Their findings indicate that the SARS-CoV-2 RNA replicase requires two iron-sulfur clusters to function optimally. Earlier studies had mistakenly identified these iron-sulfur cluster binding sites for zinc-binding sites, likely because iron-sulfur clusters degrade easily under standard experimental conditions.

Identifying this characteristic of the RNA replicase also enables researchers to exploit a weakness in the virus. TEMPOL can degrade iron-sulfur clusters, and previous research from the Rouault Lab has shown the drug may be effective in other diseases that involve iron-sulfur clusters. In cell culture experiments with live SARS-CoV-2 virus, the study team found that the drug can inhibit viral replication.

Based on previous animal studies of TEMPOL in other diseases, the study authors noted that the TEMPOL doses used in their antiviral experiments could likely be achieved in tissues that are primary targets for the virus, such as the salivary glands and the lungs.

"Given TEMPOL's safety profile and the dosage considered therapeutic in our study, we are hopeful," said Dr. Rouault. "However, clinical studies are needed to determine if the drug is effective in patients, particularly early in the disease course when the virus begins to replicate."

The study team plans on conducting additional animal studies and will seek opportunities to evaluate TEMPOL in a clinical study of COVID-19.

Credit: 
NIH/Eunice Kennedy Shriver National Institute of Child Health and Human Development

Genetic base editing treats sickle cell disease in mice

Sickle cell disease (SCD) is the most common deadly genetic disorder, affecting more than 300,000 newborns worldwide each year. It leads to chronic pain, organ failure, and early death in patients. A team led by researchers at the Broad Institute of MIT and Harvard and St. Jude Children's Research Hospital has now demonstrated a base editing approach that efficiently corrects the mutation underlying SCD in patient blood stem cells and in mice. This gene editing treatment rescued the disease symptoms in animal models, enabling the long-lasting production of healthy blood cells.

The root of SCD is two mutated copies of the hemoglobin gene, HBB, which cause red blood cells to transform from a circular disc into a sickle shape -- setting off a chain of events leading to organ damage, recurrent pain, and early mortality. In this study, the researchers used a molecular technology called base editing to directly convert a single letter of pathogenic DNA into a harmless genetic variant of HBB in human blood-producing cells and in a mouse model of SCD.

"We were able to correct the disease-causing variant in both cell and animal models using a customized base editor, without requiring double-stranded DNA breaks or inserting new segments of DNA into the genome," says co-senior author David Liu, Richard Merkin Professor and director of the Merkin Institute of Transformative Technologies in Healthcare at the Broad Institute, professor at Harvard University, and Howard Hughes Medical Institute investigator. "This was a major team effort, and our hope is that base editing will provide a promising basis for a therapeutic strategy down the road for sickle cell disease."

"Our study illustrates the power and excitement of multidisciplinary collaborations for creating novel mechanism-based cures for genetic diseases," says co-senior author Mitchell Weiss, chair of the St. Jude Department of Hematology. "In particular, we combined expertise in protein engineering, base editing, and red blood cell biology to create a novel approach for treating and possibly curing sickle cell disease."

The work appeared in Nature, led by co-first authors Gregory Newby at the Broad Institute and Jonathan Yen, Kaitly Woodard, and Thiyagaraj Mayuranathan at St. Jude Children's Research Hospital.

An improved approach

Currently, the only established method to cure SCD is a bone marrow transplant -- but finding an appropriate bone marrow donor for a patient is difficult, and patients who undergo a transplant can suffer dangerous side effects. While there are a number of gene editing treatments under development that avoid these risks by modifying a patient's own bone marrow directly, these experimental therapies rely on introducing new DNA or cleaving genomic DNA in cells, which can also cause adverse effects.

For this work, the research team used what's called an "adenine base editor," a molecular tool developed in Liu's lab that can target a specific gene sequence and convert the DNA base pair A* T to G* C, altering a gene at the level of a single pair of nucleotides. The base editor used in this study consists of a laboratory-evolved Cas9 variant -- a CRISPR-associated protein that positions the base editor at the mutated HBB site in the genome -- and a laboratory-evolved enzyme that converts the target A to a base that pairs like G. The base editor also guides the cell to repair the complementary DNA strand, completing the conversion of the target A* T base pair to G* C.

The single DNA mutation underlying sickle cell disease is an A in the healthy hemoglobin gene that has been altered to a T. While an adenine base editor cannot reverse this change, it can convert that T to a C. This edit transforms the dangerous form of hemoglobin into a naturally occurring, non-pathogenic variant called "hemoglobin Makassar."

Editing in models

The team first introduced the adenine base editor into isolated blood stem cells from human SCD patients. In these experiments, up to 80 percent of the pathogenic hemoglobin variants were successfully edited into the benign Makassar variant, with minimal instances of the editor causing undesired changes to hemoglobin.

The researchers transferred these edited blood stem cells into a mouse model to observe how they functioned in live animals. After 16 weeks, the edited cells still produced healthy blood cells.

"Sixteen weeks after transplantation, the total frequency of the edit maintained in stem cells -- which could contain edits in both copies of their hemoglobin gene, in only one copy, or in neither copy -- was 68 percent. And we were particularly excited to see that nearly 90 percent of cells contained at least one edited copy of hemoglobin," explains Newby. "Even those cells with just one edited copy appeared to be protected from sickling."

In a separate set of experiments, the researchers took blood stem cells from mice harboring the human sickle cell disease variant, edited them, and transplanted the edited cells into another set of recipient mice. Control mice transplanted with unedited cells showed typical symptoms: sickled red blood cells, consequences of short red blood cell lifetime, and an enlarged spleen. In contrast, mice transplanted with edited cells were improved compared to controls by every tested disease metric, with all measured blood parameters observed at levels nearly indistinguishable from healthy animals.

Finally, to confirm durable editing of the target blood stem cells, the researchers performed a secondary transplant, taking bone marrow from mice that had received edited cells 16 weeks previously and transferring the blood stem cells into a new set of mice. In the new animal cohort, edited cells continued to perform similarly to healthy blood stem cells, confirming that the effects of base editing were long-lasting. The team determined that editing at least 20 percent of pathogenic hemoglobin genes was sufficient to maintain blood metrics in the mice at healthy levels.

"In these final experimental phases, we demonstrated an editing threshold of about 20 percent that is necessary to mitigate this disease in mice. This base editing strategy is efficient enough to far exceed that benchmark," explains Liu. "The approach offers promise as the basis of a potential one-time treatment, or perhaps even a one-time cure, for sickle cell disease."

The researchers and other partners are working to move this concept safely and effectively into additional preclinical studies, with the eventual goal of reaching patients.

Credit: 
Broad Institute of MIT and Harvard

Cultural, belief system data can inform gray wolf recovery efforts in US

image: North Dakota, Wyoming and Montana have more residents with traditionalist values, said Michael Manfredo, lead investigator and department head of the Department of Human Dimensions of Natural Resources.

Image: 
CSU Photography

Humans regularly exert a powerful influence on the survival and persistence of species, yet social-science information is used only sporadically in conservation decisions.

Researchers at Colorado State University and The Ohio State University have created an index depicting the mix of social values among people across all 50 states, providing data that can be useful for wildlife conservation policy and management.

As a specific illustration, the research team found a supportive social context for gray wolf reintroduction in Colorado. Last fall, citizens in the state voted by ballot initiative to mandate the reintroduction of gray wolves. The data and maps in the study reveal that Colorado's social environment is far more conducive to wolf recovery than states like Montana and Idaho, which currently have state legislative efforts to reduce wolf populations.

The study, "Bringing social values to wildlife conservation decisions," was published online June 3 in Frontiers in Ecology and the Environment.

Michael Manfredo, the study's lead investigator and head of CSU's Department of Human Dimensions of Natural Resources, said the research reveals how people fall into the categories of traditionalists - those who believe animals should be used for purposes that benefit humans, like hunting and medical research - or mutualists, those who believe that animals deserve the same rights as humans. Mutualists view animals as companions and part of their social networks and project human traits onto animals.

"You can see what the 'flavor' is of the state or county, and what policies or initiatives people are likely to support or be opposed to," said Manfredo.

Highly modernized states, including California, Nevada, Colorado and Washington, are leaning more toward mutualism, according to the research. North Dakota, Wyoming and Montana have more residents with traditionalist values.

Tara Teel, CSU professor and a lead author on the study, said that the data is relevant for other hot and new topics and drills down to the county level.

"This study builds on a 50-state study on America's Wildlife Values -- the largest and first of its kind," said Teel. "It is one of the first broadly accessible social science datasets to inform wildlife conservation efforts across the United States."

Data provides insight on conservation decisions

While the results are particularly relevant for the U.S., the technique used by the team could be applied to better account for human factors in conservation decisions for addressing issues like climate change, species reintroductions and human-wildlife conflict globally.

The research team used data from a survey conducted from 2017 to 2018 of 46,894 U.S. residents and applied a sociocultural index to inform decision-making through an understanding of public values toward wildlife. Scientists measured mutualist and traditionalist values, which have previously been shown to be highly predictive of attitudes on a wide range of policy issues. The team subsequently developed state and county maps.

'Values don't change quickly'

"Previous research has found that there is a strong relationship between the laws passed in any given state and people's values," said Manfredo. "In the last two decades, there's been a substantial change in how people value wildlife," he added.

"Values don't change quickly," he said. "They're not like how a person feels about political issues. Values are formed in a person's youth and stay with you forever."

Manfredo said data showed that as far back as the early 2000s, people in Colorado were in favor of wolf reintroduction. But in places like Jackson County, a sub-alpine valley in northern Colorado, people were not so excited about wolves.

"Society is changing, and there's been a backlash from traditionalists who feel that their values and their voices in decision-making are being threatened," he said. "Ultimately, state and local agencies need to pay more attention to constituents. That means everybody in the state, not just a segment or a particular county. Policies need to fall more in line with the values of the public."

Credit: 
Colorado State University