Earth

Comb-like etching regulated growth for large-area graphene nanoribbon arrays

image: Graphene morphology regulated by the comb-like etching pattern.

Image: 
@Science China Press

The rapid development of silicon-based transistors leads to its integration getting closer to the limit of Moore's law. Graphene is expected to become the next generation of mainstream chip materials due to its ultra-high carrier mobility. However, it is difficult to obtain a high on/off current ratio for intrinsic graphene-based transistor owing to the absence of bandgap. Graphene nanoribbons, which possess a tunable bandgap and unique optoelectrical properties, have attracted extensive attention and exploration.

Nowadays, the preparation of graphene nanoribbons is underdeveloped, and common strategies include the clip of carbon materials (graphene films, carbon nanotubes, or graphite) and direct-growth on a specific substrate surface. All these methods are inspiring but also exhibit narrow scope--they require concession steps to prepare precursors or pre-functionalize substrate as templates. The hundreds of nanometers in length, alignment problem and cumbersome preparation process have retarded the integration of graphene nanoribbons into optoelectronic devices. Therefore, a robust nanotechnology to fabricate GNR with mass production, high quality, and uniform orientation are highly desired for commercialization.

In a new research article published in the Beijing-based National Science Review, scientists at the Institute of Chemistry, Chinese Academy of Sciences, Beijing China and at the Applied Mechanics Laboratory, Department of Engineering Mechanics and Center for Nano and Micro Mechanics, Tsinghua University, Beijing, China, developed a comb-like etching regulated growth process to in-situ fabricate graphene nanoribbon arrays in a template-free CVD system. They also disclosed the growth mechanism by the density functional theory (DFT) based first-principle calculations.

"The introduction of tenuous hydrogen flow is the key factor for realizing the comb-like pattern that promises to achieve the selective growth of graphene nanoribbons in place of the conventional graphene film growth on the liquid Cu surface (Figure 1)." they add. "The distinctive design allows for precisely controlling over width, edge structure, and orientation of graphene nanoribbons."

To data, the mass production of graphene nanoribbons is limited by the preparation process. "After the in-situ comb-like etching regulated growth process was completed, a large-area GNR array with a uniform orientation was well dispersed over the resolidified Cu surface." they state.

"The changes of experimental conditions allow for dominating the width and edge structure of graphene nanoribbons, the width decreased to 8 nm with a length of up to 3.1??m (aspect ratio of 387)." Moreover, the as-grown graphene nanoribbons have the smooth and sharp edges with zigzag motif. The scientists demonstrate.

Hydrogen plays a dual role in the formation of graphene by CVD technique. "Owing to the self-limiting growth mechanism, the in-situ formation of graphene nanoribbon arrays operates through stepwise mechanisms, involving an intermediate growth-favored state and a consecutive etching-favored stage. The conversion depends mainly on the catalytic activity of liquid copper surface passivated by as-grown graphene." they add.

The approach is operationally simple and efficient, offering an assurance for the use of GNR arrays in integrated circuits.

Credit: 
Science China Press

Zinc may help with fertility during COVID-19 pandemic, researchers report

DETROIT - Wayne State University School of Medicine researchers have reported that zinc supplements for men and women attempting to conceive either naturally or through assisted reproduction during the COVID-19 pandemic may prevent mitochondrial damage in young egg and sperm cells, as well as enhance immunity against the virus.

In "Potential Role of Zinc in the COVID-19 Disease Process and its Probable Impact on Reproduction," published in Reproductive Sciences, Husam Abu-Soud, Ph.D., associate professor of Obstetrics and Gynecology and the C.S. Mott Center for Growth and Development, said that in addition to benefiting couples attempting to conceive during the pandemic, zinc supplementation of up to a maximum of 50 mg per day for all adults could be beneficial in enhancing immunity and fighting the viral disease process of COVID-19.

Dr. Abu-Soud and co-authors Ramya Sethuram, Reproductive Endocrinology and Infertility fellow, and medical student David Bai, reviewed the pathophysiology of COVID-19, particularly in relation to reproductive function. They found that zinc depletion in connection with the cytokine storm - the overreaction of the immune system that causes inflammation, tissue damage and possible organ failure in fighting COVID-19 - can cause mitochondrial damage and an accumulation of reactive oxygen species in the immature egg and sperm. The result could prevent reproduction and conception.

Zinc has beneficial effects as an antioxidant and anti-inflammatory agent, and could prevent or mitigate the damage in the egg and sperm cells that result from the body's immune reaction to the virus, Dr. Abu-Soud said. The use of zinc could improve embryo quality and potentially lessen some pregnancy complications.

He also noted that zinc can be beneficial to the general population in enhancing immunity and fighting the viral disease process. The element works by combating oxidative cell damage.

Zinc alone may be insufficient to reverse the process once widespread oxidative cell damage has occurred. However, if the supplement is administered to those infected with COVID-19 before the cytokine storm phase, zinc may assist in ameliorating disease progression in the mild and early phases by suppressing viral replication and preventing cell damage as a pro-antioxidant, the researchers said.

Credit: 
Wayne State University - Office of the Vice President for Research

'Audeo' teaches artificial intelligence to play the piano

Anyone who's been to a concert knows that something magical happens between the performers and their instruments. It transforms music from being just "notes on a page" to a satisfying experience.

A University of Washington team wondered if artificial intelligence could recreate that delight using only visual cues -- a silent, top-down video of someone playing the piano. The researchers used machine learning to create a system, called Audeo, that creates audio from silent piano performances. When the group tested the music Audeo created with music-recognition apps, such as SoundHound, the apps correctly identified the piece Audeo played about 86% of the time. For comparison, these apps identified the piece in the audio tracks from the source videos 93% of the time.

The researchers presented Audeo Dec. 8 at the NeurIPS 2020 conference.

"To create music that sounds like it could be played in a musical performance was previously believed to be impossible," said senior author Eli Shlizerman, an assistant professor in both the applied mathematics and the electrical and computer engineering departments. "An algorithm needs to figure out the cues, or 'features,' in the video frames that are related to generating music, and it needs to 'imagine' the sound that's happening in between the video frames. It requires a system that is both precise and imaginative. The fact that we achieved music that sounded pretty good was a surprise."

Audeo uses a series of steps to decode what's happening in the video and then translate it into music. First, it has to detect which keys are pressed in each video frame to create a diagram over time. Then it needs to translate that diagram into something that a music synthesizer would actually recognize as a sound a piano would make. This second step cleans up the data and adds in more information, such as how strongly each key is pressed and for how long.

"If we attempt to synthesize music from the first step alone, we would find the quality of the music to be unsatisfactory," Shlizerman said. "The second step is like how a teacher goes over a student composer's music and helps enhance it."

The researchers trained and tested the system using YouTube videos of the pianist Paul Barton. The training consisted of about 172,000 video frames of Barton playing music from well-known classical composers, such as Bach and Mozart. Then they tested Audeo with almost 19,000 frames of Barton playing different music from these composers and others, such as Scott Joplin.

Once Audeo has generated a transcript of the music, it's time to give it to a synthesizer that can translate it into sound. Every synthesizer will make the music sound a little different -- this is similar to changing the "instrument" setting on an electric keyboard. For this study, the researchers used two different synthesizers.

"Fluidsynth makes synthesizer piano sounds that we are familiar with. These are somewhat mechanical-sounding but pretty accurate," Shlizerman said. "We also used PerfNet, a new AI synthesizer that generates richer and more expressive music. But it also generates more noise."

Audeo was trained and tested only on Paul Barton's piano videos. Future research is needed to see how well it could transcribe music for any musician or piano, Shlizerman said.

"The goal of this study was to see if artificial intelligence could generate music that was played by a pianist in a video recording -- though we were not aiming to replicate Paul Barton because he is such a virtuoso," Shlizerman said. "We hope that our study enables novel ways to interact with music. For example, one future application is that Audeo can be extended to a virtual piano with a camera recording just a person's hands. Also, by placing a camera on top of a real piano, Audeo could potentially assist in new ways of teaching students how to play."

Credit: 
University of Washington

California's rainy season starting nearly a month later than it did 60 years ago

WASHINGTON--The start of California's annual rainy season has been pushed back from November to December, prolonging the state's increasingly destructive wildfire season by nearly a month, according to new research. The study cannot confirm the shift is connected to climate change, but the results are consistent with climate models that predict drier autumns for California in a warming climate, according to the authors.

Wildfires can occur at any time in California, but fires typically burn from May through October, when the state is in its dry season. The start of the rainy season, historically in November, ends wildfire season as plants become too moist to burn.

California's rainy season has been starting progressively later in recent decades and climate scientists have projected it will get shorter as the climate warms. In the new study, researchers analyzed rainfall and weather data in California over the past six decades. The results show the official onset of California's rainy season is 27 days later than it was in the 1960s and the rain that does fall is being concentrated during the months of January and February.

"What we've shown is that it will not happen in the future, it's happening already," said Jelena Lukovi?, a climate scientist at the University of Belgrade in Serbia and lead author of the new study. "The onset of the rainy season has been progressively delayed since the 1960s, and as a result the precipitation season has become shorter and sharper in California."

The new study in AGU's journal Geophysical Research Letters, which publishes high-impact, short-format reports with immediate implications spanning all Earth and space sciences, is the first to quantify just how much later the rainy season now begins.

The results suggest California's wildfire season, which has been getting progressively worse due to human-caused climate change, will last even longer in the years to come and Californians can expect to see more fires flaring up in the month of November. 2020 was California's worst wildfire season on record, with nearly 10,000 fires burning more than 4.2 million acres of land.

An extended dry season means there is more overlap between wildfire season and the influx of Santa Ana winds that bring hot, dry weather to California in the fall. These winds can fan the flames of wildfires and increase the risk of late-season fires getting out of hand.

"It's not just a matter of making the vegetation drier and keeping all else equal," said Daniel Swain, a climate scientist at the University of California Los Angeles who was not involved in the study. "You're also increasing the number of opportunities for extremely dry vegetation and extremely strong offshore winds to coincide."

The delay in the start of the rainy season is likely due to changes in the atmospheric circulation patterns that bring precipitation to the West Coast, according to the study authors. They found the atmospheric circulation pattern that dominates California during the summer is extending into fall across the north Pacific Ocean. This change is bringing more rain to the states of Washington and Oregon and leaving California high and dry.

The changes mean Californians will need to better plan how they manage water resources and energy production - a longer dry season means more irrigation is needed for crops in an already water-stressed state.

"All water-sensitive stakeholders should have this information and plan their management accordingly," Lukovi? said.

Credit: 
American Geophysical Union

Personalized screening to identify teens with high suicide risk

ANN ARBOR, Mich. - The suicide rate among American adolescents has rose drastically over the last decade, but many at-risk youths aren't receiving the mental health services they need.

In fact, one of the greatest challenges is identifying the young people who need the most help.

Now, researchers have developed a personalized system to better detect suicidal youths. The novel, universal screening tool helps caregivers reliably predict an adolescent's suicide risk - alerting them to which ones need follow-up interventions - according to Michigan Medicine-led findings published in JAMA Psychiatry.

"Too many young people are dying by suicide and many at high risk go completely unrecognized and untreated," says lead author Cheryl King, Ph.D., a professor, clinical child psychologist, and director of the Youth and Young Adult Depression and Suicide Prevention Research Program in the Department of Psychiatry at Michigan Medicine.

"About half of the youth who die by suicide have never received any mental health services and some die on their first suicide attempt. We saw an urgent need to improve proactive, universal suicide screening of young people."

The screening tool, called the Computerized Adaptive Screen for Suicidal Youth (CASSY), is designed to be used in emergency rooms through a brief and efficient system that doesn't disrupt care. When an adolescent or teen is admitted for any reason - whether it's a psychiatric complaint or something unrelated like a sports injury - they complete a questionnaire on a digital device.

Follow-up questions and the number of questions are based on their answers so that the screening is tailored to the individual patient.

Adolescents are asked about suicidal thoughts but also other factors that may put them at risk, such as sleep disturbance, trouble concentrating, agitation, depression and hopelessness, and issues with family and school connectedness. The combination of risk factors is what determines a score for their suicidal risk level.

While existing suicide screening tools are currently used, King says, previous research indicates that many young people who are high risk still aren't detected or too many are detected as being at risk, including many who are "false positives."

"Different combinations of risk factors can place youth at risk," says King, who is also a child and adolescent psychologist at Michigan Medicine C.S. Mott Children's Hospital and member of the University of Michigan Injury Prevention Center. "If we screen only for suicidal thoughts, we will miss some high risk adolescents.

"There are many reasons young people may not share suicidal thoughts, possibly because they're ashamed, they aren't experiencing the thoughts at the time of screening, or someone reacted in a way they didn't feel was helpful when they shared suicidal thoughts or sensitive information in the past."

The CASSY system, she says, provides the healthcare provider in emergency services with information about the probability of a future suicide attempt. It offers thresholds for identifying different levels of risk, ranging from mild to high.

"This screening tool has the potential to be a step forward in our effort to improve clinical care models to adequately meet the needs of youth mental health," King says.

Suicide is the second-leading cause of death among U.S. teens, and the suicide rate among adolescents in the U.S. has grown by 62% since 2000. In 2018, the U.S. reported its highest annual number of adolescent suicide deaths that included 1,750 young people aged 12-17.

The algorithm for the computerized screening tool was based on data from multiple centers that participated in the Emergency Department Screen for Teens at Risk for Suicide, which is funded by the National Institutes of Mental Health.

Emergency departments are well suited for suicide risk screening, King notes, since nearly 19 % of U.S. Adolescents visit the ED over the course of a year. ED visits for youth suicide risk and self-harm have also recently doubled and are a common point of access for health services, she says.

The new study included two cohorts of adolescents aged 12-17 that visited the emergency departments. The CASSY screening tool was developed in the first cohort with 2,075 youth and validated in a second, independent cohort with data from 2,754 youths.

In this second cohort, a total of 165 adolescents (6 %) made at least one suicide attempt over the three-month period, and the CASSY predicted risk for suicide attempt with more than 88 % accuracy over the next three months.

King and her study colleagues are hopeful that many emergency departments nationwide will consider incorporating this personalized screening tool into their care models to improve suicide risk identification and treatment planning.

The development comes as national and local experts have shared concerns about isolation from the COVID-19 pandemic exacerbating mental health issues for teens at highest risk of anxiety, depression and suicidal thoughts.

"Improving suicide risk detection through effective screening has the potential to facilitate treatment, reduce morbidity and prevent death among teens and young people," King says.

Credit: 
Michigan Medicine - University of Michigan

Increased risk of dying from COVID for people with severe mental disorders

image: Martin Maripuu,
Associate professor and consultant psychiatrist
Department of Clinical Sciences, Umeå University/
Östersund Hospital, Sweden

Image: 
M. Maripuu

People with severe mental disorders have a significantly increased risk of dying from COVID-19. This has been shown in a new study from Umeå University and Karolinska Institutet in Sweden. Among the elderly, the proportion of deaths due to COVID-19 was almost fourfold for those with severe mental disorders compared to non-mentally ill people in the same age.

"We see a high excess mortality due to COVID-19 among the elderly with severe mental disorders, which gives us reason to consider whether this group should be given priority for vaccines," says Martin Maripuu, associate professor at Umeå University.

In the current study, the researchers studied data covering the entire Swedish population over the age of 20 during the period from 11 March to 15 June 2020. Among citizens with severe mental disorder, 130 people died from COVID-19 during this period, which corresponded to 0.1 per cent of the group. Among people who had not been diagnosed with a severe mental disorder, the mortality rate was almost halved, 0.06 percent.

Above all, after the age of 60, people with severe mental disorders had a higher excess mortality compared with the general population of the same age. In the age group 60-79 years, death from COVID-19 was almost four times as common among people with a severe mental disorders.

In the study, severe mental disorder was referred to as psychotic disorders, such as schizophrenia and bipolar disorder. The study did not include depression or anxiety in the term, although these conditions can also be severe.

As to what exactly causes the excess mortality in COVID-19 among people with severe mental disorders, the study itself provides no answer.

"It might be that severe mental disorders can lead to premature biological ageing, that the disease impairs health and the immune system in general or that this group has other risk factors such as obesity. It is always important to address both, mental and physical health problems of people with these disorders," says Martin Maripuu.

In total, almost eight million individuals formed basis for the study.

Credit: 
Umea University

Marmoset monkeys eavesdrop and understand conversations between other marmosets

image: Marmoset monkeys are not only passive observers of third-party interactions, but that they also interpret them.

Image: 
Judith M. Burkart, UZH

Humans continuously observe and evaluate interactions between third parties to decide with whom to interact in the future. But it is difficult to measure what information animals gain when they eavesdrop on vocal interactions between conspecifics: If they do understand such conversations, they do not necessarily exhibit behavioral expressions that can be easily observed. To overcome this hurdle, anthropologists from the University of Zurich created a study combining call simulations, thermography methods and behavioral preference measures.

Using thermal imaging, the researchers were able to non-invasively measure temperature changes in the faces of marmoset monkeys to quantify subtle emotional responses. "We were able to use this technique to show that the marmosets did not perceive the vocal interactions between conspecifics as the mere sum of the single call elements but rather perceived them holistically, as a conversation," says first author Rahel Brügger, PhD candidate at the Department of Anthropology of the University of Zurich.

Distinguishing dialogues from monologues

An animal experiencing an increase in emotional arousal will show a drop in facial surface temperature especially in the most exposed regions, namely the nose. Measuring the emitted infrared radiation by means of thermography makes it possible to record these changes. For their study the researchers used playbacks of vocal exchanges between marmosets as well as calls of individual animals who were not involved in an interaction. They played the corresponding playbacks from a hidden loudspeaker and used thermography to measure the monkeys' reactions to the various simulations. "This showed that the response to call interactions was significantly different than the response to individual calls," Brügger said. "Marmoset monkeys are thus able to distinguish a dialogue among conspecifics from a pure monologue."

Preference for cooperators

In the simulations, the researchers additionally distinguished between cooperative and competitive interactions. After the monkeys had heard the different interactions, they were given the opportunity to approach the sources of the sounds. The researchers observed that the marmosets preferred to approach the simulated conspecifics who had been involved in a cooperative interaction with a third party.

This preference fits the social system and natural behavior of these small Brazilian new world monkeys who are cooperative breeders and depend on the cooperativeness of their group members. "This study adds to the growing evidence that many animals are not only passive observers of third-party interactions, but that they also interpret them," concludes last author and professor of anthropology at UZH, Judith Burkart. "In addition, our study shows that thermography can help unveil how these social interactions are perceived by nonverbal subjects."

Credit: 
University of Zurich

Hierarchical dynamics

image: Researchers from Freiburg have been able to analyze the precise rate of signal transfer across multiple time scales.

Image: 
Graphic: Steffen Wolf

Consider for a moment a tree swaying in the wind. How long does it take for the movement of a twig to reach the trunk of the tree? How is this motion actually transmitted through the tree? Researchers at the University of Freiburg are transferring this kind of question to the analysis of proteins - which are the molecular machinery of cells. A team of researchers lead by Prof. Dr. Thorsten Hugel of the Institute of Physical Chemistry, and Dr. Steffen Wolf and Prof. Dr. Gerhard Stock of the Institute of Physics are investigating how the signals that cause structural changes in proteins travel from one site to another. They are also trying to determine how fast these mechanisms take place. Until now, researchers have been unable to analyze the precise rate of signal transfer because it involves many time scales - ranging from nanoseconds to seconds. The researchers in Freiburg, however, have now achieved such resolution by combining various experiments, simulations, and theoretical studies. They are publishing their results in the scientific journal Chemical Science.

In contrast to trees, the movements for the protein analyzed in the study, Hsp90, unfold on logarithmic time scales. Every large movement takes around ten times as long as the small, individual movements that make up the larger one. Wolf explains, "For example, a twig moves on a time scale of seconds; the branch with ten seconds; and the trunk with 100 seconds." Using a combination of state-of-the-art experimental and theoretical methods enabled the researchers to monitor allosteric communication, in other words, to show how a reaction process in Hsp90 altered a remote protein binding site. According to Stock, the team discovered the hierarchy of dynamics that this allosteric process unfolds on, which include the nanosecond to millisecond timescales and length scales from picometers to several nanometers.

What is more, the reaction process in Hsp90 is coupled with a structural change in the single amino acid Arg380. Arg380 then transmits structural information to a subdomain of the protein, and ultimately, passes it on to the protein as whole. The resulting change in structure closes a central binding site of the protein, thereby enabling it to fulfill new functions. The University of Freiburg researchers suspect that similar hierarchical mechanisms such as the one demonstrated in the Hsp90 protein are also of fundamental importance in signal transfer within other proteins. Hugel says that this could be useful for using drugs to control proteins.

Credit: 
University of Freiburg

Fish in warming Scottish seas grow faster but reach a smaller size

Researchers have found new evidence that global warming is affecting the size of commercial fish species, documenting for the first time that juvenile fish are getting bigger, as well as confirming that adult fish are getting smaller as sea temperatures rise. The findings are published in the British Ecological Society's Journal of Applied Ecology.

The researchers from the University of Aberdeen looked at four of the most important commercial fish species in the North Sea and the West of Scotland: cod, haddock, whiting and saithe. They found that juvenile fish in the North Sea and on the West of Scotland have been getting bigger while adult fish have been getting smaller. These changes in body size correlated with rising sea temperatures in both regions.

Idongesit Ikpewe, lead author of the study, said: "Both the changes in juvenile and adult size coincided with increasing sea temperature. Importantly, we observed this pattern in both the North Sea, which has warmed rapidly, and the west of Scotland, which has only experienced moderate warming. These findings suggest that even a moderate rise in sea temperature may have an impact on commercial fish species' body sizes."

In the short term, the findings may mean a reduction in commercial fishery yields, impacting an industry worth around £1.4 billion to the UK economy and one that employs over twenty-four thousand people, according to records from the House of Commons research library.

Idongesit Ikpewe said "Our findings have crucial and immediate implications for the fisheries sector. The decrease in adult body size is likely to reduce commercial fisheries yields. However, in the long-term, the faster growing and larger juveniles may compensate, to some extent, for the latter yield loss, as although the increase in length (and, therefore, weight) per individual may be small, younger fish are far more numerous. It is this trade-off that we now need to investigate."

To mitigate global warming effects and maximise sustainable yields, the authors say that temperature changes should be factored into forecasts used in fishery management.

The findings are also likely to impact marine ecosystems. "Of the four species we looked at, three (cod, whiting and saithe) are fish eating predators towards the top end of the food chain and therefore have an important ecological role in the ecosystems they inhabit. Since predator size dictates what prey they can target, a change in the body size of these fish species may have implications or predator-prey relationships, with potential consequences on the structure of the food web." said Idongesit Ikpewe.

The maximum body size fish can reach is determined by the supply and demand of limiting resources like oxygen. Warmer water typically contains less oxygen but also increases metabolic rates and therefore demand for oxygen. Fish in warming waters may sooner reach the size where they can no longer acquire the oxygen needed for maintaining metabolic demands, thereby limiting adult body size.

It's previously been shown in laboratory experiments that ectotherms (cold-blooded animals) develop faster at warmer temperatures but reach a smaller body size. This phenomenon, termed the 'temperature size rule', has been observed in several different animals, plants and bacteria. However, until now, few empirical studies have shown a link between increased temperature and faster growth in fish.

In this study the researchers examined the body size of cod, haddock, whiting and saithe at different age groups and compared trends in juvenile length and adult length with annual bottom sea temperatures.

They obtained the data from International Bottom Trawl Surveys provided by the International Council for the Exploration of the Sea. This provided 30 years of fishery-independent bottom trawl data from 1970 to 2017 for the North Sea and 1986 to 2016 for the West of Scotland.

Although the findings provide strong empirical evidence for changes in fish size and growth rate in warming seas, the study was limited to demersal species (living close to the seabed) in areas around the UK. Other commercially important species to the UK such as mackerel and herring were not considered in this study.

"The next stage of our work is to consider the management implications based on modelling these populations." said Idongesit Ikpewe. "The idea is to work out what the size changes we observed may mean for future fish productivity and yield under different scenarios of warming".

Credit: 
British Ecological Society

Ostriches challenged by temperature fluctuations

image: Ostrich close-up

Image: 
Photo: Charlie Cornwallis

The world's largest bird, the ostrich, has problems reproducing when the temperature deviates by 5 degrees or more from the ideal temperature of 20 °C. The research, from Lund University in Sweden, is published in Nature Communications.

The results show that the females lay up to 40 percent fewer eggs if the temperature has fluctuated in the days before laying eggs. Both male and female production of gametes is also negatively affected.

"Many believe that ostriches can reproduce anywhere, but they are actually very sensitive to changes in temperature. Climate change means that temperatures will fluctuate even more, and that could be a challenge for the ostrich", says Mads Schou, researcher at Lund University who wrote the article together with colleague Charlie Cornwallis.

The threat of climate change is often described in terms of higher temperatures in the future. The researchers instead emphasize that fluctuations are a major issue that many species have to contend with already. In addition, both warmer and colder temperatures affect reproduction.

However, while more temperature variations pose a threat to ostriches, there may be hope. The results show that a small proportion of the females thrive at more extreme temperatures. The researchers believe the underlying reasons for this could be genetic.

"It would be fascinating if it were genetic, because then it can be passed on to future generations. But we need to do more research to be sure that this is the case", says Mads Schou.

The study was conducted on a farm in South Africa, where the researchers collected data from 1,300 ostriches over a period of 20 years.

Credit: 
Lund University

Scientists discover plants' roadblock to specialty oil production

image: Biochemist Xiao-Hong Yu's research on specialty fatty acid production in plants got a boost from collaborating with colleagues studying an off-switch that regulates ordinary fatty acid synthesis.

Image: 
Brookhaven National Laboratory

UPTON, NY - Hundreds of naturally occurring specialty fatty acids (building blocks of oils) have potential for use as raw materials for making lubricants, plastics, pharmaceuticals, and more--if they could be produced at large scale by crop plants. But attempts to put genes for making these specialty building blocks into crops have had the opposite effect: Seeds from plants with genes added to make specialty fatty acids accumulated dramatically less oil. No one knew why.

Now two teams of biochemists working on separate aspects of oil synthesis at the U.S. Department of Energy's Brookhaven National Laboratory have converged to discover the mechanism behind the oil-production slowdown. As described in the journal Plant Physiology, they crossbred model plants and conducted detailed biochemical-genetic analyses to demonstrate a strategy for reversing the roadblock and ramping up production. The work paves the way for making at least one industrially important specialty fatty acid in plants--and may work for many others.

"Since scientists discovered the genes responsible for making specialty fatty acids several decades ago, we've dreamed of putting them into crop plants to make abundant renewable sources of desired fatty acids," said John Shanklin, chair of Brookhaven Lab's biology department, who oversaw the project. "But we've been stymied from using them because we didn't know why they dramatically slow fatty acid and oil synthesis. A number of research groups have been trying to figure out why this happens. We have now nailed down the mechanism and opened up the possibility of achieving that dream."

Two projects converge

This study grew out of two separate projects in Shanklin's biochemistry lab. One, led by Xiao-Hong Yu and Yuanheng Cai, was focused on the challenges associated with specialized fatty acid production in plants. The other, led by Jantana Keereetaweep, was deciphering details of the biochemical feedback loop plants use to regulate ordinary fatty acid and oil production.

Through that second project, the team recently characterized a mechanism by which plants down-regulate oil synthesis when levels of a plant's regular (endogenous) fatty acids get too high.

"This system operates like a thermostat," Shanklin explained. "When heat gets above its set point, the furnace turns off."

In the case of plant oils, the key machinery that controls production is an enzyme called ACCase. It has four parts, or subunits--you can think of them as gears. As long as endogenous fatty acids are below a certain level, the four "gears" mesh and the machine cranks out fatty acids for oil production. But feeding plants additional endogenous fatty acids triggers a substitution in the machinery. One of the ACCase subunits gets replaced by a version that isn't functional. "It's like a gear with no teeth," Shanklin said. That toothless gear (known as BADC) slows the fatty acid-producing machinery until endogenous fatty acid levels fall.

In contrast, the shutdown mechanism triggered by the specialty fatty acids (ones being produced by genes artificially added to the plant) kicks in when even small amounts of the "foreign" fatty acids are present, and endogenous fatty acids aren't in excess. "Because of this, they appeared to be two separate processes," Shanklin said.

But as the two teams discussed their projects, they began to wonder if the specialty fatty acids were triggering the same off switch triggered by high levels of ordinary fatty acids. "Imagine working in the same lab on different projects and in a lab meeting one day, you look at each other and ask, 'Is it possible we're working on the same thing?'" Shanklin said.

This idea provided a way for the teams to combine efforts on a new experiment.

Testing the hypothesis

Through earlier studies, Shanklin's group had created a strain of Arabidopsis (a model plant) that has two of its BADC genes deleted. In these plants, the off switch is disabled and the plants crank out high levels of endogenous fatty acids. They wondered what would happen if the BADC genes were disabled in plants engineered to produce specialty fatty acids.

To find out, Xiao-Hong Yu and Yuanheng Cai designed a strategy to crossbreed the defective off-switch plants with an Arabidopsis strain engineered to produce hydroxy fatty acids--one of the specialty types scientists would like to produce for industrial applications. This latter strain could make the hydroxy fatty acids, but its rate of oil synthesis was only half that of normal plants and it accumulated much less oil in its seeds.

When crossing four separate genetic factors, it takes several plant generations to produce plants with the desired combination of genes: both deleted BADC genes and two genes that drive the production of hydroxy fatty acids, with two identical copies of each genetic factor.

"We were fortunate to have two very dedicated students working as interns through Brookhaven's Office of Educational Programs--Kenneth Wei, who was then at Mount Sinai High School and is now at MIT, and Elen Deng, an undergraduate at Stony Brook University," Yu said. "They did fantastic work running polymerase chain reaction (PCR) tests--similar to those used to test for COVID-19--to run detailed analyses of more than 600 plants to find those with the desired genetic makeup."

Jantana Keereetaweep then worked with Yu to characterize those plants biochemically, to compare their rates of ACCase activity with those of the two Arabidopsis lines used to make the new genetic combinations.

The end result: Plants that had the combination of defective BADC genes and genes required for making hydroxy fatty acids produced normal levels of oil containing the specialty products. Compared with plants that had normal BADC genes, the new plants exhibited increases in the total amount of fatty acid per seed, the total seed oil content per plant, and the seed yield per plant.

"The BADC-defective plants are blind to the presence of hydroxy fatty acids and the usual response of turning off the ACCase--the oil-making machinery--is gone," Keereetaweep said.

The results prove that BADC is the mechanism for reducing ACCase activity in both scenarios--the accumulation of excess endogenous fatty acids and the presence of hydroxy fatty acid.

"We are now testing to see if this mechanism is limited to hydroxy fatty acids, or, as we suspect, common to other 'foreign' fatty acids that also reduce ACCase activity," Shanklin said. "If it's a general mechanism, it opens the possibility of realizing the dream of making additional desired specialty fatty acids in the oil-rich seeds of crop plants," Shanklin said.

"This is a good example where a fundamental mechanistic understanding of biochemical regulation can be deployed to enable progress towards a viable, sustainable bioeconomy," Shanklin said. "We can use this approach to make valuable renewable industrial starting materials at low cost in plants from carbon dioxide and sunlight, instead of relying on petrochemicals."

Credit: 
DOE/Brookhaven National Laboratory

Drone and landsat imagery shows long-term change in vegetation cover along intermittent river

image: Mountain zebra near the Kuiseb River in the Namib Desert.

Image: 
Photo by Oliver Halsey.

In the Namib Desert in southwestern Africa, the Kuiseb River, an ephemeral river which is dry most of the year, plays a vital role to the region. It provides most of the vegetation to the area and serves as a home for the local indigenous people, and migration corridor for many animals. The overall vegetation cover increased by 33% between 1984 and 2019, according to a Dartmouth study published in Remote Sensing .

The study leveraged recent drone imagery and past satellite imagery to estimate past vegetation cover in this linear oasis of the Kuiseb River, a fertile area in the middle of one of the driest deserts on the Earth. The findings are significant, as this is the first study to reconstruct decades of vegetation change over a long stretch of the river, rather than at just a few sites.

The research was based on a senior thesis project by Bryn E. Morgan '17, first author of the study, who has a bachelor's degree in geography and chemistry from Dartmouth, and is now a Ph.D. student in the WAVES Lab at the University of California, Santa Barbara. She first visited the Namib Desert, which is located along the coast of Namibia, in 2015 as part of the environmental studies foreign studies program , led by study co-author Douglas T. Bolger (https://envs.dartmouth.edu/people/douglas-thomas-bolger), a professor and chair of environmental studies at Dartmouth. Morgan returned to the region on her own in 2016, to conduct this research while working out of the Gobabeb Namib Research Institute and continued this work after graduating from Dartmouth.

The study area consisted of a 112-kilometer stretch of the lower Kuiseb River, which was comprised of 12 sites along the river, each of which was 500 meters wide. At each site, Morgan flew one to four unmanned aerial vehicle or drone flights to capture images of the woody vegetation cover along the river. One drone flight would take images with true color and then another flight would take images with false color using near infrared wavelengths of light, which are outside what our eyes can see. False color imagery allows one to distinguish vegetation from sand and soil, and differentiate the various types of vegetation and how healthy it may be.

The drone imagery revealed that five species of trees are part of the vegetation cover in the area: Acacia (Vachellia) erioloba, Faidherbia albida, Euclea pseudebenus, Tamrix usneoides, and Salvadora persica. A. erioloba and F. albida have been named national conservation priorities and their pods are an important food source to the livestock of the Topnaar (?Aonin), the indigenous people who live along the Kuiseb River. The Kuiseb also serves as a habitat and migration corridor for many animals, including the mountain zebra, leopard and ostrich.

Based on the 2016 data, the research team created a model that calculated the fractional vegetation cover of the lower Kuiseb River based on raw reflectance values. By matching past satellite images for the previous 30 years to the present-day imagery, and then applying the same model, the team was able to estimate past and vegetation cover. "Essentially, the drone data worked as a bridge from the ground to what the satellite was seeing," explains co-author Jonathan Chipman , director of the Citrin Family GIS/Applied Spatial Analysis Laboratory at Dartmouth. "Our methods in the study provide a model for how ecologists can combine modern drone imagery and historic satellite data to reconstruct past environmental change."

The results showed that one area of the Kuiseb River located between 110 and 140 kilometers upstream from the Atlantic Ocean in the terminal alluvial zone, which is typically dry but within the reach of floods and where soil sediments are deposited, not only had the highest vegetation but also had the most positive change in vegetation cover over the study period.

"Our findings demonstrate that vegetation cover in the lower Kuiseb River has been, on average, increasing for more than three decades," says Morgan. "As a result, we provide new insight into not only the long-term change in the hydrology and ecology of this system but also into how ephemeral rivers in desert landscapes might be responding to global change."

The research team examined the climate and hydrological records to see if increased precipitation between 1984 and 2019 may explain the overall change in vegetation cover; however, the records did not show such steady increases in precipitation. According to the co-authors, the increase in vegetation cover may reflect a long-term recovery from the drought in the Namib Desert during the early 1980s, which could be investigated by using coarser-resolution satellite data prior to 1984.

Credit: 
Dartmouth College

A team of climatologists is studying how to minimize errors in observed climate trend

image: Javier Sigró and Manola Brunet

Image: 
URV

The instrumental climate record is the cultural heritage of humankind, the result of the diligent work of many generations of people all over the world. However, the changes in the way in which temperature is measured, as well as the environment in which weather stations are located can produce spurious trends. An international study carried out by researchers from the Universitat Rovira i Virgili (URV), the State Meteorology Agency and the University of Bonn (Germany) have succeeded in identifying the most reliable methods that help correct these trends. These "homogenization methods" are a key step in converting the enormous effort made by observers into reliable data about climate change. The results of this research, funded by the Spanish Ministry of Economy and Competitiveness, have been published in the Journal of Climate of the American Meteorological Society.

Climate observations can often be traced back more than a century, even before there were cars and electricity. These long periods of time mean that it is practically impossible to maintain the same measuring conditions over the years. The most common problem is the growth of cities around urban weather stations. We know that cities are getting warmer and warmer because of the thermal properties of urban surfaces and the reduction of evapotranspiration surfaces. To verify this, it is sufficient to compare urban stations with nearby rural stations. Although less known, similar problems are caused by the expansion of irrigated crops around observatories.

The other most common reason for biases in observed data is that weather stations have been relocated, among other reasons, because of changes in the observation networks. "A typical organisational change consisted of weather stations, which used to be in cities, being transferred to newly built airports which needed observations and predictions," explains Victor Venema, a climatologist from Bonn and one of the authors of the study. "The weather station in Bonn used to be in a field in the village of Poppelsdorf, which is now a part of the city and, after it had been moved several times, it is now in the Cologne-Bonn airport," he says.

As far as the robust estimation of global trends is concerned, the most important changes are technological, which are made simultaneously in an observation network. "At the moment we are in the middle of a period of generalised automation of the observation networks," says Venema.

The computer programs that can be used for the automatic homogenisation of climate time series data are the result of several years of development. They operate by comparing stations that are near to each other and looking for changes that only take place in one of them, unlike climate changes, which affect them all.

To examine these homogenization methods, the research team generated a test bank in which they incorporated a set of simulated data that reliably imitated the sets of observed climate data, including the biases mentioned. Hence, the spurious changes are known and they can be studied to determine how the various homogenisation methods can correct them.

The test data sets generated were more diverse than those in previous studies and so were the real networks of stations, because of differences in how they were used. The researchers reproduced networks with highly varied densities of stations because in a dense network it is easier to identify a small spurious change in one station. The test data set that was used in this project was much larger than in previous studies (a total of 1,900 weather stations were analysed), which enabled the scientists to accurately determine the differences between the main automatic homogenisation methods developed by research groups in Europe and America. Because of the large size of the test data set, only the automated homogenisation methods could be tested.

The research group discovered that it is much more difficult to improve the estimated mean climate signal for an observation network than improve the accuracy of the time series of each station.

In the resulting classification, the methods of homogenisation proposed by URV and AEMET were better than the others. The method developed at the URV's C3 Centre for Climate Change (Vila-seca, Tarragona) by the Hungarian climatologist Peter Domonkos proved to be the best at homogenising both the series from individual stations and the mean series from the regional network. The AEMET method, developed by the researcher José A. Guijarro, was very close behind.

The homogenisation method developed by the National Oceanic and Atmospheric Administration of the United States (NOAA) was best at detecting and minimising systematic errors in trends from many weather stations, especially when these biases were produced simultaneously and affected many stations on similar dates. This method was designed to homogenise data sets from stations the world over where the main concern is the reliable estimation of global trends.

The results of this study have demonstrated the value of large test data sets. "It is another reason why automatic homogenisation methods are important: they can be tested more easily and this helps in their development," explains Peter Domonkos, who started his career as a meteorological observer and is now writing a book on the homogenisation of climate time series.

"The study shows the importance of very dense station networks in making homogenisation methods more robust and efficient and, therefore, in calculating observed trends more accurately," says the researcher Manola Brunet, director of the URV's C3, visiting member of the Climate Research Unit of the University of East Anglia, Norwich, United Kingdom, and vice-president of the World Meteorological Organisation's Commission for Weather, Climate, Water and Related Environmental Services & Applications.

"Unfortunately, much more climate data still has to be digitalised for even better homogenisation and quality control," she concludes.

For his part, the researcher Javier Sigró, also from the C3, points out that homogenisation is often just the first step "that allows us to go to the archives and check what happened with those observations affected by spurious changes. Improving the methods of homogenisation means that we can do this much more efficiently."

"The results of the project can help users to choose the method most suited to their needs and developers to improve their software because its strong and weak points are revealed. This will enable more improvement in the future," says José A. Guijarro from the State Meteorology Agency of the Balearic Islands and co-author of the study.

Previous studies of a similar kind have shown that the homogenisation methods that were designed to detect multiple biases simultaneously were clearly better than those that identify artificial spurious changes one by one. "Curiously, our study did not confirm this. It may be more an issue of using methods that have been accurately fitted and tested," says Victor Venema from the University of Bonn.

The experts are sure that the accuracy of the homogenisation methods will improve even more. "Nevertheless, we must not forget that climate observations that are spatially more dense and of high quality are the cornerstone of what we know about climate variability," concludes Peter Domonkos.

Credit: 
Universitat Rovira i Virgili

Rescheduling drugs to lower risk of abuse can reduce use, dangers

Many nations place drugs into various schedules or categories according to their risk of being abused and their medical value. At times, drugs are rescheduled to a more restrictive category to reduce misuse by constricting supply. A new study examined lessons from past efforts worldwide to schedule and reschedule drugs to identify general patterns and found that rescheduling drugs can lower use as well as the dangers associated with the drug. The findings have implications for policy.

The study, by researchers at Carnegie Mellon University (CMU), is published in the International Journal of Drug Policy.

"Our review suggests that rescheduling drugs can often disrupt trends in prescribing, use, or harms," says Jonathan Caulkins, professor of operations research and public policy at CMU's Heinz College, who led the study. "Hence, scheduling may be a more useful tool than is often recognized."

As part of implementing obligations to meet international drug control treaties, many countries classify drugs according to legally defined schedules. For example, the United States places drugs with no federally recognized medical use in Schedule I, and medicines are arrayed from Schedule II (the most restrictive category) to Schedule V (the least restrictive) based on their potential for abuse. Other countries have similar or higher numbers of categories; for example, Australia has 10.

In any given year, most drugs stay in the schedule they occupied the previous year. But sometimes drugs are added or are "up-scheduled" to a more restrictive category based on concerns that the drug is being misused.

In this study, based on a review of the literature, researchers examined all reported changes in quantitative measures of prescribing, dispensing, use, and use-related harms pertaining to the drug in question and, when available, any substitution with other drugs after the scheduling change. They considered scheduling and rescheduling changes in the United States between 1969 and 2020, as well as changes in a comparably large set of instances in other countries (e.g., Australia, Thailand, the United Kingdom).

Of the 90 articles reporting before and after changes in outcomes, 61 addressed events in the United States and more than half dealt with substitutions. Most of the articles on substitutions pertained to the U.S. up-scheduling of hydrocodone-containing products in 2014.

The study found that for more than half of the rescheduling events for which quantitative outcomes were reported, use-related measures declined by at least 40 percent. This led the authors to conclude that across a range of years, substances, and countries, rescheduling can be, although is not always, accompanied by appreciable reductions in prescribing, dispensing, use, and harms related to use.

The authors note that the magnitude of the declines varied widely across reschedulings, and that changes could also vary widely across different outcomes for the same rescheduling event. The authors noted that for any individual drug that was rescheduled, before-and-after changes could not necessarily be attributed only to the rescheduling; other contemporaneous changes may have played a role.

However, looking at the literature as a whole, the authors concluded that rescheduling can reduce use and harms associated with the drug that was restricted, depending on the circumstances.

The study also found that when drugs were rescheduled, sometimes there was substitution with other drugs--which might render the rescheduling ineffective or even counterproductive if the substituted drug was more dangerous. Typically, substitution was partial and as likely to involve a less dangerous drug as toward one that was more dangerous.

"Policymakers should view rescheduling as one possible tool for responding to misuse, but before wielding it, they should anticipate the possibility of substitutions and plan accordingly," explains Laura Goyeneche, research assistant at CMU's Heinz College, who coauthored the study.

Among the study's limitations, the authors cite the fact that the underlying literature is limited and unevenly distributed, with many scheduling events never being evaluated quantitatively. Furthermore, because findings were reported for many outcomes and with different measures over different timelines, standardizing outcomes was difficult; this prevented the authors from doing a meta-analysis of the data. In addition, since scheduling may be part of an array of concurrent efforts to address misuse, it is difficult to be certain about how much of a change in use can be attributed to the reschedule.

The research was followed up on work that began as a capstone project course in the Heinz College's Public Policy and Management master's program.

Credit: 
Carnegie Mellon University

Women at higher-risk of fatal, nightime cardiac arrest

image: Sumeet Chugh, MD

Image: 
Cedars-Sinai

LOS ANGELES - New research from the Center for Cardiac Arrest Prevention in the Smidt Heart Institute has found for the first time that during nighttime hours, women are more likely than men to suffer sudden death due to cardiac arrest. Findings were published in the journal Heart Rhythm.

"Dying suddenly during nighttime hours is a perplexing and devastating phenomenon," said Sumeet Chugh, MD, senior author of the study and director of the Center for Cardiac Arrest Prevention. "We were surprised to discover that being female is an independent predictor of these events."

Medical experts are mystified, Chugh says, because during these late hours, most patients are in a resting state, with reduced metabolism, heart rate and blood pressure.

Sudden cardiac arrest-also called sudden cardiac death-is an electrical disturbance of the heart rhythm that causes the heart to stop beating. People often confuse sudden cardiac arrest with heart attack. However, a heart attack is caused by a buildup of cholesterol plaque that creates a blockage in the coronary arteries. And unlike a heart attack, when most have symptoms, sudden cardiac death can present in the absence of warning signs.

Another major difference: Most people survive heart attacks, with only 10% of patients surviving out-of-hospital cardiac arrest.

Of the approximately 350,000 people affected by the condition each year in the U.S., roughly 17% to 41% of cases occur during the nighttime hours of 10 p.m. to 6 a.m.

In the study, Chugh and his team of investigators looked at records of 4,126 patients, with 3,208 daytime cases of sudden cardiac arrest and 918 nighttime cases. Compared with daytime cases, patients who suffered from nighttime cardiac arrest were more likely to be female.

While further work is needed, the researchers suggest there may be a respiratory component causing this increased risk at night for women.

Chugh's research also shows:

25.4% of females studied suffered cardiac arrest at night versus 20.6% of their male counterparts.

The prevalence of lung disease was significantly higher in those who had a cardiac arrest at night compared with those who had cardiac arrest during the daytime.

Those who had cardiac arrests in the nighttime had a higher prevalence of prior or current smoking history.

"The prevalence of chronic obstructive lung disease and asthma were found to be significantly higher in sudden cardiac arrest cases at night compared with daytime cases, regardless of gender," said Chugh, also the Pauline and Harold Price Chair in Cardiac Electrophysiology Research. "Brain-affecting medications, some of which have the potential to suppress breathing, were also found to have a significantly greater usage in nighttime compared to daytime cardiac arrest."

Based on these findings, this research report suggests that prescribing physicians may wish to be cautious when recommending brain-affecting medications, for example, sedatives and drugs prescribed for pain and depression management, to high-risk patients, especially women.

"This important research may better guide physicians and the broader medical community to making more sound, science-backed recommendations in treating this difficult condition," said Christine Albert, MD, MPH, chair of the Department of Cardiology in the Smidt Heart Institute and the Lee and Harold Kapelovitz Distringuished Chair in Cardiology. "It is also a necessary continuation of sex-based research defining much of the field of cardiology."

For two decades, Chugh has led the Oregon Sudden Unexpected Death Study, a unique partnership with the approximately 1 million residents of the Portland, Oregon metro area as well as the provider organizations that care for them-the first responders, hospital systems and the medical examiner network. Chugh also leads the Ventura Prediction of Sudden Death in Multi-Ethnic Communities Study based in Ventura, California, a similar community partnership with the approximately 850,000 residents, first responders, medical examiner and hospital systems of Ventura County.

Credit: 
Cedars-Sinai Medical Center