Earth

DNA--Metal double helix

Nanowires are vital components for future nanoelectronics, sensors, and nanomedicine. To achieve the required complexity, it is necessary to control the position and growth of the metal chains on an atomic level. In the journal Angewandte Chemie, a research team has introduced a novel approach that generates precisely controlled, helical, palladium-DNA systems that mimic the organization of natural base pairs in a double-stranded DNA molecule.

A team from Europe and the USA led by Miguel A. Galindo has now developed an elegant method for producing individual, continuous chains of palladium ions. The process is based on self-organized assembly of a special palladium complex and single-stranded DNA molecules.

In recent years, DNA has become an important tool for nanoscience and nanotechnology, particularly because of the possibility of "programming" the resulting structures through the base sequence of the DNA used. The incorporation of metals in DNA structures can give them properties such as conductivity, catalytic activity, magnetism, and photoactivity.

However, organizing metal ions in DNA molecules is not trivial because metal ions can bind to many different sites. Galindo's team developed a smart method for controlling the binding of palladium ions to specific sites. They use a specially constructed palladium complex that can form base pairs with natural adenine bases in a strand of DNA. The ligand in this complex is a flat, aromatic ring system that grasps three of the four binding positions available on the palladium ion. The fourth position of the palladium is then available to bind to a very specific nitrogen atom in adenine. The ligand also possesses oxygen atoms capable of forming a hydrogen bond with the neighboring NH(2) group of the adenine. This binding pattern corresponds exactly to a Watson-Crick base pairing, but now mediated by a palladium ion, which makes it considerably stronger than natural Watson-Crick pairing.

If a DNA strand made exclusively of adenine bases is used, one palladium complex binds to each adenine. The flat ligands assemble themselves into coplanar stacks, just like natural bases. This results in a double strand made of DNA and palladium complexes that corresponds to a natural DNA double helix in which one strand has been replaced by a supramolecular stack of continuous palladium complexes.

Although the team has yet to demonstrate the conductive properties of these systems, it can be anticipated that the correct reduction of these metal ions could lead to the formation of a conductive nanowire with a highly controlled structure. The research group is currently working in this line as well as in the modification of the ligand, which can also provide new properties to the system.

Credit: 
Wiley

Incurable cancer: Patients need palliative care support early on

So far, there has been little research into supportive care needs in patients with newly diagnosed incurable cancer and as their disease progresses. That is why experts from the German Cancer Society's working group on palliative medicine, led by Professor Florian Lordick, Director of the University Cancer Center Leipzig (UCCL), surveyed 500 patients between the ages of 25 and 89. What made the project special was the fact that the patients were accompanied from the moment they were diagnosed and before receiving any treatment. Professor Lordick sums it up thus: "There is an urgent need for patients to have early access to supportive palliative care for a wide range of issues, including psychosocial support." Palliative care is not about healing, but about maintaining quality of life, relieving pain, treating other physical ailments and problems of a psychosocial and spiritual nature.

Two-thirds of patients diagnosed with incurable cancer reported immediate, significant physical and emotional distress. The study paints a complex picture of the care provided by 20 cancer treatment centres across Germany, from university to community settings, from outpatient to inpatient care. Oncologist Lordick explains: "The patients were very interested in the survey, despite the fact that they were in a very difficult situation and the study required them to reveal their inner selves to a certain degree. That showed us just how important this issue is to them."

Patients were surveyed shortly after being diagnosed with incurable lung (217), gastrointestinal (156), head and neck (55), gynaecological (57) and skin (15) cancers, and again after three, six and twelve months. The focus was on patients' distress, symptom burden, quality of life and supportive care needs.

More than 30 per cent of respondents reported anxiety and depression shortly after diagnosis. Complaints of a lack of energy, nutritional and digestive problems, and pain were also very common. The study shows where the needs of those affected are particularly high. When comparing patients with different cancers, those with stomach, oesophageal, liver, or head and neck tumours displayed the highest level of distress over the entire observation period.

Professor Lordick believes the study results can be used to draw clear conclusions for medical practice, explaining: "Cancer centres need to have expert palliative care services, both on an inpatient and outpatient basis. These include specialised nutritional counselling, pain management, and physiotherapy and psychosocial support." The expert from Leipzig University Hospital concludes that the results underline the necessity of introducing comprehensive symptom screening and early palliative medical care.

Credit: 
Universität Leipzig

Hypnosis changes the way our brain processes information

During a normal waking state, information is processed and shared by various parts within our brain to enable flexible responses to external stimuli. Researchers from the University of Turku, Finland, found that during hypnosis the brain shifted to a state where individual brain regions acted more independently of each other.

"In a normal waking state, different brain regions share information with each other, but during hypnosis this process is kind of fractured and the various brain regions are no longer similarly synchronised," describes researcher Henry Railo from the Department of Clinical Neurophysiology at the University of Turku.

The finding shows that the brain may function quite differently during hypnosis when compared to a normal waking state. This is interesting because the extent to which hypnosis modifies neural processing has been hotly debated in the field. The new findings also help to better understand which types of changes and mechanisms may explain the experiential and behavioural alterations attributed to hypnosis, such as liability to suggestions.

The study focused on a single person who has been extensively studied earlier and been shown to react strongly to hypnotic suggestions.
During hypnosis, this person can experience phenomena that are not typically possible in a normal waking state, such as vivid and controlled hallucinations.

"Even though these findings cannot be generalised before a replication has been conducted on a larger sample of participants, we have demonstrated what kind of changes happen in the neural activity of a person who reacts to hypnosis particularly strongly," clarifies Jarno Tuominen, Senior Researcher at the Department of Psychology and Speech-Language Pathology.

Hypnosis Studied for the First Time with New Method

The study was conducted by tracking how a magnetically-induced electrical current spread throughout the brain during hypnosis and normal waking state. This method has been previously used to measure system-level changes in the brain in various states of consciousness, such as anaesthesia, coma, and sleep. This is the first time such a method has been used to assess hypnosis.

During the study, the participant sat still with eyes closed, alternatively either hypnotised or in a normal waking state. Hypnosis was induced via a single-word cue, and the different conditions were identical in every other respect.

"This allowed us to control the possible effects of the experimental setup or other factors, such as alertness," Tuominen explains.

The study was conducted by researchers Jarno Tuominen from the division of Psychology, Henry Railo from the Department of Clinical Neurophysiology, and Valtteri Kaasinen, Assistant Professor in Neurology at the University of Turku, Finland, together with Assistant Professor in Cognitive Neuroscience Sakari Kallio at the University of Skövde, Sweden.

Credit: 
University of Turku

Plasmon-coupled gold nanoparticles useful for thermal history sensing

Researchers have demonstrated that stretching shape-memory polymers embedded with clusters of gold nanoparticles alters their plasmon-coupling, giving rise to desirable optical properties. One potential application for the material is a sensor that relies on optical properties to track an object or environment's thermal history.

At issue is a stretchable polymer embedded with gold nanospheres. If the material is heated and stretched, followed by cooling to room temperature, the material will hold its stretched shape indefinitely. Once reheated to 120 degrees Celsius, the material returns to its original shape.

But what's really interesting is that the gold nanospheres are not perfectly dispersed in the polymer. Instead, they form clusters, in which their surface plasmon resonances are coupled. These plasmon-coupled nanoparticles have optical properties that shift depending on how close they are to each other, which changes when stretching alters the shape of the composite.

"When assessing the peak wavelength of light absorbed by the material, there are significant differences depending on whether the light is polarized parallel or perpendicular to the stretching direction," says Joe Tracy, corresponding author of a paper on the work and a professor of materials science and engineering at NC State. "For light polarized parallel to the direction of stretching, the further you have stretched the material, the further the light absorbed shifts to the red. For light polarized perpendicular to the stretching direction there is a blueshift."

"We also found that, while the shape-memory polymer holds its shape at room temperature, it recovers its original shape in a predictable way, depending on the temperature it is exposed to," says Tobias Kraus, co-author of the paper, a group leader at the Leibniz Institute for New Materials and a professor at Saarland University.

Specifically, once stretched 140% past its original length, you can determine the highest temperature to which the polymer is then exposed, up to 120 degrees Celsius, by measuring how much it has shrunk back toward its original size. What's more, because of the plasmon-coupled nanoparticles, this change can be measured indirectly, through measurements of the material's optical properties.

"From a practical perspective, this allows you to create an optical thermal-history sensor," Joe Tracy says. "You can use light to see how hot the material got. An important application of thermal-history sensors is assuring the quality or safety of shipping or storing materials that are sensitive to significant changes in heat. We have demonstrated an approach based on plasmon coupling of gold nanoparticles."

The sensor concept was developed empirically, but the researchers also used computational modeling to better understand the structure of the clusters of gold nanospheres and how the clusters changed during stretching. The strength of plasmon coupling is related to the spacings between nanospheres, which is known as a "plasmon ruler."

"Based on our simulations, we can estimate the distance between plasmon-coupled nanoparticles from their optical properties," says Amy Oldenburg, co-author of the paper and a professor of physics at the University of North Carolina at Chapel Hill. "This comparison is informative for designing future polymer nanocomposites based on plasmon-coupled nanoparticles."

Credit: 
North Carolina State University

Intensity of tropical cyclones is probably increasing due to climate change

Many tropical cyclone-prone regions of the world are expected to experience storm systems of greater intensity over the coming century, according to a review of research published today in ScienceBrief Review.

Moreover, sea level rise will aggravate coastal flood risk from tropical cyclones and other phenomena, even if the tropical cyclones themselves do not change at all. Models also project an increase in future tropical-cyclone precipitation rates, which could further elevate the risk of flooding.

Researchers at Princeton University, the U.S. National Oceanic and Atmospheric Administration (NOAA), and the University of East Anglia (UEA) examined more than 90 peer-reviewed articles to assess whether human activity is influencing tropical cyclones, including tropical storms, hurricanes and typhoons. The studies showed growing evidence that climate change is probably fuelling more powerful hurricanes and typhoons, a trend that is expected to continue as global temperatures rise, amounting to a roughly 5 per cent increase in maximum wind speeds if the globe warms by 2 degrees Celsius.

The influence of climate change on tropical cyclones has been notoriously difficult to separate from natural variability. But an increasingly consistent picture is emerging that suggests human activities are probably influencing some aspects of these extreme weather events, although the exact extent of the human influence is still difficult to determine confidently in today's observations. Many of the observed trends in tropical cyclones are at least qualitatively consistent with expectations from a warming climate.

The ScienceBrief Review, 'Climate change is probably increasing the intensity of tropical cyclones', is published today as part of a collection on critical issues in climate change science to inform the COP26 climate conference.

Observations show that since about 1980, the intensity of tropical cyclones has increased globally, with a larger proportion of powerful cyclones and an increase in the rate at which they intensify, especially in the North Atlantic.

However, century-scale records of landfalling hurricanes and major hurricanes for the continental United States -- as well as tropical cyclone landfalls for Japan and eastern Australia -- fail to show any significant increase over time.

The mixed picture -- revealed by past observations such as these -- is one reason why it has been so difficult to unequivocally attribute past changes in tropical cyclone activity to the century-scale build-up of greenhouse gases in the atmosphere, which has caused global warming, according to the authors.

Other factors influencing tropical cyclones, including natural climate variability such as El Niño and La Niña events, and changes in air pollution that create local cooling or warming trends over decades, may have influenced the recent trends since 1980. A key research question is how future greenhouse gas-dominated global warming will influence tropical cyclone behaviour over the coming century.

Prof Corinne Le Quéré, Royal Society Professor at UEA's School of Environmental Sciences, edited the COP26 special issue of ScienceBrief Review. She said: "There is moderate consensus that climate change is already playing a role in the development of tropical cyclones, but it is early days. In comparison with wildfires, the consensus is already clear that climate change increases the risks, as shown earlier on ScienceBrief Review."

Projections with climate models suggest that with further warming in coming decades, a larger proportion of Category 4 and 5 tropical cyclones will occur globally -- with more damaging wind speeds and more extreme rainfall rates. The damage potential of storms will also depend on factors such as the change in storm trajectory, frequency, size, intensity and rainfall. The actual damage from storms will also be influenced by human factors including the location and vulnerability of buildings and infrastructure.

Tropical cyclones could also intensify more rapidly, and move more slowly in some regions, exacerbating extreme rainfall in localised areas. An extreme example of tropical cyclone flooding induced by a stalled system occurred with Hurricane Harvey in Texas in 2017.

Thomas Knutson, Division Leader at NOAA's Geophysical Fluid Dynamics Laboratory on Princeton University's Forrestal Campus, led the review.

Gabriel Vecchi, Princeton professor of geosciences and the High Meadows Environmental Institute and a co-author on the study, said: "Larger and more intense tropical cyclones tend to cause more damage than smaller, weaker storms, so shifts toward a greater proportion of intense storms are of concern.

"The intensity of tropical cyclones has increased globally in recent decades, with the proportion of Category 3 to 5 cyclones growing by around 5 per cent per decade since 1979.

"It is still difficult to firmly attribute those trends to human-induced climate change because there are also other factors influencing these storms."

There is increased risk of inundation due to rising sea levels, with heavy rainfall projected to intensify due to enhanced moisture in the air as the climate warms.

Observations indicate the latitude at which tropical cyclones reach their peak intensity has been migrating poleward in most basins, raising the potential that those storms could begin to bring greater impacts to locations that may be less well-equipped to respond.

Modelling studies, supported by the theory of potential intensity of tropical cyclones, find that mean intensities are projected to increase by about +5 per cent for a +2 degrees C global warming scenario, and near-storm rainfall rates to increase globally by an average of +14 per cent.

Maya Chung, a PhD candidate in Princeton's Program in Atmospheric and Oceanic Sciences, said that in coastal regions, higher storm inundation levels will be among the greatest potential impacts of future tropical cyclones under climate change.

She said: "The combination of likely increased storm intensity and rainfall rates and continued sea-level rise will act to increase the inundation risk of low-lying, unprotected regions.

"The total inundation risk will depend on a variety of storm-related factors as well as sea level rise."

Whereas model projections suggest a greater proportion of higher-intensity cyclones, most model studies project the total number of tropical cyclones each year will decrease or remain approximately the same.

The 2020 hurricane season in the North Atlantic had both a high number of named storms and a high number of intense hurricanes, with six storms in Category 3 to 5.

Thomas Knutson said: "It is possible that in the real world, hurricane activity will increase more than suggested by the range of existing studies -- or perhaps less.

"Unfortunately, humans are on a path to find out through actually increasing global temperatures beyond levels experienced during human history, and then we will see how things turn out."

Prof Le Quéré said: "The impacts of climate change are becoming increasingly clear as new evidence becomes available and because our impact on the climate is also growing.

"Continually assessing the scientific evidence is critical to informing the decisions to be made at the COP26 in Glasgow later this year."

'Climate change is probably increasing the intensity of tropical cyclones', Thomas R. Knutson, Maya V. Chung, Gabriel Vecchi, Jingru Sun, Tsung-Lin Hsieh and Adam J. P. Smith, is published at ScienceBrief.org on 26 March 2021.

Credit: 
University of East Anglia

New genetic clues point to new treatments for 'silent' stroke

Scientists have identified new genetic clues in people who've had small and often apparently 'silent' strokes that are difficult to treat and a major cause of vascular dementia, according to research funded by the British Heart Foundation (BHF) and published in The Lancet Neurology.

Researchers discovered changes to 12 genetic regions in the DNA of people who have had a lacunar stroke - a type of stroke caused by weakening of the small blood vessels deep within the brain. Over time, damage to the blood vessels and subsequent interruption to blood flow can lead to long-term disability, causing difficulty with thinking, memory, walking and ultimately dementia.

There are few proven drugs to prevent or treat lacunar strokes. The blood vessels affected are less than a millimetre wide and a lacunar stroke can strike without the person knowing. It's not usually until someone has had a number of these strokes and starts to see signs of dementia that they realise something is wrong.

To date, only one genetic fault has been associated with lacunar strokes. However, after over a decade of research, Professor Hugh Markus and his team at the University of Cambridge working with researchers from around the world now believe their genetic breakthrough holds the key to finding much-needed treatments for lacunar stroke and vascular dementia.

Researchers scanned and compared the genetic code of 7,338 patients who had a lacunar stroke with 254,798 people who had not. Participants were recruited from across Europe, United States, South America and Australia after they attended hospital and had an MRI or CT brain scan.

They discovered that many of the 12 genetic regions linked to lacunar strokes were involved in maintaining the neurovascular unit - the part of the brain that separates the blood vessels from the brain and ensures that nerves function normally. These genetic changes are thought to make the small blood vessels 'leakier', causing toxic substances to enter the brain, and meaning that messages travelling around the brain slow down or don't arrive at all.

The team now plan to test whether new treatments can correct these abnormalities on brain cells in the lab. They hope to begin human clinical trials in the next ten years.

The study also highlighted that high blood pressure, type 2 diabetes and a history of smoking are causally associated with an increased risk of lacunar stroke, identifying things that we can immediately tackle.

Professor Hugh Markus, BHF-funded researcher, leader of the study and neurologist at the University of Cambridge, said:

"These small and often silent lacunar strokes have gone under the radar for a long time, and so we haven't been able treat patients as well as we'd like to. Although small, their consequences for patients can be enormous. They cause a quarter of all strokes and they are the type of stroke which is most likely to lead to vascular dementia.

"We now plan to use this new genetic blueprint as a springboard to develop much needed treatments to prevent lacunar strokes from occurring in the first place and to help stave off dementia."

Dr Matthew Traylor, first author of the study at Queen Mary University of London, said:

"Genetics offers one of the few ways we can discover completely new insights into what causes a disease such as lacunar stroke. It is only by better understanding of what causes the disease that we will be able to develop better treatments."

Professor Sir Nilesh Samani, Medical Director at the British Heart Foundation and cardiologist, said:

"This is the most extensive genetic search to date which truly gets to grips with what cause lacunar strokes. These findings are a significant leap forward and we now have a much greater understanding of the genetics and biology behind what causes the small blood vessels deep in the brain to become diseased.

"Lacunar strokes affect around 35,000 people in the UK each year. This research provides real hope that we can prevent and treat this devastating type of stroke much better in the future."

Credit: 
British Heart Foundation

New discoveries of deep brain stimulation put it on par with therapeutics

image: Nuri Ince, associate professor of biomedical engineering, is reporting that electrical stimulation of the brain at higher frequencies (above 100Hz) induces resonating waveforms which can successfully recalibrate dysfunctional circuits causing movement symptoms.

Image: 
University of Houston

Despite having remarkable utility in treating movement disorders such as Parkinson's disease, deep brain stimulation (DBS) has confounded researchers, with a general lack of understanding of why it works at some frequencies and does not at others. Now a University of Houston biomedical engineer is presenting evidence in Nature Communications Biology that electrical stimulation of the brain at higher frequencies (>100Hz) induces resonating waveforms which can successfully recalibrate dysfunctional circuits causing movement symptoms.

"We investigated the modulations in local ?eld potentials induced by electrical stimulation of the subthalamic nucleus (STN) at therapeutic and non-therapeutic frequencies in Parkinson's disease patients undergoing DBS surgery. We ?nd that therapeutic high-frequency stimulation (130-180 Hz) induces high-frequency oscillations (~300 Hz, HFO) similar to those observed with pharmacological treatment," reports Nuri Ince, associate professor of biomedical engineering.

For the past couple of decades, deep brain stimulation (DBS) has been the most important therapeutic advancement in the treatment of Parkinson's disease, a progressive nervous system disorder that affects movement in 10 million people worldwide. In DBS, electrodes are surgically implanted in the deep brain and electrical pulses are delivered at certain rates to control tremors and other disabling motor signs associated with the disease.

Until now, the process to find the correct frequency has been time consuming, with it taking sometimes months to implant devices and test their abilities in patients, in a largely back and forth process. Ince's method may speed the time to almost immediate for the programming of devices at correct frequencies.

"For the first time, we stimulated the brain and while doing that we recorded the response of the brain waves at the same time, and this has been a limitation over the past years. When you stimulate with electrical pulses, they generate large amplitude artifacts, masking the neural response. With our signal processing methods, we were able to get rid of the noise and clean it up," said Ince. "If you know why certain frequencies are working, then you can adjust the stimulation frequencies on a subject-specific basis, making therapy more personalized."

Credit: 
University of Houston

Rural Alaskans struggle to access and afford water

Water scarcity in rural Alaska is not a new problem, but the situation is getting worse with climate change. Lasting solutions must encourage the use of alternative water supplies like rainwater catchment and grey water recycling. They must also address the affordability of water related to household income, say researchers from McGill University.

Washing hands with clean water is something most people take for granted, yet for Alaska's rural residents, this is often not the case. When people pay for water by the gallon, serious thought is given to how much is used - even during the COVID-19 pandemic.

In many rural Alaskan communities, where jobs are scarce and household income is low, the cost of water is a high burden, according to the study published in Environmental Management.

"Households in Anchorage paid nearly five dollars per 1000 gallons in 2017, while residents in more remote areas paid ten times as much," says co-author Antonia Sohns, a PhD Candidate under the supervision of McGill Professor Jan Adamowski.

Living on less than 6 liters of water a day

Due to the costliness of water and challenges accessing it, rural Alaskan homes without piped water use on average 5.7 liters of water per person per day - well below the World Health Organization standard of 20 liters per person per day, and far below the average of 110 liters per person per day in similar regions like Nunavut, Canada.

"As climate change manifests rapidly across the Arctic, the challenges that households face in securing sufficient water supplies for their daily needs becomes even more difficult," says Sohns.

Intensifying effects of climate change such as coastal erosion and storm surges are compromising water sources that communities depend on, leading to damaged infrastructure and saltwater intrusion into drinking water.

"Funding should be made available to address climate change impacts to water systems and to support adaptation strategies adopted by communities," says Adamowski of the Department of Bioresource Engineering.

Water quality, but not quantity?

Currently, there are no water quantity standards, but there are water quality standards. Changes to these regulations could strengthen access to water and improve the health of rural Alaskans.

According to the researchers, part of the solution could lie in promoting household-level approaches and changing how water is provided from conventional pipes to non-conventional systems. This includes increasing funding for alternative water systems, such as rainwater catchment or gray water recycling and reuse.

Although these solutions may not meet potable water standards at the state or federal level, they could significantly improve the quality of life in many communities, they say. Governments should consider reducing burdensome requirements to allow construction of less costly water systems on the way to a fully compliant system.

The researchers also note that infrastructure projects should consider community capacity, belief structure, and local preferences. Inadequate access to water is a persistent challenge in over 200 rural communities, whose residents are primarily Alaska Native people.

"We can't overlook the importance of different perceptions of water due to cultural preference. Many households continue to gather drinking water from sources that are culturally significant such as rivers, lakes, ice, or snowmelt. In considering such factors, approaches will be more long-lasting and resilient," says Sohns.

Credit: 
McGill University

Chemists achieve breakthrough in the production of three-dimensional molecular structures

image: Chemists use this experimental setup for photochemical reactions.

Image: 
Peter Bellotti

A major goal of organic and medicinal chemistry in recent decades has been the rapid synthesis of three-dimensional molecules for the development of new drugs. These drug candidates exhibit a variety of improved properties compared to predominantly flat molecular structures, which are reflected in clinical trials by higher efficacy and success rates. However, they could only be produced at great expense or not at all using previous methods. Chemists led by Prof. Frank Glorius (University of Münster, Germany) and his colleagues Prof. M. Kevin Brown (Indiana University Bloomington) and Prof. Kendall N. Houk (University of California, Los Angeles) have now succeeded in converting several classes of flat nitrogen-containing molecules into the desired three-dimensional structures. Using more than 100 novel examples, they were able to demonstrate the broad applicability of the process. This study will be published by Science on Friday, 26 March 2021.

Light-mediated energy transfer overcomes energy barrier

One of the most efficient methods for synthesizing three-dimensional architectures involves the addition of a molecule to another, known as cycloaddition. In this process, two new bonds and a new ring are formed between the molecules. For aromatic systems - i.e. flat and particularly stable ring compounds - this reaction was not feasible with previous methods. The energy barrier that inhibits such a cycloaddition could not be overcome even with the application of heat. For this reason, the authors of the "Science" article explored the possibility of overcoming this barrier through light-mediated energy transfer.

"The motif of using light energy to build more complex, chemical structures is also found in nature," explains Frank Glorius. "Just as plants use light in photosynthesis to synthesize sugar molecules from the simple building blocks carbon dioxide and water, we use light-mediated energy transfer to produce complex, three-dimensional target molecules from flat basic structures."

New drug candidates for pharmaceutical applications?

The scientists point to the "enormous possibilities" of the method. The novel, unconventional structural motifs presented by the team in the "Science" paper will significantly expand the range of molecules that medicinal chemists can consider in their search for new drugs: for example, basic building blocks containing nitrogen and highly relevant to pharmaceuticals, such as quinolines, isoquinolines and quinazolines, which have been scarcely used owing to selectivity and reactivity problems. Through light-mediated energy transfer, they can now be coupled with a wide range of structurally diverse alkenes to obtain novel three-dimensional drug candidates or their backbones. The chemists also demonstrated a variety of innovative transformations for the further processing of these synthesized backbones, using their expertise to pave the way for pharmaceutical applications. The method's great practicality and the availability of the required starting materials are crucial for the future use of the technology: the molecules used are commercially available at low cost or easy to produce.

"We hope that this discovery will provide new impetus in the development of novel medical agents and will also be applied and further investigated in an interdisciplinary manner," explains Jiajia Ma. Kevin Brown adds: "Our scientific breakthrough can also gain great significance in the discovery of crop protection agents and beyond."

Synergy of experimental and computational chemistry

Another special feature of the study: the scientists clarified the reaction mechanism and the exact structure of the molecules produced for the first time not only analytically and experimentally in detail, but also via "computational chemistry": Kendall Houk and Shuming Chen conducted detailed computer-aided modeling of the reaction. They were able to show how these reactions work and why they occur very selectively. "This study is a prime example of the synergy of experimental and computational theoretical chemistry," emphasizes Shuming Chen, now a professor at Oberlin College in Ohio. "Our detailed mechanistic elucidation and understanding of reactivity concepts will enable scientists to develop complementary methods and to use what we learned to design more efficient synthetic routes in the future," adds Kendall Houk.

The story behind the publication

Using the method of light-mediated energy transfer, both Jiajia Ma/Frank Glorius (University of Münster) and Renyu Guo/Kevin Brown (Indiana University) had success, independently. Through collaborations with Kendall Houk and Shuming Chen at UCLA, both research groups learned of the mutual discovery. The three groups decided to develop their findings further together in order to share their breakthrough with the scientific community as soon as possible and to provide medicinal chemists with this technology to develop novel drugs.

Credit: 
University of Münster

Ocean currents predicted on enceladus

Buried beneath 20 kilometers of ice, the subsurface ocean of Enceladus--one of Saturn's moons--appears to be churning with currents akin to those on Earth.

The theory, derived from the shape of Enceladus's ice shell, challenges the current thinking that the moon's global ocean is homogenous, apart from some vertical mixing driven by the warmth of the moon's core.

Enceladus, a tiny frozen ball about 500 kilometers in diameter (about 1/7th the diameter of Earth's moon), is the sixth largest moon of Saturn. Despite its small size, Enceladus attracted the attention of scientists in 2014 when a flyby of the Cassini spacecraft discovered evidence of its large subsurface ocean and sampled water from geyser-like eruptions that occur through fissures in the ice at the south pole. It is one of the few locations in the solar system with liquid water (another is Jupiter's moon Europa), making it a target of interest for astrobiologists searching for signs of life.

The ocean on Enceladus is almost entirely unlike Earth's. Earth's ocean is relatively shallow (an average of 3.6 km deep), covers three-quarters of the planet's surface, is warmer at the top from the sun's rays and colder in the depths near the seafloor, and has currents that are affected by wind; Enceladus, meanwhile, appears to have a globe-spanning and completely subsurface ocean that is at least 30 km deep and is cooled at the top near the ice shell and warmed at the bottom by heat from the moon's core.

Despite their differences, Caltech graduate student Ana Lobo (MS '17) suggests that oceans on Enceladus have currents akin to those on Earth. The work builds on measurements by Cassini as well as the research of Andrew Thompson, professor of environmental science and engineering, who has been studying the way that ice and water interact to drive ocean mixing around Antarctica.

The oceans of Enceladus and Earth share one important characteristic: they are salty. And as shown by findings published in Nature Geoscience on March 25, variations in salinity could serve as drivers of the ocean circulation on Enceladus, much as they do in Earth's Southern Ocean, which surrounds Antarctica.

Lobo and Thompson collaborated on the work with Steven Vance and Saikiran Tharimena of JPL, which Caltech manages for NASA.

Gravitational measurements and heat calculations from Cassini had already revealed that the ice shell is thinner at the poles than at the equator. Regions of thin ice at the poles are likely associated with melting and regions of thick ice at the equator with freezing, Thompson says. This affects the ocean currents because when salty water freezes, it releases the salts and makes the surrounding water heavier, causing it to sink. The opposite happens in regions of melt.

"Knowing the distribution of ice allows us to place constraints on circulation patterns," Lobo explains. An idealized computer model, based on Thompson's studies of Antarctica, suggests that the regions of freezing and melting, identified by the ice structure, would be connected by the ocean currents. This would create a pole-to-equator circulation that influences the distribution of heat and nutrients.

"Understanding which regions of the subsurface ocean might be the most hospitable to life as we know it could one day inform efforts to search for signs of life," Thompson says.

Credit: 
California Institute of Technology

Second drug targeting KRASG12C shows benefit in mutated non-small-cell lung cancer

image: Presenter of abstract 99O

Image: 
ESMO

Lugano, Switzerland; Denver, CO, USA, 25 March 2021 - Clinical activity with a second drug inhibiting KRASG12C confirms its role as a therapeutic target in patients with advanced non-small-cell lung cancer (NSCLC) harbouring this mutation, according to results from a study with the KRASG12C inhibitor adagrasib reported at the European Lung Cancer Virtual Congress 2021. (1)

"As we strive to identify the oncogenic driver in more and more of our patients with NSCLC, it becomes critical that we develop therapies that can target these identified oncogenic drivers," said lead author Gregory Riely, from Memorial Sloan Kettering Cancer Center, New York, USA.

"KRAS mutations are the most frequent oncogenic driver that we see in patients with NSCLC and we've known about KRAS-mutant NSCLC for 30 years. We are now, finally, seeing drugs that can target this subgroup of patients," he said.

The multi-cohort phase 1/2 KRYSTAL-1 study evaluated adagrasib, a selective inhibitor of KRASG12C, in 79 patients with advanced or metastatic NSCLC harbouring a KRASG12C mutation. Most (92%) of the patients had previously been treated with chemotherapy and an anti-PD-(L)1.

Results showed that nearly half (45%) of the 51 patients evaluable for clinical activity had a partial response to treatment with adagrasib, and 26 patients had stable disease.

"The 45% response rate is unprecedented activity in patients with KRASG12C mutant NSCLC," commented Myung-Ju Ahn, from Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul, Korea. "A response of this magnitude could not be expected with other chemotherapy or immunotherapy in pre-treated KRAS-mutated patients, suggesting that KRASG12C is a therapeutic target." She considered the finding is potentially practice-changing although further studies are needed as long-term follow-up data are currently limited.

The results with adagrasib are comparable to those with another KRASG12C inhibitor, sotorasib, reported earlier this year at the World Conference on Lung Cancer 2021. (2)

"Finding another promising targeted agent against KRASG12C mutant NSCLC sheds light on the treatment of these patients who currently have unmet medical need," said Ahn.

KRASG12C mutations occur in around 14% of patients with lung adenocarcinomas, the most common subtype of NSCLC, but there is currently no approved KRAS-targeting therapy.

"Having more KRASG12C inhibitors gives us additional opportunities to explore combinations of these inhibitors with other classes of agents, including immune checkpoint inhibitors as well as other small molecule MAP kinase inhibitor combinations," said Riely.

"The current data really set up future trials to establish the role for adagrasib in patients with KRASG12C mutant NSCLC. They provide a particular opportunity to explore this drug's activity in patients with KRASG12C mutant NSCLC that have been previously treated with platinum-based chemotherapies to potentially submit for regulatory approval." If approved, he suggested: "I think this would clearly set adagrasib as a preferred second-line therapy, compared with chemotherapy, for patients with KRAS mutant NSCLC."

"Given the low toxicity, adagrasib could potentially be combined with chemotherapy, immunotherapy or other molecules to increase activity in patients with KRASG12C mutant NSCLC," suggested Ahn.

Further data from KRYSTAL-1 showed an even greater response to adagrasib in the subpopulation of patients whose tumours had an STK11 mutation as well as a KRASG12C mutation. STK11 mutations have been associated with inferior responses to immune checkpoint inhibitors in patients with NSCLC. Riely noted: "Finding that the response rate was higher for patients with STK11 mutations suggests that this group of patients, who otherwise don't benefit from checkpoint inhibitors, may have even better response to adagrasib."

Credit: 
European Society for Medical Oncology

Study of NCOA3 yields novel findings of melanoma progression

For the first time, activation of nuclear receptor coactivator 3 (NCOA3) has been shown to promote the development of melanoma through regulation of ultraviolet radiation (UVR) sensitivity, cell cycle progression and circumvention of the DNA damage response. Results of a pre-clinical study led by Mohammed Kashani-Sabet, M.D., Medical Director of the Cancer Center at Sutter's California Pacific Medical Center (CPMC) in San Francisco, CA were published online today in Cancer Research, a journal of the American Association for Cancer Research.

"Our research suggests a previously unreported mechanism by which NCOA3 regulates the DNA damage response and acts as a potential therapeutic target in melanoma, whereby activation of NCOA3 contributes to melanoma development following exposure to ultraviolet light," says Dr. Kashani-Sabet, who collaborated with scientists at CPMC's Research Institute, the University Duisburg-Essen in Germany and the Knight Cancer Institute in Portland, OR for the study.

Epidemiological studies suggest a role for UVR in melanoma causation, supported by whole genome sequencing studies demonstrating a high burden of UV-signature mutations. But the precise molecular mechanisms by which melanoma develops following UVR remain poorly understood, necessitating the identification of additional molecular factors that govern both UV and melanoma susceptibility.

NCOA3 (also known as AIB1 or SRC-3) is a member of the nuclear hormone receptor coactivator family, and regulates gene expression through its interaction with various transcription factors. NCOA3 was initially shown to be amplified in breast cancer, and has a demonstrated oncogenic role in various solid tumors. However, a role for NCOA3 in UVR-mediated melanomagenesis has not been previously demonstrated.

By utilizing a combination of in vitro, in vivo and PDX modelling of melanomagenesis, Dr. Kashani and colleagues assessed the effects of regulating NCOA3 expression in human melanoma cells as well as in melanocytes, identifying multiple oncogenic pathways regulated by NCOA3 in melanoma progression.

Results showed that down-regulation of NCOA3 expression, either by genetic silencing or small molecule inhibition, significantly suppressed melanoma proliferation in melanoma cell lines and PDXs. NCOA3 silencing suppressed expression of xeroderma pigmentosum C and increased melanoma cell sensitivity to UVR. Suppression of NCOA3 expression led to activation of DDR effectors and reduced expression of cyclin B1, resulting in G2/M arrest and mitotic catastrophe.

A single nucleotide polymorphism in NCOA3 (T960T) was associated with decreased melanoma risk, given a significantly lower prevalence in a familial melanoma cohort than in a control cohort without cancer. Additional studies suggest this polymorphism decreases NCOA3 protein production, and is accompanied by increased sensitivity to ultraviolet light--resulting in cell death.

Taken together, these findings are consistent with a model of melanoma initiation whereby elevated NCOA3 expression promotes melanocyte survival following exposure to UVR. This survival advantage enables accumulation of UVR-mediated DNA damage. Over the lifetime of an individual who is susceptible to melanoma, significant exposure to UVR can result in both the high mutational burden and uncontrolled cellular proliferation that characterize the disease. By contrast, these effects are attenuated following expression of the T960T polymorphism, with increased sensitivity to UV-mediated cell death, thereby protecting against the carcinogenic effects of UVR.

"Our results demonstrate an unprecedented role for a molecular marker in distinct stages of tumor progression. These results identify NCOA3 as a candidate susceptibility marker for melanoma, as a potential diagnostic marker, as a prognostic marker of melanoma survival and as a target for therapy. We propose a critical role for NCOA3 in UVR-mediated melanomagenesis, and as a rational therapeutic target for melanoma," says Dr. Kashani-Sabet.

Skin cancer is the most common type of cancer worldwide. Melanoma accounts for about 1% of skin cancers but causes a large majority of skin cancer deaths. This year, approximately 106,000 Americans will be diagnosed with melanoma.

"We're committed to advancing this research to help guide and inform care for the treatment of Sutter patients with melanoma and other patients worldwide," says Dr. Kashani-Sabet.

Credit: 
Sutter Health

Researchers at Stanford and Carnegie Mellon reveal cost of key climate solution

Perhaps the best hope for slowing climate change - capturing and storing carbon dioxide emissions underground - has remained elusive due in part to uncertainty about its economic feasibility.

In an effort to provide clarity on this point, researchers at Stanford University and Carnegie Mellon University have estimated the energy demands involved with a critical stage of the process. (Watch video here: https://www.youtube.com/watch?v=-ZPIwwQs9aM)

Their findings, published April 8 in Environmental Science & Technology, suggest that managing and disposing of high salinity brines - a by-product of efficient underground carbon sequestration - will impose significant energy and emissions penalties. Their work quantifies these penalties for different management scenarios and provides a framework for making the approach more energy efficient.

"Designing massive new infrastructure systems for geological carbon storage with an appreciation for how they intersect with other engineering challenges--in this case the difficulty of managing high salinity brines--will be critical to maximizing the carbon benefits and reducing the system costs," said study senior author Meagan Mauter, an associate professor of Civil and Environmental Engineering at Stanford University.

Getting to a clean, renewable energy future won't happen overnight. One of the bridges on that path will involve dealing with carbon dioxide emissions - the dominant greenhouse gas warming the Earth - as fossil fuel use winds down. That's where carbon sequestration comes in. While most climate scientists agree on the need for such an approach, there has been little clarity about the full lifecycle costs of carbon storage infrastructure.

Salty challenge

An important aspect of that analysis is understanding how we will manage brines, highly concentrated salt water that is extracted from underground reservoirs to increase carbon dioxide storage capacity and minimize earthquake risk. Saline reservoirs are the most likely storage places for captured carbon dioxide because they are large and ubiquitous, but the extracted brines have an average salt concentration that is nearly three times higher than seawater.

These brines will either need to be disposed of via deep well injection or desalinated for beneficial reuse. Pumping it underground - an approach that has been used for oil and gas industry wastewater - has been linked to increased earthquake frequency and has led to significant public backlash. But desalinating the brines is significantly more costly and energy intensive due, in part, to the efficiency limits of thermal desalination technologies. It's an essential, complex step with a potentially huge price tag.

The big picture

The new study is the first to comprehensively assess the energy penalties and carbon dioxide emissions involved with brine management as a function of various carbon transport, reservoir management and brine treatment scenarios in the U.S. The researchers focused on brine treatment associated with storing carbon from coal-fired power plants because they are the country's largest sources of carbon dioxide, the most cost-effective targets for carbon capture and their locations are generally representative of the location of carbon dioxide point sources.

Perhaps unsurprisingly, the study found higher energy penalties for brine management scenarios that prioritize treatment for reuse. In fact, brine management will impose the largest post-capture and compression energy penalty on a per-tone of carbon dioxide basis, up to an order of magnitude greater than carbon transport, according to the study.

"There is no free lunch," said study lead author Timothy Bartholomew, a former civil and environmental engineering graduate student at Carnegie Mellon University who now works for KeyLogic Systems, a contractor for the Department of Energy's National Energy Technology Laboratory. "Even engineered solutions to carbon storage will impose energy penalties and result in some carbon emissions. As a result, we need to design these systems as efficiently as possible to maximize their carbon reduction benefits."

The road forward

Solutions may be at hand.

The energy penalty of brine management can be reduced by prioritizing storage in low salinity reservoirs, minimizing the brine extraction ratio and limiting the extent of brine recovery, according to the researchers. They warn, however, that these approaches bring their own tradeoffs for transportation costs, energy penalties, reservoir storage capacity and safe rates of carbon dioxide injection into underground reservoirs. Evaluating the tradeoffs will be critical to maximizing carbon dioxide emission mitigation, minimizing financial costs and limiting environmental externalities.

"There are water-related implications for most deep decarbonization pathways," said Mauter, who is also a fellow at the Stanford Woods Institute for the Environment. "The key is understanding these constraints in sufficient detail to design around them or develop engineering solutions that mitigate their impact."

Credit: 
Stanford University

DNA damage 'hot spots' discovered within neurons

image: Neurons (labeled in purple) show signs of an active DNA repair process (labeled in yellow). The cells' DNA itself is labeled in cyan (in this image, overlap between cyan and yellow appears green).

Image: 
Image courtesy of Ward lab, NINDS

Researchers at the National Institutes of Health (NIH) have discovered specific regions within the DNA of neurons that accumulate a certain type of damage (called single-strand breaks or SSBs). This accumulation of SSBs appears to be unique to neurons, and it challenges what is generally understood about the cause of DNA damage and its potential implications in neurodegenerative diseases.

Because neurons require considerable amounts of oxygen to function properly, they are exposed to high levels of free radicals--toxic compounds that can damage DNA within cells. Normally, this damage occurs randomly. However, in this study, damage within neurons was often found within specific regions of DNA called "enhancers" that control the activity of nearby genes.

Fully mature cells like neurons do not need all of their genes to be active at any one time. One way that cells can control gene activity involves the presence or absence of a chemical tag called a methyl group on a specific building block of DNA. Closer inspection of the neurons revealed that a significant number of SSBs occurred when methyl groups were removed, which typically makes that gene available to be activated.

An explanation proposed by the researchers is that the removal of the methyl group from DNA itself creates an SSB, and neurons have multiple repair mechanisms at the ready to repair that damage as soon as it occurs. This challenges the common wisdom that DNA damage is inherently a process to be prevented. Instead, at least in neurons, it is part of the normal process of switching genes on and off. Furthermore, it implies that defects in the repair process, not the DNA damage itself, can potentially lead to developmental or neurodegenerative diseases.

This study was made possible through the collaboration between two labs at the NIH: one run by Michael E. Ward, M.D., Ph.D. at the National Institute of Neurological Disorders and Stroke (NINDS) and the other by Andre Nussenzweig, Ph.D. at the National Cancer Institute (NCI). Dr. Nussenzweig developed a method for mapping DNA errors within the genome. This highly sensitive technique requires a considerable number of cells in order to work effectively, and Dr. Ward's lab provided the expertise in generating a large population of neurons using induced pluripotent stem cells (iPSCs) derived from one human donor. Keith Caldecott, Ph.D. at the University of Sussex also provided his expertise in single strand break repair pathways.

The two labs are now looking more closely at the repair mechanisms involved in reversing neuronal SSBs and the potential connection to neuronal dysfunction and degeneration.

Credit: 
NIH/National Institute of Neurological Disorders and Stroke

New biomarkers of malignant melanoma identified

image: Graphic showing the research project on the metabolomic profile of exosomes isolated from CSCs and blood from patients with malignant melanoma. From cell cultures of CSCs and differentiated tumour cells, and from the blood of patients with malignant melanoma vs. healthy controls, the exosomes were extracted and characterized, and their molecular composition was analysed to identify characteristic molecules of CSCs differentially present in the blood of malignant melanoma patients, with the potential to act as diagnostic biomarkers of the disease

Image: 
University of Granada

Their study has shown that these malignant melanoma vesicles produced by CSCs have a different molecular composition from that of differentiated tumour cells. These molecules were also found to be detectable in exosomes present in the blood, and they presented differences in patients with malignant melanoma compared to healthy individuals. This makes them potentially suitable as biomarkers for the diagnosis and prognosis of this disease.

The results have been published in the prestigious scientific journal Molecular Oncology.

Malignant melanoma is one of the most aggressive types of skin cancer and its prevalence has been increasing worldwide in recent years. Among the factors that contribute to the life-threatening nature and severity of this disease are the late appearance of the first symptoms, the lack of effective treatments, its high metastasis capacity, and also the difficulty of detecting this particular cancer. Unfortunately, the diagnosis of malignant melanoma therefore continues to be problematic due to the lack of indicators--known as biomarkers--to accurately signal the early stages of this disease and predict how it might evolve in a given patient, once detected.

These characteristics, which make this type of cancer such a serious disease, may be partly attributable to so-called cancer stem cells (CSCs), a sub-population of cells that exist in tumours and that present the typical characteristics of stem cells. They are responsible for tumour initiation, maintenance, and progression, as well as metastasis and recurrence--even years after a tumour has been eradicated.

Now, a team of scientists led by Professor Juan Antonio Marchal Corrales of the Department of Human Anatomy and Embryology at the University of Granada (UGR) and Director of the "Doctores Galera y Requena" Chair in Research on Cancer Stem Cells, pertaining to the Biohealth Research Institute in Granada (ibs.GRANADA) and the MNat Scientific Unit of Excellence (ModelingNature), has studied these CSCs--specifically, the microvesicles that act as "messengers" for these cells. Known as exosomes, these cells produce and send other cells and tissues to communicate via the transfer of certain biomolecules, thereby promoting the emergence of metastases.

These exosomes have been shown to be involved in many tumour processes. As cells release them and circulate via the bloodstream, they offer a very interesting source of biomarkers as they can be easily isolated from a blood sample. This study focused on the molecular characterization of exosomes produced by CSCs and isolated in the blood from patients with malignant melanoma. Metabolomic techniques were used to analyse the molecular profile of biological systems in order to identify possible biomarkers for the diagnosis of this disease.

This study is the result of extensive multidisciplinary work in which translational researchers, bioinformaticians, and clinical researchers have joined forces to take another step in the field of Personalized Medicine or Precision Medicine in Oncology. The team comprises members from the UGR; Fundación MEDINA (led by Francisca Vicente and José Pérez del Palacio, Area Head and Principal Investigator of the Screening Department, respectively); and the "Virgen de las Nieves" and "San Cecilio" Teaching Hospitals in Granada (all members of ibs.GRANADA ); the University of Vigo; and the Spanish National Cancer Research Centre (CNIO).

Among its findings, the study showed that the molecular composition of exosomes produced by CSCs is different from those released by differentiated tumour cells. To investigate this, using a primary patient-derived malignant melanoma cell line enriched in CSCs, both types of cells were cultivated in large quantities and the exosomes that they produced and released into the culture were isolated. Once the properties and characteristics of both the cells and the exosomes they produced had been tested, a metabolomic analysis was carried out. This enabled the molecules (metabolites) present in the biological sample to be studied. After the molecules had been detected and extracted using a mass spectrometer, which quantifies them with great precision, a series of statistical analyses were carried out to determine which molecules were found in the highest concentration in the exosomes of each cell type. Thus, the researchers tentatively identified some lipidic metabolites differentially present in exosomes of CSCs and differentiated tumour cells.

Metabolomic profile

Subsequently, and following the same scientific approach, a similar study was carried out comparing the metabolomic profile of exosomes isolated from the blood of patients with malignant melanoma in different stages and healthy individuals who acted as controls. The study concluded that certain metabolites, including some of those previously identified in CSCs, were also present in exosomes isolated from blood in different concentrations among melanoma patients and healthy individuals. By means of the corresponding statistical models, these molecules and their different concentrations in blood made it possible to distinguish individuals with malignant melanoma from those without the disease. This makes them suitable candidates for acting as potential biomarkers for its diagnosis.

However, the authors emphasise that this study is only a first step. The identification of some of these molecules, the complete characterization of those already tentatively identified, and the replication of the study with a greater number of samples to validate and verify their clinical application as biomarkers all remain pending.

Studies such as this constitute a new avenue for the discovery of cancer biomarkers aimed at improving early diagnosis, prognosis, and treatment-response prediction. And, of course, these results can be extrapolated to many other tumours, in the quest to identify biomarkers that help us better understand the pathogenesis of these diseases and achieve personalized precision medicine.

Credit: 
University of Granada