Brain

Seeding oceans with iron may not impact climate change

Historically, the oceans have done much of the planet's heavy lifting when it comes to sequestering carbon dioxide from the atmosphere. Microscopic organisms known collectively as phytoplankton, which grow throughout the sunlit surface oceans and absorb carbon dioxide through photosynthesis, are a key player.

To help stem escalating carbon dioxide emissions produced by the burning of fossil fuels, some scientists have proposed seeding the oceans with iron -- an essential ingredient that can stimulate phytoplankton growth. Such "iron fertilization" would cultivate vast new fields of phytoplankton, particularly in areas normally bereft of marine life.

A new MIT study suggests that iron fertilization may not have a significant impact on phytoplankton growth, at least on a global scale.

The researchers studied the interactions between phytoplankton, iron, and other nutrients in the ocean that help phytoplankton grow. Their simulations suggest that on a global scale, marine life has tuned ocean chemistry through these interactions, evolving to maintain a level of ocean iron that supports a delicate balance of nutrients in various regions of the world.

"According to our framework, iron fertilization cannot have a significant overall effect on the amount of carbon in the ocean because the total amount of iron that microbes need is already just right,'' says lead author Jonathan Lauderdale, a research scientist in MIT's Department of Earth, Atmospheric and Planetary Sciences.

The paper's co-authors are Rogier Braakman, Gael Forget, Stephanie Dutkiewicz, and Mick Follows at MIT.

Ligand soup

The iron that phytoplankton depend on to grow comes largely from dust that sweeps over the continents and eventually settles in ocean waters. While huge quantities of iron can be deposited in this way, the majority of this iron quickly sinks, unused, to the seafloor.

"The fundamental problem is, marine microbes require iron to grow, but iron doesn't hang around. Its concentration in the ocean is so miniscule that it's a treasured resource," Lauderdale says.

Hence, scientists have put forth iron fertilization as a way to introduce more iron into the system. But iron availability to phytoplankton is much higher if it is bound up with certain organic compounds that keep iron in the surface ocean and are themselves produced by phytoplankton. These compounds, known as ligands, constitute what Lauderdale describes as a "soup of ingredients" that typically come from organic waste products, dead cells, or siderophores -- molecules that the microbes have evolved to bind specifically with iron.

Not much is known about these iron-trapping ligands at the ecosystem scale, and the team wondered what role the molecules play in regulating the ocean's capacity to promote the growth of phytoplankton and ultimately absorb carbon dioxide.

"People have understood how ligands bind iron, but not what are the emergent properties of such a system at the global scale, and what that means for the biosphere as a whole," Braakman says. "That's what we've tried to model here."

Iron sweet spot

The researchers set out to characterize the interactions between iron, ligands, and macronutrients such as nitrogen and phosphate, and how these interactions affect the global population of phytoplankton and, concurrently, the ocean's capacity to store carbon dioxide.

The team developed a simple three-box model, with each box representing a general ocean environment with a particular balance of iron versus macronutrients. The first box represents remote waters such as the Southern Ocean, which typically have a decent concentration of macronutrients that are upwelled from the deep ocean. They also have a low iron content given their great distance from any continental dust source.

The second box represents the North Atlantic and other waters that have an opposite balance: high in iron because of proximity to dusty continents, and low in macronutrients. The third box is a stand-in for the deep ocean, which is a rich source of macronutrients, such as phosphates and nitrates.

The researchers simulated a general circulation pattern between the three boxes to represent the global currents that connect all the world's oceans: The circulation starts in the North Atlantic and dives down into the deep ocean, then upwells into the Southern Ocean and returns back to the North Atlantic.

The team set relative concentrations of iron and macronutrients in each box, then ran the model to see how phytoplankton growth evolved in each box over 10,000 years. They ran 10,000 simulations, each with different ligand properties.

Out of their simulations, the researchers identified a crucial positive feedback loop between ligands and iron. Oceans with higher concentrations of ligands had also higher concentrations of iron available for phytoplankton to grow and produce more ligands. When microbes have more than enough iron to feast on, they consume as much of the other nutrients they need, such as nitrogen and phosphate, until those nutrients have been completely depleted.

The opposite is true for oceans with low ligand concentrations: These have less iron available for phytoplankton growth, and therefore have very little biological activity in general, leading to less macronutrient consumption.

The researchers also observed in their simulations a narrow range of ligand concentrations that resulted in a sweet spot, where there was just the right amount of ligand to make just enough iron available for phytoplankton growth, while also leaving just the right amount of macronutrients left over to sustain a whole new cycle of growth across all three ocean boxes.

When they compared their simulations to measurements of nutrient, iron, and ligand concentrations taken in the real world, they found their simulated sweet spot range turned out to be the closest match. That is, the world's oceans appear to have just the right amount of ligands, and therefore iron, available to maximize the growth of phytoplankton and optimally consume macronutrients, in a self-reinforcing and self-sustainable balance of resources.

If scientists were to widely fertilize the Southern Ocean or any other iron-depleted waters with iron, the effort would temporarily stimulate phytoplankton to grow and take up all the macronutrients available in that region. But eventually there would be no macronutrients left to circulate to other regions like the North Atlantic, which depends on these macronutrients, along with iron from dust deposits, for phytoplankton growth. The net result would be an eventual decrease in phytoplankton in the North Atlantic and no significant increase in carbon dioxide draw-down globally.

Lauderdale points out there may also be other unintended effects to fertilizing the Southern Ocean with iron.

"We have to consider the whole ocean as this interconnected system," says Lauderdale, who adds that if phytoplankton in the North Atlantic were to plummet, so too would all the marine life on up the food chain that depends on the microscopic organisms.

"Something like 75 percent of production north of the Southern Ocean is fueled by nutrients from the Southern Ocean, and the northern oceans are where most fisheries are and where many ecosystem benefits for people occur," Lauderdale says. "Before we dump loads of iron and draw down nutrients in the Southern Ocean, we should consider unintended consequences downstream that potentially make the environmental situation a lot worse."

Credit: 
Massachusetts Institute of Technology

Study identifies states with highest rates of melanoma due to ultraviolet radiation

A new study finds a wide state-by-state variation in rates of melanoma caused by ultraviolet (UV) exposure with highest rates in several states on the East and West Coast including Hawaii, but also a few landlocked states, including Utah, Vermont, and Minnesota. The report, appearing in the International Journal of Cancer, finds state-level incidence rates for UV-attributable melanoma ranged from 15 cases per 100,000 in Alaska to 65 cases per 100,000 in Hawaii. The authors say variations between states likely reflect a combination of the strength the sun's rays, participation in outdoor activities, sun protection, indoor tanning, and early detection.

For the new study, investigators led by Farhad Islami, M.D., Ph.D. estimated the number, proportion, and incidence rates of malignant melanomas attributable to UV radiation in each of the United States. They did so by calculating the difference between observed melanomas during 2011-2015 and a baseline of expected cases.

Estimating the contribution of UV exposure required a novel approach. Without a population completely unexposed to UV radiation, researchers used the best data available: historical melanoma incidence rates from 1942-1954 in Connecticut, which had the country's first statewide population-based cancer registry and is in a high-latitude (generally lower UV rate) environment. For most adults, melanomas diagnosed during those years likely reflected UV exposure accumulated in the 1930s or earlier, when exposure was minimized by clothing style with more complete skin coverage and limited recreational exposure. This reference population acted as the theoretical minimum UV exposure.

UV-exposure accounted for 91.0% (338,701/372,335) of the total melanoma cases diagnosed during 2011-2015 in the United States; 94.3% (319,412) of UV-attributable cases occurred in non-Hispanic whites.

To highlight state differences, researchers highlighted results for non-Hispanic whites rather than the total population, because a lower burden in some states could largely reflect higher proportions of non-whites in the population. Melanoma incidence rates in the United States are lowest in blacks (1.0 per 100,000) and are also substantially lower in other minorities (e.g., 4.5 per 100,000 in Hispanics) than in non-Hispanic whites (27.2 per 100,000).

By state, the attributable age-standardized rate among non-Hispanic whites ranged from 15.1 per 100,000 in Alaska to 65.1 in Hawaii. Multiple states along the East and West Coast had UV-attributable incidence rates exceeding 25 per 100,000 among non-Hispanic whites: Delaware (37.1), Georgia (36.5), California (33.8), Maryland (32.6), North Carolina (29.5), Florida (29.2), Oregon (28.5), South Carolina (28.1), Washington (27.8), New Jersey (27.7), New Hampshire (26.5). Rates were also above 25 per 100,000 in Alabama (25.4) and several landlocked states: Utah (40.4), Vermont (31.4), Minnesota (27.9), Idaho (27.6), Kentucky (25.7), and Colorado (24.5).

In addition to states with a high UV index like Hawaii, California, and Florida, UV-attributable melanoma rates are high in many states with relatively low UV index, such as Minnesota and Idaho, likely reflecting high prevalence of outdoor activities (e.g., going to beaches, lakes, or outdoor swimming pools; recreational boating; skiing; and perhaps occupational activities such as farming) and insufficient sun protection. Many UV-related melanomas are preventable using appropriate measures.

The report also finds higher UV-attributable melanoma burden in younger females than males. "High indoor tanning prevalence among teen girls in the late 1990s is likely a contributing factor," said Dr. Islami.

Credit: 
American Cancer Society

Earth's glacial cycles enhanced by Antarctic sea-ice

image: North-South ocean currents are shown at different depths in the ocean, with their strength and direction indicated by the arrows. The density of the water increases with depth, from low (orange) to high (brown). During cold climates (right), sea ice around Antarctica grows, preventing some outgassing of carbon from the ocean to the atmosphere. Also, brine formation increases, which causes the Antarctic Bottom Water to become more dense, decreasing mixing with waters above. The two processes result in more carbon stored in the deep ocean.

Image: 
IBS

During past glacial periods the earth was about 6ºC colder and the Northern hemisphere continents were covered by ice sheets up to 4 kilometers thick. However, the earth would not have been so cold, nor the ice sheets so immense, if it were not for the effects of sea ice on the other side of planet.

This is the conclusion of a study published this week in the Proceedings of the National Academy of Sciences of the United States of America by a team of scientists from the IBS Center for Climate Physics (ICCP) in Busan, South Korea and the University of Hawaii at Manoa, in Honolulu, HI, USA. In the study, the scientists investigated what role sea ice (frozen ocean water) in the Southern Ocean surrounding Antartica played in past climate transitions. They found that under glacial conditions sea ice not only inhibits outgassing of carbon dioxide from the surface ocean to the atmosphere, but it also increases storage of carbon in the deep ocean. These processes lock away extra carbon in the ocean that would otherwise escape to the atmosphere as CO2, warm the planet, and reduce glacial amplitudes.

We know from air bubbles trapped in ice cores that the concentration of carbon dioxide in the atmosphere during cold glacial periods was 80-100 parts per million (ppm) lower than the pre-industrial levels (280 ppm). Because the ice sheets also reduced the amount of carbon stored on land, the missing carbon must have been stored away in the ocean. It remained unclear for many decades what processes were responsible for this massive reorganization of the global carbon cycle during glacial periods, but scientists suspected that the Southern Ocean likely played an important role, due to two unique features. First, the densest, and therefore deepest type of water in the ocean are formed near Antarctica, appropriately named "Antarctic Bottom Water". Second, it is the only place where deep ocean waters can move freely to the surface due to the action of winds. As a result, "Processes that occur on the surface in the Southern Ocean have a profound effect on the deep ocean and the amount of carbon that is stored there," explains Dr. Karl Stein, ICCP scientist and lead author of the study.

In turn, changes in the extent of sea ice in the Southern Ocean impact the carbon storage through both deep water formation and interaction with upwelling water. Sea ice contains very little salt, so when ocean water freezes into ice the leftover water is an extremely salty brine. This cold, salty water is very dense, sinking to the ocean bottom and forming Antarctic bottom water. As the climate gets colder, more sea ice formation occurs and more brine and heavier bottom waters are formed. The sea ice eventually grows under glacial conditions until it covers a large portion of the Southern Ocean. This means that the water upwelling from the deep ocean reaches the surface under sea ice. "Deep ocean waters store large amounts of carbon, so prior to large scale fossil fuel burning, the water upwelling in the Southern Ocean was a source of carbon to the atmosphere," explains Dr. Eun Young Kwon, Associate Project Lead of ICCP and co-author of the study. She adds, "If sea ice covered the area of upwelling waters under glacial conditions, it could act as a lid to the outgassing of carbon dioxide".

To investigate the physical effects of sea ice on the ocean, the team used a climate computer model to conduct simulations that covered the last 784,000 years of Earth's climate history, encompassing the last eight glacial cycles. "The model experiment is unique because previous studies only covered a single period in time, typically the Last Glacial Maximum snapshot 21,000 years ago, or used models that were too simple to capture these Southern Ocean processes" says Tobias Friedrich, co-author of the study. "This allowed us for the first time to look at the timing of the sea ice impacts, in addition to assessing their magnitude." The team used a separate model for the carbon cycle to quantify the impacts of sea ice and ocean circulation changes on atmospheric carbon dioxide.

Their results show that sea ice has the largest impact on carbon storage via the formation of Antarctic Bottom Water, driving a 30 ppm drawdown of atmospheric CO2. "The increased sea ice formation during glacial periods causes an increase in the density difference between the bottom water and the water above," says Dr. Axel Timmermann, co-author of the study and ICCP Director. "The larger the difference in density between two waters masses, the more difficult it is to mix them." The reduced mixing means more carbon can be stored in the deep ocean. Importantly, this process is related to the creation of sea ice in the Southern Ocean, which can occur early within a glacial cycle. Later in the glacial cycle, the sea ice covers a large enough area of the Southern Ocean that it can "cap" the carbon dioxide outgassing from the upwelling water, causing a further 10ppm reduction of the level in the atmosphere.

"The results show that Southern Ocean sea ice can respond quickly to the climate cooling, strongly amplifying glacial cycles," says Karl Stein. However, much more work needs to be done before the glacial climate-carbon cycle puzzle is complete. "We still don't know how the initial cooling and reduction of atmospheric carbon gets triggered, but we think it is related to the growth of the ice sheets in the Northern Hemisphere and the corresponding changes in ocean salinity", explains Axel Timmermann.

Credit: 
Institute for Basic Science

Nanolaminate-based design for UV laser mirror coating

image: a, Traditional design using pairs of high and low index materials. b, Proposed strategy using nanolaminate layers and low index materials. c, reflectance and d, transmittance spectra (45° incident angle, solid lines: s polarized light, dot lines: p polarized light). e, single-pulse damage probability as a function of the input fluence.

Image: 
by Meiping Zhu, Nuo Xu, Behshad Roshanzadeh, S. T. P. Boyd, Wolfgang Rudolph, Yingjie Chai, and Jianda Shao

The demand for laser-resistant mirror coatings is increasing in Inertial Confinement Fusion, Extreme Light Infrastructure and other laser applications. The ideal UV laser mirror (UVLM) coating requires high reflectivity with large bandwidth and high laser-induced damage threshold (LIDT). Unfortunately, these requirements are difficult to satisfy simultaneously. This is due, for example, to the fact that high reflectivity requires high refractive index (n) materials, while higher n materials tend to have a smaller optical bandgap and therefore a lower LIDT. Traditionally, UVLMs were achieved by deposition of laser-resistant layers on high reflective layers. However, compromises are made for the seemingly contradictory requirements.

In a new paper published in Light Science & Application, scientists from the Laboratory of Thin Film Optics, Shanghai Institute of Optics and Fine Mechanics, Chinese Academy of Sciences, China, the Department of Physics and Astronomy, University of New Mexico, USA, and co-workers proposed a "reflectivity and laser-resistance in one" design by using tunable nanolaminate layers (NLD coating). An Al2O3-HfO2 nanolaminate-based mirror coating for UV laser applications was experimentally demonstrated using e-beam deposition. The bandwidth, over which the reflectance is larger than 99.5%, is more than twice that of a traditional "reflectivity bottom and LIDT top" combination design (TCD coating) mirror of comparable overall thickness. The LIDT is increased by a factor of ~1.3 for 7.6 ns pulses at a wavelength of 355 nm. The reported concept resulting in improved performance parameters paves the way toward a new generation of UV coatings for high-power laser applications.

The proposed new structure replaces the high-n materials in the traditional designs with nanolaminate layers. These scientists summarize the principle of their design structure:

"The (average) refractive index and optical bandgap can be tuned by adjusting the thickness ratio of the two materials in the nanolaminate layers, while keeping the total optical thickness constant." "The proposed method enables UVML coatings with larger high reflectivity bandwidth, higher LIDT and smaller transmission ripples in the VIS-NIR region compared to traditional designs of comparable overall thickness."

"Compared to the TCD coating, the NLD coating has a lower E-field intensification, a faster E-field decay with depth and smaller absorption, which are consistent with the observed higher LIDT." they added.

"The e-beam deposited nanolaminate materials can be used for large-size (meter-scale) UVML coatings. We believe that the described concept opens new avenues for improved UV coatings and can benefit many areas of laser technology that rely on high-quality optical coatings." the scientists forecast.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

The origins of roughness

image: Surfaces of different materials always develop surface roughness with identical statistical properties.

Image: 
Photo: AG Pastewka

Most natural and artificial surfaces are rough: metals and even glasses that appear smooth to the naked eye can look like jagged mountain ranges under the microscope. There is currently no uniform theory about the origin of this roughness despite it being observed on all scales, from the atomic to the tectonic. Scientists suspect that the rough surface is formed by irreversible plastic deformation that occurs in many processes of mechanical machining of components such as milling. Prof. Dr. Lars Pastewka from the Simulation group at the Department of Microsystems Engineering at the University of Freiburg and his team have simulated such mechanical loads in computer simulations. The researchers found out that surfaces made of different materials, which show distinct mechanisms of plastic deformation, always develop surface roughness with identical statistical properties. They published their results in the freely accessible online journal Science Advances.

Geological surfaces, such as mountain ranges, are created by mechanical deformation, which then leads to processes such as fracture or wear. Synthetic surfaces typically go through many steps of shaping and finishing, such as polishing, lapping, and grinding, explains Pastewka. Most of these surface changes, whether natural or synthetic, lead to plastic deformations on the smallest atomic length scale: "Even at the crack tips of most brittle materials such as glass, there is a finite process zone in which the material is plastically deformed," says the Freiburg researcher. "Roughness on these smallest scales is important because it controls the area of intimate atomic contact when two surfaces are pressed together and thus adhesion, conductivity and other functional properties of surfaces in contact."

In collaboration with colleagues from the Karlsruhe Institute of Technology, the École Polytechnique Fédérale de Lausanne/Switzerland, and the Sandia National Laboratories/USA, and funded by the European Research Council (ERC), Pastewka and his group were able to simulate the surface topography for three reference material systems at the JUQUEEN and JUWELS supercomputers at the Jülich Supercomputing Centre, which included monocrystalline gold, a high entropy alloy of nickel, iron and titanium, and the metallic glass copper-zirconium, in which the atoms do not form ordered structures but an irregular pattern. Each of these three materials is known to have different micromechanical or molecular properties. The scientists now investigated the mechanism of the deformation and the resulting changes in the atomic scale both within the solid and on its surface.

Pastewka, who is also a member of the Cluster of Excellence Living, Adaptive and Energy-autonomous Material Systems (livMatS), and his team found that despite their different structures and material properties, all three systems, when compressed, develop rough surfaces with a so-called self-affined topography. This means that the systems have identical geometric structures regardless of the scale on which they are observed: Surface topography in a virtual microscope at the nanometer scale cannot be distinguished from the structure of mountain landscapes at the kilometer scale. "This is one explanation," says Pastewka, "as to why an almost universal structure of surface roughness is observed in experiments."

Credit: 
University of Freiburg

Off-grid sanitation systems show promise, despite toilet paper

image: This conveyer of rubber bands separates solid and liquid waste effectively in an experimental sanitation system at a test site in India. But at a facility in South Africa, where the culture uses toilet paper, the results aren't as good. This was one of many lessons learned from two eight-month-long field trials for sanitation systems sponsored by the Bill and Melinda Gates Foundation.

Image: 
Brian Hawkins, Duke University

DURHAM, N.C. -- As legend has it, when French workers felt their livelihoods threatened by automation in the early 1900s, they flung their wooden shoes called sabots into the machines to stop them. Hence the word sabotage.

Instead of wasting good footwear, perhaps they should've tried wet toilet paper.

In two new papers published in the journal Science of the Total Environment, engineers from Duke University report on results from the first large-scale, real-world field trials of critical components of their off-grid sanitation system. Since 2011, the Bill & Melinda Gates Foundation's "Reinvent the Toilet" initiative has invested more than $200 million to fund these and other efforts to create small-scale sanitation systems to serve the needs of the 4.2 billion people who lack safely managed sanitation worldwide.

While their nutrient removal processes need improvement, the researchers say they were pleasantly surprised at how long the system's components lasted. They were also reminded of just how important cultural practices can be to the success of a global engineering challenge.

"The first step in our reinvented toilet system separated the solid and liquid waste through a conveyer belt made of rubber bands," said Brian Hawkins, research scientist in the Duke Center for Water, Sanitation, Hygiene and Infectious Disease (WaSH-AID). "It works great in India where they have a washing culture, but in South Africa, where they have a wiping culture, the toilet paper got into the mix and gummed up all of the gears. That resulted in the system needing cleaning every couple of days, which is not sustainable."

For the Gates-funded sanitation systems to be considered successful, they must not only remove pathogens from human waste and recover resources such as energy, clean water and nutrients, they have to do this "off the grid," without access to external electricity or water sources. To top it all off, the systems must cost less than five cents per user, per day--that's 200 users for $10.

Along with Jeffrey Glass and Brian Stoner, both faculty members in Duke's Department of Electrical & Computer Engineering, Hawkins has worked on his system at RTI International and Duke Engineering for almost a decade. After testing the liquid treatment system at large scales in real-life settings, it appears that the team's dedication is beginning to pay off.

"The big thing we were trying to find out is how long we can run this system before it needs critical maintenance," said Hawkins. "And you want to do that in as close to real-life situations as possible."

Throughout much of 2018, the Duke researchers installed a prototype waste treatment system at two locations -- a women's dormitory at a textile mill in Coimbatore, India and a communal ablution block at the edge of Durban, South Africa. Over the course of eight months, both systems served up to 50 potential users at any given time, processing more than 11,000 liters of waste throughout the field trials.

While the problem of toilet paper was an important discovery, the primary purpose of the trials was to test the longevity of the critical components of the liquid treatment process. After solids are separated from the liquids, they are pushed through a large activated carbon filter.

While this removes almost all of the biosolids, it does not disinfect the water or remove any dissolved salts. But that's okay, because the researchers can make use of one problem to solve the other. Electricity is run through the water to break down the molecular bonds within the remaining salts to produce chlorine-containing oxidants, a powerful disinfectant.

"Electrochemically treating hazardous waste has a lot of benefits," said Glass. "You don't have to ship or handle any chemicals for the disinfection, and the process can be very efficient."

Depending on the specific chemistry of the water, however, the electrical leads needed for this process can corrode quickly. The researchers also weren't sure how much use the activated carbon filters could handle before needing to be replaced. But the field trials helped alleviate those fears.

"Both trials went for eight months and the filters never failed. I had it in my head that it'd be great if they could last a year, and so far it looks like they might be able to," said Hawkins. "So long as we cleaned the electrical system periodically, it lasted for hundreds of hours of service. That's a component that, if you had to replace it every six months, it's not going to work out. But if you can make it last three or four years, that's okay."

In general, the field trial results were promising in regard to the system's potential maintenance needs and performance. Duke's prototype systems met all of the biological parameters and three out of the five chemical standards for liquid effluent according to recently released stand-alone sewage treatment requirements from the International Organization for Standardization (ISO). However they fell short of the standards for remaining amounts of phosphorus and nitrogen.

"These two chemicals are of particular concern when you're discharging water into lakes and streams because too much of either can cause algal blooms," said Hawkins. "There are efficient ways of removing phosphorus and nitrogen at low costs for large volumes, or for small volumes at high costs, but not for low volumes at low cost. That's an ongoing topic of R&D for us."

Duke engineers are now testing solutions to such problems in their next iteration of systems in India. The new systems are more compact and efficient -- and just a little bit closer to being employed as full-scale working solutions.

"If you look at the flow diagram of our original system and what we're testing now, the components are all the same, but the newest prototypes are a lot more lean and mean," Hawkins said. "At this point we're mostly iterating on the same technologies, but not really making big changes."

Credit: 
Duke University

Testing during studying improves memory and inference

image: Psychology Professor Aaron Benjamin, left, and graduate student Jessica Siler are interested in understanding whether testing is better than rote restudy for memory and inference.

Image: 
Doris Dahl, Beckman Institute

Research has shown that testing enhances memory. However, less is known on whether testing can improve a person's ability to make inferences.

A new study by the Human Memory and Cognition Lab at the University of Illinois at Urbana-Champaign has found that testing is better than rote restudy because it improves both memory and the ability to make inferences. The study "Long-term Inference and Memory Following Retrieval Practice" was published in Memory & Cognition.

"We do a lot of our learning outside the classroom. Because so much of our learning is self-regulated, we need to look into which learning strategies are most beneficial," said Jessica Siler, a graduate student in psychology at Illinois. "We found that being tested on the material led to better memory and better inference of new images."

The researchers conducted the study in three phases. During the training phase, participants were introduced to a set of bird pictures where they learned to which families the birds belonged. In the study phase, half the bird families were studied through testing and the other half through restudy.

"During testing, they were asked to guess to which family the bird belonged and then they were given feedback. Alternatively, during restudy, they were given the picture with the name of the family and they simply retyped the name," Siler said.

After the study phase, participants were tested on the subject matter. In the testing phase, they were given bird pictures -- ones they had seen previously and some that they had not -- and were asked to categorize the birds into families.

"There were two kinds of tests we used," said Aaron Benjamin, a professor of psychology, who directs the Human Memory and Cognition Lab and also is affiliated with the Beckman Institute for Advanced Science and Technology. "We wanted to test whether they remembered seeing a particular bird and whether they were able to infer which family it belonged to."

"Designing these experiments takes a lot of precision," Benjamin said. "They are learning multiple families of birds and you need to arrange the stimuli carefully so that you don't confound the results."

The researchers also compared the participants' memory and inference over a period of 25 days. "We saw that testing helps your memory and inference over long periods of time," Siler said.

The researchers are interested in understanding what people understand about their learning mechanisms. "We want to add in meta-cognitive measures where we ask them how well they think they have learned the material and whether they will be able to categorize similar birds," Siler said. "This will give us insight into whether this is a mechanism that can be used in the real world."

Credit: 
Beckman Institute for Advanced Science and Technology

Researchers wake monkeys by stimulating 'engine' of consciousness in brain

MADISON, Wis. -- A small amount of electricity delivered at a specific frequency to a particular point in the brain will snap a monkey out of even deep anesthesia, pointing to a circuit of brain activity key to consciousness and suggesting potential treatments for debilitating brain disorders.

Macaques put under with general anesthetic drugs commonly administered to human surgical patients, propofol and isoflurane, could be revived and alert within two or three seconds of applying low current, according to a study published today in the journal Neuron by a team led by University of Wisconsin-Madison brain researchers.

"For as long as you're stimulating their brain, their behavior -- full eye opening, reaching for objects in their vicinity, vital sign changes, bodily movements and facial movements -- and their brain activity is that of a waking state," says Yuri Saalmann, UW-Madison psychology and neuroscience professor. "Then, within a few seconds of switching off the stimulation, their eyes closed again. The animal is right back into an unconscious state."

Mice have been roused from light anesthesia before with a related method, and humans with severe disorders have improved through electric stimulation applied deep in their brains. But the new study is the first to pull primates in and out of a deep unconscious state, and the results isolate a particular loop of activity in the brain that is crucial to consciousness.

Saalmann's lab focused its attention on a spot deep in the core of the brain called the central lateral thalamus. Lesions in that area of the human brain are linked to severe consciousness disruption like coma. But location alone was not enough to manipulate consciousness. Building on studies of waking versus unconscious brain activity in cats, says graduate student Michelle Redinbaugh, the researchers tried to match the frequency of central lateral thalamus activity during wakefulness.

Precisely stimulating multiple sites simultaneously as little as 200 millionths of a meter apart and applying bursts of electricity 50 times per second proved to work like a switch to bring the brain in and out of anesthesia.

"A millimeter out of position, and you dramatically reduce the effect," says Redinbaugh, first author of the study. "And if you're in that ideal location, but stimulating at two Hertz instead of 50? Nothing happens. This is very location- and frequency-specific."

Such precise fixation on activity in the central lateral thalamus could be coupled with recordings of activity in the outer folds of the brain, called the cortex, also believed to be key to consciousness. By watching signaling as Wisconsin National Primate Research Center monkeys moved from unconscious to conscious states, the researchers saw the central lateral thalamus stimulating parts of the cortex. In turn, the cortex influenced the central lateral thalamus to keep it active.

"So, you have this loop between the deeper layers of the cortex and the central lateral thalamus, which in a sense acts like an engine," says Saalmann, whose work was supported by the National Institutes of Health and the Binational Science Foundation. "We can now point to crucial parts of the brain that keep this engine running and drive changes in the cerebral cortex that affect your awareness, the richness of your conscious experience."

Designing and delivering electrical stimulation with such precision gives the researchers hope that their approach could be used to help patients dealing with many types of abnormal brain activity.

"This kind of intervention could really be improved by a tailor-made approach," says Redinbaugh. "Specifically mimicking activity of this nucleus could be a much more effective way of helping patients in a coma, or people that have spatial neglect. We think this could broadly affect disorders of consciousness."

Credit: 
University of Wisconsin-Madison

UCF researchers develop device that mimics brain cells used for human vision

ORLANDO, Feb. 14, 2020 -University of Central Florida researchers are helping to close the gap separating human and machine minds.

In a study featured as the cover article appearing today in the journal Science Advances, a UCF research team showed that by combining two promising nanomaterials into a new superstructure, they could create a nanoscale device that mimics the neural pathways of brain cells used for human vision.

"This is a baby step toward developing neuromorphic computers, which are computer processors that can simultaneously process and memorize information," said Jayan Thomas, an associate professor in UCF's NanoScience Technology Center and Department of Materials Science and Engineering. "This can reduce the processing time as well as the energy required for processing. At some time in the future, this invention may help to make robots that can think like humans."

Thomas led the research in collaboration with Tania Roy, an assistant professor in UCF's NanoScience Technology Center, and others at UCF's NanoScience Technology Center and the Department of Materials Science and Engineering.

Roy said a potential use for the technology is for drone-assisted rescues.

"Imagine a drone that can fly without guidance to remote mountain sites and locate stranded mountaineers," Roy said. "Today it is difficult since these drones need connectivity to remote servers to identify what they scan with their camera eye. Our device makes this drone truly autonomous because it can see just like a human."

"Earlier research created a camera which captured the image and sent it to a server to be recognized, but our group created a single device that mimics the eye and the brain function together," she said. "Our device can observe the image and recognize it on the spot."

The trick to the innovation was growing nanoscale, light-sensitive perovskite quantum dots on the two-dimensional, atomic thick nanomaterial graphene. This combination allows the photoactive particles to capture light, convert it to electric charges and then have the charges directly transferred to the graphene, all in one step. The entire process takes place on an extremely thin film, about one-ten thousandths of the thickness of a human hair.

Basudev Pradhan, who was a Bhaskara Advanced Solar Energy fellow in Thomas' lab and is currently an assistant professor in the Department of Energy Engineering at the Central University of Jharkhand in India, and Sonali Das, a postdoctoral fellow in Roy's lab, are shared first authors of the study.

"Because of the nature of the superstructure, it shows a light-assisted memory effect," Pradhan said. "This is similar to humans' vision-related brain cells. The optoelectronic synapses we developed are highly relevant for brain-inspired, neuromorphic computing. This kind of superstructure will definitely lead to new directions in development of ultrathin optoelectronic devices."

Das said there are also potential defense applications.

"Such features can also be used for aiding the vision of soldiers on the battlefield," she said. "Further, our device can sense, detect and reconstruct an image along with extremely low power consumption, which makes it capable for long-term deployment in field applications."

Neuromorphic computing is a long-standing goal of scientists in which computers can simultaneously process and store information, like the human brain does, for example, to allow vision. Currently, computers store and process information in separate places, which ultimately limits their performance.

To test their device's ability to see objects through neuromorphic computing, the researchers used it in facial recognition experiments, Thomas said.

"The facial recognition experiment was a preliminary test to check our optoelectronic neuromorphic computing," Thomas said. "Since our device mimics vision-related brain cells, facial recognition is one of the most important tests for our neuromorphic building block."

They found that their device was able to successfully recognize the portraits of four different people.

The researchers said they plan to continue their collaboration to refine the device, including using it to develop a circuit-level system.

Credit: 
University of Central Florida

Key modifier identified in large genetic deletion related to neurodevelopmental disorders

image: Neurodevelopmental disorders, including schizophrenia and autism, likely result from complex interactions that modify the effects of individual genes. Illustration of interactions between pairs of fruit fly counterparts of genes within the neurodevelopmental disorder-associated 3q29 deletion (top). Simultaneous reduced expression of NCBP2 using RNA interference led to an enhancement of cellular defects due to reduced expression of each of the other 3q29 genes. NCBP2 acts as a modifier gene within the 3q29 deletion, and affects several cellular processes, including apoptosis or cell death, to contribute to the observed neurodevelopmental features of the deletion (bottom).

Image: 
Girirajan Laboratory, Penn State

Neurodevelopmental disorders, including schizophrenia and autism, likely result from complex interactions that modify the effects of individual genes. In a new study, researchers evaluated the effects of over 300 pairwise knockdowns--reducing the expression of two genes simultaneously--of the fruit fly versions of genes located in a region of human chromosome 3 that, when deleted, has been implicated in these disorders. These interactions suggest that the disorders have a complex causation involving many genes, rather than resulting from the effects of any individual gene. One gene in particular, NCBP2, appears to be a key modifier, influencing the impact of other genes in the deletion.

A paper describing the research by scientists at Penn State, Boston University, and the University of Florida, appears online February 13, 2020 in the journal PLOS Genetics.

"Neurodevelopmental disorders, like schizophrenia and autism, are often associated with large genetic deletions or duplications," said Santhosh Girirajan, associate professor of genomics in the biochemistry and molecular biology and anthropology departments at Penn State and the leader of the research team. "These 'copy-number variants' can contain many genes, so piecing together the molecular mechanisms that lead to these disorders is incredibly difficult. The deletion on human chromosome 3, referred to as 3q29, encompasses 1.6 million base pairs and includes 21 genes. Testing the interactions among these genes in a mammalian model would be cost- and time-prohibitive, so we use the fruit fly, which allows us to test a large number of genetic interactions relatively quickly."

Of the 21 genes located in the 3q29 deletion, fruit fly counterparts have been identified for 14 genes. Using a technique called RNA interference (RNAi), which reduces the expression of genes in specific tissues in the fly, the researchers first knocked down the expression of 14 fly genes individually and quantified their impact on how cells are organized in the fly eye. They then looked at pairwise knockdowns by reducing the expression of two genes simultaneously. Overall, they tested 314 pairwise knockdowns, including interactions among all 14 of the genes in the 3q29 deletion and between those genes and others with known roles in neurodevelopment.

"When we look at pairwise knockdowns, there are basically three possible outcomes," said Matthew Jensen, a graduate student at Penn State and co-first author of the paper. "The effect could be additive, meaning the impact we see in the pairwise knockdown is simply what we would see by adding the effects of the two individual genes together. This would suggest that the genes act independently of one another. Alternatively, we could see a rescue of the impact of the individual gene, or we could see the impact get worse. These last two outcomes represent an interaction between the genes, where the whole is greater than the sum of its parts, and suggest a more complex relationship between the genes."

Among all the pairwise knockdowns that the research team tested, one particular gene stood out as having a large effect on the impact of all the other genes in the 3q29 deletion. The NCBP2 gene codes for a protein that is part of the "nuclear cap-binding complex," which binds to the end of RNA molecules and plays a role in RNA regulation, transport, and decay in the cell. The main impact of NCBP2 interactions was the disruption of the cell cycle and increased "apoptosis"--cell death. The researchers propose that NCBP2 could modify several cellular processes, not necessarily directly related to apoptosis, but ultimately causing a cascade of events that lead to cell death. Thus, the researchers suggest that apoptosis is an important molecular mechanism for neurodevelopmental disorders related to the 3q29 deletion.

The research team confirmed the role of apoptosis by crossing their knockdown flies with flies that overexpress a gene that inhibits apoptosis. Doing so rescued or reduced the effect of the knockdowns. They also tested the role of several of the genes and interactions in a separate model system--the frog--and found similar results.

"The 3q29 deletion confers about 40 times greater risk for schizophrenia and 20 times greater risk for autism," said Girirajan. "Instead of trying to exhaustively study individual genes in the deletion, we wanted to try to get a broader picture of what is going on. By studying pairwise interactions, we can start to get a better understanding of the molecular and cellular mechanisms that lead to these devastating disorders."

Credit: 
Penn State

Mapping the landscape of citizen science

Science has moved beyond the lab. Researchers are using non-scientists more and more to help conduct their research and expand their reach. Everyday people are contributing their data, helping researchers learn more about a topic and get comprehensive results.

But what does "citizen science" mean and how can it support science learning and education?

A new report from the National Academies of Sciences, Engineering, and Medicine has found that citizen science is reshaping research. It can greatly facilitate large-scale research by providing opportunities to study more topics while teaching people more about science and enhancing science education.
T

he report is one of the first of its kind to examine the available information on citizen science projects and, through peer-reviewed evidence, clearly identify trends, weaknesses and opportunities for growth.

Darlene Cavalier and Lekelia "Kiki" Jenkins, professors at Arizona State University's School for the Future of Innovation in Society, were members of the National Academy of Sciences' Committee on Designing Citizen Science to Support Science Learning, which authored the report. Cavalier and Jenkins, will share their expertise on citizen science and the findings from this new study at the annual meeting of the American Association for the Advancement of Science (AAAS) during the session "Learning Through Citizen Science: Enhancing Opportunities by Design."

"This session specifically highlights the potential of citizen science to support science learning," said Cavalier. "We'd like to share best practices to intentionally design citizen science programs with science learning as one goal. Personally, I enjoy the Q&A sessions where I learn about developments and hear from people shaping the field."

"We need to understand where we are before we can start thinking about where we need to go," offers Jenkins, who is organizing the AAAS session as well as giving a presentation. "That's the reason why this study was done, because people were doing citizen science and assuming that, you're doing science, you're going to learn something. But without being directed in what are the learning objectives and how we are building in support structures to make sure people are learning these things, actually what we find is that people don't learn as much as they could."

Cavalier, a professor of practice, is a founding board member of the Citizen Science Association. She's also the founder of SciStarter, a platform that connects people to citizen science projects they can participate in. She says she wanted to learn more about how people without formal degrees could participate in science.

"Opportunities were out there, but they were difficult to discover and little was known about the projects or participants," said Cavalier. "I wanted to make it make it easy for anyone to find and engage in projects they are curious or concerned about and catalyze research across the landscape of projects, people and perspectives."

Citizen science has been used to study a wide variety of topics. Cavalier and a team from SFIS and ASU Libraries recently developed citizen science kits that can be checked out of local libraries. The kits include all the instruments and resources needed for people to research things like light pollution, air quality and biodiversity. Associate Professor Jenkins focuses much of her research on marine conservation, including studying fisheries learning exchanges, where fishing communities learn from each other how to mitigate common problems involving habitat damage and decreasing fish populations. Fisheries learning exchanges and citizen science share similar approaches to collective problem-solving.

Citizen science not only benefits researchers, but it also provides teaching opportunities outside a typical classroom setting."Not only do scientists get data, but the people who are participating in citizen science are learning something as well," said Jenkins.

When done right, citizen science can be an opportunity to support and extend learning and welcome different skills and beliefs. But according to the report, those opportunities can only be reached if diversity, equity and inclusion are included as goals in a project's original design. Jenkins said people have to intentionally design citizen science projects around those three things; otherwise, it can lead to bias in the project.

"It's a mission that everyone needs to take up if they care about those issues," said Jenkins. "We all have a role to play around diversity, equity and inclusion in the sciences."

Credit: 
Arizona State University

Galactic cosmic rays affect Titan's atmosphere

image: Optical image of Titan taken by NASA Cassini spacecraft.

Image: 
NASA/JPL-Caltech/Space Science Institute

Planetary scientists using the Atacama Large Millimeter/submillimeter Array (ALMA) revealed the secrets of the atmosphere of Titan, the largest moon of Saturn. The team found a chemical footprint in Titan's atmosphere indicating that cosmic rays coming from outside the Solar System affect the chemical reactions involved in the formation of nitrogen-bearing organic molecules. This is the first observational confirmation of such processes, and impacts the understanding of the intriguing environment of Titan.

Titan is attracting much interest because of its unique atmosphere with a number of organic molecules that form a pre-biotic environment.

Takahiro Iino, a scientist at the University of Tokyo, and his team used ALMA to reveal the chemical processes in Titan's atmosphere. They found faint but firm signals of acetonitrile (CH3CN) and its rare isotopomer CH3C15N in the ALMA data.

"We found that the abundance of 14N in acetonitrile is higher than those in other nitrogen bearing species such as HCN and HC3N," says Iino. "It well matches the recent computer simulation of chemical processes with high energy cosmic rays."

There are two important players in the chemical processes of the atmosphere; ultraviolet (UV) light from the Sun and cosmic rays coming from outside the Solar System. In the upper atmosphere, UV light selectively destroys nitrogen molecules containing 15N because the UV light with the specific wavelength that interacts with 14N14N is easily absorbed at that altitude. Thus, nitrogen-bearing species produced at that altitude tend to exhibit a high 15N abundance. On the other hand, cosmic rays penetrate deeper and interact with nitrogen molecules containing 14N. As a result, there is a difference in the abundance of molecules with 14N and 15N. The team revealed that acetonitrile in the stratosphere is more abundant in 14N than those of other previously measured nitrogen-bearing molecules.

"We suppose that galactic cosmic rays play an important role in the atmospheres of other solar system bodies," says Hideo Sagawa, an associate professor at Kyoto Sangyo University and a member of the research team. "The process could be universal, so understanding the role of cosmic rays in Titan is crucial in overall planetary science."

Titan is one of the most popular objects in ALMA observations. The data obtained with ALMA needs to be calibrated to remove fluctuations due to variations of on-site weather and mechanical glitches. For referencing, the observatory staff often points the telescope at bright sources, such as Titan, from time to time in science observations. Therefore, a large amount of Titan data is stored in the ALMA Science Archive. Iino and his team have dug into the archive and re-analyzed the Titan data and found subtle fingerprints of very tiny amounts of CH3C15N.

Credit: 
National Institutes of Natural Sciences

Gold nanoclusters: new frontier for developing medication for treatment of Alzheimer's disease

image: Au23(CR)14 Nanocluster functions in multiple stages of the progression from Aβ monomer to Aβ plaques.

Image: 
©Science China Press

Alzheimer's disease (AD) is a progressive neurodegenerative disorder characterized by amyloid-β (Aβ) fibrillation and plaque formation. While more than 50 millions of people are devastated by AD, no treatment is available. Recently, anti-Aβ antibody-based immunotherapy has failed in clinical trials, partially due to the increased cytotoxicity of soluble Aβ oligomers. Therefore, developing a medication for AD treatment becomes an even more grave challenge.

In a new research article published in the Beijing-based National Science Review, scientists at the State Key Laboratory of Advanced Technology for Materials Synthesis and Processing, Wuhan University of Technology in China explored the possibility of treating with gold nanoclusters.

As illustrated in Figure 1, Au23(CR)14, a novel gold nanocluster modified with Cys-Arg (CR) dipeptide, functions in multiple stages of the progression from Aβ monomer to Aβ plaques: inhibiting the misfolding and fibrillation of amyloid-β (Aβ), fully dissolving the preformed/mature Aβ fibrils and restoring the conformation of Aβ peptides from misfolded β-sheets into unfolded monomer state with abolished cytotoxicity, and more importantly, completely dissolving endogenous Aβ plaques in the brain slices from transgenic AD model mice. Furthermore, Au23(CR)14 has good biocompatibility and infiltration ability across the blood brain barrier (BBB).

This article not only presents a compelling nanotherapeutic candidate for AD treatment, but also opens a new frontier for developing nanomaterial-based medications for AD treatment. Undoubtedly, more researches studying the basic mechanisms by which gold nanoclusters dissolve Aβ plaques will spur the development of new medications for AD treatment.

Credit: 
Science China Press

Satellite image data reveals rapid decline of China's intertidal wetlands

Using archives of satellite imaging data, a study in Frontiers in Earth Science has conducted the most in-depth study of China's intertidal wetlands to date and found a 37.62% decrease in area between 1970 and 2015.

Intertidal wetlands significantly contribute to China's environmental and ecological diversity, but are facing unprecedented pressures from anthropogenic development, as well as the threat of future sea level rise.

Despite the ecological and economic significance of China's vast areas of intertidal wetlands, such as storm protection, pollution purification and carbon sequestration, their distribution and variation over time have not been well documented. This study highlights the urgency for conservation measures of these valuable habitats by showing the scale of their decline for the first time.

"Intertidal wetlands are buffer zones, linking freshwater river systems and the salty oceanic system. Despite their significant contributions to environmental, ecological and hydrological services, China's intertidal wetlands have not received sufficient protection," says study lead Dr. Song, from the School of Geographical Sciences at Guangzhou University.

The research team at Guangzhou University utilized archives of remote sensing data from earth observation satellites to map intertidal wetland patterns on a national scale, following the continental coastline over 18,000km. The use of archival data in this study has provided the most integrated and complete perspective for use in natural resource management and land-use change research to date.

Satellite image data was collected from the 1970s, 1995 and 2015 from the Landsat-MSS/TM/8 satellites and was integrated with coastal maps from corresponding time periods to improve map accuracy. 653 field survey points were also selected by random sampling from different coastline types and compared with the interpreted data for further precision validation. The results show worrying trends for these valuable habitats if effective protective measures are not put in place.

The area of intertidal flats along China's coast showed extensive decline from 7848.21km2 in the 1970's to 4895km2 in 2015, equating to a 37.62% area loss of the intertidal wetlands in the last 40 years. These areas have also become increasingly fragmented as anthropogenic development continues to encroach further towards the coast.

At an annual loss rate of between 0.94% and 1.17%, China's intertidal wetlands are disappearing almost twice as fast as the global average, due to an explosive growth of economies and populations in coastal regions. Huge reclamations of open-coast wetland are also being used for aquaculture, agriculture, tourism construction and hydrologic engineering due to intensified land use conflicts. These new coastal developments have led to expanding seawall construction, which is being dubbed 'the new Great Wall of China'.

In the long run, accelerated sea level rise due to climate change will have the largest impact on coastal wetlands, putting ecosystems under increased strain. The combined pressures of land reclamation and sea level rise reduce the space for the wetland to retreat inland, leading to further decline.

Dr. Song emphasizes, "Intertidal flats and wetlands are confronting unprecedented pressure, because of the interference of human activity and the expected future sea level rise. Due to the region's development plan of coastal regions, and the government favored 'blue economy', the human activity in the near-shore region is going to become more intensive."

"A high priority should be given to intertidal wetland conservation and habitat reconstruction, with a view to a sustainable future."

Credit: 
Frontiers

Study surveys molecular landscape of endometrial cancer

The most comprehensive molecular study of endometrial cancer to date has further defined the contributions of key genes and proteins to the disease, say its authors.

Published online February 13 in Cell, the study suggests new treatment approaches that could be tailored for each patient, as well as potential biological targets for future drug design.

Led by researchers from NYU Grossman School of Medicine and more than a dozen other institutions, the team reached its conclusions by measuring levels of key proteins, the workhorse molecules that make up cellular structures and signaling networks.

Controlled by instructions encoded in genes, protein levels in cancer cells are the functional result of genetic changes that affect risk for endometrial cancer, researchers say. Focused on proteomics, the large-scale analyses of protein functions and interactions, the study compared protein levels in 95 uterine tumors and 49 normal uterine tissue samples.

"While more time-consuming and expensive, proteomics reveals insights into cancer risk that cannot be found by experiments that look at changes in the genetic code alone," says study senior co-author David Fenyö, PhD, professor in the Department of Biochemistry and Molecular Pharmacology and faculty in the Institute for Systems Genetics at NYU Langone Health

"Proteomics identifies the proteins that are most active in a specific tumor, which potentially enables the design of treatments that will work best against that tumor in particular," adds Fenyo.

Endometrial cancer, which arises in the lining of the uterus, is the sixth most common cancer in women globally and resulted in 12,160 deaths in the United States in 2019. While most women diagnosed in the early stages can be cured, some endometrial tumors can recur, which comes typically with far worse clinical outcomes.

The new work builds on The Cancer Genome Atlas, or TCGA, a landmark research effort that first outlined the genetic underpinnings of many cancers in 2013. Like TCGA, the new study sought to look not at any single aspect of molecular biology; instead, it investigated all players involved in a given set of cancer cells, from the molecular "letters" making up DNA, to the RNA genetic material that DNA is converted into, to the proteins built based on the RNA.

The study also examined the chemical changes to proteins, called post-translational modifications, which determine when and where the proteins are "switched on or off". Altogether, the researchers took more than 12 million measurements of differences between normal and cancerous cells in DNA and RNA, protein levels, and in chemical changes to DNA and proteins.

The NYU Langone research team played a major role in a key finding of the study, which revealed a new way to tell apart a highly aggressive type of endometrial cancer from a less aggressive type that looks similar under a microscope. Telling the two types apart would help clinicians to better fit treatment approaches to a given patient, and to do so earlier in the course of the disease, say the authors.

One subtype of endometrial cancer, the endometrioid subtype, is often identified early, and comprises about 85 percent of all endometrial cancers. A second subtype, the serous subtype, is more aggressive, is typically identified later, and accounts for more deaths than endometrioid tumors. To complicate matters, there is within the endometrioid group an aggressive subset of tumors with molecular markers that are more similar to the serous subtype.

The NYU Langone team focused much of their work on determining what distinguishes these aggressive endometrioid tumors from the serous tumors and the less aggressive endometrioid tumors. They found a subset of proteins that were phosphorylated - had a certain post-translational modification that switches on proteins - in the aggressive subset of endometrioid tumors and in serous tumors, but not in the less aggressive subset of endometrioid tumors. Moreover, the researchers found that some of these hyperactive proteins can be targeted by drugs that are currently approved by the U.S. Food and Drug Administration for other purposes.

In addition, the field had previously established that some people with the less aggressive subset of endometrioid tumors have a genetic difference (mutation) that overproduces the protein beta-catenin, which results in a poor prognosis.

The NYU Langone team found evidence that the high levels of beta-catenin in these seemingly less aggressive tumors are linked to an increased activity of a signaling pathway called Wnt, which is known to spur abnormal cell growth.

"For many years scientists have been using genomics, the study of the genetic code, which is a very effective way but a relatively basic way to look at cancer," says study co-lead author Emily Kawaler, a student at NYU Grossman School of Medicine. "But if we add on all of these extra levels--proteins, RNA, and the ways proteins talk to each other--then we can learn even more about how cancer is actually working."

Credit: 
NYU Langone Health / NYU Grossman School of Medicine