Tech

New way to halt leukemia relapse shown promising in mice

image: CML stem cells, which are the cellular source of the vast majority of CML cells, are reportedly resistant to TKI therapy due to the stem cell quiescence. Thus, the remaining CML stem cells are responsible for the recurrence after TKI therapy.

Image: 
Kazuhito Naka, Hiroshima University

Researchers have identified a second path to defeating chronic myelogenous leukemia, which tends to affect older adults, even in the face of resistance to existing drugs.

The new findings were published on September 17th in Nature Communications.

Almost all patients with chronic myelogenous leukemia, or CML, have a faulty, cancer-causing gene, or "oncogene" called BCR-ABL1. BCR-ABL1 turns a regular stem cell (a unique type of cell that can turn into other types of cells and then reproduce those cells during life time) in the bone marrow into a CML stem cell that produces malformed blood cells. And instead of the CML stem cell dying when it should be scheduled to do so, the oncogene causes it to keep producing even more of these faulty blood cells.

Advances in treatment since the turn of the millennium have been extremely successful at combatting the disease in patients with this oncogene. Drugs called tyrosine kinase inhibitors (TKI) have completely transformed the prognosis of people with such leukemias, and with fewer of the side effects of other cancer treatments. In most cases, the cancer goes into remission and patients live for many years following diagnosis.

BCR-ABL1 directs the production of an abnormal type of tyrosine kinase, an enzyme that 'turns on' many types of proteins through a cascade of chemical reactions known as signal transduction--in effect communication via chemistry. Miscommunication resulting from the faulty enzyme is what promotes the growth of the leukemic cells. By stopping this communication within CML stem cells, TKI signal transduction therapy inhibits their growth and brings a halt to their production of the malformed blood cells.

However, TKIs only controls the disease; they don't cure it. Drug resistance can develop in a patient because while TKIs work well on proliferative mature CML cells that are actively reproducing, they are less effective at inducing cell death on the part of CML stem cells that are quiescent.

Quiescence is an "idling" stage in the life cycle of a cell in which it basically just rests and hangs out for extended periods of time in anticipation of reactivation, neither replicating nor dying.

"If CML stem cells are in a quiescent phase, they are otherwise left untouched by TKI treatment, and so survive to potentially cause a relapse," said Kazuhito Naka, paper author and an associate professor from the Department of Stem Cell Biology of Hiroshima University's Research Institute for Radiation Biology and Medicine.

But the researchers found in mouse models that if they disrupt Gdpd3--a different, non-oncogene gene--then the self-renewal capacity of the CML stem cells is sharply decreased. Gdpd3 directs the production of an enzyme for a particular type of lipid that appears to play a key role in regulating the quiescence of CML stem cells in an oncogene-independent fashion.

In other words, the Gdpd3 gene involved in production of this lipid is largely responsible for the maintenance of CML stem cells. The researchers had broken their quiescence.

Crucially, when the researchers disrupted the Gdpd3 gene encoding these lipids, leukemia relapse in the mice was significantly reduced, even when the BCR-ABL1 oncogene was not disrupted.

"This potentially provides another path to arresting these leukemias--and maybe other cancers too," said Dr. Naka, "beyond having to wrestle with the BCR-ABL1 oncogene."

While the researchers have discovered a new, biologically significant role for this particular lipid in causing the recurrence of CML, they still do not fully understand the precise way this happens. The researchers now want to investigate the mechanisms involved and whether this lipid also plays a role in the quiescence of the cancer stem cells that cause solid tumors, not just in leukemias, and thus in these cancers' recurrence and growth as well.

Credit: 
Hiroshima University

Promising computer simulations for stellarator plasmas

video: The video shows the turbulent variation of plasma density over the cross-section of the plasma ring of the Wendelstein 7-X stellarator.

Image: 
Video: IPP, A. Banon Navarro

For the fusion researchers at IPP, who want to develop a power plant based on the model of the sun, the turbulence formation in its fuel - a hydrogen plasma - is a central research topic. The small eddies carry particles and heat out of the hot plasma centre and thus reduce the thermal insulation of the magnetically confined plasma. Because the size and thus the price of electricity of a future fusion power plant depends on it, one of the most important goals is to understand, predict and influence this "turbulent transport".

Since the exact computational description of plasma turbulence would require the solution of highly complex systems of equations and the execution of countless computational steps, the code development process is aimed at achieving reasonable simplifications. The GENE code developed at IPP is based on a set of simplified, so-called gyrokinetic equations. They disregard all phenomena in the plasma which do not play a major role in turbulent transport. Although the computational effort can be reduced by many orders of magnitude in this way, the world's fastest and most powerful supercomputers have always been needed to further develop the code. In the meantime, GENE is able to describe the formation and propagation of small low-frequency plasma eddies in the plasma interior well and to reproduce and explain the experimental results - but originally only for the simply constructed, because axisymmetric fusion systems of the tokamak type.

For example, calculations with GENE showed that fast ions can greatly reduce turbulent transport in tokamak plasmas. Experiments at the ASDEX Upgrade tokamak at Garching confirmed this result. The required fast ions were provided by plasma heating using radio waves of the ion cyclotron frequency.

A tokamak code for stellarators

In stellarators, this turbulence suppression by fast ions had not been observed experimentally so far. However, the latest calculations with GENE now suggest that this effect should also exist in stellarator plasmas: In the Wendelstein 7-X stellarator at IPP at Greifswald, it could theoretically reduce turbulence by more than half. As IPP scientists Alessandro Di Siena, Alejandro Bañón Navarro and Frank Jenko show in the journal Physical Review Letters, the optimal ion temperature depends strongly on the shape of the magnetic field. Professor Frank Jenko, head of the Tokamak Theory department at IPP in Garching: "If this calculated result is confirmed in future experiments with Wendelstein 7-X in Greifswald, this could open up a path to interesting high-performance plasmas".

In order to use GENE for turbulence calculation in the more complicated shaped plasmas of stellarators, major code adjustments were necessary. Without the axial symmetry of the tokamaks, one has to cope with a much more complex geometry for stellarators.

For Professor Per Helander, head of the Stellarator Theory department at IPP in Greifswald, the stellarator simulations performed with GENE are "very exciting physics". He hopes that the results can be verified in the Wendelstein 7-X stellarator at Greifswald. "Whether the plasma values in Wendelstein 7-X are suitable for such experiments can be investigated when, in the coming experimental period, the radio wave heating system will be put into operation in addition to the current microwave and particle heating," says Professor Robert Wolf, whose department is responsible for plasma heating.

GENE becomes GENE-3D

According to Frank Jenko, it was another "enormous step" to make GENE not only approximately, but completely fit for the complex, three-dimensional shape of stellarators. After almost five years of development work, the code GENE-3D, now presented in the "Journal of Computational Physics" by Maurice Maurer and co-authors, provides a "fast and yet realistic turbulence calculation also for stellarators", says Frank Jenko. In contrast to other stellarator turbulence codes, GENE-3D describes the full dynamics of the system, i.e. the turbulent motion of the ions and also of the electrons over the entire inner volume of the plasma, including the resulting fluctuations of the magnetic field.

Credit: 
Max-Planck-Institut für Plasmaphysik (IPP)

Nose's response to odors more than just a simple sum of parts

video: A recording of the activity of olfactory sensory neurons in the nose of a mouse. When an odor is presented, changes in activity can be seen as changes in light intensity. Measurement and analysis of such recording has led to the finding that suppression and enhancement of the response of these neurons in the nose is happening when odors are mixed, overturning traditional thinking that the response is a simple sum. The recorded area is approximately 0.5 mm by 0.5 mm.

Image: 
© Shigenori Inagaki, Takeshi Imai, Kyushu University

Take a sniff of a freshly poured glass of wine, and the prevailing scientific thinking would suggest that the harmony of fragrances you perceive starts with sensory receptors in your nose simply adding up the individual odors they encounter. However, new research from Kyushu University shows that a much more complex process is occurring, with some responses being enhanced and others inhibited depending on the odors present.

In mammals, the sense of smell starts with the detection of odors by receptors at the ends of special cells--called olfactory sensory neurons--in the nose. Each of these neurons has just one type of receptor out of a large repertoire that depends on the species, with humans having around 400 types and mice around 1,000.

While the brain's processing of sensory information is known to be important for picking out and synthesizing smells, relatively little is still known about the processes happening where the odors are first detected in the nose.

Using recently developed techniques for highly sensitive recording of the response of the receptors in the noses of living mice, the research group led by Takeshi Imai, professor in the Graduate School of Medical Sciences at Kyushu University, has now published in Cell Reports a deeper understanding of how neurons in the nose react to odors and their mixtures.

"It has been previously considered that each odor 'activates' a specific set of receptors, and that the response of neurons in the nose to odor mixtures is a simple sum of the responses to each component, but now we have evidence in mice that this is not the case," says Shigenori Inagaki, the lead author of the paper.

Studying mice that were genetically modified to have their neurons emit green light depending on the amount of calcium ions in them--an indicator of activity--upon absorption of excitation light, the researchers were able to sensitively record the response of the neurons in the mice's noses using a two-photon microscope.

Based on these recordings, Imai's team found that odors could not only activate but also suppress the response of the neurons in the noses of mice, indicating complex interactions are happening well before the signals reach the olfactory bulb or brain for additional processing. Furthermore, their experiments showed that mixing of odors often leads to an enhancement of response through synergy, especially when they are in relatively low concentrations, or a suppression of response through antagonism, especially when they are in high concentrations.

The suppression and enhancement processes revealed by this study may explain why odor mixtures produce very different perceptual outcomes from their components, and this kind of understanding could aid in the development of methods for the rational design of olfactory experiences, from pleasant, harmonious smells to deodorizers.

"These results indicate that our perception of odors is being tuned from the very moment they are detected in the nose," explains Imai. "Possibly, these findings may explain how the addition of a minor amount of an odor can have such a major effect on the perceived fragrance, or how different kinds of odor molecules in a glass of wine produce a nice harmony."

Credit: 
Kyushu University

Glyphosate residue in manure fertilizer decrease strawberry and meadow fescue growth

image: Strawberry and meadow fescue were used in the study.

Image: 
Viivi Saikkonen

A new study finds that glyphosate residue in manure fertilizer decrease the growth of strawberry and meadow fescue as well as runner production of strawberry.

Earlier experiments with Japanese quails showed how glyphosate residue in poultry feed accumulated in quail manure. In these experiments, half of the quails were fed with glyphosate-contaminated feed while the control group were fed with organic feed free from glyphosate residues. This allowed testing whether glyphosate residues in poultry manure affect crop plants if the manure is used as a fertilizer.

"We established an experimental field where we planted both strawberry and meadow fescue. These plants were fertilized with bedding material containing excrements from quails raised on feed containing glyphosate residue and organic feed free of the residue," tells Senior Researcher Anne Muola from the Biodiversity Unit of the University of Turku, Finland.

Decreased growth as well as indirect effects

The growth and reproduction of strawberry and meadow fescue were monitored throughout one growing season. High amount of glyphosate residue (158 mg/kg) decreased the growth of both studied crop plants and runner production of strawberry even if the amount of glyphosate in the soil decreased fast during the growing season.

In addition, the manure fertilizer containing glyphosate residue was found to have indirect effects. Larger meadow fescues were producing more inflorescences and herbivorous insects preferred larger strawberries.

"Our results support earlier studies which have found that already very small glyphosate residue (

Glyphosate-based herbicides are the most frequently used herbicides globally. Many GMO crops are so called "glyphosate ready" meaning that they are resistant to glyphosate. This allows agricultural practices where glyphosate is applied in considerable amounts which increases the likelihood of its residue ending up in animal feed.

"For instance, the cultivation of GM soy is not allowed in the EU. Still, soy is an excellent energy and protein source and it is imported from outside the EU to be used as a component in animal feed. The glyphosate residues in feed are then accumulated in poultry excrement," says Marjo Helander and continues:

"Poultry manure is rich in essential nutrients and organic compounds, and thus, to increase the sustainability of poultry industry, regulations suggest that poultry manure should be used as an organic fertilizer. However, this can lead into a situation where glyphosate can be unintentionally spread to fields or gardens via organic fertilizer, counteracting its ability to promote plant growth."

Credit: 
University of Turku

Immune system may have another job -- combatting depression

An inflammatory autoimmune response within the central nervous system similar to one linked to neurodegenerative diseases such as multiple sclerosis (MS) has also been found in the spinal fluid of healthy people, according to a new Yale-led study comparing immune system cells in the spinal fluid of MS patients and healthy subjects. The research, published Sept. 18 in the journal Science Immunology, suggests these immune cells may play a role other than protecting against microbial invaders -- protecting our mental health.

The results buttress an emerging theory that gamma interferons, a type of immune cell that helps induce and modulate a variety of immune system responses, may also play a role in preventing depression in healthy people.

"We were surprised that normal spinal fluid would be so interesting," said David Hafler, the William S. and Lois Stiles Edgerly Professor of Neurology, professor of immunobiology and senior author of the study.

Previous research has shown that blocking gamma interferons and the T cells they help produce can cause depression-like symptoms in mice. Hafler notes that depression is also a common side effect in patients with MS treated with a different type of interferon.

Using a powerful new technology that allows a detailed examination of individual cells, the researchers show that while the characteristics of T cells in the spinal fluid of healthy people share similarities with those of MS patients, they lack the ability to replicate and cause the damaging inflammatory response seen in autoimmune diseases such as MS.

In essence, the immune system in the brains of all people is poised to make an inflammatory immune system response and may have another function than defending against pathogens, Hafler said.

"These T cells serve another purpose and we speculate that they may help preserve our mental health," he said.

Hafler said that his lab and colleagues at Yale plan to explore how immune system responses in the central nervous system might affect psychiatric disorders such as depression.

Credit: 
Yale University

New method identifies antibody-like proteins with diagnostic and therapeutic potential for SARS-CoV-2

Scientists have used a new high-speed, in vitro selection method to isolate 9 antibody-like proteins (ALPs) that bind to the SARS-CoV-2 virus - 4 of which also exhibited neutralizing activity - within 4 days, according to a new study. While much research has focused on the identification of whole antibodies against SARS-CoV-2 - the causative agent of COVID-19 - less attention has been paid to ALPs, which include monobodies that can offer similar diagnostic and therapeutic advantages. However, established in vitro methods to select small proteins like ALPs have some drawbacks. Phage display, the most common method, generates relatively small libraries of candidates, limiting the utility of this approach in the context of recently described viruses such as SARS-CoV-2. Another method, mRNA display, generates sufficiently large libraries but is time consuming, taking up to several weeks to yield results. Seeking a solution, Taishi Kondo and colleagues refined a method they previously developed, called TRAP display, which streamlines two of the most time-consuming early steps in the mRNA display protocol. Kondo et al. used their TRAP display method to obtain - within 4 days - 9 ALPs that bind to the S1 subunit of the SARS-CoV-2 spike protein complex. Of these 9 ALPs, 4 showed affinity for the domain of the spike protein that binds the ACE2 receptor, suggesting that these ALPs may neutralize the virus' ability to bind to ACE2 on human cells. Of these 4 ALPs with binding affinity, one blocked SARS-CoV-2's ability to infect cultured cells in vitro. The authors then used the high-affinity ALP to capture SARS-CoV-2 viral particles from nasal swab samples, demonstrating the ALP's potential for use in diagnostic tests. "We believe that the monobodies procured in our study will soon be useful to develop effective diagnostic tools and that such tools will contribute to the worldwide effort to overcome the COVID-19 pandemic," the authors write.

Credit: 
American Association for the Advancement of Science (AAAS)

A better informed society can prevent lead poisoning disasters

Six years after it began, the Flint, Michigan, water crisis remains among the highest-profile emergencies in the United States.

Extensive iron and lead corrosion of the water distribution system in Flint, coupled with lead release, created "red water" complaints, rapid loss of chlorine disinfectant and an outbreak of Legionnaires Disease that killed 12 people. State and federal agencies have disbursed $450 million in aid so far. In August, the state of Michigan reached a mediated settlement in a civil suit and is expected to pay about $600 million to victims, many of whom are children.

"The Flint story is a cautionary tale of poor anticipation of risks, actions that were too little too late, reactionary and not driven by scientific data," John R. Scully, University of Virginia Charles Henderson Chaired Professor of Materials Science and Engineering, said. Scully also serves as technical editor in chief of CORROSION Journal.

In a paper published Sept. 8 in the Proceedings of the National Academy of Sciences, Scully and Raymond J. Santucci, who earned his Ph.D. from UVA's Department of Materials Science and Engineering in 2019, address unresolved scientific questions that can help avert future lead poisoning disasters. Lead poisoning from degrading lead pipes is a pervasive threat, and future incidents are likely.

"This requires a fresh perspective, to avoid just looking in the rearview mirror and instead focus on what lies around the curve ahead," Scully said.

Commonly proposed strategies offer false comfort based on sparse water testing, rules of thumb, and traditional mitigation strategies rather than by rigorously challenged sound scientific principles, according to the paper's authors.

"We need to better understand the scientific factors that govern lead release, and that starts with a better testing strategy and understanding of some fundamental truths," Scully said.

Scully and Santucci argue that the people who manage and regulate public water systems should be using scientific data to predict the risks of lead release. Risk assessments based on scientific data should replace current reliance on water sampling, which is imprecise and often too late to prevent a disaster.

A predictive framework for lead corrosion would allow regulators and infrastructure managers to anticipate problems and manage the risk of water conditions associated with unacceptable lead release.

Santucci and Scully recommend better thermodynamic and kinetic calculations and models that can predict lead release and accumulation. The models proposed could generate risk assessments based on dynamic data such as water chemistry, reaction rates, scale formation and inhibitor corrosion mechanisms, as well as water stagnation and flow.

Citizen scientists can help meet the data-gathering challenge. "Rapid accurate testing, perhaps via mobile phone test kits, could provide more real-time data. Hand-held, mobile tech that enables citizens to monitor their own drinking water should have advanced already," Scully said.

Santucci and Scully illustrate how chemical thermodynamics can predict the formation of thermodynamically stable lead-based compounds on lead pipe surfaces. Certain compounds form advantageous films that might act as kinetic barriers to hinder corrosion and function to sequester otherwise soluble lead.

"Stable film development depends on a certain equilibrium chemistry, with consequences for phosphate treatment," Santucci said. "Add more phosphate and you can remove more soluble lead to form a protective lead phosphate film. Remove phosphate completely, and you then rely on hoping that other lead compounds (lead -carbonate, -sulfate, -oxide, etc.) can remove the levels of lead you need," which is usually not the case.

In phosphate-treated water, a lead-phosphate film will form. "From our data, it is thermodynamically impossible to stay within the acceptable range of the EPA's Lead and Copper rule without an inhibitor like phosphate. But it takes time for the scale to form. We need to explore new inhibitor chemicals and surface-altering agents that optimize the protective scale coverage on a pipe wall," Santucci said.

The authors also suggest altogether new ideas to anticipate, monitor and prevent future lead in water crises. Artificial intelligence and machine learning could help identify relationships between water and pipe conditions and lead levels in drinking water. Santucci and Scully also propose a promising strategy of using isotope analysis to trace the sources of lead. "This strategy would enhance our understanding of how lead is released from lead pipes and other not so obvious sources, which is dearly lacking," their paper states.

Public officials may argue that the investment in scientific research and modeling is unnecessary because lead-based pipes are being replaced, albeit at homeowners' expense. Scully and Santucci disagree with that perspective. "Total replacement of lead service lines is a wonderful goal, but finding all sources of lead can be difficult," Scully said.

Replacing lead in public water systems does not simply mean replacing lead pipes. Additional sources include lead-based solder used to join pipes together and commercial brass that commonly contains small amounts of lead, and lead ions that soak into the corrosion film of steel pipes over time. Partial replacement of lead pipes with other pipes such as steel or copper can actually increase lead release due to galvanic corrosion, a process where contact between two dissimilar metals causes protection of one and accelerated degradation of the other.

Santucci and Scully argue for holistic, kinetic models that incorporate the rate of lead release from all possible sources, under many real-world operating conditions.

"The root cause of the Flint water crisis can be found at the intersection of materials science, surface science, water chemistry and electrochemistry," Scully said. "A better-informed society can prevent such disasters from happening in the future through improved risk assessment, anticipation and management of factors affecting lead release. We all have a part to play in averting future lead poisoning disasters."

Credit: 
University of Virginia School of Engineering and Applied Science

NASA estimates powerful hurricane Teddy's extreme rainfall

image: On Sept. 18 at 8 a.m. EDT (1200 UTC), NASA's IMERG estimated Hurricane Teddy was generating as much as (30 mm/1.18 inches of rain (dark pink) on the western side of its eye. Rainfall throughout most of the storm was occurring between 5 and 15 mm (0.2 to 0.6 inches/yellow and green colors) per hour. The rainfall data was overlaid on infrared imagery from NOAA's GOES-16 satellite.

Image: 
NASA/NOAA/NRL

Using a NASA satellite rainfall product that incorporates data from satellites and observations, NASA estimated Hurricane Teddy's rainfall rates. Teddy is a major hurricane in the Central North Atlantic Ocean.

On Sept. 18, NOAA's National Hurricane Center (NHC) warned that Teddy remains a powerful hurricane over the Central Atlantic Ocean, and large ocean swells are forecast to spread across much of the western Atlantic increasing a rip current threat.

Hurricane Teddy's Status on Sept. 18

At 5 a.m. EDT (0900 UTC), the center of Hurricane Teddy was located near latitude 21.6 degrees north and longitude 55.4 degrees west. That is about 550 miles (890 km) east-northeast of the Northern Leeward Islands, and about 935 miles (1,510 km) southeast of Bermuda.

Teddy is moving toward the northwest near 12 mph (19 kph) and this general motion is expected to continue for the next couple of days, followed by a turn to the north by early next week. Maximum sustained winds are near 130 mph (215 kph) with higher gusts. Teddy is a category 4 hurricane on the Saffir-Simpson Hurricane Wind Scale.  Some fluctuations in strength are expected during the next day or so. The estimated minimum central pressure is 947 millibars.

Estimating Teddy's Rainfall Rates from Space

NASA's Integrated Multi-satellitE Retrievals for GPM or IMERG, which is a NASA satellite rainfall product, estimated on Sept. 18 at 8 a.m. EDT (1200 UTC), Hurricane Teddy was generating as much as 30 mm (1.18 inches) of rain per hour on the western side of its eye. That was the area of where the heaviest rainfall was occurring.

Rainfall throughout most of the rest of the storm was estimated as falling at a rate between 5 and 15 mm (0.2 to 0.6 inches) per hour. At the U.S. Naval Laboratory in Washington, D.C., the IMERG rainfall data was overlaid on infrared imagery from NOAA's GOES-16 satellite to provide a full extent of the storm.

"Teddy remains a powerful category 4 hurricane with a well-defined eye and intense eyewall," said John Cangialosi Senior Hurricane Specialist at NHC in Miami, Fla. "There have been occasional dry slots that have eroded some of the convection in the eyewall and rain bands, but these seem to be transient."

What Does IMERG Do?

This near-real time rainfall estimate comes from the NASA's IMERG, which combines observations from a fleet of satellites, in near-real time, to provide near-global estimates of precipitation every 30 minutes. By combining NASA precipitation estimates with other data sources, we can gain a greater understanding of major storms that affect our planet.

Instead, what the IMERG does is "morph" high-quality satellite observations along the direction of the steering winds to deliver information about rain at times and places where such satellite overflights did not occur. Information morphing is particularly important over the majority of the world's surface that lacks ground-radar coverage. Basically, IMERG fills in the blanks between weather observation stations.

NHC Key Messages for Teddy

The NHC issued key messages for Teddy, including about its forecast track and the ocean swells it is generating.

NHC said Teddy is expected to approach Bermuda as a hurricane this weekend and make its closest approach to the island late Sunday or Monday (Sept. 20 or 21). While the exact details of Teddy's track and intensity near the island are not yet known, the risk of strong winds, storm surge, and heavy rainfall on Bermuda is increasing.

Large swells produced by Teddy are expected to affect portions of the Leeward Islands, the Greater Antilles, the Bahamas, Bermuda, and the southeastern United States during the next few days. These swells could cause life-threatening surf and rip current conditions.

NASA Researches Tropical Cyclones

Hurricanes/tropical cyclones are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contribues directly to America's leadership in space and scientific exploration.

Credit: 
NASA/Goddard Space Flight Center

NASA confirms development of record-breaking tropical storm Wilfred, ending hurricane list

image: On Sept. 18 at 5:30 a.m. EDT (0930 UTC), NASA's IMERG estimated Tropical Storm Wilfred was generating as much as 30 mm/1.18 inches of rain (dark pink) around the center of circulation. Rainfall throughout most of the storm and in fragmented bands of thunderstorms to the southeast of the center, was occurring between 5 and 15 mm (0.2 to 0.6 inches/yellow and green colors) per hour. The rainfall data was overlaid on infrared imagery from NOAA's GOES-16 satellite.

Image: 
NASA/NOAA/NRL

The list of hurricane names is officially used up with the development of the 23rd tropical cyclone of the year. Tropical Storm Wilfred just formed in the Eastern Atlantic Ocean today, Sept. 18. Using a NASA satellite rainfall product that incorporates data from satellites and observations, NASA estimated Wilfred's rainfall rates.

All of the names on the 2020 official list of hurricane names for the Atlantic Ocean hurricane season have now been claimed. That means the next system that forms into a tropical storm will get a name from the Greek Alphabet. This only happened once in Atlantic hurricane history, back in 2005. If Tropical Depression 22, located in the Gulf of Mexico, becomes a tropical storm it would be named Alpha.

Eric Blake, Senior Hurricane Specialist at NOAA's National Hurricane Center in Miami, Fla. noted, "Wilfred has formed, continuing the record-setting pace of the 2020 hurricane season since it is the earliest 21st named storm on record, about 3 weeks earlier than Vince of 2005."

Wilfred's Status on Sept. 18

At 11 a.m. EDT (1500 UTC), the center of Tropical Storm Wilfred was located near latitude 11.9 degrees north and longitude 32.4 degrees west. That is 630 miles (1,105 km) west-southwest of the Cabo Verde Islands. Wilfred is moving toward the west-northwest near 17 mph (28 kph) and this general motion is expected for the next few days.

Maximum sustained winds are near 40 mph (65 km/h) with higher gusts. Some slight strengthening is possible today, and weakening should start this weekend and continue into next week. The estimated minimum central pressure is 1008 millibars.

Estimating Wilfred's Rainfall Rates from Space

NASA's Integrated Multi-satellitE Retrievals for GPM or IMERG, which is a NASA satellite rainfall product, estimated on Sept. 18 at 5:30 a.m. EDT (0930 UTC), Wilfred was generating as much as 30 mm (1.18 inches) of rain per hour around the center of circulation.

Rainfall throughout most of the storm and in fragmented bands of thunderstorms to the southeast of the center was estimated as falling at a rate between 5 and 15 mm (0.2 to 0.6 inches) per hour. At the U.S. Naval Laboratory in Washington, D.C., the IMERG rainfall data was overlaid on infrared imagery from NOAA's GOES-16 satellite to provide a full extent of the storm.

What Does IMERG Do?

This near-real time rainfall estimate comes from the NASA's IMERG, which combines observations from a fleet of satellites, in near-real time, to provide near-global estimates of precipitation every 30 minutes. By combining NASA precipitation estimates with other data sources, we can gain a greater understanding of major storms that affect our planet.

What the IMERG does is "morph" high-quality satellite observations along the direction of the steering winds to deliver information about rain at times and places where such satellite overflights did not occur. Information morphing is particularly important over the majority of the world's surface that lacks ground-radar coverage. Basically, IMERG fills in the blanks between weather observation stations.

NASA Researches Tropical Cyclones

Hurricanes/tropical cyclones are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

For more information about NASA's IMERG, visit: https://pmm.nasa.gov/gpm/imerg-global-image

Credit: 
NASA/Goddard Space Flight Center

NASA's aqua satellite helps confirm subtropical storm alpha

image: On Sept. 18, 2020, NASA's Aqua satellite provided a visible image of Subtropical Storm Alpha in the eastern North Atlantic Ocean near Portugal's coast.

Image: 
Image Courtesy: NASA Worldview, Earth Observing System Data and Information System (EOSDIS).

Subtropical Storm Alpha has formed near the coast of Portugal, becoming the first named storm using the Greek Alphabet list, now that the annual list of names is exhausted. NASA's Aqua satellite obtained visible imagery of the new storm.

NASA Satellite View

The Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite captured a visible image of Subtropical Storm Alpha on Sept. 18 at 8:30 a.m. EDT (1:30 p.m. local time) near Portugal. The image showed a better-organized small low-pressure area that has been rotating around a larger extratropical low pressure area. Satellite imagery shows that moderate-to-deep convection has persisted near the center creating thunderstorms since last night. Meanwhile scatterometer data shows a closed 40-knot low-pressure area, and the National Hurricane Center noted that radar images from Portugal show a definite organized convective pattern.

Although Alpha is "likely neutral- or cold-core, it has developed enough tropical characteristics to be considered a subtropical storm," said Eric Blake, senior hurricane specialist at NOAA's National Hurricane Center (NHC) in Miami, Fla.

Satellite imagery was created using NASA's Worldview product at NASA's Goddard Space Flight Center in Greenbelt, Md.

Alpha's Status

At 12:30 p.m. EDT (1630 UTC), NOAA's National Hurricane Center (NHC) noted the center of Subtropical Storm Alpha was located near latitude 39.9 degrees north and longitude 9.3 degrees west. That is just 75 miles (125 km) north of Lisbon, Portugal. The storm is moving toward the northeast near 17 mph (28 kph), and this general motion is expected during the next day or so before dissipation. Maximum sustained winds are near 50 mph (85 kph) with higher gusts. The estimated minimum central pressure is 999 millibars.

Alpha's Impacts

Alpha should move across the coast of west-central Portugal during the next couple of days. Little change in strength is expected before landfall, with rapid weakening over land through the weekend.

NHC said Alpha is expected to produce 1 to 2 inches (25 to 50 mm) of rainfall, with isolated amounts of 3 inches (75 mm) over the northern portion of Portugal and into west-central Spain through Saturday morning.

Information on wind hazards from Alpha can be found in products from the Portuguese Institute for Sea and Atmosphere at http://www.ipma.pt.

Global models show the small low pressure area moving northeastward for the next 24 hours before dissipating over northern Spain or the Bay of Biscay.

Credit: 
NASA/Goddard Space Flight Center

Undersea earthquakes shake up climate science

image: An artist's rendering of undersea earthquake waves.

Image: 
Caltech

Despite climate change being most obvious to people as unseasonably warm winter days or melting glaciers, as much as 95 percent of the extra heat trapped on Earth by greenhouse gases is held in the world's oceans. For that reason, monitoring the temperature of ocean waters has been a priority for climate scientists, and now Caltech researchers have discovered that seismic rumblings on the seafloor can provide them with another tool for doing that.

In a new paper publishing in Science, the researchers show how they are able to make use of existing seismic monitoring equipment, as well as historic seismic data, to determine how much the temperature of the earth's oceans has changed and continues changing, even at depths that are normally out of the reach of conventional tools.

They do this by listening for the sounds from the many earthquakes that regularly occur under the ocean, says Jörn Callies, assistant professor of environmental science and engineering at Caltech and study co-author. Callies says these earthquake sounds are powerful and travel long distances through the ocean without significantly weakening, which makes them easy to monitor.

Wenbo Wu, postdoctoral scholar in geophysics and lead author of the paper, explains that when an earthquake happens under the ocean, most of its energy travels through the earth, but a portion of that energy is transmitted into the water as sound. These sound waves propagate outward from the quake's epicenter just like seismic waves that travel through the ground, but the sound waves move at a much slower speed. As a result, ground waves will arrive at a seismic monitoring station first, followed by the sound waves, which will appear as a secondary signal of the same event. The effect is roughly similar to how you can often see the flash from lightning seconds before you hear its thunder.

"These sound waves in the ocean can be clearly recorded by seismometers at a much longer distance than thunder -- from thousands of kilometers away," Wu says. "Interestingly, they are even 'louder' than the vibrations traveling deep in the solid Earth, which are more widely used by seismologists."

The speed of sound in water increases as the water's temperature rises, so, the team realized, the length of time it takes a sound to travel a given distance in the ocean can be used to deduce the water's temperature.

"The key is that we use repeating earthquakes--earthquakes that happen again and again in the same place," he says. "In this example we're looking at earthquakes that occur off Sumatra in Indonesia, and we measure when they arrive in the central Indian ocean. It takes about a half hour for them to travel that distance, with water temperature causing about one-tenth-of-a second difference. It's a very small fractional change, but we can measure it."

Wu adds that because they are using a seismometer that has been in the same location in the central Indian Ocean since 2004, they can look back at the data it collected each time an earthquake occurred in Sumatra, for example, and thus determine the temperature of the ocean at that same time.

"We are using small earthquakes that are too small to cause any damage or even be felt by humans at all," Wu says. "But the seismometer can detect them from great distances , thus allowing us to monitor large-scale ocean temperature changes on a particular path in one measurement."

Callies says the data they have analyzed confirm that the Indian Ocean has been warming, as other data collected through other methods have indicated, but that it might be warming even faster than previously estimated.

"The ocean plays a key role in the rate that the climate is changing," he says. "The ocean is the main reservoir of energy in the climate system, and the deep ocean in particular is important to monitor. One advantage of our method is that the sound waves sample depths below 2,000 meters, where there are very few conventional measurements."

Depending on which set of previous data they compare their results to, ocean warming appears to be as much as 69 percent greater than had been believed. However, Callies cautions against drawing any immediate conclusions, as more data need to be collected and analyzed.

Because undersea earthquakes happen all over the world, Callies says it should be possible to expand the system he and his fellow researchers developed so that it can monitor water temperatures in all of the oceans. Wu adds that because the technique makes use of existing infrastructure and equipment, it is relatively low-cost.

"We think we can do this in a lot of other regions," Callies says. "And by doing this, we hope to contribute to the data about how our oceans are warming."

Credit: 
California Institute of Technology

Shift in West African wildmeat trade suggests erosion of cultural taboos

New research by the University of Kent's Durrell Institute of Conservation and Ecology (DICE) has demonstrated a clear fluctuation in the trade of wildmeat in and around the High Niger National Park in Guinea, West Africa.

Conservationists found a significant increase in the trading of species that forage on crops including the green monkey (Chlorocebus sabaeus) and warthog (Phacochoerus africanus), in comparison with earlier data, in spite of religious taboos against the killing and consumption of monkeys and wild pigs by Muslims in the region. These species are increasingly being killed to protect crops and farmers can gain economically from their sale, providing an additional incentive for killing. The consumption of wild pigs is prohibited by Islam, yet a marked increase in the number of carcasses recorded in rural areas from 2011 to 2017 has suggested an erosion in the religious taboo.

The research team led by Dr Tatyana Humle (DICE) alongside colleagues from Beijing Forestry University, China, and the Higher Institute of Agronomy and Veterinary of Faranah, Guinea, drew conclusions after examining the wildmeat trade in three rural markets in the Park and in the nearest urban centre, Faranah, by collecting market survey data during August-November 2017, and comparing it with data from the same period in the 1990's, 2001 and 2011.

Mammals, most notably small sized species, now dominate the wildmeat trade around High Niger National Park. Further findings indicate a marked increase in the number of carcasses and biomass offered for sale from 2001 onwards in rural areas, whereas in Faranah there were no notable differences with data gathered in 1994. Therefore, urban demand does not appear to be driving the wildmeat trade in this region. Instead, the wildmeat trade in rural areas could perhaps be linked to an increase in human population and limited access to alternative sources of animal protein.

Dr Humle said: 'This study highlights that despite the local reduction in urban demand for wildmeat, pressures on wildlife in the Park remain. The prominence of crop-protection is increasingly being recognised for driving the wildmeat trade across rural West Africa, yet there is a need to better understand the motivations behind hunting and supply and demand dynamics. There is wider scope to investigate and improve the balance between local farmers' livelihoods and biodiversity conservation.'

Credit: 
University of Kent

Development of high-sensitivity, wide-IF band heterodyne receiver in THz frequency range

image: HEBM structure and a micrograph

Image: 
National Institute of Information and Communications Technology (NICT)

[Abstract]

The National Institute of Information and Communications Technology (NICT, President: TOKUDA Hideyuki, Ph.D.) has developed a unique superconducting hot electron bolometer mixer (HEBM) using magnetic materials. As a result, the noise of the 2 THz band heterodyne receiver has been reduced and the wide IF band has been achieved. The 2 THz band HEBM produced this time has a low noise performance of about 570 K (DSB), which is about 6 times the quantum noise limit, and a wide IF band characteristic of about 6.9 GHz, which is about 3 GHz larger than the conventional structure HEBM. Both of these are world-class performance.

This technology is expected to contribute to the development of new frequency resources such as high-speed wireless communication, non-destructive inspection, global environment measurement, and radio astronomy as basic technology in the THz frequency domain, which is an undeveloped frequency domain.

[Background]

The terahertz frequency range is an unexplored region and is expected to be applied to high-speed wireless communication, nondestructive inspection, security, medical care, global environment measurement and radio astronomy. However, in order to realize them, it is important to first develop the fundamental technology of oscillation and detection technology.

So far, superconducting SIS mixers have reported excellent heterodyne receiver performance in the lowest noise and wide IF band in the frequency domain up to 1 THz. However, the upper limit frequency of its operation is considered to be about 1.5 THz, and a superconducting hot electron bolometer mixer (HEBM) is currently under research and development as a low-noise mixer element in the frequency region exceeding 1.5 THz.

It has already been reported that HEBM exhibits low-noise receiver operation below 10 times the quantum noise limit in the frequency region above 1.5 THz. However, HEBM had a problem to be solved for its application, because the IF bandwidth, which means the amount of information that can be processed at one time, is narrow. Compared to a superconducting SIS mixer that can secure an IF bandwidth of 20 GHz or more, HEBM was less than a quarter of that, at 3 to 5 GHz. The expansion of the IF bandwidth has great application merits, and there has been a demand for a wider IF bandwidth of HEBM.

[Achievements]

NICT has developed a new HEBM structure using magnetic materials as a detection technology, which is a basic technology for terahertz waves, in collaboration with the Advanced ICT Research Institute and the Applied Electromagnetic Research Institute under research collaboration at the Terahertz Technology Research Center. The new HEBM offers low noise performance and wide IF bandwidth at 2 THz.

HEBM has a structure in which a small superconducting thin film piece (superconducting strip) is placed between two metal electrodes and is a mixer that utilizes the strong impedance nonlinearity generated between the superconducting-normal transition (See Figure 1 (a)). This time, we have developed a new HEBM structure unique to NICT that inserts a nickel (Ni) thin film, which is a magnetic material, between the superconducting thin film and metal electrode to leave superconductivity only in the superconducting strip between the electrodes (See Figure 2 (b)). This structure allows HEBM to be further miniaturized and has realized a wider IF band as well as lower detector noise.

Therefore, this time, we prepared a miniaturized HEBM with a superconducting strip length of 0.1 μm and achieved Trx = 570 K (DSB) as the mixer noise temperature corrected for the loss of the input optical system at the measurement frequency of 2 THz. This is an extremely low noise operation that is about 6 times the quantum noise limit. In addition, the IF bandwidth of about 6.9 GHz, which is about 3 GHz larger than that of the conventional HEBM, was obtained, and it was confirmed that the new HEBM structure using magnetic materials is effective in improving the receiver performance (See Figure 3). These results are the results of evaluation at the actual operating temperature of 4 K, and we believe that they have the world's top-level performance as a terahertz band HEBM.

[Future Prospects]

NICT is working on the development of waveguide HEBM with the aim of commercializing 2 THz band HEBM. We aim to apply it to remote sensing technology such as global environment measurement and radio astronomy.

Credit: 
National Institute of Information and Communications Technology (NICT)

Who stole the light?

image: Schematic sketch of the scattering experiment with two competing processes. The soft x-ray beam (blue arrow, from left) hits the magnetic sample (circular area) where it scatters from the microscopic, labyrinth-like magnetization pattern. In this process, an x-ray photon is first absorbed by a Cobalt 3p core electron (a). The resulting excited state can then relax spontaneously (b), emitting a photon in a new direction (purple arrow). This scattered light is recorded as the signal of interest in experiments. However, if another x-ray photon encounters an already excited state, stimulated emission occurs (c). Here, two identical photons are emitted in the direction of the incident beam (blue arrow towards right). This light carries only little information on the sample magnetization and is usually blocked for practical reasons.

Image: 
MBI Berlin

Free electron X-ray lasers deliver intense ultrashort pulses of x-rays, which can be used to image nanometer-scale objects in a single shot. When the x-ray wavelength is tuned to an electronic resonance, magnetization patterns can be made visible. When using increasingly intense pulses, however, the magnetization image fades away. The mechanism responsible for this loss in resonant magnetic scattering intensity has now been clarified.

Just as in flashlight photography, short yet intense flashes of x-rays allow to record images or x-ray diffraction patterns which "freeze" motion that is slower than the duration of the x-ray pulse. The advantage of x-rays over visible light is that nanometer scale objects can be discerned due to the short wavelength of x-rays. Furthermore, if the wavelength of the x-rays is tuned corresponding to particular energies for electronic transitions, one can obtain unique contrast, allowing for example to make the magnetization of different domains within a material visible. The fraction of x-rays scattered from a magnetic domain pattern, however, decreases when the x-ray intensity in the pulse is increased. While this effect had been observed already in the very first images of magnetic domains recorded at a free electron x-ray laser in 2012, a variety of different explanations had been put forward to explain this loss in scattered x-ray intensity.

A team of researchers from MBI Berlin, together with colleagues from Italy and France, has now precisely recorded the dependence of the resonant magnetic scattering intensity as a function of the x-ray intensity incident per unit area ( the "fluence") on a ferromagnetic domain sample. Via integration of a device to detect the intensity of every single shot hitting the actual sample area, they were able record the scattering intensity over three orders of magnitude in fluence with unprecedented precision, in spite of the intrinsic shot-to-shot variations of the x-ray beam hitting the tiny samples. The experiments with soft x-rays were carried out at the FERMI free-electron x-ray laser in Trieste, Italy.

Magnetization is a property directly coupled to the electrons of a material, which make up the magnetic moment via their spin and orbital motion. For their experiments, the researchers used patterns of ferromagnetic domains forming in cobalt-containing multilayers, a prototypical material often used in magnetic scattering experiments at x-ray lasers. In the interaction with x-rays, the population of electrons is disturbed and energy levels can be altered. Both effects could lead to a reduction in scattering, either through a transient reduction of the actual magnetization in the material due to the reshuffling of electrons with different spin, or by not being able to detect the magnetization anymore because of the shift in the energy levels. Furthermore, it has been debated whether the onset of stimulated emission at high x-ray fluences administered during a pulse of about 100 femtoseconds duration can be responsible for the loss in scattering intensity. The mechanism in the latter case is due to the fact that in stimulated emission, the direction of an emitted photon is copied from the incident photon. As a result, the emitted x-ray photon would not contribute to the beam scattered away from the original direction, as sketched in Fig.1.

In the results presented in the journal Physical Review Letters, the researchers show that while the loss in magnetic scattering in resonance with the Co 2p core levels has been attributed to stimulated emission in the past, for scattering in resonance with the shallower Co 3p core levels this process is not significant. The experimental data over the entire fluence range is well described by simply considering the actual demagnetization occurring within each magnetic domain, which the MBI researchers had previously characterized with laser-based experiments. Given the short lifetime of the Co 3p core levels of about a quarter femtosecond which is dominated by Auger decay, it is likely that the hot electrons generated by the Auger cascade in concert with subsequent electron scattering events lead to a reshuffling of spin up and spin down electrons transiently quenching the magnetization. As this reduced magnetization manifests itself already within the duration of the x-ray pulses used (70 and 120 femtosecond) and persists for a much longer time, the latter part of the x-ray pulse interacts with a domain pattern where the magnetization has actually faded away. This is in line with the observation that less reduction of the magnetic scattering is observed when hitting the magnetic sample with the same number of x-ray photons within a shorter pulse duration (Fig.2). In contrast, if stimulated emission were the dominant mechanism, the opposite behavior would be expected.

Beyond clarifying the mechanism at work, the findings have important ramifications for future single shot experiments on magnetic materials at free electron x-ray lasers. Similar to the situation in structural biology, where imaging of protein molecules by intense x-ray laser pulses can be impeded by the destruction of the molecule during the pulse, researchers investigating magnetic nanostructures also have to choose the fluence and pulse duration wisely in their experiments. With the fluence dependence of resonant magnetic scattering mapped out, researchers at x-ray lasers now have a guideline to design their future experiments accordingly.

Credit: 
Forschungsverbund Berlin

Mapping the decision-making pathways in the brain

image: In a simple behavioral experiment, rats choose between better quality food that is delivered in conjunction with an uncomfortably bright light, or lower quality food with a less bright light. Signals from the ventromedial thalamus may affect which choice the rats make.

Image: 
OIST

Every day, we make hundreds of decisions. While most are small and inconsequential, like choosing what to eat or wear, others are more complex and involve weighing up potential costs and benefits, like deciding whether to study more for a better grade instead of hanging out with friends.

Decisions like these are shaped by our own values and preferences but understanding how our brains make these choices is still not well understood. Now, scientists at the Okinawa Institute of Science and Technology Graduate University (OIST) have identified a new area of the brain that could be involved in cost-benefit decision-making.

"Previously, many neuroscientists believed that each area of the brain carried out a specific function, such as recognizing faces, memory or movement," said first author Dr. Bianca Sieveritz, former PhD student and now Junior Research Fellow in the OIST Brain Mechanism for Behavior Unit. "But in more recent years, we've realized that it's far more complex, and that cognitive processes are carried out by a distributed network across the brain, with many different brain areas communicating."

To fully understand how cognitive abilities like decision-making work, scientists first need to figure out which parts of the brain are connected to each other. These connections are formed by specialized cells called neurons. Neurons have long, thin projections that can reach out and send signals to other neurons located in different regions of the brain.

But mapping these connections is not easy. "It's not as simple as identifying that one brain area is connected to another," explained Dr. Sieveritz. "Within each brain region, there are many different types, or sub-classes, of neurons, which you need to be able to identify, and each neuronal sub-class might only connect to one other area of the brain or be involved in one specific function. So it's very messy and complicated."

In this study, published in Brain Structure and Function, Dr. Sieveritz examined an area of the brain known as the ventromedial thalamus. This region of the brain is predominantly involved in movement, with abnormal function in this area associated with symptoms of Parkinson's. But many neuroscientists believe that this area may have other undiscovered functions.

While many neurons from the ventromedial thalamus connect to the motor cortex, other neurons reach further into the prefrontal cortex. This area lies right at the front of our brain and plays a major role in more complex cognitive behaviors, including expression of personality and understanding of language.

But the prefrontal cortex is huge and comprised of millions of neurons, so this research focused in on a tiny area within the prefrontal cortex - the prelimbic cortex. Located right where the two brain hemispheres meet, the prelimbic cortex has previously been found to play a role in fear conditioning, working memory and decision-making.

Working with brain slices from rats, Dr. Sieveritz used different chemical markers to stain the ends of the neurons from the ventromedial thalamus as well as the ends of the different neurons within the outermost layer of the prelimbic cortex. She then pinpointed where the ends of the neurons met each other and formed a connection point, or synapse.

She found that most connections formed within the prelimbic cortex were with a specific sub-class of neurons called cortico-striatal neurons.

"Cortico-striatal neurons in the prelimbic cortex are important for cost-benefit decision-making," said Dr. Sieveritz. "If the ventromedial thalamus is sending signals to these cortico-striatal neurons, this could mean that the ventromedial thalamus plays a role in cost-benefit decision-making too."

Interestingly, Dr. Sieveritz noticed that neurons from the ventromedial thalamus were also sending signals to inhibitory neurons within the prelimbic cortex. Inhibitory neurons slow down or stop other neurons from firing and are essential for keeping the brain's activity under careful control.

"We know that these inhibitory neurons actually extend down into the cortex and connect to the cortico-striatal neurons in deeper layers of the brain and possibly inhibit their activity," said Dr. Sieveritz. "The connections between the neurons from the ventromedial thalamus and the inhibitory neurons therefore may provide a way of finetuning the activity of the cortico-striatal neurons."

Dr. Sieveritz stressed that while her results implicate the ventromedial thalamus in cost-benefit decision-making, further research is needed to uncover its exact role in this cognitive process. The team now plans to conduct behavioral studies in rats to determine if signals from the ventromedial thalamus change whether rats choose higher quality food at a higher cost, or lower quality food at a lower cost.

This research brings scientists one step closer to understanding the circuitry underlying the complex process of cost-benefit decision-making. "Neuroscience is like a huge jigsaw puzzle; everyone can only do a tiny part," said Dr. Sieveritz. "But when all our research is combined, hopefully we will start to see the bigger picture."

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University