Tech

Researchers at IDIBELL-ICO describe a new resistance mechanism

image: Pictured here is Dr. Oriol Casanovas (left) and Dr. Iratxe Zuazo (right).

Image: 
Iratxe Zuazo

Researchers at IDIBELL-ICO describe a new resistance mechanism to therapies that prevent the formation of blood vessels

The mechanism does not activate the reduction of oxygen in the cells of the tumor, contrary to what common drugs, called anti-angiogenic, usually cause.

In response to treatment, the immune cells of the tumor act as elements that make the tumor malignant.

Researchers at the Bellvitge Biomedical Research Institute (IDIBELL) and the ProCure Program of the Catalan Institute of Oncology (ICO) published today at Cancer Research a study describing a new mechanism in cancer that turns cells into malignant cells and contradicts what had been published so far about drug resistance that prevent the formation of blood vessels (anti-angiogenics). The research has been led by Dr. Oriol Casanovas, from the group of Tumor Angiogenesis of the IDIBELL, and the Dra. Iratxe Zuazo has participated as one of the first authors.

In the search for alternative factors to fight cancer, antibody development as a therapeutic route is one of them. Antibodies can have an anti-tumor effect, preventing tumors from developing properly. In this study, scientists observed that one of these antibodies led to similar effects to those known for traditional antiangiogenic medications. However, the initial response to treatment culminated in the appearance of long-term resistance and malignancy through a mechanism unknown to date.

"In the formation of blood vessels (angiogenesis) it had been described that the conditions of low oxygen concentration (hypoxia) in the tumor were the cause of malignization of the tumor cells that became more aggressive and migrated" says Dr. Iratxe Zuazo, "but our samples did not present hypoxia conditions and we still had this effect".

The antibody of the study had the target Semaphorin 4D (Sema4D). Semaphorins are a large and diverse family of proteins from outside the cells, involved in cell signaling, and are essential for the development and maintenance of many organs and tissues. Some of them have implications for angiogenesis and cancer progression. Its name comes from the Greek semaphero, that means "bearer of signals".

Sema4D is a protein that is expressed predominantly in the membrane of solid tumors, in cancers such as breast, prostate and colon cancer. This protein is also found in tumor-associated macrophages (TAMs), which are immune cells that play an important role in tumor invasion, tumor formation and in metastasis. The 4D Semaphorin is also related to the formation of blood vessels.

Faced with this new situation, the researchers looked for which could be the differential factor from what was known to date, and they realized that there was a large presence of macrophages. They saw that, in the presence of the anti-Sema4D antibody, the macrophages secreted a molecule named SDF1, which causes the tumor cells to migrate more: they present more motility and more invasion.

These tests were done in transgenic mice models and what was observed was an increased survival of animals in the short term. In the long term, however, an unwanted effect (metastasis) was generated, significantly worsening the condition of the mice.

"Now that we know the new mechanism, we can begin to look for a way to inhibit it - for example, avoiding secretion of SDF1- and, in this way, to give an alternative to only have the positive effects of the antibody" explains Dr. Oriol Casanovas. He adds that "depending on each case, treatments could be carried out simultaneously with two drugs at the same time".

"What we propose is that the immune system is also taken into account when doing some therapies, because now we know that there may be the possibility of activating it with some drugs", concludes Dr. Iratxe Zuazo.

Credit: 
IDIBELL-Bellvitge Biomedical Research Institute

Greater awareness needed of stomach cancer risk in under-40s, especially in Latin America

image: Abstract - P-145 'Gastric cancer in young Latin women: bad prognostic factors and outcomes' will be presented by Germán Calderillo-Ruiz during the Poster Session on 4 July, 11:10-11:30 CEST.

Image: 
@European Society for Medical Oncology

Barcelona, Spain, 2 July 2019 - Stomach cancer should no longer be considered a disease only of older people, and patients under 40 with chronic digestive symptoms should be more actively investigated - especially if they are of Latin American ethnicity. This advice follows new data from a retrospective, observational study in Mexico which showed that one in seven of over 2,000 patients diagnosed with gastric cancer between 2004 and 2016 were under 40. These findings, reported at the ESMO World Congress on Gastrointestinal Cancer 2019, (1,2) support US National Cancer Institute data showing that gastric cancer is affecting more young Hispanic people, with worse outcomes than in older patients.

"At our centre, we have seen a 120% increase in gastric cancer in younger patients in the last 12 years and this increase has been mainly in female patients who typically present with more advanced disease and worse prognostic indicators than men, with an adverse impact on survival," said study author Dr German Calderillo-Ruiz, from the National Cancer Institute, Tlalpan, Mexico.

In the Mexican study, over half of patients under 40 with gastric cancer were women, in contrast to previous research which has typically shown that gastric cancer is most common in men. Female patients in Mexico were more likely to have diffuse-type and poorly differentiated tumours and later stage disease at diagnosis than male patients, with a significantly lower overall survival.

"The lack of financial resources may impact on women's behaviour of delaying pursuit of medical care when gastric symptoms appear. We hope this research will encourage clinicians and patients to be more aware of the risk of gastric cancer in younger people and, in particular, to encourage women with gastric symptoms to seek medical help sooner," said Calderillo-Ruiz.

Commenting on the implications of the research, Dr Rodrigo Dienstmann, from Vall d'Hebron Institute of Oncology, Barcelona, Spain, highlighted the combination of genetic and environmental factors that contribute to gastric cancer and the fact that young people with gastric cancer have more aggressive disease which is less responsive to curative treatment.

"We cannot change the genetic factors but we can act on the unhealthy diet, obesity, and untreated Helicobacter pylori infection which increase the risk of gastric cancer. Helicobacter infection can cause chronic inflammation and lesions that are precursors to gastric cancer but, once diagnosed, can be cured with a combination of antibiotics and drugs to reduce stomach acid," said Dienstmann.

"Younger people who regularly experience indigestion, heartburn or other gastric symptoms should not ignore them but should go to their doctor as they probably need diagnostic tests. In addition, clinicians should not ignore the possibility of gastric cancer in young population, particularly in Latin America or among Hispanics in North America." concluded Dienstmann.

Following the latest research in Mexico, epidemiological and molecular studies are being conducted in Latin America and Europe to investigate the different molecular subtypes of gastric cancer in the regions and improve our understanding of risk factors in these populations. (3)

Study results

In the Mexican study, data from 2,022 patients with gastric adenocarcinoma diagnosed between 2004 and 2016 were analysed, of whom 290 patients (14%) were under 40. Of these, 54% were women and 46% were men. Women had higher levels of factors indicating poor prognosis than men: diffuse-type tumour (68% vs 32%; P=0.127), ring-seal cells (76% vs 69%; P=0.049), poorly-differentiated (89% vs 84%; P=0.014) and higher prevalence of stage IV disease (59% vs 41%; P=0.011).

Overall survival was a median of 7 versus 8 months for women and men respectively (P=0.03; hazard ratio (HR) 1.29; 95% CI, 1.05-1.65). Median overall survival was significantly worse in patients with tumours at the oesophagogastric junction: 7 vs 14 months (P=0.23; HR 0.68; 95% CI, 1.05-2.688) and more advanced disease: clinical-stages I-III, locally advanced and stage IV 33, 12, and 5 months, respectively (P=0.001; HR=2.28; 95% CI, 1.72-3.01). Independent predictors of overall survival were maintained in a Cox-Regression analysis: gender (P=0.038, HR 1.29, 95% CI 1.01-1.65), primary tumor (P=0.02, HR 1.68, 95% CI 1.05-2.68) and clinical-stage (P=0.001, HR 2.28, 95% CI 1.72-3.01).

Credit: 
European Society for Medical Oncology

Mechanical vibration generated by electron spins

image: The sample consists of a cantilever made of Y3Fe5O12(YIG) connected to an edge of a YIG film and a heater placed on the YIG film around the root of the cantilever. An electric current applied to the heater generates heat, which flows across the YIG film and the GGG substrate toward the sample holder. The heat current creates spin-wave (magnon) accumulation at the surface and the bottom of the YIG film. The accumulation injects spin current into the YIG cantilever connected around the surface of the film.

Image: 
Kazuya Harii

Micro mechanical elements are indispensable components of modern electrical devices but the actuation of them requires electrical current. It becomes harder to wire the element as further downscaling of device is pursued. As a way out of this issue, researchers demonstrated a new way to deliver a force to drive micro mechanics by spin current.

Spin current is a flow of electron angular momentum in matters. The spin current has been used as a new information carrier in the context of spintronics such as hard disk drives (HDD) head and Magnetic Random-Access Memory (MRAM). In this context, the injection of spin current can control the orientation of micro magnets by exerting magnetic torque.

Considering the angular momentum nature of the spin current, what would happen when the spin current is injected to a mechanical object? The injected excess amount of angular momentum may exert mechanical torque on it. This is the idea.

In this study, the researchers fabricated a micro cantilever structure made of magnetic insulator yttrium iron garnet (YIG: Y3Fe5O12). A metallic thin wire was put on the root of the cantilever as a heater. When electrical current flows in the wire, the wire works as a generator of spin current by spin Seebeck effect and the spin current propagates into the micro cantilever. The researchers measured the vibration of the cantilever while injecting the spin current modulated near the resonant frequency of the micro cantilever. The measurement confirmed that only when the spin current injection of appropriate spin orientation can excite the vibration of the cantilever.

"This driving mechanism of micro machines does not require an electrical wiring on it." Kazuya Harii, a researcher at ERATO Saitoh Spin Quantum Rectification Project, said. "This mechanism is applicable to any mechanical objects in micro and nano meter scales."

Credit: 
Japan Science and Technology Agency

Details of UK-led solar science mission revealed at National Astronomy Meeting

image: A 90 cm diameter external occulter extends (post-launch) from the sunward Leading CubeSat, resulting in an artificial eclipse falling on the anti-sunward Trailing CubeSat instrumentation, with a separation of 100 m. The pair of CubeSat's will fly in formation maintaining positioning via an ion propulsion thruster system throughout the 5-year mission lifetime. Uniquely, the Leading CubeSat serves as an occulter, as well as, housing instrumentation for solar observations of coronal magnetic fields.

Image: 
Dr Eamon Scullion Northumbria University Newcastle upon Tyne

Named after a Celtic goddess of the Sun, SULIS is a UK-led solar science mission, designed to answer fundamental questions about the physics of solar storms. The mission consists of a cluster of small satellites and will carefully monitor solar storms using state-of-the-art UK technology, as well as demonstrating new technologies in space. Lead Investigator on the project, Dr Eamon Scullion of Northumbria University, will reveal plans for the mission on Wednesday, 3 July at the Royal Astronomical Society's National Astronomy Meeting in Lancaster.

Once funded, the mission will study the nature of solar eruptions, and track huge magnetic clouds of charged gas as they travel at high speed on a collision course with Earth.

"SULIS will apply high definition remote sensing in 3D to help space scientists finally understand what these magnetic clouds of charged gas are made of, how much matter they contain, what causes their eruption, how fast they are travelling and, most importantly, how damaging they could be to Earth" explains Scullion.

Solar storms occur when the Sun releases enormous bursts of energy as solar flares, launching huge magnetic clouds of charged gas, known as coronal mass ejections. It is the interaction of these charged particles with the Earth's atmosphere that results in aurora, but solar storms can also have a more significant impact on Earth, causing global mobile phone or GPS disruption, radio blackouts, and satellite failures.

The coronal magnetic field is one of the most important physical properties of the solar atmosphere and yet it is one of the least explored. SULIS will include instruments to directly measure the magnetic field of the solar corona for the first time, with three pairs of formation-flying coronagraphs in orbit around the Sun. The first pair will be put into Earth orbit, with the other two pairs to be positioned ahead of, and behind, Earth in its orbit for a mission lifetime of 10 years.

Severe space weather is included on the UK National Risk Register, meaning government departments including military, energy, civil aviation, and transport must plan for this risk. "Solar storms are unavoidable" says Scullion, "but with SULIS we will learn about their basic building blocks in order to more accurately forecast when the next 'big one' will arrive. Having advanced warnings will enable us to take steps to minimise the impact".

On 2 July 2019, SULIS Co-Investigator Dr Huw Morgan, will be part of a team of solar scientists from Aberystwyth University visiting Chile to observe the total solar eclipse. The eclipse provides ideal conditions for testing a state-of-the-art compact hyperspectral imager, which is expected to become incorporated into the SULIS mission on one of satellite pairs. Dr Morgan says, "The SULIS consortium are now awaiting the outcome of the eclipse observing run with high expectations for spectacular HD images of the solar corona".

SULIS is not only designed to be a space science mission, but also to demonstrate technology for precision alignment of small satellites flying in formation, and future communications.

Precision manoeuvring in formation-flying is a challenge for satellite constellations and is crucial for maintaining a functioning coronagraph in space. The coronagraph is essentially an artificial eclipse created by one satellite in a pair eclipsing the Sun with respect to the view of the other satellite. This eclipsing is required in order to block out the bright light of the Sun surface in order to detect and measure the properties of the faint light coming only from the corona. SULIS will investigate the nature of the magnetic field in the corona through inspecting subtle changes in the properties of the coronal light itself.

The mission will also demonstrate the use of laser power transfer in space and laser communications in Low Earth Orbits (i.e. for both inter-satellite communications and satellite-to-Earth communications). This is essential for small satellites with instruments capable of recording vast amounts of data, which either require more efficient ways to store large volumes of data locally, or mechanisms for moving data off the satellites extremely rapidly to avoid hardware and data telemetry issues.

On the SULIS mission, one of the satellites in the formation-flying pair will be shadowed by the other, meaning that the partially eclipsed satellite will require some additional power. This will be done through a laser power exchange with the satellite that is in permanent sunlight in order to carry out all its functions.

The ability to transfer power to otherwise "dead" satellites could be highly useful for future small satellite cluster missions, increasing their longevity, and helping to manage the ever-growing space debris problem.

"We are excited to be developing a mission to expand the UK's role in solar physics" says Scullion. "The SULIS mission complements existing and proposed operational space weather missions from Nasa and Esa and will help pave the way for future space weather instruments".

Credit: 
Royal Astronomical Society

Personalized medicine software vulnerability uncovered by Sandia researchers

image: Researchers at Sandia National Laboratories uncovered a vulnerability in open source genome mapping software that has now been fixed by developers.

Image: 
Brent Haglund

LIVERMORE, Calif. -- A weakness in one common open source software for genomic analysis left DNA-based medical diagnostics vulnerable to cyberattacks.

Researchers at Sandia National Laboratories identified the weakness and notified the software developers, who issued a patch to fix the problem. The issue has also been fixed in the latest release of the software. While no attack from this vulnerability is known, the National Institutes of Standards and Technology recently described it in a note to software developers, genomics researchers and network administrators.

The discovery reveals that protecting genomic information involves more than safe storage of an individual's genetic information. The cybersecurity of computer systems analyzing genetic data is also crucial, said Corey Hudson, a bioinformatics researcher at Sandia who helped uncover the issue.

Personalized medicine -- the process of using a patient's genetic information to guide medical treatment -- involves two steps: sequencing the entire genetic content from a patient's cells and comparing that sequence to a standardized human genome. Through that comparison, doctors identify specific genetic changes in a patient that are linked to disease.

Genome sequencing starts with cutting and replicating a person's genetic information into millions of small pieces. Then a machine reads each piece numerous times and transforms images of the pieces into sequences of building blocks, commonly represented by the letters A, T, C and G. Finally, software collects those sequences and matches each snippet to its place on a standardized human genome sequence. One matching program used widely by personalized genomics researchers is called Burrows-Wheeler Aligner (BWA).

Sandia researchers studying the cybersecurity of this program found a weak spot when the program imports the standardized genome from government servers. The standardized genome sequence traveled over insecure channels, which created the opportunity for a common cyberattack called a "man-in-the-middle."

In this attack, an adversary or a hacker could intercept the standard genome sequence and then transmit it to a BWA user along with a malicious program that alters genetic information obtained from sequencing. The malware could then change a patient's raw genetic data during genome mapping, making the final analysis incorrect without anyone knowing it. Practically, this means doctors may prescribe a drug based on the genetic analysis that, had they had the correct information, they would have known would be ineffective or toxic to a patient.

Forensic labs and genome sequencing companies that also use this mapping software were also temporarily vulnerable to having results maliciously altered in the same way. Information from direct-to-consumer genetic tests was not affected by this vulnerability because these tests use a different sequencing method than whole genome sequencing, Hudson said.

Security cybersleuths

To find this vulnerability, Hudson and his cybersecurity colleagues at the University of Illinois at Urbana-Champaign used a platform developed by Sandia called Emulytics to simulate the process of genome mapping. First, they imported genetic information simulated to resemble that from a sequencer. Then they had two servers send information to Emulytics. One provided a standard genome sequence and the other acted as the "man-in-the-middle" interceptor. The researchers mapped the sequencing results and compared results with and without an attack to see how the attack changed the final sequence.

"Once we discovered that this attack could change a patient's genetic information, we followed responsible disclosure," Hudson said. The researchers contacted the open source developers, who then issued a patch to fix the problem. They also contacted public agencies, including cybersecurity experts at the U.S. Computer Emergency Readiness Team, so they could more widely distribute information about this issue.

The research, funded by Sandia's Laboratory Directed Research and Development program, continues testing other genome mapping software for security weaknesses. Differences between each program mean the researchers might find a similar, but not identical, issue in other programs, Hudson said.

Along with installing the latest version of BWA, Hudson and his colleagues recommend other "cyberhygiene" strategies to secure genomic information, including transmitting data over encrypted channels and using software that protects sequencing data from being changed. They also encourage security researchers who routinely analyze open source software for weaknesses to look at genomics programs. This practice is common in industrial control systems in the energy grid and software used in critical infrastructure, Hudson said, but would be a new area for genomics security.

"Our goal is to make systems safer for people who use them by helping to develop best practices," he said.

Credit: 
DOE/Sandia National Laboratories

Why some cities turn off the water pipes at night

image: Professor David Taylor analyzes the impact of intermittent water systems.

Image: 
Roberta Baker

For more than a billion people around the world, running water comes from "intermittent systems" that turn on and off at various times of the week. A new paper by University of Toronto Engineering professor David Taylor proposes a simple, yet powerful model to explain why and how these systems come to be -- and how they fit into the global challenge of meeting international targets for human development and safe drinking water.

The idea of an intermittent water system may seem strange to engineers from developed countries. Constantly filling and emptying pipes puts a lot of stress on the system due to fluctuations in pressure. It also opens the door to contamination: rainwater or sewage can leak into empty pipes more easily than full ones.

But Taylor believes there may be benefits to intermittent systems as well as drawbacks. "One obvious example is that a pipe can't leak if there is no water in it," he says. "If you have no budget for repairs, turning off the taps at night when nobody is using them is a very effective way to stop losing water to leaks, at least in the short run."

Taylor's PhD thesis involved working with water companies in Delhi, India and trying to understand how intermittent operation affected their ability to meet customer demand. One way to do this is to build a hydraulic model -- a virtual representation of every pipe, valve and customer inside a computer. But Taylor quickly found that such detailed models weren't especially helpful.

"These systems are chaotic," says Taylor. "There are often pipes or valves that are missing from the official charts. We usually don't know as much as we think we do, and in that situation, fancy models can't tell us much."

But rather than give up, Taylor asked himself a question: what would the model look like if I admit that I know almost nothing about the network?

"You don't need a detailed understanding of food chemistry to know that if you want twice as many cookies you had better add twice as much of everything, not just the flour," says Taylor. "It turns out that if you model a water supply system in this simple, first-order way, there's a lot you can learn."

Taylor's single-equation model can, among other things, describe the key differences between how a system behaves when customers are satisfied versus when they are not. When customers are not satisfied, doubling supply time -- say moving from one to two hours per day -- requires twice the amount of water, because people are taking all they can get.

But when customers are getting enough water, demand levels off. In this situation, each additional hour costs a lot less because weaker effects, such as leakage, are now the dominant factor.

This distinction helps resolve a long-standing debate about whether intermittent systems waste water or save water. In the unsatisfied case, they probably save water, but they do so by leaving customers thirsty. In the satisfied case, the hassle of turning off and on the pipes probably isn't worth the gain in terms of water savings.

In a paper recently published in Water Resources Research, Taylor lays out his model and describes how it might be used to analyze existing systems and set goals for new ones. He calibrated the model by comparing its results with those of a much more complex one, and found that the agreement between the two models was high enough to be able to provide useful insights, such as whether a given upgrade is likely to cost-effective.

"The model lets you see right away what the effect of altering a parameter is going to be, whether it's leakage or demand or whatever," says Taylor. "That enables you to do these back-of-the-envelope calculations and determine whether what you're proposing is feasible."

Another key aspect of the model is that it is dimensionless. For example, the amount of time the system supplies water is not measured in minutes or hours, but rather the percentage of time the system is turned on. This makes it easier to compare systems with each other. Taylor also hopes that it will help in global efforts to meet the UN's Sustainable Development Goals and its Human Right to Water.

"These documents say that water needs to be 'available when needed,' but that might mean different things in different places," he says. "Maybe its 24 hours a day, maybe it's 12, maybe it's less. What I hope this model can do is present a theoretical framework for how we decide which systems count as safely managed water supplies and which ones don't."

"Without a way to decide which intermittent systems count as 'safe', we don't stand a chance of hitting our 2030 global goals for access to clean and affordable water," he adds. "The model can help guide us as we start to make the major infrastructure investments needed to hit these goals."

Credit: 
University of Toronto Faculty of Applied Science & Engineering

Tiny motor can 'walk' to carry out tasks

image: This walking microrobot was built by the MIT team from a set of just five basic parts, including a coil, a magnet, and stiff and flexible structural pieces.

Image: 
Will Langford

CAMBRIDGE, MA -- Years ago, MIT Professor Neil Gershenfeld had an audacious thought. Struck by the fact that all the world's living things are built out of combinations of just 20 amino acids, he wondered: Might it be possible to create a kit of just 20 fundamental parts that could be used to assemble all of the different technological products in the world?

Gershenfeld and his students have been making steady progress in that direction ever since. Their latest achievement, presented this week at an international robotics conference, consists of a set of five tiny fundamental parts that can be assembled into a wide variety of functional devices, including a tiny "walking" motor that can move back and forth across a surface or turn the gears of a machine.

Previously, Gershenfeld and his students showed that structures assembled from many small, identical subunits can have numerous mechanical properties. Next, they demonstrated that a combination of rigid and flexible part types can be used to create morphing airplane wings, a longstanding goal in aerospace engineering. Their latest work adds components for movement and logic, and will be presented at the International Conference on Manipulation, Automation and Robotics at Small Scales (MARSS) in Helsinki, Finland, in a paper by Gershenfeld and MIT graduate student Will Langford.

Their work offers an alternative to today's approaches to constructing robots, which largely fall into one of two types: custom machines that work well but are relatively expensive and inflexible, and reconfigurable ones that sacrifice performance for versatility. In the new approach, Langford came up with a set of five millimeter-scale components, all of which can be attached to each other by a standard connector. These parts include the previous rigid and flexible types, along with electromagnetic parts, a coil, and a magnet. In the future, the team plans to make these out of still smaller basic part types.

Using this simple kit of tiny parts, Langford assembled them into a novel kind of motor that moves an appendage in discrete mechanical steps, which can be used to turn a gear wheel, and a mobile form of the motor that turns those steps into locomotion, allowing it to "walk" across a surface in a way that is reminiscent of the molecular motors that move muscles. These parts could also be assembled into hands for gripping, or legs for walking, as needed for a particular task, and then later reassembled as those needs change. Gershenfeld refers to them as "digital materials," discrete parts that can be reversibly joined, forming a kind of functional micro-LEGO.

The new system is a significant step toward creating a standardized kit of parts that could be used to assemble robots with specific capabilities adapted to a particular task or set of tasks. Such purpose-built robots could then be disassembled and reassembled as needed in a variety of forms, without the need to design and manufacture new robots from scratch for each application.

Langford's initial motor has an ant-like ability to lift seven times its own weight. But if greater forces are required, many of these parts can be added to provide more oomph. Or if the robot needs to move in more complex ways, these parts could be distributed throughout the structure. The size of the building blocks can be chosen to match their application; the team has made nanometer-sized parts to make nanorobots, and meter-sized parts to make megarobots. Previously, specialized techniques were needed at each of these length scale extremes.

"One emerging application is to make tiny robots that can work in confined spaces," Gershenfeld says. Some of the devices assembled in this project, for example, are smaller than a penny yet can carry out useful tasks.

To build in the "brains," Langford has added part types that contain millimeter-sized integrated circuits, along with a few other part types to take care of connecting electrical signals in three dimensions.

The simplicity and regularity of these structures makes it relatively easy for their assembly to be automated. To do that, Langford has developed a novel machine that's like a cross between a 3-D printer and the pick-and-place machines that manufacture electronic circuits, but unlike either of those, this one can produce complete robotic systems directly from digital designs. Gershenfeld says this machine is a first step toward to the project's ultimate goal of "making an assembler that can assemble itself out of the parts that it's assembling."

Credit: 
Massachusetts Institute of Technology

Generation and sampling of quantum states of light in a silicon chip

image: By exploring complex integrated circuits, photonic states can be generated and processed at larger scales.

Image: 
Dr Stefano Paesani, University of Bristol

Scientists from the University of Bristol and the Technical University of Denmark have found a promising new way to build the next generation of quantum simulators combining light and silicon micro-chips.

In the roadmap to develop quantum machines able to compete and overcome classical supercomputers in solving specific problems, the scientific community is facing two main technological challenges.

The first is the capability of building large quantum circuits able to process the information on a massive scale, and the second is the ability to create a large number of single quantum particles that can encode and propagate the quantum information through such circuits.

Both these two requirements need to be satisfied in order to develop an advanced quantum technology able to overcome classical machines.

A very promising platform to tackle such challenges is silicon quantum photonics. In this technology, the information carried by photons, single particle of lights, is generated and processed in silicon micro-chips.

These devices guide and manipulate light at the nanoscale using integrated waveguides - the analogue of optical fibres at the nanometre-scale.

Crucially, the fabrication of photonic chips requires the same techniques used for fabricating electronic micro-chips in the semiconductor industry, making the fabrication of quantum circuits at a massive scale possible.

In the University of Bristol's Quantum Engineering Technology (QET) Labs, the team have recently demonstrated silicon photonic chips embedding quantum interferometres composed of almost a thousand optical components, orders of magnitude higher that what was possible just few years ago.

However, the big question that remained unanswered was if these devices were also able to produce a number of photons large enough to perform useful quantum computational tasks. The Bristol-led research, published today in the journal Nature Physics, demonstrates that this question has a positive answer.

By exploring recent technological developments in silicon quantum photonics, the team have demonstrated that even small-scale silicon photonic circuits can generate and process a number of photons unprecedented in integrated photonics.

In fact, due to imperfections in the circuit such as photon losses, previous demonstrations in integrated photonics have been mostly limited to experiments with only two photons generated and processed on-chip, and only last year, four-photon experiments were reported using complex circuitry.

In the work, by improving the design of each integrated component, the team show that even simple circuits can produce experiments with up to eight photons, double than the previous record in integrated photonics. Moreover, their analysis shows that by scaling up the circuit complexity, which is a strong capability of the silicon platform, experiments with more than 20 photons are possible, a regime where photonic quantum machines are expected to surpass the best classical supercomputers.

The study also investigates possible applications for such near-term photonics quantum processors entering a regime of quantum advantage.

In particular, by reconfiguring the type of optical non-linearity in the chip, they demonstrated that silicon chips can be used to perform a variety of quantum simulation tasks, known as boson sampling problems.

For some of these protocols, for example the Gaussian Boson Sampling, this new demonstration is a world-first.

The team also demonstrated that, using such protocols, silicon quantum devices will be able to solve industrially relevant problems. In particular, they show how the chemical problem of finding the vibrational transitions in molecules undergoing an electronic transformation can be simulated on our type of devices using Gaussian Boson Sampling.

Lead author Dr Stefano Paesani from the University of Bristol's Centre for Nanoscience and Quantum Information, said: "Our findings show that photonic quantum simulators surpassing classical supercomputers are a realistic near-term prospect for the silicon quantum photonics platform.

"The development of such quantum machines can have potentially ground-breaking impacts on industrially relevant fields such as chemistry, molecular designing, artificial intelligence, and big-data analysis.

"Applications include the design of better pharmaceutics and the engineering of molecular states able to generate energy more efficiently."

Co-author, Dr Raffaele Santagati, added: "The results obtained make us confident that the milestone of quantum machines faster than any current classical computers is within reach of the integrated quantum photonics platform.

"While it is true that also other technologies have the capability to reach such regime, for example trapped ions or superconducting systems, the photonics approach has the unique advantage of having the near-term applications we investigated. The photonic path, although perilous, is set, and is very much worth pursuing."

Professor Anthony Laing, Associate Professor of Physics at Bristol supervised the project. He said: "In quadrupling the number of photons both generated and processed in the same chip, the team have set the scene for scaling up quantum simulators to tens of photons where performance comparisons with today's standard computing hardware become meaningful."

Credit: 
University of Bristol

FEFU scientists teamed up with colleagues to develop ointment for skin cancer prevention

image: This is the Laboratory for DNA diagnosis, Far Eastern Federal University, Vladivostok, Russia.

Image: 
FEFU press office

Scientists from Far Eastern Federal University (FEFU), V.I. Vernadsky Crimean Federal University, Dmitry Mendeleev University of Chemical Technology, and Far Eastern Branch of the Russian Academy of Sciences (FEB RAS), assumed the risks of primary skin cancer and its recurrences can be significantly reduced by applying the ointment with antisense oligonucleotides which are short DNA, RNA fragments used in oncology to suppress the synthesis of tumor proteins. A related review was published in Molecules.

The role of antisense oligonucleotides is to support the functions of proteins responsible for apoptosis in the human body. This natural mechanism allows cells to carry out programmed death. As a result of skin exposure to primary damaging factors, such as ultraviolet (natural sunlight, tanning machines UV), these proteins may be restrained to function in the transformed cells, thus the regulation of apoptosis is violating, in some cases it becomes impossible. Due to this, the cells begin to divide uncontrollably and mutate resulting in the tumor occurrence.

The most reliable way to get rid of skin cancer is to cut it out surgically. However, at different stages of tumor progression, especially concerning the most aggressive and lethal types of skin cancer - melanoma and Merkel carcinoma - it's difficult to recognize the tumor boundaries and depth. Thus, for a surgeon, it's not clear how much tissues around a tumor area should be removed. The fact is that these tissues may hide isolated tumor cells, which may provoke the relapse of cancer in the future. To cope with that, we consider special ointments as promising treatment, which we suggest as an additional measure to surgical excision. Ointments will help patients to get free of some aesthetic defects that inevitably follow the surgery. In some cases, at the initial stages of the malignancy, ointment-based treatment, perhaps, will allow one to completely avoid the operation.' Vadim Kumeiko said, Deputy Director for Development, Head of Laboratory of Biomedical Cell Technologies, FEFU School of Biomedicine.

The researchers are focused on the development of an ointment containing antisense oligonucleotides for BCL-2 and survivin proteins. It will be designed to fight over melanoma, the most deadly type of skin cancer. The components of the ointment will penetrate the stratum corneum, and reach the melanocytes and progenitor cells, the mutation of which can lead to the development of a deadly tumor.

One of the main challenges in the development of such an ointment remains the speed of delivery of antisense oligonucleotides into cancerous tissues. These compounds are unstable and can lose their effectiveness even before they reach damaged skin cells. For the same reason, the strategy of direct topical application of the ointment at the area of the damaged cells seems to be more effective comparing to other delivering routes, for example, intravenous.

Skin cancer remains the most common type of malignancy. It accounts for one of the three cases of cancer diagnosed. Ultraviolet radiation of natural and artificial origin is one of the main factors affecting the mutation of human skin cells into tumor cells. As a rule, cancer occurs on the skin areas which are most often exposed to the natural sunlight or tanning machines UV. Men and women of different ages and skin color are vulnerable to skin cancer. In the high-risk group are the elderly, people with white or light-colored skin, with blue eyes, as well as redheads.

Credit: 
Far Eastern Federal University

Redefining the limits of measurement accuracy

image: Scientists at the QUEST Institute at Leibniz University, Hannover, and the Physikalisch-Technische Bundesanstalt, have, together with colleagues in Florence, Italy, introduced a method based on a non-classical state adapted to two measurement parameters at once. This will enable precision measurements of molecules which could reveal interactions between conventional and dark matter.

Image: 
Fabian Wolf

For centuries, humans have been expanding their understanding of the world through more and more precise measurement of light and matter. Today, quantum sensors achieve extremely accurate results. An example of this is the development of atomic clocks, which are expected to neither gain nor lose more than a second in thirty billion years. Gravitational waves were detected via quantum sensors as well, in this case by using optical interferometers.

Quantum sensors can reach sensitivities that are impossible according to the laws of conventional physics that govern everyday life. Those levels of sensitivity can only be reached if one enters the world of quantum mechanics with its fascinating properties - such as the phenomenon of superposition, where objects can be in two places at once and where an atom can have two different energy levels at the same time.

Both generating and controlling such non-classical states is extremely complex. Due to the high level of sensitivity required, these measurements are prone to external interference. Furthermore, non-classical states must be adapted to a specific measurement parameter. "Unfortunately, this often results in increased inaccuracy regarding other relevant measurement parameters", says Fabian Wolf, describing the challenge. This concept is closely linked to Heisenberg's uncertainty principle. Wolf is part of a team of researchers from Leibniz University Hannover, Physikalisch-Technische Bundesanstalt in Braunschweig, and the National Institute of Optics in Florence. The team introduced a method based on a non-classical state adapted to two measurement parameters at once.

The experiment can be visualised as the quantum mechanical version of a simple pendulum. In this case, the adapted measurement parameters are the pendulum's maximum displacement (amplitude) and the number of oscillations per second (frequency). The pendulum comprises a single magnesium ion embedded into an "ion trap". Via laser light interactions, researchers were able to cool the magnesium ion to the ground state of a quantum mechanical system, the coldest achievable state. From there, they generated a "Fock state" of the motion and oscillated the single atom pendulum using an external force. This allowed them to measure amplitude and frequency with a sensitivity unmatched by a conventional pendulum. In contrast to previous experiments, this was the case for both measurement parameters without having to adjust the non-classical state.

Using this new approach, the team reduced the measurement time by half while the resolution remained constant or doubled the resolution with a constant measurement time. High resolution is particularly important for spectroscopy techniques based on changing the state of motion. In this particular case, researchers intend to analyse individual molecular ions via laser irradiation in order to stimulate molecular movement. The new procedure will enable them to analyse the state of the molecule before it is disrupted by too intense laser irradiation. "For example, precision measurements of molecules could reveal interactions between conventional and dark matter, which would be a great contribution to solving one of the biggest mysteries in contemporary physics", says Fabian Wolf. The measurement concept, which researchers demonstrated for the first time, could also improve the resolution in optical interferometers such as gravitational wave detectors - therefore providing more in-depth insights into the dawn of the universe.

Credit: 
Physikalisch-Technische Bundesanstalt (PTB)

Using artificial intelligence to better predict severe weather

When forecasting weather, meteorologists use a number of models and data sources to track shapes and movements of clouds that could indicate severe storms. However, with increasingly expanding weather data sets and looming deadlines, it is nearly impossible for them to monitor all storm formations -- especially smaller-scale ones -- in real time.

Now, there is a computer model that can help forecasters recognize potential severe storms more quickly and accurately, thanks to a team of researchers at Penn State, AccuWeather, Inc., and the University of Almería in Spain. They have developed a framework based on machine learning linear classifiers -- a kind of artificial intelligence -- that detects rotational movements in clouds from satellite images that might have otherwise gone unnoticed. This AI solution ran on the Bridges supercomputer at the Pittsburgh Supercomputing Center.

Steve Wistar, senior forensic meteorologist at AccuWeather, said that having this tool to point his eye toward potentially threatening formations could help him to make a better forecast.

"The very best forecasting incorporates as much data as possible," he said. "There's so much to take in, as the atmosphere is infinitely complex. By using the models and the data we have [in front of us], we're taking a snapshot of the most complete look of the atmosphere."

In their study, the researchers worked with Wistar and other AccuWeather meteorologists to analyze more than 50,000 historical U.S. weather satellite images. In them, experts identified and labeled the shape and motion of "comma-shaped" clouds. These cloud patterns are strongly associated with cyclone formations, which can lead to severe weather events including hail, thunderstorms, high winds and blizzards.

Then, using computer vision and machine learning techniques, the researchers taught computers to automatically recognize and detect comma-shaped clouds in satellite images. The computers can then assist experts by pointing out in real time where, in an ocean of data, could they focus their attention in order to detect the onset of severe weather.

"Because the comma-shaped cloud is a visual indicator of severe weather events, our scheme can help meteorologists forecast such events," said Rachel Zheng, a doctoral student in the College of Information Sciences and Technology at Penn State and the main researcher on the project.

The researchers found that their method can effectively detect comma-shaped clouds with 99 percent accuracy, at an average of 40 seconds per prediction. It was also able to predict 64 percent of severe weather events, outperforming other existing severe-weather detection methods.

"Our method can capture most human-labeled, comma-shaped clouds," said Zheng. "Moreover, our method can detect some comma-shaped clouds before they are fully formed, and our detections are sometimes earlier than human eye recognition."

"The calling of our business is to save lives and protect property," added Wistar. "The more advanced notice to people that would be affected by a storm, the better we're providing that service. We're trying to get the best information out as early as possible."

This project enhances earlier work between AccuWeather and a College of IST research group led by professor James Wang, who is the dissertation adviser of Zheng.

"We recognized when our collaboration began [with AccuWeather in 2010] that a significant challenge facing meteorologists and climatologists was in making sense of the vast and continually increasing amount of data generated by Earth observation satellites, radars and sensor networks," said Wang. "It is essential to have computerized systems analyze and learn from the data so we can provide timely and proper interpretation of the data in time-sensitive applications such as severe-weather forecasting."

He added, "This research is an early attempt to show feasibility of artificial intelligence-based interpretation of weather-related visual information to the research community. More research to integrate this approach with existing numerical weather-prediction models and other simulation models will likely make the weather forecast more accurate and useful to people."

Concluded Wistar, "The benefit [of this research] is calling the attention of a very busy forecaster to something that may have otherwise been overlooked."

Credit: 
Penn State

Proteins trapped in glass could yield new medicinal advances

image: The protein, captured in an extremely thin piece of glass -- around 50 nanometres in diameter, is sliced up, atom by atom, with the help of an electrical field. It is then analysed through Atom Probe Tomography, and the 3D structure is recreated on a computer.

Image: 
Small: Volume 15, Issue 24, Atom Probe Tomography for 3D Structural and Chemical Analysis of Individual Proteins Gustav Sundell, Mats Hulander, Astrid Pihl, Martin Andersson Copyright Wiley-VCH Verlag GmbH & Co. KGaA. Reproduced with permission.

Researchers at Chalmers University of Technology, Sweden, have developed a unique method for studying proteins which could open new doors for medicinal research. Through capturing proteins in a nano-capsule made of glass, the researchers have been able to create a unique model of proteins in natural environments. The results are published in the scientific journal, Small.

Proteins are target-seeking and carry out many different tasks necessary to cells' survival and functions. This makes them interesting for development of new medicines - particularly those proteins which sit in the cellular membrane, and govern which molecules are allowed to enter the cell and which are not. This means that understanding how such proteins work is an important challenge in order to develop more advanced medicines. But this is no easy feat - such proteins are highly complex. Today several different methods are used for imaging proteins, but no method offers a full solution to the challenge of studying individual membrane proteins in their natural environment.

A research group at Chalmers University of Technology, under the leadership of Martin Andersson at the Department of Chemistry and Chemical Engineering, has now successfully used Atom Probe Tomography to image and study proteins. Atom Probe Tomography has existed for a while, but has not previously been used in this way - but instead for investigating metals and other hard materials.

"It was in connection with a study of contact surfaces between the skeleton and implants when we discovered we could distinguish organic materials in the bone with this technique. That gave us the idea to develop the method further for proteins," says Martin Andersson.

The challenge lay in developing a method of keeping the proteins intact in their natural environment. The researchers successfully achieved this by encapsulating the protein in an extremely thin piece of glass, only around 50 nanometres in diameter (a nanometre is 1/millionth of a millimetre.) They then sliced off the outermost layer of the glass using an electrical field, freeing the protein atom by atom. The protein could then be recreated in 3D on a computer.

The results of the study have been verified through comparison with existing three-dimensional models of known proteins. In the future, the researchers will refine the method to improve the speed and accuracy.

The method is ground breaking in several ways. As well as modelling the three-dimensional structure, it simultaneously reveals the proteins' chemical composition.

"Our method offers a lot of good solutions and can be a strong complement to existing methods. It will be possible to study how proteins are built at an atomic level," says Martin Andersson.

With this method, potentially all proteins can be studied, something that is currently not possible. Today, only around one percent of membrane proteins have been successfully structurally analysed.

"With this method, we can study individual proteins, as opposed to current methods which study a large number of proteins and then create an average value," says Gustav Sundell, a researcher in Martin Andersson's research group.

With Atom Probe Tomography, information on an atom's mass can also be derived.

"Because we collect information on atoms' masses in our method, it means we can measure the weight. We can then, for example, create tests where medicinal molecules are combined with different isotopes - giving them different masses - which makes them distinguishable in a study. It should contribute to speeding up processes for constructing and testing new medicines," says Mats Hulander, a researcher in Martin Andersson's group.

Credit: 
Chalmers University of Technology

Promising approach: Prevent diabetes with intermittent fasting

image: Thomas Laeger with first suthor Maria Teresa Castaño-Martinez (center) and a technical assistant in the laboratory of the Department of Experimental Diabetology.

Image: 
David Ausserhofer/DIfE

Intermittent fasting is known to improve sensitivity to the blood glucose-lowering hormone insulin and to protect against fatty liver. DZD scientists from DIfE have now discovered that mice on an intermittent fasting regimen also exhibited lower pancreatic fat. In their current study published in the journal Metabolism, the researchers showed the mechanism by which pancreatic fat could contribute to the development of type 2 diabetes.

Fatty liver has been thoroughly investigated as a known and frequently occurring disease. However, little is known about excess weight-induced fat accumulation in the pancreas and its effects on the onset of type 2 diabetes. The research team led by Professor Annette Schürmann and Professor Tim J. Schulz of the German Institute of Human Nutrition (DIfE) has now found that overweight mice prone to diabetes have a high accumulation of fat cells in the pancreas. Mice resistant to diabetes due to their genetic make-up despite excess weight had hardly any fat in the pancreas, but instead had fat deposits in the liver. "Fat accumulations outside the fat tissue, e.g. in the liver, muscles or even bones, have a negative effect on these organs and the entire body. What impact fat cells have within the pancreas has not been clear until now," said Schürmann, head of the Department of Experimental Diabetology at DIfE and speaker of the German Center for Diabetes Research (DZD).

Intermittent fasting reduces pancreatic fat

The team of scientists divided the overweight animals, which were prone to diabetes, into two groups: The first group was allowed to eat ad libitum - as much as they wanted whenever they wanted. The second group underwent an intermittent fasting regimen: one day the rodents received unlimited chow and the next day they were not fed at all. After five weeks, the researchers observed differences in the pancreas of the mice: Fat cells accumulated in group one. The animals in group two, on the other hand, had hardly any fat deposits in the pancreas.

Pancreatic adipocytes mediate hypersecretion of insulin

In order to find out how fat cells might impair the function of the pancreas, researchers led by Schürmann and Schulz isolated adipocyte precursor cells from the pancreas of mice for the first time and allowed them to differentiate into mature fat cells. If the mature fat cells were subsequently cultivated together with the Langerhans islets of the pancreas, the beta cells of the "islets" increasingly secreted insulin. "We suspect that the increased secretion of insulin causes the Langerhans islets of diabetes-prone animals to deplete more quickly and, after some time, to cease functioning completely. In this way, fat accumulation in the pancreas could contribute to the development of type 2 diabetes," said Schürmann.

Significance of pancreatic fat for diabetes prevention

Current data suggest that not only liver fat should be reduced to prevent type 2 diabetes. "Under certain genetic conditions, the accumulation of fat in the pancreas may play a decisive role in the development of type 2 diabetes," said Schulz, head of the Department of Adipocyte Development and Nutrition. Intermittent fasting could be a promising therapeutic approach in the future. The advantages: it is non-invasive, easy to integrate into everyday life and does not require drugs.

Intermittent Fasting

Intermittent fasting means not eating during certain time slots. However, water, unsweetened tea and black coffee are allowed around the clock. Depending on the method, the fasting lasts between 16 and 24 hours or, alternatively, a maximum of 500 to 600 calories are consumed on two days within a week. The best known form of intermittent fasting is the 16:8 method which involves eating only during an eight-hour window during the day and fasting for the remaining 16 hours. One meal - usually breakfast - is omitted.

Islets of Langerhans

The islets of Langerhans - also referred to as islet cells or Langerhans islets - are islet-like accumulations of hormone-producing cells in the pancreas. A healthy adult has about one million Langerhans islets. Each "islet" has a diameter of 0.2-0.5 millimeters. The beta cells produce the blood glucose-lowering hormone insulin and make up about 65 to 80 percent of the islet cells. When blood glucose levels are elevated, these secrete insulin into the bloodstream so that the levels are normalized again.

Credit: 
Deutsches Zentrum fuer Diabetesforschung DZD

New study shows nanoscale pendulum coupling

image: Researchers could synchronize two crystal optomechanical oscillators mechanically coupled.

Image: 
D. Navarro

In 1665, Lord Christiaan Huygens found that two pendulum clocks, hung in the same wooden structure, oscillated spontaneously and perfectly in line but in opposite directions: the clocks oscillated in anti-phase. Since then, synchronization of coupled oscillators in nature has been described at several scales: from heart cells to bacteria, neural networks and even in binary star systems -spontaneously synchronized.

Mechanical oscillators are typical in these systems. In the nanoscale, the challenge is to synchronize these. In these lines, an article published in the journal Physical Review Letters -by a team of researchers from the Institute of Nanoscience and Nanotechnology of the UB (IN2UB) together with ICN2 researchers showed a version of mechanic oscillators at a nanoscale. Through a series of experiments, researchers could synchronize two crystal optomechanical oscillators mechanically coupled, located in the same silicon platform and activated through independent optical impulses. These nanometric oscillators have a size of 15 micrometres per 500 nanometres.

While a mechanical pendulum receives impulses from the clock to keep its movement, the optomechanical pendulums use the pressure from radiation, but interaction of oscillators is the same in both experiments. The study also shows that the collective dynamics can be controlled acting externally on one oscillator only.

"Results show a good base for the creation of reconfigurable networks of optomechanical oscillators thanks to these collective dynamics that are dominated by a weak mechanical coupling. This could have applications in photonics, for instance, for pattern recognition tasks or a more complex cognitive process", notes Daniel Navarro Urrios, from IN2UB, who led the research.

Credit: 
University of Barcelona

Unraveling the brain's reward circuits

image: A team from the University of Pennsylvania found that consuming food turns down the activity of neurons that signal hunger in the brain via a different pathway than alcohol and drugs. The researchers also discovered that the circuits that trigger the pleasurable release of dopamine are interconnected with the activity of hunger neurons, suggesting that drugs and alcohol can hijack not only the brain's reward circuits but also those responsible for signaling hunger, serving to create a behavior that reinforces itself.

Image: 
Amber Alhadeff/University of Pennsylvania

To some, a chocolate cake may spark a shot of pleasure typically associated with illicit drugs. A new study by Penn biologists offers some insights into that link, revealing new information about how the brain responds to rewards such as food and drugs.

In the work, which appears this week online in the journal Neuron, a team led by Assistant Professor J. Nicholas Betley, postdoctoral researcher Amber L. Alhadeff, and graduate student Nitsan Goldstein of the School of Arts and Sciences shows that, in mice, consuming food turns down the activity of neurons that signal hunger in the brain via a different pathway than alcohol and drugs, which can likewise act as appetite suppressants. Yet the research also reveals that the circuits that trigger the pleasurable release of dopamine are interconnected with the activity of hunger neurons, suggesting that drugs and alcohol can hijack not only the brain's reward circuits but also those responsible for signaling hunger, serving to create a behavior that reinforces itself.

"Signals of reward, whether it's food or drugs, access the brain through different pathways," says Betley, senior author on the work. "But once they're in the brain, they engage an interconnected network between hypothalamic hunger neurons and reward neurons. It could be that drugs are reinforced not only by increasing a dopamine spike, but also by decreasing the activity of hunger neurons that make you feel bad."

With a greater understanding of these pathways, the researchers say their findings could inform the creation of more effective weight loss drugs or even addiction therapies.

Betley and colleagues' work has previously shown infusing any type of macronutrient (any calorie-containing food) into a mouse turned down the activity of AgRP neurons, which are responsible for the unpleasant feelings associated with hunger. The signal by which the stomach tells the brain it has consumed food travels along what is known as the vagal nerve.

Curious about whether alcohol, which is also caloric, could trigger the same effect, they found that it did so in mice, even when the vagal pathway was disrupted.

"If we cut that highway, highly caloric and rewarding foods like fat can no longer get that signal to the hunger neurons, but ethanol could," says Alhadeff.

The team next tried to do the same thing with cocaine, nicotine, and amphetamine, drugs that have been shown to have appetite suppressing activity, and found the same thing. It's the first time, the team says, that a non-nutrient has been shown to regulate AgRP neurons for a sustained period of time.

"What is exciting is that the results suggest there are pharmacological mechanisms out there that can be harnessed to reduce the activity of these neurons to alleviate hunger if someone was on a weight-loss diet," Alhadeff says.

Knowing that alcohol and drugs also trigger the release of dopamine, a neurotransmitter associated with a sensation of "reward" that is also implicated in addiction, the team observed that dopamine neuron activity increased in parallel to the decrease in AgRP neuron activity.

They went after that lead. Using a technique by which they could activate AgRP neurons without depriving an animal of food, the researchers explored how these hunger neurons influence dopamine signaling. In the absence of a food reward, they found little response in the dopamine neurons to activation of AgRP neurons. But when an animal with active AgRP was fed, the surge of dopamine was even higher than it would have been normally, without activated AgRP neurons. In other words, AgRP neurons made food more "rewarding" when the animals were hungry.

"We were surprised to find these AgRP neurons seemed to be signaling the dopamine neurons, but we couldn't detect that until the animal gets the reward," Goldstein says. "This suggested that either an indirect or modulatory circuit mediates the interaction between hunger and reward neurons in the brain."

The same thing happened when the animal received a drug, such as nicotine.

Moving ahead, the research team is investigating the differences between the reward signals that come from alcohol and drugs versus food and unpacking the connection they have revealed between the dopamine neurons and AgRP neurons. Using sophisticated new technology, they'll also be studying individual neurons to see if the effects they've observed are due to the activity of small subpopulations of neurons in the brain.

If they're successful at identifying a new, druggable pathway that could target these linked circuits, Betley says it would be welcome, as many currently available weight-loss drugs have unpleasant side effects such as nausea.

"It's hard to have somebody adhere to these drugs when they're feeling poorly," he says. "Our findings suggest there are multiple ways into the brain, and maybe by combining these strategies we can overcome these problems."

Credit: 
University of Pennsylvania