Earth

What can trigger violence in postcolonial Africa?

image: The graphic shows data covering the period from the year in which a country became independent -- until 2013. Up to end of the Cold War in 1989 the numbers are especially striking: 30 of the 32 major civil wars in Africa occurred in countries with a precolonial state -- despite the fact that countries without PCS groups accounted for only 39 percent of observations in the data set. In short, almost every major civil war in sub-Saharan Africa during the Cold War era occurred in a country with a PCS, or precolonial state, group.

Image: 
Jack Paine and Stephen Dow/University of Rochester

Democracy is probably not in the cards for Sudan any time soon.

When widespread protests erupted in April in this northeastern African nation, the military seized the opportunity and overthrew the country's brutal dictator of the past 30 years, President Omar Hassan al-Bashir. But his replacement--no other than his former military enforcer Lt. General Mohamed Hamdan--is unlikely to deliver peace and democracy. Not only is Hamdan accused of genocide in Darfur, he has also now sent troops to assault, rape, and kill Sudan's pro-democracy protesters.

Sudan is not alone when it comes to bloodshed in postcolonial Africa. Violent political events, rooted in ethnic conflicts, have plagued sub-Saharan Africa since independence, causing millions of deaths and hampering economic development. Yet, nearly 80 percent of the continent's major ethnic groups have never participated in any civil war.

Of course, Africa exhibits considerable variation. Why, for example, have civil wars and insurgencies occurred in Sudan and Uganda, but not Kenya? Why did Benin experience several coups and coups attempts after independence but not Côte d'Ivoire?

Jack Paine, an assistant professor of political science at the University of Rochester, is studying the factors that underlie ethnic strife. In a recent paper, "Ethnic Violence in Africa: Destructive Legacies of Pre-Colonial States," published in International Organization, he explores why civil wars and coups d'état occur more frequently in some sub-Saharan African countries than others. What makes violence more likely?

Much previous research has looked to the postcolonial period for answers. Yet, taking a longer term perspective, Paine found the origins in precolonial political organization. In short, African countries that include ethnic groups that were organized as states prior to European colonization are at much higher risk for violence.

"Many African countries have experienced considerable ethnic strife," says Paine. "These tensions have roots in deeper historical events. Frequently, precolonial political organizations sowed the seeds of later discord."

The precolonial roots of postcolonial conflict

Essentially, authoritarian rulers face similar tradeoffs: If they try to buy off potential enemies by including them in the ruling coalition, they risk elevating opponents to positions of power where they can overthrow the ruler in a coup. However, if rulers deny rivals important cabinet posts, the excluded groups may launch an outsider rebellion to try to topple the government.

According to Paine, in many African nations, insecure postcolonial leaders decided against inclusive coalitions for fear of an insider putsch. That was especially the case in countries that incorporated an ethnic group with a precolonial state, given the general absence of interethnic political parties and the corresponding inability to commit to bargains.

"Distinct states and identities created privileged subsets of the population that, when independence became imminent in the 1950s and 1960s, were unwilling to forge organizational ties with other ethnic groups," Paine says.

He also discovered a direct spillover effect: ethnic groups organized as a precolonial state--so called PCS groups--within a country increased the likelihood of conflict for all groups in that country.

Compiling a new data set on precolonial African states, Paine analyzed data covering the period from the year in which a country became independent--until 2013. Up to end of the Cold War in 1989 the numbers are especially striking: 30 of the 32 major civil wars in Africa occurred in countries with a precolonial state--despite the fact that countries without PCS groups accounted for only 39 percent of observations in the data set.

In short, almost every major civil war in sub-Saharan Africa during the Cold War era occurred in a country with a PCS, or precolonial state, group.

Explaining the variance within Africa

While plenty of researchers have shown how ethnicity affects violence, Paine argues that most existing theories do not sufficiently explain variations within Africa.

During the precolonial period, roughly before 1884, Africa featured diverse forms of political organization, ranging from stateless societies such as the Maasai in Kenya, to hierarchically organized societies with standing armies such as the Dahomey in Benin. Centralized states often participated in violent activities to promote intergroup inequality, says Paine.

During the subsequent high colonial period, roughly from 1900 to 1945, PCS groups were elevated in the colonial governance hierarchy. According to Paine, they became natural allies because ruling through existing local political hierarchies reduced colonial administrative costs. Examples include the Asante in Ghana, Buganda in Uganda, Hausa and Fulani in Nigeria, and Lozi in Zambia. Paine points out that this kind of governance strategy was most closely associated with British rule, which favored indirect governance.

"The fact that people differ in their language and origins matters less than the fact that people differ in their history of political interactions--before, during, and after the colonial period," says Paine.

That's why common policy recommendations for ending civil wars may not work without understanding the long-term effects of factors such as precolonial statehood, warns Paine. For example, promoting inclusive power-sharing agreements will likely not stem violence in PCS countries because the internal security dilemma might destabilize such arrangements.

Deepening democratic institutions in order to increase the credibility of power-sharing agreements--and the hope that over time the legacies of distinct statehood will lessen--provide a possible but uncertain path out of the coup and civil war trap for many PCS countries.

"Even if changes over time can help ease earlier tensions caused by precolonial statehood, the unfortunate consequences of these historical differences continue to linger in many African countries to the present day," Paine says.

Credit: 
University of Rochester

Mini 'magic' MRI scanner could diagnose knee injuries more accurately

image: This is a prototype 'mini' MRI scanner, developed by Imperial College London, that could be used for diagnosing knee injuries

Image: 
Imperial College London / <i>Journal of Magnetic Resonance in Medicine</i>

Researchers at Imperial College London have developed a prototype mini MRI scanner that fits around a patient's leg.

The team say the device - which uses so-called 'magic angle' effect - could potentially help diagnose knee injuries more quickly, and more accurately.

In a proof-of-concept study using animal knees, the results suggest the technology could be used to show all the structures of the knee.

The scientists say the device (which looks like a large metal ring through which a patient places their leg) could help diagnose conditions such as anterior cruciate ligament injuries - particularly common among footballers.

Furthermore, the small size of the device could enable it to be used in local clinics and even GP surgeries, potentially reducing NHS waiting times for MRI scans.

The research was funded by the National Institute for Health Research.

Currently, key components of the knee joints such as ligaments and tendons are difficult to see in detail in the MRI scans, explains Dr Karyn Chappell, a researcher and radiographer from Imperial's MSK Lab: "Knee injuries affect millions of people - and MRI scans are crucial to diagnosing the problem, leading to quick and effective treatment. However we currently face two problems: connective tissue in the knee is unclear on MRI scans, and people are waiting a long time for a scan."

Dr Chappell added: "This can cause particular problems for women, as they are at greater risk of anterior cruciate ligament injuries. The reasons for this are unclear, but it could be linked to hormones such as oestrogen making ligaments more elastic, leading to more joint injuries."

Knee injuries commonly affect one of three areas: the tendons (which attach muscle to bone), the meniscus (a cushioning pad of cartilage that prevents the bones of the joints rubbing together), or the ligaments (tough bands of connective tissue that hold bones in a joint together).

Following knee injury a doctor may refer a patient for a MRI scan to help establish which part of the joint is injured. MRI scans use a combination of radio waves and strong magnets to 'flip' water molecules in the body. The water molecules send out a signal, which creates an image.

However, tendons, ligaments and meniscus are not usually visible with MRI, due to the way water molecules are arranged in these structures, explains Dr Karyn Chappell.

"These structures are normally black on an MRI scan - they simply don't produce much signal that can be detected by the machine to create the image. This is because they are made mostly of the protein collagen, arranged as fibres. The collagen fibres hold water molecules in a tight configuration, and it is in fact water that is detected by the MRI. If you do see a signal it suggests there is more fluid in the area - which suggests damage, but it is very difficult for medical staff to conclusively say if there is injury."

To overcome this problem, Dr Chappell harnessed the power of a phenomenon called the 'magic angle': "The brightness of these tissues such as tendons and ligaments in MRI images strongly depends on the angle between the collagen fibres and the magnetic field of the scanner. If this angle is 55 degrees the image can be very bright, but for other angles it is usually very dark."

The team explain the magic angle is achieved in their scanner because they are able to easily change the orientation of the magnetic field. While the patient sits comfortably in a chair, the specially designed magnet (which uses motors and sensors similar to those found in robots in car factories) can rotate around the leg and the orientate magnetic field in multiple directions.

This is not possible in current hospital MRI scanners, which are also much more expensive than the prototype scanner.

"Previously the magic angle phenomenon was thought of as a problem, as it could mean medical staff mistakenly thinking the knee is injured. However, I realised that if we took a number of scans around the knee, we could use the signal produced by the magic angle effect to build a clear picture of the knee structures," explained Dr Chappell.

"Specifically, we can combine images obtained at different magnet angles and not only increase the brightness, but also see how the collagen fibres are arranged. This enables us to establish the pattern of collagen fibres in the knee structures, which is crucial information ahead of treatments such as repairing a torn meniscus," added Dr Chappell.

"At the moment, it's very difficult to see which direction the collagen fibres run in a meniscus. This is important because sewing across the fibres will effectively repair a tear in the meniscus. However if the stitch is in the same direction as the fibres, the repair may fail."

In a new study, published in the journal Magnetic Resonance in Medicine, the multi-disciplinary team scanned the knee joints of six goats and ten dogs in a conventional MRI scanner.

All of the dog legs were donated by the Royal Veterinary College, having been donated for research by dog owners following the death of their pet.

Dogs suffer from knee injuries and arthritis similar to humans, making them a good subject for the study.

The results showed that using the magic angle can accurately detect ligament and tendon damage.

The team say now they know magic angle scanning can be used to visualise the knee, combining this with the new prototype mini scanner could enable knees to be accurately scanned with this technology - and hope to progress to human trials of the 'mini' scanner within a year.

Dr Chappell explained: "Although this is an early-stage proof-of-concept study, it shows the technology could potentially be used to accurately detect knee injury. We now hope to enter human trials - and explore if this technology could be used for other joints such as ankles, wrists and elbows."

Credit: 
Imperial College London

Study: Society pays heavy price for failure to diagnose and treat conduct disorder

Conduct disorder is a common and highly impairing psychiatric disorder that usually emerges in childhood or adolescence and is characterized by severe antisocial and aggressive behavior, including physical aggression, theft, property damage and violation of others' rights. Much greater awareness, improved diagnosis and enhanced treatment are all required in order to reduce the burden on society of the severe behavioral condition, conduct disorder, according to a new study co-authored by an LSU psychology professor.

"There needs to be a concerted effort to improve the diagnosis and treatment of children and teenagers with conduct disorder by investing in training in evidence-based treatments for this condition and ensuring that families can access child and adolescent mental health services. At LSU, we provide diagnostic services to the community for children and adolescents with serious behavior problems ages 6 to 17 through our Psychological Services Center, run by the LSU Department of Psychology," said co-author Paul Frick, LSU Department of Psychology professor.

The study reviewed evidence from research conducted around the world and estimated the prevalence of conduct disorder to be around 3 percent in school-aged children and it is a leading cause of referral to child and adolescent mental health services. Yet paradoxically it is one of the least widely recognized or studied psychiatric disorders, and funding for research into it lags far behind many other childhood disorders.

What the evidence shows is that conduct disorder is associated with an exceptionally high individual, societal and economic burden. The health and personal burden of it is seven times greater than that of attention-deficit/hyperactivity disorder, or ADHD, a much more widely known disorder. Whilst it is likely that children diagnosed with ADHD may also show signs of conduct disorder, very few will be diagnosed or receive treatment for it. Conduct disorder is also associated with a greater health burden than autism.

"Despite the fact that it is associated with a very high personal, familial, and societal burden, conduct disorder is under-recognized and frequently goes undiagnosed and untreated. Unfortunately, the longer this goes on, the more difficult it is to treat. It truly exemplifies the old saying that 'an ounce of prevention is worth a pound of cure.' Also, many treatments that are being used in the community have not proven effective," Frick said.

This failure to tackle and treat conduct disorder in children and adolescents led the researchers to write the new Nature Reviews paper which calls for a greater awareness of the condition, and more funding to improve our understanding and ability to treat the disorder. The paper--a comprehensive overview of all aspects of conduct disorder, its diagnosis, clinical management and long-term impact--highlights the negative consequences and adult outcomes that can occur if it is not correctly diagnosed or treated.

In particular it reveals the high physical and mental health burden on patients and their families. In children, conduct disorder is associated with a higher risk of developing depression, anxiety, alcohol and substance abuse. Up to 50 percent of individuals with conduct disorder develop antisocial or borderline personality disorder in adulthood, along with more serious criminal behavior and gang involvement. The study also finds that young people with conduct disorder are more likely to have children earlier, with more unplanned pregnancies, to become dependent on benefits, homeless or even to attempt suicide. Such behaviors have a huge detrimental effect on an individual, their families and society. In addition, those with conduct disorder display parenting problems which often mean that their children are at higher risk for developing conduct disorder, starting an inter-generational cycle.

However, the researchers suggest that with the correct diagnosis, management for the condition is possible with the support of child and adolescent mental health services. The study highlights the value of both training parents in better supporting children with conduct disorder and skills training for children and adolescents with the condition to help them improve their social and problem-solving skills and ability to regulate their emotions. Frick and his co-authors suggest these approaches can have profound impacts on the patient's well-being and life chances.

Frick hopes the study can bring much needed attention to the diagnosis and treatment of children with conduct disorder. Most people view children with conduct disorder as just "bad kids" and often don't recognize the need for mental health treatment. He hopes the paper highlights the societal impact of the condition which requires more funding for research on treatment from both governmental and private sources.

Credit: 
Louisiana State University

Medicines made of solid gold to help the immune system

image: B lymphocytes (blue and green) and gold nanoparticles (red) measured with dark field hyperspectral imaging coupled with fluorescent detection.

Image: 
© UNIGE

Over the past twenty years, the use of nanoparticles in medicine has steadily increased. However, their safety and effect on the human immune system remains an important concern. By testing a variety of gold nanoparticles, researchers at the University of Geneva (UNIGE), in collaboration with the National Centre of Competence in Research "Bio-inspired Materials" and Swansea University Medical School, (United Kingdom), are providing first evidence of their impact upon human B lymphocytes - the immune cells responsible for antibody production. The use of these nanoparticles is expected to improve the efficacy of pharmaceutical products while limiting potential adverse effects. These results, published in the journal ACS Nano, will lead to the development of more targeted and better tolerated therapies, particularly in the field of oncology. The methodology developed makes it also possible to test the biocompatibility of any nanoparticle at an early stage in the development of a new nanodrug.

Responsible for the production of antibodies, B lymphocytes are a crucial part of the human immune system, and therefore an interesting target for the development of preventive and therapeutic vaccines. However, to achieve their goal, vaccines must reach B lymphocytes quickly without being destroyed, making the use of nanoparticles particularly interesting. "Nanoparticles can form a protective vehicle for vaccines - or other drugs - to specifically deliver them where they can be most effective, while sparing other cells," explains Carole Bourquin, a Professor at the UNIGE's Faculties of Medicine and Science, who co-led this study. "This targeting also allows the use of a lower dose of immunostimulant while maintaining an effective immune response. It increases its efficacy while reducing side-effects, provided that the nanoparticles are harmless to all immune cells." Similar studies have already been conducted for other immune cells such as macrophages, which seek out and interact with nanoparticles, but never before for the smaller, and more difficult to handle, B lymphocytes.

Gold is an ideal material

Gold is an excellent candidate for nanomedicine because of its particular physico-chemical properties. Well tolerated by the body and easily malleable, this metal has, for instance, the particularity of absorbing light and then releasing heat, a property that can be exploited in oncology. "Gold nanoparticles can be used to target tumours. When exposed to a light source, the nanoparticles release heat and destroy neighbouring cancer cells. We could also attach a drug to the surface of the nanoparticles to be delivered to a specific location," explains UNIGE researcher Sandra Hočevar. "To test their safety and the best formula for medical use, we have created gold spheres with or without a polymer coating, as well as gold rods to explore the effects of coating and shape. We then exposed human B lymphocytes to our particles for 24 hours to examine the activation of the immune response."

By following activation markers expressed on the surface of B cells, the scientists were able to determine how much their nanoparticles activated or inhibited the immune response. While none of the nanoparticles tested demonstrated adverse effects, their influence on the immune response differed depending on their shape and the presence of a surface, polymer coating. "Surface properties, as well as nanoparticle morphology definitely are important when it comes to the nanoparticle-cell interaction. Interestingly, the gold nanorods inhibited the immune response instead of activating it, probably by causing interference on the cell membrane, or because they are heavier", says Martin Clift, an Associate Professor of Nanotoxicology and In Vitro Systems at Swansea University Medical School, and the project's co-leader.

Uncoated, spherical particles easily aggregate and are therefore not appropriate for biomedical use. On the other hand, gold spheres coated with a protective polymer are stable and do not impair B lymphocyte function. "And we can easily place the vaccine or drug to be delivered to the B lymphocytes in this coating," says Carole Bourquin. «In addition, our study established a methodology for assessing the safety of nanoparticles on B lymphocytes, something that had never been done before. "This could be especially useful for future research, as the use of nanoparticles in medicine still requires clear guidelines."

Many clinical applications

B cells are at the heart of vaccine response, but also in other areas such as oncology and autoimmune diseases. The gold nanoparticles developed by the team of researchers could make it possible to deliver existing drugs directly to B lymphocytes to reduce the necessary dosage and potential side effects. In fact, studies in patients are already being carried out for the treatment of brain tumours. Gold nanoparticles can be made small enough to cross the blood-brain barrier, allowing specific anti-tumoural drugs to be delivered directly into the cancerous cells.

Credit: 
Université de Genève

Study connects low social engagement to amyloid levels and cognitive decline

Social relationships are essential to aging well; research has shown an association between lack of social engagement and increased risk of dementia. A new study by investigators from Brigham and Women's Hospital found that higher brain amyloid-β in combination with lower social engagement in elderly men and women was associated with greater cognitive decline over three years. The results of the study were published last month in the American Journal of Geriatric Psychiatry.

"Social engagement and cognitive function are related to one another and appear to decline together," said senior author Nancy Donovan, MD, chief of the Division of Geriatric Psychiatry at the Brigham. "This means that social engagement may be an important marker of resilience or vulnerability in older adults at risk of cognitive impairment."

The investigators sampled 217 men and women enrolled in the Harvard Aging Brain Study, a longitudinal observational study looking for early neurobiological and clinical signs of Alzheimer's disease. The participants, aged 63-89, were cognitively normal, but some individuals showed high levels of amyloid-β protein, a pathologic hallmark of Alzheimer's disease detected with neuroimaging techniques.

The investigators used standard questionnaires and examinations to assess participants' social engagement (including activities such as spending time with friends and family and doing volunteer work) and cognitive performance at baseline and three years later.

Social engagement was particularly relevant to cognition in participants with evidence of Alzheimer's disease brain changes. The researchers report that, among cognitively normal older adults with high levels of amyloid-β, those who had lower social engagement at baseline showed steeper cognitive decline than those who were more socially engaged. This association was not observed in those with low amyloid-β.

Donovan and her team used a standard measure of social engagement that did not capture all the intricacies of digital communication or the qualitative aspects of relationships. They reported that a more contemporary and comprehensive assessment of social engagement could be a valuable outcome measure in future clinical trials of Alzheimer's disease.

The team cited that future studies with follow-up periods longer than three years may further gauge cognitive decline over time and help untangle the complex mechanisms of Alzheimer's disease progression.

"We want to understand the breadth of this issue in older people and how to intervene to protect high-risk individuals and preserve their health and well-being," said Donovan.

Credit: 
Brigham and Women's Hospital

Smart glasses follow our eyes, focus automatically

image: Stanford engineers are testing a pair of smart glasses that can automatically focus on whatever you're looking at.

Image: 
Nitish Padmanaban

Though it may not have the sting of death and taxes, presbyopia is another of life's guarantees. This vision defect plagues most of us starting about age 45, as the lenses in our eyes lose the elasticity needed to focus on nearby objects. For some people reading glasses suffice to overcome the difficulty, but for many people the only fix, short of surgery, is to wear progressive lenses.

"More than a billion people have presbyopia and we've created a pair of autofocal lenses that might one day correct their vision far more effectively than traditional glasses," said Stanford electrical engineer Gordon Wetzstein. For now, the prototype looks like virtual reality goggles but the team hopes to streamline later versions.

Wetzstein's prototype glasses - dubbed autofocals - are intended to solve the main problem with today's progressive lenses: These traditional glasses require the wearer to align their head to focus properly. Imagine driving a car and looking in a side mirror to change lanes. With progressive lenses, there's little or no peripheral focus. The driver must switch from looking at the road ahead through the top of the glasses, then turn almost 90 degrees to see the nearby mirror through the lower part of the lens.

This visual shift can also make it difficult to navigate the world. "People wearing progressive lenses have a higher risk of falling and injuring themselves," said graduate student Robert Konrad, a co-author on a paper describing the autofocal glasses published June 28 in the journal Science Advances.

The Stanford prototype works much like the lens of the eye, with fluid-filled lenses that bulge and thin as the field of vision changes. It also includes eye-tracking sensors that triangulate where a person is looking and determine the precise distance to the object of interest. The team did not invent these lenses or eye-trackers, but they did develop the software system that harnesses this eye-tracking data to keep the fluid-filled lenses in constant and perfect focus.

Nitish Padmanaban, a graduate student and first author on the paper, said other teams had previously tried to apply autofocus lenses to presbyopia. But without guidance from the eye-tracking hardware and system software, those earlier efforts were no better than wearing traditional progressive lenses.

To validate its approach, the Stanford team tested the prototype on 56 people with presbyopia. Test subjects said the autofocus lenses performed better and faster at reading and other tasks. Wearers also tended to prefer the autofocal glasses to the experience of progressive lenses - bulk and weight aside.

If the approach sounds a bit like virtual reality, that isn't far off. Wetzstein's lab is at the forefront of vision systems for virtual and augmented reality. It was in the course of such work that the researchers became aware of the new autofocus lenses and eye-trackers and had the insight to combine these elements to create a potentially transformative product.

The next step will be to downsize the technology. Wetzstein thinks it may take a few years to develop autofocal glasses that are lightweight, energy efficient and stylish. But he is convinced that autofocals are the future of vision correction.

"This technology could affect billions of people's lives in a meaningful way that most techno-gadgets never will," he said.

Credit: 
Stanford University

Scientists propose gait-based biometric identification method for the old with wearable devices

image: (a) Placement of acceleration sensor nodes; (b) Impact of the number of templates in Temps on recognition rate; (c) and (d) Comparison of identity recognition of older adults using the proposed method and the histogram similarity based method under different ground conditions (c) and with different sensor placements (d).

Image: 
SIAT

Human gait is a unique feature that could be used for robust identity recognition. Gait-based identity recognition method combines several advantages, such as high fraud-resistance, secure data collection, no need for explicit user interaction, and continuous and long-distance authentication. This combination makes gait a very suitable biometric parameter for user verification when associated with wearable devices.

However, intra-subject gait fluctuation in older adults is more significant than in young people due to physical strength changes associated with aging. As a result, gait-based identity recognition of older adults is more challenging.

Prof. LI Ye and his colleagues Dr. SUN Fangmin and Dr. ZANG Weilin at the Shenzhen Institutes of Advanced Technology (SIAT) of the Chinese Academy of Sciences, in cooperation with colleagues from the University of Calabria in Italy, have proposed a gait-based identification method for elderly users.

The current work, published in Information Fusion, is based on the team's a previous study which was published in IEEE Internet of Things Journal in 2018 and extends gait identification to elderly users. In the previous study, LI's team has proposed a speed-adaptive gait cycle segmentation method and an individualized matching threshold generation method to improve the gait authentication of young adults when walking at various speeds.

In the current report, the researchers introduced a gait-based identity recognition method used for the access control of elderly people-centered wearable healthcare devices. A gait template synthesis algorithm was proposed to alleviate the problem of intra-subject gait fluctuation in elderly older people. To further improve the identity recognition rate, the researchers designed an arbitration-based score-level fusion algorithm: two matching algorithms are used to make preliminary decisions; if there are inconsistencies in the preliminary decisions, the third matching algorithm is used to provide the final decision. Such procedure is shown to improve the recognition rate of older adults compared with the existing histogram similarity based method.

The feasibility of the proposed method was verified using a public dataset containing acceleration signals from three IMUs (inertial measurement units) worn by 64 older users ranging in age from 50 to 79 years which was shared by Osaka University in 2011. The experimental results obtained prove that the average recognition rate reached 96.7%, indicating the proposed method was quite suitable for robust gait-based identification of elderly users.

Credit: 
Chinese Academy of Sciences Headquarters

Researchers teleport information within a diamond

image: The lattice structure of diamond contains a nitrogen-vacancy center with surrounding carbons. A carbon isotope (green) is first entangled with an electron (blue) in the vacancy, which then wait for a photon (red) to absorb, resulting in quantum teleportation?based state transfer of the photon into the carbon memory.

Image: 
Yokohama National University

Researchers from the Yokohama National University have teleported quantum information securely within the confines of a diamond. The study has big implications for quantum information technology - the future of how sensitive information is shared and stored.

The researchers published their results on June 28, 2019 in Communications Physics.

"Quantum teleportation permits the transfer of quantum information into an otherwise inaccessible space," said Hideo Kosaka, a professor of engineering at Yokohama National University and an author on the study. "It also permits the transfer of information into a quantum memory without revealing or destroying the stored quantum information."

The inaccessible space, in this case, consisted of carbon atoms in diamond. Made of linked, yet individually contained, carbon atoms, a diamond holds the perfect ingredients for quantum teleportation.

A carbon atom holds six protons and six neutrons in its nucleus, surrounded by six spinning electrons. As the atoms bond into a diamond, they form a notoriously strong lattice. Diamonds can have complex defects, though, when a nitrogen atom exists in one of two adjacent vacancies where carbon atoms should be. This defect is called a nitrogen-vacancy center.

Surrounded by carbon atoms, the nucleus structure of the nitrogen atom creates what Kosaka calls a nanomagnet.

To manipulate an electron and a carbon isotope in the vacancy, Kosaka and the team attached a wire about a quarter the width of a human hair to the surface of a diamond. They applied a microwave and a radio wave to the wire to build an oscillating magnetic field around the diamond. They shaped the microwave to create the optimal, controlled conditions for the transfer of quantum information within the diamond.

Kosaka then used the nitrogen nanomagnet to anchor an electron. Using the microwave and radio waves, Kosaka forced the electron spin to entangle with a carbon nuclear spin - the angular momentum of the electron and the nucleus of a carbon atom. The electron spin breaks down under a magnetic field created by the nanomagnet, allowing it to become susceptible to entanglement. Once the two pieces are entangled, meaning their physical characteristics are so intertwined they cannot be described individually, a photon which holds quantum information is applied and the electron absorbs the photon. The absorption allows the polarization state of the photon to be transferred into the carbon, which is mediated by the entangled electron, demonstrating a teleportation of information at the quantum level.

"The success of the photon storage in the other node establishes the entanglement between two adjacent nodes," Kosaka said. Called quantum repeaters, the process can take individual chunks of information from node to node, across the quantum field.

"Our ultimate goal is to realize scalable quantum repeaters for long-haul quantum communications and distributed quantum computers for large-scale quantum computation and metrology," Kosaka said.

Credit: 
Yokohama National University

When the dinosaurs died, lichens thrived

image: Lead author Jen-Pan Huang on a lichen-covered rock in Taiwan.

Image: 
(c) JP Huang

When an asteroid smacked into the Earth 66 million years ago, it triggered mass extinctions all over the planet. The most famous victims were the dinosaurs, but early birds, insects, and other life forms took a hit too. The collision caused clouds of ash to block the sun and cool the planet's temperature, devastating plant life. But a new study in Scientific Reports shows that while land plants struggled, some kinds of lichens--organisms made of fungi and algae living together--seized the moment and evolved into new forms to take up plants' role in the ecosystem.

"We thought that lichens would be affected negatively, but in the three groups we looked at, they seized the chance and diversified rapidly," says Jen-Pang Huang, the paper's first author, a former postdoctoral researcher at the Field Museum now at Academia Sinica in Taipei. "Some lichens grow sophisticated 3D structures like plant leaves, and these ones filled the niches of plants that died out."

The researchers got interested in studying the effects of the mass extinction on lichens after reading a paper about how the asteroid strike also caused many species of early birds to go extinct. "I read it on the train, and I thought, 'My god, the poor lichens, they must have suffered too, how can we trace what happened to them?'" says Thorsten Lumbsch, senior author on the study and the Field Museum's curator of lichenized fungi.

You've seen lichens a million times, even if you didn't realize it. "Lichens are everywhere," says Huang. "If you go on a walk in the city, the rough spots or gray spots you see on rocks or walls or trees, those are common crust lichens. On the ground, they sometimes look like chewing gum. And if you go into a more pristine forest, you can find orange, yellow, and vivid violet colors--lichens are really pretty." They're what scientists call "symbiotic organisms"--they're made up of two different life forms sharing one body and working together. They're a partnership between a fungus and an organism that can perform photosynthesis, making energy from sunlight--either a tiny algae plant, or a special kind of blue-green bacterium. Fungi, which include mushrooms and molds, are on their own branch on the tree of life, separate from plants and animals (and actually more closely related to us than to plants). The main role of fungi is to break down decomposing material.

During the mass extinction 66 million years ago, plants suffered since ash from the asteroid blocked out sunlight and lowered temperatures. But the mass extinction seemed to be a good thing for fungi--they don't rely on sunlight for food and just need lots of dead stuff, and the fossil record shows an increase in fungal spores at this time. Since lichens contain a plant and a fungus, scientists wondered whether they were affected negatively like a plant or positively like a fungus.

"We originally expected lichens to be affected in a negative way, since they contain green things that need light," says Huang.

To see how lichens were affected by the mass extinction, the scientists had to get creative--there aren't many fossil lichens from that time frame. But while the researchers didn't have lichen fossils, they did have lots of modern lichen DNA.

From observing fungi growing in lab settings, scientists know generally how often genetic mutations show up in fungal DNA--how frequently a letter in the DNA sequence accidentally gets switched during the DNA copying process. That's called the mutation rate. And if you know the mutation rate, if you compare the DNA sequences of two different species, you can generally extrapolate how long ago they must have had a common ancestor with the same DNA.

The researchers fed DNA sequences of three families of lichens into a software program that compared their DNA and figured out what their family tree must look like, including estimates of how long ago it branched into the groups we see today. They bolstered this information with the few lichen fossils they did have, from 100 and 400 million years ago. And the results pointed to a lichen boom after 66 million years ago, at least for some of the leafier lichen families.

"Some groups don't show a change, so they didn't suffer or benefit from the changes to the environment," says Lumbsch, who in addition to his work on lichens is the Vice President of Science and Education at the Field. "Some lichens went extinct, and the leafy macrolichens filled those niches. I was really happy when I saw that not all the lichens suffered."

The results underline how profoundly the natural world we know today was shaped by this mass extinction. "If you could go back 40 million years, the most prominent groups in vegetation, birds, fungi--they'd be more similar to what you see now than what you'd see 70 million years ago," says Lumbsch. "Most of what we see around us nowadays in nature originated after the dinosaurs."

And since this study shows how lichens responded to mass extinction 66 million years ago, it could shed light on how species will respond to the mass extinction the planet is currently undergoing. "Before we lose the world's biodiversity, we should document it, because we don't know when we'll need it," says Huang. "Lichens are environmental indicators--by simply doing a biodiversity study, we can infer air quality and pollution levels."

Beyond the potential implications in understanding environmental impacts and mass extinctions, the researchers point to the ways the study deepens our understanding of the world around us.

"For me, it's fascinating because you would not be able to do this without large molecular datasets. This would have been impossible ten years ago," says Lumbsch. "It's another piece to the puzzle to understanding what's around us in nature."

"We expect a lot of patterns from studying other organisms, but fungi don't follow the pattern. Fungi are weird," says Huang. "They're really unpredictable, really diverse, really fun."

Credit: 
Field Museum

UI researchers validate optimum composites structure created with additive manufacturing

image: Photographs of the types of printed specimens used in this study and axis definition.

Image: 
University of Illinois at Urbana-Champaign Department of Aerospace Engineering

Additive manufacturing built an early following with 3D printers using polymers to create a solid object from a Computer-Aided Design model. The materials used were neat polymers--perfect for a rapid prototype, but not commonly used as structural materials.

A new wave of additive manufacturing uses polymer composites that are extruded from a nozzle as an epoxy resin, but reinforced with short, chopped carbon fibers. The fibers make the material stronger, much like rebar in a cement sidewalk. The resulting object is much stiffer and stronger than a resin on its own.

The question a recent University of Illinois at Urbana-Champaign study set out to answer concerns which configuration or pattern of carbon fibers in the layers of extruded resin will result in the stiffest material.

John Lambros, Willett professor in the Department of Aerospace Engineering and director of the Advanced Materials Testing and Evaluation Laboratory at U of I was approached by an additive manufacturing research group at Lawrence Livermore National Laboratory to test composite parts that they had created using a direct ink writing technique.

"The carbon fibers are small, about seven microns in diameter and 500 microns in length," Lambros said. "It's easier with a microscope but you can certainly see a bundle with the naked eye. The fibers are mostly aligned in the extruded resin, which is like a glue that holds the fibers in place. The Lawrence Livermore group provided the parts, created with several different configurations and one made without any embedded fibers as a control. One of the parts had been theoretically optimized for maximum stiffness, but the group wanted definitive experimental corroboration of the optimization process."

Lambros said that while waiting for the actual additively manufactured composite samples, Lambros and his student made their own "dummy" samples out of Plexiglas, and that way could begin testing the dummies.

In this case, the shape being tested was a clevis joint--a small, oval-shaped plate with two holes used to connect two other surfaces. For each different sample shape, Lambros' lab must create a unique loading fixture to test it.

"We create the stands, the grips, and everything--how they'll be painted, how the cameras will record the tests, and so on," Lambros said. "When we got the real samples, they weren't exactly the same shape. The thickness was a bit different than our Plexiglas ones, so we made new spacers and worked it out in the end. From the mechanics side, we must be very cautious. It's necessary to use precision so as to be confident that any eventual certification of additively manufactured parts is done properly."

"We created an experimental framework to validate the optimal pattern of the short-fiber reinforced composite material," Lambros said. "As the loading machine strained the clevis joint plates, we used a digital image correlation technique to measure the displacement field across the surface of each sample by tracking the motion in the pixel intensity values of a series of digital images taken as the sample deforms. A random speckle pattern is applied to the sample surface and serves to identify subsets of the digital images in a unique fashion so they can be tracked during deformation."

They tested one control sample and four different configurations, including the one believed to be optimized for stiffness, which had a wavy fiber pattern rather than one oriented along horizontal or vertical lines.

"Each sample clevis joint plate had 12 layers in a stack. The optimized one had curved deposition lines and gaps between them," Lambros said. "According to the Livermore group's predictions, the gaps are there by design, because you don't need more material than this to provide the optimal stiffness. That's what we tested. We passed loading pins through the holes, then pulled each sample to the point of breaking, recording the amount of load and the displacement.

"The configuration that they predicted would be optimal, was indeed optimal. The least optimal was the control sample, which is just resin--as you would expect because there are no fibers in it."

Lambros said that there is a premise in the analysis that this is a global optimum--meaning that this is the absolutely best possible sample built for stiffness--no other build pattern is better than this one.

"Although of course we only tested four configurations, it does look like the optimized configuration may be the absolute best in practice because the configurations that would most commonly be used in design, such as 0°-90° or ±45° alignments, were more compliant or less stiff than what this one was," Lambros said. "The interesting thing that we found is that the sample optimized to be the stiffest also turned out to be the strongest. So, if you look at where they break, this one is at the highest load. This was somewhat unexpected in the sense that they had not optimized for this feature. In fact, the optimized sample was also a bit lighter than the others, so if you look at specific load, the failure load per unit weight, it's a lot higher. It's quite a bit stronger than the other ones. And why that is the case is something that we're going to investigate next."

Lambros said there may be more testing done in the future, but for now, his team successfully demonstrated that they could provide a validation for the optimized additive composite build.

Credit: 
University of Illinois Grainger College of Engineering

New research raises prospect of better anti-obesity drugs

image: Neurons in the brainstem become activate in response to heat. By inhibiting these cells in mice, researchers were able to make the animals eat less and burn more calories.

Image: 
Laboratory of Molecular Genetics at The Rockefeller University

Effective weight-loss strategies call for eating less food, burning more calories--or ideally, both. But for the more than 90 million Americans who suffer from obesity, a disease that contributes to conditions ranging from cancer to heart disease, behavioral change is hard to accomplish or not effective enough--which is why scientists have long sought drugs that would help people shed pounds. Yet effective, long-lasting treatments have thus far eluded them.

In a new report published in Cell, researchers in the laboratory of Jeffrey M. Friedman propose a new avenue in the search for anti-obesity drugs. Collaborating with a Princeton University team, they have found that a group of brain cells previously shown to regulate hunger also controls energy expenditure. And since our body weight depends both on the calories we consume and the energy we burn, these findings could lead to a new type of weight-loss medication that acts on both sides of the energy equation.

Energy in, energy out

Thus far, most obesity research has focused on the biological mechanisms that govern how much we eat. Yet manipulating the neural circuits that control our calorie intake hasn't led to broadly successful anti-obesity drugs. So, Marc Schneeberger Pané, a postdoctoral Kavli NSI fellow in Friedman's lab, launched a project to investigate the processes by which we burn energy instead.

Mammals like mice and humans expend energy in many ways, and producing heat is among most important. When the ambient temperature drops, we burn more fuel to maintain a steady body temperature; when it rises, we burn less. We even possess a special form of fat tissue, known as brown fat, that is burned to produce heat directly.

Scientists knew that some populations of temperature-sensitive neurons in the hypothalamus region of the brain play a role in regulating heat production and, therefore, energy expenditure. But they did not know exactly how those neurons exerted their influence, or if other cells outside the hypothalamus might play a similar role.

Double duty

Schneeberger Pané and his colleagues, including graduate student Luca Parolari, began by mapping the brain regions activated by a rise in ambient temperature. They used an advanced 3D-imaging technique called iDISCO, developed at Rockefeller, to scan the brains of mice exposed to hot temperatures, looking for signs of neuronal activity.

As expected, the team saw activity in the hypothalamus. But they also saw activity among a particular group of neurons in a part of the brainstem known as the dorsal raphe nucleus--and, to the researchers' surprise, these happened to be the very same neurons that, just two years earlier, the lab had found to be crucial for controlling hunger.

"Our new findings demonstrate that these neurons regulate energy balance by modulating both food intake and energy expenditure through partially overlapping circuit mechanisms," says Alexander R. Nectow, who led the Princeton University team.

The possibility that these cells, which the lab had previously dubbed "hunger neurons," might regulate both hunger and energy expenditure raised the prospect that they could serve as powerful levers for managing body weight.

"We were very excited," recalls Schneeberger Pané, who adds that he and his colleagues view these multitalented neurons as "a new horizon in obesity research."

The researchers used sophisticated biochemical techniques to alternately turn these temperature-sensitive brainstem neurons on and off. They discovered that activating the neurons reduced the temperature of the animals' brown fat, which is burned to generate heat, core body temperature as well. Suppressing the neurons, meanwhile, amped up heat production--and, as the scientists had shown previously, it also made the animals less hungry.

Eat less, burn more

But generating heat by burning brown fat isn't the only way to expend energy. Physical activity burns calories too, as do all the basic tasks that keep the body alive: breathing, digesting food, and so on. So the researchers put the mice in special cages tricked out with sensors to track their movements and gauge how much carbon dioxide they produce and how much food, water, and oxygen they consume. The goal was to see if the temperature-sensitive hunger neurons dorsal raphe nucleus could control energy expenditure not only by regulating temperature, but by other means as well.

Again, the results were clear: just as activating the neurons caused heat production to plummet, so too did it cause movement, metabolic activity, and overall energy expenditure to nosedive. Suppressing the neurons, on the other hand, caused all of them to rise.

The team has already begun searching for unique receptors in these dual-purpose neurons that govern both hunger and energy expenditure. The idea is to identify targets that can be used to create novel anti-obesity drugs capable of delivering a one-two punch.

"When you inhibit these neurons, they suppress food intake and increase energy expenditure at the same time," says Schneeberger Pané. Figuring out how to quiet them in people could enable a doubly effective assault against a massive public health problem.

Credit: 
Rockefeller University

Using artificial intelligence to deliver personalized radiation therapy

video: New Cleveland Clinic-led research shows that artificial intelligence (AI) can use medical scans and health records to personalize the dose of radiation therapy used to treat cancer patients.

Image: 
Cleveland Clinic

Thursday, June 27, 2019, CLEVELAND: New Cleveland Clinic-led research shows that artificial intelligence (AI) can use medical scans and health records to personalize the dose of radiation therapy used to treat cancer patients.

Published today in The Lancet Digital Health, the research team developed an AI framework based on patient computerized tomography (CT) scans and electronic health records. This new AI framework is the first to use medical scans to inform radiation dosage, moving the field forward from using generic dose prescriptions to more individualized treatments.

Currently, radiation therapy is delivered uniformly. The dose delivered does not reflect differences in individual tumor characteristics or patient-specific factors that may affect treatment success. The AI framework begins to account for this variability and provides individualized radiation doses that can reduce the treatment failure probability to less than 5 percent.

"While highly effective in many clinical settings, radiotherapy can greatly benefit from dose optimization capabilities," says lead author Mohamed Abazeed, M.D., Ph.D., a radiation oncologist at Cleveland Clinic's Taussig Cancer Institute and a researcher at the Lerner Research Institute. "This framework will help physicians develop data-driven, personalized dosage schedules that can maximize the likelihood of treatment success and mitigate radiation side effects for patients."

The framework was built using CT scans and the electronic health records of 944 lung cancer patients treated with high-dose radiation. Pre-treatment scans were input into a deep-learning model, which analyzed the scans to create an image signature that predicts treatment outcomes. Using sophisticated mathematical modeling, this image signature was combined with data from patient health records - which describe clinical risk factors - to generate a personalized radiation dose.

"The development and validation of this image-based, deep-learning framework is exciting because not only is it the first to use medical images to inform radiation dose prescriptions, but it also has the potential to directly impact patient care," said Dr. Abazeed. "The framework can ultimately be used to deliver radiation therapy tailored to individual patients in everyday clinical practices."

There are several other factors that set this first-of-its-kind framework apart from other similar clinical machine learning algorithms and approaches. The technology developed by the team uses an artificial neural network that merges classical approaches of machine learning with the power of a modern neural network. The network determines how much prior knowledge to use to guide predictions about treatment failure. The extent that prior knowledge informs the model is tunable by the network. This hybrid approach is ideal for clinical applications since most clinical datasets in individual hospitals are more modest in sample size compared to non-clinical datasets used to make other well-known AI predictions (i.e. online shopping or ride-sharing).

Additionally, this framework was built using one of the largest datasets for patients receiving lung radiotherapy, rendering greater accuracy and limiting false findings. Lastly, each clinical center can utilize their own CT datasets to customize the framework and tailor it to their specific patient population.

"Machine learning tools, including deep learning, are poised to play an important role in healthcare," says Dr. Abazeed. "This image-based information platform can provide the ability to individualize multiple cancer therapies but more immediately is a leap forward in radiation precision medicine."

Credit: 
Cleveland Clinic

Astronomers make history in a split second

video: In a world first, an Australian-led international team of astronomers has determined the precise location of a powerful one-off burst of cosmic radio waves. The discovery was made with CSIRO's new Australian Square Kilometre Array Pathfinder (ASKAP) radio telescope in Western Australia. Fast radio bursts last less than a millisecond, making it difficult to accurately determine where they have come from. CSIRO's Dr Keith Bannister and his team developed new technology to freeze and save ASKAP data less than a second after a burst arrives at the telescope. This technology was used to pinpoint the location of FRB 180924 to its home galaxy (DES J214425.25?405400.81). The team made a high-resolution map showing that the burst originated in the outskirts of a Milky Way-sized galaxy about 3.6 billion light-years away. The galaxy from which the burst originated was then imaged by three of the world's largest optical telescopes - Keck, Gemini South and the European Southern Observatory's Very Large Telescope. The cause of fast radio bursts remains unknown but the ability to determine their exact location is a big leap towards solving this mystery.

Image: 
CSIRO/Sam Moorfield

In a world first, an Australian-led international team of astronomers has determined the precise location of a powerful one-off burst of cosmic radio waves.

The discovery was made with CSIRO's new Australian Square Kilometre Array Pathfinder (ASKAP) radio telescope in Western Australia. The galaxy from which the burst originated was then imaged by three of the world's largest optical telescopes - Keck, Gemini South and the European Southern Observatory's Very Large Telescope - and the results were published online by the journal Science today.

"This is the big breakthrough that the field has been waiting for since astronomers discovered fast radio bursts in 2007," CSIRO lead author Dr Keith Bannister said.

In the 12 years since then, a global hunt has netted 85 of these bursts. Most have been 'one-offs' but a small fraction are 'repeaters' that recur in the same location.

In 2017 astronomers found a repeater's home galaxy but localising a one-off burst has been much more challenging.

Fast radio bursts last less than a millisecond, making it difficult to accurately determine where they have come from.

Dr Bannister's team developed new technology to freeze and save ASKAP data less than a second after a burst arrives at the telescope.

This technology was used to pinpoint the location of FRB 180924 to its home galaxy (DES J214425.25?405400.81). The team made a high-resolution map showing that the burst originated in the outskirts of a Milky Way-sized galaxy about 3.6 billion light-years away.

"If we were to stand on the Moon and look down at the Earth with this precision, we would be able to tell not only which city the burst came from, but which postcode - and even which city block," Dr Bannister said.

ASKAP is an array of multiple dish antennas and the burst had to travel a different distance to each dish, reaching them all at a slightly different time.

"From these tiny time differences - just a fraction of a billionth of a second - we identified the burst's home galaxy and even its exact starting point, 13,000 light-years out from the galaxy's centre in the galactic suburbs," team member Dr Adam Deller of Swinburne University of Technology said.

To find out more about the home galaxy, the team imaged it with the European Southern Observatory's 8-m Very Large Telescope in Chile and measured its distance with the 10-m Keck telescope in Hawai'i and the 8-m Gemini South telescope in Chile.

The only previously localised burst, the 'repeater' is coming from a very tiny galaxy that is forming lots of stars.

"The burst we localised and its host galaxy look nothing like the 'repeater' and its host," Dr Deller said.

"It comes from a massive galaxy that is forming relatively few stars. This suggests that fast radio bursts can be produced in a variety of environments, or that the seemingly one-off bursts detected so far by ASKAP are generated by a different mechanism to the repeater."

The cause of fast radio bursts remains unknown but the ability to determine their exact location is a big leap towards solving this mystery.

Team member Dr Jean-Pierre Macquart (Curtin University node of the International Centre for Radio Astronomy Research (ICRAR)) is an expert on using fast radio bursts to probe the Universe.

"These bursts are altered by the matter they encounter in space," Dr Macquart said.

"Now we can pinpoint where they come from, we can use them to measure the amount of matter in intergalactic space."

This would reveal material that astronomers have struggled for decades to locate.

The localisation of the radio burst was done as part of a project using ASKAP called CRAFT (Commensal Real-time ASKAP Fast Transients) that is jointly led by Dr Bannister, Dr Macquart and Dr Ryan Shannon of Swinburne University of Technology.

Dr Shannon and CSIRO's Dr Shivani Bhandari carried out the observations and were the first to spot the burst.

Stuart Ryder (Macquarie University, Australia), J. Xavier Prochaska (University of California Santa Cruz, USA) and Nicolas Tejos (Pontificia Universidad Catolica de Valparaiso, Chile) carried out the optical observations.

Credit: 
CSIRO Australia

FEFU scientists likely found way to grow new teeth for patients

image: Long black haired woman smiling close-up photography.

Image: 
Photo by Lesly Juarez on Unsplash

A group of histologists and dentists from School of Biomedicine, Far Eastern Federal University (FEFU), teamed up with Russian and Japanese colleagues and found cells that are probably responsible for the formation of human dental tissue. Researchers propose to apply the study outcome within the development of bioengineering techniques in dentistry aimed at growing new dental tissue for patients. A related article is published in the International Journal of Applied and Fundamental Research.

FEFU scientists used human prenatal tissues to study the early stage of development of the embryonic oral cavity during the period when the teeth were set up - from the 5th to the 6th week. They have recognized several types of cells that are involved in the formation of one of the teeth rudiments -- the enamel (dental) organ. Among them, chromophobe cells with elongated spindle-shaped form have been identified which are also responsible for the development of human teeth in the first weeks of embryo formation. The data obtained can provide a fundamental basis for the development of bioengineering therapies in dentistry and gastroenterology.

'Numerous attempts to grow teeth from only the stem cells involved in the development of enamel, dentin and pulp, i.e. ameloblasts and odontoblasts, were not successful: there was no enamel on the samples, teeth were covered only by defective dentin. The absence of an easily accessible source of cells for growing dental tissue seriously restricts the development of a bioengineering approach to dental treatment. To develop technologies of tissue engineering and regenerative medicine -- promising methods of treatment in dentistry -- the cells identified by us may become the clue to the new level of quality dental treatment. Natural implants that are completely identical to human teeth will no doubt be better than titanium ones, and their lifespan can be longer than that of artificial ones, which are guaranteed for 10-15 years. Although for a successful experiment, we still have a lack of knowledge about intercellular signaling interactions during the teeth development.' said Ivan Reva, Senior Researcher of the Laboratory for Cell and Molecular Neurobiology, School of Biomedicine, FEFU.

The scientist noted that large chromophobe cells reside not only the place where the teeth of the embryo form, but also exist at the border where the multilayers squamous epithelium of the oral cavity passes into the cylindrical epithelium of the developing digestive tube. This means that the new bio-engineering approach is relevant not only for growing new dental tissue but also for growing organs for subsequent transplantation and likely will be applied in gastroenterology.

The development of new biological approaches for the teeth reconstruction with stem cells is one of the most pressing tasks in dentistry for the upcoming years. There are still a lot of questions challenging the researchers. For example, scientists have yet to figure out how in the earliest stages of human embryo development, from the seemingly homogeneous, and in fact, multilayered ectoderm, which is located in the forming oral cavity, different types and forms of teeth develop. However, it is already clear that more kinds of cells are engaged in the earliest stages of human teeth formation than it was previously supposed. Thanks to the research of FEFU scientists in cooperation with their colleagues from Russia and Japan, it also became clear that the crown of the tooth and its root have different mechanisms of formation.

Credit: 
Far Eastern Federal University

Nutritional cues regulate pancreatic tumor's 'cell drinking'

image: This is Cosimo Commisso, Ph.D., assistant professor in Sanford Burnham Prebys' NCI-designated Cancer Center.

Image: 
Sanford Burnham Prebys

LA JOLLA, CALIF. - June 27, 2019 - Desperate for nutrients, rapidly growing pancreatic tumors resort to scavenging "fuel" through an alternative supply route, called macropinocytosis. Scientists are hopeful that blocking this process, often described as "cellular drinking," could lead to tumor-starving drugs. First, however, fundamental information is needed--such as the invisible molecular signals that drive the process.

Now, scientists from Sanford Burnham Prebys have identified a signaling pathway that regulates macropinocytosis, the nutritional cue that triggers the process and key metabolic differences between tumors--revealing new directions for drug development and patient treatment. The findings were published in Developmental Cell.

"To find the metabolic Achilles' heel of pancreatic cancer, we need a deeper understanding of how these tumors obtain nutrients," says Cosimo Commisso, Ph.D., senior author of the paper and assistant professor in Sanford Burnham Prebys' NCI-designated Cancer Center. "Our study reveals that, like people, pancreatic cancer metabolism is diverse. Some pancreatic tumors can 'dial up' or 'dial down' macropinocytosis depending on the availability of glutamine--an amino acid that plays a key role in the metabolism of rapidly growing cells. Other tumors have naturally high levels of 'always on' macropinocytosis. We also identified the molecular regulators of this process, which may ultimately lead to personalized treatments."

Pancreatic cancer is deadly. Less than 10 percent of people with the cancer remain alive five years later. More than 56,000 Americans are expected to receive a diagnosis in 2019, according to the American Cancer Society, a number that continues to rise. New studies have linked military service to an increased risk of pancreatic cancer, perhaps due to exposure to herbicides such as Agent Orange.

In the study, Commisso and his team analyzed cell lines derived from people with pancreatic ductal adenocarcinoma (PDAC)--the most common type of pancreatic cancer--and utilized a mouse model of human pancreatic cancer. In about half of the cell lines, removing glutamine caused macropinocytosis to "rev up." The remaining cell lines maintained higher-than-normal levels of macropinocytosis regardless of glutamine presence or absence. In cells located inside tumors, where glutamine is scarce, macropinocytosis activity increased. Importantly, they demonstrated that two well-known cancer signaling pathways, EGFR and Pak, drove glutamine-sensitive macropinocytosis--providing new drug development insights.

"In addition to pancreatic cancer, macropinocytosis is prevalent in other cancer types, including lung, prostate, bladder and breast tumors," says Commisso. "I'm hopeful that the insights we glean from our study of pancreatic cancer will apply to additional tumor types."

Next, Commisso and his team will explore how hungry pancreatic tumor cells sense glutamine levels--similar to how a car measures gasoline levels through a fuel-level sensor. If this glutamine-sensing mechanism could be found and inhibited, a pancreatic tumor's ability to turn on macropinocytosis would be blocked--and it would starve.

Credit: 
Sanford Burnham Prebys