Brain

French unions played key role in protecting workers' mental health

ITHACA, N.Y. - During a three-year organizational restructuring at France Telecom that began in 2007 - which called for the downsizing of 22,000 employees, often based on ethically questionable methods - there was a wave of employee suicides. Published reports put the total number of deaths at 35.

Virginia Doellgast, associate professor of comparative employment relations in Cornell University's ILR School, examines the role unions played in the aftermath of those deaths. Her paper, "After the Social Crisis: The Transformation of Employment Relations at France Telecom," was published Feb. 11 in Socio-Economic Review.

The researchers highlight France Telecom labor unions' approaches to studying and publicizing the negative effects of employment restructuring on workers' psychosocial health. The unions were able to influence how the suicides were interpreted - both within the firm and in the media - then communicate their findings to workers, managers and the public.

In 2009, when the press reported high rates of suicides at the company, the unions were ready with a well-organized message, backed by their survey findings: The company and its management were to blame. As a result, unions gained a formal role in monitoring management practices to make sure they did not threaten workers' psychosocial health.

That led to December's landmark decision by a Paris court, which found the company and several former top executives guilty of "collective moral harassment."

According to Doellgast, the France Telecom case holds lessons for U.S. unions struggling with similar problems of growing job insecurity and intensifying performance pressure. She has recently worked with the Communications Workers of America on a survey measuring worker stress and burnout, sleep problems, use of medication, repetitive strain injuries, and fears of outsourcing and downsizing.

"One lesson for U.S. unions is that change starts by getting workers to understand that stress-related problems are widespread and to mobilize around the demand for good, healthy jobs," Doellgast said. "Also, real power comes from communicating these issues to the public and to policymakers. Companies are more likely to prioritize worker health when they fear losing customers, and when they fear fines and jail time for their top executives."

Doellgast said the unions' ability to bring the case to the courts and the resulting landmark decision could have far-reaching consequences.

"Does the company just have a responsibility to its shareholders, or does it have a responsibility to its other stakeholders, which include its workers?" she said. "This case shows the critical role of unions in making the case for this 'stakeholder' view of the firm: Companies are part of society, and managers should make sure they're not killing their employees to make short-term profits."

Credit: 
Cornell University

How roots find their way to water

image: Light Sheet Fluorescence Microscopy is based on two processes: 1) lateral illumination of the specimen with laser light along a plane and 2) detection of fluorescent light emitted from a thin volume centred around the illumination plane. The plant (Arabidopsis thaliana) is mounted in a three-dimensional assembly, stands upright in a plant-derived gel appropriate for the species and supplied with medium and light.

Image: 
Daniel von Wangenheim

FRANKFURT. Plants use their roots to search for water. While the main root digs down-wards, a large number of fine lateral roots explore the soil on all sides. As researchers from Nottingham, Heidelberg and Goethe University of Frankfurt report in the current issue of "Nature Plants", the lateral roots already "know" very early on where they can find water.

For his experiment, Daniel von Wangenheim, a former doctoral researcher in Professor Ernst Stelzer's Laboratory for Physical Biology and most lately a postdoc at Malcolm Ben-nett's, mounted thale cress roots along their length in a nutrient solution. They were, how-ever, not completely immersed and their upper side left exposed to the air. He then ob-served with the help of a high-resolution 3D microscope how the roots branched out.

To his surprise, he discovered that almost as many lateral roots formed on the air side as on the side in contact with the nutrient solution. As he continued to follow the growth of the roots with each cell division in the microscope, it became evident that the new cells drive the tip of the root in the direction of water from the very outset, meaning that if a lateral root had formed on the air side, it grew in the direction of the agar plate.

"It's therefore clear that plants first of all spread their roots in all directions, but the root obvi-ously knows from the very first cell divisions on where it can find water and nutrients," says Daniel von Wangenheim, summarizing the results. "In this way, plants can react flexibly to an environment with fluctuating resources."

Credit: 
Goethe University Frankfurt

Bu researchers identify privacy law gaps in high school STI health services

Without addressing these gaps, collaboration between schools (operating under FERPA) and health departments (operating under HIPAA) can compromise student privacy.

A new commentary by Boston University School of Public Health (BUSPH) researchers published in the journal Pediatrics uses the example of high school sexually transmitted infection (STI) programs to highlight how collaborations between schools and health departments can create gaps in student privacy.

"Students who use health programs in school may not realize that there could be vulnerabilities for their private health information. School nurses and health departments who collaborate on programs in schools must also collaborate on explicit protections for students' private health information," says Dr. Patricia Elliott, clinical assistant professor of community health sciences at BUSPH and the paper's lead author.

Complications arise in these collaborations, the researchers write, because school nurses operate under the Family Educational Rights and Privacy Act (FERPA), while health departments may operate under the Health Insurance Portability and Accountability Act (HIPAA). Both laws are meant to protect patient privacy, but in different ways, leaving unintentional gaps. For example, FERPA allows parents to see medical information in the school record, and allows school nurses to disclose medical information to other school administrators in some cases. When private medical information from, for example, a local health department's in-school STI testing and treatment program operating under HIPAA is passed on to a school nurse operating under FERPA, that information becomes less private than a student--and perhaps even the health department program--may realize.

To help close the gap, Elliott and colleagues write that collaborations between schools and local health departments should include mapping out processes and workflow to find and anticipate these gaps, and that collaborators should create clear privacy protocols for all partners and tell students in plain language what privacy protections they can expect.

"Schools have become important sites for many health interventions, but, if we are not careful, what is good for a student's health may not be good for their privacy," Elliott says.

Credit: 
Boston University School of Medicine

Predators to spare

In 2014, a disease of epidemic proportions gripped the West Coast of the U.S. You may not have noticed, though, unless you were underwater.

Fueled by abnormally hot ocean temperatures, sea star wasting disease ravaged these echinoderms from Alaska to Mexico. The condition, still not fully understood, wiped out a significant marine predator, the sunflower star. The sunflower star was a particularly important predator of sea urchins, and since the sea star's disappearance, the urchins it preyed upon have multiplied and laid waste to large swaths of kelp forest. However, the spiny scourge seemed to have spared some areas, especially those where multiple urchin predators occurred, particularly within marine protected areas.

A team of marine biologists, led by recent UC Santa Barbara graduate Jake Eisaguirre, has investigated what factors kept urchins in check in marine protected areas in the western Channel Islands. They found that a redundancy in urchin predators, and the protection afforded to them, seems to be responsible. The results offer a new perspective on strategies to manage ecosystems for resilience and highlight an underappreciated benefit of marine reserves. The study appears in the journal Ecology.

"This sea star wasting disease was a very impactful and rapid event," said Jennifer Caselle, a research biologist at the university's Marine Science Institute (MSI), an adjunct faculty member in ecology, evolution and marine biology and one of the study's coauthors. "We had abundant sea stars on our reefs, and within one year we had no sea stars." The researchers haven't seen a single sunflower star since 2014.

The sunflower star's disappearance rippled through the entire kelp forest food web in what scientists call a trophic cascade. The team's research indicated that even a few sunflower stars could effectively control an area's urchin population, so without them, the populations exploded, and the kelp forests turned into barrens in many places in California.

And unfortunately, it's easier for a kelp forest to become an urchin barren than for it to return to its original state. "There are feedbacks that prevent it from shifting back," said lead author Eisaguirre. "One of them could be that the abundant urchins on the 'urchin barrens' are starved and provide no nutrition to predators, so nothing wants to eat them."

Puzzling oases

While the urchins mowed down vast tracts of kelp in some regions, especially in Northern California, the researchers noticed that kelp in marine protected areas off the Channel Islands was still relatively healthy. They suspected it may be related to the urchin's other two predators in the region: the California sheephead and California spiny lobster.

Both of these species occur primarily in Southern California, and are both heavily fished. "We thought that the protection of these other predators, even though they weren't highly abundant in the western part of the Channel, may still have helped to compensate for the loss of the sunflower star," said coauthor Katie Davis, a research scientist at MSI.

The patchwork layout of marine protected areas around the Channel Islands provided an ideal setup to test the effect marine reserves had on sea urchin predators, and accordingly, the urchins themselves. Adjacent areas are virtually identical except for their status as open or closed to fishing. What's more, the research group has been collecting data in the area for more than 20 years under the Partnership for Interdisciplinary Studies of Coastal Oceans (PISCO), a long-term ecological project.

The team looked at data spanning several years before and after the onset of sea star wasting disease, examining how the assemblage of urchin predators changed. They used statistical models to investigate how different variables -- like the size and abundance of the three predators, protected status of different sites and sea surface temperature -- might have affected sea urchin populations. These models suggested that not only the abundance, but also the size of the remaining predators in the system were important.

Diversity and redundancy

Before the onset of the disease, the abundance of sunflower stars had the most pronounced effect on urchin populations. However, after the outbreak, the best predictors of urchin numbers were the abundance and size of the remaining predators. And, by comparing across a number of sites, the researchers found that predators were more abundant and larger within protected areas.

The scientists concluded that the marine protected areas released the predators from fishing pressures, so they were able to effectively fill the void left after the sunflower stars died off. Outside the protected areas, where the predators are smaller and less abundant, they were less able to compensate for the loss of the sea stars. This is one of the first studies showing that marine protected areas can confer ecosystem resilience by ensuring the protection of critical species functions.

"When you have multiple different species all performing similar functions, if something catastrophic happens to one of them, those functions can still be maintained," explained Caselle. In this case the function was predation, but the concept applies more broadly.

The state of affairs in Northern California supported the team's conclusion. The ocean north of San Francisco is too cold for sheephead and lobsters, and the otters that are well established on the Central Coast haven't been able to get a foothold north of the bay. As a result, the urchin population grew relatively unchecked once the sunflower star disappeared. The spiny hordes have since decimated the kelp forests of Northern California and the Pacific Northwest.

The scientists also found that predators' sizes made a big difference, especially for the sheephead. "What surprised us the most was that even really small differences in sheephead size resulted in really big differences in how many urchins they could eat," Caselle said. This is because bigger fish have bigger mouths that can crack bigger sea urchins.

One of the most common effects in marine reserves is that fish grow larger and become more numerous. Many studies have shown that this increases reproduction rates, since larger fish release disproportionately more eggs than smaller fish. However, this study is one of the first to highlight another, underappreciated effect: The larger fish are also able to better control urchins, eating more of them, as well as the larger, more fertile individuals.

"And that's important because even small differences in fishing pressure can result in those size class differences for the sheephead," Caselle added.

"We are moving into a situation now where resource managers and resource users are having serious conversations about active restoration of kelp forests and other habitats being altered by climate change. Restoration may be the only option if we want kelp forests to retain their functions and their diversity," she continued. Fortunately, these findings show that we may be able to manage ecosystems for resilience to environmental changes by protecting multiple species that provide critical functions and recognizing that redundancy is important.

In the future, the team wants to investigate the feedback cycles at work, especially those involving sheephead, with an eye toward how these insights can be leveraged for ecosystem restoration. They will also continue their long-term monitoring with a special focus on understanding the effectiveness of marine protected areas and how they confer resilience to climate change.

Credit: 
University of California - Santa Barbara

Identified a brain circuit that could indicate the risk of developing Alzheimer's

image: From left to right: Carles Soriano-Mas (principal researcher of the study), Inés del Cerro (first author of the article) and José Manuel Menchón (Head of the research group).

Image: 
Bellvitge Biomedical Research Institute (IDIBELL)

Late-onset Alzheimer's disease (LOAD), the one that appears after age 65, is the most common form of this neurodegenerative disease and accounts for more than 90% of cases. The first brain changes associated with the disease may appear years before the first symptoms, but the lack of clear risk markers complicates the application of the appropriate prevention strategies for those that are more vulnerable.

Now a study, published in the Journal of Psychiatric Research, has provided the first evidence that a poor neuronal connection between the brainstem and cerebellum may be predictive of the risk of developing Alzheimer's. The article has the participation of researchers from the Bellvitge Biomedical Research Institute (IDIBELL) and the CIBER of Mental Health (CIBERSAM), Barcelona University and Bellvitge University Hospital, in collaboration with researchers from the FLENI center of Buenos Aires (Argentina) and the CIBER of Neurodegenerative Diseases (CIBERNED).

Although the hereditary component in LOAD is much lower than the one in the early-onset, the risk of suffering LOAD, slightly increases having a familiar history. Therefore, the study focuses on a group of descendants of patients with LOAD to try to find early risk markers to develop the disease. The individuals, in good health and with an average age of 50 years, showed a worse performance in memory tests, although within the normal range, compared with volunteers without a family history of Alzheimer's.

After performing a functional neuroimaging test, the researchers discovered that this 'bad memory' was associated with the brain circuit that connects a region of the brainstem with the cerebellum, known as Locus Coeruleus. This is a structure traditionally related to control balance and other motor behaviors but recently has been associated with long-term memory.

This brain circuit is established as a neurobiological basis for early and subtle neurocognitive deficits, and its study could identify those subjects with a higher risk of developing Alzheimer's disease in advanced ages. Similarly, the development of interventions and therapeutic strategies that enhance connectivity between the Locus Coeruleus and the cerebellum could, in the future, delay the age of symptoms appearance or help to minimize their impact and slow their progression. This opens the door to studies in which &laquoit would be interesting to determine how taking healthy lifestyle habits (such as good dietary habits, stop smoking, or doing moderate physical exercise) could modulate the activity of this brain circuit», in the words of Carles Soriano-Mas, the corresponding author of the article.

Credit: 
IDIBELL-Bellvitge Biomedical Research Institute

Research points to potential brain marker of stress and its effects on problem solving

Stress response is the body's normal physiological reaction to a situation that it perceives as threatening. However, stress can also impact important aspects of thinking, including problem solving. Researchers from the University of Missouri School of Medicine and the MU Thompson Center for Autism and Neurodevelopmental Disorders have discovered a potential indicator of how stress affects the brain and alters its ability to problem solve. These findings could ultimately understand and optimize treatment for patients suffering from stress-related illnesses.

The results come from two companion studies involving 45 healthy college-age individuals who were genetically tested for the presence of at least one copy of a variation in the serotonin transporter gene (SERT), which is associated with greater susceptibility to stress. Participants were given a series of tests while being monitored by magnetic resonance imaging (MRI). The first test involved verbal processing tasks where participants were asked in two sessions (stress and no-stress control) how many items from a category they could name in a minute. Researchers found that stress did not impact overall performance for either gender or gene group, but effects of stress on performance did relate to changes in the brain's overall functional connectivity in all participants, suggesting the brain could provide a biomarker for the effects of stress on cognition.

"This may begin to help us understand what is going on in the brain when stress is affecting cognition," said supervising investigator David Beversdorf, MD, professor of radiology, neurology and psychology at the MU School of Medicine and the MU Thompson Center. "If we can develop an intervention that affects the brain's networks, we may be able to mitigate the cognitively impairing effects of stress."

In the other study, the same participants completed problem solving tasks in two sessions (stress and no-stress control) during MRI testing. Researchers discovered changes to the connections involving a section of the brain called the middle temporal gyrus related to changes in performance during stress in participants. This relationship depended on the presence or absence of the stress-related variant of the SERT gene, indicating a potential specific brain marker associated with susceptibility to stress during problem solving.

"When you look at the relationship of the imaging changes in the brain and the performance changes resulting from stress, the left middle temporal gyrus appears to be a critical hub, and this relationship depends on an individual's genetic susceptibility to stress," Beversdorf said. "The next step is to look at this in specific patient populations. Is this effect greater in PTSD populations or test anxiety patients? And if we can understand how to mitigate those effects, it could be very helpful to these patients."

Credit: 
University of Missouri-Columbia

Researchers look to fungus to shed light on cancer

image: Graduate student Ananya Sengupta (l) and Assistant Professor of Chemistry and Biochemistry James Frederich found that a product derived from a fungus could shed light on cellular interactions prominent in cancer.

Image: 
Bruce Palmer/FSU Photography Services

TALLAHASSEE, Fla. -- A fungus that attacks almond and peach trees may be key to identifying new drug targets for cancer therapy.

A team of Florida State University researchers from the Department of Chemistry and Biochemistry found that a natural product from the fungus Fusicoccum amygdali stabilizes a family of proteins in the cell that mediate important signaling pathways involved in the pathology of cancer and neurological diseases.

Their work is published in the journal ACS Chemical Biology.

Assistant Professor James Frederich and Professor Brian Miller found that fusicoccin -- a product derived from the fungus -- binds to and stabilizes protein complexes formed between 14-3-3 adaptor proteins and a subset of their client interaction partners. The 14-3-3 proteins are essentially major intersections in cells for signaling and regulatory operations. When their functions go awry, a disease is often present.

"Our goal in this study was to take one of the most intractable signaling networks in cell biology and develop a way to study it," Frederich said. "Our work draws inspiration from a structurally complex natural product, which we used as a tool to identify new targets for cancer cell biology."

Through this process, Frederich, Miller and their students identified 119 protein-protein interactions (PPIs) that can serve as targets for fusicoccin in humans. Several of these PPIs are important in cancer and other diseases. The research team has already narrowed that list down to 14 PPI targets that they find particularly promising.

"Our discovery of several new putative biological targets, which could clarify the mechanism of action of this natural product, is a major step forward," Miller said. "Identifying these new targets is very exciting, as is the potential to inform the design of fusicoccin derivatives with tailored activities."

The work is an ongoing collaboration between Frederich and Miller, who merged their areas of expertise in organic chemistry and biochemistry to explore the potential of fusicoccin.

"The unique combination of experiments and bioinformatics presented in this work lies squarely at the interface between chemistry and biology," Miller said. "We are hopeful that these types of chemical biology collaborations can be expanded."

Credit: 
Florida State University

Artificial atoms create stable qubits for quantum computing

Quantum engineers from UNSW Sydney have created artificial atoms in silicon chips that offer improved stability for quantum computing.

In a paper published today in Nature Communications, UNSW quantum computing researchers describe how they created artificial atoms in a silicon 'quantum dot', a tiny space in a quantum circuit where electrons are used as qubits (or quantum bits), the basic units of quantum information.

Scientia Professor Andrew Dzurak explains that unlike a real atom, an artificial atom has no nucleus, but it still has shells of electrons whizzing around the centre of the device, rather than around the atom's nucleus.

"The idea of creating artificial atoms using electrons is not new, in fact it was first proposed theoretically in the 1930s and then experimentally demonstrated in the 1990s - although not in silicon. We first made a rudimentary version of it in silicon back in 2013," says Professor Dzurak, who is an ARC Laureate Fellow and is also director of the Australian National Fabrication Facility at UNSW, where the quantum dot device was manufactured.

"But what really excites us about our latest research is that artificial atoms with a higher number of electrons turn out to be much more robust qubits than previously thought possible, meaning they can be reliably used for calculations in quantum computers. This is significant because qubits based on just one electron can be very unreliable."

Chemistry 101

Professor Dzurak likens the different types of artificial atoms his team has created to a kind of periodic table for quantum bits, which he says is apt given that 2019 - when this ground-breaking work was carried out - was the International Year of the Periodic Table.

"If you think back to your high school science class, you may remember a dusty chart hanging on the wall that listed all the known elements in the order of how many electrons they had, starting with Hydrogen with one electron, Helium with two, Lithium with three and so on.

"You may even remember that as each atom gets heavier, with more and more electrons, they organise into different levels of orbit, known as 'shells'.

"It turns out that when we create artificial atoms in our quantum circuits, they also have well organised and predictable shells of electrons, just like natural atoms in the periodic table do."

Connect the dots

Professor Dzurak and his team from UNSW's School of Electrical Engineering - including PhD student Ross Leon who is also lead author in the research, and Dr Andre Saraiva - configured a quantum device in silicon to test the stability of electrons in artificial atoms.

They applied a voltage to the silicon via a metal surface 'gate' electrode to attract spare electrons from the silicon to form the quantum dot, an infinitesimally small space of only around 10 nanometres in diameter.

"As we slowly increased the voltage, we would draw in new electrons, one after another, to form an artificial atom in our quantum dot," says Dr Saraiva, who led the theoretical analysis of the results.

"In a real atom, you have a positive charge in the middle, being the nucleus, and then the negatively charged electrons are held around it in three dimensional orbits. In our case, rather than the positive nucleus, the positive charge comes from the gate electrode which is separated from the silicon by an insulating barrier of silicon oxide, and then the electrons are suspended underneath it, each orbiting around the centre of the quantum dot. But rather than forming a sphere, they are arranged flat, in a disc."

Mr Leon, who ran the experiments, says the researchers were interested in what happened when an extra electron began to populate a new outer shell. In the periodic table, the elements with just one electron in their outer shells include Hydrogen and the metals Lithium, Sodium and Potassium.

"When we create the equivalent of Hydrogen, Lithium and Sodium in the quantum dot, we are basically able to use that lone electron on the outer shell as a qubit," Ross says.

"Up until now, imperfections in silicon devices at the atomic level have disrupted the way qubits behave, leading to unreliable operation and errors. But it seems that the extra electrons in the inner shells act like a 'primer' on the imperfect surface of the quantum dot, smoothing things out and giving stability to the electron in the outer shell."

Watch the spin

Achieving stability and control of electrons is a crucial step towards silicon-based quantum computers becoming a reality. Where a classical computer uses 'bits' of information represented by either a 0 or a 1, the qubits in a quantum computer can store values of 0 and 1 simultaneously. This enables a quantum computer to carry out calculations in parallel, rather than one after another as a conventional computer would. The data processing power of a quantum computer then increases exponentially with the number of qubits it has available.

It is the spin of an electron that we use to encode the value of the qubit, explains Professor Dzurak.

"Spin is a quantum mechanical property. An electron acts like a tiny magnet and depending on which way it spins its north pole can either point up or down, corresponding to a 1 or a 0.

"When the electrons in either a real atom, or our artificial atoms, form a complete shell, they align their poles in opposite directions so that the total spin of the system is zero, making them useless as a qubit. But when we add one more electron to start a new shell, this extra electron has a spin that we can now use as a qubit again.

"Our new work shows that we can control the spin of electrons in the outer shells of these artificial atoms to give us reliable and stable qubits.

"This is really important because it means we can now work with much less fragile qubits. One electron is a very fragile thing. However an artificial atom with 5 electrons, or 13 electrons, is much more robust."

The silicon advantage

Professor Dzurak's group was the first in the world to demonstrate quantum logic between two qubits in silicon devices in 2015, and has also published a design for a full-scale quantum computer chip architecture based on CMOS technology, which is the same technology used to manufacture all modern-day computer chips.

"By using silicon CMOS technology we can significantly reduce the development time of quantum computers with the millions of qubits that will be needed to solve problems of global significance, such as the design of new medicines, or new chemical catalysts to reduce energy consumption", says Professor Dzurak.

In a continuation of this latest breakthrough, the group will explore how the rules of chemical bonding apply to these new artificial atoms, to create 'artificial molecules'. These will be used to create improved multi-qubit logic gates needed for the realisation of a large-scale silicon quantum computer.

Credit: 
University of New South Wales

How some mammals pause their pregnancies

image: A female elephant seal and her pup on a beach near Big Sur, California. Seals and other pinnepeds are among the mammals whose early-stage embryos can enter diapause -- a temporary dormant state -- and then implant and develop later. The timing of pregnancy and birth are thereby postponed to occur when conditions are more favorable for survival.

Image: 
Alice C. Gray

How do some mammals postpone the development of their embryos to await better conditions for having offspring? A recent study at the UW Medicine Institute for Stem Cell and Regenerative Medicine explored this reproductive enigma, which can occur in more than 130 species of mammals as well as in some marsupials.

The study was study led by Abdiasis Hussein, a graduate student in the lab of Hannele Ruohola-Baker, UW professor of biochemistry and associate director of UW Medicine's Institute for Stem Cell and Regenerative Medicine. The findings were reported in Developmental Cell, a Cell Press scientific journal.

The results not only advance the understanding of delayed embryo implantation, but also suggest how some otherwise rapidly dividing cells, such as those in tumors, become quiescent.

In the suspended state of pregnancy called embryonic diapause, an early-stage embryo refrains from implanting in the mother's uterus, where it could be nourished to grow into a baby. Instead, like a seed, the embryo remains dormant until certain molecular regulators prod it to germinate.

Diapause, or delayed implantation, is a biological strategy for waiting out conditions unfavorable to sustaining newborns, such as lack of food, insufficient maternal fat stores, or older siblings who haven't been weaned.

Bears, armadillos, seals, and some otters, badgers and other weasel-like animals undergo seasonal diapause, as a regular part of their reproductive cycles.

Many types of bears, for example, breed in the late spring or early summer. The female then voraciously hunts for food. Only when the female bear has sufficient body fat and weight will one or more of her embryos implant months later, after she retreats to her den. Any cubs would be born in late winter.

To learn what puts a biochemical hold-and-release on embryonic development, Hussein, Ruohola-Baker and their team induced diapause in a female mouse model by reducing estrogen levels. They then compared diapause embryos to pre-implantation and post-implantation embryos. They also induced diapause in mouse embryonic stem cells by starving the cells, and compared those to actively growing mouse embryonic stem cells.

In the wild, some animal embryos will delay implantation until their mother has enough energy and nutrients in her body to support them. Starvation or other stresses somehow provoke an embryonic stop-time. This response is an effort to protect their survival.

The researchers did extensive studies of how metabolic and signaling pathways control both the dormant and active states of mouse embryos and of mouse embryonic stems cells in lab dishes.

Metabolism concerns the life-sustaining chemical activities cells carry out to convert substances into energy, build materials, and remove waste. By analyzing these reactions' end products, called metabolites, the scientists could begin to pull together a picture of what happens to cause diapause, and how cells are released from its clutches.

The scientists also looked at gene expression in comparing cell states. They sought to determine what might be influencing how the DNA code was being interpreted, what critical proteins were being produced and in what amounts, in the suspended and active states.

According to embryonic stem cell researcher Ruohola-Baker, epigenetic differences in interpreting the same DNA code, rather than any alterations in the DNA itself, may be key to understanding how embryos enter and exit diapause.

Further investigation pointed to a set of proteins vital in embryonic cell survival. The activity of the genes related to these proteins, as well as levels of certain amino acids, were ramped up in the diapausal embryos. For example, by using CRISPR gene-editing technology, Hussein and Julie Mathieu, UW assistant professor of comparative medicine, squelched the flow of glutamine, an amino acid that controls an important metabolic (energy-use) pathway.

The researchers collected additional data that indicated that this and other metabolic factors influenced a catalytic enzyme, mTOR, that regulates many cell processes, including cell proliferation, growth, and protein synthesis. mTOR is also involved in "sensing" cell nutrient and energy stores.

mTOR is already known to be a central regulator of metabolism and physiology in mammalian aging and cancer. It also manages aspects of embryonic growth and development. In this study, situations that inhibited mTOR led to the distinct metabolic profile that characterizes diapause. Researchers also found that this inhibition was reversible.

Understanding the mechanisms behind diapause could advance knowledge in medicine, as well as in wildlife biology. Carol Ware, a UW professor of comparative medicine, said that diapause is an essential means of survival for some species, and occurs under environmental stress in others.

Research on the mechanism of diapause in animals is an important step in seeing if this cellular response can be harnessed for clinical therapies, such as better in vitro fertilization procedures to help people have children.

Hussein believes this line of research might also have significance for future cancer studies. Figuring out why and when cancer cells enter quiescence might help explain their hunkering down to withstand chemotherapy, and reviving themselves later. Perhaps a therapy eventually could be devised, he said, that could wake up the cells to coincide with the timing of anti-cancer drugs.

Credit: 
University of Washington School of Medicine/UW Medicine

Chemistry technique is 'warp drive' for creating better synthetic molecules for medicine

LA JOLLA, CA -- In a study with implications for the future of drug discovery, Scripps Research scientists showed they were able to turn simple chemicals into unique 3-D structures resembling those found in nature--structures with desirable properties for medicines.

In the process, they found a potential drug lead for inflammatory disease, which is now being investigated further. The research appears in Nature Chemistry.

"We were able to start with flat molecules and use a single chemical operation to create much more complex shapes, such as those you would expect from metabolites of medicinal plants or marine organisms," says Ryan Shenvi, PhD, Scripps Research chemistry professor and senior author of the study. "In essence, we found a way to bridge the gap between the synthetic space and natural products, opening up a whole new realm to explore for potential drugs."

Nature's advantage

In the field of drug discovery, compounds made by nature are thought to have some advantages over synthetic molecules, which are created from simple chemical feedstocks. Much of it has to do with their shape: so-called "natural products" tend to have complex, spherical 3-D structures that bind more precisely with molecules in the body, providing favorable drug attributes such as fewer side effects.

Synthetic molecules used in the early stages of drug discovery, on the other hand, are typically flat, simple structures that are more likely to interact broadly with other molecules in the body. However, because they're so easy to create, they're more widely available for experimentation. When scientists are looking for a new drug to treat a particular disease, they will often turn to libraries of millions of synthetic molecules in the hopes of finding a needle in the haystack.

"But a bigger haystack doesn't necessarily mean you'll find more needles," says Shenvi. "It usually just means more hay."

Escaping flatland

For this reason, Shenvi and his Scripps Research lab have been working for several years on creating new tools to "escape flatland"--or build better drug candidates than the flat molecules that dominate traditional drug-screening libraries. The approach described in Nature Chemistry relies on a surprising chemical reaction stumbled upon by the Shenvi group in 2015.

"No one would have predicted that this reaction would work," says first author Benjamin Huffman, a predoctoral fellow in Shenvi's lab. "We even tried artificial intelligence-based prediction technology that is currently being rolled out."

But because the experiment would be relatively quick, Huffman and Shenvi decided to try it anyway, testing it on simple chemical compounds known as butenolides, which are byproducts from the corn oil refining industry. To their surprise, the compounds bonded almost instantaneously--their electron clouds joining together to form a new molecule with unexpected complexity. The remarkable rate of the reaction piqued their interest and suggested an unusual driving force that might prove to be general.

"Our next step was to find out if this reaction would work with other molecules that have different properties," Shenvi says. "So, we built a small collection of these unusual constructs."

Warp speed transformations

Initial experiments showed that the reaction has the same effect on many different types of flat synthetic molecules, transforming them into desirable 3-D shapes that look like they could have been produced by a living cell.

A major portion of the study then sought to understand, retrospectively, how the reaction occurred in the first place, which required collaboration with Kendall Houk, PhD, at the University of California, Los Angeles, and postdoctoral fellow Shuming Chen, PhD, in Houk's lab. One challenge was the speed of the reaction; it happened inexplicably fast, rendering the commonly used measurement tools useless.

Shenvi likens the reaction to "warp drive" in the TV series Star Trek, which enabled interstellar travelers to reach new frontiers of space faster than ever before. However, this chemical warp drive allows the researchers to explore distant regions chemical space.

Already, the approach has turned up one potential new drug lead: a compound that inhibits the expression of a protein known to play a role in autoimmune diseases.

After handing off the compound collection to Calibr's high-throughput screening facility, one of the molecules was immediately identified by Scripps Research staff scientist Emily Chin, PhD, and Professor Luke Lairson, PhD, of the Chemistry Department, for its ability act on a cell signaling pathway known as cGAS/STING. This pathway plays a key role in inflammation and is implicated in autoimmune disorders. The Lairson and Shenvi labs are continuing to investigate the possible lead.

"We are now taking a step back to carefully analyze the chemistry and see if we can expand this kind of result to other areas," Shenvi says. "Our goal is to blur the line between synthetic and natural product space and enable the discovery of new disease-relevant mechanisms."

Credit: 
Scripps Research Institute

Gulf coast mollusks rode out past periods of climate change

image: These Middle Eocene gastropod shells from Texas are just a small sample of the fossils in amateur paleontologist Christopher Garvie's collection. The study leveraged Garvie's fossils to study how past periods of climate change impacted mollusks along the Gulf Coast of the United States.

Image: 
Christopher Garvie

About 55 million years ago, a rapidly warming climate decimated marine communities around the world. But according to new research, it was a different story for snails, clams and other mollusks living in the shallow waters along what is now the Gulf Coast of the United States. They were able to survive.

The findings, published on Feb. 7 in Scientific Reports, suggest that mollusks in the region may adapt yet again to the climate change of today.

"Mollusks are sort of unique in this aspect as they are better adapted to cope with high temperatures," said lead author William Foster, an assistant professor at the University College Dublin and former postdoctoral researcher at The University of Texas at Austin Jackson School of Geosciences.

The paper's co-authors are Jackson school Assistant Professor Rowan Martindale, Cornell College Assistant Professor and former Jackson School postdoctoral researcher Drew Muscente, and Jackson School alumna Anna Weiss, who contributed to the research while earning her Ph.D. Coauthors also include an international team of collaborators and amateur Austin paleontologist Christopher Garvie.

The backbone of the research is Garvie's personal collection of Gulf Coast mollusk fossils, which he has collected over the past 30 years. He estimates that his collection includes over a quarter million specimens from sites ranging from Texas to Florida on the Gulf Coast and Florida to New Jersey on the Atlantic Coast.

"Being particular about the details, I kept notes of where I got things and I never threw anything away," Garvie said. "Even if I found 50 of one species, I would keep them all. That turns out to be useful for understanding community evolution and climate change distribution."

Garvie and Foster met at the Jackson School's Non-Vertebrate Paleontology Laboratory, the sixth largest paleontological repository in the United States. Garvie's collection contains specimens from the Cretaceous through the Eocene - a time period starting about 66 million years ago and lasting about 32 million years. It provided a great opportunity to study how periods of climate change during that time impacted mollusk communities.

"This research is a prime example of a scientific study that would not have been possible without a citizen scientist and the excellent collections at the Non-Vertebrate Paleontology Laboratory," said Martindale.

During the time the research focuses on, the Earth was in a warmer state than it is today, with no large ice sheets covering the poles. Even in this "hot house" state, the period contained multiple temperature spikes that warmed the planet even more. One of these spikes - the Paleocene-Eocene Thermal Maximum (PETM) - occurred about 55 million years ago and is frequently compared to the human-driven climate change happening today. During the PETM, atmospheric carbon dioxide rose rapidly, which in turn caused average global temperatures to rise by 9 to 14 degrees Fahrenheit.

While the PETM led to a decline in coral reef communities and the mass extinction of seafloor dwelling plankton called foraminifera, the Gulf Coast mollusks survived.

"It does highlight that even in events that we think are devastating, there's still a bit of hope from these resilient communities," Martindale said.

The scientists found that some mollusk species did go extinct in the 32 million-year period they studied, however they didn't find any link between those extinctions and temperature spikes, which suggests they were not related to climate change.

While Gulf Coast mollusks made it through the PETM and other temperature spikes unscathed, Foster said that this period of warming may only go so far as an analog for climate change happening today.

The Earth today is in an overall cooler state than it was during the PETM, and temperatures are rising much faster, meaning that mollusks-- along with other life-- may need to make a larger adjustment to their lifestyles in less time.

"The mollusks that live in the Gulf today are adapted to a colder climate, and the lack of impact in our study differs to projected changes because in the Early Cenozoic, global warming was happening in an already hot world," Foster said.

Along with climate change, Foster said that Gulf Coast mollusks are facing additional pressure from modern threats of overfishing, pollution, invasive species and loss of habitat, all which have the potential to drive regional extinctions.

Credit: 
University of Texas at Austin

Simulations identify missing link to determine carbon in deep Earth reservoirs

Understanding the Earth's carbon cycle has important implications for understanding climate change and the health of biospheres.

But scientists don't yet understand how much carbon lies deep in the Earth's water reservoirs -- for example, in water that is under extreme pressure in the mantle -- because experiments are difficult to conduct under such conditions.

Researchers at the Pritzker School of Molecular Engineering (PME) at the University of Chicago and the University of Science and Technology in Hong-Kong have created a complex computer simulation that will help scientists determine the concentration of carbon under the conditions of the mantle, which include temperatures of up to 1000K and pressures of up to 10 GPa, which is 100,000 times greater than on the Earth's surface.

These simulations provide an ingenious way to evaluate the missing link between measurements (in particular, vibrational spectra used to discover signatures of ions in water) and the ion and molecular concentrations in these conditions. This research, which was published recently in the journal Nature Communications, has important implications in understanding the Earth's carbon cycle.

"Our computational strategy will greatly facilitate the determination of the amount of carbon at the extreme conditions of the Earth's mantle," said Giulia Galli, the Liew Family Professor of Molecular Engineering and professor of chemistry at UChicago, who is also a senior scientist at Argonne National Laboratory and one of the authors of the research.

"Together with many other research groups around the world, we have been part of a large project aimed at understanding how much carbon is present in the Earth and how it moves from the interior toward the surface," said Ding Pan, former post-doctoral researcher at UChicago in Galli's group, first author of the research, and current assistant professor of physics and chemistry in Hong-Kong University of Science and Technology. "This is one step toward building a comprehensive picture of carbon concentration and movement in the earth."

A step toward better understanding the carbon cycle

Understanding how much carbon lies in deep reservoirs many miles underground is important because it is estimated that more than 90 percent of the Earth's carbon is buried in its interior. That deep carbon influences the form and concentration of carbon near the surface, which ultimately can impact global climate change.

Unfortunately, there is no experimental technique yet available to directly characterize carbonates dissolved in water at extreme pressure and temperature conditions. Pan and Galli devised a novel strategy that combines spectroscopy results with sophisticated calculations based on quantum mechanics to determine the concentration of ions and molecules in water at extreme conditions.

By carrying out these simulations, Pan and Galli found that the concentration of a specific important species -- bicarbonate ions -- has been underestimated by previously used geochemical models. They proposed a new view of what happens when you dissolve carbon dioxide in water at extreme conditions.

"The determination of what happens when one dissolves carbon dioxide in water under pressure is critical to the understanding of the chemistry of carbon in the Earth's interior," Galli said. "Our study contributes to the understanding of the deep carbon cycle, which substantially influences the carbon budget near the Earth's surface."

Galli and Pan's simulation were performed at the Research Computing Center at UChicago and at the Deep Carbon Observatory Computer Cluster. It is just one of the several investigations of ions in water and water at interfaces ongoing in Galli's group.

General simulation tools to understand water

Gaining a deeper understanding of what takes place when water -- and matter dissolved or suspended in water -- comes into contact with those solids is the focus of the Argonne-led AMEWS Center. For example, in many water systems, a phenomenon known as fouling -- the accumulation of unwanted material on solid surfaces to the detriment of function -- occurs at interfaces.

"A huge number of the challenges we face surrounding water center on the interface between water and the materials that make up the systems that handle, process, and treat water, including ions, of course," said Seth Darling, director of AMEWS and a PME fellow. "The quantum mechanical simulations of Galli, integrated with experiments, can make a real difference in understanding aqueous interfacial phenomena where ions, like the carbonates studied in Nature Communications, are present."

Credit: 
University of Chicago

Human textiles to repair blood vessels

image: The researchers use extracellular matrix sheets to make yarn -- a bit like that used to make clothing fabric.

Image: 
Nicolas L'Heureux

What if we could replace a patient's damaged blood vessels with brand new ones produced in a laboratory? This is the challenge set by Inserm researcher Nicolas L'Heureux, who is working on the human extracellular matrix - the structural support of human tissues that is found around practically all of the body's cells.

In a study published in Acta Biomaterialia, L'Heureux and his colleagues at the Tissue Bioengineering unit (Inserm/Université de Bordeaux) describe how they have cultivated human cells in the laboratory to obtain extracellular matrix deposits high in collagen - a structural protein that constitutes the mechanical scaffold of the human extracellular matrix. "We have obtained thin but highly robust extracellular matrix sheets that can be used as a construction material to replace blood vessels", L'Heureux explains.

The researchers then cut these sheets to form yarn - a bit like that used to make fabric for clothing. "The resulting yarn can be woven, knitted or braided into various forms. Our main objective is to use this yarn to make assemblies which can replace the damaged blood vessels", adds L'Heureux.

Made entirely from biological material, these blood vessels would also have the advantage of being well-tolerated by all patients. Given that collagen does not vary from individual to individual, it is not expected that the body will consider these vessels as foreign bodies that need to be rejected.

The researchers would now like to refine their techniques used to produce these "human textiles" before moving on to animal testing, in order to validate this last hypothesis. If these are conclusive, this could lead to clinical trials.

Credit: 
INSERM (Institut national de la santé et de la recherche médicale)

Human gut-in-a-dish model helps define 'leaky gut,' and outline a pathway to treatment

image: This is a mini-gut organoid generated in the lab from human stem cells.

Image: 
UC San Diego Health Sciences

Once a vague scapegoat for a variety of ills, increasing evidence suggests a condition known as "leaky gut" -- in which microbes and other molecules seep out of the intestines -- may be more common, and more harmful, than previously thought.

Leaky gut is most often experienced by older people, patients with cancers or other chronic ailments, and people with especially stressful lifestyles. Stressors break down the zipper-like junctions between the cells that form the gut lining. Microbes and molecules that subsequently leak out through these cell gaps can trigger an immune response, contributing to a variety of diseases driven by chronic inflammation, such as inflammatory bowel disease, dementia, atherosclerosis, liver fibrosis, cancers, diabetes and arthritis.

Yet there isn't a great way for clinicians to tell who has leaky gut and who does not -- and there are no treatments to fix it.

University of California San Diego School of Medicine researchers are now able to simulate leaky gut conditions for the first time, using 3D models of human intestines generated from patient cells. These small organoids, or "mini-guts," have revealed new biomarkers that help define what a leaky gut looks like -- molecular signals that could one day help clinicians better diagnose the condition, track its progression and evaluate the success or failure of treatments. The team also used the model to explore a potential pathway for tightening leaky guts with a common, available medication.

The study, publishing February 10, 2020 in Life Science Alliance, was led by first author Pradipta Ghosh, MD, professor of cellular and molecular medicine at UC San Diego School of Medicine and Moores Cancer Center, and senior author Soumita Das, PhD, associate professor of pathology at UC San Diego School of Medicine.

Ghosh and colleagues had previously identified a specialized molecular mechanism that helps tighten gut lining junctions -- a mechanism they dubbed the stress-polarity signaling pathway. They discovered that the junctions come undone when the pathway is under stress. They also found hints that the diabetes drug metformin might help activate the pathway, tightening up the junctions. But these fundamental discoveries were made in cell lines grown in petri dishes, without relevance in humans -- at least not yet.

So Ghosh, Das and team turned to the next best thing to a human test subject: 3D gut organoids derived from patients' intestines. The lining of the gut is not smooth, but more akin to rough terrain, with many peaks and valleys. At the bottom of each valley, known as a crypt, live small reservoirs of stem cells. To produce gut organoids, the researchers collected a tiny amount of these stem cells from patient biopsies and grew them in the lab. As they do in a person's gut, the stem cells differentiated, or specialized, into the four different cell types that make up the gut lining. But in the lab, outside the body, they rolled up into a ball and formed crypts. In other words, "mini-guts."

To simulate leaky gut conditions, the researchers rolled open the mini-gut balls to expose the surface of the intestinal lining. Then they sprinkled on several types of bacteria, which stressed the gut lining junctions, causing them to fall apart.

With this new model, Ghosh, Das and team confirmed that the junctions between cells are controlled by the previously identified stress-polarity signaling pathway. They discovered that the pathway keeping the gut lining intact begins to break down with aging and as colorectal tumors develop. They also measured an increase in markers of inflammation as the gut barrier began to fail.

But this stress-polarity signaling pathway can be restored, the team found. The diabetes drug metformin activates AMPK, an enzyme that plays a key role in the stress-polarity signaling pathway. The researchers demonstrated that metformin strengthens the junctions of mini-guts, tightening the lining back up when stressed by bacteria, aging or tumor formation. One measure of stress-polarity signaling pathway strength is the levels of a molecule called occludin. In Ghosh and Das' experiments, metformin increased occludin levels as much as six-fold compared to untreated cells.

Since each is derived from a different person, mini-guts vary in terms of their underlying genetics and epigenetics. That's a strength, said Ghosh, but also a limitation.

"Lots of research is done in mice that are inbred so that they are genetically identical, all in the same cage, eating the same diet, in order to remove these variables from the studies," she said. "But lab mice are far more standardized than the same human from day to day, or patients we see in the clinics. Here, our model is a better representation of humanity. On the other hand, it also means that each organoid is its own unique experiment. We have to test many organoids to be able to make any claim, which we did in our study."

Researchers next want to take a closer look at the diseases driven by leaky gut. They also plan to test various ways to tighten junctions in the context of aging, inflammatory bowel disease, cancer and other conditions to see if they can reduce or prevent initiation and progression of these diseases.

"I think you'd be hard pressed to find a disease in which systemic inflammation is not a driver," Das said. "That's why, even though there are so many things we still don't know, we're excited about the broad potential this model and these findings open for developing personalized leaky gut therapeutics that target AMPK and the stress-polarity signaling pathway."

Credit: 
University of California - San Diego

Invisible X-rays turn blue

image: The hazardous radiations such as Ultra-Violet and X-ray oxidative triggers the color-changing cascade of new dyes for 1000 times.

Image: 
Tsuyoshi Kawai

A new reaction system can detect X-rays at the highest sensitivity ever recorded by using organic molecules. The system, developed by researchers at Nara Institute of Science and Technology (NAIST), Ikoma, Japan, and Centre National de la Recherche Scientfique (CNRS), Toulouse, France, involves the cycloreversion of terarylene, causing the molecule to switch reversibly between colorless and blue isoforms in the presence or absence of X-rays. With detection at safe doses, this reaction system is expected to detect even the faintest X-rays levels considered dangerous.

Photoreactive materials convert light input into chemical output and are standard in semiconductor and 3D printing technologies. Some of these materials are also used in eye-protection, such as how sunglasses can reduce UV exposure by changing the lens color. Similarly, workers at risk to X-ray radiation are required to wear monitoring badges that indicate dangerous levels through changes in photoreactive materials.

However, NAIST Professor Tsuyoshi Kawai stresses that these badges do not completely eliminate the risk.

"Current materials for wearable detectors are sensitive to about 1 Gy. Ideally, safety management systems want about one hundred times more sensitivity," he says.

Kawai is an expert at increasing the photoconversion efficiency of photoreactive molecules, having focused his attention primarily on terarylenes, organic molecules with which his research team has consistently achieved exceptionally high reaction efficiencies.

"We have steadily improved the number of molecules that can undergo photoconversion in response to one photon. It was one to one in 2011 and today it becomes 33 molecules per one photon," he says.

To increase the quantum yield of terarylenes, is to maximize the number of changes that can be induced by a single photon. They have selected terarylenes because of their reversibility, meaning that the molecule can be converted back to the starting blue isoform upon exposure to ultraviolet light allowing for the system to be reset for repeated use.

Indeed, the color change is one of several reasons he believes organic molecules are preferable when considering X-ray detectors.

"Photochromic organic detectors can report X-rays through easily observed color changes and are recyclable and easy to process," he says.

The key modification to the terarylene molecules was the addition of a phenyl group to only one of the molecules two phenylthiophene groups, which allowed for reversible photoconversion between two isoforms. The result was a sensitivity of up to 0.3 Gy, making it more than 1000 times more sensitive than current commercial systems. Notably, 0.3 Gy is considered a safe exposure level, suggesting that no dangerous level will go undetected.

Photoconversion reactions like photosynthesis or neural stimulation in response to light in our eyes occurs at less than 100% efficiency (less than one molecule reacts to one photon). The system designed by the researchers, however, could achieve 3300% (33 molecules per photon), showing the potential of organic molecules in artificial systems.

"I think this is the highest efficiency ever reported for photoconversion with an organic molecule," notes Kawai.

Credit: 
Nara Institute of Science and Technology