Earth

Antibiotics can inhibit skin lymphoma

New research from the LEO Foundation Skin Immunology Research Center at the University of Copenhagen shows, surprisingly, that antibiotics inhibit cancer in the skin in patients with rare type of lymphoma.

Many patients with the rare lymphoma cancer, CTCL, contract staphylococcal infections in the skin. CTCL is a cancer in the so-called T-cells of the immune system, which shows in the skin. Therefore, the patient's immune system is weakened and the skin is less resistant to bacteria.

In a new study, researchers from the LEO Foundation Skin Immunology Research Center at the Faculty of Health and Medical Sciences, the University of Copenhagen, have - in collaboration with Aarhus and Zealand University Hospitals and Aarhus University - shown that aggressive treatment with antibiotics not only inhibits the staphylococcal bacteria, but also the cancer cells. The number of cancer cells is reduced and the cancer is significantly diminished for a period of time in patients with severe skin inflammation.

During a staphylococcal infection, the healthy immune cells in the body are working at full throttle. They produce growth substances called cytokines, which are used to get the immune system up and running. The cancer cells latch onto the growth substances, using them to accelerate their own growth. The research results show for the first time that the antibiotic treatment can slow down this process.

'When we inhibit the staphylococcal bacteria with antibiotics, we simultaneously remove the activation of the immune cells. This means that they do not produce as many cytokines, and therefore the cancer cells cannot get the extra 'fuel'. As a result, the cancer cells are inhibited from growing as fast as they did during the bacterial attack. This finding is ground-breaking as it is the first time ever that we see this connection between bacteria and cancer cells in patients, says Professor Niels Ødum from the LEO Foundation Skin Immunology Research Center.

The finding is the result of many years' research where the researchers have conducted molecular studies and laboratory tests, taken tissue samples from skin and blood and conducted clinical studies of carefully selected patients.

Eager to Find New Treatments

So far, CTCL patients with infections in the skin have only reluctantly been given antibiotics because it was feared that the infection would come back as antibiotic-resistant staphylococci after the treatment. The researchers behind the finding believe that the new results will change this.

'It has previously been seen that antibiotics have had some kind of positive effect on some of these patients, but it has never been studied what it actually does to the cancer itself. Our finding shows that it may actually be a good idea to give patients with staphylococci on the skin this treatment because it inhibits the cancer and at the same time possibly reduces the risk of new infections', says Niels Ødum.

It is still difficult to say whether the new knowledge may be transferred to other types of cancer. For the researchers at the LEO Foundation Skin Immunology Research Center, the next step is to initially look more closely at the link between cancer and bacteria.

'We do not know if this finding is only valid for lymphoma. We see it particularly in this type of cancer because it is a cancer within the immune system. The cancer cells already 'understand' the signals that the immune cells send out. When the immune cells are put to work, so are the cancer cells. At any rate, it is very interesting and relevant to take a closer look at the interaction between bacteria and cancer, which we see here', says Niels Ødum.

'The next step will be the development of new treatments that only target the 'bad' bacteria, without harming the 'good' bacteria, which protects the skin', he says.

Credit: 
University of Copenhagen - The Faculty of Health and Medical Sciences

Little helpers for the rainforest

image: This is a secondary forest growing on a former water buffalo pasture. The regenerating forest consists of young, smaller trees and a dense herb layer, as more light reaches the ground.

Image: 
Eckhard W. Heymann

Tropical rainforests store large quantities of carbon dioxide, produce oxygen and provide habitats for many animal and plant species. If these ecosystems, which are so important for the global climate and biodiversity, are destroyed, they will recover very slowly, if at all. Scientists from the German Primate Center (DPZ) - Leibniz Institute for Primate Research, the University Estadual Paulista, Brazil, and the University of Marburg have conducted a long-term study on the role monkeys play in the regeneration of degraded rainforests. For over 20 years, they observed two tamarin species in the rainforest of Peru. These animals feed on fruits and void the seeds undigested with their faeces. The researchers have studied the dispersal and germination of seeds as well as the growth and genetic origin of various plants in a forest that had emerged from a former pasture. For the first time, they were able to prove that monkeys have a decisive influence on the dispersal of seeds from the original primary forest to the regenerating secondary forest (Scientific Reports).

The study was carried out in the Peruvian Amazon rainforest at the Estación Biológica Quebrada Blanco research station of the German Primate Center. Near the station there is an area of about four hectares which was cleared and used as pasture for water buffalos between 1990 and 2000. After the grazing was abandoned, rainforest slowly developed again. The researchers around Eckhard W. Heymann, scientist at the German Primate Center and head of the study, observed that moustached and black-fronted tamarins were temporarily in the early secondary forest.

Tamarins feed mainly on fruits and disperse the seeds of many different tropical trees and lianas over their faeces. "We wanted to find out whether the seed dispersal by monkeys has a demonstrable effect on the natural regeneration of forests," says Eckhard W. Heymann.

To investigate which seeds were dispersed from the primary forest to the secondary forest, the researchers identified seeds from the monkeys' faeces and observed their development in the secondary forest. Around ten per cent of these seeds stem from plants growing in the primary forest and were dispersed into the secondary forest. A part of these seeds germinated and the resulting seedlings survived for at least one year. These seedlings could be assigned to eight different plant species. Seven of these species could only be found as adult plants in the nearby primary forest.

In order to genetically verify the results, the scientists analyzed seedlings and young plants of the neotropic tree Parkia panurensis. The seeds of this tree are dispersed exclusively by tamarins in the area around the DPZ research station. The researchers extracted the DNA from leaves of seedlings and young plants growing in the secondary forest and compared the genotype with those of adult Parkia trees in the primary forest. Half of these seedlings and young plants could be matched to eleven parent trees in the primary forest. The distances between young and parent plants were exactly in the range over which the tamarins disperse Parkia seeds.

"Our data show for the first time that the moustached and black-fronted tamarins effectively disperse seeds from primary forest into secondary forest," says Eckhard W. Heymann. "We were able to prove that the seeds germinate and form young plants, thus increasing the diversity of species in the secondary forest. The tamarins have been shown to contribute to the natural regeneration of areas destroyed by humans."

The study included data collected at the DPZ research station since 1994, but not initially against the background of the current issue. "At that time, we did not expect the cleared forest area to ever recover," emphasizes Eckhard W. Heymann. "However, the study shows how important data collection and investigations over a very long period of time are in order to be able to make reliable statements about slowly developing ecological processes."

Credit: 
Deutsches Primatenzentrum (DPZ)/German Primate Center

Children with medical emergencies during airline flights have limited aid

DURHAM, N.C. -- Children afflicted with medical emergencies during commercial airline trips tend to have common ailments such as vomiting, fever or allergic reactions - events that should be easily treated, according to a study led by Duke Health researchers.

But few airlines stock first-aid kits with pediatric versions of therapies that would help, including liquid forms of pain relievers or allergy medications.

"Children represent almost 16% of emergency medical events on airlines, so these incidences are not rare," said Alexandre Rotta, chief of the Division of Pediatric Critical Care Medicine at Duke University School of Medicine and lead author of a study published Thursday in the Annals of Emergency Medicine.

"Both airlines and parents should be aware of the most common illnesses and be prepared to deal with them," he said. "Our study provides this much-needed information."

The study is a first-ever detailing of more than 11,000 instances on 77 international airlines in which children required emergency medical attention, covering a period between January 2015 and October 2016.

Most of the incidents involving children were handled by flight crew members (86.6%), but in nearly 9% of cases, doctors who were on board as passengers were asked to lend their assistance. About 16% of total cases resulted in a child needing additional care upon landing, and only 0.5% of flights were diverted to a nearby airport to get immediate care.

The most common medical events among children were the same conditions that drive pediatric emergency room visits, including nausea and vomiting (33.9%), fever or chills (22.2%), acute allergic reaction (5.5%), abdominal pain (4.7%) and stomach flu (4.5%).

But unless parents had stocked their carry-on bags with therapies, the likelihood was slim that the airline would have a remedy on hand that was appropriate for a child. The Federal Aviation Administration requires U.S. airlines to have well-stocked first-aid kits that include asthma inhalers, antihistamines and aspirin. But the medications are in pill form, which many youngsters can't swallow, and/or in adult dosages.

In 2018, Congress passed a law directing the FAA to assess whether on-board first-aid kits have the minimum contents to meet the needs of children. Rotta, who is himself a pilot and who has frequently assisted children and adults during in-flight emergencies, said the research team's analysis should provide a shopping list for stocking airline first-aid kits.

"This is needed information to help inform discussion and policies affecting children on airlines and what should be included in the on-board medical kits," Rotta said. "But for right now, if you are a parent traveling with a child, we recommend you carry on the medications you think your child might need."

Credit: 
Duke University Medical Center

Interventions for type 2 diabetes successful across the genetic landscape

BOSTON - As the number of people with type 2 diabetes soared to 8.8 percent of the population by 2017, a growing public health movement has sought to know if tailoring dietary recommendations to specific genetic profiles might help reduce the risk of the disease in susceptible individuals. A team of researchers from Massachusetts General Hospital (MGH) has now found that the quality of dietary fat consumed and the genetic risk of diabetes work independently of each other, and that a diet rich in polyunsaturated fats can be safely applied across the spectrum of type 2 diabetes genetic risk.

"Our meta-analysis shows on a scale never done before that there is no apparent need to be concerned about the genetic risk to inform sound dietary recommendations for individuals with type 2 diabetes," says Jordi Merino, RD, PhD, of the MGH Diabetes Unit and Center for Genomic Medicine, and corresponding author of the study published online in the BMJ. "This means that lifestyle or dietary interventions for the prevention of type 2 diabetes can be deployed across all gradients of genetic risk since genetic burden does not seem to impede their effectiveness."

Recommendations aimed at improving dietary quality have become an integral part of the worldwide public health effort to stem the rampant growth of diabetes. The MGH investigators found that irrespective of genetic risk, consuming more polyunsaturated fat (such as omega 3 and omega 6 fatty acids) in place of refined starch and sugars is associated with a lower risk of type 2 diabetes, while consuming more monounsaturated fat in place of carbohydrates is associated with a higher risk of the metabolic disease. In North America, monounsaturated fats typically derive from animal sources of food such as red meat, dairy and full-fat dairy products.

Merino emphasizes another important finding of the study that transcends the issue of dietary fat. "The positive association between polygenic scores and type 2 diabetes we reported acknowledges the fact that people at higher genetic risk could benefit from additional strategies that have nothing to do with dietary fat intake," he says.

The MGH study included more than 102,000 participants of European descent who were free from diabetes at baseline. These individuals were culled from 15 cohort studies and followed over 12 years. In finding no appreciable interaction between dietary components and diabetes type 2 risk-increasing genes, the analysis concurs with the national Diabetes Prevention Program which demonstrated that lifestyle modification is effective regardless of the genetic burden for type 2 diabetes. The MGH findings are also consistent with recent evidence around coronary artery disease, which has led to heart-healthy lifestyle and dietary regimens being promoted across the genetic landscape.

The picture is somewhat different with obesity, however, where increasing evidence has shown that unhealthy dietary or certain lifestyle patterns like sugar, sweetened drinks, fried foods and physical inactivity might interact with genetic susceptibility to elevate body mass index (BMI). Looking to explain the dichotomy, Merino says, "The metabolic complexity of type 2 diabetes and coronary heart disease may account for the lack of interaction between lifestyle factors and genetic background."

Credit: 
Massachusetts General Hospital

Electricity-driven undersea reactions may have been important for the emergence of life

image: The research group has proposed an effective mechanism to utilize the chemical energy generated by hot hydrothermal fluids gushing out of hydrothermal vents on Earth's early ocean floor for the synthesis of biomolecules.

Image: 
ELSI

Though it remains unknown how life began, there is a community of scientists who suspect it occurred in or around deep sea hydrothermal environments. At such sites, water heated by contact with hot rocks from Earth's mantle flows into the lower ocean, passing over and through minerals which are themselves precipitated by the interaction of this hot water with cold seawater. The minerals often include metal sulfides, such as iron sulfide, also known as pyrite or fool's gold. As they precipitate, these mineral precipitates begin to form channels for the hot vent water, and since the metal-containing minerals are electrically conductive and the compositions of the vent water and ocean water are different, an electrical gradient is created--something like a natural battery--with electric current flowing from the vent water through the minerals and into the ocean. A team led by Tokyo Institute of Technology/Earth-Life Science Institute (ELSI) scientists have now shown via careful laboratory experiments that this current can reduce the metal sulfide minerals to native metals and mixed metal sulfide/metal conglomerates, which in turn can reduce and catalyze the reduction of various organic compounds.

The team led by Norio Kitadai, an affiliated scientist of the Tokyo Institute of Technology/Earth-Life Science Institute (ELSI) as well as a scientist at the Japan Agency for Marine-Earth Science and Technology (JAMSTEC), produced a set of electrochemical reactions in the laboratory that are suspected to have been generated in early ocean floor hydrothermal vent environments. They demonstrated that metal sulfides, including those of iron, copper, lead, and silver (some of which are common constituent minerals in hydrothermal vent environments), were converted to native metals by electroreduction. Complexes of metal sulfide and reduced metal were also produced during the process. It was also discovered that several organic chemical reactions indispensable in modern life were promoted by these complexes. The authors believe the metals and metal sulfides served as reducing agents and catalysts for these reactions.

This research identifies a new mechanism for the creation of organic compounds driven by hydrothermal electricity generation in hydrothermal fluids. An exciting implication of this work is that, since electrical current appears to be universally generated in deep-sea hydrothermal vent environments on Earth, anywhere such hydrothermal processes occur throughout the cosmos should likewise promote this kind of chemistry. Indeed, recent astronomical and spacecraft-based observations suggest there may be vigorous hydrothermal activity on the moons of Saturn and Jupiter (Enceladus and Europa), and hydrothermal activity was likely common on early Mars. Further research on the effects of various metals and electric gradients is expected to unveil much more about the environmental conditions that can facilitate prebiotic chemistry. This could ultimately lead to a better understanding of the universality and similarity of life in the universe.

Credit: 
Tokyo Institute of Technology

Device could automatically deliver drug to reverse opioid overdose

video: A Purdue University team of engineers has built a wearable device designed to automatically detect when a person's respiration rate decreases to a certain level - converted from electrocardiography (EKG) signals - and then release naloxone, which blocks the opioid from binding to brain receptors.

Image: 
Purdue University video/Erin Easterling

WEST LAFAYETTE, Ind. -- Opioid users tend to be alone and incapacitated during an overdose. Purdue University researchers are developing a device that would automatically detect an overdose and deliver naloxone, a drug known to reverse deadly effects.

"The antidote is always going to be with you," said Hyowon "Hugh" Lee, an assistant professor of biomedical engineering at Purdue. "The device wouldn't require you to recognize that you're having an overdose or to inject yourself with naloxone, keeping you stable long enough for emergency services to arrive."

Overdose happens when opioids bind to receptors in the brain that regulate breathing, causing a person to hypoventilate and die. According the U.S. Department of Health and Human Services, approximately 130 people in the U.S. die every day from opioid-related drug overdoses.

Lee's team has built a wearable device designed to detect when a person's respiration rate decreases to a certain level - converted from electrocardiography (EKG) signals - and then release naloxone, which blocks the opioid from binding to brain receptors.

Wearing the device would be similar to wearing an insulin pump: The current proof of concept is an armband that straps on a magnetic field generator, connected to a portable battery worn at the hip. A sticker-like EKG sensor on the skin, such as on the chest, measures respiration rate. When the sensor detects a respiration rate that's too low, it activates the magnetic field generator to heat up a drug capsule in the body, releasing naloxone in 10 seconds.

The researchers envision the drug capsule being pre-injected under the skin in an outpatient setting. That way, the device system would automatically deliver naloxone to the patient during an overdose, buying about an hour before relapsing. A YouTube video is available at https://youtu.be/b2Bfc4YRm0Y.

According to Lee, that extra hour would give emergency services plenty of time to get the patient to the hospital. The capsule also delivers a larger dose of naloxone than products currently available on the market - making it more effective at delaying relapse - and would be cheaper to manufacture.

Although the device doesn't yet work automatically, the researchers demonstrated through in vitro and in vivo experiments that the setup successfully detects a low respiration rate from EKG signals and delivers naloxone. Their work appears in the Journal of Controlled Release.

The device has been patented through the Purdue Research Foundation Office of Technology Commercialization. Since submitting the work for publication, the researchers have downsized the magnetic field generator and battery so that the device is less bulky.

"The goal is to make the whole system unobtrusive, so that you don't feel like you're having to wear something large all the time," Lee said.

The researchers also plan to build a communications system into the device that would automatically alert emergency services when the patient has overdosed.

The technology could possibly deliver other drugs besides naloxone.

"People with allergies need epinephrine right away. This setup might remove the need for an epi pen," Lee said.

Credit: 
Purdue University

Coping skills program helps social service workers reduce stress, trauma after disasters

image: University of Illinois social work professors Tara Powell and Kate Wegmann found that a mental health intervention called Caregivers Journey of Hope can bolster social service workers' emotional resilience and ability to cope with the stress and trauma associated with disasters such as Superstorm Sandy.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- An intervention called Caregivers Journey of Hope can help social service workers - especially those with the least experience in the field - to mitigate the stress and trauma they may experience when they're helping community members recover from disasters, a new study found.

There's a significant need for mental health interventions for social service workers, who are at high risk of burnout, chronic stress and emotional distress in disaster recovery, said the study's co-authors, University of Illinois social work professors Tara Powell and Kate M. Wegmann.

"Since many people in helping professions may be trying to rebuild their own lives while helping traumatized people in the community, providing these workers with the training and tools to practice physical, emotional and social self-care is critical to helping them reduce their own stress and avert burnout," said Powell, who led the study.

Powell and her co-authors examined the impact that the Caregivers Journey of Hope workshop had on 722 professionals who assisted victims of Superstorm Sandy in New York and New Jersey.

Sandy ravaged the Eastern Seaboard of the U.S., Canada and the Caribbean during October 2012, killing more than 200 people and causing more than $70 billion in damage. New York and New Jersey were among the hardest-hit regions on the U.S. mainland, where 87 people died and more than 650,000 homes were damaged or destroyed, according to the study.

Powell co-developed the Caregivers Journey of Hope curriculum while working for Save the Children. The curriculum was designed to bolster the resilience of social workers, teachers and children in New Orleans and reduce emotional distress they experienced as a result of Hurricane Katrina in 2005.

Recovery from disasters often takes years, Powell and Wegmann noted in the study. Working closely with traumatized clients and vicariously experiencing their terror and pain can adversely affect the mental health of counselors and social workers.

In turn, this distress can trigger a host of emotional, behavioral, physical and interpersonal problems, negatively affecting caregivers' job performance and personal lives, according to the study.

Obtaining social support can be especially important for counselors because the often-confidential nature of their work prevents them from discussing traumatizing or stressful experiences outside the workplace, the researchers wrote.

"The half-day Caregivers Journey of Hope workshop gives front-line care providers an opportunity to process disaster-related stress in a safe, confidential environment, build social support and develop strategies to cope with stressors in the workplace and at home," Powell said. "A wealth of research over the past couple of decades has illustrated that higher levels of stress are associated with lower levels of social support."

Working in small groups, workshop participants share their experiences; explore the types, sources and effects of stress; and develop solutions, such as ways they can build their social support networks. They also discuss strategies for rebuilding their communities and for enhancing individual and community-level recovery.

Powell and Wegmann tested the intervention with social workers and counselors from 37 agencies in New York and New Jersey after Sandy.

Participants reported substantial decreases in their stress levels and showed significant improvements on all of the other measures surveyed, the researchers found.

Caregivers who were newest on the job - those with one to four years' experience - benefitted the most, showing the greatest gains in their ability to recognize the signs and effects of stress and in their perceived ability to cope with taxing situations.

"This finding is of particular importance, as those with less experience in the social service field are at a higher risk for experiencing various forms of caregiver distress," Wegmann said. "Research has shown that those who perceive that they can actively cope with stressors or who have higher coping self-efficacy tend to have better health and mental health outcomes."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Time heals all wounds, but this adhesive can help

Cuts, scrapes, blisters, burns, splinters, and punctures - there are a number of ways our skin can be broken. Most treatments for skin wounds involve simply placing a barrier over it (usually an adhesive gauze bandage) to keep it moist, limit pain, and reduce exposure to infectious microbes, but do not actively assist in the healing process. More sophisticated wound dressings that can monitor aspects of healing such as pH and temperature and deliver therapies to a wound site have been developed in recent years, but they are complex to manufacture, expensive, and difficult to customize, limiting their potential for widespread use.

Now, a new, scalable approach to speeding up wound healing has been developed based on heat-responsive hydrogels that are mechanically active, stretchy, tough, highly adhesive, and antimicrobial: active adhesive dressings (AADs). Created by researchers at the Wyss Institute for Biologically Inspired Engineering at Harvard University, Harvard's John A. Paulson School for Engineering and Applied Sciences (SEAS), and McGill University, AADs can close wounds significantly faster than other methods and prevent bacterial growth without the need for any additional apparatus or stimuli. The research is reported in Science Advances.

"This technology has the potential to be used not only for skin injuries, but also for chronic wounds like diabetic ulcers and pressure sores, for drug delivery, and as components of soft robotics-based therapies," said corresponding author David Mooney, Ph.D., a Founding Core Faculty member of the Wyss Institute and the Robert P. Pinkas Family Professor of Bioengineering at SEAS.

AADs take their inspiration from developing embryos, whose skin is able to heal itself completely, without forming scar tissue. To achieve this, the embryonic skin cells around a wound produce fibers made of the protein actin that contract to draw the wound edges together, like a drawstring bag being pulled closed. Skin cells lose this ability once a fetus develops past a certain age, and any injuries that occur after that point cause inflammation and scarring during the healing process.

In order to mimic the contractile forces that pull embryonic skin wounds closed, the researchers extended the design of previously developed tough adhesive hydrogels by adding a thermoresponsive polymer known as PNIPAm, which both repels water and shrinks at around 90° F. The resulting hybrid hydrogel begins to contract when exposed to body temperature, and transmits the force of the contracting PNIPAm component to the underlying tissue via strong bonds between the alginate hydrogel and the tissue. In addition, silver nanoparticles are embedded in the AAD to provide antimicrobial protection.

"The AAD bonded to pig skin with over ten times the adhesive force of a Band-Aid? and prevented bacteria from growing, so this technology is already significantly better than most commonly used wound protection products, even before considering its wound-closing properties," said Benjamin Freedman, Ph.D., a Postdoctoral Fellow in the Mooney lab who is leading the project.

To test how well their AAD closed wounds, the researchers tested it on patches of mouse skin and found that it reduced the size of the wound area by about 45% compared to almost no change in area in the untreated samples, and closed wounds faster than other treatments including microgels, chitosan, gelatin, and other types of hydrogels. The AAD also did not cause inflammation or immune responses, indicating that it is safe for use in and on living tissues.

Furthermore, the researchers were able to adjust the amount of wound closure performed by the AAD by adding various amounts of acrylamide monomers during the manufacturing process. "This property could be useful when applying the adhesive to wounds on a joint like the elbow, which moves around a lot and would probably benefit from a looser bond, compared to a more static area of the body like the shin," said co-first author Jianyu Li, Ph.D., a former Postdoctoral Fellow at the Wyss Institute who is now an Assistant Professor at McGill University.

The team also created a computer simulation of AAD-assisted wound closure, which predicted that AAD could cause human skin to contract at a rate comparable to that of mouse skin, indicating that it has a higher likelihood of displaying a clinical benefit in human patients. "We are continuing this research with studies to learn more about how the mechanical cues exerted by AAD impact the biological process of wound healing, and how AAD performs across a range of different temperatures, as body temperature can vary at different locations," said Benjamin Freedman, Ph.D., a Postdoctoral Fellow in the Mooney lab who is taking the lead on the project. "We hope to pursue additional preclinical studies to demonstrate AAD's potential as a medical product, and then work toward commercialization."

"This is another wonderful example of a mechanotherapy in which new insights into the key role that physical forces play in biological control can be harnessed to develop a new and simpler therapeutic approach that may be even more effective than drugs or complex medical devices," said Wyss Founding Director Donald Ingber, M.D., Ph.D., who is also the Judah Folkman Professor of Vascular Biology at Harvard Medical School (HMS) and the Vascular Biology Program at Boston Children's Hospital, and Professor of Bioengineering at SEAS.

Credit: 
Wyss Institute for Biologically Inspired Engineering at Harvard

Amoeba builds barriers for protection against bacteria

image: Dr. Adam Kuspa is the corresponding author of this work.

Image: 
Baylor College of Medicine

In some respects, animals and amoebae are not that different. For instance, both are at risk of potentially deadly attacks by bacteria and have evolved ways to prevent them. Researchers at Baylor College of Medicine report in the journal Science Advances that Dictyostelium discoideum, the soil-dwelling single-celled amoeba that feeds on bacteria, builds a barrier around its colonies that counteracts bacterial attempts to penetrate them, facilitates amoebal feeding and protects them from oxidative stress.

"We were surprised to discover that, when exposed to some types of Gram-negative bacteria such as Klebsiella pneumoniae, but not Gram-positive bacteria, D. discoideum secretes into its surroundings large amounts of CadA, a protein until now known only as a cell adhesion molecule that contributes to the amoeba's development," said corresponding author Dr. Adam Kuspa, professor of biochemistry and molecular biology and senior vice president and dean of research at Baylor. "We analyzed CadA and determined that it is a lectin, a molecule that binds to carbohydrates, and mixing CadA with K. pneumoniae resulted in clumps of bacteria."

Kuspa and his colleagues then investigated the effect of the lack of CadA on the amoeba's ability to form colonies or plaques on a film of bacteria, typical laboratory conditions to study amoebae. They deleted the CadA gene and found that only 20 percent of the amoebae survived and formed plaques when set to grow on a film of K. pneumoniae. But the same CadA-deficient amoebae grew the same as amoebae with CadA when set to grow on Gram-positive bacteria.

"It was totally unexpected that secreted CadA was important for amoebal colony formation on some type of bacteria, but not other," said first author Dr. Timothy Farinholt, who was a graduate student in the Kuspa lab when he was working on this project. "When we added exogenous CadA to CadA-deficient amoeba, the amoeba formed threefold more colonies than when CadA was not available."

Altogether, the experiments suggested that CadA is important for the amoeba to go through the initial stages leading to plaque formation and that once the plaque reaches certain size, CadA is no longer needed to keep the amoebae alive on a bacterial film.

But, how was CadA helping D. discoideum survive on a sea of life-threatening K. pneumoniae?

A series of microscopic examinations showed a sharp border between the amoebal colony and the surrounding bacteria, suggesting that CadA is forming a barrier that prevents live bacteria from entering the amoebal plaque. As CadA binds to K. pneumoniae and agglutinates or clumps the bacteria, the amoebae are attracted toward these bacterial clumps and feed on them at the edges to the plaque. Additional experiments showed that when CadA is present, the amoebae actively feeding at the plaque edges have lower levels of oxidative stress than when CadA is absent.

"Amoeba living in soil are constantly exposed to vast numbers of heterogeneous groups of bacteria and CadA and other lectins we and others have so far identified enable amoebae to recognize specific species and adapt to survive," Kuspa said.

Credit: 
Baylor College of Medicine

Climate change could revive medieval megadroughts in US Southwest

About a dozen megadroughts struck the American Southwest during the 9th through the 15th centuries, but then they mysteriously ceased around the year 1600. What caused this clustering of megadroughts -- that is, severe droughts that last for decades -- and why do they happen at all?

If scientists can understand why megadroughts happened in the past, it can help us better predict whether, how, and where they might happen in the future. A study published today in Science Advances provides the first comprehensive theory for why there were megadroughts in the American Southwest. The authors found that ocean temperature conditions plus high radiative forcing -- when Earth absorbs more sunlight than it radiates back into space -- play important roles in triggering megadroughts. The study suggests an increasing risk of future megadroughts in the American Southwest due to climate change.

Previously, scientists have studied the individual factors that contribute to megadroughts. In the new study, a team of scientists at Columbia University's Lamont-Doherty Earth Observatory has looked at how multiple factors from the global climate system work together, and projected that warming climate may bring a new round of megadroughts.

By reconstructing aquatic climate data and sea-surface temperatures from the last 2,000 years, the team found three key factors that led to megadroughts in the American Southwest: radiative forcing, severe and frequent La Niña events -- cool tropical Pacific sea surface temperatures that cause changes to global weather events -- and warm conditions in the Atlantic. High radiative forcing appears to have dried out the American Southwest, likely due to an increase in solar activity (which would send more radiation toward us) and a decrease in volcanic activity (which would admit more of it) at the time. The resulting increase in heat would lead to greater evaporation. At the same time, warmer than usual Atlantic sea-surface temperatures combined with very strong and frequent La Niñas decreased precipitation in the already dried-out area. Of these three factors, La Niña conditions were estimated to be more than twice as important in causing the megadroughts.

While the Lamont scientists say they were able to pinpoint the causes of megadroughts in a more complete way than has been done before, they say such events will remain difficult for scientists to predict. There are predictions about future trends in temperatures, aridity, and sea surface temperatures, but future El Niño and La Niña activity remains difficult to simulate. Nevertheless, the researchers conclude that human-driven climate change is stacking the deck towards more megadroughts in the future.

"Because you increase the baseline aridity, in the future when you have a big La Niña, or several of them in a row, it could lead to megadroughts in the American West," explained lead author Nathan Steiger, a Lamont-Doherty Earth Observatory hydroclimatologist.

During the time of the medieval megadroughts, increased radiative forcing was caused by natural climate variability. But today we are experiencing increased dryness in many locations around the globe due to human-made forces. Climate change is setting the stage for an increased possibility of megadroughts in the future through greater aridity, say the researchers.

Credit: 
Columbia Climate School

The climate is warming faster than it has in the last 2,000 years

image: Global mean warming / cooling rates over the last 2,000 years. In red are the periods (each across 51 years) in which the reconstructed temperatures increased. Global temperatures decreased in the periods in blue. The green line shows that the maximum expected warming rate without anthropogenic influence is just under 0.6 degrees per century. Climate models (dashed orange line) are able to simulate this natural upper limit very well. At more than 1.7 degrees per century, the current rate of warming is significantly higher than the expected natural rate of warming, and higher than values for every previous century. Instrumental measurements since 1850 (in black) confirm these figures.

Image: 
University of Bern

Many people have a clear picture of the "Little Ice Age" (from approx. 1300 to 1850). It's characterized by paintings showing people skating on Dutch canals and glaciers advancing far into the alpine valleys. That it was extraordinarily cool in Europe for several centuries is proven by a large number of temperature reconstructions using tree rings, for example, not just by historical paintings. As there are also similar reconstructions for North America, it was assumed that the "Little Ice Age" and the similarly famous "Medieval Warm Period" (approx. 700 - 1400) were global phenomena. But now an international group led by Raphael Neukom of the Oeschger Center for Climate Change Research at the University of Bern is painting a very different picture of these alleged global climate fluctuations. In a study which has just appeared in the well-known scientific journal Nature, and in a supplementary publication in Nature Geoscience, the team shows that there is no evidence that there were uniform warm and cold periods across the globe over the last 2,000 years.

Climate fluctuations in the past varied from region to region

"It's true that during the Little Ice Age it was generally colder across the whole world," explains Raphael Neukom, "but not everywhere at the same time. The peak periods of pre-industrial warm and cold periods occurred at different times in different places." According to the climate scientist from Bern, the now-debunked hypothesis of climate phases occurring at the same time across the globe came about because of an impression that is defined by the climate history of Europe and North America. In the absence of data from other parts of the earth, this notion was applied to the whole planet, raising expectations that relatively cold or warm periods throughout the last 2,000 years were globally synchronous phenomena. But it has now been shown that this was not the case.

The authors of the study in Nature see the explanation for that as being that regional climates in pre-industrial times were primarily influenced by random fluctuations within the climate systems themselves. External factors such as volcanic eruptions or solar activity were not intense enough to cause markedly warm or cold temperatures across the whole world for decades, or even centuries.

The researchers relied on a database from the international research consortium PAGES (Past Global Changes, http://www.pastglobalchanges.org), which provides a comprehensive overview of climate data from the last 2,000 years, for their investigation of five pre-industrial climate epochs. In addition to tree rings, it also includes data from ice cores, lake sediments and corals. To really put the results to the test, the team led by Raphael Neukom analyzed these data sets using six different statistical models - more than ever before. This allowed for the calculation of the probability of extremely warm or cold decades and centuries, and not just the calculation of absolute temperatures. The result was that no globally coherent picture emerged during the periods being investigated. "The minimum and maximum temperatures were different in different areas," says Raphael Neukom. So thermal extremes across the world cannot be inferred from regional temperature phenomena like the oft-mentioned "Medieval Warm Period" in Europe and North America.

The current warm period is happening across the world for the first time

The results look very different for recent history. Both studies show that the warmest period of the last 2,000 years was most likely in the 20th century. They also show that this was the case for more than 98 percent of the surface of the earth. This shows - once again - that modern climate change cannot be explained by random fluctuations, but by anthropogenic emissions of CO2 and other greenhouse gases. What we didn't know until now is that not only average global temperatures in the 20th century are higher than ever before in at least 2,000 years, but also that a warming period is now affecting the whole planet at the same time for the first time. And the speed of global warming has never been as high as it is today.

Credit: 
University of Bern

Outcompeting cancer

One of the reasons cancer cells are so robust against the body's natural defenses is that they are in fact human cells, and as such they have the innate machinery not only to trick the body's defense and maintenance systems, but even to hijack them. Therefore, discovering cancer cells' full "bag of tricks" is key for fighting cancer.

Eduardo Moreno, principal investigator at the Champalimaud Centre for the Unknown, in Lisbon, Portugal, has taken an important step in this direction by discovering one new such "trick": a cell-competition mechanism which he has named "fitness fingerprints".

"We first identified this 'fitness fingerprints' mechanism in the model animal Drosophila melanogaster (the common fruit fly), in 2010, and now, in this new study published in the journal Nature, we were able to prove that it also exists in humans and that blocking it halts the growth of human cancer cells", says Moreno.

Bad neighbours

Moreno and his team discovered that neighbouring cells in the body are constantly evaluating each other's fitness level by using special markers that each cell exhibits on its surface. "We found that there are actually two types of markers: 'Win' fitness fingerprints, which signify that the cell is young and healthy and 'Lose' fitness fingerprints, which signify that the cell is old or damaged", Moreno explains. "If a cell is less fit than its neighbours, meaning that it either has less Win or more Lose than them, then they eliminate it, thereby ensuring the health and integrity of the tissue as a whole."

According to Moreno, his team found that this process is important for proper development, tissue regeneration after injury and to prevent premature aging, but that it can also be hijacked for tumour growth.

"Cancer cells use these fitness fingerprints to disguise themselves as super-fit cells that have relatively many more Win fitness fingerprints on their surface [than their healthy neighbours]. This makes the normal cells that surround cancer cells appear less healthy by comparison. In this way, cancer cells trick their healthy neighbours and bring about their death, consequently destroying the tissue and making room for tumour expansion."

When fitness fingerprints were identified by Moreno's group in the fruit fly, it was not known whether this cell competition mechanism would be conserved in humans, as it is possible that different animals use different strategies to detect unwanted cells. In fact, Moreno suspected that this mechanism might not be conserved.

"Fitness fingerprints can be very useful, but they also carry a significant cancer risk since they make tumours more aggressive. Such a tradeoff may be acceptable for short-lived animals such as the fruit fly, but for long lived animals such as humans it might be too risky", he points out. However, following a series of experiments in human cancer cells, they found that we humans possess this double-edged mechanism after all.

Fitness fingerprints in human cancer

To find out whether human cells express fitness fingerprints and whether they are involved in cancer, two researchers in the lab, Rajan Gogna and Esha Madan, performed a series of experiments. They began by identifying the gene that codes for fitness fingerprints in the human genome. Once the gene was identified, they saw that it actually codes for four different types of fitness fingerprints: two types of Win fitness fingerprints and two types of Lose fitness fingerprints.

Next, to observe whether the fitness fingerprints impact cancer growth, the team analysed the expression of these four types in different types of tissues: malignant cancer (breast and colon), benign tumors (breast and colon), tissue adjacent to the tumour and normal tissue.

Their analysis revealed several striking findings: in normal tissue, the expression of Win was overall quite sparse, and expression of Lose was even lower. In contrast, the expression of Win was significantly increased in all tumors, with higher levels in malignant tumors than in benign.

But more alarmingly, tumours seemed to be transforming the fitness level exhibited by the neighboring tissue to their advantage: "expression of Lose was significantly higher in tissue adjacent to tumors, when compared to normal tissue. Moreover, Lose levels were higher in tissue adjacent to malignant tumors than around benign tumours", Gogna explains. "In fact, further statistical analysis showed that expression levels of Win in cancer and Lose in the neighbouring tissue can predict cancer malignancy accurately 86.3% of the time."

On the road to potential therapies

The team's findings strongly suggested that high expression of Win fingerprints in the tumor, in association with high expression of Lose in the surrounding tissue, is a prerequisite for tumor growth. So they decided to test the effect of blocking this mechanism. To do that, they implanted grafts from human tumours in mice and knocked out (or cancelled), the expression of fitness fingerprints.

The results were encouraging: "we found that this manipulation significantly reduced the volume of the tumours, showing that it diminished the destructive power of the tumour against its host tissue. However, this approach alone does not eliminate the cancer cells, just slows down their progress", Madan explains.

Next, to test the full therapeutic potential of this approach, the team decided to combine blocking fitness fingerprints expression with chemotherapy. This two-pronged approach was very successful: "we were able to further reduce, and in some cases completely eliminate, tumorigenesis!", says Gogna.

According to Moreno, this is yet another example of basic, curiosity-based research that ends up having important implications for human health. "When we began studying cell competition in the fruit fly, we were addressing it as a basic biology question: how do tissues eliminate viable, but suboptimal, cells. From there to potential cancer therapies seems like an almost unlikely development, but this is how research works; you start with the curiosity to know how things work and from there, sometimes, you find yourself on the road to potential novel therapies."

Next, Moreno's team is planning to study this mechanism more deeply, while continuing to collaborate with clinicians for the development of future cancer drugs. "These findings are very encouraging, but they are still preliminary and it will be some years before we are able to use them to help cancer patients", he concludes.

Credit: 
Champalimaud Centre for the Unknown

New study identifies causes of multidecadal climate changes

image: Global climate change threatens the continued existence of glaciers, such as the Aletsch Glacier in Switzerland, photographed in July 2015.

Image: 
Anupma Gupta/Michael Evans

A new reconstruction of global average surface temperature change over the past 2,000 years has identified the main causes for decade-scale climate changes. The analysis suggests that Earth's current warming rate, caused by human greenhouse gas emissions, is higher than any warming rate observed previously. The researchers also found that airborne particles from volcanic eruptions were primarily responsible for several brief episodes of global cooling prior to the Industrial Revolution of the mid-19th century.

The new temperature reconstruction also largely agrees with climate model simulations of the same time period. The researchers found agreement for temperature changes caused by identifiable factors, such as volcanic aerosols and greenhouse gases, as well as for random fluctuations in climate that take place on the same timescales. This suggests that current climate models accurately represent the contributions of various influences on global climate change--and are capable of correctly predicting future climate warming.

The research team--19 members of the Past Global Changes (PAGES) project, including University of Maryland Geology Associate Professor Michael Evans--used seven different statistical methods to perform the reconstruction. The results are published online July 24, 2019, in the journal Nature Geoscience.

"Our reconstructions look like the 'hockey stick' diagram of global temperature change that was first reconstructed more than two decades ago," said Evans, who is also co-chair of PAGES and has a joint appointment at UMD's Earth System Science Interdisciplinary Center (ESSIC). "Thanks to the work of the PAGES community, we have much more data now. The results were consistent regardless of how we created the reconstructions or which randomly chosen subset of input data we used."

The new 2,000-year reconstruction improves on previous efforts by using the most detailed and comprehensive database of its kind yet assembled. This dataset, painstakingly compiled by PAGES researchers, includes nearly 700 separate publicly available records from sources that contain indicators of past temperatures, such as long-lived trees, reef-building corals, ice cores, and marine and lake sediments. The data are sourced from all of Earth's continental regions and major ocean basins.

By comparing the new reconstructions with existing climate simulations generated using the Coupled Model Intercomparison Project 5 (CMIP5) climate models, the PAGES research team was able to determine the relative contributions of several influences on global temperatures over time. These included natural influences, such as fluctuations in solar heating and the cooling effect of particles ejected by volcanic eruptions, as well as the human-caused influence of greenhouse gas emissions.

The results suggest that volcanic activity was responsible for variations before about 1850. After that, greenhouse gases became the dominant influence on global climate. By removing these influences in their analysis, the researchers also identified the magnitude of the random changes that cannot be traced to a specific cause. The team's data-based reconstructions also agreed with model simulations when evaluating these random changes.

"This makes us more confident that our reconstructions are realistic and, in turn, that the climate models are simulating past and future climate warming faithfully," Evans added.

This agreement between the researchers' data-based reconstructions and the CMIP5 simulations suggests that existing climate models can accurately predict future global temperature change over the next few decades, according to Evans. However, these simulations depend heavily on the choices that humans make in the future, which is very difficult to predict, Evans added.

"The uncertainty in the influence of human activities is not so large when looking forward only a few decades," Evans said. "But in the longer term, the choices we make for our energy sources and how much carbon these sources emit really matter."

Credit: 
University of Maryland

Unlocking therapies for hard-to-treat lung cancers

image: From left: Marc Montminy, Laura Rodón and Reuben Shaw.

Image: 
Salk Institute

LA JOLLA--(July 24, 2019) Now, a new Salk Institute study, published on July 24, 2019, in the journal Science Advances, shows that researchers could target these hard-to-treat cancers by pursuing drugs that keep a cellular "switch," called CREB, from triggering tumor growth. The study was led by Marc Montminy, professor and J.W. Kieckhefer Foundation Chair at Salk, in close collaboration with Professor Reuben Shaw, director of the Salk Cancer Center and William R. Brody Chair.

"A drug that blocks this switch could have therapeutic benefits for patients with non-small-cell lung cancer," says Montminy. "This disease has eluded efforts to identify effective treatments."

Shaw adds, "There's really no good treatment, so any insight that helps this subset of patients is a major advance."

Scientists have studied CREB for decades. The molecule, known as a transcription factor because it binds to DNA to change gene transcription, has a key role in directing which proteins a cell can make.

The Montminy and Shaw laboratories at Salk focused on the role of CREB in patients with diabetes. Over the years, more and more research has suggested that CREB is important in cancer, but no one knew exactly how CREB affects cancer growth--until recently.

Laura Rodón, a postdoctoral researcher in the Montminy lab, wanted to look at which genes CREB binds to in patients with non-small-cell lung cancer to understand how CREB influences cancer--and reveal potential new drug targets. To do so, the team examined how non-small-cell lung cancer cell lines grew in a mouse model, studied the tumors and correlated the results with data from tumors in patients. They discovered that CREB and its partner, CRTC2, are activated in a subset of NSCLC tumors.

Normally, a tumor suppressor gene called LKB1 would block this activation--but this checkpoint is gone in patients with the altered gene. In these patients, CRTC2 is abnormally activated, and stimulates genes that contribute to lung cancer. In particular, follow-up experiments showed that CRTC2 mistakenly turns on another gene called ID1, which is known to cause cancer in other tissues.

"It was an exciting finding to show how CREB ultimately contributes to this deadly type of cancer," says Rodón. "This gives weight to the idea that if we were able to turn off that CREB switch, we'd be able to help patients."

The next step in this research is to look into potential drugs that can interfere with CREB or CRTC2 in this subset of non-small-cell lung cancer patients. Luckily, past studies that aimed to block CREB as a way of helping diabetes patients offer a suite of new possibilities for cancer treatments. Shaw says biomedical companies may have promising NSCLC drugs on hand and not even realize it.

"There are a lot of interesting findings in this space," says Shaw. "Hopefully in the next couple years, we'll know a lot more about treating these patients."

The team agrees that this study is a great example of how laboratories at Salk work together to embrace new projects.

"Salk encourages collaborations," says Montminy. "That makes it very easy to do studies like this that require people with different expertise to work together."

Other authors include Robert U. Svensson, Ezra Wiater, Matthew G.H. Chun, Wen-Wei Tsai and Lillian J. Eichner of Salk.

The work was funded by The National Institutes of Health (R01 DK083834, R35CA220538, P01CA120964), The Leona M. and Harry B. Helmsley Charitable Trust (grant #2012-PG-MED002), the Clayton Foundation for Medical Research, the Kieckhefer Foundation, the Tobacco-Related Disease Research Program (25FT-0006), the American Cancer Society (ACS#124183-PF-13-023-01-CSM, PF-15-037-01-DMC) and Salk (A014195).

Journal

Science Advances

Credit: 
Salk Institute

Hydration sensor could improve dialysis

For patients with kidney failure who need dialysis, removing fluid at the correct rate and stopping at the right time is critical. This typically requires guessing how much water to remove and carefully monitoring the patient for sudden drops in blood pressure.

Currently there is no reliable, easy way to measure hydration levels in these patients, who number around half a million in the United States. However, researchers from MIT and Massachusetts General Hospital have now developed a portable sensor that can accurately measure patients' hydration levels using a technique known as nuclear magnetic resonance (NMR) relaxometry.

Such a device could be useful for not only dialysis patients but also people with congestive heart failure, as well as athletes and elderly people who may be in danger of becoming dehydrated, says Michael Cima, the David H. Koch Professor of Engineering in MIT's Department of Materials Science and Engineering.

"There's a tremendous need across many different patient populations to know whether they have too much water or too little water," says Cima, who is the senior author of the study and a member of MIT's Koch Institute for Integrative Cancer Research. "This is a way we could measure directly, in every patient, how close they are to a normal hydration state."

The portable device is based on the same technology as magnetic resonance imaging (MRI) scanners but can obtain measurements at a fraction of the cost of MRI, and in much less time, because there is no imaging involved.

Lina Colucci, a former graduate student in health sciences and technology, is the lead author of the paper, which appears in the July 24 issue of Science Translational Medicine. Other authors of the paper include MIT graduate student Matthew Li; MGH nephrologists Kristin Corapi, Andrew Allegretti, and Herbert Lin; MGH research fellow Xavier Vela Parada; MGH Chief of Medicine Dennis Ausiello; and Harvard Medical School assistant professor in radiology Matthew Rosen.

Hydration status

Cima began working on this project about 10 years ago, after realizing that there was a critical need for an accurate, noninvasive way to measure hydration. Currently, the available methods are either invasive, subjective, or unreliable. Doctors most frequently assess overload (hypervolemia) by a few physical signs such as examining the size of the jugular vein, pressing on the skin, or examining the ankles where water might pool.

The MIT team decided to try a different approach, based on NMR. Cima had previously launched a company called T2 Biosystems that uses small NMR devices to diagnose bacterial infections by analyzing patient blood samples. One day, he had the idea to use the devices to try to measure water content in tissue, and a few years ago, the researchers got a grant from the MIT-MGH Strategic Partnership to do a small clinical trial for monitoring hydration. They studied both healthy controls and patients with end-stage renal disease who regularly underwent dialysis.

One of the main goals of dialysis is to remove fluid in order bring patients to their "dry weight," which is the weight at which their fluid levels are optimized. Determining a patient's dry weight is extremely challenging, however. Doctors currently estimate dry weight based on physical signs as well as through trial-and-error over multiple dialysis sessions.

The MIT/MGH team showed that quantitative NMR, which works by measuring a property of hydrogen atoms called T2 relaxation time, can provide much more accurate measurements. The T2 signal measures both the environment and quantity of hydrogen atoms (or water molecules) present.

"The beauty of magnetic resonance compared to other modalities for assessing hydration is that the magnetic resonance signal comes exclusively from hydrogen atoms. And most of the hydrogen atoms in the human body are found in water molecules," Colucci says.

The researchers used their device to measure fluid volume in patients before and after they underwent dialysis. The results showed that this technique could distinguish healthy patients from those needing dialysis with just the first measurement. In addition, the measurement correctly showed dialysis patients moving closer to a normal hydration state over the course of their treatment.

Furthermore, the NMR measurements were able to detect the presence of excess fluid in the body before traditional clinical signs -- such as visible fluid accumulation below the skin -- were present. The sensor could be used by physicians to determine when a patient has reached their true dry weight, and this determination could be personalized at each dialysis treatment.

Better monitoring

The researchers are now planning additional clinical trials with dialysis patients. They expect that dialysis, which currently costs the United States more than $40 billion per year, would be one of the biggest applications for this technology. This kind of monitoring could also be useful for patients with congestive heart failure, which affects about 5 million people in the United States.

"The water retention issues of congestive heart failure patients are very significant," Cima says. "Our sensor may offer the possibility of a direct measure of how close they are to a normal fluid state. This is important because identifying fluid accumulation early has been shown to reduce hospitalization, but right now there are no ways to quantify low-level fluid accumulation in the body. Our technology could potentially be used at home as a way for the care team to get that early warning."

In their study of the healthy control subjects, the researchers also incidentally discovered that they could detect dehydration. This could make the device useful for monitoring elderly people, who often become dehydrated because their sense of thirst lessens with age, or athletes taking part in marathons or other endurance events. The researchers are planning future clinical trials to test the potential of their technology to detect dehydration.

Credit: 
Massachusetts Institute of Technology