Culture

Economists find carbon footprint grows with parenthood

Increased time constraints and the need for convenience in raising children appear to offset parents' concerns about the future when it comes to their carbon footprints, according to new research by University of Wyoming economists and a colleague in Sweden.

UW's Jason Shogren and Linda Thunstrom, along with Jonas Nordstrom of the Lund University School of Economics and Management, have documented that two-adult households with children emit over 25 percent more carbon dioxide than two-adult households without children. Their research appears April 15 in PLOS One, a journal published by the Public Library of Science.

"While having children makes people focus more on the future and, presumably, care more about the environment, our study suggests that parenthood does not cause people to become 'greener,'" Shogren and Thunstrom say. "In fact, the difference in CO2 emissions between parents and non-parents is substantial, and that's primarily because of increased transportation and food consumption changes."

The study involved an analysis of expenditures on goods and services by households in Sweden. The researchers found that parents with children at home consume goods and services that emit CO2 in the areas of food, such as meat, and transportation, such as gasoline, at higher rates than childless households.

The economists note that time constraints become more binding, and convenience may become more important, when people have children.

"Parents may need to be in more places in one day," resulting in people driving themselves instead of using public transportation or bicycling, the researchers wrote. "They also need to feed more people. Eating more pre-prepared, red meat carbon-intensive meals may add convenience and save time."

The disparity in the carbon footprints of Swedish households with and without children is particularly striking, as concerns about climate change are more pronounced in Sweden than most other developed countries. Most Swedes believe climate change is real and have accepted sizable CO2 taxes, and households with children are subsidized, which helps to alleviate some of the time crunch for parents. Sweden has generous parental leave and subsidized day care, and parents have a legal right to reduced work hours.

"If we're finding these results in Sweden, it's pretty safe to assume that the disparity in carbon footprints between parents and nonparents is even bigger in most other Western countries," Thunstrom says -- though she notes that Sweden also has one of the world's highest female labor participation rates, which may add to the time constraints of household with children.

"Becoming a parent can transform a person -- he or she thinks more about the future and worries about future risks imposed on their children and progeny," Shogren says. "But, while having children might be transformational, our results suggest that parents' concerns about climate change do not cause them to be 'greener' than non-parent adults."

Credit: 
University of Wyoming

T2K results restrict possible values of neutrino CP phase

image: The arrow indicates the value most compatible with the data. The gray region is disfavored at 99.7% confidence level. Nearly half of the possible values are excluded.

Image: 
The T2K Collaboration

The T2K Collaboration has published new results showing the strongest constraint yet on the parameter that governs the breaking of the symmetry between matter and antimatter in neutrino oscillations. Using beams of muon neutrinos and muon antineutrinos, T2K has studied how these particles and antiparticles transition into electron neutrinos and electron antineutrinos, respectively. The parameter governing the matter/antimatter symmetry breaking in neutrino oscillation, called δcp phase, can take a value from -180º to 180º. For the first time, T2K has disfavored almost half of the possible values at the 99.7% (3σ) confidence level, and is starting to reveal a basic property of neutrinos that has not been measured until now. This is an important step on the way to knowing whether or not neutrinos and antineutrinos behave differently. These results, using data collected through 2018, have been published in the multidisciplinary scientific journal, Nature on April 16.

For most phenomena, the laws of physics provide a symmetric description of the behavior of matter and antimatter. However, this symmetry does not hold universally. The effect of the asymmetry between matter and antimatter is most apparent in the observation of the universe, which is composed of matter with little antimatter. It is considered that equal amounts of matter and antimatter were created at the beginning of the universe. Then, for the universe to evolve to a state where matter dominates over antimatter, a necessary condition is the violation of the so-called Charge-Parity (CP) symmetry. Until now, CP symmetry violation has only been observed in the physics of subatomic particles called quarks, but the magnitude of the CP symmetry violation is not large enough to explain the observed dominance of matter over antimatter in the universe. T2K is now searching for a new source of CP symmetry violation in neutrino oscillations that would manifest as a difference in the measured oscillation probability for neutrinos and antineutrinos.

The T2K experiment uses a beam consisting primarily of muon neutrinos or muon antineutrinos created using the proton beam from the Japan Proton Accelerator Research Complex (J-PARC) located in Tokai village on the east coast of Japan. A small fraction of the neutrinos (or antineutrinos) are detected 295 km away at the Super-Kamiokande detector, located under a mountain in Kamioka, near the west coast of Japan. As the muon neutrinos and muon antineutrinos traverse the distance from Tokai to Kamioka (hence the name T2K), a fraction will oscillate or change flavor into electron neutrinos or electron antineutrinos respectively. The electron neutrinos and electron antineutrinos are identified in the Super-Kamiokande detector by the rings of Cherenkov light that they produce (shown below). While Super-Kamiokande cannot identify each event as a neutrino or antineutrino interaction, T2K is able to study the neutrino and antineutrino oscillations separately by operating the beam in neutrino mode or antineutrino mode.

T2K released a result analysing data with 1.49x1021 and 1.64x1021 protons from the accelerator for neutrino beam mode and antineutrino beam mode respectively. If the parameter δcp equals 0º or 180º, the neutrinos and antineutrinos will change their types (from muon to electron) in the same way during oscillation. The δcp parameter may have a value that enhances the oscillations of neutrinos or antineutrinos, breaking CP symmetry. However, the observation of neutrinos is already enhanced in the T2K experiment by the fact that the detectors and beam line components are made out of matter and not antimatter. To separate the effect of δcp from known beam line and interaction effects, the T2K analysis includes corrections based on data from magnetized near detectors (ND280) placed 280m from the target. T2K observed 90 electron neutrino candidates and 15 electron antineutrino candidates. T2K expects to observe 82 electron neutrino events compared to 17 electron antineutrino events for maximal neutrino enhancement (δcp =?90º) and 56 electron neutrino events compared to 22 electron antineutrino events for maximal antineutrino enhancement (δcp=+90º). The observed number of events as a function of the reconstructed neutrino energy is shown below. The T2K data is most compatible with a value close to δcp=?90º that significantly enhances the oscillation probability of neutrinos in the T2K experiment. Using this data, T2K evaluates confidence intervals for the parameter δcp. The disfavored region at the 3σ (99.7%) confidence level is ?2º to 165º. This result represents the strongest constraint on δcp to date. The values of 0º and 180º are disfavored at 95% confidence level , which was the case in T2K's previous release in 2017, indicating that CP symmetry may be violated in neutrino oscillations.

While this result shows a strong preference for enhancement of the neutrino rate in T2K, it is not yet clear if CP symmetry is violated or not. To further improve the experimental sensitivity to a potential CP symmetry violating effect, the T2K Collaboration will upgrade the near detector suite to reduce systematic uncertainties and accumulate more data, and J-PARC will increase the beam intensity by upgrading the accelerator and beamline.

Credit: 
Stony Brook University

Is it bloating or is it a heart attack?

A patient in the hospital for metastatic Hodgkin lymphoma with significant abdominal distention displayed sudden onset of ST-segment elevations--often an indicator of a heart attack--however the heart attack symptoms improved when the cardiovascular care team pressed on the abdomen during a standard exam, according to a case published in JACC: Case Reports.

"It is important to be aware that, while rare, acute gastrointestinal distention can cause ST-segment changes on an ECG. Clinicians must distinguish these cases from true heart attacks to prevent unnecessary treatment and invasive procedures whenever possible," said Enrique Ostrzega, MD, a cardiologist at the University of Southern California and the senior author of the case report.

Clinicians at the University of Southern California, Los Angeles, admitted a previously healthy 41-year-old male with three weeks of lower extremity swelling, fatigue and shortness of breath. He was taken to the intensive care unit (ICU) and later diagnosed with metastatic Hodgkin lymphoma.

While under sedation on a ventilator in the ICU, the patient's cardiac monitor displayed sudden onset of ST-segment elevation. The ST segment is the interval between ventricular depolarization and repolarization of the heartbeat on an electrocardiogram (ECG). Elevation of this segment can be a sign of a heart attack and is typically known as a "STEMI" (ST-Elevation Myocardial Infarction), which is a serious and high-risk presentation of a heart attack. An ECG in this case was interpreted by a computer algorithm as a heart attack, leading the treating clinicians to call for an emergent cardiology consult.

The patient's physical examination showed significant abdominal distention. When an ultrasound probe was placed on the patient's upper abdomen, the previously noted ST-segment elevations abruptly resolved on the cardiac monitor. These dynamic changes were confirmed with an ECG. There was immediate resolution of the ST-segment elevation when gentle palpation of the abdomen was performed. The consulting cardiology team suspected the cause may be related to the patient's abdominal distention and performed an abdominal X-ray which revealed significant gastric distention. After a nasogastric tube was placed for gastric decompression an ECG confirmed resolution of ST-segment elevations with no further documented ST-segment abnormalities or evidence of cardiac dysfunction.

According to the report, when presented with an atypical case of ST-segment elevation on ECG, thorough examination and history are vital as other disorders can mimic a STEMI pattern on ECG. Some noncardiac causes of ST-segment elevation include pancreatitis, community-acquired pneumonia and intracranial bleeding.

In most previously reported cases of gastrointestinal distention causing a STEMI pattern on ECG, coronary angiography confirmed no evidence of obstructive coronary artery disease, and in all reported cases it improved with management and resolution of the underlying gastrointestinal issue. According to the report authors, to their knowledge this is the first case where a direct physical maneuver (abdominal palpation) elicited an acute reversal of ST-segment elevations. They believe the gastric distention led to a direct compressive effect on the heart and the application of mild pressure on the abdomen relieved or shifted the effect.

"Careful clinical examination and setting up the correct differential diagnosis is the cornerstone of the treatment of every patient. Here--starting from an important observation-- [the authors] treated the patient without needing interventional procedures," said Julia Grapsa, MD, PhD, FACC, editor-in-chief of JACC: Case Reports.

Credit: 
American College of Cardiology

Examining associations between ages of parents, grandparents and autism risk in children

What The Study Did: Older age for parents has been associated with autism spectrum disorders (ASDs) in children, however little is known about the association between the age of grandparents at the time of the birth of the parent and the risk of ASD in the grandchildren. This association was investigated in an observational study with the use of data from Danish national health registries that included three generations and 1.4 million children born from 1990 to 2013.

Authors: Zeyan Liew, Ph.D., M.P.H., of the Yale School of Public Health in New Haven, Connecticut, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.2868)

Editor's Note: The article includes funding/support disclosures. Please see the articles for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

How common is racial/ethnic discrimination in US surgical residency programs?

What The Study Did: Surveys from nearly 7,000 resident surgeons were used to evaluate how common racial/ethnic discrimination is in U.S. general surgery programs and how it's associated with burnout, thoughts of quitting and suicide.

Authors: Yue-Yung Hu, M.D., M.P.H., of the Feinberg School of Medicine at Northwestern University in Chicago, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamasurg.2020.0260)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the articles for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Prescribing an overdose: A chapter in the opioid epidemic

ROCHESTER, Minn. -- Research indicates that widespread opioid overprescribing contributed to the opioid epidemic. New research shows that this dangerous trend has apparently been coupled with another: inappropriate use of high-potency opioids.

A multi-institution research collaboration led by Mayo Clinic will publish its findings Wednesday, April 15, in JAMA Network Open. The study showed that more than half of Americans starting the most highly regulated opioids might be receiving inappropriate treatment.

"In pain management, there is a need to use a variety of treatment options, including -- when appropriate -- extended-release opioids and very strong immediate-acting opioids like fentanyl," W. Michael Hooten, M.D., a Mayo Clinic anesthesiologist and pain medicine specialist. "However, these particular medications can cause a number of serious adverse effects, so extra safeguards are needed when these medications are prescribed." Dr. Hooten is a study co-author.

"One of the key factors in determining whether these drugs can be used safely is the presence of opioid tolerance in the patient who was prescribed one of these medications," says Dr. Hooten. "In other words, tolerance to some of the most dangerous adverse effects of opioids, including suppressing breathing and excessive sedation, develops only after a patient takes daily doses of opioids over time. Patients who are not opioid-tolerant should not be receiving high-potency fentanyl or extended-release opioid products because they are susceptible to these life-threatening adverse effects."

The medications examined in the study included high-dose, extended-release oxycodone; all doses of extended-release hydromorphone; fentanyl patches; and all varieties of transmucosal ? oral or nasal delivery ? fentanyl.

Data behind findings

To determine whether these medications were inappropriately used across the U.S., the study team used pharmacy and medical claims data, and linked electronic health records from the OptumLabs Data Warehouse. OptumLabs is a collaborative center for research and innovation co-founded by Optum Inc. and Mayo Clinic, and focused on improving patient care and patient value.
Examining pharmacy and medical claims data from 2007 to 2016, the investigators identified nearly 300,000 instances of prescriptions during that time period that were for medications reserved for people with opioid tolerance. They removed records of people who had recently been hospitalized or who had an opioid poisoning diagnosis within the preceding six months. They also removed people who did not have at least six months of continuous insurance claims information at the time of the prescription and people with certain missing demographic information.

The remaining 153,385 instances of new outpatient prescriptions of these reserved medications occurred among 131,756 people from across the U.S.

Less than 48% of these showed evidence of prior opioid tolerance.

"Our findings are concerning because it appears that many people starting to use these drugs may be at risk for some quite serious outcomes," says Molly Jeffery, Ph.D., the study's lead author. "In general, physicians are allowed to prescribe drugs off-label -- that is, without adhering to the indications or warnings included in the drug label. But these particular drugs are considered risky enough that the FDA (Food and Drug Administration) requires manufacturers to provide additional oversight and education to physicians to make sure they understand the risks associated with the drugs." Dr. Jeffery is also the scientific director of Emergency Medicine Research at Mayo Clinic.

Furthermore, regulations for one of the drug classes that the team studied -- transmucosal immediate-release fentanyl, or TIRFS -- require physicians who prescribe, and pharmacists who dispense, the drugs to complete a certification process and enroll each patient.

"This process is meant to ensure patient safety while preserving access to the drugs for people who really need them," she says. "All of the drugs we studied have appropriate uses, but that does require that the physicians prescribing the drugs know the risks."

A key contribution of this study is that researchers were able to look for additional evidence of opioid tolerance in linked electronic health records for about 15% of the group, or 20,044 patients. They determined that between 0.5% and 4% of episodes of use of the reserved medications had additional information in the patient's record that showed evidence of tolerance not indicated in claims data. This additional evidence includes prescriptions showing up in the electronic health record but not in claims.

"Critiques of prior studies using only claims data raised the possibility that patients might have opioid prescriptions that don't show up in insurance claims," notes Dr. Jeffery. "For example, if a person paid for a prescription with cash instead of submitting it to insurance."

Prescriptions noted in the electronic health record may, or may not, have been filled.

"We used natural language processing -- a type of artificial intelligence -- to look through physician notes in the patients' health records, and also looked at information about prescriptions written," says Dr. Jeffery. "We did not find substantial additional evidence that patients were opioid-tolerant when they started these drugs."

"There was no evidence of opioid tolerance in more than half of the patients in our study," Dr. Jeffery says. "Given how common that was, we wanted to use the clinical notes to look for reasons why physicians would have prescribed these drugs in people who were not opioid-tolerant.

The team's analysis of clinical notes didn't give any insight into physicians' reasons for prescribing to people who were not opioid-tolerant. However, the physicians on the research team had some guesses about what might be behind these prescriptions.

"My colleagues and I discussed the possibility that, in particular, fentanyl patches might have been prescribed for patients who have serious medical or surgical problems that limit the ability to swallow oral medications," says Dr. Hooten. "Also, fentanyl patches contain a three-day supply of the drug, so patches could be a possible solution for patients who may not be able to take oral medications on a scheduled basis."

Dr. Hooten hopes his fellow clinicians pay attention to these findings.

"I often treat people mired in addiction," he says. "As physicians, our first charge is to do no harm, and with opioids -- especially this group of medications -- the risk of harm is very real."

"It concerns me that I might have patients coming to me whose substance abuse disorder was exacerbated by inappropriate prescribing practices."

Credit: 
Mayo Clinic

First Gulf-wide survey of oil pollution completed 10 years after Deepwater Horizon

video: The University of South Florida (USF) conducted the first comprehensive baseline study of oil pollution in the Gulf of Mexico. USF marine scientists surveyed 10,000 fish and found oil exposure in all of them - the highest levels were detected in yellowfin tuna, golden tilefish and red drum.

Image: 
University of South Florida

ST. PETERSBURG, Fla. (April 15, 2020)- Since the 2010 BP oil spill, marine scientists at the University of South Florida (USF) have sampled more than 2,500 individual fish representing 91 species from 359 locations across the Gulf of Mexico and found evidence of oil exposure in all of them, including some of the most popular types of seafood. The highest levels were detected in yellowfin tuna, golden tilefish and red drum.

The study, just published in "Nature Scientific Reports," represents the first comprehensive, Gulf-wide survey of oil pollution launched in response to the Deepwater Horizon spill. It was funded by a nearly $37 million grant from the independent Gulf of Mexico Research Initiative (GoMRI) to establish the Center for Integrated Modeling and Analysis of Gulf Ecosystems (C-IMAGE), an international consortium of professors, post-doctoral scholars and students from 19 collaborating institutions.

Over the last decade, USF scientists conducted a dozen research expeditions to locations off the United States, Mexico and Cuba examining levels of polycyclic aromatic hydrocarbons (PAHs), the most toxic chemical component of crude oil, in the bile of the fish. Bile is produced by the liver to aid in digestion, but it also acts as storage for waste products.

"We were quite surprised that among the most contaminated species was the fast-swimming yellowfin tuna as they are not found at the bottom of the ocean where most oil pollution in the Gulf occurs," said lead author Erin Pulster, a researcher in USF's College of Marine Science. "Although water concentrations of PAHs can vary considerably, they are generally found at trace levels or below detection limits in the water column. So where is the oil pollution we detected in tunas coming from?"

Pulster says it makes sense that tilefish have higher concentrations of PAH because they live their entire adult lives in and around burrows they excavate on the seafloor and PAHs are routinely found in Gulf sediment. However, their exposure has been increasing over time, as well as in other species, including groupers, some of Florida's most economically important fish. In a separate USF-led study, her team measured the concentration of PAHs in the liver tissue and bile of 10 popular grouper species. The yellowedge grouper had a concentration that increased more than 800 percent from 2011 to 2017.

Fish with the highest concentrations of PAH were found in the northern Gulf of Mexico, a region of increased oil and gas activity and in the vicinity of the Deepwater Horizon spill that gushed nearly four million barrels of oil over the course of three months in 2010. Oil-rich sediments at the bottom where much of the oil settled are resuspended by storms and currents, re-exposing bottom-dwelling fish.

Oil pollution hot spots were also found off major population centers, such as Tampa Bay, suggesting that runoff from urbanized coasts may play a role in the higher concentrations of PAHs. Other sources include chornic low-level releases from oil and gas platforms, fuel from boats and airplanes and even natural oil seeps--fractures on the seafloor that can ooze the equivalent of millions of barrels of oil per year.

"This was the first baseline study of its kind, and it's shocking that we haven't done this before given the economic value of fisheries and petroleum extraction in the Gulf of Mexico," said Steven Murawksi, professor of fisheries biology at USF, who led the international research effort.

Despite the detected trends of oil contamination in fish bile and liver, fish from the Gulf of Mexico are rigorously tested for contaminants to ensure public safety and are safe to eat because oil contaminants in fish flesh are well below public health advisory levels. Chronic PAH exposure, however, can prevent the liver from functioning properly, resulting in the decline of overall fish health.

These studies were made possible by BP's 10-year, $500 million commitment to fund independent research on the long-term effects of the Deepwater Horizon spill administered by the Gulf of Mexico Research Initiative. This year marks the end of that funding.

"Long-term monitoring studies such as these are important for early warning of oil pollution leaks and are vital for determining impacts to the environment in the case of future oil spills," Pulster said.

Credit: 
University of South Florida

Nanosensor can alert a smartphone when plants are stressed

CAMBRIDGE, MA -- MIT engineers have developed a way to closely track how plants respond to stresses such as injury, infection, and light damage, using sensors made of carbon nanotubes. These sensors can be embedded in plant leaves, where they report on hydrogen peroxide signaling waves.

Plants use hydrogen peroxide to communicate within their leaves, sending out a distress signal that stimulates leaf cells to produce compounds that will help them repair damage or fend off predators such as insects. The new sensors can use these hydrogen peroxide signals to distinguish between different types of stress, as well as between different species of plants.

"Plants have a very sophisticated form of internal communication, which we can now observe for the first time. That means that in real-time, we can see a living plant's response, communicating the specific type of stress that it's experiencing," says Michael Strano, the Carbon P. Dubbs Professor of Chemical Engineering at MIT.

This kind of sensor could be used to study how plants respond to different types of stress, potentially helping agricultural scientists develop new strategies to improve crop yields. The researchers demonstrated their approach in eight different plant species, including spinach, strawberry plants, and arugula, and they believe it could work in many more.

Strano is the senior author of the study, which appears today in Nature Plants. MIT graduate student Tedrick Thomas Salim Lew is the lead author of the paper.

Embedded sensors

Over the past several years, Strano's lab has been exploring the potential for engineering "nanobionic plants" -- plants that incorporate nanomaterials that give the plants new functions, such as emitting light or detecting water shortages. In the new study, he set out to incorporate sensors that would report back on the plants' health status.

Strano had previously developed carbon nanotube sensors that can detect various molecules, including hydrogen peroxide. About three years ago, Lew began working on trying to incorporate these sensors into plant leaves. Studies in Arabidopsis thaliana, often used for molecular studies of plants, had suggested that plants might use hydrogen peroxide as a signaling molecule, but its exact role was unclear.

Lew used a method called lipid exchange envelope penetration (LEEP) to incorporate the sensors into plant leaves. LEEP, which Strano's lab developed several years ago, allows for the design of nanoparticles that can penetrate plant cell membranes. As Lew was working on embedding the carbon nanotube sensors, he made a serendipitous discovery.

"I was training myself to get familiarized with the technique, and in the process of the training I accidentally inflicted a wound on the plant. Then I saw this evolution of the hydrogen peroxide signal," he says.

He saw that after a leaf was injured, hydrogen peroxide was released from the wound site and generated a wave that spread along the leaf, similar to the way that neurons transmit electrical impulses in our brains. As a plant cell releases hydrogen peroxide, it triggers calcium release within adjacent cells, which stimulates those cells to release more hydrogen peroxide.

"Like dominos successively falling, this makes a wave that can propagate much further than a hydrogen peroxide puff alone would," Strano says. "The wave itself is powered by the cells that receive and propagate it."

This flood of hydrogen peroxide stimulates plant cells to produce molecules called secondary metabolites, such as flavonoids or carotenoids, which help them to repair the damage. Some plants also produce other secondary metabolites that can be secreted to fend off predators. These metabolites are often the source of the food flavors that we desire in our edible plants, and they are only produced under stress.

A key advantage of the new sensing technique is that it can be used in many different plant species. Traditionally, plant biologists have done much of their molecular biology research in certain plants that are amenable to genetic manipulation, including Arabidopsis thaliana and tobacco plants. However, the new MIT approach is applicable to potentially any plant.

"In this study, we were able to quickly compare eight plant species, and you would not be able to do that with the old tools," Strano says.

The researchers tested strawberry plants, spinach, arugula, lettuce, watercress, and sorrel, and found that different species appear to produce different waveforms -- the distinctive shape produced by mapping the concentration of hydrogen peroxide over time. They hypothesize that each plant's response is related to its ability to counteract the damage. Each species also appears to respond differently to different types of stress, including mechanical injury, infection, and heat or light damage.

"This waveform holds a lot of information for each species, and even more exciting is that the type of stress on a given plant is encoded in this waveform," Strano says. "You can look at the real time response that a plant experiences in almost any new environment."

Stress response

The near-infrared fluorescence produced by the sensors can be imaged using a small infrared camera connected to a Raspberry Pi, a $35 credit-card-sized computer similar to the computer inside a smartphone. "Very inexpensive instrumentation can be used to capture the signal," Strano says.

Applications for this technology include screening different species of plants for their ability to resist mechanical damage, light, heat, and other forms of stress, Strano says. It could also be used to study how different species respond to pathogens, such as the bacteria that cause citrus greening and the fungus that causes coffee rust.

"One of the things I'm interested in doing is understanding why some types of plants exhibit certain immunity to these pathogens and others don't," he says.

Strano and his colleagues in the Disruptive and Sustainable Technology for Agricultural Precision interdisciplinary research group at the MIT-Singapore Alliance for Research and Technology (SMART), MIT's research enterprise in Singapore, are also interested in studying is how plants respond to different growing conditions in urban farms.

One problem they hope to address is shade avoidance, which is seen in many species of plants when they are grown at high density. Such plants turn on a stress response that diverts their resources into growing taller, instead of putting energy into producing crops. This lowers the overall crop yield, so agricultural researchers are interested in engineering plants so that don't turn on that response.

"Our sensor allows us to intercept that stress signal and to understand exactly the conditions and the mechanism that are happening upstream and downstream in the plant that gives rise to the shade avoidance," Strano says.

Credit: 
Massachusetts Institute of Technology

ECMO physicians offer guidance in the context of resource-scarce COVID-19 treatment

image: ECMO may be used to support critically ill patients in the COVID pandemic.

Image: 
ATS

April 15, 2020 - Rapidly escalating numbers of COVID-19 patients suffering from respiratory failure threaten to overwhelm hospital capacity and force healthcare providers into making challenging decisions about the care they provide. Of particular interest is the role of ECMO - extracorporeal membrane oxygenation, a form of life support for patients with advanced lung disease - to support critically ill patients in the current pandemic.

In "ECMO Resource Planning in the Setting of a Pandemic Respiratory Illness," an open-access paper published in the Annals of the American Thoracic Society, ECMO physicians outline their approach for care.

Currently, there is no vaccine or treatment for COVID-19 beyond supportive care such as mechanical ventilation or, in severe cases, ECMO to maintain patients and provide a window for potential recovery. However, when demand far outpaces a hospital's ability to provide highly specialized, resource-intensive therapies such as ECMO, physicians must be prepared to determine when and if to offer such support.

"The key challenge in pandemic settings is to optimize resource utilization so that patients are appropriately triaged and cared for within a hospital and throughout the larger health care system," said Steven Keller, MD, PhD, senior author and ECMO physician in the Division of Pulmonary and Critical Care Medicine, Brigham and Women's Hospital. "This is a daunting task as it requires a level of planning and coordination not routine in our current system and is difficult to implement within a limited window for planning and without dedicated resources."

"However, achieving this coordination is vital to ensuring that patients who are likely to benefit most from ECMO, younger patients with severe respiratory failure but without other significant co-morbidities or evidence of multi-organ failure, are able to receive the support that will offer them the best opportunity for survival."

Dr. Keller and his co-author suggest the following guidelines to help medical centers respond to patients' needs as resources contract in the COVID-19 pandemic:

Mild Surge - Focus on increasing capacity:

o Develop criteria specific to pandemic for initiation and cessation of ECMO.

o Obtain necessary equipment and expand capacity.

o Collocate/regionalize ECMO patients.

o Implement staffing protocols that allow for ECMO specialists/RNs to care for more patients based on acuity

o Collaborate with other local/regional ECMO centers.

Moderate Surge - Transition focus to determine allocation of scarce resources

Major Surge - Limit or defer use of scarce resources.

"Planning for how to deploy these resources in advance will both optimize care for patients initiated on ECMO support as well as provide guidance for clinicians caring for patients in whom ECMO support is not an option in a resource-limited environment," said Dr. Keller.

To read the authors' perspective piece in its entirety, and to see the full collection of COVID-19 manuscripts in the ATS library, go here.

Credit: 
American Thoracic Society

High-res imaging with elastography may accurately detect breast cancer in surgical margins

Bottom Line: A high-resolution, three-dimensional imaging technique, when combined with quantitative measurement of tissue elasticity, could accurately detect cancer within the resected margins of surgical specimens taken from patients undergoing breast-conserving surgery.

Journal in Which the Study was Published: Cancer Research, a journal of the American Association for Cancer Research

Authors: Brendan F. Kennedy, PhD, associate professor in the School of Engineering at The University of Western Australia (UWA) and laboratory head of BRITElab at the Harry Perkins Institute of Medical Research in Perth, Western Australia; and Christobel Saunders, MBBS, professor in the School of Medicine at UWA

Background: "Despite living in the 'digital age,' surgeons must routinely rely on their eyesight and sense of touch to determine if they have removed the entire tumor during breast-conserving surgery," said Kennedy. "Due to lack of adequate tools, 20 to 30 percent of patients must return for additional surgery, resulting in substantial physical and financial burdens and increased risk of complications."

Beyond surgeons' native senses of sight and touch, X-rays are often used for the intraoperative detection of tumor in the margin of surgical specimens, said Saunders. "While useful in some cases, this method can't detect small microscopic traces of tumor that surgeons often miss," she said. "As a result, it is widely accepted that higher-resolution intraoperative detection techniques are needed."

Optical coherence tomography (OCT) is a type of imaging technique that generates three-dimensional images of tissue. "OCT can be described as the optical equivalent of ultrasound, using reflections of light waves rather than sound waves to form images of tissue microstructure," Kennedy explained. The images generated by OCT can be used to visualize and detect cancer in mastectomy tissues, but the sensitivity and specificity of this technique in breast-conserving surgery specimens in several recent studies was relatively low.

Cancer and its associated stroma are stiffer than benign tissues, Kennedy explained. OCT technology can also be used to measure tissue deformation under an applied force, allowing for clinicians to ascertain the elasticity, or stiffness, of the surgical specimen. Quantitative micro-elastography (QME) is a variant of OCT that can generate three-dimensional maps of local elasticity. "We wanted to determine the diagnostic accuracy of QME for detecting tumor in the margins of breast-conserving surgical specimens, and compare this technique with images generated by OCT alone," Kennedy said.

How the Study was Conducted: In this study by Kennedy, Saunders, and colleagues, 90 patients were recruited who were undergoing surgical treatment for breast cancer. Of these patients, 83 received breast-conserving surgery, and seven received a mastectomy. Following surgery, simultaneous OCT and QME analyses were conducted on the resection margins of the freshly excised specimens. After imaging, the surgical specimens were submitted for standard histopathological processing. Histology sections were co-registered with the OCT and QME images to determine the types of tissue present in each scan.

To facilitate the comparison of OCT and QME images with histopathology, three-dimensional regions of interest (ROI) were selected from the scans. At least one ROI was selected for every surgical margin used in the study. Each ROI was determined to be positive for cancer if the pathologist identified tumor within 1 millimeter (mm) of the margin of the corresponding histology section.

To generate a set of pilot data, the researchers used all mastectomy surgical samples and surgical samples from 12 patients who received breast-conserving surgery. These data were used to train seven readers (two surgeons, two engineers, one medical sonographer, one pathology assistant, and one medical resident) to determine if images generated by OCT and QME indicated the presence of cancer in the surgical margins.

Surgical samples from the remaining 71 patients who received breast-conserving surgery were used to determine the ability of OCT and QME to detect cancer within 1mm of the surgical margins, as compared with gold-standard post-operative histology. Readers were blinded to the histopathology results. Sensitivity, specificity, and accuracy of each method was calculated for each reader, and aggregate results for all of the readers were determined. While inter-reader agreement was nearly perfect for QME, agreement was only moderate for OCT, Kennedy noted.

Results: Based on the aggregate results, OCT images resulted in a 69.0 percent sensitivity, 79.0 percent specificity, and 77.5 percent accuracy for detecting cancer within 1mm of the surgical margin. QME images resulted in a 92.9 percent sensitivity, 96.4 percent specificity, and 95.8 percent accuracy for detecting cancer within 1mm of the surgical margin.

Author's Comments: "Imaging the microscale stiffness of tissue using QME has the potential to reduce re-excision rates in breast-conserving surgery," Kennedy said. "Further, by quantifying tissue stiffness, we remove the subjectivity that is inherent to the surgeon's sense of touch.

"The ideal scenario would be to perform the imaging in the surgical cavity immediately after the specimen has been removed," Kennedy continued. "This would give surgeons a direct indication of whether any tumor had been missed. As such, our next goal is to develop a handheld QME probe to enable intraoperative imaging."

Study Limitations: Roughly 12 percent of the selected ROIs were excluded from the study for a variety of reasons, including extensive thermal damage, imaging artifacts, or the presence of a rare form of mucinous ductal carcinoma in situ (DCIS), representing a limitation of the study. "We are working on ways to accommodate for each of these excluded ROIs to ensure that the technique is as robust as possible," Kennedy said.

Credit: 
American Association for Cancer Research

Probiotic intervention in ameliorating the altered CNS functions in neurological disorders

As per the WHO report, one in every four people are affected by a mental or neurological disorder at some point in their lives making mental disorders among the leading cause of ill-health and disability worldwide. In the present era of socioeconomic competition and related stress, there has been a significant rise in the incidence of neurological and psychiatric ailments especially depression, bipolar disorder, acute anxiety and panic attacks. Many of these conditions are actually never talked about openly as people (affected directly or indirectly) shy away or feel embarrassed, causing the worsening of the conditions of a directly affected person in the absence of proper and early diagnosis and treatment.

With this scenario, the need to investigate newer and safer intervention therapies with prophylactic and/or therapeutic effects is well understood. Recently, the role of gut microbiota and its cross-talk with the human brain in modulating Central Nervous System (CNS) physiology and its optimal working has been highlighted. This review article, presented by Sharma and Kaur (Mehr Chand Mahajan DAV College for Women, India), focuses on the role and effect of regular intake of probiotic bacteria (through external administration of probiotic rich foods, drinks, capsules etc.) that help to strengthen our overall gut health. The review gives a comprehensive insight into the potential of regular intake of these good bacteria in a fixed dose to strengthen our gut microflora thus improving neurologic manifestations or decreasing the incidence and severity of neurological and psychiatric disorders. It also delineates the underlying mechanisms involved at the molecular and biochemical level through which the probiotic bacteria work in ameliorating the altered CNS functions under diseased neurological and psychiatric disorders (Anxiety, Major Depressive disorder, bipolar disorder, schizophrenia, autism spectrum disorder, cognitive impairments etc). The potential of probiotics as an important dietary modification as well as a useful intervention therapy with preventive and therapeutic value holds strong and should be an integral part of other treatment protocols recommended for the target population by the physician.

Credit: 
Bentham Science Publishers

A more plant-based diet without stomach troubles: getting rid of FODMAPs with enzymes

A plant-based diet is a good choice for both climate and health. However, many plant-based products, especially legumes, contain FODMAP compounds that are poorly digestible and cause unpleasant intestinal symptoms. A study by VTT and Finnish companies succeeded in breaking down FODMAPs with enzymes and producing new, stomach-friendly plant-based food products.

FODMAPs are short-chain carbohydrate molecules that are poorly absorbed in the human small intestine. These non-absorbed compounds move along to the large intestine, where intestinal microbes feed on them. This results in the production of gases that causes symptoms especially for those suffering from intestinal disorders, but also for many others. These problems are relatively common, as it has been estimated that the irritable bowel syndrome alone affects between 10% and 20% of the population.

Many foods containing FODMAPs are in themselves healthy and good sources of fibre, nutrients and vegetable proteins. However, those suffering from symptoms will often avoid these foods and miss out on their health benefits.

Enzymes to do away with FODMAPs

In a study funded by VTT, Gold&Green Foods, Raisio, Roal and Valio, VTT focused on two key FODMAP compounds: galactan and fructan. Galactan is abundant in, for example, legumes, while fructan is found in many cereals, among other things.

"We investigated whether these compounds can be removed from food by breaking them down with enzymes. We utilised both commercial enzymes and ones produced at VTT in the project. We used them to test the removal of FODMAPs from faba bean and pea protein concentrates as well as from rye, graham and wheat flour", says Senior Research Scientist Antti Nyyssölä from VTT.

The solution proved to work: there were only small amounts of FODMAPs remaining in the raw materials after enzymatic treatment.

"The method is similar to that used to make Hyla milk, in which lactose is broken down in advance. Similarly, enzymatic treatment can be used to remove FODMAPs from food."

New plant-based foods suitable for the FODMAP diet

The research project also tested whether enzymes work in connection with the preparation of food products. This would allow the food industry to eliminate harmful FODMAP compounds in their own processes. The project focused on testing plant-based spoonable products, meat analogues and bakery products to investigate different types of plant-based foods suitable for the FODMAP diet.

"The study showed that enzymes also work under a variety of conditions and in different food processes. This is interesting new information especially for legumes, as there are currently no similar legume-based foods suitable for the FODMAP diet on the market", says Nyyssölä.

"The results are most likely to be utilised next in the development of new food items, but also in academic research in order to verify the effects on intestinal symptoms with certainty", he continues.

Credit: 
VTT Technical Research Centre of Finland

How expectations influence learning

image: Burkhard Pleger (left) and Bin Wang collaborated for the studies.

Image: 
RUB, Marquard

During learning, the brain is a prediction engine that continually makes theories about our environment and accurately registers whether an assumption is true or not. A team of neuroscientists from Ruhr-Universität Bochum has shown that expectation during these predictions affects the activity of various brain networks. Dr. Bin Wang, Dr. Lara Schlaffke and Associate Professor Dr. Burkhard Pleger from the Neurological Clinic of Berufsgenossenschaftliches Universitätsklinikum Bergmannsheil report on the results in two articles that were published in March and April 2020 in the journals Cerebral Cortex and Journal of Neuroscience.

The neuroscientists identified two key regions in the brain: the thalamus plays a central role in decision-making. The insular cortex, on the other hand, is particularly active when it is clear whether the right or wrong decision has been made. "The expectation during learning then regulates specific connections in the brain and thus the prediction for learning-relevant sensory perception," says Burkhard Pleger.

Focus on the decision making process

For the investigation, the team used a learning task that focuses on the decision-making process during the perception of skin contact in the brain. "It's like learning a computer strategy game using a game pad, which gives sensory feedback to certain fingers on certain stimuli," compares Pleger. "The point is that a certain touch stimulus leads to success and that this has to be learned from stimulation to stimulation."

28 participants were given either tactile stimulus A or B on the index finger in each trial run. At the push of a button, they then had to predict whether the subsequent tactile stimulus would be the same or not. The probability of A and B was constantly changing, which the participant had to learn from prediction to prediction.

Strategy analysis

During the test, the participants' brain activity was examined using functional magnetic resonance imaging. The researchers were particularly interested in the trial runs in which the participants changed their decision-making strategy. They asked the question to what extent the change in expectations influenced brain activity.

To the researchers two brain regions stood out: the thalamus and the insular cortex. The thalamus processes information that comes from the sensory organs or other areas of the brain and passes it on to the cerebrum. It is also called the gateway to consciousness.

A new role for the thalamus

Using functional magnetic resonance images, the researchers were able to show that different brain connections between the prefrontal cortex and the thalamus were responsible for maintaining a learning strategy or changing the strategy. The higher the expectations before the decision, the sooner the strategy was maintained and the lower the strength of these connections. With low expectations, there was a change of strategy and the regions seemed to interact much more strongly with each other. "The brain appears to be particularly active when a learning strategy has to be changed while it takes significantly less energy to maintain a strategy," concludes Pleger.

"So far, the thalamus has been viewed as a switch," adds the neuroscientist. "Our results underline its role in higher cognitive functions that help decision-making while learning. So the thalamus is not only a gateway to sensory consciousness, but rather it seems to link it to cognitive processes that serve, for example, to make decisions

Affecting sensory perception

The insular cortex, on the other hand, is involved in perception, motor control, self-confidence, cognitive functions and interpersonal experiences. This part was particularly active when a participant had already made his decision and then found out whether he was right or wrong. "Different networks that are anchored in the insular cortex are regulated by expectations and thus seem to have a direct influence on future sensory perception," said Pleger.

Credit: 
Ruhr-University Bochum

Bees point to new evolutionary answers

image: A recently described Fijian bee species, Homalictus groomi (photo James Dorey).

Image: 
Courtesy James Dorey

Evolutionary biology aims to explain how new species arise and evolve to occupy myriad niches - but it is not a singular or simplistic story. Rare bees found in high mountain areas of Fiji provide evidence that they have evolved into many species, despite the fact they can't readily adapt to different habitats.

These bees - discovered by a team of researchers from Flinders University, South Australian Museum, UniSA and University of Adelaide - serve as a major warning about the impact of ongoing human-induced climate change and loss of biodiversity for different species.

The Fijian bees are locked into very specific habitats, and when these have contracted and split due to past climate change, the bee populations also became fragmented, with some isolated populations eventually turning into new species.

"The adaptation to new habitats and niches is often assumed to drive the diversification of species, but we found that Fijian bee diversity arose from an inability to adapt," says Flinders University's James Dorey, lead author on a new paper that explains this research.

The paper - "Radiation of tropical island bees and the role of phylogenetic niche conservatism as an important driver of biodiversity," by James Dorey, Scott Groom, Elisha Freedman, Cale Matthews, Olivia Davies, Ella Deans, Celina Rebola, Mark Stevens, Michael Lee and Michael Schwarz - has been published by Proceedings B journal.

"Our genetic data show how a single bee-colonisation in Fiji gave rise to over 20 endemic bee species largely constrained to cooler, high elevations," says Mr Dorey. "At least for Fijian bees, a relative inability to adapt has created a species-making machine."

New Fijian bee species evolved when a single ancestral species colonised and spread over lowland areas during cool periods, but were later restricted to different mountaintops as the climate warms and the lowlands become too hot for comfort. The isolated populations later become new species. Each subsequent climate cycle has the potential to generate new species.

"Perhaps, if Darwin had studied Fijian bees instead of Galapagos finches, he might have come to rather different conclusions about the origin of species", adds Flinders Univeristy's Associate Professor Mike Schwarz, who was part of the research group.

One of the major arguments at the core of evolutionary theory suggests that species arise from adaptive radiation into new niche spaces, with gene flow between the new and ancestral populations subsequently inhibited, eventually leading to speciation. As an alternative, phylogenetic niche conservatism points to the inability of a lineage to adapt to new or changing environments, in turn, promoting speciation when populations become isolated as their preferred habitats contract.

This is what the researchers found in the high mountain areas of Fiji. Of the 22 Fijian bee species they identified, most have very narrow elevational ranges (constrained by temperature) - and 14 species were only recovered from single mountain peaks.

"This demonstrates how slowly bees have adapted to new climates, since the colonisation of Fiji," says Mr Dorey. 'We further highlight that such phylogenetic signals could indicate climate-related extinction risks. Indeed, one Fijian bee species (Homalictus achrostus) appears to be at serious risk of extinction, with sightings becoming much rarer since its initial discovery in the 1970s. This raises concerns for the 13 other species that we have, so far, only found on single mountain tops. They have nowhere to go if climate continues to warm and represent 14 very good reasons to curb global greenhouse gas emissions."

Credit: 
Flinders University

Additions to resource industry underwater robots can boost ocean discoveries

image: An ROV fitted with an arm for collecting marine samples.

Image: 
AIMS

Underwater robots are regularly used by the oil and gas industry to inspect and maintain offshore structures. The same machines could be adapted to gather extra scientific information, thus boosting environmental and resource management capabilities, an Australian-led study has revealed.

Scientists from around the globe, led by Dianne McLean and Miles Parsons from the Australian Institute of Marine Science (AIMS), are urging closer ties between industry and researchers to maximise the use of the underwater robots, known as remotely operated vehicles (ROVs).

In a paper published in the journal Frontiers in Marine Science, they identify a range of instruments that can be easily added to the craft, including cameras, audio recorders and sample collectors.

The information gathered will significantly increase scientific and industry understanding of the impact of marine infrastructure, producing benefits for ecological management and regulatory compliance.

"This is a real win-win," said Dr McLean. "With some low-cost engineering and operational tweaks, industry and science can use ROVs to fuel new scientific discoveries. For instance, we could better understand the influence of structures such as platforms and pipelines in marine ecosystems - to the mutual benefit of the resource company and the environment."

The new research follows an earlier study that used adapted underwater vehicles to examine fish populations around a platform on the North West Shelf, 138 km offshore from Dampier.

In May this year, the AIMS team is set to extend the study, working with Santos to use ROVs to survey marine life around shallow water platform jackets.

The craft are routinely used to inspect thousands of industrial subsea structures around the world each year. They operate in shallow water, and at depths down to 3000 metres.

McLean, a fish ecologist and specialist in underwater video systems, and Parsons, an acoustics expert, teamed up with colleagues in Australia, the US, England and Scotland to identify feasible, cost-effective ways in which standard work-class ROVs could be adapted to expand their data-gathering capabilities.

These include the addition of extra sensors, cameras, acoustic transmitters and receivers, and sample collection devices.

"By partnering with experienced research scientists, industry can improve the quality of its ROV-derived data," says Dr Parsons.

Dr McLean said that the extra information, and the spirit of cooperation through which it was gathered, could be particularly useful when it came to complex engineering and environmental management challenges such as decommissioning large structures at the end of their working lives.

"From an industry point of view," she said, "these small additions to ROVs and their use for scientific surveys has the potential not only to improve environmental management, but also to facilitate more informed engagement with external stakeholders such as regulators and the public."

The research shows that small enhancements to the vehicles and how they are used now could provide substantial benefits to science and to resource companies in the long-term.

Credit: 
Australian Institute of Marine Science