Culture

Continuity of English primary care has worsened with GP expansions

A new study published by the British Journal of General Practice has found that patients' abilities to see their preferred GP has fallen greater in English practices that have expanded, compared with those that stayed about the same size.

At the same time, English practices that have expanded have not achieved better access to care or provided better overall experience. Being able to see the same GP is highly valued by many patients and previous studies have suggested that it may lead to fewer hospital admissions and fewer deaths.

Over the last few years, the UK Government has encouraged expansion, mergers and greater collaboration between practices. This was intended to enable them to deliver services in new ways, work more efficiently and lengthen their opening hours. Most recently, in 2019, Primary Care Networks - collaborative groups of practices serving larger populations - have been set up across England.

The study led by the University of Kent analysed changes in reported ability for patients to see a preferred GP, their access to care and overall patient experience over the last few years based on responses to the UK GP Patient Survey. This survey asks questions of several hundred thousand people each year.

In the 644 practices that had expanded by more than 20% between 2013 to 2018, the proportion of patients saying they were able to see their preferred GP fell by 10% from 59% to 49%, while in the 5,602 practices that had stayed about the same size (i.e. less than 20% change in number of patients), the same proportion fell by 7% from 63% to 56%. The fall remained greater in practices that had expanded even after allowing for other characteristics of the practices, such as age distribution of the registered patients, rurality and level of poverty.

The research was led by Professor Lindsay Forbes at the University of Kent's Centre for Health Services Studies alongside Professor Stephen Peckham (Kent and the London School of Hygiene & Tropical Medicine (LSHTM)), Professor Matt Sutton (University of Manchester), Professor Katherine Checkland (Manchester) and Hannah Forbes (Manchester).

These scientists are all part of the Policy Research Unit in Commissioning and the Healthcare System (PRUComm), a collaboration between LSHTM and the Universities of Manchester and Kent, funded by the National Institute for Health Research. PRUComm undertakes research into how the health and social care system and commissioning of its services could be improved, providing evidence to the Department of Health and Social Care to inform the development of policy.

Professor Forbes said: 'Larger general practice size in England may well be associated with slightly poorer continuity of care and may not improve patient access. This goes to show that bigger may not be better with English primary care. Better health outcomes for individuals and patient experience for those with long-term conditions must be prioritised. Continuity of care is an important feature of good quality primary care and it is vital that we preserve this for the benefit of patients. It is also important that we collect good data about collaborative working and practice growth and monitor the effects on patient experience.'

Credit: 
University of Kent

Oncotarget: Heterogeneity of CEACAM5 in breast cancer

image: Proposed mechanisms of tumor dissemination from primary breast tumors to metastases. CEACAM5-expression outlined as no expression (white circle), heterogeneous expression (half-filled grey circle) and homogeneous expression (filled grey circle) based on the observed expression patterns in a total of 59 sets of primary breast carcinomas (P) and corresponding lymph node metastases (M), including datasets from both cryosectioned tissue and TMA. The distribution of the 59 tumor sets are outlined along with a representative immunostain at the right, as well as the proportion of tumor-sets with the given profile. Hierarchy-locked suggests that the disseminating tumor cells remain in a specific differentiation state. Induced suggests that extrinsic factors lead to induction of CEACAM5 in negative cells. EMT and EMT/MET suggest that disseminating tumor cells undergo epithelial to mesenchymal transition (EMT) without or with subsequent mesenchymal to epithelial transition (MET), respectively. Hierarchy-based suggests that disseminating cancer stem cells retain their differentiation capacity.

Image: 
Correspondence to - René Villadsen - r.villadsen@sund.ku.dk

Oncotarget recently published "Heterogeneity of CEACAM5 in breast cancer" which reported that Here, we examined a repository of 110 cryopreserved primary breast carcinomas by immunohistochemistry to assess the distribution of CEACAM5 in tumor subtypes.

Assessing sample sets of paired primary breast cancers and corresponding lymph node lesions from a total of 59 patients revealed a high correlation between primary tumor and lymph node with regard to CEACAM5-status.

When examining the consequence of expression of CEACAM5 in breast cancer cell lines in culture assays we found that CEACAM5-expressing cells were less invasive.

In survival analysis, using cohort studies of breast cancer, the expression of CEACAM5 predicted different clinical outcomes depending on molecular subtypes.

Altogether, our analysis suggests that CEACAM5 plays a context-dependent role in breast cancer that warrants further investigation.

Dr. René Villadsen from The University of Copenhagen said, "The carcinoembryonic antigen family (CEA) consists of a subgroup of 12 members of carcinoembryonic antigen-related cell adhesion molecules (CEACAMs), and several of these are reportedly overexpressed in various cancers."

Early work suggested that CEACAM5 was also often overexpressed in breast cancer.

Since then several immunobased assays have been implemented to examine the role of CEACAM5 as a clinically relevant marker in breast cancer.

While some studies have demonstrated that increased serum levels in preoperative breast cancer patients do correlate to a worse outcome others have Immunophenotypic.

A summary of the observed results are available in Table 1. Overall, the available data do not provide a consensus on the role of CEACAM5 in breast cancer.

Here, we assess CEACAM5 expression in breast cancer subtypes by immunohistochemistry, and compare the expression pattern in primary tumors to corresponding lymph node metastases.

The Villadsen Research Team concluded in their Oncotarget Research Paper, "the findings in this study may help improve the understanding of the biological effect of CEACAM5-expression in breast cancer."

Credit: 
Impact Journals LLC

Study provides first evidence of a relationship between a bird's gut and its brain

video: A male zebra finch passes the final stage of the novel foraging task, showing that he has mastered the lid flipping technique.

Image: 
Morgan C. Slevin, Florida Atlantic University

Despite extensive support for relationships between the gut microbiome and the brain (the "microbiota-gut-brain axis") in humans and rodents, little is known about these relationships in other animals, leaving questions about this system's generality.

To address these knowledge gaps, researchers from Florida Atlantic University's Charles E. Schmidt College of Science and Harbor Branch Oceanographic Institute, in collaboration with Cornell University, studied the relationship between cognition and the gut microbiome of captive zebra finches (Taeniopygia guttata). Songbirds provide an opportunity to test for a microbiota-gut-brain axis because of recent advances in understanding avian cognition.

In a population of 38 zebra finches, researchers quantified performance on cognitive tasks measuring learning and memory. For the study, they sampled the gut microbiome using a cloacal swab and quantified bacterial alpha and beta diversity. The zebra finch cloacal microbiome is representative of that of its large intestine.

Results of the study, published in the Royal Society's journal Biology Letters, showed that captive zebra finches' gut microbiome characteristics were related to performance on a cognitive assay where they learned a novel foraging technique. Researchers also identified potentially critical bacteria that were relatively more abundant in birds that performed better on this assay. This correlation provides some of the first evidence of a relationship between a bird's gut microbiome and its brain.

"An animal's gut microbiome can have wide-ranging effects on health, cognitive performance and behavior, coining the conceptual framework 'microbiota-gut-brain axis,'" said Morgan C. Slevin, lead and corresponding author and an FAU Ph.D. student in integrative biology and neuroscience. "The gut microbiome can affect the brain directly by releasing neurotransmitters and precursors that stimulate the vagus nerve and indirectly by influencing the immune system. Gut microbiome characteristics have been linked in rodents and humans to learning and memory and mental health."

To assess cognitive performance, the researchers tested the zebra finches using three tasks measuring learning and memory: novel foraging, color association, and color reversal. Each bird was tested individually (visually but not acoustically isolated from other subjects) and researchers viewed and scored trials remotely via video.

Findings showed Helicobacter, responsible for many intestinal diseases including ulcers, and Gallibacterium, with many hemolytic species found in birds including poultry, were generally more abundant in birds that performed poorly.

"While we did not identify beneficial taxa responsible for differences among performance categories, we suggest Helicobacter and Gallibacterium may signal microbiome imbalance or maladaptation in poor-performance birds," said Rindy C. Anderson, Ph.D., senior author, an assistant professor of biological sciences in FAU's Charles E. Schmidt College of Science, and a member of FAU's Brain Institute. "This finding raises the question: 'Do specific taxa influence cognitive performance? Or, is a songbird's gut microbiome simply indicative of host quality and thus correlated with cognitive ability?' Research could address these questions by describing the functionality of the core microbiome members for more bird species and testing how specific pre- and probiotic treatments affect cognitive ability."

The researchers note that another intriguing possibility is that microbiome characteristics impact some cognitive processes more than others depending on sex, such as motor learning and short-term memory (novel foraging) compared to longer-term associative memory (color association) and flexibility (color reversal).

These studies will be crucial to understanding how the microbiome affects the brain and overall health of wild and captive animals.

Credit: 
Florida Atlantic University

COVID-19 control measures shorten hospital stays for moms, babies

LOS ANGELES (Nov. 2, 2020) -- New infection prevention practices implemented during the coronavirus pandemic have resulted in significantly shorter hospital stays for mothers and their babies, according to investigators at Cedars-Sinai, with no changes in the rates of cesarean deliveries, complications or poor outcomes.

The retrospective study, recently published in the American Journal of Obstetrics & Gynecology MFM, examined the impact of several modifications in the Labor and Delivery Unit at the medical center. The changes included temperature screening of all patients and visitors, limiting the number of visitors, providing staff with personal protective equipment (PPE), and new approaches to delivery management and newborn care.

"Patients can be reassured that appropriate measures have been taken to protect them and their babies and that those changes are not going to impact their ability to have a good and safe delivery with us," said Naomi Greene, PhD, principal investigator of the study and an assistant professor of obstetrics and gynecology at Cedars-Sinai.

The study reviewed data from 1,936 deliveries, comparing two separate groups. The first cohort were patients who gave birth in January and February of 2020, before new COVID-19 guidelines were in place. The second group had their babies in March and April, after infection control protocols were implemented at the medical center.

"When the Labor and Delivery Unit made safety modifications in response to the pandemic, approximately half of the women who had vaginal deliveries - and their babies - stayed just one night in the hospital. But before the pandemic, only a quarter of the women giving birth went home after one night; most spent two nights, on average," said Greene.

Similar patterns were seen among women who had a cesarean delivery. More than 40% of them spent two days or less in the hospital once pandemic protocols were in place. But before the safety changes, a much smaller number were discharged within two days - just 12%, said Greene.

Maternal-fetal medicine specialist Mariam Naqvi, MD, senior author on the study, said it was encouraging to find the shorter stay did not appear to impact the care or welfare of the mother and her child.

"It's always our goal to discharge a patient after she has met all of her postpartum milestones and is medically stable to go home. But with COVID-19, we are mindful of the potential risk of unnecessary or prolonged stays in the hospital. It is encouraging to see that in the short term, spending less time was not associated with more complications," said Naqvi, an assistant professor of obstetrics and gynecology at Cedars-Sinai.

Here are additional findings researchers noted when comparing the groups delivering before and after the implementation of pandemic safety modifications:

There was no difference in the rate of cesarean delivery for first-time mothers carrying to full term.

No differences in the rate of induced labor.

No differences in adverse maternal or neonatal outcomes.

"Our study suggests there may be value in exploring whether there are benefits, post-pandemic, of a shorter hospital stay for childbirth. Also, perhaps limiting visitors may give women and their partners time to focus more on the new baby and the helpful in-patient education we provide," said Sarah Kilpatrick, MD, PhD, a co-author of the study and the Helping Hand of Los Angeles Chair in Obstetrics and Gynecology at Cedars-Sinai.

"Our goal during these unprecedented and stressful times is to provide the safest care possible and to get families home, as soon as is safely possible," said Kilpatrick.

Credit: 
Cedars-Sinai Medical Center

COVID-19 news from Annals of Internal Medicine

Below please find a summary and link(s) of new coronavirus-related content published today in Annals of Internal Medicine. The summary below is not intended to substitute for the full article as a source of information. A collection of coronavirus-related content is free to the public at http://go.annals.org/coronavirus.

Emergency Approvals for COVID-19: Evolving Impact on Obligations to Patients in Clinical Care and Research

There currently is debate regarding whether U.S. institutions and clinicians may or should restrict patient access to COVID-19 drugs and vaccines that have been granted emergency use authorization (EUA) by the U.S. Food and Drug Administration. Authors from the University of Pennsylvania and NYU Grossman School of Medicine discuss their views on the legal, ethical, and clinical ramifications of such restriction. They suggest that, although EUAs expand treatment options, they do not necessarily support a shift in the standard of care - only evidence can do that, and the evidence supporting EUAs varies considerably. Institutions and clinicians are not obligated to offer unproven interventions, but rather they must assess available evidence and treat patients accordingly. As a result, they may permissibly decide not to offer EUA products in clinical care and instead to limit access to EUA products exclusively to participants in clinical trials. Read the full text: https://www.acpjournals.org/doi/10.7326/M20-6703.

Media contacts: A PDF for this article is not yet available. Please click the link to read full text. The lead author, Holly Fernandez Lynch, JD, MBE, can be reached at lynchhf@pennmedicine.upenn.edu.

Credit: 
American College of Physicians

Where you get depression care matters, study finds

In the United States, more than half of people living with a mental health disorder do not receive treatment, according to the National Institute of Mental Health, which is why primary-care clinics can play a leading role in depression care.

Research shows that collaborative care programs in which primary-care providers work with a depression care manager and a designated psychiatric consultant can more than double the likelihood of improving depression outcomes. But a new study published in Health Affairs shows that not all care is equal.

Looking at data from 11,003 patients in 135 primary care clinics in nine states, researchers found tremendous variation in how well clinics implemented collaborative care programs.

In some clinics, fewer than 25% of patients served had substantial improvements in depression after six months and in other clinics, more than 75% of patients had significant improvement.

"This is the largest study to date of collaborative care programs for depression in primary care," said lead author Jürgen Unützer, professor and chair of the Department of Psychiatry at the University of Washington School of Medicine, where the model was pioneered. "The differences are huge and it makes a big difference where you get your depression care."

Researchers said the reason for the large variation in success across clinics could be summed up in three major findings:

The most important finding, they said, is that it made a big difference how much help the clinics had with implementing collaborative care.

Clinics that received more intensive implementation support were almost twice as likely to achieve good depression outcomes as those with a basic level of implementation support, such as program literature and a one-time training. More intensive support included ongoing outcomes tracking and feedback from the UW Medicine AIMS Center (Advancing Integrated Mental Health Solutions) over a one-year period.

In other findings:

Patients who are more severely depressed or sicker in other ways are less likely to have good depression outcomes.

Patients who are poor and have fewer resources may also have worse depression outcomes. Clinics that treat low-income patients, such as Federally Qualified Health Centers, may have a harder time achieving good depression outcomes and may need extra help and resources to be successful.

Clinics that had several years of practice with collaborative care achieved somewhat better outcomes than clinics that are still learning how to do collaborative care.

The collaborative care model was pioneered by the late Dr. Wayne Katon, who spent three decades testing and developing approaches to improve depression treatment in primary care. More than 80 randomized controlled trials have validated the success of the collaborative care model.

One of the most powerful aspects of collaborative care is regular monitoring of patients' depression and systematic adjustment of treatments if patients are not improving as expected. Similarly, clinics that regularly monitor their patients' depression and make systematic adjustments in their programs if patients are not improving as expected may achieve substantially better outcomes for their patients.

Credit: 
University of Washington School of Medicine/UW Medicine

New study reveals poisoning exposures in Australian schools

New research from the University of Sydney has found poisoning exposures in children and adolescents while at school are relatively common and appear to be increasing, highlighting the need for more robust prevention measures.

The authors state that by focusing on improved safety strategies, the incidence of poisonings in schools could decrease.

Published today in the Archives of Disease in Childhood, it is the most up-to date study to investigate poisonings in schools in New South Wales, Australian Capital Territory and Tasmania, with data revealing the types of exposures and substances involved.

The researchers studied cases reported to the New South Wales Poisons Information Centre (NSWPIC) over a 4 and-a-half-year period (January 2014 to June 2018). NSWPIC is Australia's largest Poisons Information Centre, taking 50 percent of the nation's poisoning calls.

Injury and poisoning are major causes of hospitalisation and death in children globally. Poisoning is the third leading source of hospitalised childhood injury in Australia and is largely preventable.

"The study found 1751 calls relating to exposures at school made to the Poisons Information Centre, with 61 percent concerning accidental exposures, 12 percent concerning deliberate self-poisonings and 12 percent from medication dosing errors," said senior author Dr Rose Cairns from Sydney School of Pharmacy, in the Faculty of Medicine and Health, and senior poisons specialist at the NSW Poisons Information Centre.

"Most self-poisoning exposures were from over-the-counter products such as paracetamol, and most accidental exposures occurred from stings and bites, exposures to plants and in science class.

"Poisoning exposures at school appear to be increasing, with 81 calls per quarter in 2014-2016, and 129 calls per quarter in 2017-2018.

"Children are at risk of different types of poisoning depending on their age and developmental stage. Younger children are at risk of accidental exposures, while adolescents are at risk of self-harm poisonings. Deliberate self-poisoning (self-harm exposures/overdose) is increasing in children and adolescents in Australia, and elsewhere," Dr Cairns said.

Study findings

The median age was 12 years old and exposure peaked in children 14 years

55 percent of cases were male

Deliberate self-poisoning was predominantly reported in girls (79 percent)

More than 25 percent of poisoning cases were hospitalised, with deliberate self-poisoning exposures being the most common reason (92 percent), recreational exposures (57 percent) and other intentional exposures (33 percent)

Accidental exposures (15 percent) and medication errors (11 percent) had low hospital referral rates

Over-the-counter medicines such as paracetamol and ibuprofen were most commonly taken in self-poisoning incidents

Medication errors occurring at school accounted for nearly 12 percent of cases with the most common medications involved being methylphenidate and clonidine (ADHD medications), and paracetamol. Where recorded, 150 cases involved a dosing error with a medication prescribed for the child involved, while 40 cases were prescription medication administered to the wrong child

Science class poisoning exposures accounted for 19 percent of accidental exposures, and a range of substances were involved. Copper sulfate was responsible for approximately one-quarter of all science class exposures, of which 45 percent resulted in hospitalisation. Most science class exposures were accidental.

Accidental exposures, dares, pranks and recreational exposures occurred more frequently in boys

Poisons calls were not just about medicines, but included everything from insect bites, mushrooms and hand sanitiser, to glow sticks, soap and disinfectant.

Strategies needed to stop preventable poison exposures

"Many of these poison exposures were likely preventable, so we need to focus on strategies for prevention and school-based initiatives and programmes to make the school environment safer for students," said Dr Cairns.

"A better understanding of reasons for poisonings, and circumstances surrounding exposures, is key to guiding public health strategies for poisoning prevention.

"For accidental poisoning exposures schools could undertake a risk assessment of common chemistry experiments and reinforce the use of personal protective equipment in class. This includes the provision of well-fitting goggles to prevent eye exposures.

"In the case of deliberate self-poisonings, schools could look to increase teacher training for identifying and responding to mental health problems, anti-bullying strategies, more school counsellors, regulation of social media use, and mental health first aid training for teachers. The increased funding for school counsellors and psychologists announced in September is a step in the right direction.

"Despite there being policies and procedures regarding medication handling in Australian schools, medication administration errors were common over the study period. While low risk, our study highlights the importance of medication skills training by school staff to ensure correct administration of medications to students. Some medicines are self-administered by students during school time, so increased counselling by prescribers may also help, or consideration of dosing regimens that avoid medication during school hours.

"This is particularly important as Australian studies have found that the prescribing of medicines in children and adolescents, particularly of psychotropics, has increased substantially and it was these medicines where the dosing errors occurred most frequently.

"Poisonings can occur in any setting, including at home and at school. While we often focus on poisoning prevention initiatives to improve safety in the home, we need to recognise that children spend a large portion of their waking hours at school, so it's important to consider simple ways we can decrease poisoning exposures in this setting."

Credit: 
University of Sydney

Artificial night lighting has widespread impacts on nature

Artificial night-time lighting has a diverse range of effects across the natural world and should be limited where possible, researchers say.

A team led by the University of Exeter brought together more than 100 studies and found "widespread" impacts on animals and plants.

Changes to animals' bodies and behaviour - especially hormone levels and patterns of waking and sleeping - were consistently found.

The study shows that levels of melatonin (a hormone regulating sleep cycles) were reduced by exposure to artificial lighting at night in all animal species studied.

"Lots of studies have examined the impacts of artificial night-time lighting on particular species or communities of species," said Professor Kevin Gaston, of the Environment and Sustainability Institute on Exeter's Penryn Campus in Cornwall.

"Our research brings those studies together - and we find the effects are very diverse and very pervasive.

"Particularly strong responses are seen in hormone levels, the timing of daily activity in diurnal (daytime) species, and 'life-history' traits such as number of offspring.

"People may imagine this is all about powerful light, but in fact we are seeing a lot of responses at quite low levels of artificial light."

Dr Dirk Sanders added: "We see differences in nocturnal and diurnal species.

"For rodents, which are mostly nocturnal, the duration of activity tended to be reduced by night-time lighting.

"In contrast, for birds - with all of those included strictly diurnal - artificial light led to an extension of the duration of their activity, with singing and foraging starting earlier."

Previous studies have shown night-time lighting has wide-ranging impacts - from reducing pollination by insects to trees budding earlier in spring.

Like climate change, night-time lighting appears to benefit certain species in certain locations, but Professor Gaston said the clear message of the study was to reduce lighting where possible.

"Both climate change and night-time lighting are human-driven and enormously disruptive to the natural world," he said.

"Historically, we have not really worried about the impact of night-time lighting.

"Only now are we discovering its wide-ranging effects.

"Our study shows that we should, as a matter of principle, only use night-time lighting where we need it and no further, and at intensities that we need and no more.

"In effect, we need to view light like any other pollutant.

"Obviously it would be ridiculous to say 'switch the world's lights off' - but we could reduce our use of light immensely with absolutely no impact on ourselves."

Professor Gaston is the scientific advisor on a forthcoming landmark natural history series about the night-time, called "Earth at Night in Colour''. The series is released on Apple TV+ on December 4th.

Credit: 
University of Exeter

Researchers develop a high-power, portable terahertz laser

Researchers at MIT and the University of Waterloo have developed a high-power, portable version of a device called a quantum cascade laser, which can generate terahertz radiation outside of a laboratory setting. The laser could potentially be used in applications such as pinpointing skin cancer and detecting hidden explosives.

Until now, generation of terahertz radiation powerful enough to perform real-time imaging and fast spectral measurements required temperatures far below 200 Kelvin (-100 degrees Fahrenheit) or lower. These temperatures could only be achieved with bulky equipment that limited the technology's use to a laboratory setting. In a paper published in Nature Photonics, MIT Distinguished Professor of Electrical Engineering and Computer Sciences Qing Hu and his colleagues report that their terahertz quantum cascade laser can function at temperatures of up to 250 K (-10 degrees Fahrenheit), meaning that only a compact portable cooler is required.

Terahertz quantum cascade lasers, tiny chip-embedded semiconductor laser devices, were first invented in 2002, but adapting them to operate far above 200 K proved to be so difficult that many people in the field speculated that there was a fundamental physical reason preventing it, Hu says.

"With a high operating temperature, we can finally put this in a compact portable system and take this breakthrough technology out of the laboratory," Hu says. "This will enable portable terahertz imaging and spectral systems that will have an immediate impact on wide-ranging applications in medicine, biochemistry, security, and other areas."

Hu began research into the terahertz frequencies -- a band of the electromagnetic spectrum between microwaves and the infrared range -- back in 1991.

"It took me 11 years and three generations of students to make our own [terahertz quantum cascade laser] in 2002," he says. Since then, maximum operating temperatures that limited the use of terahertz radiation remained well below room temperature. The maximum of 250 K reported in this paper represents a considerable jump from the previous maximum of 210 K, which was established in 2019, beating a previous 2012 record of 200 K that had stayed untouched for seven years.

The lasers, which measure only a few millimeters in length and are thinner than a human hair, are quantum well structures with meticulously custom-engineered wells and barriers. Within the structure, electrons "cascade" down a kind of staircase, emitting a light particle, or photon, at each step.

One important innovation described in the Nature Photonics paper was the doubling of the height of the barriers within the laser to prevent leakage of the electrons, a phenomenon that tended to increase at high temperatures.

"We understood that over-the-barrier electron leakage was the killer," causing the system to break down if not cooled with a cryostat, Hu says. "So, we put a higher barrier to prevent the leakage, and this turned out to be key to the breakthrough."

Previously, higher barriers were explored sporadically, but they yielded inferior results, Hu says. The prevailing opinion was that increased electron scattering associated with the higher barriers was detrimental, and therefore higher barriers should be avoided.

The research team developed the correct parameters for the band structure for tall barriers and a conceptually novel optimization scheme for the design.

This innovation was paired with a "direct phonon scheme" that keeps the laser operating through a configuration in which lower lasing levels of each module, or steps of the structure's staircase, are quickly depopulated of electrons through phonon (or a unit of vibrational energy) scattering into a ground state, which then serves as the injector of electrons into the next step's upper level, and the process repeats. Such an arrangement of the electrons in the system is essential for lasing to occur, as first envisioned by Einstein back in 1916.

"These are very complex structures with close to 15,000 interfaces between quantum wells and barriers, half of which are not even seven atomic layers thick," says co-author Zbig Wasilewski, professor of electrical and computer engineering and University of Waterloo Endowed Chair in Nanotechnology. "The quality and reproducibility of these interfaces are of critical importance to the performance of terahertz lasers. It took the best in molecular beam epitaxial growth capabilities -- our research team's key contribution -- together with our MIT collaborators' expertise in quantum device modeling and fabrication, to make such important progress in this challenging sector of THz photonics."

In a medical setting, the new portable system, which includes a compact camera and detector and can operate anywhere with an electric outlet, could provide real-time imaging during regular skin-cancer screenings or even during surgical procedures to excise skin cancer tissues. The cancer cells show up "very dramatically in terahertz" because they have higher water and blood concentrations than normal cells, Hu says.

The technology could also be applied in many industries where the detection of foreign objects within a product is necessary to assure its safety and quality.

Detection of gases, drugs, and explosives could become especially sophisticated with the use of terahertz radiation. For example, compounds such as hydroxide, an ozone-destruction agent, have a special spectral "fingerprint" within the terahertz frequency rage, as do drugs including methamphetamine and heroin, and explosives including TNT.

"Not only can we see objects through optically opaque materials, but we can also identify the substances," Hu says.

Hu says he sees "a clear path" to the goal of being able to generate powerful terahertz without needing a cooler.

"Using the direct phonon scheme and taller barriers is the way to go forward," he says. "I can finally see the light at the end of the tunnel when we will reach room temperature."

Credit: 
Massachusetts Institute of Technology

What digital revolution? Hundreds of millions of farmers still cannot get online

image: Coverage of mobile services across global croplands.

Image: 
Mehrabi et al. 2020

The digital age brims with promise for the world's half-billion smallholder farmers. Smartphones with the right apps can tell farmers when it's likely going to rain, how to identify and eradicate pests, and negotiate the prices for a bountiful harvest delivered by a combination of hard work and big data.

While digital technologies are steadily reaching more farmers, for each one plugged into the latest weather forecast or selling produce at the tap of a screen, millions more are sidelined as the digital revolution whizzes by.

Depending on the region, this is due to a lack of devices or a combination of nonexistent, extremely expensive or outdated network coverage. For example, across Mexico, Latin America's second-largest economy, virtually everyone has a cell phone but only 25% of farming households have internet access.

Across many locations in sub-Saharan Africa, which has the potential to be a global breadbasket, fewer than 40% of farming households have internet access. And unlike Asia and Latin America where mobile phone ownership is nearly universal, fewer than 70% of farmers in sub-Saharan Africa have handheld devices. And access to 4G networks required to run more sophisticated apps is only 9%.

"There's an assumption that we're going to be able to target everyone with these new technologies and everyone is going to be able to benefit," said Zia Mehrabi, a scientist at the University of British Columbia who led the analysis published Nov. 2 in Nature Sustainability.

The study also showed major differences between farm size and mobile network services. Globally, 24-37% of farms under one hectare had access to 3G or 4G networks. Service availability is as high as 80% for farms over 200 hectares.

2G technology, which provides voice and text service, has wider availability but is not compatible with most smartphone technology.

"We face a digital poverty trap - those who are already marginalized fall further behind while others benefit from the myriad of opportunities offered by digital innovation in farming," said Andy Jarvis, a co-author from the Alliance of Bioversity International and CIAT, who is also part of the CGIAR Platform for Big Data in Agriculture.

The researchers' affordability analysis found that for many rural poor who do live in areas with coverage, getting connected could eat up the majority of their household budget.

"The study points to the need not only to expand coverage but vastly reduce the costs to make it affordable," said Jarvis. "We need to consider digital connectedness as a basic need, and design next-generation innovations to work in every corner of Africa."

The study included authors from the World Bank and the Helmholtz Centre for Environmental Research in Germany.

Noise but no signal

The study zeroed in on mobile coverage gaps where farmers need it most.

In nitrogen-deficient cropping areas, which have low yields without proper fertilizer management, 3G/4G availability was 60% and 22% respectively. Areas dependent on rainfall for production had 71% and 54% availability. For arid environments, which pose large but surmountable production challenges, coverage was only 37% and 17%.

"These coverage gaps pose important roadblocks for developing data-hungry nutrient advisories, climate services, and financial services that require mobile internet," the authors said.

As for people affected by food insecurity, which is based on the prevalence of childhood stunting, 3G/4G coverage was 61%/45% and, in Africa, 52%/22%. Global access for people afflicted by malaria, which caused more than 400,000 deaths in 2018, is only 37%/17%.

"This lack of coverage for at-risk populations poses serious concern for responding to food security and health impacts of emerging diseases including COVID-19. This lack of coverage is more problematic today than ever before," said Mehrabi.

Universal broadband

The authors call on governments, businesses, development agencies and global philanthropists to quickly mobilize the investments and interventions needed to close the digital divide. They call for immediate action on The United Nations Sustainable Development Goal, which outlines universal access to the Internet in least developed countries by 2020.

Proposed actions include:

Investment in "last-mile" infrastructure innovations such as renewable energy, low-cost mobile towers, and backhaul technologies like scalable microwave technologies.

Increase in handset affordability: devices in emerging markets are in the USD $100 range, making them unaffordable for many, including women who have lower handset ownership than men in many regions.

Make data access universal: even if they have handsets, the poorest farmers are unable to afford data. In some African countries, a basic plan with 1GB of data per month exceeds the annual income of the poorest 10% of the population.

Use interim solutions: SMS advisories and voice message services on existing 2G networks offer significant opportunities for productivity, market connectivity, money transfers, credit and other services - as long as they have sophisticated back-end support.

Research funding: this baseline study should be regularly updated to keep track of progress and build metrics related to capacity building, skills, digital literacy and cultural appropriateness of services, particularly across ages and genders.

"There's a lot of 5G coming online. If access is not addressed at lower-end technologies, this is only going to aggravate the divide and create more inequality," said Mehrabi.

While the future may hold universal internet access for the world - Google's Loon, OneWeb and SpaceX's Starlink hope to enable this - the reality is that delivery of these products is still perhaps a long way off.

"They'd better get their skates on, because we desperately need universal access in farming landscapes. And the longer we delay, the more problematic this is going to be," said Mehrabi.

Credit: 
The Alliance of Bioversity International and the International Center for Tropical Agriculture

Silk road contains genomic resources for improving apples

image: Examples of different colors, sizes and shapes of apples, a reflection of the diversity of the apple genetic resources or germplasm preserved in ARS Geneva.

Image: 
Thomas Chao, USDA-ARS

The fabled Silk Road - the 4,000-mile stretch between China and Western Europe where trade flourished from the second century B.C. to the 14th century A.D. - is responsible for one of our favorite and most valuable fruits: the domesticated apple (Malus domestica).

Snack-packing travelers would pick apples at one spot, eat them and toss their cores many miles away. The seeds grew into trees in their new locations, cross-bred with the wild species, and created more than 7,000 varieties of apples that exist today.

Hybridizations with wild species have made the apple genome very complex and difficult to study. A global team of multi-disciplinary researchers - co-led by Zhangjun Fei, faculty member at Boyce Thompson Institute (BTI), and Gan-Yuan Zhong, scientist with the USDA-Agricultural Research Service (ARS) in Geneva, New York - tackled this problem by applying cutting-edge sequencing technologies and bioinformatics algorithms to assemble complete sets of both chromosomes for the domesticated apple and its two main wild progenitors.

The researchers discovered that the apple's unique domestication history has led to untapped sources of genes that could be used for crop improvement, such as improving size, flavor, sweetness and texture.

"Plant breeders could use this detailed information to improve upon traits that matter most to consumers, which today is primarily flavor," says Fei, also an adjunct associate professor in Cornell University's School of Integrative Plant Science (SIPS).

"Perhaps more importantly," he added, "the information will help breeders produce apples that are more resistant to stress and disease."

The research is described in a paper published in Nature Genetics on November 2, with authors from BTI, Cornell University, Cornell AgriTech, the U.S. Department of Agriculture (USDA) and Shandong Academy of Agricultural Sciences.

From the Silk Road to Geneva, N.Y.

According to Fei, the new study was the outgrowth of an earlier collaboration, published in Nature Communications in 2017, which traced the history of apple domestication and evolution along the Silk Road.

Follow-up discussions among Fei, Zhong and other colleagues at Cornell, inspired them to build better and new apple reference genomes by applying new sequencing and assembly technologies to material in USDA's Geneva Clonal Repository. The repository, which is housed at Cornell AgriTech, holds the largest collection of apple accessions in the world. Many of these accessions can be traced back to the Silk Road.

In the current work, the researchers sequenced, assembled and compared the full reference genomes for three species: Gala, a top commercial cultivar of M. domestica; and apple's two main wild progenitors, the European crabapple (M. sylvestris) and the central Asian wild apple (M. sieversii), which together account for about 90% of the domesticated apple's genome.

The results provide apple breeders with detailed genomic roadmaps that could help them build a better apple.

"We wanted to develop new genomes, especially the wild progenitors, because of the tremendous impact they could have on understanding apple's genetic diversity and identifying useful traits for breeding new cultivars," said Zhong, who is also an adjunct associate professor in SIPS.

By comparing the three genomes, the researchers were able to identify which progenitor species contributed the genes responsible for many traits in the domesticated apple.
For example, the team found that the gene giving apple its crunchy texture is located near the gene that makes it susceptible to blue mold.

"Now that we know exactly where those two genome regions are," Fei said, "breeders could figure out a way to keep the texture gene and breed out or edit out the blue mold gene to produce a more disease-resistant cultivar."

Discovering what's missing

The team also assembled pan-genomes for the three species. A pan-genome captures all of the genetic information in a species, unlike a reference genome that captures one individual organism. Pan-genomes are especially important for a very diverse species like apple.

The team identified about 50,000 genes in the pan-genome of the domesticated apple, including about 2,000 that were not present in previously published reference genomes for apple species. "These 'missing genes' turn out to be really important, because many of them determine the traits of greatest interest to apple breeders," Fei said.

Using RNA extracted from different stages of Gala fruits, they also identified genes linked to texture, aroma and other fruit characteristics that were preferentially expressed between the two copies of the genes.

"That provides us and breeders with an even deeper understanding of the genetic diversity underlying a particular trait," Zhong said.
"The findings will help our group better manage and curate more than 6,000 apple accessions in the USDA Geneva Clonal Repository," Zhong adds, "as well as enable us to provide critical genetic and genomic information associated with the accessions to breeders and other researchers."

Credit: 
Boyce Thompson Institute

New simulation finds max cost for cost-effective health treatments

UNIVERSITY PARK, Pa. -- As health care costs balloon in the U.S., experts say it may be important to analyze whether those costs translate into better population health. A new study led by a Penn State researcher analyzed existing data to find a dividing line - or "threshold - for what makes a treatment cost-effective or not.

David Vanness, professor of health policy and administration, led a team of researchers that created a simulation to consider health care treatment costs, insurance premiums, quality of life, and life expectancy to explore whether a treatment delivers enough value for its costs to be considered beneficial for population health.

According to Vanness, the term "treatment cost" in this research incorporates all the costs and savings related to a treatment. For example, the cost of a treatment to lower blood cholesterol would include how much it costs but also take into account potential savings for preventing a heart attack and its subsequent treatment.

"We know that we are spending more and more on health care in the U.S. and that we're getting less and less for it," Vanness said. "We do a good job of developing new treatments in this country, but we don't do a good job of covering everybody or making sure that people have access to basic health care. We're spending a lot on our medical treatments, but many of those treatments just don't have a lot of value."

Vanness added that in order to improve a population's health without spending too much, it's important to be able to tell whether the prices drug and device manufacturers are charging are justified by what they deliver in health improvements.

The researchers found that in their simulation, for every $10,000,000 increase in health care expenditures, 1860 people became uninsured. This led to five deaths, 81 quality-adjusted life-years lost due to death, and 15 quality-adjusted life-years lost due to illness. In health care economics, one quality-adjusted life-year (QALY) is equal to one year of perfect health.

Vanness said these results -- recently published in the Annals of Internal Medicine -- suggests a cost effectiveness threshold of $104,000 per QALY.

"If a treatment is beneficial but it costs more than about $100,000 to gain one quality-adjusted life-year using that treatment, then it may not be a good deal," Vanness said. "Because our simulation was using data estimates, we wanted to come up with a range of plausible values. So anything over a range of $100,000 to $150,000 per QALY gained is likely to actually make our population's health fall."

To create the simulation, Vanness said he and the other researchers used a variety of data, starting with estimates about how likely people are to drop their insurance when their premiums go up.

"We also used evidence from the public health literature on what happens to people's health and mortality when they gain or lose health insurance," Vanness said.

The simulation then compiled that data and estimated how much the health of a population goes down when costs increase. According to Vanness, that relationship determines the cost-effectiveness threshold -- how much a treatment can cost relative to the health benefits it gives before it causes more harm than good.

The researchers said the findings could be especially important to organizations like the Institute for Clinical and Economic Review, which provides analysis to several private and public insurers to help negotiate prices with manufacturers. These organizations could use the findings as empirical evidence for what makes a treatment a good value in the U.S.

"Moving forward, I think some changes could be made to national policy to make cost effectiveness analysis more commonly used," Vanness said. "Our goal is to get that information out there with the hope that somebody is going to use it to help guide coverage or maybe get manufacturers to reduce their prices on some of these drugs."

Credit: 
Penn State

A new curriculum helps surgical trainees comprehensively treat victims of firearm violence

video: Researchers at Washington University School of Medicine developed a curriculum so that surgical trainees can best treat victims of firearm violence and learn about their journey to survival.

Image: 
American College of Surgeons

CHICAGO (November 2, 2020): Firearm violence impacts not only individuals and their communities, but also the health care providers who treat them. For surgeons, treating these victims and understanding the impact of firearm violence as a public health issue requires both technical and non-technical skills. Researchers at Washington University School of Medicine, St. Louis, Mo., developed a multidisciplinary curriculum to train surgical residents so that they can best treat victims of firearm violence and feel confident in contributing to the national conversation on firearm violence as a public health problem. The study findings appear as an "article in press" on the Journal of the American College of Surgeons website ahead of print.

The Anatomy of Gun Violence (AGV) curriculum was developed at Washington University School of Medicine to teach surgical trainees about managing firearm injuries while also understanding the injuries within the context of the public health epidemic of firearm violence. To achieve these goals, the curriculum was delivered over six weeks to general surgery residents in the 2017-18 and 2018-19 academic years. The curriculum contains multiple educational methods: a core curriculum of didactic lectures, mock oral examinations, a bleeding control training session, a Resuscitative Endovascular Balloon Occlusion of the Aorta (REBOA) training session, a gun violence survivor session, and the Surgery for Abdominal-thoracic ViolencE (SAVE) simulation lab, along with other specialty programs. The authors believe this is the first effort to teach surgical residents about firearm violence as a disease process within its social context.

"We wanted to create a more holistic curriculum that not only involved epidemiological aspects of firearm violence, but also preventative medicine and the impact it has on the emotional and psychosocial parts of our lives," said lead study author Emily J. Onufer, MD, MPH, a general surgery resident at Washington University School of Medicine.

In both academic years, 60 surgical residents participated in the AGV curriculum and 41 and 36 residents, respectively, completed a survey regarding their experiences with the curriculum. The residents reported an average 7.5 percent improvement in knowledge, with junior residents showing an even larger increase. The SAVE lab, where residents are grouped in teams and complete five penetrating trauma scenarios, was the highest rated component of the curriculum. Respondents also requested a debrief after the SAVE lab to discuss best practices and a session on firearm safety, policy, and opportunities for advocacy so that they can feel confident contributing to the national conversation on firearm violence.

Dr. Onufer and her coauthors emphasized the importance of the gun violence survivor session in the curriculum. She said that surgeons have the tendency to, as a coping mechanism, depersonalize the treatment of firearm violence victims. The survivor session helped humanize and personalize the firearm violence epidemic in St. Louis.

"One of the hardest things for surgical residents is that we see these patients and we do everything we can to help them, but then we may rotate off the service and we don't find out what happens to them," said study coauthor Erin Andrade, MD, MPH, a general surgery resident at Washington University School of Medicine. "Bringing a patient back through the gun violence survivor session meant getting to hear a story come full circle. The survivor was able to come back and show us the impact we can make in a patient's life."

The AGV curriculum also contained a section on STOP THE BLEED®, training residents with hands-on practice of direct manual pressure, tourniquet application, and wound packing to stop severe bleeding. Residents assembled their own bleeding control kits during the training and, in their second year, if they had previously completed the STOP THE BLEED® session, they had the opportunity to serve as an instructor for a session with hospital environmental services workers. Serving as an instructor gave residents the chance to teach lifesaving skills to members of their community.

"The surgeon is relevant to the public health experience of violence. We can have an impact outside the operating room. I think a lot of residents who maybe never saw that possibility before now believe that," said study coauthor LJ Punch, MD, FACS, president, Power4STL, St. Louis, Mo.

Each scenario in the AGV curriculum represented actual patients who had been treated at the trauma center. Each scenario also was named after the residents who were involved in the care of that patient. Naming each scenario in this manner highlighted the tremendous impact surgical trainees have on the lives of firearm violence victims.
"I went through the operative records and found the names of the residents who were taking care of the patient from the emergency room to the operating room and I named the case after them. That is not something residents often get feedback on--that their work is the reason why this person is alive," Dr. Punch said.

The researchers are exploring ways to track the curriculum's clinical impact beyond the six-week course. They noted this is a single center experience at an urban training program with a high prevalence of firearm injuries, which could limit the curriculum's applicability in other regions. However, they emphasized that firearm violence is an epidemic across the U.S. and this curriculum provides a way to standardize the teaching of penetrating traumatic injury for surgical trainees.

Addressing firearm violence from a public health approach is also the focus of the American College of Surgeons Improving Social Determinants to Attenuate Violence (ISAVE) task force, which recently outlined steps the medical community must take to understand and address the root causes of firearm violence.

Credit: 
American College of Surgeons

Hungry plants rely on their associated bacteria to mobilize unavailable iron

image: In iron-limiting soils, plants activate an iron-deficiency response that includes production and secretion of coumarins. These compounds alter the composition of the root microbiota and elicit a microbial activity that rescues plant iron nutrition, improving plant growth.

Image: 
Christopher J. Harbort

In nature, healthy plants are awash with bacteria and other microbes, mostly deriving from the soil they grow in. This community of microbes, termed the plant microbiota, is essential for optimal plant growth and protects plants from the harmful effects of pathogenic microorganisms and insects. The plant root microbiota is also thought to improve plant performance when nutrient levels are low, but concrete examples of such beneficial interactions remain scarce. Iron is one of the most important micronutrients for plant growth and productivity. Although abundant in most soils, iron's poor availability often limits plant growth, as it is found in forms that cannot be taken up by plants. Thus, adequate crop yields often necessitate the use of chemical fertilizers, which can be ecologically harmful in excessive application. Now, MPIPZ researchers led by Paul Schulze-Lefert have uncovered a novel strategy employed by plants to overcome this problem: they release substances from their roots that direct plant-associated bacteria to mobilise soil iron so that plants can easily take it up.

When confronted with iron in unavailable forms, plants mount a compensatory response to avoid iron deficiency. This starvation response involves extensive reprogramming of gene expression and the production and secretion of coumarins, aromatic compounds that are discharged from plant roots and which themselves can improve iron solubility. Interestingly, it was recently shown that coumarins are a selective force, shaping the composition of plant-associated bacterial communities. Now, it emerges that some coumarins also act as an "SOS" signal that prompts the root microbiota to support plant iron nutrition.

To first assess the contribution of the root microbiota to iron-limiting plant performance, first-author Christopher Harbort and colleagues used a controlled system which allowed them to regulate the availability of iron as well as the presence of root-associated bacteria. Using the laboratory model thale cress, they compared plants completely lacking bacteria and ones with an added synthetic community (SynCom) of bacterial commensals which reflects the root bacterial diversity observed in nature. The authors found that addition of this bacterial SynCom strongly improved the performance of plants grown on unavailable iron but not those grown with iron that was readily available. Growing plants in associations with single bacterial strains allowed them to determine that this iron-rescuing capacity is widespread among bacteria from different bacterial lineages of the root microbiota. When the researchers performed the same experiments with plants compromised in the production or secretion of coumarins, the community of bacteria provided no benefits. Thus, they could show that plant-secreted coumarins are responsible for eliciting nutritional assistance from bacterial commensals under iron limitation.

The authors' findings strongly suggest that the root microbiota is an integral part of how plants adapt to growth in iron-limiting soil. Furthermore, by identifying the plant-to-microbe signal for assistance, this research brings us one step closer to harnessing naturally present soil bacteria as a substitute for synthetic fertilizers. Improving plant iron nutrition could not only improve agricultural yields, but also increase the nutrient content of staple food crops, a potential strategy for tackling iron deficiency in humans as well.

Credit: 
Max Planck Institute for Plant Breeding Research

For plant and animal immune systems the similarities go beyond sensing

image: Superposition of the HeLo domains of plant (yellow), human (blue), and mouse MLKL (pink).

Image: 
Takaki Maekawa

Although profoundly different in terms of physiology, habitat and nutritional needs, plants and animals are confronted with one shared existential problem: how to keep themselves safe in the face of constant exposure to harmful microorganisms. Mounting evidence suggests that plants and animals have independently evolved similar receptors that sense pathogen molecules and set in motion appropriate innate immune responses. Now, in a study just published in the journal Cell Host & Microbe, senior author Takaki Maekawa together with co-first authors Lisa K. Mahdi, Menghang Huang, Xiaoxiao Zhang and colleagues have discovered that plants have evolved a family of proteins that bear a striking resemblance to proteins called mixed lineage kinase domain-like proteins (MLKLs), which trigger cell death in vertebrates as part of the immune response. In uncovering and characterizing an important new family of plant immune proteins, the authors' study, which involved collaboration with fellow MPIPZ researchers Paul Schulze-Lefert, Jane Parker and Jijie Chai, provides intriguing new insights into how plants protect themselves from microbial invaders.

Regulated cell death often accompanies immunity against infection in plants, animals and fungi. One pervasive theory suggests that highly localised cell death responses serve to strictly limit the spread of infection. Although starting from independent origins this shared response seems to also involve highly similar machinery: many proteins involved in cell death in different kingdoms of life contain a so-called HeLo domain, a bundle structure made up of four helices, which causes resistance and cell death by disturbing the integrity of cellular membranes or forming ion channels.

Based on the similarities between animal and plant immune systems and on the key role played by HeLo domains in cell death, Maekawa hypothesised that plants might also contain other proteins with HeLo domains. Making use of bioinformatic and structural analysis, he and his team discovered a new family of HeLo domain-containing proteins that are widely shared among different plant species, indicating that they are important for plant physiology.

Maekawa termed the proteins plant MLKLs, and for further studies he focused on MLKLs expressed in the model plant Arabidopsis thaliana. He and his team isolated MLKL proteins from A. thaliana and determined that plant MLKLs possess the same overall protein architecture as their vertebrate counterparts and also assemble into tetramer, likely auto-inhibited, structures when they're not active. Importantly, plant MLKLs also play a role in immunity, as plants in which genes encoding these proteins were mutated and thus non-functional were susceptible to pathogen infection.

Further investigation revealed additional similarities with vertebrate MLKLs: plant MLKLs are also trafficked to cellular membranes as part of their function, and activation of these proteins leads to cell death. Maekawa now aims to discover the molecular details underlying the function of plant MLKLs in immunity: "It will be exciting to uncover exactly how MLKLs are activated upon pathogen infection and how this activation is translated into effective plant protection."

Credit: 
Max Planck Institute for Plant Breeding Research