Culture

Arduous farm labor in the past means longer working hours today

A new study in The Economic Journal finds that societies with a history of farming crops heavily reliant on labor effort prefer harder work and longer hours.

Researchers used data from the European Social Survey, conducted every two years, from 2002 to 2014. The survey records individual-level information on a number of background characteristics, social attitudes, and human values. Researchers focused predominantly on three measures of work effort: the total number of hours respondents report normally working per week in their main jobs, their desired weekly work hours, and the difference between actual and contracted weekly work hours.

The study shows that differences in measures of work effort across European regions can be explained by variation in those regions' suitability for labor intensive crops. Researchers measured varying labor intensity in the production of different crops, in conditions of traditional agriculture. Using information from studies of the US Department of Agriculture and a Prussian agricultural census, researchers estimated the marginal returns to labor in the production of different crops, finding high labor returns for potatoes, and low returns for cereal crops like oats, barley and wheat. European regions with a higher suitability for crops reliant on labor effort, consistently scored higher in terms of hours worked.

Researchers additionally studied how preferences for hard work come to persist in a society over time. They found that significant aspects of work ethic are transmitted from parents to children, leading to more prominent results in native-born respondents of native-born parents. They also found that the work ethic is stronger in societies that have been reliant on agriculture for longer. High work ethic is correlated with lower preferences for redistribution, suggesting a feedback between culture and institutions that perpetuates cultural preferences.

Ultimately researchers concluded that labor effort with high marginal returns in agricultural production provides an incentive for investment in a preference for work. Other things equal, societies that cultivate crops more dependent on labor effort work more hours. Preferences for longer working hours, and more effort put in during those hours, can then persist through cultural transmission and institutional feedback mechanisms, even after societies have transitioned away from agriculture.

"The laborious nature of rice cultivation has been theorized to have an impact on the work ethic of those societies that have historically depended on it", says author Vasiliki Fouka. "This research shows systematically that this is true for a variety of crops, across the regions of Europe. In areas where hard work paid off, our ancestors engraved a work ethic in our culture that survives until today."

Credit: 
Oxford University Press USA

LSU Health New Orleans research shows how stress remodels the brain

New Orleans, LA -- Research led by Si-Qiong June Liu, MD, PhD, Professor of Cell Biology and Anatomy at LSU Health New Orleans School of Medicine, has shown how stress changes the structure of the brain and reveals a potential therapeutic target to the prevent or reverse it. The findings are published in JNeurosci, the Journal of Neuroscience, available here.

Working in a mouse model, Liu and her research team found that a single stressful event produced quick and long-lasting changes in astrocytes, the brain cells that clean up chemical messengers called neurotransmitters after they've communicated information between nerve cells. The stressful episode caused the branches of the astrocytes to shrink away from the synapses, the spaces across which information is transmitted from one cell to another.

The team also discovered a mechanism resulting in communication disruption. They found that during a stressful event, the stress hormone norepinephrine suppresses a molecular pathway that normally produces a protein, GluA1, without which nerve cells and astrocytes cannot communicate with each other.

"Stress affects the structure and function of both neurons and astrocytes," notes Dr. Liu. "Because astrocytes can directly modulate synaptic transmission and are critically involved in stress-related behavior, preventing or reversing the stress-induced change in astrocytes is a potential way to treat stress-related neurological disorders. We identified a molecular pathway that controls GluA1 synthesis and thereby astrocyte remodeling during stress. This suggests new pharmacological targets for possible prevention or reversal of stress-induced changes."

She says that since many signaling pathways are conserved throughout evolution, the molecular pathways that lead to astrocyte structural remodeling and suppression of GluA1 production may also occur in humans who experience a stressful event.

"Stress alters brain function and produces lasting changes in human behavior and physiology," Liu adds. "The experience of traumatic events can lead to neuropsychiatric disorders including anxiety, depression and drug addiction. Investigation of the neurobiology of stress can reveal how stress affects neuronal connections and hence brain function. This knowledge is necessary for developing strategies to prevent or treat these common stress-related neurological disorders."

Credit: 
Louisiana State University Health Sciences Center

Tandem solar cell world record: New branch in the NREL chart

image: The CIGS-Pero tandem cell was realised in a typical lab size of 1 square centimeter.

Image: 
HZB

Tandem cells combine two different semiconductors that convert different parts of the light spectrum into electrical energy. Metal-halide perovskite compounds mainly use the visible parts of the spectrum, while CIGS semiconductors convert rather the infrared light. CIGS cells, which consist of copper, indium, gallium and selenium, can be deposited as thin-films with a total thickness of only 3 to 4 micrometers; the perovskite layers are even much thinner at 0.5 micrometers. The new tandem solar cell made of CIGS and perovskite thus has a thickness of well below 5 micrometers, which would allow the production of flexible solar modules.

"This combination is also extremely light weight and stable against irradiation, and could be suitable for applications in satellite technology in space", says Prof. Dr. Steve Albrecht, HZB. These results, obtained in a big collaboration, have been just published in the renowned journal JOULE.

"This time, we have connected the bottom cell (CIGS) directly with the top cell (perovskite), so that the tandem cell has only two electrical contacts, so-called terminals", explains Dr. Christian Kaufmann from PVcomB at HZB, who developed the CIGS bottom cell with his team and he adds "Especially the introduction of rubidium has significantly improved the CIGS absorber material".

Albrecht and his team have deposited in the HySPRINT lab at HZB the perovskite layer directly on the rough CIGS layer. "We used a trick that we had previously developed," explains former postdoc from Albrecht's group Dr. Marko Jošt, who is now a scientist at the University of Ljubjana, Slovenia. They applied so-called SAM molecules to the CIGS layer, which form a self-organised monomolecular layer, improving the contact between perovskite and CIGS.

The new perovskite CIGS tandem cell achieves an efficiency of 24.16 percent. This value has been officially certified by the CalLab of the Fraunhofer Institute for Solar Energy Systems (ISE).

Since such "2 Terminal" tandem cells made of CIGS and perovskite now represent a separate category, the National Renewable Energy Lab NREL, USA, has created a new branch on the famous NREL chart for this purpose. This chart shows the development of efficiencies for almost all solar cell types since 1976. Perovskite compounds have only been included since 2013 - the efficiency of this material class has increased more steeply than any other material.

Credit: 
Helmholtz-Zentrum Berlin für Materialien und Energie

Boson particles discovery provides insights for quantum computing

image: A U.S. Army project conducted at Penn State discovered that a class of particles known as bosons can behave as an opposite class of particles called fermions, when forced into a line. This finding provides a key insights for the development of quantum devices and quantum computers. Researchers at Penn State use this apparatus to create an array of ultra-cold one-dimensional gases made up of atoms.

Image: 
Photo courtesy Nate Follmer, Penn State

RESEARCH TRIANGLE PARK, N.C. -- Researchers working on a U.S. Army project discovered a key insight for the development of quantum devices and quantum computers.

Scientists found that a class of particles known as bosons can behave as an opposite class of particles called fermions, when forced into a line.

The research, conducted at Penn State University and funded in part by the Army Research Office, an element of U.S. Army Combat Capabilities Development Command's Army Research Laboratory, found that when the internal interactions among bosons in a one-dimensional gas are very strong, their velocity distribution transforms into that of a gas of non-interacting fermions when they expand in one dimension. The research is published in the journal Science.

"The performance of atomic clocks, quantum computers and quantum systems rely upon the proper curation of the properties of the chosen system," said Dr. Paul Baker, program manager, atomic and molecular physics at ARO. "This research effort demonstrates that the system statistics can be altered by properly constraining the dimensions of the system. In addition to furthering our understanding of foundational principles, this discovery could provide a method for dynamically switching a system from bosonic to fermionic to best meet the military need."

The researchers experimentally demonstrated that, when bosons expand in one dimension--the line of atoms is allowed spread out to become longer--they can form a Fermi sea.

"All particles in nature come in one of two types, depending on their spin, a quantum property with no real analogue in classical physics," said David Weiss, Distinguished Professor of Physics at Penn State and one of the leaders of the research team. "Bosons, whose spins are whole integers, can share the same quantum state, while fermions, whose spins are half integers, cannot. When the particles are cold or dense enough, bosons behave completely differently from fermions. Bosons form Bose-Einstein condensates, congregating in the same quantum state. Fermions, on the other hand, fill available states one by one to form what is called a Fermi sea."

The research team created an array of ultracold one-dimensional gases made up of bosonic atoms (Bose gases) using an optical lattice that uses laser light to trap the atoms. In the light trap, the system is at equilibrium and the strongly interacting Bose gases have spatial distributions like fermions, but still have the velocity distributions of bosons. When the researchers shut off some of the trapping light, the atoms expand in one dimension. During this expansion, the velocity distribution of the bosons smoothly transforms into a one that is identical to fermions.

"By fully understanding the dynamics of one-dimensional gases, and then by gradually making the gases less integrable, we hope to identify universal principles in dynamical quantum systems," Weiss said.

Dynamical, interacting quantum systems are an important part of fundamental physics. They are also increasing technologically relevant, as many actual and proposed quantum devices are based on them, including quantum simulators and quantum computers.

"We now have experimental access to things that if you would have asked any theorist working in the field ten years ago 'will we see this in our lifetime?' they would have said 'no way,'" said Marcos Rigol, professor of physics at Penn State and the other leader of the research team.

Credit: 
U.S. Army Research Laboratory

Asian universities close gap on US schools in world rankings by increasing STEM funding

BUFFALO, N.Y. - China and South Korea are surging in the international brain race for world-class universities, as schools in the East Asian nations are replacing institutions in the United States in international college rankings, according to new University at Buffalo-led research.

The research, which analyzed the effects of government policy on universities across the globe, found that China and Korea raised the number of their universities among the top 500 schools in the world through increased government funding and a focus on developing research programs in science, technology, engineering and math (STEM) fields.

However, U.S. universities continue to dominate the top 100 rankings, suggesting limitations to the approach taken by China and South Korea, says Jaekyung Lee, PhD, lead researcher and professor of counseling, school and educational psychology in the UB Graduate School of Education.

"China has already surpassed Japan in world rankings and is closing the gap with the U.S. fast," says Lee. "Yet the 'Asian catch-up model' of building world-class universities relies heavily on government funding and central planning without creating an environment for intellectual autonomy and sustainable innovation. Chinese and Korean schools are hardly seen among the top 100 universities. The model may work better for the early stages of development, but not for the advanced stages that require innovation and leadership."

Closing the gap on Western universities

For policymakers in many East Asian nations, research universities are viewed as a key driving force for economic development, says Lee.

Using U.S. or other Western top-tier research universities as benchmarks, schools in East Asian countries made strategic investments in higher education with a priority toward STEM programs to create their own world-class universities.

To examine the effectiveness of the Asian catch-up model, the researchers reviewed the QS World University Rankings from 2008-14 and the Academic Ranking of World Universities from 2003-13. The study, published in March in Educational Research for Policy and Practice, also observed the amount of academic citations - a critical factor in ranking methodology - and funding spent on university research for the U.S., China, South Korea and Japan.

China experienced the largest rise in the number of entries in world rankings. The increase was supported by several government initiatives that poured more than $20 billion in funding into more than 100 institutions. The funds were concentrated in STEM disciplines and fueled a 94% increase in research publications. These universities produced 8.6% of the world's research citations in 2012, a dramatic rise from 0.8% in 1996.

South Korea universities also improved in international rankings, as the nation added three schools to the top 500 lists. To increase the competitiveness of its institutions, South Korea invested $1.2 billion in handpicked university programs, funding graduate student stipends and scholarships, and improving research infrastructure. The country produced 2.2% of the world's research citations in 2012, quadruple the amount created in 1996.

The rise of Chinese and South Korean universities coincided with a drop in the number of Japanese schools in the top 500 rankings. An early employer of the catch-up method, Japan's success made it an initial leader in higher education among East Asian countries. During the previous two decades, Japan shifted from university-wide support to the funding of select research programs. Despite a slight increase in government support, Japanese institutions experienced a decrease in citations.

The U.S. maintained the highest number of universities in the top 100 and 500 rankings, even with several schools dropping from the lists. Unlike East Asian nations that focused on rankings, research and graduate education, U.S. policy initiatives prioritized undergraduate education with an emphasis on graduation and retention rates and job placement. In fact, the U.S. did not actively compete in the international brain race and few U.S. universities benchmarked themselves against peer institutions in other nations, says Lee.

U.S. federal and state governments continue to invest billions of dollars in university research, but the U.S.'s percentage of the world's academic citations were nearly halved between 1996-2012, falling from 41% to 24%.

Stronger growth in China may be attributed to its strategy of lifting whole universities, whereas Japan and South Korea concentrated funding on select research programs, says Lee. Japan's progress may also have been limited by the maturation of its higher education system and weaker financial incentives.

"In spite of the rapid growth in university rankings by Chinese and Korean universities, progress was limited to the second and third tiers," says Lee. "This finding might be related to the diminishing returns between citations and rankings. South Korea and China may fall into the trap of benchmarking, following Japan's suit if they fail to evolve from the 'catch-up' model to 'first mover' strategies for leading innovations."

Limitations of STEM

Although the development of STEM programs helped East Asian universities rise in international rankings, concentrating funding on STEM programs typically harmed institutional success. China, South Korea and Japan are outliers, says Lee.

China's government allocated 100% of its research funding to top universities with concentrations on STEM disciplines, whereas South Korea and Japan allotted 62% and 35%, respectively. In contrast, the U.S. universities in the top 100 rankings were more likely to have a greater balance in funding between STEM fields, the humanities and social sciences.

An underlying cause of the imbalance, says Lee, may be the language barriers and biases that restrict access to international scholarly networks and journals for non-English speakers in non-STEM fields.

"Asian nations should reframe the question for world-class university development to, 'How should we distinguish ourselves from our American counterparts?'" says Lee. "This strategic differentiation is more likely to create win-win results when each nation pursues more comprehensive yet distinctive world-class universities. Then, the challenge is not so much catching up with established leaders as distinguishing from one's peers."

Credit: 
University at Buffalo

Being right-brained or left-brained comes down to molecular switches

image: Dr. Viviane Labrie

Image: 
Courtesy of Van Andel Institute

GRAND RAPIDS, Mich. (April 14, 2020) -- Scientists may have solved one of the most puzzling and persistent mysteries in neuroscience: why some people are "right-brained" while others are "left-brained."

The answer lies in how certain genes on each side of the brain are switched "on" and "off" through a process called epigenetic regulation. The findings may explain why Parkinson's disease and other neurological disorders frequently affect one side of the body first, a revelation that has far-reaching implications for development of potential future treatments.

The study was led by Van Andel Institute's Viviane Labrie, Ph.D., and published in the journal Genome Biology.

"The mechanisms underlying brain asymmetry have been an elephant in the room for decades," Labrie said. "It's thrilling to finally uncover its cause, particularly given its potential for helping us better understand and, hopefully one day, better treat diseases like Parkinson's."

Each cell in the brain has the same genes but it is epigenetics that dictate whether those genes are switched "on" or "off." Labrie and her collaborators found numerous epigenetic differences between the hemispheres of healthy brains that are linked to variations in gene activity. Notably, these differences, or asymmetry, could make one side of the brain more vulnerable to neurological diseases.

For example, epigenetic abnormalities on one side of the brain could make that hemisphere more susceptible to the processes that cause the death of brain cells in Parkinson's. The differences in cell death across hemispheres leads to the appearance of the disease's hallmark symptoms, such as tremor, on one side of the body before the other. As the disease progresses, symptoms on the side first affected often are more severe than symptoms on the other side of the body.

The findings also give scientists a vital window into the various biological pathways that contribute to symptom asymmetry in Parkinson's, including brain cell development, immune function and cellular communication.

"We all start out with prominent differences between the left and right sides of our brains. As we age, however, our hemispheres become more epigenetically similar. For Parkinson's, this is significant: people whose hemispheres are more alike early in life experienced faster disease progression, while people whose hemispheres were more asymmetric had slower disease progression," Labrie said. "Many of these changes are clustered around genes known to impact Parkinson's risk. There is huge potential to translate these findings into new therapeutic strategies."

Labrie is already starting to look at this phenomenon in other neurological diseases like Alzheimer's.

The study is one of the first to parse the molecular causes of brain asymmetry. Early research on the left versus right brain was conducted in the mid-20th century by Roger Sperry, whose groundbreaking work with split-brain patients earned him a Nobel Prize.

Credit: 
Van Andel Research Institute

Economic growth is incompatible with biodiversity conservation

The increase in resource consumption and polluting emissions as a result of economic growth is not compatible with biodiversity conservation. However, most international policies on biodiversity and sustainability advocate economic growth. These are the main conclusions of the study 'Biodiversity policy beyond economic growth', published this week in the scientific journal Conservation Letters. This contradiction became clear after a review of international scientific and policy work on the subject. The scientific article is overseen by Iago Otero - a researcher at the Centre interdisciplinaire de recherche sur la montagne, of the University of Lausanne, Switzerland - and has involved 22 professionals from some 30 research centres in 12 countries, specialised in conservation ecology and ecological economics. Participants in the project include, among others, Katharine N. Farrell, from the University of Rosario (Colombia), Lluís Brotons, researcher from CSIC at CREAF, Giorgos Kallis from ICTA-UAB and Beatriz Rodríguez-Labajos, researcher from ICTA-UAB and the University of California Berkeley.

The document recommends that the IPBES (Intergovernmental Platform on Biodiversity and Ecosystem Services) - the IPCC of biodiversity - incorporate in its reports a scenario that goes beyond economic growth, as part of its current work to envision the future of biodiversity. So far, the projections of change in biodiversity assume that the economy has to grow and seek policy options that minimize biodiversity loss without compromising economic growth. Instead, the article recommends beginning with conservation and social welfare objectives and then looking at what economic trajectories might meet them. "This can mean positive or negative rates of Gross Domestic Product growth," says Iago Otero, leader of the study, adding that more and more voices in IPBES are calling for "replacement of this economic indicator with new welfare paradigms."

Taking the last 170 years in the United States as an example, the research team speculates about the meaning of continued economic growth that is clearly associated with biodiversity loss but whose contribution to social progress has become stagnant since the late 1970s.

Alternatives for conserving biodiversity

The article outlines 7 alternative proposals to ensure prosperity beyond growth and halt the loss of biodiversity. They are realised in the following national and international actions by diverse communities, NGOs, researchers and companies:

1. Limit the commercialization of resources at an international level.

All products contain a certain amount of resources and land use necessary for their production. The paper proposes establishing absolute caps on these amounts in the products marketed and to allocate them by country. It is argued that less international trade reduces resource extraction and the spread of invasive species.

2. Restrict the activity of extractive industries in areas of high biodiversity.

Putting in place clear limitations and removing subsidies to unsustainable extractive industries helps to curb habitat loss and fragmentation. Moratoriums on extraction could also be introduced in highly sensitive regions.

3. Slow down the expansion of major infrastructure.

Re-examine in detail the need for new major infrastructure (airports, dams, motorways) and its impact on sensitive ecosystems and human communities. In addition, protect areas that are still free of roads, to prevent the rapid loss of their biodiversity and endangered cultures.

4. Reduce and share the work.

Promoting legislation that reduces the working week and supporting companies that implement work sharing schemes can reduce environmental pressure and impacts on biodiversity.

5. Promote agro-ecological development and food sovereignty.

Encourage government support for sustainable agricultural systems and local and organic food, through regulations and subsidies and by adjusting tax systems accordingly. This seeks to shorten production chains, using criteria of biodiversity and sustainability, reduce pressure from agricultural and livestock production and promote diversity within species, between species and of landscapes.

6. Prioritize compact urban planning and shared use of housing.

Promote efficient land use through integrated collective housing solutions, rent control and limits on the land available for urbanization and peri-urban expansion. Reduce the pressure of urbanization on peri-urban agricultural land.

7. Report on the impact of production on biodiversity.

Tax product advertisements that lead to overexploitation of species and lands. Increase awareness of the effects of products on biodiversity through better labelling and information campaigns. Promote education programmes on responsible consumption.

Credit: 
Universitat Autonoma de Barcelona

Scoring system empowers surgery departments to prioritize medically necessary operations

image: Medically-Necessary, Time-Sensitive Procedures: Scoring System to Ethically and Efficiently Manage Resource Scarcity and Provider Risk During the COVID-19 Pandemic.

Image: 
American College of Surgeons

CHICAGO (April 14, 2020): A team of investigators at the University of Chicago (Ill.), has devised a new scoring system that helps surgeons across surgical specialties decide when to proceed with medically necessary operations in the face of the resource constraints and increased risk posed by the Coronavirus Disease 2019 (COVID-19) pandemic. The process, called Medically Necessary Time-Sensitive (MeNTS) Prioritization, is published as an "article in press" on the Journal of the American College of Surgeons website ahead of print.

In the midst of the COVID-19 pandemic, hospitals must make sure they can care for the influx of patients who have advanced viral infection and therefore may require intensive care and the use of ventilators. Hospitals also must ensure that physicians, nurses, and other staff are not subjected to unnecessary risk of infection. At the same time, some patients that are not currently hospitalized still need surgical care that should not be delayed for an excessive amount of time.

Decisions to proceed with MeNTS surgery at the present time are being made on a case-by-case basis, with surgeons following guidelines developed by individual surgical specialties, such as triaging breast cancer surgery recommendations developed by the COVID-19 Pandemic Breast Cancer Consortium 1 and released on April 14. Prior to that release, ACS released an overall recommendation 2 on March 13 that hospitals, health systems, and surgeons plan to minimize, postpone, or cancel elective operations until it is clear that the health care infrastructure can support critical care needs. This recommendation was followed by another more detailed guidance document released by ACS 3 on March 17 to aid in surgical decision making in triaging operations that features an Elective Surgery Acuity Scale (ESAS) from St. Louis University.

The new methodology described by University of Chicago surgeons addresses what many call "elective" surgical procedures and is designed to guide both surgeons within a specialty and OR leaders across different specialties. "The majority of surgical procedures are done because of disease processes that do not have good nonsurgical treatment options. If you delay these procedures, that itself can lead to problems and complications. If cancer surgery is postponed indefinitely, for example, there is the potential risk that the disease will become more advanced.

"If a patient has pain in the hip or knee, the additional restrictions on mobility, not to mention the pain itself, are real issues. Although we talk about these operations as being 'elective,' that doesn't mean they are optional. It's just a matter of the surgeon and the patient having the opportunity to elect the time when the operation should take place. The procedures are more aptly called medically-necessary and time-sensitive," explained Vivek N. Prachand, MD, FACS, lead author of the article, and professor of surgery and chief quality officer at University of Chicago Medicine and Biological Sciences.

The MeNTS Prioritization process was created by a team of six representatives from general surgery, vascular surgery, surgical oncology, transplantation, cardiac surgery, otolaryngology, and surgical ethics. The team reviewed studies of the effect of COVID-19 as well as severe acute respiratory syndrome on hospital resources, health care providers, surgical procedures, and surgical patients in Asia and Europe and identified 21 factors related to outcome, risk of viral transmission to health care professional, and use of resources.

"We studied how patients undergoing surgery might potentially be at increased risk of postoperative problems if they had COVID-19. We looked at surgical procedures individually and whether these operations routinely require an ICU stay; other currently scarce hospital resources; and/or general anesthesia, which increases the risk for spreading the virus to the health care team. We also thought about the disease process itself: how effective are non-surgical options? Would a wait of two weeks or six weeks make the operation riskier or more difficult to perform and increase the chance a patient might have complications or have to stay in the hospital longer?" Dr. Prachand added.

Each of the 21 factors is scored on a scale of 1 to 5, and the total score, ranging from 21 to 105, is computed for each case. The higher the score, the greater the risk to the patient, the higher the utilization of health care resources, and the higher the chance of viral exposure to the health care team. (See the linked sample MeNTS worksheet for a full list of the factors that are scored.)

University of Chicago surgeons have been using MeNTS for about two weeks, and they have increased the number of medically necessary, time-sensitive operations to about 15 per day, including colon resection for a painful bleeding cancer, removal of an infected hip replacement, urgent stereotactic brain biopsy of a relatively quickly growing diffuse brain tumor, and repair of a lacerated finger tendon.

The scoring system has been welcomed by University of Chicago specialties not originally involved in its creation, such as orthopedics, gynecology, and anesthesiology. "The MeNTS process gives our anesthesiology colleagues more reassurance that we are taking into consideration their risk in the care of certain patients. It also helps surgical trainees understand that decisions are being made in an equitable and transparent way," Dr. Prachand said.

MeNTS may be used by any facility that is performing medically necessary, time-sensitive operations. "The nice thing about the system is that it applies not only to academic medical centers in big cities. It can be applied anywhere. The same assessment of resources is true wherever you practice. The factors are not hospital- or practice-environment-specific. These factors are fundamental and straightforward and can help surgeons and hospitals provide the surgical care that patients need both now in the thick of the pandemic as well as when we get to the other side of the peak," Dr. Prachand said.

Credit: 
American College of Surgeons

Self-isolation or keep calm and carry on -- the plant cell's dilemma

image: A cell in which the plasmodesmal connections are open (the green fluorescent protein that the cell produces moves in to the surrounding cells) and an isolated cell, in which the connections are closed

Image: 
John Innes Centre

Self-isolation in the face of a marauding pathogen may save lives but it comes at the expense of life-sustaining essentials such as transport, communication and connectivity.

This leaves decision makers with a dreadful dilemma as they judge when it's time to relax lockdown measures.

New research suggests plants must balance similar trade-offs as they respond to pathogens that could rip through their defence cell by cell.

Plant cells communicate with their neighbours by tunnel-like connections called plasmodesmata. This is one way that cells exchange information and resources.

Plasmodesmata are lined by the same membrane that surrounds the cell and they allow molecules to move from one cell into the surrounding cells.

When a cell perceives a threat like an invading fungus or bacteria, the plasmodesmata close over and the cells are temporarily isolated.

In this study researchers at the John Innes Centre used bioimaging approaches to investigate what proteins are involved in this process of cellular self-isolation.

They show that the cell wall material of fungus - called chitin - triggers different responses in the membrane that lines the plasmodesmal tunnels when compared to the responses it triggers in the membrane that surrounds the cell body.

The signaling cascade in plasmodesmata triggers the production of a polysaccharide called callose that forces the plasmodesmal tunnel to close over and for the cells to isolate themselves.

"This indicates that cells control their connectivity independently of other responses, although we don't yet know why this is," explains Dr Christine Faulkner of the John Innes Centre.

The study also finds that guard-like receptors that sit in the plasmodesmata are different from those that sit in the rest of the membrane, but both receptors use the same enzyme.

"This is puzzling", says Dr Faulkner, "but we also discovered that the mechanism of activation of this enzyme in the plasmodesmata is different to the mechanism used in the rest of the membrane. Thus, it seems that while both receptors use the same tool (the enzyme) to transmit a signal, they use it differently for different purposes."

The requirement for specific signaling in the plasmodesmal part of the cell membrane suggests that the vital processes requiring cell-to-cell connectivity must be regulated independently of immune response.

The study concludes: "This raises questions whether there is a critical requirement for cells to balance connectivity and resource exchange with a protective mechanism imposed by isolation."

Credit: 
John Innes Centre

Predicting the evolution of genetic mutations

image: The algorithm called "minimum epistasis interpolation" results in a visualization of how a protein could evolve to either become highly effective or not effective at all. They compared the functionality of thousands of versions of the protein, finding patterns in how mutations cause the protein to evolve from one functional form to another.

Image: 
McCandlish lab/CSHL, 2020

Quantitative biologists David McCandlish and Juannan Zhou at Cold Spring Harbor Laboratory have developed an algorithm with predictive power, giving scientists the ability to see how specific genetic mutations can combine to make critical proteins change over the course of a species's evolution.

Described in Nature Communications, the algorithm called "minimum epistasis interpolation" results in a visualization of how a protein could evolve to either become highly effective or not effective at all. They compared the functionality of thousands of versions of the protein, finding patterns in how mutations cause the protein to evolve from one functional form to another.

"Epistasis" describes any interaction between genetic mutations in which the effect of one gene is dependent upon the presence of another. In many cases, scientists assume that when reality does not align with their predictive models, these interactions between genes are at play. With this in mind, McCandlish created this new algorithm with the assumption that every mutation matters. The term "Interpolation" describes the act of predicting the evolutionary path of mutations a species might undergo to achieve optimal protein function.

The researchers created the algorithm by testing the effects of specific mutations occurring in the genes that make streptococcal GB1 protein. They chose the GB1 protein because of its complex structure, which would generate enormous numbers of possible mutations that could be combined in an enormous number of possible ways.

"Because of this complexity, visualization of this data set became so important," says McCandlish. "We wanted to turn the numbers into a picture so that we can understand better what [the data] is telling us."

[Video - Visualizing the evolution of a protein: https://www.youtube.com/watch?v=0miHVrncrhY]

The visualization is like a topological map. Height and color correlate with the level of protein activity and distance between points on the map represents how long it takes for the mutations to evolve to that level of activity.

The GB1 protein begins in nature with a modest level of protein activity, but may evolve to a level of higher protein activity through a series of mutations that occur in several different places.

McCandlish likens the evolutionary path of the protein to hiking, where the protein is a hiker trying to get to the highest or best mountain peaks most efficiently. Genes evolve in the same manner: with a mutation seeking the path of least resistance and increased efficiency.

To get to the next best high peak in the mountain range, the hiker is more likely to travel along the ridgeline than hike all the way back down to the valley. Going along the ridgeline efficiently avoids another potentially tough ascent. In the visualization, the valley is the blue area, where combinations of mutations result in the lowest levels of protein activity.

The algorithm shows how optimal each possible mutant sequence is and how long it will take for one genetic sequence to mutate into any of many other possible sequences. The predictive power of the tool could prove particularly valuable in situations like the COVID-19 pandemic. Researchers need to know how a virus is evolving in order to know where and when to intercept it before it reaches its most dangerous form.

McCandlish explains that the algorithm can also help "understand the genetic routes that a virus might take as it evolves to evade the immune system or gain drug resistance. If we can understand the likely routes, then maybe we can design therapies that can prevent the evolution of resistance or immune evasion."

There are additional potential applications for such a predictive genetic algorithm, including drug development and agriculture.

"You know, at the very beginning of genetics... there was all this interesting speculation as to what these genetic spaces would look like if you could actually look at them," McCandlish added. "Now we're really doing it! That's really cool."

Credit: 
Cold Spring Harbor Laboratory

Estuaries are warming at twice the rate of oceans and atmosphere

image: NSW environment officer collecting data at Bengello near Batemans Bay.

Image: 
NSW DPIE

Estuaries on the south-east coast of Australia are warming at twice the rate of oceans and the atmosphere, a new study has found.

Researchers say the apparent accelerated impact from climate change on estuaries could adversely affect economic activity and ecological biodiversity in rivers and lakes worldwide.

Dr Elliot Scanes from the University of Sydney said: "Our research shows that estuaries are particularly vulnerable to a warming environment. This is a concern not only for the marine and bird life that rely on them but the millions of people who depend on rivers, lakes and lagoons for their livelihoods around the world."

The researchers say that changes in estuarine temperature, acidity and salinity are likely to reduce the global profitability of aquaculture and wild fisheries. Global aquaculture is worth $US243.5 billion a year and wild fisheries, much of which occurs in estuaries, is worth $US152 billion. More than 55 million people globally rely on these industries for income.

Professor Pauline Ross, who leads the research group in the School of Life and Environmental Sciences, said: "Estuaries provide services of immense ecological and economic value. The rates of change observed in this study may also jeopardise the viability of coastal vegetation such as mangroves and saltmarsh in the coming decades and reduce their capacity to mitigate storm damage and sea-level rise."

The results are based on 12 years of recording temperatures in 166 estuaries along the entire 1100-kilometre stretch of the New South Wales coast in south-eastern Australia. In that time more than 6200 temperature observations were taken.

The data, which are publicly available, were taken by field officers of the NSW Department of Planning, Industry and the Environment and used in a marine research collaboration with the University of Sydney.

On average, the estuary systems experienced a 2.16-degree temperature increase, about 0.2 degrees each year.

Dr Elliot Scanes said: "This is evidence that climate change has arrived in Australia; it is not a projection based on modelling, but empirical data from more than a decade of investigation."

Studies on specific lake and river systems have found evidence of warming, such as along the North Sea, in Germany, in the Hudson River in New York and Chesapeake Bay, Maryland. This is the world's first long-term study that has considered a diverse range of estuary types on such a large scale.

It is published today in Nature Communications.

"This increase in temperature is an order of magnitude faster than predicted by global ocean and atmospheric models," Dr Elliot Scanes said.

According to the Australian Bureau of Meteorology, air and sea temperatures in Australia have increased by about 1 degree since 1910. And over the past decade, air temperatures have increase 1.5 degrees as compared to the 1961 to 1990 average.

"Our results highlight that air or ocean temperatures alone cannot be relied upon to estimate climate change in estuaries; rather, individual traits of any estuary need to be considered in the context of regional climate trends," Dr Elliot Scanes said.

"New models will need to be developed to help predict estuarine changes."

The study also found that acidification of estuaries was increasing by 0.09 pH units a year. There was also changes to the salinity of estuary systems: creeks and lagoons became less saline while river salinity increased.

Temperature increases in estuaries were also dependent on the type, or morphology of the system, the study found.

Professor Ross said: "Lagoons and rivers increased in temperature faster than creeks and lakes because they are shallower with more limited ocean exchange."

She said that this suggests industries and communities that rely on shallow estuaries for culture, income and food could be particularly vulnerable during global warming.

"This is of concern in other dry temperate zones like the Mediterranean and South Africa where many of the estuaries are similar to those studied here," she said.

The study suggests that estuaries that remain open may also soon begin to "tropicalise", and estuarine ecosystems could become colonised by tropical marine species and reflect a warmer environment.

Professor Ross said: "This research will help local fisheries and aquaculture to develop mitigation strategies as the climate changes."

DOWNLOAD research and photos of estuary collection points and Dr Elliot Scanes at this link.

INTERVIEWS
Dr Elliot Scanes
School of Life and Environmental Sciences, The University of Sydney
elliot.scanes@sydney.edu.au

Professor Pauline Ross
School of Life and Environmental Sciences, The University of Sydney
pauline.ross@sydney.edu.au

MEDIA ENQUIRIES
Marcus Strom | marcus.strom@sydney.edu.au | +61 423 982 485

DECLARATION
The data collection for this study was funded by NSW Department Planning, Industry and the Environment.
Two of the research authors, Elliot Scanes and Peter Scanes, are related.

Credit: 
University of Sydney

PTSD and moral injury linked to pregnancy complications

Elevated symptoms of PTSD and moral injury can lead to pregnancy complications, found a Veterans Affairs study of women military veterans.

Both PTSD and moral injury were predictors of adverse pregnancy outcomes such as preterm birth and gestational diabetes, while PTSD symptoms also predicted postpartum depression, anxiety, and a self-described difficult pregnancy.

The results led study author Dr. Yael I. Nillni, a researcher with the National Center for PTSD at the VA Boston Healthcare System and Boston University School of Medicine, to suggest that "screening for PTSD and moral injury during the perinatal period is important to identify women who may need treatment for these problems."

The findings appear April 14, 2020, in the Journal of Traumatic Stress.

Posttraumatic stress disorder results from experiencing traumatic events, such as military combat. Symptoms include re-experiencing the trauma through flashbacks or nightmares, numbness, sudden anger, and hyperarousal.

PTSD is more common in women veterans than civilian women. In addition to combat, experiences such as childhood abuse, military sexual trauma, and sexual harassment can cause PTSD in women veterans.

Moral injury refers to distress related to transgression of deeply held moral beliefs. It can lead to feelings of shame, guilt, and demoralization. Moral injury can result from a number of experiences, such as combat and military sexual trauma. Experiencing leadership failures or perceived betrayal by peers, the military, or the government have also been linked with moral injury in veterans. Past research has shown that a person does not need to be directly involved in a transgressive act to face moral injury. Being exposed to transgressions can also lead to moral injury.

While PTSD and moral injury frequently occur together in veterans, they are distinct conditions.

Previous VA research has shown PTSD may increase the risk of gestational diabetes, preeclampsia, and preterm birth. Some evidence suggests that moral injury can negatively impact physical health, but its effects on pregnancy have not been studied.

To better understand how these two conditions affect pregnancy, the researchers followed 318 women veterans who became pregnant within three years of separating from military service.

They found that women with elevated PTSD symptoms were at greater risk of adverse pregnancy outcomes than women with lower symptoms PTSD. Elevated symptoms of moral injury also increased the risk of adverse outcomes.

Both conditions raised the risk of gestational diabetes, preeclampsia, and preterm birth. Only PTSD increased the risk of postpartum depression, anxiety, and the perception of a difficult pregnancy.

For both PTSD and moral injury, the more severe the symptoms, the higher the likelihood of pregnancy complications.

The results were consistent with other studies on PTSD and pregnancy. In 2018, VA and the Department of Defense published clinical practice guidelines that emphasize the importance of screening for mental health conditions during pregnancy. The new findings add evidence to the idea that both PTSD and moral injury should be screened for and treated during pregnancy.

Nillni stressed the importance of screening for pregnant women both within and outside VA health care settings.

"Given that many women receive obstetric care outside of the VA," she explained, "increased awareness of the impact of PTSD and moral injury on perinatal outcomes is imperative to improve screening during this sensitive time and connect at-risk women veterans to services."

Credit: 
Veterans Affairs Research Communications

Johns Hopkins experts publish 'guidebook' for blood plasma therapy

A team of Johns Hopkins experts has created a clinical guidebook to help hospitals and medical centers rapidly scale up their ability to deliver so-called convalescent plasma therapy, which leverages immune system components found in the plasma portion of blood from people who have recovered from COVID-19 illness.

"We've received many inquiries from health care providers looking to ramp up their ability to deliver this therapy," says Evan M Bloch, M.D., M.S. an associate professor of pathology at the Johns Hopkins University School of Medicine who is part of the team working on convalescent therapy. "There is historical precedent for its use to prevent and treat viral illness. However, during the chaos of an epidemic, the therapy is often deployed without rigorously studying its effects. Carefully conducted studies are critically needed to understand which people are most likely to benefit from this therapy and how best to apply it to optimize that benefit."

The guidebook was published online April 7 in the Journal of Clinical Investigation.

In recent weeks, infectious disease expert Arturo Casadevall, M.D., Ph.D., has led a team of physicians and scientists from around the United States to establish a network of hospitals and blood banks that can begin collecting, isolating and processing blood plasma from COVID-19 survivors.

"This paper details the nuts and bolts of how to deploy convalescent plasma, and this information should be very helpful to colleagues worldwide who are preparing to use this therapy against COVID-19," says Casadevall, a Bloomberg Distinguished Professor who holds joint appointments in the Johns Hopkins Bloomberg School of Public Health and the Johns Hopkins University School of Medicine.

The U.S. Food and Drug Administration has paved the way for researchers at Johns Hopkins to proceed with clinical trials to test convalescent plasma therapy in people who are at high risk for severe COVID-19 illness and have been exposed to people who have tested positive for the virus. Like most therapies, Bloch says, convalescent blood plasma's best potential for effectiveness is early in the disease's progression. Currently, there are no proven drug therapies or effective vaccines for treating COVID-19.

The guidebook outlines a range of clinical trials underway or planned at hospitals taking part in the Johns Hopkins-led network for convalescent plasma therapy.

Among the protocols outlined in the guide are criteria for eligible donors of blood plasma, how hospitals can mobilize donors and work with local and national blood centers, methods for prescreening donors, and the risks and potential benefits of the therapy.

Bloch, also an expert on global health, says convalescent blood plasma therapy can be deployed in low-resource communities. There is a difference, however, in how blood plasma may be collected in communities with low versus high resources.

He says high-resource communities typically rely on apheresis machines to remove a donor's blood, filter the plasma from it, and return the rest of the blood, plus a replacement for the collected plasma (i.e. a protein called albumin), back to the donor. Using the apheresis method, a single donor could produce enough plasma to potentially benefit up to three other people.

In low-resource communities where apheresis machines may be unavailable, the output of plasma would be less per donor. This is because doctors have to perform a typical whole blood donation from the donor and manually separate the plasma in a laboratory by using a centrifuge machine or letting gravity separate the blood products.

Among the most common challenges to scaling up convalescent blood plasma therapy, Bloch says, is rapidly developing in-house testing for whether the blood plasma of donors contains key antibodies the immune system needs to recognize and help destroy the virus in the body. There are also logistical challenges associated with identifying donors and performing repeat COVID-19 nasal swab tests for the virus in them.

"This field is moving so fast that a problem today is solved tomorrow," says Bloch. "We aimed to publish a baseline document that can serve hospitals globally. It will, undoubtedly, evolve."

Credit: 
Johns Hopkins Medicine

Common disease prevention and cancer screening would benefit from genomic risk assessment

Many of the most common causes of death are due to diseases whose onset could be significantly slowed down or whose prognosis could be improved by identifying with increasing accuracy individuals at high risk. In the current system, a considerable number of high-risk individuals cannot be identified in time, as currently available tools measure genetic risk inadequately or not at all.

The findings, published today in the Nature Medicine journal, demonstrate that genomic information could be used to improve the selective prevention of cardiac diseases and diabetes, as well as cancer screening. The results are based on the FinnGen research project, which encompasses more than 135,000 Finnish voluntary donors of biobank samples.

The study focused on five common diseases: coronary heart disease, type 2 diabetes, atrial fibrillation, breast cancer and prostate cancer.

Previous studies have identified numerous genetic risk factors for each of these diseases. In this study, the data pertaining to all of these individual risk factors was combined into what are known as genome-wide polygenic risk scores. These scores were calculated for all the 135,000 study subjects, for each of the five diseases.

"In terms of cardiovascular diseases and diabetes, genomic information alone can identify individuals who have a lifetime risk of more than 60% of developing these diseases, which means that most of them will develop these diseases at some point of their lives," says the principal investigator of the study, Professor Samuli Ripatti from the University of Helsinki.

The research group also combined genetic risk data with currently known risk factors and clinical risk calculators. Adding genomic information improved the accuracy of current risk estimation approaches.

"Our findings show that the genetic risk profile was a significant factor in predicting the onset of all five diseases studied. A particular benefit was seen in the identification of individuals who develop diseases at a younger age than on average," says Nina Mars, doctor of medical science at the Institute for Molecular Medicine Finland (FIMM) of the University of Helsinki, who carried out the study.

"Personalised risk calculation engenders opportunities that are important to healthcare. Risk assessment that utilises genomic information could be employed in, for example, determining the age when breast and prostate cancer screening begins. One option is to have those with a elevated genetic risk already undergo screening earlier than instructed in the current screening recommendations", Mars states.

"A study that combines genomic and health data in such an extensive dataset is exceptional even on the global scale. From the perspective of our healthcare system, it's great to have been able to study Finnish individuals, making the results also directly applicable to Finns," says Aarno Palotie, scientific director of the FinnGen research project.

Credit: 
University of Helsinki

'Directing' evolution to identify potential drugs earlier in discovery

Scientists have developed a technique that could significantly reduce the time of discovering potential new antibody-based drugs to treat disease.

Antibodies are produced by the body in response to the presence of a disease-causing agent. They can also be synthesised in the laboratory to mimic natural antibodies and are used to treat a number of diseases.

Antibody therapies can be highly effective, but challenges can arise when promising candidate antibodies are produced at a large scale. Stresses encountered during manufacturing can disrupt the structure of these fragile proteins leading to aggregation and loss of activity. This in turn prevents them from being made into a therapeutic.

New research from an eight-year collaboration between scientists at the University of Leeds and the biopharmaceutical company AstraZeneca has resulted in a technique that allows fragments of antibodies to be screened for susceptibility to aggregation caused by structure disruption much earlier in the drug discovery process.

The approach is described in the journal Nature Communications, published today.

Dr David Brockwell, Associate Professor in the Astbury Centre for Structural Molecular Biology at the University of Leeds, led the research. He said: "Antibody therapeutics have revolutionised medicine. They can be designed to bind to almost any target and are highly specific.

"But a significant problem has been the failure rate of candidates upon manufacturing at industrial scale. This often only emerges at a very late stage in the development process - these drugs are failing at the last hurdle.

"But our research is turning that problem on its head."

When it comes to developing an antibody drug, scientists are not restricted to a single protein sequence. Fortunately, there is often an array of similar antibodies with the same ability of locking or binding tightly onto a disease-causing agent. That gives researchers a range of proteins to screen, to determine which are more likely to progress through the development process.

Professor Sheena Radford, FRS, Director of the Astbury Centre for Structural Molecular Biology, said: "The collaboration that has existed between the team of scientists within the University of Leeds and AstraZeneca demonstrates the power of industry and academia working together to tackle what has been one of the major roadblocks preventing the efficient and rapid development of these powerful therapeutic molecules."

How the target proteins are screened

The target proteins are cloned into the centre of an enzyme that breaks down antibiotics in the bacterium E.coli. This enables the scientists to directly link antibiotic resistance of the bacteria to how aggregation-prone the antibody fragment is. A simple readout - bacterial growth on an agar plate containing antibiotic - gives an indication of whether the target protein will survive the manufacturing process. If the antibody proteins are susceptible to stress, they will unfold or clump together, become inactive, and the antibiotic will kill the bacteria. But if the protein chain is more stable, the bacteria thrives and will display antimicrobial resistance and will grow in the presence of the antibiotic.

The scientists harvest the bacteria that have survived and identify the cloned protein sequence. That indicates which protein sequences to take forward in the development pipeline. The whole cycle takes about a month and increases the chances of success.

Directed evolution

But the process can go a step further, using the idea of directed evolution.

Scientists use the idea of natural selection where mutations or changes take place in the proteins, sometimes making them more stable. Directed evolution could generate new better performing sequences that, at the current time, cannot even be imagined, let alone designed and manufactured. How does this method work? Like Darwin's natural selection, evolutionary pressure in this case is applied by the antibiotic and selects for the survival of bacteria that produce the protein variants that do not aggregate.

The protein sequences hosted in the bacterial cells that have shown resistance are harvested and their genes sequenced and scored, to select the best performing sequences. After a quick check to ensure that the new antibody sequences still retain their excellent binding capability for the original disease-causing target, they can be taken forward for further development.

Professor Radford said: "There is tremendous excitement about this approach. We are letting evolutionary selection change the sequence of the proteins for us and that might make some of them more useful as drug therapies. Importantly for industry, nature does the hard-work - obviating the need for so called rational engineering which is time- and resource-intensive.

"As we do this, we will be putting the sequence information we gather into a database. As the database gets bigger, it may well be possible with artificial intelligence and machine learning to be able to identify the patterns in protein sequences that tell us that a protein can be scaled up for pharmaceutical production without needing any experiments. That is our next challenge and one we are tackling right now."

Dr David Lowe, who led the work at AstraZeneca, said: "The screening system that we have developed here is a great example of industry and academia working together to solve important challenges in the development of potential new medicines.

"By combining AstraZeneca's antibody discovery and screening expertise, together with the Astbury Centre's world-leading knowledge of protein structure and aggregation, we have produced a high throughput method for rapidly evolving proteins with better biophysical properties that has the potential for wide scientific applicability."

Credit: 
University of Leeds