Culture

How men and women network impacts their labor market performance

A new paper in The Economic Journal, published by Oxford University Press, develops a theory of how people's social network structure impacts productivity and earnings. While large and loosely connected networks lead to better access to information, smaller and tighter networks lead to more peer pressure. Information is relatively more beneficial in uncertain work environments while for peer pressure the opposite is the case.

Researchers here also document significant network difference by gender, showing that loose networks are more common for men, and tight networks are more common among women. Based on this fact, the theory provides a new rationale for why men self-select more frequently into occupations involving high-risk decisions, such as finance and research, while women prefer safer settings such as health and education.

Different types of social networks are associated with distinct advantages and disadvantages in the workplace. Loose connections grant greater access to information and are therefore especially valuable in an uncertain work environment with high but risky project returns. Those with looser social networks receive more information about the value of a project beforehand, allowing them to identify which projects are worth working hard on.

In turn, a tight network, where connections are interlinked and clustered, leads to a relatively better performance in stable workplaces, where information acquisition is not crucial. The reason is that workers with tighter networks face more peer pressure as failures lead to tension not only among partners on that specific project, but also throughout the entire group. So generally, workers with tight networks put more effort into projects to prevent failure.

Academic disciplines, requiring the design and completion of projects with a-priori unknown outcomes, are an example of an uncertain work environment that favors looser networks; and so are management positions, jobs in finance or in the arts and entertainment. The proposed theory argues that individuals with large and loose networks should outperform those with small and tight networks in those settings.

The authors uncover the new fact that men and women differ in the way they build their social networks. They examine data from the Digital Bibliographic Library Browser's computer science set (438,531 men and 146,829 women), email communications from Enron (1,628 women and 2,298 men), and AddHealth's friendship networks which is composed of information from roughly 140 US schools (73,244 students). On average, women had tighter, more interconnected networks with high clustering, while men were more likely to form larger networks with looser connections. The study thus found this trend throughout very different environments -- academia, private company and schools -- showing the pervasiveness of these gender disparities in network structures.

This study then shows that women perform poorly relative to men in high risk occupations and connects the differences in networking structures to their labour market outcomes. This suggests that networking differences across gender may be an overlooked source of wage differences, especially in high risk occupations.

"We were surprised to learn that men's and women's networks differ in these drastic ways, with the differences being robust across very distinct environments," said the paper's authors Ilse Lindenlaub and Anja Prummer. "We hope that our findings spark more research into the importance of network structure for labour market outcomes, not only but also to better understand gender gaps in the labour market.''

Credit: 
Oxford University Press USA

Stronger together in the microbiome: How gut microbes feed each other to overcome dietary deficiencies, change host behavior, and improve reproduction

image: Gut bacteria use a metabolic cross-feeding to overcome dietary deficiencies, and to change the host behavior and reproductive output

Image: 
Gil Costa

(Title: Stronger together in the microbiome: how gut microbes feed each other to overcome dietary deficiencies, change host behavior, and improve reproduction)

To study how the microbiome affects their host behavior, a group of researchers at the Champalimaud Centre for the Unknown, in Lisbon - Portugal, used the fruit fly combined with high-tech tools to show that two gut bacteria establish a metabolic cross-feeding that enables them to grow in diets that lack the nutrients that are essential for their growth and to allow them to change host decision making and reproduction. Results reveal a mechanism through which the right combination of bacteria can lead to microbiome resiliency to dietary perturbations and changes in brain function.

A balanced intake of essential amino acids is crucial to ensure the well-being and health of all animals. The essential amino acids are the building blocks of proteins but they also influence how much offspring animals produce, and what animals decide to eat. Intriguingly, researchers at the Champalimaud Centre for the Unknown had previously shown that the microbiome plays an important role in dictating how amino acids affect the brain. What was most puzzling was that bacteria could only affect the decisions of the animal when they were present in specific combinations. It is widely known that the microbiome often contains many different species of bacteria but why different types of bacteria are needed to influence brain function and alter host physiology remains a mystery. This is the puzzle Carlos Ribeiro and his team at the Champalimaud Centre for the Unknown set out to tackle: . "To study how bacteria affect their host physiology is a daunting task in organisms with very complex microbiomes. This is where the fly and its less complex microbiome emerges as a powerful tool. It allows us to precisely dissect the mechanisms used by the microbiota to change the host's feeding decisions.", points out Sílvia Henriques, post-doctoral researcher and author of this study published today (August 25th) in the journal Nature Communications.

In the laboratory led by Carlos Ribeiro, principal investigator and senior author of this study, it was previously found that flies deprived of single essential amino acids develop a strong appetite for protein rich foods. However, in flies that were associated with two bacteria that are very abundant in the microbiome (Acetobacter pomorum and Lactobacillus plantarum) their preference for protein was drastically reduced and they prefered to eat sugar. "Interestingly, the association of flies with any of these bacteria alone could not reduce yeast appetite. Thus, in this new study our main focus was to understand why these two particular bacteria need to be present to change the feeding behaviour of the fly." says Ribeiro.

Work from several groups working on the microbiome, including the Ribeiro Lab, has shown that it is typically necessary for a community of bacteria, rather than isolated bacteria, to produce an effect on the host behaviour - and this was most likely due to specific substances bacteria produce, so called metabolites. Therefore the team set out to measure the metabolic interactions established between the bacteria within the microbiome and to map how specific bacteria and their metabolites affect the animal.

To tackle these, the authors runned a series of elegant experiments. To follow the feeding choices of the flies, researchers took advantage of a sensor developed in the lab - the flyPAD - and used it to measure with great detail the feeding pattern of individual flies. Then they used bacterial mutants to understand the impact of specific functions of the bacterial cells in the behavior of the host. And at last with collaborators at the University of Glasgow, they have also used a sophisticated technique called 'Isotope-resolved metabolomics' that enabled them to track the metabolites that were exchanged between the two different bacteria.

"We found that the two bacteria exchange metabolites and that this cross-feeding (syntrophy) enables them to grow and act on the animal even if diets lack the nutrients that are essential for them. Specifically, we now understand that Lactobacillus strains produce lactate which is used by the Acetobacter strains to synthetize amino acids and other metabolites. These are then used by the Lactobacillus strain which cannot synthetize them to continue to produce lactate. Furthermore, these bacterial amino acids are very likely used by the animal for egg production. But most importantly, we now understand that the lactate is also used by the Acetobacter bacteria to change the behaviour of the fly." explains Darshan Dhakan, post-doctoral researcher and author of this study.

By establishing this cross-feeding relation, the bacterial community becomes resilient to drastic dietary changes enabling their growth in the intestines of animals that ingest diets that lack nutrients that are essential to their survival. Ribeiro adds, "It is well established that our diet affects both the microbiome and our brain. What makes it complicated is the microbiome then in turn affects how diet affects us and what animals decide to eat. This makes it a very complex puzzle to solve. But by combining the right technologies with the right experimental system we can get at the heart of the mechanisms by which the microbiome interacts with our diet to affect our brain and our body. Importantly we show that the right associations of bacteria can make the microbiome resilient to dietary perturbations explaining why some animals and people might be more sensitive to the nutrient content of food than others. It is also a beautiful example of how nature establishes circular economies where nothing gets wasted and everybody gains."

In conclusion, this study emerges as an important example of how model organisms can be used to disentangle the influence of diet on the microbiome and to understand the individual contributions of gut bacterial species on brain function and behaviour. "The methodologies that were used in this study will allow us to identify all the metabolic interactions established amongst bacteria and will allow us to understand the precise mechanisms responsible for altering what animals decide to eat and brain function. Those insights can then be used to guide the search for similar mechanisms in animals with much more complex microbiomes, including in humans.", concludes Ribeiro.

Credit: 
Champalimaud Centre for the Unknown

Measles outbreaks in Niger linked to rainfall and temperature, study finds

UNIVERSITY PARK, Pa. -- Rainfall and temperature drive agricultural activity, which, in turn, influences patterns of measles outbreaks in the West African nation of Niger, according to an international team of researchers. The findings may be useful for improving vaccine coverage for seasonally mobile populations within Niger and other countries.

"Measles is a major cause of child mortality in sub-Saharan Africa, responsible for about 62,000 deaths in the region in 2017," said Alexandre Blake, graduate student in biology, Penn State. "Yet, current immunization strategies achieve low coverage, in part, because they were designed for higher-income countries where children are vaccinated prior to attending school, instead of for highly mobile populations where the median infection age is below school age."

The researchers analyzed weekly reported measles cases at the district level in Niger from 1995 to 2004, as well as weekly cumulative rainfall and average temperature data from the National Oceanic and Atmospheric Administration. Next, they used wavelet analysis, a mathematical tool for uncovering temporal patterns hidden in large amounts of data, and regression, a statistical tool, to investigate the associations between the measles cases and the environmental data. Their findings will appear on July 26) in the Journal of the Royal Society Interface.

The team discovered a strong and consistent annual pattern of measles outbreaks that was associated with rainfall. Specifically, they found that the rainy season was associated with a lower risk of measles case reporting, whereas measles cases were higher during the dry season.

"The timing of the beginning of the measles season is consistent with a phase of annual agricultural labor migrations, when people move from rural areas to urban areas," said Blake. "Vaccination strategies that target migrating populations at this point in the season could be very powerful to break the annual pattern of outbreaks."

The researchers also observed a second weaker and more inconsistent pattern of outbreaks occurring every 2-3 years.

"This second outbreak pattern tells us that other mechanisms are at play," said Nita Bharti, assistant professor of biology. "So even if we could vaccinate everybody before the annual agricultural migration event, there would still be measles cases that are related to other factors."

Blake noted that some of those factors could be inequalities in access to care and movement between adjacent countries, such as Nigeria, where vaccine coverage is also low.

"Niger and Northern Nigeria share languages, culture and economic activities, and as a result, there is a lot of human movement across the border," he said. "But they do not have synchronous measles vaccination efforts, which likely permits the reintroduction of the virus on both sides."

The team concluded that targeting seasonally mobile populations for immunizations could reduce the strong seasonal pattern of outbreaks in Niger and across similar settings.

"Human health and the environment are often stitched together with human behavior," said Bharti. "Understanding the nature of those relationships in sub-Saharan Africa will provide valuable insights on how to tailor interventions in this setting."

Credit: 
Penn State

Cutting surgical robots down to size

Journal

Nature Machine Intelligence

DOI

10.1038/s42256-020-0203-4

Credit: 
Wyss Institute for Biologically Inspired Engineering at Harvard

Most adults with lupus or common types of arthritis have similar risks of getting admitted to hospital as other COVID-19 patients

Most adults with systemic lupus erythematosus (SLE) are not at increased risk of hospitalization from COVID-19 due to medications used to dampen their altered immune system, the cause of their disease. Nor are most people with more common types of arthritis, such as rheumatoid, psoriatic and spondyloarthritis, at greater risk of hospitalization from COVID-19, a pair of new reports shows.

SLE, known widely as lupus, along with common forms of arthritis, are autoimmune conditions caused by the immune system's mistaken attack on a person's own tissues, leading to inflammation in the joints, skin, kidneys, and other parts of the body. The majority of those affected by these diseases are women.

Although the new studies, led by NYU Grossman School of Medicine researchers, show that for some of those affected the use of steroid medications to reduce inflammation slightly increased the likelihood of needing hospital care, researchers say the results should be reassuring to patients overall.

Many people are taking steroids or other immunosuppressing medications, especially newer biologic drugs, to prevent their immune system's attack on their tissues. And the researchers say their patients report feeling added anxiety that their treatments make them more susceptible to the dangers of coronavirus infection.

In the first study, published recently in the journal Arthritis and Rheumatology, researchers closely monitored the health of 226 adult patients, mostly Black, Hispanic, and female, undergoing treatment at NYU Langone Health clinics or NYC Health + Hospitals Bellevue Hospital for mild to severe forms of lupus. All were surveyed by phone or email, or had their medical records checked between April 13 and June 1, when the pandemic peaked in the New York City region. Twenty-four were hospitalized out of 41 who were formally diagnosed with COVID-19, and four of them died. Another 42 had COVID-19-like symptoms but were not formally tested.

For the second study, published in the same journal, researchers monitored 103 mostly white women being treated at NYU Langone Health clinics between March 3 and May 4 for inflammatory arthritis, which unlike common osteoarthritis, does not primarily result from joint wear and tear. All tested positive for COVID-19 or had symptoms highly suggesting they were infected. Twenty-seven (26 percent) were hospitalized, with four deaths (4 percent).

Researchers say their latest study findings showed that lupus patients taking immune-suppressing medications, such as mycophenolate mofetil (Cellcept) and azathioprine (Imuran), had no greater risk of hospitalization (15 out of 24) than lupus patients not using the medications (nine of 17). Similarly, hospitalization rates for people with inflammatory arthritis (26 percent) and COVID-19 were also no greater than seen for all New Yorkers (25 percent, according to city figures).

Among the research team's other findings was that patients taking biologic drugs for arthritis, such as adalimubab (Humira) and etanercept (Enbrel), which are made from living cells, or the antiviral hydroxychloroquine, were at no greater or lesser risk of hospitalization than those not taking the drugs. However, those taking glucocorticoids, a type of steroid, even in mild doses, were upwards of 10 times more likely to be hospitalized than arthritis patients not using steroids. The researchers caution that although statistically significant, the study's small size may overestimate the actual risk.

"Our findings represent the largest of its kind for American patients with lupus or arthritis and COVID-19, and should reassure most patients, especially those on immunosuppressant therapy, that they are at no greater risk of having to be admitted to hospital from COVID-19 than other lupus or arthritis patients," says one of the studies' co-lead investigators, Ruth Fernandez-Ruiz, MD.

"People with lupus or inflammatory arthritis have the same risk factors for getting seriously ill from COVID-19 as people without these disorders," says Fernandez-Ruiz, a postdoctoral fellow in rheumatology in the Department of Medicine at NYU Langone.

These shared risk factors, she says, which overall more than double people's risk of hospitalization from COVID-19, are having multiple underlying health conditions, such as obesity, hypertension, and diabetes.

"Patients receiving therapy for lupus and inflammatory arthritis should not automatically stop taking their medications for fear that they would be worse off if they also caught the coronavirus," says another of the studies' co-lead investigators, Rebecca Haberman, MD. "Instead, rheumatology patients should consult with their medical provider about their overall risk factors for COVID-19 and make plans accordingly," says Haberman, a clinical instructor in rheumatology in the Department of Medicine at NYU Langone.

Credit: 
NYU Langone Health / NYU Grossman School of Medicine

Treating COVID-19 could lead to increased antimicrobial resistance

The use of antibiotics in people with COVID-19 could result in increased resistance to the drugs' benefits among the wider population, a new study suggests.

Patients hospitalised as a result of the virus are being given a combination of medications to prevent possible secondary bacterial infections.

However, research by the University of Plymouth and Royal Cornwall Hospital Trust suggests their increased use during the pandemic could be placing an additional burden on waste water treatment works.

Writing in the Journal of Antimicrobial Chemotherapy, scientists say this could lead to raised levels of antibiotics within the UK's rivers or coastal waters which may in turn result in an increase in antimicrobial resistance (AMR), where bacteria become resistant to the action of antibiotics.

This would be particularly acute in receiving waters from waste water treatment works serving large hospitals, or emergency 'Nightingale' hospitals, where there is a concentration of COVID-19 patients.

The findings are based on reports that up to 95% of COVID-19 inpatients are being prescribed antibiotics as part of their treatment, and concerns that such a large-scale drug administration could have wider environmental implications.

Sean Comber, Professor of Environmental Chemistry in Plymouth and the article's lead author, said: "COVID-19 has had an impact on almost every aspect of our lives. But this study shows its legacy could be felt long after the current pandemic has been brought under control. From our previous research, we know that significant quantities of commonly prescribed drugs do pass through treatment works and into our water courses. By developing a greater understanding of their effects, we can potentially inform future decisions on prescribing during pandemics, but also on the location of emergency hospitals and wider drug and waste management."

The COVID-19 guidance issued by the National Institute for Health and Care Excellence (NICE) suggests patients with COVID-19 should be treated with doxycycline and either amoxicillin or a combination of other medications if a bacterial infection is suspected, but to withhold or stop antibiotics if a bacterial infection is unlikely.

Neil Powell, Consultant Pharmacist at the Royal Cornwall Hospital said: "Common with other hospitalised patients in the UK, and other countries, the majority of our patients with COVID symptoms were prescribed antibiotics because it is very difficult to know whether a patient presenting with symptoms of COVID has an overlying bacterial infection or not. We did a lot of work to try and identify those patients who were unlikely to have a bacterial infection complicating their viral COVID infections in an attempt to reduce the amount of antibiotic exposure to our patients and consequently the environment."

This research combined patient numbers for UK emergency hospitals set up temporarily around the country with waste water treatment work capacity and available river water dilution serving the emergency hospital and associated town.

Using available environmental impact data and modelling tools developed by the UK water industry, it focussed on one UK emergency hospital - Harrogate, geared up to treat around 500 people - and showed the risks posed by doxycycline was low, assuming the hospital was at full capacity.

Tom Hutchinson, Professor of Environment and Health at the University and a co-author on the research, added: "This is a comprehensive environmental safety assessment which addresses potential risks to fish populations and the food webs they depend on. The data for amoxicillin indicated that while there was little threat of direct impacts on fish populations and other wildlife, there is a potential environmental concern for selection of AMR if at 100% capacity."

Amoxicillin is used to treat everything from pneumonia and throat infections to skin and ear infections.

Mathew Upton, Professor of Medical Microbiology at the University and a co-author on the research, added: "Antibiotics underpin all of modern medicine, but AMR is an issue that could impact millions of lives in the decades to come. Currently, the COVID-19 pandemic is causing immense suffering and loss of life across the globe, but AMR has been - and will remain - one of the most significant threats to global human health. We conducted this study so that we can begin to understand the wider impact of global pandemics on human health. It is clear that mass prescribing of antibiotics will lead to increased levels in the environment and we know this can select for resistant bacteria. Studies like this are essential so that we can plan how to guide antibiotic prescription in future pandemics."

Credit: 
University of Plymouth

Some of America's favorite produce crops may need to get a move on by 2045

image: Researchers compared how future warming will impact where and when five California crops can be grown.

Image: 
Berkeley Lab

Record drought and heat have some farmers worried about where and when crops can be grown in the future, even in California where unprecedented microclimate diversity creates ideal growing conditions for many of the most popular items in America's grocery stores. A third of the vegetables and two-thirds of fruits and nuts consumed by Americans are now grown on more than 76,000 farms across the state, yet 20 years from now certain California regions may simply become too hot and dry for continued production.

New research from Lawrence Berkeley National Laboratory (Berkeley Lab) shows that by the years 2045-2049 future temperatures will have more of an effect on when cool-season crops, such as broccoli and lettuce, can be grown than on where, while for warm-season crops (cantaloupe, tomatoes, carrots) the impact will be greater for where they can be grown versus when. The scientists describe pairing computer modeling with information about historic and ideal growing temperatures for five important California crops in their paper, "Projected temperature increases may require shifts in the growing season of cool-season crops and the growing locations of warm-season crops," which appeared in the journal Science of the Total Environment earlier this month.

"To ensure food security for California and the rest of the country, it's important to predict how future warming will affect California agriculture," said the paper's lead author Alison Marklein. "We need reliable information about how future climate conditions will impact our crops in order for the agricultural system to develop an adequate response to ensure food security. For instance, one major challenge when considering relocating crops is that growers have specialized knowledge of their land and crops. If crops can no longer be grown in their current locations, then the farmer has to either move to a new area or grow a different crop, which presents a practical and economic burden on the farmer."

Now a scientist at UC Riverside, Marklein had previously led the project while completing a postdoctoral fellowship at Berkeley Lab and collaborating with Peter Nico, a study co-author and staff scientist in Berkeley Lab's Earth and Environmental Sciences Area. Scientists from the U.S. Department of Agriculture and UC Davis also contributed. Funded by the University of California's Global Food Initiative, the research represents a significant research focus for Berkeley Lab: sustainable agriculture. Another current Berkeley Lab study applies machine learning to developing microbial amendments that could replenish soils depleted of nutrients like carbon and phosphorus.

Growing accustomed to climate extremes

In carrying out the study, the researchers first selected five annual crops that are produced more in California than any other state - lettuce, broccoli, carrots, tomatoes, and cantaloupe. These nutrient-dense crops contributed 64% of the state's cash value of vegetable and melon crops in 2016 and are essential to food security, as evidenced by their place among the top vegetables and fruits donated to four studied California food banks.

The team then obtained 15 years of air temperature data beginning in 1990 from locations across the state, as well as information about crop temperature thresholds - or maximum and minimum air temperatures beyond which crop failure occurs - and growing locations for each of the five crops going back seven years. They also considered a crop's optimal growing-season length: for example, broccoli requires four consecutive months of minimum 39 degrees Fahrenheit and maximum 95 degrees.

Setting out to compare how each crop would do across California under different possible climate scenarios, one hot-dry and another cool-wet, the researchers looked at how higher temperatures may affect the crops in their historical growing locations. Next, they identified possibilities to expand any crop to a more ideal growing location based on that crop's temperature threshold, looking at all areas where that crop has not been grown, even where land had not previously been used for agriculture.

Finally, the team calculated how much of the land historically used for growing each of the five crops can be maintained under future warming scenarios (hot-dry, cool-wet); how much of the land used would be untenable due to temperature rise; and how much land not formerly used for agriculture could potentially support each of the five crops in comparison to historical agricultural land where these crops have not yet been grown.

Tomatoes could face some growing pains

"We found differences in how warmer temperatures will affect the cool-season crops versus the warm-season crops," Marklein said. "For cool-season crops like broccoli and lettuce, it may be possible to extend their growing seasons. But it may become too warm to grow warm-season tomatoes where they have been historically farmed in summer, and may require moving them to milder climates warm enough for growing tomatoes under the new climate scenarios."

The team found that both cool-season crops, broccoli and lettuce, are currently grown nearer to their lower temperature thresholds during fall and spring. The climate models predict an increase in winter temperatures above the minimum temperature threshold for both crops, suggesting that by mid-century these crops could also be grown into winter, even in areas where they have not historically been grown.

The warmer temperatures in fall and spring suggest that tomatoes might benefit from a shift in growing season. But that could prove harder than it appears.

"Looking at the hot-dry future climate scenario, although temperatures in fall and spring are likely to increase as will summer temperatures, a shift in growing season isn't a viable solution because the summer temperatures are likely to exceed the critical temperatures for tomatoes," Marklein said. "Tomatoes need four consecutive months for their growing season, so the gap in the middle filled by summer makes this unfeasible."

Opportunities could crop up all over

While it's true that some of the crops studied, tomatoes especially, will lose areas where they have been traditionally farmed due to future warming, there could be some ways to mitigate these potential challenges. For example, because their analysis focused on air temperature rather than crop temperature, in practice irrigation may be able to reduce some negative heat effects.

In total, Marklein said this study gives agricultural planners a lot to think about. "This is really a first step in planning for future climate scenarios. This work could be used to help prioritize resources like cropland and water to maximize agricultural productivity and food security," she said. "It's critical to plan ahead for future warming scenarios, particularly in areas like California that feed the nation."

Credit: 
DOE/Lawrence Berkeley National Laboratory

Legacy

image: Maria A. Ramos-Roman, M.D.

Image: 
UT Southwestern Medical Center

DALLAS - Aug. 25, 2020 - Breastfeeding secures delivery of sugar and fat for milk production by changing the insulin sensitivity of organs that supply or demand these nutrients, a new study led by UT Southwestern scientists suggests. The findings, published in this month's print issue of Diabetes, could explain how different tissues cooperate to start and maintain lactation and offer strategies to help improve breastfeeding success for mothers who have insufficient milk production.

Epidemiologic studies suggest that breastfeeding protects women from developing Type 2 diabetes, even decades after their children have been weaned. But how this benefit arises has been unclear, explains Maria A. Ramos-Roman, M.D., an associate professor of internal medicine in the division of endocrinology at UT Southwestern.

One hypothesis is that breastfeeding changes how the body uses insulin, the hormone that allows cells to take in glucose from the bloodstream. (Glucose is a type of sugar that powers cells.) Detailed metabolic studies have shown that although pregnancy reduces the body's sensitivity to insulin, lactation may restore insulin sensitivity to a pre-pregnancy state. But it is unclear how this process happens and how it affects highly insulin-sensitive organs, such as the liver or fat tissue.

To learn more, Ramos-Roman and her colleagues recruited 18 women who had recently given birth. Twelve of these women were either exclusively breastfeeding or giving their babies less than 6 ounces of formula per day. The remaining six were exclusively formula feeding.

At five weeks postpartum, each of these women came to UT Southwestern for a detailed medical exam to make sure they were healthy and could safely participate in other parts of the study. One week later, the volunteers returned for a second visit in which they underwent a hyperinsulinemic-euglycemic clamp, a test that quantifies insulin sensitivity. Although the test measures insulin sensitivity for the body overall, these women also received stable isotopes during the test - tracers that helped measure insulin sensitivity specifically for organs of interest, such as the liver, or fat. At a third visit by eight weeks postpartum, the study participants underwent a type of imaging called magnetic resonance spectroscopy that measured the amount of fat in their livers.

The researchers found that all of the women had low blood insulin concentrations characteristic of the postpartum period, regardless of how they were feeding their new babies. However, even at these low insulin levels, after a 12-hour fast, the lactating mothers produced 2.6-fold more glucose and released 2.3-fold more fatty acids from storage in their fat tissue - a process known as lipolysis - than that of the formula feeders. Both glucose production and lipolysis were suppressed in both groups when the researchers mimicked a "fed state" by delivering insulin through the clamp. But suppression of lipolysis was heightened by higher levels of prolactin - the hormone that stimulates milk production - in the lactating mothers, accompanied by lower levels of fatty acids in the blood and lower amounts of fat in their livers.

These results suggest that breastfeeding appears to increase insulin sensitivity in highly insulin-sensitive organs, Ramos-Roman explains. After a 12-hour fast, the liver and fat tissues of lactating mothers release more sugar and fat into the bloodstream than formula-feeding mothers; however, after food intake or under the conditions of a hyperinsulinemic-euglycemic clamp, lactating mothers respond to small increases in insulin levels by holding on to more stored fat than formula-feeding mothers. Both conditions maximize the nutrients available for making milk in breastfeeding women, providing a steady stream from either stored resources or food intake.

These changes in insulin levels and sensitivity might also provide the legacy benefit observed in epidemiologic studies in women who have breastfed, offering a buffer against insulin resistance even decades later, Ramos-Roman says. (These previous studies showed an association between intensity or duration of breastfeeding and a lower incidence of Type 2 diabetes in the mothers.)

In addition, she adds, better understanding of this process could help researchers find new ways to help encourage it in mothers who would like to breastfeed but are having problems with low milk supply - finding ways to stimulate insulin-sensitive tissues to supply more nutrients into or demand less nutrients from the bloodstream could boost milk production.

"Lactation is millions of years old, but we still have a long way to go before we understand all there is to learn," Ramos-Roman says. "If we know better how it happens in the body, we can help improve the health of women and children."

Credit: 
UT Southwestern Medical Center

Magnetic stimulation dramatically improves fecal incontinence

image: Dr. Satish S.C. Rao, director of neurogastroenterology/motility and the Digestive Health Clinical Research Center at the Medical College of Georgia

Image: 
Phil Jones, Senior Photographer, Augusta University

Painless magnetic stimulation of nerves that regulate muscles in the anus and rectum appears to improve their function and dramatically reduce episodes of fecal incontinence, a debilitating problem affecting about 10% of the population, investigators report.

They have early evidence that TNT, or translumbosacral neuromodulation therapy, is a promising, novel, safe, low-cost treatment for strengthening key nerves and reducing or even eliminating episodes of stool leakage, Medical College of Georgia investigators report in the American Journal of Gastroenterology.

"We have identified that nerve damage is an important mechanism in the pathogenesis of stool leakage, and we have identified a noninvasive and targeted treatment to correct the nerve damage and address this pervasive problem," says Dr. Satish S.C. Rao, director of neurogastroenterology/motility and the Digestive Health Clinical Research Center at the Medical College of Georgia at Augusta University.

"We found there was significant improvement in fecal incontinence across the board," says Rao, after six sessions of weekly TNT treatment to key nerves, "which told us something is happening with this treatment. There is an effect on nerve function which, in turn, is leading to improvement of symptoms."

The rectum is the connector between the colon and the anus, where stool exits, and the muscles directly involved in moving feces along then holding it in place until we are ready to go to the bathroom, have been a focal point for treating fecal incontinence. However current strategies are largely unsatisfactory for at least half of patients because they do not directly address the causes, including nerve dysfunction in the anus and rectum, the investigators say.

Rao and his team decided to take a step back and look at the function of the nerves controlling those muscles. He developed a relatively benign test, called TAMS, or translumbosacral anorectal magnetic stimulation, to look at nerve activity by placing a probe in the rectum and a coil on the back to deliver magnetic stimulation to nerves in the anus and rectum and watch the response. When they found that nerve function was an issue in 80-90% of patients they assessed, they began exploring a similar approach using external, repetitive magnetic stimulation to help heal those nerves.

This first study was in 33 participants, including 23 women, who tend to have more problems with fecal incontinence, and, who were an average of about 60 years old. Age also is a risk factor. They used the same four sites on the upper and lower back they used to test the function of the relevant lumbar and sacral nerves, which are about two inches below the skin, after some surface mapping to find an exact location in each individual.

Patients lie comfortably face down and the machine makes a steady 'tock, tock' sound. Treatment lasts 15 minutes to an hour depending on the frequency. The 15-minute version meant, for example, 15 stimulations per second, or 15 hertz, clearly the quickest but, surprisingly, not the most effective frequency for this purpose.

Rather, while all participants derived some benefit, it was those receiving the lowest frequency, one hertz, over an hour who benefited most.

The investigators defined responders as those with at least a 50% reduction in the number of episodes of stool leakage per week. The one-hertz group experienced about a 90% reduction in weekly episodes as well as significantly improved ability to sense a need to defecate and in their ability to hold more stool. Those in the one hertz and midrange five-hertz group also reported the most improvement in quality of life issues.

"We measured several parameters including their leakage events, we measured their nerve and muscle function, quality of life, all of those were measured," Rao says. Participants also kept stool diaries, with some reporting zero incontinence episodes following TNT.

"It's still in the early stage, but it's quite remarkable what we are seeing," he says.

Like the patients he sees in his practice, study participants had a variety of issues that likely contributed to their lack of fecal control including diabetes, back injuries, hysterectomies and bladder and hemorrhoid surgeries. Childbirth is a common cause of both fecal and urinary incontinence. One of the females in the study had never had a baby, 18 others had vaginal deliveries and three of those also had a C-section, and four others only had a C-section. Eleven of the women with a vaginal delivery had vaginal tears and six had a forceps-assisted delivery.

While they didn't selectively pick people with nerve damage for the study, the investigators again found that whatever the cause, those with significant stool leakage had problems with delayed and weakened nerve conduction compared to healthy controls.

TNT dramatically shortened the time it takes those nerves to activate the muscle by several important milliseconds, particularly in the one-hertz group, where the response time consistently returned to normal.

"We have always tended to blame the anal muscle as the problem," Rao says of key controls needed to keep stool contained until we are in the bathroom. But they also know from women who experience muscle tears during childbirth, which is common, that repairing the muscle does not guarantee the woman will not have problems with leakage, he says. Sometimes muscle repair works temporarily, but when you follow up five years later, about half are incontinent, and nearly 90% are incontinent in 10 years, he says. "Ideally you want to treat all the mechanisms that are not working. We have not really approached it like that," Rao says.

His team suspected their repeated stimulation of the nerves would induce their innate ability to adapt in response to a variety of stimulations, called neuroplasticity, a skill that exists in nerves throughout the brain and body that enables both learning as well as recovery from injury or disease. They had preliminary evidence of this including studies indicating that magnetic stimulation improves neuropathy and pain in a condition called levator ani syndrome, in which patients experience burning pain in the rectal or perianal region.

They suspected high frequency stimulation, like 15 hertz, already used in the brain to treat problems like depression and stroke recovery, would work best, which is why they were surprised to find that the relevant nerves in this case were most responsive to longer periods of low frequency 1 hertz. Rao surmises one reason may be that the nerves that help control defecation are not as active as typical brain cells, although laboratory studies are needed to confirm that theory, he says. He also wants to learn more about underlying mechanisms for how the nerve changes occur with magnetic stimulation and, along with colleague Dr. Amol Sharma, MCG gastroenterologist and a study coauthor, look at its potential in other gastrointestinal motility problems caused by conditions like Parkinson's disease and the stomach-paralyzing problem gastroparesis.

How long benefits of TNT hold, and how often follow-up sessions may be needed are already being pursued in a larger study of 132 participants now underway at MCG and AU Health System and Harvard University's Massachusetts General Hospital in Boston, on which Rao is also the project director and principal investigator.

Participants for the published study were recruited from MCG's adult teaching hospital, AU Medical Center, and from the University of Manchester's Manchester Academic Health Sciences Centre in the United Kingdom, under the supervision of Dr. Shaheen Hamdy, professor of neurogastroenterology, although all participants were ultimately enrolled at the Augusta facility.

They went through extensive screening to ensure there weren't other medical problems, like severe diarrhea or inflammatory bowel disease, that could contribute to their incontinence, as well as a host of other serious medical conditions. To qualify, individuals had to have a history of recurrent fecal incontinence for six months that did not respond to approaches like diet modifications and diarrhea medication, and a two-week diary that reported at least one episode of fecal incontinence per week. As part of the study, investigators performed several tests to assess nerve and muscle function, including Rao's TAMS test, at the start and finish of the trial. They also used TAMS to ensure the participant's nerves were responding to the stimulation.

The only reported side effect of TNT was some temporary tingling in the treatment area, probably prompted by rejuvenating nerves, Rao says. He notes penetrability of the magnetic stimulations can be problematic with obesity or in patients with significant scarring from problems like back injury and/or surgeries. He also notes poor nerve conduction likely is a factor in some patients with constipation.

Credit: 
Medical College of Georgia at Augusta University

The secret life of melons revealed: "Jumping sequences" may alter gene expression

Tsukuba, Japan - On the surface, the humble melon may just look like a tasty treat to most. But researchers from Japan have found that this fruit has hidden depths: retrotransposons (sometimes called "jumping sequences") may change how genes are expressed.

In a study published recently in Communications Biology, researchers from the University of Tsukuba and the National Agriculture and Food Research Organization (NARO) have revealed that retrotransposons had a role in altering gene expression when melon genomes were diversifying, and may affect gene expression that induces fruit ripening.

Melons comprise one of the most economically important fruit crops globally. A special feature of melons is the coexistence of two fruit types: climacteric (which produce ethylene and exhibit a burst in cellular respiration as ripening begins), and non-climacteric. Ethylene is a plant hormone important to the regulation of climacteric fruit-ripening traits such as shelf life, which is of major economic importance.

"Because Harukei-3 melons produce ethylene during ripening, we wanted to look at ethylene-related gene expression in this type of melon," says lead author of the study Professor Hiroshi Ezura. "Harukei-3 produces an especially sweet fruit if grown in the right seasons. Because of its taste and attractive appearance, Harukei-3 has been used for a long time in Japan as a standard type for breeding high-grade muskmelon."

To examine ethylene-related gene expression, the researchers assembled the whole genome sequence of Harukei-3 by using third-generation nanopore sequencing paired with optical mapping and next-generation sequencing.

"We compared the genome of Harukei-3 with other melon genomes. Interestingly, we found that there are genome-wide presence/absence polymorphisms of retrotransposon-related sequences between melon accessions, and 160 (39%) were transcriptionally induced in post-harvest ripening fruit samples. They were also co-expressed with neighboring genes," explains Dr. Ryoichi Yano, senior author. "We also found that some retrotransposon-related sequences were transcribed when the plants were subjected to heat stress."

Retrotransposons are transposons (also referred to as "jumping sequences" because they can change their positions within a genome) with sequences similar to those of retroviruses.

"Our findings suggest that retrotransposons contributed to changes in gene expression patterns when melon genomes were diversifying. Retrotransposons may also affect gene expression that brings on fruit ripening," says Professor Ezura.

The Harukei-3 genome assembly, together with other data generated in this study, is available in the Melonet-DB database. Combined with future updates, this database will contribute to the functional genomic study of melons, especially reverse genetics using genome editing.

Credit: 
University of Tsukuba

Before eyes open, they get ready to see?

image: This is a photograph of Thor, a Norwegian forest cat. The primary visual cortex of higher mammals, including cats, is organized into an orderly tiling of sensory modules. Computer simulations show that spontaneous retinal waves mirrored from the retinal mosaics initiate this clustering of functional circuits in visual cortex.

Image: 
KAIST

A KAIST research team's computational simulations demonstrated that the waves of spontaneous neural activity in the retinas of still-closed eyes in mammals develop long-range horizontal connections in the visual cortex during early developmental stages.

This new finding featured in the August 19 edition of Journal of Neuroscience as a cover article has resolved a long-standing puzzle for understanding visual neuroscience regarding the early organization of functional architectures in the mammalian visual cortex before eye-opening, especially the long-range horizontal connectivity known as "feature-specific" circuitry.

To prepare the animal to see when its eyes open, neural circuits in the brain's visual system must begin developing earlier. However, the proper development of many brain regions involved in vision generally requires sensory input through the eyes.

In the primary visual cortex of the higher mammalian taxa, cortical neurons of similar functional tuning to a visual feature are linked together by long-range horizontal circuits that play a crucial role in visual information processing.

Surprisingly, these long-range horizontal connections in the primary visual cortex of higher mammals emerge before the onset of sensory experience, and the mechanism underlying this phenomenon has remained elusive.

To investigate this mechanism, a group of researchers led by Professor Se-Bum Paik from the Department of Bio and Brain Engineering at KAIST implemented computational simulations of early visual pathways using data obtained from the retinal circuits in young animals before eye-opening, including cats, monkeys, and mice.

From these simulations, the researchers found that spontaneous waves propagating in ON and OFF retinal mosaics can initialize the wiring of long-range horizontal connections by selectively co-activating cortical neurons of similar functional tuning, whereas equivalent random activities cannot induce such organizations.

The simulations also showed that emerged long-range horizontal connections can induce the patterned cortical activities, matching the topography of underlying functional maps even in salt-and-pepper type organizations observed in rodents. This result implies that the model developed by Professor Paik and his group can provide a universal principle for the developmental mechanism of long-range horizontal connections in both higher mammals as well as rodents.

Professor Paik said, "Our model provides a deeper understanding of how the functional architectures in the visual cortex can originate from the spatial organization of the periphery, without sensory experience during early developmental periods."

He continued, "We believe that our findings will be of great interest to scientists working in a wide range of fields such as neuroscience, vision science, and developmental biology."

Credit: 
The Korea Advanced Institute of Science and Technology (KAIST)

One step closer to earlier diagnosis of bipolar disorder and psychoses

In a new study from the Danish psychiatry project iPSYCH, researchers have identified genetic risk factors for developing bipolar disorder and psychoses among people with depression. In the longer term, the results may contribute to ensuring the correct diagnosis is made earlier, so that the patients can receive the correct treatment as quickly as possible.

Bipolar disorder and psychoses such as schizophrenia are serious mental disorders, which often have a great impact on a person's life and well-being. In a number of cases, bipolar disorder and schizophrenia are first diagnosed several years after the onset of the disorder. This is associated with unfavourable prognosis for the course of the disorders. The sooner the patient gets the correct diagnosis and begins targeted treatment, the better the prognosis. For this reason, researchers are aiming at identifying risk factors that will aid psychiatrists to reach the correct diagnosis as early as possible.

Depression often precedes bipolar disorder and psychoses

Many people who develop bipolar disorder or psychoses initially come into contact with the mental health services due to depression. A research team from iPSYCH therefore set out to examine a dataset consisting of 16,949 people aged 10-35 who had been treated for depression at a psychiatric hospital in Denmark.

"Our goal with the study was to investigate whether genetic factors are associated with an increased risk of developing bipolar disorder or psychosis among patients with depression. This knowledge can potentially be used in clinical practice to identify patients who should be monitored even more closely," explains the lead author of the research article based on the study, Senior Researcher Katherine Musliner from the National Centre for Register-based Research.

Among the factors the researchers looked into in the study was whether the genetic risk scores for bipolar disorder and schizophrenia - i.e. a person's individual genetic risk of developing these disorders - could possibly help psychiatrists determine which of their patients with depression was at greatest risk of subsequently developing bipolar disorder or a psychosis.

"One thing we discovered was that the genetic risk score for bipolar disorder is associated with an increased risk of developing bipolar disorder, and that the genetic risk score for schizophrenia is associated with an increased risk of developing a psychosis among patients who have been diagnosed with depression," says Katherine Musliner, stressing that the effect of the genetic risk scores were relatively small.

Family history weighs heavily

Another member of the research group behind the study, Professor Søren Dinesen Østergaard from the Department of Clinical Medicine and Aarhus University Hospital - Psychiatry, emphasises that caution is needed when interpreting the results.

"At present, the genetic risk scores cannot contribute to early diagnosis of bipolar disorder and psychoses in clinical practice, but it cannot be ruled out that this could be the future scenario. On the other hand, our study confirms that having a parent with bipolar disorder or a psychosis is a strong predictor for the development of these particular disorders after depression. This underlines the importance of getting information about mental disorders in the family as part of the assessment of people suffering from depression," he explains.

The results have been published in the American Journal of Psychiatry.

Background for the results

The study is a register-based study with data from 16,949 people who were treated for depression at a psychiatric hospital in Denmark in the period from 1994 to 2016.

Credit: 
Aarhus University

New study shows evolutionary breakdown of 'social' chromosome in ants

image: Workers of the red fire ant Solenopsis invicta on an llumina Miseq sequencing chip used to analyse the genes in their social chromosome.

Image: 
Yannick Wurm & Emeline Favreau

Scientists from Queen Mary University of London have found that harmful mutations accumulating in the fire ant social chromosome are causing its breakdown.

The chromosome, first discovered by researchers at the University in 2013, controls whether the fire ant colony has either one queen or multiple queens. Having these two different forms of social organisation means the species can adapt easily to different environments and has resulted in them becoming a highly invasive pest all over the world, living up to their Latin name Solenopsis invicta, meaning "the invincible".

For the new study, published in eLife, the research team performed detailed analyses of the activity levels of all the genes within the social chromosome for the first time to understand how it works and its evolution. They found that damaging mutations are accumulating in one version of the social chromosomes, causing it to degenerate. The findings also showed that most of the recent evolution of these chromosomes stems from attempts to compensate for these harmful mutations.

Natural selection is the main evolutionary mechanism that helps to optimise genes over generations but normally, it cannot simultaneously optimise genes for two different types of social organisation within one species.

To overcome this evolutionary conflict, social chromosomes group together genes adapted to each type of social form. The results of the new study show that this solution prevents the removal of harmful mutations from the genome and as a result, these mutations accumulate over time and begin to dominate the fate of the system.

The social chromosomes in fire ants are a rare example of a direct link between genes and social behaviour. They work in a similar way to the X and Y chromosomes in humans, which determine sex.

This discovery has wider ecological and medical implications because genomic structures similar to social and sex chromosomes can not only help species adapt to changing environments but also underpin diseases such as cancer.

Dr. Martínez-Ruiz, lead author of the study from Queen Mary University of London, said: "Our results show that the initial benefit of nature combining genes into a social chromosome has a cost. One million years later, most of the differences we see between social chromosomes are due to the accumulation of negative mutations."

"We also see that the rest of the genome adapts very quickly in response to negative mutations," added Dr. Wurm, Reader in Bioinformatics at Queen Mary and senior author of the study. "This is how evolution works, by adding patches to imperfect solutions, rather than by finding the most efficient solution."

"Despite the degeneration of the social chromosomes, the fire ants are unlikely to lose them anytime soon. This would require another major chromosomal reshuffling - such events are rare and usually lethal," Dr Wurm continues. "However, over long evolutionary timescales, anything is possible. Most of the 20,000 species of ants either have only single-queen colonies or only multiple-queen colonies. We are now trying to understand whether social chromosomes are required for changes in social organisation."

The study builds on earlier research by the authors on the evolution of social chromosomes. They have previously identified differences in genes for chemical communication that may be responsible for perceiving queens*, showed that one social chromosome has doubled in size**, and that this social chromosome lacks genetic diversity***.

Credit: 
Queen Mary University of London

Scientists prove SARS-CoV-2 potential to infect human brain organoids

image: Conceptual graph of SARS-CoV-2 infecting human brain organoids

Image: 
SIAT

SARS-CoV-2 can infect human neural progenitor cells and brain organoids, as shown by researchers from the Shenzhen Institutes of Advanced Technology (SIAT) of the Chinese Academy of Sciences and their collaborators from The University of Hong Kong (HKU).

Their study was published in Cell Research on August 4.

Coronavirus disease 2019 (COVID-19) is caused by SARS-CoV-2, also known as the novel severe acute respiratory syndrome coronavirus 2. Over 17 million confirmed cases of COVID-19 and more than 686,703 associated deaths have been reported across 218 countries and geographical regions as of August 3.

This novel coronavirus primarily causes respiratory illness with clinical manifestations largely resembling those of SARS. However, neurological symptoms including headache, anosmia, ageusia, confusion, seizure and encephalopathy have also been frequently reported in COVID-19 patients.

A study of 214 hospitalized COVID-19 patients in Wuhan reported that 36.4% of all patients and 45.5% of severe cases had neurologic symptoms. In addition, studies from France and Germany have revealed that 84.5% and 36.4%, respectively, of COVID-19 patients had viral infections in the brain.

However, there has been no direct experimental evidence of SARS-CoV-2 infection in the human central nervous system (CNS).

To explore the direct involvement of SARS-CoV-2 in the CNS in physiologically relevant models, the research team assessed SARS-CoV-2 infection in human neural progenitor cells (hNPCs), neurospheres and brain organoids derived from induced pluripotent stem cells (iPSCs).

The results demonstrated that iPSC-derived hNPCs were permissive to SARS-CoV-2 infection, but not SARS-CoV infection. Extensive protein expression and infectious viral particles were detected in neurospheres and brain organoids infected with SARS-CoV-2, which suggested SARS-CoV-2 could productively infect the human brain.

Importantly, SARS-CoV-2 infection in 3D human brain organoids was localised to TUJ1 (neuronal marker) - and NESTIN (NPC marker)-positive cells, suggesting SARS-CoV-2 could directly target cortical neurons and NPCs.

"Overall, our study provides the first evidence of direct SARS-CoV-2 infection in human brain organoids, which contributes to our understanding of the pathogenesis of neurological complications in COVID-19," said Prof. HUANG Jiandong from HKU, who led the study.

The research team suggested that chronic and long-term consequences of SARS-CoV-2 infection of the CNS should be closely monitored.

Credit: 
Chinese Academy of Sciences Headquarters

Plant living with only one leaf reveals fundamental genetics of plant growth

image: (A) An individual Monophyllaea glabra with one large leaf in front and a cluster of flowers growing from the base of the leaf. (B) Monophyllaea glabra at only a few days old showing a growth-ceased cotyledon (left) and developing cotyledon (right). The area where the gene ANGUSTIFOLIA3 works is stained purple. The yellow lines in the B' enlargement photo show the area where ANGUSTIFOLIA3 is not active in plants with typical anatomy commonly used as models in research labs, such as the thale cress Arabidopsis thaliana.

Image: 
Photo by Hirokazu Tsukaya, CC BY

Clinging to the walls of tropical caves is a type of plant with a single leaf that continues to grow larger for as long as the plant survives. Researchers at the University of Tokyo hope that their study of this unusual species may help inspire future genetic tools to control the size of common crop plants.

"We are pleased that we finally made a small breakthrough studying this plant," said Professor Hirokazu Tsukaya, who led the recent research project.

The plant's scientific name, Monophyllaea glabra, means "hairless species of one-leaf plant." M. glabra sprouts from a seed with two embryonic leaves called cotyledons, but only one of the cotyledons continues to develop into a leaf.

All Monophyllaea species grow one leaf that, as far as scientists have observed, can continue growing bigger as long as the plant lives. Most plants have no limit on the number of leaves they can grow, but those leaves do have a predetermined maximum size.

Tsukaya first tried working with Monophyllaea in the early 1990s after a trip to see the plants growing in their native habitat in Thailand.

"Monophyllaea like to live in limestone caves in Southeast Asia. If you have a chance to go there, you can see these plants easily," said Tsukaya.

The same curious biology that made the plants so interesting also made them challenging to study with new genetic tools being designed at the time for more common species with immediate agricultural or medical relevance. After a decadeslong hiatus while other molecular techniques developed, the project to understand Monophyllaea began again recently when doctoral student and first author of the research paper Ayaka Kinoshita joined the lab.

"I believe ours is the only lab in the world currently studying this species," said Tsukaya.

Understanding what makes Monophyllaea unique required tools that could see the location and activity level of genes early in the leaf's development. A technique known as whole-mount in situ hybridization allows researchers to preserve whole chunks of an organism, not just thin slices, and lock in place all of the genetic material the cells were using at the time of their death. The technique is commonly used in animal tissue, but is more complicated to use in plants because of the stiff outer cell wall around plant cells.

"Luckily, another of our lab members, Assistant Professor Hiroyuki Koga, is a true professional at using the whole-mount system and he persisted to develop a suitable method for plants," said Tsukaya. Koga is the second author of the research publication and was able to perfect a technique to preserve entire three-week-old Monophyllaea plants.

In plants with standard anatomy, the gene SHOOT MERISTEMLESS (STM) is expressed in cells at the growing tips of stems, referred to as the shoot meristem. Additionally, the gene ANGUSTIFOLIA3 (AN3) is expressed in very young leaves to promote the multiplication of cells that form the leaf.

"With our naked eye, we cannot see any shoot meristem in Monophyllaea. So we want to know, is it lost or is it modified?" explained Tsukaya.

Instead of separating the location and timing of STM and AN3 gene expression, young Monophyllaea showed overlapping expression of the two genes. Researchers say that what looks like a simple leaf in Monophyllaea is actually a combination or fusing of the shoot meristem and leaf.

"In Monophyllaea, the expression areas overlap, suggesting this plant is a hybrid of a normal leaf and shoot meristem. We suppose this curious gene expression pattern is one reason why the plant has such a curious appearance," said Tsukaya.

Researchers state that understanding how unusual species like M. glabra evolved to use common genes in uncommon ways will help agricultural scientists develop tools for controlling the size of leaves for optimal farming cultivation in the future.

"We study M. glabra because the characteristics of Monophyllaea development are very unique and they cannot be found in any mutants of common laboratory plants. Dealing with the unique phenomenon can definitely provide new insights to plant science," said Tsukaya.

Credit: 
University of Tokyo