Culture

Why some older adults remember better than others

Even among healthy people, a faltering memory is often an expected part of aging - but it's not inevitable.

"Some individuals exhibit remarkable maintenance of memory function throughout late adulthood, whereas others experience significant memory decline. Studying these differences across individuals is critical for understanding the complexities of brain aging, including how to promote resilience and longevity," said Alexandra Trelle, a postdoctoral research fellow at Stanford University.

Building on studies that have focused on young populations, Trelle and colleagues are investigating memory recall in healthy, older adults as part of the Stanford Aging and Memory Study. In new research, published May 29 in eLife, this team has found that memory recall processes in the brains of older adults can look very similar to those previously observed in the brains of young adults. However, for those seniors who had more trouble remembering, evidence for these processes was noticeably diminished.

By gaining a better understanding of memory function in older adults, these researchers hope to someday enable earlier and more precise predictions of when memory failures signal increased risk for dementia.

A striking similarity

When Anthony Wagner, the Lucie Stern Professor in the Social Sciences at Stanford's School of Humanities and Sciences, was a graduate student at Stanford in the '90s, he conducted some of the first fMRI studies of memory formation. At that time, state-of-the-art imaging and analysis technologies only allowed measurement of the magnitude of activity from a centimeter-and-a-half section of the brain.

In contrast, the current study measured activity from the whole brain at high-resolution, and analyses not only focused on the magnitude of activity but also on the memory information that is contained in patterns of brain activity.

"It's exciting to have basic science tools that allow us to witness when a memory is being replayed in an individual mind and to draw on these neural processes to explain why some older adults remember better than others," said Wagner, who is senior author of the paper. "As a graduate student, I would never have predicted that we would do this kind of science someday."

In the experiment, 100 participants between the ages of 60 and 82 had their brains scanned as they studied words paired with pictures of famous people and places. Then, during a scanned memory test, they were prompted with words they had seen and asked to recall the associated picture. The memory test was designed to assess one's ability to remember specific associations between elements of an event, a form of memory that is often disproportionately affected by aging.

In the scans, the researchers observed that the brain processes that support remembering in older adults resemble those in younger populations: when people remember, there is an increase in hippocampal activity - a brain structure long known to be important for remembering events - along with the reinstatement of activity patterns in the cortex that were present when the event was initially experienced. That is, remembering entails neural time travel, replaying patterns that were previously established in the brain.

"It was striking that we were able to replicate this moment-to-moment relationship between hippocampal activity, replay in the cortex, and memory recall, which has previously been observed only in healthy younger adults," said Trelle, who is lead author of the paper. "In fact, we could predict whether or not an individual would remember at a given moment in time based on the information carried in patterns of brain activity."

The researchers found that, on average, recall ability declined with age. Critically, however, regardless of one's age, stronger hippocampal activity and replay in the cortex was linked to better memory performance. This was true not only for the memory test conducted during the scan but also memory tests administered on a different day of the study. This intriguing finding suggests that fMRI measures of brain activity during memory recall are tapping into stable differences across individuals, and may provide a window into brain health.

Only the beginning

This research lays the foundation for many future investigations of memory in older adults in the Stanford Aging and Memory Study cohort. These will include work to further detail the process of memory creation and recall, studies of change in memory performance over time, and research that pairs fMRI studies with other kinds of health data, such as changes in brain structure and the build-up of proteins in the brain that are linked to Alzheimer's disease.

The ultimate aim is to develop new and sensitive tools to identify individuals who are at increased risk for Alzheimer's disease before significant memory decline occurs.

"We're beginning to ask whether individual differences in the ability to mentally travel back in time can be explained by asymptomatic disease that impacts the brain and predicts future clinical diagnosis," said Wagner. "We're hopeful that our work, which requires rich collaborations across disciplines, will inform clinical problems and advance human health."

Credit: 
Stanford University

Survey finds large increase in psychological distress reported among US adults during the COVID-19 pandemic

A new survey conducted during the pandemic by researchers at the Johns Hopkins Bloomberg School of Public Health and the SNF Agora Institute at Johns Hopkins University found a more-than-threefold increase in the percentage of U.S. adults who reported symptoms of psychological distress—from 3.9 percent in 2018 to 13.6 percent in April 2020. The percentage of adults ages 18­–29 in the U.S. who reported psychological distress increased from 3.7 percent in 2018 to 24 percent in 2020.

The survey, fielded online April 7 to April 13, found that 19.3 percent of adults with annual household incomes less than $35,000 reported psychological distress in 2020 compared to 7.9 percent in 2018, an increase of 11.4 percentage points. Nearly one-fifth, or 18.3 percent, of Hispanic adults reported psychological distress in 2020 compared to 4.4 percent in 2018, a more than four-fold increase of 13.9 percentage points. The researchers also found that psychological distress in adults age 55 and older almost doubled from 3.8 percent in 2018 to 7.3 percent in 2020.

The survey found only a slight increase in feelings of loneliness, from 11 percent in 2018 to 13.8 percent in 2020, suggesting that loneliness is not driving increased psychological distress.

The findings were published online June 3 in a research letter in JAMA.

The disruptions of the COVID-19 pandemic--social distancing, fear of contracting the disease, economic uncertainty, including high unemployment--have negatively affected mental health. The pandemic has also disrupted access to mental health services.

“We need to prepare for higher rates of mental illness among U.S .adults post-COVID,” says Beth McGinty, PhD, associate professor in the Bloomberg School’s Department of Health Policy and Management. “It is especially important to identify mental illness treatment needs and connect people to services, with a focus on groups with high psychological distress including young adults, adults in low-income households, and Hispanics.”

The survey used a scale to assess feelings of emotional suffering and symptoms of anxiety and depression in the past 30 days. The survey questions included in this analysis did not ask specifically about COVID-19. The scale, a validated measure of psychological distress, has been shown to accurately predict clinical diagnoses of serious mental illness.

Using NORC AmeriSpeak, a nationally representative online survey panel, the researchers analyzed survey responses of 1,468 adults ages 18 and older. They compared the measure of psychological distress in this survey sample from April 2020 to an identical measure from the 2018 National Health Interview Survey.

"The study suggests that the distress experienced during COVID-19 may transfer to longer-term psychiatric disorders requiring clinical care," says McGinty. "Health care providers, educators, social workers, and other front-line providers can help promote mental wellness and support."

Credit: 
Johns Hopkins Bloomberg School of Public Health

Collaborative research addresses need for conservation of springs in drying climate

image: Northern Arizona University professor Abe Springer researches springs and aquifers in the Grand Canyon and other arid parts of the southwestern United States.

Image: 
Northern Arizona University

A Northern Arizona University professor co-authored a paper on the importance of springs in a drying climate that is in the inaugural climate change refugia special edition of Frontiers in Ecology and the Environment.

The issue focuses on refugia, which refers to areas that are relatively buffered from current climate change and shelter valued wildlife, ecosystems and other natural resources. Abe Springer, a professor of hydrogeology and ecohydrology in the School of Earth and Sustainability whose research focuses on springs and aquifer health, collaborated on "Oases of the future? Springs as potential hydrological refugia in drying climates."

The collaborators, which included the U.S. Geological Survey, Rocky Mountain Research Station, the U.S. Department of Agriculture, the Nature Conservancy, Sky Island Alliance, Hampshire College, the Museum of Northern Arizona and National Park Service and Glen Canyon National Recreation Area, reviewed relevant published studies on the role of springs as refuges to support plants and animals in drying climates. They created a conceptual model that takes into account the response of springs to drying events and what investigations researchers must do to identify and classify a spring's potential to be a refuge.

Springer contributed results and implications on springs as refugia from his research group's springs ecohydrology research with the Springs Stewardship Institute at the Museum of Northern Arizona. His role in developing a geomorphological-based classification system for springs ecosystems helped the team characterize and prioritize different types of refugia.

The results, while not surprising, do serve as a call to action to researchers and citizen scientists alike. Springs have served as refuges for some species through previous climatic changes, and that's likely to become even more true in the future. However, scientists still can't say with certainty what effects a drying climate can have on these delicate ecosystems.

"Springs importance of refugia may increase with future predicted drying in such places as the southwestern United States," Springer said. "Inventories of the richness and diversity of life at springs are still too limited to provide adequate knowledge of their response to drying events."

They can, however, make some educated guesses with the available data. Springer said springs serve as wet refuges for certain plants and animals; these refuges are fed by groundwater stored in large aquifers, which can offset the drying events somewhat, offering a long-term buffer to such short-term, climate-influenced events. But as aquifers dry up from human pumping, springs are at risk of drying up, affecting entire ecosystems and even putting species at risk of extinction.

These risks are what led to the special edition of the journal, edited by Toni Lyn Morelli, a research ecologist at the USGS's Northeast Climate Adaptation Center at the University of Massachusetts, Amherst. Morelli said she hoped bringing the issue of refugia to the fore would spur action and innovation among researchers and conservationists.

Northern Arizona, already an arid climate, is at particular risk as climate change leads to even more drying. Springer has studied aquifers and springs in this region for years, including how the Grand Canyon gets its water (perhaps counterintuitively, it's not from the Colorado River), and has previously assessed the condition and risk of 200 springs in the Coconino and Kaibab national forests. This research led to conservation priorities among these springs, which forest managers have implemented. They include a range of various springs geomorphology to include the range of necessary refugia.

"All climatic and human-induced changes to hydrologic systems influence the aquifers that supply waters to springs," Springer said. "Our regional studies about the hydrological influences of forest management is important for sustaining processes to buffer groundwater storage from drying climate."

Credit: 
Northern Arizona University

Researchers study genetic outcomes of translocating bighorn sheep

image: This bighorn sheep roams as part of the Absaroka herd near Cody, Wyo. University of Wyoming researchers led a study to evaluate the long-term impact of translocation actions on Rocky Mountain bighorn sheep in Wyoming. The research was published in a paper that appeared in the May 29 online issue of the Journal of Wildlife Management.

Image: 
Wyoming Game and Fish Department

Translocation is an important management tool used for nearly 100 years to increase bighorn sheep population numbers in Wyoming and to restore herds to suitable habitat throughout their historical range. Yet, translocation also can alter the underlying genetic diversity of managed wildlife species in both beneficial and detrimental ways.

To evaluate the long-term impact of bighorn sheep translocations, a University of Wyoming professor and postdoctoral researcher co-led a study from 2015-19. The research group characterized statewide genetic structure and diversity by using microsatellite and mitochondrial DNA data in 353 indigenous and translocated Rocky Mountain bighorn sheep populations in Wyoming.

"The results of this study provide a comprehensive view of the level of genetic diversity in bighorn sheep in Wyoming. This is really important because bighorn sheep populations in Wyoming and throughout the West were driven down to such low numbers between the gold rush era and the 1960s," says Holly Ernest, a UW professor of wildlife genomics and disease ecology, and the Wyoming Excellence Chair in Disease Ecology in the Department of Veterinary Sciences and the Program in Ecology. "This loss of population numbers historically was due, in large part, to part overharvest, exposure to livestock diseases and loss of their habitat. In Wyoming, they perhaps existed in the hundreds of thousands pre-mid-1800s, but were driven down to as low as 2,000 total in Wyoming by the 1960s. Large losses of individuals often mean large loss of genetic diversity, which is a major foundation of healthy populations."

Ernest was the senior and corresponding author of a paper, titled "Bighorn Sheep Genetic Structure in Wyoming Reflects Geography and Management," that was published in the May 29 online edition of the Journal of Wildlife Management. The journal publishes manuscripts containing information from original research that contributes to basic wildlife science. Suitable topics include investigations into the biology and ecology of wildlife and their habitats that have direct or indirect implications for wildlife management and conservation.

Sierra Love Stowell, a research genomicist and a UW postdoctoral researcher at the time (2016-18) of this work, was the paper's lead author. Roderick Gagne, a research scientist at Colorado State University and a UW postdoctoral researcher from 2015-17, and Doug McWhirter, a Wyoming Game and Fish Department wildlife biologist, were co-authors of the paper. Additionally, other wildlife biologists and officials from the Wyoming Game and Fish Department contributed to the paper.

Bighorn sheep are a key component of Wyoming's biodiversity and a species that provides important viewing and hunting opportunities.

Translocation is a tool used in wildlife management that involves the intentional, human-mediated movement of individual animals, populations or species from one area with release in another. Beyond the demographic effects of adding more individual bighorn sheep, translocated animals bring more genetic material that can increase genetic diversity and improve fitness in recipient populations. Translocation of bighorn sheep in Wyoming began in 1922 and still occurs today.

The study found there was high gene flow -- genetic interchange due to movement of animals with resulting successful breeding -- among herds that had translocation sources in common, and herds that received translocated individuals from other herds.

"We identified at least five genetic clusters of Rocky Mountain bighorn sheep in the major mountain ranges of Wyoming," Ernest says. "These genetic clusters generally align with current management units."

The herd units identified were in the Absaroka, Devil's Canyon, Jackson, Kouba Canyon and Whiskey Mountain areas.

For example, there is high gene flow among Devil's Canyon, Laramie Peak and Ferris-Seminoe herds. Devil's Canyon, including the surrounding habitat in the Bighorn Mountains, received the most translocations of any herd in Wyoming, including translocation of bighorn sheep from Whiskey Mountain near Dubois; Morgan Creek, Idaho; Missouri Breaks, Mont.; and the Lower Deschutes River, Ore.

Laramie Peak also received translocations from Whiskey Mountain and Montana. The Ferris-Seminoe herd was founded by translocation and continues to receive individuals from Devil's Canyon.

Ernest says the most interesting finding of the study is that bighorn sheep have maintained a distinctive population genetic structure in Wyoming, even with historical population losses and translocations.

"We found this intriguing and important, because we might have expected that the very large reductions in population sizes and extensive translocation events might have caused disintegration of population genetic structure, and an appearance of Wyoming bighorn sheep to be panmictic or 'all interbreeding,'" Ernest says. "But, they are not. They have distinctive populations."

The study used a panel of 38 variable microsatellite loci and 512 base pairs of mitochondrial DNA sequence to identify the genetic structure throughout the state and in translocation source herds; quantify the extent of genetic diversity within each genetic cluster; and estimate the degree of gene flow among herds.

In the early 1800s, the estimated number of bighorn sheep in Wyoming was between 150,000 and 200,000. Overharvest, habitat loss and livestock-transmitted disease outbreaks led to severe population declines. By the 1960s, bighorn sheep numbers had dwindled to about 2,000 before rebounding to roughly 7,000 in 1990.

Today's estimates are between 6,000 and 7,000 animals, with variable demographic rates between herds. Population rebound following the steep decline is attributed to management efforts, which include limiting harvest, preventing disease outbreaks and translocating individual sheep for reintroduction and demographic control.

The study's results provide a statewide assessment of genetic diversity and structure that will enhance management by understanding the outcomes of translocation, identifying the source of unknown individuals and parameterizing disease ecology models, Ernest says.

"The source herd identification of wandering bighorn sheep is important when determining if management actions, such as herd reductions, can be applied to reduce the likelihood of animals leaving the herd for extended forays," Love Stowell says. "Effective population sizes are low in most Wyoming herds, suggesting that managers should weigh the importance of maintaining gene flow for increasing genetic diversity and effective population size against the risks of disease transmission, outbreeding depression, phenology mismatch and other factors. Finally, this research provides a baseline for genetic monitoring in the face of future disease outbreaks or extreme weather events."

Outbreeding depression is described as interbreeding between two distinct populations that can lead to reduction in service and reproduction, Love Stowell says. Phenology mismatch is the timing of life events, such as lambing of bighorn sheep and their environment.

Credit: 
University of Wyoming

'Different techniques needed' to detect fingermarks on new banknotes

Techniques used to detect fingermarks on traditional cotton banknotes are not effective on Scottish banks' new polymer notes and different methods are required, according to a study by University of Strathclyde researchers.

Researchers from the University's Centre for Forensic Science and the Scottish Police Authority Forensic Services tested three techniques to determine which would work best on polymer notes owing to their non-porous fabric, which contrasts with the porous materials in cotton notes..

They discovered that the use of superglue fuming, followed by black magnetic powder, was found to be the most effective process for enhancement of fingermarks on all of the note types tested. Infrared light was the most effective light source for enhancing ridge detail on the fingermarks. This process has been found to be effective on polymer notes, in the same way that other processes were on cotton notes.

Uncirculated notes provided by the Royal Bank of Scotland (RBS) and Clydesdale Bank were used in the study, which is the first on the recovery of latent fingermarks from these types of polymer notes. The research was proposed by the Scottish Police Authority (SPA) and was carried out in its Forensic Services laboratories at the Scottish Crime Campus in Gartcosh.

The research has been published in the journal Forensic Science International.

Dr Penny Haddrill, a Teaching Fellow in the Centre for Forensic Science, is a co-author of the study. She said: "The techniques used in this research are not new but different combinations of existing techniques were examined to find out which ones, and the order in which they were used, would be most effective for recovering fingermarks from polymer notes.

"This is a good example of Strathclyde research being put into practice, as the methods in the study are now being used operationally. It's also part of our ongoing collaboration with the Scottish Police Authority Forensic Services, where many of our MSc Forensic Science students have carried out research work over the years."

Carina Joannidis led the study while an MSc student at Strathclyde and is now a Mark Enhancement Recovery Officer at Forensic Services SPA. She said: "With banks introducing polymer banknotes, it is important forensic scientists are able to continue to recover fingerprints from money seized under the Proceeds of Crime Act which has been used by convicted criminals in Serious and Organised Crime.

"This paper, written in collaboration with the University of Strathclyde, details effective ways to enhance fingerprints on new polymer banknotes so we can continue to serve the needs of justice in Scotland. Forensic Services work closely with Police Scotland, the COPFS and other partners to tackle the harm caused by serious organised crime for the people of Scotland."

The UK's Home Office carried out a fingermark detection study in 2016 on Bank of England notes but Scottish banknotes appear to have more textured areas and a less smooth finish. Notes worth £5 and £10 from both RBS and Clydesdale were examined in the Strathclyde study; they were separated into time periods of seven, 14, 21 and 28 days, to determine the effectiveness of each process on aged fingermarks.

Each of the fingermarks, provided by a group of donors, was given a score between zero and four for visibility. The scores across each series of 10 were then added together to give total scores out of 40, while each series was given a second overall score out of 10; this indicated how many of the 10 fingermarks gave a score of one or more and so contained any evidence of a fingermark.

The researchers aim to extend their study to cover £20 banknotes issued by RBS and Clydesdale, as well as the Bank of Scotland's polymer notes.

Credit: 
University of Strathclyde

Living near oil and gas wells tied to low birth weights in infants

Berkeley -- Living near active oil and gas wells may put pregnant people at higher risk of having low birth weight babies, especially in rural areas, finds a new study of birth outcomes in California.

The study, funded by the California Air Resources Board, is one of the largest of its kind and the first in the state. It analyzed the records of nearly 3 million births to people living within 6.2 miles (10 kilometers) of at least one oil or gas well between 2006 and 2015. Unlike previous studies, it examined births in both rural and urban areas, and people living near both active and inactive oil and gas sites.

The study found that, in rural areas, pregnant people who lived within 0.62 miles (1 kilometer) of the highest producing wells were 40% more likely to have low birth weight babies and 20% more likely to have babies who were small for their gestational age compared to people living farther away from wells or near inactive wells only. Among term births, babies were 1.3 ounces (36 grams) smaller, on average, than those of their counterparts.

People living near active wells in urban areas also had slightly increased odds of having small for gestational age babies than their counterparts. The study did not find a significant relationship between proximity to oil and gas wells and premature births.

"Being born of low birth weight or small for gestational age can affect the development of newborns and increase their risk of health problems in early childhood and even into adulthood," said Rachel Morello-Frosch, a professor of public health and of environmental science, policy and management at the University of California, Berkeley, and senior author of the paper. "When you see a shift of over 30 grams of birth weight among term infants, from an individual clinical perspective, it may not seem like a lot. But when you see that kind of large population shift in birth weight -- that can have significant population level implications for infant and children's health."

The findings, published June 3 in the journal Environmental Health Perspectives, add to a growing body of evidence linking proximity to oil and gas wells to a variety of adverse birth outcomes, including premature birth, heart defects and low birth weight.

Oil production in California has generally declined over the past three decades, and Gov. Gavin Newsom last year issued stricter regulations on new fracking permits in the state. However, the state issued 24 new fracking permits in early April, and another 282 are awaiting review.

"This study is the first to characterize the implications for perinatal health of active oil and gas production in the state, and I think the results can inform decision-making in regulatory enforcement and permitting activities." Morello-Frosch said. "Results from health studies such as ours support recent efforts to increase buffers between active well activities and where people live, go to school and play. This scientific evidence of adverse health effects facing vulnerable populations, including pregnant women, should be taken into account as Californians debate the extent to which we to want to expand oil and gas drilling in our state."

A long history of oil production

Previous research linking oil and gas production to adverse birth outcomes has examined people living near fracking sites in Colorado, Pennsylvania, Oklahoma and Texas. Oil production in California differs from some of these other regions because the infrastructure is generally much older, and the state has a high number of inactive wells.

In addition, because of the geology of the region, many of the sites use enhancement techniques, including fracking and steam and water injection, to access oil reserves, said study lead author Kathy Tran, a graduate student in environmental health sciences at UC Berkeley,

"Even though the California oil and gas industry dates back to the early 1900s, there hasn't been any analysis looking at potential health effects related to oil and gas exposure," Tran said.

Both active and inactive oil and gas sites create a myriad of environmental hazards that have the potential to impact perinatal health, including air and water pollutants, noise and excessive lighting. However, with limited access to the production sites themselves, it can be hard for researchers to pinpoint precisely what factors might be contributing to adverse birth outcomes.

"A lot of the equipment that's being operated on site is a contributor to air pollution, but how much air pollution is an unknown because the inventory industry reports are estimated based on emissions factors, as opposed to measured emissions levels." Tran said. "We assume that with greater production volume, the equipment is being used more intensively. And for that reason, that may be a significant contributor to why we see some impacts related to adverse birth outcomes."

The study corrected for a variety of demographic factors that might also impact birth outcomes, including race, ethnicity, socioeconomic status, maternal education and other neighborhood-level factors, including other sources of air pollution.

While it's unclear why the differences in birth weight were more pronounced in rural areas than in urban areas, the researchers hypothesize that other factors -- such as differences in indoor air quality, maternal occupation or housing conditions -- may have impacted the results.

In the future, Tran hopes that measurements of people's actual exposure to potentially toxic pollutants from oil and gas sites will help pinpoint the culprits behind these findings.

"Because researchers don't have direct access to the actual oil and gas sites, it's hard to get a good estimate of what people actually experience," Tran said. "Obviously, things like wind direction and water movement and other environmental conditions factor into personal exposure, as well. And for that reason, the more in-depth exposure assessment we can get, the more we can really understand why we are seeing the effects that we see."

Credit: 
University of California - Berkeley

Get excited by neural networks

image: Scientists at The University of Tokyo use machine learning to predict the excited electronic states of materials--research that can accelerate both the characterization of materials as well as the formulation of new useful compounds

Image: 
Institute of Industrial Science, The University of Tokyo

Tokyo, Japan - Researchers at the Institute of Industrial Science, The University of Tokyo (UTokyo-IIS) used artificial intelligence to rapidly infer the excited state of electrons in materials. This work can help material scientists study the structures and properties of unknown samples and assist with the design of new materials.

Ask any chemist, and they will tell you that the structures and properties of materials are primarily determined by the electrons orbiting around the molecules that make it up. To be specific, the outermost electrons, which are most accessible for participating in bonding and chemical reactions, are the most critical. These electrons can rest in their lowest energy "ground state," or be temporarily kicked into a higher orbit called an excited state. Having the ability to predict excited states from ground states would go a long way to helping researchers understand the structures and properties of material samples, and even design new ones.

Now, scientists at UTokyo-IIS have developed a machine learning algorithm to do just that. Using the power of artificial neural networks--which have already proven themselves useful for deciding if your latest credit card transaction was fraudulent or which movie to recommend streaming--the team showed how an artificial intelligence can be trained to infer the excited state spectrum by knowing the ground states of the material.

"Excited states usually have atomic or electronic configurations that are different from their corresponding ground states," says first author Shin Kiyohara. To perform the training, the scientists used data from core-electron absorption spectroscopy. In this method, a high energy X-ray or electron is used to knock out a core electron orbiting close to the atomic nucleus. Then, the core electron excites to unoccupied orbitals, absorbing the energy of the high energy X-ray/electron. Measuring this energy absorption reveals information about the atomic structures, chemical bonding, and properties of materials.

The artificial neural network took as input the ground state partial density of states, which can be easily computed, and was trained to predict the corresponding excited state spectra. One of the main benefits of using neural networks, as opposed to conventional computational methods, is the ability to apply the results from training set to completely new situations.

"The patterns we discovered for one material showed excellent transferability to others," says senior author Teruyasu Mizoguchi. "This research in excited states can help scientists better understand chemical reactivity and material function in new or existing compounds."

Credit: 
Institute of Industrial Science, The University of Tokyo

'Excretion of sugar into stool'? New action of anti-diabetic drug discovered

image: The areas where FDG (sugar) is accumulated appear black. In patients taking metformin (right), the intestine appears black, which indicates that FDG (sugar) is accumulated in the intestine.

Image: 
Kobe University

A research team led by Kobe University Graduate School of Medicine's Professor OGAWA Wataru (the Division of Diabetes and Endocrinology) and Project Associate Professor NOGAMI Munenobu (the Department of Radiology) has discovered that metformin, the most widely prescribed anti-diabetic drug, causes sugar to be excreted in the stool.

Metformin has been used for more than 60 years, and is the most frequently prescribed drug for diabetes in the world. Administration of metformin lowers blood sugar levels, but the mechanism behind this effect was not clear. Metformin's mode of action has thus been actively researched over the world.

Taking advantage of the new bio-imaging apparatus PET-MRI, the research team revealed that metformin promotes the excretion of blood sugar from the large intestine into the stool. This is a completely new discovery that has never previously been predicted.

The current finding may explain metformin's biological actions for which the underlying mechanism is unknown, and contribute to the development of new drugs for diabetes.

These findings were published on June 3, 2020 in the online edition of Diabetes Care, a medical journal published by the American Diabetes Association.

Main Points

Metformin is the most frequently prescribed drug for diabetes in the world. The mechanism by which this drug lowers blood sugar concentration is not clear.

In a bioimaging study on humans, metformin was found to promote the "excretion of sugar into the stool".

This newly discovered action of metformin may explain some of the drug's biological effects and contribute to the development of new medication for diabetes.

Research Background

Diabetes is characterized by the elevation of blood sugar concentration, which damages the blood vessels and in turn leads to various diseases. More than 400 million people suffer from diabetes around the world, therefore the prevention of diabetes and its related diseases is an important global medical issue.

A number of drugs that reduce blood sugar concentration are available. Among them, metformin is one of the oldest classes of drugs and has been used for more than 60 years. Metformin, recommended as a first-line drug in many countries, is the most frequently prescribed medication for diabetes.

However, the mechanism by which metformin lowers blood sugar concentration is not clear. Elucidation of this mechanism would contribute to the development of new and better drugs for diabetes. Consequently, research has been actively conducted into the action of metformin.

Summary of the Discovery

FDG-PET (fluorodeoxyglucose-positron emission tomography) is an imaging test to study where and how much FDG (a substance similar to sugar) is accumulated in the body after the administration of this substance through the vessels. Because FDG behaves in a similar way to sugar in the human body, FDG-PET can reveal organs or tissues that consume or accumulate large amounts of sugar (*1).

FDG-PET is generally conduced with a device that integrates both a PET and a CT (computed tomography) device. Obtaining images using FDG-PET and CT allows for the detailed examination of locations where FDG is accumulated. Recently, a device in which PET and MRI (magnetic resonance imaging) are integrated (PET-MRI) has been developed. MRI is used to examine the inside of the body using a strong magnetic field. It can examine bodily structures that cannot be analyzed by CT. PET-MRI is still a rare and valuable device; for example, only 9 units have been installed in Japan.

Professor Ogawa's research team used PET-MRI to investigate the movement of sugar in the bodies of diabetic patients, both those who were taking metformin and those who were not. The team found that sugar (i.e. FDG) is heavily accumulated in the intestine of patients taking metformin (Fig. 1). To understand where in the intestine sugar accumulates, the research team subsequently investigated the "wall of the intestine" and the "inside of the intestine (stool and other contents)" separately using a special technique.

They found that, in patients taking metformin, more sugar was accumulated in the areas inside the intestine that are distal to the ileum (the anal side part of the small intestine) (Fig. 2). On the other hand, there was no difference in sugar accumulation in the "wall of the intestine" between patients who were taking and not taking metformin.

These results indicate that, when a patient takes metformin, sugar in the blood is released from the intestine into the stool. Not only the finding that metformin promotes the excretion of sugar into the stool, but also the revelation that sugar is excreted from the intestine into the stool itself are new discoveries that were not anticipated.

Recently, a new anti-diabetic drug has been launched; the SGLT2 inhibitor, which lowers blood sugar concentrations by excreting sugar in the urine. Its beneficial clinical effects are attracting much attention. Excreting sugar into the stool, as triggered by metformin, may also be related to the SGLT2 inhibitor's sugar lowering effect.

The significance of this research and its future development

Previous studies using PET-CT showed that FDG was accumulated in the intestines of patients taking metformin. It was however assumed that FDG (sugar) was accumulated in the "wall of the intestine" without sufficient evidence because PET-CT cannot separately show the wall and the inside the intestine. In the current study, the new imaging technology PET-MRI allowed the research team to investigate the accumulation in the wall and the inside of the intestine (stool) separately, revealing for the first time that metformin-induced accumulation of sugar occurred exclusively inside the intestine.

Taking a SGLT2 inhibitor results in the excretion of tens of grams of sugar per day in the urine. In this study, it was not possible to quantitatively evaluate how many grams of sugar are excreted in the stool. The significance of this discovery will be further confirmed by using a new imaging method that will enable the excreted sugar in the stool to be quantified.

It is thought that changes in the intestinal flora caused by metformin (*2) are related to its blood sugar lowering effect, but how metformin alters the intestinal flora is completely unknown. Since changes in nutrients such as sugar affect the growth of bacteria, it is possible that metformin's effect of excreting sugar into the intestine may also be related to the changes in the intestinal flora.

Credit: 
Kobe University

Cholesterol levels dropping in Western nations -- but rising in Asia

Cholesterol levels are declining sharply in Western nations, but rising in low- and middle-income nations - particularly in Asia, suggests the largest ever study of global cholesterol levels.

The new study, by hundreds of researchers from across the world, was led by Imperial College London and published in the journal Nature.

The research used data from 102.6 million individuals and examined cholesterol levels in 200 countries, across a 39-year time period, from 1980 to 2018.

The work, which was funded by the Wellcome Trust and the British Heart Foundation, revealed that high cholesterol is responsible for about 3.9 million worldwide deaths. Half of these deaths happen in East, South and Southeast Asia.

Cholesterol is a waxy substance found in the blood. The body needs cholesterol to build healthy cells, but too much can lead to a build-up in the blood vessels. Cholesterol comes in different types. High-density lipoprotein (HDL) 'good' cholesterol, which should be 1mmol/L or above, is thought to have a protective effect against heart attack and stroke, by mopping up excess 'bad' cholesterol.

Non-HDL 'bad' cholesterol, which should be as low as possible, around 2mmol/L, can block blood supply and lead to heart attacks and strokes. This type of cholesterol is usually raised by diets high in saturated and trans fats, which is found in many processed foods, instead of healthier unsaturated fats. It can be lowered effectively through the use of statins.

The results of the new study revealed total and non-HDL cholesterol levels have fallen sharply in high income nations, particularly those in North-western Europe, North America and Australasia, while rising in low- and middle-income nations, particularly in East and Southeast Asia. China, which had some of the lowest levels of non-HDL cholesterol in 1980, had one of the largest rates of increase in non-HDL over the 39 year study period.

Professor Majid Ezzati, lead author of the research from Imperial's School of Public Health, said: "For the first time, the highest levels of non-HDL cholesterol are outside of the Western world. This suggests we now need to set into place throughout the world pricing and regulatory policies that shift diets from saturated to non-saturated fats, and to prepare health systems to treat those in need with effective medicines. This will help save millions of deaths from high non-HDL cholesterol in these regions"

Countries with the highest levels of non-HDL cholesterol, which is a marker of cardiovascular risk, changed from those in Western Europe such as Belgium, Finland, Greenland, Iceland, Norway, Sweden, Switzerland and Malta in 1980 to those in Asia and Pacific, such as Tokelau, Malaysia, the Philippines and Thailand.

Professor Ezzati added that some of the reduction in non-HDL cholesterol levels in Western nations are due to increased use of statins in Western countries, which are not yet used widely in low- and middle-income countries.

The team point out that some countries had less data compared to others, which could influence how certain we are about cholesterol levels and changes over time.

Professor Sir Nilesh Samani, Medical Director at the British Heart Foundation said: "It's encouraging to see the reduction in levels of non-HDL, or 'bad', cholesterol in the UK since 1980. Public health initiatives about the risks of a diet high in saturated fat, and wider treatment with statins in those with high levels will have made a big contribution. The result is undoubtedly fewer heart attacks and strokes. However, we mustn't be complacent or be misled by this change. High numbers of people still have undiagnosed or uncontrolled levels of non-HDL cholesterol putting them at greater risk of heart and circulatory diseases. We strongly encourage people, especially those over 40, to have their cholesterol checked. It's important for those diagnosed with high non-HDL cholesterol to follow their doctor's advice for lowering it."

The research, which produced cholesterol comparisons for all 200 countries, also found that:

Non-HDL cholesterol for UK women ranked 18th highest in the world and 16th highest in Europe in 1980. In 2018, it ranked 130th highest in the world and 34th highest in Europe.

Non-HDL cholesterol for UK men ranked 18th highest in the world and 16th highest in Europe in 1980. In 2018, it ranked 106th highest in the world and 35th highest in Europe.

Non-HDL cholesterol for US women ranked 50th highest in the world in 1980. In 2018, it ranked 127th highest in the world.

Non-HDL cholesterol for US men ranked 42th highest in the world in 1980. In 2018, it ranked 108th highest in the world.

Non-HDL cholesterol for Chinese women ranked 152th highest in the world in 1980. In 2018, it ranked 110th highest in the world.

Non-HDL cholesterol for Chinese men ranked 153th highest in the world in 1980. In 2018, it ranked 99th highest in the world.

Non-HDL cholesterol for Indian women ranked 139th highest in the world in 1980. In 2018, it ranked 140th highest in the world.

Non-HDL cholesterol for Indian men ranked 128th highest in the world in 1980. In 2018, it ranked 128th highest in the world.

Non-HDL cholesterol for Australian women ranked 32th highest in the world in 1980. In 2018, it ranked 146th highest in the world.

Non-HDL cholesterol for Australian men ranked 31th highest in the world in 1980. In 2018, it ranked 116th highest in the world.

Non-HDL cholesterol for Canadian women ranked 37th highest in the world in 1980. In 2018, it ranked 147th highest in the world.

Non-HDL cholesterol for Canadian men ranked 32nd highest in the world in 1980. In 2018, it ranked 123rd highest in the world.

Non-HDL cholesterol for German women ranked 23rd highest in the world and 19th highest in Europe in 1980. In 2018, it ranked 100th highest in the world and 27th highest in Europe.

Non-HDL cholesterol for German men ranked 24th highest in the world and 21st highest in Europe in 1980. In 2018, it ranked 83rd highest in the world and 31th highest in Europe.

Non-HDL cholesterol for French women ranked 27th highest in the world and 22nd highest in Europe in 1980. In 2018, it ranked 42nd highest in the world and 4th highest in Europe.

Non-HDL cholesterol for French men ranked 15th highest in the world and 14th highest in Europe in 1980. In 2018, it ranked 26th highest in the world and 11th highest in Europe.

Credit: 
Imperial College London

Largest, oldest Maya monument suggests importance of communal work

From the ground, it's impossible to tell that the plateau underfoot is something extraordinary. But from the sky, with laser eyes, and beneath the surface, with radiocarbon dating, it's clear that it is the largest and oldest Mayan monument ever discovered.

Located in Tabasco, Mexico, near the northwestern border of Guatemala, the newly discovered site of Aguada Fénix lurked beneath the surface, hidden by its size and low profile until 2017. The monument measures nearly 4,600 feet long, ranges from 30 to 50 feet high and includes nine wide causeways.

The monument was discovered by an international team led by University of Arizona professors in the School of Anthropology Takeshi Inomata and Daniela Triadan, with support from the university's Agnese Nelms Haury program and under the authorization of the National Institute of Anthropology and History of Mexico.

They used lidar - or light detection and ranging - technology, which uses laser-emitting equipment from an airplane. Laser beams penetrate the tree canopy, and their reflections off the ground's surface reveal the three-dimensional forms of archaeological features. The team then excavated the site and radiocarbon-dated 69 samples of charcoal to determine that it was constructed sometime between 1,000 to 800 B.C. Until now, the Maya site of Ceibal, built in 950 B.C., was the oldest confirmed ceremonial center. This oldest monumental building at Aguada Fénix turned out to be the largest known in the entire Maya history, far exceeding pyramids and palaces of later periods.

The team's findings are published today in the journal Nature.

"Using low-resolution lidar collected by the Mexican government, we noticed this huge platform. Then we did high-resolution lidar and confirmed the presence of a big building," Inomata said. "This area is developed - it's not the jungle; people live there - but this site was not known because it is so flat and huge. It just looks like a natural landscape. But with lidar, it pops up as a very well-planned shape."

The discovery marks a time of major change in Mesoamerica and has several implications, Inomata said.

First, archaeologists traditionally thought Maya civilization developed gradually. Until now, it was thought that small Maya villages began to appear between 1000 and 350 B.C., what's known as the Middle Preclassic period, along with the use of pottery and some maize cultivation.

Second, the site looks similar to the older Olmec civilization center of San Lorenzo to the west in the Mexican state of Veracruz, but the lack of stone sculptures related to rulers and elites, such as colossal heads and thrones, suggests less social inequality than San Lorenzo and highlights the importance of communal work in the earliest days of the Maya.

"There has always been debate over whether Olmec civilization led to the development of the Maya civilization or if the Maya developed independently," Inomata said. "So, our study focuses on a key area between the two."

The period in which Aguada Fénix was constructed marked a gap in power - after the decline of San Lorenzo and before the rise of another Olmec center, La Venta. During this time, there was an exchange of new ideas, such as construction and architectural styles, among various regions of southern Mesoamerica. The extensive plateau and the large causeways suggest the monument was built for use by many people, Inomata said.

"During later periods, there were powerful rulers and administrative systems in which the people were ordered to do the work. But this site is much earlier, and we don't see the evidence of the presence of powerful elites. We think that it's more the result of communal work," he said.

The fact that monumental buildings existed earlier than thought and when Maya society had less social inequality makes archaeologists rethink the construction process.

"It's not just hierarchical social organization with the elite that makes monuments like this possible," Inomata said. "This kind of understanding gives us important implications about human capability, and the potential of human groups. You may not necessarily need a well-organized government to carry out these kinds of huge projects. People can work together to achieve amazing results."

Inomata and his team will continue to work at Aguada Fénix and do a broader lidar analysis of the area. They want to gather information about surrounding sites to understand how they interacted with the Olmec and the Maya.

They also wants to focus on the residential areas around Aguada Fénix.

"We have substantial information about ceremonial construction," Inomata said, "but we want to see how people lived during this period and what kind of changes in lifestyle were happening around this time."

Credit: 
University of Arizona

New laser system provides 3D reconstructions of living deep-sea animals and mucus filters

image: These illustrations show how a sheet of laser light from the DeepPIV system illuminates the inside of a larvacean filter, revealing internal structures.

Image: 
Image: © 2020 MBARI

Living in an essentially zero-gravity environment, many deep-sea animals have evolved soft, gelatinous bodies and collect food using elaborate mucus filters. Until now, studying these delicate structures has been virtually impossible. A new study published in the journal Nature describes a unique laser-based system for constructing 3D models of diaphanous marine animals and the mucus structures they secrete.

According to Kakani Katija, MBARI Principal Engineer and the lead author on the new paper, "Mucus is ubiquitous in the ocean, and complex mucus structures are made by animals for feeding, health, and protection. Now that we have a way to visualize these structures deep below the surface we can finally understand how they function and what roles they play in the ocean."

For this study, the researchers focused on one of the most prolific mucus architects, deep-sea animals called larvaceans. Larvaceans are abundant throughout the world's ocean basins and range from less than one centimeter to about 10 centimeters in length. So-called "giant" larvaceans create balloon-like mucus webs that can be up to a meter across. Inside these outer filters are smaller, fist-sized inner filters that the animals use to feed on tiny particles and organisms, ranging from less than a micron to a few millimeters in size.

Despite their insubstantial bodies, larvaceans remove vast amounts of carbon-rich food out of the surrounding water. When their mucus filters become clogged the animals release the mucus, which sinks rapidly to the seafloor. This helps the ocean remove carbon dioxide from the atmosphere and carries microplastics from the water column down to the seafloor.

Researchers, like MBARI Senior Scientist and co-author Bruce Robison, have long been interested in how larvaceans can filter a wide variety of particles while processing very large volumes of water (up to 80 liters an hour). Previous studies have looked at smaller larvacean filters in the laboratory, but this is the first study to provide quantitative data about these mucus structures in the open ocean.

To gather these data, Katija, who heads MBARI's Bioinspiration Lab, worked with a team of engineers, scientists, and submersible pilots to develop an instrument called DeepPIV (PIV stands for particle imaging velocimetry). Mounted on a remotely operated vehicle (ROV), the DeepPIV instrument projects a sheet of laser light that illuminates particles in the water, like dust motes in a sunbeam. By recording the movement of these particles in video, researchers can quantify tiny currents around marine animals as well as water flowing through their filters and their transparent bodies.

During field deployments of the DeepPIV system, Katija and her colleagues discovered that, as the ROV moved back and forth, the sheet of laser light revealed a series of cross sections through the transparent, gelatinous bodies and the mucus filters of giant larvaceans. By assembling a series of these cross-sectional images, the team was able to create three-dimensional reconstructions of individual larvaceans and their filters, much as radiologists do following a CAT scan of a human body.

Collecting high-fidelity video imagery required skilled piloting of MBARI's ROVs. "Using DeepPIV to collect these 3D cross sections is probably the hardest thing I've ever done with an ROV," said Knute Brekke, chief pilot for ROV Doc Ricketts. "We were using a 12,000 pound robot to move a millimeter-thick laser sheet back and forth through a larvacean and its fist-sized mucus filter that was drifting hundreds of meters below the ocean surface."

Combining three-dimensional models of larvacean filters with observations of flow patterns through the filters, Katija and her collaborators were able, for the first time, to identify the shape and function of different parts of the larvacean's inner filter. Using 3D rendering software, they were able to virtually "fly through" the inner filter and study the flow of fluid and particles through different parts of the filter.

"Now we have a technique for understanding the form of these complex structures, and how they function," Katija explained. "No one has done in situ 3D reconstructions of mucus forms like this before."

"Among other things, we're hoping to understand how larvaceans build and inflate these structures," she continued. "This could help us design better 3D printers or build complex inflatable structures that could be used in a number of environments," including underwater and in outer space.

Expanding on this work, members of the Bioinspiration Lab are experimenting with new 3D plenoptic imaging systems that can capture highly-precise information about the intensity, color, and direction of light in a scene. They are also collaborating on the development of new underwater robots that will be able to follow gelatinous animals through the water for hours or days at a time.

"In this paper, we have demonstrated a new system that operates well with a variety of underwater vehicles and midwater organisms," said Katija. "Now that we have a tool to study the mucus filtering systems found throughout the ocean, we can finally bring to light some of nature's most complex structures."

"DeepPIV has revealed a marvel of natural engineering in the structure of these complex and intricate filtering webs," said Robison. "And in DeepPIV, human engineering has produced a powerful new tool for investigating these and other mysteries of the deep ocean."

Credit: 
Monterey Bay Aquarium Research Institute

Clinical, immune features of hospitalized pediatric patients with COVID-19

What The Study Did: The immunologic features of mild and moderate COVID-19 in pediatric patients is described and compared in this case series.

Authors: Yun Xiang, Ph.D., and Jianbo Shao, Ph.D., of the Wuhan Children's Hospital in China, are the corresponding authors.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.10895)

Editor's Note: The article includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

First optical measurements of Milky Way's Fermi Bubbles probe their origin

image: Fermi Bubbles: Astronomers used the WHAM telescope to measure huge outflows of gas extending from the Milky Way's center known as the Fermi Bubbles. They were able to measure the velocity, density and pressure of the gas for the first time, confirming and extending previous measurements made by using a distant quasar as a light source to look through and measure the gas.

Image: 
Images by Dhanesh Krishnarao and NASA

MADISON -Using the Wisconsin H-Alpha Mapper telescope, astronomers have for the first time measured the Fermi Bubbles in the visible light spectrum. The Fermi Bubbles are two enormous outflows of high-energy gas that emanate from the Milky Way and the finding refines our understanding of the properties of these mysterious blobs.

The research team from the University of Wisconsin-Madison, UW-Whitewater and Embry-Riddle Aeronautical University measured the emission of light from hydrogen and nitrogen in the Fermi Bubbles at the same position as recent ultraviolet absorption measurements made by the Hubble Telescope.

"We combined those two measurements of emission and absorption to estimate the density, pressure and temperature of the ionized gas, and that lets us better understand where this gas is coming from," says Dhanesh Krishnarao, lead author of the new study and an astronomy graduate student at UW-Madison.

The researchers announced their findings June 3 at the 236th meeting of the American Astronomical Society, which was held virtually for the first time since 1899, in response to the COVID-19 pandemic.

Extending 25,000 light years both above and below the center of the Milky Way, the Fermi Bubbles were discovered in 2010 by the Fermi Gamma Ray Telescope. These faint but highly energetic outflows of gas are racing away from the center of the Milky Way at millions of miles per hour. But while the origin of the phenomenon has been inferred to date back several million years ago, the events that produced the bubbles remain a mystery.

Now, with new measurements of the density and pressure of the ionized gas, researchers can test models of the Fermi Bubbles against observations.

"The other significant thing is that we now have the possibility of measuring the density and pressure and the velocity structure in many locations," with the all-sky WHAM telescope, says Bob Benjamin, a professor of astronomy at UW-Whitewater and co-author of the study. "We can do an extensive mapping effort across the Fermi Bubbles above and below the plane of the galaxy to see if the models that people have developed are holding up. Because, unlike the ultraviolet data, we're not limited to just specific lines of sight."

Matt Haffner, professor of physics and astronomy at Embry-Riddle Aeronautical University and a co-author of the report, says the work demonstrates the usefulness of the WHAM telescope, developed at UW-Madison, to tell us more about the workings of the Milky Way. The central region of our home galaxy has long been difficult to study because of gas blocking out view, but WHAM has provided new opportunities to gather the kind of information we have for distant galaxies.

"There are regions of the galaxy we can target with very sensitive instruments like WHAM to get this kind of new information toward the center that previously we are only able to do in the infrared and radio," says Haffner. "We can make comparisons to other galaxies by making the same kind of measurements towards the center of the Milky Way."

Credit: 
University of Wisconsin-Madison

Tracking cancer's immortality factor

Canadian scientists have achieved a first in the study of telomerase, an essential enzyme implicated in aging and cancer.

In today's edition of the prestigious journal Molecular Cell, scientists from Université de Montréal used advanced microscopy techniques to see single molecules of telomerase in living cells.

A flaw in the replication of chromosomes means that they get shorter with each cell division. If nothing is done to correct this error, replication stops and cells go into a state called senescence, a hallmark of aging. Normally, telomerase adds extra DNA to the ends of chromosomes to prevent this problem, but as we age our bodies produce fewer of them.

Cancer cells, on the other hand, become immortal by switching telomerase back on, allowing cells to divide indefinitely. This re-activation is among the first steps that direct cells to become cancerous, but the process remains poorly understood. If researchers knew more about it, they could offer hope for some form of therapy to combat it.

Now an Université de Montréal team led by biochemistry professor Pascal Chartrand, in collaboration with cell biologist Agnel Sfeir at the Skirball Institute in New York, has succeeded in tagging telomerase with several ultrabright fluorescent molecules - something that's never been done before.

"With this technological breakthrough, we observed that telomerase continuously probes telomeres, but becomes engaged at the ends of chromosomes following a two-step binding mode," said UdeM biochemist Hadrien Laprade, who, with his colleague Emmanuelle Querido, conducted the experimental investigations.

In their study, the scientists also show how mutation of a telomeric regulatory factor results in an unrestrained access of telomerase to the tips of telomeres, an event that promotes tumorigenesis.

"This new technology now provides sufficient details of how a key actor in cancer works at the molecular level, the first step in developing novels therapeutic strategies to thwart its activity," said Chartrand.

"It could take years before we get there, but this is an important first step."

Credit: 
University of Montreal

Epidemiology, clinical features, disease severity in pediatric patients with COVID-19

What The Study Did: Epidemiology, clinical and laboratory features of 50 children hospitalized with COVID-19 in New York are examined in this case series.

Authors: Philip Zachariah, M.D., M.Sc., of Columbia University Irving Medical Center in New York, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamapediatrics.2020.2430)

Editor's Note: The article includes conflict of interest disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network