Culture

Activation of carbon-fluorine bonds via cooperation of a photocatalyst and tin

image: The world's first regioselective C-F bond transformation of perfluorinated compounds.

Image: 
Osaka University

Fluorinated compounds are an important group of compounds that are widely used in pharmaceuticals, agricultural chemicals, functional resins, and organic electronic materials. In particular, perfluorinated compounds with multiple carbon-fluorine bonds are attracting attention because of their high thermal and chemical stability and various excellent properties such as water and oil repellency and chemical resistance.

"C-F bonds are extremely strong; hence, their transformation under mild conditions is difficult, and the selective activation of a specific C-F bond from among multiple C-F bonds in perfluorinated compounds has not been achieved," explains Prof. Makoto Yasuda, corresponding author of the study.

The research team led by Prof. Makoto Yasuda has discovered that a site-selective C-F bond transformation to valuable functional groups proceeds via a photocatalyst and organotin compounds under visible light irradiation (Figure. 1). Experimental and theoretical results revealed the importance of the cooperation of a photocatalyst and organotin compounds in this transformation.

In this research, site-selective C-F bond transformation to valuable allylic groups has been accomplished by using a photocatalyst and organotin compounds under safe and common visible light irradiation (Figure 2a). The establishment of the methodology to activate strong carbon-fluorine bonds under such mild conditions is the key to achieving the targeted transformation of perfluorinated compounds at specific sites.

"We have attempted to elucidate this reaction mechanism using both experimental and theoretical chemical methods and have found that the cooperative action of the photocatalyst and organotin compound plays a very important role in the progression of the reaction. In particular, it is noteworthy that the organotin compound plays the dual role of capturing unstable radical intermediates and scavenging fluorine as a Lewis acid, which is a very significant finding for future research on carbon-fluorine bond conversion reactions," explains Prof. Makoto Yasuda. Furthermore, by using this method, they have succeeded in synthesizing fluorine-substituted analogues of a compound that show promise for pharmaceutical applications (Figure 2b).

"Fluorine is an important element in pharmaceuticals, and many small-molecule drugs contain fluorine atoms. It is expected that the field of fluorine-containing drugs will continue to grow. As a result of this research, high value-added perfluorinated compounds, which were impossible to synthesize in the past, can now be synthesized in a simple and short process, which is expected to lead to the expansion of the library of seed compounds for fluorine-containing drug discovery," says Prof. Makoto Yasuda.

Credit: 
Osaka University

A speedy trial: What it takes to be the fastest land predator

image: (Left) Flight types involved in cheetah galloping (Right) A simple model for analyzing galloping motion. A team of researchers from Japan devised a simple analytical model emulating vertical hopping and spine bending movement displayed by cheetahs during running and obtained criteria for determining flight types like cheetah galloping

Image: 
Image courtesy: Tomoya Kamimura from Nagoya Institute of Technology

What makes cheetah the fastest land mammal? Why aren't other animals, such as horses, as fast? While we haven't yet figured out why, we have some idea about how--cheetahs, as it turns out, make use of a "galloping" gait at their fastest speeds, involving two different types of "flight": one with the forelimbs and hind limbs beneath their body following a forelimb liftoff, called "gathered flight," while another with the forelimbs and hind limbs stretched out after a hind limb liftoff, called "extended flight" (see Figure 1). Of these, the extended flight is what enables cheetahs to accelerate to high speeds, and it depends on ground reaction forces satisfying specific conditions; in the case of horses, the extended flight is absent.

Additionally, cheetahs show appreciable spine movement during flight, alternating between flexing and stretching in gathered and extended modes, respectively, which contributes to its high-speed locomotion. However, little is understood about the dynamics governing these abilities.

"All animal running constitutes a flight phase and a stance phase, with different dynamics governing each phase," explains Dr. Tomoya Kamimura from Nagoya Institute of Technology, Japan, who specializes in intelligent mechanics and locomotion. During the flight phase, all feet are in the air and the center of mass (COM) of the whole body exhibits ballistic motion. Conversely, during the stance phase, the body receives ground reaction forces through the feet. "Due to such complex and hybrid dynamics, observations can only get us so far in unraveling the mechanisms underlying the running dynamics of animals," Dr. Kamimura says.

Consequently, researchers have turned to computer modeling to gain a better dynamic perspective of the animal gait and spine movement during running and have had remarkable success using fairly simple models. However, few studies so far have explored the types of flight and spine motion during galloping (as seen in a cheetah). Against this backdrop, Dr. Kamimura and his colleagues from Japan have now addressed this issue in a recent study published in Scientific Reports, using a simple model emulating vertical and spine movement.

The team, in their study, employed a two-dimensional model comprising two rigid bodies and two massless bars (representing the cheetah's legs), with the bodies connected by a joint to replicate the bending motion of the spine and a torsional spring. Additionally, they assumed an anterior-posterior symmetry, assigning identical dynamical roles to the fore and hind legs.

By solving the simplified equations of motion governing this model, the team obtained six possible periodic solutions, with two of them resembling two different flight types (like cheetah galloping) and four, only one flight type (unlike cheetah galloping), based on the criteria related to the ground reaction forces provided by the solutions themselves. Researchers then verified these criteria with measured cheetah data, revealing that cheetah galloping in the real world indeed satisfied the criterion for two flight types through spine bending (see Figure 2).

Additionally, the periodic solutions also revealed that horse galloping only involves gathered flight due to restricted spine motion, suggesting that the additional extended flight in cheetahs combined with spine bending allowed them to achieve such great speeds!

"While the mechanism underlying this difference in flight types between animal species still remains unclear, our findings extend the understanding of the dynamic mechanisms underlying high-speed locomotion in cheetahs. Furthermore, they can be applied to the mechanical and control design of legged robots in the future," speculates an optimistic Dr. Kamimura.

Cheetahs inspiring legged robots! Who would've thought?

Credit: 
Nagoya Institute of Technology

Luring bacteria into a trap

Developing vaccines against bacteria is in many cases much more difficult than vaccines against viruses. Like virtually all pathogens, bacteria are able to sidestep a vaccine's effectiveness by modifying their genes. For many pathogens, such genetic adaptations under selective pressure from vaccination will cause their virulence or fitness to decrease. This lets the pathogens escape the effects of vaccination, but at the price of becoming less transmissible or causing less damage. Some pathogens, however, including many bacteria, are extremely good at changing in ways that allow them to escape the effects of vaccination while remaining highly infectious.

For scientists looking to develop vaccines, this kind of immune evasion has been a fundamental problem for decades. If they set out to develop vaccines against bacterial pathogens, often they will notice that these quickly become ineffective.

Weaponising immune evasion

Now, however, researchers at ETH Zurich and the University of Basel have exploited precisely this mechanism to come up with an effective vaccine against bacteria. They succeeded in developing a Salmonella vaccine that, instead of trying to outright kill intestinal bacteria, rather guides their evolution in the gut to make them a weaker pathogen.

"This allowed us to show that immune evasion is not only a major challenge in vaccine development, but that it can in fact be put to good use in both human and veterinary medicine," explains ETH Professor Emma Slack. "We can use it to drive the evolution of pathogenic microorganisms in a certain direction - in our case, a dead end." Slack led the study, which involved many researchers from different groups at ETH Zurich and other institutions, together with ETH Professor Wolf-Dietrich Hardt and Médéric Diard, Professor at the University of Basel's Biozentrum.

Combination vaccine leads to the goal

In their study, the researchers inoculated mice with a series of slightly different vaccines against Salmonella typhimurium, and observed how the Salmonella in the animals' guts modified their genes to escape the vaccines' effects. This let the scientists identify the full spectrum of possible immune evasion mutations in Salmonella typhimurium. Subsequently, the researchers produced a combination vaccine from four Salmonella strains that covered the bacteria's full spectrum of genetic evasion options.

A surprising immune evasion was driven by this combined vaccine, causing an important Salmonella sugar coating on the surface to atrophy. While the affected bacteria were still able to multiply in the animals' guts, they were largely unable to infect body tissues and cause disease. This is because the sugar coating is part of the bacteria's protective coating that shields them from the host's defences as well as from viruses that often infect and kill the bacteria. In tests on mice, the scientists were able to show that their new vaccine was more effective at preventing Salmonella infections than existing vaccines approved for use in pigs and chickens.

The scientists now plan to use the same principle to develop vaccines against other microorganisms - for example, against antimicrobial-resistant bacterial strains. In addition, it ought to be possible to use the approach in biotechnology and bring about specific modifications in microorganisms by exerting selective pressure through vaccines.

Credit: 
ETH Zurich

Researchers figured out how the ancestors of modern horses migrated

image: Caballine horse descendants now live in Canada, USA, China, Russia and Kazakhstan.

Image: 
UrFU

An international research team determined that ancestors of modern domestic horses and the Przewalski horse moved from the territory of Eurasia (Russian Urals, Siberia, Chukotka, and eastern China) to North America (Yukon, Alaska, continental USA) from one continent on another at least twice. It happened during the Late Pleistocene (2.5 million years ago - 11.7 thousand years ago). The analysis results are published in the journal. The findings and description of horse genomes are published in the journal Molecular Ecology.

"We found out that the Beringian Land Bridge, or the area known as Beringia, influenced genetic diversity within horses and beyond," said senior researcher at the Ural Branch of the Russian Academy of Sciences and the Ural Federal University (Russia) Dmitry Gimranov. "Owing to the appearance of this land part, the flow of genes among mammoths, bison, and wolves could occur regularly. And if 1-0.8 million years ago horses from North America were not yet widespread in Eurasia, then in the periods of 950-450 and 200-50 thousand years ago, there was a bidirectional spread of genes over long distances."

In other words, horses migrated between continents not only in one direction but also vice versa. The first wave of migration was predominantly from North America to Eurasia. The second migration was dominated by the movement from Eurasia to North America.

The most important researchers conclude that most animals used the Beringian Land Bridge only once, and horses several times. This fact could significantly affect the genetic structure of horse populations and made them very interesting objects of research for paleogeneticists.

To determine the area of settlement of horses, molecular biologists studied horses' DNA from both continents. From 262 samples of bones and teeth, they selected 78 with sufficient DNA. Researchers conducted radiocarbon dating and genetic analysis at the Denmark and USA laboratories. In addition, they looked at the research data from 112 samples.

"The data shows that horses returned to North America from Eurasia across Beringia at about the same time as bison, brown bears, and lions," says Dmitry Gimranov. "That is, in the last "days" of the late Pleistocene, when the territory was not covered by water and it was like a bridge for the movement of many groups of animals. With the beginning of climate warming (the beginning of the Holocene or 11.7 thousand years ago) and the last disappearance of Beringia at the end of the Pleistocene, the biogeographic significance of this ecological corridor radically changed the history of terrestrial animal species on both continents."

Although the North American horse population eventually became extinct in the early Holocene, horses became widespread on both continents due to domestication and are now found far beyond their historical range.

Credit: 
Ural Federal University

Aortic condition more deadly in women than in men

CHICAGO -- Women who experience acute aortic dissection--a spontaneous and catastrophic tear in one of the body's main arteries--not only are older and have more advanced disease than men when they seek medical care, but they also are more likely to die, according to research published online today in The Annals of Thoracic Surgery.

"Data over the course of the last few decades demonstrate differences in both presentation and outcomes between males and females who have acute aortic dissection, with greater mortality among females," said Thomas G. Gleason, MD, from Brigham and Women's Hospital in Boston, Massachusetts. "This study underscores the need for further interrogatories into these sex differences that may help provoke refined sex-directed strategies to further improve outcomes."

Lauren Huckaby, MD, from the University of Pittsburgh in Pennsylvania, Dr. Gleason, and colleagues queried the Interventional Cohort (IVC) of the International Registry of Acute Aortic Dissection (IRAD) to explore sex-specific differences in presentation, operative approach, and outcomes in patients with type A acute aortic dissection (TAAD). IRAD is the largest consortium of centers worldwide (55 institutions in 12 countries) that collects and analyzes data related to the clinical aspects of aortic dissection; the IVC was initiated to provide more detailed insight into surgical techniques and procedures for aortic dissection.

The Stanford classification divides aortic dissections into two groups (A and B), depending on the location of the tear in the aorta. In type A dissection, the tear begins where the aorta exits the heart (ascending aorta) and frequently extends from the upper to lower sections of the aorta, compromising blood flow throughout the body. Type A aortic dissections have a high likelihood of migrating toward the heart, where they become deadly by rupturing into the pericardial sac that surrounds the heart. As many as 40% of people with aortic dissections die instantly, and the risk of death increases approximately 1% for every hour that the diagnosis and surgical repair are delayed, according to multisociety clinical practice guidelines.

Within the IRAD-IVC database, the researchers identified 2,823 patients who experienced TAAD from 1996 to 2018 and underwent operative repair or a surgical approach as part of a hybrid repair. Approximately 34% of the patients were female.

Sex-Specific Differences in Clinical Presentation

Although less frequently affected by TAAD, female patients were significantly older than male patients (65.4 years versus 58.6 years on average) and had different presenting symptoms, like hypotension (low blood pressure) and greater evidence of malperfusion (loss of blood supply to a vital organ), with a higher prevalence of shock (31.3% versus 22.2%) and coma/altered consciousness (11.5% versus 7.5%).

Benjamin A. Youdelman, MD, from Maimonides Medical Center in Brooklyn, New York, who was not directly involved in this research, explained that these variances in clinical presentation indicate that women may be waiting longer to seek medical care compared to men. This may be due to female patients being "stoic," not considering their symptoms as signs of a significant problem and not prioritizing their care.

"The result is presenting for medical care later, with a greater percentage of women in shock and mental status changes that are often attributed to a stroke, which can further delay the correct diagnosis of an aortic dissection as the cause," said Dr. Youdelman, who works closely with the aortic disease awareness campaign--Think Aorta US. "All of this results in worse early outcomes after aortic dissection for women compared to men. It has been known for a long time that outcomes after aortic dissection are dependent on time to treatment: The faster a person is treated the better."

Variations in Imaging Findings

The researchers also found differences in imaging: Female patients were more likely to experience intramural hematoma, which is blood leaking through the innermost layer of the aortic wall and flowing between the inner and outer walls (19.4% versus 13.2%), and complete (17.2% versus 10.2%) or partial (24.8% versus 19.4%) false lumen thrombosis. In aortic dissection, the force of the diverted blood splits the aortic wall layers, leading to the formation of a false lumen or newly created passageway for blood flow.

"The recognition that women present differently and later in the course suggests that they may be seeking emergency care in a more delayed fashion than men," said Dr. Gleason. "Accordingly, clinicians should respond to these sometimes opaque signs and symptoms by considering aortic dissection early. These findings at presentation should raise suspicion among physicians into the possibility of aortic dissection, giving rise to immediate diagnosis to allow for more efficient surgical management. We should be hypervigilant in women to avoid any further delays in treatment."

Operative Approaches

The researchers noted that operative approaches were distinct between the sexes, with female patients being less likely than male patients to undergo aortic valve replacement, aortic root replacement, and/or complete arch replacement. In fact, the less frequent use of complete arch replacement in female patients (15.2% versus 20.6%) was highlighted by the researchers because worse outcomes after the operation historically have been demonstrated. They deduced that this more aggressive approach may be avoided in the treatment of older, female patients due to concerns for worse outcomes.

Overall, female patients had increased mortality, although in the last few years, mortality between the sexes was comparable, which suggests recent improvements in care. According to Dr. Gleason, better recognition, earlier diagnosis, faster and more efficient care delivery, new and improved surgical techniques--including brain perfusion and reconstruction procedures--and subsequent longitudinal surveillance have all contributed to more lives being saved.

Dr. Youdelman explained that family medical history also is a critically important factor for identifying patients at risk and saving lives. "Once this information is shared with the health care team, an evaluation by CT scan to achieve definitive diagnosis and rapid treatment would result. Patients should start conversations with their families; it may save a life," he said.

Further study by the researchers is expected in order to better understand what is driving the development of acute aortic syndromes in each sex, as well as more accurately determine dissection risk and inform individualized treatment decisions.

Credit: 
The Society of Thoracic Surgeons

Researchers explore ways to detect 'deep fakes' in geography

image: Researchers combined satellite images of Tacoma, Washington, with Seattle and Beijing to create a composite image, and then identified differences between the false and true images.

Image: 
Chengbin Deng

Can you trust the map on your smartphone, or the satellite image on your computer screen?

So far, yes, but it may only be a matter of time until the growing problem of "deep fakes" converges with geographical information science (GIS). Researchers such as Associate Professor of Geography Chengbin Deng are doing what they can to get ahead of the problem.

Deng and four colleagues -- Bo Zhao and Yifan Sun at the University of Washington, and Shaozeng Zhang and Chunxue Xu at Oregon State University -- co-authored a recent article in Cartography and Geographic Information Science that explores the problem. In "Deep fake geography? When geospatial data encounter Artificial Intelligence," they explore how false satellite images could potentially be constructed and detected. News of the research has been picked up by countries around the world, including China, Japan, Germany and France.

"Honestly, we probably are the first to recognize this potential issue," Deng said.

Geographic information science (GIS) underlays a whole host of applications, from national defense to autonomous cars, a technology that's currently under development. Artificial intelligence has made a positive impact on the discipline through the development of Geospatial Artificial Intelligence (GeoAI), which uses machine learning -- or artificial intelligence (AI) -- to extract and analyze geospatial data. But these same methods could potentially be used to fabricate GPS signals, fake locational information on social media posts, fabricate photographs of geographic environments and more.

In short, the same technology that can change the face of an individual in a photo or video can also be used to make fake images of all types, including maps and satellite images.

"We need to keep all of this in accordance with ethics. But at the same time, we researchers also need to pay attention and find a way to differentiate or identify those fake images," Deng said. "With a lot of data sets, these images can look real to the human eye."

To figure out how to detect an artificially constructed image, first you need to construct one. To do so, they used a technique common in the creation of deep fakes: Cycle-Consistent Adversarial Networks (CycleGAN), an unsupervised deep learning algorithm that can simulate synthetic media.

Generative Adversarial Networks (GAN) are a type of artificial intelligence, but they require training samples -- input -- of whatever content they are programmed to produce. A black box on a map could, for example, represent any number of different factories or businesses; the various points of information inputted into the network helps determine the possibilities it can generate.

The researchers altered a satellite image of Tacoma, Washington, interspersing elements of Seattle and Beijing and making it look as real as possible. Researchers are not encouraging anyone to try such a thing themselves -- quite the opposite, in fact.

"It's not about the technique; it's about how human being are using the technology," Deng said. "We want to use technology for the good, not for bad purposes."

After creating the altered composite, they compared 26 different image metrics to determine whether there were statistical differences between the true and false images. Statistical differences were registered on 20 of the 26 indicators, or 80%.

Some of the differences, for example, included the color of roofs; while roof colors in each of the real images were uniform, they were mottled in the composite. The fake satellite image was also dimmer and less colorful, but had sharper edges. Those differences, however, depended on the inputs they used to create the fake, Deng cautioned.

This research is just the beginning. In the future, geographers may track different types of neural networks to see how they generate false images and figure out ways to detect them. Ultimately, researchers will need to discover systematic ways to root out deep fakes and verify trustworthy information before they end up in the public view.

"We all want the truth," Deng said.

Credit: 
Binghamton University

Rush researchers develop new measure of brain health

How old is your brain compared to your chronological age? A new measure of brain health developed by researchers at Rush University Medical Center may offer a novel approach to identifying individuals at risk of memory and thinking problems, according to research results published in Alzheimer's & Dementia: The Journal of the Alzheimer's Association on June 1.

Dubbed the "cognitive clock" by the researchers, the tool is a measure of brain health based on cognitive performance. It may be used in the future to predict the likelihood of memory and thinking problems that develop as a person ages.

"Alzheimer's disease, which is of the most common cause of dementia, and other diseases of the brain accumulate slowly over time as people get older. Age is widely recognized as the main risk factor for Alzheimer's disease, but it's a very imperfect predictor, since not everyone develops dementia as they age," said Patricia Boyle, PhD, professor in Rush Medical College's Division of Behavioral Sciences neuropsychologist in the Rush Alzheimer's Disease Center (RADC), and lead author of the study.

"Our new cognitive clock provides a measure of brain health that tells us more about how well a person's brain is functioning than chorological age. In this way, the clock can help us detect who is at highest risk of developing cognitive impairment in the coming years.

"For some people, cognition remains fairly stable as they age," Boyle added. "But, for others, cognition declines slowly over time, and still others show steep declines."

The researchers believed that cognitive performance data, even using a simple cognitive screening test, could be used to distinguish people exhibiting normal cognitive aging from those who are on their way to developing memory and thinking problems that are often coupled with aging.

This thesis led the Rush researchers to look at data they acquired from several long-term studies conducted by the RADC, including the Rush Memory and Aging Project (MAP) which included people living in the community in greater Chicago; the Religious Orders Study (ROS), which included older Catholic clergy from across the United States; and the Chicago Health and Aging Project (CHAP), a biracial population-based study.

"We used long-term cognitive testing data from our participants to develop a profile of cognitive aging, what we call the cognitive clock" Boyle said. "The cognitive clock reflects the general pattern of age-related cognitive decline and allows us to see who is doing better than average and who is doing worse at a given point in time. This helps us identify who might be at high risk of developing memory and thinking problems."

The cognitive clock was first developed working with data from 1057 participants from the MAP and the ROS, who began without cognitive impairment and underwent yearly cognitive assessments for up to 24 years. The cognitive assessment included the Mini-Mental State Exam, a widely used test of cognitive function among the elderly that measures orientation, attention, memory, language and visual-spatial skills. In addition to the MMSE, detailed evaluations also included a structured medical history, neurologic examinations, and a set of neurocognitive tests.

The researchers examined how cognitive performance changes over time with advancing age using a novel statistical approach to identify the typical profile of cognitive aging. Using this cognitive clock, researchers can estimate an individual's cognitive age -- their position on the clock -- at any given point in time.

Cognitive age is an indicator of brain health. "We found that, on average, cognition remains stable until a cognitive age of around 80 years of age, then declines moderately until 90, then declines more rapidly until death," Boyle said.

"Further, we found that cognitive age is a much better predictor than chronological age of dementia, mild cognitive impairment and mortality. It also is more strongly associated with other aspects of brain health."

The researchers then applied the clock to an independent sample of 2,592 participants from CHAP to confirm its accuracy for predicting outcomes such as Alzheimer's dementia, mild cognitive impairment, and mortality. Again, they found that cognitive age was a better predictor of these outcomes than chronological age.

"Essentially, what we did is use cognitive data collected over many years to create a single, easy-to-understand metric that may be used to predict health outcomes with good accuracy," Boyle said.

This tool may serve as an aid in aging research moving forward and may offer a new tool to identify at risk individuals.

"It is very difficult to develop a test or biomarker that accurately predicts health outcomes on an individual level. This has been a longstanding challenge in aging research. However, we are hoping that with additional research and validation, we may be able extend the approach applied here to clinical settings," Boyle said.

"Ideally, we could have a patient come into a clinic or hospital and complete a brief cognitive screen that gives us information to plug into a formula to estimate their cognitive age. That will provide important information about their brain health, and from there, we can estimate likelihood of developing Alzheimer's disease or dementia in the coming years. That would be an exciting advance."

Credit: 
Rush University Medical Center

Featured research from NUTRITION 2021 LIVE ONLINE

Press materials are now available for NUTRITION 2021 LIVE ONLINE, a dynamic virtual event showcasing new research findings and timely discussions on food and nutrition. The online meeting will be held June 7-10, 2021.

NUTRITION 2021 LIVE ONLINE is the flagship meeting of the American Society for Nutrition (ASN), the preeminent professional organization for nutrition research scientists and clinicians around the world.

Register for a press pass to access hundreds of live and recorded sessions. Explore the virtual poster presentations and oral presentations to see all the exciting research topics that will be covered at NUTRITION 2021 LIVE ONLINE.

All materials are embargoed until 12 p.m. EDT (UTC-4) June 7.

New Research Shows Trend Toward Unhealthy Eating During Pandemic
Study participants consumed more vegetables and whole grains before COVID-19 began

Study Compares Heart Benefits of Low-Fat and Plant-Centered Diets
New findings suggest that a plant-centered diet could help lower heart disease risk

When the Economy Goes Down, So Does the Quality of Our Diets
Study examining dietary trends during the Great Recession offers lessons for the COVID-19 era

Diets that Promote Inflammation Could Increase Breast Cancer Risk
Analysis of dietary patterns for over 350,000 women suggests eating more anti-inflammatory foods helps lower risk

How a Global Pandemic Changed the Way We Eat and Shop
Studies reveal how COVID-19 influenced food choices, attitudes and prices

Smartphone Use Associated with Unhealthy Eating and Overweight in Teens
More time online linked with less healthy behaviors in large study conducted in South Korea

How Kids Eat: Five New Insights on Daily Habits and Childhood Obesity
Studies reveal how family, social media and COVID-19 influence children's diets and health

New Research Examines the Science Behind Superfoods
Findings suggest that consuming mangos, honey and spices could bring important health benefits

Trying Not to Overeat? How You Eat Matters
Study finds people who eat more tend to take larger bites or eat faster

Cutting Food Waste Alone Won't Solve World's Nutritional Needs
To feed the growing population, researchers urge attention to essential vitamins and minerals

Most Americans Are Not Getting Enough Fiber in Our Diets
Just 7% of adults meet fiber recommendations, raising risk of chronic diseases

The Latest Science on Staying Healthy During Pregnancy
New research examines prenatal supplements, how eating patterns affect sleep, physical activity while pregnant and more

Credit: 
American Society for Nutrition

Seeds of economic health disparities found in subsistence society

PULLMAN, Wash. - No billionaires live among the Tsimane people of Bolivia, although some are a bit better off than others. These subsistence communities on the edge of the Amazon also have fewer chronic health problems linked to the kind of dramatic economic disparity found in industrialized Western societies.

For a study in the journal eLife, a research team led by Aaron Blackwell of Washington State University and Adrian Jaeggi of University of Zurich tracked 13 different health variables across 40 Tsimane communities, analyzing them against each person's wealth and the degree of inequality in each community. While some have theorized that inequality's health impacts are universal, the researchers found only two robustly associated outcomes: higher blood pressure and respiratory disease.

"The connection between inequality and health is not as straightforward as what you would see in an industrialized population. We had a lot of mixed results," said Blackwell, a WSU associate professor of anthropology. "These findings suggest that at this scale, inequality is not at the level that causes health problems. Instead maybe it's the extreme inequality in a lot of modern environments that causes health problems since it's unlike any inequality we've ever had in our evolutionary history."

Anthropologists are particularly interested in studying the Tsimane because their traditional lifestyle better resembles the conditions humans lived under for many centuries before the modern era. The Tsimane eat very little processed food--instead they forage, hunt, fish and grow crops. They also get plenty of exercise through their daily activities and have few health conditions associated with modern societies like obesity, diabetes, and heart disease. They do not have easy access to modern health care so do have to contend with parasites and many respiratory conditions ranging from the common cold to pneumonia.

Starting in 2001, a team of health care workers and researchers have visited these communities annually to provide care and collect data as part of a larger Tsimane Health and Life History Project. For this study, Blackwell, Jaeggi and their colleagues were able to analyze data from different points in time, extending to 2015.

Tsimane communities are smaller and more egalitarian than most industrialized societies but some of the communities have more inequality than others. The researchers found that for several health variables, including body mass index, gastrointestinal disorders and depression, there was not a clear connection to disparity.

However, in communities where inequality was high, many individuals had higher blood pressure, whether they were at the top or the bottom of the economic pecking order, compared to their peers in communities that were less stratified. The highest blood pressure was found among poor Tsimane men, no matter where they lived.

"Basically, it's bad to be poor, but it's also bad to be poorer," said Jaeggi. "If you feel like you're worse off than other people, that's stressful. In Western industrialized societies, it's associated with many negative health consequences like high blood pressure, cardiovascular problems and infectious disease as COVID-19 has shown. In the Tsimane communities, we did find some negative effects of living in a more unequal community, but it definitely wasn't on all of the outcomes, so it seems like it's less of a universal pattern."

This study was conducted before the pandemic, so COVID-19 impacts weren't included, but the researchers did find some connection with increased risk of respiratory disease like influenza and pneumonia to inequality. The authors said it was unclear what the exact mechanisms for that connection might be, as there wasn't a clear connection with stress, unlike in Western societies.

Blackwell also noted that while increased high blood pressure was found in more unequal communities, it hadn't developed into worse conditions such as hypertension and cardiovascular disease which are more prevalent in industrialized societies.

"I think this study tells us that there are some of the seeds of why inequality is bad for us, even in relatively egalitarian societies without huge economic differences," he said. "So perhaps if we want to increase health for everybody, then trying to reduce inequality is one route to do that."

Credit: 
Washington State University

Scientists say active early learning shapes the adult brain

image: A teacher guides a student through a task in this historical photo of the Abecedarian Project, an early education, randomized controlled trial that began at the University of North Carolina and has followed the participants since 1971. Now, Virginia Tech researchers including Craig Ramey, Sharon Landesman Ramey, and Read Montague have revealed discoveries about the lasting effects of that early education on brain structure.

Image: 
Virginia Tech

An enhanced learning environment during the first five years of life shapes the brain in ways that are apparent four decades later, say Virginia Tech and University of Pennsylvania scientists writing in the June edition of the Journal of Cognitive Neuroscience.

The researchers used structural brain imaging to detect the developmental effects of linguistic and cognitive stimulation starting at six weeks of age in infants. The influence of an enriched environment on brain structure had formerly been demonstrated in animal studies, but this is the first experimental study to find a similar result in humans.

"Our research shows a relationship between brain structure and five years of high-quality, educational and social experiences," said Craig Ramey, professor and distinguished research scholar with Fralin Biomedical Research Institute at VTC and principal investigator of the study. "We have demonstrated that in vulnerable children who received stimulating and emotionally supportive learning experiences, statistically significant changes in brain structure appear in middle age."

The results support the idea that early environment influences the brain structure of individuals growing up with multi-risk socioeconomic challenges, said Martha Farah, director of the Center for Neuroscience and Society at Penn and first author of the study.

"This has exciting implications for the basic science of brain development, as well as for theories of social stratification and social policy," Farah said.

The study follows children who have continuously participated in the Abecedarian Project, an early intervention program initiated by Ramey in Chapel Hill, North Carolina, in 1971 to study the effects of educational, social, health, and family support services on high-risk infants.

Both the comparison and treatment groups received extra health care, nutrition, and family support services; however, beginning at six weeks of age, the treatment group also received five years of high quality educational support, five days a week, 50 weeks a year.

When scanned, the Abecedarian study participants were in their late 30s to early 40s, offering the researchers a unique look at how childhood factors affect the adult brain.

"People generally know about the potentially large benefits of early education for children from very low resource circumstances," said co-author Sharon Landesman Ramey, professor and distinguished research scholar at Fralin Biomedical Research Institute. "The new results reveal that biological effects accompany the many behavioral, social, health, and economic benefits reported in the Abecedarian Project. This affirms the idea that positive early life experiences contribute to later positive adjustment through a combination of behavioral, social, and brain pathways."

During follow-up examinations, structural MRI scans of the brains of 47 study participants were conducted at the Fralin Biomedical Research Institute Human Neuroimaging Lab. Of those, 29 individuals had been in the group that received the educational enrichment focused on promoting language, cognition, and interactive learning.

The other 18 individuals received the same robust health, nutritional, and social services supports provided to the educational treatment group, and whatever community childcare or other learning their parents provided. The two groups were well matched on a variety of factors such as maternal education, head circumference at birth and age at scanning.

Analyzing the scans, the researchers looked at brain size as a whole, including the cortex, the brain's outermost layer, as well as five regions selected for their expected connection to the intervention's stimulation of children's language and cognitive development.

Those included the left inferior frontal gyrus and left superior temporal gyrus, which may be relevant to language, and the right inferior frontal gyrus and bilateral anterior cingulate cortex, relevant to cognitive control. A fifth, the bilateral hippocampus, was added because its volume is frequently associated with early life adversity and socioeconomic status.

The researchers determined that those in the early education treatment group had increased size of the whole brain, including the cortex.

Several specific cortical regions also appeared larger, according to study co-authors Read Montague, professor and director of the Human Neuroimaging Lab and Computational Psychiatry Unit at the Fralin Biomedical Research Institute, and Terry Lohrenz, research assistant professor and member of the institute's Human Neuroimaging Laboratory.

The scientists noted the group intervention treatment results for the brain were substantially greater for males than for females. The reasons for this are not known, and were surprising, since both the boys and girls showed generally comparable positive behavioral and educational effects from their early enriched education. The current study cannot adequately explain the sex differences.

"When we launched this project in the 1970s, the field knew more about how to assess behavior than it knew about how to assess brain structure," Craig Ramey said. "Because of advances in neuroimaging technology and through strong interdisciplinary collaborations, we were able to measure structural features of the brain. The prefrontal cortex and areas associated with language were definitely affected; and to our knowledge, this is the first experimental evidence on a link between known early educational experiences and long-term changes in humans."

"We believe that these findings warrant careful consideration and lend further support to the value of ensuring positive learning and social-emotional support for all children - particularly to improve outcomes for children who are vulnerable to inadequate stimulation and care in the early years of life," Craig Ramey said.

Credit: 
Virginia Tech

Converting scar tissue to heart muscle after a heart attack

image: Researchers from the University of Tsukuba showed that cardiac scar tissue (fibroblasts) can be directly reprogrammed to heart muscle cells (cardiomyocytes) in mice. By treating mice post-heart attack with a virus carrying cardiac transcription factors, they found that new cardiomyocytes were formed by fibroblasts converting into cardiomyocytes as opposed to fibroblasts fusing with existing cardiomyocytes. This study demonstrates the potential of direct reprograming as a strategy for cardiac regeneration after myocardial infarction.

Image: 
University of Tsukuba

Tsukuba, Japan - It is estimated that during a heart attack, one billion cells in the heart are lost. In the wake of the heart attack, the lost tissue is replaced by scar tissue, which can lead to heart failure, arrhythmia and death. In a new study, researchers from the University of Tsukuba have shown how cells in the scar tissue can be converted to heart muscle cells, effectively regenerating the injured heart.

The injured heart of humans and rodents alike does not have the capacity to regenerate after injury. Therefore, the only way for the heart to heal the wound is to build a scar tissue in the injured area. A longstanding goal in the field has been to find a way to reprogram fibroblasts, cells that produce the connective tissue in a scar, to cardiomyocytes, the working heart muscle cells. By doing so, the lost heart muscle cells could be replaced, effectively preventing the heart from going into heart failure, a heart muscle weakness that can lead to death.

Previous studies have shown that cardiomyocytes appear to be formed by directly injecting a harmless virus carrying a set of cardiac transcription factors, proteins that drive the expression of genes that heart muscle cells need for their development and function, into the heart of rodents after a heart attack. However, the origin and functional significance of these newly formed heart muscle cells has not unequivocally been determined yet.

"Direct cardiac reprogramming holds great potential for cardiac regeneration and the treatment of myocardial infarction," says lead author of the study Professor Masaki Ieda. "However, when transcription factors are introduced, apparent cardiomyocytes may be formed either by converting fibroblasts to new cardiomyocytes or by fusing fibroblasts with existing cardiomyocytes. The difference is that only the former process, which we call 'direct reprogramming', significantly contributes to regeneration. In this study, our goal was to determine how new cardiomyocytes are formed when cardiac transcription factors are introduced after myocardial infarction."

To achieve their goal, the researchers first generated mice in which all cells emitted red fluorescence. However, the mice were modified in a way that the fibroblasts emitted green fluorescence after treatment with the drug tamoxifen. As a result, when looking at the heart after treatment with tamoxifen, cells that emitted both red and green fluorescence indicated that cell fusion between fibroblasts and cardiomyocytes had happened. Conversely, the presence of green fluorescence indicated that direct reprogramming of fibroblasts to cardiomyocytes had occurred.

Equipped with the tools to tackle their research question, the researchers used a mouse model of heart attack and treated the mice with tamoxifen. While there was no direct reprogramming in a control group, the researchers found 1-1.5% of directly reprogrammed cells when a virus carrying cardiac transcription factors was injected into the mice. Both groups exhibited minimal cell fusion. These results suggest that the main route of generating new heart muscle cells by this method is via reprogramming fibroblasts directly to cardiomyocytes.

"These are striking results that show that fibroblasts can be directly reprogrammed to cardiomyocytes. Our findings demonstrate the exciting potential of direct reprograming as a strategy for cardiac regeneration after myocardial infarction," says Professor Ieda.

Credit: 
University of Tsukuba

Corruption in healthcare worsens the health of patients and the quality of nutrition

image: Bribing for any public service, not just for health services, worsens health.

Image: 
UrFU

Bribery in the public healthcare does not solve the problem of poor quality of services, and even exacerbates it, researchers argue. The same can be said about the well-being of patients and their own assessment of health. In other words, bribes in the healthcare do not provide good quality services and do not pay off. Such conclusions were reached by an international team of researchers, including Olga Popova, the article's co-author, an associate professor at the Ural Federal University (UrFU, Russia).

Researchers examined survey data on 41,000 citizens from 28 post-communist countries in Central and Eastern Europe, as well as the former Soviet republics, including Russia (information provided by the World Bank and the European Bank for Reconstruction and Development). The research results were published at the Journal of Comparative Economics.

"In general, the corruption mechanism that undermines people's health is as follows," said Olga Popova. "As everyday practice shows, the poorest strata of the population have the greatest health problems. They are also the ones who most often use medical services and, as a result, are more inclined than others to solve their problems with the help of bribes."

Corruption is also fueled by the fact that healthcare workers can deliberately reduce the quality of service in order to receive informal remuneration, the researchers presume.

"However, the plight in public health together with dismissive and irresponsible attitudes toward the poor population groups lead to the fact that giving a bribe does not solve such problems," said Olga Popova. "The other problems, including the frequent and unjustified absence of doctors, long waiting times for services, disrespectful treatment by the staff, unavailability of medicines, and uncleaned premises are not solved as well. The mismatch between the consequences of bribery and expectations while giving a bribe as well as the risk of disclosure lead to a deterioration in the well-being of patients who bribe."

The researchers emphasize that bribery for any public service, not just healthcare, has the same adverse effect.

Another conclusion is that bribe-givers face the risks of health deterioration more often than other people. This happens because bribes divert the funds from personal and family budgets, disrupt the normal diet, and reduce the possibility of regular consumption of meat, chicken, fish or their vegetarian equivalent.

The lowest level of bribery in healthcare is in Slovenia, Estonia, Poland, Georgia, and the Czech Republic, while the highest is in Azerbaijan, Moldova, and Tajikistan. Russia is approximately in the middle of the ranking and is close to the positions of Lithuania and Bosnia and Herzegovina.

On average across the surveyed countries, 17% of the respondents reported having had a bribe-taking experience while receiving a health care in the public health system. About a third reported having such experience while receiving the other types of government services. In Moldova and Tajikistan, almost a half of the patients have an experience of corruption. In Ukraine and Azerbaijan ? about a third. In terms of the prevalence of corruption, including in the healthcare sector, European post-communist states are second after the countries in sub-Saharan Africa. Researchers call such volumes catastrophic and urge to fight corruption. Increasing the salaries of civil servants, law enforcement practice, and campaigns to create intolerance in society may help to solve the problem.

Credit: 
Ural Federal University

The role of computer voice in the future of speech-based human-computer interaction

image: In the modern day, our interactions with voice-based devices and services continue to increase. In this light, researchers at Tokyo Institute of Technology and RIKEN, Japan, have performed a meta-synthesis to understand how we perceive and interact with the voice (and the body) of various machines. Their findings have generated insights into human preferences, and can be used by engineers and designers to develop future vocal technologies.

Image: 
Katie Seaborn

In the modern day, our interactions with voice-based devices and services continue to increase. In this light, researchers at Tokyo Institute of Technology and RIKEN, Japan, have performed a meta-synthesis to understand how we perceive and interact with the voice (and the body) of various machines. Their findings have generated insights into human preferences, and can be used by engineers and designers to develop future vocal technologies.

As humans, we primarily communicate vocally and aurally. We convey not just linguistic information, but also the complexities of our emotional states and personalities. Aspects of our voice such as tone, rhythm, and pitch are vital to the way we are perceived. In other words, the way we say things matters.

With advances in technology and the introduction of social robots, conversational agents, and voice assistants into our lives, we are expanding our interactions to include computer agents, interfaces, and environments. Research on these technologies can be found across the fields of human-agent interaction (HAI), human-robot interaction (HRI), human-computer interaction (HCI), and human-machine communication (HMC), depending on the kind of technology under study. Many studies have analyzed the impact of computer voice on user perception and interaction. However, these studies are spread across different types of technologies and user groups and focus on different aspects of voice.

In this regard, a group of researchers from Tokyo Institute of Technology (Tokyo Tech), Japan, RIKEN Center for Advanced Intelligence Project (AIP), Japan, and gDial Inc., Canada, have now compiled findings from several studies in these fields with the intention to provide a framework that can guide future design and research on computer voice. As lead researcher Associate Professor Katie Seaborn from Tokyo Tech (Visiting Researcher and former Postdoctoral Researcher at RIKEN AIP) explains, "Voice assistants, smart speakers, vehicles that can speak to us, and social robots are already here. We need to know how best to design these technologies to work with us, live with us, and match our needs and desires. We also need to know how they have influenced our attitudes and behaviors, especially in subtle and unseen ways."

The team's survey considered peer-reviewed journal papers and proceedings-based conference papers where the focus was on the user perception of agent voice. The source materials encompassed a wide variety of agent, interface, and environment types and technologies, with the majority being "bodyless" computer voices, computer agents, and social robots. Most of the user responses documented were from university students and adults. From these papers, the researchers were able to observe and map patterns and draw conclusions regarding the perceptions of agent voice in a variety of interaction contexts.

The results showed that users anthropomorphized the agents that they interacted with and preferred interactions with agents that matched their personality and speaking style. There was a preference for human voices over synthetic ones. The inclusion of vocal fillers such as the use of pauses and terms like "I mean..." and "um" improved the interaction. In general, the survey found that people preferred human-like, happy, empathetic voices with higher pitches. However, these preferences were not static; for instance, user preference for voice gender changed over time from masculine voices to more feminine ones. Based on these findings, the researchers were able to formulate a high-level framework to classify different types of interactions across various computer-based technologies.

The researchers also considered the effect of the body, or morphology and form factor, of the agent, which could take the form of a virtual or physical character, display or interface, or even an object or environment. They found that users tended to perceive agents better when the agents were embodied and when the voice "matched" the body of the agent.

The field of human-computer interaction, particularly that of voice-based interaction, is a burgeoning one that continues to evolve almost daily. As such, the team's survey provides an essential starting point for the study and creation of new and existing technologies in voice-based human-agent interaction (vHAI). "The research agenda that emerged from this work is expected to guide how voice-based agents, interfaces, systems, spaces, and experiences are developed and studied in the years to come," Prof. Seaborn concludes, summing up the importance of their findings.

Credit: 
Tokyo Institute of Technology

Safe distance: How to make sure our outdoor activities don't harm wildlife

image: An infographic illustrates the safe distance between humans and different types of animals.

Image: 
Sarah Markes/WCS

Spending time outdoors is good for a person's body and soul, but how good is it for the wildlife around us?

Outdoor recreation has become a popular activity, especially in the midst of a pandemic, where access to indoor activities might be limited. Long known to have negative behavioural and physiological effects on wildlife, outdoor recreation is one of the biggest threats to protected areas. Human disturbance to animal habitats can lower their survival and reproduction rates, and ultimately shrink populations or eradicate them from areas where they would otherwise thrive. Still, park planners and natural resource managers often can't find clear recommendations on how to limit these impacts.

A new scholarly article in the open-access, peer-reviewed journal Nature Conservation from researchers at the Wildlife Conservation Society looked at nearly 40 years of research on recreation impacts on wildlife to try to find the point where recreation starts to impact the wildlife around us. Knowing when and to what extent a species is being disturbed can ultimately allow for more informed and effective management decisions and increase the chances of its successful conservation.

The researchers found that the impact or uncomfortable distance to humans, vehicles or trails for shorebirds and songbirds was as short as 100 meters or even less, whereas for hawks and eagles it was greater than 400 meters. For mammals, it varied even more widely, with an impact threshold of 50 meters for medium-sized rodents. Large ungulates - like elk - would rather have to stay 500 to 1,000 meters away from people.

While human disturbance thresholds can vary widely, large buffer zones around human activities and controlled visitation limits should always be considered during planning and maintenance of parks and protected areas. Based on their findings, the authors recommend that human activities should be considered to be impacting wildlife at least 250 metres away. Further, they call for future research to explicitly identify points where recreation begins or ends to impact wildlife.

Credit: 
Pensoft Publishers

Sick bats also employ 'social distancing' which prevents the outbreak of epidemics

image: Images of bats.

Image: 
Yuval Barkai.

The Covid-19 pandemic has introduced us to expressions like 'lockdown', 'isolation' and 'social distancing', which became part of social conduct all over the world. Now it appears that bats also maintain social distancing which might help prevent the spread of contagious diseases in their colonies. In a new study published in Annals of the New York Academy of Science, researchers from Tel Aviv University demonstrate that sick bats, just like ill humans, prefer to stay away from their communities, probably as a means for recovery, and possibly also as a measure for protecting others. The study was conducted by postdoctoral researcher Dr. Kelsey Moreno and PhD candidate Maya Weinberg at the laboratory of Prof. Yossi Yovel, Head of the Sagol School of Neuroscience and a researcher at School of Zoology at the George S. wise Faculty of Life Sciences.

The study monitored two colonies of Egyptian fruit bats - one living in an enclosure and the other in its natural environment. To examine the behavior of bats when they get sick, the researchers injected several bats in each group with a bacteria-like protein, thereby stimulating their immune response without generating any real danger to the bats. Tests revealed symptoms such as a high fever, fatigue and weight loss, and the 'ill' bats' behavior was tracked with GPS.

The researchers discovered that the 'sick' bats chose to keep away from the colony. In the first group, they left the bat cluster of their own accord and kept their distance. In the second group the 'ill' bats likewise moved away from the other bats in the colony, and also stayed in the colony and did not go out in search of food for two successive nights.

Research student Maya Weinberg explains that this social distancing behavior is probably caused by the need to conserve energy - by avoiding the energy-consuming social interactions in the group. Weinberg emphasizes, however, that this behavior can also protect the group and prevent the pathogen from spreading within the colony. Moreover, the fact that sick bats don't leave the cave, prevents the disease from spreading to other colonies. "The bats' choice to stay away from the group is highly unusual for these animals. Normally these bats are extremely social creatures, living in caves in very crowded conditions," says Weinberg. "In fact, the 'sick' bats' behavior is very reminiscent of our own during recovery from an illness. Just as we prefer to stay home quietly under the blanket when we are ill, sick bats, living in very crowded caves, also seek solitude and peace as they recuperate."

Prof. Yovel adds that the study's findings suggest that the likelihood of bats passing pathogens to humans under regular conditions is very low, because sick bats tend to isolate themselves and stay in the cave. "We observed that during illness bats choose to stay away from the colony and don't leave the cave, and thus avoid mixing with other bats. This suggests that in order to encounter a sick bat, people must actually invade the bats' natural environment or eliminate their habitats. In other words, if we protect them, they will also protect us."

Credit: 
Tel-Aviv University