Brain

Effects of nutrient pollution in marine ecosystems are compounded by human activity

Excessive nutrients, such as nitrogen and phosphorus, have devastating effects on coastal marine ecosystems by causing algal blooms that deplete oxygen in the water, killing marine life. Such nutrients can enter the sea in wastewater or run-off from agricultural land. However, a new review in open-access journal Frontiers in Marine Science highlights that problems caused by other human activities, such as climate change, can exacerbate these detrimental effects on marine ecosystems. The review suggests that an integrated approach considering land use, ecology and input from scientists, politicians and the public is required to defeat this terrible synergy.

By enriching the land, humans are impoverishing the sea. Nutrients, in the form of synthetic fertilizers added to agricultural land, can leach into rivers and ultimately the sea. While not inherently toxic, these nutrients cause an explosion of plant and algal life that disturbs delicate marine ecosystems. This process is called cultural eutrophication, and it can have serious consequences for marine life.

"The threats posed by eutrophication include reduced water clarity, oxygen depletion, and toxic algal events that result in critical habitat losses such as coral reefs, seagrass meadows, and mangrove forests," said Professor Thomas Malone of the University of Maryland Center for Environmental Science. "Other serious consequences include mass mortalities of marine animals, loss of biodiversity, and threats to human health."

This new review by Malone and Professor Alice Newton of the University of Algarve details so-called "dead zones" within the ocean, where eutrophication has caused significant oxygen depletion resulting in mass mortalities of marine animals. Conservative estimates indicate that there are now over 700 such areas. These harmful effects can be seen amongst the natural treasures of the ocean, such as the Great Barrier Reef, where eutrophication is partly responsible for a 70% reduction in hard corals over the past century.

So, where do these nutrients come from? As well as agricultural fertilizers, other sources include human sewage and farm animal manure. Burning fossil fuels, such as in car engines, also results in an abundance of nitrogen-containing compounds that can enter the sea.

While eutrophication is bad in isolation, the review reveals that other human activities, such as overfishing and burning fossil fuels can compound its effects even further. For instance, climate change increases river runoff and, as a consequence, the level of nutrient pollution. Overfishing affects animals that typically eat algae, allowing algal blooms to grow larger.

Ocean acidification is also an issue. The ocean absorbs a significant quantity of the CO2 released from combustion, increasing seawater acidity. At the same time, low oxygen environments exacerbate ocean acidification by releasing addition CO2 into the ocean. This can have serious adverse effects on corals, shellfish and plankton.

Given that the problem is multifaceted, Malone suggests that the solution is too. An important aspect of this involves better monitoring of nutrient pollution on a global scale as well as reducing nutrient inputs into coastal marine ecosystems. Steps to reduce nutrient input must be guided by a variety of people, including scientists, policy makers, and the public, to give the strategies the best chance of success.

The review suggests a wide variety of practical steps that land-owners could take to reduce nutrient pollution, including restoring habitats, such as mangroves and marshes, that can remove nutrients from water before they reach the sea. Ecosystem management plans that consider both watershed areas on land and the seawater they drain into are also necessary to tackle the problem.

Donald Boesch, Professor of marine science at the University of Maryland and former President of the University of Maryland Center for Environmental Science, who was not involved in the study, said: "This valuable review shows the commonalities in the timing, causes and consequences of nutrient pollution of coastal waters in many otherwise very different regions of the world. Our progress in reversing this important cause of ecosystem degradation will be limited unless we can reduce agricultural pollution through more effective regulations and incentives."

Credit: 
Frontiers

New diagnostic criteria shine light on early dementia mimics

image: UK academics and clinicians have collaborated to develop a diagnostic definition of the widely recognised but poorly understood condition, Functional Cognitive Disorder (FCD).

Image: 
University of Bristol

Experts estimate up to one third of people attending specialist memory clinics could have a condition that is commonly mistaken for early dementia.

In a paper published in the journal, Brain, UK academics and clinicians have collaborated to develop a diagnostic definition of the widely recognised but poorly understood condition, Functional Cognitive Disorder (FCD).

Dr Harriet Ball from the University of Bristol, first author of the paper, said providing diagnostic criteria was an incredibly important step toward improving diagnosis, management and research into FCD and other cognitive disorders.

"Dysfunction of day-to-day thinking processes is a feature of FCD but it is often misdiagnosed as early dementia. We estimate up to a third of people attending specialist memory clinics have FCD. While FCD involves impairment of thinking processes, unlike dementia, it is not expected to progress. From a patient's point of view, that is a very different prognosis and one that requires different management.

"As clinicians, our aim is to unravel the causes of early memory symptoms, and importantly, identify those that can improve over time rather than deteriorate towards dementia. Having clear diagnostic criteria for FCD will enable us to better characterise the condition and better explain it - and its prognosis - to patients and their families," said Dr Ball.

The position paper: Functional cognitive disorder: dementia's blind spot is the collaborative effort of 25 of the UK's leading experts on the topic and represents the first agreed clinical definition of FCD.

This definition will allow a new phase of research into FCD as researchers can now consistently identify patients for studies. The next stage for this work, which has already begun, involves assessing clinical markers and understanding the epidemiology, all of which will help to build treatment studies.

"While some people do spontaneously recover, this is often related to how long it has gone on for and how entrenched it has become. Treatment up to now has focused on management of aspects that we know can help in general, for example cutting down medications that might be making things worse, working on better sleep patterns; but in future we'd like to test specific cognitive therapies which could prove much more successful," said Dr Ball.

Dr Ball said the definition also had important benefits in terms of strengthening research into dementia.

"With a clear operational definition, we'll be better at picking the right people for trials against, for example, Alzheimer's proteins - because if lots of people with FCD are in those trials, it is much harder to show any treatment effect against Alzheimer's."

Credit: 
University of Bristol

Independent search engines respect your privacy but give more visibility to misinformation

Anti-vaccine websites, which could play a key role in promoting public hesitancy about a potential COVID vaccine, are far more likely to be found via independent search engines than through an internet giant like Google.

The study, led by researchers at Brighton and Sussex Medical School (BSMS), showed that independent search engines returned between 3 and 16 anti-vaccine websites in the first 30 results, while Google.com returned none.

Lead author Professor Pietro Ghezzi, RM Phillips Chair in Experimental Medicine at BSMS, said the study raises concerns that people who are being exposed to these websites are often people who have turned to alternative search engines because they are worried about use of their personal data by the internet giants.

"Vaccine hesitancy was defined by the World Health Organisation as one of the top ten threats to global health last year," said Professor Ghezzi. "Since then we've had the COVID crisis, a recent report showed that (50%) of people in the UK would not take a Coronavirus vaccine if it was available. This is frightening - and this study perhaps gives some indication as to why this is happening."

"There are two main messages here. One is to the Internet giants, who are becoming more responsible in terms of avoiding misinformation, but need to build trust with users regarding privacy because of their use of personal data; and the other is to the alternative search engines, who could be responsible for spreading misinformation on vaccines, unless they become better in their role as information gatekeepers. This suggests that quality of the information provided, not just privacy, should be regulated."

The study, involving researchers in the UK, Belgium, Italy and Spain, has analysed 840 websites returned by 28 search engines in four languages, and compared the ranking of anti-vaccine websites.

The study also found that some localised version of google (English-UK, Italian and Spanish) also return more anti-vaccine websites than the main, US English, Google.com.

Credit: 
University of Sussex

Analyzing the factors that enable fish to reproduce in the Gulf of Cadiz

Researchers from the Marine Biology Laboratory of the University of Seville, led by Professor José Carlos García Gómez, have studied the factors involved in fish reproduction and breeding in the Gulf of Cadiz. Their analysis focused on estuaries which, due to their specific conditions, created by the mixing of river and sea water, are especially favourable for the reproductive function of the species in the area.

The Guadalquivir estuary showed the highest density of early stages fish and also of macro-zooplankton (fish prey). A higher concentration of organic matter (preferential food of the macrozooplanton in the Guadalquivir), provided by a greater flow of fresh water and correlated with total suspended solids, inorganic matter and turbidity, were the most typical characteristics of the Guadalquivir. This, along with the salinity gradient, may explain its high densities of macro-zooplankton and early stages fish. In contrast, the recurrent booms of jellyfish (Blackfordia virginata) and ctenophores (Bolinopsis sp.) observed in the Guadiana and Bay of Cadiz estuaries affected their breeding capacities. The most abundant fish species in the Gulf of Cadiz, particularly in the interior of the Guadalquivir estuary, was the anchovy Engraulis encrasicolus.

The results show that the estuaries of the rivers Guadalquivir and Guadiana, which have larger river basins, have higher freshwater discharges, thus generating different physical-chemical characteristics in their waters, as well as an extensive transition zone with a longitudinal salinity gradient (40-50 km approx.). In contrast, the Tinto-Odiel and Bay of Cadiz estuaries, with their small river basins, have a very low flow of fresh water, which explains why their internal characteristics are similar to those of external marinas.

Turbidity is another factor to consider in this regard, as it is highly beneficial for fish larvae. They use the poor visibility offered by these waters to hide from predators. While, at the same time, since these larvae locate their prey at a very short distance, turbidity has little influence on finding and catching them. In short, fish larvae that spawn in estuaries or in areas very close to their mouths are in a perfect scenario to grow, fulfilling the vital maxim to which all wild animals aspire "to eat and not be eaten".

For the study, samples were taken from the estuaries during the hot and dry seasons of 2016, 2017 and 2018. Some temporal differences were observed in the larval and juvenile fish communities within the estuaries themselves, as these environments fluctuate significantly. By collecting samples for 3 consecutive years, researchers were able to achieve greater certainty of the differences between the different areas studied. In fact, comparing the two most productive estuaries in the Gulf of Cadiz, the Guadiana and Guadalquivir, the average difference in the abundance of fish larvae and juveniles found was up to 5 times greater in the latter, rising to up to 10 times in some years.

Furthermore, in both the Guadiana and the Bay of Cadiz, jellyfish and ctenophore population explosions were observed only in 2017 and 2018, but not in 2016. The causes of this temporal variation are still unknown but we are studying what other factors may have a bearing on these differences.

The study was carried out in collaboration with the General Research Services (SGI) of the University of Seville, and was fully financed by the Seville Port Authority as part of its policy of generating knowledge on the Guadalquivir estuary and surrounding areas to promote better integrated management, sustainable exploitation and the conservation of this natural environment, as set out in the Strategic Plan of the Port of Seville 2025.

Credit: 
University of Seville

Academia from home

As the uncertainty around reopening college and university campuses this fall continues, those who work, study, teach and conduct research are navigating the uncertain terrain of the "new normal." They are balancing physical distancing and other COVID-19 prevention practices with productivity, creating home workspaces and mastering communications and teamwork across time and space.

Turns out, there's a group of people for whom these challenges are not new. Postdoctoral researchers -- people in the critical phase between graduate school and permanent academic positions -- are part of a small but growing cohort that has been turning to remote work to meet the challenges of their young careers. Often called upon to relocate multiple times for short-term, full-time appointments, postdocs and their families have to endure heightened financial costs, sacrificed career opportunities and separations from their support communities.

But with the right practices and perspectives, remote work can level the playing field, especially for those in underrepresented groups, according to Kurt Ingeman, a postdoctoral researcher in UC Santa Barbara's Department of Ecology, Evolution and Marine Biology. And, like it or not, with COVID-19 factoring into virtually every decision we now make, he noted, it's an idea whose time has come.

"We started this project in the pre-pandemic times but it seems more relevant than ever as academics are forced to embrace work-from-home," said Ingeman, who makes the case for embracing remote postdoctoral work in the journal PLOS Computational Biology. Family and financial considerations drove his own decision to design a remote position; many early-career researchers face the same concerns, he said.

It takes a shift in perspective to overcome resistance to having remote research teammates. Principal investigators often don't perceive the remote postdoc as a fully functional member of the lab and worry about the loss of spontaneous informal actions, and interactions, that can generate new ideas, Ingeman said.

"These are totally valid concerns," he said. "We suggest (in the paper) ways to use digital tools to fully integrate remote postdocs into lab activities, like mentoring graduate students or coding and writing together. These same spaces are valuable for virtual coffee chats and other informal interactions."

Communication enabled by technology is in fact foundational to a good remote postdoc experience, according to Ingeman and co-authors, who advocate for investment in and use of reliable videoconferencing tools that can help create rapport between team members, and the creation of digital spaces to share documents and files. Transparency and early expectation setting are keys to a good start. In situations where proximity would have naturally led to interaction, the researchers recommend having a robust communications plan. Additionally, postdocs would benefit from establishing academic connections within their local community to combat isolation.

There are benefits to reap from such arrangements and practices, the researchers continued. For the postdoc, it could mean less stress and hardship, and more focus on work. For the team, it could mean a wider network overall.

"For me, remote postdoc work was a real bridge to becoming an independent researcher," said Ingeman, who "struggled with isolation early on," but has since gained a local academic community, resulting in productive new research collaborations.

Additionally, opening the postdoc pool to remote researchers can result in a more diverse set of applicants.

"The burdens of relocating for a temporary postdoc position often fall hardest on members of underrepresented groups," Ingeman added. "So the idea of supporting remote work really stand out to me as an equity issue."

Of course, not all postdoc positions can be remote; lab and field work still require a presence. But as social distancing protocols and pandemic safety measures are forcing research teams to minimize in-person contact or undergo quarantine at a moment's notice, developing remote research skills may well become a valuable part of any early-career researcher's toolkit.

"Even labs and research groups that are returning to campus in a limited way may face periodic campus closures, so it makes sense to integrate remote tools now," Ingeman said. "Our suggestions for remote postdocs are absolutely applicable to other lab members working from home during closures."

Credit: 
University of California - Santa Barbara

WTF, when will scientists learn to use fewer acronyms?

Have you heard of DNA? It stands for Do Not Abbreviate apparently. Jokes aside, it's the most widely used acronym in scientific literature in the past 70 years, appearing more than 2.4 million times.

The short form of deoxyribonucleic acid is widely understood, but there are millions more acronyms (like WTF: water-soluble thiourea-formaldehyde) that are making science less useful and more complex for society, according to a new paper released by Australian researchers.

Queensland University of Technology (QUT) Professor Adrian Barnett and Dr Zoe Doubleday from the University of South Australia (UniSA) have analysed 24 million scientific article titles and 18 million abstracts between 1950 and 2019, looking for trends in acronym use.

Despite repeated calls for scientists to reduce their use of acronyms and jargon in journal papers, the advice has been largely ignored, their findings show in a paper published in eLife.

Many of the 1.1 million unique acronyms identified in the past 70 years are causing confusion, ambiguity and misunderstanding, making science less accessible, the researchers say.

"For example, the acronym UA has 18 different meanings in medicine, and six of the 20 most widely used acronyms have multiple common meanings in health and medical literature," according to Dr Zoe Doubleday.

"When I look at the top 20 scientific acronyms of all time, it shocks me that I recognise only about half. We have a real problem here."

DNA is universally recognised, but the second most popular acronym CI (confidence interval) could easily be confused for chief investigator, cubic inch or common interface. Likewise, US (United States/ultrasound/urinary system) and HR (heart rate/hazard ratio) often trip people up.

Prof Barnett says the use of acronyms in titles has more than trebled since 1950 and increased 10-fold in scientific abstracts in the same period.

"Strikingly, out of the 1.1 million acronyms analysed, we found that two per cent (about 2,000) were used more than 10,000 times," he says. "Even when the 100 most popular acronyms were removed, there was still a clear increase in acronym use over time."

Entrenched writing styles in science are difficult to shift and excessive acronym use points to a broader communication problem in science, Dr Doubleday says, but journals could help stem the trend by restricting the number of acronyms used in a paper.

"In the future it might be possible - software permitting - for journals to offer two versions of the same paper, one with acronyms and one without, so that the reader can select the version they prefer."

Credit: 
University of South Australia

Selective conversion of reactive lithium compounds made possible

image: The Bochum research team has developed a new catalyst that could be interesting for industrial applications.

Image: 
RUB, Marquard

Researchers at Ruhr-Universität Bochum have developed a new catalyst that can catalyse reactions to produce pharmaceuticals or chemicals used in agriculture. It creates carbon-carbon bonds between what are known as organolithium compounds without creating any unwanted by-products. The team led by Professor Viktoria Däschlein-Gessner, Inorganic Chemistry II Research Group, describes the results in the journal Angewandte Chemie, published online on 29 July 2020.

Indispensable for many applications

Organolithium compounds are reagents with a lithium-carbon bond, which are among the most reactive compounds in synthetic chemistry. "Due to their special properties, they are indispensable in many applications, even on an industrial scale," says Viktoria Däschlein-Gessner, member of the Cluster of Excellence Ruhr Explores Solvation, Resolv for short. "However, high reactivity often also leads to unwanted side reactions. As a result, organolithium compounds have so far only been considered to a limited extent, or even not at all, for some applications."

The research group led by Viktoria Däschlein-Gessner was able to overcome such limitations with the help of a highly efficient catalyst. The new phosphine-palladium catalyst selectively couples two carbon atoms - both with different organolithium compounds and many so-called aryl halides. The decisive factor was that it is sufficiently active, even at room temperature.

Market launch planned

No additional additives are needed for the new synthesis process and it can be used widely. This means that intermediate steps during synthesis can be avoided, thus producing less metal salt waste. The catalyst guarantees a high degree of selectivity, even if product quantities of several grams are produced. To allow for use on an industrial scale, the next step must be to test it at even larger volumes.

In cooperation with industry, the researchers in Bochum intend to launch the developed catalysts on the market soon. "Their particular activity is not only advantageous in the described reactions, but also offers improvements for numerous other transformations in almost all areas of fine chemical synthesis," says Däschlein-Gessner. In addition to the production of pharmaceuticals and chemicals for agriculture, these include fragrances and materials for organic light-emitting diodes.

Credit: 
Ruhr-University Bochum

Yale quantum researchers create an error-correcting cat

New Haven, Conn. -- Yale physicists have developed an error-correcting cat -- a new device that combines the Schrödinger's cat concept of superposition (a physical system existing in two states at once) with the ability to fix some of the trickiest errors in a quantum computation.

It is Yale's latest breakthrough in the effort to master and manipulate the physics necessary for a useful quantum computer: correcting the stream of errors that crop up among fragile bits of quantum information, called qubits, while performing a task.

A new study reporting on the discovery appears in the journal Nature. The senior author is Michel Devoret, Yale's F.W. Beinecke Professor of Applied Physics and Physics. The study's co-first authors are Alexander Grimm, a former postdoctoral associate in Devoret's lab who is now a tenure-track scientist at the Paul Scherrer Institute in Switzerland, and Nicholas Frattini, a graduate student in Devoret's lab.

Quantum computers have the potential to transform an array of industries, from pharmaceuticals to financial services, by enabling calculations that are orders of magnitude faster than today's supercomputers.

Yale -- led by Devoret, Robert Schoelkopf, and Steven Girvin -- continues to build upon two decades of groundbreaking quantum research. Yale's approach to building a quantum computer is called "circuit QED" and employs particles of microwave light (photons) in a superconducting microwave resonator.

In a traditional computer, information is encoded as either 0 or 1. The only errors that crop up during calculations are "bit-flips," when a bit of information accidentally flips from 0 to 1 or vice versa. The way to correct it is by building in redundancy: using three "physical" bits of information to ensure one "effective" -- or accurate -- bit.

In contrast, quantum information bits -- qubits -- are subject to both bit-flips and "phase-flips," in which a qubit randomly flips between quantum superpositions (when two opposite states exist simultaneously).

Until now, quantum researchers have tried to fix errors by adding greater redundancy, requiring an abundance of physical qubits for each effective qubit.

Enter the cat qubit -- named for Schrödinger's cat, the famous paradox used to illustrate the concept of superposition.

The idea is that a cat is placed in a sealed box with a radioactive source and a poison that will be triggered if an atom of the radioactive substance decays. The superposition theory of quantum physics suggests that until someone opens the box, the cat is both alive and dead, a superposition of states. Opening the box to observe the cat causes it to abruptly change its quantum state randomly, forcing it to be either alive or dead.

"Our work flows from a new idea. Why not use a clever way to encode information in a single physical system so that one type of error is directly suppressed?" Devoret asked.

Unlike the multiple physical qubits needed to maintain one effective qubit, a single cat qubit can prevent phase flips all by itself. The cat qubit encodes an effective qubit into superpositions of two states within a single electronic circuit -- in this case a superconducting microwave resonator whose oscillations correspond to the two states of the cat qubit.

"We achieve all of this by applying microwave frequency signals to a device that is not significantly more complicated than a traditional superconducting qubit," Grimm said.

The researchers said they are able to change their cat qubit from any one of its superposition states to any other superposition state, on command. In addition, the researchers developed a new way of reading out -- or identifying -- the information encoded into the qubit.

"This makes the system we have developed a versatile new element that will hopefully find its use in many aspects of quantum computation with superconducting circuits," Devoret said.

Credit: 
Yale University

Plant-based meats improve some cardiovascular risk factors compared with red meat

Swapping out red meat for certain plant-based meat alternatives can improve some cardiovascular risk factors, according to a new study by researchers at Stanford Medicine.

The small study was funded by an unrestricted gift from Beyond Meat, which makes plant-based meat alternatives, and used products from the company in comparing the health effects of meat with plant-based alternatives. Beyond Meat was not involved in designing or conducting the study and did not participate in data analysis.

It may seem obvious that a patty made of plants is a healthier option than a hamburger. But many of the new meat alternatives, such as Beyond Meat, have relatively high levels of saturated fat and added sodium and are considered highly processed foods, meaning they are made with food isolates and extracts as opposed to whole beans or chopped mushrooms. All of these factors have been shown to contribute to cardiovascular disease risk, said Christopher Gardner, PhD, professor of medicine at the Stanford Prevention Research Center.

"There's been this sort of backlash against these new meat alternatives," Gardner said. "The question is, if you're adding sodium and coconut oil, which is high in saturated fat, and using processed ingredients, is the product still actually healthy?" To find out, Gardner and his team gathered a group of more than 30 individuals and assigned them to two different diets, each one for eight weeks. One diet called for at least two daily servings of meat -- the options available were primarily red meat -- and one called for at least two daily servings of plant-based meat.

In particular, the researchers measured the levels of a molecule, trimethylamine N-oxide, or TMAO, in the body; TMAO has been linked to cardiovascular disease risk. They found that TMAO levels were lower when study participants were eating plant-based meat.

A paper describing the results of the study will be published Aug. 11 in the American Journal of Clinical Nutrition. Gardner is the senior author of the paper. Postdoctoral scholar Anthony Crimarco, PhD, is the lead author.

Comparing burgers

Gardner, a longtime vegetarian, is a staunch proponent of eating whole foods, with a particular emphasis on vegetables. As nearly all plant-based meats are fairly high in saturated fats and classified as highly processed foods -- Beyond Meat included -- Gardner wanted to study how they affect the body compared with red meat.

He and his team conducted a study that enrolled 36 participants for 16 weeks of dietary experimentation. Gardner designed the research as a crossover study, meaning participants acted as their own controls. For eight weeks, half of the participants ate the plant-based diet, while the other half ate the meat-based diet consisting of primarily red meat, although some participants ate a small amount of chicken. Then they switched. Regardless of which diet participants were on, both groups had on average two servings of meat or plant-based alternatives per day, carefully tracking their meals in journals and working with members of Gardner's team to record their eating habits.

The team took precautions to eliminate bias throughout the study, including working with a third party at Stanford, the Quantitative Sciences Unit, to analyze the data once all participants had finished their 16-week dietary interventions. "The QSU helped us draw up a statistical analysis plan, which we published online before the study was completed," Gardner said. "That way our plan was public, and we were accountable for the specific primary and secondary outcomes that we had initially said we wanted to go after -- namely, the participants' levels of TMAO, blood cholesterol, blood pressure and weight."

An emerging risk factor

The main outcome the team was interested in tracking, Gardner said, was the level of TMAO.

Gardner calls TMAO "an emerging risk factor," meaning there seems to be a connection between higher levels of TMAO and an increased risk of cardiovascular disease, but the connection has yet to be definitively proved. Two precursors to TMAO, carnitine and choline, are found in red meat, so it's possible that individuals who regularly eat beef, pork or lamb for dinner will simply have higher levels of TMAO.

"At this point we cannot be sure that TMAO is a causal risk factor or just an association," Gardner said. However, he sees a reason to pay attention to TMAO readouts. In the past few years, studies have shown that high levels of TMAO are consistent with increased inflammation and blood clotting, among other health concerns. Gardner points to another study in which researchers found that people with elevated TMAO had a 60% higher risk for adverse cardiovascular events, such as a heart attack.

In Gardner's study, the researchers observed that participants who ate the red-meat diet during the first eight-week phase had an increase in TMAO, while those who ate the plant-based diet first did not. But something peculiar happened when the groups switched diets. Those who transitioned from meat to plant had a decrease in TMAO levels, which was expected. Those who switched from plant to meat, however, did not see an increase in TMAO.

"It was pretty shocking; we had hypothesized that it wouldn't matter what order the diets were in," Gardner said. It turns out that there are bacterial species responsible for the initial step of creating TMAO in the gut. These species are thought to flourish in people whose diets are red-meat heavy, but perhaps not in those who avoid meat.

"So for the participants who had the plant-based diet first, during which they ate no meat, we basically made them vegetarians, and in so doing, may have inadvertently blunted their ability to make TMAO," he said. Whether this type of approach could be used as a strategy for decreasing cardiovascular disease risk remains to be seen.

Beyond TMAO

Outside of TMAO, health benefits conveyed from plant-based alternatives extended to weight and levels of LDL cholesterol -- or "bad" cholesterol. No matter which diet was first, participants' levels of LDL cholesterol dropped on average 10 milligrams per deciliter, which is not only statistically significant, but clinically significant too, Gardner said. In addition, participants lost 2 pounds, on average, during the plant-based portion of the diet.

"The modest weight loss observed when participants substituted the plant-based meats in place of the red meats is an unexpected finding, since this was not a weight-loss study," Crimarco said. "I think this indicates the importance of diet quality. Not all highly processed foods are created equal."

Gardner hopes to continue studying the relationship between health and plant-based meat alternatives, particularly as it pertains to changes in the microbiome. Gardner said he's also interested in expanding his research into diet patterns overall. "Maybe next we'll look at a combination of dietary factors on health -- perhaps alternative meat combined with alternative dairy products," he said.

Credit: 
Stanford Medicine

Aging memories may not be 'worse, 'just 'different'

image: "Older adults might be representing events in different ways, and transitions might be picked up differently than, say, a 20-year-old," said Zachariah Reagh, assistant professor of psychological and brain sciences in Arts & Sciences. Reagh looked at fMRI images to study memory differences in different age groups.

Image: 
Washington University in St. Louis

"Memory is the first thing to go."

Everyone has heard it, and decades of research studies seem to confirm it: While it may not always be the first sign of aging, some faculties, including memory, do get worse as people age.

It may not be that straightforward.

Zachariah Reagh, assistant professor of psychological and brain sciences in Arts & Sciences at Washington University in St. Louis, looked at the brain activity of older people not by requiring them to recite a group of words or remember a string of numbers. Instead, Reagh looked at a "naturalistic approach," one that more closely resembled real-world activities.

He found that brain activity in older adults isn't necessarily quieter when it comes to memory.

"It's just different," he said.

The study results were published today in the journal Nature Communications.

Common tests of memory involve a person's ability to remember a string of words, count backward, or recognize repeated images. "How many times do you suspect a 75-year-old is going to have to remember, 'tree, apple, cherry, truck?'" asked Reagh, first author on the paper with Angelique Delarazan, Alexander Garber and Charan Ranganath, all of University of California, Davis.

Instead, he used a data set from the Cambridge Centre for Ageing and Neuroscience (Cam-CAN) that included functional MRI (fMRI) scans of people watching an 8-minute movie. "There were no specific instructions, or a 'gotcha' moment," Reagh said. "They just got to kick back, relax and enjoy the film."

But while they may have been relaxing, the subjects' brains were hard at work recognizing, interpreting and categorizing events in the movies. One particular way people categorize events is by marking boundaries -- where one event ends and another begins.

An "event" can be pretty much anything, Reagh said. "This conversation, or a component of it, for example. We take these meaningful pieces and extract them out of a continuous stream."

And what constitutes a boundary is actually consistent among people.

"If you and I watch the same movie, and we are given the instruction to press a button when we feel one meaningful unit has ended, you and I will be much more similar in our responses than we are different," Reagh said.

When looking at the fMRI results -- which use changes in blood flow and blood oxygen to highlight brian activity -- older adults showed similarly increased activity as a control group at the boundaries of events. That's not to say that brains of all ages are processing the information similarly.

"It's just different," Reagh said. "In some areas, activity goes down and, in some, it actually goes up."

Overall activity did decline pretty reliably across ages 18-88, Reagh said, and when grouped into "younger, middle aged, and older," there was a statistically reliable drop in activity from one group to another.

"But we did find a few regions where activity was ramped up across age ranges," he said. "That was unexpected."

Much of the activity he was interested in is in an area of the brain referred to as the posterior medial network -- which includes regions in the midline and toward the backside of the brain. In addition to memory, these areas are heavily involved in representing context and situational awareness. Some of those areas showed decreased activity in the older adults.

"We do think the differences are memory-related," Reagh said. At the boundaries, they saw differences in the levels of activity in the hippocampus that was related to memory in a different measurement -- "story memory," he called it.

"There might be a broad sense in which the hippocampus's response to event boundaries predicts how well you are able to parse and remember stories and complex narratives," no matter one's age, Reagh said.

But for older adults, closer to the front of the brain, particularly the medial prefrontal cortex, things were looking up.

Activity in that area of the brain was ramped up in older adults. This area is implicated in broad, schematic knowledge -- what it's like to go to a grocery store as opposed to a particular grocery store.

"What might be happening is as older adults lose some responsiveness in posterior parts of the brain, they may be shifting away from the more detailed contextual information," Reagh said. But as activity levels heighten in the anterior portions, "things might become more schematic. More 'gist-like.'"

In practice, this might mean that a 20-year-old noting an event boundary in a movie might be more focused on the specifics -- what specific room are the characters in? What is the exact content of the conversation? An older viewer might be paying more attention to the broader picture -- What kind of room are the characters in? Have the characters transitioned from a formal dinner setting to a more relaxed, after-dinner location? Did a loud, tense conversation resolve into a friendly one?

"Older adults might be representing events in different ways, and transitions might be picked up differently than, say, a 20-year-old," Reagh said.

"An interesting conclusion one could draw is maybe healthy older adults aren't 'missing the picture.' It's not that the info isn't getting in, it's just it's getting in differently."

Credit: 
Washington University in St. Louis

The structural basis of Focal Adhesion Kinase activation on lipid membranes unravelled

image: Assembly of oligomeric FAK (yellow/cyan) on the membrane (purple) triggers autophosphosphorylation. Shown is a state where the autophosphorylation site (red glow) is bound to the active site of the FAK kinase.

Image: 
CIB-CSIC

A research team led by Daniel Lietha has just published in The EMBO Journal the mechanistic details of the activation of the Focal Adhesion Kinase (FAK) on lipid membranes. Lietha started this research during his work at the Spanish National Cancer Research Centre (CNIO) and has culminated it in his current institution, the Centro de Investigaciones Biológicas Margarita Salas (CIB-CSIC).

FAK is a key protein ensuring controlled cell adhesion, proliferation, migration and survival which in cancer is often responsible for aberrant cell invasion leading to metastatic cancers. In the cytosol, FAK adopts an autoinhibited state but is activated upon recruitment into focal adhesions, yet how this occurs or what induces structural changes was unknown.

Lietha's group have demonstrated that FAK is activated when it is localised to the cell membrane where it interacts with specific phosphoinositide lipids. Now, the high-resolution structure of an oligomeric form of FAK bound to a lipid membrane has been obtained using Cryo-Electron Microscopy. The analysis of the structure shows that initial binding of FAK to the membrane causes steric clashes that release the kinase domain from autoinhibition, allowing it to undergo a large conformational change and interact itself with the membrane in an orientation that places the active site towards the membrane.

The structure also reveals that several interfaces align in the rearranged conformation to allow oligomerization of FAK on the membrane with a key phosphorylation site exposed, leading to autophosphorylation and, in turn, activation of FAK. Molecular dynamics simulations were carried out to understand the mechanism and dynamics of the process of autophosphorylation and subsequent activation on the membrane.

To validate the computational model, different mutants of FAK have been generated carrying mutations at the observed interfaces. Extensive biochemical experiments have been carried out to evaluate how the different mutations affect lipid binding, FAK autophosphorylation and activation. Moreover, how the mutations affect FAK function in cancer cells was also studied revealing that the uncovered mechanism is key for cancer cell invasion and proliferation.

Credit: 
Centro Nacional de Investigaciones Oncológicas (CNIO)

Organocatalyst that controls radical reactions for complex and bulky compound synthesis

image: Overview diagram: active control of radical reaction.

Image: 
Kanazawa University

[Background]

Organocatalysts consisting of organic compounds without metal elements are receiving much attention as next generation catalysts in the hope of reducing environmental burden and coping with exhaustion/rising prices of rare metals. However, it is difficult for an organocatalyst to control radical reactions involving a single electron with high reactivity. Thus, reaction processes mediated by organocatalysts are rather limited. This hinders the development and application of organic synthesis by the use of organocatalysts.

N-heterocyclic carbene*1) is known as an organocatalyst free of metal elements and its catalytic reactions have been actively investigated. Studies on N-heterocyclic carbene as an organocatalyst started with the investigation of a biological reaction involving the coenzyme, thiamine (vitamin B1), a thiazolium*2) salt. The enol intermediate of the biological reaction is known to promote reactions of single electron transfer to electron acceptors such as lipoamide*3), flavin adenine dinucleotide (FAD) and Fe4S4*4), which play important roles in oxidation reactions. Scientists were inspired by this biological phenomenon and have synthesized N-heterocyclic carbene catalysts that could control radical reactions. These have been applied to organic synthesis. However, due to limits in the number of substrates that can be used for such a catalytic reaction, only a narrow range of organic compounds could be synthesized. This has severely limited applications, for example, in drug discovery.

[Results]

Prof. Ohmiya and co-workers designed an N-heterocyclic carbene catalyst in a rational and precise manner for the purpose of widening the range of target chemical substances for radical reactions. The group recently found a thiazolium-type N-heterocyclic carbene catalyst having an N-neopentyl group and applied this carbene catalyst to a radical reaction to synthesize a dialkyl ketone from an aliphatic aldehyde and an aliphatic carboxylic acid derivative (Figure 1). This was previously a very difficult synthesis. So far, a radical reaction using conventional N-heterocyclic carbene catalysts could be applied only to aromatic aldehydes as catalytic reaction substrates. The N-heterocyclic carbene catalyst newly developed here is a versatile catalyst applicable to both aromatic and aliphatic aldehydes, thus significantly widening the application of radical reactions of organocatalysts.

The key to this success was their finding that the N-neopentyl group of the thiazolium-type N-heterocyclic carbene was effective in the reaction progress, while their study was carried out making full use of organic chemistry and measurement techniques. The bulkiness of the N-neopentyl group was found to be effective not only in promoting a coupling reaction of two different radical species generated in the reaction system but also in suppressing undesirable side reactions.

The present catalytic reaction has the following merits in organic chemical synthesis; 1) bulky molecules can be reaction substrates due to involvement of a highly reactive radical, and 2) the method is excellent in terms of a wide range of functional groups and substrates, since the catalytic reaction can be carried out under mild conditions without the need for metal catalysts or redox reagents. Thus, it is now possible to synthesize more than 35 bulky and complex dialkyl ketone, which was previously very difficult. This enables the synthesis of natural compounds and pharmaceuticals having a dialkyl ketone backbone from an aliphatic aldehyde and an aliphatic carboxylic acid derivative.

[Future prospects]

In this study, the research group has designed a new organocatalyst that controls radical reactions, which significantly widens the applicability to various substrates. The study is expected to accelerate drug discovery, since it enables synthesis of organic compounds with high added value that used to be nearly impossible to attain. From an academic viewpoint, the study has established design guidelines of organocatalysts that can control radical reactions.

Credit: 
Kanazawa University

Mathematical patterns developed by Alan Turing help researchers understand bird behavior

Scientists from the University of Sheffield have used mathematical modelling to understand why flocks of long-tailed tits segregate themselves into different parts of the landscape.

The team tracked the birds around Sheffield's Rivelin Valley which eventually produced a pattern across the landscape, using maths helped the team to reveal the behaviours causing these patterns.

The findings, published in the Journal of Animal Ecology, show that flocks of long-tailed tits are less likely to avoid places where they have interacted with relatives and more likely to avoid larger flocks, whilst preferring the centre of woodland.

It was previously unknown why flocks of long-tailed tits live in separate parts of the same area, despite there being plenty of food to sustain multiple flocks and the birds not showing territorial behaviour.

The equations used to understand the birds are similar to those developed by Alan Turing to describe how animals get their spotted and striped patterns. Turing's famous mathematics indicates if patterns will appear as an animal grows in the womb, here it's used to find out which behaviours lead to the patterns across the landscape.

Territorial animals often live in segregated areas that they aggressively defend and stay close to their den. Before this study, these mathematical ideas had been used to understand the patterns made by territorial animals such as coyotes, meerkats and even human gangs. However, this study was the first to use the ideas on non-territorial animals with no den pinning them in place.

Natasha Ellison, PhD student at the University of Sheffield who led the study, said: "Mathematical models help us understand nature in an extraordinary amount of ways and our study is a fantastic example of this."

"Long-tailed tits are too small to be fitted with GPS trackers like larger animals, so researchers follow these tiny birds on foot, listening for bird calls and identifying birds with binoculars. The field work is extremely time consuming and without the help of these mathematical models these behaviours wouldn't have been discovered."

Credit: 
University of Sheffield

Aspirin may accelerate progression of advanced cancers in older adults

Results from a recent clinical trial indicate that for older adults with advanced cancer, initiating aspirin may increase their risk of disease progression and early death.

The study, which was conducted by a binational team led by researchers at Massachusetts General Hospital (MGH), the Berman Center in Minnesota, and Monash University in Australia, is published in the Journal of the National Cancer Institute.

Compelling evidence from clinical trials that included predominantly middle-aged adults demonstrates that aspirin may reduce the risk of developing cancer, especially colorectal cancer. Information is lacking for older adults, however.

To provide insights, investigators designed and initiated the ASPirin in Reducing Events in the Elderly (ASPREE) trial, the first randomized double-blind placebo-controlled trial of daily low-dose aspirin (100 mg) in otherwise healthy older adults. The study included 19,114 Australian and U.S. community-dwelling participants aged 70+ years (U.S. minorities 65+ years) without cardiovascular disease, dementia, or physical disability at the start of the study. Participants were randomized to aspirin or placebo and followed for a median of 4.7 years.

In October 2018, the investigators published a very surprising and concerning report showing an association between aspirin use and an elevated risk of death, primarily due to cancer. The current report now provides a more comprehensive analysis of the cancer-related effects of aspirin in the ASPREE participants. "We conducted this study as a more detailed examination of the effect of aspirin on the development of cancer as well as death from cancer," explained senior author Andrew T. Chan, MD, MPH, Chief of the Clinical and Translational Epidemiology Unit at MGH, Director of Epidemiology at the MGH Cancer Center, and a Professor of Medicine at Harvard Medical School.

Dr. Chan and his colleagues reported that 981 participants who were taking aspirin and 952 who were taking placebo developed cancer. There was no statistically significant difference between the groups for developing cancer overall or for developing specific types of cancer. Aspirin was associated with a 19% higher risk of being diagnosed with cancer that had spread (or metastasized) and a 22% higher risk of being diagnosed with stage 4, or advanced, cancer, however. Also, among participants who were diagnosed with advanced cancer, those taking aspirin had a higher risk of dying during follow-up than those taking placebo.

"Deaths were particularly high among those on aspirin who were diagnosed with advanced solid cancers, suggesting a possible adverse effect of aspirin on the growth of cancers once they have already developed in older adults," said Dr. Chan. He added that the findings suggest the possibility that aspirin might act differently, at the cellular or molecular level, in older people, which requires further study.

Notably, the vast majority of the study participants did not previously take aspirin before age 70. "Although these results suggest that we should be cautious about starting aspirin therapy in otherwise healthy older adults, this does not mean that individuals who are already taking aspirin--particularly if they began taking it at a younger age--should stop their aspirin regimen," Dr. Chan added.

Credit: 
Massachusetts General Hospital

Theoretical study shows that matter tends to be ordered at low temperatures

image: In (a), the quantum-critical point is shown (cyan bullet). The system undergoes a phase transition when the external magnetic field reaches a certain critical value. In (b), the hypothetical quantum-critical point for B=0 and T=0 is shown (red bullet). The red-shaded gradient represents the role played by the mutual interactions between neighboring magnetic moments, which become increasingly relevant as the temperature decreases.

Image: 
UNESP

Classical phase transitions are governed by temperature. One of the most familiar examples is the phase transitions of water from solid to liquid to gas. However, other parameters govern phase transitions when temperatures approach absolute zero, including pressure, the magnetic field, and doping, which introduce disorder into the molecular structure of a material.

This topic is treated from the theoretical standpoint in the article "Unveiling the physics of the mutual interactions in paramagnets ", published in Scientific Reports, an online journal owned by Springer Nature.

The paper resulted from discussions held in the laboratory in the context of the doctoral research of the two main authors, Lucas Squillante and Isys Mello, supervised by the last author, Mariano de Souza , a professor in the Physics Department of São Paulo State University's Institute of Geosciences and Exact Sciences (IGCE-UNESP) in Rio Claro, Brazil.

The other coauthors are Roberto Eugenio Lagos Mônaco and Antonio Carlos Seridonio , also professors at UNESP, and Harry Eugene Stanley , a professor at Boston University (USA).

The study was supported by São Paulo Research Foundation - FAPESP via a grant awarded to the project "Exploring thermodynamic and transport properties of strongly correlated electron systems", for which Souza was the principal investigator.

"In paramagnetic materials, there's always a subtle many-body contribution to the system's energy. This contribution can be considered a small effective local magnetic field. It's usually overlooked, given the very small amount of energy associated with it compared to the energy associated with thermal fluctuations or external magnetic fields.

Nevertheless, when the temperature and external magnetic field approach zero, such many-body contributions become significant," Souza told.

The study showed that matter always tends to be ordered at low temperatures owing to many-body interactions. The noninteracting spin gas model therefore does not occur in the real world because a many-body interaction between the spins in the system would impose order.

"We found that in actual materials, there's no such thing as a critical point at which a quantum phase transition occurs in a genuine zero field because of the persistence of the residual magnetic field created by the many-body interaction. In a broader context, ideal Bose-Einstein condensation can't be obtained because of this interaction," Souza said.

A Bose-Einstein condensate, often referred to as the "fifth state of matter" (the others being solid, liquid, gas and plasma), is a group of atoms cooled to within a hair of absolute zero. When they reach that temperature, the atoms have no free energy to move relative to each other and fall into the same quantum states, behaving as a single particle.

Bose-Einstein condensates were first predicted and calculated theoretically by Satyendra Nath Bose (1894-1974) and Albert Einstein (1879-1955) in 1924, but it was not until 1995 that Eric A. Cornell, Carl E. Wieman and Wolfgang Ketterle managed to make one using ultracold rubidium gas, for which all three were awarded the 2001 Nobel Prize in Physics.

"What our study showed was that although a nonideal Bose-Einstein condensate can be obtained experimentally, the ideal condition for condensation can't be achieved because it presupposes that particles don't perceive or interact with each other, whereas residual interaction always occurs, even in the vicinity of absolute zero," Souza said.

"Another discovery was that matter can be magnetized adiabatically [without heat loss or gain] via these mutual interactions alone."

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo