Tech

Wine glass size may influence how much you drink in restaurants

The size of glass used for serving wine can influence the amount of wine drunk, suggests new research from the University of Cambridge, funded by the National Institute of Health Research (NIHR). The study found that when restaurants served wine in 370ml rather than 300ml glasses they sold more wine, and tended to sell less when they used 250ml glasses. These effects were not seen in bars.

Alcohol is the fifth largest contributor to early death in high income countries and the seventh world-wide. One proposed way of reducing the amount of alcohol consumed is to reduce the size of wine glasses, though until now the evidence supporting such a move has been inconclusive and often contradictory.

Wine glasses have increased in size almost seven-fold over the last 300 years with the most marked increase being a doubling in size since 1990. During this time, the amount of wine consumed in England quadrupled, although the number of wine consumers stayed constant. Wine sales in bars and restaurants are either of fixed serving sizes when sold by the glass, or - particularly in restaurants - sold by the bottle or carafe for free-pouring by customers or staff.

A preliminary study carried out by researchers at the Behaviour and Health Research Unit, University of Cambridge, suggested that serving wine in larger wine glasses - while keeping the measure the same - led to a significant increase in the amount of wine sold.

To provide a robust estimate of the effect size of wine glass size on sales - a proxy for consumption - the Cambridge team did a 'mega-analysis' that brought together all of their previously published datasets from studies carried out between 2015 and 2018 at bars and restaurants in Cambridge. The team used 300ml glasses as the reference level against which to compare differences in consumption.

In restaurants, when glass size was increased to 370ml, wine sales increased by 7.3%. Reducing the glass size to 250ml led to a drop of 9.6%, although confidence intervals (the range of values within which the researchers can be fairly certain their true value lies) make this figure uncertain. Curiously, increasing the glass size further to 450ml made no difference compared to using 300ml glasses.

"Pouring wine from a bottle or a carafe, as happens for most wine sold in restaurants, allows people to pour more than a standard serving size, and this effect may increase with the size of the glass and the bottle," explained first author Dr Mark Pilling. "If these larger portions are still perceived to be 'a glass', then we would expect people to buy and consume more wine with larger glasses.

"As glass sizes of 300ml and 350ml are commonly used in restaurants and bars, drinkers may not have noticed the difference and still assumed they were pouring a standard serving. When smaller glass sizes of 250ml are available, they may also appear similar to 300ml glasses but result in a smaller amount of wine being poured. In contrast, very large glasses, such as the 450ml glasses, are more obviously larger, so drinkers may have taken conscious measures to reduce how much they drink, such as drinking more slowly or pouring with greater caution."

The researchers also found similar internal patterns to those reported in previous studies, namely lower sales of wine on warmer days and much higher sales on Fridays and Saturdays than on Mondays.

The researchers found no significant differences in wine sales by glass size in bars - in contrast to the team's earlier study. This shows the importance of replicating research to increase our ability to detect the effects of wine glass size. When combined with data from other experiments, the apparent effect in bars disappeared.

"If we are serious about tackling the negative effects of drinking alcohol, then we will need to understand the factors that influence how much we consume," added senior author Professor Dame Theresa Marteau. "Given our findings, regulating wine glass size is one option that might be considered for inclusion in local licensing regulations for reducing drinking outside the home."

Professor Ashley Adamson, Director of the NIHR School of Public Health Research, said: "We all like to think we're immune to subtle influences on our behaviour - like the size of a wine glass - but research like this clearly shows we're not.

"This important work helps us understand how the small, everyday details of our lives affect our behaviours and so our health. Evidence like this can shape policies that would make it easier for everyone to be a bit healthier without even having to think about it."

Clive Henn, Senior Alcohol Advisor at Public Health England, welcomed the report: "This interesting study suggests a new alcohol policy approach by looking at how the size of wine glasses may influence how much we drink. It shows how our drinking environment can impact on the way we drink and help us to understand how to develop a drinking environment which helps us to drink less."

Credit: 
University of Cambridge

Stretchable, wearable coils may make MRI, other medical tests easier on patients

image: The Purdue University team developed RF coils that are formable and stretchable for medical imaging.

Image: 
Joseph Rispoli/Purdue University

WEST LAFAYETTE, Ind. - Anyone who has had a mammogram or an MRI knows how uncomfortable and awkward the tests can be. Now, Purdue University researchers have taken technology used in the defense and aerospace industries to create a novel way of doing some medical imaging.

One reason the tests are uncomfortable is that they often use rigid radio-frequency (RF) coils to detect signals from the body. Now, the Purdue team has developed RF coils that are formable and stretchable.

"Imagine going for an imaging session and they strap on a comfortable fabric with the coils embedded inside," said Joseph Rispoli, an assistant professor of biomedical engineering and electrical and computer engineering in Purdue's College of Engineering. "We created an adaptable, wearable and stretchable fabric embroidered with conductive threads that provides excellent signal-to-noise ratio for enhanced MRI scanning."

The Purdue team's work appeared in the journal IEEE Transactions on Biomedical Engineering.

Current approaches to enhancing signal-to-noise ratio, known as SNR, include shaping receive coil arrays to encompass a generalized form of the body part of interest, but these are often rigid and require the patient be posed in a specific way. The Purdue flexible and stretchable coil could be placed close to the skin on an area or joint, regardless of its positioning.

The thread technology used in the Purdue innovation is similar to that found in applications for the aerospace and defense industries. Rispoli said the technology also is applicable to breast MRI and to enhancing medical device communication using wearable or implantable antennas.

"Our preliminary results show a full-scale device will be superior in all aspects of diagnostic testing, including increased sensitivity and fewer false positives," Rispoli said.

The innovators are working with the Purdue Research Foundation Office of Technology Commercialization to patent the technology. The office recently moved into the Convergence Center for Innovation and Collaboration in Discovery Park District, located on the west side of the Purdue campus.

The researchers are looking for partners to continue developing their technology. For more information on licensing and other opportunities, contact Patrick Finnerty of OTC at pwfinnerty@prf.org and mention track code 2019-RISP-68630.

The National Institutes of Health funded some of the work on the technology. The Purdue innovators also presented the technology at the International Society for Magnetic Resonance in Medicine Annual Meeting and the IEEE International Engineering in Medicine and Biology Conference.

Rispoli also is a member of the drug delivery and molecular sensing program at the Purdue University Center for Cancer Research, where he works on technologies to diagnose and monitor brain, breast and other cancers.

Credit: 
Purdue University

STATICA: A novel processor that solves a notoriously complex mathematical problem

image: Research overview of STATICA, a novel processor architecture.

Image: 
Tokyo Institute of Technology

Scientists at Tokyo Institute of Technology have designed a novel processor architecture that can solve combinatorial optimization problems much faster than existing ones. Combinatorial optimization are complex problems that show up across many different fields of science and engineering and are difficult for conventional computers to handle, making specialized processor architectures very important.

The power of applied mathematics can be seen in the advancements of engineering and other sciences. However, often the mathematical problems used in these applications involve complex calculations that are beyond the capacities of modern computers in terms of time and resources. This is the case for combinatorial optimization problems.

Combinatorial optimization consists in locating an optimal object or solution in a finite set of possible ones. Such problems ubiquitously manifest in the real world across different fields. For example, combinatorial optimization problems show up in finance as portfolio optimization, in logistics as the well-known "travelling salesman problem", in machine learning, and in drug discovery. However, current computers cannot cope with these problems when the number of variables is high.

Fortunately, a team of researchers from Tokyo Institute of Technology, in collaboration with Hitachi Hokkaido University Laboratory, and the University of Tokyo, have designed a novel processor architecture to specifically solve combinatorial optimization problems expressed in the form of an Ising model. The Ising model was originally used to describe the magnetic states of atoms (spins) in magnetic materials. However, this model can be used as an abstraction to solve combinatorial optimization problems because the evolution of the spins, which tends to reach the so-called lowest-energy state, mirrors how an optimization algorithm searches for the best solution. In fact, the state of the spins in the lowest-energy state can be directly mapped to the solution of a combinatorial optimization problem.

The proposed processor architecture, called STATICA, is fundamentally different from existing processors that calculate Ising models, called annealers. One limitation of most reported annealers is that they only consider spin interactions between neighboring particles. This allows for faster calculation, but limits their possible applications. In contrast, STATICA is fully connected and all spin-to-spin interactions are considered. While STATICA's processing speed is lower than those of similar annealers, its calculation scheme is better: it uses parallel updating.

In most annealers, the evolution of spins (updating) is calculated iteratively. This process is inherently serial, meaning that spin switchings are calculated one by one because the switching of one spin affects all the rest in the same iteration. In STATICA, the updating process is carried out in parallel using what is known as stochastic cell automata. Instead of calculating spin states using the spins themselves, STATICA creates replicas of the spins and spin-to-replica interactions are used, allowing for parallel calculation. This saves a tremendous amount of time due to the reduced number of steps needed. "We have proven that conventional approaches and STATICA derive the same solution under certain conditions, but STATICA does so in N times fewer steps, where N is the number of spins in the model," remarks Prof. Masato Motomura, who led this project. Furthermore, the research team implemented an approach called delta-driven spin updating. Because only spins that changed in the previous iteration are important when calculating the following one, a selector circuit is used to only involve spins that flipped in each iteration.

STATICA offers reduced power consumption, higher processing speed, and better accuracy than other annealers. "STATICA aims at revolutionizing annealing processors by solving optimization problems based on the mathematical model of stochastic cell automata. Our initial evaluations have provided strong results," concludes Prof. Motomura. Further refinements will make STATICA an attractive choice for combinatorial optimization.

Credit: 
Tokyo Institute of Technology

Cold sintering produces capacitor material at record low temperatures

image: The dense microstructure of barium titanate as seen under a microscope

Image: 
Clive Randall, Penn State

Barium titanate is an important electroceramic material used in trillions of capacitors each year and found in most electronics. Penn State researchers have produced the material at record low temperatures, and the discovery could lead to more energy efficient manufacturing.

A team of Penn State scientists used the cold sintering process to densify barium titanate ceramics at less than 572 degrees Fahrenheit (300 degrees Celsius), the lowest processing temperatures ever used, while maintaining the quality achieved at higher temperatures in modern commercial manufacturing, the researchers said.

"Our work is the first example showing we can densify ferroelectric oxides in a single step," said Kosuke Tsuji, a doctoral candidate in the Department of Materials Science and Engineering at Penn State and lead author of the study. "It should open up the possibility to densify many more inorganic materials at low temperatures."

It is the first time researchers have densified barium titanate in a single step using cold sintering. Previous attempts required secondary heating to produce materials with useful dielectric properties, said the scientists, who reported their findings in the Journal of the European Ceramic Society.

Sintering is a commonly used process to compress fine powders into a solid mass of material using heat and pressure. The cold sintering process, developed by Penn State scientists, accomplishes this at much lower temperatures and in shorter times than conventional sintering. The emerging technology holds the potential to reduce the costs and environmental impacts of manufacturing a wide range of materials, according to the researchers.

The researchers used new chemistries to densify barium titanate in a single step. Cold sintering involves adding a few drops of liquid to ceramic powder. Reactions between moisture, heat and pressure create denser materials compared to heating at higher temperatures without liquid.

Previous cold sintering research used neutral or acidic solutions, but the new study incorporated hydroxide, an alkaline material. The hydroxide helped produce barium titanate with the necessary dielectric properties at lower temperatures, the scientists said.

"This research shows that materials that were previously difficult to sinter can now be done," said Clive Randall, professor of materials science and engineering at Penn State, who led the development of cold sintering. "It takes us to the dream that we can eventually find the right chemistry to allow all ceramic materials, and maybe even metal materials, to be cold sintered."

Barium titanate is the basic compound used to produce high permittivity dielectric materials in multilayer capacitors. Of the more than 3 trillion ceramic capacitors produced each year, about 90% contain barium titanate.

"These devices underpin the modern electronic world," said Randall, who also serves as director of Penn State's Materials Research Institute. "The implications of applying this technology to barium titanate are enormous. In your cell phone alone, you may have 1,000 components that are all made of barium titanate. It is ubiquitous to all electronics."

Lowering temperatures used in commercial manufacturing would not only be more energy efficient but could open the door to using less expensive metals and incorporating polymer composites into these capacitors, according to the researchers.

"This is very attractive to many of the leading capacitor companies, which are all working with these researchers through Penn State's Center for Dielectrics and Piezoelectrics (CDP)," Randall said.

Credit: 
Penn State

Researchers overcome the space between protons and neutrons to study heart of matter

NEWPORT NEWS, VA - Nuclear physicists have entered a new era for probing the strongest force in the universe at its very heart with a novel method of accessing the space between protons and neutrons in dense environments. The research, which was carried out at the Department of Energy's Thomas Jefferson National Accelerator Facility, has been published in the journal Nature and opens the door for more precision studies of the strongest part of the strong nuclear force and the structure of neutron stars.

"Until this work, the forces between protons and neutrons at very, very short distances comparable to the size of the proton and neutron itself were very model dependent," explains Axel Schmidt, a former MIT postdoctoral researcher and the paper's lead author, who has since moved to George Washington University. "We've come up with a new way to analyze data from Jefferson Lab to look at these forces at distances that are much, much shorter."

The strong force is one of the four fundamental forces of nature, which also include gravity, electromagnetism and the weak nuclear force. The strong force is responsible for binding together the protons and neutrons that form the nucleus of the atom, and thus the core of every atom that builds our visible universe.

"What we have presented in this paper is a new approach in learning about that force by using protons and neutrons in nuclei that happen to get close together, and using this natural occurrence within nuclei to learn about these forces," says Schmidt.

As pairs of protons and neutrons get very close together, they may engage in a short-range correlation, forming a brief partnership. While in this correlation, they overlap momentarily before the particles part ways.

In their analysis, the researchers captured snapshots of these correlations to study as microcosms of dense nuclear matter. They then tested different state-of-the-art models for the strong nuclear force to see how well the models explain the data.

They found that the most successful models describe the strong nuclear force at short distances as having a so-called tensor interaction, where protons interact with other protons very differently than they interact with neutrons. Then, as the distance between the correlated particles shrinks even further, the nuclear force interaction changes to a so-called scalar interaction, where proton-proton and proton-neutron interactions are very similar.

"And we find that these models of the force that have a harder repulsive (scalar) core seem to do a better job of explaining the data," Schmidt explains.

This hard, repulsive core of the strong nuclear force has never before been directly accessed experimentally inside a nucleus. The researchers say that's because as experimenters attempted to reach these short distance scales in particle accelerators using higher and higher energies, the data became muddled with production of other particles that complicated the interactions, as a direct consequence of the higher energies needed to access the short distance scales.

The researchers say they were surprised to find that even though the protons and neutrons overlapped in these interactions, the models that treat them as individual particles were still successful in describing their behavior. This was verified across the range of different nuclei used in the experiment, from carbon to lead.

"We find that we can still model the data using protons and neutrons, even though they are clearly at distance scales that are smaller than their own size and therefore quarks and gluons might need to be explicitly accounted for," says Jackson Pybus, an MIT graduate student and the paper's second author. "They are clearly overlapping to a large degree, but it doesn't seem to invalidate our models and our calculations."

Further, the result also has implications for the structure of neutron stars, where it's expected that protons and neutrons overlap much as they do as in the short-range correlations studied in the experiment.

"That's a huge triumph for modern nuclear physics, because nobody expected this model to have any connection to reality at this distance scale," says Or Hen, an assistant professor of physics at MIT and spokesperson for collaboration.

They say the next step is see if these results hold up when they can run the experiment again on a wide range of nuclei and at higher precision with the newly upgraded CEBAF accelerator and experimental equipment in Jefferson Lab's Experimental Hall B. The experiment has been approved for running and is awaiting scheduling.

To study the short-range correlations, the researchers re-analyzed data taken at the Department of Energy's Thomas Jefferson National Accelerator Facility from an experiment conducted in 2004 using Jefferson Lab's Continuous Electron Beam Accelerator Facility, a DOE Office of Science User Facility. CEBAF produced a 5.01 GeV beam of electrons to probe nuclei of carbon, aluminum, iron and lead.

Credit: 
DOE/Thomas Jefferson National Accelerator Facility

These feet were made for walking

image: The longitudinal arch has been well studied when it comes to foot stiffness, but this research found that the transverse arch may be more important.

Image: 
OIST

Many of us take our feet for granted, but they have a challenging job in the biomechanics department. When we push off with the ball of the foot, the force we apply exceeds our body weight, causing the middle of the foot to bend. Yet the foot maintains its shape because it is stiff enough to withstand this force. Researchers have long debated what gives the midfoot its stiffness. Now, a new study, published in Nature, has shed light on the importance of a little-studied structure called the transverse arch (TA), which runs across the foot.

"Having a firm understanding of how the human foot works has several real-world applications," said Professor Mahesh Bandi, from the Nonlinear and Non-equilibrium Physics Unit at the Okinawa Institute of Science and Technology Graduate University (OIST), who co-led the study. For example, he explained, the current definition of flatfoot disorder is solely based on the well-studied medial longitudinal arch (LA), which runs the long way down the foot, and doesn't consider the transverse arch (TA). What's more, he adds, this research could also help with the design of robotic feet and point to clues about how bipedal walking evolved.

Previous studies have found that 25% of the foot's stiffness is conferred by the LA. The researchers theorized that the TA contributes even more to this effect, much like how rolling a floppy piece of paper makes it harder to bend. Using a protocol developed with the help of computer simulations, and experiments on plastic and mechanical models, the researchers found that about half of the stiffness in human cadaveric feet is controlled by the TA.

Measuring the stiffness of the human foot

To determine what influences midfoot stiffness, Professor Bandi and colleagues created both computer simulations and plastic models of the midfoot and measured how much force was required to bend them a certain amount.

"We found that the plastic models and simulations with more pronounced TAs were stiffer and less susceptible to bending than flatter ones," said Professor Bandi. "In contrast, on these models, an increase in the curvature of the LA had little effect on the stiffness."

They then performed bending tests on mechanical models of the foot, which varied in length, thickness and TA curvature. As with the simulations and plastic models, they found that the mimics with more pronounced TAs were stiffer to bend. Finally, they found that cutting the TA tissues in feet from human cadavers donated to research - while leaving the LA tissues intact - reduced the feet's stiffness by about a factor of half.

In the steps of our ancestors

Professor Bandi and colleagues also looked at the role the TA played in human evolution. Researchers know that the feet of vervet monkeys, macaques, chimpanzees and gorillas are substantially flatter than human feet, and can only partially stiffen. Meanwhile, species within the genus Homo, like us, all have a pronounced TA, enabling effective walking and running.

By comparing the curvature of the TA in humans and non-human primates to fossils of earlier hominin species, Professor Bandi and colleagues measured where a prominent TA first appeared in the fossil record. "Our findings suggest that a human-like TA predates the genus Homo by around 1.5 million years and was a vital component in the evolution of modern humans. This shows that in future tests, researchers need to analyze both arches, not just the LA," said Professor Bandi.

This research was jointly led by Professor Bandi, Professor Madhusudhan Venkadesan from Yale University and Professor Shreyas Mandre from the University of Warwick. It was funded by a Young Investigator Grant from the Human Frontier Science Program.

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

Adequate folate levels linked to lower cardiovascular mortality risk in RA patients

Decreased folate levels in the bloodstream have been associated with an increased risk of cardiovascular mortality in patients with rheumatoid arthritis, shedding light on why those patients are more susceptible to heart and vascular disease, according to research published today in JAMA Network Open by experts at The University of Texas Health Science Center at Houston (UTHealth).

Patients with rheumatoid arthritis are 60% more likely to die from cardiovascular disease, but researchers have not been able explain why. Rheumatoid arthritis is an autoimmune disorder that causes inflammation due to immune system attacks on healthy cells. It can lead to permanent tissue and joint damage. Women are 2-3 times more likely to develop the disease.

"Our study is the first to show an association between serum folate and increased cardiovascular mortality in patients with rheumatoid arthritis," said Kalyani Sonawane, PhD, an assistant professor at UTHealth School of Public Health and the study's lead author. "It's particularly important for patients taking disease-modifying anti-rheumatic drugs to understand this increased risk."

Serum folate, more commonly known as folic acid, is a B vitamin that is essential in the creation of new cells and has a homocysteine lowering effect. Homocysteine is an amino acid found in blood, and high levels have been linked to a greater risk of developing cardiovascular disease. Individuals with rheumatoid arthritis often have an increased amount of homocysteine, an imbalance that may be due to common medications prescribed for rheumatoid arthritis, such as methotrexate, which deplete folate levels.

Folic acid is found in many foods such as eggs, broccoli, citrus fruits, and leafy greens. Healthy adults should consume at least 400 mcg daily, but study authors say folate-rich foods may not be enough to prevent cardiovascular disease for people with rheumatoid arthritis. Diets high in animal protein such as red meat and increased coffee consumption have been linked to higher homocysteine levels. Avoiding red meats and coffee and eating a diet rich in fruits and vegetables, in addition to taking a daily folic acid supplement, can help reduce homocysteine blood levels.

The researchers identified 683 patients with a self-reported diagnosis of rheumatoid arthritis. Participants were divided into three groups based on their measured serum folate levels: the first group (239 patients) had levels below 4.3 nanograms per milliliter; the second (234 patients) measured levels between 4.3 and 8.2 nanograms per milliliter; the third (210 patients) had levels greater than 8.2 nanograms per milliliter.

Over the course of 17 years, 258 cardiovascular deaths occurred. Serum folate level below 4.3 nanograms per milliliter was associated with 50% higher cardiovascular mortality risk in rheumatoid arthritis patients.

"Our findings suggest that serum folate level might be a useful indicator to assess cardiovascular mortality risk of a rheumatoid arthritis patient in clinical practice," said senior author Maria E. Suarez-Almazor, MD, PhD, Barnts Family Distinguished Professor at The University of Texas MD Anderson Cancer Center. "If future clinical studies validate a causal link, taking folate supplements could be an affordable way to reduce this risk in patients with rheumatoid arthritis."

Credit: 
University of Texas Health Science Center at Houston

Digging into the far side of the moon: Chang'E-4 probes 40 meters into lunar surface

image: The subsurface stratigraphy seen by Yutu-2 radar on the farside of the moon.

Image: 
CLEP/CRAS/NAOC

A little over a year after landing, China's spacecraft Chang'E-4 is continuing to unveil secrets from the far side of the Moon. The latest study, published on Feb.26 in Science Advances, reveals what lurks below the surface.

Chang'E-4 (CE-4) landed on the eastern floor of the Van Kármán crater, near the Moon's south pole, on Jan. 3, 2019. The spacecraft immediately deployed its Yutu-2 rover, which uses Lunar Penetrating Radar (LPR) to investigate the underground it roams.

"We found that the signal penetration at the CE-4 site is much greater than that measured by the previous spacecraft, Chang'E-3, at its near-side landing site," said paper author LI Chunlai, a research professor and deputy director-general of the National Astronomical Observatories of the Chinese Academy of Sciences (NAOC). "The subsurface at the CE-4 landing site is much more transparent to radio waves, and this qualitative observation suggests a totally different geological context for the two landing sites."

LI and his team used the LPR to send radio signals deep into the surface of the moon, reaching a depth of 40 meters by the high frequency channel of 500 MHz - more than three times the depth previously reached by CE-3. This data allowed the researchers to develop an approximate image of the subsurface stratigraphy.

"Despite the good quality of the radar image along the rover route at the distance of about 106 meters, the complexity of the spatial distribution and shape of the radar features make identification of the geological structures and events that generated such features quite difficult," said SU Yan, a corresponding author who is also affiliated with NAOC.

The researchers combined the radar image with tomographic data and quantitative analysis of the subsurface. They concluded that the subsurface is essentially made by highly porous granular materials embedding boulders of different sizes. The content is likely the result of a turbulent early galaxy, when meteors and other space debris frequently struck the Moon. The impact site would eject material to other areas, creating a cratered surface atop a subsurface with varying layers.

The results of the radar data collected by the LPR during the first 2 days of lunar operation provide the first electromagnetic image of the far side subsurface structure and the first 'ground truth' of the stratigraphic architecture of an ejecta deposit.

"The results illustrate, in an unprecedented way, the spatial distribution of the different products that contribute to from the ejecta sequence and their geometrical characteristics," LI said, referring to the material ejected at each impact. "This work shows the extensive use of the LPR could greatly improve our understanding of the history of lunar impact and volcanism and could shed new light on the comprehension of the geological evolution of the Moon's far side."

Credit: 
Chinese Academy of Sciences Headquarters

Mathematician identifies new tricks for the old arch in our foot

image: Schematic of the foot skeleton showing the arches and typical loading pattern during locomotion.

Image: 
Yale University

A stiff mid-foot is essential for withstanding excessive force when pushing off on the ground for walking and running

The arch along the length of the foot was believed to be responsible for mid-foot stiffness. Now, a research collaboration between the University of Warwick and two other universities has illustrated the greater importance of a lesser studied foot arch - the transverse arch.

Our research opens new ways to study the foot for future researchers on foot health. Even the definitions of flatfoot are based upon the longitudinal arch and do not consider the transverse arch. Our work throws these standard practices into question but more work is needed to know how to update them.

Walking and running subjects our feet to forces in excess of body weight. The longitudinal arch of the feet was thought to be the reason the feet do not deform under such load. However, researchers from the University of Warwick, Okinawa Institute of Science and Technology Graduate University in Japan and Yale University have illustrated that the transverse arch may be more important for this stiffness.

Past theories of the foot stiffness look at the longitudinal arch, however in the paper 'Stiffness of the human foot and evolution of the transverse arch' published today, the 26th of February in the journal Nature, researchers from the University of Warwick working in collaboration with Yale University and OIST Graduate University, propose the transverse arch may play an equally important role.

The collaboration found that the transverse arch is a bigger source of foot stiffness than what was found due to the longitudinal arch in previous work. They also discovered that the transverse arched evolved to become almost human-like over 3.5 million years ago.

This collaboration between Dr Shreyas Mandre, from the Department of Maths at the University of Warwick, Professor Mahesh Bandi, from the Nonlinear and Non-equilibrium Physics Unit at the Okinawa Institute of Science and Technology Graduate University (OIST) and Professor Madhusudhan Venkadesan, from Yale University was funded by a Young Investigator award by the Human Frontiers Science Program.

VIDEO HERE: https://www.youtube.com/watch?v=QyiX0Fb-Lfw

The authors say that this research motivates further work into the role of the transverse arches in the disciplines of podiatry and evolutionary anthropology. These insights could also inspire new designs for prosthetic and robotic feet.

The role of the transverse arch may be understood in simpler terms by looking at a thin paper sheet. When the short edge is held flat, the sheet is floppy and droops under a little weight. But curl the edge a little and even 100 times as much weight is not excessive.

"Flat thin objects like paper sheets bend easily, but are much difficult to stretch," Dr. Mandre explains. "The transverse curvature of the sheet engages its transverse stretching when attempting to bend it. This coupling of bending and stretching due to curvature is the principle underlying the stiffening role of the transverse arch."

But because the foot serves multiple mechanical functions, its structure is more complicated than the paper sheet. Therefore, "flattening" the foot to test the hypothesis of curvature-induced stiffening may have unidentified confounding variables. To overcome this difficulty, the researchers ingeniously disrupted the underlying principle while keeping the transverse arch intact.

"Understanding of the underlying principle enabled us to build mechanical mimics of the foot comprising springs that imitated the elastic tissue of the foot. Disrupting the transversely oriented springs in these mimics had the same effect as flattening them," explains Ali Yawar, a co-author of the study.

VIDEO HERE: https://www.youtube.com/watch?v=adt3sH9O_vE

"We disrupted the underlying principle of curvature-induced stiffening in human cadaveric feet by transecting the transverse tissue, which reduced the mid-foot stiffness by nearly half," said Carolyn Eng, another co-author of the article. In comparison, experiments in the 1980's on disrupting the stiffening mechanism due to the longitudinal arch only showed a reduction in stiffness by about 25%.

This research also injects new interpretation of the fossil record of human ancestral species, especially pertaining to the emergence of bipedalism. The researchers formulated a measure of the transverse arch to accounts for variations in the length and thickness of the feet. They used the measure to compare related species such as the great apes, human ancestral species and some distantly related primates.

"Our evidence suggests that a human-like transverse arch may have evolved over 3.5 million years ago, a whole 1.5 million years before the emergence of the genus Homo and was a key step in the evolution of modern humans," explains Prof. Venkadesan. It also provides a hypothesis for how Australopithecus afarensis, the same species as the fossil Lucy, thought to not possess longitudinally arched feet, could generate footprints like humans that were discovered in Laetoli.

Credit: 
University of Warwick

Scientists discover dust from Middle East cools the Red Sea

video: Researchers at the King Abdullah University of Science and Technology (KAUST) have discovered dust from the Middle East has a positive cooling effect over the land and the Red Sea.

Image: 
King Abdullah University of Science and Technology (KAUST)

Researchers at the King Abdullah University of Science and Technology (KAUST) have discovered dust from the Middle East has a positive cooling effect over the land and the Red Sea.

"Saudi Arabia is in the so-called Dust Belt, emitting about a third of the world's dust emissions," said Professor Georgiy Stenchikov, Director of the Earth Science and Engineering Program in Physical Sciences and Engineering at KAUST. "Dust affects the entire world and it is the most abundant aerosol on Earth."

Based on satellite data, scientists have found that the amount of dust over the Red Sea is greater than over the land, with the largest radiative cooling effect of the sea in the world.

"Dust cools the surface but warms the atmosphere, which is how it changes the circulation," Stenchikov highlighted. "The Sahara dust shifts the Rain Belt in the summer to the north, increasing the circulation in the Sahel, for instance, so droughts would be more severe if there were no dust."

Climate change is caused by the increasing concentration of greenhouse gases, which increase the Earth's temperature. At the same time, the atmosphere is full of aerosols, which could either be natural or anthropological, that absorb and reflect solar radiation.

"Dust is a very complex aerosol," Professor Stenchikov noted. "Not only does it reflect solar radiation, it also absorbs solar and infrared radiation and affects the atmospheric circulation, so it produces an extremely strong and complex effect overall on the circulation and the climate."

"The concentration of dust in this center is very high," he added. "The radiative effect is very high so the effect on the climate is also very strong - it affects the climate over land, the Red Sea and nearby waters. Dust deposition also provides nutrients to the sea."

In their paper published in the Journal of Geographical Research, the researchers showed that the climatological dust radiative forcing over the southern Red Sea is the largest in the world and, for the first time, studied the role of its effects in the region.

According to Dr. Sergey Osipov, from the Atmospheric Chemistry Department at the Max Planck Institute for Chemistry in Germany, primary author of the study, it is essential to study mineral dust aerosols because of the broad spectrum of effects that span from our daily routines to climate change.

"In terms of the environmental hazards, dust contributes to ambient air pollution, which was recently recognized as a major health risk that reduces life expectancy," Dr. Osipov continues.

"Dust is also a major climate driver, and its radiative effects are especially pronounced and relevant in the Middle East and over the Red Sea. Dust aerosols reduce the extreme surface air temperatures over the land and protect the coral reefs in the Red Sea."

The paper demonstrates that dust is an integral part of the Earth system and participates in numerous complex interactions. The research team continue their work to understand these mechanisms and to account for them in models to improve the reliability of future climate projections at a local, regional and global scale.

Credit: 
King Abdullah University of Science & Technology (KAUST)

Scientists discover new clue behind age-related diseases and food spoilage

image: Meirong Zeng (left) and Kevin Wilson at Beamline 9.0.2 of the Advanced Light Source adjusting the Aerosol Mass Spectrometer to measure the degradation kinetics of lipids.

Image: 
Marilyn Sargent/Berkeley Lab

Scientists at the U.S. Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) have made a surprising discovery that could help explain our risk for developing chronic diseases or cancers as we get older, and how our food decomposes over time.

What's more, their findings, which were reported recently in the Proceedings of the National Academy of Sciences (PNAS), point to an unexpected link between the ozone chemistry in our atmosphere and our cells' hardwired ability to ward off disease.

"The beauty of nature is that it often decides to use similar chemistries throughout a system, but we never thought that we would find a common link between atmospheric chemistry, and the chemistry of our bodies and food," said Kevin Wilson, the deputy director of Berkeley Lab's Chemical Sciences Division who led the study. "Our study is the first to explore another chemical pathway that might affect how well the cells in our bodies - and even our food - can respond to oxidative stress, such as pollution, over time."

Our bodies and some of our favorite foods, including meat, nuts, and avocados, have a lot in common: They are made of organic molecules, such as unsaturated lipids, which are important building blocks for cell walls.

Unsaturated lipids and other organic molecules, like carbohydrates and protein, slowly degrade over time because of a chain reaction - known as autoxidation - kicked off by oxygen and hydroxyl radicals, a type of reactive oxygen species. Hydroxyl radicals insidiously attack the unsaturated lipids in our bodies and our food, turning the freshest of avocados brown, for example.

The damage to our bodies from hydroxyl radicals, however, is more devastating than an oxidized avocado. As we age, decades of exposure to hydroxyl radicals and other reactive oxygen species slowly yet surely debilitate our body's unsaturated lipids. Such irreversible damage increases oxidative stress and the likelihood of developing cancer and age-related chronic diseases such as Alzheimer's.

An unexpected link

For decades, scientists believed that hydroxyl radicals worked alone when they attack unsaturated lipids.

But Wilson and his research team found that hydroxyl radicals have a surprising partner in crime - and it goes by the name "Criegee intermediate."

Criegee intermediates are highly reactive, exotic molecules first proposed by chemist Rudolf Criegee in 1975 to explain how pollutants emitted by cars and factories react with the ozone layer in our atmosphere.

So in 2015, decades after Criegee's groundbreaking discovery, Wilson and co-author Nadja Heine were surprised to observe a chemical species called "secondary ozonides" - molecules containing carbon, hydrogen, and oxygen - during a hydroxyl reaction with unsaturated lipids at Berkeley Lab's Advanced Light Source (ALS). (Heine was a postdoctoral researcher in Berkeley Lab's Chemical Sciences Division at the time of the study.)

What baffled the researchers is that secondary ozonides aren't typically associated with unsaturated lipids; rather, they are products of a Criegee intermediate reaction with atmospheric aldehydes, which are organic compounds derived from alcohols.

"We wondered, do Criegee intermediates work with hydroxyl in the degradation of the unsaturated lipids in food, plastic, and the cells in our bodies?" Wilson said.

Going on a scavenger hunt

Because Criegee intermediates have a fleeting existence, it's hard to observe them directly. So the researchers used a process of elimination to zero in on them.

Lead author Meirong Zeng, a postdoctoral researcher in Berkeley Lab's Chemical Sciences Division, employed a technique called mass spectroscopy at the ALS to illuminate the lipid nanodroplets under ultraviolet light. The ALS is a DOE-funded synchrotron facility that delivers beams of X-ray, ultraviolet, and infrared light to support dozens of simultaneous experiments to explore the microscopic structure and chemical makeup of samples across a wide range of scientific disciplines.

When she added "scavenger" alcohol molecules known to react only with Criegee intermediates to the lipid nanodroplets, she observed that the lipids' degradation rate conspicuously slowed down - a consequence of the scavenger molecules' reactivity with the Criegee intermediates and thus rendering them inert, Zeng said.

They also found that once the Criegee intermediates had been disabled by the scavenger molecules, the reaction yielded products similar to peroxide, and it did not release secondary ozonides, Zeng said.

The researchers believe that these results provide evidence of a new lipid degradation pathway where lipid-hungry hydroxyl generates Criegee intermediates, which then give birth to a new batch of hydroxyl; the newly formed hydroxyl sends off a new generation of Criegee intermediates; and the cycle goes on and on.

"This surprised us because hydroxyl radicals were known to cause oxidative damage to cells, but what wasn't known before our study is that hydroxyl does this via the formation of Criegee intermediates," Wilson added.

Since chronic diseases, cancer, and food spoilage have been linked to cell damage caused by hydroxyl radicals, the researchers believe that Criegee intermediates could also play a similar role in the molecular degradation that makes us vulnerable to diseases as we age, and that causes food to decay.

A new path for antioxidants

The discovery could lay the foundation for a new class of antioxidants - from vitamins to natural food preservatives, Zeng added.

"It's an exciting discovery. This gave us a fuller picture of the mechanisms behind cellular degradation and disease that was completely unexpected," she said.

"To complete this work took years of hard work by Nadja and Meirong, and the unique capabilities of the Advanced Light Source to probe complex chemistry," Wilson said. "We hope that the results from our study will inspire researchers to further explore the biochemistry of Criegee intermediates, lipids, and antioxidants, which are needed to help people in a number of ways: from the prevention of disease to the preservation of food."

The researchers next plan to work with theorists at Berkeley Lab to study the quantum properties, such as the electronic structure, at play in the Criegee intermediate/hydroxyl reaction to better assess how this cycle might operate in human cells, food, and in materials containing unsaturated lipids such as plastics and fuels.

Credit: 
DOE/Lawrence Berkeley National Laboratory

New study allows brain and artificial neurons to link up over the web

image: The virtual lab connecting Southampton, Zurich and Padova.

Image: 
University of Southampton

Research on novel nanoelectronics devices led by the University of Southampton enabled brain neurons and artificial neurons to communicate with each other.

This study has for the first time shown how three key emerging technologies can work together: brain-computer interfaces, artificial neural networks and advanced memory technologies (also known as memristors).

The discovery opens the door to further significant developments in neural and artificial intelligence research.

Brain functions are made possible by circuits of spiking neurons, connected together by microscopic, but highly complex links called 'synapses'. In this new study, published in the scientific journal Nature Scientific Reports, the scientists created a hybrid neural network where biological and artificial neurons in different parts of the world were able to communicate with each other over the internet through a hub of artificial synapses made using cutting-edge nanotechnology. This is the first time the three components have come together in a unified network.

During the study, researchers based at the University of Padova in Italy cultivated rat neurons in their laboratory, whilst partners from the University of Zurich and ETH Zurich created artificial neurons on Silicon microchips. The virtual laboratory was brought together via an elaborate setup controlling nanoelectronic synapses developed at the University of Southampton. These synaptic devices are known as memristors.

The Southampton based researchers captured spiking events being sent over the internet from the biological neurons in Italy and then distributed them to the memristive synapses. Responses were then sent onward to the artificial neurons in Zurich also in the form of spiking activity. The process simultaneously works in reverse too; from Zurich to Padova. Thus, artificial and biological neurons were able to communicate bidirectionally and in real time.

Themis Prodromakis, Professor of Nanotechnology and Director of the Centre for Electronics Frontiers at the University of Southampton said "One of the biggest challenges in conducting research of this kind and at this level has been integrating such distinct cutting edge technologies and specialist expertise that are not typically found under one roof. By creating a virtual lab we have been able to achieve this."

The researchers now anticipate that their approach will ignite interest from a range of scientific disciplines and accelerate the pace of innovation and scientific advancement in the field of neural interfaces research. In particular, the ability to seamlessly connect disparate technologies across the globe is a step towards the democratisation of these technologies, removing a significant barrier to collaboration.

Professor Prodromakis added "We are very excited with this new development. On one side it sets the basis for a novel scenario that was never encountered during natural evolution, where biological and artificial neurons are linked together and communicate across global networks; laying the foundations for the Internet of Neuro-electronics. On the other hand, it brings new prospects to neuroprosthetic technologies, paving the way towards research into replacing dysfunctional parts of the brain with AI chips."

Credit: 
University of Southampton

Quitting smoking during the first trimester of pregnancy still puts the baby at risk

Although quitting smoking during the 1st trimester of pregnancy reduces the risk of low birth weight, it isn't enough to protect the unborn child from being born shorter and with smaller brain size, a new study from the University of Eastern Finland shows. The study looked at 1.4 million mother-child pairs in Finland, analysing the effect of maternal smoking on newborns' body size and body proportions when the mother had smoked only during the 1st trimester as opposed to continued smoking. The findings were published in BMJ Open yesterday.

Smoking during pregnancy increases the risk for adverse pregnancy outcomes not only in the neonatal period, but also much later, possibly even in late adulthood. Tobacco smoke contains thousands of chemicals, which can cross the placenta and enter foetal circulation. Among them nicotine has a multitude of adverse effects on the development of organs, including the brain. Other well-known toxic chemicals in tobacco smoke include carbon monoxide, which can interfere with the unborn child's oxygen supply.

"Smoking during pregnancy is relatively common. In this study, 84.5% were non-smokers and 3.5% quit smoking during the 1st trimester, but 12% continued to smoke after the 1st trimester," Researcher Isabell Rumrich from the University of Eastern Finland says.

The study showed that maternal smoking is associated with a stronger reduction in body length and head circumference than in birth weight, leading to changed body proportions. The effects on body proportions of having quit smoking during the 1st trimester or having continued smoking after the 1st trimester were similar, stressing the importance of early pregnancy as a sensitive exposure window. Among the newborns exposed to maternal smoking only during the 1st trimester, all three measurements of body size (birth weight, body length and head circumference) showed signs of growth restriction. In addition, their body proportions were abnormal.

Furthermore, the study suggests a limited potential to repair foetal damage induced in early pregnancy. Maternal smoking can have an effect on cell proliferation during organogenesis in early prenatal development. Insults during this period have been shown to persist throughout life.

"The most important finding of our study is that although quitting smoking in the 1st trimester reduces the risk of low birth weight, brain size and body length in relation to body weight seem not to catch up.This stresses the importance of quitting smoking already before pregnancy, since even smoking only during early pregnancy can have devastating effects on the long term health of the unborn child," Rumrich notes.

Credit: 
University of Eastern Finland

Cases of poisoning: Liquids containing cannabidiols for e-cigarettes can be manipulated

According to a preliminary assessment from the German Federal Institute for Risk Assessment (BfR), the prohibited use of synthetic cannabinoids in e-liquids can be assumed with high likelihood
as reason for the observed symptoms. According to media reports, the patients admitted to have consumed cannabidiol (CBD). CBD is a weakly psychoactive cannabinoid. Already a year ago, the use of
CBD liquids manipulated with psychoactive cannabinoids has
been reported on from the USA. The users suffered from neurological symptoms as the youths in Bremerhaven.

Thus, both case series share similar symptoms - including the fact
that no respiratory symptoms appeared - and the absence of fatal cases in contrast to earlier reports from the USA. The cases of poisoning in Bremerhaven can only be assessed to a limited extent by BfR due to the lack of sufficient available information. Further data, particularly regarding the products, additives, and devices used, are required for a toxicological risk assessment. At the moment, there are no hints that indicate the cause or enhancement of the symptoms of poisoning by e-cigarettes.

However, e-cigarettes in general, with nicotine or nicotine free, compromise health.

Credit: 
BfR Federal Institute for Risk Assessment

Antioxidant precursor molecule could improve brain function in patients with MS

(PHILADELPHIA) - N-acetylcysteine (NAC) is a naturally occurring molecule that replenishes antioxidants and shows improved brain metabolism and self-reported improvements in cognitive function in patients with multiple sclerosis, according to a study published in the journal, Frontiers in Neurology.

The study found improvements in brain metabolism, particularly in areas that support cognition such as the frontal and temporal lobes in patients with multiple sclerosis. In addition, patients reported qualitative improvements in their cognitive and attention focusing abilities. The study was performed by the Department of Integrative Medicine and Nutritional Sciences, as well as the Departments of Neurology and Radiology, at Thomas Jefferson University.

Current treatments for multiple sclerosis are generally limited to trying to slow the progression of the disease by preventing new brain lesions from occurring. However, these approaches do not help the neurons that have already been affected by the disease process. While the primary event of multiple sclerosis results from an immunological process that targets the white matter, the actual damage to neurons may be due in large part from oxidative stress caused by the immune reaction. NAC is an oral supplement, and also comes in an intravenous form that is used to protect the liver in acetaminophen overdose. Several initial studies have shown that NAC administration increases glutathione levels in the brain. Because glutathione is an antioxidant, researchers have proposed that it could reduce the oxidative stress from the immune reaction, though it's unclear whether it would improve the function of neurons. The current study tested this by tracking cerebral glucose metabolism on positron emission tomography (PET) brain, as a proxy for normal neuronal function.

"This study is an important step in understanding how NAC might work as a potentially new avenue for managing multiple sclerosis patients. The NAC appears to enable neurons to recover some of their metabolic function," says senior author on the paper Daniel Monti, MD, Chairman of the Department of Integrative Medicine and Nutritional Sciences and Director of the Marcus Institute of Integrative Health at Thomas Jefferson University.

Researchers evaluated 24 patients with multiple sclerosis who continued their current treatment and were placed into two groups - the first group received a combination of oral and intravenous (IV) NAC for two months (in addition to their current treatment program); and the second group, the control patients, received only their standard-of-care treatment for multiple sclerosis for two months. Patients were not blinded. Those patients in the active group received 50mg/kg NAC intravenously once per week and 500mg NAC orally 2x per day on the non IV days.

All patients underwent brain scanning using FDG PET imaging which measures the amount of glucose metabolism in the neurons throughout the brain. This test was used to determine the level of neuronal recovery. Patients were evaluated initially and after two months of either receiving the NAC or standard of care therapy. Patients were also evaluated clinically using several different measures of brain function and quality of life.

Compared to controls, the patients receiving NAC had significant improvements in brain metabolism in the caudate, inferior frontal gyrus, lateral temporal gyrus, and middle temporal gyrus as measured by FDG PET imaging. These areas are particularly important in supporting cognition and attention, which were both perceived by patients to be improved via self-reported questionnaires of overall health and well-being.

"This is an exciting study that suggests a natural molecule such as NAC may help improve brain metabolism and symptoms in patients with multiple sclerosis," says corresponding author and neuro-imaging expert Andrew Newberg, MD, Professor and Director of Research at the Department of Integrative Medicine and Nutritional Sciences. The investigators hope that this research will open up new avenues of treatment for multiple sclerosis patients.

Credit: 
Thomas Jefferson University