Culture

Fans prefer teams that built success over time more than with purchased super

LAWRENCE -- When a franchise buys a superstar like Tom Brady or LeBron James, the team tends to win more games. But do the fans follow? How much team loyalty is purchased along with an expensive star? Maybe not as much as some owners might hope -- in the NBA Finals between the Miami Heat and San Antonio Spurs, many fans expressed their dislike of the "bought" Miami team.

In a new paper published in the peer-reviewed Journal of Applied Social Psychology, researchers at the University of Kansas asked over 1,500 Americans how much they liked teams that purchased excellence and compared that with liking teams that built excellence from the ground up.

"People reliably preferred the 'built' teams and slighted the 'bought' teams," said lead author Omri Gillath, professor of psychology at KU. "This was true of sports teams -- even if they didn't know them, such as New Zealand rugby teams -- and work teams such as a squad of lawyers."

In each of five studies, people were willing to root more for the teams built over time than those assembled from free agency and deep-pocketed owners. People preferred teams of lawyers built with time and patience over those inviting in celebrities to shine and impress others.

This preference is reliable and strong. What makes the difference?

"Fans appreciate the effort and commitment required to build a team from the ground up," said co-author Christian Crandall, professor of psychology at KU. "Hard work is a central American value, and it certainly applies to work and sports -- everyone loves a winner, but even more so when the backstory is based on perspiration and determination."

Potential fans thought a gradually built team would show more teamwork, operating smoothly together as a team. The cohesion was a plus, but it was not as important as seeing hard work in building a team.

Fans prefer the teams who develop their players, invest in them and cultivate their skills.

"This explains part of the appeal of winning teams and part of the appeal of faithful fans of teams that work, struggle and manage to eke out just a few wins each season," Gillath said. "It could also help explain people's endorsement of Cinderella teams in competitions like March Madness. People love to see 'unknown' teams who work hard beat highly ranked teams with top one-and-done recruits."

Credit: 
University of Kansas

Nanobiomaterial boosts neuronal growth in mice with spinal cord injuries

image: Researchers from the Department of Orthopedics of Tongji Hospital have successfully used a nanobiomaterial called layered double hydroxide (LDH) to inhibit the inflammatory environment surrounding spinal cord injuries in mice, accelerating regeneration of neurons and reconstruction of the neural circuit in the spine.

Image: 
Liming Cheng, Rongrong Zhu, Department of Orthopedics, Tongji Hospital of Tongji University

Researchers from the Department of Orthopedics of Tongji Hospital at Tongji University in Shanghai have successfully used a nanobiomaterial called layered double hydroxide (LDH) to inhibit the inflammatory environment surrounding spinal cord injuries in mice, accelerating regeneration of neurons and reconstruction of the neural circuit in the spine. The researchers were also able to identify the underlying genetic mechanism by which LDH works. This understanding should allow further modification of the therapy which, in combination with other elements, could finally produce a comprehensive, clinically applicable system for spinal cord injury relief in humans.

The research appears in the American Chemical Society journal ACS Nano on February 2.

There is no effective treatment for spinal cord injuries, which are always accompanied by death of neurons, breakage of axons, or nerve fibers, and inflammation. Even though new neural stem cells continue to be generated by the body, this inflammatory microenvironment (the immediate, small-scale conditions at the injury site) severely hinders regeneration of neurons and axons. Worse still, the prolonged activation of immune cells in this area also results in secondary lesions of the nervous system, in turn preventing the stems cells from differentiating themselves into new cell types.

If this aggressive immune response at the injury site could be moderated, there is the possibility that neural stem cells could begin differentiation and neural regeneration could occur.

Image title: Nanobiomaterial boosts neuronal growth in mice

Image caption: Researchers from the Department of Orthopedics of Tongji Hospital have successfully used a nanobiomaterial called layered double hydroxide (LDH) to inhibit the inflammatory environment surrounding spinal cord injuries in mice, accelerating regeneration of neurons and reconstruction of the neural circuit in the spine.

Image credit: Liming Cheng, Rongrong Zhu, Department of Orthopedics, Tongji Hospital of Tongji University

Image usage restrictions: News organizations may use or redistribute this image, with proper attribution, as part of news coverage of this paper only.

In recent years, a raft of novel nano-scale biomaterials -- natural or synthetic materials that interact with biological systems -- have been designed to assist activation of neural stem cells, along with their mobilization and differentiation. Some of these "nanocomposites" are capable of delivering drugs to the injury site and accelerate neuronal regeneration. These nanocomposites are especially attractive for spinal cord treatment due to their low toxicity. However, few have any ability to inhibit or moderate the immune reaction at the site, and so do not tackle the underlying problem. Moreover, the underlying mechanisms of how they work remains unclear.

Nanolayered double hydroxide (LDH) is a kind of clay with many interesting biological properties relevant to spinal cord injuries, including good biocompatibility (ability to avoid rejection by the body), safe biodegradation (breakdown and removal of the molecules after application), and excellent anti-inflammatory capability. LDH has already been widely explored in biomedical engineering with respect to immune response regulation, but mainly in the field of anti-tumor therapy.

"These properties made LDH a really promising candidate for the creation of a much more bene?cial microenvironment for spinal cord injury recovery," says Rongrong Zhu of the
Department of Orthopedics of Tongji Hospital, first author of the study.

Under the leadership of Liming Cheng, corresponding author of the study, the research team transplanted the LDH into the injury site of mice, and found that the nanobiomaterial had signi?cantly accelerated neural stem cells migration, neural di?erentiation, activation of channels for neuron excitation, and induction of action potential (nerve impulse) activation. The mice were also found to enjoy significantly improved locomotive behavior compared to the control group of mice. In addition, when the LDH was combined with Neurotrophin-3 (NT3), a protein that encourages the growth and differentiation of new neurons, the mice enjoyed even better recovery effects than the LDH on its own. In essence, the NT3 boosts neuronal development while the LDH creates an environment where that development is allowed to thrive.

Then, via transcriptional profiling, or analysis of gene expression of thousands of genes at once, the researchers were able to identify how the LDH performs its assistance. They found that once LDH is attached to cell membranes, it provokes greater activation of the "transforming growth factor-β receptor 2" (TGFBR2) gene, decreasing production of the white blood cells that enhance inflammation and increasing production of the white blood cells that inhibit inflammation. Upon application of a chemical that inhibits TGFBR2, they found the beneficial effects were reversed.

The understanding of how LDH performs these effects should now allow the researchers to tweak the therapy to enhance its performance and to finally create a comprehensive therapeutic system for spinal cord injuries--combining these biomaterials with neurotrophic factors like NT3-that can be used in clinical application on people.

Credit: 
Department of Orthopedics, Tongji Hospital of Tongji University

New evidence in search for the mysterious Denisovans

image: Replica of the Sangiran 17 Homo erectus cranium from Java.

Image: 
Photo supplied by the Trustees of the Natural History Museum.

An international group of researchers led by the University of Adelaide has conducted a comprehensive genetic analysis and found no evidence of interbreeding between modern humans and the ancient humans known from fossil records in Island Southeast Asia. They did find further DNA evidence of our mysterious ancient cousins, the Denisovans, which could mean there are major discoveries to come in the region.

In the study published in Nature Ecology and Evolution, the researchers examined the genomes of more than 400 modern humans to investigate the interbreeding events between ancient humans and modern human populations who arrived at Island Southeast Asia 50,000-60,000 years ago.

In particular, they focused on detecting signatures that suggest interbreeding from deeply divergent species known from the fossil record of the area.

The region contains one of the richest fossil records (from at least 1.6 million years) documenting human evolution in the world. Currently there are three distinct ancient humans recognised from the fossil record in the area: Homo erectus, Homo floresiensis (known as Flores Island hobbits) and Homo luzonensis.

These species are known to have survived until approximately 50,000-60,000 years ago in the cases of Homo floresiensis and Homo luzonensis, and approximately 108,000 years for Homo erectus, which means they may have overlapped with the arrival of modern human populations.

The results of the study showed no evidence of interbreeding. Nevertheless, the team were able to confirm previous results showing high levels of Denisovan ancestry in the region.

Lead author and ARC Research Associate from the University of Adelaide Dr João Teixeira, said: "In contrast to our other cousins the Neanderthals, which have an extensive fossil record in Europe, the Denisovans are known almost solely from the DNA record. The only physical evidence of Denisovan existence has been a finger bone and some other fragments found in a cave in Siberia and, more recently, a piece of jaw found in the Tibetan Plateau."

"We know from our own genetic records that the Denisovans mixed with modern humans who came out of Africa 50,000-60,000 years ago both in Asia, and as the modern humans moved through Island Southeast Asia on their way to Australia.

"The levels of Denisovan DNA in contemporary populations indicates that significant interbreeding happened in Island Southeast Asia.

"The mystery then remains, why haven't we found their fossils alongside the other ancient humans in the region? Do we need to re-examine the existing fossil record to consider other possibilities?"

Co-author Chris Stringer of the Natural History Museum in London added:

"While the known fossils of Homo erectus, Homo floresiensis and Homo luzonensis might seem to be in the right place and time to represent the mysterious 'southern Denisovans', their ancestors were likely to have been in Island Southeast Asia at least 700,000 years ago. Meaning their lineages are too ancient to represent the Denisovans who, from their DNA, were more closely related to the Neanderthals and modern humans."

Co-author Prof Kris Helgen, Chief Scientist and Director of the Australian Museum Research Institute, said: "These analyses provide an important window into human evolution in a fascinating region, and demonstrate the need for more archaeological research in the region between mainland Asia and Australia."

Helgen added: "This research also illuminates a pattern of 'megafaunal' survival which coincides with known areas of pre-modern human occupation in this part of the world. Large animals that survive today in the region include the Komodo Dragon, the Babirusa (a pig with remarkable upturned tusks), and the Tamaraw and Anoas (small wild buffalos).

"This hints that long-term exposure to hunting pressure by ancient humans might have facilitated the survival of the megafaunal species in subsequent contacts with modern humans. Areas without documented pre-modern human occurrence, like Australia and New Guinea, saw complete extinction of land animals larger than humans over the past 50,000 years."

Dr Teixeira said: "The research corroborates previous studies that the Denisovans were in Island Southeast Asia, and that modern humans did not interbreed with more divergent human groups in the region. This opens two equally exciting possibilities: either a major discovery is on the way, or we need to re-evaluate the current fossil record of Island Southeast Asia."

"Whichever way you choose to look at it, exciting times lie ahead in palaeoanthropology."

Credit: 
University of Adelaide

Total knee replacement a cost-effective treatment for patients with knee osteoarthritis

Knee osteoarthritis is a painful condition that affects over 14 million U.S. adults, many of whom have extreme obesity, defined by body mass index (BMI) greater than 40kg/m2. Total knee replacement (TKR) is often recommended to treat advanced knee osteoarthritis, but surgeons may be hesitant to operate on patients with extreme obesity due to concerns about the increased risks of tissue infection, poor wound healing and higher risk of implant failure. Using an established, validated and widely published computer simulation called the Osteoarthritis Policy (OAPol) Model, researchers from Brigham and Women's Hospital, together with collaborators from Yale and Boston University Schools of Medicine, quantified the tradeoff between the benefits and adverse events, taking into consideration costs of forgoing versus pursuing TKR. They found that across older and younger age groups, TKR is a cost-effective treatment for these patients. Findings are published in Annals of Internal Medicine.

"People with extreme obesity experience substantial pain reduction from TKR, leading to meaningful improvements in quality-adjusted life expectancy. High BMI should not serve as a barrier for people seeking this procedure," said corresponding author Elena Losina, PhD, a founding director of the Policy and Innovation eValuation in Orthopedic Treatments Center and a co-director of the Brigham's Orthopedic and Arthritis Center for Outcomes Research. "From a health policy perspective, this operation offers a very good value for the dollars spent."

The computer model used by the researchers, OAPol, combines clinical and economic data from national datasets to forecast the clinical course of patients who decide to receive or forgo TKR. In the model, each treatment choice is associated with benefits (improvements in pain leading to better quality of life), drawbacks (surgery complications, continuous pain reducing quality of life) and costs. The model tallies the data from a large number of patients to calculate an incremental cost-effectiveness ratio of TKR, calculated as dollars for quality adjusted life year (QALY) gained. The researchers reported favorable cost-effectiveness ratios of $35,200 and $54,100 per QALY for patients younger and older than 65 years, respectively. The researchers noted that most patients with extreme obesity and advanced knee osteoarthritis considering TKR are in the younger age range. These data, they suggest, may help to diminish concerns regarding the value of TKR in these patients.

"Instead of questioning whether or not to do surgery for people with extreme obesity, the conversation should be about how to accommodate these patients and provide accurate information about what to expect post-surgery," Losina said. "Ultimately, this study raises the question of how to do the operation in a way that addresses all of the challenges that may arise. This is a discussion that should take place between individual patients and physicians, discussing all the risks, complications, and benefits as well as considerations of operating room accommodations that would optimize the work of orthopedic surgeons performing TKR in these patients."

Credit: 
Brigham and Women's Hospital

Fungal species causing candidiasis use distinct infection strategies

image: Level of mitochondrial gene expression and interferon response in human epithelial cells in response to infection with the four Candida species.Increased expression of mitochondrial genes and interferon response was only achieved after direct contact with viable Candida cells, but not when they were inactivated or viable but unable to establish direct contact.

Image: 
IRB Barcelona

Candidiasis is a fungal infection caused by a yeast called Candida. It is a serious global health problem and it can be vaginal, oral or systemic. The latter is the most severe form of infection, as it can lead to death, but vaginal candidiasis infection is the most prevalent, affecting 80% of women at some point in their lives.

Scientists led by Dr. Toni Gabaldón, ICREA researcher and group leader at the Institute for Research in Biomedicine (IRB Barcelona) and the Barcelona Supercomputing Center (BSC), in collaboration with Dr. Bernhard Hube's group at the Hans Knoell Institute and the University of Jena in Germany, have described the various mechanisms used by the fungus Candida to infect the epithelium of the vagina and how human cells respond.

Candidiasis is caused by several species of fungus and the study has focused on the four species that cause 90% of cases: Candida albicans, C. glabrata, C. parapsylasis, and C. tropicalis. The researchers have observed that each species has a distinct infection pattern. However, vaginal epithelial cells respond in the same way to different species and the response is modulated according to the severity of the infection as it progresses.

"Understanding the infection processes and the interaction of the fungus with epithelial cells could contribute to the search for a treatment that anticipates the defence response, making it more effective," explains Dr. Gabaldón, head of the Comparative Genomics laboratory at IRB Barcelona and BSC.

A computational study of genetic patterns of infection and defence

Dr. Gabaldón's group has focused on the computational analysis of gene expression patterns of the different Candida species during infection. In this regard, they have quantified, analysed, and compared the genes that are activated and those that remain silent in both human and fungal cells, when infection by the different species of Candida begins.

"Knowing the genetic pattern that corresponds to the pathological process would allow work on a kit to detect infection," explains Hrant Hovhannisyan, co-first author of the study and PhD student of the Comparative Genomics laboratory.

Cell defence based on mitochondrial DNA

Analysis of the genetic defence response of human vaginal epithelial cells revealed that that it was based mainly on the action of mitochondria, the energy organelle of the cell. The researchers found that the mitochondria and even the cells release their DNA, thereby sending a signal to the immune system cells responsible for neutralising the infection.

"This system had previously been observed as a defence against viral infections and also some bacteria, but it is the first time it has been detected in response to a fungal infection," explains Dr. Gabaldón.

This work was carried out within the framework of an Innovative Training Networks (ITN) project funded within the Marie Sk?odowska-Curie Actions call of the European Commission involving ten research groups from several European countries. "The excellence of the partners and the good environment for collaboration fostered by this type of project have allowed us to develop research with high added-value," says Dr. Gabaldón, who is also the project coordinator.

Credit: 
Institute for Research in Biomedicine (IRB Barcelona)

New genetic links found to rare eye disease

image: An electron microscopy image of human retinal cells, which were analyzed in a new study on a rare eye disease known as MacTel. The cells were created using induced pluripotent stem cell technology.

Image: 
Kevin Eade, Scott Henderson and Kimberly Vanderpool

LA JOLLA, CA--An analysis of thousands of genomes from people with and without the rare eye disease known as MacTel has turned up more than a dozen gene variants that are likely causing the condition to develop and worsen for a significant share of patients.

The discovery, by a team of scientists from Scripps Research and the Lowy Medical Research Institute, in collaboration with Columbia University in New York and UC San Diego, provides a new avenue to pursue for diagnosis and treatment. It also sheds light on fundamental aspects of metabolism in the retina, a tissue with one of the highest energy demands in the human body. Findings appear today in the journal Nature Metabolism.

"It's exciting to uncover new answers to the many questions surrounding this rare and complex eye disease," says Martin Friedlander, MD, PhD, professor at Scripps Research and president of the Lowy Medical Research Institute in La Jolla. "Although we've known that MacTel has a genetic component, the precise variants had remained elusive. These findings will serve as a springboard for further scientific investigation and as a guide to potential therapeutic targets."

MacTel, short for "macular telangiectasia type 2," is a progressive and debilitating eye disease that occurs in roughly one out of 5,000 people, or about 2 million people worldwide. A disease of the retina, the light-sensing tissue at the back of the eye, MacTel causes a gradual deterioration of central vision, interfering with critical tasks such as reading and driving.

Piecing together the puzzle

For more than 15 years, scientists have collaborated in an international effort, the MacTel Project, to find the cause of MacTel and develop treatments. Earlier studies found patients had low levels of an amino acid called serine in their bloodstream. However, while serine is essential for many biological processes, it was not known at that time to affect eye health.

In 2019, Friedlander and colleagues found the connection. They made a breakthrough discovery that low serine levels were responsible for a buildup of toxic lipids, which cause photoreceptor cells to die. But still, many questions remained. For example, what caused the decline in serine? The investigation continued.

In the new study, Rando Allikmets, PhD, of Columbia University, used an alternative approach to find genetic drivers of disease. Instead of assessing individual mutations in genes, he and his team analyzed groups of mutations, giving them a greater ability to identify disease-causing genes in a small population of people with MacTel.

One gene, PHGDH, had significantly more variants in MacTel patients than those without the disease. The team identified 22 rare variants in PHGDH which, together, account for approximately 3 to 4 percent of MacTel cases. Many more variants likely exist but haven't been found yet--a challenge considering the small patient population with diverse genetic causes.

Implications for other diseases

PHGDH is a key enzyme that enables the body to make serine, and these studies provided the long-sought link to low serine observed in MacTel patients. Its function is essential for the health of neurons in the eye and elsewhere in the body.

Several of the gene variants identified in the study are known to cause rare, severe neuropathies when both of the alleles, or copies of the gene, are affected. In the case of MacTel, only a single allele is affected, resulting in a partial loss of the enzyme function, which leads to retinal degeneration.

Many of these variants in PHGDH were identified for the first time and were predicted to cause defects in the gene, and Friedlander's group worked to confirm this. They directly tested whether each of the multitude of variants identified in MacTel patients are actually harmful to PHGDH function, and found they were.

"The PHGDH gene is essential for the production of serine, which plays a central role in cellular metabolism," says Kevin Eade, PhD, a former Scripps Research postdoctoral associate who is a senior scientist at the Lowy Medical Research Institute. "Through genetic analyses and experiments in human-derived retinal tissue, we were able confirm that even a partial loss of PHGDH function can have a damaging effect on the retina."

The scientists then used human induced pluripotent stem cells to generate specialized retinal cells that contained one of the MacTel-associated PHGDH mutations. They found that a PHGDH mutation in these cells leads to the production of a toxic lipid previously shown to cause MacTel.

Vision for a therapy

By more fully understanding the role of PHGDH in MacTel, scientists are hopeful they can begin to map out potential treatment approaches.

"We still have much to learn about this rare disease, including why systemic changes in serine metabolism leads to retinal degeneration," says Marin Gantner, PhD, senior staff scientist at the Lowy Medical Research Institute and doctoral graduate of Scripps Research. "We have come such a long way in identifying links to the disease, and every step provides new insights that can be leveraged in creating a therapy."

Credit: 
Scripps Research Institute

Study reveals plunge in lithium-ion battery costs

The cost of the rechargeable lithium-ion batteries used for phones, laptops, and cars has fallen dramatically over the last three decades, and has been a major driver of the rapid growth of those technologies. But attempting to quantify that cost decline has produced ambiguous and conflicting results that have hampered attempts to project the technology's future or devise useful policies and research priorities.

Now, MIT researchers have carried out an exhaustive analysis of the studies that have looked at the decline in the prices these batteries, which are the dominant rechargeable technology in today's world. The new study looks back over three decades, including analyzing the original underlying datasets and documents whenever possible, to arrive at a clear picture of the technology's trajectory.

The researchers found that the cost of these batteries has dropped by 97 percent since they were first commercially introduced in 1991. This rate of improvement is much faster than many analysts had claimed and is comparable to that of solar photovoltaic panels, which some had considered to be an exceptional case. The new findings are reported today in the journal Energy and Environmental Science, in a paper by MIT postdoc Micah Ziegler and Associate Professor Jessika Trancik.

While it's clear that there have been dramatic cost declines in some clean-energy technologies such as solar and wind, Trancik says, when they started to look into the decline in prices for lithium-ion batteries, "we saw that there was substantial disagreement as to how quickly the costs of these technologies had come down." Similar disagreements showed up in tracing other important aspects of battery development, such as the ever-improving energy density (energy stored within a given volume) and specific energy (energy stored within a given mass).

"These trends are so consequential for getting us to where we are right now, and also for thinking about what could happen in the future," says Trancik, who is an associate professor in MIT's Institute for Data, Systems and Society. While it was common knowledge that the decline in battery costs was an enabler of the recent growth in sales of electric vehicles, for example, it was unclear just how great that decline had been. Through this detailed analysis, she says, "we were able to confirm that yes, lithium-ion battery technologies have improved in terms of their costs, at rates that are comparable to solar energy technology, and specifically photovoltaic modules, which are often held up as kind of the gold standard in clean energy innovation."

It may seem odd that there was such great uncertainty and disagreement about how much lithium-ion battery costs had declined, and what factors accounted for it, but in fact much of the information is in the form of closely held corporate data that is difficult for researchers to access. Most lithium-ion batteries are not sold directly to consumers -- you can't run down to your typical corner drugstore to pick up a replacement battery for your iPhone, your PC, or your electric car. Instead, manufacturers buy lithium-ion batteries and build them into electronics and cars. Large companies like Apple or Tesla buy batteries by the millions, or manufacture them themselves, for prices that are negotiated or internally accounted for but never publicly disclosed.

In addition to helping to boost the ongoing electrification of transportation, further declines in lithium-ion battery costs could potentially also increase the batteries' usage in stationary applications as a way of compensating for the intermittent supply of clean energy sources such as solar and wind. Both applications could play a significant role in helping to curb the world's emissions of climate-altering greenhouse gases. "I can't overstate the importance of these trends in clean energy innovation for getting us to where we are right now, where it starts to look like we could see rapid electrification of vehicles and we are seeing the rapid growth of renewable energy technologies," Trancik says. "Of course, there's so much more to do to address climate change, but this has really been a game changer."

The new findings are not just a matter of retracing the history of battery development, but of helping to guide the future, Ziegler points out. Combing all of the published literature on the subject of the cost reductions in lithium-ion cells, he found "very different measures of the historical improvement. And across a variety of different papers, researchers were using these trends to make suggestions about how to further reduce costs of lithium-ion technologies or when they might meet cost targets." But because the underlying data varied so much, "the recommendations that the researchers were making could be quite different." Some studies suggested that lithium-ion batteries would not fall in cost quickly enough for certain applications, while others were much more optimistic. Such differences in data can ultimately have a real impact on the setting of research priorities and government incentives.

The researchers dug into the original sources of the published data, in some cases finding that certain primary data had been used in multiple studies that were later cited as separate sources, or that the original data sources had been lost along the way. And while most studies have focused only on the cost, Ziegler says it became clear that such a one-dimensional analysis might underestimate how quickly lithium-ion technologies improved; in addition to cost, weight and volume are also key factors for both vehicles and portable electronics. So, the team added a second track to the study, analyzing the improvements in these parameters as well.

"Lithium-ion batteries were not adopted because they were the least expensive technology at the time," Ziegler says. "There were less expensive battery technologies available. Lithium-ion technology was adopted because it allows you to put portable electronics into your hand, because it allows you to make power tools that last longer and have more power, and it allows us to build cars" that can provide adequate driving range. "It felt like just looking at dollars per kilowatt-hour was only telling part of the story," he says.

That broader analysis helps to define what may be possible in the future, he adds: "We're saying that lithium-ion technologies might improve more quickly for certain applications than would be projected by just looking at one measure of performance. By looking at multiple measures, you get essentially a clearer picture of the improvement rate, and this suggests that they could maybe improve more rapidly for applications where the restrictions on mass and volume are relaxed."

Trancik adds the new study can play an important role in energy-related policymaking. "Published data trends on the few clean technologies that have seen major cost reductions over time, wind, solar, and now lithium-ion batteries, tend to be referenced over and over again, and not only in academic papers but in policy documents and industry reports," she says. "Many important climate policy conclusions are based on these few trends. For this reason, it is important to get them right. There's a real need to treat the data with care, and to raise our game overall in dealing with technology data and tracking these trends."

Credit: 
Massachusetts Institute of Technology

A simple laser for quantum-like classical light

image: A simple laser comprising just two standard mirrors was used to create higher-dimensional classically entangled light, a new state of the art, deviating from the prevailing paradigm of two-dimensional Bell states. The approach combines internal generation, in-principle unlimited in what can be created, with external control, allowing user-defined states to be molded. Shown here are examples of two-dimensional Bell (left) and high-dimensional states (right), including the famous GHZ states.

Image: 
by Yijie Shen, Isaac Nape, Xilin Yang, Xing Fu, Mali Gong, Darryl Naidoo and Andrew Forbes

Tailoring light is much like tailoring cloth, cutting and snipping to turn a bland fabric into one with some desired pattern. In the case of light, the tailoring is usually done in the spatial degrees of freedom, such as its amplitude and phase (the "pattern" of light), and its polarization, while the cutting and snipping might be control with spatial light modulators and the like. This burgeoning field is known as structured light, and is pushing the limits in what we can do with light, enabling us to see smaller, focus tighter, image with wider fields of view, probe with fewer photons, and to pack information into light for new high-bandwidth communications. Structured light has also been used to test the classical-quantum boundary, pushing the limits with what classical light can do for quantum processes, and vice versa. This has opened the intriguing possibility to create classical light that has quantum-like properties, as if it is "classically entangled". But how to create and control such states of light, and how far can one "push the limits"?

The prevailing tools for structuring light from lasers is hindered by the complexity of the specialized lasers needed, often requiring customized geometries and/or elements, while the prevailing two-dimensional paradigm of using only "pattern" and "polarization", means accessing two-dimensional classically entangled light, mimicking quantum qubits, 1s and 0s. An example of this would be the well-known quantum Bell states, shown in Figure 1 (left), which as classical light appear as vectorial structured light, combining the two degrees of freedom of "pattern" and "polarization". These two degrees of freedom mimic the two dimensions of the qubit quantum state. To create higher dimensions requires finding more degrees of freedom in a system seeming constrained to just two.

In their paper "Creation and control of high-dimensional multi-partite classically entangled light", Chinese and South African scientists report on how to create arbitrary dimensional quantum-like classical light directly from a laser. They use a very simple laser available in most university teaching laboratories to show eight dimensional classically entangled light, a new world record. They then go on to manipulate and control this quantum-like light, creating the first classically entangled Greenberger-Horne-Zeilinger (GHZ) states, a rather famous set of high-dimensional quantum states, shown in Figure 1.

"Theorists have long suggested all the applications that would be possible with such quantum-like light, but the lack of any creation and control steps has prohibited any progress. Now we have shown how to overcome this hurdle," says Dr. Shen from Tsinghua University (present senior research fellow in University of Southampton), the lead author of the paper.

Traditionally, exotic structured light from lasers requires equally exotic laser systems, either with custom elements (metasurfaces for example) or custom geometries (topological photonic based for example). The laser built by the authors contained only a gain crystal and followed textbook design with just two off-the-shelf mirrors. Their elegant solution is itself build on a principle embedded in quantum mechanics: ray-wave duality. The authors could control both path and polarization inside the laser by a simple length adjustment, exploiting what is called ray-wave duality lasers. According to Prof. Forbes, the project supervisor, "what is remarkable is not only that we could create such exotic states of light, but that their source is as simple a laser as you could possibly imagine, with nothing more than a couple of standard mirrors." The authors realized that the crucial "extra" degrees of freedom were right in front of theirs eyes, needing only a new mathematical framework to recognize them. The approach allows in-principle any quantum state to be created by simply marking the wave-like rays that are produced by the laser and then externally controlling them with a spatial light modulator, molding them to shape. In a sense, the laser produces the dimensions needed, while later modulation and control molds the outcome to some desired state. To demonstrate this, the authors produced all the GHZ states, which span an eight dimensional space.

Because no-one had ever created such high-dimensional classically entangled light, the authors had to invent a new measurement approach, translating tomography of high-dimensional quantum states into a language and technique suitable for its classical light analogue. The result is a new tomography for classically entangled light, revealing its quantum-like correlations beyond the standard two dimensions.

This work provides a powerful approach to creating and controlling high-dimensional classical light with quantum-like properties, paving the way for exciting applications in quantum metrology, quantum error correction and optical communication, as well as in stimulating fundamental studies of quantum mechanics with much more versatile bright classical light.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

Expressing some doubts about android faces

image: Example of the facial flow lines when two androids (left and middle) and adult males (right) lifted their inner eyebrow.

Image: 
Ishihara, Iwanaga, and Asada, Comparison between the Facial Flow Lines of Androids and Humans, <i>Frontiers in Robotics and AI</i>, 2021

Osaka, Japan - Researchers from the Graduate School of Engineering and Symbiotic Intelligent Systems Research Center at Osaka University used motion capture cameras to compare the expressions of android and human faces. They found that the mechanical facial movements of the robots, especially in the upper regions, did not fully reproduce the curved flow lines seen in the faces of actual people. This research may lead to more lifelike and expressive artificial faces.

The field of robotics has advanced a great deal over the past decades. However, while current androids can appear very humanlike at first, their active facial expressions may still be unnatural and slightly unsettling to us. The exact reasons for this effect have been difficult to pinpoint. Now, a research team at Osaka University has used motion capture technology to monitor the facial expressions of five android faces and compared the results with actual human facial expressions. This was accomplished with six infrared cameras that monitored reflection markers at 120 frames per second and allowed the motions to be represented as three-dimensional displacement vectors.

"Advanced artificial systems can be difficult to design because the numerous components have complex interactions with each other. The appearance of an android face can experience surface deformations that are hard to control," study first author Hisashi Ishihara says. These deformations can be due to interactions between components such as the soft skin sheet and the skull-shaped structure, as well as the mechanical actuators.

The first difference the team found between the androids and adult males was in their flow lines, especially the eye and forehead areas. These lines tended to be almost straight for the androids but were curved for the human adult males. Another major difference was with the skin surface undulation patterns in the upper part of the face.

"Redesigning the face of androids so that the skin flow pattern resembles that of humans may reduce the discomfort induced by the androids and improve their emotional communication performance," senior author Minoru Asada says. "Future work may help give the android faces the same level of expressiveness as humans have. Each robot may even have its own individual "personality" that will help people feel more comfortable."

Credit: 
Osaka University

Having a single personal doctor may sometimes lead to unnecessary tests

Patient care by a single primary care physician is associated with many health benefits, including increased treatment adherence and decreased hospital admissions and mortality risk. But can the relationship built between doctor and patient also lead to unnecessary care?

A new University of Florida study finds that male patients who have a single general physician were more likely to receive a prostate cancer screening test during a period when the test was not recommended by the US Preventive Services Task Force. The study, which appears in Frontiers in Medicine, is the first to explore whether continuity of care may lead to patients complying with recommendations for low-value or even harmful care for any condition.

"The results show that the trust between a doctor and a patient is a strong bond, but it emphasizes that it is important that physicians practice evidence-based care," said Arch G. Mainous III, PhD, the study's lead author and a professor in the department of health services research, management and policy at the UF College of Public Health and Health Professions. "Patients look to their physician to act in their best interest and so physicians need to take that trust and provide the best care possible."

In 2012, the US Preventive Services Task Force recommended against prescribing the prostate specific antigen, or PSA, test, rating it a grade D test and concluding there is moderate or high certainty the test has no net benefit or that the harms -- including false-positive results, overtreatment and treatment complications -- outweigh the benefits. This recommendation stayed in place until May 2018, when the task force upgraded the PSA test to a grade C, one that is selectively offered based on a physician's professional judgment and patient preferences. While use of the PSA test declined between 2012 and 2018, a significant portion of men continued to receive the test during that period.

For the study, the UF team analyzed data from the Behavioral Risk Factor Surveillance System, a nationwide system of health-related telephone surveys that collects data about Americans' health-related risk behaviors, chronic health conditions and use of preventive services. The team looked at data from 2016, four years after the task force's recommendation against PSA tests.

They evaluated responses from men ages 40 and older with no symptoms or family history of prostate cancer. Survey questions asked if participants had a single personal doctor, if they had ever received a PSA test and what recommendations or advice they received about PSA tests from health care professionals.

Among 232,548 men who responded to the questions, nearly 40% reported receiving a PSA test during the timeframe when it was not recommended. Having a single personal doctor was associated with discussion of both advantages and disadvantages of PSA tests, but also a recommendation to receive a PSA test.

"The results do not suggest that interpersonal continuity with one regular doctor is not important. Quite the contrary, these results reinforce the power of that relationship," said Mainous, also vice chair for research in the UF College of Medicine's department of community health and family medicine. "The patient-physician relationship and trust in one's physician is critical in providing care, but responsibility falls to physicians to provide the best high-quality care."

Credit: 
Frontiers

Antioxidant-primed stem cells show promise in repairing bone damaged by radiation

image: Ferulic acid is an antioxidant that is originally identified from fruits, vegetables, and Chinese medical herbs. A study from China highlighted the role of Ferulic acid in the protection of skeletal stem cells from irradiation and promotion of tissue regeneration. The figure was prepared by Pei-Lin Li, M.D. and Heng Zhu, M.D.&Ph.D.

Image: 
AlphaMed Press

Durham, NC - The standard of treatment for bone tumors is often two-fold: surgery to remove the cancerous section followed by radiation therapy to ensure all the cancerous cells have been killed off. This is an effective way to defeat bone tumors; however, it often results in large bone defects and hampers wound healing because of extensive tissue cutoff and irradiation-induced tissue damage. A new study published in STEM CELLS Translational Medicine demonstrates how stem cells primed with ferulic acid can repair such bone damage and how this occurs. The information this study provides could aid in the development of new treatments for irradiated bone injuries.

Heng Zhu, M.D., Ph.D., of the Beijing Institute of Radiation Medicine (BIRM), and Li Ding, M.D., Ph.D., of the Air Force Medical Center in the People's Republic of China, were co-corresponding authors of the paper. BIRM colleagues Jia-Wu Liang, M.D, and Pei-Lin Li, M.D., were co-first authors.

"One prime concern of radiation therapy is that it impairs the 'stemness' of skeletal stem cells (SSCs), meaning that it affects their ability to self-renew and differentiate, which are assets critical to bone regeneration and repair," said Dr. Zhu. "Despite this, little information is available regarding the change in SSC stemness after irradiation and the related underlying regulatory factors. This limits our level of understanding regarding SSC-based bone regeneration."

While seeking a way to restrict radiation's harmful effects on SSCs, Dr. Zhu and his team became interested in ferulic acid (FA). FA is a potent antioxidant commonly found in fruits and vegetables that exhibits strong anti-inflammatory properties and has been widely applied in the prevention of cardiovascular disease, diabetes, cancer and more. It is also known to alleviate radiation-induced stem cell damage.

"Given the fundamental role of SSCs in bone regeneration and the potential role of FA in irradiation protection, we hypothesized that FA combined with SSCs might be effective in reconstructing irradiated bone defects. We explored this idea using an in vitro cell model and an in vivo animal model. Moreover, the cellular and molecular mechanisms underlying the protective effects of FA on SSCs were also investigated," Dr. Ding said.

For the in vitro phase of their study, the researchers seeded irradiated SSCs and nonirradiated SCCs onto culture plates and then added FA at different concentrations to determine the potential effects of FA on SSC proliferation, differentiation and self-renewal.

To explore the repairing capacity of SSC on irradiated bone defects in an animal model, the hind limbs and lower abdomens of mice were locally irradiated and then bone defect surgeries were conducted an hour later. Nonirradiated mice that underwent surgery served as bone defect controls. Microcryogels that had been seeded with either FA-primed SSCs or SSCs alone were then injected into the bone defect areas of the irradiated mice. The same concentration of microcryogel without SSCs was used in another group of irradiated animals, as a control. Finally, to study the effects of FA on bone repair, FA solution was injected into irradiated bone defects.

The results were then evaluated at intervals of one, two and three weeks post-radiation. They showed that FA significantly rescued the radiation-induced impairment of SSCs by activating the p38/MAPK and ERK/MAPK pathways, which are chains of proteins in cells that communicate signals from receptors on a cell's surface to its DNA. As such, they play an important role in proliferation, differentiation, development, transformation, apoptosis and other complex cellular programs.

Moreover, the FA substantially enhanced the bone repair effects of SSCs in the irradiated bone defect mice.

"This work unveils that FA promotes the maintenance of skeletal stem cells' stemness after irradiation. It also defined activation of the p38/MAPK and ERK/MAPK pathways as the molecular underlying mechanism governing FA activity," said Dr. Liang "Notably, our data shows that SSC is very sensitive to irradiation and exhibits impaired stemness, which adds new information to understanding bone osteoporosis and hindered bone repair post-radiation."

"While several studies previously published have shown that FA can enhance and improve stem cell self-renewal and that its activity is mediated through activation of p38/MAPK and ERK/MAPK pathways, ours is the first to demonstrate that FA enhances the bone repair effects of SSCs," Dr. Li added. "As such, we believe that targeting SSCs has potential as a novel strategy in treating irradiated bone injury."

"In the current study, promising data demonstrates that stem cells combined with ferulic acid, a potent antioxidant agent, enhanced the bone repair effects of skeletal stem cells in an irradiated bone defect," said Anthony Atala, M.D., Editor-in-Chief of STEM CELLS Translational Medicine and Director of the Wake Forest Institute for Regenerative Medicine. "These findings could someday lead to new treatment standards for bone tumors."

Credit: 
AlphaMed Press

United States ranks lowest in overall policies to help parents support children

image: Baylor University sociologist Matthew A. Andersson, Ph.D.

Image: 
Baylor University photo

National work-family policies that give lower-income families more time together while allowing them paid time off are more effective for children's psychological health than cash transfers, according to a study of developed nations led by Baylor University.

In a study of about 200,000 children in 20 developed nations, the United States ranked lowest in overall policies aimed at helping parents support children.

The study, published in the journal Social Forces, supports the view of critics who say that the United States government does not do enough to mandate flexible hours and paid leave.

"Perhaps not surprisingly, the United States also shows some of the largest gaps in overall health between rich and poor children," said lead author Matthew A. Andersson, Ph.D., assistant professor of sociology at Baylor University. "Our study argues that these two phenomena are connected across nations. And as national work-family policy improves, inequalities in children's health lessen between advantaged and disadvantaged families."

A significant gap exists between the well-being of children from richer and poorer families, and the research found that children's self-rated health improves as federally mandated work flexibility and paid leave become more generous.

Cash transfers such as policies subsidizing child care and providing income support are important in fighting hunger and homelessness in individual families, Andersson noted.

"But cash transfers only give money," he said. "Paid parental leave and work flexibility gives parents time plus money -- and that seems to be the key combination to combating inequalities in children's health across developed nations."

But parents at the lower end of the socioeconomic spectrum often work jobs which do not entitle them to schedule flexibility or time off to spend with their children, and they may not be able to negotiate those benefits with employers.

"Sadly, it's often not even a matter of negotiation," Andersson said. "Employers do not offer benefits -- or if they do, these benefits are quite limited. Not surprisingly, advantaged families tend to be attached to higher-quality jobs where benefits are available regardless of federal mandate."

Researchers analyzed data about work-family policies, cash transfer mandates and self-reported child well-being in countries in the Organisation for Economic Co-operation and Development, an intergovernmental organization.

The data included how generous each country was with paid vacation and sick leave, vacation, work flexibility, maternity leave and child care, including cash transfers made to families with children.

Researchers also examined data from health surveys of school-aged children ages 11, 13 and 15 to determine whether there were links to psychological health complaints, life satisfaction levels and health. The information was gathered in the Health Behavior in School-Aged Children, a World Health Organization cross-national survey.

Children were asked whether they had their own bedroom, whether their family owned a vehicle, whether they owned a computer or computers and how many times, if any, they had traveled with their family on a holiday in the past year.

Children also answered questions about how often they had felt "low" or experienced "irritability or bad temper" within the past six months, how satisfied they were with their lives and how they rated their health.

"Traditionally, the U.S. has placed burdens for establishing work-family accommodation on employers and individual businesses," Andersson said. "If the U.S. could switch course by mandating work-family benefits regardless of employer, findings suggest that children in this country would show some of the largest health gains across all OECD countries."

He noted that the research also is significant for children's futures, as those starting off in disadvantaged circumstances are at a high risk of having socioeconomic setbacks over many years.

Credit: 
Baylor University

Human fondness, faith in machines grows during pandemic

People are not very nice to machines. The disdain goes beyond the slot machine that emptied your wallet, a dispenser that failed to deliver a Coke or a navigation system that took you on an unwanted detour.

Yet USC researchers report that people affected by COVID-19 are showing more goodwill -- to humans and to human-like autonomous machines.

"The new discovery here is that when people are distracted by something distressing, they treat machines socially like they would treat other people. We found greater faith in technology due to the pandemic and a closing of the gap between humans and machines," said Jonathan Gratch, senior author of the study and director for virtual humans research at the USC Institute for Creative Technologies.

The findings, which appeared recently in the journal iScience, come from researchers at USC, George Mason University and the U.S. Department of Defense.

The scientists noted that, in general, people mostly dispense with social norms of human interaction and treat machines differently. The behavior holds even as machines become more humanlike; think Alexa, the persona in your vehicle navigation system or other virtual assistants. This is because human default behavior is often driven by heuristic thinking -- the snap judgments people use to navigate complex daily interactions.

In studying human-machine interactions, the researchers noted that people impacted by COVID-19 also displayed more altruism both toward other people and to machines.

They showed this using the simple "dictator game," which has been used in other studies as an established method to measure altruism. The scientists selected people who had been adversely affected by COVID-19, based on measurements of stress, and then enrolled them in the roleplaying game - with a twist. In addition to engaging other people in the exercise, the subjects also engaged computers.

Unexpectedly, the people affected by COVID-19 showed the same altruism toward computers and human partners. As the participants were increasingly distracted with coronavirus concerns, they became more compassionate toward machines.

"Our findings show that as people interacted more via machines during the past year, perceptions about the value of technology increased, which led to more favorable responses to machines," Gratch said.

Also, scientific breakthroughs that produced coronavirus vaccines in record time may have renewed faith in technology. The COVID-19 vaccines were developed in less than a year by leading universities and pharma companies worldwide. Such breakthroughs can affect how people respond to technology in general, Gratch explained.

The study findings are consistent with previous research that shows disasters often bring out compassion in people who feel compelled to help. During the COVID-19 pandemic, people grew more dependent on machines to purchase products online, work remotely from home, take classes or gain manufactured personal protective equipment. The results indicate that it is possible to encourage goodwill toward machines in other ways, perhaps including machines that express emotions or cultural cues.

But the study also raises concerns. For example, nefarious programmers could design machines to look and sound more human to gain people's trust and then defraud them.

Credit: 
University of Southern California

Modifying an implant: Dental implant biomaterials

Announcing a new article publication for BIO Integration journal. In this review article the authors Oliver K. Semisch-Dieter, Andy H. Choi and Martin P. Stewart from the University of Technology Sydney, Ultimo, NSW, Australia discuss the use of biomaterials in dental implants.

Biomaterials have become essential for modern implants. A suitable implant biomaterial integrates into the body to perform a key function, whilst minimizing negative immune response. Focusing on dentistry, the use of dental implants for tooth replacement requires a balance between bodily response, mechanical structure and performance, and aesthetics. The authors address the use of biomaterials in dental implants with significant comparisons drawn between Ti and zirconia.

Attention is drawn to optimizing surface modification processes and the additional use of coatings. Alternatives and novel developments are addressed, providing potential implications of combining biomaterials to form novel composites that combine and synergize the benefits of each material.

Article reference: Oliver K. Semisch-Dieter, Andy H. Choi and Martin P. Stewart, Modifying an Implant: A Mini-review of Dental Implant Biomaterials. BIO Integration, 2021, https://doi.org/10.15212/bioi-2020-0034

BIO Integration is fully open access journal which will allow for the rapid dissemination of multidisciplinary views driving the progress of modern medicine.

As part of its mandate to help bring interesting work and knowledge from around the world to a wider audience, BIOI will actively support authors through open access publishing and through waiving author fees in its first years. Also, publication support for authors whose first language is not English will be offered in areas such as manuscript development, English language editing and artwork assistance.

Credit: 
Compuscript Ltd

The astonishing self-organization skills of the brain

image: Culture of neurons with 80 percent inhibitory (red) and 20 percent excitatory (green) neurons. The share of inhibitory neurons is about 4 times as high as what is normal in the brain - nonetheless, the network proved to be surprisingly stable.

Image: 
Menahem Segal / Editing: Anna Levina

A team of researchers from Tübingen and Israel uncovers how brain structures can maintain function and stable dynamics even in unusual conditions. Their results might lay the foundations for better understanding and treating conditions like epilepsy and autism.

The neurons in our brains are connected with each other, forming small functional units called neural circuits. A neuron that is connected to another one via a synapsis can transmit information to the second neuron by sending a signal. This, in turn, might prompt the second neuron to transmit a signal to other neurons in the neural circuit. If that happens, the first neuron is likely an excitatory neuron: one that prompts other neurons to fire. But neurons with the exact opposite task are equally important to the functionality of our brain: inhibitory neurons, which make it less likely that the neurons they are connected to send a signal to others.

The interplay of excitation and inhibition is crucial for normal functionality of neural networks. Its dysregulation has been linked to many neurological and psychiatric disorders, including epilepsy, Alzheimer's disease, and autism spectrum disorders.

From cell cultures in the lab...

Interestingly, the share of inhibitory neurons among all neurons in various brain structures (like the neocortex or the hippocampus) remains fixed throughout the lifetime of an individual at 15 to 30 percent. "This prompted our curiosity: how important is this particular proportion?", recalls Anna Levina, a researcher at Tübingen University and the Max Planck Institute for Biological Cybernetics. "Can neural circuits with a different proportion of excitatory and inhibitory neurons still function normally?" Her collaborators from the Weizmann Institute of Science in Rehovot (Israel) designed a novel experiment that would allow to answer these questions. They grew cultures that contained different, even extreme ratios of excitatory and inhibitory neurons.

The scientists then measured the activity of these artificially designed brain tissues. "We were surprised that networks with various ratios of excitatory and inhibitory neurons remained active, even when these ratios were very far from natural conditions", explains Levina's PhD student Oleg Vinogradov. "Their activity does not change dramatically, as long as the share of inhibitory neurons stays somewhere in the range of 10 to 90 percent." It seems that the neural structures have a way of compensating for their unusual composition to remain stable and functional.

...to a theoretical understanding

So naturally the researchers asked next: what mechanism allow the brain tissue to adjust to these different conditions? The researchers theorized that the networks adapt by adjusting the number of connections: If there few inhibitory neurons, they have to take on a bigger role by building more synapses with the other neurons. Conversely, if the share of inhibitory neurons is large, the excitatory neurons have to make up for this by establishing more connections.

The theoretical model of the Tübingen scientists can explain the experimental findings of their colleagues in Rehovot and uncover the mechanisms helping to maintain stable dynamics in the brain. The results provide a clearer picture of how excitation/inhibition balance is preserved and where it fails in living neural networks. In the longer term, they might be useful for the emergent field of precision medicine: induced pluripotent stem cell derived neural cultures could be used to find mechanisms of neuropsychiatric disorders and novel medications.

Credit: 
Max-Planck-Gesellschaft