Culture

Lighting up biology from within

A biochemical reaction between an enzyme called luciferase and oxygen causes fireflies to glow and is considered one of the most well-known examples of bioluminescence in nature. Now, an international team of researchers led by Elena Goun at the University of Missouri is working to harness the power of bioluminescence in a low-cost, noninvasive portable medical imaging device that could one day be applied to many uses in biomedical research, translational medicine and clinical diagnoses.

Potential uses include developing better treatments for cancer, diabetes and infectious diseases, along with monitoring various metabolic functions, such as gut health, in both animals and humans, said Goun, an associate professor of chemistry in the College of Arts and Science and corresponding author on the study published in Nature Communications.

"This is the first example of a low-cost, portable bioluminescence imaging tool that can be used in large non-transgenic animals such as dogs," Goun said. "The mobility and cost-effectiveness of this technology also makes it a powerful tool for use in many areas of preclinical research, clinical research and diagnostics."

Once the imaging probe is inserted into the body and reaches a targeted internal organ, such as the liver, the level of biological activity, such as liver toxicity, determines the amount of luciferin that is released into the bloodstream. Eventually, it reaches the area of the device, setting off a biochemical reaction that creates light. A portable light detector -- about 10 millimeters, smaller than the diameter of a penny -- is then placed on the surface of the body near the inserted device and measures the intensity of the light. The level of detected light correlates with the amount of luciferin present, which scientists can then use when determining the health of the targeted organ.

Jeffrey Bryan, a professor of veterinary oncology in the College of Veterinary Medicine and a co-author on the study, said this technology will be helpful in a clinical setting -- both in animal and human medicine -- where medical professionals can determine if a treatment is working inside a patient.

"This is a way we can monitor, in a minimally invasive way, a patient's physiological response to whatever treatment is administered to him or her," said Bryan, who is also an associate director of comparative oncology at MU's Ellis Fischel Cancer Center. "Right now, most of the time we are looking for responses to treatment by asking the patient how they feel and then doing big, invasive, expensive tests to see if the treatment is working. Sometimes, that requires multiple procedures. But, if we can monitor for the desired effect in a minimally invasive manner and continue monitoring the progress over a long time period with this technology, that would probably reduce the need for more invasive testing."

In addition to the diagnostic testing benefits of this technology, Goun said their approach could have the potential to significantly reduce the number of dogs, cats and non-human primates being used for experimental testing purposes by commercial drug development companies.

"Portable bioluminescent platform for in vivo monitoring of biological processes in non-transgenic animals," was published in Nature Communications.

Credit: 
University of Missouri-Columbia

Newly described horned dinosaur from New Mexico was the earliest of its kind

image: With a frilled head and beaked face, Menefeeceratops sealeyi, discovered in New Mexico, lived 82 million years ago. It predated its better-known relative, Triceratops.

Image: 
Sergey Krasovskiy

A newly described horned dinosaur that lived in New Mexico 82 million years ago is one of the earliest known ceratopsid species, a group known as horned or frilled dinosaurs. Researchers reported their find in a publication in the journal PalZ (Paläontologische Zeitschrift).

Menefeeceratops sealeyi adds important information to scientists' understanding of the evolution of ceratopsid dinosaurs, which are characterized by horns and frills, along with beaked faces. In particular, the discovery sheds light on the centrosaurine subfamily of horned dinosaurs, of which Menefeeceratops is believed to be the oldest member. Its remains offer a clearer picture of the group's evolutionary path before it went extinct at the end of the Cretaceous.

Steven Jasinski, who recently completed his Ph.D. in Penn's Department of Earth and Environmental Science in the School of Arts & Sciences, and Peter Dodson of the School of Veterinary Medicine and Penn Arts & Sciences, collaborated on the work, which was led by Sebastian Dalman of the New Mexico Museum of Natural History and Science. Spencer Lucas and Asher Lichtig of the New Mexico Museum of Natural History and Science in Albuquerque were also part of the research team.

"There has been a striking increase in our knowledge of ceratopsid diversity during the past two decades," says Dodson, who specializes in the study of horned dinosaurs. "Much of that has resulted from discoveries farther north, from Utah to Alberta. It is particularly exciting that this find so far south is significantly older than any previous ceratopsid discovery. It underscores the importance of the Menefee dinosaur fauna for the understanding of the evolution of Late Cretaceous dinosaur faunas throughout western North America."

The fossil specimen of the new species, including multiple bones from one individual, was originally discovered in 1996 by Paul Sealey, a research associate of the New Mexico Museum of Natural History and Science, in Cretaceous rocks of the Menefee Formation in northwestern New Mexico. A field crew from the New Mexico Museum of Natural History and Science collected the specimen. Tom Williamson of the New Mexico Museum of Natural History and Science briefly described it the following year, and recent research on other ceratopsid dinosaurs and further preparation of the specimen shed important new light on the fossils.

Based on the latest investigations, researchers determined the fossils represent a new species. The genus name Menefeeceratops refers to the rock formation in which it was discovered, the Menefee Formation, and to the group of which the species is a part, Ceratopsidae. The species name sealeyi honors Sealey, who unearthed the specimen.

Menefeeceratops is related to but predates Triceratops, another ceratopsid dinosaur. However Menefeeceratops was a relatively small member of the group, growing to around 13 to 15 feet long, compared to Triceratops, which could grow to up to 30 feet long.

Horned dinosaurs were generally large, rhinoceros-like herbivores that likely lived in groups or herds. They were significant members of Late Cretaceous ecosystems in North America. "Ceratopsids are better known from various localities in western North America during the Late Cretaceous near the end of the time of dinosaurs," says Jasinski. "But we have less information about the group, and their fossils are rarer, when you go back before about 79 million years ago."

Although bones of the entire dinosaur were not recovered, a significant amount of the skeleton was preserved, including parts of the skull and lower jaws, forearm, hindlimbs, pelvis, vertebrae, and ribs. These bones not only show the animal is unique among known dinosaur species but also provide additional clues to its life history. For example, the fossils show evidence of a potential pathology, resulting from a minor injury or disease, on at least one of the vertebrae near the base of its spinal column.

Some of the key features that distinguish Menefeeceratops from other horned dinosaurs involve the bone that make up the sides of the dinosaur's frill, known as the squamosal. While less ornate than those of some other ceratopsids, Menefeeceratops' squamosal has a distinct pattern of concave and convex parts.

Comparing features of Menefeeceratops with other known ceratopsid dinosaurs helped the research team trace its evolutionary relationships. Their analysis places Menefeeceratops sealeyi at the base of the evolutionary tree of the centrosaurines subfamily, suggesting that not only is Menefeeceratops one of the oldest known centrosaurine ceratopsids, but also one of the most basal evolutionarily.

Menefeeceratops was part of an ancient ecosystem with numerous other dinosaurs, including the recently recognized nodosaurid ankylosaur Invictarx and the tyrannosaurid Dynamoterror, as well as hadrosaurids and dromaeosaurids.

"Menefeeceratops was part of a thriving Cretaceous ecosystem in the southwestern United States with dinosaurs that predated a lot of the more well-known members closer to end of the Cretaceous," says Jasinski.

While relatively less work has been done collecting dinosaurs in the Menefee Formation to date, the researchers hope that more field work and collecting in these areas, together with new analyses, will turn up more fossils of Menefeeceratops and ensure a better understanding of the ancient ecosystem of which it was part.

Credit: 
University of Pennsylvania

COVID-19 alters gray matter volume in the brain, new study shows

image: Researchers Kuaikuai Duan and Vince Calhoun have found that neurological complications of Covid-19 patients may be linked to lower gray matter volume in the front region of the brain even six months after hospital discharge.

Image: 
Vince Calhoun, Georgia Tech

Covid-19 patients who receive oxygen therapy or experience fever show reduced gray matter volume in the frontal-temporal network of the brain, according to a new study led by researchers at Georgia State University and the Georgia Institute of Technology.

The study found lower gray matter volume in this brain region was associated with a higher level of disability among Covid-19 patients, even six months after hospital discharge.

Gray matter is vital for processing information in the brain and gray matter abnormality may affect how well neurons function and communicate. The study, published in the May 2021 issue of Neurobiology of Stress, indicates gray matter in the frontal network could represent a core region for brain involvement in Covid-19, even beyond damage related to clinical manifestations of the disease, such as stroke.

The researchers, who are affiliated with the Center for Translational Research in Neuroimaging and Data Science (TReNDS), analyzed computed tomography scans in 120 neurological patients, including 58 with acute Covid-19 and 62 without Covid-19, matched for age, gender and disease. The work was done jointly with Enrico Premi and his colleagues at the University of Brescia in Italy, who provided the data for the study. They used source-based morphometry analysis, which boosts the statistical power for studies with a moderate sample size.

"Science has shown that the brain's structure affects its function, and abnormal brain imaging has emerged as a major feature of Covid?19," said Kuaikuai Duan, the study's first author, a graduate research assistant at TReNDS and Ph.D. student in Georgia Tech's School of Electrical and Computer Engineering. "Previous studies have examined how the brain is affected by Covid-19 using a univariate approach, but ours is the first to use a multivariate, data-driven approach to link these changes to specific Covid-19 characteristics (for example fever and lack of oxygen) and outcome (disability level)."

The analysis showed patients with higher levels of disability had lower gray matter volume in the superior, medial and middle frontal gyri at discharge and six months later, even when controlling for cerebrovascular diseases. Gray matter volume in this region was also significantly reduced in patients receiving oxygen therapy compared to patients not receiving oxygen therapy. Patients with fever had a significant reduction in gray matter volume in the inferior and middle temporal gyri and the fusiform gyrus compared to patients without fever. The results suggest Covid-19 may affect the frontal-temporal network through fever or lack of oxygen.

Reduced gray matter in the superior, medial and middle frontal gyri was also present in patients with agitation compared to patients without agitation. This implies that gray matter changes in the frontal region of the brain may underlie the mood disturbances commonly exhibited by Covid-19 patients.

"Neurological complications are increasingly documented for patients with Covid-19," said Vince Calhoun, senior author of the study and director of TReNDS. Calhoun is Distinguished University Professor of Psychology at Georgia State and holds appointments in the School of Electrical and Computer Engineering at Georgia Tech and in neurology and psychiatry at Emory University. "A reduction of gray matter has also been shown to be present in other mood disorders such as schizophrenia and is likely related to the way that gray matter influences neuron function."

The study's findings demonstrate changes to the frontal-temporal network could be used as a biomarker to determine the likely prognosis of Covid-19 or evaluate treatment options for the disease. Next, the researchers hope to replicate the study on a larger sample size that includes many types of brain scans and different populations of Covid-19 patients.

Credit: 
Georgia Institute of Technology

Rooting the bacterial tree of life

image: Fusobacteria, Gracilicutes and Bacteroidota all branched off from a last bacterial common ancestor.

Image: 
The University of Queensland

Scientists now better understand early bacterial evolution, thanks to new research featuring University of Queensland researchers.

Bacteria comprise a very diverse domain of single-celled organisms that are thought to have evolved from a common ancestor that lived more than three billion years ago.

Professor Phil Hugenholtz, from the Australian Centre for Ecogenomics in UQ's School of Chemistry and Molecular Biosciences, said the root of the bacterial tree, which would reveal the nature of the last common ancestor, is not agreed upon.

"There's great debate about the root of this bacterial tree of life and indeed whether bacterial evolution should even be described as a tree has been contested," Professor Hugenholtz said.

"This is in large part because genes are not just shared 'vertically' from parents to offspring, but also 'horizontally' between distant family members.

"We've all inherited certain traits from our parents, but imagine going to a family BBQ and suddenly inheriting your third cousin's red hair.

"As baffling as it sounds, that's exactly what happens in the bacterial world, as bacteria can frequently transfer and reconfigure genes horizontally across populations quite easily.

"This might be useful for bacteria but makes it challenging to reconstruct bacterial evolution."

For the bacterial world, many researchers have suggested throwing the 'tree of life' concept out the window and replacing it with a network that reflects horizontal movement of genes.

"However, by integrating vertical and horizontal gene transmission, we found that bacterial genes travel vertically most of the time - on average two-thirds of the time - suggesting that a tree is still an apt representation of bacterial evolution," Professor Hugenholtz said.

"The analysis also revealed that the root of the tree lies between two supergroups of bacteria, those with one cell membrane and those with two.

"Their common ancestor was already complex, predicted to have two membranes, the ability to swim, sense its environment, and defend itself against viruses."

The University of Bristol's Dr Tom Williams said this fact led to another big question.

"Given the common ancestor of all living bacteria already had two membranes, we now need to understand how did single-membrane cells evolve from double-membraned cells, and whether this occurred once or on multiple occasions," Dr Williams said.

"We believe that our approach to integrating vertical and horizontal gene transmission will answer these and many other open questions in evolutionary biology."

The research was a collaboration between UQ, the University of Bristol in the UK, Eötvös Loránd University in Hungary, and NIOZ in the Netherlands, and has been published in Science (DOI: 10.1126/science.abe5011).

Credit: 
University of Queensland

Tiny amino acid differences can lead to dramatically different enzymes

image: By slightly adjusting an enzyme's amino acids, it can live in completely different environments.

Image: 
Dr Ulrike Kappler

Just a few changes to an enzyme's amino acids can be enough to dramatically change its function, enabling microbes to inhabit wildly different environments.

University of Queensland microbiologist Associate Professor Ulrike Kappler, led by an international team of researchers, made this discovery when investigating how Haemophilus influenzae bacteria colonise the human respiratory system.

"This disease-causing bacterium is supremely adapted to living in humans, so much so that they cannot survive anywhere else," Dr Kappler said.

"It turns out that one enzyme, MtsZ, is the key player in this adaptation.

"But, surprisingly, close relatives of this protein, which promotes Haemophilus survival exclusively inside humans, help other species of bacteria to survive exclusively in lakes.

"How could closely related enzymes help one bacterial species live exclusively in humans and another to live only in lakes?

"The answer is a matter of minute amino acid changes."

The research shows that a sequence difference of just three amino acids, a difference of less than 0.25 per cent of the MtsZ enzyme sequence, changes the functionality of the enzyme between bacteria living in lakes compared with those living in humans.

"It the natural world, tiny differences can lead to enormous functional changes - for example, humans and chimpanzees aren't exactly the same despite being 99 percent genetically similar," Dr Kappler said.

"We're just now realising that this can be the case for enzymes as well.

"The slight changes in this enzyme enable the lake-dwelling bacteria to live on decaying algae and generate energy.

"Contrast this with Haemophilus, which uses MtsZ to scavenge amino acids from the human body and use them for bacterial growth and replication.

"Now that we understand the unique structure of this enzyme in Haemophilus, we hope to develop ways to inhibit its specific function and remedy chronic respiratory conditions associated with this bacterium."

Credit: 
University of Queensland

UQ research finds new way to reduce scarring

image: University of Queensland Professor Kiarash Khosrotehrani

Image: 
The University of Queensland

Researchers have been able to reduce scarring by blocking part of the healing process in research that could make a significant difference for burns and other trauma patients.

University of Queensland Professor Kiarash Khosrotehrani said scars had been reduced by targeting the gene that instructs stem cells to form them in an animal study.

"The body's natural response to trauma is to make plenty of blood vessels to take oxygen and nutrients to the wound to repair it," Professor Khosrotehrani said.

"Once the wound has closed, many of these blood vessels become fibroblast cells which produce the collagens forming the hard materials found in scar tissue.

"We found that vascular stem cells determined whether a blood vessel was retained or gave rise to scar material instead."

The experimental dermatology team then identified the molecular mechanism to switch off the process by targeting a specific gene involved in scar formation known as SOX9.

Professor Khosrotehrani said while more research was required, the potential application of the findings would have obvious benefits for many patients including those who've had knee or hip surgeries, melanomas removed, or suffered burns.

"The classic situation where there's a lot of scarring is burns - where the wound is healed but there is a big scar in that area," he said.

"Now that we've found the molecular drivers, we understand the process better and we are hopeful that a treatment can be developed.

"We used siRNA - or small ribonucleic acid - technology to block the RNA of SOX9 from being expressed and this reduced scarring in animals.

"Whatever we propose has to go through the further trials, but we believe this application won't be difficult to apply to human patients."

Credit: 
University of Queensland

Best practices to prevent the federal government from blowing its technology budget

INFORMS Journal Manufacturing & Service Operations Management Study Key Takeaways:

The study looked at archival data on 240 U.S. federal government technology programs across 24 federal agencies.

Researchers found that the practice of moving baseline targets is a key driver in continually increasing budgets for federal government technology programs.

The componentization of a program into smaller work units and increasing the level of competency in program management can dampen this increase, resulting in significant cost savings.

CATONSVILLE, MD, May 11, 2021 - With the U.S. federal government investing billions of taxpayer dollars in executing technology programs, wouldn't you like to know where this money is going? A new study has identified ways to reduce federal spending in the execution of these taxpayer-funded technology programs.

To monitor the execution of these programs, the federal government establishes a baseline, which is an aggregate plan consisting of the program's planned budget, schedule and scope. The problem is that federal technology programs are re-baselined several times, which means if the baseline is changed, it can appear as though a program is not over budget when in fact it is over the original planned budget.

New research in the INFORMS journal Manufacturing & Service Operations Management investigates the drivers of these baseline changes and identifies mechanisms to reduce these changes, thereby helping improve utilization of taxpayer contributions associated with such programs.

"Taxing the Taxpayers: An Empirical Investigation of the Drivers of Baseline Changes in U.S. Federal Government Technology Programs," written by Dwaipayan Roy, Anant Mishra and Kingshuk Sinha, all of the University of Minnesota, looks at archival data on 240 U.S. federal government technology programs across 24 federal agencies.

"We find significant savings can occur by reducing baseline changes in programs of greater scope if federal agencies and contractor firms invest greater efforts in componentizing a program into smaller work units and identifying managers with high levels of technical and practical knowledge in 'program management' - a competency critical for managing multiple interrelated projects," said Roy, professor in the Carlson School of Management.

"Baseline changes can serve as early warning signals for federal agencies and contractor firms to identify programs that may be facing execution challenges and enable them to make mid-course corrections," continued Roy.

Another key finding is that federal technology programs using the agile methodology experienced more baseline changes.

"Scope creep can be higher in such programs, as these programs can often lack sufficient upfront effort in developing the initial baseline and depend too much on making adaptations during execution. The upfront effort is actually critical for better managing adaptations and avoiding the time-consuming approval process needed for revising a baseline," added Roy.

Credit: 
Institute for Operations Research and the Management Sciences

People living with HIV more likely to get sick with, die from COVID-19

HERSHEY, Pa. -- Over the past year, studies have revealed that certain pre-existing conditions, such as cancer, diabetes and high blood pressure, can increase a person's risk of dying from COVID-19. New research shows that individuals living with human immunodeficiency virus (HIV) and acquired immune deficiency syndrome (AIDS) -- an estimated 38 million worldwide, according to the World Health Organization -- have an increased risk of SARS-CoV-2 infection and fatal outcomes from COVID-19.

In a new study, published in Scientific Reports, Penn State College of Medicine researchers found that people living with HIV had a 24% higher risk of SARS-CoV-2 infection and a 78% higher risk of death from COVID-19 than people without HIV. They assessed data from 22 previous studies that included nearly 21 million participants in North America, Africa, Europe and Asia to determine to what extent people living with HIV/AIDS are susceptible to SARS-CoV-2 infection and death from COVID-19.

The majority of the participants (66%) were male and the median age was 56. The most common comorbidities among the HIV-positive population were hypertension, diabetes, chronic obstructive pulmonary disease and chronic kidney disease. The majority of patients living with HIV/AIDS (96%) were on antiretroviral therapy (ART), which helps suppress the amount of HIV detected in the body.

"Previous studies were inconclusive on whether or not HIV is a risk factor for susceptibility to SARS-CoV-2 infection and poor outcomes in populations with COVID-19," said Dr. Paddy Ssentongo, lead researcher and assistant professor at the Penn State Center for Neural Engineering. "This is because a vast majority of people living with HIV/AIDS are on ART, some of which have been used experimentally to treat COVID-19."

According to the researchers, certain pre-existing conditions are common among people living with HIV/AIDS, which may contribute to the severity of their COVID-19 cases. The beneficial effects of antiviral drugs, such astenofovir and protease-inhibitors, in reducing the risk of SARS-CoV-2 infection and death from COVID-19 in people with living with HIV/AIDS remain inconclusive.

"As the pandemic has evolved, we've obtained sufficient information to characterize the epidemiology of HIV/SARS-CoV-2 coinfection, which could not be done at the beginning of the pandemic due to scarcity of data," said Vernon Chinchilli, fellow researcher and chair of the Department of Public Health Sciences. "Our findings support the current Centers for Disease Control and Prevention guidance to prioritize persons living with HIV to receive a COVID-19 vaccine."

Credit: 
Penn State

Greater presence of family docs, midwives may decrease rates of cesarean birth

Surgical cesarean births can expose new mothers to a range of health complications, including infection, blood clots and hemorrhage. As part of Healthy People 2020 and other maternal health objectives, the state of California exerted pressure to reduce cesarean deliveries, and statewide organizations established quality initiatives in partnership with those goals. In this study, researchers from Stanford University and the University of Chicago examined unit culture and provider mix differences on hospital and delivery units to identify characteristics of units that successfully reduced their cesarean delivery rates. The mixed-methods study surveyed and interviewed labor and delivery teams from 37 California hospitals that were participating sites in the California Maternal Quality Care Collaborative's Supporting Vaginal Birth initiative. Respondents at successful hospitals included more family physicians and midwives, and physicians who had been in practice for less time. The study identified a number of unit culture factors that also predicted success. The authors conclude, "Family medicine, a discipline that strongly identifies itself as valuing patient-centered care and shared decision-making, may be in a unique position to contribute positively to this aspect of culture change on labor and delivery units."

Credit: 
American Academy of Family Physicians

Combination of psychotherapy and pharmacotherapy more effective in treating depression

Most patients with depression are treated in primary care, however, relatively few clinical trials for treating depression have focused on primary care. Researchers at the Vrije University Amsterdam examined the effects of the two major approaches to treating depression: psychotherapy and pharmacotherapy, as well as combined treatment and care-as-usual. The study integrated the results of 58 randomized controlled trials with a total of 9,301 patients. Results concluded that both psychotherapy and pharmacotherapy were significantly more effective than care-as-usual or waitlist control. However, they found no significant difference between psychotherapy and pharmacotherapy as stand-alone treatments. Combined treatment, particularly in studies that included cognitive behavioral therapy, was better than either pharmacotherapy or psychotherapy alone. Treatment in primary care should be organized to accommodate any of these treatments in response to patients' preferences and values, the authors write.

Credit: 
American Academy of Family Physicians

Focus on outliers creates flawed snap judgments

image: Study participants saw a grid of faces like this one, for just one second. Photo courtesy of Mel Khaw and Duke University.

Image: 
Duke University

DURHAM, N.C. -- You enter a room and quickly scan the crowd to gain a sense of who's there - how many men versus women. How reliable is your estimate?

Not very, according to new research from Duke University.

In an experimental study, researchers found that participants consistently erred in estimating the proportion of men and women in a group. And participants erred in a particular way: They overestimated whichever group was in the minority.

"Our attention is drawn to outliers," said Mel W. Khaw, a postdoctoral research associate at Duke and the study's lead author. "We tend to overestimate people who stand out in a crowd."

For the study, which appears online in the journal Cognition, researchers recruited 48 observers ages 18-28. Participants were presented with a grid of 12 faces and were given just one second to glance at the grid. Study participants were then asked to estimate the number of men and women in the grid.

Participants accurately assessed homogenous groups - groups containing all men or all women. But if a group contained fewer women, say, participants overestimated the number of women present.

The researchers also tracked participants' eye movements. They found that participants looked more often at whichever group was in the minority - men or women.

All of this occurred very quickly -- during a glance of just one second, said co-author and Duke psychologist Scott Huettel.

"We should recognize that our visual system is set up to orient ourselves towards some types of information more than others," Huettel said. "People form an initial impression very quickly,
and that impression biases where we look next."

Interestingly, the same tendency to focus on the outlier also extended to scanning other kinds of images.

In a second experiment, study participants were shown a grid of nature photos showing a variety of indoor and outdoor scenes. Participants consistently overestimated whatever type of scene appeared less often.

For instance, if a grid of 12 photos contained two outdoor scenes - say, a waterfall and a mountain range -- participants reported, on average, that the grid contained three such scenes.

In other words, the same behavior occurred whether people were looking at faces or scenes. That's important, Huettel said.

"That fact that this occurs with indoor and outdoor scenes suggests that this doesn't represent a social bias," Huettel said. "It really has to do with a fundamental feature of human perception."

And that built-in flaw in human perception suggests our quick judgments should be viewed with caution.

"Snap judgments are powerful," Huettel said. "But they're not perfect."

Co-author Rachel Kranton, an economist, noted that as the research was coming together, she received an invitation to an economics conference including a photo from a past event.

The photo showed a meeting room full of mostly men, a situation Kranton frequently encounters at economics conferences. Kranton said she found herself scanning the photo for the presence of women -- and smiling in recognition.

"When human beings walk into a social situation, we immediately try to suss out the setting," Kranton said. "We scan to see who's there and how we fit in - that's a common human experience. It's one I've experienced many times."

Credit: 
Duke University

Study examines connection between oral and general health in patients with diabetes

Individuals with diabetes are at greater risk of developing oral health issues, like gum disease, yet care for these linked health issues are usually disconnected, split between primary care and dental care. A research team from the University of Amsterdam developed an intervention that provided primary care-based oral health information and dental referrals for patients with diabetes. In a cluster randomized controlled trial, 764 patients from 24 primary care practices received either the oral health support or standard primary care. Participants were asked to rate their oral health quality of life, as well as their general health and any oral health complaints, at the start and end of the study. Analysis showed that individuals who received the primary care-based oral health support intervention had a significant increase in their self-reported oral health quality of life when compared with the control group. The authors conclude that, "patients with type 2 diabetes who attend primary diabetes care can benefit from extra attention to oral health." They add, "It also further reflects the concept of oral health and general health being connected."

Credit: 
American Academy of Family Physicians

Low-temperature crystallization of phase-pure α-formamidinium lead iodide enabled by study

image: Researchers found that transformation from the intercalated initial structure to the final perovskite arrangement takes place via a sequence of intermediates.

Image: 
Ahlawat Paramvir, @EPFL

Though different fabrication approaches exist, two-step deposition is one of the main experimental techniques now used to make efficient, stable PSCs, especially on the industrial scale. The process involves first depositing lead iodide (PbI2) and then adding halide salts of monovalent cations such as methylammonium iodide (MAI) and formamidinium iodide (FAI) to convert it to perovskite.

While this two-step deposition is better than other options, it is difficult to maintain reproducible high performance and long-term stability when scaling up, mostly because of a lack of control over the fabrication process. Gaining an understanding of the mechanism behind halide perovskite crystallization at the atomic level is therefore essential.

In the paper A combined molecular dynamics and experimental study of two-step process enabling low-temperature formation of phase-pure α-FAPbI3 the authors chose to study, to this end, the two-step fabrication of methylammonium lead iodide (MAPbI3) and formamidinium lead iodide (FAPbI3).

While the former is a well-studied system, the latter was chosen because of attractive features including a ?1.45-eV bandgap, high-charge carrier mobility, and superior thermal stability that appear in its α-FAPbI3 polymorph. The problem with this perovskite however is that the α phase is metastable and the thermodynamic phase transition requires high temperatures of around 150 degrees Celsius. The combined experimental and theoretical study, published in the 23 April issue of Science Advances, uncovered the microscopic details of the crystallization process, leading the way to the discovery of a low-temperature pathway to the fabrication of the material.

While previous experimental research on MAPbI3 revealed that the two-step process occurs via intercalation of the MA+ cations in PbI2 layers followed by a transformation to the perovskite structure via intermediate phases, the experiments couldn't resolve the nature of these intermediate phases or clarify the underlying atomistic mechanism. Using a molecular dynamics (MD) investigation based on an enhanced sampling technique called metadynamics (WTMetaD), the team found that that transformation takes place through a sequence of intermediates. The theoretical results were in line with experiments, encouraging the researchers to investigate whether a similar process was also behind the transformation of α-FAPbI3. Starting from simulations, they discovered that a two-step process is indeed possible at lower temperatures in this material. A series of in situ x-ray and thin-film experiments then confirmed this result and enabled the low-temperature formation of phase-pure α -FAPbI3 thin films.

Credit: 
National Centre of Competence in Research (NCCR) MARVEL

World's fastest information-fuelled engine designed by SFU researchers

Simon Fraser University researchers have designed a remarkably fast engine that taps into a new kind of fuel -- information.

The development of this engine, which converts the random jiggling of a microscopic particle into stored energy, is outlined in research published this week in the Proceedings of the National Academy of Sciences (PNAS) and could lead to significant advances in the speed and cost of computers and bio-nanotechnologies.

SFU physics professor and senior author John Bechhoefer says researchers' understanding of how to rapidly and efficiently convert information into "work" may inform the design and creation of real-world information engines.??

"We wanted to find out how fast an information engine can go and how much energy it can extract, so we made one," says Bechhoefer, whose experimental group collaborated with theorists led by SFU physics professor David Sivak.

Engines of this type were first proposed over 150 years ago but actually making them has only recently become possible.

"By systematically studying this engine, and choosing the right system characteristics, we have pushed its capabilities over ten times farther than other similar implementations, thus making it the current best-in-class," says Sivak.

The information engine designed by SFU researchers consists of a microscopic particle immersed in water and attached to a spring which, itself, is fixed to a movable stage. Researchers then observe the particle bouncing up and down due to thermal motion.

"When we see an upward bounce, we move the stage up in response," explains lead author and PhD student Tushar Saha. "When we see a downward bounce, we wait. This ends up lifting the entire system using only information about the particle's position."

Repeating this procedure, they raise the particle "a great height, and thus store a significant amount of gravitational energy," without having to directly pull on the particle.

Saha further explains that, "in the lab, we implement this engine with an instrument known as an optical trap, which uses a laser to create a force on the particle that mimics that of the spring and stage."??

Joseph Lucero, a Master of Science student adds, "in our theoretical analysis, we find an interesting trade-off between the particle mass and the average time for the particle to bounce up. While heavier particles can store more gravitational energy, they generally also take longer to move up."

"Guided by this insight, we picked the particle mass and other engine properties to maximize how fast the engine extracts energy, outperforming previous designs and achieving power comparable to molecular machinery in living cells, and speeds comparable to fast-swimming bacteria," says postdoctoral fellow Jannik Ehrich.

Credit: 
Simon Fraser University

Discovery of new geologic process calls for changes to plate tectonic cycle

video: Elements of a newly discovered process in plate tectonics include a mass (rock slab weight), a pulley (trench), a dashpot (microcontinent), and a string (oceanic plate) that connects these elements to each other.

In the initial state, the microcontinent drifts towards the subduction zone (Figure a).

The microcontinent then extends during its journey to the subduction trench owing to the tensional force applied by the pull of the rock slab pull across the subduction zone (Figure b).

Finally, the microcontinent accretes to the overriding plate and resists subduction due to its low density, causing the down-going slab to break off.

Image: 
Erkan Gün/University of Toronto

TORONTO, ON - Geoscientists at the University of Toronto (U of T) and Istanbul Technical University have discovered a new process in plate tectonics which shows that tremendous damage occurs to areas of Earth's crust long before it should be geologically altered by known plate-boundary processes, highlighting the need to amend current understandings of the planet's tectonic cycle.

Plate tectonics, an accepted theory for over 60 years that explains the geologic processes occurring below the surface of Earth, holds that its outer shell is fragmented into continent-sized blocks of solid rock, called "plates," that slide over Earth's mantle, the rocky inner layer above the planet's core. As the plates drift around and collide with each other over million-years-long periods, they produce everything from volcanoes and earthquakes to mountain ranges and deep ocean trenches, at the boundaries where the plates collide.

Now, using supercomputer modelling, the researchers show that the plates on which Earth's oceans sit are being torn apart by massive tectonic forces even as they drift about the globe. The findings are reported in a study published this week in Nature Geoscience.

The thinking up to now focused only on the geological deformation of these drifting plates at their boundaries after they had reached a subduction zone, such as the Marianas Trench in the Pacific Ocean where the massive Pacific plate dives beneath the smaller Philippine plate and is recycled into Earth's mantle.

The new research shows much earlier damage to the drifting plate further away from the boundaries of two colliding plates, focused around zones of microcontinents - continental crustal fragments that have broken off from main continental masses to form distinct islands often several hundred kilometers from their place of origin.

"Our work discovers that a completely different part of the plate is being pulled apart because of the subduction process, and at a remarkably early phase of the tectonic cycle," said Erkan Gün, a PhD candidate in the Department of Earth Sciences in the Faculty of Arts & Science at U of T and lead author of the study.

The researchers term the mechanism a "subduction pulley" where the weight of the subducting portion that dives beneath another tectonic plate, pulls on the drifting ocean plate and tears apart the weak microcontinent sections in an early phase of potentially significant damage.

"The damage occurs long before the microcontinent fragment reaches its fate to be consumed in a subduction zone at the boundaries of the colliding plates," said Russell Pysklywec, professor and chair of the Department of Earth Sciences at U of T, and a coauthor of the study. He says another way to look at it is to think of the drifting ocean plate as an airport baggage conveyor, and the microcontinents are like pieces of luggage travelling on the conveyor.

"The conveyor system itself is actually tearing apart the luggage as it travels around the carousel, before the luggage even reaches its owner."

The researchers arrived at the results following a mysterious observation of major extension of rocks in alpine regions in Italy and Turkey. These observations suggested that the tectonic plates that brought the rocks to their current location were already highly damaged prior to the collisional and mountain-building events that normally cause deformation.

"We devised and conducted computational Earth models to investigate a process to account for the observations," said Gün. "It turned out that the temperature and pressure rock histories that we measured with the virtual Earth models match closely with the enigmatic rock evolution observed in Italy and Turkey."

According to the researchers, the findings refine some of the fundamental aspects of plate tectonics and call for a revised understanding of this fundamental theory in geoscience.

"Normally we assume - and teach - that the ocean plate conveyor is too strong to be damaged as it drifts around the globe, but we prove otherwise," said Pysklywec.

Credit: 
University of Toronto