Tech

Freshwater fish species richness has increased in Ohio River Basin since '60s

The taxonomic and trophic composition of freshwater fishes in the Ohio River Basin has changed significantly in recent decades, possibly due to environmental modifications related to land use and hydrology, according to a study published April 24 in the open-access journal PLOS ONE by Mark Pyron of Ball State University, and colleagues.

Manmade threats to freshwater ecosystems are numerous and globally widespread. The legacy of agriculture and land use is manifested in the Ohio River Basin, drastically modified via logging and wetland draining following European colonization. After this period, the Ohio River Basin watershed was historically dominated by agriculture, and then converted from agriculture to forest during the 1960s-80s. The effects of these changes on fish throughout the basin are not fully known.

Pyron and colleagues used 57 years of rotenone and electrofishing fish collection survey data (1957-2014), collected by the Ohio River Valley Water Sanitation Commission, to examine changes to taxonomy, trophic classifications, and life history strategies of freshwater fish assemblages in the Ohio River Basin over this period.

Annual species richness varied from 31 to 90 species and generally showed a positive trend, increasing over time. Taxonomic and trophic structure was correlated with the decrease in agriculture and increase in forest. The trophic composition of fish catch also correlated with this changes to the Basin's hydrology. In general, the environmental modifications were associated with more fish species which feed on plant matter and detritus, and fewer fish feeding on plankton and on other fish.

The authors believe that future land use modifications, climate change, and altered biotic interactions could continue to contribute to complex patterns of change in freshwater fish assemblages in the Ohio River.

Pyron adds: We found significant changes in species and trophic composition of freshwater fishes in the Ohio River Basin from 1957-2014. Species richness increased with year and the fish assemblages varied with changes in landuse and hydrologic alteration."

Credit: 
PLOS

Getting fertilizer in the right place at the right rate

image: The soil slabs that were used in simulated snowmelt runoff studies were removed from the field plots by the researchers.

Image: 
Photo by T. King.

We've all heard about the magical combination of being in the right place at the right time. Well for fertilizer, it's more accurate to say it should be in the right place at the right rate. A group of Canadian scientists wanted to find the perfect combination for farmers in their northern prairies.

When farmers place fertilizer on a field, they'd like it to stay there. However, water that runs off a field can take some of the valuable fertilizer with it. In Canada this water can take two forms: rainfall runoff or snow melt. It's the latter type of water - snow melt - that causes the most runoff losses in the Canadian prairies.

Knowing how each form of runoff affects fertilizer will impact their "right place, right rate" calculation. Rainfall runoff and snow melt runoff may result in different fertilizer management recommendations. So, Jeff Schoenau from the University of Saskatchewan and his team focused on runoff from snowmelt.

"This work contributed to finding better practices for phosphorus fertilization," Schoenau explains. "These will help growers in the northern prairies make better use of their fertilizer. By applying the fertilizer in the right place at the right rate, growers can greatly lower the phosphorus loss from snowmelt runoff."

For the "right place" part of the fertilizer, they studied applying fertilizer to the top of the soil and leaving it there versus in-soil placement. The in-soil placement can involve placing the fertilizer in the furrow with the seed or next to it in a separate furrow. It can also refer to broadcasting fertilizer onto the soil followed by mixing it into the soil rather than leaving it on the surface.

They applied different amounts, or rates, of these two forms -- on-top-of-soil and in-soil -- in their study. Their results showed that the in-soil placement resulted in less phosphorus loss from snowmelt runoff.

"In the case of phosphorus fertilization practices, in-soil placement is helpful because it can help roots better access and take up phosphorus," he says. "Also, having the phosphorus placed in the soil rather than on the surface reduces its interaction with runoff from snowmelt in early spring."

Phosphorus is an important nutrient that plants need, so it is often applied to fields as fertilizer. It can come in different forms, and end up in different forms depending on the chemistry of the soil. When phosphorus fertilizer is applied, it undergoes transformation in the soil through reaction with minerals and organic matter. Ideally, it will end up in a form that the plants can use. However, too much of a good thing can be bad because it can run off and cause harm to nearby rivers and lakes.

"In our research we were able to employ some novel techniques to help us find the nature and origin of some of these forms in soil and water," he says. "Our main message here is that benefits can be realized by getting the phosphorus fertilizer into the soil where the roots are rather than leaving it on the surface."

Schoenau explains that runoff from snow is different than runoff from summer rains. The force of rainfall can loosen pieces of the soil containing phosphorus. Snowmelt runoff moves the element differently, mostly in its dissolved form from the soil and pieces of plants on the surface.

"In order to encourage growers to follow the best practices, it's important to document and understand why and how a specific practice like the one we tested works," he says. "I am both a scientist and farmer on the prairies interested in furthering the environmental and economic sustainability of our modern cropping systems."

Credit: 
American Society of Agronomy

Study: Microbes could influence earth's geological processes as much as volcanoes

image: Microbiology professor Karen Lloyd (second from right) and Ph.D. student Katie Fullerton (far left) look at a volcanic wall while on a research trip to Costa Rica. Location: Irazu Volcano.

Image: 
Photo by Tom Owens.

By acting as gatekeepers, microbes can affect geological processes that move carbon from the earth's surface into its deep interior, according to a study published in Nature and coauthored by microbiologists at the University of Tennessee, Knoxville. The research is part of the Deep Carbon Observatory's Biology Meets Subduction project.

"We usually think of geology as something that happens independently of life, and life just adjusts to the geology," said Karen Lloyd, associate professor of microbiology at the University of Tennessee, Knoxville and senior author of the study. "But we found that microbes can impact major geological processes happening on Earth today."

For the study, researchers evaluated the Costa Rica's subduction zone, a point where the ocean floor sinks underneath the continental plate. The results showed that microbes consume and trap a small but measurable amount of the carbon sinking into the trench off Costa Rica's Pacific coast. The microbes may also be involved in chemical processes that pull out even more carbon, leaving cement-like veins of calcite in the crust.

"It is amazing to consider that tiny microbes can potentially influence geological processes on similar scales as these powerful and visually impressive volcanoes, which are direct conduits to the earth's interior," said Maarten de Moor, coauthor and professor at the National University of Costa Rica's Observatory of Volcanology and Seismology.

The unexpected findings have important implications for how much carbon moves from Earth's surface into the interior, especially over geological timescales. The research is part of the Deep Carbon Observatory's Biology Meets Subduction project.

In the future, researchers plan to investigate other forearc regions to see if this trend is widespread. If these biological and geochemical processes occur worldwide, they would translate to 19 percent less carbon entering the deep mantle than previously estimated.

Credit: 
University of Tennessee at Knoxville

New sensor detects rare metals used in smartphones

image: A new sensor changes its fluorescence when it binds to lanthanides (Ln), rare earth metals used in smartphones and other technologies, potentially providing a more efficient and cost-effective way to detect these elusive metals.

Image: 
Cotruvo Lab, Penn State

A more efficient and cost-effective way to detect lanthanides, the rare earth metals used in smartphones and other technologies, could be possible with a new protein-based sensor that changes its fluorescence when it binds to these metals. A team of researchers from Penn State developed the sensor from a protein they recently described and subsequently used it to explore the biology of bacteria that use lanthanides. A study describing the sensor appears online in the Journal of the American Chemical Society.

"Lanthanides are used in a variety of current technologies, including the screens and electronics of smartphones, batteries of electric cars, satellites, and lasers," said Joseph Cotruvo, Jr., assistant professor and Louis Martarano Career Development Professor of Chemistry at Penn State and senior author of the study. "These elements are called rare earths, and they include chemical elements of atomic weight 57 to 71 on the periodic table. Rare earths are challenging and expensive to extract from the environment or from industrial samples, like waste water from mines or coal waste products. We developed a protein-based sensor that can detect tiny amounts of lanthanides in a sample, letting us know if it's worth investing resources to extract these important metals."

The research team reengineered a fluorescent sensor used to detect calcium, substituting the part of the sensor that binds to calcium with a protein they recently discovered that is several million times better at binding to lanthanides than other metals. The protein undergoes a shape change when it binds to lanthanides, which is key for the sensor's fluorescence to "turn on."

"The gold standard for detecting each element that is present in a sample is a mass spectrometry technique called ICP-MS," said Cotruvo. "This technique is very sensitive, but it requires specialized instrumentation that most labs don't have, and it's not cheap. The protein-based sensor that we developed allows us to detect the total amount of lanthanides in a sample. It doesn't identify each individual element, but it can be done rapidly and inexpensively at the location of sampling."

The research team also used the sensor to investigate the biology of a type of bacteria that uses lanthanides--the bacteria from which the lanthanide-binding protein was originally discovered. Earlier studies had detected lanthanides in the bacteria's periplasm--a space between membranes near the outside of the cell--but, using the sensor, the team also detected lanthanides in the bacterium's cytosol--the fluid that fills the cell.

"We found that the lightest of the lanthanides--lanthanum through neodymium on the periodic table--get into the cytosol, but the heavier ones don't," said Cotruvo. "We're still trying to understand exactly how and why that is, but this tells us that there are proteins in the cytosol that handle lanthanides, which we didn't know before. Understanding what is behind this high uptake selectivity could also be useful in developing new methods to separate one lanthanide from another, which is currently a very difficult problem."

The team also determined that the bacteria takes in lanthanides much like many bacteria take in iron; they secrete small molecules that tightly bind to the metal, and the entire complex is taken into the cell. This reveals that there are small molecules that likely bind to lanthanides even more tightly than the highly selective sensor.

"We hope to further study these small molecules and any proteins in the cytosol, which could end up being better at binding to lanthanides than the protein we used in the sensor," said Cotruvo. "Investigating how each of these bind and interact with lanthanides may give us inspiration for how to replicate these processes when collecting lanthanides for use in current technologies."

Credit: 
Penn State

New way to 'see' objects accelerates the future of self-driving cars

ITHACA, N.Y. - Researchers have discovered a simple, cost-effective, and accurate new method for equipping self-driving cars with the tools needed to perceive 3D objects in their path.

The laser sensors currently used to detect 3D objects in the paths of autonomous cars are bulky, ugly, expensive, energy-inefficient - and highly accurate.

These Light Detection and Ranging (LiDAR) sensors are affixed to cars' roofs, where they increase wind drag, a particular disadvantage for electric cars. They can add around $10,000 to a car's cost. Despite their drawbacks, most experts have considered LiDAR sensors the only plausible way for self-driving vehicles to safely perceive pedestrians, cars and other hazards on the road.

Now, Cornell researchers have discovered that a simpler method, using two inexpensive cameras on either side of the windshield, can detect objects with nearly LiDAR's accuracy and at a fraction of the cost. The researchers found that analyzing the captured images from a bird's eye view rather than the more traditional frontal view more than tripled their accuracy, making stereo cameras a viable and low-cost alternative to LiDAR.

"One of the essential problems in self-driving cars is to identify objects around them - obviously that's crucial for a car to navigate its environment," said Kilian Weinberger, associate professor of computer science and senior author of the paper, which will be presented at the 2019 Conference on Computer Vision and Pattern Recognition, June 15-21 in Long Beach, California.

"The common belief is that you couldn't make self-driving cars without LiDARs," Weinberger said. "We've shown, at least in principle, that it's possible."

The first author of the paper is Yan Wang, doctoral student in computer science.

Ultimately, Weinberger said, stereo cameras could potentially be used as the primary way of identifying objects in lower-cost cars, or as a backup method in higher-end cars that are also equipped with LiDAR.

Credit: 
Cornell University

CU School of Medicine scientist helps create international database of women scientists

AURORA, Colo. (April 23, 2019) - A database of women scientists that was created a year ago by a team led by a CU School of Medicine postdoctoral fellow has grown to list more than 7,500 women and is featured in an article published today in PLOS Biology.

The "Request a Woman Scientist" database was created to address concerns that women's scientific expertise is often excluded at professional gatherings.

"The idea came from repeated experiences of seeing all men panels ('manels') and women's scientific expertise often excluded in the public realm," writes Elizabeth McCullagh, PhD, a postdoctoral fellow in the Department of Physiology and Biophysics on the Anschutz Medical Campus, and her co-authors.

The article, "Request a woman scientist: A database for diversifying the public face of science," is published today in the peer-reviewed journal PLOS Biology.

According to a 2017 study that analyzed colloquium speakers at 50 prestigious universities, men were invited to give twice as many talks about their research as women. When asked why, the event organizers often repeated the same explanation: "We tried to find a women to speak on this panel, but we didn't know any women who work on this topic."

To combat the misperception that women are not engaged in a range of scientific activities, McCullagh and her colleagues created the Request a Woman Scientist database to connect educational institutions, policymakers, the media, the public, and others with women scientists across disciplines around the world.

Women listed in the database have indicated their willingness to speak with students or the media, consult on a project, sit on a panel or serve as a conference keynote speaker.

Between its launch in January 2018 to November 2018, when data was generated for the PLOS Biology article, more than 7,500 women from 133 countries have signed up and the platform has been accessed more than 100,000 times by journalists, conference organizers, school teachers, and other scientists. Already, journalists from The Atlantic, Grist, and online National Geographic have relied on the database for sources.

To be listed, women scientists fill out an online form and members of the group 500 Women Scientists vet the entries by verifying that the submitted information is accurate. The database lists women who are in a science, technology, engineering, math, and medicine (STEMM) field.

500 Women Scientists is a grassroots organization started by four women who met in graduate school at CU Boulder and who maintained friendships and collaborations after jobs and life took them away from Boulder. The group's mission is to make science open, inclusive, and accessible. When they published an open letter in November 2016, the group's founders set an aspirational goal of collecting 500 signatures, which they surpassed within hours of posting the letter. More information about 500 Women Scientists is available at http://www.500womenscientists.org.

"Our goal is to increase representation of women scientists in society and change perceptions of what a scientist looks like," said McCullagh. "As our database grows, we plan to make it easier to use so that women scientists are recognized for their significant contributions to science and our understanding of the world."

Six authors, including McCullagh, are listed as authors of the article.

Credit: 
University of Colorado Anschutz Medical Campus

Information technology can support antimicrobial stewardship programs

NEW YORK (April 23, 2019) -- The incorporation of information technology (IT) into an antimicrobial stewardship program can help improve efficiency of the interventions and facilitate tracking and reporting of key metrics, including outcomes, according to a Society for Healthcare Epidemiology of America (SHEA) white paper published in the society's journal Infection Control & Hospital Epidemiology.

"When used intentionally, information technology can help ease the growing demands placed on healthcare systems to meet antimicrobial stewardship standards and reporting requirements, even as financial and personnel resources are reduced," said Kristi Kuper, PharmD, BCPS, senior clinical manager for infectious diseases in the Center for Pharmacy Practice Excellence at Vizient and lead author of the white paper.

The paper, The Role of Electronic Health Record and "Add-On" Clinical Decision Support Systems to Enhance Antimicrobial Stewardship Programs, provides a review of the stewardship related functionality within these IT systems. The paper also describes how the platforms can be used to improve antimicrobial use and identifies how this technology can support current and potential future antimicrobial stewardship regulatory and accreditation standards. It also suggests enhancements to existing systems in use today.

Beyond recommendations on how best to select, implement, and utilize EHR and clinical decision-support tools, the authors outline several recommendations to help close the gaps in existing systems including:

Creating more nimble systems for non-acute settings such as primary care clinics, surgery centers, and outpatient dialysis centers

Improving documentation processes for clinical decision support and EHR tools to reduce the provider burden

Enhancing the ability to track and report patient outcomes

Establishing user networks to share best practices and reduce redundancies to help increase the efficiency of the development of rules and reports

"While existing systems may present challenges, when used optimally, informatics can create readily available tools for local and national reporting, help guide appropriate antimicrobial prescribing that improves selection, dosing, and duration of therapy, and serve as an educational reference for trainees and providers," said Kuper.

Credit: 
Society for Healthcare Epidemiology of America

Calcium deficiency in cells due to ORAI1 gene mutation leads to damaged tooth enamel

image: High-magnification of tooth enamel in normal (control) mice (left) and mice with ORAI1 gene mutation (right) by scanning electron microscope.

Image: 
Reprinted with permission from Eckstein et al., Sci. Signal. 12, eaav4663 (2019).

A mutation in the ORAI1 gene--studied in a human patient and mice--leads to a loss of calcium in enamel cells and results in defective dental enamel mineralization, finds a study led by researchers at NYU College of Dentistry.

The study, published April 23 in Science Signaling, identifies ORAI1 as the dominant protein for calcium influx and reveals the mechanisms by which calcium influx affects enamel cell function and the formation of tooth enamel.

Calcium is critical for many cellular functions, including mineralizing teeth and bones. Calcium enters cells via ORAI proteins, which form pores in a cell's plasma membrane to enable calcium influx when activated.

"Our previous research has shown that deficiencies in the modulation of calcium influx or calcium transport result in dental enamel malformation," said Rodrigo Lacruz, PhD, associate professor of basic science and craniofacial biology at NYU College of Dentistry and the study's senior author. "Despite this knowledge, the biology of enamel cells as it relates to the role of calcium signaling remains poorly understood."

Studies show that several genes, including ORAI (which encode ORAI proteins), are involved in the formation of tooth enamel. Enamel--the hard, outer layer of teeth--first forms as a soft, gel-like matrix. ORAI proteins then help the enamel-forming cells to mineralize.

Mutations in the human ORAI1 gene result in immune dysfunction and immune diseases, but people with ORAI1 mutations also have defects in their tooth enamel. In this study, the researchers investigated the case of a patient with a complex medical history, including combined immunodeficiency and a mutation in the ORAI1 gene. Throughout his childhood, the patient had defects on his tooth enamel, resulting in severe cavities and related dental abscesses. Based on his clinical presentation, the researchers concluded that the ORAI1 mutation likely accounted for the defective enamel mineralization.

Given the lack of dental samples from patients with ORAI1 mutations, Lacruz and his colleagues then developed mouse models to study the role of ORAI proteins in enamel formation, both by observing tooth enamel and examining its influence on the environment inside enamel cells.

The researchers studied the ORAI family of proteins (ORAI1, ORAI2, and ORAI3) and genetic mutations in the corresponding genes to investigate the mechanism by which calcium is modulated by each of these proteins. When mice had a mutation in the ORAI1 gene and were therefore deficient in ORAI1 protein, calcium entry into enamel cells was significantly reduced (by roughly 50 percent), and tooth enamel was abnormal, including cracks in the outer enamel layer. By contrast, mice with ORAI2 mutations and ORAI2 deficiency showed an increase in calcium by approximately 30 percent in the enamel cells, which did not result in obvious enamel defects. This suggests that ORAI1 is the dominant channel for modulating the influx of calcium into enamel cells.

To better understand how calcium influx--and conversely, deficiency in calcium--changes the functioning of enamel cells, the researchers examined the activity of cells lacking ORAI1. They found that calcium dysregulation in ORAI1-deficient cells affects their function at multiple levels, including increased mitochondrial respiration and subsequent changes in redox balance. An elevation in reactive oxygen species can be detrimental to cells, and to protect proteins in an intracellular environment that is more oxidizing, a mechanism called S-glutathionylation is promoted.

The findings provide a foundational understanding of what happens in enamel cells, which could help create a pathway for researchers interested in regenerating tooth enamel or developing therapies to treat patients with enamel defects.

"We've long observed deficiencies in tooth enamel associated with abnormal calcium levels in the enamel cells, but can now detail a mechanism for how this occurs," said Lacruz.

Credit: 
New York University

Metformin may help patients maintain weight loss long-term

1. Metformin may help patients maintain weight loss long-term

Abstract: http://annals.org/aim/article/doi/10.7326/M18-1605

Editorial: http://annals.org/aim/article/doi/10.7326/M19-0782

URLs go live when the embargo lifts

In the Diabetes Prevention Program (DPP) clinical trial and its long-term follow-up study, among the persons who lost at least 5 percent of their body weight during the first year, long-term maintenance of weight loss was more likely if they had been assigned to treatment with metformin than with placebo or lifestyle intervention. Being older and losing a greater amount of weight in the first year were the most consistent predictors of lasting weight loss. Findings from a cohort study are published in Annals of Internal Medicine.

Weight loss plays a central role in efforts to prevent or delay type 2 diabetes. As such, identifying good predictors of long-term weight loss could lead to improved weight management. The DPP was a randomized controlled trial that compared weight loss and diabetes prevention with metformin, intensive lifestyle intervention (ILS), or placebo among more than 3,000 participants with prediabetes, and its Outcomes Study (DPPOS) observed patients after the masked treatment phase ended. The DPP/DPPOS is the largest and longest-running study of metformin for prevention of diabetes.

After the first year, twice as many participants in the ILS group versus the metformin group lost at least 5 percent of their body weight. However, those who were assigned to the metformin group had greater success at maintaining their weight loss between years 6 and 15, while patients were still being followed. The researchers noted that greater weight loss at one year predicted long-term weight loss across all groups. Early weight loss was also important with regard to diabetes incidence. The researchers found that cumulative diabetes incidence rates over 15 years were lower among those who lost at least 5 percent of their weight in the first year.

According to the study authors, future research should focus on whether metformin could be a useful intervention for weight loss maintenance after initial weight loss with lifestyle interventions, antiobesity drugs or devices, or bariatric surgery.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To speak with the lead author, Kishore M. Gadde, MD, please contact him via email at Kishore.gadde@pbrc.edu, or Lisa Stanbury at Lisa.Stansbury@pbrc.edu.

2. Poor glycemic control raises risk for preterm birth

Abstract: http://annals.org/aim/article/doi/10.7326/M18-1974

URLs go live when the embargo lifts

Risk for preterm birth was strongly linked to maternal HBA1c around the time of conception. Women with HBA1c levels consistent with recommended target levels also seemed to be at risk for preterm delivery as well as other adverse pregnancy outcomes. Findings from a cohort study are published in Annals of Internal Medicine.

Type 1 diabetes is known to complicate pregnancy, but the relationship between maternal glycemic control and preterm birth is less clear. At least three major organizations now recommend a target HBA1c level of less than 6.5 percent in early pregnancy. However, because these recommendations originate from research regarding congenital malformations and large for gestational age infants, whether such strict glycemic control prevents or reduces excess risk for preterm birth in women with type 1 diabetes is not clear.

Researchers at Karolinska Institutet in Stockholm, Sweden use nationwide Swedish registers to examine adverse pregnancy outcomes in more than 2,400 singleton deliveries according to HBA1c levels in mothers with type 1 diabetes. The researchers also examined neonatal death, large for gestational age, macrosomia, infant birth injury, hypoglycemia, respiratory distress, low Apgar score, and stillbirth. They found that increasing HBA1c levels were associated with progressively increased risks for preterm birth, as well as other adverse pregnancy outcomes. Most of the elevated risk for preterm birth was attributable to medically indicated preterm births, although spontaneous preterm births also increased with higher HbA1C levels.

According to the researchers, these findings are important for developing future guidelines and informing clinicians about the risks associated with poor glycemic control. However, the results do not support the idea that further lowering the recommended HBA1c level during early pregnancy will eliminate the risk for preterm birth.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To speak with the lead author, Jonas F. Ludvigsson, MD, PhD, please contact Karolinska Institutet's Press Office at pressinfo@ki.se or +46 8 524 860 77.

3. National Institutes of Health Pathways to Prevention Workshop Identifies Gaps in Evidence for Effectiveness and Safety of Long-term Osteoporosis Drug Therapies

Summary of review and position paper

Review: http://annals.org/aim/article/doi/10.7326/M19-0533

Position Paper: http://annals.org/aim/article/doi/10.7326/M19-0961

Editorial: http://annals.org/aim/article/doi/10.7326/M19-1112

URLs go live when the embargo lifts

About 10 million U.S. adults aged 50 years or older have osteoporosis and that number is expected to increase as the population ages. Several osteoporosis drug treatments reduce fractures in the short-term (up to 3 years), but optimal long-term use is uncertain. A systematic review evaluating the benefits and harms of long-term osteoporosis drug treatment and of discontinuation of treatment, or drug holidays, and an accompanying position paper are published in Annals of Internal Medicine.

Researchers funded by the National Institutes of Health Office of Disease Prevention and the Agency for Healthcare Research and Quality who were based at the Geriatric Research Education & Clinical Center in the Minneapolis VA Health Care System and the University of Minnesota Evidence-based Practice Center reviewed 35 trials and 13 observational studies. They found that for mostly treatment-naïve postmenopausal women, compared with placebo, alendronate for 4 years reduced vertebral and nonvertebral fractures in women with osteoporosis, and zoledronic acid for 6 years reduced these fractures in women with osteopenia or osteoporosis. Controlled, observational studies suggested that long-term bisphosphonate treatment may increase risk for serious but still rare adverse events, such as atypical femoral fracture (AFF) and osteonecrosis of the jaw (ONJ). Risk for AFF appears greater with longer use. In women with osteoporosis, raloxifene for 4 years compared with placebo reduced vertebral fractures, but not nonvertebral fractures, and increased risk for deep venous thrombosis and pulmonary embolism. In women with unknown osteoporosis or osteopenia status, oral hormone therapy for 5 to 7 years reduced clinical fractures and hip fractures compared with placebo, but this was offset by increased risk for cardiovascular disease and cognitive impairment, and estrogen-progestin increased risk for invasive breast cancer. Data were insufficient to draw conclusions about the benefits and harms of long-term use of other FDA-approved osteoporosis drugs, including risedronate, ibandronate, denosumab, teriparatide, and abaloparatide.

Data were less clear for informing clinicians about the balance of fracture benefits to harms with bisphosphonate drug holidays. This was because neither alendronate continuation beyond 5 years nor zoledronic acid beyond 3 years versus discontinuation reduced hip or other nonvertebral fractures and only inconsistently reduced vertebral fractures. Also, scant observational data suggested bisphosphonate continuation may increase risk of AFF compared with discontinuation.

According to the National Institutes of Health Pathways to Prevention Workshop report that focused on research gaps for long-term drug therapies for osteoporotic fracture prevention, the evidence was insufficient to determine if implementing drug holidays in some patients could help to reduce the risk of some serious adverse events. The Workshop committee recommends further research in this area. They also recommend research into newer treatments, and into the patient and clinical barriers that prevent people from getting screened and adhering to osteoporosis drug treatment regimens.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To interview the lead author, Howard Fink, MD, MPH, please contact Ralph Heussner at ralph.heussner@va.gov.

Credit: 
American College of Physicians

Was the restaurant really that bad, or was it just the rain?

COLUMBUS, Ohio - There are a few things that will result in poor customer reviews of a restaurant: bad service, bad food - and bad weather.

A study of 32 Florida restaurants found that customers left more negative remarks on comment cards on days when it was raining than on days when it was dry.

Results showed the odds of patrons leaving very negative comments versus very positive comments were 2.9 times greater on rainy days.

In two other online studies done in other parts of the country, results suggested that unpleasant weather left people in bad moods, which was then linked to them having less positive views about the restaurants they visited.

"Restaurant managers may see more than the usual bad reviews on certain days, and it may have nothing to do with the service or the quality of the food," said Milos Bujisic, co-author of the study and assistant professor of hospitality management at The Ohio State University.

"Restaurants can't control the weather, but it may affect how customers review them."

The research appears online in the Journal of Hospitality and Tourism Research and will be published in a future print edition.

While weather was not the most important factor in how customers reviewed their dining experience, it can't be ignored, said study co-author Vanja Bogicevic, a visiting assistant professor of hospitality management at Ohio State.

"It may be a smaller factor, but it is something that managers should pay attention to," Bogicevic said.

In the first study, researchers examined the comments cards left at the Florida restaurants, all part of the same national fast-casual chain.

The researchers rated the comments on a five-point scale from 1 (very negative) to 5 (very positive). They also examined weather data from the National Climatic Data Center for each restaurant's location on the days comment cards were left.

They examined 14 different weather variables, but only three were related to customer comments - rain, temperature and barometric pressure.

Higher temperatures - which in Florida, can often mean it is uncomfortably hot - were linked to more negative comments.

Higher barometric pressure was also connected to negative comments in Florida, which is also probably different from much of the country, the researchers said, because rising pressure is often associated with fair weather. In warmer climates, high barometric pressure is often linked to higher daytime temperatures.

Two other studies conducted online offered more insight into exactly how weather affected customer evaluations.

In one study, 158 people from around the country who visited a restaurant within the last 24 hours were asked to rate and describe the weather conditions right before their restaurant visit. They also rated their own mood and what kind of "word-of-mouth" review they would give the restaurant - in other words, whether they would recommend the restaurant and tell others positive things about their experience.

Results showed that people who described the weather as more pleasant also rated their mood more positively. Better moods - and not the weather itself - were related to more positive word-of-mouth.

A third study specifically targeted people living in the Midwest, Northeast and Northwest regions of the United States, where the weather is variable over the year.

This study involved 107 people. Some were asked whether they visited a restaurant in the last seven days during pleasant weather, and some were asked if they had visited a restaurant in unpleasant conditions (very cold, raining or snowing).

Participants who reported eligible conditions for the study then answered questions about their mood that day, their dining experience and whether they would give good word-of-mouth to the restaurant.

Similar to the previous study, pleasant weather elevated consumers' moods, which was linked to a better rating of their restaurant experience and better word-of-mouth compared to those who visited in unpleasant weather.

Bujisic noted that bad weather may affect not only the mood of customers, but also the wait staff and others who serve the customers.

"A rainy day may put employees in a bad mood and that will affect their service," he said. "Managers need to explain that to their employees and work to keep them motivated."

In addition, managers may want to find ways to boost customers' moods during unpleasant weather, Bogicevic said.

"Think about creative strategies to make customers happy. Maybe offer a free drink or play more upbeat music," she said.

Credit: 
Ohio State University

Photonics: The curious case of the disappearing cylinders

image: (a) Light with a wavelength of 700 nm traveling from bottom to top is distorted when the radius of the cylinder (in the middle) is 175 nm. (b) There is hardly any distortion when the cylinder has a radius of 195 nm. These images correspond to the conditions for invisibility predicted by the theoretical calculation.

Image: 
<i>Applied Physics Express</i>

A pair of researchers at Tokyo Institute of Technology (Tokyo Tech) describes a way of making a submicron-sized cylinder disappear without using any specialized coating. Their findings could enable invisibility of natural materials at optical frequency [1] and eventually lead to a simpler way of enhancing optoelectronic devices, including sensing and communication technologies.

Making objects invisible is no longer the stuff of fantasy but a fast-evolving science. 'Invisibility cloaks' using metamaterials[2] -- engineered materials that can bend rays of light around an object to make it undetectable -- now exist, and are beginning to be used to improve the performance of satellite antennas and sensors. Many of the proposed metamaterials, however, only work at limited wavelength ranges such as microwave frequencies.

Now, Kotaro Kajikawa and Yusuke Kobayashi of Tokyo Tech's Department of Electrical and Electronic Engineering report a way of making a cylinder invisible without a cloak for monochromatic illumination at optical frequency -- a broader range of wavelengths including those visible to the human eye.

They firstly explored what happens when a light wave hits an imaginary cylinder with an infinite length. Based on a classical electromagnetic theory called Mie scattering[3], they visualized the relationship between the light-scattering efficiency of the cylinder and the refractive index [4]. They looked for a region indicating very low scattering efficiency, which they knew would correspond to the cylinder's invisibility.

After identifying a suitable region, they determined that invisibility would occur when the refractive index of the cylinder ranges from 2.7 to 3.8. Some useful natural materials fall within this range, such as silicon (Si), aluminum arsenide (AlAs) and germanium arsenide (GaAs), which are commonly used in semiconductor technology.

Thus, in contrast to the difficult and costly fabrication procedures often associated with exotic metamaterial coatings, the new approach could provide a much simpler way to achieve invisibility.

The researchers used numerical modeling based on the Finite-Difference Time-Domain (FDTD) method to confirm the conditions for achieving invisibility. (See Figure/ Animation.) By taking a close look at the magnetic field profiles, they inferred that "the invisibility stems from the cancellation of the dipoles generated in the cylinder."

Although rigorous calculations of the scattering efficiency have so far only been possible for cylinders and spheres, Kajikawa notes there are plans to test other structures, but these would require much more computing power.

To verify the current findings in practice, it should be relatively easy to perform experiments using tiny cylinders made of silicon and germanium arsenide. Kajikawa says: "We hope to collaborate with research groups who are now focusing on such nanostructures. Then, the next step would be to design novel optical devices."

Potential optoelectronic applications may include new kinds of detectors and sensors for the medical and aerospace industries.

Credit: 
Tokyo Institute of Technology

Researchers find high-risk genes for schizophrenia

Using a unique computational "framework" they developed, a team of scientist cyber-sleuths in the Vanderbilt University Department of Molecular Physiology and Biophysics and the Vanderbilt Genetics Institute (VGI) has identified 104 high-risk genes for schizophrenia.

Their discovery, which was reported in the journal Nature Neuroscience, supports the view that schizophrenia is a developmental disease, one which potentially can be detected and treated even before the onset of symptoms.

"This framework opens the door for several (research) directions," said the paper's senior author, Bingshan Li, PhD, associate professor of Molecular Physiology and Biophysics and an investigator in the Vanderbilt Genetics Institute (VGI).

One direction is to determine whether drugs already approved for other, unrelated diseases could be "repurposed" to improve the treatment of schizophrenia. Another is to find in which cell types in the brain these genes are active along the development trajectory.

Ultimately, Li said, "I think we'll have a better understanding of how prenatally these genes predispose risk and that will give us a hint of how to potentially develop intervention strategies. It's an ambitious goal ... (but) by understanding the mechanism, drug development could be more targeted."

Schizophrenia is a chronic, severe mental disorder characterized by hallucinations and delusions, "flat" emotional expression and cognitive difficulties. Symptoms usually start between the ages of 16 and 30. Antipsychotic medications can relieve symptoms but there is no cure for the disease.

Genetics plays a major role. While schizophrenia occurs in 1 percent of the population, the risk rises sharply to 50 percent for a person whose identical twin has the disease.

Recent genome-wide association studies (GWAS) have identified more than 100 loci, or fixed positions on different chromosomes, associated with schizophrenia. That may not be where high-risk genes are located, however. The loci could be regulating the activity of the genes at a distance -- nearby or very far away.

To solve the problem Li, with first authors Rui Chen, PhD, research instructor in Molecular Physiology and Biophysics, and postdoctoral research fellow Quan Wang, PhD, developed a computational "framework" they called the Integrative Risk Genes Selector.

The framework pulled the top genes from previously reported loci based on their cumulative supporting evidence from multi-dimensional genomics data, as well as gene networks.

Which genes have high rates of mutation? Which are expressed in prenatally? These are the kind of questions a genetic "detective" might ask to identify and narrow the list of "suspects."

The result was a list of 104 high-risk genes, some of which encode proteins targeted in other diseases by drugs already on the market. One gene is suspected in the development of autism spectrum disorder. "Schizophrenia and autism have shared genetics," Chen said.

Much work remains to be done. But, said Chen, "our framework can push GWAS a step forward ... to further identify genes." It also could be employed to help track down genetic suspects in other complex diseases.

Credit: 
Vanderbilt University Medical Center

How slippery surfaces allow sticky pastes and gels to slide

image: A gel-like yield stress fluid, top, moves as a plug without shearing in a tube with the new surface coating. At bottom, the same fluid is seen shearing while it flows in an uncoated tube, where part of the fluid gets stuck to the tube while part of it continues to flow.

Image: 
Image courtesy of the researchers

An MIT research team that has already conquered the problem of getting ketchup out of its bottle has now tackled a new category of consumer and manufacturing woe: how to get much thicker materials to slide without sticking or deforming.

The slippery coatings the team has developed, called liquid-impregnated surfaces, could have numerous advantages, including eliminating production waste that results from material that sticks to the insides of processing equipment. They might also improve the quality of products ranging from bread to pharmaceuticals, and even improve the efficiency of flow batteries, a rapidly developing technology that could help to foster renewable energy by providing inexpensive storage for generated electricity.

These surfaces are based on principles initially developed to help foods, cosmetics, and other viscous liquids slide out of their containers, as devised by Kripa Varanasi, a professor of mechanical engineering at MIT, along with former students Leonid Rapoport PhD '18 and Brian Solomon PhD '16. The new work is described in the journal ACS Applied Materials and Interfaces.

Like the earlier surfaces they developed, which led to the creation of a spinoff company called LiquiGlide, the new surfaces are based on a combination of a specially textured surface and a liquid lubricant that coats the surface and remains trapped in place through capillary action and other intermolecular forces associated with such interfaces. The new paper explains the fundamental design principles that can achieve almost 100 percent friction reduction for these gel-like fluids.

Needing a squeeze

Such materials, known as yield-stress fluids, including gels and pastes, are ubiquitous. They can be found in consumer products such as food, condiments, and cosmetics, and in products in the energy and pharmaceuticals industries. Unlike other fluids such as water and oils, these materials will not start to flow on their own, even when their container is turned upside down. Starting the flow requires an input of energy, such as squeezing the container.

But that squeezing has its own effects. For example, bread-making machinery typically includes scrapers that constantly push the sticky dough away from the sides of its container, but that constant scraping can result in over-kneading and a denser loaf. A slippery container that requires no scraping could thus produce better-tasting bread, Varanasi says. By using this system, "beyond getting everything out of the container, you now add higher quality" of the resulting product.

That may not be critical where bread is concerned, but it can have great impact on pharmaceuticals, he says. The use of mechanical scrapers to propel drug materials through mixing tanks and pipes can interfere with the effectiveness of the medicine, because the shear forces involved can damage the proteins and other active compounds in the drug.

By using the new coatings, in some cases it's possible to achieve a 100 percent reduction in the drag the material experiences -- equivalent to "infinite slip," Varanasi says.

"Generally speaking surfaces are enablers," says Rapoport. "Superhydrophobic surfaces, for example, enable water to roll easily, but not all fluids can roll. Our surfaces enable fluids to move by whichever way is more preferable for them -- be it rolling or sliding. In addition we found that yield-stress fluids can move on our surfaces without shearing, essentially sliding like solid bodies. This is very important when you want to maintain the integrity of these materials when they are being processed."

Like the earlier version of slippery surfaces Varanasi and his collaborators created, the new process begins by making a surface that is textured at the nanoscale, either by etching a series of closely spaced pillars or walls on the surface, or mechanically grinding grooves or pits. The resulting texture is designed to have such tiny features that capillary action -- the same process that allows trees to draw water up to their highest branches through tiny openings beneath the bark -- can act to hold a liquid, such as a lubricating oil, in place on the surface. As a result, any material inside a container with this kind of lining essentially only comes in contact with the lubricating liquid, and slides right off instead of sticking to the solid container wall.

The new work described in this paper details the principles the researchers came up with to enable the optimal selection of surface texturing, lubricating material, and manufacturing process for any specific application with its particular combination of materials.

Helping batteries to flow

Another important application for the new coatings is in a rapidly developing technology called flow batteries. In these batteries, solid electrodes are replaced by a slurry of tiny particles suspended in liquid, which has the advantage that the capacity of the battery can be increased at any time simply by adding bigger tanks. But the efficiency of such batteries can be limited by the flow rates.

Using the new slippery coatings could significantly boost the overall efficiency of such batteries, and Varanasi worked with MIT professors Gareth McKinley and Yet-Ming Chiang on developing such a system led by Solomon and Xinwei Chen, a former postdoc in Chiang's lab.

These coatings could resolve a conundrum that flow battery designers have faced, because they needed to add carbon to the slurry material to improve its electrical conductivity, but the carbon also made the slurry much thicker and interfered with its movement, leading to "a flow battery that couldn't flow," Varanasi says.

"Previously flow batteries had a trade-off in that as you add more carbon particles the slurry becomes more conductive, but it also becomes thicker and much more challenging to flow," says Solomon. "Using slippery surfaces lets us have the best of both worlds by allowing flow of thick, yield-stress slurries."

The improved system allowed the use of a flow electrode formulation that resulted in a fourfold increase in capacity and an 86 percent savings in mechanical power, compared with the use of traditional surfaces. These results were described recently in the journal ACS Applied Energy Materials.

"Apart from fabricating a flow battery device which incorporates the slippery surfaces, we also laid out design criteria for their electrochemical, chemical, and thermodynamic stability," explains Solomon. "Engineering surfaces for a flow battery opens up an entirely new branch of applications that can help meet future energy storage demand."

Credit: 
Massachusetts Institute of Technology

Poor mental health in bisexual people explained

image: The largest study of bisexual people in the world to date, led by La Trobe University, has examined why bisexual people experience higher rates of psychological distress than heterosexual and homosexual people.

Image: 
Peter Salanki

The largest study of bisexual people in the world to date, led by La Trobe University, has examined why bisexual people experience higher rates of psychological distress than heterosexual and homosexual people.

Questioning more than 2,600 bisexual people across Australia, the Who I Am study's aim was to uncover the reasons for poor mental health in bisexual people. The study found significant links between poor mental health and the following factors:

Bisexual people who are in heterosexual relationships;

Bisexual people perceiving their sexuality to be bad or wrong;

Bisexual people thinking their partner's support or understanding of their sexuality is low.

The study has already instigated Bi+ Australia - the first national organisation set up to improve the mental health of bisexual Australians through support, education and research.

Research Officer Julia Taylor, who led the research from La Trobe's Australian Research Centre in Sex, Health and Society (ARCSHS), says the study is proof that more support is needed to improve the mental health of bisexual people.

"Attraction to more than one gender is very common among Australian adults and most health practitioners are unaware of the very poor mental health associated with this group," Mrs Taylor said.

"While there's been an increased focus on lesbian and gay health in recent years, a substantial gap in knowledge specifically on bisexual health needs still remains.

"Through the Who I Am study, we wanted to address this gap and provide GPs and other health professionals with more information on bisexual mental health.

"The findings have given a unique insight into what challenging life experiences bisexual people are going through and how this is impacting their mental health.

"This study from La Trobe, along with the organisation Bi+ Australia, is making a real impact in enhancing the understanding, acceptance, inclusion and celebration of bisexuality in Australia, and hopefully the world."

Poor mental health in bisexual people:

The study found significant statistics demonstrating the poor mental health of cisgender (where their gender is congruent with biological sex) bisexual people:

One in four have attempted suicide;

Nearly 80 per cent had considered self-harm or thought about committing suicide

Over 60 per cent have high or very high current psychological distress, with 40 per cent reporting having had depression in the past;

Transgender and gender diverse bisexual people experienced even poorer mental health and these findings will be released in the coming months.

Credit: 
La Trobe University

Thermodynamic magic enables cooling without energy consumption

image: Theoretically, this experimental device could turn boiling water to ice, without using any energy.

Image: 
Andreas Schilling, UZH

Physicists at the University of Zurich have developed an amazingly simple device that allows heat to flow temporarily from a cold to a warm object without an external power supply. Intriguingly, the process initially appears to contradict the fundamental laws of physics.

If you put a teapot of boiling water on the kitchen table, it will gradually cool down. However, its temperature is not expected to fall below that of the table. It is precisely this everyday experience that illustrates one of the fundamental laws of physics - the second law of thermodynamics - which states that the entropy of a closed natural system must increase over time. Or, more simply put: Heat can flow by itself only from a warmer to a colder object, and not the other way round.

Cooling below room temperature

The results of a recent experiment carried out by the research group of Prof. Andreas Schilling in the Department of Physics at the University of Zurich (UZH) appear at first sight to challenge the second law of thermodynamics. The researchers managed to cool a nine-gram piece of copper from over 100°C to significantly below room temperature without an external power supply. "Theoretically, this experimental device could turn boiling water to ice, without using any energy," says Schilling.

Creating oscillating heat currents

To achieve this, the researchers used a Peltier element, a component commonly used, for example, to cool minibars in hotel rooms. These elements can transform electric currents into temperature differences. The researchers had already used this type of element in previous experiments, in connection with an electric inductor, to create an oscillating heat current in which the flow of heat between two bodies perpetually changed direction. In this scenario, heat also temporarily flows from a colder to a warmer object so that the colder object is cooled down further. This kind of "thermal oscillating circuit" in effect contains a "thermal inductor". It functions in the same way as an electrical oscillating circuit, in which the voltage oscillates with a constantly changing sign.

Laws of physics remain intact

Until now, Schilling's team had only operated these thermal oscillating circuits using an energy source. The researchers have now shown for the first time that this kind of thermal oscillating circuit can also be operated "passively", i.e. with no external power supply. Thermal oscillations still occurred and, after a while, heat flowed directly from the colder copper to a warmer heat bath with a temperature of 22°C, without being temporarily transformed into another form of energy. Despite this, the authors were also able to show that the process does not actually contradict any laws of physics. To prove it, they considered the change in entropy of the whole system and showed that it increased with time - fully in accordance with the second law of thermodynamics.

Potential application still a long way off

Although the team recorded a difference of only about 2°C compared to the ambient temperature in the experiment, this was mainly due to the performance limitations of the commercial Peltier element used. According to Schilling, it would be possible in theory to achieve cooling of up to -47°C under the same conditions, if the "ideal" Peltier element - yet to be invented - could be used: "With this very simple technology, large amounts of hot solid, liquid or gaseous materials could be cooled to well below room temperature without any energy consumption."

The passive thermal circuit could also be used as often as desired, without the need to connect it to a power supply. However, Schilling admits that a large-scale application of the technique is still a long way off. One reason for this is that the Peltier elements currently available are not efficient enough. Furthermore, the current set-up requires the use of superconducting inductors to minimize electric losses.

Established perceptions challenged

The UZH physicist considers the work more significant than a mere "proof-of-principle" study: "At first sight, the experiments appear to be a kind of thermodynamic magic, thereby challenging to some extent our traditional perceptions of the flow of heat."

Credit: 
University of Zurich