Culture

'Five star' hospitals often provide fewer services than other hospitals, new data suggests

image: Comparison of services provided at five star rated hospitals vs. non-five star rated hospitals.

Image: 
Johns Hopkins Medicine

If you're looking for a top-notch hospital with a wide range of services, narrowing your list to hospitals with a five-star patient experience rating might lead you astray. Many five-star hospitals offer fewer services than those without five stars, according to a new study by Johns Hopkins researchers published June 10 in JAMA Internal Medicine.

"If you stay in a hotel with a five-star rating, you generally accept not only better service than in other hotels, but more services, from valet parking and room service to a spa and pool," says Zishan Siddiqui, M.D., assistant professor of medicine at the Johns Hopkins University School of Medicine and first author of the new paper. "But when it comes to hospitals, the five-star category is much less helpful at capturing the services offered."

The U.S. Centers for Medicare & Medicaid Services (CMS) publically reports data on more than 4,000 hospitals across the country. In the past, raw numbers -- reflecting measures such as patient satisfaction, complication rates and timeliness of care -- were published on the CMS website. In 2016, however, the agency debuted a system in which hospitals are assigned a star rating in several categories, including "patient experiences."

Although data on how the public uses the ratings isn't available, Siddiqui and his colleagues assume that people use them when choosing a facility for their medical care.

"If people are coming in with the same expectations as they have for five-star hotels when they review hospital star ratings, we wondered whether their expectations about getting more services in a five-star hospital would be true," says Siddiqui.

The researchers linked CMS patient experience star ratings with information from the American Hospital Association on the clinical services a hospital offers. Among 2,798 hospitals with patient experience star ratings, 150 hospitals (5.4%) received five stars. The team compared those hospitals to ones that received one through four stars in patient experience.

While 95.3% of most hospitals have emergency departments and 90.6% have intensive care units, only 77.3% of five-star hospitals have emergency departments and only 42.0% have intensive care units. Similarly, five-star hospitals are less likely to have neurology, cardiology, obstetrics and oncology units, among other services. Only 1.7% of five-star hospitals have neonatal intensive care units, compared to 31.5% of other hospitals. The five-star hospitals are also less likely to be teaching hospitals or research hospitals. Even when the team removed specialty hospitals -- such as cardiac and orthopaedic hospitals -- from the analysis, the results were similar: Five-star-rated general medical hospitals offered fewer services than general medical hospitals with lower ratings.

Siddiqui says the findings didn't surprise him. "These patient experience scores are based on the communication and responsiveness of health care workers," he explains. "When a hospital has generally healthy patients who all have a similar set of problems, it's much easier for physicians and nurses to communicate with them and respond to their needs."

Hospitals that have more services -- and therefore more complex patients -- have more challenges predicting patients' needs and are more likely to end up with low scores when patients are surveyed.

"This means hospitals that are seeing these kinds of patients are taking a hit when it comes to their rating," says Siddiqui. But those very hospitals -- with expertise in managing many types of patients -- may be those that people are looking for in a hospital search.

The findings don't necessarily apply to the other star ratings that CMS issues, since the current study only looked at patient experience star ratings. And Siddiqui says the ratings still have value -- a hospital with a four-star patient experience rating will generally have higher standards of communication and responsiveness than one with a one-star rating. But he hopes consumers take the ratings with a grain of salt and look beyond five-star hospitals when choosing their medical care.

"If you're looking for a hospital, I'd recommend using more than one evaluation method after narrowing hospitals based on your clinical needs, experience of family and friends with similar needs, word of mouth and your doctor's recommendation," he says.

Credit: 
Johns Hopkins Medicine

Evolutionary discovery to rewrite textbooks

image: This is a Choanocyte cross section.

Image: 
The University of Queensland

Scientists at The University of Queensland have upended biologists' century-old understanding of the evolutionary history of animals.

Using new technology to investigate how multi-celled animals developed, their findings revealed a surprising truth.

Professor Bernie Degnan said the results contradicted years of tradition.

"We've found that the first multicellular animals probably weren't like the modern-day sponge cells, but were more like a collection of convertible cells," Professor Degnan said.

"The great-great-great-grandmother of all cells in the animal kingdom, so to speak, was probably quite similar to a stem cell.

"This is somewhat intuitive as, compared to plants and fungi, animals have many more cell types, used in very different ways - from neurons to muscles - and cell-flexibility has been critical to animal evolution from the start."

The findings disprove a long-standing idea: that multi-celled animals evolved from a single-celled ancestor resembling a modern sponge cell known as a choanocyte.

"Scattered throughout the history of evolution are major transitions, including the leap from a world of microscopic single-cells to a world of multi-celled animals," Professor Degnan said.

"With multicellularity came incredible complexity, creating the animal, plant, fungi and algae kingdoms we see today.

"These large organisms differ from the other more-than-99-per-cent of biodiversity that can only be seen under a microscope."

The team mapped individual cells, sequencing all of the genes expressed, allowing the researchers to compare similar types of cells over time.

Fellow senior author Associate Professor Sandie Degnan said this meant they could tease out the evolutionary history of individual cell types, by searching for the 'signatures' of each type.

"Biologists for decades believed the existing theory was a no-brainer, as sponge choanocytes look so much like single-celled choanoflagellates - the organism considered to be the closest living relatives of the animals," she said.

"But their transcriptome signatures simply don't match, meaning that these aren't the core building blocks of animal life that we originally thought they were.

"This technology has been used only for the last few years, but it's helped us finally address an age-old question, discovering something completely contrary to what anyone had ever proposed."

"We're taking a core theory of evolutionary biology and turning it on its head," she said.

"Now we have an opportunity to re-imagine the steps that gave rise to the first animals, the underlying rules that turned single cells into multicellular animal life."

Professor Degnan said he hoped the revelation would help us understand our own condition and our understanding of our own stem cells and cancer.

Credit: 
University of Queensland

Mathematical tools to study tumors

image: The results obtained suggest that vitronectin can change the rigidity of the location of the tumorous cells. The changes caused by specific organisation of vitronectin can form 'pathways' that could help the tumor to migrate, with the grave problems that this would cause. For this reason, this 'basic science' study opens a possible new way of combatting this cancer which could be based in modifying the organisation of the vitronectin, so making tumors less aggressive.

Image: 
Universidad de Sevilla

Researchers from the Department of Cellular Biology at the University of Seville and the Seville Institute of Biomedicine (IBiS), Pablo Vicente and Doctor Luisma Escudero, in close collaboration with the researcher Rebecca Burgos and other members of the group of Doctor Rosa Noguera (University of Valencia--INCLIVA, CIBERONC) have published a study aimed at developing new therapies to fight childhood cancer.

This project means a step forward in the basic aspect of the study of cancer that could open new avenues of research to help understand what makes a tumour more a less aggressive and how they can be fought. However, the researchers stress that their finding does not in itself mean a cure for cancer.

Neuroblastoma is a type of cancer that originates during the development of the nervous system. It mainly affects children less than 18 months old. It is the most common solid tumour in early childhood and despite the great improvements made in the cure rate for other childhood tumours, the survival rate for patients with neuroblastoma is much less satisfactory.

There is clear evidence that the location where the tumour is located and that supports it (the extracellular matrix), plays an important role in the initial growth and development of the tumour. This setting is formed by a network of fibres and fibrils, which, depending on their density and how they are connected, give more or less rigidity to this tumorous micro-environment.

Therefore, it is important to understand how tumour cells are related to the extracellular matrix and how the fibres and fibrils are organised. This is not easy. To achieve this, the researchers have combined in this study the analysis of images of biopsy samples of tumours from patients affected by neuroblastoma, with new mathematical procedures (Graph Theory) that have allowed them to describe how the vitronectin fibrils are organised. The conclusion of this complex study is quite a lot simpler. The degree of organisation of the vitronectin correlates with the aggressiveness of the tumour and could be used to classify patients before any potential treatment.

The results obtained suggest that vitronectin can change the rigidity of the location of the tumorous cells. In the most serious cases, vitronectin could guide the cancerous neuroblasts making it possible for them to invade other organs. That is to say, the changes caused by specific organisation of vitronectin can form "pathways" that could help the tumour to migrate, with the grave problems that this would cause. For this reason, this "basic science" study opens a possible new way of combatting this cancer which could be based in modifying the organisation of the vitronectin, so making tumours less aggressive.

Credit: 
University of Seville

Researchers discover potential new therapeutic target for Alzheimer's disease

image: This microscopic image shows the merging of APP and apoE in brain cells. This co-localization suggests an age-associated increase in apoE-N-terminal APP interaction and thus higher production of the toxic amyloid-β (Aβ) protein characteristic of Alzheimer's disease.

Image: 
© University of South Florida

TAMPA, Fla. (June 12, 2019) -- Apolipoproten E (apoeE) is a major genetic risk factor for the development of Alzheimer's disease, yet the protein tends to be understudied as a potential druggable target for the mind-robbing neurodegenerative disease.

Now a research team led by the University of South Florida Health (USF Health) Morsani College of Medicine reports that a novel apoE antagonist blocks apoE interaction with N-terminal amyloid precursor protein (APP). Moreover, this peptide antagonist, known as 6KApoEp, was shown to reduce Alzheimer's-associated beta amyloid (β-amyloid) accumulation and tau pathologies in the brain, as well as improving learning and memory in mice genetically engineered to mimic symptoms of Alzheimer's disease.

Many failed anti-amyloid therapies for Alzheimer's disease have been directed against various forms of the protein β-amyloid, which ultimately forms clumps of sticky plaques in the brain. The presence of these amyloid plaques is one of the major hallmarks of Alzheimer's disease.

The USF Health research findings suggests that disrupting apoE physical interaction with N-terminal APP may be a new disease-modifying therapeutic strategy for this most common type of dementia.

The preclinical study was published online May 2 in Biological Psychiatry.

"For the first time, we have direct evidence that the N-terminal section of apoE itself acts as an essential molecule (ligand) to promote the binding of apoeE to the N-terminal region of APP outside the nerve cell," said the study's lead author Darrell Sawmiller, PhD, an assistant professor in the USF Health Department of Psychiatry & Behavioral Neurosciences. "This receptor-mediated mechanism plays a role in the development of Alzheimer's disease. Overstimulation of APP by apoE may be an earlier, upstream event that signals other neurodegenerative processes contributing to the amyloid cascade."

"Initially we wanted to better understand how apoE pathologically interacts with APP, which leads to the formation of β-amyloid plaques and neuronal loss," said study senior author Jun Tan, PhD, MD, a professor in the USF Health Department of Psychiatry & Behavioral Neurosciences. "Our work further discovered an apoE derivative that can modulate structural and functional neuropathology in Alzheimer's disease mouse models."

Credit: 
University of South Florida

Scientists identify a novel neural circuit mediating visually evoked innate defensive responses

image: VTAGABA+ neurons that mediates visually evoked innate defensive responses, involving the SCGlut+_ VTAGABA+_ CeA (central nucleus of the amygdala).

Image: 
ZHOU Zheng and LIU Xuemei

Fear overgeneralization (i.e., deficits in the discrimination of safety and threat) is an important pathological characteristic of anxiety-related syndromes such as posttraumatic stress disorder (PTSD), generalized anxiety disorder (GAD) and panic disorder.

However, unlike traditional conditioned fear, the mechanism of processing innate fear is largely unknown.

Prof. WANG Liping and his colleagues ZHOU Zheng and LIU Xuemei at the Shenzhen Institutes of Advanced Technology (SIAT) of the Chinese Academy of Sciences revealed that the VTA (ventral tegmental area) GABAergic neural circuit mediates visually evoked innate defensive responses.

In this study, the research group identified for the first time a neural circuit related to the VTAGABA+ neurons that mediates visually evoked innate defensive responses, involving the SC Glut+_ VTAGABA+_ CeA (central nucleus of the amygdala) pathway.

It has been confirmed by neuroscientists that the VTA plays an important role in learned appetitive and aversive behaviors. Interestingly, the researchers revealed that the GABA neurons in VTA were activated by visual threats by fiber photometry. Through viral tracing and both in vivo and in vitro electrophysiological recordings, they showed that VTAGABA+ neurons received direct excitatory inputs from the superior colliculus (SC).

The researchers revealed that VTAGABA+ neurons mediated looming-evoked innate defensive behaviors. These findings demonstrated that SC glutamatergic inputs activated VTAGABA+ neurons and mediated looming-evoked innate defensive behaviors. They further showed that CeA was a downstream target of the SC-VTA GABAergic pathway and it was involved in innate defensive behavior.

This study provides new insights into potential mechanisms of survival across species, as well as the maladaptive behavior in fear- and anxiety-related mental disorders.

The team has been working on the dissection of the neural circuits mediating instinctive emotions for years. In their previous work, they have resolved a novel brain circuit controlling visually evoked innate freezing behaviors via the SC-LP-LA subcortical pathway. They also had pioneering work which first reported that the accelerated innate defensive response to visual threats induced by stress is mediated by the Locus Coeruleus (LC) directed TH+ projections to the SC. The current study published in Neuron is another key milestone in their systematic work.

Credit: 
Chinese Academy of Sciences Headquarters

Ants maintain essential interactions despite environmental flux

image: The researchers investigated how ants adjust their social interactions to accommodate changes in population density.

Image: 
Adam B. Lazarus, Public Domain

UNIVERSITY PARK, Pa. -- Ants adjust their social interactions to accommodate changes in population density, according to researchers at Penn State and Georgetown University. The findings suggest that ant colonies are capable of maintaining their sophisticated social organization despite potentially drastic changes in their environments.

"Ants are among the most ecologically successful groups in nature due to their complex social organization, particularly their division of labor, including food acquisition," said David Hughes, associate professor of entomology and biology. "The survival of ant colonies depends on their ability to maintain this organization. In our study, we saw remarkable resilience of ant colonies to changes in population density. This finding helps to explain ants' evolutionary success."

According to Hughes, changes in ant colony size and population density are natural occurrences. They can increase as the queen reproduces and the colony grows, and they can decrease when the colony decides to split into multiple nest sites.

"To minimize potentially adverse effects due to changes in density and to maintain social balance, ant colonies should try to actively manage the rates of their interactions," said Hughes. " Until now, however, few studies have investigated these phenomena."

The researchers manipulated the population densities of three colonies of carpenter ants by quadrupling the sizes of their nest spaces. They placed the colonies inside wooden camera boxes fitted with infrared lights so they could film the ants under natural dark nesting conditions. The ants were able to leave the nests at any time to enter foraging areas.

The team manually identified the position in the nest of each ant at every point in time -- equivalent to more than 6.9 million data points -- to investigate whether the increased nest space influenced the spatial organization of the insects. The researchers found that the ants' positions relative to the others was similar regardless of the population density. When population density was lower, ants simply were separated further in space from each other.

Next, the researchers examined 3,200 ant interactions to analyze whether the increase in nest space influenced their task performance -- especially related to trophallaxis, or the transfer of foods from ant to ant. The team's results appeared May 2, 2019, in eLife.

"As a statistician, I built statistical models that captured how individual ants move within their nests, and how and when they chose to engage in important social behaviors like trophallaxis," said Ephraim Hanks, associate professor of statistics, Penn State. "These analyses helped to explain how ant populations maintained high levels of community interactions even after their spatial environment was changed."

Indeed, the team found that -- contrary to its expectations, which were that ant interactions would decrease with decreasing population -- ant interactions actually did not change with decreased density.

"Our results showcase the kinds of behavioral mechanisms ant colonies apply to achieve social homeostasis in the face of disturbance," said Hughes. "Specifically, ants actively regulate their spatial distribution and interaction behaviors in a way that allows them to maintain critical elements of their social interaction patterns -- such as food and information exchange -- in spite of drastic changes in their environment."

"This work shows that social species like ants can maintain levels of social connection and interaction even when their environment changes drastically," said Hanks. "As social interactions are a critical driver of the spread of infectious disease, our work shows that changing spatial environments, such as how cities or businesses are laid out, may have little or no effect on the spread of infectious disease, as social species may change their movement patterns to conserve community interactions."

Credit: 
Penn State

Community knowledge can be as valuable as ecological knowledge in environmental decision-making

An understanding of community issues can be as valuable as knowing the ecology of an area when making environmental decisions, according to new research from the University of Exeter Business School.

Billions of dollars are spent on environmental management each year across the globe but the approach has largely been to focus on a single ecosystem or species. This narrow viewpoint has produced mixed results because it does not always consider the value of other information in the bigger picture.

The research looked at what information would give real value to help decision making in order to achieve the best possible outcome. They examined four socio-ecological systems which represented a range of real-life environmental issues, two from fisheries and two from sustainable agriculture. Study one looked at how to maximise fish populations in an area with existing territorial fishing rights; study 2 was a project to re-stock salmon in a recreational fishery. The third study looked at pest control work within an agricultural production system and the last concerned the clearing of forest areas for agricultural use.

The researchers specifically focused on the relative value of gathering ecological versus social information; for example, understanding how fish populations grow over time versus the level of influence or engagement within local fishing communities.

They discovered that in the cases where managers needed community engagement (case studies focusing on increasing the size of fish populations or decreasing forest clearance), understanding what influenced this engagement was more important for success than the ecological aspects. However, the opposite was true when considering the two other ecological programmes on re-stocking salmon stocks and pest control work, where the need for widespread community engagement was less important.

The study concluded that overall information about social and ecological factors can be equally important depending on the characteristics of management actions.

"Our research shows that environment managers should always focus on improving their understanding of the community or environment that is directly impacted by their management actions," said Dr Katrina Davis of the Land, Environment, Economics and Policy Institute (LEEP) at the University and lead author of the report.

"It would help managers to try to understand what the value is of each piece of knowledge, then bring those pieces together to create a holistic plan. For example, the dynamics of a local community will play a huge role in the success of an environmental protection programme when managers are directly engaging with that community, but if managers are targeting agricultural pests, then understanding the drivers of community engagement will be less important."

The research was also carried out by Dr Jonathan Rhodes, University of Queensland, Dr Iadine Chades from the Commonwealth Scientific and Industrial Research Organisation and Dr Michael Bode from the Queensland University of Technology.

The research also showed that understanding how and why a community would engage with environmental management will be more important than understanding how one community will influence another to engage, or not.

"We believe that considering a wide range of factors, as well as understanding the value that information brings, can improve outcomes for our environment as a whole," added Dr Davis.

Credit: 
University of Exeter

New tool can pinpoint origins of the gut's bacteria

image: The researchers included, from left, UCLA graduate students Mike Thompson, Liat Shenhav, Leah Briscoe; and Professor Eran Halperin.

Image: 
UCLA Samueli Engineering

A UCLA-led research team has developed a faster and more accurate way to determine where the many bacteria that live in, and on, humans come from. Broadly, the tool can deduce the origins of any microbiome, a localized and diverse community of microscopic organisms.

The new computational tool, called "FEAST," can analyze large amounts of genetic information in just a few hours, compared to tools that take days or weeks. The software program could be used in health care, public health, environmental studies and agriculture. The study was published online in Nature Methods.

A microbiome typically contains hundreds to thousands of microbial species. Microbiomes are found everywhere, from the digestive tracts of humans, to lakes and rivers that feed water supplies. The microorganisms that make up these communities can originate from their surrounding environment, including food.

Knowing where these organisms come from and how these communities form can give scientists a more detailed picture of the unseen ecological processes that affect human health. The researchers developed the program to give doctors and scientists a more effective tool to investigate these phenomena.

The source-tracking program gives the percentage of the microbiome that came from somewhere else. It's similar in concept to a census that reveals the countries that its immigrant population came from, and what percentage each group is of the total population.

For example, using the source-tracking tool on a kitchen counter sample can indicate how much of that sample came from humans, how much came from food, and specifically which types of food.

Armed with this information, doctors will be able to distinguish a healthy person from one who has a particular disease by simply analyzing their microbiome. Scientists could use the tool to detect contamination in water resources or in food supply chains.

"The microbiome has been linked to many aspects of human physiology and health, yet we are just in the early stages of understanding the clinical implications of this dynamic web of many species and how they interact with each other," said Eran Halperin, the study's principal investigator who holds UCLA faculty appointments in the Samueli School of Engineering and in the David Geffen School of Medicine.

"There has been an unprecedented expansion of microbiome data, which has rapidly increased our knowledge of the diverse functions and distributions of microbial life," Halperin added. "Nonetheless, such big and complex datasets pose statistical and computational challenges."

Compared to other source-tracking tools, FEAST is up to 300 times faster, and is significantly more accurate, the researchers say.

Also, current tools can only analyze smaller datasets, or only target specific microorganisms that are deemed to be harmful contaminants. The new tool can process much larger datasets and offer a more complete picture of the microorganisms that are present and where they came from, the researchers say.

The researchers confirmed FEAST's viability by comparing it against analyses of previously published datasets.

For example, they used the tool to determine the types of microorganisms on a kitchen counter and it provided much more detail than previous tools that analyzed the same dataset.

They also used the tool to compare the gut microbiomes of infants delivered by cesarean section to the microbiomes of babies who were delivered vaginally.

"My hope is that scientists will use FEAST to diagnose bacteria-related health conditions," said UCLA computer science graduate student Liat Shenhav, the study's first author. "For example, if a particular cancer has a microbial signature, FEAST can potentially be utilized for early diagnosis."

Credit: 
UCLA Samueli School of Engineering

Could playing computer games improve your peripheral vision?

image: This is a screen from the SuperVision suite of games.

Image: 
Argenis Ramirez Gomez

Playing computer games could help improve people's peripheral vision, new research reveals.

Researchers have found a significant improvement in the peripheral awareness of people who played computer games specially designed around using peripheral vision.

This finding opens up the possibility that these types of games can be used to help improve players' performance in team sports - so they can spot team-mates quicker - or to help them to identify potential hazards at the side of their vision.

Researchers at Lancaster University's School of Computing and Communications were keen to explore how players' peripheral vision might be used within computer games and if playing games could help to improve a players' peripheral awareness.

"Most computer games involve looking directly at targets, or following the movement of characters, because that is the most natural and intuitive way we use our eyes," said Mr Ramirez Gomez. "We wanted to explore the opposite - is it possible to play games just by using our peripheral vision, is it possible to develop strategies to overcome the challenge, would it be engaging and fun and could these games improve our peripheral awareness?"

They created three games, which are based on popular culture and mythology - such as the stories of Medusa and Cyclops. The Medusa game, for example, involved having Medusa dig up mushrooms in her garden while avoiding looking directly at the mushrooms - otherwise they would turn into stone.

The suite of games, collectively called SuperVision, require players to use a mouse to select, or steer, objects within the game using their peripheral vision. Eye-trackers check for when players look directly at objects within the game and players are penalised accordingly.

"Players struggled at first as they attempted to control and resist their instinctive impulse to look," said Argenis Ramirez Gomez, PhD student and researcher at Lancaster University. "The games go against our natural behaviour. The players know they can't look but having to make decisions and interact with objects in the games forces players to lose control over their instincts and so they indulge their desire to look directly at the objects, failing in the game. But over time people developed strategies to overcome the challenge, such as focussing on a particular spot on the screen."

The researchers assessed each player's peripheral vision using a large protractor held to their eye level and by showing them coloured visual clues at different angles within a 180° radius.

Mr Ramirez Gomez said: "We evaluated the participants' peripheral visual capabilities before and after the games to test for skills development. We found a significant improvement in object recognition in the participants' peripheral vision after playing the games."

Even just one gaming session resulted in improvements in the players' peripheral awareness. The study continued over two weeks and the participants continued to show improvements in their peripheral vision throughout the duration of the research.

The participants did not play the games over the weekends during the study. This created a gap of three days between playing the games and researchers taking a measurement of the players' peripheral vision. There was no noticeable decline in performance over this gap, suggesting improvements in peripheral vision can be lasting, at least in the short-term.

Credit: 
Lancaster University

Baby pterodactyls could fly from birth

A breakthrough discovery has found that pterodactyls, extinct flying reptiles also known as pterosaurs, had a remarkable ability - they could fly from birth. This discovery's importance is highlighted by the fact that no other living vertebrates today, or in the history of life as we know it, have been able to replicate this. This revelation has a profound impact on our understanding of how pterodactyls lived, which is critical to understanding how the dinosaur world worked as a whole.

Previously, pterodactyls were thought to only be able to take to the air once they had grown to almost full size, just like birds or bats. This assumption was based on fossilised embryos of the creatures found in China that had poorly developed wings.

However, Dr David Unwin, a University of Leicester palaeobiologist who specialises in the study of pterodactyls and Dr Charles Deeming, a University of Lincoln zoologist who researches avian and reptilian reproduction, were able to disprove this hypothesis. They compared these embryos with data on prenatal growth in birds and crocodiles, finding that they were still at an early stage of development and a long way from hatching. The discovery of more advanced embryos in China and Argentina that died just before they hatched provided the evidence that pterodactyls had the ability to fly from birth.
Dr David Unwin said: "Theoretically what pterosaurs did, growing and flying, is impossible, but they didn't know this, so they did it anyway."

Another fundamental difference between baby pterodactyls, also known as flaplings, and baby birds or bats, is that they had no parental care and had to feed and look after themselves from birth. Their ability to fly gave them a lifesaving survival mechanism which they used to evade carnivorous dinosaurs. This ability also proved to be one of their biggest killers, as the demanding and dangerous process of flight led to many of them dying at a very early age.

The research has also challenged the current view that pterodactyls behaved in a similar way to birds and bats and has provided possible answers to some key questions surrounding these animals. Since flaplings were able to both fly and grow from birth, this provides a possible explanation as to why they were able to reach enormous wingspans, far larger than any historic or current species of bird or bat. How they were able to carry out this process will require further research, but it is a question that wouldn't have been posed without these recent developments in our understanding.

Dr Deeming added: "Our technique shows that pterosaurs were different from birds and bats and so comparative anatomy can reveal novel developmental modes in extinct species."

Credit: 
University of Leicester

Future tsunamis possible in the Red Sea's Gulf of Elat-Aqaba

image: These are cottages on the Gulf of Elat-Aqaba in Egypt, a popular tourist destination.

Image: 
Natalie Michelson

Researchers who took a closer look at a 1995 tsunami in the Gulf of Elat-Aqaba, at the northeastern tip of the Red Sea, say that the gulf's surrounding countries should prepare for future tsunami hazards in the economically developing vital region.

A team of scientists led by Amos Salamon of the Geological Survey of Israel simulated tsunamis, using the GeoClaw modeling program, paired with models of the magnitude 7.2 Nuweiba earthquake, which led to the 1995 tsunami in the gulf. They conclude in the journal Seismological Research Letters that the tsunami simulations correspond well with data on wave height, inundation and damage reported from the actual small tsunami.

Four countries--Egypt, Saudi Arabia, Jordan and Israel--border the Gulf of Eliat-Aqaba. Nuweiba is an Egyptian coastal town on the gulf. The tsunami reached its greatest wave height--three to four meters at the Nuweiba harbor --and brought the most serious damage to a platform there, with some minor damage also occurring to local nomad dwellings along the coast and to the Aqaba Port.

Since 1995, the Gulf of Elat-Aqaba has grown in economic importance for all four countries, supporting shipping ports, tourism and potentially large regional water and electrical projects such as Jordan's proposed Red Sea-Dead Sea Water Conveyance (sometimes called the "Peace Conduit") pipeline.

The tsunami was a surprise at the time, happening in a closed gulf far away from the open ocean, but has been mostly forgotten, said Salamon. "People remember the earthquake and the shaking rather than the tsunami. It went 'below the radar,' and was left hidden in reports and papers."

The 1995 earthquake took place along the Dead Sea Transform, the fault system that runs from southeastern Turkey to the southern tip of the Sinai Peninsula. "We have already learned that earthquakes along the fault system have the potential to generate tsunamis in the Eastern Mediterranean through seismogenic submarine landslides, but now we realize that it is also capable of generating tsunamis in the gulf by co-seismic deformation as well as by submarine landslides," Salamon said.

The motion along the fault that ruptured in the 1995 Nuweiba earthquake was predominantly strike-slip, where rocks slide past each other in a horizontal fashion along a vertical fracture. The most damaging tsunamis around the globe have been associated with thrust faults, where one slab of crust rides over the top of another slab. But researchers have noted several tsunamis related to strike-slip faults in recent years, including the magnitude 7.5 Palu earthquake in Indonesia last year, even though these ruptures do not tend to produce as much vertical upheaval as a thrust fault.

"In general the concept is that strike-slip earthquakes do not generate significant tsunamis, and we thought it is of great importance to show that we do have this," Salamon said. "There was relatively small and limited damage here from this tsunami, but we should not ignore that this unique seismotectonic settings includes active normal faults along the gulf's shoulders as well as steep submarine slopes along which we have identified fresh scars that may have evidence of past tsunamis."

One of the most important pieces of information from the 1995 event, which allowed the researchers to perform their simulations, was a 1995 mareogram, or tidal gauge reading, that Salamon's student Eran Frucht discovered. These data gave them a way to judge the accuracy of their simulations. The researchers concluded, for instance, that the peak three to four-meter wave height in Nuweiba Harbor may have been a local phenomenon affected by the shape of the harbor or by a nearby underwater landslide.

Salamon and his colleagues now are conducting a larger scale earthquake and tsunami hazard evaluation for the entire Gulf of Elat-Aqaba region. The evaluation is particularly important because there are few historical records in the area on past tsunami size and frequency, to guide future estimates.

The worst case scenario, said Salamon, would be an earthquake on a fault that crosses from the land into the gulf, which could produce severe ground shaking in the surrounding cities, submarine landslides and subsidence of the coastline that amplifies inundation from a tsunami.

Salamon said that the hazard evaluation would give scientists in all four countries surrounding the gulf a better understanding on whether to expect this worst-case scenario. "We thought that if we do this kind of research, it would be relevant for our neighbors as well."

Credit: 
Seismological Society of America

Pre-qualifying education and training helps health workers tackle gender based violence

Gender-based violence (GBV) could be tackled more effectively by giving healthcare students wider and more practical education and training in identifying and responding to the 'warning signs' presented among patients they will encounter in professional life, according to a new study.

Introducing effective GBV educational strategies before healthcare staff qualify would help to reduce the serious health and social threat to people - mainly women - around the globe. Tackling GBV is a key part of meeting UN Sustainable Development Goal 5: Gender Equality.

Researchers from the Universities of Birmingham and Melbourne reviewed almost 500 research sources in the first internationally-focused systematic literature review to combine evidence on the subject of GBV educational strategies for prequalifying healthcare students. The study that was led by Dr Caroline Bradbury-Jones from University of Birmingham was published in Trauma Violence & Abuse.

They set out a number of implications for further practice, policy and research, including:

GBV learning opportunities should have a practical focus and aim to incorporate an interactive element for improved results.

Existing and future education programmes should give greater attention to the wider forms of GBV such as female genital mutilation/cutting, forced marriage, honour violence and human trafficking.

More research is needed on the subject of single- versus mixed-gender audiences in GBV education for prequalifying healthcare students.

Ms. Dana Sammut, also from the University of Birmingham's School of Nursing, commented: "GBV poses a serious health and social threat to women around the world. Pre-qualifying education is vital in shaping professionals' responses, yet healthcare staff and students lack confidence in dealing with the issue.

"Healthcare institutions are often left to design and implement their own GBV policies, which can result in inconsistencies. Introducing effective GBV educational strategies before students qualify allows these problems to be addressed at the earliest opportunity in healthcare practitioners' careers."

The researchers identified that interactive approaches to learning gave better results than did theory-focused education and simple accumulation of knowledge. They recommend that future research should investigate wider learning theory and consider its application in the development of GBV curricula.

Credit: 
University of Birmingham

Researchers identify human protein that aids development of malaria parasite

image: In normal human liver cells (left), Plasmodium parasites (red) develop into a circular, exoerythrocytic form that gives rise to malaria. But in cells lacking CXCR4 (right), the parasite remains trapped in its rod-shaped sporozoite form.

Image: 
Bando <em>et al</em>., 2019

Researchers in Japan have discovered that the Plasmodium parasites responsible for malaria rely on a human liver cell protein for their development into a form capable of infecting red blood cells and causing disease. The study, which will be published June 12 in the Journal of Experimental Medicine, suggests that targeting this human protein, known as CXCR4, could be a way to block the parasite’s life cycle and prevent the development of malaria.

According to the World Health Organization, there were an estimated 219 million cases of malaria in 2017, resulting in the deaths of approximately 435,000 people. Infected mosquitoes transmit Plasmodium parasites to humans in the form of rod-shaped sporozoites that travel to the liver and invade liver cells (hepatocytes). Once inside these cells, the Plasmodium sporozoites develop into spherical exoerythrocytic forms (EEFs) that eventually give rise to thousands of merozoites capable of spreading into red blood cells and causing malaria.

“It seems likely that the transformation of Plasmodium sporozoites into EEFs is tightly controlled so that it only occurs in hepatocytes and not at earlier stages of the parasite’s life cycle,” says Masahiro Yamamoto, a professor at the Research Institute for Microbial Diseases of Osaka University. “However, we know very little about the host factors that regulate the differentiation of sporozoites in infected hepatocytes.”

In the new study, Yamamoto and colleagues discovered that a hepatocyte protein called CXCR4 helps Plasmodium sporozoites transform into EEFs. Depleting this protein from human liver cells reduced the ability of sporozoites to develop into EEFs. Moreover, mice pretreated with a drug that inhibits CXCR4 were resistant to malaria, showing reduced levels of parasites in the blood and significantly higher survival rates following Plasmodium infection.

Yamamoto and colleagues also identified a cell signaling pathway that causes hepatocytes to produce more CXCR4 in response to Plasmodium infection and determined that the protein aids the parasite’s development by raising the levels of calcium inside the cells.

“Our study reveals that CXCR4 blockade inhibits Plasmodium sporozoite transformation in hepatocytes,” Yamamoto says. “Most anti-malaria drugs targeting Plasmodium-derived molecules eventually lead to drug resistance in these parasites. However, inhibitors targeting human proteins such as CXCR4 might avoid this problem and could be used prophylactically to prevent the development of malaria. Moreover, the CXCR4 inhibitor used in this study is already widely used in humans undergoing treatment for blood cancers, which could accelerate its repurposing as a new way of combating malaria.”

Credit: 
Rockefeller University Press

Mouse study finds BPA exposure has transgenerational effects on gene linked to autism

WASHINGTON--Transgenerational bisphenol A (BPA) exposure may contribute to autism, according to a mouse study published in the Endocrine Society's journal Endocrinology.

Endocrine disrupting chemicals (EDCs) are chemicals or mixtures of chemicals that interfere with the way the body's hormones work. BPA is a common EDC used in plastics and food storage material, and it is already present in most humans' urine or blood. Animal studies have linked BPA to anxiety, aggression, and poor learning and social interactions. Studies of human populations report associations between BPA and neurobehavioral issues like attention deficit hyperactivity disorder and autism.

"Exposure of mouse fetuses to BPA disrupts formation of nerve cell connections in the brain, and this is a transgenerational effect," said the study's senior author, Emilie F. Rissman, Ph.D., of University of Virginia School of Medicine in Charlottesville, Va. and North Carolina State University in Raleigh, N.C. "To put this in human terms, if your great grandmother was exposed to BPA during her pregnancy and none of your other relatives ever came into contact with BPA, your brain would still show these effects."

In this mouse study, researchers tested mice descended from those exposed to BPA for social recognition and found that they showed a social behavioral deficient like autistic behavior. Mice whose great grandmothers were exposed to BPA during pregnancy were more active and took longer to habituate to strangers than other mice. More strikingly, they didn't explore the new mice that were introduced to the group. Mice are very social and curious, so this is an exciting finding.

"Even if we ban all BPA right now, that will not change these long-term effects on the brain," Rissman said.

Credit: 
The Endocrine Society

Empirical energy consumption model quantifies Bitcoin's carbon footprint

Researchers have conducted the first analysis of Bitcoin power consumption based on empirical data from IPO filings and localization of IP addresses. They found that the cryptocurrency's carbon emissions measure up to those of Kansas City--or a small nation. The study, published June 12 in the journal Joule, suggests that cryptocurrencies contribute to global carbon emissions, an issue that must be considered in climate change mitigation efforts.

Bitcoin and other cryptocurrencies rely on blockchain technology, which enables a secure network without relying on a third party. Instead, so-called Bitcoin "miners" guarantee a system without fraud by validating new transactions. Miners solve puzzles for numerical signatures, a process that requires enormous amounts of computational power. In return, miners receive Bitcoin currency.

"This process results in immense energy consumption, which translates into a significant carbon footprint," says Christian Stoll, a researcher at the Center for Energy Markets at the Technical University of Munich, Germany, and the MIT Center for Energy and Environmental Policy Research.

Scientists have growing concerns that Bitcoin mining is fueling an appetite for energy consumption that sometimes draws from questionable fuel sources--such as coal from Mongolia--in addition to hydropower and other low-carbon power resources. And cryptocurrency's energy issues seem to only be getting worse, with the computing power required to solve a Bitcoin puzzle increasing more than four-fold in 2018. While there is a growing push among researchers to quantify Bitcoin's energy consumption in order to better understand its contribution to global climate change, recent studies have struggled to generate accurate estimates.

"We argue that our work goes beyond prior work," says Stoll. "We can provide empirical evidence where current literature is based on assumptions."

Stoll and his team used IPO filings disclosed in 2018 by all major mining hardware producers to determine which machines miners are actually using and the power efficiencies of these machines. They also used IP addresses to determine emissions scenarios for actual mining locations and compare carbon emissions from power sources used by Bitcoin miners in different locations. Finally, they calculated Bitcoin's carbon footprint based on its total power consumption and estimates from different emissions scenarios. These include a lower limit scenario, in which all miners use the most efficient hardware; an upper limit scenario, in which miners behave rationally by disconnecting their hardware as soon as costs exceed revenue; and a best guess scenario, which accounts for the anticipated energy efficiency of the network and realistic additional energy losses from cooling and IT hardware.

"Our model reflects how the connected computing power and the difficulty of Bitcoin search puzzles interact, and it provides a high precision of power consumption since it incorporates auxiliary losses," says Stoll. "However, the precision of our results strongly depends on the accuracy of the input data, such as the IPO filings for hardware characteristics. The carbon emissions strongly depend on the assumed carbon intensity of power consumption."

Using this model, Stoll and his team estimated Bitcoin's annual energy consumption at 45.8 terawatt hours. This allowed them to calculate an annual carbon emissions range between 22.0 and 22.9 megatons of CO2--equivalent to CO2 emitted by Kansas City and placing Bitcoin's emissions between Jordan and Sri Lanka in emissions rankings (the 82nd and 83rd highest emitters). However, the researchers estimate that the energy consumption estimate would almost double (greatly amplifying emissions estimates) if they were to include all other cryptocurrencies in their consequences.

"We do not question the efficiency gains that blockchain technology could, in certain cases, provide," says Stoll. "However, the current debate is focused on anticipated benefits, and more attention needs to be given to costs."

Credit: 
Cell Press