Earth

Secondary school admissions system is still a work in progress

New research, examining how parents choose secondary schools, questions England's 'success rate' for admissions and suggests the 'good news' revealed today may not tell the full story. Researchers believe the system could, easily and cheaply, be made to work better.

A new report entitled 'School Choice, Admission and Equity of Access', published today by Lancaster University Management School and funded by the Nuffield Foundation, suggests that while allowing fewer choices enables local authorities to show that high proportions of children will be attending their first choice school, this is a hollow achievement if first choices are not a good reflection of true preferences. Currently, around half of local authorities in England allow parents to rank no more than three schools which, authors say, forces parents to think strategically when listing their choices. London families can list up to six schools.

Using 2014 National School Preferences (NSP) data and detailed records on pupil and school characteristics from the National Pupil Database (NPD), the researchers are the first to take a close look at how parents list rank schools - to investigate the importance of school quality, school proximity, and the probability of getting admitted. They investigate how different types of family, differentiated by ethnicity, income, and their child's ability, choose the schools they apply to - and the resulting admission decisions.

In the 2014 data studied, 35% of parents listed only one choice; only 27% made as many choices as they could and 39% put their local school at the top of their list. The authors say that some parents are playing very safe and second guessing other parents' decisions due to the limited schools they can list, whereas some simply are safe because they live near to the school they really want to attend. Interestingly, the data also reveals just 55% of parents included their local school as one of their choices - suggesting widespread dissatisfaction with neighbourhood schools.

The research also highlights important inequalities in access to chosen schools in England. Minority ethnic families are 17% less likely to achieve their first choice school - and the data suggests this is more of an issue for Black families, than Asian or other minority groups. Interrogation of the data suggests this is driven by ethnic minority parents putting greater weight on school quality than British white parents. Minority ethnic groups are, on average, willing to travel 21% further for a school that performs 10 % better (in terms of their 5+ GCSE pass rate) than white parents who are only prepared to travel an average of 11% further.

Parents of brighter pupils (those in the top third of the year 6 Key Stage 2 tests) also place a (50%) greater weight on school quality, than the families of children in the bottom third. However, minority ethnic children who are high performers at Key Stage 2 are less likely to be admitted to the secondary schools on their rank order lists than the children of white British parents. Living in areas with limited numbers of good schools, and the nature of school admission criteria at the most in-demand schools, such as faith schools, could be contributing factors.

The study also found that:

Minority ethnic families use a considerably larger number of the preference choices open to them, on average. If minority ethnic parents rank higher-quality and therefore more popular schools, this is part of the explanation of why they are the least likely to get their first choice of school.

White families who do not qualify for free-school meals make more cautious (i.e. closer) choices, and are less likely to choose high quality schools if the chance of admission is low.

Families living in London are less likely to get into their top choice school than non-London families - but because they get to list six choices they may look for longshots that may be further afield.

Professor Ian Walker, an education economist from Lancaster University Management School, said:"By disentangling parental choices from admission criteria, we find that the system is a lottery for many. Parents, faced with a limit on the number of schools that they can list, have to think strategically and take into account the behaviour of other parents. As a result, many act conservatively. Failing to be conservative might result in attending a school that has capacity - because its unpopular with other parents too.

"Our analysis suggests that some groups of parents would have a greater chance of achieving a good school for their children if they were allowed to rank more schools. Extending the length of lists that parents are able to specify is a simple and cheap intervention for local authorities to implement. Our results found that this would also bring the most benefit to minority ethnic families who are currently faring worse."

The findings of the research could, the authors suggest, be used by local authorities to create an algorithm that helps parents make the best choices from the surrounding schools, whatever their quality.

"Parents need easier access to more detailed information to make more informed school choice decisions," Professor Walker said. "When we choose a flight or a hotel, we expect to be presented with clear information on the cost and benefits of our options to help us make informed choices. But, when it comes to school choices, local authority websites are bereft of information. By better using technology to inform and guide, they could easily and cheaply empower parents to make better choices. It's a good time to do this - it would be a very cheap way of levelling up at a time when school capacity is getting tighter."

Ruth Maisey, Education Programme Head at the Nuffield Foundation said:

"We welcome this research which shows that the secondary school admissions system can be improved relatively easily and cheaply by local authorities to reduce inequalities faced by minority ethnic groups. Allowing parents to list more schools and providing them with better information can make the system fairer and more transparent. By encouraging parents to list their 'true' preferences, local authorities would have more accurate data on the real popularity of schools, which could lead to better local decision-making regarding capacity and school improvement."

Credit: 
Lancaster University

Breast cancer treatment costs highest among young women with metastatic cancer

(CHAPEL HILL, N.C., June 10, 2020) Cancer's untold toll may be a financial one, especially among young women fighting advanced breast cancer.

New research by the University of North Carolina at Chapel Hill shows much higher costs for treating metastatic breast cancer patients than for earlier-stage cancer patients or those without cancer.

The findings, published in the journal Breast Cancer Research and Treatment, revealed the largest expected costs were among women age 18-44. Breast cancer in younger women is typically diagnosed at more advanced stages, is more aggressive and less responsive to treatment.

"Our results highlight the tremendous cost burden associated with metastatic breast cancer among working-age women, particularly during the years after initial treatment of metastatic disease as well as at the end of life," said study co-author Stephanie Wheeler, Ph.D., MPH, professor of health policy and management at the Gillings School of Global Public Health and researcher at the Center for Health Promotion and Disease Prevention.

For example, among women aged 18-44, the incremental average monthly cost of treating metastatic breast cancer were $4,463 compared to monthly costs of $2,418 for treating stage 1 cancer. Among other age groups the treatment costs for advanced cancer were not statistically different.

Women living in North Carolina and treated for breast cancer between 2003-2014 were included in the study that was funded by the U.S. Centers for Disease Control and Prevention.

In the study group, 4,806 had metastatic breast cancer, meaning their cancer had spread to other parts of the body, and 21,772 had non-metastatic cancer. The patient data was collected in a cancer registry coordinated by the Lineberger Comprehensive Cancer Center.

Using statistical modeling based on insurance claims data, researchers estimated medical costs for patients with metastatic breast cancer.

For comparison, they also estimated costs for patients with earlier-stage breast cancer and 109,631 women with no cancer who were in the same age group, from the same county of residence, and who had the same type of health insurance.

For women with metastatic breast cancer where the five-year survival rate is only 26.3% compared to 98.8% for localized cancer, treatment continues for longer, and it includes end-of-life care. These factors contribute to the high medical costs associated with metastatic breast cancer, which can be a financial burden for women and their families.

The finding that medical costs are higher for younger and middle age women may reflect their desire for more aggressive treatment and willingness to pay for additional months of life, researchers said, or it may reflect breakdowns in shared decision making between patients and practitioners, leading to treatments with minimal financial and health benefits for patients.

"Our results suggest that we spend nearly twice as much in the last year of life for women that die of breast cancer compared to other causes of death," said study lead author Justin Trogdon, Ph.D., a professor of health policy and management at Gillings School of Global Public Health and researcher at the Center for Health Promotion and Disease Prevention.

"We should work to ensure that end-of-life spending for metastatic breast cancer represents women's preferences and is of high value," said Trogdon, a health economist.

By identifying the age groups and phases of care where medical costs are the highest, the results of this study may inform decision makers about where to invest resources, for example, which groups of patients may be in need of extra financial or psychological support.

The study can also inform future research into how to improve metastatic breast cancer treatment for populations or treatment phases that are currently associated with the highest medical costs.

Credit: 
University of North Carolina at Chapel Hill

Antarctic sea-ice models improve for the next IPCC report

image: Lettie Roach in her office.

Image: 
Dave Allen

The world of climate modeling is complex, requiring an enormous amount of coordination and collaboration to produce. Models feed on mountains of different inputs to run simulations of what a future world might look like, and can be so big -- in some cases, lines of code in the millions -- they take days or weeks to run. Building these models can be challenging, but getting them right is critical for us to see where climate change is taking us, and importantly, what we might do about it.

A study in Geophysical Research Letters evaluates 40 recent climate models focusing on sea ice -- the relatively thin layer of ice that forms on the surface of the ocean -- around Antarctica. The study was coordinated and produced to inform the next Intergovernmental Panel on Climate Change report, due out in 2021.

All the models projected decreases in the aerial coverage of Antarctic sea ice over the 21st century under different greenhouse gas emission scenarios, but the amount of loss varied considerably between the emissions scenarios.

"I am really fascinated by Antarctic sea ice, which the models have struggled more with than Arctic sea ice," said lead author Lettie Roach, a postdoctoral researcher at the University of Washington. "Not as many people are living near the Antarctic and there haven't been as many measurements made in the Antarctic, making it hard to understand the recent changes in sea ice that we've observed through satellites."

The models are known as coupled climate models, meaning they incorporate atmospheric, ocean, terrestrial and sea ice models to project what the future holds for our climate system. We are all familiar with the story of soon-to-be ice-free summers in the Arctic and the implications that may have on global trade. But what's driving change around Antarctic sea ice and what's expected in the future is less clear.

This study's assessment of Antarctic sea ice in the new climate models is among the first.

"This project arose from a couple of workshops that were polar climate centered, but no one was leading an Antarctic sea ice group," said Roach. "I put my hand up and said I would do it. The opportunity to lead something like this was fun, and I'm grateful to collaborators across many institutions for co-creating this work."

The Antarctic is characterized by extremes. The highest winds, largest glaciers and fastest ocean currents are all found there, and getting a handle on Antarctic sea ice, which annually grows and shrinks six-fold, is critically important. To put that into perspective, that area is roughly the size of Russia.

The icy parts of our planet -- known as the cryosphere -- have an enormous effect on regulating the global climate. By improving the simulation of Antarctic sea ice in models, scientists can increase their understanding of the climate system globally and how it will change over time. Better sea ice models also shed light on dynamics at play in the Southern Ocean surrounding Antarctica, which is a major component of our southern hemisphere.

"The previous generation of models was released around 2012," says Roach. "We've been looking at all the new models released, and we are seeing improvements overall. The new simulations compare better to observations than we have seen before. There is a tightening up of model projections between this generation and the previous, and that is very good news."

Credit: 
University of Washington

Proposed seismic surveys in Arctic Refuge likely to cause lasting damage

image: An early spring view of tracks left by a 3D seismic survey conducted in winter 2017-2018. The spacing of the tracks in the photo is 200-by-400 meters, rather than the 200-by-200-meter grid proposed by SAExporation. The photo is not an example of the long-term damage found by researchers in other areas.

Image: 
Photo by Matt Nolan

Winter vehicle travel can cause long-lasting damage to the tundra, according to a new paper by University of Alaska Fairbanks researchers published in the journal Ecological Applications.

Scars from seismic surveys for oil and gas exploration in the Arctic National Wildlife Refuge remained for decades, according to the study. The findings counter assertions made by the Bureau of Land Management in 2018 that seismic exploration causes no "significant impacts" on the landscape. That BLM determination would allow a less-stringent environmental review process of seismic exploration in the Arctic Refuge 1002 Area.
 

UAF's Martha Raynolds, the lead author of the study, said she and other scientists have documented lasting impacts of winter trails throughout years of field research. Their paper, authored by an interdisciplinary team with expertise in Arctic vegetation, snow, hydrology and permafrost, summarizes what is currently known about the effects of Arctic seismic exploration and what additional information is needed to effectively regulate winter travel to minimize impacts.

A grid pattern of seismic survey lines is used to study underground geology. These trails, as well as trails caused by camps that support workers, damage the underlying tundra, even when limited to frozen, snow-covered conditions. Some of the existing scars on the tundra date back more than three decades, when winter 2D seismic surveys were initiated. Modern 3D surveying requires a tighter network of survey lines, with larger crews and more vehicles. The proposed 1002 Area survey would result in over 39,000 miles of tracks.
 

"Winter tundra travel is not a technology that has changed much since the '80s," said Raynolds, who studies Arctic vegetation at UAF's Institute of Arctic Biology. "The impacts are going to be as bad or worse, and there are proposing many, many more miles of trails."

Conditions for winter tundra travel have become more difficult, due to a mean annual temperature increase of 7-9 degrees F on Alaska's Arctic coastal plain since 1986. Those warmer conditions have contributed to changing snow cover and thawing permafrost. The impact of tracks on the vegetation, soils and permafrost eventually changes the hydrology and habitat of the tundra, which affects people and wildlife who rely on the ecosystem.
 

The paper argues that more data are needed before proceeding with Arctic Refuge exploration efforts. That includes better information about the impacts of 3D seismic exploration; better weather records in the region, particularly wind and snow data; and high-resolution maps of the area's ground ice and hydrology. The study also emphasizes that the varied terrain and topography in the 1002 Area are different from other parts of the North Slope, making it more vulnerable to damage from seismic exploration.
 

Credit: 
University of Alaska Fairbanks

Noise disturbs the brain's compass

Our sense of direction tends to decline with age. In the scientific journal "Nature Communications", researchers from the German Center for Neurodegenerative Diseases (DZNE) and experts from the USA report on new insights into the causes of this phenomenon. According to their findings, the main source of errors in determining spatial position and apparently the cause of age-related orientation problems is a "noisy" and therefore imprecise perception of the speed at which one is moving. These study results could contribute to the development of diagnostic tools for early detection of dementia.

From visual stimuli to muscle feedback and signals relayed by the vestibular system - the human brain uses a wide range of sensory inputs to determine position and to guide us through space. An essential part of the necessary information processing happens in the "entorhinal cortex". In this area, which is present in both brain hemispheres, there are special neurons that generate a mental map of the physical environment. Thus, information on real space is translated into a "data format", which the brain can process. "The human navigation system works quite well. But it is not without flaws", explained Prof. Thomas Wolbers, principal investigator at the DZNE, Magdeburg site. "It is well known that there are people with good orientation skills and those who find it harder to find their way around. This ability usually diminishes with age, because older people generally find spatial orientation more difficult than younger individuals, especially in unfamiliar surroundings. Therefore, the chances of getting lost increase with age."

Study in virtual space

To understand the causes of this decline, DZNE scientists led by Thomas Wolbers, in collaboration with experts from the US Massachusetts Institute of Technology and the University of Texas at Austin, designed a specific experiment: A total of about 60 cognitively healthy young and older adults who were fitted with "virtual reality" goggles had to move and orient themselves - separately from each other - within a digitally generated environment. Simultaneously, participants also moved physically along convoluted paths. They were assisted by an experimenter who led the individual test person by the hand. In doing so, real locomotion led directly to movements in virtual space. "This is an artificial setting, but it reflects aspects of real situations," said Wolbers.

During the experiment, participants were asked several times to estimate the distance and direction to the starting point of the path. Because the virtual environment offered only a few visual cues for orientation, participants had to rely mainly on other stimuli. "We looked at how accurately participants were able to assess their position in space and thus tested what is known as path integration. In other words, the ability to determine position based on body awareness and the perception of one's own movement. Path integration is considered a central function of spatial orientation," explained Wolbers.

"Noisy" model

Just as important as the experimental setup was the mathematical modeling of the measured data. This was based on an approach to describe interfering effects on position determination as noise. "The human body and its sensory organs are far from perfect. Information processing in the brain is therefore affected by glitches, which can be interpreted as noise. This is similar to a radio broadcast, where noise can superimpose the actual signal," said Wolbers. "With the help of our mathematical model, we were able to unravel the contributions of various sources of error and identify what distorts position tracking the most and what has little effect. Such sources of error have never been investigated at this level of detail."

For example, data evaluation showed that body rotation in the direction of the path's starting point was consistently quite accurate. And memory errors played virtually no role. "To determine the location in space while you are moving, you have to constantly update your position in your mind. This requires you to remember where you were moments before. In this respect, our analysis found only minimal errors," said Wolbers.

A matter of velocity

The research team's conclusion: Errors in path integration are mainly caused by "accumulating internal noise" in information processing - and this phenomenon is probably a consequence of inaccuracies in the perception of movement speed. "It should be noted that humans intuitively estimate distances covered on the basis of how long and how fast they were previously travelling. Yet, our study suggests that the critical source of error for determining position is not time perception, but apparently random fluctuations in the speed information that gets to the brain," said Wolbers.

This source of error was dominant both in the younger (average age 22 years) and in the older adults (average age 69 years). "The young subjects were generally better at orientation than the older study participants. Critically, the accumulating internal noise increased with age. This phenomenon is apparently the main cause of deficits in path integration and probably also the trigger for age-related orientation problems. However, we do not yet know the exact origin of this noise and why it increases with age," said Wolbers.

Early detection of dementia

In previous studies, Wolbers and other DZNE researchers had found that in cognitively healthy, older adults, certain neurons of the entorhinal cortex - so-called grid cells -, which are essential for spatial navigation, fire irregularly: Their activity is unstable. This was related to age-related difficulties in orientation. The current results suggest that these instabilities are not due to malfunctions of the grid cells themselves, but are caused by noise from outside. The problem is therefore not in the grid cells but in the flow of information that reaches the entorhinal cortex. This points to a possibility for the early diagnosis of Alzheimer's.

"Alzheimer's disease is associated with damage to the entorhinal cortex at an early stage. It is therefore reasonable to assume that orientation disorders such as those that manifest in Alzheimer's originate in this area of the brain. Unlike age-related orientation difficulties, as our current study suggests," explained Wolbers. "This could provide an opportunity to distinguish normal age-related orientation problems from those caused by Alzheimer's. In the long term, our aim is to develop diagnostic methods that detect Alzheimer's at an early stage. This might be possible using technology such as virtual reality. We are currently preparing clinical studies on this."

Credit: 
DZNE - German Center for Neurodegenerative Diseases

A continuous simulation of Holocene effective moisture change in East and Central Asia

image: Schematic illustration of the lake energy and water balance models.

Image: 
©Science China Press

Studying the lake evolution and environmental change during the Holocene is of great significance for predicting the future global climate change. Relevant researches were mostly focused on reconstructions of lake level, regional effective moisture and paleoenvironment using geomorphic, sedimentological, and biostratigraphic methods. However, with the development of paleoclimatology, only using lake sediments as indicators to reconstruct lake evolution, water balance fluctuation and paleoenvironment change cannot more specifically explain the mechanism of past climate change. It is therefore necessary to quantitatively reconstruct and simulate the past climate change from a new perspective.

Lake water balance system is constantly responding to changes of climatic conditions on different time scales. Researchers constructed a virtual lake system by assuming that each land grid cell is a separate lake. Then based on a transient climate evolution model, a lake energy balance model and a lake water balance model, they continuously and quantitatively simulated Holocene effective moisture change represented by variability of virtual lake level in East and Central Asia. The virtual lake level, lake area, water depth and lake salinity are not equivalent to actual values, but these assumptions can be used to estimate relative changes of the regional effective moisture.

The evaporation of lake surface depends on the heat capacity of water, water density, lake depth, lake surface temperature, shortwave radiation and longwave radiation absorbed by the water surface, longwave radiation emitted by the water surface, latent heat flux, sensible heat flux, etc. The components of lake water balance model mainly include lake surface precipitation, evaporation and runoff, while the runoff change is closely related to the process of the runoff generation and confluence of the basin. Lake models forced by a coupled atmosphere-ocean general circulation model indicate that since the early Holocene, the changes of precipitation in northern China and Tibetan Plateau are consistent with the changes of summer solar radiation in Northern Hemisphere low latitudes, showing a decreased trend. The decrease of summer solar radiation in the Northern Hemisphere reduced the thermal contrast between land and sea, and produced weak Asian summer monsoon, so that less water vapor was brought. Accordingly, a decline in effective moisture through the Holocene in the Tibetan Plateau was primarily a result of decreased precipitation. However, decreased effective moisture in northern China was influenced by the combined effects of decreased precipitation, and increased evaporation caused by longwave radiation and shortwave radiation. The late Holocene high lake level in southern China was maintained by high coastal precipitation, possibly resulted from changes in local ocean feedbacks. Increased precipitation contributed to the effective moisture rise in northern Central Asia, which was induced by the enhancement of the westerly wind circulation.

This study not only analyzes the factors affecting the Holocene climate change in East and Central Asia by using the causes about water balance fluctuation, but also provides a new method to quantitatively reconstruct and simulate the past climate change.

Credit: 
Science China Press

Ebola transmission risks would be taken more seriously with ground-up interventions

A study led by the University of Kent's Durrell Institute of Conservation and Ecology (DICE) has found significant differences in disease risk perception and channels of information about Ebola virus disease (EVD) in rural areas and urban centres of Guinea, West Africa.

Findings were established after researchers investigated residents' opinions of the wildlife potentially posing a risk for EVD transmission to humans, wildmeat consumption before and during the 2013-2016 EVD outbreak in Guinea, and the ways in which EVD transmission risks were communicated during the outbreak.

The research led by Dr Tatyana Humle (DICE) alongside colleagues from Beijing Forestry University, China and other international institutions, found that rural people mainly received information about EVD through awareness-raising missions, especially in villages, as opposed to urban respondents who also gained their information through newspapers and radio.

Bats, chimpanzees, monkeys, warthogs, crested porcupines, duikers and cane rats were perceived as potential transmitters of EVD, but only bats and chimpanzees were reportedly consumed less often during the epidemic period even though a wildmeat ban was in place. Reduced consumption of bats and chimpanzees and an increase in domestic meat consumption revealed influenced consumption behaviour based on perceived disease risk. Yet many respondents in rural areas still did not strongly believe that wildlife could act as vectors of EVD, underestimating the risk associated with handling, capturing, butchering, and transporting infected wild animal carcasses.

Respondents who believed that EVD is not natural blamed developed countries for its spread. These individuals tended to maintain their wildmeat consumption habits and potentially mistrust information conveyed. The high cost and low availability of domestic meat were also cited as barriers to alternative meat protein consumption, especially in rural areas.

Dr Humle said: 'Our research indicates that future public health and behavioural change campaigns must use carefully developed messaging in relation to the risks of zoonoses. There should also be a bigger focus on raising awareness of affordable and accessible alternative protein resources. This will be more beneficial to residents than imposing bans or restrictions. In regions such as West Africa, the relationship between socio-economic context, food security, and public health is so important and requires greater attention.'

Credit: 
University of Kent

Link between liver and heart disease could lead to new therapeutics

image: Peroxisomal protein import is significantly impaired in aged fly hepatocytes. The number of peroxisomes (Red fluorescence) does not change, whereas peroxisomal protein import (Yellow fluorescence) decreases with age. The relationship among peroxisomes, liver function and heart aging might become a promising target for new therapies.

Image: 
Hua Bai

AMES, Iowa - A new study that looks closely at the cardiac health of flies provides new evidence that liver dysfunction may lead to deterioration of the heart.

The research fills in gaps in how scientists understand the links between heart health and other tissues and could inform the development of new therapies in human medicine, said Hua Bai, an assistant professor of genetics, development and cell biology at Iowa State University. Bai's lab has performed previous studies on how cardiac health in flies changes with age. The new study, published in the academic journal Nature Communications, also covers new ground in the function of a poorly understood organelle called the peroxisome, which may play a major role in how organisms age.

"We were thinking outside the heart for this paper," Bai said. "We wanted to find out if other tissues affect cardiac function during aging. There is significant data suggesting that liver function actually is a risk factor for cardiac disease. A patient with a lot of liver dysfunction often develops cardiac disease. This is a concern because you may have two diseases that you have to deal with for these patients."

But Bai said no direct link between liver and heart disease has emerged in experiments, leaving medical professionals unsure if the two factors share a causal relationship or if there's simply a correlation. Bai's lab attempted to fill that gap by studying the interaction between liver disease and the function of cardiac muscles in flies.

Protecting the liver maintains heart health

Previous studies from Bai's lab showed that manipulating genes in the cardiac muscles of flies could restore the heart function of older flies to a state similar to younger flies, essentially turning back the clock on cardiac tissues. In the new experiments, the researchers manipulated various genes governing liver function in flies to see how that would affect heart health as the flies aged.

"Our findings demonstrate we can protect the liver of old animals and maintain the health of the heart without doing any direct intervention on the heart tissue," said Kerui Huang, a graduate student in Bai's laboratory and the lead author of the study.

Much of the genetic work the researchers conducted focused on peroxisomes, understudied organelles inside cells that regulate key lipid metabolic processes and detoxification critical for brain and liver function.

"Looking at all the biology literature, we don't know much about how peroxisome function changes in aged animals," Bai said. "We show that peroxisomal protein import function is significantly impaired in aged flies. Research like ours could open up another new field to study how peroxisomes regulate tissue aging."

Huang said although flies appear to be highly dissimilar to humans, human medicine still has much to gain from studying fly biology. For instance, the functions of a fly's liver and heart share many similar functions with the human liver and heart.

Pharmaceutical companies have shown great interest in finding new avenues to treat age-related disease, Bai said. The relationship between peroxisomes, liver function and heart aging described in the new study might become a promising target for new therapies and drugs, he said.

Credit: 
Iowa State University

Climate change: Warm springtime's unwelcome legacy

A new study shows that the severe impact of the summer drought that hit Europe in 2018 was partly due to the spring heatwave that preceded it, which triggered early and rapid plant growth, depleting soil moisture.

With lots of sunshine, high temperatures, and ultimately drought, the summer of 2018 was extremely dry in Europe - particularly Northern and Central Europe. Among the consequences of the lack of precipitation were forest fires and significant harvest losses, which had a considerable economic impact. In Germany alone, the sums paid to farmers in compensation amounted to 340 million euros. The 2018 drought differed from the dry summers of 2003 and 2010 insofar as it was preceded over much of Central Europe by an unusual spring heatwave. An international collaboration, led by Ludwig-Maximilians-Universitaet (LMU) in Munich researchers Ana Bastos and Julia Pongratz, has now shown that the spring heatwave amplified the effects of the subsequent summer drought. The impact of the summer drought on the productivity and carbon balance of ecosystems varied on a regional scale, depending on the nature of the dominant type of vegetation. In light of ongoing global warming, the incidence of summer heatwaves and periodic droughts is expected to rise. According to the authors of the study, the adoption of alternative land management strategies could offer ways to mitigate droughts and their effects. The findings appear in the online journal Science Advances.

Research studies of the summer droughts in 2003 and 2010 have revealed that ecosystems absorbed less carbon dioxide than usual, because their productivity was restricted owing to the scarcity of water, the high temperatures and fire damage. "Little is known about whether and how preceding weather parameters influence the response of ecosystems to extreme conditions during the summer," says the lead author of the new study, Ana Bastos, who now heads a research group at the Max Planck Institute for Biogeochemistry in Jena. "To answer this question, we used the year 2018 in Europe as a case-study and carried out climate simulations incorporating 11 different vegetation models."

The results show that the warm and sunny conditions that prevailed in the spring led to more vigorous vegetation growth, which also started earlier than usual. This in turn increased rates of uptake of carbon dioxide during spring. However, the impact on annual productivity - and therefore on the overall carbon balance - was highly variable across regions. "When plants resume growth earlier in the year, they use more water," says Bastos. "In Central Europe, rapid plant growth in the spring significantly reduced the water content of the soil. By the summer, the level of soil moisture was already insufficient to maintain the biomass that had accumulated, making ecosystems more vulnerable to the effects of the drought." According to the models, this effect explains about half of the summer's soil moisture deficit. Therefore, in Central Europe the high spring temperatures had a negative impact on the productivity of ecosystems and net uptake of carbon dioxide later in the year.

In Scandinavia on the other hand, the earlier onset of growth compensated for the drought-induced loss of productivity later in the summer. As a result, levels of ecosystem activity, as well as the annual carbon balance, were either neutral or slightly on the positive side. The authors attribute this different regional behavior to the specific vegetation in the two regions. In Central Europe, arable land and pastures dominate the landscape, while forests cover much of Scandinavia. "Trees use water somewhat more economically," says Bastos. "If they grow faster in the spring, they also consume more water than they otherwise would. But they can control water loss from transpiration by adjusting the opening of stomatal pores in their leaves," she explains. Furthermore, trees have deeper roots than grasses or crop plants, which enables them to tap the water present at greater depths during periods of drought. For these reasons, the boreal forests of Northern Europe maintained almost normal levels of carbon fixation, even during the strong drought.

Overall, the new simulations indicate that the warm spring of 2018 contributed either to amplify the vulnerability of ecosystems to summer drought, in central Europe, or to mitigate the negative effects of a warm and dry summer, in Scandinavia, related with differences in land-cover and water-use by vegetation. These findings suggest that better data on growth rates of vegetation in spring could serve as a supplementary early indicator of impending summer droughts. Moreover, the negative impacts of future heatwaves and droughts could perhaps be reduced with the help of alternative approaches to land management. "In the long term, owing to climate change, spring vegetation will regularly grow at faster rates, consuming more water and increasing the risk of summer droughts," says Julia Pongratz. "It might be possible to make ecosystems more resilient by altering the plant cover - for example, by planting stands of trees in the immediate vicinity of cropland. But more extreme water shortages in summer will themselves alter the nature of ecosystems, if threshold levels of mortality and fire incidence are more frequently exceeded. So it is not at all clear whether Europe's ecosystems will continue to serve as carbon dioxide sinks in the future."

Credit: 
Ludwig-Maximilians-Universität München

NASA finds post-tropical depression Cristobal soaking the Great Lakes

image: The GPM's core satellite passed over Cristobal on June 10 at 2 a.m. EDT (0600 UTC). GPM found heaviest rainfall (orange) occurring in two areas. One area was north and west of Lake Superior, north of Rossport and Red Rock, Ontario, Canada. The second area was over Georgian Bay in the eastern side of Lake Huron. In both places, heavy rain (orange) was falling at a rate of 1 inch (25 mm) per hour. Light rain (blue) appears stretched around the northern and eastern side of the system.

Image: 
NASA/NRL

NASA's GPM satellite gathered data on what is now Post-Tropical Cyclone Cristobal and revealed some areas of heavy rain were occurring. Cristobal was bringing rainfall and gusty winds to the Great Lakes Region and still generating warnings.

Warnings and Advisories

On June 10, Cristobal was designated a post-tropical cyclone and the storm has triggered several watches and warnings in the Great Lakes area. The National Weather Service's Weather Prediction Center (WPC) in College Park, Md. issued a Lakeshore Flood Warning for the northern shores of Lake St. Clair. Lakeshore Flood Advisories are in effect for the Lake Michigan shoreline of northern Lower Michigan, the Lake Michigan shoreline of Upper Michigan and the Lake Huron shoreline of Upper Michigan.

In addition, a Gale Warning is in effect for Lake Michigan, eastern Lake Superior and portions of Lake Huron. Wind Advisories are in effect for parts of Wisconsin and Michigan.

What is a Post-tropical Cyclone?

NOAA's National Hurricane Center defines a Post-tropical cyclone as a former tropical cyclone. This generic term describes a cyclone that no longer possesses sufficient tropical characteristics to be considered a tropical cyclone. Post-tropical cyclones can continue carrying heavy rains and high winds. Former tropical cyclones that have become fully extratropical and remnant lows are two classes of post-tropical cyclones.

Rainfall Estimates

When the Global Precipitation Measurement mission or GPM core satellite passed over Cristobal on June 10 at 2 a.m. EDT (0600 UTC). GPM found heaviest rainfall occurring in two areas. One area was north and west of Lake Superior, north of Rossport and Red Rock, Ontario, Canada. The second area was over Georgian Bay on the eastern side of Lake Huron. In both places, rain was falling at rates of 1 inch (25 mm) per hour. Light rain appears stretched around the northern and eastern side of the system, falling at less than 0.2 inches (less than 5 millimeters) per hour.

Forecast Rainfall

The WPC noted, "The primary rainfall threat with Cristobal has ended. Sporadic heavy rain is possible today across the Great Lakes, along and ahead of a cold front associated with extratropical Cristobal. Minor to moderate river flooding will continue across portions of the Mississippi Valley."

Cristobal's Status on Wednesday, June 10, 2020

At 5 a.m. EDT (0900 UTC) on June 10, the center of Post-Tropical Cyclone Cristobal was located near latitude 45.8 degrees north and longitude 88.2 degrees west. That places the center about 195 miles (310 km) north-northeast of Madison, Wisconsin and about 185 miles (295 km) west of Sault Ste. Marie, Michigan. The post-tropical cyclone is moving toward the north-northeast near 30 mph (48 kph) and this motion is expected to continue as Cristobal tracks into Ontario, Canada.

Maximum sustained winds are near 40 mph (65 kph) with higher gusts. Little change in strength is forecast during the next 48 hours. The estimated minimum central pressure is 983 millibars.

Cristobal's Forecast Path

WPC noted that in addition to rainfall, winds gusting over 40 mph are expected early during the morning of June 10 over portions of Wisconsin and Michigan close to the Great Lakes. In addition, a few tornadoes are possible today across in the Great Lakes region, with the greatest chances in parts of Michigan, Indiana and Ohio.

Tropical cyclones/hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

GPM is a joint mission between NASA and the Japan Aerospace Exploration Agency, JAXA. The Suomi NPP satellite is a joint mission with NASA and NOAA.

For updated forecasts, visit: http://www.nhc.noaa.gov

Credit: 
NASA/Goddard Space Flight Center

Kissing bugs also find suitable climatic conditions in Europe

image: The triatomine or "kissing" bug Triatoma infestans.

Image: 
Dorian D. Dörge for Goethe University Frankfurt

An infection with Chagas disease is only possible in Latin America since the insect species that spread the disease only occur there. Scientists at Goethe University and the Senckenberg Society for Natural Research have now used ecological niche models to calculate the extent to which habitats outside of the Americas may also be suitable for the bugs. The result: climatically suitable conditions can be found in southern Europe for two kissing bug species; along the coasts of Africa and Southeast Asia the conditions are suitable for yet another species. The Frankfurt scientists therefore call for careful monitoring of the current distribution of triatomine bugs. (eLife DOI: 10.7554/eLife.52072)

The acute phase of the tropical Chagas disease (American Trypanosomiasis) is usually symptom-free: only in every third case does the infecting parasite (Trypanosoma cruzi) cause any symptoms at all, and these are often unspecific, such as fever, hives and swollen lymph glands. But the parasites remain in the body, and many years later chronic Chagas disease can become life-threatening with pathological enlargement of the heart and progressive paralysis of the gastrointestinal tract. There is no vaccine for Chagas disease. The WHO estimates that 6 to 7 million people are infected worldwide, with the majority living in Latin America (about 4.6 million), followed by the USA with more than 300,000 and Europe with approximately 80,000 infected people.

Chagas parasites are transmitted by predatory blood-sucking bugs that ingest the pathogen along with the blood. After a development period in the intestinal tract of the bugs, the parasites are shed in the bug's faeces. The highly infectious faeces are unintentionally rubbed into the wound by the extreme itching caused by the bug bite. Oral transmission by eating food contaminated with triatomine bug faeces is also possible.

Researchers led by the Frankfurt parasitologists and infection biologists Fanny Eberhard and Professor Sven Klimpel have used niche models to investigate which climatic conditions in the world are suitable for Latin American kissing bugs. In particular, temperature and precipitation patterns were incorporated into the calculations on the climatic suitability of a region. The researchers were able to show that currently in addition to Latin America, Central Africa and Southeast Asia also have suitable habitats for triatomines. Two of the triatomine species, Triatoma sordida and Triatoma infestans, are now finding suitable habitats in temperate regions of southern Europe such as Portugal, Spain, France and Italy. Both triatomine species frequently transmit the dangerous parasites in Latin America and can be found inside or near houses and stables, where they get their nightly blood meals preferably from dogs, chickens and humans.

Another triatomine species, Triatoma rubrofasciata, has already been detected outside Latin America. The model calculations by the Frankfurt scientists identify suitable habitats along large areas of the African and Southeast Asian coasts.

Professor Sven Kimpel explains: "There are people living in Europe who were infected with Chagas in Latin America and are unknowingly carriers of Trypanosoma cruzi. However, the parasite can currently only be transmitted to other people through untested blood preservations or by a mother to her unborn child. Otherwise, Trypanosoma cruzi requires triatomine bugs as intermediate hosts. And these bugs are increasingly finding suitable climatic conditions outside Latin America. Based on our data, monitoring programmes on the distribution and spreading of triatomine bugs would therefore be feasible. Mandatory reporting of Chagas disease cases could also be helpful."

Credit: 
Goethe University Frankfurt

Newly synthesized fungal compound can switch on a self-destruct button for cancer

video: Synthesizing fungal molecule capable of reactivating the self-destruct gene in aggressive cancer cells.

Image: 
Tokyo University of Science

All human body cells have a certain lifespan, during which they perform their essential duties. At the end of this lifespan, they reach senescence and-no longer able to perform those duties-die. This suicidal death is programmed into their genes through a process called apoptosis, causing them to self-destruct in order to make way for fresh, young, and healthy cells to replace them.

Mutations in a special gene called p53 can sometimes interfere with this process. Caused by aging, ultraviolet light, and various mutagenic compounds, these mutations can disable the apoptosis gene, resulting in "zombie" cells that refuse to die and continue to multiply, spreading the disabled gene and replacing healthy working cells with undying, rapidly growing tumors. This is the disease that we call cancer, and it takes many forms depending on which body cells develop the mutations.

 

Previously, scientists identified an anticancer compound called FE399 in a species of filamentous fungus called Ascochyta, which is often found afflicting common food crops such as cereals. The compound is a specific group of depsipeptides, a type of amino acid group, and was shown to induce apoptosis in cancerous human cells, particularly colorectal cancer, while they are still in vitro, proving its worth as an anti-cancer chemical.

Unfortunately, due to a variety of chemical complexities, the FE399 compound is not easy to purify, which hindered any plans for its widespread applications in cancer treatment. It was thus clear that extracting FE399 from the fungi naturally would not be a commercially feasible method, and despite the promise of a powerful anticancer drug, research into this particular compound was stalled.

 

The promise of a new anticancer treatment was tempting, however, and Prof Isamu Shiina, along with Dr Takayuki Tonoi, and his team from the Tokyo University of Science, accepted the challenge. "We wanted to create a lead compound that could treat colon cancer, and we aimed to do this through the total synthesis of FE399," says Prof Shiina. Total synthesis is the process of the complete chemical synthesis (production) of a complex molecule using commercially available precursors, allowing mass production. The results of their extensive studies will be published in the European Journal of Organic Chemistry.

 

The team figured that first, the structure of the depsipeptide would need to be identified. This was simple and could be easily performed using commercially available and inexpensive materials. Following this simple start, the subsequent procedures required many steps and resulted in some small failures when isomers were unsuccessfully isolated.

 

However, the team was rewarded for their efforts when, in a major breakthrough, their mass spectrometry and nuclear magnetic resonance studies confirmed that a trio of spots on a plate showed identical chemical signature to the known formula of FE399, meaning that they were able to successfully recreate FE399 synthetically.

 

Their technique was found to have an overall yield of 20%, which is quite promising for future large-scale production plans. "We hope that this newly produced compound can provide an unprecedented treatment option for patients with colorectal cancer, and thus improve the overall outcomes of the disease and ultimately improve their quality of life," states Prof Shiina.

 

Further research is needed to test the efficiency of FE399 in the treatment of other solid and blood-based cancers, and before mass production, the biological activities and structure of the FE399 molecule will need to be evaluated. But for now, the team from Tokyo University of Science are thrilled with their findings, and are positive that their research will help to improve treatments and therapies for patients with colorectal cancer.

 

Credit: 
Tokyo University of Science

Roadkill study identifies animals most at risk in Europe

image: Image of a Little Owl killed on a road taken during research by the team.

Image: 
Joaquim Pedro Ferreira

Around 194 million birds and 29 million mammals are thought to be killed each year on European roads, according to a new study that has ranked the most vulnerable species.
An international research team used 90 roadkill surveys from 24 European countries to create a new method of estimating both the birds and mammal species killed most often on roads, and the species most vulnerable to being wiped out of certain areas.
The research, published in Frontiers in Ecology and Environment, found that the species killed most often were not necessarily the ones most vulnerable to disappearing completely. This means action to preserve wildlife when new roads are built risks being targeted at the wrong species based on current methods.

Dr Manuela Gonzalez-Suarez, an ecologist at the University of Reading, and co-author of the study, said: "Road densities in Europe are among the world's highest, with 50% of the continent within 1.5km of a paved road or railway. Roads are therefore a significant threat to wildlife, and evidence shows deaths on them could even cause some species to disappear completely.

"Despite this, the long-term vulnerability of species is not currently considered when assessing the impact of new roads on wildlife, meaning we risk channeling support to the wrong species, doing nothing to help those most at risk. Better understanding which species are most vulnerable to roads is therefore important if we are to take more effective conservation action."

The research team, led by the Centre for Environmental and Marine Studies (CESAM) in Lisbon, calculated roadkill rates for 423 bird species and 212 mammal species. They found that small animals with high population densities and which reach maturity at an early age were most likely to be killed on roads. Nocturnal mammals and birds with a diet of predominantly of plants and seeds were also shown to have higher death rates.

The animals with the highest predicted roadkill rates were the common blackbird (11.94 per km/yr) and soprano pipistrelle bat (1.76 per km/yr). Roads in Central Europe, such as in Germany, Austria and Czech Republic, were found to be the most deadly.

The study also used the roadkill surveys to rank the bird and mammal species whose long-term survival was most threatened by roadkill.

The hazel grouse and russet ground squirrel were found to be the most at-risk of local extinction. Both are common in Europe but are classified as species of Least Concern in the International Union for Conservation of Nature's (IUCN) Red List of Threatened Species.

The most vulnerable animals classified as Threatened by the IUCN were the red-knobbed coot, and the Balcan mole and Podolian mole.

The study revealed that the roadkill hotspots did not correlate with the areas with the highest populations of vulnerable species.

For example, house sparrows had a high roadkill rate projection (2.7 per km/yr) but were ranked 420th of 423 bird species for vulnerability. Conversely, the hazel grouse had a low predicted roadkill rate (0.2 per km/yr) but was most vulnerable of all birds studied.

The areas with the highest concentrations of vulnerable bird species were the Iberian Peninsula, Balkan Peninsula and Eastern European countries. Vulnerable mammals were concentrated in northern Spain, Italy, Austria and the Balkan Peninsula.

Dr Clara Grilo, researcher at CESAM and lead author of the study, said: "We wanted to get the big picture of which species are more roadkilled and also map the regions that can be a threat for wildlife conservation in Europe. We used modeling to estimate roadkill for the unstudied species and also to identify which species are vulnerable to local extinction due to roads.

"From a conservation perspective, we need to go beyond the quantification of roadkill by applying population models to identify which species can be vulnerable to additional loss of individuals, which will provide more accurate information to target road segments that require mitigation."

Members of the public can contribute roadkill reports to improve studies like this one by visiting https://projectsplatter.co.uk/

Top 10 most vulnerable species

BIRDS

1 Bonasa bonasia, Hazel grouse

2 Circaetus gallicus, Short-toed snake eagle

3 Phylloscopus borealis, Artic warbler

4 Lanius nubicus, Masked shrike

5 Anthus cervinus, Red-throated pipit

6 Loxia leucoptera, Two-barred crossbill

7 Buteo rufinus, Long-legged buzzard

8 Gelochelidon nilotica, Common gull-billed tern

9 Ardeola ralloides, Squacco heron

10 Glaucidium passerinum, Eurasian pygmy-owl

MAMMALS

1 Spermophilus major, Russet ground squirrel

2 Spalax graecus, Balkan mole rat

3 Spalax zemni, Podolian mole rat

4 Spalax microphthalmos, Greater blind mole rat

5 Spalax arenarius, Sandy mole rat

6 Pygeretmus pumilio, Dwarf fat-tailed jerboa

7 Rhinolophus blasii, Blasius' horseshoe bat

8 Rhinolophus Euryale, Mediterranean horseshoe bat

9 Sorex alpinus, Alpine shrew

10 Ellobius talpinus, Northern mole vole

Credit: 
University of Reading

Researchers uncover novel approach for treating eczema

Researchers at the University of British Columbia (UBC) and Vancouver Coastal Health Research Institute (VCHRI) have identified a key enzyme that contributes to eczema, which may lead to better treatment to prevent the skin disorder's debilitating effects.

The study was recently published in the Journal of Investigative Dermatology.

Eczema, also known as atopic dermatitis (AD), causes the skin's protective barrier to break down, making it more vulnerable to foreign entities that can cause itching, inflammation, dryness and further degradation of the skin's protective barrier.

"The symptoms people often experience with eczema make them more likely to avoid going outside their homes or to work," says the study's senior author, Dr. David Granville, a professor in UBC's faculty of medicine and researcher at VCHRI. "It is estimated that the annual cost of eczema in North America is over $5.5 billion because of how it impacts people's health and well-being."

The Granzyme B enzyme is positively correlated with itchiness and disease severity in eczema. Researchers found that Granzyme B weakens the skin barrier by cleaving through the proteins holding cells together making it easier for allergens to penetrate across.

"Between cells in our skin are proteins that anchor them tightly together," says Granville. "In some inflammatory diseases, such as eczema, Granzyme B is secreted by cells and eats away at those proteins, causing these bonds to weaken and the skin to become further inflamed and itchy."

Researchers found that by knocking out Granzyme B with genetic modification, or inhibiting it with a topical gel, they could prevent it from damaging the skin barrier and significantly reduce the severity of AD.

"Previous work had suggested that Granzyme B levels correlate with the degree of itchiness and disease severity in patients with atopic dermatitis; however, there was no evidence that this enzyme played any causative role," says Granville. "Our study provides evidence that topical drugs targeting Granzyme B could be used to treat patients with eczema and other forms of dermatitis."

Researchers aim to quell the root cause of eczema symptoms

Approximately 15-20 per cent of Canadians live with some form of AD, and among Canadian children under the age of five, AD affects between 10-15 per cent. Of those, around 40 per cent will experience symptoms of the disease for the rest of their lives.

AD is also associated with an increased risk of developing a host of other inflammatory conditions, including food allergies, asthma and allergic rhinitis.

"Atopic dermatitis is the leading non-fatal health burden attributable to skin diseases," says Dr. Chris Turner, the study's lead author and former UBC postdoctoral fellow in Granville's laboratory.

AD typically follows an itch-scratch cycle in which itchiness is followed by scratching and more itchiness. This cycle usually occurs during flare-ups, which can appear anytime, and sometimes weeks, months or years apart.

Corticosteroid creams are a common treatment for individuals with AD who experience more severe itching and rashes. However, these can thin the skin when used over a prolonged period of time, which can make skin more prone to damage and infection.

A gel or cream that stops or limits Granzyme B, thereby reducing the severity of AD, could be a safer and more effective long-term treatment.

"A gel or cream that blocks Granzyme B could have fewer if any side-effects and circumvent the itch-scratch cycle, making flare-ups less pronounced," says Turner

While a commercially available treatment is still a ways away, the researchers see great promise in this line of research and are pursuing further clinical trials into Granzyme B and Granzyme B inhibitors.

Credit: 
University of British Columbia

First impressions can sway financial professionals' forecasts of firms for up to 6 years

First impressions can have long-term effects on people's perceptions and behavior. A new study looked at the influence among finance professionals of first impression bias of firms' performance, which spurs people to place undue weight on early experiences. The study found that equity analysts placed greater emphasis on early impressions than later ones, that negative first impressions had more power than positive ones, and that first impression bias could influence forecasts of a firm by a financial professional for up to six years.

The study was conducted by researchers at Carnegie Mellon University (CMU), the University of California, Irvine, and Pennsylvania State University. It appears in Review of Finance.

"Analysts are professionals who have a financial incentive to make accurate forecasts," explains Thomas Ruchti, Assistant Professor of Accounting at CMU's Tepper School of Business, who coauthored the study. "Forecasting future outcomes is inherently subjective, so we sought to determine whether analysts are more optimistic in their forecasts about firms that performed well in the year before the analysts followed the firm and more pessimistic in their forecasts about firms that performed poorly in the year before."

The researchers looked at nearly 1.7 million analyst forecasts between 1984 and 2017 to examine whether analysts' first impressions of a firm induced bias in their forecasting behavior. They obtained information on analysts' quarterly earnings per share forecasts (looking one quarter ahead), price targets, and stock recommendations from the Institutional Brokers' Estimate System database.

As a proxy for analysts' first impressions, they used a firm's stock return after adjusting for industry returns in the year before an analyst's first forecasts. A firm's performance was categorized as positive if it was at or above the 90th percentile in market returns in its industry for the prior year, and negative if it was below the 10th percentile.

The study found that equity analysts' first impressions of a firm could influence their future forecasting behavior for two years if the impression was positive and up to six years if the impression was negative. Analysts who had abnormally positive first impressions of firms experienced a first impression bias that led to forecasts that were more optimistic than the consensus (consensus forecasts were calculated as the average value of the first forecasts of all analysts following the same firm at the same time).

In contrast, analysts who experienced abnormally negative first impressions experienced a first impression bias that led to forecasts that were more pessimistic than the consensus. Negative first impressions had roughly double the economic effect on forecasts and price targets of positive first impressions of equal or greater magnitude, the study found.

A positive first impression was associated with a 15% higher probability that the analyst would recommend buying a stock, while a negative first impression was associated with a 31% lower likelihood that an analyst would do so. The opposite pattern held for recommendations to sell. The study also found that analysts appeared to place more weight on recent experiences and their earliest experiences, and less weight on intermediate experiences.

"Past studies have looked at the influence of first impression bias in the lab, but our work is the first to do so among finance professionals in the field," says Ruchti. "Our findings have practical implications for brokerage houses: When assigning analysts to follow a particularly successful or unsuccessful firm, brokerages may benefit from designing procedures to compensate for the first impression biases of analysts in generating future forecasts, price targets, and recommendations."

Credit: 
Carnegie Mellon University