Tech

Producing hydrogen from splitting water without splitting hairs

New model explains interactions between small copper clusters used as low-cost catalysts in the production of hydrogen by breaking down water molecules

Copper nanoparticles dispersed in water or in the form of coatings have a range of promising applications, including lubrication, ink jet printing, as luminescent probes, exploiting their antimicrobial and antifungal activity, and in fuel cells. Another promising application is using copper as a catalyst to split water molecules and form molecular hydrogen in gaseous form. At the heart of the reaction, copper-water complexes are synthesised in ultra-cold helium nanodroplets as part of the hydrogen production process, according to a recent paper published in EPJ D. For its authors, Stefan Raggl, from the University of Innsbruck, Austria, and colleagues, splitting water like this is a good way of avoiding splitting hairs.

Previous work showed that at the molecular level, water oxidises copper nanoparticles until their surface is saturated with molecules carrying hydrogen (called hydroxyl groups). Theoretical work further showed that a monolayer of water, once adsorbed on the copper particles, spontaneously converts to a half-monolayer of hydroxide (OH) plus half a monolayer of water while releasing hydrogen gas.

In their study, Raggl and colleagues synthesised neutral copper-water complexes by successively doping helium nano-droplets -- which are kept at the ultra-cold temperature of 0.37 K in a state referred to as superfluid -- with copper atoms and water molecules. These droplets are then ionised by electrons. The authors show that the composition of the most prominent ions depends on the partial copper and water pressures in the cell where the reaction occurs. They observe ions containing several copper atoms and several dozen water molecules.

The authors recognise that they failed to directly observe the predicted hydrogen formation because their instrument is not designed to detect electrically neutral entities.

Credit: 
Springer

Scientists have developed an effective marker for cancer diagnosis and therapy

image: This is lysosome.

Image: 
© NUST MISIS

A research group consisting of scientists from NUST MISIS, the Technical University of Munich, Helmholtz Zentrum München, the University of Duisburg-Essen, and the University of Oldenburg has developed a system that allows doctors to both improve the accuracy of diagnosing malignant cells and to provide additional opportunities for cancer treatment. The magnetoferritin compound is the main element of this new system. The research article has been published in Advanced Functional Materials.

The lack of accuracy ("contrast") in imaging is a common problem of non-invasive diagnosis. "Contrast agents", compounds that are introduced into the body before a diagnosis procedure to enhance the response and make affected cells more visible on a tomograph, can be used to solve this problem in magnetic resonance imaging (MRI). Paramagnetic gadolinium particles and superparamagnetic iron particles are among these agents. However, even in small quantities, these substances - alien to the human body - can potentially be dangerous.

"The international research team, including Dr. Ulf Wiedwald, a visiting Professor at the NUST MISIS Biomedical Nanomaterials Laboratory, has developed a unique injection diagnosis system based on magnetoferritin. The developed system will significantly improve the quality of MRIs and optical diagnosis", -- said Alevtina Chernikova, Rector of NUST MISIS.

Magnetoferritin is a compound consisting of endogenous human protein (ferritin) and a magnetic nucleus. The development and testing of the compound was conducted following the existing protocol for the synthesis of magnetoferritin, but was improved for the effective capture of tumor cells. The high concentration of magnetoferritin in tumor tissue made it possible to obtain a hypoallergenic contrast agent that is perfectly compatible with the human body.

"An intravenous injection of magnetoferritin has been proposed. Then, spreading with the blood flow, [the magnetoferritin] will be captured by the targeted tumor cells. As has been shown in a large number of studies, these cells actively capture transferrin - the protein responsible for transport of iron in blood. The same receptors are capable of capturing magnetoferritin as well. Once they get into the lysosomes of targeted cells, the magnetoferritin will further enhance the contrast signal", -- commented Dr. Wiedwald.

The system will also allow doctors to conduct therapy on tumor formations. If cancerous cells are identified, they can be targeted by an electromagnetic field or light, which will lead to their heating and subsequent death.

Credit: 
National University of Science and Technology MISIS

Researchers test autobiographical memory for early Alzheimer's detection

Testing how well people remember past events in their lives could help medical professionals make early predictions about who is at risk for developing Alzheimer's disease, according to a new study from the University of Arizona.

Researchers administered an "autobiographical memory" test to a group of 35 healthy adults, about half of whom carry the gene variant APOE e4 -- a known genetic risk factor that nearly doubles the chances of developing Alzheimer's disease. As a group, those with the genetic risk described memories with much less detail than those without it.

Sometimes called a disease with a clinically silent beginning, Alzheimer's is difficult to detect early even though changes in the brain related to the disease may begin to happen years or even decades before an individual starts to exhibit memory difficulties, said UA neuropsychologist Matthew Grilli, lead author of the new research, which is published in the Journal of the International Neuropsychological Society.

"This raises a huge challenge for developing effective treatments," said Grilli, an assistant professor and director of the Human Memory Laboratory in the UA Department of Psychology. "The hope is that in the near future we will have drugs and other treatments that could potentially slow down, stop and even reverse some of these brain changes that we think are the hallmarks of Alzheimer's disease. The problem is that if we can't detect who has these hallmarks early enough, these treatments may not be fully effective, if at all."

Grilli's goal is to help pick up on brain changes much earlier, before they begin to have an obvious effect on cognition and memory.

He and his UA colleagues Aubrey Wank, John Bercel and Lee Ryan decided to focus on autobiographical memory, or people's recollection of past events in their lives, because this type of memory depends on areas of the brain that are vulnerable to early changes from Alzheimer's disease.

"When we retrieve these complex types of memories that have multimodal details, they're highly vivid or rich; they come with narratives, context and backstories," Grilli said. "We've learned through cognitive neuroscience that the ability to recreate these memories in your mind's eye depends on a widely distributed network in the brain, and it critically depends on regions of the brain that we know are compromised early on in Alzheimer's disease pathology."

In autobiographical interviews, study participants, who ranged in age from early-50s to 80, were asked to recall recent memories, memories from their childhood and memories from early adulthood with as much detail as possible. The interviewers -- who did not know which participants had a genetic risk factor for Alzheimer's -- recorded and scored participants' responses, evaluating which details added to the richness and vividness of the memories and which did not.

Those with the genetic risk factor for Alzheimer's disease, as a group, described memories with much less vivid detail than those without the risk factor, despite the fact that all study participants performed normally and comparably on a battery of other, standard neuropsychology tests.

"None of these individuals would be diagnosed with dementia or mild cognitive impairment," Grilli said. "They are clinically normal, they are cognitively normal, but there's this subtle difficulty one group has with retrieving real-world memories, which we think is because there are more people in the group who are at a preclinical stage of Alzheimer's disease."

Not everyone with the gene variant APOE e4, which is present in about 25 percent of the population, will develop Alzheimer's disease, and not everyone who develops Alzheimer's has the gene.

"From this study, we can't identify one person and say for sure this person is in the preclinical phase of Alzheimer's disease. That's the next stage of work that we need to do," Grilli said. "But we know that as a group there probably are more people in the e4 carrier group that are in the preclinical phase of Alzheimer's disease, and we think this is why they had a harder time generating these memories."

Grilli said the next step is to study brain activity in the people who struggle to generate vivid autobiographical memories to see if they have observable changes in brain structure or activation of the regions of the brain affected early on by Alzheimer's.

The hope is the work could lead to the development of a clinical test sensitive enough to the preclinical brain changes of Alzheimer's disease that could be used to identify people who should undergo more extensive testing for early Alzheimer's disease pathology.

"The tests for early signs of Alzheimer's disease pathology are invasive and expensive, so this new cognitive test potentially could be used as a screen," Grilli said. "It also could be used to help clinical trials. At the moment, it's very difficult and expensive to conduct clinical trials of new drugs because it takes a very long time to determine whether that drug has had an impact on memory. If we have more sensitive measures, we might get answers sooner, especially if we're trying to administer drugs before obvious signs of memory impairment are detectable."

Credit: 
University of Arizona

New online tool for clinicians could predict long-term risk of breast cancer returning

A new, simple web-based calculator that could better predict the long term risk of breast cancer returning in other areas of the body has today been published online by researchers at The Royal Marsden NHS Foundation Trust and Queen Mary University of London.

The prognostic tool - CTS5 (Clinical Treatment Score post-5-years) - published recently in the Journal of Clinical Oncology, could be used to decide which patients are at high enough risk of their cancer returning after receiving the standard five years of endocrine (hormone) therapy, and so could benefit from continuation of treatment. It could also predict which patients are at low risk of recurrence, and so can avoid any further therapy along with the potential adverse side effects.

Over the last three decades, there has been a major increase in the rate of invasive breast cancer in Western countries. Approximately 85 per cent of patients are now diagnosed as oestrogen receptor (ER) positive, which means that the cancer grows in response to the hormone oestrogen. Almost all of these patients are prescribed five years of hormone therapy after having standard treatment (surgery, chemotherapy, and/or radiation therapy), to lower the risk of the cancer returning.

However, hormone therapy can have significant side effects for some patients, including weakness of bone tissue, and exacerbation of menopausal symptoms. Oncologists along with patients have to decide after five years of hormone treatment whether extending this type of therapy is worthwhile and appropriate.

Professor Mitch Dowsett, Head of The Royal Marsden Ralph Lauren Centre for Breast Cancer Research and Professor of Biochemical Endocrinology at The Institute of Cancer Research (ICR), and Professor Jack Cuzick and Dr Ivana Sestak from Queen Mary University of London developed CTS5 after reviewing data from two previously published studies. Together these provided information on 11,446 postmenopausal women with ER positive breast cancer who had received five years of hormone therapy (tamoxifen, anastrozole, or letrozole).

Using the data set from one previously published study, they measured how many women developed metastasis five to 10 years after they finished endocrine therapy. This was then combined with information about the tumour, which had been measured at the point of diagnosis, to produce a risk equation - CTS5.

To investigate the validity of the tool, researchers then tested CTS5 against data from the second study. CTS5 was shown to be able to accurately separate women into groups of low, intermediate, or high risk of developing a late distant recurrence breast cancer after five years of hormone therapy. The test identified 42 per cent of women as who were sufficiently low risk so that extending hormone therapy would have been of very little value.

Co-lead researcher Professor Mitch Dowsett, Head of The Royal Marsden Ralph Lauren Centre for Breast Cancer Research and Professor of Biochemical Endocrinology at The Institute of Cancer Research, London, said: "What we have developed could improve clinical practice, benefiting breast cancer patients by avoiding potentially unnecessary extended treatment. Clinicians require expertise and the best tools to help them make crucial decision on treatment for patients, decisions that can make a difference to patient's quality of life."

Queen Mary University of London researchers led on the development of the web-based CTS5 calculator, which has been designed with clinicians in mind. After inputting patient details, including age, tumour size and tumour grade, it gives an estimated 5-10 year risk of the cancer returning in another part of the woman's body, with an estimated benefit from extending their hormone therapy.

Co-lead researcher Professor Jack Cuzick from Queen Mary University of London said: "Hormone sensitive breast cancer is one of the few cancers where late recurrence is common, and predicting who is at high risk is particularly important so that they can continue hormone treatment. While our ability to predict this type of cancer is highly likely to improve in the future, we're providing a simple tool which is available now, and is easily used and well tested."

Dr Ivana Sestak from Queen Mary University of London said: "Over 50 per cent of women who have finished hormonal treatment for their breast cancer are at increased risk of developing a late metastasis. But there are no web-based calculators for predicting which women are at highest risk.

"Our tool provides a very simple way of obtaining the risk of a late metastasis for each woman individually. It is very important to identify these women in the clinic and the calculator provides help in the decision-making process."

Professor Dowsett adds: "This tool uses information that is already gathered in all patients, so could be easily used across the UK and globally at other centres."

The authors note that the data used was from previously published studies that began over 20 years ago on patients with hormone receptor positive tumours (make up about 85% of breast cancer). The only major change to the management of this specific patient population since then has been the introduction of trastuzumab for some patients, and so clinicians who are treating this specific group should be cautious of using CTS5.

Credit: 
The Royal Marsden NHS Foundation Trust

Algal blooms a threat to small lakes and ponds, too

COLUMBUS, Ohio - Harmful algae isn't just a problem for high-profile bodies of water - it poses serious, toxic threats in small ponds and lakes as well, new research has found.

A team of researchers from The Ohio State University examined water samples from two dozen ponds and small lakes in rural Ohio and found plenty of cause for concern, with particularly high levels of toxins at one lake.

Toxins from algae can cause skin rashes, intestinal problems and damage to the liver and nervous system. Fertilizers common to agriculture - including nitrogen and phosphorous - create an environment in which harmful algae can flourish.

The researchers said that the way farmers manage runoff could play a significant role in creating water bodies that are ripe for harmful algal blooms. A primary concern is tile drainage, a widely used agricultural approach to removing excess water from the soil below the surface. That water - and the nutrients found in it - are rerouted, often toward ponds on farm property, said study co-author Seungjun Lee, a postdoctoral researcher in environmental health sciences at Ohio State.

"A lot of people and government agencies are paying attention to larger lakes, including Lake Erie, but these smaller bodies of water are also used for recreation, fishing and irrigation," he said.

The study was published recently in the journal Environmental Science & Technology.

The research team, led by Ohio State's Jiyoung Lee, analyzed samples from the 24 bodies of water over a three-month period in late summer 2015.

Ten of the sites had detectable levels of microcystins, toxins produced by freshwater cyanobacteria during algal blooms.

One site had repeated instances of microcystin concentrations above recreational guidelines set by the Ohio Environmental Protection Agency, and so the research team paid particular attention to the samples from that site.

"Samples from this lake in early July were particularly concerning, as they contained four times the recommended amount of microcystin for recreational use and more than 800 times the recommended level for drinking," Seungjun Lee said.

A pond or lake with high toxin levels presents a risk to people, pets, farm animals, wildlife (including fish) and crops and could benefit from routine monitoring and work to lower the risk of algal blooms, he said. The researchers did not name the lake in question, because it is privately owned.

Jiyoung Lee said the impact of tile drainage may be elevated in small lakes and ponds, compared to larger lakes.

"Highly concentrated nutrients are being introduced into a smaller volume of water, making small lakes and ponds more sensitive to this influx of phosphorous, nitrogen and other nutrients," she said.

Nitrate and phosphorous are linked to the primary type of toxic microcystin found in the water. Judicious use of fertilizers could help control the algal blooms, as could measures to reduce animal waste contamination of ponds and lakes, Seungjun Lee said.

Though the study concentrated on Ohio agricultural areas, its findings likely apply to many areas throughout the U.S. and the world where agriculture and small lakes and ponds coexist, the researchers said.

Credit: 
Ohio State University

Bahama Nuthatch feared extinct rediscovered in the Bahamas

image: One of the rarest birds in the western hemisphere, the Bahama Nuthatch, has been rediscovered by research teams searching the island of Grand Bahama.
The finding is particularly significant because the species had been feared extinct following the catastrophic damage caused by Hurricane Matthew in 2016, and had not been found in subsequent searches.
But it is feared that there could only be two left -- placing the species on the verge of extinction and certainly among the world's most critically endangered birds.

Image: 
Matthew Gardner, University of East Anglia

One of the rarest birds in the western hemisphere, the Bahama Nuthatch, has been rediscovered by research teams searching the island of Grand Bahama.

The finding is particularly significant because the species had been feared extinct following the catastrophic damage caused by Hurricane Matthew in 2016, and had not been found in subsequent searches.

But it is feared that there could only be two left - placing the species on the verge of extinction and certainly among the world's most critically endangered birds.

The Bahama Nuthatch is an endangered species, only known from a small area of native pine forest on Grand Bahama Island, which lies approximately 100 miles off Palm Beach, Florida.

University of East Anglia masters students Matthew Gardner and David Pereira set out on a three-month expedition to find this and other endemic Caribbean pine forest bird species.

They made their way through dense forest with thick 'poisonwood' understorey - the layer of vegetation growing beneath the main forest canopy - in what is thought to be one of the most exhaustive searches of the island.

They worked in partnership with Nigel Collar and David Wege from Birdlife International and the Bahamas National Trust, the organisation which works to protect the habitats and species of The Bahama Islands.

Meanwhile a second team of Bahamian students, led by Zeko McKenzie of the University of The Bahamas-North and supported by the American Bird Conservancy, also searched for the bird.

The Bahama Nuthatch has a long bill, a distinctive high-pitched squeaky call, and nests only in mature pine trees. There had been a sharp decline in its population crashing from an estimated 1,800 in 2004 to just 23 being seen in a survey in 2007. The decline likely began in the 1950s due to habitat loss due to timber removal, and more recently due to hurricane damage, storm surges having killed large areas native forest.

Both teams made Nuthatch sightings in May, and the UEA team were lucky enough to capture the elusive bird on film.

Dr Diana Bell, from UEA's School of Biological Sciences, said: "The Bahama Nuthatch is a critically endangered species, threatened by habitat destruction and degradation, invasive species, tourist developments, fires and hurricane damage.

"Our researchers looked for the bird across 464 survey points in 34,000 hectares of pine forest. It must have been like looking for a needle in a hay stack. They played out a recording of the bird's distinctive call in order to attract it.

"As well as searching for the elusive bird, they also collected environmental data to better understand its habitat preferences and surveyed the extent of hurricane and fire damage," she added."

Matthew Gardner said: "We were the first to undertake such an exhaustive search through 700km of forest on foot.

"We had been scouring the forest for about six weeks, and had almost lost hope. At that point we'd walked about 400km. Then, I suddenly heard its distinctive call and saw the unmistakable shape of a Nuthatch descending towards me. I shouted with joy, I was ecstatic!"

The UEA team made six Nuthatch sightings in total, and McKenzie's team independently made five sightings, using different methods, in the same small area of forest - including a sighting of what they believe to be two birds together.

Mr Gardner said: "During three months of intensive searching we made six Bahama Nuthatch sightings. Our search was extremely thorough but we never saw two birds together, so we had thought there might only be one left in existence."

"The other team have reported seeing two together so that is promising. However, these findings place the species on the verge of extinction and certainly amongst the world's most critically endangered birds."

"We also don't know the sex of the birds. In many cases when birds dwindle to such small numbers, any remaining birds are usually male."

"The photographs clearly show this distinctive species and cannot be anything else" said Michael Parr, President of American Bird Conservancy and a UEA alumnus.

"Fortunately this is not a hard bird to identify, but it was certainly a hard bird to find," he added.

The Nuthatch was spotted in a small area known as Lucaya Estates. During the research project, birds were seen and heard in three distinct but nearby locations within this area.

Researcher Zeko McKenzie said: "Although the Bahama Nuthatch has declined precipitously, we are encouraged by the engagement of conservation scientists who are now looking for ways to save and recover the species."

The UEA team however are less optimistic as the exact drivers of the precipitous decline of the bird are still unclear.

Dr Diana Bell said: "Sadly, we think that the chances of bringing this bird back from the brink of extinction are very slim - due to the very low numbers left, and because we are not sure of the precise drivers for its decline.

"But it is still absolutely crucial that conservation efforts in the native Caribbean pine forest do not lapse as it is such an important habitat for other endemic birds including the Bahama Swallow, Bahama Warbler and Bahama Yellowthroat.

"The habitat is also incredibly important for North American migrants including the Kirtlands Warbler," she added.

Ellsworth Weir, Grand Bahama Parks Manager at the Bahamas National Trust, said: "It has been a pleasure for The Bahamas National Trust to host both Matthew and David as they conducted this very important research on Grand Bahama."

"Their work has taken them across the length and breadth of the island in what was likely the most in depth search to be conducted. Their research, which was inclusive of bird and habitat surveys, has helped to answer questions that some residents have been asking for some time."

"Sadly, we realize now that we are faced with a very dire situation regarding the Bahama Nuthatch. We wouldn't have realized the extent of the issue without the persistent efforts of David and Matthew."

Credit: 
University of East Anglia

Fish lice could be early indicators of metal pollution in freshwater

image: Fish lice could be early indicators for metal pollution in fresh water, say researchers. A Argulus japonicus, a fish louse,, is shown in this image. Water quality in rivers and dams is decaying all over the world, and metal pollution is a major reason. Meanwhile, water resources are very limited. University of Johannesburg scientists Prof Annemariè Avenant-Oldewage and Dr Beric Gilbert published their research in PLoS One.

Image: 
Prof Annemariè Avenant-Oldewage, Department of Zoology, University of Johannesburg.

Everyone needs safe and clean water to drink. Yet industry, agriculture and urban activities threaten fresh water. In particular, metal pollution can be very hard to detect early. Because of this, scientists are always searching for sensitive indicators of water quality. Now, a fish louse shows great promise as an early indicator for monitoring pollution in rivers and dams.

Living creatures tell a more complete story

Water samples only tell the story of a river for a moment in time. So researchers studied fish, because fish accumulate pollutants such as metals over time. But it can be difficult to get a complete story from fish also, says Prof Annemariè Avenant-Oldewage. She heads the Department of Zoology at the University of Johannesburg.

"Fish have mechanisms to protect themselves. They can reduce the toxic effects from metal pollution inside their bodies. They move the metals they accumulate to organs or other body parts where it is less harmful to them. Because of this, we cannot detect very low levels of metals by analysing fish.

"Also, if the fish have parasites, the parasites can accumulate the metals better than the fish. Tapeworms are an example of such internal parasites.

"In a way, the parasites absorb the metals from the fish. The parasites can then end up with metals in much higher concentrations than those in the host. For some internal parasites, levels of metals have been found to be up to 2 500 times higher than in the host," says Prof Avenant-Oldewage.

"This means we can measure metals in them, long before it is possible to do that in fish or in water samples. So parasites can give us early warnings of pollution."

Early warnings without harming fish

In follow-up research, Prof Avenant-Oldewage and her team studied tape worms. Tape worms live inside the intestines of fish, but they are not ideal. The host fish they live in has to be killed to analyse for accumulated pollutants.

Added to that, the researchers found that tape worms also have a way to get rid of metals. An egg-bearing tapeworm can move metal pollutants in its body, into the egg shells it is about to release.

As an alternative, the researchers then considered external fish parasites. If these work, no fish would need to be killed.

Picky eater of threatened fish

Next, Prof Avenant-Oldewage's team studied an external parasite called Paradiplozoon. The parasite lives on the gills of fish.

"Like most parasites, Paradiplozoon are picky eaters. They will only live on two species of yellowfish. Those yellowfish are only found in the Vaal River. So they would not be versatile indicators for water quality.

"Yellowfish is prized as a fighting fish for angling competitions. But they are physiologically sensitive creatures. They go into shock if someone removes parasites from their gills."

Bloodsucking swimmer

This is where the fish louse, Argulus japonicus , enters the picture as a possible early indicator.

Argulus japonicus lives in many kinds of freshwater and marine environments. It is a crustacean, a cousin of shrimp. It lives on the skin of many species of fish, but is also able to swim in search of a host. Because it infects the skin of its host, researchers can remove it without injuring the fish. All these abilities make it a versatile option.

In their latest study, the researchers analysed Argulus lice from the Vaal River. They wanted to see what fish lice do with the metals they accumulate.

Fluorescing metals

Dr Beric Gilbert caught mudfish and yellowfish in the Vaal Dam, close to Deneysville.

Then he removed Argulus lice from the fishes.

He froze the parasites, applied stains, and used a microscope with fluorescent functions. Then he could see areas in male and female lice that had higher concentrations of metals.

"Most of the metals were in the hard outer layer of the lice, also called the exoskeleton. There wasn't much difference in the amount of metals absorbed by male and female Argulus ," says Dr Gilbert.

The more intense the fluorescent signal, or glow, produced by the microscope, the higher the amount of metals accumulated in those areas of the lice.

"Male lice seemed to concentrate more metals in the exoskeleton covering the underside of their bodies. This was visible as a brighter yellow signal, or intense glow, when studying the parasites with the microscope.

"But in egg-bearing females, a layer of jelly around the eggs produced a positive signal, indicating the presence of metals. The female uses the jelly to secure the eggs to surfaces in the environment, when she lays them," he says.

Next hoops to jump

Argulus fish lice do not qualify as good freshwater indicators yet, says Prof Avenant-Oldewage.

"Our next step is to find out what mechanisms the lice use to protect themselves from metals. We also need to find out how they absorb metals in the first place," she says.

"If Argulus japonicus fish lice succeed, they could become sensitive, living metal indicators in the future. That way, we could detect metal pollution long before fish are affected. There could still be time to do something about it."

Credit: 
University of Johannesburg

The molecular link between aging and neurodegeneration

For decades researchers have worked to shed light on the causes of neurodegenerative disorders, a group of devastating conditions, including Alzheimer's and Parkinson's, that involve the progressive loss of neurons and nervous system function. In recent years, numerous factors, from genetic mutations to viral infections, have been found to contribute to the development of these diseases.

Yet age remains the primary risk factor for almost all neurodegenerative disorders. A precise understanding of the links between aging and neurodegeneration has remained elusive, but research from Harvard Medical School now provides new clues.

In a study published in Cell on Aug. 23, the research team describes the discovery of a molecular link between aging and a major genetic cause of both amyotrophic lateral sclerosis (ALS) and frontotemporal dementia (FTD), two related neurodegenerative diseases with shared genetic risk factors.

The findings, the researchers said, reveal possible new targets for treatment of these and other neurodegenerative diseases.

"Our study provides the first description of a molecular event that connects aging with neurodegeneration," said senior study author Junying Yuan, the HMS Elizabeth D. Hay Professor of Cell Biology. "These insights are a critical step towards understanding the mechanisms by which aging predisposes individuals to neurodegeneration."

The results also highlight the need for a better understanding of the biology of neurodegenerative diseases in the context of aging.

"Laboratory models of neurodegenerative diseases often have a missing element, and that is the contribution of age," Yuan said. "We have to understand better the process in its totality, not just its isolated components, to better guide clinical trials and improve our chances of finding effective treatments for these devastating diseases."

RIP rescue

Also known as Lou Gehrig's disease, ALS is a progressive, incurable condition marked by the gradual death of motor neurons. It shares some clinical and genetic features with FTD, which is marked by an early and rapid onset of dementia.

Around one in 10 patients with both diseases carry genetic mutations that cause the partial dysfunction of a protein known as TBK1. Previous studies, including by Yuan and colleagues, have shown that TBK1 is involved in a form of programmed cell death and in neuroinflammation, a hallmark of neurodegenerative disorders. How TBK1 contributes to the development of ALS and FTD, however, was unclear.

In the current study, Yuan and colleagues modeled the reduced levels of TBK1 found in ALS and FTD patients by creating mice that had only one functional copy of the gene that produces TBK1. These mice were healthy and similar in appearance to normal mice. In contrast, those that lacked the gene entirely died before birth.

However, the team found that mice without TBK1 could be fully rescued--surviving birth and becoming healthy adults--by blocking the activity of RIPK1, another protein known to play a central role in programmed cell death, neuroinflammation and neurodegenerative disease. Further analyses revealed that TBK1 normally functions to inhibit the activity of RIPK1 during embryonic development.

This discovery prompted the researchers to investigate another protein, called TAK1, previously known to also inhibit RIPK1 function. When they looked at data on TAK1 expression in human brains, the scientists found that TAK1 expression declines significantly with age. In the brains of patients with ALS, TAK1 expression was further reduced compared with the brains of similarly aged people without ALS.

Brake it down

To model the interaction between partial loss of TBK1 and TAK1 with aging, the team created mice that expressed half the usual amount of TBK1. The mice also expressed half the usual amount of TAK1 in their microglia, the immune cells of the brain where TAK1 is normally most active.

With reductions in both TBK1 and TAK1, these mice displayed traits associated with ALS and FTD, including motor deficits, hind limb weakness, anxiety-like behavior in new environments and changes in brain chemistry. The mice had a reduction in the number of neurons in the brain, and increased motor neuron dysfunction and cell death.

When the team inhibited RIPK1 activity independently of TBK1 and TAK1, they observed a reversal in symptoms.

Like a pair of brakes on a bicycle, TAK1 and TBK1 appear to work together to suppress the activity of RIPK1, and even if one brake fails, the other can compensate, the researchers said. But if both begin to fail, RIPK1 activity increases, leading to cell death and neuroinflammation.

This may be why individuals with TBK1 mutations do not develop ALS and FTD until they become older, when TAK1 levels decline with age, Yuan said.

Several clinical trials are underway to test the safety and efficacy of drugs that block RIPK1 activity in neurodegenerative and chronic inflammatory diseases, and these findings support the rationale for those trials, Yuan added.

"I think the next couple of years will reveal whether RIPK1 inhibitors can help ALS and FTD patients," she said. "I think our study makes us more confident those efforts might work."

Despite numerous large-scale clinical trials, however, no effective therapeutics have yet been developed for neurodegenerative diseases. This research now establishes a model of study that incorporates both aging and genetic risk for ALS and FTD, which may have broad implications.

"Many trials have been launched based on data from studies in mice, but how can a two-year-old mouse, for example, fully reflect what happens in an 80-year-old patient with Alzheimer's?" Yuan said. "We need to develop new thinking on how to model these diseases to incorporate the element of aging, and we think this study is an important step toward that goal."

The scientists are currently investigating why TAK1 levels decline with age and its potential role in other neurodegenerative diseases.

Credit: 
Harvard Medical School

Smuggling: Advocacy groups claims it's 'one of the tobacco industry's greatest scams'

Two new studies from the Tobacco Control Research Group at the University of Bath, published in the BMJ journal, Tobacco Control, expose evidence that big tobacco companies are still facilitating tobacco smuggling, while attempting to control a global system designed to prevent it, and funding studies that routinely overestimate levels of tobacco smuggling.

The findings come off the back of a major announcement last week from Bloomberg Philanthropies which makes Bath's Tobacco Control Research Group one of the leaders of an all-new $20 million global tobacco industry watchdog aiming to counter the negative influences of the tobacco industry on public health. The global partnership aims in particular to highlight tobacco industry activity across low and middle income countries.

The first study, which draws on leaked documents, highlights the elaborate lengths the industry has gone to control a global track and trace system and to undermine a major international agreement - the Illicit Trade Protocol - designed to stop the tobacco industry from smuggling tobacco.

A linked blog and editorial in the BMJ, help explain the findings.

In 2012, off the back of a string of inquiries, court cases and fines all aiming to hold the major tobacco companies to account for their involvement in global tobacco smuggling operations aimed to avoid paying taxes, governments around the world adopted the Illicit Trade Protocol (ITP). Part of a global treaty, the Framework Convention on Tobacco Control, the ITP aims to root out tobacco industry smuggling through an effective track and trace system - a system in which tobacco packs are marked so they can be tracked through their distribution route and, if found on the illicit market, can be traced back to see where they originated.

Fearful of developments, the new study argues that at this point the tobacco industry claimed to have changed, no longer the perpetrators of smuggling, but instead themselves now the victims of smuggled and counterfeited tobacco.

Simultaneously the major tobacco companies developed their own track and trace system, 'Codentify', lobbying governments around the world to see it adopted as the global track and trace system of choice. Leaked documents show the four major transnational tobacco companies hatched a joint plan to use front groups and third parties to promote 'Codentify' to governments and have them believe it was independent of industry and how these plans were operationalised. For example, the study reveals how a supposedly independent company fronted for British American Tobacco (BAT) in a tender for a track and trace system in Kenya.

Yet, reveal the researchers, growing evidence suggests the tobacco industry is still facilitating tobacco smuggling. Approximately two thirds of smuggled cigarettes may still derive from industry. At best, the authors suggest, this shows the tobacco industry's failure to control its supply chain, but they point to growing evidence from government investigations, whistleblowers and leaked tobacco industry documents all suggesting ongoing industry involvement.

The study suggests that in order to bolster support for their system and enhance their credibility, Big Tobacco created front groups, poured funding into organisations meant to hold it to account and into initiatives that would curry favour, and paid for misleading data and reports.

Professor Anna Gilmore, Director of the Tobacco Control Research Group, explains: "This has to be one of the tobacco industry's greatest scams: not only is it still involved in tobacco smuggling, but big tobacco is positioning itself to control the very system governments around the world have designed to stop it from doing so. The industry's elaborate and underhand effort involves front groups, third parties, fake news and payments to the regulatory authorities meant to hold them to account."

The second study, published today examines the quality of the data and reports on illicit tobacco that the tobacco industry has funded and raises further concerns about the tobacco industry's conduct. It finds that industry-funded data routinely overestimates levels of tobacco smuggling.

The first study to systematically identify and review literature that assesses industry-funded data on the illicit tobacco trade, it identifies widespread concerns with the quality, accuracy and transparency of tobacco-industry funded research. Industry-funded data were criticised for a fundamental lack of transparency at every stage of the research process, from sampling and data collection, through analysis to publication of findings.

The authors posit that the consistency with which issues have been identified suggests that the tobacco industry may be intentionally producing misleading data on the topic.

The authors suggest that despite overwhelming evidence of historical complicity in tobacco smuggling and their latest evidence suggesting that tobacco companies are continuing to fuel the illicit trade, the industry now portrays itself as key to solving the problem, presenting its funding of research as an example of its attempts to reduce illicit trade.

Lead author Allen Gallagher from the Tobacco Control Research Group at Bath explains: "Our latest findings fit with the tobacco industry's long history of manipulating research, including its extensive efforts to undermine and cause confusion on science showing the negative health impacts of smoking and second-hand smoke."

Second author Dr Karen Evans-Reeves added: "Despite far-reaching concerns over industry-funded data on this topic, tobacco companies continue to spend millions of pounds funding research into the illicit tobacco trade. As recently as 2016 Philip Morris International's PMI IMPACT initiative pledged 100 million USD for this purpose. Yet, if industry-funded data consistently fails to reach the expected standards of replicable academic research, we must question if it has any use beyond helping the industry muddy the waters on an important public health issue."

The team is now calling on governments and international bodies to crack down on Big Tobacco's tactics, to ensure that systems designed to control tobacco smuggling are truly independent of the industry and its front organisations, and that research on tobacco smuggling is free of industry interests.

Professor Gilmore, senior author on both papers states: "Governments, tax and customs authorities around the world appear to have been hoodwinked by industry data and tactics. It is vital that they wake up and realise how much is at stake. Tobacco industry funded research cannot be trusted. No government should implement a track and trace system linked in any shape or form to the tobacco manufacturers. Doing so could allow the tobacco industry's involvement in smuggling to continue with impunity."

Andy Rowell, co-author of the first paper states: "Governments need to be alert to what the tobacco industry is doing and to realise it is now operating via a complex web of front groups and companies. Any track and trace system linked to 'Codentify' simply cannot be trusted."

Credit: 
University of Bath

UV light may prevent infections in catheters, cardiac drivelines

image: A single UVC light-diffusing optical fiber.

Image: 
David Welch, Ph.D., Columbia University Vagelos College of Physicians and Surgeons

A specific wavelength of ultraviolet light, now delivered through light-diffusing optical fibers, is highly effective at killing drug-resistant bacteria in cell cultures, according to a new study led by David J. Brenner, PhD, a professor of radiation biophysics at Columbia University Vagelos College of Physicians and Surgeons. The technology is designed to prevent infections around skin-penetrating medical devices, such as catheters or mechanical heart pump drivelines.

Why it Matters

Infections from skin-penetrating medical devices, including catheters and drivelines for left ventricular assist devices (LVADs), are a major health threat. For example, an estimated 14 to 28 percent of patients with an LVAD develops a driveline skin infection, leading to complications that limit their use as a long-term therapy for heart failure patients. The most serious of these infections are caused by the drug-resistant bacteria MRSA (methicillin-resistant Staphylococcus aureus).

Background

In a previous mouse study, Brenner and his Columbia team demonstrated that a narrow spectrum of far-UVC light, with a wavelength of 207 to 224 nanometers (nm), can kill MRSA bacteria without damaging human skin. Conventional germicidal UV light, with a wavelength of 254 nm, is also effective at killing bacteria, but it can't be used in health care settings around people because it can harm the skin and eyes. Far-UVC light is safe for people because it can't penetrate the outer layer of dead skin or the tear layer of the eye, but it's deadly for bacteria, which are much smaller and easier to penetrate.

What's New

The current study was designed to test whether far-UVC light that is transmitted along a thin fiber could be used to disinfect complex tissue shapes, such as the area where a catheter or a driveline enters the skin. The Columbia team developed a new way to deliver the light, using a laser to send 224 nm far-UVC light through a thin flexible optical fiber. In this study, the fibers were laid directly over tissue cultures containing MRSA bacteria, which were efficiently killed by the far-UVC light diffusing out of the fibers.

What it Means

"Our study suggests that far-UVC light, delivered by optical fibers that can be incorporated into skin-penetrating devices, could be used to prevent catheter-based and driveline infections," said Brenner. "This application would be used for catheters or drivelines that have to be kept in place for long periods of time, and it's hard to keep the area where they penetrate the skin sterile. Incorporating these thin far-UVC-emitting fibers into the catheter or driveline may be the solution."

What's Next

Studies to determine if the technology can prevent infections around skin-penetrating lines in animal models are currently underway.

Caveats

The study was performed on bacteria in laboratory tissue cultures, not on living animals or human patients. In addition, the technology to make the equipment easily portable and affordable is under development.

Credit: 
Columbia University Irving Medical Center

D-Wave demonstrates first large-scale quantum simulation of topological state of matter

BURNABY, BC - (August 22, 2018) -- D-Wave Systems Inc., the leader in quantum computing systems and software, today published a milestone study demonstrating a topological phase transition using its 2048-qubit annealing quantum computer. This complex quantum simulation of materials is a major step toward reducing the need for time-consuming and expensive physical research and development.

The paper, entitled "Observation of topological phenomena in a programmable lattice of 1,800 qubits", was published in the peer-reviewed journal Nature (Vol. 560, Issue 7719, August 22, 2018). This work marks an important advancement in the field and demonstrates again that the fully programmable D-Wave quantum computer can be used as an accurate simulator of quantum systems at a large scale. The methods used in this work could have broad implications in the development of novel materials, realizing Richard Feynman's original vision of a quantum simulator. This new research comes on the heels of D-Wave's recent Science Magazine paper demonstrating a different type of phase transition in a quantum spin-glass simulation. The two papers together signify the flexibility and versatility of the D-Wave quantum computer in quantum simulation of materials, in addition to other tasks such as optimization and machine learning.

In the early 1970s, theoretical physicists Vadim Berezinskii, J. Michael Kosterlitz and David Thouless predicted a new state of matter characterized by nontrivial topological properties. The work was awarded the Nobel Prize in Physics in 2016. D-Wave researchers demonstrated this phenomenon by programming the D-Wave 2000Q™ system to form a two-dimensional frustrated lattice of artificial spins. The observed topological properties in the simulated system cannot exist without quantum effects and closely agree with theoretical predictions.

"This paper represents a breakthrough in the simulation of physical systems which are otherwise essentially impossible," said 2016 Nobel laureate Dr. J. Michael Kosterlitz. "The test reproduces most of the expected results, which is a remarkable achievement. This gives hope that future quantum simulators will be able to explore more complex and poorly understood systems so that one can trust the simulation results in quantitative detail as a model of a physical system. I look forward to seeing future applications of this simulation method."

"The work described in the Nature paper represents a landmark in the field of quantum computation: for the first time, a theoretically predicted state of matter was realized in quantum simulation before being demonstrated in a real magnetic material," said Dr. Mohammad Amin, chief scientist at D-Wave. "This is a significant step toward reaching the goal of quantum simulation, enabling the study of material properties before making them in the lab, a process that today can be very costly and time consuming."

"Successfully demonstrating physics of Nobel Prize-winning importance on a D-Wave quantum computer is a significant achievement in and of itself. But in combination with D-Wave's recent quantum simulation work published in Science, this new research demonstrates the flexibility and programmability of our system to tackle recognized, difficult problems in a variety of areas," said Vern Brownell, D-Wave CEO.

"D-Wave's quantum simulation of the Kosterlitz-Thouless transition is an exciting and impactful result. It not only contributes to our understanding of important problems in quantum magnetism, but also demonstrates solving a computationally hard problem with a novel and efficient mapping of the spin system, requiring only a limited number of qubits and opening new possibilities for solving a broader range of applications," said Dr. John Sarrao, principal associate director for science, technology, and engineering at Los Alamos National Laboratory.

"The ability to demonstrate two very different quantum simulations, as we reported in Science and Nature, using the same quantum processor, illustrates the programmability and flexibility of D-Wave's quantum computer," said Dr. Andrew King, principal investigator for this work at D-Wave. "This programmability and flexibility were two key ingredients in Richard Feynman's original vision of a quantum simulator and open up the possibility of predicting the behavior of more complex engineered quantum systems in the future."

The achievements presented in Nature and Science join D-Wave's continued work with world-class customers and partners on real-world prototype applications ("proto-apps") across a variety of fields. The 70+ proto-apps developed by customers span optimization, machine learning, quantum material science, cybersecurity, and more. Many of the proto-apps' results show that D-Wave systems are approaching, and sometimes surpassing, conventional computing in terms of performance or solution quality on real problems, at pre-commercial scale. As the power of D-Wave systems and software expands, these proto-apps point to the potential for scaled customer application advantage on quantum computers.

Credit: 
LaunchSquad

Big data and technology in disasters: Better integration needed for effective response

image: Disasters are becoming more commonplace and complex, and the challenges for rescue and humanitarian organizations increase. Increasingly these groups turn to big data to help provde solutions. The authors wished to examine how ICT tools and big data were being used in disaster responses. By conducting a structured literature search and developing a data extraction tool on the use of ICT and big data during disasters they showed that some important gaps exist which should be part of a future research focus.

Image: 
Copyright: © 2018 Society for Disaster Medicine and Public Health, Inc. https://doi.org/10.1017/dmp.2018.73

In a recent review article published in the journal Disaster Medicine and Public Health Preparedness, a group of Johns Hopkins' authors evaluated 113 studies using predetermined criteria with the final search taking place on May 1, 2017. Search terms were created in consultation with medical librarians and subject matter experts in Information and Communications Technology (ICT), big data, and disasters. Only articles that implemented ICT and big data tools in real life were considered. (Table 1).

A data extraction tool was developed by subject matter and included the following items; first author and year, data type, disaster type, country. (Table 2).

The literature review identified some important gaps: more information is needed on the use of technologies. Most articles discussed the use of ICT in natural disasters which were mainly hurricanes and earthquakes. What was underreported was data on extreme temperatures and flooding, even though these events account for 27% and 26% of global deaths respectively.

According to first author Dr. Jeffrey Freeman, "Disasters are inherently a Big Data challenge, and with the ubiquitous nature of cell phones, the rapid spread of connectivity, and the rise of technologies like the Internet of Things, the challenge is only going to get bigger. In disasters, the key question we face today is how do we harness a growingly diverse and often chaotic wave of data and information. Simply put, we've got to handle more data than we've ever had, and do so more quickly and effectively than we've ever done before. Big Data and ICT pose a serious challenge in disasters, but they also hold promise for potential solutions. The answer to leveraging the massive amounts of data that ICT is creating is likely to be found within the very same technologies driving the Information Age. But we have to think creatively about adapting and adopting these technologies in emergency situations. Disasters leave little room for trial and error. The consequences are too great."

According to Dr. Dan Barnett (coauthor on the paper) "As a researcher of public health emergency preparedness and response systems, I've watched closely as the rate of innovation has frequently outpaced adoption in this field. If we are to be effective in responding to disasters and other public health emergency situations, we need to do a better job figuring out how technology can be integrated into disaster response.

In embarking on this integrative literature review, we knew information and communications technology (ICT) was present in disasters, and we knew people were using related technologies, but we didn't know much else. As researchers, we wanted to more clearly understand how Big Data applications and ICT solutions were being used, and more importantly, we wanted to know where things went right and where things went wrong. These kinds of insights can move the state of the science forward, and ultimately, allow us to achieve a more effective response to disasters.

Technology and disasters have had a tenuous relationship. For those of us in the field, there has been a growing recognition that technology holds promise for enabling disaster response, but we've also watched as even the most basic of technologies, like phone service and electricity, has been crippled during the acute phase of a disaster. Technology holds little value in disasters if unavailable when it's
needed most. If we can understand more clearly how people want to use Big Data and ICT in disasters, then we can focus our efforts on ensuring those technologies are resilient and reliable under any circumstances."

Credit: 
Society for Disaster Medicine and Public Health, Inc.

GPM sees Hurricane Lane threatening Hawaiian islands with heavy rainfall

video: NASA's GPM Sees Hurricane Lane Threatening Hawaiian Islands With Heavy Rainfall On Aug. 22 at 1:48 a.m. EDT (0548 UTC) the Global Precipitation Measurement mission or GPM core satellite passed over Hurricane Lane when it was a Category five hurricane in the Central Pacific Ocean. GPM found very heavy rain occurring in powerful storms located in Lane's well defined eye wall. Moderate to heavy rainfall was also covering a large area extending outward from the eye.

Image: 
Credits: NASA/JAXA, Hal Pierce

The GPM core observatory satellite flew over the Central Pacific Ocean and Hurricane Lane on Aug. 22, 2018 and analyzed rainfall rates and cloud heights. Watches and Warnings are in effect in the Hawaiian Islands.

The Global Precipitation Measurement mission or GPM core satellite passed over Lane at 1:48 a.m. EDT (0548 UTC/ on Aug. 21, 2018 at 7:48 p.m. HST). At that time Lane was located about 316 nautical miles (585.2 km) from Hilo, Hawaii.

Hurricane Lane is one of the strongest tropical cyclones to move into the Hawaiian Islands.

At the time of this GPM pass Lane was a category five on the Saffir-Simpson hurricane wind scale with winds of about 140 knots (161 mph). This analysis shows precipitation derived from data collected by the GPM satellite's Microwave Imager (GMI) and Dual-Frequency Precipitation Radar (DPR) instruments. GPM's GMI showed that very heavy rain was occurring with powerful storms located in Hurricane Lane's well defined eye wall. GMI revealed that moderate to heavy rainfall was also covering a large area extending outward from Lane's eye.

A 3-D image, created at NASA's Goddard Space Flight Center in Greenbelt, Md. showed the use of GPM satellite's radar data (DPR Ku Band) in conjunction with NOAA's GOES-West satellite infrared data to estimate the heights of storms within Hurricane Lane. Estimated storm top heights were over 6.2 miles (10 km). Cloud tops in a band of thunderstorms well southwest of the center were over 9.2 miles (14.9 km).

GPM is a joint mission between NASA and the Japan Aerospace Exploration Agency, JAXA.

Watches and Warnings

Warnings and watches are up in the Hawaiian Islands. NOAA's Central Pacific Hurricane Center said aHurricane Watch is in effect for Central Oahu, Kauai Leeward, Kauai Mountains, Kauai Windward, Niihau, Oahu Koolau, Oahu North Shore, Oahu South Shore, Olomana, Waianae Coast, and Waianae Mountains

A Hurricane Warning is in effect for Big Island Interior, Big Island North and East, Big Island Summits, Haleakala Summit, Kahoolawe, Kohala, Kona, Lanai Makai, Lanai Mauka, Leeward Haleakala, Maui Central Valley, Maui Leeward West, Maui Windward West, Molokai Leeward, Molokai Windward, South Big Island, and Windward Haleakala.

Location and Strength of Hurricane Lane on Aug. 22

NOAA's CPHC said at 11 a.m. EDT (5 a.m. HST/1500 UTC), the center of Hurricane Lane was located near latitude 15.1 degrees north and longitude 155.3 degrees west. That's about 315 miles (505 km) south of Kailua-Kona, Hawaii and about 460 miles (740 km) south-southeast of Honolulu, Hawaii.

Lane is moving toward the west-northwest near 9 mph (15 kph) and this motion is expected to become northwest later today, followed by a turn to the north-northwest on Thursday. On the forecast track, the center of Lane will move very close to or over the main Hawaiian Islands from Thursday through Saturday, Aug. 25.

Maximum sustained winds are near 155 mph (250 kph) with higher gusts. Lane is a category 4 hurricane on the Saffir-Simpson Hurricane Wind Scale. Some weakening is forecast during the next 48 hours, but Lane is forecast to remain a dangerous hurricane as it draws closer to the Hawaiian Islands.

Hurricane-force winds extend outward up to 40 miles (65 km) from the center and tropical-storm-force winds extend outward up to 140 miles (220 km). The estimated minimum central pressure is 935 millibars.

Situation Overview from CPHC on Aug. 22

CPHC stated "Major Hurricane Lane is passing roughly 280 miles south of the Big Island this morning and has started a turn towards the northwest in line with the Central Pacific Hurricane Center forecast. The center of Lane will track dangerously close to the Hawaiian Islands from Thursday, Aug. 22 through Saturday, Aug. 25.

Regardless of the exact track of the storm center, life threatening impacts are likely over some areas as this strong hurricane makes its closest approach. Just a reminder that impacts from a hurricane extend far from the center of the storm and slight changes to the forecast track this close to the islands will produce rapid changes to the local forecast impacts."

The CPHC forecasts that "the current forecast track will bring local impacts of damaging winds and life threatening flooding rain across the state from Wednesday through Saturday".

Credit: 
NASA/Goddard Space Flight Center

Excited atoms throw light on anti-hydrogen research

image: A positron beam line transports the positrons from the source into the main antihydrogen trap.

Image: 
Swansea University

Swansea University scientists working at CERN have published a study detailing a breakthrough in antihydrogen research.

The scientists were working as part of the ALPHA collaboration which is made up of researchers and groups from over a dozen institutions from all over the world, with the UK contingent led by Swansea University's Professor Mike Charlton.

The research, funded by the EPSRC, was obtained using apparatus at the Antiproton Decelerator facility at CERN, and has been published in the Nature journal.

The Experiment:

The ALPHA team experiment shows how the scientists improved efficiency in the synthesis of antihydrogen, and for the first time succeeded in accumulating the anti-atoms, which has allowed for greater scope in their experimentation.

Professor Charlton said: "When an excited atom relaxes, it emits light of a characteristic colour, the yellow colour of sodium street lights is an everyday example of this. When the atom is hydrogen, which is a single electron and a single proton, and the excited electron decays to the lowest energy state from a higher one, the discrete series of ultraviolet light emitted forms the Lyman Series, which is named after Theodore Lyman who first observed this over 100 years ago.

"The presence of these discrete lines helped to establish the theory of quantum mechanics which governs the world at an atomic level and is one of the corner stones of modern Physics.

"The Lyman-alpha line is of fundamental importance in physics and astronomy. For example, observations in astronomy on how the line from distant emitters is shifted to longer wavelengths (known as the redshift), gives us information on how the universe evolves, and allows testing models which predict its future"

This experiment is the first time the Lyman-alpha transition - when the hydrogen electron transitions between the so-called 1S and 2P state, emitting or absorbing UV light of 121.6 nm wavelength - has been observed in anti-hydrogen. Antihydrogen is the antimatter counterpart to hydrogen, and is comprised of a single anti-proton and a single anti-electron with the latter particle also known as a positron.

Excited Atoms

For this experiment, the physicists accumulated about 500 antihydrogen atoms in the trap. If they did nothing, they could hold these atoms for many, many, hours without loss. However, by illuminating the trapped atoms with various colours of UV light, the team could drive the Lyman-alpha transition and excite the antihydrogen atoms.

These excited atoms are no longer trapped within the apparatus and, being comprised of antimatter, promptly annihilate with the surrounding matter of the equipment and are detected.

This observation is significant as it is yet another test of a property of antihydrogen that is in good agreement with that of hydrogen. It is also a key step towards the production of ultra-cold antihydrogen atoms, which will greatly improve the ability to control, manipulate and perform further precision studies on the anti-atom.

Professor Charlton said: "This represents another landmark advance in atomic physics, which should open the way to manipulation of the kinetic energies of the trapped anti-atoms

"While studies have continued at the Antiproton Decelerator facility at CERN, further refining these measurements and using the techniques to improve our understanding of the antihydrogen through spectroscopy, the ALPHA team will be modifying the apparatus in order to study the effect of Earth's gravity on the anti-atom. The next few months will be an exciting time for all concerned."

Credit: 
Swansea University

Does it matter where students sit in lecture halls?

image: This is a lecture theater.

Image: 
FEBS

Lectures are a staple of higher education, and understanding how students interact and learn within the lecture theatre environment is central to successful learning. In a new study published in FEBS Open Bio, researchers examined students' reasons for choosing particular seats in a lecture hall, and investigated how seating positions correlate with student performance.

Many students preferred being able to sit with their friends, while others were more concerned with either attracting or avoiding the lecturer's attention. Some students chose seats that allowed them to see and hear clearly, while others picked easily vacatable seats that made them feel less anxious.

Friendship groups who sat together tended to achieve similar grades, and students who sat alone at the edges tended to do worse than average. Lecturers may be able to use these findings to provide assistance to anxious students, and to support the learning of all students by encouraging interactions between the different groups.

"Interaction is a key part of learning and knowing who the students are interacting with can be a great benefit when designing activities," said lead author Dr. David P. Smith, of Sheffield Hallam University, in the UK.

Credit: 
Wiley