Tech

Removing the brakes on plant oil production

image: Using biochemical genetic techniques in plant cell cultures, this Brookhaven Lab team evaluates the effects of releasing the biochemical 'brakes' on plant oil production. Pictured are: Group leader and Biology Department Chair John Shanklin (standing), Jan Keereetaweep (front right), Hui Liu (front left), and Zhiyang Zhai.

Image: 
Brookhaven National Laboratory

UPTON, NY--Scientists studying plant biochemistry at the U.S. Department of Energy's Brookhaven National Laboratory have discovered new details about biomolecules that put the brakes on oil production. The findings suggest that disabling these biomolecular brakes could push oil production into high gear--a possible pathway toward generating abundant biofuels and plant-derived bioproducts. The study appears in in the journal Plant Physiology.

"It's normal for plant cells to down-regulate oil production when we feed them excess fatty acids, and this study confirms our hypothesis about how they do that. But we also discovered that the brakes on oil production are partially on even under normal conditions, which was a big surprise," said Brookhaven Lab biochemist John Shanklin, who led the research.

"It would be like driving a car for several years and finding out one day that a parking brake you didn't know about had been on all along. When you remove that brake, the car has much more power; that's what we've just discovered for plant oil production," he said.

A delicate balance

The biomolecule central to this study is the enzyme that determines the rate of oil production. That enzyme, known as ACCase, is a protein made of four subunits, all of which are necessary for the enzyme to function. With all four subunits in place, the enzyme drives the first step in the synthesis of fatty acids, key components of oils.

Earlier work by Shanklin's group in 2012 revealed that when plant cells were fed a short-term excess of fatty acids (lasting less than two days), a feedback loop inhibited this enzyme, so oil production would slow down . As long as fatty acid concentrations dropped within two days, the enzyme and oil production would turn back on. But a longer-term excess of fatty acids would permanently disable the enzyme. At the time, scientists knew of several ways that the enzyme could be inhibited, but none of those ways could explain the irreversible inhibition they were observing.

When colleagues at the University of Missouri discovered an inactive version of one of the four enzyme subunits in 2016, Shanklin suspected that this inactive subunit might be the cause of the permanent shutdown--by taking the place of one of the active subunits in the enzyme. He designed this new study to test that hypothesis.

Team member Hui Liu obtained plants in which the genes that code for the inactive subunits were individually disabled. She used those variants to breed plants that had combinations of disabled subunits. If Shanklin's idea was correct, cells with disabled inactive subunits would have a lower capacity to turn the enzyme off.

"We suspected that disabling the genes would turn off the off-switch for oil production, allowing the plant cells to make more oil," Shanklin explained.

When team member Jan Keereetaweep tested this idea by feeding the plant cells excess fatty acids, that's exactly what happened: Cells with combinations of the disabled genes didn't turn off oil production the way cells with the normal genes did.

"There was 50 percent less inhibition of oil production in the cells with disabled genes compared to the wild-type plant cells," Shanklin said. That result confirmed that the inactive subunit coded for by the normal genes in the wild-type plants was indeed what triggered permanent shutdown of the enzyme.

But the big surprise came when Keereetaweep measured fatty acid synthesis in the plant cells with disabled inactive subunits without artificially feeding them excess fatty acids and compared the results with those for wild-type plant cells under the same conditions. Under those normal conditions, where you wouldn't expect to see oil production inhibited, the enzyme driving oil production was significantly more active in plant cells with the disabled genes than in normal plant cells.

"That means that, even under normal conditions, inactive subunits are putting the brakes on ACCase, reducing its activity and limiting oil production," Shanklin said. "Disabling the genes for those inactive subunits is like taking the brakes off the car, revealing the motor's true potential."

"This project was an excellent collaboration among Keereetaweep, Liu, and Zhiyang Zhai to answer some basic scientific questions about plant metabolism," Shanklin noted. "Now, the knowledge they generated can potentially underpin strategies to increase oil accumulation in plant species grown for applications such as biofuels or bioproducts."

Credit: 
DOE/Brookhaven National Laboratory

A different spin on superconductivity

image: Artistic representation of high-spin pairs forming in a YPtBi crystal, leading to unconventional superconductivity.

Image: 
Emily Edwards, University of Maryland

When you plug in an appliance or flip on a light switch, electricity seems to flow instantly through wires in the wall. But in fact, the electricity is carried by tiny particles called electrons that slowly drift through the wires. On their journey, electrons occasionally bump into the material's atoms, giving up some energy with every collision.

The degree to which electrons travel unhindered determines how well a material can conduct electricity. Environmental changes can enhance conductivity, in some cases drastically. For example, when certain materials are cooled to frigid temperatures, electrons team up so they can flow uninhibited, without losing any energy at all--a phenomenon called superconductivity.

Now a team* of researchers from the University of Maryland (UMD) Department of Physics together with collaborators has seen exotic superconductivity that relies on highly unusual electron interactions. While predicted to occur in other non-material systems, this type of behavior has remained elusive. The team's research, published in the April 6 issue of Science Advances, reveals effects that are profoundly different from anything that has been seen before with superconductivity.

Electron interactions in superconductors are dictated by a quantum property called spin. In an ordinary superconductor, electrons, which carry a spin of ½, pair up and flow uninhibited with the help of vibrations in the atomic structure. This theory is well-tested and can describe the behavior of most superconductors. In this new research, the team uncovers evidence for a new type of superconductivity in the material YPtBi, one that seems to arise from spin-3/2 particles.

"No one had really thought that this was possible in solid materials," explains Johnpierre Paglione, a UMD physics professor and senior author on the study. "High-spin states in individual atoms are possible but once you put the atoms together in a solid, these states usually break apart and you end up with spin one-half. "

Finding that YPtBi was a superconductor surprised the researchers in the first place. Most superconductors start out as reasonably good conductors, with a lot of mobile electrons--an ingredient that YPtBi is lacking. According to the conventional theory, YPtBi would need about a thousand times more mobile electrons in order to become superconducting at temperatures below 0.8 Kelvin. And yet, upon cooling the material to this temperature, the team saw superconductivity happen anyway. This was a first sign that something exotic was going on inside this material.

After discovering the anomalous superconducting transition, researchers made measurements that gave them insight into the underlying electron pairing. They studied a telling feature of superconductors--their interaction with magnetic fields. As the material undergoes the transition to a superconductor, it will try to expel any added magnetic field from its interior. But the expulsion is not completely perfect. Near the surface, the magnetic field can still enter the material but then quickly decays away. How far it goes in depends on the nature of the electron pairing, and changes as the material is cooled down further and further.

To probe this effect, the researchers varied the temperature in a small sample of the material while exposing it to a magnetic field more than ten times weaker than the Earth's. A copper coil surrounding the sample detected changes to the superconductor's magnetic properties and allowed the team to sensitively measure tiny variations in how deep the magnetic field reached inside the superconductor.

The measurement revealed an unusual magnetic intrusion. As the material warmed from absolute zero, the field penetration depth for YPtBi increased linearly instead of exponentially as it would for a conventional superconductor. This effect, combined with other measurements and theory calculations, constrained the possible ways that electrons could pair up. The researchers concluded that the best explanation for the superconductivity was electrons disguised as particles with a higher spin--a possibility that hadn't even been considered before in the framework of conventional superconductivity.

The discovery of this high-spin superconductor has given a new direction for this research field. "We used to be confined to pairing with spin one-half particles," says Hyunsoo Kim, lead author and a UMD assistant research scientist. "But if we start considering higher spin, then the landscape of this superconducting research expands and just gets more interesting."

For now, many open questions remain, including how such pairing could occur in the first place. "When you have this high-spin pairing, what's the glue that holds these pairs together?" says Paglione. "There are some ideas of what might be happening, but fundamental questions remain-which makes it even more fascinating."

Credit: 
University of Maryland

Cell biology: Dynamics of microtubules

Cells possess an internal skeleton, which enables them to alter their form and actively migrate. This 'cytoskeleton' is composed of a number of filament systems, of which microtubules are one. As the name suggests, a microtubule is a cylinder. Its wall is made up of 13 protofilaments, each consisting of heterodimeric subunits containing two related tubulin proteins. Microtubules not only confer mechanical stability on cells and help to dictate their forms, they also serve as an intracellular transport network. Furthermore, microtubules are the major constituents of the mitotic spindle, which mediates the orderly segregation of the replicated chromosome sets into the two daughter cells during cell division. All of these functions require dynamic regulation of microtubule lengths. A group of LMU physicists led by Professor Erwin Frey, in collaboration with Professor Stefan Diez (Technical University of Dresden and Max Planck Institute for Molecular Cell Biology and Genetics, Dresden), has now developed a model in which the motor proteins that are responsible for the transport of cargo along protofilaments also serve to regulate microtubule lengths. The model is described and experimentally validated in the journal Physical Review Letters.

In earlier work, Frey's group had shown that the density of molecular motors attached to the filaments has an impact on whether the microtubule grows or shrinks, and that their effect depends on the length of the filament concerned. The longer the microtubule, the greater the number of motor proteins it can accommodate. Motor molecules called kinesins proceed along the protofilament, stepping from one dimer to the next. When a kinesin protein reaches the end, it detaches from the filament taking the tubulin to which it is bound with it. Consequently, if the motor density on the protofilament is high, shrinkage will continue. On the other hand, a new tubulin dimer can bind to the end. At the end, motor-dependent shrinkage thus competes with microtubule growth. "Hence, assuming that resources (i.e. both tubulins and molecular motors) are present in access, there will be a filament length at which the rates of growth and shrinkage balance out," says Matthias Rank, first author of the study. However, in a real cell, these components are unlikely to be available in unlimited amounts. For example, formation of the mitotic spindle significantly depletes the numbers of free tubulin molecules in the soluble phase of the cytoplasm. In the new study, the researchers explored the effects of such resource limitation on the regulation of microtubule length.

Using simulations based on a mathematical model of polymer dynamics, they found that under these conditions two distinct mechanisms of length regulation come into play. Which of these becomes dominant depends on the relative concentrations of the tubulins and the motor proteins: In a certain concentration range the dynamic equilibrium between growth and shrinkage of the microtubules operates as it would if resources were not limiting. "But things are different when one of the required resources is in short supply", says Rank. "That is the case, for instance, when not enough motor molecules are available to trigger rapid depolymerization of the protofilaments." In this situation, the microtubules continue to grow until the concentration of tubulins falls below a critical value. Furthermore, there is a concentration range in which both processes are active. "In this case, we observe that the microtubules come in two sizes and that they sometimes switch between the two lengths", says Frey. "In physical terms, this can be described as a phase transition." In vitro experiments carried out by their co-author in Dresden have confirmed the existence of this transitional regime predicted by the Munich model. The team is convinced that their results are also applicable to other polymer systems, and they suspect that the limitation of key resources may play an important part in regulating other cellular processes.

Credit: 
Ludwig-Maximilians-Universität München

Creating a 2-D platinum magnet

image: This is Associate Professor Justin Ye, head of the Device Physics of Complex Materials group, which is part of the Zernike Institute for Advanced materials at the University of Groningen's Faculty of Science and Engineering.

Image: 
Zernike Institute for Advanced materials , University of Groningen

University of Groningen physicists have induced magnetism in platinum with an electric field created by a paramagnetic ionic liquid. As only the surface of the platinum is affected, this creates a switchable 2D ferromagnet. The study was published in Science Advances on 6 April.

Platinum is used a lot in jewellery and electronics. Although this precious metal looks great and is an excellent conductor, it has no magnetic properties. Until now, that is: University of Groningen scientists have induced ferromagnetic states on the surface of a thin film of platinum. 'You can tune magnets electrically by changing the number of carriers inside, which is one of the key ideas in spintronics. But so far, no one could generate magnets like that', says Associate Professor Justin Ye, chair of the Device Physics of Complex Materials group at the University of Groningen.

Ionic liquid

Lei Liang, a postdoc in Ye's research group and first author of the paper, built a device in which he could induce ferromagnetism in non-magnetic platinum, using a field effect generated by gating through an ionic medium called ionic liquid. Ye: 'The key here is that we used a paramagnetic ionic liquid, a new type of ionic liquid which we synthesized ourselves.' If an electric field is applied, the ions move to the surface of the platinum, carrying both charge and magnetic moment. Both affect the surface layer of the platinum film, creating an atomically thin layer of magnetic platinum.

Spintronics

'We were able to show that this is really a 2D magnet, and the magnetic state can extend to the room temperature', says Ye. 'It is amazing that we could still add new properties to such a well-known material.' Recently, many 2D magnets have been isolated from layered compounds, but most are insulators and they are only magnetic at very low temperature. Making them in a conductor could be useful in spintronics, a promising new type of electronics, which is based on the magnetic moment (or spin) of electrons. The new discovery means that magnetism can be switched on and off in a conductor, which could lead to the development of devices that can simultaneously control charge and spin.

Credit: 
University of Groningen

Mechanism vital to keeping blood stem cells functional uncovered

Hematopoietic stem cells, that form mature blood cells, require a very precise amount of protein to function -- and defective regulation of protein production is common in certain types of aggressive human blood cancers. Now, a research team at Lund University in Sweden has uncovered a completely new mechanism that controls how proteins are produced to direct stem cell function.

"Our research is potentially important for life-threatening blood cancers characterised by dysfunctional stem cells -- which are common in elderly people. High protein synthesis levels could represent an Achilles' heel to eradicating cancer-initiating cells", explains Cristian Bellodi, research team leader at Lund University's Department of Laboratory Medicine.

Dr. Bellodi's laboratory uncovered a new important function of pseudouridine, the most common type of RNA modification in human cells.

RNA is the essential molecule that decodes the genetic information in humans. It is emerging that the chemical structure of RNA molecules is extensively modified by specific enzymes normally present in our cells, which are commonly found to be altered in severe medical syndromes and various types of cancers. However, the contribution of RNA modifications in human development and disease is still mostly unexplored.

"Understanding the function of RNA modifications represents a new exciting research area. We still know very little about the mechanisms by which RNA molecules are modified, and whether this affects important biological processes in our cells. Therefore, it is essential that we learn how specific types of chemical modifications normally regulate RNA function in our cells, in order to understand how dysregulation of this process contributes to human disease, says Cristian Bellodi.

The team's key discovery was that stem cells lacking an enzyme responsible for pseudouridine modification of RNA, known as PUS7, produce abnormal amounts of protein. This protein overload leads to unbalanced stem cell growth and dramatically blocks differentiation to blood cells.

They uncovered that the PUS7 enzyme is capable of introducing a pseudouridine modification into previously uncharacterized, non-coding-protein RNA molecules that they denoted as miniTOGs (mTOGs). The presence of pseudouridine "activates" mTOGs to strongly suppress the stem cell protein synthesis machinery. This ensures that the correct amount of proteins is made.

"Our work illustrates that this exquisite control mechanism -- regulated by PUS7 and pseudouridine -- is critical to adjusting the amount of proteins needed for human stem cells to grow and produce blood", says Cristian Bellodi.

Since pseudouridine modifications may affect various RNA molecules in different types of normal and malignant cells, "our discoveries pave the way for future avenues of research aimed at exploring the role of pseudouridine in human development disease", concludes Cristian Bellodi.

Credit: 
Lund University

School lunch decisions made by the child and not the parent

Philadelphia, April 6, 2018 - While school lunches in the UK are subject to food standards, the contents of packed lunches are not as closely scrutinized, and studies have raised concern regarding the nutritional quality of packed lunches. A new study published in the Journal of Nutrition Education and Behavior found that children, not their parents, are often the primary decision maker of whether they will eat a school lunch or what is packed for their lunch.

"Children's role in their packed lunch provision highlights their growing authority over everyday food decisions. Packed lunches provide a unique medium because they connect the school, parent, and student. There is limited research, though, on parents' perspectives and perceptions related to packed lunches, specifically the role of children in food choice and preparation," said lead author Hannah Ensaff, PhD, School of Food Science and Nutrition, University of Leeds, Leeds, UK.

Study participants were twenty parents providing a packed lunch for their children (aged 5 - 11 years) attending four urban primary schools in the UK. Focus groups were conducted to promote discussion among parents to gain an understanding of contrasting viewpoints. Key topics explored included reasons for selecting a packed lunch, foods and beverages included and their selection, role of children in preparation, and packed lunch policies.

After analysis of the data, four keys themes emerged: children as a decision maker; priorities when preparing a packed lunch; parents' anxieties and reassurances; and school factors. Even though parents preferred taking advantage of school lunches that are provided at no cost to some families, they were unwilling to force this decision when the child disagreed. The child's food preferences also took precedence when the packed lunch was prepared. Children themselves made specific requests when shopping or the parent packed what they knew would be enjoyed and eaten. The ability to monitor that a lunch had been eaten was cited as a benefit of a packed lunch over a school lunch and providing a treat in the packed lunch was also important to parents. The inclusion of treats and other items such as chips, chocolate, and soda is often prohibited by packed lunch guidelines, but parents questioned whether enforcement is possible. They also reported children trying to persuade parents to ignore the policy by reporting on what other children had brought to school.

Children's growing authority over food choice has implications for staff involved in providing school food and presents an opportunity to develop initiatives to promote better food choices and subsequent nutrition," said Dr. Ensaff. "This is particularly important as schools are being used for public health interventions." Further research is needed to explore children's perceptions of their role as active decision makers in food choices both in packed lunches and school meals.

Credit: 
Elsevier

Smartphone 'scores' can help doctors track severity of Parkinson's disease symptoms

image: Johns Hopkins computer science students Srihari Mohan, left, and Andong Zahn, display the iPhone and Android smartphone apps they helped design to allow Parkinson's disease patients to measure the severity of their symptoms.

Image: 
Noam Finkelstein/Johns Hopkins U

Parkinson's disease, a progressive brain disorder, is often tough to treat effectively because symptoms, such as tremors and walking difficulties, can vary dramatically over a period of days, or even hours.

To address this challenge, Johns Hopkins University computer scientists, working with an interdisciplinary team of experts from two other institutions, have developed a new approach that uses sensors on a smartphone to generate a score that reliably reflects symptom severity in patients with Parkinson's disease.

In a study published recently online in the journal JAMA Neurology, researchers from Johns Hopkins' Whiting School of Engineering, the University of Rochester Medical Center, and Aston University in the United Kingdom reported that the severity of symptoms among Parkinson's patients seen by neurologists aligned closely with those generated by their smartphone app.

Typically, patients with Parkinson's disease are evaluated by medical specialists during three or four clinic visits annually, with subjective assessments capturing only a brief snapshot of a patient's fluctuating symptoms. In their homes, patients may also be asked to fill out a cumbersome 24-hour "motor diary" in which they keep a written record of their mobility, involuntary twisting movements and other Parkinson's symptoms. The doctor then uses this self-reported or imprecise data to guide treatment.

In the new study, the researchers say patients could use a smartphone app to objectively monitor symptoms in the home and share this data to help doctors fine-tune their treatment.

E. Ray Dorsey, a University of Rochester Medical Center neurologist and a co-author of the research paper, said he welcomes the validation of Parkinson's patient severity scores produced by the smartphone tests.

"If you think about it, it sounds crazy," he said, "but until these types of studies, we had very limited data on how these people function on Saturdays and Sundays because patients don't come to the clinic on Saturdays or Sundays. We also had very limited data about how people with Parkinson's do at two o'clock in the morning or 11 o'clock at night because, unless they're hospitalized, they're generally not being seen in clinics at those times."

About six years ago, while doing medical research at Johns Hopkins, Dorsey was introduced to Suchi Saria, an assistant professor of computer science at the university. Saria, the corresponding author of the study and an expert in a computing technique called machine learning, had been using it to extract useful information from health-related data that was routinely being collected at hospitals. The two researchers, along with some of Saria's students, teamed up to find a way to monitor the health of Parkinson's patients as easily as people with diabetes can check their glucose levels with a pinprick blood test.

The team members knew that neurologists evaluated their Parkinson's patients by gathering information about how they moved, spoke and completed certain daily tasks. "Can we do this with a cellphone?" Saria wondered at the time. "We asked, 'What are the tricks we can use to make that happen?' "

Using existing smartphone components such as its microphone, touch screen and accelerometer, the team members devised five simple tasks involving voice sensing, finger tapping, gait measurement, balance and reaction time. They turned this into a smartphone app called 'HopkinsPD.' Next, using a machine learning technique that the team devised, they were able to convert the data collected with these tests and turn that into an objective Parkinson's disease severity score--a score that better reflected the overall severity of patients' symptoms and how well they were responding to medication.

The researchers say this smartphone evaluation should be particularly useful because it does not rely on the subjective observations of a medical staff member. Moreover, it can be administered any time or day in a clinic or within the patient's home, where the patient is less likely to be as nervous as in a medical setting.

"The day-to-day variability of Parkinson's symptoms is so high," Saria said. "If you happen to measure a patient at 5 p.m. today and then three months later, again at 5 p.m., how do you know that you didn't catch him at a good time the first time and at a bad time the second time?"

Collecting more frequent smartphone test data in a medical setting as well as in the home, could give doctors a clearer picture of their patients' overall heath and how well their medications are working, Saria and her colleagues suggested.

Summarizing the importance of their finding in the JAMA Neurology report, the researchers said, "A smartphone-derived severity score for Parkinson's disease is feasible and provides an objective measure of motor symptoms inside and outside the clinic that could be valuable for clinical care and therapeutic development."

Patients in the research project used Android smartphones to download the software, available through the Parkinson's Voice Initiative website. The team has now partnered with Apple and Sage Bionetworks to develop mPower, an iPhone version that is available at Apple's App Store.

The study's three co-lead authors included two of Saria's students from the Department of Computer Science at Johns Hopkins: doctoral candidate Andong Zhan and third-year undergraduate Srihari Mohan.

Zahn, who is from Qujing, Yunnan, in China, described the project as "a unique experience of extracting data from the physical world to a digital world and finally seeing it become meaningful clinical information."

Mohan, who is from Redmond, Washington, added, "While not all research gets integrated tangibly into people's lives, what excites me most is the potential for the methods we developed to be deployed seamlessly into a patient's lifestyle and improve the quality of care."

Credit: 
Johns Hopkins University

Researchers develop transparent patch to detect dangerous food threats

video: Is that meat still good? Are you sure? McMaster researchers have developed a test to bring certainty to the delicate but critical question of whether meat and other foods are safe to eat or need to be thrown out.

Image: 
McMaster University

Is that meat still good? Are you sure? McMaster researchers have developed a test to bring certainty to the delicate but critical question of whether meat and other foods are safe to eat or need to be thrown out.

Mechanical and chemical engineers at McMaster, working closely with biochemists from across campus, have collaborated to develop a transparent test patch, printed with harmless molecules, that can signal contamination as it happens. The patch can be incorporated directly into food packaging, where it can monitor the contents for harmful pathogens such as E. coli and Salmonella.

The new technology, described today in the research journal ACS Nano, has the potential to replace the traditional "best before" date on food and drinks alike with a definitive indication that it's time to chuck that roast or pour out that milk.

"In the future, if you go to a store and you want to be sure the meat you're buying is safe at any point before you use it, you'll have a much more reliable way than the expiration date," says lead author Hanie Yousefi, a graduate student and research assistant in McMaster's Faculty of Engineering.

If a pathogen is present in the food or drink inside the package, it would trigger a signal in the packaging that could be read by a smartphone or other simple device. The test itself does not affect the contents of the package.

According to the World Health Organization, foodborne pathogens result in approximately 600 million illnesses and 420,000 deaths per year. About 30 per cent of those cases involve children five years old and younger.

The researchers are naming the new material "Sentinel Wrap" in tribute to the McMaster-based Sentinel Bioactive Paper Network, an interdisciplinary research network that worked on paper-based detection systems. That network's research ultimately gave rise to the new food-testing technology.

Chemical engineer Carlos Filipe and mechanical-biomedical engineer Tohid Didar, collaborated closely on the new detection project.

The signaling technology for the food test was developed in the McMaster labs of biochemist Yingfu Li.

"He created the key, and we have built a lock and a door to go with it," says Filipe, who is Chair of McMaster's Department of Chemical Engineering.

Mass producing such a patch would be fairly cheap and simple, the researchers say, as the DNA molecules that detect food pathogens can be printed onto the test material.

"A food manufacturer could easily incorporate this into its production process," says Didar, an assistant professor of mechanical engineering and member of the McMaster Institute for Infectious Disease Research.

Getting the invention to market would need a commercial partner and regulatory approvals, the researchers say. They point out that the same technology could also be used in other applications, such as bandages to indicate if wounds are infected, or for wrapping surgical instruments to assure they are sterile.

Credit: 
McMaster University

New blood test useful to detect people at risk of developing Alzheimer's disease

Heidelberg, 6 April 2018 - There is, as yet, no cure for Alzheimer's disease. It is often argued that progress in drug research has been hampered by the fact that the disease can only be diagnosed when it is too late for an effective intervention. Alzheimer's disease is thought to begin long before patients show typical symptoms like memory loss. Scientists have now developed a blood test for Alzheimer's disease and found that it can detect early indicators of the disease long before the first symptoms appear in patients. The blood test would thus offer an opportunity to identify those at risk and may thereby open the door to new avenues in drug discovery. The research is published today in EMBO Molecular Medicine.

One of the hallmarks of Alzheimer's disease is the accumulation of amyloid-beta plaques in the patient's brain. The blood test, developed by Klaus Gerwert and his team at Ruhr University Bochum, Germany, works by measuring the relative amounts of a pathological and a healthy form of amyloid-β in the blood. The pathological form is a misfolded version of this molecule and known to initiate the formation of toxic plaques in the brain. Toxic amyloid-beta molecules start accumulating in the patients' body 15-20 years before disease onset. In the present study, Gerwert and colleagues from Germany and Sweden addressed whether the blood test would be able to pick up indications of pathological amyloid-beta in very early phases of the disease.

The researchers first focused on patients in the early, so called prodromal stages of the disease from the Swedish BioFINDER cohort conducted by Oskar Hanson. They found that the test reliably detected amyloid-beta alterations in the blood of participants with mild cognitive impairment that also showed abnormal amyloid deposits in brain scans.

In a next step, Gerwert and colleagues investigated if their assay was able to detect blood changes well ahead of disease onset. They used data from the ESTHER cohort study, which Hermann Brenner started in 2000 at DKFZ, comparing blood samples of 65 participants that were later in the follow-up studies diagnosed with Alzheimer's disease with 809 controls. The assay was able to detect signs of the disease on average eight years before diagnosis in individuals without clinical symptoms. It correctly identified those with the disease in almost 70% of the cases, while about 9% of true negative subjects would wrongly be detected as positive. The overall diagnostic accuracy was 86%.

Currently available diagnostic tools for Alzheimer's disease either involve expensive positron emission tomography (PET) brain scans, or analyze samples of cerebrospinal fluid that are extracted via lumbar puncture. The researchers suggest that their blood test serves as a cheap and simple option to pre-select individuals from the general population for further testing by these more invasive and costly methods to exclude the falsely positive subjects.The blood test developed by Gerwert and colleagues uses a technology called immuno-infrared sensor to measure distribution of pathological and healthy structures of amyloid-beta. The pathological amyloid-beta structure is rich in a sticky, sheet-like folding pattern that makes it prone to aggregation, while the healthy structure is not. The two structures absorb infrared light at a different frequency, allowing the blood test to determine the ratio of healthy to pathological amyloid-beta in the sample.

The blood test will be extended to Parkinson disease by measuring another disease biomarker - alpha-synuclein - instead of amyloid-beta.

Credit: 
EMBO

Giant solar tornadoes put researchers in a spin

video: A solar tornado observed by the NASA satellite SDO between April 23-29, 2015. The tornado prominence erupted on 28 April. An image of the Earth is superimposed for scale.

Image: 
SDO data courtesy of NASA. Movie created using the ESA and NASA funded Helioviewer Project.

Despite their appearance solar tornadoes are not rotating after all, according to a European team of scientists. A new analysis of these gigantic structures, each one several times the size of the Earth, indicates that they may have been misnamed because scientists have so far only been able to observe them using 2-dimensional images. Dr Nicolas Labrosse will present the work, carried out by researchers at the University of Glasgow, Paris Observatory, University of Toulouse, and Czech Academy of Sciences, at the European Week of Astronomy and Space Science (EWASS) in Liverpool on Friday 6 April.

Solar tornadoes were first observed in the early 20th century, and the term was re-popularised a few years ago when scientists looked at movies obtained by the AIA instrument on the NASA Solar Dynamics Observatory (SDO). These show hot plasma in extreme ultraviolet light apparently rotating to form a giant structure taking the shape of a tornado (as we know them on Earth).

Now, using the Doppler effect to add a third dimension to their data, the scientists have been able to measure the speed of the moving plasma, as well as its direction, temperature and density. Using several years' worth of observations, they were able to build up a more complete picture of the magnetic field structure that supports the plasma, in structures known as prominences.

Dr Nicolas Labrosse, lead scientist in the study, explains: "We found that despite how prominences and tornadoes appear in images, the magnetic field is not vertical, and the plasma mostly moves horizontally along magnetic field lines. However we see tornado-like shapes in the images because of projection effects, where the line of sight information is compressed onto the plane of the sky."

Dr Arturo López Ariste, another member of the team, adds: "The overall effect is similar to the trail of an aeroplane in our skies: the aeroplane travels horizontally at a fixed height, but we see that the trail starts above our heads and ends up on the horizon. This doesn't mean that it has crashed!"

Giant solar tornadoes - formally called tornado prominences - have been observed on the Sun for around a hundred years. They are so called because of their striking shape and apparent resemblance to tornadoes on Earth, but that is where the comparison ends.

Whereas terrestrial tornadoes are formed from intense winds and are very mobile, solar tornadoes are instead magnetized gas. They seem to be rooted somewhere further down the solar surface, and so stay fixed in place.

"They are associated with the legs of solar prominences - these are beautiful concentrations of cool plasma in the very hot solar corona that can easily be seen as pink structures during total solar eclipses," adds Labrosse.

"Perhaps for once the reality is less complicated than what we see!" comments Dr Brigitte Schmieder, another scientist involved in the work.

She continues: "Solar tornadoes sound scary but in fact they normally have no noticeable consequences for us. However, when a tornado prominence erupts, it can cause what's known as space weather, potentially damaging power, satellite and communication networks on Earth."

Credit: 
Royal Astronomical Society

Attention deficit disorders could stem from impaired brain coordination

image: This is Lin Mei, Ph.D.

Image: 
CWRU School of Medicine

Researchers from Case Western Reserve University School of Medicine and colleagues have discovered how two brain regions work together to maintain attention, and how discordance between the regions could lead to attention deficit disorders, including schizophrenia, bipolar disorder, and major depression.

People with attention deficits have difficulty focusing and often display compulsive behavior. The new study suggests these symptoms could be due to dysfunction in a gene--ErbB4--that helps different brain regions communicate. The gene is a known risk factor for psychiatric disorders, and is required to maintain healthy neurotransmitter levels in the brain.

In a study published in the current issue of Neuron, researchers showed mice lacking ErbB4 activity in specific brain regions performed poorly on timed attention tasks. The mice struggled to pay attention and remember visual cues associated with food. Neuroscientists describe the kind of thought-driven attention required for the tasks as "top-down attention." Top-down attention is goal-oriented, and related to focus. People who lack efficient top-down attention are at a higher risk for attention deficit hyperactivity disorder (ADHD). The study is the first to connect ErbB4 to top-down attention.

"The results reveal a mechanism for top-down attention, which could go wrong in attention disorders," says corresponding author Lin Mei, PhD, professor and chair of the department of neurosciences at Case Western Reserve University School of Medicine. "And since ErbB4 is a risk factor for schizophrenia, bipolar disorder, and major depression, the results provide insights into mechanisms of these disorders."

When the researchers attached probes to the mice to measure brain activity, they found mice without ErbB4 had brain regions that were acting independently, rather than together in synchrony. In particular, the researchers studied the prefrontal cortex--normally associated with decision-making--and the hippocampus--a region that supports memory. These two regions coordinate for a variety of brain tasks, including memory and attention. "We found top-down attention, previously thought to be controlled by the prefrontal cortex, also involves the hippocampus in a manner where the two regions are highly synchronized when attention is high," says Mei. "Our findings give importance to synchrony between the prefrontal cortex and hippocampus in top-down attention and open up the possibility that attention deficit disorders, like ADHD, might involve impairments in the synchrony between these two regions."

According to the new study, ErbB4 coordinates a cascade of brain signals that "bridge" the two regions. ErbB4 itself encodes a receptor found on the surface of brain cells. The study found that when a protein (neuregulin-1) attaches to the ErbB4 receptor, it triggers a chain reaction that ultimately determines neurotransmitter levels in the prefrontal cortex and hippocampus. Without ErbB4, neurotransmitter levels go awry. The researchers discovered mice lacking ErbB4 have low levels of a particular neurotransmitter--GABA, or gamma-aminobutyric acid--in their brain. Low GABA levels can lead to impaired top-down attention in the prefrontal cortex, and impairs how the prefrontal cortex can efficiently coordinate with the hippocampus. The researchers concluded that ErbB4 helps link the two brain regions to maintain attention.

The study used a novel mouse model to study brain functions. By using genetic and chemical techniques, Mei's team can specifically inhibit ErbB4 in a specific brain region. "We generated a mutant mouse that enables us to inhibit ErbB4 activity whenever and wherever we want, thus allowing temporal and spatial control of ErbB4 activity," says Mei. "This positions us to understand how different brain regions and their neurotransmitter activity regulate various brain functions." The researchers are planning to use the novel mouse model to study how ErbB4 may coordinate brain activities, in an effort to learn more about mechanisms behind attention deficit disorders.

Credit: 
Case Western Reserve University

Light 'relaxes' crystal to boost solar cell efficiency

image: Constant illumination was found to relax the lattice of a perovskite-like material, making it more efficient at collecting sunlight and converting it to energy. The stable material was tested for solar cell use by scientists at Rice University and Los Alamos National Laboratory.

Image: 
Light to Energy Team/Los Alamos National Laboratory

Some materials are like people. Let them relax in the sun for a little while and they perform a lot better.

A collaboration led by Rice University and Los Alamos National Laboratory found that to be the case with a perovskite compound touted as an efficient material to collect sunlight and convert it into energy.

The researchers led by Aditya Mohite, a staff scientist at Los Alamos who will soon become a professor at Rice; Wanyi Nie, also a staff scientist at Los Alamos, and lead author and Rice graduate student Hsinhan (Dave) Tsai discovered that constant illumination relaxes strain in perovskite's crystal lattice, allowing it to uniformly expand in all directions.

Expansion aligns the material's crystal planes and cures defects in the bulk. That in turn reduces energetic barriers at the contacts, making it easier for electrons to move through the system and deliver energy to devices.

This not only improves the power conversion efficiency of the solar cell, but also does not compromise its photostability, with negligible degradation over more than 1,500 hours of operation under continuous one-sun illumination of 100 milliwatts per cubic centimeter.

The research, which appears this week in Science, represents a significant step toward stable perovskite-based solar cells for next generation solar-to-electricity and solar-to-fuel technologies, according to the researchers.

"Hybrid perovskite crystal structures have a general formula of AMX3, where A is a cation, M is a divalent metal and X is a halide," Mohite said. "It's a polar semiconductor with a direct band gap similar to that of gallium arsenide.

"This endows perovskites with an absorption coefficient that is nearly an order of magnitude larger than gallium arsenide (a common semiconductor in solar cells) across the entire solar spectrum," he said. "This implies that a 300-nanometer thick film of perovskites is sufficient to absorb all the incident sunlight. By contrast, silicon is an indirect band gap material that requires 1,000 times more material to absorb the same amount of sunlight."

Mohite said researchers have long sought efficient hybrid perovskites that are stable in sunlight and under ambient environmental conditions.

"Through this work, we demonstrated significant progress in achieving both of these objectives," he said. "Our triple-cation-based perovskite in a cubic lattice shows excellent temperature stability at more than 100 degrees Celsius (212 degrees Fahrenheit)."

The researchers modeled and made more than 30 semiconducting, iodide-based thin films with perovskite-like structures: Crystalline cubes with atoms arranged in regular rows and columns. They measured their ability to transmit current and found that when soaked with light, the energetic barrier between the perovskite and the electrodes largely vanished as the bonds between atoms relaxed.

They were surprised to see that the barrier remained quenched for 30 minutes after the light was turned off. Because the films were kept at a constant temperature during the experiments, the researchers were also able to eliminate heat as a possible cause of the lattice expansion.

Measurements showed the "champion" hybrid perovskite device increased its power conversion efficiency from 18.5 percent to 20.5 percent. On average, all the cells had a raised efficiency above 19 percent. Mohite said perovskites used in the study were 7 percent away from the maximum possible efficiency for a single-junction solar cell.

He said the cells' efficiency was nearly double that of all other solution-processed photovoltaic technologies and 5 percent lower than that of commercial silicon-based photovoltaics. They retained 85 percent of their peak efficiency after 800 hours of continuous operation at the maximum power point, and their current density showed no photo-induced degradation over the entire 1,500 hours.

"This work will accelerate the scientific understanding required to achieve perovskite solar cells that are stable," Mohite said. "It also opens new directions for discovering phases and emergent behaviors that arise from the dynamical structural nature, or softness, of the perovskite lattice."

The lead researchers indicated the study goes beyond photovoltaics as it connects, for the first time, light-triggered structural dynamics with fundamental electronic transport processes. They anticipate it will lead to technologies that exploit light, force or other external triggers to tailor the properties of perovskite-based materials.

Co-authors of the paper are research scientist Bo Chen, Rafael Verduzco, an associate professor of chemical and biomolecular engineering and of materials science and nanoengineering, and Pulickel Ajayan, chair of the Department of Materials Science and NanoEngineering, the Benjamin M. and Mary Greenwood Anderson Professor in Engineering and a professor of chemistry, all of Rice; graduate student Reza Asadpour and Muhammad Ashraf Alam, the Jai N. Gupta Professor of Electrical and Computer Engineering, of Purdue University; research scientists Jean-Christophe Blancon, Sergei Tretiak and Wanyi Nie of Los Alamos; research assistant professor Constantinos Stoumpos and Mercouri Kanatzidis, the Charles E. and Emma H. Morrison Professor of Chemistry at Northwestern University; Jacky Even and Olivier Durand, professors of physics at the Institute of Electronics and Telecommunications of Rennes, France, and Joseph Strzalka, a physicist at Argonne National Laboratory.

This work was supported by the Department of Energy's Office of Energy Efficiency and Renewable Energy and its Basic Energy Sciences office.

Credit: 
Rice University

Allina study shows patients with very small breast tumors may forgo lymph node biopsies

How to treat patients who have microinvasive breast cancer - tumors that are 1 mm or less in size (the thickness of a dime) -- is somewhat controversial. Can these tiny tumors affect the lymph nodes and spread cancer to other areas of the body?

Physicians at the Virginia Piper Cancer Institute wanted to know if surgical procedures to test the lymph nodes for cancer were always necessary.

They examined the outcomes of 294 patients who were treated between 2001 and 2015. Only 1.5 percent had positive lymph nodes - indicating the rare possibility of metastatic cancer. And the only patients with positive lymph nodes had microinvasive tumors that were associated with relatively large non-invasive tumors (ductal carcinoma in situ or DCIS).

"These findings allow surgeons to select which patients with microinvasive tumors may actually benefit from lymph node sampling, while sparing other patients from this procedure," said Tamera Lillemoe, M.D, pathologist and a study co-author.

Credit: 
Allina Health

The traits of fast typists discovered by analyzing 136 million keystrokes

video: The test was taken by people from over 200 countries of whom 68 percent were from the US.

Image: 
Aalto University

Researchers from Aalto University in Finland and University of Cambridge in the United Kingdom have collected extensive data about the typing behavior of 168,000 volunteers. The researchers developed an online typing test following scientific standards and published it on the free typing speed assessment website typingmaster.com. Users transcribed sentences voluntarily after giving their informed consent that their anonymized data would be collected and used for research purposes.The test was taken by people from over 200 countries of whom 68% were from the US. Most were younger and interested in typing, with over 70% having taken a touch typing class.

"Ethical large-scale crowdsourcing experiments that allow us to analyse how people interact with computers on a large scale are instrumental for identifying solution principles for the design of next-generation user interfaces," says Dr Per Ola Kristensson, University Reader (Associate Professor) in Interactive Systems Engineering at the University of Cambridge.

The data sheds new light on present-day typing performance. Not surprisingly, the data confirmed faster typists generally make less mistakes. "Correct motor execution is the key to fast typing. Slower typists make more errors and require a long time to identify and correct them. This is particularly detrimental for their performance." says Anna Feit, a doctoral student at Aalto University.

However, the study also discovered a new practice in fast typing called rollover typing, which is well-known among pro gamers but has not been observed during everyday typing. Here, the next key is pressed down before the previous finger has lifted up. This strategy was surprisingly prevalent, with fast users typing 40-70% of keystrokes using rollover, irrespective of whether they touch type or not. "This strategy is only possible for highly practiced letter combinations and when performance does not rely on visual attention," says Anna Feit. "An important goal for typing is to learn not to look at fingers."

The results of the study change our understanding of typing performance. Most of our knowledge of how people type is based on studies from the typewriter era. People presently make different types of mistakes: more errors where a letter is replaced by another one, whereas in the typewriter era they often added characters or omitted one. Also, modern users use their hands differently. Anna Feit explains: "Modern keyboards allow typing keys with different fingers of the same hand with much less force than what was possible with typewriters. This is partially explaining why self-taught typing techniques using less than ten fingers can be as fast as the touch typing system, which was probably not the case in the typewriter era."

The large dataset allows to infer reliable statistics about the typing performance of modern computer users. Average users in this study typed 52 words per minute, much less than the professionally trained typists studied in the 70's and 80's who typically reached 60-90 words per minute. However, performance varied largely. "The fastest users in our study typed 120 words per minute, which is amazing given that this is a controlled study with randomized phrases," tells Prof. Antti Oulasvirta. "Many informal tests allow users to practice the sentences, allowing unrealistically high performance."

Users who took the typing test also provided their demographic information and answered questions about their typing experience. It was found that users who took a typing course in the past actually showed a very similar typing behavior as those who never took such a course, how fast they type, how they use their hands and the errors they make - even though they use less fingers. This is in line with previous research suggesting that users on the one hand develop their own efficient techniques and on the other forget the "pure" touch typing system.

Analyzing the individual keypresses, the researchers found that users exhibit different typing styles, characterized by how they use their hands and fingers, the use of rollover, tapping speeds, and typing accuracy. For example, they could classify some users as ''careless typists'' who move fingers quickly but have to correct many mistakes, and others as attentive error-free typists, who gain speed by moving hands and fingers in parallel, pressing the next key already before the first one is released. It is now possible for a computer to classify users' typing behavior simply based on the observed keystroke timings which does not require to store the text that users have typed. Such information can be useful for example for spell checkers, or to create new personalised training programmes for typing.

Anna Feit says: "You do not need to change to the touch typing system if you want to learn to type faster. A few simple exercises can help you to improve your own typing technique". She suggests the following practices:

Pay attention to errors, as errors are costly to correct. Slow down to avoid them and you will be faster in the long run.

Learn to type without looking at fingers; your motor system will automatically pick up very fast ''trills'' for frequently occurring letter combinations ("the"), which will speed up your typing tremendously. Being able to look at the screen while typing also allows you to quickly detect mistakes.

Practice rollover: use different fingers for successive letter keys instead of moving a single finger from one key to another. Then, when typing a letter with one finger, press already the next one with the other finger.

Take an online typing test occasionally to track performance and to identify weaknesses such as high error rates. Make sure that the test requires you to type new sentences so you do not overpractice the same text. Our scientific typing test gives you a reliable estimate of your typing performance.

Dedicate time to practice deliberately. People may forget the good habits and relapse to detrimental ways of typing.

The study will be presented at the world's largest computer-human interaction conference, CHI, at Montreal, Canada, USA, in April 2018, where it was recognized to be among the top 5% of publications. The study was done in collaboration with TypingMaster.com.

Credit: 
Aalto University

Why noise can enhance sensitivity to weak signals

image: Conceptual illustration showing that sensitivity of the bistable system becomes high when Gaussian noise is imposed to a weak signal.

Image: 
Kasai S., et al., <em>Applied Physics Express</em>, Feb. 16, 2018.

A team of Japanese researchers has discovered a new mechanism to explain stochastic resonance, in which sensitivity to weak signals is enhanced by noise. The finding is expected to help electronic devices become smaller and more energy-efficient.

Noise is generally a nuisance that drowns out small signals. For example, it can prevent you from catching what your partner is saying during a conversation. However, it is known that living organisms find it easier to detect predators in noisy environments since noise enhances the sensitivity of the sensory organs. This phenomenon, called stochastic resonance, is considered to be of great use for engineering devices and addressing noise problems in various other fields. However, there have not been convincing explanations as to why noise enhances sensitivity to weak signals since initial report of the phenomenon in 1981.

One stumbling block preventing researchers from fully understanding the phenomenon is the complexity in nonlinear theories involving friction and fluctuation, both considered to be essential for the phenomenon.

To address this problem, the team, comprising Hokkaido University Professor Seiya Kasai, Associate Professor Akihisa Ichiki of Nagoya University, and Senior Researcher Yukihiro Tadokoro of Toyota Central R&D Labs., Inc., established a simple model that excluded friction force, a parameter that they consider negligible in nano- and molecular-scale systems.

The researchers found correlations between sensitivity and noise in a bistable system, a nonlinear system that has two stable states and allows the transition between them depending on input values, like a seesaw. They also figured out the role of white Gaussian noise, the most standard noise widely found in the natural world.

When a transition occurs without friction, the sensitivity of the bistable system to a Gaussian-noise-imposed weak signal becomes significantly high. Furthermore, the researchers found the relative difference - which determines the sensitivity - of Gaussian distribution function diverges in its tail edge. This means that the sensitivity becomes anomalously high by increasing the threshold of the bistable system. This theory has been experimentally verified by an electronic two-state device called the Schmitt trigger.

The finding is expected to pave the way for using noise rather than eliminating it, which will contribute to establishing new technologies. It could help electronic devices become smaller and more energy-efficient. "Since Gaussian noise is commonly found, our study should help us better understand various nonlinear and fluctuating phenomena in the natural world and society." says Kasai.

Credit: 
Hokkaido University