Earth

NIST study shows face recognition experts perform better with AI as partner

image: Do these two faces show the same person? Trained specialists called forensic face examiners testify about such questions in court. A NIST study measuring their accuracy reveals the science behind their work for the first time.

Image: 
J. Stoughton/NIST

Experts at recognizing faces often play a crucial role in criminal cases. A photo from a security camera can mean prison or freedom for a defendant--and testimony from highly trained forensic face examiners informs the jury whether that image actually depicts the accused. Just how good are facial recognition experts? Would artificial intelligence help?

A study appearing today in the Proceedings of the National Academy of Sciences has brought answers. In work that combines forensic science with psychology and computer vision research, a team of scientists from the National Institute of Standards and Technology (NIST) and three universities has tested the accuracy of professional face identifiers, providing at least one revelation that surprised even the researchers: Trained human beings perform best with a computer as a partner, not another person.

"This is the first study to measure face identification accuracy for professional forensic facial examiners, working under circumstances that apply in real-world casework," said NIST electronic engineer P. Jonathon Phillips. "Our deeper goal was to find better ways to increase the accuracy of forensic facial comparisons."

The team's effort began in response to a 2009 report by the National Research Council, "Strengthening Forensic Science in the United States: A Path Forward", which underscored the need to measure the accuracy of forensic examiner decisions.

The NIST study is the most comprehensive examination to date of face identification performance across a large, varied group of people. The study also examines the best technology as well, comparing the accuracy of state-of-the-art face recognition algorithms to human experts.

Their result from this classic confrontation of human versus machine? Neither gets the best results alone. Maximum accuracy was achieved with a collaboration between the two.

"Societies rely on the expertise and training of professional forensic facial examiners, because their judgments are thought to be best," said co-author Alice O'Toole, a professor of cognitive science at the University of Texas at Dallas. "However, we learned that to get the most highly accurate face identification, we should combine the strengths of humans and machines."

The results arrive at a timely moment in the development of facial recognition technology, which has been advancing for decades, but has only very recently attained competence approaching that of top-performing humans.

"If we had done this study three years ago, the best computer algorithm's performance would have been comparable to an average untrained student," Phillips said. "Nowadays, state-of-the-art algorithms perform as well as a highly trained professional."

The study itself involved a total of 184 participants, a large number for an experiment of this type. Eighty-seven were trained professional facial examiners, while 13 were "super recognizers," a term implying exceptional natural ability. The remaining 84--the control groups--included 53 fingerprint examiners and 31 undergraduate students, none of whom had training in facial comparisons.

For the test, the participants received 20 pairs of face images and rated the likelihood of each pair being the same person on a seven-point scale. The research team intentionally selected extremely challenging pairs, using images taken with limited control of illumination, expression and appearance. They then tested four of the latest computerized facial recognition algorithms, all developed between 2015 and 2017, using the same image pairs.

Three of the algorithms were developed by Rama Chellappa, a professor of electrical and computer engineering at the University of Maryland, and his team, who contributed to the study. The algorithms were trained to work in general face recognition situations and were applied without modification to the image sets.

One of the findings was unsurprising but significant to the justice system: The trained professionals did significantly better than the untrained control groups. This result established the superior ability of the trained examiners, thus providing for the first time a scientific basis for their testimony in court.

The algorithms also acquitted themselves well, as might be expected from the steady improvement in algorithm performance over the past few years.

What raised the team's collective eyebrows regarded the performance of multiple examiners. The team discovered that combining the opinions of multiple forensic face examiners did not bring the most accurate results.

"Our data show that the best results come from a single facial examiner working with a single top-performing algorithm," Phillips said. "While combining two human examiners does improve accuracy, it's not as good as combining one examiner and the best algorithm."

Combining examiners and AI is not currently used in real-world forensic casework. While this study did not explicitly test this fusion of examiners and AI in such an operational forensic environment, results provide an roadmap for improving the accuracy of face identification in future systems.

While the three-year project has revealed that humans and algorithms use different approaches to compare faces, it poses a tantalizing question to other scientists: Just what is the underlying distinction between the human and the algorithmic approach?

"If combining decisions from two sources increases accuracy, then this method demonstrates the existence of different strategies," Phillips said. "But it does not explain how the strategies are different."

Credit: 
National Institute of Standards and Technology (NIST)

Cell chat: Attacking disease by learning the language of cells

image: In the cover image of a forthcoming issue of Small, a single lymphoma cell is isolated on the new biosensor (magnified 2,700 times).

Image: 
EPFL

Breakthrough lab-on-a-chip technology that reveals how human cells communicate could lead to new treatments for cancer and autoimmune disorders.

Developed by an Australian-Swiss research team, the technology offers researchers unprecedented insights into how individual cells behave - something that scientists are discovering is far more complex than previously thought.

The researchers from RMIT University, École polytechnique fédérale de Lausanne (EPFL) and Ludwig Institute for Cancer Research in Lausanne joined forces to build a miniature biosensor that allows scientists to isolate single cells, analyse them in real time and observe their complex signalling behaviour without disturbing their environment.

Distinguished Professor Arnan Mitchell, Director of RMIT's MicroNano Research Facility, said single cell analysis held great promise for developing new treatments for diseases but a lack of effective analysis technologies was holding back research in the field.

"We know a lot about how groups of cells communicate to fight disease or respond to infections but we still have a lot to learn about individual cells," Mitchell said.

"Studies have recently shown that you can take two cells of the same type and give them the same treatment but they will respond very differently.

"We don't know enough about the underlying mechanisms to understand why this happens and we don't have the right technologies to help scientists figure it out.

"Our solution to this challenge is a complete package - an integrated optofluidic biosensor that can isolate single cells and monitor the chemicals they produce in real-time over at least 12 hours.

"It's a powerful new tool that will give us a deeper fundamental understanding of cell communication and behaviour. These insights will open the way to develop radically new methods for diagnosing and treating disease."

Human cells communicate that something is wrong in complex and dynamic ways, producing various chemical substances that signal to other cells what they need to do. When an infection is detected, for example, white blood cells will spring into action and release special proteins to fight and eliminate the intruders.

Understanding how individual cells interact and communicate is critical to developing new therapies for serious diseases, to better harness the power of the body's own immune system or precisely target defective cells.

In a paper published in the high-impact journal Small, the research team demonstrate how the technology can be used to examine the secretion of cytokines from single lymphoma cells.

Cytokines are small proteins produced by a broad range of cells to communicate to other cells, and they are known to play an important role in responses to infection, immune disorders, inflammation, sepsis and cancer.

The study found the lymphoma cells produced cytokine in different ways, unique to each cell, enabling researchers to determine each cell's "secretion fingerprints".

"If we can build up a clear picture of this behaviour, this would help us sort good cells from bad and enable us to one day develop treatments that precisely target just those bad cells," Mitchell said.

How it works

The biosensor is the latest adaptation of microfluidic lab-on-a-chip technology developed in RMIT's MicroNano Research Facility.

A microfluidic chip contains tiny channels, pumps and processors, enabling precise and flexible manipulation of fluids. Essentially, microfluidics does for fluids what microelectronics does for information - integrating vast quantities of tiny processing elements into a small chip that is portable, fast and can be produced quickly and efficiently.

The new cost-effective and scaleable technology is lightweight and portable, combining microfluidics with nanophotonics.

Compatible with traditional microscopes, the biosensor is a thin glass slide coated with a gold film, perforated with billions of tiny nanoholes arranged in a specific pattern. These nanoholes transmit a single colour of light, due to an optical phenomena known as the plasmonic effect.

By observing the colour transmitted, researchers can determine the presence of minute quantities of specific chemicals on a slide without any external labels. This detection method enables the continuous monitoring of the chemicals produced from a single cell in real time.

The nanophotonic sensor is coupled to a microfluidic integrated circuit with fluid channels about the size of a human hair. The circuit includes valves to isolate the cell and concentrate its secretions, and systems to regulate the temperature and humidity to sustain the cell.

The work is a collaboration between the laboratory of Bionanophotonic Systems at EPFL, Switzerland, the Integrated Photonics and Applications Centre in the School of Engineering at RMIT and Ludwig Institute for Cancer Research, Switzerland.

RMIT microfluidic chips have been pivotal in enabling research across a range of areas - from water quality monitoring to the development of point-of-care blood tests for suspected heart attacks that could deliver results while a patient is still in an ambulance.

Credit: 
RMIT University

Limiting global warming could avoid millions of dengue fever cases

Limiting global warming to 1.5°C could avoid around 3.3 million cases of dengue fever per year in Latin America and the Caribbean alone - according to new research from the University of East Anglia (UEA).

A new report published today in the Proceedings of the National Academy of Sciences (PNAS) reveals that limiting warming to the goal of the UN Paris Agreement would also stop dengue spreading to areas where incidence is currently low.

A global warming trajectory of 3.7°C could lead to an increase of up to 7.5 million additional cases per year by the middle of this century.

Dengue fever is a tropical disease caused by a virus that is spread by mosquitoes, with symptoms including fever, headache, muscle and joint pain. It is endemic to over 100 countries, and infects around 390 million people worldwide each year, with an estimated 54 million cases in Latin America and the Caribbean.

Because the mosquitoes that carry and transmit the virus thrive in warm and humid conditions, it is more commonly found in areas with these weather conditions. There is no specific treatment or vaccine for dengue and in rare cases it can be lethal.

Lead researcher Dr Felipe Colón-González, from UEA's School of Environmental Sciences and the Tyndall Centre for Climate Change Research, said: "There is growing concern about the potential impacts of climate change on human health. While it is recognised that limiting warming to 1.5°C would have benefits for human health, the magnitude of these benefits remains mostly unquantified.

"This is the first study to show that reductions in warming from 2°C to 1.5°C could have important health benefits."

The Paris Climate Agreement aims to hold global-mean temperature well below 2°C and to pursue efforts to limit it to 1.5°C above preindustrial levels.

The team studied clinical and laboratory confirmed dengue reports in Latin America and used computer models to predict the impacts of warming under different climate scenarios.

They found that limiting global warming to 2°C could reduce dengue cases by up to 2.8 million cases per year by the end of the century compared to a scenario in which the global temperature rises by 3.7°C.

Limiting warming further to 1.5°C produces an additional drop in cases of up to half a million per year.

Southern Mexico, the Caribbean, northern Ecuador, Colombia, Venezuela and coastal Brazil will be most affected by increases in dengue cases.

Brazil would benefit the most from limiting warming to 1.5°C with up to half a million cases avoided per year by the 2050s and 1.4 million avoided cases per year by 2100.

The team also found that limiting global warming would also limit the expansion of the disease towards areas where incidence is currently low such as Paraguay and northern Argentina.

Co-author Dr Iain Lake, also from UEA, added: "Understanding and quantifying the impacts of warming on human health is crucial for public health preparedness and response.

"Warming has already reached 1°C above pre-industrial levels, and the current trajectory, if countries meet their international pledges to reduce CO2, is around 3°C - so clearly a lot more needs to be done to reduce CO2 and quickly if we are to avoid these impacts."

The research was led by the University of East Anglia, UK, in collaboration with colleagues at Universidade do Estado de Mato Grosso, Brazil.

Credit: 
University of East Anglia

New study investigates dolphin liberation in Korea

image: From left are Sejoon Kim in the School of Energy and Chemical Engineering and Professor Bradley Tatar in the Division of General Studies at UNIST.

Image: 
UNIST

"Dolphin liberation in South Korea has raised awareness towards the welfare of marine animals and has resulted in the strengthening of animal protection policies and the level of welfare."

An engineering student, affiliated with UNIST has recently carried out a scientific investigation on dolphin liberation in South Korea. The paper presents the overall analysis of the social impact of the first case of dolphin rehabilitation in Asia, which occurred in 2013.

This study has been carried out by Sejoon Kim in the School of Energy and Chemical Engineering in collaboration wit Professor Bradley Tatar in the Division of General Studies at UNIST. Their findings have been published in the April issue of the journal, Coastal Management and will be published online, this month.

"After the release of captive dolphins from South Korean marine parks, there has been a growing environmental movement towards the conservation and management of marine and coastal ecosystems," says Sejoon. "Although such movement relies on a single-species conservation focus and does not encompass an entire ecosystem, it has enormous symbolic significance for the welfare of marine animals."

The research team hopes to expand their research to areas beyond the study of dolphin liberation and carry out in-depth case studies on various topics, including the whale-eating culture in Ulsan, the public perspective of dolphin shows, as well as the establishment of new types of dolphin life experience facilities.

Credit: 
Ulsan National Institute of Science and Technology(UNIST)

Scientific 'dream team' shed light on motor neuron death

image: This is an image of the collaborative team. From left: Nick Luscombe, Rickie Patani, Raphaelle Luisier and baby Agnes, Giulia Tyzack and Jernej Ule.

Image: 
Greta Keenan

As the old adage goes, 'two heads are better than one'. With the development of new technologies and increasingly specialist expertise, ground-breaking science needs to be a team effort.

But it isn't always easy for researchers to work together. Finding the right people to collaborate with can be tricky, especially when some are understandably protective of their ideas.

Then there is the practical challenge of meeting up, exchanging ideas and carrying out the research, miles - and even time zones - apart.

A new study from Crick researchers shows that collaboration can be easy when you are part of a culture that supports it. We caught up with the scientists involved to find out how working together helped them uncover the earliest events in motor neuron disease.

The mystery

Three years ago, a group of clinical neurologists, molecular biologists and computer scientists from different London institutes decided to work together to solve the mystery of why motor neurons die in patients with amyotrophic lateral sclerosis (ALS), also known as motor neuron disease.

As a clinical neurologist, Rickie Patani sees first-hand the impact that ALS has on his patients.

"It's a really devastating disease," he says. "Patients progressively lose the ability to move, eat, speak and ultimately breathe.

"We set out to uncover the molecular events that lead to ALS, in the hope that one day we can develop new treatments for patients."

The suspect

Previous studies had implicated deregulation of RNA - a molecule closely related to DNA that has a vital role in coding, decoding, regulating and expressing genes - in ALS. For instance, patients with a hereditary form of ALS often have genetic mutations that prevent their RNA from functioning properly.

But even with RNA expert Jernej Ule on board, comparing RNA sequencing in healthy and diseased motor neurons couldn't provide the full picture.

Turning back the clock

Using cutting-edge stem cell technology, scientists in Rickie's lab took skin cells from healthy volunteers and patients with ALS and turned them into stem cells capable of becoming many other cell types.

Then, using specific chemical signals, they 'guided' the stem cells into becoming motor neurons that they could study in the lab.

"By turning back the clock, we could watch what happened to the motor neurons over time to lead to the disease," says Giulia Tyzack, a researcher in Rickie's lab. "It was really amazing!"

Digging for treasure

Armed with a whole load of RNA sequencing data from healthy and diseased motor neurons at different stages of disease progression, Jernej and Rickie turned to Nick Luscombe and Raphaelle Luisier to drill down into the data and work out exactly what was going wrong. Nick and Raphaelle are bioinformaticians; highly skilled scientists who develop advanced computational techniques to study biological data.

"Initially, using conventional analysis, we didn't detect any differences in RNA sequencing between healthy and diseased motor neurons," says Raphaelle. "But we knew something must have been going wrong to make the ALS motor neurons die, so we wrote a new program to dig deeper into the genetic code - and when the results came back, we knew we were on to something."

The analysis unearthed what was going wrong in ALS motor neurons. Parts of the RNA sequence that don't code for proteins are usually cut out before the RNA is translated into protein, but in the ALS motor neurons this wasn't happening as effectively. This guided the team to collectively discover that a protein called SFPQ, which normally resides inside the cell nucleus, was in fact leaving the nucleus in diseased motor neurons.

"It was like one big treasure hunt," says Nick. "We had the map, and knew where we were looking, and with enough digging we found the gold!"

Cracking the case

The team had uncovered these molecular hallmarks inside human stem cell models of hereditary ALS. They next confirmed that animal models of hereditary ALS also shared the same features. But to see if the same events could explain non-hereditary forms of the disease, they looked at post-mortem spinal cord tissue from patients.

They found that the loss of SFPQ protein was consistent across the board, whether they looked at cells, mouse models or post-mortem tissue confirming that they had discovered an important molecular hallmark of ALS.

"Now that we know these key events are linked to motor neuron death in people with ALS, we can start to think about how we could develop new ways to detect and treat the disease," says Rickie.

Under one roof

This project started before the Francis Crick Institute opened its doors to scientists in the summer of 2016.

For the first couple of years, the team were worked together across different sites, meeting up to share ideas when they could.

But since spring 2017, this scientific 'dream team' have all come together under one roof here at the Crick.

"It's unbelievable how much of a difference it made all being two minutes' walk from each other," says Jernej. "The project was going well even when we were working in different institutes in London, but being able to chat to each other almost every day speeds things up dramatically. We finished the project within a year, while it might have taken two years or more if we weren't all here at the Crick."

By combining cellular models of motor neuron development, measurements of protein-RNA interactions and detailed statistical analysis, this diverse team of Crick scientists have shed light on potential causes of ALS, opening new opportunities to intervene and develop treatments.

This discovery goes to show that when it comes to science, two heads (or more) really are better than one.

Credit: 
The Francis Crick Institute

How scientists analyse cell membranes

image: A new compound mimicking natural cholesterol in membranes of living cells (here: HeLa cells). The substance is labelled with a fluorescent dye (red).

Image: 
© L. Rakers et al./<i>Cell Chem Biol</i>

Exchange of material and information at the level of individual cells requires transport and signalling at level of the plasma membrane enclosing the cell. Studying mechanisms at such tiny dimensions presents researchers with enormous challenges - for example, when they want to find out how an important component of the membrane - cholesterol - behaves and is distributed. So far, cholesterol can only be labelled to a very limited extent with fluorescent dyes, which can be visualized under the microscope without damaging the membrane. Researchers at the University of Münster (Germany) have now developed a method which enables them to circumvent these difficulties. They synthesized a new type of compound which has properties similar to those of cholesterol, but which can be labelled with dyes and visualized in living cells. There, the compound realistically mimics the behaviour of natural cholesterol. "Our new approach offers enormous potential for imaging membrane dynamics in living cells," says Prof. Volker Gerke, one of the leaders of the study and Coordinator at the Cells-in-Motion Cluster of Excellence. The work is the result of an interdisciplinary study involving organic chemists, biochemists and biophysicists. The study appears in the current issue of the journal Cell Chemical Biology.

The detailed story:

Cells in the body are enclosed in a kind of protective envelope - the plasma membrane, which separates the cell from its environment. Cells also contain internal membranes which separate the individual components of the cell from each other and regulate the movement of substances between the different "spaces". Cholesterol, a fatlike substance, is an important component of membranes ensuring that they work properly.

Synthesis of new compounds

In order to generate substances which behave similarly to natural cholesterol, the research group of organic chemists led by Prof. Frank Glorius first synthesized a series of chemical compounds. As a starting substance they used natural cholesterol, which was transformed into a certain organic salt, an imidazolium salt. "We already knew from previous studies that these salts interact well with biomolecules and are therefore suitable for cellular experiments," says Frank Glorius, who also led the study. In order to compare the biophysical properties of the newly synthesized compounds with those of the natural cholesterol, the researchers incorporated the substances to synthetic model membranes consisting of phospholipids (these phospholipids constitute the main component of membranes). Biochemists and Biophysicists at the Cluster of Excellence in the group of Prof. Dr. Hans-Joachim Galla measured, among other things, how the new substances affected the phase transition temperature of model membranes, and how they changed the fluidity in the phospholipid layer at different temperatures. "After evaluating the data, we finally settled on three compounds which exhibited very similar properties to those of natural cholesterol," says Lena Rakers, a PhD student of Organic Chemistry and one of the two first authors of the study.

Experiments in living cells

The researchers selected these compounds in order to examine them in living cellular membranes, thereby studying them in even more complex structures. For this purpose, they used cultures of human epithelial cells - HeLa cells - as well as cells from human blood vessels, HUVEC cells. Due to their structure, the newly synthesized substances fitted well into the cellular membranes. With the aid of surface mass spectrometry, the researchers measured the molecules in the membrane and could show that the compounds behaved in a very similar way to natural cholesterol in living cells, too.

Because of its structure, one of the new substances could be labelled with fluorescent dyes. To this end, the researchers attached an azide group onto the substance. They then linked the dyes to this azide group using click chemistry - an effective method enabling molecular components to be joined on the basis of a few chemical reactions. Finally, the biochemists visualized the substance in living cells using high-resolution confocal microscopy. In this way, they were able to observe its distribution and dynamic changes. "These analyses also showed that the novel compound behaved analogously to cellular cholesterol," says David Grill, a PhD student of Biochemistry and the other first author of the study. One great advantage of the new method is that during the entire process the components and the properties of the cellular membrane remained undamaged.

In the future the researchers want to continue developing their method and test the new substances in further cellular studies using a variety of microscopic imaging methods. One of their aims is to use click chemistry to attach fluorescent dyes and other molecules to the new compounds to eventually introduce selective changes in the membrane.

Credit: 
University of Münster

Tau mutations may increase cancer risk

PHILADELPHIA -- Mutations to the protein tau, commonly associated with neurodegenerative disorders, may serve as a novel risk factor for cancer, according to results published in Cancer Research, a journal of the American Association for Cancer Research.

"Our study revealed that the presence of tau mutations raises the risk of developing cancer," said Fabrizio Tagliavini, MD, scientific director, IRCCS Foundation Carlo Besta Neurological Institute, Milan, Italy. "Furthermore, our bioinformatic analysis highlighted a broader functional environment for the tau protein, which had been previously associated mainly with disease development in the context of neurodegeneration."

Tau protein is essential for the stabilization of microtubules, a major element of the eukaryotic cytoskeleton. Defective tau protein is traditionally associated with neurodegenerative disorders, such as Alzheimer's disease and frontotemporal lobar degeneration (FTLD). "A mutated tau has a reduced ability to bind to microtubules; this leads to microtubule destabilization and cytoskeleton disruption, which is detrimental to cellular survival," explained Tagliavini. "Additionally, free tau protein can form toxic aggregates within nerve cells, impairing neuronal function."

Previous work in the Tagliavini lab found that mutations in tau led to chromatin defects and chromosome abnormalities. "It is well-known that chromosome aberrations are often linked to cancer," said Tagliavini.

"Therefore, we decided to determine if there was a possible association between tau mutations and cancer."

Tagliavini and colleagues analyzed cancer incidence in 15 families bearing seven different tau mutations and affected by FTLD. To calculate cancer risk, each tau-mutated family was matched with three reference families with superimposable pedigrees (control subject's age, gender, and native location matching the person affected with FTLD).

Fifteen percent of subjects from tau-mutated families developed cancer, while only 9 percent of subjects from the reference families had cancer. Cancer types in both cohorts were variable; tau mutations were not associated with specific cancers. Following multivariate analysis, the researchers determined that individuals from tau-mutated families were 3.72 times more likely to develop cancer compared to the reference families.

The researchers also used a bioinformatics analysis to understand the interactions of tau protein with other proteins. They found that almost a third of proteins that tau interacts with were involved in DNA metabolism and cell cycle control; aberrant regulation of these key processes can lead to cancer, explained Tagliavini.

"Patients carrying tau mutations are usually attended for neurodegeneration," said Tagliavini. "However, with further confirmation of our results, these patients could also be monitored for their risk of developing cancer. Clinicians should take into account both of these aspects of tau pathology."

Limitations of the study include missing genetic analyses from several patients and individuals from reference families due to the unavailability of DNA. "The analysis of missing patient data would have allowed a more significant correlation between cancer and tau mutations, which had to be inferred with statistical analysis," noted Tagliavini.

Credit: 
American Association for Cancer Research

Some like it hot!

image: An adult gecko with a backpack containing an RFID tag attached to the lizard's skin. The RFID tag transmits its location and the gecko's body temperature.

Image: 
Annegret Grimm-Seyfarth, UFZ

The world of reptiles may well include creatures that are more spectacular than the Gehyra variegata, but nevertheless, this small nocturnal gecko has managed to make a couple of fascinating new contributions to the discussion about the ecological consequences of climate change.

These geckos are small grey- or brown-skinned lizards that grow to around five centimetres and spend their lives in the deserts of Australia. The hollowed out trunks of eucalyptus trees are their preferred hiding places. After spending the night hunting insects, this is where they seek refuge from the heat in a climate where temperatures can often climb to more than 40 degrees Celsius.
And it is in exactly these deserts where climate scientists expect to see even more extreme conditions in the future. They are forecasted to become even hotter and drier, worldwide. So how will the unique animals and plants that live in these ecosystems cope with these new challenges? This was the question the researchers set out to investigate, using this little gecko as a representative of other nocturnal desert inhabitants.

Prof. Klaus Henle, head of the Department of Conservation Biology at the Helmholtz Centre for Environmental Research, began collecting data about Gehyra variegata as far back as the 1980s. Working in the Kinchega National Park in Eastern Australia, he and his colleagues have been catching, measuring, photographing, tagging and then releasing reptiles for more than 30 years.

The researchers at the Helmholtz Centre for Environmental Research then collated this information with weather conditions in the National Park, and also with global climate phenomena. Their findings are surprising, to say the least. As biologist Annegret Grimm-Seyfarth said, "We expected the higher temperatures and greater dryness to have a negative effect both on the individual geckos and on their populations." After all, even lizards need a certain amount of moisture to ensure that their eggs develop properly, and to enable them to moult when they need to. If they dry out completely, they will die. And the same is also true if excessive temperatures cause them to overheat.
"But our investigations revealed that our geckos grow and survive particularly well in the very hottest years. In fact, they are generally in better condition, and their populations grow rather than fall". But what could be the reason? To discover this, Grimm-Seyfarth carefully observed the lizards' behaviour and measured their body temperature.

At night, she used an infrared thermometer, which can measure temperature at a distance, to locate the creatures while they were hunting. And then, to find where the geckos were hiding during the day, the scientists used tiny passive transmitters, similar to those used as ID chips for dogs. These chips are usually implanted under the skin. But when the tiny reptile is only five centimetres long, this isn't really feasible. So, the scientists created miniature backpacks for the geckos to wear, which keep the chips close to their skin. The researchers then use a radio antenna to locate the chips. The chips not only reveal where each lizard is, but also transmit its body temperature.
Despite the tremendous daytime heat in the desert, the scientists discovered that the geckos don't search out particularly cool places to hide. They prefer their refuges to have a temperature between 30 and 35 degrees Celsius. As Grimm-Seyfarth said, "The creatures need these high temperatures so that they can digest their food properly". Consequently, they also crawl around in branches that are particularly exposed to the sun. To her astonishment, Grimm-Seyfarth also found that, in cooler years, the geckos left the safety of their tree and deliberately sat out in the sun to bask in its heat. But searching for enough warmth also takes energy. And, if the search is unsuccessful, the gecko can't digest its food properly. This might be the reason that cooler years had a negative effect on the geckos.

However, even having the perfect temperature range isn't any good if it's also too dry. This doesn't just cause physical problems for these creatures. In particularly dry periods, there are also fewer insects for them to eat. As expected, geckos experience really hard times during periods of drought. But it's not just local precipitation levels that are the decisive factor. Every few years, the La Niña weather phenomenon brings torrential rainfall to the east coast of Australia. Months later, the rivers bring the resulting flood waters to the desert, increasing humidity levels and creating an abundance of insects. As Grimm-Seyfarth said, "These lizards are affected by local conditions and global climate phenomena alike. We need to look beyond the horizon of a particular area if we want to make an accurate prediction about the future of its inhabitants.

Until now, all the evidence indicated that the geckos would face problems caused by drought rather than by heat. But now, we have discovered that they can also compensate for this, to a certain extent. The study shows that, although individual creatures become thinner in years of drought, their population levels remain constant. This is because they scale back their growth and reproduction rates in hard times", explained Grimm-Seyfarth. They can then concentrate all their efforts on surviving into the next year. Thanks to their exceptional longevity (up to 28 years in some cases), these creatures can afford to lose a few reproductive cycles without any particular problem. And when conditions improve again, they can make up for lost time.

Therefore, even if climate change causes deterioration in living conditions for these geckos, they're hardly likely to die out immediately. And, according these evaluations by the scientists at the Helmholtz Centre for Environmental Research, these findings could also apply to other long-lived desert dwellers. However, this is in no way a simple carte blanche for letting climate change just happen. "If there are several very dry years in succession, the creatures will no longer be able to cope", said Grimm-Seyfarth. At some point, even the hardiest survival specialists will be overwhelmed.

Credit: 
Helmholtz Centre for Environmental Research - UFZ

Electron tomography technique leads to 3-D reconstructions at the nanoscale

image: This is a schematic of proposed TEM 3-D atomic imaging with multi-slice method with four examples of noisy intensity measurements at different angles of rotation, and 3-D atomic potential reconstructions and 1-D cross-sections along x and y directions.

Image: 
David Ren

Orlando, FL - Understanding the microscopic structure of a material is key to understanding how it functions and its functional properties. Advances in fields like materials science have increasingly pushed abilities to determine these features to even higher resolutions. One technique for imaging at nanoscale resolution, transmission electron microscopy (TEM), is one example of promising technology in this area. Scientists recently found a way to harness the power of TEM to measure the structure of a material at the highest possible resolution - determining the 3D position of every individual atom.

Presenting their work at the OSA Imaging and Applied Optics Congress 25-28 June, in Orlando, Florida, USA, a team of researchers has demonstrated a technique using TEM tomography to determine the 3D positions of strongly scattering atoms. Through simulation, the group showed that it is possible to reconstruct the atomic potentials with atomic resolution using only image intensity measurements, and that it's possible to do so on molecules that are very sensitive to electron beams.

"Transmission electron microscopy is used extensively in both materials science and biology," said Colin Ophus, National Center for Electron Microscopy, Lawrence Berkeley National Lab, Berkeley, California, and a member of the research team. "Because we fully solve the nonlinear propagation of the electron beam, our tomographic reconstruction method will enable more quantitative reconstruction of weakly scattering samples, at higher or even atomic resolution."

Similar to the way computerized tomography (CT) scans performed for medical imaging in hospitals are built using a series of two-dimensional cross-sectional images at different increments, electron tomography constructs a three-dimensional volume by rotating samples incrementally, collecting two-dimensional images. While most CT imaging in hospitals is done with x-rays to determine features of larger things like bones, the beams of electrons used in TEM allows researchers to look with significantly higher resolution, down to the atomic scale.

"However, on the atomic scale we cannot neglect the very complex quantum mechanical effects of the sample on the electron beam," Ophus said. "This means in our work, we must use a much more sophisticated algorithm to recover the atomic structure than those used in an MRI or CT scan."

The TEM setup the group used measured the energy intensity that hits the microscope's sensor, which is proportional to the number of electrons that hit the sensor, a number that depends on how the electron beam is configured for each experiment. Using the intensity data, the new algorithm designed by the group stitched the two-dimensional projected images into a 3D volume.

Making the jump to three dimensions with large fields of view, however, can tax computers exponentially more than dealing with single 2D images. To work around this, they modified their algorithm to be used on graphic processing units (GPUs), which can perform many times more mathematical operations in parallel than typical computer processing units (CPUs).

"We are able to obtain results in a reasonable amount of time for realistic sample dimensions," said David Ren, a member of the team.

With generally weaker bonds between their atoms, biomolecules can be notoriously difficult to study using TEM because the electron beams used to study a metal alloy, for example, would typically tear a biomolecule apart. Lowering the electron dosage in a sample, though, can create images that are so noisy, other algorithms currently in use can't reconstruct a 3D image. Thanks to a more precise physical model, the team's new algorithm has the ability.

Now that they have fully developed the reconstruction algorithm, the team said they hope to apply what they've observed from simulations to experimental data. They plan to make all of their reconstruction codes available as open source for the wider research community.

Credit: 
Optica

To manage weight, it may matter when protein supplements are consumed

WEST LAFAYETTE, Ind. -- People looking to manage their weight with strength-training and protein supplements should consume their supplements during a meal, according to a research review by nutrition experts at Purdue University.

"It may matter when you take your supplements in relation to when you eat meals, so people who consume protein supplements in between meals as snacks may be less likely to be successful in managing their body weight," said Wayne Campbell, professor of nutrition science and senior author on the study.

The findings are published in Nutrition Reviews. The study is led by Joshua Hudson, a Purdue postdoctoral research associate. Robert Bergia, a graduate research assistant, also contributed. The analysis was supported by Purdue's Department of Nutrition Science.

Protein supplements are available in ready-to-drink, powdered and solid forms, and often contain whey, casein or soy proteins. They can help with weight gain, weight loss or weight management based on how they are incorporated into an eating plan and taken with meals or as snacks.

"This is really the first time that the issue of timing when supplements are consumed in regard to meals has been looked at," Hudson said. "This review needs to be followed up by rigorous studies to better evaluate the timing of protein supplements in relationship to meals."

Their analysis of research studies found that while protein supplementation effectively increased lean mass for all groups, consuming protein supplements with meals helped maintain their body weight while decreasing their fat mass. In contrast, consuming protein supplements between meals promoted weight gain.

The timing likely makes a difference because a person may tend to adjust their calories at a meal time to include the protein supplement.

"Such dietary compensation is likely missing when protein supplements are consumed as snacks. Calories at meal times may not be adjusted to offset the supplement's calories, thus leading to a higher calorie intake for that day," said Campbell, whose expertise integrates human nutrition, exercise physiology and geriatrics. "If the goal is to manage weight, then snacking on protein supplements may be less effective. People who are trying to gain weight may consider consuming protein supplements between meals."

More than 2,000 nutrition articles were screened across journal databases to identify 34 studies with 59 intervention groups that were related to this topic. The studies were selected based on specific factors including inclusion of healthy adults, evaluating consumption of protein supplements between meals or with meals, whether results showed a change in lean muscle mass, and a minimum of six weeks duration for each of the studies.

Credit: 
Purdue University

'These could revolutionize the world'

image: These are small diameter carbon nanotubes grown on a stainless steel surface.

Image: 
Pint Lab/Vanderbilt Univerity

Imagine a box you plug into the wall that cleans your toxic air and pays you cash.

That's essentially what Vanderbilt University researchers produced after discovering the blueprint for turning the carbon dioxide into the most valuable material ever sold - carbon nanotubes with small diameters.

Carbon nanotubes are supermaterials that can be stronger than steel and more conductive than copper. The reason they're not in every application from batteries to tires is that these amazing properties only show up in the tiniest nanotubes, which are extremely expensive. Not only did the Vanderbilt team show they can make these materials from carbon dioxide sucked from the air, but how to do this in a way that is much cheaper than any other method out there.

These materials, which Assistant Professor of Mechanical Engineering Cary Pint calls "black gold," could steer the conversation from the negative impact of emissions to how we can use them in future technology.

"One of the most exciting things about what we've done is use electrochemistry to pull apart carbon dioxide into elemental constituents of carbon and oxygen and stitch together, with nanometer precision, those carbon atoms into new forms of matter," Pint said. "That opens the door to being able to generate really valuable products with carbon nanotubes.

"These could revolutionize the world."

In a report published today in ACS Applied Materials and Interfaces, Pint, interdisciplinary material science Ph.D. student Anna Douglas and their team describe how tiny nanoparticles 10,000 times smaller than a human hair can be produced from coatings on stainless steel surfaces. The key was making them small enough to be valuable.

"The cheapest carbon nanotubes on the market cost around $100-200 per kilogram," Douglas said. "Our research advance demonstrates a pathway to synthesize carbon nanotubes better in quality than these materials with lower cost and using carbon dioxide captured from the air."

But making small nanotubes is no small task. The research team showed that a process called Ostwald ripening -- where the nanoparticles that grow the carbon nanotubes change in size to larger diameters -- is a key contender against producing the infinitely more useful size. The team showed they could partially overcome this by tuning electrochemical parameters to minimize these pesky large nanoparticles.

This core technology led Pint and Douglas to co-found SkyNano LLC, a company focused on building upon the science of this process to scale up and commercialize products from these materials.

"What we've learned is the science that opens the door to now build some of the most valuable materials in our world, such as diamonds and single-walled carbon nanotubes, from carbon dioxide that we capture from air through our process," Pint said.

Credit: 
Vanderbilt University

Study reveals how high-latitude corals cope with the cold

video: This is a video of corals coping with cold. Lead researcher, Claire Ross went to the southernmost reefs of Western Australia and found that the corals there could do something quite interesting.

Image: 
The University of Western Australia

Corals growing in high-latitude reefs in Western Australia can regulate their internal chemistry to promote growth under cooler temperatures, according to new research at the ARC Centre of Excellence for Coral Reef Studies at The University of Western Australia.

The study, published today in Proceedings of the Royal Society B, suggests that ocean warming may not necessarily promote faster rates of calcification in reefs where temperatures are currently cooler (lower than 18C).

Lead author Claire Ross said the study was carried out over two years in Western Australia's Bremer Bay, 515km south-east of Perth in the Great Southern region. Bremer Bay is a renowned diving, snorkelling and tourism hot spot due to its stunning crystal clear waters, white sand and high marine biodiversity.

"For two years we used cutting-edge geochemical techniques to link the internal chemistry of the coral with how fast the corals were growing in a high-latitude reef," Ms Ross said.

"These high-latitude reefs (above 28 degrees north and below 28 degrees south) have lower light and temperatures compared to the tropics and essentially provide natural laboratories for investigating the limits for coral growth."

Ms Ross said the researchers expected the corals to grow slower during winter because the water was colder and light levels lower but they were surprised to find the opposite pattern.

"We were able to link the remarkable capacity for temperate corals to maintain high growth during winter to the regulation of their internal chemistry," she said.

"We also found that there was more food in the water for corals during winter compared to summer, indicating that (in addition to internal chemical regulation) corals may feed more to sustain growth."

Coral reefs are one of world's most valuable natural resources, providing a habitat for many ocean species, shoreline protection from waves and storms, as well as being economically important for tourism and fisheries.

However, studies have shown that the important process by which corals build their skeletons is under threat due to CO2-driven climate change. The effects of climate change on coral reefs are likely to vary geographically, but relatively little is known about the growth rates of reefs outside of the tropics.

"Our study is unique because it is among the first to fully decipher the corals' internal chemistry," Ms Ross said. "The findings of this study help better understand and predict the future of high-latitude coral reefs under CO2-driven climate change."

Credit: 
ARC Centre of Excellence for Coral Reef Studies

New study sheds light on the opioid epidemic and challenges prevailing views about this public health crisis

image: Change in total and drug-related mortality rates and years of potential life lost, prior to age 75 years, by single year of age for non-Hispanic whites: 2015 vs 1999.

Image: 
<i>American Journal of Preventive Medicine</i>

Ann Arbor, May 22, 2018 - A study published in the American Journal of Preventive Medicine sheds new light on the sharp rise in fatal drug overdoses in recent years, one of the most severe public health challenges of our time. The study found that the growth in fatal overdoses for non-Hispanic whites (NHWs) aged 22-56 years was sufficiently large to account for the entire growth in mortality rates (MR) and years of potential life lost (YPLL) for this population from 1999 to 2015.

MR and YPLL rose by 21.2 per 100,000 people and more than 700,000 years from 1999 to 2015. If drug mortality rates had remained at 1999 levels and other patterns of mortality had not changed, MRs and YPLL would have declined considerably for NHW men aged 22-56 years and risen only slightly for corresponding women.

"Particularly noteworthy is the rapid rise in lost life years and mortality rates for non-Hispanic white males in their 20s and 30s. These increases are considerably larger than those experienced by corresponding men or women in their 40s and 50s, who have been the focus of earlier analyses," explained Christopher J. Ruhm, PhD, Frank Batten School of Leadership and Public Policy, University of Virginia, Charlottesville, VA, USA. He points out that focusing on whites in their 40s and 50s missed important components of MR and YPLL growth.

Some of the study's findings are not fully consistent with previous research that has received a great deal of public attention. Whereas prior research emphasized the rising mortality rates of NHWs in their 40s and 50s due to prescription opioid overdoses, this study shows that the overall MR and YPLL actually grew more for NHWs in their 20s and 30s, with 62 percent of the MR and 76 percent of the YPLL increase among NHWs aged 22-56 years accounted for by individuals aged 22-39 years, and 32 percent of the MR and 41 percent of YPLL by individuals aged 22-30 years alone. Illicit opioids were primarily responsible for the growth in fatal overdoses among individuals aged 22-39 years, especially males, with prescription and illicit opioids playing more equal roles for older females and with other drug categories also becoming somewhat more important at higher ages.

Vital statistics data were used to examine to what extent increases in MR and YPLL among midlife (aged 22-56) NHWs from 1999 to 2015 could be explained by increases in fatal overdoses and deaths involving specific drug categories. Results were analyzed for the age ranges: 22-30, 31-39, 40-48, and 49-56 years, chosen because preliminary analysis indicated similar trends within, but substantial differences between them. NHWs were focused upon because MR and YPLL changes were uniformly negative for nonwhites. When looking at the involvement of specific drug categories, recently developed methods were implemented to account for missing information on drug involvement on death certificates. The drugs involved in fatal overdoses were examined with primary attention paid to prescription and illicit opioids, with a distinction made between separate versus combined use. The use of non-opioid drug categories was also examined.

"Understanding the dimensions of the problem is critical and it is particularly important to understand how changes in MR and YPLL vary across age and sex, as well as race-ethnicity groups, and also how the contributions of different drug categories to these increases vary across these groups," commented Dr. Ruhm, adding that he hoped the data and insights will help lead to multi-faceted efforts to curtail the growing epidemic.

Credit: 
Elsevier

Fluid dynamics may play key role in evolution of cooperation

image: This is Dervis Can Vural, assistant professor in the Department of Physics at Notre Dame.

Image: 
University of Notre Dame

Believe it or not -- it's in our nature to cooperate with one another, even when cheating may be more profitable. Social cooperation is common in every scale of life, from the simplest bacterial films and multicellular tissues to insect colonies and nation-states, where individuals prioritize the common good over personal gain, even when the two might conflict. Scientists have long wondered how social cooperation could evolve and persist, since "survival of the fittest" often favors cheaters that multiply at the expense of others.

In a new study, physicists at the University of Notre Dame examined how the mechanical properties of an environment may shape the social evolution of microbial populations. Through computer simulations and analytical calculations, they determined the necessary properties of diffusion and flow that allow microbes to evolve stable social behavior. Their findings also allow for speculation that the evolution of single-cell organisms to multicellular organisms may have taken place in flowing fluids like rivers or streams as opposed to larger bodies of water such as oceans and lakes.

"Microbes form groups, like little villages," said Dervis Can Vural, assistant professor in the Department of Physics at Notre Dame. "If a cheater mutant emerges in one, its descendants will multiply at the cost of others, and spread like a tumor. Such non-cooperating groups will grow weak and die."

Unlike most past approaches that describe cooperation in abstract language of economics and game theory, the Notre Dame group was interested in determining the role of physical forces on social evolution. The model consisted of bacteria secreting two types of diffusing molecules: a "public good" -- a molecule that provides a benefit to those nearby, such as a digestive enzyme -- and waste. While both cooperating and cheating strains produce waste, cheaters produce less or no public goods while benefiting from the molecules produced by the cooperators.

In the absence of fluid flow, cheaters ultimately took over, weakened and killed the entire population. However, when the team introduced flow into the model, shear forces caused some of the microbial groups to distort and occasionally fragment, which limits the spread of cheaters.

"Fragmentation is the key to stable cooperation," Can Vural said. "If groups manage to fragment more often than mutants appear, then cooperation will prevail. It's a bit like starting a new forest before the fire catches on."

Understanding evolutionary transitions in social behavior can help engineer strategies to manipulate microbial ecosystems. As a proof of concept, the team was able to fine tune social evolution using only flow patterns, so that bacteria would cooperate and persist only in a confined region; elsewhere they would be taken over by cheaters and go extinct.

"In a waste treatment facility, in an industrial bioreactor or in our own guts, it is desirable that bacteria cooperate and coexist harmoniously to fulfill different functions. It would be very bad if one strain took over at the cost of others," said Gurdip Uppal, a graduate student at Notre Dame, who co-authored the study. "In other cases, such as with disease-causing biofilms, we would like to suppress cooperation, since this makes them so much stronger."

Credit: 
University of Notre Dame

Michael Jackson's antigravity tilt -- Talent, magic, or a bit of both?

image: Fiugure A shows drawings showing the 'antigravity tilt' (> 45° forward bend), the dance move introduced by Michael Jackson, in comparison to the normal limit of a human tilt (20° forward bend), as well as the conceptualized shoe designed by MJ and coinventors. Figure B shows the shift of the fulcrum from the sacrum to the Achilles tendon in MJ's antigravity tilt.

Image: 
Manjul Tripathi

Charlottesville, VA (May 22, 2018). When was the last time you watched a Michael Jackson music video? If your answer is "never" or "not for quite a while," you are really missing a treat. According to Rolling Stone, "No single artist ... shaped, innovated or defined the medium of 'music video' more than Michael Jackson."

Back in the 1980s and early 1990s, MTV had only one format--music videos--and that genre really took off when Jackson burst on the scene in 1983 with his musical hit "Billie Jean." Prior to his arrival on MTV, most videos were merely visual promos for artists' songs, and in some cases the visual side of the promos detracted from the music. Michael Jackson, on the other hand, took his incredible music and added story lines, special effects, cinematography, and amazing choreography. He created high-budget brief movies highlighting both music and dance.

And about that dance. . . . Jackson executed dance moves we thought impossible, at the time and even now. Almost every fan tried to dance like him, but very few could pull it off. Some of Jackson's dance moves appear to defy the laws of gravity. In one move featured in his 1987 music video "Smooth Criminal," he pitches forward 45 degrees, with his body straight as a rod and his shoes resting on the stage, and holds the position. That is not how the human body works! How did Michael Jackson do it? Was it talent, magic, or both?

Three neurosurgeons from the Postgraduate Institute of Medical Education and Research in Chandigarh, India--Nishant S. Yagnick, Manjul Tripathi, and Sandeep Mohindra--set out to examine the antigravity tilt introduced in "Smooth Criminal" from a neurosurgeon's point of view.

First, Yagnick et al. walk us through some basics of spinal biomechanics to show just how impressive is the feat. Even the strongest of dancers can only maintain a 25- to 30-degree forward tilt from the ankle.

Admitted fans of Jackson, the neurosurgeons document how the antigravity tilt was accomplished, taking into account the talent and core strength of the artist, as well as his inventiveness and use of a patented aid, that together seem to move his body past human limits. They also warn other neurosurgeons of new forms of spinal injuries, as dancers follow Jackson's example and attempt "to jump higher, stretch further, and turn faster than ever before."

The full story on the antigravity tilt is published today in a new article in the Journal of Neurosurgery: Spine entitled "How did Michael Jackson challenge our understanding of spine biomechanics?".

Read the article soon. This is one of those mysteries where the solution is as fascinating as the performance. After you've read the article, you may want to go to YouTube and check out "Smooth Criminal" and other Michael Jackson music videos.

When asked about his article, Dr. Tripathi said, "MJ has inspired generations of dancers to push themselves beyond their limits. Though a visual delight, such moves also lead to new forms of musculoskeletal injuries. "The King of Pop" has not only been an inspiration but a challenge to the medical fraternity."

Credit: 
Journal of Neurosurgery Publishing Group