Tech

Bacteria-killing gel heals itself while healing you

image: McMaster engineers specializing in infectious diseases have created a bacteria-killing gel composed entirely of friendly viruses. The gel can heal itself when cut.

Image: 
JD Howell, McMaster University

HAMILTON, ON July 25, 2019 - McMaster researchers have developed a novel new gel made entirely from bacteria-killing viruses.

The anti-bacterial gel, which can be targeted to attack specific forms of bacteria, holds promise for numerous beneficial applications in medicine and environmental protection.

Among many possibilities, it could be used as an antibacterial coating for implants and artificial joints, as a sterile growth scaffold for human tissue, or in environmental cleanup operations, says chemical engineer Zeinab Hosseini-Doust.

Her lab, which specializes in developing engineering solutions for infectious disease, grew, extracted and packed together so many of the viruses - called bacteriophages, or simply phages - that they assembled themselves spontaneously into liquid crystals and, with the help of a chemical binder, formed into a gelatin-like substance that can heal itself when cut.

Yellow in colour and resembling Jell-O, a single millilitre of the antibacterial gel contains 300 trillion phages, which are the most numerous organisms on Earth, outnumbering all other organisms combined, including bacteria.

"Phages are all around us, including inside our bodies," explains Hosseini-Doust. "Phages are bacteria's natural predators. Wherever there are bacteria, there are phages. What is unique here is the concentration we were able to achieve in the lab, to create a solid material."

The field of phage research is growing rapidly, especially as the threat of antimicrobial resistance grows.

"We need new ways to kill bacteria, and bacteriophages are one of the promising alternatives," says Lei Tan, a PhD student in Hosseini-Doust's lab and a co-author on the paper describing the research, published today in the journal Chemistry of Materials. "Phages can kill bacteria that are resistant to antibiotics."

Hosseini-Doust says the DNA of phages can readily be modified to target specific cells, including cancer cells. Through a Nobel Prize-winning technology called phage display, it's even possible to find phages that target plastics or environmental pollutants.

Being able to shape phages into solid form opens new vistas of possibility, just as their utility in fighting diseases is being realized, she says.

Credit: 
McMaster University

Microrobots show promise for treating tumors

image: An artist's illustration of microrobots inside the gut.

Image: 
Caltech

Targeting medical treatment to an ailing body part is a practice as old as medicine itself. A Band-Aid is placed on a skinned knee. Drops go into itchy eyes. A broken arm goes into a cast.

But often what ails us is inside the body and is not so easy to reach. In such cases, a treatment like surgery or chemotherapy might be called for. A pair of researchers in Caltech's Division of Engineering and Applied Science are working on an entirely new form of treatment--microrobots that can deliver drugs to specific spots inside the body while being monitored and controlled from outside the body.

"The microrobot concept is really cool because you can get micromachinery right to where you need it," says Lihong Wang, Caltech's Bren Professor of Medical Engineering and Electrical Engineering. "It could be drug delivery, or a predesigned microsurgery."

The microrobots are a joint research project of Wang and Wei Gao, assistant professor of medical engineering, and are intended for treating tumors in the digestive tract.

The microrobots consist of microscopic spheres of magnesium metal coated with thin layers of gold and parylene, a polymer that resists digestion. The layers leave a circular portion of the sphere uncovered, kind of like a porthole. The uncovered portion of the magnesium reacts with the fluids in the digestive tract, generating small bubbles. The stream of bubbles acts like a jet and propels the sphere forward until it collides with nearby tissue.

On their own, magnesium spherical microrobots that can zoom around might be interesting, but they are not especially useful. To turn them from a novelty into a vehicle for delivering medication, Wang and Gao made some modifications to them.

VIDEO: https://www.youtube.com/watch?v=YWK3gg6J8ng

First, a layer of medication is sandwiched between an individual microsphere and its parylene coat. Then, to protect the microrobots from the harsh environment of the stomach, they are enveloped in microcapsules made of paraffin wax.

At this stage, the spheres are capable of carrying drugs, but still lack the crucial ability to deliver them to a desired location. For that, Wang and Gao use photoacoustic computed tomography (PACT), a technique developed by Wang that uses pulses of infrared laser light.

The infrared laser light diffuses through tissues and is absorbed by oxygen-carrying hemoglobin molecules in red blood cells, causing the molecules to vibrate ultrasonically. Those ultrasonic vibrations are picked up by sensors pressed against the skin. The data from those sensors is used to create images of the internal structures of the body.

Previously, Wang has shown that variations of PACT can be used to identify breast tumors, or even individual cancer cells. With respect to the microrobots, the technique has two jobs. The first is imaging. By using PACT, the researchers can find tumors in the digestive tract and also track the location of the microrobots, which show up strongly in the PACT images. Once the microrobots arrive in the vicinity of the tumor, a high-power continuous-wave near-infrared laser beam is used to activate them. Because the microrobots absorb the infrared light so strongly, they briefly heat up, melting the wax capsule surrounding them, and exposing them to digestive fluids. At that point, the microrobots' bubble jets activate, and the microrobots begin swarming. The jets are not steerable, so the technique is sort of a shotgun approach--the microrobots will not all hit the targeted area, but many will. When they do, they stick to the surface and begin releasing their medication payload.

"These micromotors can penetrate the mucus of the digestive tract and stay there for a long time. This improves medicine delivery," Gao says. "But because they're made of magnesium, they're biocompatible and biodegradable."

Tests in animal models show that the microrobots perform as intended, but Gao and Wang say they are planning to continue pushing the research forward.

"We demonstrated the concept that you can reach the diseased area and activate the microrobots," Gao says. "The next step is evaluating the therapeutic effect of them."

Gao also says he would like to develop variations of the microrobots that can operate in other parts of the body, and with different types of propulsion systems.

Wang says his goal is to improve how his PACT system interacts with the microrobots. The infrared laser light it uses has some difficulty reaching into deeper parts of the body, but he says it should be possible to develop a system that can penetrate further.

Credit: 
California Institute of Technology

America's packaged food supply is ultra-processed

image: Distribution of health star ratings by major food category for foods that were and were not classified as ultra-processed.

Image: 
Northwestern University

About 80% of Americans' calorie consumption comes from store-bought foods, beverages

Food supply plays a central role in the development of obesity, cardiovascular disease

Compared to other western countries, U.S. has more sugar and sodium content

Manufacturers 'can and should be doing a whole lot better' at providing healthy options

CHICAGO --- Americans are overexposed to products that are high in energy, saturated fat, sugar and salt, according to a new Northwestern Medicine study that reports the United States packaged food and beverage supply in 2018 was ultra-processed and generally unhealthy.

Since about 80% of Americans' total calorie consumption comes from store-bought foods and beverages (packaged and unpackaged), the food and beverage supply plays a central role in the development of chronic disease including obesity and cardiovascular disease.

The study was published today, July 24, in the journal Nutrients. It aims to provide new information for consumers, researchers and policymakers to encourage food manufacturers to reformulate or replace unhealthy products and to inform the U.S. government on where action may be needed to improve the healthfulness of the U.S. packaged food and beverage supply.

"To say that our food supply is highly processed won't shock anyone, but it's important that we hold food and beverage manufacturers accountable by continually documenting how they're doing in terms of providing healthy foods for consumers," said lead author Abigail Baldridge, a biostatistician in the department of preventive medicine at Northwestern University Feinberg School of Medicine. "And the verdict is they can and should be doing a whole lot better."

As classified by the NOVA Food Classification System developed at the University of Sao Paulo in Brazil, "ultra-processed food and beverages" is the fourth and final group of foods that "are industrial formulations made entirely or mostly from substances extracted from foods (oils, fats, sugar, starch and proteins)." They are derived from hydrogenated fats and modified starch, and are synthesized in laboratories.

The scientists analyzed 230,156 products and, using the NOVA classification system, found 71% of products such as bread, salad dressings, snack foods, sweets, sugary drinks and more were ultra-processed. Among the top 25 manufacturers by sales volume, 86% of products were classified as ultra-processed.

"Bread and bakery products" was the only category consistently among the highest third across all four nutrient categories in terms of nutrient levels (calories, saturated fat, total sugars and sodium).

Compared to other western countries like Australia, the U.S. food supply is similarly healthy but more processed with higher median sugar and sodium content, the study found.

Dietary guidelines are routinely updated, but no such regular surveillance or reporting on what is available on grocery shelves is available to consumers, researchers or policy makers. Changing the food supply must start with properly assessing it, Baldridge said.

"Food and beverage products continuously evolve, and reports like these highlight opportunities to make critical changes within specific manufacturers or product categories to reduce saturated fat, salt and sugars," Baldridge said.

This evaluation can occur either by manufacturers replacing or reformulating the food and beverage products, Baldridge said.

"Our team has previously shown that breads, in particular, have 12% higher sodium content in the U.S. in comparison to the U.K., where national sodium-reduction strategies have contributed to lowering sodium levels in packaged foods," Baldridge said.

The scientists analyzed data collected by local Chicago company Label Insight, which represents more than 80% of all food and beverage products sold in the U.S. over the past three years.

Collecting data on the packaged food and beverage supply is difficult because it is so large and about 20% of packaged foods in the U.S. turn over every year, Baldridge said.

"We need to better capture real-time information of our constantly changing food supply if we're going to track and improve its healthfulness," said study co-author Dr. Mark Huffman, the Quentin D. Young Professor of Health Policy, associate professor of preventive medicine and medicine at Feinberg and a Northwestern Medicine cardiologist.

To that end, the study team, including researchers at The George Institute for Global Health in Australia, last summer launched the U.S. version of FoodSwitch, a free mobile phone app that allows consumers to scan packaged foods to determine their healthfulness. If a product doesn't exist in the app's 268,000-product database, then the app asks the user to crowdsource the information by uploading photos of its barcode, nutritional label and packaging to update the app's ever-growing database of foods.

The scientists in this study ranked foods based on their healthfulness number defined by a Health Star Rating system, which scores packaged foods between 0.5 stars (unhealthiest) to 5 stars (healthiest) to provide a quick look at the nutritional profile of packaged foods. They focused specifically on whether products were ultra-processed or not, and compared healthfulness and level of processing product categories and leading manufacturers.

Credit: 
Northwestern University

Clues on how soils may respond to climate change found

image: Scientists collected rock samples from red, purple and orange Paleocene-Eocene Thermal Maximum soil horizons in Wyoming.

Image: 
Allison Baczynski, Penn State

Rock core samples from a period of warming millions of years ago indicate soils contributed to a rapid rise in atmospheric greenhouse gas and suggest modern climate models may overestimate Earth's ability to mitigate future warming, according to an international team of scientists.

Researchers discovered a drastic drop in organic material preserved in sections of core samples from the Paleocene-Eocene Thermal Maximum (PETM), a global warming event 55.5 million years ago that's considered the best analogue for modern climate change.

The findings, according to the researchers, suggest ancient soils from a site in modern day Wyoming acted as a source of atmospheric carbon dioxide, emitting the greenhouse gas into the atmosphere, and not a sink, trapping and storing carbon underground.

The researchers said this could mean global climate models, which expect soils to be a sink, may overstate the ability of terrestrial ecosystems to lessen the impacts of climate change. However, additional studies are needed to see how soils reacted to the PETM in other parts of the world, they said.

"We see the amount of carbon drops drastically, by orders of magnitude, during this PETM event," said Allison Baczynski, a postdoctoral scholar in geosciences at Penn State and lead author on the study. "So at least in Wyoming, my data suggests soils acted as a source, not a sink, for carbon dioxide, which could provide new information as we try to figure out where our climate is heading."

The team's findings appeared online in April and in the print version of the journal Paleoceanography and Paleoclimatology in May. Katherine Freeman, Evan Pugh University Professor of Geosciences, and Baczynski’s adviser, is co-author.

The cores, drilled in 2011 at the Bighorn Basin in Wyoming, are the first terrestrial core samples of the PETM. The scientists found the samples contained less organic matter than expected, but, at the time, the team lacked tools with enough sensitivity to measure specific biomarkers.

Baczynski spent parts of four years improving the sensitivity of the equipment by two orders of magnitude, and using that tool, the team collected the first biomarker record of the PETM from terrestrial core samples.

"Prior to improving the sensitivity, we had carbon isotope values from before and after the PETM, but nothing during," Baczynski said. "We were able to fill in that gap in this study."

The researchers found the 130-foot section they believe to represent the PETM had the lowest weight of total carbon and biomarkers of any part of the core.

"At least in the Bighorn Basin, it appears that high PETM temperature, seasonally intense precipitation, or a combination, accelerated organic matter decay rates such that they outpaced plant productivity and ultimately resulted in reduced soil organic carbon during the PETM," Baczynski said.

The PETM is marked by global rise in temperatures, from about 9 to 15 degrees Fahrenheit, and a rapid increase in atmospheric carbon dioxide. The carbon dioxide from this time has a unique isotopic signature, and scientists can identify it in tree and plant fossils that absorbed the carbon.

The scientists found the PETM section of the core lacked evidence of this process. Using the new tool, and comparing samples with nearby outcrops, scientists believe up to 40 percent of the core may be composed of older fossil carbon that predates the PETM. The area was once a floodplain, and rivers may have carried and deposited the older carbon, scientists said.

Baczynski said the instrument she developed will help with similar fossil research and has boarder applications to study materials with low carbon levels, like extraterrestrial samples that could someday come from Mars.

Credit: 
Penn State

Reach out and touch someone

video: A team of University of Utah biomedical engineers are helping develop a prosthetic arm for amputees that can move with the person's thoughts and feel the sensation of touch via an array of electrodes implanted in the muscles of the patient.

Image: 
University of Utah College of Engineering

July 24, 2019 - Keven Walgamott had a good "feeling" about picking up the egg without crushing it.
What seems simple for nearly everyone else can be more of a Herculean task for Walgamott, who lost his left hand and part of his arm in an electrical accident 17 years ago. But he was testing out the prototype of a high-tech prosthetic arm with fingers that not only can move, they can move with his thoughts. And thanks to a biomedical engineering team at the University of Utah, he "felt" the egg well enough so his brain could tell the prosthetic hand not to squeeze too hard.

That's because the team, led by University of Utah biomedical engineering associate professor Gregory Clark, has developed a way for the "LUKE Arm" (so named after the robotic hand that Luke Skywalker got in "The Empire Strikes Back") to mimic the way a human hand feels objects by sending the appropriate signals to the brain. Their findings were published in a new paper co-authored by U biomedical engineering doctoral student Jacob George, former doctoral student David Kluger, Clark and other colleagues in the latest edition of the journal Science Robotics. A copy of the paper may be obtained by emailing robopak@aaas.org.

"We changed the way we are sending that information to the brain so that it matches the human body. And by matching the human body, we were able to see improved benefits," George says. "We're making more biologically realistic signals."

That means an amputee wearing the prosthetic arm can sense the touch of something soft or hard, understand better how to pick it up, and perform delicate tasks that would otherwise be impossible with a standard prosthetic with metal hooks or claws for hands.

"It almost put me to tears," Walgamott says about using the LUKE Arm for the first time during clinical tests in 2017. "It was really amazing. I never thought I would be able to feel in that hand again."

Walgamott, a real estate agent from West Valley City, Utah, and one of seven test subjects at the University of Utah, was able to pluck grapes without crushing them, pick up an egg without cracking it, and hold his wife's hand with a sensation in the fingers similar to that of an able-bodied person.

"One of the first things he wanted to do was put on his wedding ring. That's hard to do with one hand," says Clark. "It was very moving."

How those things are accomplished is through a complex series of mathematical calculations and modeling.

The LUKE Arm

The LUKE Arm has been in development for some 15 years. The arm itself is made of mostly metal motors and parts with a clear silicon "skin" over the hand. It is powered by an external battery and wired to a computer. It was developed by DEKA Research & Development Corp., a New Hampshire-based company founded by Segway inventor Dean Kamen.

Meanwhile, the University of Utah team has been developing a system that allows the prosthetic arm to tap into the wearer's nerves, which are like biological wires that send signals to the arm to move. It does that thanks to an invention by University of Utah biomedical engineering Emeritus Distinguished Professor Richard A. Normann called the Utah Slanted Electrode Array. The Array is a bundle of 100 microelectrodes and wires that are implanted into the amputee's nerves in the forearm and connected to a computer outside the body. The array interprets the signals from the still-remaining arm nerves, and the computer translates them to digital signals that tell the arm to move.

But it also works the other way. To perform tasks such as picking up objects requires more than just the brain telling the hand to move. The prosthetic hand must also learn how to "feel" the object in order to know how much pressure to exert because you can't figure that out just by looking at it.

First, the prosthetic arm has sensors in its hand that send signals to the nerves via the Array to mimic the feeling the hand gets upon grabbing something. But equally important is how those signals are sent. It involves understanding how your brain deals with transitions in information when it first touches something. Upon first contact of an object, a burst of impulses runs up the nerves to the brain and then tapers off. Recreating this was a big step.

"Just providing sensation is a big deal, but the way you send that information is also critically important, and if you make it more biologically realistic, the brain will understand it better and the performance of this sensation will also be better," says Clark.

To achieve that, Clark's team used mathematical calculations along with recorded impulses from a primate's arm to create an approximate model of how humans receive these different signal patterns. That model was then implemented into the LUKE Arm system.

Future Research

In addition to creating a prototype of the LUKE Arm with a sense of touch, the overall team is already developing a version that is completely portable and does not need to be wired to a computer outside the body. Instead, everything would be connected wirelessly, giving the wearer complete freedom.

Clark says the Utah Slanted Electrode Array is also capable of sending signals to the brain for more than just the sense of touch, such as pain and temperature, though the paper primarily addresses touch. And while their work currently has only involved amputees who lost their extremities below the elbow, where the muscles to move the hand are located, Clark says their research could also be applied to those who lost their arms above the elbow.

Clark hopes that in 2020 or 2021, three test subjects will be able to take the arm home to use, pending federal regulatory approval.

The research involves a number of institutions including the U's Department of Neurosurgery, Department of Physical Medicine and Rehabilitation and Department of Orthopedics, the University of Chicago's Department of Organismal Biology and Anatomy, the Cleveland Clinic's Department of Biomedical Engineering, and Utah neurotechnology companies Ripple Neuro LLC and Blackrock Microsystems. The project is funded by the Defense Advanced Research Projects Agency and the National Science Foundation.

"This is an incredible interdisciplinary effort," says Clark. "We could not have done this without the substantial efforts of everybody on that team."

Credit: 
University of Utah

Exploring genetic 'dark matter,' researchers gain new insights into autism and stroke

With its elegant double helix and voluminous genetic script, DNA has become the of darling of nucleic acids. Yet, it is not all powerful. In order for DNA to realize its potential--for genes to become proteins--it must first be transcribed into RNA, a delicate molecule that requires intense care and guidance.

"Gene expression is a lot more complicated than turning on a switch," says Robert B. Darnell, the Robert and Harriet Heilbrunn Professor. "There's a whole layer of regulation that alters both the quality and quantity of a protein that's produced from a gene. And much of it happens at the level of RNA."

In the brain, RNA's job as a gene tuner is vital to ensuring that the right proteins are made at the right time; and when this process go awry, the consequences can be serious. Darnell's lab recently found that the brain's response to stroke depends on the precise regulation of a subtype of RNA; and they have also learned that mutations affecting gene regulation underlie some cases of autism spectrum disorder.

Genome's little helper

Whereas DNA is stuck inside a cell's nucleus, RNA is fairly mobile. In the brain, so-called messenger RNAs can be found at the connections between neurons, called synapses, where they are translated into proteins that affect brain signaling. This process is regulated by another class of RNAs, known as miroRNAs, which can rapidly promote or suppress protein production in response to dynamic changes in the brain.

In a recent experiment described in Cell Reports, Darnell and his colleagues tracked microRNA activity in the mouse brain following a simulated stroke. Using a technique called crosslinking immunoprecipitation, or CLIP, they found that stroke prompts a dramatic reduction in a subset of microRNAs known as miR-29s. Typically, these molecules limit the production of two proteins called GLT-1 and aquaporin; and when miR-29 levels drop, the researchers found, these proteins are produced in higher-than-usual quantities.

GLT-1 is responsible for getting rid of extra glutamate, a chemical that is produced in abundance during stroke and can harm the brain if left unchecked. An uptick in production of this protein therefore seems to mitigate stroke-associated brain damage. Increased aquaporin, on the other hand, exacerbates tissue swelling, further threatening an already-imperiled brain. In short, a drop in miR-29s appears to simultaneously help and hinder stroke recovery. The good news is that a better understanding of how both of these processes work might guide the development of new and very precise medical tools.

"This research suggests potential drug targets for treating stroke," says Darnell. "By artificially inducing more GLT-1 mRNA with a drug, for example, you could regulate the amount of glutamate that's getting sucked up and reduce damage to the brain."

Covert mutations

To understand what causes a person's disease, researchers often look for mutations in genes--also known as the "coding" regions of DNA--that lead to the production of dysfunctional proteins. However, this general strategy works only for diseases that run in families and are driven by specific protein irregularities, which isn't the case for some complex conditions. For example, though studies have identified many different coding mutations that contribute to the development of autism spectrum disorder (ASD) and epilepsy, together these mutations account for only about a quarter to a third of cases.

Researchers are therefore beginning to search for irregularities in the noncoding sections of DNA--regions that don't directly code for proteins, but that make RNA whose job it is to regulate genes. Once thought of as "junk DNA," these regions are now known to be critical in determining which proteins a cell makes, when it makes them, and in what quantities. And according to Darnell, analyzing noncoding DNA can be particularly useful in understanding diseases that don't adhere to conventional heredity patterns.

"Some conditions have a genetic component, but they don't come with simple family trees where you can predict the chance of a child having a disease based on the parents' genetic makeup," says Darnell. "So you need a different approach to figure out which types of mutations are underlying the disease."

To find noncoding mutations associated with ASD, Darnell and his colleagues developed a new take on the family tree. Using a large genetic database, they first analyzed the DNA of 1,790 "microfamilies," each consisting of a mother, a father, one child with ASD, and one without. They then applied a machine-learning algorithm, developed with colleagues at Princeton, to identify ways in which children with the condition were genetically different from the rest of their family members who were unaffected by the disorder.

Described in Nature Genetics, these findings suggest that by analyzing noncoding mutations, researchers may be able to better understand not only ASD but a variety of conditions, ranging from neurological disorders to heart disease.

"Noncoding DNA makes up over 98 percent of the genome, and it's largely unexplored," says Darnell. "We're showing that this genetic dark matter can fill in our understanding of diseases that coding mutations can't explain."

Credit: 
Rockefeller University

Kids are exposed to smoking in movies

More than half of the top-grossing movies in Ontario in the past 16 years featured smoking, according to University of Toronto researchers with the Ontario Tobacco Research Unit - and most of these films were rated as acceptable for youth.

Since 2002, Adult Accompaniment (AA) or 14A rated movies have delivered 5.7 billion tobacco images to Ontario moviegoers -- three times as many as 18A or R-rated movies delivered in the same period, according to the report, released July 23.

The report's authors estimate that exposure to on-screen smoking will encourage 185,000 youth 17 or younger to become smokers, resulting in $1.1 billion in additional health-care costs over their lifetimes.

"In fact, these estimates may understate the impact of movie smoking on Ontario kids," said Prof. Robert Schwartz of the Dalla Lana School of Public Health. Schwartz, who is also director of the Ontario Tobacco Research Unit, said most movies rated R in the U.S. - meaning they are prohibited to under-18 youth without a guardian - are rated acceptable for youth by the Ontario Film Review Board. These movies are more likely to contain smoking.

Movies are a powerful vehicle for promoting tobacco use. The tobacco industry has a well-documented history of collaboration with Hollywood to promote smoking in movies -- including payment for the placement of tobacco products in movies.

"A substantial body of scientific evidence indicates that exposure to smoking in movies is a cause of smoking initiation and progression to regular smoking among youth," said Donna Kosmack Co-Chair of the Ontario Coalition for Smoke-Free Movies. "Exposure to onscreen tobacco undermines tobacco prevention efforts."

According to recent polling by Ipsos, 78 per cent of Ontarians support not allowing smoking in movies rated G, PG, and 14A - an increase from 73 per cent in 2011.

"There is a straightforward way to fix the problem, and that's an amendment of the regulations under the provincial Film Classification Act that would require all movies with smoking shown in Ontario to be rated 18A," said Michael Perley of the Ontario Campaign for Action on Tobacco.

"The people of Ontario support action to protect kids from the normalization of smoking," said Liz Scanlon, Senior Manager of Public Affairs, Ontario, for Heart & Stroke.

Thousands of Ontarians have signed petitions that support action to reduce exposure to smoking in youth-rated films released in Ontario. These petitions have been read into the Legislature's record by nearly two dozen MPPs from the three major parties.

Credit: 
University of Toronto

Special “mapping” brain cells could inspire smarter self-driving vehicles

In a study published in Nature Communications, BU researchers Jake Hinman, William Chapman, and Michael Hasselmo, director of BU's Center for Systems Neuroscience and a College of Arts & Sciences professor of psychological and brain sciences, confirmed the presence of specialized brain cells that provide rats with personal maps of their surroundings. They believe that human brains likely have these neurons too, although further research is needed to be certain of this.

The study, partially funded by a $7.5 million multidisciplinary grant from the Department of Defense, offers valuable insights into the workings of the brain's navigational system--knowledge that could be leveraged to create smarter autonomous vehicles that can find their way around obstacles as well as living organisms.

For decades, scientists have thought that an area of the brain called the hippocampus stores maps of our surroundings, functioning as if it's the brain's file cabinet for map illustrations similar to those that pop up when we search for a location on Google Maps. But some researchers theorized that in order to effectively use those maps to navigate our environments, our brains must first convert them to the "street-view" version of Google Maps, mentally placing ourselves into a first-person view. In other words, we must develop an idea of where the map's boundaries and landmarks are located in relation to ourselves.

Now, the BU team's findings provide some of the first biological evidence that proves an internal street-view map function does exist, at least in rats--specifically in an area deep in the brain that helps control behavior, the striatum.

Using electrodes to see what was happening inside the rats' brains, the researchers brought the animals into a room containing strategically placed, crushed bits of Froot Loops. As the rats embarked on their sugary scavenger hunt, special brain cells within the striatum--called egocentric boundary cells--appeared to go crazy with activity. These boundary cells fired in different ways to guide the rats through their environment.

"[It's] much like if I were to give you directions to go somewhere, I might tell you, 'Oh, when you're walking down the street, once there's a Starbucks on your left, you're going to turn right,'" says Hinman, first author on the study and a former postdoctoral researcher in Hasselmo's lab. (He's now running his own lab as a faculty member at the University of Illinois Urbana-Champaign.)

Hinman says the boundary cells in the striatum served as each rat's street-view map, firing in precise ways to say, "You're close to this wall," or "There's a wall on your right." This information allowed the rat to orient itself throughout its search for the Froot Loops.

"These [boundary-cell] neurons are our first step in figuring out how animals use these two strings of information to influence each other," says Chapman, a postdoctoral researcher in Hasselmo's lab. "Based on where you think you are in the environment, you might expect a wall in a certain location. If it's not there, you use that to update what you're doing in that [moment in] time, but you also update your representation of where you are."

So why is the Department of Defense funding a rat's mission for sweetened breakfast cereals? Because one day, these missions could be the key to incredible technological breakthroughs.

"The goal is to make robots able to navigate effectively in complex environments," explains Hasselmo, the study's principal investigator and senior author. "It's easier to have robots working in warehouses that have empty floors.... It's all very predictable. But it's much harder for a robot to go across uneven terrain. A human could easily walk across a path and step across boulders...but a robot would find that much more difficult."

Take, for instance, the explosions that occurred at the Japanese nuclear power plant Fukushima Daichii in 2011. High levels of radioactivity prevented human engineers from being able to safely address the situation in person. Robots were sent to help instead, but they were often tripped up by debris and other unpredictable obstacles.

Hasselmo says self-navigating robots would be better equipped to maneuver hazardous situations like Fukushima. "One [application for this research] would be for rescue-type operations or salvage-type operations," he says.

Self-driving cars also encounter similar issues on the road and on terrain--for both situations, the brain's natural navigational system could offer clues for high-tech solutions.

That natural feat is exactly what the researchers hope robots will be able to accomplish someday. In the meantime, there are still many uncertainties to be explored. So far, Hasselmo's team has only examined how these boundary cells react to walls; in future studies they hope to address how the cells fire in response to more dynamic boundaries and landmarks, like moving objects or people. They're also in the process of investigating how--or if--these cells respond in dark environments, where an animal has less visual information to rely on.

Credit: 
Boston University

Designed protein switch allows unprecedented control over living cells

video: Researchers at the UW Medicine Institute for Protein Design in Seattle explain how they, and researchers at University of California San Francisco, created a synthetic protein switch, LOCKR, from scratch. The talk about the several uses they found for versions of LOCKR in controlling several processes in living cells.

Image: 
UW Medicine

Scientists have created the first completely artificial protein switch that can work inside living cells to modify—or even commandeer—the cell’s complex internal circuitry.

The switch is dubbed LOCKR, short for Latching, Orthogonal Cage/Key pRotein.

Companion papers published July 24 in the journal Nature describe LOCKR’s design and demonstrate several practical applications of the technology. The work was conducted by bioengineering teams led by David Baker at the UW Medicine Institute for Protein Design and Hana El-Samad at UC San Francisco.

The scientists show that LOCKR can be “programmed” to modify gene expression, redirect cellular traffic, degrade specific proteins, and control protein binding interactions. The researchers also use LOCKR to build new biological circuits that behave like autonomous sensors. These circuits detect cues from the cell’s internal or external environment and respond by making changes to the cell. This is akin to the way a thermostat senses ambient temperature and directs a heating or cooling system to shut itself off as soon as a desired temperature is reached.

Once assembled by a cell, these new switches measure just eight nanometers on their longest side. More than a hundred million would be needed to cover the period at the end of this sentence.

"The ability to control cells with designer proteins ushers in a new era of biology," said El-Samad, the Kuo Family Professor of Biochemistry and Biophysics at UCSF and co-senior author of the reports. "In the same way that integrated circuits enabled the explosion of the computer chip industry, these versatile and dynamic biological switches could soon unlock precise control over the behavior of living cells and, ultimately, our health."

Having no counterpart in the natural world, LOCKR stands apart from every tool of the biotech trade, including recent technologies like optogenetics and CRISPR. While its predecessors were discovered in nature and then retooled for use in labs, industry, or medicine, LOCKR is among the first biotechnology tools entirely conceived of and built by humans.

The lead authors of the reports are Bobby Langan and Scott Boyken of the UW Medicine Institute for Protein Design, and Andrew Ng of the UC Berkeley-UCSF Graduate Program in Bioengineering.

"Right now, every cell is responding to its environment," said Langan. "Cells receive stimuli, then have to figure out what to do about it. They use natural systems to tune gene expression or degrade proteins, for example."

Langan and his colleagues set out to create a new way to interface with these cellular systems. They used computational protein design to create self-assembling proteins that present bioactive peptides only upon addition of specific molecular "keys."

With a version of LOCKR installed in yeast, the team was able to show that the genetically engineered fungus could be made to degrade a specific cellular protein at a time of the researchers' choosing. By redesigning the switch, they also demonstrated the same effect in lab-grown human cells.

To stay healthy, cells must tightly control their biochemical processes. Abnormal activity in just one gene, or buildup of the wrong protein, can upset a cell's equilibrium. This could lead to cell death or even cancer. LOCKR gives scientists a new way to interact with living cells. It could thereby enable a new wave of therapies for diseases as diverse as cancer, autoimmune disorders and more.

"LOCKR opens a whole new realm of possibility for programming cells," said Ng. "We are now limited more by our imagination and creativity rather than the proteins that nature has evolved."

Credit: 
University of Washington School of Medicine/UW Medicine

Elephant extinction will raise carbon dioxide levels in atmosphere

image: Forest elephants engineer the ecosystem of the entire central African forest, and their catastrophic decline toward extinction has implications for carbon policy.

Image: 
Stephen Blake, Ph.D.

ST. LOUIS - One of the last remaining megaherbivores, forest elephants shape their environment by serving as seed dispersers and forest bulldozers as they eat over a hundred species of fruit, trample bushes, knock over trees and create trails and clearings. Their ecological impact also affects tree populations and carbon levels in the forest, researchers report, with significant implications for climate and conservation policies.

In a paper recently published in Nature Geoscience, a Saint Louis University biologist and his colleagues found that elephant populations in central African forests encourage the growth of slow-growing trees with high wood density that sequester more carbon from the atmosphere than fast growing species which are the preferred foods of elephants.

As forest elephants preferentially browse on the fast growing species, they cause high levels of damage and mortality to these species compared to the slow growing, high wood density species. The collapse of forest elephant populations will likely therefore causes an increase in the abundance of fast growing tree species at the expense of slow growing species, and reduce the ability of the forest to capture carbon.

Stephen Blake, Ph.D., assistant professor of biology at Saint Louis University, spent 17 years in central Africa doing, among other things, applied research and conservation work with elephants. While there, he collected a data set on forest structure and species composition in the Nouabalé-Ndoki Forest of northern Congo.

In the current study, Blake's collaborators developed a mathematical computer model to answer the question 'What would happen to the composition of the forest over time with and without elephant browsing?'

To find out, they simulated elephant damage through browsing in the forest and assumed they browse certain plant species at different rates. Elephants prefer fast-growing species in more open spaces. As they feed and browse, they cause damage, knocking off a limb or breaking a shrub. The model calculated feeding and breakage rates along with elephant mortality rates to see their effect on certain woody plants.

"Lo and behold, as we look at numbers of elephants in a forest and we look at the composition of forest over time, we find that the proportion of trees with high density wood is higher in forests with elephants," Blake said.

"The simulation found that the slow-growing plant species survive better when elephants are present. These species aren't eaten by elephants and, over time, the forest becomes dominated by these slow-growing species. Wood (lignin) has a carbon backbone, meaning it has a large number of carbon molecules in it. Slow growing high wood density species contain more carbon molecules per unit volume than fast growing low wood density species. As the elephants "thin" the forest, they increase the number of slow-growing trees and the forest is capable of storing more carbon."

These findings suggest far-ranging ecological consequences of past and present extinctions. The loss of elephants will seriously reduce the ability of the remaining forest to sequester carbon. Trees and plants use carbon dioxide during photosynthesis, removing it from the atmosphere. For this reason, plants are helpful in combating global warming and serve to store carbon emissions.

Without the forest elephants, less carbon dioxide will be taken out of the atmosphere. In monetary terms, forest elephants represent a carbon storage service of $43 billion.

"The sad reality is that humanity is doing its best to rid the planet of elephants as quickly as it can," Blake said. "Forest elephants are rapidly declining and facing extinction. From a climate perspective, all of their positive effect on carbon and their myriad other ecological roles as forest gardeners and engineers will be lost."

The study authors note that forest elephant conservation could reverse this loss.

"Elephants are a flagship species. People love elephants - we spend millions every year on cuddly toys, they are zoo favourites and who didn't cry during Dumbo? and yet we're pushing them closer to extinction every day. On one hand we admire them and feel empathy and are horrified when they are murdered and on the other hand we're not prepared to do anything serious about it. The consequences may be severe for us all. We need to change our ways.

"Besides, it just makes good sense to keep them around. They're doing an amazing job of helping the planet store carbon for free."

Other authors on the study include first author Fabio Berzaghi, Marcos Longo, Philippe Ciais, Francois Bretagnolle, Simone Vieira, Marcos Scaranello, Giuseppe Scarascia-Mugnozza and Christopher E. Doughty.

Key Take-aways

Researchers asked 'What would happen to the composition of the forest over time with and without elephants?'

They found that elephant browsing on fast growing tree species damages and kills young plants which pushes the composition of the forest towards slow-growing plant species which increase in abundance in areas where elephants occur. Slow-growing plants have dense wood and therefore store more carbon than slow growing species.

The loss of elephants will seriously reduce the ability of the forest to sequester carbon and so less carbon dioxide will be kept out of the atmosphere.

Forest elephants are rapidly becoming extinct. From a climate perspective, all of the positive carbon effect that elephants provide will be lost if we do not reverse the trend of illegal killing for the ivory trade.

Saint Louis University is a Catholic, Jesuit institution that values academic excellence, life-changing research, compassionate health care, and a strong commitment to faith and service. Founded in 1818, the University fosters the intellectual and character development of more than 13,500 students on campuses in St. Louis and Madrid, Spain. Building on a legacy of nearly 200 years, Saint Louis University continues to move forward with an unwavering commitment to a higher purpose, a greater good.

Credit: 
Saint Louis University

Expanding the limits of personalized medicine with high-performance computing

image: Jonathan Ozik ponders the results of the team's work on identifying, via simulation, the rules of cancer immunotherapy.

Image: 
Argonne National Laboratory

What should personalized, precision treatment of cancer look like in the future? We know that people are different, their tumors are different, and they respond differently to different therapies. Medical teams of the future might be able to create a "virtual twin" of a person and their tumor. Then, by tapping supercomputers, physician-led teams could simulate how tumor cells behave to test millions (or billions) of possible treatment combinations. Ultimately, the best combinations might offer clues towards a personalized, effective treatment plan.

Sound like wishful thinking? The first steps towards this vision have been undertaken by a multi-institution research collaboration that includes Jonathan Ozik and Nicholson Collier, computational scientists at the U.S. Department of Energy’s Argonne National Laboratory.

The research team, which includes collaborators at Indiana University and the University of Vermont Medical Center, brought the power of high-performance computing to the thorny challenge of improving cancer immunotherapy. The team tapped twin supercomputers at Argonne and the University of Chicago, finding that high-performance computing can yield clues in fighting cancer, as discussed in a June 7 article published in Molecular Systems Design and Engineering.

“With this new approach, researchers can use agent-based modeling in more scientifically robust ways.” — Nicholson Collier, computational scientist at Argonne and the University of Chicago

Standing up to cancer

Cancer immunotherapy is a promising treatment that realigns your immune system to reduce or eliminate cancer cells. The therapy, however, helps only 10 to 20 percent of patients — partly because the way in which cancer cells and immune cells mingle is complex and poorly understood. Proven rules are scarce.

To begin uncovering the rules of immunotherapy, the team turned to a set of three tools:

Agent-based modeling, which predicted the behavior of individual “agents” – cancer and immune cells, in this case
Argonne’s award-winning workflow technology to take full advantage of the supercomputers
A guiding framework to explore models and dynamically direct and track results

The trio operate in a hierarchy. The framework, developed by Ozik, Collier, Argonne colleagues, and Gary An, a surgeon and professor at the University of Vermont Medical Center, is called Extreme-scale Model Exploration with Swift (EMEWS). It oversees the agent-based model and the workflow system, the Swift/T parallel scripting language, developed at Argonne and the University of Chicago.

What is unique about this combination of tools? “We are helping more people in a variety of computational science fields to do large-scale experimentation with their models,” said Ozik, who — like Collier — holds a joint appointment at the University of Chicago. “Building a model is fun. But without supercomputers, it is difficult to really understand the full potential of how models can behave.” 

Working smarter, not harder

The team sought to find simulated scenarios in which:

No additional cancer cells grew
90 percent of cancer cells died
99 percent of cancer cells died

They found that no cancer cells grew in 19 percent of simulations, 9 in 10 cancer cells died in 6 percent of simulations, and 99 in 100 cancer cells died in about 2 percent of the simulations.

The team began with an agent-based model, built with the PhysiCell framework, designed by Indiana University’s Paul Macklin to explore cancer and other diseases. They assigned each cancer and immune cell characteristics — birth and death rates, for example — that govern their behavior and then let them loose.

“We use agent-based modeling to address many problems,” said Ozik. “But these models are often computationally intensive and produce a lot of random noise.”

Exploring every possible scenario within the PhysiCell model would have been impractical. “You can’t cover the entire model’s possible behavior space,” said Collier. So the team needed to work smarter, not harder.

The team relied on two approaches — genetic algorithms and active learning, which are forms of machine learning— to guide the PhysiCell model and find the parameters that best controlled or killed the simulated cancer cells.

Genetic algorithms seek those ideal parameters by simulating the model, say, 100 times and measuring the results. The model then repeats the process again and again using better-performing parameter values each time. “The process allows you to find the best set of parameters quickly, without having to run every single combination,” said Collier.

Active learning is different. It also repeatedly simulates the model, but, as it does, it tries to discover regions of parameter values where it would be most advantageous to further explore in order to get a full picture of what works and what doesn’t. In other words, “where you can sample to get the best bang for your buck,” said Ozik.

Meanwhile, Argonne’s EMEWS acted like a conductor, signaling the genetic and active learning algorithms at the right times and coordinating the large number of simulations on Argonne’s Bebop cluster in its Laboratory Computing Resource Center, as well as on the University of Chicago’s Beagle supercomputer.

Moving beyond medicine

The research team is applying similar approaches to challenges across different cancer types, including colon, breast and prostate cancer.

Argonne’s EMEWS framework can offer insights in areas beyond medicine. Indeed, Ozik and Collier are currently using the system to explore the complexities of rare earth metals and their supply chains. “With this new approach, researchers can use agent-based modeling in more scientifically robust ways,” said Collier.

Credit: 
DOE/Argonne National Laboratory

Diet of traditional Native foods revealed in hair samples

image: University of Alaska Fairbanks research has tied chemical signatures in human hair to the consumption of traditional Yup'ik foods such as these blueberries, as well as fish and marine mammals.

Image: 
Photo courtesy of the UA Museum of the North

University of Alaska Fairbanks researchers have linked specific chemical signatures found in human hair with a diet of traditional Yup'ik foods. The finding could help scientists make connections between diet and long-term health trends in Alaska Native populations.

The study was published this month in the Journal of Nutrition.

The researchers examined the diets of 68 residents in two Southwest Alaska coastal villages. Each resident participated in four extensive dietary interviews and provided hair samples. Scientists analyzed the samples at specific locations along each strand of hair for the ratio between different nitrogen isotopes, which is a potential chemical signal, or biomarker, of diet.

Researchers were able to strongly establish a connection between the biomarker and the consumption of traditional foods like fish and marine mammals. Changes in the biomarker along the hair strand also showed that traditional food intake peaked during the summer months.

The consumption of many traditional foods increases the presence of heavier nitrogen isotopes in the hair, because those isotopes are more abundant in animals that are higher in the food web, such as marine mammals and fish. Scientists have established that general relationship before, but the new results will allow them to more closely pinpoint the amount of traditional foods in a person's diet, said senior author Diane O'Brien, a researcher at UAF's Center for Alaska Native Health Research.

"This study lets us put a number on it - in other words, it lets us translate a biomarker measurement to a specific percent of the diet from traditional foods," O'Brien said. "It seemed to be a very good reflection of all the traditional foods people were eating, even foods like berries and greens that do not have high nitrogen isotope ratios. This is likely because people who consume a lot of traditional foods tend to consume all of them, not just certain ones."

These data are important to scientists because the consumption of traditional foods in Alaska Native diets has been associated with a reduced risk of chronic disease.

Combining the use of biomarkers and surveys provides a more accurate way to measure what people are eating, allowing scientists to better make connections between diet and health. Using biomarkers is relatively cheap and easy, O'Brien said, and is a tool that doesn't rely heavily on people's recollections of their diets.

"Our diets are very complicated, and most of us don't really don't pay close attention to what we eat," she said. "Because of that, there can be a lot of error in dietary data, making it hard to conclusively demonstrate links between diet and health. Having this biomarker gives us a lot more power to demonstrate how traditional foods relate to health in Alaska Native populations."

Credit: 
University of Alaska Fairbanks

Artificial intelligence solution improves clinical trial recruitment

image: A nurse examines a patient in the Emergency Department of Cincinnati Children's, where researchers successfully tested artificial intelligence-based technology to improve patient recruitment for clinical trials. Researchers report test results in the journal JMIR Medical Informatics.

Image: 
Cincinnati Children's

CINCINNATI--Clinical trials are a critical tool for getting new treatments to people who need them, but research shows that difficulty finding the right volunteer subjects can undermine the effectiveness of these studies. Researchers at Cincinnati Children's Hospital Medical Center designed and tested a new computerized solution that used artificial intelligence (AI) to effectively identify eligible subjects from Electronic Health Records (EHRs), allowing busy clinical staff to focus their limited time on evaluating the highest quality candidates.

The study is published online in JMIR Medical Informatics. It shows that compared to manually screening EHRs to identify study candidates, the system--called the Automated Clinical Trial Eligibility Screener© (ACTES)--reduced patient screening time by 34 percent and improved patient enrollment by 11.1 percent. The system also improved the number of patients screened by 14.7 percent and those approached by 11.1 percent.

Busy emergency departments often serve as excellent locations for clinical trial coordinators to find people who may be good study candidates. According to the study's lead investigator, Yizhao Ni, PhD, Division of Biomedical Informatics, ACTES is designed to streamline what often proves to be inefficient clinical trial recruiting process that doesn't always catch enough qualified candidates.

"Because of the large volume of data documented in EHRs, the recruiting processes used now to find relevant information are very labor intensive within the short time frame needed," said Ni. "By leveraging natural language processing and machine learning technologies, ACTES was able to quickly analyze different types of data and automatically determine patients' suitability for clinical trials."

How it Works

The system has natural language processing, which allows computers to understand and interpret human language as the system analyzes large amounts of linguistic data. Machine learning allows computerized systems to automatically learn and evolve from experience without specifically being programmed. This makes it possible for computer programs to process data, extract information, and generate knowledge independently.

The automated system extracts structured information such as patient demographics and clinical assessments from EHRs. It also identifies unstructured information from clinical notes, including the patients' clinical conditions, symptoms, treatments and so forth. The extracted information is then matched with eligibility requirements to determine a subject's suitability for a specific clinical trial.

The system's machine learning component also allows it to learn from historical enrollments to improve its future recommendations, according to the researchers. Much of the analyses are handled by carefully designed AI algorithms, essentially procedures or formulas that computers use to solve problems by performing a set sequence of specified actions.

Advanced to Live Clinical Setting

Previously the system was successfully pilot tested in a retrospective study published in 2015 by the Journal of the American Medical Informatics Association. The current study tested the solution prospectively and in real time in a busy emergency department environment, where clinical research coordinators recruited patients for six different pediatric clinical trials involving different diseases.

Using the technology in a live clinical environment involved significant collaboration between data scientists, application developers, information service technicians and the end users, clinical staff.

"Thanks to the institution's collaborative environment, we successfully incorporated different groups of experts in designing the integration process of this AI solution." Ni said.

Credit: 
Cincinnati Children's Hospital Medical Center

Valleytronics core theory for future high-efficiency semiconductor technology

image: A diagram on the formation of valley domain in molybden disulphide, a 2D crystal material, and its current signal control.

Image: 
DGIST

DGIST research team discovered a theory that can expand the development of valleytronics technology, which has been drawing attention as a next generation semiconductor technology. This is expected to advance the development of valleytronics technology one level further, a new magnetic technology of next generation that surpasses the existing data processing speed.

DGIST announced on Monday, June 17 that Professor JaeDong Lee research's team at the DGIST Department of Emerging Materials Science discovered the formation of valley domain , which will contribute to the performance of next generation semiconductor, development of anomalous current, and its control mechanism. This research has significant meaning as it discovered and applied the correlations between valley domain1, current, and two different physical quantities.

A valley is a vertex or an edge of band energy and is also called valley spin. valleytronics is the storage and use of information using the number of quanta which determines valleys. It is applicable to future electronic devices and quantum computing technology since its quantum information storage surpasses the existing charge or spin control technology. Many researchers are conducting research on valley control since valleytronics has infinite potential that encompasses spintronics and nanoelectronics in the next generation semiconductor engineering field. However, the actual applicability is not too high due to the difficulties in securing the stability and enough quantity of valleys.

Through this research, Professor JaeDong Lee's team solved the stability issue of valley spin by discovering the formation of valley domain in molybdenum disulfide, a next generation 2D monolayer semiconductor material. A valley domain is defined as domain of electrons with the same valley momentum inside a matter. The team identified that a valley domain formed in an extreme nano structure can be used to store information in replacement of spin. Moreover, the research team discovered that they can generate anomalous transverse current by controlling the size of valley domain. Anomalous transverse current occurs inevitably due to the movement of a domain wall2 and flows toward only one direction along to the movement of valley domain. They also proposed and showed the applicability of diode mechanism, a single crystal nanostructure substance that is unlike the existing semiconductor diode3 of heterostructure.

Professor JaeDong Lee at Department of Emerging Materials Science said "Through this research, we have discovered the core theory of valleytronics which can use the two different phenomena of valley magnetic and electric signal control in a single 2D crystalline material at the same time. We hope that valleytronics research becomes applicable in more various fields to accelerate the advancement of low-power, highspeed information storage platform."

Credit: 
DGIST (Daegu Gyeongbuk Institute of Science and Technology)

Does one size does fit all? A new model for organic semiconductors

image: This is a representation of carrier mobility in hard inorganic materials (upper figure, band transport) and flexible organic solids (lower figure, flexibility induced transport mechanism).

Image: 
Kazuyuki Sakamoto

Osaka, Japan - Organic materials that can conduct charge have the potential to be used in a vast array of exciting applications, including flexible electronic devices and low-cost solar cells. However, to date, only organic light emitting diodes (OLEDs) have made a commercial impact owing to gaps in the understanding of organic semiconductors that have limited improvements to charge carrier mobility. Now an international team including researchers from Osaka University has demonstrated the mechanism of charge mobility in an organic single crystal. Their findings are published in Scientific Reports.

In an effort to improve the charge carrier mobility in organic crystals, significant attention has been focused on understanding how the electronic structure of organic single crystals allows for the movement of charge. Analyzing highly ordered single crystals instead of samples that contain many defects and disorders gives the most accurate picture of how the charge carriers move in the organic material.

The researchers analyzed a single crystal of rubrene, which, owing to its high charge mobility, is one of the most promising conducting organic material. However, despite the popularity of rubrene, its electronic structure is not well understood. They found that theory-based conclusions reached in previous work were inaccurate because of molecular vibrations at room temperature that are a consequence of the flexibility of the material.

"We have demonstrated a new mechanism that is not observed for traditional inorganic semiconductor materials," study corresponding author Kazuyuki Sakamoto explains. "Inorganic semiconductors such as silicon, which are widely used in electronics, are generally hard, inflexible materials; therefore, certain assumptions made for these materials do not translate to organic conducting materials that are more flexible."

The successful preparation of an ultra-high-quality single rubrene crystal sample allowed experiments to be carried out that provided a definitive comparison with previous data. The experiments highlighted the limitations of previous assumptions and revealed the influence of other factors such as electron diffraction and molecular vibrations.

"By reliably demonstrating the room temperature behavior of an organic conducting material and reframing the thinking behind previous conclusions that have been drawn, we have provided a much clearer basis for research going forward," Professor Sakamoto explains. "We hope that this insight will accelerate the development of flexible conducting devices with a wide range of exciting functions."

Credit: 
Osaka University