Culture

Evolution of Wuhan coronavirus (2019-nCoV) and modeling of spike protein for human transmission

image: Evolutionary analysis of the coronaviruses and modeling of the Wuhan coronavirus S-protein interacting with human ACE2. A, Phylogenetic tree of coronaviruses based on full-length genome sequences. B, Amino acid sequence alignment of the RBD domain of coronavirus S-proteins. C, Structural modeling of the Wuhan coronaviruses S-protein complexed with human ACE2 molecule.

Image: 
©Science China Press

The cluster of pneumonia cases in Wuhan City, Hubei Province of China was first reported on December 30, 2019 by the Wuhan Municipal Health Commission. The Centers for Disease Control and Prevention (CDC) later determined and announced a novel coronavirus (CoV), denoted as Wuhan CoV (2019-nCoV), had caused the outbreak of the pneumonia. The current public health emergency partially resembles the emergence of the SARS outbreak in southern China in 2002, which led to more than 8,000 human infections and 774 deaths. As of January 15, 2020, there were more than 40 laboratory-confirmed cases of the noval Wuhan CoV infection with one death and multiple exported cases in Japan, and Thailand. Under the current public health emergency, it is imperative to understand the origin and native host(s) of the Wuhan CoV, and to evaluate the public health risk of this novel coronavirus for transmission cross species or between humans.

"To understand the origin of the Wuhan CoV and its genetic relationship with other coronaviruses, we performed phylogenetic analysis on the collection of coronavirus sequences from various sources.", said Professor Pei Hao at Institut Pasteur of Shanghai, the corresponding author of this study. The results indicated that Wuhan CoV belongs to the Betacoronavirus genera, and shares a common ancestor with the SARS/SARS-like coronaviruses (including those causing the 2002/2003 SARS epidemic). Their common ancestor more or less resembles the bat coronavirus HKU9-1. Based on the evolutionary relationship of betacoronavirus, bat is likely the native host of the Wuhan CoV, though it remains likely there is intermediate host(s) in the transmission cascade from bats to humans.

"Furthermore, to evaluate Wuhan CoV health risk for human transmission, we performed structural modeling of its spike protein that is critical for coronavirus-host interaction, and evaluate its ability to mediate human infection."said Prof. Hao. The binding free energy for Wuhan CoV S-protein to human ACE2 (receptor for human infection) is -50.6 kcal/mol, which is considered as a significant binding affinity to human ACE2, despite a little weaker than that between SARS-CoV S-protein and ACE2 (-78.kcal/mol). The results point to the important discovery that the Wuhan CoV S-protein supports strong interaction with human ACE2 molecules despite its sequence diversity from SARS-CoV S-protein. "So the Wuhan CoV poses significant health risk for transmission cross species or between humans throught the same infection mechanism of the SARS-CoV, e.g. S-protein - ACE2 interaction."said Prof. Hao.

These important results not only presented for the first time the evidence of human transmissions by the Wuhan novel coronavirus (2019-nCoV), but also were indicative of its infection mechanism that is the same as that of the SARS virus responsible for the 2002/2003 epidemic, depite their considerable divergence in sequences.

Credit: 
Science China Press

Biodiversity yields financial returns

image: A meadow with more than ten species yields more than a meadow with only one species.

Image: 
Valentin Klaus

Many farmers associate grassland biodiversity with lower yields and financial losses. "Biodiversity is often considered unprofitable, but we show that it can, in fact, pay off," says Nina Buchmann, Professor of Grassland Sciences at ETH Zurich. In an interdisciplinary study at the interface of agricultural sciences, ecology and economics, Buchmann and her colleagues were able to quantify the economic added value of biodiversity based on a grassland experiment that examined different intensities of cultivation. Their paper has just been published in the journal Nature Communications.

Creating higher revenues

"Our work shows that biodiversity is an economically relevant factor of production," says Robert Finger, Professor of Agricultural Economics and Policy at ETH Zurich. If 16 different plant species grow in a field instead of just one, the quality of the forage remains more or less the same, but the yield is higher - which directly correlates to the income that can be made from milk sales. "The resultant increase in revenues in our study is comparable to the difference in yield between extensively and intensively farmed land," says Sergei Schaub, lead author of the study and a doctoral student in Finger's and Buchmann's groups.

Switzerland has so-called ecological compensation areas, i.e., grasslands for which farmers pay particular attention to promoting biodiversity. However, these areas often have poor soils and the yields they produce cannot be compared with those of high-quality grassland. Fortunately, the researchers were able to use data from the long-term Jena Experiment, which - among other questions - compared different farming practices at the same site.

"Our results show that biodiversity has an economically positive effect on all areas, regardless of whether farmers mow and fertilise them four times a year or just once," Schaub says. The more intensely the land is farmed, however, the more difficult it becomes to maintain a high level of biodiversity, because only a few plant species can withstand fertilisation and frequent mowing, he notes. Finger adds that Swiss farmers already take more advantage of this economic effect than their counterparts in other countries. Generally speaking, biodiversity on the areas used for forage production in Switzerland is already relatively rich in biodiversity because the seed mixtures are adapted to local conditions, he explains.

Biodiversity as risk insurance

The researchers didn't expect their results to be so conclusive. And there's another economic aspect that they didn't even factor in: "Biodiversity is also a kind of risk insurance," Buchmann says. Diverse grasslands are better off to cope with extreme events such as droughts or floods, he explains, because different plant species react differently to such environmental influences, which partially compensates for any losses arising. "This means yields become more stable over time," Buchmann says, as the research team demonstrated in other recent studies.

The researchers believe their results are a clear indication that it's worthwhile for farmers to increase the diversity of plants growing on their land. "Preserving or restoring diverse grasslands can be a win-win situation," the researchers note at the end of their paper. Not only because this increases farmers' yields and operating revenues, but also because it improves and promotes important ecosystem services such as pollination or water quality.

Credit: 
ETH Zurich

High levels of PFAS affect immune, liver functions in cape fear river striped bass

Researchers from North Carolina State University have found elevated levels of 11 per- and polyfluoroalkyl (PFAS) chemicals in the blood of Cape Fear River striped bass. Two of those compounds - perfluorooctane sulfonate (PFOS) and Nafion byproduct 2 - are associated with altered immune and liver functions in those fish.

Scott Belcher, associate professor of biology and corresponding author of a paper describing the research, led a team that included NC State colleagues Detlef Knappe, Ben Reading and postdoctoral researcher Theresa Guillette as well as partners from the North Carolina Wildlife Commission and the U.S. Environmental Protection Agency (EPA).

The team isolated serum from the blood of 58 wild caught Cape Fear River striped bass ranging in age from 2 to 7 years old. In collaboration with EPA researchers Mark Stryner and James McCord, they determined the concentrations of 23 different PFAS chemicals present in the serum using a combination of liquid chromatography and high-resolution mass spectrometry.

"Testing blood levels gives you an idea of the 'body burden' of these particular chemicals," Belcher says. "The levels of these chemicals in the water were measured in parts per trillion, but in the serum of the fish levels are higher and in parts per billion, demonstrating that they have clearly bioaccumulated in these fish."

The team then compared the blood serum samples from the wild caught fish to those from a reference population of 29 striped bass raised in an aquaculture facility fed by ground water. "The serum levels of chemicals in the wild caught bass were 40% higher, on average, than the background levels found in this reference population," Belcher says.

In comparison to the levels of PFAS found in Cape Fear River water, elevated levels of PFOS and Nafion byproduct 2 were found in 100% and 78% of the wild bass samples, respectively. The serum concentrations of these compounds were associated with biomarkers of altered liver enzyme activity and immune function in those fish.

"These PFAS levels are some of the highest recorded in fish," Belcher says, "but one of the most unusual findings here is that smaller or younger fish had the highest levels of these compounds. This points to the fact that PFAS chemicals are very different from other persistent chemicals, like mercury or PCBs. They have unique and very different chemical properties that cause them to bioaccumulate differently, and we're really just beginning to understand why and how they do what they do."

Credit: 
North Carolina State University

Cervical cancer screening saves lives

image: UNM Comprehensive Cancer Center

Image: 
The University of New Mexico

Cervical cancer is the third most common cancer in women worldwide, but most American women can prevent it by being screened with tests that detect human papillomaviruses (HPV).

A new study led by University of New Mexico Comprehensive Cancer Center scientists shows that screening every three years instead of annually prevents most cervical cancers. And of the cancers that are found during routine screenings, most are caught before they've had a chance to spread, making them far easier to treat.

The results of the study were published ahead of the print version, in the December online edition of the International Journal of Cancer. UNM Regents' Professor Cosette Wheeler, PhD, led the study and says, "Cancer screening works and the vast majority of women who get cervical cancer simply don't get screened at all, or instead wait too long between screens."

Wheeler and her team worked with the New Mexico Tumor Registry to link their information with that of the New Mexico HPV Pap Registry. The state's Tumor Registry records all cases of cancer and all deaths due to cancer in the state. The HPV Pap Registry records all cervical cancer screening results, which include Pap and HPV tests, and all procedures to diagnose and treat cervical precancers - abnormalities that have not yet turned cancerous.

Combining data from the two statewide public health information systems provided a unique ability to understand the screening histories of women who developed cervical cancer throughout New Mexico. "This capacity is not available elsewhere," Wheeler says. "It serves as a model information system for cancer prevention in the United States."

Previous studies, Wheeler says, have used data from a single health care system, and often from the same insurer. The New Mexico data, however, include all information from the entire state, regardless of the women's insurance provider, insurance coverage, health care provider and location.

Wheeler's team included national and international experts and postdoctoral, graduate and undergraduate students who study health care delivery to improve cancer prevention across New Mexico. In this study, the screening records of each woman who was diagnosed with cervical cancer were compared with those of a control group of five New Mexican women without cervical cancer. The diagnosed women and the women in the control groups were matched on age, race, ethnicity and rural or urban geographic area.

Wheeler's team found that 61% of women in the control groups had been screened within the previous three years, but only 38% of the women with cervical cancer had been screened in the same period prior to their cancer diagnosis.

The researchers also compared the medical histories of women diagnosed with cervical cancer. Those who had been screened in the three years prior to diagnosis were half as likely to be diagnosed with localized cervical cancer as those who hadn't been screened. They were also 83% less likely to be diagnosed with cervical cancer that had spread.

"Screening is super important for catching cancers before they have spread," Wheeler says.

The team also showed that women who receive a negative screening test were very unlikely to be diagnosed with cervical cancer in the following three-and-a-half to five years. Many HPV infections will resolve naturally, Wheeler says, but the immune system needs time to act. She and her team found that more frequent screening offered no additional benefit.

"The value of a negative screen is huge," Wheeler says. "If you screen, we can show that screening prevents more than 80% of distant cancer and about 50% of local cancer. And the local cancer is easily treatable."

This study, she says, gives real-world evidence to assure New Mexican women and their health care providers that screening for cervical cancer every three years safely finds cancer early and that screening more frequently has no additional benefit.

In the end, the biggest problem Wheeler sees is that the United States has no organized way to remind women when the time comes for their three-year screening. "We need to fix this [lack of a central reminder system]," Wheeler says. "New Mexico can take the lead."

Credit: 
University of New Mexico Health Sciences Center

Study resurrects mammoth DNA to explore the cause of their extinction

A new study in Genome Biology and Evolution, published by Oxford University Press, resurrected the mutated genes of the last herd of woolly mammoths and found that their small population had developed a number of genetic defects that may have proved fatal for the species.

Woolly mammoths were one of the most abundant, cold-adapted species on earth before the conclusion of the Pleistocene (~11,700 years ago). The end of this period was marked by dramatic climate fluctuations that eventually gave way to Holocene - which saw the near complete loss of cold and dry steppe tundra, (also known as the Mammoth steppe). This change caused the extinction of many species, including cave bears, cave hyenas, the woolly rhinoceros, and the continental population of woolly mammoths.

An isolated population of mammoths disappeared later on St. Paul Island due to rising sea levels and a lack of freshwater, which left only a small population of woolly mammoths (~ 300-500) on Wrangel Island, a remote Arctic refuge off the coast of Siberia. This final herd of mammoths died out some 5,000 years ago, though the cause of their extinction has long been a mystery.

Researchers here built upon the prior scientific studies that identified potentially harmful genetic mutations on the Wrangel Island mammoths. Comparing the DNA of a Wrangel Island mammoth to that of 3 Asian elephants and 2 more ancient mammoths, the study found a number of unique genetic mutations. The genes that showed these mutations related to a range of important functions, including neurological development, male fertility, insulin signalling, and sense of smell.

The final cause of the extinction of the Wrangel Island mammoths is still a mystery, but it is clear that, due to a decline in their population, they suffered from a medley of genetic defects that may have hindered their development.

"We know how the genes responsible for our ability to detect scents work," says Vincent J. Lynch, an assistant professor of biological sciences at the University at Buffalo. "So we can resurrect the mammoth's version, make cells I culture produce the mammoth gene, and then test whether the protein functions normally in cells. If it doesn't - and it didn't - we can infer that it probably means that Wrangel Island mammoths were unable to smell the flowers that they ate."

Credit: 
Oxford University Press USA

Microscopic eye movements vital for 20/20 vision

Visual acuity--the ability to discern letters, numbers, and objects from a distance--is essential for many tasks, from recognizing a friend across a room to driving a car.

Researchers previously assumed that visual acuity was primarily determined by the optics of the eye and the anatomy of the retina. Now, researchers from the University of Rochester--including Michele Rucci, a professor of brain and cognitive sciences, and Janis Intoy, a neuroscience graduate student at Boston University and a research assistant in Rucci's lab in Rochester--show that small eye movements humans aren't even aware of making play a large role in humans' visual acuity. The research, published in the journal Nature Communications, may lead to improved treatments and therapies for vision impairments.

Unlike a stationary camera that takes a fixed photograph of the world, human eyes are constantly moving, taking in new pieces of a visual scene and continually changing the visual input to the retina.

"Humans are normally not aware that their eyes are always in motion, even when attempting to maintain a steady gaze on a point," Intoy says.

These gaze shifts, known as fixational eye movements, were once thought to be inconsequential because they are so small. But, they are large on a microscopic level, relative to the size of cells in the retina, and they shift the image across many receptors. Rucci and the members of his lab have progressively shown that these movements are critical to processes in the visual system.

In order to determine whether or not fixational eye movements affect visual acuity, Rucci and Intoy studied how these tiny eye movements affect a person's performance on one of the most common assessments of visual acuity: the Snellen eye chart. The Snellen eye chart consists of 11 lines of block letters in which each line displays an increasing number of letters in decreasing sizes. During a vision test, a person is asked to read the letters in the lines. If a person has normal visual acuity--20/20 vision--she is able to read to at least line eight on the chart from a distance of 20 feet.

Poor outcomes on the Snellen eye chart test are commonly attributed to defects in optical, structural, and/or physiological properties of the eye; eye movements are rarely considered. Rucci and Intoy found, however, that fixational eye movements are key contributors to 20/20 vision. In fact, even though humans are not aware of making them, these eye movements are finely controlled and can allow people to read at least two lines further on the Snellen eye chart versus when eye movements are absent or impaired.

In order to measure visual acuity in the absence of fixational eye movements, the researchers stabilized the eye chart on observers' retinas by continually updating the display according to the eye movements, counteracting the movements' effects. That is, unlike during normal viewing, when the visual input changes with eye movements, Intoy and Rucci ensured that the image of the eye chart remained stationary on the retina. This led to a drastic reduction in visual acuity; the observers, who normally had 20/20 vision, were on average now only able to read to approximately line six of the Snellen eye chart, to the line indicating 20/30 vision.

"We found that achieving 20/20 vision is not only the outcome of good optics and a healthy retina but also fine motor control, to a level that eludes awareness," Rucci says. "Impairment in visual acuity may originate from eye movements, a factor that is presently not monitored at all."

Because of the large role fixational eye movements play in visual acuity, doctors should carefully consider and examine these movements in people with impaired visual acuity, such as myopia (nearsightedness) and hyperopia (farsightedness).

"Eye movement disorders and visual impairments often coexist in some conditions as well," Intoy says. "For example, poor fixational control is often found in patients with visual impairments like dyslexia, and visual impairments are often present in patients with motor abnormalities like Parkinson's disease."

In addition to shedding new light on the fundamental mechanisms involved in high-acuity vision in humans, these findings suggest that methods based on oculomotor training and motor rehabilitation may help improve visual acuity. These therapies could include having patients with motor disorders practice holding their gaze on stationary objects and practice precisely shifting their gaze between nearby objects.

"Therapies involving fixational eye movements generally wouldn't help compensate for deficits in the optics, structure, and physiological features of the eye," Intoy says. "It is presently unclear whether abnormal eye movements may be the cause of such deficits (or vice versa), but our study now indicates this is indeed possible. If eye movements and the properties of the eye are interrelated in this way, then therapies involving eye movements may be helpful in these cases."

Credit: 
University of Rochester

Novel techniques for mining patented gene therapies offer promising treatment options

image: A team of scientists from Purdue University and institutions around the world have come together to better understand the growing number of worldwide patented innovations available for gene therapy treatment.

Image: 
Marxa Figueiredo/Purdue University

WEST LAFAYETTE, Ind. - The global gene therapy market is expected to reach $13 billion by 2024 as new treatment options target cancers and other diseases.

Now, a team of scientists from Purdue University and other research institutions around the world have come together to better understand the growing number of worldwide patented innovations available for gene therapy treatment. They specifically focus on nonviral methods, which use synthetic or natural compounds or physical forces to deliver materials generally less toxic than their viral counterparts into the therapy treatments.

"The possibility of using nonviral vectors for gene therapy represents one of the most interesting and intriguing fields of gene therapy research," said Marxa Figueiredo, an associate professor of basic medical sciences in Purdue's College of Veterinary Medicine, who helped lead the research team and works with the Purdue Research Foundation Office of Technology Commercialization to patent her technologies related to health. "This is an innovative method for identifying the technological routes used by universities and companies across the world and uncovering emerging trends for different gene therapy sectors."

The scientists used big data, patent and clinical data mining to identify technological trends for the gene therapy field. The team's work is presented in the Feb. 7 edition of Nature Biotechnology. They envision that their analysis will help guide future developments for gene therapy.

This work brought together investigators from across the globe in a joint effort to use new databases and methods to better understand the trends of the gene therapy field in respect to nonviral vectors. Dimas Covas, coordinator of the Center for Cell-based Therapy, affiliated with the University of São Paulo in Brazil, lent his extensive experience in cell therapy. Aglaia Athanassiadou, Virginia Picanço-Castro and Figueiredo contributed their extensive experience on nonviral vectors for gene therapy. Cristiano Pereira and Geciane Porto brought their expertise in economics and business administration to the analyses. Each contribution was fundamental to achieving a new way to identify technological trends in this field.

"This work brought together investigators from very diverse disciplines to create a different perspective of the gene therapy field," Figueiredo said. "Our groups continue to work individually or in collaboration to generate and patent new vectors to help fill the needs of this re-emerging field of nonviral gene therapy."

Credit: 
Purdue University

New platform for composing genetic programs in mammalian cells

A new synthetic biology toolkit developed at Northwestern University will help researchers design mammalian cells with new functionalities.

The toolkit, called the Composable Mammalian Elements of Transcription (COMET), includes an ensemble of synthetic transcription factors and promoters that enable the design and tuning of gene expression programs in a way not previously possible. The result could be new therapies for difficult-to-treat diseases, like cancer.

"Our long-term goal is enabling bioengineers to build customizable cell-based therapies and boiling a design goal down into a genetic program requires suitable biological parts. Building this COMET toolkit was an important step towards enabling us to truly design new functions in mammalian cells," said Josh Leonard, associate professor of chemical and biological engineering at Northwestern University's McCormick School of Engineering, who led the research. He is also a member of Northwestern's Center for Synthetic Biology.

The results were published February 7 in the journal Nature Communications.

Creating new technology to tune cells

Synthetic biology researchers look to reprogram cells by changing their DNA to give them new functionality. While researchers have had success reprogramming the DNA of bacteria cells to create new therapeutics and chemicals, mammalian cells currently are more difficult to modify because of their complicated underlying biology.

Powerful tools like CRISPR-Cas9 can edit single genes within these cells, but those tools do not allow researchers to readily create more nuanced, sophisticated functionality that requires introducing and often finetuning novel genetic networks.

Leonard and his team are interested in developing cell-based therapies -- like reprogrammed cells that find tumors in the body and treat cancer at the sites of disease -- but realized that they needed to develop a toolkit to construct many of the functionalities that could be most useful. Recent advances in the field have paved the way for identifying desirable therapeutic functions, even within the context of such a complicated system, and the key challenge was figuring out how to modify a cell to carry out those tasks.

"The therapies we develop require sophisticated technologies," said Patrick Donahue, a graduate student in Leonard's lab and first author of the paper. "There was an opportunity to develop one such technology to expand the functions we can implement in a mammalian cell. So, as I was starting my PhD, we sat down and asked, 'What characteristics would we want in a transcription engineering toolkit?'"

A new library to advance the field

The group then worked to develop a library of promoters and transcription factors -- which copy DNA into RNA -- that enable the design and tuning of gene expression. The authors characterized these components and how they work together in order to enable precise tuning of gene expression levels. They also developed a mathematical model that explains how the system works.

"For a synthetic biology technology, having a mathematical model is essential for enabling reusable and predictable modules that other researches can apply and expand upon. We model this principle in the COMET system," said Neda Bagheri, who collaborated with Leonard. Bagheri is associate adjunct professor of chemical and biological engineering at Northwestern and a Distinguished Washington Research Foundation Investigator at the University of Washington Seattle.

Now, Leonard and his team are working to use the platform to build biological systems that can carry out sophisticated functions, like delivering therapies directly to tumors. That involves programming a cell to be able to evaluate its environment to determine whether tissue is healthy or cancerous.

A key mission for the group is making this new technology readily available to other groups, so that others can both expand COMET and use it to further research across multiple fields. The prepublication paper on bioRxiv has already received much attention, and the biological parts will be distributed as a kit by Addgene.

"COMET will enable researchers to test hypotheses that weren't otherwise possible to test, helping us both build useful biotechnologies and improve our understanding of complicated processes, like immune function or development. In emerging technical fields like synthetic biology, creating technology platforms is vital for sparking innovation. We're excited to see what COMET will enable our community to do next," Leonard said.

Credit: 
Northwestern University

Seeing blue after the little blue pill: Visual disturbances in Viagra users

Sildenafil is commonly used to treat erectile dysfunction and is generally regarded as safe with limited side effects. However, a recent study in Frontiers in Neurology has highlighted the risk of persistent visual side-effects, such as light sensitivity and color vision impairment, in men who have taken the highest recommended dose of Viagra. While these effects appear to be rare, the research suggests that first-time Viagra users should start with a lower dose before increasing it, if necessary.

Erectile dysfunction can have significant psychological consequences for men who are affected by it, and it can make fulfilling sexual relationships more difficult to achieve. Sildenafil, more commonly known by its tradename Viagra, became available in 1998 as a treatment for erectile dysfunction. It soon became the fastest selling drug in history, demonstrating the phenomenal demand for treatments that enhance sexual performance.

Originally developed as a treatment for high blood pressure, the drug dilates blood vessels and relaxes smooth muscle in the penis, making it easier to achieve and maintain an erection. The effects of the drug normally last 3-5 hours and although side-effects such as headache and blurred vision occasionally occur, they usually disappear relatively quickly.

However, Dr. Cüneyt Karaarslan of the Dünyagöz Adana hospital in Turkey, noticed a pattern in 17 male patients who attended the hospital. In the new study, Karaarslan reports that the patients suffered numerous visual disturbances, including abnormally dilated pupils, blurred vision, light sensitivity, and color vision disturbances, which included intensely blue colored vision with red/green color blindness.

All 17 patients had taken sildenafil for the first time, and all took the highest recommended dose of 100 mg. None of the men had been prescribed the medication. The visual side-effects began once the drug took effect, and were still present when the men arrived at the clinic 24-48 hours later.

The doctors in the clinic conducted various eye tests and monitored the patients over time to see how their symptoms developed. Fortunately, in all 17 patients the symptoms had cleared up by 21 days later, but this was doubtless a difficult experience for the men involved.

"Many men use non-prescription performance enhancing drugs to help with sexual anxiety and erectile dysfunction," said Karaarslan. "For the vast majority of men, any side-effects will be temporary and mild. However, I wanted to highlight that persistent eye and vision problems may be encountered for a small number of users."

So, why were these men susceptible to such long-lived side-effects? It may be possible that a small subsection of the population does not break sildenafil down and eliminate it from the body efficiently, leading to very high concentrations in the blood compared with most users.

These men also took the highest recommended dose of sildenafil on their first time taking the drug. Starting with a lower dose may have meant less severe side-effects. In addition, taking the drug under medical supervision would likely have meant that the men would not have used such a high dose on their first time.

So, if you are struggling with erectile dysfunction, should you be worried about trying Viagra? In short, no. Such persistent side-effects appear to be very rare. However, it is always best practice to consult your physician first, it may be best not to start at the highest dose, and in case you are particularly sensitive, consider first using the drug under medical supervision.

"Although these drugs, when used under the control of physicians and at the recommended doses, provide very important sexual and mental support, uncontrolled and inappropriate doses should not be used or repeated," said Karaarslan.

Credit: 
Frontiers

Mystery of marine recycling squad solved

image: Pictures of ammonia-oxidizing Archaea and nitrite-oxidizing Nitrospinae: The picture on the left shows the abundance of ammonia-oxidizing Archaea (green) and other microorganisms (blue). The picture on the right shows the abundance of nitrite-oxidizing Nitrospinae (green) and other microorganisms (blue). The differences in abundance and size are clearly visible.

Image: 
Max Planck Institute for Marine Microbiology/ K. Kitzinger

One is missing - in short, this summarizes the mystery. It is about nitrification, the oxidation of ammonia via nitrite to nitrate, a key process in marine nitrogen cycling. In the sea, both steps of this process are balanced and most available nitrogen exists in the form of nitrate, the final product of nitrification. The organisms which are largely responsible for the first step of nitrification in the ocean - the ammonia oxidizing archaea - were discovered around a decade ago, and it turns out that they are amongst the most abundant microorganisms on the planet.

The second part of nitrification, the transformation of nitrite to nitrate, is carried out by nitrite oxidizing bacteria, which mainly belong to the Nitrospinae phylum. Yet, Nitrospinae are ten time less abundant than the ammonia-oxidizers, raising the question: is there an equally abundant, still undiscovered nitrite oxidizer in the ocean?

Grow fast, die young

Scientists at the Max Planck Institute for Marine Microbiology have now solved this mystery in cooperation with colleagues of the University of Vienna, the University of Southern Denmark and the Georgia Institute of Technology. "We show that there is no need to invoke yet undiscovered, abundant nitrite oxidizers to explain nitrification in the ocean. "Surprisingly, we probably already know all the players," says Katharina Kitzinger, first author of the paper, published in the scientific journal Nature Communications in February.

So far, scientists have mainly determined the number of microbes involved in the marine nitrification, however, Katharina Kitzinger and her colleagues also examined the biomass of the microorganisms, as well as the growth rates and the activity of individual cells. These results have revealed that the ten times higher abundance of ammonia-oxidizers is not due to differences in the size of the microorganisms or because of the slow growth of Nitrospinae, as many scientists supposed until now.

"On the contrary. Our results indicate that Nitrospinae are much more active and grow much faster than the ammonia-oxidizing Archaea. Thus, Nitrospinae are clearly more efficient than the Archaea," explains Katharina Kitzinger and adds: "As such, one would expect the Nitrospinae to be significantly more abundant. As this is not the case we assume that Nitrospinae have such a low abundance because they have a high mortality rate. This explains the balanced marine nitrification process in the ocean and makes the existence of further unknown, abundant nitrite oxidizers unlikely."

Nitrogen and food for friends

At the same time, the researchers investigated which nitrogen compounds ammonia-oxidizing Archaea and Nitrospinae use for their cell growth. "While the Archaea almost exclusively grow using ammonium, the Nitrospinae seem to mainly use organic nitrogen, namely urea and cyanate instead," says Katharina Kitzinger. "The utilization of organic nitrogen is likely key to the ecological success of Nitrospinae, as it allows them to avoid competition with their friends, the Archaea, on whom they depend on for nitrite." In this way the two microorganisms help each other: The Archaea produce nitrite which serves the Nitrospinae, while the Nitrospinae presumably release some ammonium after they take up organic nitrogen. So, in turn, they provide the energy source for the Archaea - a symbiotic win-win situation.

The scientists acquired their samples in the Gulf of Mexico, where the process of nitrification is very important due to the high nutrient input from rivers like the Mississippi. "The microorganisms involved in nitrification and their relative abundances are similar worldwide," says Katharina Kitzinger. "Therefore, it is very likely that our results are also valid for the rest of the ocean."

Credit: 
Max Planck Institute for Marine Microbiology

International team delivers research breakthrough for leading cause of blindness

Researchers have identified a new protein linked to age-related macular degeneration (AMD) that could offer new hope for the diagnosis and treatment of the disease, which affects over 1.5 million people in the UK alone.

The research team, made up of scientists from Queen Mary University of London, the University of Manchester, Cardiff University, and Radboud University Medical Center, Nijmegen, found significantly higher levels of a protein called factor H-related protein 4 (FHR-4) in the blood of AMD patients.

Further investigation, using eye tissue donated for medical research, showed the presence of the FHR-4 protein within the macula - the specific region of the eye affected by the disease.

The results of this study open up new routes for early diagnosis, by measuring FHR-4 levels in the blood, and suggests therapies targeting this protein could provide promising future treatment options for the disease.

FHR-4 regulates the complement system, part of the immune system, which plays a critical role in inflammation and the body's defence against infection.

Previous studies have linked the complement system to AMD showing that genetically inherited faults in key complement proteins are strong risk factors for the condition.

In this study, the researchers used a genetic technique, known as a genome-wide association study, to identify specific changes in the genome related to the increased levels of FHR-4 found in AMD patients.

They found higher blood FHR-4 levels were associated with changes to genes that code for proteins belonging to the factor H family, which clustered together within a specific region of the genome. The identified genetic changes also overlapped with genetic variants first found to increase the risk of AMD over 20 years ago.

Together, the findings suggest that inherited genetic changes can lead to higher blood FHR-4 levels, which results in uncontrolled activation of the complement system within the eye and drives disease.

Blood levels of FHR-4 were measured in 484 patients and 522 age-matched control samples using two independent, established collections of AMD patient data. These were the Cambridge AMD study, led by Professor Anthony Moore from Moorfields Eye Hospital and UCL Institute of Ophthalmology (now at the University of California San Francisco) and Professor John Yates from Cambridge University, and the European Genetic Database (EUGENDA), led by Professor Anneke den Hollander and Professor Carel Hoyng from Radboud University Medical Center.

There are two main types of AMD - 'wet' AMD and 'dry' AMD. Whilst some treatment options exist for 'wet' AMD, there is currently no available treatment for 'dry' AMD.

Dr Valentina Cipriani, who jointly led the statistical data analysis with Dr. Laura Lorés-Motta from the Radboud University Medical Center and is an expert in ophthalmic statistical genetics at Queen Mary University of London, and member of the International AMD Genomics Consortium (IAMDGC), said: "By unveiling FHR-4 as a novel, key molecular player for AMD, our study was able to dissect further the genetic disease predisposition at the factor H region. This is one of the most established genetic associations in the field of complex genetics. We hope our findings will accelerate interest from the wider research community in the involvement of the complement system in AMD, with the ultimate goal of uncovering the role of the whole 'complementome' in the disease."

Professor Simon Clark, a specialist in the regulation of the complement system in health and disease at the University of Manchester, said: "This study really is a step-change in our understanding of how complement activation drives this major blinding disease. Up until now, the role played by FHR proteins in disease has only ever been inferred. But now we show a direct link and, more excitingly, become a tangible step closer to identifying a group of potential therapeutic targets to treat this debilitating disease."

Professor Paul Bishop, an ophthalmologist and AMD expert at the University of Manchester, said: "The combined protein and genetic findings provide compelling evidence that FHR-4 is a critical controller of that part of the immune system which affects the eyes. Apart from improving understanding of how AMD is caused, this work also provides a way of predicting risk of the disease by simply measuring blood levels of FHR-4 and also provides a new route to treatment by reducing the blood levels of FHR-4 to restore immune system function in the eyes."

Professor Paul Morgan, an expert in complement biology at Cardiff University, and leader in the development of the antibodies and assays that underpinned this work said: "The collaboration between experts in complement biology, eye disease and genetics across Europe has enabled the accumulation of a robust body of evidence that genetically dictated FHR-4 levels in plasma are an important predictor of risk of developing AMD. The unique antibodies and assays we have developed have potential not only for contributing to risk prediction but also to new ways of treating this common and devastating disease."

Credit: 
Queen Mary University of London

Protein closely linked to commonest cause of blindness

An international team of scientists has identified a protein which is strongly linked to the commonest cause of blindness in developed countries when its levels are raised in the blood.

The discovery is a major step forward in the understanding of age-related macular degeneration, which affects 1.5 million people in the UK alone.

The study, carried out by the team from Universities of Manchester, Cardiff, London and Nijmegen, and Manchester Foundation NHS Trust is published in Nature Communications.

The major funder was the Medical Research Council.

The protein, called FHR4, was found by the team to be present at higher levels in the blood of patients with AMD compared to individuals of a similar age without the disease.

The findings were confirmed in 484 patient and 522 control samples from two independent collections across Europe.

Analyses of eyes donated for research after life also revealed the FHR4 protein was present in the AMD-affected parts of the eye

FHR4 was shown by the team to activate part of the immune system -called the complement system; over activation is a major causal factor of AMD.

FHR4 is one of a group of proteins that regulate the complement system and the genes encoding these proteins are tightly clustered on chromosome 1, the largest human chromosome.

When the team investigated a set of genetic variants across the human genome, they found that genetic variants in this region on chromosome 1 determined the levels of FHR4 in the blood. And they found that the same genetic variants were associated with AMD.

Professor Paul Bishop and Professor Simon Clark, from the University of Manchester were part of the leadership team on the study.

Professor Bishop, who is also a Consultant Ophthalmologist at Manchester Royal Eye Hospital, said: "The combined protein and genetic findings provide compelling evidence that FHR4 is a critical controller of that part of the immune system which affects the eyes.

"We have shown that genetically determined higher blood FHR4 levels leads to more FHR4 in the eye which in turn increases the risk of the uncontrolled immune system response that drives the disease.

"So apart from improving understanding of how AMD is caused, this work provides a way of predicting risk of the disease by simply measuring blood levels of FHR4.

He added: "It also provides a new route to treatment by reducing the blood levels of FHR4 to restore immune system function in the eyes.

"Because treatments options for AMD are limited, this comprehensive understanding of the biology of AMD is a huge boost for scientists finding answers to a problem which causes untold misery for thousands of people in the UK alone."

Professor Simon Clark, a specialist in the regulation of the complement system in health and disease said: "This study really is a step-change in our understanding of how complement activation drives this major blinding disease.

"Up until now, the role played by FHR proteins in disease has only ever been inferred. But now we show a direct link and, more excitingly, become a tangible step closer to identifying a group of potential therapeutic targets to treat this debilitating disease."

Credit: 
University of Manchester

New commuter concern: Cancerous chemical in car seats

image: Study participant wearing the silicone wristband used to track TDCIPP.

Image: 
David Volz/UCR

The longer your commute, the more you're exposed to a chemical flame retardant that is a known carcinogen and was phased out of furniture use because it required a Proposition 65 warning label in California.

That is the conclusion of a new UC Riverside study published this month in the journal Environment International.

While much research on automobile pollution focuses on external air pollutants entering vehicle interiors, this study shows that chemicals emanating from inside your car could also be cause for concern.

Though there are other Proposition 65-list chemicals that are typically used in the manufacture of automobiles, this flame retardant is a new addition to the list. Known as the Safe Drinking Water and Toxic Enforcement Act, Proposition 65 requires the state to maintain and update a list of chemicals known to cause cancer or reproductive harm.

Some scientists assumed that humans stopped being exposed to the chemical, called TDCIPP or chlorinated tris, after it was placed on California's Proposition 65 list in 2013. However, it is still widely used in automobile seat foam. The study shows that not only is your car a source of TDCIPP exposure, but that less than a week of commuting results in elevated exposure to it.

David Volz, associate professor of environmental toxicology at UCR, said the results were unexpected.

"I went into this rather skeptical because I didn't think we'd pick up a significant concentration in that short a time frame, let alone pick up an association with commute time," Volz said. "We did both, which was really surprising."

Over the past decade, Volz has studied how various chemicals affect the trajectory of early development. Using zebrafish and human cells as models, the Volz laboratory has been studying the toxicity of a newer class of flame retardants called organophosphate esters since 2011.

Little is known about the toxicity of these organophosphate esters -- TDCIPP is one of them -- but they've replaced older flame-retardant chemicals that lasted longer in the environment and took longer to metabolize.

Using zebrafish as a model, Volz found TDCIPP prevents an embryo from developing normally. Other studies have reported a strong association between TDCIPP and infertility among women undergoing fertility treatments.

Knowing its use is still widespread in cars, Volz wondered whether a person's exposure is elevated based on their commute. UC Riverside undergraduates made for excellent study subjects, as a majority of them have a daily commute.

The research team included collaborators at Duke University and was funded by the National Institutes of Health as well as the USDA National Institute of Food and Agriculture.

Participants included around 90 students, each of whom had commute times that varied from less than 15 minutes to more than two hours round trip. All of them were given silicone wristbands to wear continuously for five days.

The molecular structure of silicone makes it ideal for capturing airborne contaminants. Since TDCIPP isn't chemically bound to the foam, Aalekyha Reddam, a graduate student in the Volz laboratory, said it gets forced out over time and ends up in dust that gets inhaled.

Multiple organophosphate esters were tested, but TDCIPP was the only one that showed a strong positive association with commute time.

"Your exposure to TDCIPP is higher the longer you spend in your vehicle," Reddam said.

While Volz and his team did not collect urine samples to verify that the chemical migrated into the bodies of the participants, they believe that's what happened.

"We presume it did because of how difficult it is to avoid the ingestion and inhalation of dust," Volz said. Additionally, other studies have examined the accumulation of TDCIPP in urine, but not as a function of how long a person sits in a car.

Going forward, the research team would like to repeat the study with a larger group of people whose ages are more varied. They would also like to study ways to protect commuters from this and other toxic compounds.

Until more specific reduction methods can be identified, the team encourages frequently dusting the inside of vehicles, and following U.S. Environmental Protection Agency guidelines for reducing exposure to contaminants.

Until safer alternatives are identified, more research is needed to fully understand the effects of TDCIPP on commuters.

"If we picked up this relationship in five days, what does that mean for chronic, long-term exposure, for people who commute most weeks out of the year, year over year for decades?" Volz asked.

Credit: 
University of California - Riverside

The complex effects of colonial rule in Indonesia

The areas of Indonesia where Dutch colonial rulers built a huge sugar-producing industry in the 1800s remain more economically productive today than other parts of the country, according to a study co-authored by an MIT economist.

The research, focused on the Indonesian island of Java, introduces new data into the study of the economic effects of colonialism. The finding shows that around villages where the Dutch built sugar-processing factories from the 1830 through the 1870s, there is today greater economic activity, more extensive manufacturing, and even more schools, along with higher local education levels.

"The places where the Dutch established [sugar factories] persisted as manufacturing centers," says Benjamin Olken, a professor of economics at MIT and co-author of a paper detailing the results, which appears in the January issue of the Review of Economic Studies.

The historical link between this "Dutch Cultivation System" and economic activity today has likely been transmitted "through a couple of forces," Olken suggests. One of them, he says, is the building of "complementary infrastructure" such as railroads and roads, which remain in place in contemporary Indonesia.

The other mechanism, Olken says, is that "industries grew up around the sugar [industry], and those industries persisted. And once you have this manufacturing environment, that can lead to other changes: More infrastructure and more schools have persisted in these areas as well."

To be sure, Olken says, the empirical conclusions of the study do not represent validation of Dutch colonial rule, which lasted from the early 1600s until 1949 and significantly restricted the rights and self-constructed political institutions of Indonesians. Dutch rule had long-lasting effects in many areas of civic life, and the Dutch Cultivation System used forced labor, for one thing.

"This paper is not trying to argue that the [Dutch] colonial enterprise was a net good for the people of the time," Olken emphasizes. "I want to be very clear on that. That's not what we're saying."

Instead, the study was designed to evaluate the empirical effects of the Dutch Cultivation System, and the outcome of the research was not necessarily what Olken would have anticipated.

"The results are striking," Olken says. "They just jump out at you."

The paper, "The Development Effects of the Extractive Colonial Economy: The Dutch Cultivation System in Java," is co-authored by Olken and Melissa Dell PhD '12, a professor of economics at Harvard University.

On the ground

Historically in Java, the biggest of Indonesia's many islands, the main crop had been rice. Starting in the 1830s, the Dutch instituted a sugar-growing system in some areas, building 94 sugar-processing factories, as well as roads and railroads to transport materials and products.

Generally the Dutch would export high-quality sugar from Indonesia while keeping lower-quality sugar in the country. Overall, the system became massive; at one point in the mid-19th century, sugar production in Java accounted for one-third of the Dutch government's revenues and 4 percent of Dutch GDP. By one estimate, a quarter of the population was involved in the industry.

In developing their research, Olken and Dell used 19th century data from government archives in the Netherlands, as well as modern data from Indonesia. The Dutch built the processing plants next to rivers in places with enough flat land to sustain extensive sugar crops; to conduct the study, the researchers looked at economic activity near sugar-processing factories and compared it with economic activity in similar areas that lacked factories.

"In the 1850s, the Dutch spent four years on the ground collecting detailed information for the over 10,000 villages that contributed land and labor to the Cultivation System," Dell notes. The researchers digitized those records and, as she states, "painstakingly merged them" with economic and demograhic records from the same locations today

As the results show, places close to factories are 25-30 percentage points less agricultural in economic composition than those away from factories, and they have more manufacturing, by 6-7 percentage points. They also have 9 percent more employment in retail.

Areas within 1 kilometer of a sugar factory have a railroad density twice that of similar places 5 to 20 kilometers from factories; by 1980, they were also 45 percent more likely to have electricity and 4 percent more likely to have a high school. They also have local populations with a full year more of education, on average, than areas not situated near old sugar factories.

The study shows there is also about 10 to 15 percent more public-land use in villages that were part of the Dutch Cultivation System, a data point that holds steady in both 1980 and 2003.

"The key thing that underlies this paper, in multiple respects, is the linking of the historical data and the modern data," Olken says. The researchers also observed that the disparity between industrialized places and their more rural counterparts has not arisen since 1980, further suggesting how much Java's deep economic roots matter.

Net Effects?

The paper blends the expertise of Olken, who has spent years conducting antipoverty studies in Indonesia, and Dell, whose work at times examines the effects of political history on current-day economic outcomes.

"I had never really done a historical project before," Olken says. "But the opportunity to collaborate with Melissa on this was really exciting."

One of Dell's best-known papers, published in 2010 while she was still a PhD student at MIT, shows that in areas of Peru where colonial Spanish rulers instituted a system of forced mining labor from the 1500s to the 1800s, there are significant and negative economic effects that persist today.

However, somewhat to their surprise, the researchers did not observe similarly pronounced effects from the Dutch Cultivation System.

"One might have thought that could have had negative consequences on local social capital and local development in other respects," says Olken, adding that he "wasn't sure what to expect" before looking at the data.

"The differences between the long-run effects of forced labor in Peru and Java suggest that for understanding persistent impacts on economic activity, we need to know more than just whether there was forced labor in a location," Dell says. "We need to understand how the historical institutions influenced economic incentives and activities initially, and how these initial effects may or may not have persisted moving forward."

Olken adds that the study "can't measure every possible thing," and that "it's possible there are other effects we didn't see."

Moreover, Olken notes, the paper cannot determine the net effect of the Dutch Cultivation System on Indonesian economic growth. That is, in the absence of Dutch rule, Indonesia's economy would have certainly grown on it own -- but it is impossible to say whether it would have expanded at a rate faster, slower, or equivalent to the trajectory it had under the Dutch.

"We can't say what would have happened if the Dutch had never showed up in Indonesia," Olken says. "And of course the Dutch [colonizing] Indonesia had all kinds of effects well beyond the scope of this paper, many of them negative for the contemporaneous population."

Credit: 
Massachusetts Institute of Technology

Engineers mix and match materials to make new stretchy electronics

image: With a new technique, MIT researchers can peel and stack thin films of metal oxides -- chemical compounds that can be designed to have unique magnetic and electronic properties. The films can be mixed and matched to create multi-functional, flexible electronic devices, such as solar-powered skins and electronic fabrics.

Image: 
Image: Felice Frankel

At the heart of any electronic device is a cold, hard computer chip, covered in a miniature city of transistors and other semiconducting elements. Because computer chips are rigid, the electronic devices that they power, such as our smartphones, laptops, watches, and televisions, are similarly inflexible.

Now a process developed by MIT engineers may be the key to manufacturing flexible electronics with multiple functionalities in a cost-effective way.

The process is called "remote epitaxy" and involves growing thin films of semiconducting material on a large, thick wafer of the same material, which is covered in an intermediate layer of graphene. Once the researchers grow a semiconducting film, they can peel it away from the graphene-covered wafer and then reuse the wafer, which itself can be expensive depending on the type of material it's made from. In this way, the team can copy and peel away any number of thin, flexible semiconducting films, using the same underlying wafer.

In a paper published in the journal Nature, the researchers demonstrate that they can use remote epitaxy to produce freestanding films of any functional material. More importantly, they can stack films made from these different materials, to produce flexible, multifunctional electronic devices.

The researchers expect that the process could be used to produce stretchy electronic films for a wide variety of uses, including virtual reality-enabled contact lenses, solar-powered skins that mold to the contours of your car, electronic fabrics that respond to the weather, and other flexible electronics that seemed until now to be the stuff of Marvel movies.

"You can use this technique to mix and match any semiconducting material to have new device functionality, in one flexible chip," says Jeehwan Kim, an associate professor of mechanical engineering at MIT. "You can make electronics in any shape."

Kim's co-authors include Hyun S. Kum, Sungkyu Kim, Wei Kong, Kuan Qiao, Peng Chen, Jaewoo Shim, Sang-Hoon Bae, Chanyeol Choi, Luigi Ranno, Seungju Seo, Sangho Lee, Jackson Bauer, and Caroline Ross from MIT, along with collaborators from the Uniersity of Wisconsin at Madison, Cornell University, the University of Virginia, Penn State University, Sun Yat-Sen University, and the Korea Atomic Energy Research Institute.

Buying time

Kim and his colleagues reported their first results using remote epitaxy in 2017. Then, they were able to produce thin, flexible films of semiconducting material by first placing a layer of graphene on a thick, expensive wafer made from a combination of exotic metals. They flowed atoms of each metal over the graphene-covered wafer and found the atoms formed a film on top of the graphene, in the same crystal pattern as the underlying wafer. The graphene provided a nonstick surface from which the researchers could peel away the new film, leaving the graphene-covered wafer, which they could reuse.

In 2018, the team showed that they could use remote epitaxy to make semiconducting materials from metals in groups 3 and 5 of the periodic table, but not from group 4. The reason, they found, boiled down to polarity, or the respective charges between the atoms flowing over graphene and the atoms in the underlying wafer.

Since this realization, Kim and his colleagues have tried a number of increasingly exotic semiconducting combinations. As reported in this new paper, the team used remote epitaxy to make flexible semiconducting films from complex oxides -- chemical compounds made from oxygen and at least two other elements. Complex oxides are known to have a wide range of electrical and magnetic properties, and some combinations can generate a current when physically stretched or exposed to a magnetic field.

Kim says the ability to manufacture flexible films of complex oxides could open the door to new energy-havesting devices, such as sheets or coverings that stretch in response to vibrations and produce electricity as a result. Until now, complex oxide materials have only been manufactured on rigid, millimeter-thick wafers, with limited flexibility and therefore limited energy-generating potential.

The researchers did have to tweak their process to make complex oxide films. They initially found that when they tried to make a complex oxide such as strontium titanate (a compound of strontium, titanium, and three oxygen atoms), the oxygen atoms that they flowed over the graphene tended to bind with the graphene's carbon atoms, etching away bits of graphene instead of following the underlying wafer's pattern and binding with strontium and titanium. As a surprisingly simple fix, the researchers added a second layer of graphene.

"We saw that by the time the first layer of graphene is etched off, oxide compounds have already formed, so elemental oxygen, once it forms these desired compounds, does not interact as heavily with graphene," Kim explains. "So two layers of graphene buys some time for this compound to form."

Peel and stack

The team used their newly tweaked process to make films from multiple complex oxide materials, peeling off each 100-nanometer-thin layer as it was made. They were also able to stack together layers of different complex oxide materials and effectively glue them together by heating them slightly, producing a flexible, multifunctional device.

"This is the first demonstration of stacking multiple nanometers-thin membranes like LEGO blocks, which has been impossible because all functional electronic materials exist in a thick wafer form," Kim says.

In one experiment, the team stacked together films of two different complex oxides: cobalt ferrite, known to expand in the presence of a magnetic field, and PMN-PT, a material that generates voltage when stretched. When the researchers exposed the multilayer film to a magnetic field, the two layers worked together to both expand and produce a small electric current.

The results demonstrate that remote epitaxy can be used to make flexible electronics from a combination of materials with different functionalities, which previously were difficult to combine into one device. In the case of cobalt ferrite and PMN-PT, each material has a different crystalline pattern. Kim says that traditional epitaxy techniques, which grow materials at high temperatures on one wafer, can only combine materials if their crystalline patterns match. He says that with remote epitaxy, researchers can make any number of different films, using different, reusable wafers, and then stack them together, regardless of their crystalline pattern.

"The big picture of this work is, you can combine totally different materials in one place together," Kim says. "Now you can imagine a thin, flexible device made from layers that include a sensor, computing system, a battery, a solar cell, so you could have a flexible, self-powering, internet-of-things stacked chip."

The team is exploring various combinations of semiconducting films and is working on developing prototype devices, such as something Kim is calling an "electronic tattoo" -- a flexible, transparent chip that can attach and conform to a person's body to sense and wirelessly relay vital signs such as temperature and pulse.

"We can now make thin, flexible, wearable electronics with the highest functionality," Kim says. "Just peel off and stack up."

Credit: 
Massachusetts Institute of Technology