Culture

Lineage tracing of direct astrocyte-to-neuron conversion for brain repair

image: Neurons converted from lineage-traced astrocytes.

Image: 
Jinan University

Regeneration of functional new neurons to repair injured human brain is a long-term unsolved problem up till today. The lack of neuroregeneration is one of the major reasons why so many brain disorders such as stroke and Alzheimer's disease do not have a cure yet. A research team led by Prof. Gong Chen at Jinan University (Guangzhou, China) published a work on October 9th in Neural Regeneration Research, providing unambiguous data that brain internal astrocytes are directly converted into neurons through lineage tracing studies. Using brain internal astrocytes, a type of supporting cells to neurons, to directly convert into new neurons is an innovative brain repair technology that may benefit millions of patients worldwide.

Human brain has two major types of cells, neurons and glial cells. Neurons cannot divide and therefore don't have self-regeneration capability, but glial cells can divide and self-regenerate. Prof. Chen's team has previously demonstrated through a series of works that brain internal glial cells can be directly converted into neurons through NeuroD1-based gene therapy. Many other labs have also confirmed such direct glia-to-neuron conversion in the brain, spinal cord, and retina. However, some people, both inside and outside the regenerative medicine field, still have some benefit of the doubt on direct glia-to-neuron conversion and demand more evidence to support such ground-breaking discoveries.

In this study, Chen's team employs transgenic reporter mice that can faithfully trace astrocytic lineage to unambiguously demonstrate that astrocytes can be converted directly into neurons in adult mouse brain. "We crossed Aldh1l1-CreERT2 mice with Ai14 mice and administered tamoxifen to induce Cre-mediated recombination so that many astrocytes will be lineage-traced permanently by red fluorescent protein tdTomato", explained by the first author and a postdoctoral fellow Dr. Zongqin Xiang. "Then, we injected AAV expressing neural transcription factor NeuroD1 under astrocytic promoter GFAP into the mouse cortex, and detected clear expression of NeuroD1 in the tdTomato-labeled astrocytes at 7 days post viral infection", added by the co-first author and a PhD student Mr. Liang Xu. "Most excitingly, at 135 days post NeuroD1 infection (experiments delayed by COVID-19), we observed that many NeuroD1-expressing tdTomato-labeled astrocytes had now been converted into NeuN-positive neurons with typical neuronal morphology. This experiment provides unambiguous evidence on in vivo astrocyte-to-neuron conversion", concluded by Prof. Gong Chen.

Besides AAV viral system, Chen's team employed another viral system, the retroviral system, to further demonstrate such direct glia-to-neuron conversion process. "While AAV has the advantage of low immunogenicity and relatively safe as a gene therapy vector for the treatment of neurological disorders, its capability to infect both neurons and glial cells may cause confusion if AAV dosing and promoter are not used properly," said Prof. Wenliang Lei, a co-corresponding author of the work. "We therefore employed retroviral vectors to express NeuroD1 exclusively in dividing glial cells, and we were able to further confirm that the newly generated neurons were directly converted from dividing glial cells", added Prof. Lei.

With two lines of unambiguous data in hand, Chen's team went on to address some confusions in the field caused by some improperly designed experiments and high dosing of AAV used by some labs. "A recent work challenged the field of in vivo glia-to-neuron conversion based on one set of experiments using high dose of AAV that produced artifacts in the mouse cortex," said Prof. Wen Li, another co-corresponding author of the work. "We demonstrated in this work that when an appropriate dosage of AAV was used, the artifacts of so-called leakage can be minimized if not completely avoided", added Prof. Li.

"For newcomers entering into this exciting in vivo glia-to-neuron conversion field, we highly recommend all investigators using different viral systems at different dosing, and performing both in vitro and in vivo studies to prove or disprove any hypothesis," Prof. Chen provided insightful comments as a leader of this emerging new field. "With the addition of this indisputable evidence from lineage tracing studies, we believe that this exciting in vivo glia-to-neuron conversion technology will provide an unprecedented opportunity to repair damaged brain with internal glial cells", Prof. Chen concluded with unwavering confidence.

Credit: 
Guangdong-Hongkong-Macau Institute of CNS Regeneration, Jinan University

Two planets around a red dwarf

image: The SAINT-EX Observatory is a fully robotic facility hosting a 1-metre telescope based in Mexico.

Image: 
Institute of Astronomy, UNAM / E. Cadena

Red dwarfs are the coolest kind of star. As such, they potentially allow liquid water to exist on planets that are quite close to them. In the search for habitable worlds beyond the borders of our solar system, this is a big advantage: the distance between an exoplanet and its star is a crucial factor for its detection. The closer the two are, the higher the chance that astronomers can detect the planet from Earth.

"But these stars are rather small and emit little light compared to most other stars, such as our Sun", Brice-Olivier Demory, lead author of the study and Professor of Astrophysics at the University of Bern explains. These factors make them challenging to observe in detail. Without the proper instruments, any planets that might orbit them could easily be overlooked - especially terrestrial planets, like Earth, that are comparably small.

A dedicated telescope

One instrument, with which it is possible to study red dwarfs and their planets closely, is the Mexico-based SAINT-EX telescope, co-operated by the NCCR PlanetS. SAINT-EX is an acronym that stands for Search And characterIsatioN of Transiting EXoplanets. The project has been named in honor of Antoine de Saint-Exupéry (Saint-Ex), the famous writer, poet and aviator.

The SAINT-EX Observatory is a fully robotic facility hosting a 1-metre telescope. It is equipped with instrumentation specifically suited to enable high-precision detection of small planets orbiting cool stars. Now, this specialization pays off: earlier this year, the telescope was able to detect two exoplanets orbiting the star TOI-1266, located around 120 light years from Earth. The research, published recently in the journal Astronomy and Astrophysics, provides a first impression of their characteristics.

A peculiar pair

Compared to the planets in our solar system, TOI-1266 b and c are much closer to their star - it takes them only 11 and 19 days respectively to orbit it. However, as their host star is much cooler than the Sun, their temperatures are not very extreme: the outer planet has approximately the temperature of Venus (although it is 7 times closer to its star than Venus is to the Sun). The two planets are of similar density, possibly corresponding to a composition of about a half of rocky and metallic material and half water. This makes them about half as rocky as Earth or Venus but also far rockier than Uranus or Neptune.

In size, the planets clearly differ from each other. The inner planet, TOI-1266 b, measures up to a little under two-and-a-half times the Earth's diameter. This makes it a so-called "sub-Neptune". The outer planet, TOI-1266 c, is just over one-and-a-half times the size of our planet. Thus, it belongs to the category of "super-Earths".

This places the two planets at the edges of the so-called radius-valley, as Brice-Olivier Demory explains: "Planets between about the radius of TOI-1266 b and c are quite rare, likely because of the effect of strong irradiation from the star, which can erode their atmospheres". Yilen Gómez Maqueo Chew, SAINT-EX Project Coordinator and researcher at the National Autonomous University of Mexico adds: "Being able to study two different types of planets in the same system is a great opportunity to better understand how these different sized planets come to be".

Good timing and help from the embassy

Having this opportunity, especially this year, is anything but a given. The scientists were fortunate enough to be able to complete their observations shortly before the Covid-19-related lockdown in Mexico. Shortly after the observations were made, the observatory had to be closed due to the consequences of the pandemic. This has not changed until today. The scientists hope to resume operations of SAINT-EX in the next few months and to target the next red dwarf and its potential planets. "Also, the Mexican Embassy in Bern was a great help in facilitating the discussions with the Mexican government and in providing continued support to the project", says Demory.

SAINT-EX - Search and characterisation of exoplanets

SAINT-EX is an international collaboration which had its kick-off meeting in the National Astronomical Observatory in San Pedro Martir (MEX) in September of 2016. The project's principal investigator is Prof. Brice-Olivier Demory, from the Center for Space and Habitability of the University of Bern in Switzerland and National Center of Competence in Research PlanetS; the project's coordinator and leader in Mexico is Dr. Yilen Gomez Maqueo Chew from the Instituto de Astronomía of the Universidad Nacional Autonoma de Mexico (UNAM). Also, part of the project are Prof. Willy Benz from the National Center of Competence in Research PlanetS, Prof. François Bouchy from the University of Geneva in Switzerland, Dr. Michaël Gillon from the University of Liège in Belgium, Prof. Kevin Heng from the University of Bern in Switzerland, Prof. Didier Queloz from the University of Geneva, Switzerland, and Cambridge in the UK, and Dr. Laurence Sabin, also from Instituto de Astronomía de Astronomía in UNAM. SAINT-EX has been funded by the Swiss National Science Foundation and the Universities of Bern, Geneva, Liège and Cambridge as well as UNAM. SAINT-EX also received support from the National Council for Science and Technology (CONACYT) through the National Laboratories call for proposals for the National Astronomical Observatory of San Pedro Martir.

Bernese space exploration: With the world's elite since the first moon landing

When the second man, "Buzz" Aldrin, stepped out of the lunar module on July 21, 1969, the first task he did was to set up the Bernese Solar Wind Composition experiment (SWC) also known as the "solar wind sail" by planting it in the ground of the moon, even before the American flag. This experiment, which was planned and the results analysed by Prof. Dr. Johannes Geiss and his team from the Physics Institute of the University of Bern, was the first great highlight in the history of Bernese space exploration. Ever since Bernese space exploration has been among the world's elite. The numbers are impressive: 25 times were instruments flown into the upper atmosphere and ionosphere using rockets (1967-1993), 9 times into the stratosphere with balloon flights (1991-2008), over 30 instruments were flown on space probes, and with CHEOPS the University of Bern shares responsibility with ESA for a whole mission. The successful work of the Department of Space Research and Planetary Sciences (WP) from the Physics Institute of the University of Bern was consolidated by the foundation of a university competence center, the Center for Space and Habitability (CSH). The Swiss National Fund also awarded the University of Bern the National Center of Competence in Research (NCCR) PlanetS, which it manages together with the University of Geneva.

Credit: 
University of Bern

Pinpointing the 'silent' mutations that gave the coronavirus an evolutionary edge

DURHAM, N.C. -- We know that the coronavirus behind the COVID-19 crisis lived harmlessly in bats and other wildlife before it jumped the species barrier and spilled over to humans.

Now, researchers at Duke University have identified a number of "silent" mutations in the roughly 30,000 letters of the virus's genetic code that helped it thrive once it made the leap -- and possibly helped set the stage for the global pandemic. The subtle changes involved how the virus folded its RNA molecules within human cells.

For the study, published Oct. 16 in the journal PeerJ, the researchers used statistical methods they developed to identify adaptive changes that arose in the SARS-CoV-2 genome in humans, but not in closely related coronaviruses found in bats and pangolins.

"We're trying to figure out what made this virus so unique," said lead author Alejandro Berrio, a postdoctoral associate in biologist Greg Wray's lab at Duke.

Previous research detected fingerprints of positive selection within a gene that encodes the "spike" proteins studding the coronavirus's surface, which play a key role in its ability to infect new cells.

The new study likewise flagged mutations that altered the spike proteins, suggesting that viral strains carrying these mutations were more likely to thrive. But with their approach, study authors Berrio, Wray and Duke Ph.D. student Valerie Gartner also identified additional culprits that previous studies failed to detect.

The researchers report that so-called silent mutations in two other regions of the SARS-CoV-2 genome, dubbed Nsp4 and Nsp16, appear to have given the virus a biological edge over previous strains without altering the proteins they encode.

Instead of affecting proteins, Berrio said, the changes likely affected how the virus's genetic material -- which is made of RNA -- folds up into 3-D shapes and functions inside human cells.

What these changes in RNA structure might have done to set the SARS-CoV-2 virus in humans apart from other coronaviruses is still unknown, Berrio said. But they may have contributed to the virus's ability to spread before people even know they have it -- a crucial difference that made the current situation so much more difficult to control than the SARS coronavirus outbreak of 2003.

The research could lead to new molecular targets for treating or preventing COVID-19, Berrio said.

"Nsp4 and Nsp16 are among the first RNA molecules that are produced when the virus infects a new person," Berrio said. "The spike protein doesn't get expressed until later. So they could make a better therapeutic target because they appear earlier in the viral life cycle."

More generally, by pinpointing the genetic changes that enabled the new coronavirus to thrive in human hosts, scientists hope to better predict future zoonotic disease outbreaks before they happen.

"Viruses are constantly mutating and evolving," Berrio said. "So it's possible that a new strain of coronavirus capable of infecting other animals may come along that also has the potential to spread to people, like SARS-CoV-2 did. We'll need to be able to recognize it and make efforts to contain it early."

Credit: 
Duke University

Utilizing telemedicine in the ER can reduce wait times and patient length of stay

INFORMS Journal Information Systems Research New Study Key Takeaways:

Increasing telemedicine availability in the emergency room (ER) significantly reduces the average patients' hospital stay.

Using telemedicine if there is a demand surge or supply shortage rapidly decreases ER hospital stays.

Using telemedicine to reduce a patient's length of stay is a result of the reduction of wait times.

CATONSVILLE, MD, October 16, 2020 - Telemedicine has become more common given the current global pandemic. COVID-19 has limited doctor's office and hospital visits to ensure safety for everyone. But rather than diminish the quality of care, new research in the INFORMS journal Information Systems Research finds that increasing wider use of telemedicine in the emergency room (ER) can yield positive results for patients and providers alike.

The study, "Does Telemedicine Reduce Emergency Room Congestion? Evidence from New York State," looks at all emergency room visits in New York from 2010 to 2014. The researchers found, on average, telemedicine availability in the ER significantly reduces average patients' length of stay (LOS), which is partially driven by the flexible resource allocation.

Overcrowding in ERs is a common and nagging problem. It not only is costly for hospitals, but also compromises care quality and patient experience. Study authors Susan Lu of Purdue University, Shujing Sun of the University of Texas at Dallas and Huaxia Rui of the University of Rochester say finding ways to improve ER care delivery is important, as long as it actually works.

"The adoption of telemedicine leads to a larger reduction in ER length of stay when there is a demand surge or supply shortage," said Lu, a professor in the Krannert School of Management at Purdue. "This improvement does not come at the expense of care quality or patient cost."

The authors replicated their findings using annual U.S. hospital data and found that ER telemedicine adoption also significantly reduced average patients' waiting time, which suggests that the LOS reduction partially comes from the reduction of waiting time.

According to the National Hospital Ambulatory Medical Care Survey, from 2000 to 2015, the number of ER visits in the United States increased by more than 25%. This congestion in the ER can have a number of negative implications from unhappy patients, decreased productivity by doctors because they're overworked, and increased financial costs because of unnecessary tests.

According to information published in February 2019 by the American Hospital Association, 76% of U.S. hospitals use various telemedicine technologies to connect patients and providers.

This research article shows more specifically the impact telemedicine can have in reducing ER congestion and provides positive implications.

"The current pandemic has shown hospitals the great promise of telemedicine application and hopefully the unexpected enrollment of such policies alongside this research can help get the process underway to help more healthcare facilities utilize this technology in ERs and elsewhere," said Lu. "Policymakers can play a role as well by reducing regulatory barriers that inhibit more expansive use of telemedicine and by creating incentives that encourage hospitals to more broadly adopt telemedicine in emergency rooms."

Credit: 
Institute for Operations Research and the Management Sciences

Safe sex or risky romance? Young adults make the rational choice

Astudy published in the journal Psychological Science found that young adults--contrary to how they are sometimes portrayed in the media--tend to make highly rational decisions when it comes to selecting potential romantic partners.

This is not to say that young adults make risk-free choices, but they appear to consider both the risks and benefits of their sexual behavior in a highly consistent and thoughtful manner.

"There is a tendency to view sexual decision making in young adults as a highly variable and somewhat random process, more influenced by hormones or impulsivity than rational processes," said Laura Hatz, a doctoral candidate at the University of Missouri and lead author of the study. "Our study suggests, however, that young adults are highly consistent in their choices, balancing potential partners' level of attractiveness against the potential risk for sexually transmitted infection."

The research involved presenting 257 participants with hypothetical "sexual gambles" in which a photo of a potential partner's face was shown alongside an associated, though purely hypothetical, risk of contracting a sexually transmitted infection. Nearly all participants in the study made consistently rational choices, as defined by established models of psychological behavior. Prior research has shown that, in general, individuals tend to use what are known as heuristic decision strategies--cognitive shortcuts that may ignore some information--to make choices in life.

Hatz and her colleagues found that even individuals who could be identified as classic heuristic decision makers for monetary-based choices became rational decision makers when similar choices were framed as sexual choices.

Credit: 
Association for Psychological Science

New study may reveal link to lipids playing a key role in Parkinson's disease

image: Researchers in the Neuroregeneration Institute are making strides in Parkinson's disease research

Image: 
McLean Hospital

In a novel research study conducted by a team from the Neuroregeneration Institute at McLean Hospital, investigators believe they have found key brain cell type changes involving lipids, inflammation, and the development of Parkinson's disease (PD). Their findings appear in the current issue of the Proceedings of the National Academy of Sciences of the United States of America.

"Our study emphasizes the importance of cooperative use, storage, and transport of lipids between brain cell types in Parkinson's disease. Mechanisms involved in balancing cellular lipids--especially neutral lipids--such as we have characterized here, have been relatively understudied in the neurodegenerative diseases," explained Oeystein R. Brekk, PhD, an assistant neuroscientist at the Neuroregeneration Institute and first author of the study. "However, a wealth of knowledge already exists on such cellular lipid use, and consequences of lipid variations in other organs. For example, most people will know lipids from the role they play in increased risk for cardiovascular disease. Like the cardiovascular disease models, our Parkinson's disease and lipid-induced PD animal models point to lipid dependent pathological processes inside the brain, meaning we see dysregulation of the lipids and increased neuroinflammation."

In their study, Brekk and the McLean team demonstrate concurrent lipid changes in dopaminergic neurons and their neighboring brain glial cells, such as microglia and astrocytes in Parkinson's disease brains. Specifically, microglia and astrocytes showed abnormal patterns of intracellular lipid storage, which were significantly correlated with the accumulation of lipids within the dopaminergic neurons, the most vulnerable brain cells to the disease process. Overall lipid triglyceride content was statistically linked to a lipid-induced inflammatory stress marker in the brain tissue of PD patients. A remarkably similar brain cell and pathological picture was seen in an experimental animal model that simulates a Parkinson's disease genetically linked enzymatic loss-of-function in the glucocerebrosidase gene, leading to glycosphingolipid accumulation.

The work shows that microglia, which to a large extent are controlling macrophage and immune functions in the brain, are overloaded with lipids in Parkinson's disease, while astrocytes that normally supply lipids for maintenance and growth, on average, are losing some of that lipid content. At the same time, the neurons are accumulating lipids in an inverse linear fashion relative to the surrounding astrocytes. Moreover, the study shows that there is a statistically significant link between a molecule known as GPNMB. This stress immune response molecule is linked to astrocytes that typically appears to quench some of the inflammatory signals that are associated with lipid accumulation and overall triglyceride levels in the substantia nigra region of the brain.

"Remarkably, we can model these new findings in Parkinson's disease versus healthy aging, microglia and astrocyte interactions in the vulnerable brain regions, precisely by mechanisms that block a lysosomal lipid breakdown pathway, shown to be a strong risk factor for developing PD," said senior author Dr. Ole Isacson, founding director of the Neuroregeneration Institute at McLean Hospital and professor of neurology at Harvard Medical School. "These results support our lipid-inflammation hypothesis in the causation of Parkinson's disease initiation and progression and may help us discover and develop new therapies by leaving behind conventional thinking about PD pathology, which to some extent has been limited to neurons and protein aggregates."

According to Isacson, the next steps include exploring how these lipid cell-cell interactions in the brain are both adaptive and pathological over time and how such cell mechanisms can lead to Parkinson's disease and Lewy body dementia.

Credit: 
McLean Hospital

New study highlights the role of risk communication in coping with COVID-19

image: Making victory sign during COVID-19 pandemic before takeoff

Image: 
Zheng JIN

The mental effects of pandemics on people can arise not only from the burden of preventive measures, the fear of contracting and cure but also coping with the exponential deaths. It was predicted in 2018 that the next major outbreak and its containment challenges might not be due to a lack of preventive technologies but to emotional contagion, which could erode trust in government, causing serious economic and social disruption. It is thus crucial to understand the relationship between risk communication and psychological responses especially in the ascending phase of pandemic, at which public emotions and behaviours in response to the epidemic change rapidly. In it is in this vein that, psychologists at the International Joint Laboratory of Cognitive and Behavioural Scienc (iLCBC) at Zhengzhou Normal University have carried a research on the relationship between psychological responses and risk communication during the early phase of COVID-19 pandemic to answer the following questions: What is the public reaction to epidemic outbreaks in the early phase? How does the effective exchange of real-time risk information impact them over time? What are the characteristics of these effects under different risk intensities?

Data was collected from 26th Jan 2020 (at which time 30 Provinces launched the First-level response to major public health emergencies in China, 56 deaths had occurred, and 2,014 cases were confirmed worldwide) until 17th February 2020 (1,775 deaths and 71,429 confirmed cases worldwide) with the mean test-retest interval of 16 days, by inviting community residents from two provincial capitals: Wuhan and Zhengzhou.

The findings of the study showed that risk communication in the initial stage of the outbreak mitigated the susceptibility to emotional contagion, and that this interaction had a larger influence on the epidemic frontline (i.e., Wuhan). Furthermore, prevention activities were predicted by the quality of risk communication, suggesting that preventive behaviours taken were closely linked to the efficient and timely transmission of information related to the epidemic. While researchers have found that effective risk communication may reduce susceptibility to emotional contagion and is a significant means of alleviating public anxiety, it has shown some inconsistencies to previous findings in that there is a reciprocal correlation between anxiety and risk communication, which means that the emotional component may build resistance to risk communication.

In January 2020, Wuhan became the battlefront in the fight against COVID-19 and was the focus of global attention. The data provides some of the first follow-up records regarding mental health during the COVID-19 outbreak. According to Dr Zheng Jin, Director of the iLCBC, "Officials trying to circumvent chaos or panic by withholding information are more harmful than the public behaving irrationally in a public health emergency" he says, "Pre-crisis planning is expected to create a transparent, open and honest flow of information."

Credit: 
Zhengzhou Normal University

Internet connectivity is oxygen for research and development work

image: Emmanuel Togo, IT architect for the University of Ghana, gave a tour of the university's campus network operations center during an ICT Health CheckUp conducted by Paul Hixson, University of Illinois.

Image: 
College of ACES, University of Illinois.

URBANA, Ill. - Fast and reliable internet access is fundamental for research and development activity around the world. Seamless connectivity is a privilege we often take for granted. But in developing nations, technological limitations can become stumbling blocks to efficient communication and cause significant disadvantages.

Pete Goldsmith, director of the Soybean Innovation Lab at University of Illinois, works closely with partner organizations in several African countries. He noticed that his African colleagues were often dealing with technological problems that made communication very challenging. For example, sometimes they had to rely on their cell phones because their institution's internet access was unreliable.

Goldsmith teamed up with two IT experts at U of I, former Chief Information Officer Paul Hixson and Director of Research IT and Innovation Tracy Smith, to investigate technological challenges facing institutions in developing countries.

"Connectivity is the oxygen organizations run on," Hixson says. "It's such a basic requirement that it's often not even recognized as an issue. But lack of connectivity severely hinders an organization's ability to perform simple functions, conduct research, and compete for grants."

Goldsmith, Hixson, and Smith conducted an in-depth case study of information communication technology (ICT) infrastructure at the Savannah Agricultural Research Institute (SARI), a leading research station in Ghana and a close collaborator of SIL.

The case study included focus groups, interviews, and a technological analysis of SARI's equipment and connectivity. Based on this study, the research team developed the ICT Health Checkup, an assessment procedure for IT administrators to methodically assess the current state of their system, identify gaps affecting performance, and document steps for remediation.

The ICT Health Checkup tool systematically evaluates four key elements of ICT infrastructure. The first step focuses on connectivity and bandwidth, identifying the required bandwidth to accommodate the institution's needs and whether the institution has an uninterrupted fiber-based connection to the global internet. The second step analyzes core physical infrastructure, including dependable electricity, local network design, and both wired and wireless connectivity capabilities.

The third step looks at available intranet service offerings for researchers such as local storage, data backup procedures, access control, security procedures, email service, and cloud access. Finally, the fourth step deals with the human resources and technical support requirements for planning and managing the institution's IT infrastructure.

"With this tool, institutions can go through a checklist, and at each point there is a 'stoplight'. If it's red, you know there is something that needs to be fixed, because there are conditions that will act as a block and you can't go on until they are fixed - until there's a green light. So turning things from red to green at each step is crucial; methodically going through each step at a time and making sure it's fixed before moving on to the next one," Hixson explains.

The researchers compare the ICT Health Checkup to a medical health exam; it measures the current conditions and can be used as a benchmarking tool to measure improvements.

Goldsmith says the tool can be used to empower organizations so they can be self-sufficient. "With proper connectivity you can manage and store research data, compete for grants, and manage awards," he notes. "It's the foundation that allows institutions to participate fully in a global context."

The research team is currently expanding the study, collecting data from nine institutions and five networking organizations operating in three countries, in order to create a more robust picture of internet connectivity challenges and potential solutions across Africa.

They are also collaborating with the National Research and Education Networks (NRENs) in each of the sub-Saharan African countries that SIL operates in. These African NRENs are comparable to Internet2, which has been an instrumental partner in the expansion and adoption of advanced computing technologies at U of I and is one of the leading NRENs in the U.S., serving the country's research and higher-education communities.

"With the ICT health checkup, our partner African NRENs now have an actual assessment tool they can use with their member institutions. It's becoming a continent-wide approach as they are starting to adopt this new instrument created at the U of I to be their benchmark and measurement tool," Goldsmith says.

"The U of I is ideally positioned to provide this knowledge, because of the university's continued leadership in the computational and network administration space," he adds. "Now we are extending that to have real impact overseas."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Those funky cheese smells allow microbes to 'talk' to and feed each other

image: Fungi and bacteria key to ripening cheese communicate with and feed each other using volatile compounds

Image: 
Adam Detour

MEDFORD/SOMERVILLE, Mass. (October 16, 2020)-- Researchers at Tufts University have found that those distinctly funky smells from cheese are one way that fungi communicate with bacteria, and what they are saying has a lot to do with the delicious variety of flavors that cheese has to offer. The research team found that common bacteria essential to ripening cheese can sense and respond to compounds produced by fungi in the rind and released into the air, enhancing the growth of some species of bacteria over others. The composition of bacteria, yeast and fungi that make up the cheese microbiome is critical to flavor and quality of the cheese, so figuring out how that can be controlled or modified adds science to the art of cheese making.

The discovery, published in Environmental Microbiology, also provides a model for the understanding and modification of other economically and clinically important microbiomes, such as in soil or the gastrointestinal tract.

"Humans have appreciated the diverse aromas of cheeses for hundreds of years, but how these aromas impact the biology of the cheese microbiome had not been studied," said Benjamin Wolfe, professor of biology in the School of Arts and Science at Tufts University and corresponding author of the study. "Our latest findings show that cheese microbes can use these aromas to dramatically change their biology, and the findings' importance extends beyond cheese making to other fields as well."

Many microbes produce airborne chemical compounds called volatile organic compounds, or VOCs, as they interact with their environment. A widely recognized microbial VOC is geosmin, which is emitted by soil microbes and can often be smelled after a heavy rain in forests. As bacteria and fungi grow on ripening cheeses, they secrete enzymes that break down amino acids to produce acids, alcohols, aldehydes, amines, and various sulfur compounds, while other enzymes break down fatty acids to produce esters, methyl ketones, and secondary alcohols. All of those biological products contribute to the flavor and aroma of cheese and they are the reason why Camembert, Blue cheese and Limburger have their signature smells.

The Tufts researchers found that VOCs don't just contribute to the sensory experience of cheese, but also provide a way for fungi to communicate with and "feed" bacteria in the cheese microbiome. By pairing 16 different common cheese bacteria with 5 common cheese rind fungi, the researchers found that the fungi caused responses in the bacteria ranging from strong stimulation to strong inhibition. One bacteria species, Vibrio casei, responded by growing rapidly in the presence of VOCs emitted by all five of the fungi. Other bacteria, such as Psychrobacter, only grew in response to one of the fungi (Galactomyces), and two common cheese bacteria decreased significantly in number when exposed to VOCs produced by Galactomyces.

The researchers found that the VOCs altered the expression of many genes in the bacteria, including genes that affect the way they metabolize nutrients. One metabolic mechanism that was enhanced, called the glyoxylate shunt, allows the bacteria to utilize more simple compounds as "food" when more complex sources such as glucose are unavailable. In effect, they enabled the bacteria to better "eat" some of the VOCs and use them as sources for energy and growth.

"The bacteria are able to actually eat what we perceive as smells," said Casey Cosetta, post-doctoral scholar in the department of biology at Tufts University and first author of the study. "That's important because the cheese itself provides little in the way of easily metabolized sugars such as glucose. With VOCs, the fungi are really providing a useful assist to the bacteria to help them thrive."

There are direct implications of this research for cheese producers around the world. When you walk into a cheese cave there are many VOCs released into the air as the cheeses age. These VOCs may impact how neighboring cheeses develop by promoting or inhibiting the growth of specific microbes, or by changing how the bacteria produce other biological products that add to the flavor. A better understanding of this process could enable cheese producers to manipulate the VOC environment to improve the quality and variety of flavors.

The implications of the research can even extend much further. "Now that we know that airborne chemicals can control the composition of microbiomes, we can start to think about how to control the composition of other microbiomes, for example in agriculture to improve soil quality and crop production and in medicine to help manage diseases affected by the hundreds of species of bacteria in the body," said Wolfe.

Credit: 
Tufts University

Enzymatic DNA synthesis sees the light

image: This illustration shows how the Wyss' team encoded the first measures of the 1985 Nintendo Entertainment System video game Super Mario BrothersTM "Overworld Theme" (Input) in DNA and then decoded it again into a sound-bite (output). First, using a specially developed in silico approach they translated the input sheet notes into a ternary code of 2s, 1s, and 0s, which in turn was transformed into a set of distinct DNA sequences. With the help of their photolithographic enzymatic DNA synthesis method, the actual DNA sequences were written on a patterned array, and safely stored away. At a later time, the synthesized single-stranded DNA sequences could be decoded again by DNA sequencing and translated back into output sound.

Image: 
Wyss Institute at Harvard University

(BOSTON) -- According to current estimates, the amount of data produced by humans and machines is rising at an exponential rate, with the digital universe doubling in size every two years. Very likely, the magnetic and optical data-storage systems at our disposal won't be able to archive this fast-growing volume of digital 1s and 0s anymore at some point. Plus, they cannot safely store data for more than a century without degrading.

One solution to this pending global data-storage problem could be the development of DNA - life's very own information-storage system - into a digital data storage medium. Researchers already are encoding complex information consisting of digital code into DNA's four-letter code comprised of its A, T, G, and C nucleotide bases. DNA is an ideal storage medium because, it is stable over hundreds or thousands of years, has an extraordinary information density, and its information can be efficiently read (decoded) again with advanced sequencing techniques that are continuously getting less expensive.

What lags behind is the ability to write (encode) information into DNA. The programmed synthesis of synthetic DNA sequences still is mostly performed with a decades-old chemical procedure, known as the "phosphoramidite method", that takes many steps that, although being able to be multiplexed, can only generate DNA sequences with up to around 200 nucleotides in length and makes occasional errors. It also produces environmentally toxic by-products that are not compatible with a "clean data storage technology".

Previously, George Church's team at Harvard's Wyss Institute for Biologically Inspired Engineering and Harvard Medical School (HMS) has developed the first DNA storage approach that uses a DNA-synthesizing biological enzyme known as Terminal deoxynucleotidyl Transferase (TdT), which, in principle, can synthesize much longer DNA sequences with fewer errors. Now, the researchers have applied photolithographic techniques from the computer chip industry to enzymatic DNA synthesis, and thus developed a new method to multiplex TdT's superior DNA writing ability. In their study published in Nature Communications, they demonstrated the parallel synthesis of 12 DNA strands with varying sequences on a 1.2 square millimeter array surface.

"We have championed and intensively pursued the use of DNA as a data-archiving medium accessed infrequently, yet with very high capacity and stability. Breakthroughs by us and others have enabled an exponential rise in the amount of digital data encrypted in DNA," said corresponding author Church. "This study and other advances in enzymatic DNA synthesis will push the envelope of DNA writing much further and faster than chemical approaches." Church is a Core Faculty member at the Wyss Institute and lead of its Synthetic Biology Focus Area with DNA data storage as one of its technology development areas. He also is Professor of Genetics at HMS and Professor of Health Sciences and Technology at Harvard and MIT.

While the group's first strategy using the TdT enzyme as an effective tool for DNA synthesis and digital data storage controlled TdT's enzyme activity with a second enzyme, they show in their new study that TdT can be controlled by the high-energy photons that UV-light is composed of. A high level of control is essential as the TdT enzyme needs to be instructed to add only one single or a short block made of one of the four A, T, G, C nucleotide bases to the growing DNA strand with high precision at each cycle of the DNA synthesis process.

Using a special codec, a computational method that encodes digital information into DNA code and decodes it again, which Church's team developed in their previous study, the researchers encoded the first two measures of the "Overworld Theme" sheet music from the 1985 Nintendo Entertainment System (NES) video game Super Mario BrothersTM within 12 synthetic DNA strands. They generated those strands on an array matrix with a surface measuring merely 1.2 square millimeters by extending short DNA "primer" sequences, which were extended in a 3x4 pattern, using their photolithographic approach.

"We applied the same photolithographic approach used by the computer chip industry to manufacture chips with electrical circuits patterned with nanometer precision to write DNA," said first author Howon Lee, Ph.D., a postdoctoral fellow in Church's group at the time of the study. "This provides enzymatic DNA synthesis with the potential of unprecedented multiplexing in the production of data-encoding DNA strands."

Photolithography, like photography, uses light to transfer images onto a substrate to induce a chemical change. The computer chip industry miniaturized this process and uses silicon instead of film as a substrate. Church's team now adapted the chip industry's capabilities in their new DNA writing approach by substituting silicon with their array matrix consisting of microfluidic cells containing the short DNA primer sequences. In order to control DNA synthesis at primers positioned in the 3x4 pattern, the team directed a beam of UV-light onto a dynamic mask (as is done in computer chip manufacturing) - which essentially is a stencil of the 3x4 pattern in which DNA synthesis is activated - and shrunk the patterned beam on the other side of the mask with optical lenses down to the size of the array matrix.

"The UV-light reflected from the mask pattern precisely hits the target area of primer elongation and frees up cobalt ions, which the TdT enzyme needs in order to function, by degrading a light-sensitive "caging" molecule that shields the ions from TdT," explained co-author Daniel Wiegand, Research Scientist at the Wyss Institute. "By the time the UV-light is turned off and the TdT enzyme deactivated again with excess caging molecules, it has added a single nucleotide base or a homopolymer block of one of the four nucleotide bases to the growing primer sequences."

This cycle can be repeated multiple times whereby in each round only one of the four nucleotide bases or a homopolymer of a specific nucleotide base is added to the array matrix. In addition, by selectively covering specific openings of the mask during each cycle, the TdT enzyme only adds that specific nucleotide base to DNA primers where it is activated by UV-light, allowing the researchers to fully program the sequence of nucleotides in each of the strands.

"Photon-directed multiplexed enzymatic DNA synthesis on this newly instrumented platform can be further developed to enable much higher automated multiplexing with improved TdT enzymes, and, eventually make DNA-based data storage significantly more effective, faster, and cheaper," said co-corresponding author Richie Kohman, Ph.D., a Lead Senior Research Scientist at the Wyss' Synthetic Biology focus area, who helped coordinate the research in Church's team at the Wyss Institute.

"This new approach to enzyme-directed synthetic DNA synthesis by the Church team is a clever piece of bioinspired engineering that combines the power of DNA replication with one of the most controllable and robust manufacturing methods developed by humanity - photolithography - to provide a solution that brings us closer to the goal of establishing DNA as a usable data storage medium," said the Wyss Institute's Founding Director Don Ingber, M.D., Ph.D., who is also the Judah Folkman Professor of Vascular Biology at Harvard Medical School and Boston Children's Hospital, and Professor of Bioengineering at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS).

Credit: 
Wyss Institute for Biologically Inspired Engineering at Harvard

Octopus-inspired sucker transfers thin, delicate tissue grafts and biosensors

CHAMPAIGN, Ill. -- Thin tissue grafts and flexible electronics have a host of applications for wound healing, regenerative medicine and biosensing. A new device inspired by an octopus's sucker rapidly transfers delicate tissue or electronic sheets to the patient, overcoming a key barrier to clinical application, according to researchers at the University of Illinois at Urbana-Champaign and collaborators.

"For the last few decades, cell or tissue sheets have been increasingly used to treat injured or diseased tissues. A crucial aspect of tissue transplantation surgery, such as corneal tissue transplantation surgery, is surgical gripping and safe transplantation of soft tissues. However, handling these living substances remains a grand challenge because they are fragile and easily crumple when picking them up from the culture media," said study leader Hyunjoon Kong, a professor of chemical and biomolecular engineering at Illinois.

Kong's group, along with collaborators at Purdue University, the University of Illinois at Chicago, Chung-Ang University in South Korea, and the Korea Advanced Institute for Science and Technology, published their work in the journal Science Advances.

Current methods of transferring the sheets involve growing them on a temperature-sensitive soft polymer that, once transferred, shrinks and releases the thin film. However, this process takes 30-60 minutes to transfer a single sheet, requires skilled technicians and runs the risk of tearing or wrinkling, Kong said.

"During surgery, surgeons must minimize the risk of damage to soft tissues and transplant quickly, without contamination. Also, transfer of ultrathin materials without wrinkle or damage is another crucial aspect," Kong said.

Seeking a way to quickly pick up and release the thin, delicate sheets of cells or electronics without damaging them, the researchers turned to the animal kingdom for inspiration. Seeing the way an octopus or squid can pick up both wet and dry objects of all shapes with small pressure changes in their muscle-powered suction cups, rather than a sticky chemical adhesive, gave the researchers an idea.

They designed a manipulator made of a temperature-responsive layer of soft hydrogel attached to an electric heater. To pick up a thin sheet, the researchers gently heat the hydrogel to shrink it, then press it to the sheet and turn off the heat. The hydrogel expands slightly, creating suction with the soft tissue or flexible electronic film so it can be lifted and transferred. Then they gently place the thin film on the target and turn the heater back on, shrinking the hydrogel and releasing the sheet.

The entire process takes about 10 seconds. See a video at https://youtu.be/EhMySEk-lRw.

Next, the researchers hope to integrate sensors into the manipulator, to further take advantage of their soft, bio-inspired design.

"For example, by integrating pressure sensors with the manipulator, it would be possible to monitor the deformation of target objects during contact and, in turn, adjust the suction force to a level at which materials retain their structural integrity and functionality," Kong said. "By doing so, we can improve the safety and accuracy of handling these materials. In addition, we aim to examine therapeutic efficacy of cells and tissues transferred by the soft manipulator."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Viral 'molecular scissor' is next COVID-19 drug target

image: The SARS-CoV-2-PLpro enzyme is visualized with an inset of viral inhibitor interaction. Blocking the enzyme's effects may prove fruitful in stopping coronavirus infections.

Image: 
Image courtesy Shaun K. Olsen, PhD, laboratory at The University of Texas Health Science Center at San Antonio (Joe R. and Teresa Lozano Long School of Medicine)

SAN ANTONIO, Texas, USA - American and Polish scientists, reporting Oct. 16 in the journal Science Advances, laid out a novel rationale for COVID-19 drug design - blocking a molecular "scissor" that the virus uses for virus production and to disable human proteins crucial to the immune response.

The researchers are from The University of Texas Health Science Center at San Antonio (UT Health San Antonio) and the Wroclaw University of Science and Technology. Information gleaned by the American team helped Polish chemists to develop two molecules that inhibit the cutter, an enzyme called SARS-CoV-2-PLpro.

SARS-CoV-2-PLpro promotes infection by sensing and processing both viral and human proteins, said senior author Shaun K. Olsen, PhD, associate professor of biochemistry and structural biology in the Joe R. and Teresa Lozano Long School of Medicine at UT Health San Antonio.

"This enzyme executes a double-whammy," Dr. Olsen said. "It stimulates the release of proteins that are essential for the virus to replicate, and it also inhibits molecules called cytokines and chemokines that signal the immune system to attack the infection," Dr. Olsen said.

SARS-CoV-2-PLpro cuts human proteins ubiquitin and ISG15, which help maintain protein integrity. "The enzyme acts like a molecular scissor," Dr. Olsen said. "It cleaves ubiquitin and ISG15 away from other proteins, which reverses their normal effects."

Dr. Olsen's team, which recently moved to the Long School of Medicine at UT Health San Antonio from the Medical University of South Carolina, solved the three-dimensional structures of SARS-CoV-2-PLpro and the two inhibitor molecules, which are called VIR250 and VIR251. X-ray crystallography was performed at the Argonne National Laboratory near Chicago.

"Our collaborator, Dr. Marcin Drag, and his team developed the inhibitors, which are very efficient at blocking the activity of SARS-CoV-2-PLpro, yet do not recognize other similar enzymes in human cells," Dr. Olsen said. "This is a critical point: The inhibitor is specific for this one viral enzyme and doesn't cross-react with human enzymes with a similar function."

Specificity will be a key determinant of therapeutic value down the road, he said.

The American team also compared SARS-CoV-2-PLpro against similar enzymes from coronaviruses of recent decades, SARS-CoV-1 and MERS. They learned that SARS-CoV-2-PLpro processes ubiquitin and ISG15 much differently than its SARS-1 counterpart.

"One of the key questions is whether that accounts for some of the differences we see in how those viruses affect humans, if at all," Dr. Olsen said.

By understanding similarities and differences of these enzymes in various coronaviruses, it may be possible to develop inhibitors that are effective against multiple viruses, and these inhibitors potentially could be modified when other coronavirus variants emerge in the future, he said.

Credit: 
University of Texas Health Science Center at San Antonio

Congenital heart defects may not increase the risk of severe COVID-19 symptoms

NEW YORK, NY (Oct. 16)--Adults and children born with heart defects had a lower-than-expected risk of developing moderate or severe COVID-19 symptoms, finds a study of more than 7,000 patients from the congenital heart disease center at Columbia University Vagelos College of Physicians and Surgeons.

Throughout the course of the pandemic, evidence has shown that individuals with heart disease have a higher risk of life-threatening illness and complications from COVID-19. But the impact of SARS-CoV-2 infection on individuals with congenital heart defects, who are generally younger than those with adult-onset heart disease, was unknown.

About 1% (40,000) of babies born each year in the United States have one or more heart defects.

"At the beginning of the pandemic, many feared that congenital heart disease would be as big a risk factor for severe COVID-19 as adult-onset cardiovascular disease," says Matthew Lewis, MD, assistant professor of medicine at Columbia University Vagelos College of Physicians and Surgeons and co-leader of the study. "We were reassured by the low number of congenital heart patients who required hospitalization for COVID-19 and the relatively good outcomes of these patients."

Few congenital heart patients had COVID-19

Only 53 congenital heart patients (43 adults and 10 children)--less than 0.8% of patients at Columbia's congenital heart center--presented to their physician with symptoms of SARS-CoV-2 infection from March through June. (During the study period, an estimated 20% of people in the New York metropolitan area are thought to have been infected with the coronavirus.)

More than 80% (43) of these patients had mild symptoms. Of the 9 patients who developed moderate to severe symptoms, 3 died. (Another study performed at Columbia University Irving Medical Center during the same period found that roughly 22% of hospitalized patients from the general population became critically ill and about one-third of those patients died.)

In the new study, the researchers found that patients with a genetic syndrome and adults with advanced disease from their congenital heart defect were more likely to develop moderate to severe symptoms, though an individual's type of congenital heart defect did not impact symptoms severity.

Though the study sample was small, the researchers conclude that congenital heart disease alone may not be enough to increase the risk of severe COVID-19 symptoms.

Caveats

It's unlikely that people with congenital heart disease have an intrinsically lower risk of becoming severely ill from the new coronavirus, and the researchers hypothesize that the patients in this study may have adhered more strictly to social distancing guidelines compared with the general population, given the publicity about increased COVID-19 risk in patients with heart disease. The researchers caution that individuals with congenital heart disease should continue to practice strict social distancing and follow all CDC guidelines as these measures are likely contributing to the study findings.

They also note that the younger average age (34 years) of these patients and lower incidence of acquired cardiac risk factors compared with other individuals who had severe COVID-19 may explain why fewer congenital heart patients than expected had severe symptoms.

"It's possible that elderly patients with congenital heart disease might have a different risk profile than the general population," adds Brett Anderson, Florence Irving Assistant Professor of Pediatrics at Columbia University Vagelos College of Physicians and Surgeons and co-leader of the study. "We have yet to define what those risk factors are."

Credit: 
Columbia University Irving Medical Center

Results from the TARGET FFR study reported at TCT Connect

NEW YORK - October 16, 2020 - Results from the randomized controlled TARGET FFR trial show that while a physiology-guided percutaneous coronary intervention (PCI) optimization strategy did not achieve a significant increase in the proportion of patients with final FFR ?0.90, it reduced the proportion of patients with a residual FFR ?0.80 following PCI.

Findings were reported today at TCT Connect, the 32nd annual scientific symposium of the Cardiovascular Research Foundation (CRF). TCT is the world's premier educational meeting specializing in interventional cardiovascular medicine.

260 patients were successfully randomized between March 2018 and November 2019 at a single site. Following angiographically successful PCI procedures, patients were randomized (1:1) to receive either a physiology-guided incremental optimization strategy (PIOS intervention group, n=131) or blinded post-PCI coronary physiology measurements (control group, n=129). Patients undergoing successful, standard-of-care PCI for either stable angina or medically stabilized non-ST-segment-elevation myocardial infarction (NSTEMI) were eligible for randomization.

The trial's primary endpoint was defined as the proportion of patients with a final post-PCI FFR result ?0.90. The study found that the incidence of final FFR ?0.90 was 10% higher in the PIOS group than the control group but that the difference was not statistically significant (38.1% vs. 28.1%, p=0.099). However, the study's secondary endpoint, the proportion of patients with final FFR ?0.80, was significantly lower in the PIOS group (18.6% vs 29.8%, p=0.045).

Based on FFR pullback assessment of the stented vessel, a target for further optimization was present in 60 of the 131 (46%) patients randomized to PIOS, and operators considered it appropriate to perform additional post-dilatation +/- stenting in 40 of these 60 (66%) patients. Among patients who had further intervention/optimization performed, mean post-PCI FFR increased significantly from 0.76 to 0.82 (p

"When assessing the proposed optimal post-PCI FFR cutoff value of ?0.90, we found that the majority of patients with angiographically acceptable PCI results actually have a physiologically suboptimal outcome," said Damien Collison, MD, Interventional Cardiologist at the Golden Jubilee National Hospital, Glasgow, Scotland. "Up to 30% of patients may even have a final FFR result that remains below the guideline-directed threshold for performing revascularization in the first place. In our randomized controlled trial, application of an FFR-guided optimization strategy after stenting led to improvements in both FFR and CFR and significantly reduced the proportion of patients with a final post-PCI FFR ?0.80."

The TARGET FFR trial was funded by NHS National Waiting Times Centre Board endowment funds. Dr. Collison reported the following disclosures: consulting fees/honoraria from Abbott Medical and MedAlliance.

Credit: 
Cardiovascular Research Foundation

How bacteria adapt their machinery for optimum growth

The most important components for the functioning of a biological cell are its proteins. As a result, protein production is arguably the most important process for cell growth. The faster the bacterial growth rate, the faster protein synthesis needs to take place. Because protein synthesis is the most expensive cellular process in terms of cellular resources usage, it appears reasonable to assume that the cell to increases production capacities by hosting more copies of the complicated machinery in proportion to its growth rate. This would mean that in order for growth to double, twice as many copies of all components of the translation machinery would be needed.

It has been clear since the 1960s, however, that it's not that simple. Instead, the composition of the 'cocktail' of individual components in the machinery, which itself is made from proteins and RNA, varies with the growth rate. A new, complex computer model developed in Düsseldorf shows what concentrations of the individual components are needed in order to produce different synthesis rates, explaining for the first time the reasons behind the observed variations across growth conditions.

Xiao-Pan Hu, a doctoral student in Prof. Dr. Martin Lercher's Computational Cell Biology group at the HHU, developed the model. Hu used computer modelling to encode established biochemical principles at the cellular level. The resulting model can be used to calculate the speed with which a cell can produce its components and thus predicts cell growth based on a predefined composition of its machinery.

Theoretically, each production rate can be realised using a large number of different molecule concentrations. The question is: What does nature do? Which one of the many feasible compositions do real Escherichia coli ('E. coli') bacteria use and why? Hu and his colleagues have based their work on a simple assumption reflected everywhere in nature: an organism generally has an evolutionary advantage if it needs as few resources as possible for its development. Consequently, the team searched through the many possible compositions for the one that is 'cheapest' for the cell, i.e., the one that requires the smallest possible total mass of molecules.

Comparisons with experimental data show that this assumption is correct and accurately predicts the concentrations measured in real E. coli bacteria colonies. This allowed the Düsseldorf-based research team not only to describe the data quantitatively but also to actually understand the reasons behind the data, namely that a principle found in many other areas of life also applies here.

In further analyses, the model also proved accurate for situations where the bacteria are exposed to antibiotics. In exceptional circumstances like these, the bacteria are particularly stressed and need a toolset that is arranged differently in order to grow.

The research group is currently investigating whether the findings for protein synthesis can also be applied to other cellular processes and other organisms. The models developed as part of this work should also help to design biotech procedures more efficiently. They make it possible to calculate the optimum concentrations of the individual components in the cell for the desired biological production.

Credit: 
Heinrich-Heine University Duesseldorf