Culture

Growing up trilobite

image: A slab with multiple species of trilobite fossils embedded. Scale is in centimeters.

Image: 
M. Hopkins/© AMNH

If you've ever held a trilobite fossil, seen one in a classroom, or walked by one in a store, chances are it was Elrathia kingii, one of the most common and well-recognized trilobites, and collected by the hundreds of thousands in western Utah. But despite the popularity of this species, scientists had not determined how it grew--from hatchling to juvenile to adult--until now. New work from the American Museum of Natural History published today in the journal Papers in Palaeontology describes the development and growth rate of Elrathia kingii--only the second such dataset to be compiled for a trilobite--allowing for the first comparison among trilobite species.

"There's quite a big size range among trilobites. Some never got bigger than about a centimeter, while the largest on record is 72 centimeters (28 inches)," said Melanie Hopkins, an associate curator in the Museum's Division of Paleontology and the study's author. "Growth-rate studies like this one can help us tackle some of the big-picture questions: How did some trilobites get so big? What was the environmental context for that? And how did body size evolve over the evolutionary history of the clade?"

Trilobites are a group of extinct marine arthropods--distantly related to the horseshoe crab--that lived for almost 300 million years. They were incredibly diverse, with more than 20,000 described species. Their fossilized exoskeletons are preserved in sites all over the world, from the United States to China. Like insects, they molted throughout their lifetimes, leaving clues to how they changed during development. But to calculate the species' growth rate, scientists need fossils representing all stages of the animal's life--and lots of them.

"There are tons of specimens of Elrathia kingii out there but most of them are adults, and data from exactly where they were collected is inconsistent," Hopkins said. "I needed material that I could collect from as small a section as possible that included a lot of juveniles."

So in May 2018, Hopkins spent five days in Utah with a crew consisting of Museum staff and volunteers at a new fossil site said to preserve bucketloads of Elrathia kingii. By the end of the trip, they had collected about 500 specimens--many of them juveniles, which can be as small as half a millimeter long--from a section of outcrop just 1.5 meters (about 5 feet) long.

Hopkins estimated the growth rate and compared it to previously published data on a different trilobite, Aulacopleura konincki--the first time two trilobite species have been compared in this way. The two species look very similar and Hopkins found that they also grow in similar ways: for example, the growth of the trunk--the area immediately below the trilobite's head made up of segments that increase with age--was controlled by a growth gradient, with those that were younger and closer to the back of the body undergoing faster growth. But while Elrathia kingii was smaller in early development and went through fewer molts before adulthood, it had faster growth rates, ultimately reaching sizes on par with Aulacopleura konincki, the largest of which are about 4 centimeters long.

In future studies, Hopkins is planning to add growth-rate data on different, more diverse-looking trilobite species to her models.

Credit: 
American Museum of Natural History

Scientists identify new material with potential for brain-like computing

The most powerful and advanced computing is still primitive compared to the power of the human brain, says Chinedu E. Ekuma, Assistant Professor in Lehigh University's Department of Physics.

Ekuma's lab, which aims to gain an understanding of the physical properties of materials, develops models at the interface of computation, theory, and experiment. One area of focus: 2-Dimensional (2D) materials. Also dubbed low-dimensional, these are crystalline nanomaterials that consist of a single layer of atoms. Their novel properties make them especially useful for the next-generation of AI-powered electronics, known as neuromorphic, or brain-like devices.

Neuromorphic devices attempt to better mimic how the human brain processes information than current computing methods. A key challenge in neuromorphic research is matching the human brain's flexibility, and its ability to learn from unstructured inputs with energy efficiency. According to Ekuma, early successes in neuromorphic computing relied mainly on conventional silicon-based materials that are energy inefficient.

"Neuromorphic materials have a combination of computing memory capabilities and energy efficiency for brain-like applications," he says.

Now Ekuma and his colleagues at the Sensor and Electrons Devices Directorate at the U.S. Army Research Laboratory have developed a new complex material design strategy for potential use in neuromorphic computing, using metallocene intercalation in hafnium disulfide (HfS2). The work is the first to demonstrate the effectiveness of a design strategy that functionalizes a 2D material with an organic molecule. It has been published in an article called "Dynamically reconfigurable electronic and phononic properties in intercalated HfS2" in Materials Today. Additional authors: Sina Najmaei, Adam A.Wilson Asher C. Leff and Madan Dubey of the United States Army Research Laboratory.

"We knew that low-dimensional materials showed novel properties, but we did not expect such high tunability of the HfS2-based system," says Ekuma. "The strategy was a concerted effort and synergy between experiment and computation. It started with an afternoon coffee chat where my colleagues and I discussed exploring the possibility of introducing organic molecules into a gap, known as van der Waals gap, in 2D materials. This was followed by the material design and rigorous computations to test the feasibility. Based on the encouraging computational data, we proceeded to make the sample, characterize the properties, and then made a prototype device with the designed material."

Scholars in search of energy-efficient materials may be particularly interested in this research, as well as industry, especially semiconductor industries designing logic gates and other electronic devices.

"The key takeaway here is that complex materials design based on 2D materials is a promising route to achieving high performing and energy-efficient materials," says Ekuma.

Credit: 
Lehigh University

In the sharing economy, consumers see themselves as helpers

COLUMBUS, Ohio - Whether you use a taxi or a rideshare app like Uber, you're still going to get a driver who will take you to your destination.

But consumers view an employee of a taxi company differently from an independent driver picking up riders via an app, a new Ohio State University study suggests.

Consumers see themselves as helping independent providers like those on rideshare apps. When they use traditional firms, like a taxi company, they don't view themselves as helping the employees - they're just purchasing a service.

The peer-to-peer business model of firms like Uber or Airbnb is changing how consumers view some service providers, said John Costello, lead author of the study and a doctoral candidate in marketing at Ohio State's Fisher College of Business.

"Previous work has shown that consumers view employees as being an extension of the company they work for," Costello said.

"But we found that consumers see providers for these peer-to-peer companies as separate from the company - as people just like themselves."

The study was published online recently in the Journal of Marketing.

These different views of service providers have important implications for how firms like Uber and Airbnb market themselves to consumers, the study found. It may also have consequences for issues like how consumers tip independent providers and their support for regulations in the sharing economy.

Results showed that peer-to-peer companies had better success marketing themselves to consumers when they focused on the people that provide their service, and less success when they focused on their companies or apps.

"When peer-to-peer companies focus their marketing on the people who provide their services, we found it made consumers think about how their purchases benefit the individual providers," said study co-author Rebecca Reczek, professor of marketing at Ohio State's Fisher College.

"But when peer-to-peer firms focused on their apps instead, it makes people think that they're just purchasing from a company rather than thinking about how their purchase helps an individual."

In one real-life field study, the researchers partnered with a peer-to-peer company whose app allowed college students to buy or rent textbooks from each other.

The company set up a table on a college campus for several days to promote their service. On half the days, the banner on the table and the promotional cards they had available focused on using the company's app to buy or rent books. On the other half of the days, the banner and cards focused on how you could buy or rent books directly from your classmates.

Results showed that more of the promotional cards were taken when the focus was on the student providers (379) than when the focus was on the app itself (281).

In a second study, 259 students were shown ads for one of two fictitious firms, either "Reliable Rideshare" or "Reliable Cab." Some of the ads focused on the companies themselves (the cab or the rideshare company) and some focused on people (employees of the cab company or drivers for the rideshare company).

Participants were also asked if they thought their purchases from either the rideshare or cab company would help someone.

Findings showed that consumers were more likely to say they would purchase from the rideshare company when the ad focused on the providers rather than the company itself. But for the cab company, the ads' focus made no difference in purchase intention.

The reason was that participants were more likely to say their purchases were helping people when they used the rideshare drivers than when they used the cab company employees, the study found.

"When people are buying from an employee, like those who work for a taxi company, they don't really think of themselves as helping these workers. They're just making a transaction," Costello said.

Overall, the results show that peer-to-peer companies should focus on people who provide their services in their ads and marketing materials, Reczek said.

But the findings may also have wider implications. For example, if people think they are already helping rideshare drivers simply by hiring them, they may be less likely to tip them or tip them less. They may also be less likely to support regulations that financially protect these workers.

But some research suggests that rideshare drivers often make less than minimum wage, she noted.

"This perception of consumers that they're helping simply through their purchases may have negative consequences for providers," Reczek said.

"Their beliefs may not match the economic reality of what it is like to be a provider for a peer-to-peer firm."

Credit: 
Ohio State University

A nanomaterial path forward for COVID-19 vaccine development

image: A graphic of the SARS-CoV-2 virus.

Image: 
UC San Diego

From mRNA vaccines entering clinical trials, to peptide-based vaccines and using molecular farming to scale vaccine production, the COVID-19 pandemic is pushing new and emerging nanotechnologies into the frontlines and the headlines.

Nanoengineers at UC San Diego detail the current approaches to COVID-19 vaccine development, and highlight how nanotechnology has enabled these advances, in a review article in Nature Nanotechnology published July 15.

"Nanotechnology plays a major role in vaccine design," the researchers, led by UC San Diego Nanoengineering Professor Nicole Steinmetz, wrote. Steinmetz is also the founding director of UC San Diego's Center for Nano ImmunoEngineering. "Nanomaterials are ideal for delivery of antigens, serving as adjuvant platforms, and mimicking viral structures. The first candidates launched into clinical trials are based on novel nanotechnologies and are poised to make an impact."

Steinmetz is leading a National Science Foundation-funded effort to develop--using a plant virus-- a stable, easy to manufacture COVID-19 vaccine patch that can be shipped around the world and painlessly self-administered by patients. Both the vaccine itself and the microneedle patch delivery platform rely on nanotechnology. This vaccine falls into the peptide-based approach described below.

"From a vaccine technology development point of view, this is an exciting time and novel technologies and approaches are poised to make a clinical impact for the first time. For example, to date, no mRNA vaccine has been clinically approved, yet Moderna's mRNA vaccine technology for COVID-19 is making headways and was the first vaccine to enter clinical testing in the US."

As of June 1, there are 157 COVID-19 vaccine candidates in development, with 12 in clinical trials.

"There are many nanotechnology platform technologies put toward applications against SARS-CoV-2; while highly promising, many of these however may be several years away from deployment and therefore may not make an impact on the SARS-CoV-2 pandemic," Steinmetz wrote. "Nevertheless, as devastating as COVID-19 is, it may serve as an impetus for the scientific community, funding bodies, and stakeholders to put more focused efforts toward development of platform technologies to prepare nations for readiness for future pandemics," Steinmetz wrote.

To mitigate some of the downsides of contemporary vaccines--namely live-attenuated or inactivated strains of the virus itself-- advances in nanotechnology have enabled several types of next-generation vaccines, including:

Peptide-based vaccines: Using a combination of informatics and immunological investigation of antibodies and patient sera, various B- and T-cell epitopes of the SARS-CoV-2 S protein have been identified. As time passes and serum from convalescent COVID-19 patients are screened for neutralizing antibodies, experimentally-derived peptide epitopes will confirm useful epitope regions and lead to more optimal antigens in second-generation SARS-CoV-2 peptide-vaccines. The National Institutes of Health recently funded La Jolla Institute for Immunology in this endeavor.

Peptide-based approaches represent the simplest form of vaccines that are easily designed, readily validated and rapidly manufactured. Peptide-based vaccines can be formulated as peptides plus adjuvant mixtures or peptides can be delivered by an appropriate nanocarrier or be encoded by nucleic acid vaccine formulations. Several peptide-based vaccines as well as peptide-nanoparticle conjugates are in clinical testing and development targeting chronic diseases and cancer, and OncoGen and University of Cambridge/DIOSynVax are using immunoinformatics-derived peptide sequences of S protein in their COVID-19 vaccine formulations.

An intriguing class of nanotechnology for peptide vaccines is virus like particles (VLPs) from bacteriophages and plant viruses. While non-infectious toward mammals, these VLPs mimic the molecular patterns associated with pathogens, making them highly visible to the immune system. This allows the VLPs to serve not only as the delivery platform but also as adjuvant. VLPs enhance the uptake of viral antigens by antigen-presenting cells, and they provide the additional immune-stimulus leading to activation and amplification of the ensuing immune response. Steinmetz and Professor Jon Pokorski received an NSF Rapid Research Response grant to develop a peptide-based COVID-19 vaccine from a plant virus: https://jacobsschool.ucsd.edu/news/news_releases/release.sfe?id=3005. Their approach uses the Cowpea mosaic virus that infects legumes, engineering it to look like SARS-CoV-2, and weaving antigen peptides onto its surface, which will stimulate an immune response.

Their approach, as well as other plant-based expression systems, can be easily scaled up using molecular farming. In molecular farming, each plant is a bioreactor. The more plants are grown, the more vaccine is made. The speed and scalability of the platform was recently demonstrated by Medicago manufacturing 10 million doses of influenza vaccine within one month. In the 2014 Ebola epidemic, patients were treated with ZMapp, an antibody cocktail manufactured through molecular farming. Molecular farming has low manufacturing costs, and is safer since human pathogens cannot replicate in plant cells.

Nucleic-acid based vaccines: For fast emerging viral infections and pandemics such as COVID-19, rapid development and large scale deployment of vaccines is a critical need that may not be fulfilled by subunit vaccines. Delivering the genetic code for in situ production of viral proteins is a promising alternative to conventional vaccine approaches. Both DNA vaccines and mRNA vaccines fall under this category and are being pursued in the context of the COVID-19 pandemic.

* DNA vaccines are made up of small, circular pieces of bacterial plasmids which are engineered to target nuclear machinery and produce S protein of SARS-CoV-2 downstream.

* mRNA vaccines on the other hand, are based on designer-mRNA delivered into the cytoplasm where the host cell machinery then translates the gene into a protein - in this case the full-length S protein of SARS-CoV-2. mRNA vaccines can be produced through in vitro transcription, which precludes the need for cells and their associated regulatory hurdles

While DNA vaccines offer higher stability over mRNA vaccines, the mRNA is non-integrating and therefore poses no risk of insertional mutagenesis. Additionally, the half-life, stability and immunogenicity of mRNA can be tuned through established modifications.

Several COVID-19 vaccines using DNA or RNA are undergoing development: Inovio Pharmaceuticals has a Phase I clinical trial underway, and Entos Pharmeuticals is on track for a Phase I clinical trial using DNA. Moderna's mRNA-based technology was the fastest to Phase I clinical trial in the US, which began on March 16th, and BioNTech-Pfizer recently announced regulatory approval in Germany for Phase 1/2 clinical trials to test four lead mRNA candidates.

Subunit vaccines: Subunit vaccines use only minimal structural elements of the pathogenic virus that prime protective immunity-- either proteins of the virus itself or assembled VLPs. Subunit vaccines can also use non-infectious VLPs derived from the pathogen itself as the antigen. These VLPs are devoid of genetic material and retain some or all of the structural proteins of the pathogen, thus mimicking the immunogenic topological features of the infectious virus, and can be produced via recombinant expression and scalable through fermentation or molecular farming. The frontrunners among developers are Novavax who initiated a Phase I/II trial on May 25, 2020. Also Sanofi Pasteur/GSK, Vaxine, Johnson & Johnson and the University of Pittsburgh have announced that they expect to begin Phase I clinical trials within the next few months. Others including Clover Biopharmaceuticals and the University of Queensland, Australia are independently developing subunit vaccines engineered to present the prefusion trimer confirmation of S protein using the molecular clamp technology and the Trimer-tag technology, respectively.

Delivery device development

Lastly, the researchers note that nanotechnology's impact on COVID-19 vaccine development does not end with the vaccine itself, but extends through development of devices and platforms to administer the vaccine. This has historically been complicated by live attenuated and inactivated vaccines requiring constant refrigeration, as well as insufficient health care professionals where the vaccines are needed.
"Recently, modern alternatives to such distribution and access challenges have come to light, such as single-dose slow release implants and microneedle-based patches which could reduce reliance on the cold chain and ensure vaccination even in situations where qualified health care professionals are rare or in high demand," the researchers write. "Microneedle-based patches could even be self-administered which would dramatically hasten roll-out and dissemination of such vaccines as well as reducing the burden on the healthcare system."

Pokorski and Steinmetz are co-developing a microneedle delivery platform with their plant virus COVID-19 vaccine for both of these reasons.

This work is supported by a grant from the National Science Foundation (NSF CMMI-2027668)

"Advances in bio/nanotechnology and advanced nanomanufacturing coupled with open reporting and data sharing lay the foundation for rapid development of innovative vaccine technologies to make an impact during the COVID-19 pandemic," the researchers wrote. "Several of these platform technologies may serve as plug-and-play technologies that can be tailored to seasonal or new strains of coronaviruses. COVID-19 harbors the potential to become a seasonal disease, underscoring the need for continued investment in coronavirus vaccines."

Credit: 
University of California - San Diego

Genetic editing milestone in mouse model of Rett Syndrome

image: Lead author, John Sinnamon, and principal investigator, Gail Mandel.

Image: 
Rett Syndrome Research Trust

A genomic error that causes Rett Syndrome, a serious lifelong neurological disorder, can be corrected in the brains of mice by rewriting the genetic instructions carried by the RNA.

The new research, published July 14 in the journal Cell Reports, shows that RNA editing may repair the underlying cause of Rett Syndrome in a mouse model. The technology recoded enough RNA to restore half of the normal protein in three different kinds of neurons in the Rett mouse.

The results represent a promising early step in using RNA editing to treat Rett Syndrome, a disorder that affects about 350,000 individuals worldwide. The authors in the neuroscience lab of Gail Mandel at the Vollum Institute at Oregon Health and Science University (OHSU) in Portland, caution, however, that much work lies ahead to advance the potential therapeutic to the clinic.

"This was a proof of principle" that the technique works in the brain, says lead author John Sinnamon.

People diagnosed with Rett Syndrome have mutations in a gene called MECP2. The gene makes a protein that is abundant in brain cells and controls the activity of many other genes.

Disease symptoms usually appear between 12 and 18 months of age and can include loss of speech and purposeful hand use, respiratory problems, motor deficits, seizures, and gastrointestinal and orthopedic issues. No cure exists, but studies in mice suggest restoring healthy MeCP2 protein function can dramatically reverse the condition.

Hundreds of different mutations in the MECP2 gene have been found in people with Rett Syndrome. The instructions for the protein it makes are coded in a unique combination of four genomic "letters" -- A, C, G and T. The cell transcribes the DNA code into RNA and then into protein.

The idea behind the strategy used in this new study is to produce a healthy MeCP2 protein by repairing or "editing" the genetic error in the RNA.

The team used a mouse model of a human MECP2 mutation in which a single letter is wrong, an A where a G should be. They adapted an RNA editing technique by designing a guide to recognize the mutated section and change the A back to the normal G.

In 2017, Sinnamon, Mandel, and their colleagues reported their first success with the RNA approach, efficiently repairing the Rett mouse mutant RNA in developing neurons in a lab dish. In the new study, the team expanded on these results. They asked three questions: Is it possible to edit MeCP2 RNA in several different types of neurons in adult mice in vivo? If so, what types of neurons can be edited? Does editing restore MeCP2 protein function?

To address these questions, the researchers packaged a mouse Mecp2 RNA guide and human editing enzyme (the "editase") in a viral vector and introduced it directly into the hippocampus, a well-studied brain structure associated with learning and memory.

The injected editase repaired about half of the RNA produced by the mutant MeCP2 gene in each of three types of neurons located in different regions of the hippocampus and, importantly, MeCP2 protein function was equally repaired in the neurons.

"It is encouraging that this RNA editing approach seems to be efficacious in different types of neurons in the brain," says Mandel, the senior author whose lab has pioneered the concept of RNA editing in Rett Syndrome. A similar repair rate should be achievable throughout the brain, the researchers believe, if a vector can be delivered diffusely throughout the brain.

"In the next set of experiments," Mandel says "we will administer the virus by blood, so that the entire brain is subject to editing. This will allow us to monitor whether there is any amelioration of Rett symptoms in the mice."

As a strategy to restore the normal function of MECP2, the single-base RNA editing approach of swapping out A and G could address about 40% of all known mutations that cause Rett Syndrome, Sinnamon says.

The researchers reported that RNA of genes other than Mecp2 also were inadvertently edited, known as off-target effects. It is unknown what, if any, effect the off-target edits had in the mice. The experimental treatment appeared to cause no harm to the mice over the study time period.

As the RNA editing approach progresses through experimental steps, additional questions will need to be answered. "We need to know how much MECP2 RNA we need to repair in an individual cell and in how many cells in the nervous system," Mandel says.

Her research group and others in the editing field also want to learn more about how to increase editing efficiency while diminishing off-target effects. "There are many bright and determined investigators working on these problems," she says. "My hope is this paper will stimulate these creative minds even further. We are trying to take advantage of what's happening in the field in real time and apply the emerging optimizations to Rett and other neurological diseases.

"The study is the first example of RNA editing in a mouse model of a neurological disease, and therefore a considerable step forward in the potential of RNA editing becoming a therapeutic for Rett Syndrome," says Monica Coenraads, executive director of Rett Syndrome Research Trust, which helped fund the study. "Strategies that edit the mutation are an elegant way of addressing the problem. This exciting progress would not be possible without all of RSRT's supporters and the affected families that are so dedicated to raising funds for us."

RSRT funds six genetic approaches to restore a functioning MeCP2 protein in patients with disease causing mutations. Beyond RNA editing, other approaches are gene replacement, which is the closest to the clinic, gene editing, RNA trans-splicing, protein replacement and reactivation of the silent MeCP2 on the inactive X.

The study was funded by the Rett Syndrome Research Trust and the National Institutes of Health. Mandel is a scientific co-founder of Vico Therapeutics, a biotech company working on non-viral approaches to edit RNA in Rett syndrome and other neurological conditions.

Credit: 
Rett Syndrome Research Trust

New study shows how plants regulate their growth-inhibiting hormones to survive

image: Dimerization of OsGA2ox and OsDAO enhances its enzymatic activity.
Under low substrate concentration (GA4 or IAA) (left), GA2ox or DAO functions as a monomer with low enzyme activity. At high substrate concentration (right), the interface GA4 or IAA causes the formation of multimers by bridging two enzyme molecules, resulting in hyper-activation. GA4 or IAA is retained in a stable interface position, allowing two subunits to enter the active site for the next reaction without a high energy barrier. Lys or Arg is the most important amino acid for retaining GA4 or IAA and for entering the active site. Furthermore, MD simulation of OsGA2ox3 revealed the presence of a gate, allowing substrate to enter the active site and for product to exit. This gate had a hinge site composed of three amino acids, W106, C186, and V196, and was also stabilized by the interaction between R97 in subunit A and F100 in subunit D. GA4-dependent dimerization enhanced its enzymatic activity. These mechanisms are conserved in all rice GA2oxs.

Image: 
S. Takehara et al. 2020

In a world with a consistently growing population and a climate crisis, food shortage is a looming threat. To alleviate this threat, crop scientists, botanists, genetic engineers, and others, have been exploring ways of boosting crop productivity and resilience. One way to control plant growth and physiology is to regulate the levels of "phytohormones" or plant hormones.

However, much remains to be known about the mechanisms that underlie this hormonal regulation in plants, limiting advancement in this direction. Now, in a study led by Nagoya University Japan, a team of scientists has discovered, using rice plants as the study model, that a process called "allosteric regulation" is involved in maintaining the phytohormonal balance in plants. Their findings, published in Nature Communications, could hold the key to significantly advancing the research on plant growth and development, providing a potential solution for food security.

Plants survive by adapting their development and physiology to their surrounding environments by controlling the levels of enzymes driving the synthesis of two phytohormones, gibberellin and auxin. Enzymes are proteins that bind to one or more reactant chemicals and speed up a reaction process. The binding site is called the activation site. In 1961, it was discovered that in bacteria, enzyme activity is enhanced or inhibited via allosteric regulation, which essentially is the binding of a molecule called the "effector" at a site other than the active site of the enzyme. In allosteric regulation, the structure of the enzyme changes to either support or hinder the reaction that the enzyme enables.

Professor Miyako Ueguchi-Tanaka of Nagoya University, lead scientist in the team that has now observed allosteric regulation in plants for the first time, explains their research findings, ''We used a technique called X-ray crystallography and found that, as molecules of the enzymes (gibberellin 2-oxidase 3 [GA2ox3], and auxin dioxygenase [DAO]) bind to gibberellin and auxin (respectively), they interact among themselves and form 'multimeric' structures, comprising four and two units respectively. As the amounts of gibberellin and auxin increase, so does the rate of multimerization of the enzymes. And multimerization enhances the activity of the enzymes, enabling greater degradation of gibberellin and auxin. Synchronous structural changes and activity enhancement are typical of allosteric-regulation events."

The scientists further carried out "phylogenetic" analysis of GA2ox3 and DAO, which revealed that plants independently developed this hormone regulation mechanism at three separate time-points over the course of the evolutionary process.

Enthusiastic about the future prospects of these findings, Prof Ueguchi says, "The activity control system revealed here can be used to artificially regulate the activity of the growth inactivating hormones in plants. As a result, rice crop productivity can be improved and high-biomass plants can be produced in the event of food shortage or an environmental crisis."

Of course, this study is only a stepping stone for now, and much remains to be done to see how the findings of this study can be applied practically in agricultural lands. However, these findings certainly are encouraging, and they signal the coming of a new era of sustainable development fueled by biotechnological advancements.

Credit: 
Nagoya University

Learning the wiring diagram for autism spectrum disorders

image: The illustration shows cerebellar cerebro-cortical circuits mediating autism spectrum disorder-relevant behaviors; specifically, modulation of Rcrus1 influences social behavior while modulation of the posterior vermis impacts repetitive behaviors and behavioral flexibility. UTSW researchers reported these findings in a recent study in Nature Neuroscience.

Image: 
UT Southwestern Medical Center

DALLAS - July 14, 2020 - A team led by UT Southwestern researchers has identified brain circuitry that plays a key role in the dysfunctional social, repetitive, and inflexible behavioral differences that characterize autism spectrum disorders (ASD). The findings, published online this week in Nature Neuroscience, could lead to new therapies for these relatively prevalent disorders.

The Centers for Disease Control and Prevention estimate that about 1 in 54 children in the U.S. have ASD, a broad range of neurodevelopmental conditions thought to be caused by a combination of genetic and environmental factors. Although researchers have identified some key genes and pathways that contribute to ASD, the underlying biology of these disorders remains poorly understood, says Peter Tsai, M.D., Ph.D., assistant professor in the departments of neurology and neurotherapeutics, neuroscience, pediatrics, and psychiatry at UT Southwestern Medical Center and a member of the Peter O'Donnell Jr. Brain Institute.

However, Tsai explains, one key brain region that's been implicated in ASD dysfunction is the cerebellum, part of the hindbrain in vertebrates that holds about three-quarters of all the neurons in the body and has traditionally been linked with motor control. Recent studies by Tsai and his colleagues have demonstrated that inhibiting activity in a region of the cerebellum known as Rcrus1 can cause altered social and repetitive/inflexible behaviors reminiscent of ASD in mice. Their work also found that stimulation of this area could rescue social behaviors in an ASD-relevant model but was unable to improve repetitive or inflexible behaviors. Together, these studies suggested that additional regions of the cerebellum might also regulate repetitive and/or inflexible behaviors.

In addition, how these cerebellar regions might regulate these ASD-relevant behaviors remained unknown. To learn more about the brain circuitry controlling these behaviors, Tsai and his colleagues worked with mice genetically engineered to reduce the activity of Purkinje cells, specialized cells that turn down the activity of other brain regions. When they examined the activity of the rest of the brain, they saw increased activity in the medial prefrontal cortex (mPFC), another region previously implicated in ASD. Behavioral tests showed that these animals displayed characteristic social and repetitive/inflexible behaviors reminiscent of ASD. When the researchers inhibited mPFC activity in these animals, both social impairments and repetitive/inflexible behaviors improved.

Because the cerebellum and the mPFC are on opposite ends of the brain, Tsai and his colleagues used microscopic imaging to trace how these regions are linked. They found connections specifically between Rcrus1 and the mPFC in these animals, with decreased Rcrus1 activity leading to increased mPFC activity. Further investigation showed that connectivity in this region wasn't just disrupted in these particular mice. It also existed in about a third of 94 different mouse lines carrying autism-related mutations and in two independent cohorts of people with ASD.

Looking further to better determine the anatomical connections between these regions, the researchers saw that signals from Rcrus1 appear to be routed to the mPFC through an area known as the lateral nucleus; however, modulation of this region was only sufficient to improve social behaviors in their genetic mouse model while repetitive/inflexible behaviors remained abnormal. Thus, Tsai and colleagues interrogated other cerebellar regions and found that modulation of another ASD-implicated cerebellar region, the posterior vermis, results in improvement in repetitive and inflexible behaviors. They then asked whether this cerebellar region also targets the mPFC and found that both posterior vermis and Rcrus1 converge on the mPFC through another intermediate region, the ventromedial thalamus.

Each of these regions could play a key role in potential future therapies for ASD, Tsai explains. And because their experiments could improve dysfunctional social and repetitive/inflexible behaviors even in adult animals, it raises the possibility that therapies that target this circuit in humans might be able to improve ASD-related dysfunction even into adulthood.

Just as an electrician can repair a home's wiring once he or she understands the wiring diagram, these findings give us potential hope for improving dysfunctional activity in the circuits involved in ASD," Tsai says.

Credit: 
UT Southwestern Medical Center

Global sentiments towards COVID-19 shifts from fear to anger

image: Graph showing the evolving public emotions over the course of the COVID-19 pandemic based on tweets. The figure provided is an extended version of the data published to include more recent data from the months of May and June.

Image: 
NTU Singapore

The fear that people developed at the start of the COVID-19 outbreak has given way to anger over the course of the pandemic, a study of global sentiments led by Nanyang Technological University, Singapore (NTU Singapore) has found.

In an analysis of over 20 million tweets in English related to the coronavirus, an international team of communication researchers observed that tweets reflecting fear, while dominant at the start of the outbreak due to the uncertainty surrounding the coronavirus, have tapered off over the course of the pandemic.

Xenophobia was a common theme among anger-related tweets, which progressively increased, peaking on 12 March - a day after the World Health Organisation declared the COVID-19 outbreak a pandemic. The anger then evolved to reflect feelings arising from isolation and social seclusion.

Accompanying this later shift is the emergence of tweets that show joy, which the researchers say suggested a sense of pride, gratitude, hope, and happiness. Tweets that reflected sadness doubled, although they remain proportionally lower than the other emotions.

The rapid evolution of global COVID-19 sentiments within a short period of time points to a need to address increasingly volatile emotions through strategic communication by government and health authorities, as well as responsible behaviour by netizens before they give rise to "unintended outcomes", said Professor May O. Lwin of NTU's Wee Kim Wee School of Communication and Information.

Prof Lwin, who led the team representing four countries, said: "Worldwide, strong negative sentiments of fear were detected in the early phases of pandemic but by early April, these emotions have gradually been replaced by anger. Our findings suggest that collective issues driven by emotions, such as shared experiences of distress of the COVID-19 pandemic including large-scale social isolation and the loss of human lives, are developing.

"If such overbearing public emotions are not addressed through clear and decisive communication by authorities, citizen groups and social media stakeholders, there is potential for the emergence of issues such as breeding mistrust in the handling of the disease, and a belief in online falsehoods that could hinder the ongoing control of the disease."

The study was published in the scientific journal JMIR Public Health & Surveillance in May.

A glimmer of hope and gratitude amidst anger

To identify trends in the expression of the four basic emotions - fear, anger, sadness, and joy - and examine the narratives underlying those emotions, Prof Lwin and her team first collected 20,325,929 tweets in English containing the keywords 'Wuhan', 'corona', 'nCov', and 'covid'.

The tweets, collected from late January to early April at the Institute of High Performance Computing in Agency for Science, Technology and Research (A*STAR) using Twitter's standard search application interface programme, came from over 7 million unique users in more than 170 countries.

"Although the data looks at only public tweets surrounding the four selected keywords, the results are sufficient to start a conversation about possible issues arising from the pandemic at present," said Prof Lwin, whose collaborators also include Tianjin University, University of Lugano, and University of Melbourne.

The underlying emotions of tweets were then analysed using an algorithm developed by A*STAR, whose accuracy has been demonstrated in previous studies. Word clouds based on the top single words and two-word phrases were generated for each of the four emotions.

Upon analysing the results, the team found that words such as 'first case' and 'outbreak' were among the most-used words in tweets from late January, indicating fear that was possibly related to the emerging coronavirus and the unknown nature of it, causing uncertainty about containment and spread.

Xenophobia was also reflected at the start of the pandemic, when the disease was predominantly contained in China and Asia, as indicated by words such as 'racist' and 'Chinese people'.

As the pandemic escalated, fears around shortages of COVID-19 diagnostic tests and medical supplies emerged, as suggested by words such as 'test shortages' and 'uncounted'. Anger then shifted to discourses around the isolation fatigue that can occur from social seclusion, indicated by words such as "stay home" and several swear words.

Signs of sadness surrounding the topics of losing friends and family members also started to surface, with words relating to 'loved one' and 'passed away' highlighting potential social concerns arising from personal traumatic experiences of the pandemic.

But accompanying these negative emotions were parallel escalating sentiments of joy relating to national pride, gratitude, and community spirit, the NTU-led team found, with words such as 'thank', 'good news' and 'feel good'.

Tweets that were collected and analysed from early April to mid-June as an extension of the JMIR study also showed that these positive sentiments exceeded fear postings on social media.

Upcoming follow-up studies led by Prof Lwin will dive into country-specific trends in public emotions. Preliminary findings show that in Singapore, there is a moderate balance of positive sentiments relating to resilience, civic pride, and celebration of heroic acts and acts of kindness. This is in contrast to other countries where strong negative emotions overwhelmingly feature in the social media posts.

Credit: 
Nanyang Technological University

COVID-19: Patients improve after immune-suppressant treatment

LOS ANGELES (July 14, 2020) - Most patients hospitalized with COVID-19 (coronavirus) pneumonia experienced improvement after receiving a Food and Drug Administration-approved drug normally given for rheumatoid arthritis, according to an observational study at Cedars-Sinai. Outcomes for patients who received the drug, tocilizumab, included reduced inflammation, oxygen requirements, blood pressure support and risk of death, compared with published reports of illness and death associated with severely ill COVID-19 patients.

The single-center, observational study of 27 patients was led by Stanley Jordan, MD, director of the Cedars-Sinai Nephrology and Transplant Immunology Programs, and published June 23 in Clinical Infectious Diseases.

While the patient outcomes were encouraging, investigators said they were not sufficient to prove the drug was safe and effective for use in COVID-19 patients because they did not conduct a clinical trial with a control group.

The team examined laboratory and clinical changes - including oxygen levels, the need for medication to increase blood pressure and patient survival - in 27 patients with COVID-19 pneumonia who received the immunosuppressive drug tocilizumab to slow an out of control immune response. The researchers observed improved inflammatory markers and patient survival, compared with reports of patients not treated with tocilizumab.

"Researchers have been studying tocilizumab for a decade, focusing on its use for rheumatoid arthritis and cytokine storms with cancer," said Jordan, a Cedars-Sinai professor of Medicine. The medication was approved in 2010 by the FDA as treatment for rheumatoid arthritis.

The Cedars-Sinai investigators found that interleukin 6 - a protein that fuels immune cell production and is the target for tocilizumab - was the main cytokine elevated in COVID-19 patients.

"Since tocilizumab blocks interleukin 6, we reasoned that it made sense to try it with COVID-19 pneumonia patients," Jordan explained.

Cytokines are molecules secreted by multiple cell types, including immune system cells that regulate the body's immune response. A cytokine storm is a severe reaction in which immune cells flood and attack healthy organs they are supposed to protect. In COVID-19 patients, the virus stimulates immune cells that lead to collateral lung damage, which may cause blood vessels to leak and blood to clot. The patient's blood pressure sinks, and organs start to fail.

Early on in the COVID-19 pandemic, healthcare professionals discovered that cytokine storms were causing rapid deterioration in some patients. The key to patient survival, investigators are learning, is to keep that storm from gathering strength.

Most of the patients who received tocilizumab were on ventilators to support breathing. They each received one dose of tocilizumab, which helps block the signaling of the cytokine, interleukin 6 - the only cytokine detected in damaging amounts in all of the study patients.

"The more interleukin 6 present in the body, the worse the patient outcome," Jordan said.

Post-treatment results showed that 23 patients experienced significant drops in body temperature and C-reactive protein (CRP) levels. CRP levels increase when infection is present in the body. Four patients did not have rapid declines in CRP levels, and three of them had poorer outcomes. Adverse events were minimal, but two deaths unrelated to tocilizumab occurred, Jordan said.

"Our observational study suggests the medication may help reduce inflammation, oxygen requirements, blood pressure support and the risk of death," Jordan said.

Jordan's current research builds on his earlier work with tocilizumab. That research focused on the drug's potential for blocking the harmful effects of interleukin 6 on organ transplantation, including rejection of a donor organ. The study found that tocilizumab helps regulate the immune response and prevents organ rejection. Jordan and his colleagues currently are carrying out a randomized, placebo-controlled trial of the investigational medication clazakizumab, another interleukin 6 blocker.

Based on his past and current research, Jordan is encouraged about potential benefits of tocilizumab for patients with COVID-19 pneumonia.

"Based on our preliminary results, I am hopeful that this class of drugs may help patients with COVID-19 pneumonia improve," Jordan said. "But we won't know the outcome until we complete a randomized controlled clinical trial."

Credit: 
Cedars-Sinai Medical Center

Improving animal research: New ARRIVE 2.0 guidelines released

image: Photo of the ARRIVE guidelines on a table.

Image: 
NC3Rs

A previous version of the ARRIVE guidelines for the rigorous reporting of animal studies was published in 2010 by the UK-based science organisation, the NC3Rs. Now, ten years later, new reporting guidelines - ARRIVE 2.0 - have been published on July 14, 2020 in the open access journal PLOS Biology. Although the original guidelines were widely endorsed by journals and funders, they have not led to the comprehensive improvements in reporting intended, and ARRIVE 2.0 sets out to address this.

Poor reporting of research methods and findings is a major factor that influences the reliability and reproducibility of scientific studies, including those involving animals. It is hard to argue that animals aren't being wasted when the research can't be properly scrutinised or reproduced because of a lack of information or detail. Tackling this has become an international effort driven by ethical concerns and the desire to ensure that data from in vivo studies fully adds to the knowledge base.

The major shift is the prioritisation of the recommendations in the new guidelines into two groups to simplify their use in practice - the "ARRIVE Essential 10" that are the basic minimum to include in a manuscript to enable readers and reviewers to assess the reliability of the findings, and the complementary "recommended set" that provides context to the study. The aim is for the scientific community to focus initial efforts on the Essential 10, with the Recommended Set subsequently adopted as best reporting practice.

A major barrier to full and transparent reporting of animal research is a lack of understanding about the importance of including information on experimental design and statistical analyses, for example, sample size calculations and strategies to minimise bias, and the consequences of omitting these key details. To address this, the guidelines are also accompanied by an "Explanation and Elaboration" document which describes the relevance of each item in the guidelines with examples of good practice taken from the literature.

NC3Rs head of experimental design and reporting Dr Nathalie Percie du Sert said

"All researchers expect that their work will make an important contribution to the knowledge base. This can only happen if the work is fully and transparently reported. In ARRIVE 2.0 we have focused on simplifying the guidelines based on our experience and feedback from the scientific community as well as addressing misconceptions about why reporting methodological details is critical in a manuscript. The responsibility now lies with researchers and organisations to embrace ARRIVE 2.0 and the associated resources we have provided, ensuring that the poor reporting of in vivo research is a thing of the past."

ARRIVE 2.0 was developed by an international working group which included funders, journal editors, statisticians and researchers from the UK, mainland Europe, North America and Australia. The prioritisation of the recommendations into the two sets was supported by a Delphi exercise with more than 70 experts from 19 countries participating. Although the primary purpose of the guidelines is to improve the quality of manuscripts, they can also be used during the planning and conduct of animal studies to help make sure that experiments are robustly designed and properly recorded, preparing the way for future publication.

Director of Research Quality at the National Institute of Neurological Disorders and Stroke at the National Institutes of Health, Dr Shai Silberberg, said "The efforts made by NC3Rs to update and streamline the ARRIVE guidelines are commendable.? The ARRIVE Essential 10 that highlight items related to potential risk of bias serve an important step toward better study design, experimental rigor, and improved reporting."

Professor Malcolm Macleod, Professor of Neurology and Translational Neuroscience and Academic Lead for Research Improvement and Research Integrity at the University of Edinburgh, UK said "The release of ARRIVE 2.0 is an important milestone in our efforts to improve the reporting - and we hope, also, the design, conduct and analysis - of animal research. The NC3Rs are to be congratulated on bringing together such a diverse and international team to revise the guidelines, and for the huge amount of unseen effort that their team have put into supporting that process and driving it forward."

An Explanation and Elaboration document - also published in Plos Biology - accompanies the new guidelines, and a dedicated website: https://www.arriveguidelines.org has been launched. The website includes resources for researchers and stakeholders such as journals and funders. The ARRIVE guidelines have been translated in French and German, with more languages to follow.

To encourage wide dissemination, ARRIVE 2.0 have also been published in BMC Veterinary research, BMJ Open Science, the British Journal of Pharmacology, the Journal of Cerebral Blood Flow and Metabolism, Experimental Physiology and Journal of Physiology

Credit: 
PLOS

Scientists achieve first complete assembly of human X chromosome

Although the current human reference genome is the most accurate and complete vertebrate genome ever produced, there are still gaps in the DNA sequence, even after two decades of improvements. Now, for the first time, scientists have determined the complete sequence of a human chromosome from one end to the other ('telomere to telomere') with no gaps and an unprecedented level of accuracy.

The publication of the telomere-to-telomere assembly of a complete human X chromosome July 14 in Nature is a landmark achievement for genomics researchers. Lead author Karen Miga, a research scientist at the UC Santa Cruz Genomics Institute, said the project was made possible by new sequencing technologies that enable "ultra-long reads," such as the nanopore sequencing technology pioneered at UC Santa Cruz.

Repetitive DNA sequences are common throughout the genome and have always posed a challenge for sequencing because most technologies produce relatively short "reads" of the sequence, which then have to be pieced together like a jigsaw puzzle to assemble the genome. Repetitive sequences yield lots of short reads that look almost identical, like a large expanse of blue sky in a puzzle, with no clues to how the pieces fit together or how many repeats there are.

"These repeat-rich sequences were once deemed intractable, but now we've made leaps and bounds in sequencing technology," Miga said. "With nanopore sequencing, we get ultra-long reads of hundreds of thousands of base pairs that can span an entire repeat region, so that bypasses some of the challenges."

Filling in the remaining gaps in the human genome sequence opens up new regions of the genome where researchers can search for associations between sequence variations and disease and for other clues to important questions about human biology and evolution.

"We're starting to find that some of these regions where there were gaps in the reference sequence are actually among the richest for variation in human populations, so we've been missing a lot of information that could be important to understanding human biology and disease," Miga said.

Miga and Adam Phillippy at the National Human Genome Research Institute (NHGRI), both corresponding authors of the new paper, co-founded the Telomere-to-Telomere (T2T) consortium to pursue a complete genome assembly after working together on a 2018 paper that demonstrated the potential of nanopore technology to produce a complete human genome sequence. That effort used the Oxford Nanopore Technologies MinION sequencer, which sequences DNA by detecting the change in current flow as single molecules of DNA pass through a tiny hole (a "nanopore") in a membrane.

The new project built on that effort, combining nanopore sequencing with other sequencing technologies from PacBio and Illumina, and optical maps from BioNano Genomics. Using these technologies, the team produced a whole-genome assembly that exceeds all prior human genome assemblies in terms of continuity, completeness, and accuracy, even surpassing the current human reference genome by some metrics.

Nevertheless, there were still multiple breaks in the sequence, Miga said. To finish the X chromosome, the team had to manually resolve several gaps in the sequence. Two segmental duplications were resolved with ultra-long nanopore reads that completely spanned the repeats and were uniquely anchored on either side. The remaining break was at the centromere, a notoriously difficult region of repetitive DNA found in every chromosome.

In the X chromosome, the centromere encompasses a region of highly repetitive DNA spanning 3.1 million base pairs (the bases A, C, T, and G form pairs in the DNA double helix and encode genetic information in their sequence). The team was able to identify variants within the repeat sequence to serve as markers, which they used to align the long reads and connect them together to span the entire centromere.

"For me, the idea that we can put together a 3-megabase-size tandem repeat is just mind-blowing. We can now reach these repeat regions covering millions of bases that were previously thought intractable," Miga said.

The next step was a polishing strategy using data from multiple sequencing technologies to ensure the accuracy of every base in the sequence.

"We used an iterative process over three different sequencing platforms to polish the sequence and reach a high level of accuracy," Miga explained. "The unique markers provide an anchoring system for the ultra-long reads, and once you anchor the reads, you can use multiple data sets to call each base."

Nanopore sequencing, in addition to providing ultra-long reads, can also detect bases that have been modified by methylation, an "epigenetic" change that does not alter the sequence but has important effects on DNA structure and gene expression. By mapping patterns of methylation on the X chromosome, the team was able to confirm previous observations and reveal some intriguing trends in methylation patterns within the centromere.

The new human genome sequence, derived from a human cell line called CHM13, closes many gaps in the current reference genome, known as Genome Reference Consortium build 38 (GRCh38).

The T2T consortium is continuing to work toward completion of all of the CHM13 chromosomes. "It's an open consortium, so in many respects this is a community-driven project, with a lot of people dedicating time and resources to it," Miga said.

Credit: 
University of California - Santa Cruz

NHGRI researchers generate complete human X chromosome sequence

video: Video depicting the puzzle pieces of DNA sequences coming together.

Image: 
Ernesto del Aguila III

Researchers at the National Human Genome Research Institute (NHGRI), part of the National Institutes of Health (NIH), have produced the first end-to-end DNA sequence of a human chromosome. The results, published today in the journal Nature, show that generating a precise, base-by-base sequence of a human chromosome is now possible, and will enable researchers to produce a complete sequence of the human genome.

"This accomplishment begins a new era in genomics research," said Eric Green, M.D., Ph.D., NHGRI director. "The ability to generate truly complete sequences of chromosomes and genomes is a technical feat that will help us gain a comprehensive understanding of genome function and inform the use of genomic information in medical care."

After nearly two decades of improvements, the reference sequence of the human genome is the most accurate and complete vertebrate genome sequence ever produced. However, there are hundreds of gaps or missing DNA sequences that are unknown.

These gaps most often contain repetitive DNA segments that are exceptionally difficult to sequence. Yet, these repetitive segments include genes and other functional elements that may be relevant to human health and disease.

Because a human genome is incredibly long, consisting of about 6 billion bases, DNA sequencing machines cannot read all the bases at once. Instead, researchers chop the genome into smaller pieces, then analyze each piece to yield sequences of a few hundred bases at a time. Those shorter DNA sequences must then be put back together.

Senior author Adam Phillippy, Ph.D., at National Human Genome Research Institute (NHGRI) compared this issue to solving a puzzle.

"Imagine having to reconstruct a jigsaw puzzle. If you are working with smaller pieces, each contains less context for figuring out where it came from, especially in parts of the puzzle without any unique clues, like a blue sky," he said. "The same is true for sequencing the human genome. Until now, the pieces were too small, and there was no way to put the hardest parts of the genome puzzle together."

Of the 24 human chromosomes (including X and Y), study authors Phillippy and Karen Miga, Ph.D., at the University of California, Santa Cruz, chose to complete the X chromosome sequence first, due to its link with a myriad of diseases, including hemophilia, chronic granulomatous disease and Duchenne muscular dystrophy.

Humans have two sets of chromosomes, one set from each parent. For example, biologically female humans inherit two X chromosomes, one from their mother and one from their father. However, those two X chromosomes are not identical and will contain many differences in their DNA sequences.

In this study, researchers did not sequence the X chromosome from a normal human cell. Instead, they used a special cell type - one that has two identical X chromosomes. Such a cell provides more DNA for sequencing than a male cell, which has only a single copy of an X chromosome. It also avoids sequence differences encountered when analyzing two X chromosomes of a typical female cell.

The authors and their colleagues capitalized on new technologies that can sequence long segments of DNA. Instead of preparing and analyzing small pieces of DNA, they used a method that leaves DNA molecules largely intact. These large DNA molecules were then analyzed by two different instruments. Each of them generates very long DNA sequences - something previous instruments could not accomplish.

After analyzing the human X chromosome in this fashion, Phillippy and his team used their newly developed computer program to assemble the many segments of generated sequences. Miga's group led the effort to close the largest remaining sequence gap on the X chromosome, the roughly 3 million bases of repetitive DNA found at the middle portion of the chromosome, called the centromere.

There is no "gold standard" for researchers to critically evaluate the accuracy of assembling such highly repetitive DNA sequences. To help confirm the validity of the generated sequence, Miga and her collaborators performed several validation steps.

"We have never actually seen these sequences before in our genome, and do not have many tools to test if the predictions we are making are correct. This is why it is important to have specialists in the genomics community weigh in and ensure the final product is high-quality," Miga said.

The effort is part of a broader initiative by the Telomere-to-Telomere (T2T) consortium, partially funded by NHGRI. The consortium aims to generate a complete reference sequence of the human genome.

The T2T consortium is continuing its efforts with the remaining human chromosomes, aiming to generate a complete human genome sequence in 2020.

"We don't yet know what we'll find in the newly uncovered sequences. It is the exciting unknown of discovery. This is the era of complete genome sequences, and we are embracing it wholeheartedly," Phillippy said.

Potential challenges remain. Chromosomes 1 and 9, for example, have repetitive DNA segments that are much larger than the ones encountered on the X chromosome.

"We know these previously uncharted sites in our genome are very different among individuals, but it is important to start figuring out how these differences contribute to human biology and disease," Miga said. Both Phillippy and Miga agree that enhancing sequencing methods will continue to create new opportunities in human genetics and genomics.

Credit: 
NIH/National Human Genome Research Institute

Study: RNA repair shows promise in reversing mutations underlying a neurological disorder

Scientists successfully edited RNA in a living animal in such a way that the repaired RNA then corrected a mutation in a protein that gives rise to a debilitating neurological disorder in people known as Rett syndrome.

The advance by researchers at Oregon Health & Science University publishes in the journal Cell Reports.

"This is the first example of using programmable RNA editing to repair a gene in mouse models of a neurological disease," said senior author Gail Mandel, Ph.D., senior scientist in the OHSU Vollum Institute. "This gives us an approach that has some traction."

Rett syndrome is a debilitating neurological disorder caused by mutations in the gene that codes the protein MeCP2. The disease occurs almost exclusively in girls, affecting an estimated 1 in 10,000 live births, due to the fact that it's located on the X chromosome.

Mandel and co-authors, led by postdoctoral fellow John Sinnamon, Ph.D., have been focused on engineering a way to repair the mutated MeCP2 protein at the level of RNA, or ribonucleic acid, which acts as a messenger that carries instructions from DNA to control the synthesis of proteins.

Other scientists have used the programmable RNA editing repair technique to target muscular dystrophy and even hearing loss in genetically engineered mice.

However, this is the first demonstration that the technique holds promise in neurological disorders rooted in genetic mutations spread across thousands of different cell types in the brain. The nervous system poses more challenges for this technique than diseases of other organs, such as muscle or liver, which have much less cellular heterogeneity.

The new study targeted and repaired the MeCP2 protein across a variety of cell types, a scientific first.

"We repaired the MeCP2 protein in three different distinct populations of neurons," Mandel said. "So it's possible that it would work throughout the brain, assuming we can deliver the editing components in a widespread manner."

Previous landmark research led by Adrian Bird, Ph.D., with the University of Edinburgh revealed it was possible to reverse Rett-like symptoms in mice, suggesting it may be possible in people, too.

"For that reason, Rett syndrome is an ideal test case for this technique," Mandel said.

While the OHSU research shows that RNA repair holds promise as a proof of concept, Mandel emphasized that much more research needs to be done to test whether it reverses Rett-like behaviors in mice, and to improve the efficiency and specificity of the repair.

Credit: 
Oregon Health & Science University

Gut bacteria protect against mosquito-borne viral illness

Chikungunya virus, once confined to the Eastern Hemisphere, has infected millions of people in the Americas since 2013, when mosquitoes carrying the virus were discovered in the Caribbean. About half of all people infected with chikungunya virus never show symptoms, while some develop fever and joint pain that lasts about a week, and 10% to 30% develop debilitating arthritis that persists for months or years.

Scientists have understood little about why the severity of the illness varies so widely. A study from researchers at Washington University School of Medicine in St. Louis indicates, in mice, that gut bacteria may play a role. The research shows that mice with faulty gut microbiomes were less able to control chikungunya virus infection. Further, giving the mice a single species of bacteria - or a chemical compound produced by that species - improved the mice's immune responses, lowered levels of the virus in their blood and reduced the chances that a mosquito that fed on blood from infected mice would acquire the virus.

The findings, published July 14 in the journal Cell, suggest that a healthy microbiome could help reduce the chance of severe chikungunya disease and possibly even reduce community spread by disrupting the transmission of virus from person to mosquito to another person.

"In many viral diseases, only a subset of the people who get infected become symptomatic, and we don't really understand why," said senior author Michael S. Diamond, MD, PhD, the Herbert S. Gasser Professor of Medicine. "There might be things that happen during your lifetime that shape your immune system and influence whether you can stop the infection early and have minimal symptoms, or fail to stop it and develop severe disease. We found that when mice don't have a healthy gut microbiome, not only do they get sicker, but mosquitoes that sample their blood are more likely to get infected. Promoting a healthy microbiome could be important not just for individuals who might get infected but for the whole community in breaking or reducing the cycle of transmission."

The gut microbiome is the community of bacteria that live in the intestines. Gut bacteria metabolize and chemically modify some of the material that comes through the digestive tract, generating vitamins and other compounds as byproducts that then are absorbed by cells or other microbes, and help regulate inflammation and the body's response to infection.

To find out if the gut microbiome affects the severity of chikungunya infection, Diamond, first author Emma Winkler, a graduate student in Diamond's lab, and colleagues studied mice without normal gut microbiomes. They used two kinds of mice: germ-free mice, which had been kept under sterile conditions since birth and therefore never developed a gut microbiome, and ordinary laboratory mice treated with a cocktail of two commonly used antibiotics to reduce the complexity of their gut microbiomes.

The researchers infected groups of germ-free and antibiotic-treated mice with chikungunya virus, as well as a group of laboratory mice with normal microbiomes for comparison. The virus multiplied and spread rapidly in the mice that lacked gut microbes, reaching high levels in the blood and in tissues far from the site of infection. Further experiments showed that key immune cells were impaired in the mice without a normal gut microbiome.

Introducing just one bacterial species - a normal member of the human gut microbiome known as Clostridium scindens - rescued the mice's ability to fight the infection. C. scindens is not typically found in mice. But it is common in people, where it modifies a bile acid produced in the liver, generating a compound that affects immune cells. When the researchers gave the modified bile acid alone to mice that lacked normal microbiomes, it improved their immune responses and reduced viral levels in the blood and tissues.

"If having an unhealthy microbiome affects the virus levels in your blood, that raises an interesting question for a blood-borne pathogen: Does the health of your microbiome impact transmission?" said Diamond, who is also a professor of molecular microbiology, and of pathology and immunology. "It stands to reason that if there is more virus in the blood, a mosquito would be more likely to get infected when it takes a blood meal."

To test this idea, Diamond and Winkler infected three groups of mice with chikungunya virus. One group was treated with antibiotics to eliminate their gut bacteria, a second was treated with antibiotics and later given C. scindens to repopulate such bacteria in their intestines, and the third group didn't receive antibiotics at all, leaving them with normal gut microbiomes. The researchers drew blood one day after infection and offered the blood to mosquitoes to feed on. More than half of the mosquitoes that sampled the blood of the antibiotic-treated mice became infected, compared with less than a third of the mosquitoes that fed on blood from the mice with normal microbiomes or with only C. scindens.

"There are plenty of people walking around with unhealthy microbiomes and varying levels of conjugated bile acids in their guts," said Diamond. "There may be other bacteria that might be even better than C. scindens at modifying bile acids that could be used to rebalance microbiomes. If a probiotic like that were created, it might be one way to not only minimize disease in individuals, but reduce community spread at the same time."

Credit: 
Washington University School of Medicine

Scientists at USC and other institutions develop new method to improve police lineups

In a discovery with important implications for criminal justice, a team of scientists from USC and other Southern California research institutions has developed a unique way to measure the reliability of an eyewitness trying to pick a culprit from a police lineup.

It's a new forensic approach that attempts to gauge the strength of witness memory while minimizing the influence of unwitting bias in an attempt to explain -- and solve -- why so many innocent people get convicted of crimes. The method challenges police lineup techniques that've been entrenched for nearly a century.

"Our new lineup method uncovers the structure of eyewitness memory, removes decision bias from the identification process and quantifies the performance of individual witnesses. This study is a great example of using laboratory science to bring about criminal justice reform," said Sergei Gepshtein, an expert in perceptual psychology and neuroscience at the USC School of Cinematic Arts and a corresponding author of the study.

Gepshtein and collaborators from the University of California and the Salk Institute for Biological Studies in La Jolla, Calif., where he is also a member of the Center for the Neurobiology of Vision, conducted the study, which appears today in Nature Communications.

At its core, the new method accounts for mistakes in witness decision making, scores a witness' responses based on paired comparisons of photos instead of a traditional lineup and assigns a probability value to the testimony. Among the advantages, the technique minimizes biases that confound memory and inserts elements of the scientific method into eyewitness testimony.

"What's at stake is the confidence people have in their criminal justice system. People want the criminal justice system to use the best methods available for prosecuting crime," Gepshtein said.

When police round up suspects to appear in a lineup, the outcome isn't always accurate, as misidentification by eyewitnesses can lead to false arrests and imprisonments. In recent years, more than 350 people -- many serving lengthy prison sentences -- have been exonerated in the United States because their DNA was found to be incompatible with evidence collected from crime scenes. Eyewitness misidentification accounts for 70% of verified erroneous convictions, the study says.

The study also comes at a time of increased racial tensions across the United States, as police practices come under scrutiny, mass protests support the Black Lives Matter movement and some municipalities consider defunding local law enforcement.

In the experiment, the researchers used neuroscience and psychology to parse how the eyewitness's mind captures events and then retrieves impressions from memory -- two distinct brain functions that sometimes can lead to erroneous recollections.

Like in the movies, lineups occur when a suspect and several other people, known as "fillers," are shown to a witness, either by photos or in person. But the process can muddle a correct identification because of ambiguity about two factors that underlie human decision-making: strength of memory and decision criterion.

Decision criterion is the mental yardstick a witness uses to assess the resemblance of a suspect to the memory of the culprit; it's a measure of similarity that the witness deems sufficient for identification. An individual assured in their strong memory may make a selection using a rigid criterion, concluding there's a high similarity to identify the culprit. But another individual might have a weak memory and use a laxer criterion to claim that a suspect is the culprit, even if there's little similarity. In both instances, the witness could potentially finger a perpetrator erroneously.

The problem is exacerbated by law enforcement's binary yes-or-no imperative that the witness identifies the bad guy, Gepshtein explained.

To extract a more accurate memory, the scientists employed a science called sensory psychophysics. Their experiment involved 202 college students shown a movie clip of a single criminal act. Later, the scientists asked the subjects to view photos of possible culprits. Some subjects viewed photos in sequence and other subjects in a simultaneous batch, just like in a police lineup. Other subjects were presented the photos using a new method where the same photos were shown in pairs.

The latter group of subjects reported which of images in a pair was more similar to the culprit in the film clip. Then the scientists tallied the scores, a method called "perceptual scaling."

"It's like when you go to the optometrist and you're asked, 'which image is clearer: this one or that one?' as the optometrist flips the contrasting lenses," Gepshtein said. A similar technique is used in product marketing, preference testing and hearing aid tests.

In the end, the researchers obtained a relative ranking for every photo -- rather than an absolute identification of a culprit -- and could assign a probability score to each image, thus quantifying the perceived similarity of each face to the perpetrator. Using this method, they found that the culprit's face was most consistently ranked correctly; the photo received more than double the positive identifications than could be expected by chance, the study says. Moreover, the new process led to significantly fewer rejections of the lineup than the traditional methods, which is important because it indicates the bad guy is less likely to go free.

"Using these scaled measures of similarity to estimate recognition memory strength for each face in a lineup removes decision bias from the process of eyewitness identification and replaces the yes-or-no absolute identification by providing a more detailed relative judgment," Gepshtein said.

The new identification method has profound implications for law enforcement investigations. It means that, for the first time, investigators can measure the strength of the memory of an eyewitness. It also helps to create a fairer lineup in which the fillers are similar to the suspect to a known degree, whereas traditional methods could not measure that, Gepshtein explained.

In addition, the method is novel because it leads to a new manner of presenting eyewitness testimony in a court of law. It involves a forensic expert to interpret the reliability of an eyewitness, similar to how other experts weigh the value of other evidence such as DNA, ballistics or physical items from a crime scene.

But whether the criminal justice system is ready to change an investigative practice embedded in cop culture and immortalized in pulp fiction and crime thriller cinema is another matter.

"Generally speaking, criminal justice reform is an uphill battle, partly because there are many concepts and methods that are deeply embedded in the culture of those who work in our legal and law enforcement systems," said Thomas Albright, a study co-author, director of the Vision Center Laboratory at the Salk Institute and a member of the National Academy of Sciences. "On the other hand, there are compelling principles of fairness at stake here and science is a largely unmined source of new ideas to mitigate injustice."

Credit: 
University of Southern California