Culture

Novel coronavirus circulated undetected months before first COVID-19 cases in Wuhan, China

image: The SARS-CoV-2 virus under a microscope.

Image: 
National Institute of Allergy and Infectious Diseases

Using molecular dating tools and epidemiological simulations, researchers at University of California San Diego School of Medicine, with colleagues at the University of Arizona and Illumina, Inc., estimate that the SARS-CoV-2 virus was likely circulating undetected for at most two months before the first human cases of COVID-19 were described in Wuhan, China in late-December 2019.

Writing in the March 18, 2021 online issue of Science, they also note that their simulations suggest that the mutating virus dies out naturally more than three-quarters of the time without causing an epidemic.

"Our study was designed to answer the question of how long could SARS-CoV-2 have circulated in China before it was discovered," said senior author Joel O. Wertheim, PhD, associate professor in the Division of Infectious Diseases and Global Public Health at UC San Diego School of Medicine.

"To answer this question, we combined three important pieces of information: a detailed understanding of how SARS-CoV-2 spread in Wuhan before the lockdown, the genetic diversity of the virus in China and reports of the earliest cases of COVID-19 in China. By combining these disparate lines of evidence, we were able to put an upper limit of mid-October 2019 for when SARS-CoV-2 started circulating in Hubei province."

Cases of COVID-19 were first reported in late-December 2019 in Wuhan, located in the Hubei province of central China. The virus quickly spread beyond Hubei. Chinese authorities cordoned off the region and implemented mitigation measures nationwide. By April 2020, local transmission of the virus was under control but, by then, COVID-19 was pandemic with more than 100 countries reporting cases.

SARS-CoV-2 is a zoonotic coronavirus, believed to have jumped from an unknown animal host to humans. Numerous efforts have been made to identify when the virus first began spreading among humans, based on investigations of early-diagnosed cases of COVID-19. The first cluster of cases -- and the earliest sequenced SARS-CoV-2 genomes -- were associated with the Huanan Seafood Wholesale Market, but study authors say the market cluster is unlikely to have marked the beginning of the pandemic because the earliest documented COVID-19 cases had no connection to the market.

Regional newspaper reports suggest COVID-19 diagnoses in Hubei date back to at least November 17, 2019, suggesting the virus was already actively circulating when Chinese authorities enacted public health measures.

In the new study, researchers used molecular clock evolutionary analyses to try to home in on when the first, or index, case of SARS-CoV-2 occurred. "Molecular clock" is a term for a technique that uses the mutation rate of genes to deduce when two or more life forms diverged -- in this case, when the common ancestor of all variants of SARS-CoV-2 existed, estimated in this study to as early as mid-November 2019.

Molecular dating of the most recent common ancestor is often taken to be synonymous with the index case of an emerging disease. However, said co-author Michael Worobey, PhD, professor of ecology and evolutionary biology at University of Arizona: "The index case can conceivably predate the common ancestor -- the actual first case of this outbreak may have occurred days, weeks or even many months before the estimated common ancestor. Determining the length of that 'phylogenetic fuse' was at the heart of our investigation."

Based on this work, the researchers estimate that the median number of persons infected with SARS-CoV-2 in China was less than one until November 4, 2019. Thirteen days later, it was four individuals, and just nine on December 1, 2019. The first hospitalizations in Wuhan with a condition later identified as COVID-19 occurred in mid-December.

Study authors used a variety of analytical tools to model how the SARS-CoV-2 virus may have behaved during the initial outbreak and early days of the pandemic when it was largely an unknown entity and the scope of the public health threat not yet fully realized.

“These tools included epidemic simulations based on the virus’s known biology, such as its transmissibility and other factors, developed by study co-author Niema Moshiri, PhD, assistant teaching professor in the Department of Computer Science and Engineering at UC San Diego. In just 29.7 percent of these simulations was the virus able to create self-sustaining epidemics. In the other 70.3 percent, the virus infected relatively few persons before dying out. The average failed epidemic ended just eight days after the index case.”

"Our approach yielded some surprising results. We saw that over two-thirds of the epidemics we attempted to simulate went extinct. That means that if we could go back in time and repeat 2019 one hundred times, two out of three times, COVID-19 would have fizzled out on its own without igniting a pandemic. This finding supports the notion that humans are constantly being bombarded with zoonotic pathogens."

Wertheim noted that even as SARS-CoV-2 was circulating in China in the fall of 2019, the researchers' model suggests it was doing so at low levels until at least December of that year.

"Given that, it's hard to reconcile these low levels of virus in China with claims of infections in Europe and the U.S. at the same time," Wertheim said. "I am quite skeptical of claims of COVID-19 outside China at that time."

The original strain of SARS-CoV-2 became epidemic, the authors write, because it was widely dispersed, which favors persistence, and because it thrived in urban areas where transmission was easier. In simulated epidemics involving less dense rural communities, epidemics went extinct 94.5 to 99.6 percent of the time.

The virus has since mutated multiple times, with a number of variants becoming more transmissible.

"Pandemic surveillance wasn't prepared for a virus like SARS-CoV-2," Wertheim said. "We were looking for the next SARS or MERS, something that killed people at a high rate, but in hindsight, we see how a highly transmissible virus with a modest mortality rate can also lay the world low."

Credit: 
University of California - San Diego

Researchers identify DNA elements that affect MECP2 expression

Researchers at Baylor College of Medicine and the Jan and Dan Duncan Neurological Research Institute at Texas Children's Hospital (NRI) have identified and characterized two regions of DNA required for the proper expression of Mecp2/MECP2 in mice and humans.

These findings, published in Genes & Development, are helping to shed light on the function of these DNA regions and how they could be potential targets for diagnostic and therapeutic interventions for intellectual disabilities such as Rett Syndrome and MECP2 Duplication Syndrome.

Both of these intellectual disabilities are examples of the importance of precise MeCP2 protein levels for proper brain function. A decrease in this protein leads to Rett Syndrome, while an increase in this protein causes MECP2 Duplication Syndrome. Both are severe neurological disorders characterized by learning disabilities, features of autism and motor difficulties.

Dr. Huda Zoghbi, professor at Baylor, director of the NRI, and a Howard Hughes Medical Institute investigator, underscored the importance of understanding how the levels of RNA encoding this protein are regulated. Researchers in her lab identified two DNA regions that, when mutated, lead to either a decrease or an increase in MECP2 RNA and protein levels, resulting in partial behavioral deficits seen in Rett Syndrome and MECP2 Duplication Syndrome, respectively.

Yingyao Shao, first author on the study and graduate student in the Zoghbi lab, explains that when one of the two DNA regions were altered, results showed a moderate reduction in MeCP2 levels mimicking behavioral and molecular changes observed in Rett mice models. Alteration of the other DNA region caused a slight increase in MeCP2 protein levels similar to mice models with double expression of MeCP2.

"These findings provide hope that future treatments targeting these DNA regions could have clinically relevant benefits, even when only slightly correcting MeCP2 protein levels," said Shao. "Moreover, mutations in either of these two DNA regions are likely to cause intellectual disability or autism in humans, thus it is important to sequence these regions when screening for genetic causes of neurodevelopmental disorders," added Zoghbi.

Credit: 
Baylor College of Medicine

Powerful stratospheric winds measured on Jupiter for the first time

image: This image shows an artist's impression of winds in Jupiter's stratosphere near the planet's south pole, with the blue lines representing wind speeds. These lines are superimposed on a real image of Jupiter, taken by the JunoCam imager aboard NASA's Juno spacecraft.

Jupiter's famous bands of clouds are located in the lower atmosphere, where winds have previously been measured. But tracking winds right above this atmospheric layer, in the stratosphere, is much harder since no clouds exist there. By analysing the aftermath of a comet collision from the 1990s and using the ALMA telescope, in which ESO is a partner, researchers have been able to reveal incredibly powerful stratospheric winds, with speeds of up to 1450 kilometres an hour, near Jupiter's poles.

Image: 
ESO/L. Calçada & NASA/JPL-Caltech/SwRI/MSSS

Using the Atacama Large Millimeter/submillimeter Array (ALMA), in which the European Southern Observatory (ESO) is a partner, a team of astronomers have directly measured winds in Jupiter's middle atmosphere for the first time. By analysing the aftermath of a comet collision from the 1990s, the researchers have revealed incredibly powerful winds, with speeds of up to 1450 kilometres an hour, near Jupiter's poles. They could represent what the team have described as a "unique meteorological beast in our Solar System".

Jupiter is famous for its distinctive red and white bands: swirling clouds of moving gas that astronomers traditionally use to track winds in Jupiter's lower atmosphere. Astronomers have also seen, near Jupiter's poles, the vivid glows known as aurorae, which appear to be associated with strong winds in the planet's upper atmosphere. But until now, researchers had never been able to directly measure wind patterns in between these two atmospheric layers, in the stratosphere.

Measuring wind speeds in Jupiter's stratosphere using cloud-tracking techniques is impossible because of the absence of clouds in this part of the atmosphere. However, astronomers were provided with an alternative measuring aid in the form of comet Shoemaker-Levy 9 , which collided with the gas giant in spectacular fashion in 1994. This impact produced new molecules in Jupiter's stratosphere, where they have been moving with the winds ever since.

A team of astronomers, led by Thibault Cavalié of the Laboratoire d'Astrophysique de Bordeaux in France, have now tracked one of these molecules -- hydrogen cyanide -- to directly measure stratospheric "jets" on Jupiter. Scientists use the word "jets" to refer to narrow bands of wind in the atmosphere, like Earth's jet streams.

"The most spectacular result is the presence of strong jets, with speeds of up to 400 metres per second, which are located under the aurorae near the poles," says Cavalié. These wind speeds, equivalent to about 1450 kilometres an hour, are more than twice the maximum storm speeds reached in Jupiter's Great Red Spot and over three times the wind speed measured on Earth's strongest tornadoes.

"Our detection indicates that these jets could behave like a giant vortex with a diameter of up to four times that of Earth, and some 900 kilometres in height," explains co-author Bilal Benmahi, also of the Laboratoire d'Astrophysique de Bordeaux. "A vortex of this size would be a unique meteorological beast in our Solar System," Cavalié adds.

Astronomers were aware of strong winds near Jupiter's poles, but much higher up in the atmosphere, hundreds of kilometres above the focus area of the new study, which is published today in Astronomy & Astrophysics. Previous studies predicted that these upper-atmosphere winds would decrease in velocity and disappear well before reaching as deep as the stratosphere. "The new ALMA data tell us the contrary," says Cavalié, adding that finding these strong stratospheric winds near Jupiter's poles was a "real surprise".

The team used 42 of ALMA's 66 high-precision antennas, located in the Atacama Desert in northern Chile, to analyse the hydrogen cyanide molecules that have been moving around in Jupiter's stratosphere since the impact of Shoemaker-Levy 9. The ALMA data allowed them to measure the Doppler shift -- tiny changes in the frequency of the radiation emitted by the molecules -- caused by the winds in this region of the planet. "By measuring this shift, we were able to deduce the speed of the winds much like one could deduce the speed of a passing train by the change in the frequency of the train whistle," explains study co-author Vincent Hue, a planetary scientist at the Southwest Research Institute in the US.

In addition to the surprising polar winds, the team also used ALMA to confirm the existence of strong stratospheric winds around the planet's equator, by directly measuring their speed, also for the first time. The jets spotted in this part of the planet have average speeds of about 600 kilometres an hour.

The ALMA observations required to track stratospheric winds in both the poles and equator of Jupiter took less than 30 minutes of telescope time. "The high levels of detail we achieved in this short time really demonstrate the power of the ALMA observations," says Thomas Greathouse, a scientist at the Southwest Research Institute in the US and co-author of the study. "It is astounding to me to see the first direct measurement of these winds."

"These ALMA results open a new window for the study of Jupiter's auroral regions, which was really unexpected just a few months back," says Cavalié. "They also set the stage for similar yet more extensive measurements to be made by the JUICE mission and its Submillimetre Wave Instrument," Greathouse adds, referring to the European Space Agency's JUpiter ICy moons Explorer, which is expected to launch into space next year.

Credit: 
ESO

Animal model opens way to test Alzheimer's disease therapies

image: Inflammatory cells (red and green) flood into part of the brain where neurons are affected by misfolded tau protein (orange). This model in rhesus macaque monkeys shows similarities to development of Alzheimer's disease in humans and could open up ways to test new treatments.

Image: 
Danielle Beckman, UC Davis/CNPRC

Our knowledge of Alzheimer's disease has grown rapidly in the past few decades but it has proven difficult to translate fundamental discoveries about the disease into new treatments. Now researchers at the California National Primate Research Center at the University of California, Davis, have developed a model of the early stages of Alzheimer's disease in rhesus macaques. The macaque model, published March 18 in the journal Alzheimer's & Dementia: The Journal of the Alzheimer's Association could allow better testing of new treatments.

The model was developed by Professor John Morrison's laboratory at the CNPRC, in collaboration with Professor Jeffrey Kordower of Rush University Medical Center and Paramita Chakrabarty, assistant professor at the University of Florida.

Alzheimer's disease is thought to be caused by misfolding of the tau and amyloid proteins. Misfolded proteins spread through the brain, leading to inflammation and cell death. Tau protein is commonly found in neurons of the brain and central nervous system, but not elsewhere.

Researchers think that decades may elapse between the silent beginnings of the disease and the first signs of cognitive decline. Understanding what happens over these years could be key to preventing or reversing symptoms of Alzheimer's disease. But it is difficult to study therapeutic strategies without a powerful animal model that resembles the human condition as closely as possible, Morrison said. Much research has focused on transgenic mice that express a human version of amyloid or tau proteins, but these studies have proven difficult to translate into new treatments.

New translational models needed

Humans and monkeys have two forms of the tau protein in their brains, but rodents only have one, said Danielle Beckman, postdoctoral researcher at the CNPRC and first author on the paper.

"We think the macaque is a better model, because it expresses the same versions of tau in the brain as humans do," she said.

Mice also lack certain areas of neocortex such as prefrontal cortex, a region of the human brain that is highly vulnerable to Alzheimer's disease. Prefrontal cortex is present in rhesus macaques and critically important for cognitive functions in both humans and monkeys. There is a critical need for new and better animal models for Alzheimer's disease that can stand between mouse models and human clinical trials, Beckman said.

Chakrabarty and colleagues created versions of the human tau gene with mutations that would cause misfolding, wrapped in a virus particle. These vectors were injected into rhesus macaques, in a brain region called the entorhinal cortex, which is highly vulnerable in Alzheimer's disease.

Within three months, they could see that misfolded tau proteins had spread to other parts of the animal's brains. They found misfolding both of the introduced human mutant tau protein and of the monkey's own tau proteins.

"The pattern of spreading demonstrated unequivocally that tau-based pathology followed the precise connections of the entorhinal cortex and that the seeding of pathological tau could pass from one region to the next through synaptic connections," Morrison said. "This capacity to spread through brain circuits results in the damage to cortical areas responsible for higher level cognition quite distant from the entorhinal cortex," he said.

The same team has previously established spreading of misfolded amyloid proteins in macaques, representing the very early stages of Alzheimer's disease, by injecting short pieces of faulty amyloid. The new tau protein model likely represents a middle stage of the disease, Beckman said.

"We think that this represents a more degenerative phase, but before widespread cell death occurs," she said.

The researchers next plan to test if behavioral changes comparable to human Alzheimer's disease develop in the rhesus macaque model. If so, it could be used to test therapies that prevent misfolding or inflammation.

"We have been working to develop these models for the last four years," Morrison said. "I don't think you could do this without a large collaborative team and the extensive resources of a National Primate Research Center."

Credit: 
University of California - Davis

Four lichen species new to science discovered in Kenyan cloud forests

Researchers from the University of Helsinki's Finnish Museum of Natural History Luomus and the National Museums of Kenya have discovered four lichen species new to science in the rainforests of the Taita Hills in southeast Kenya.

Micarea pumila, M. stellaris, M. taitensis and M. versicolor are small lichens that grow on bark of trees and on decaying wood. The species were described based on morphological features and DNA-characters.

"Species that belong to the Micarea genus are known all over the world, including Finland. However, the Micarea species recently described from the Taita Hills have not been seen anywhere else. They are not known even in the relatively close islands of Madagascar or Réunion, where species of the genus have been previously studied," Postdoctoral Researcher Annina Kantelinen from the Finnish Museum of Natural History says.

"The Taita Hills cloud forests are quite an isolated ecosystem, and at least some of the species now discovered may be native to the area or to eastern Africa. Our preliminary findings also suggest that there are more unknown Micarea lichen species there."

The Taita Hills are part of the Eastern Arc Mountains that range from south-eastern Kenya to eastern Tanzania. The mountains rise abruptly from the surrounding plain, with the tallest peak reaching over two kilometers. Lush indigenous rainforests are mainly found on the mountaintops, capturing precipitation from clouds and mist developed by the relatively cool air rising from the Indian Ocean.

Thanks to ecological isolation and a favourable climate, the area is one of the global hotspots of biodiversity. However, the native cloud forests in the region are shrinking year by year as they are replaced by forest plantations of exotic tree species that are not native to Africa. Compared to 1955, the area of indigenous forests has been diminished to less than half.

"Planted forests have been found to bind less moisture and be more susceptible to forest fires. Therefore, they can make the local ecosystem drier and result in species becoming endangered. Some lichen species are capable of utilising cultivated forests at least temporarily, but indigenous forests have the greatest biodiversity and biomass," Kantelinen says.

The University of Helsinki maintains in the area the Taita Research Station, which is celebrating its tenth anniversary this year.

Credit: 
University of Helsinki

How to get customers to talk about you

Researchers from Arizona State University, New York University, and Northwestern University published a new paper in the Journal of Marketing that examines how marketers can fuel positive WOM without using explicit incentives.

The study, forthcoming in the Journal of Marketing, is titled "How Marketing Perks Influence Word of Mouth" and is authored by Monika Lisjak, Andrea Bonezzi, and Derek Rucker.

Word-of-mouth (WOM) is arguably the most influential means of persuasion and can be a critical driver of a company's growth. For this reason, many companies offer consumers incentives to encourage them to generate WOM. Classic examples of this practice are referral and seeding programs, whereby a company literally "pays" current customers to generate positive WOM and attract new customers. Despite its intuitive appeal, however, this practice can backfire. Ironically, incentivizing WOM sometimes can hamper, rather than increase, consumers' willingness to engage in WOM.

This research shows that commonly used marketing perks--e.g., gifts, benefits, and rewards--can effectively foster WOM without being used as explicit incentives. Their effectiveness at boosting WOM, however, depends on how they are framed and therefore perceived by consumers: Marketing perks are more effective at fostering WOM the less they are perceived to be given out of contractual obligation. The term "contractuality" refers to the degree to which a perk is perceived to be given to consumers in exchange for engaging in specific behaviors dictated by a company, such as filling out a survey or making a certain number of purchases.

Lisjak explains that "We demonstrate that marketers can influence the perceived contractuality of a perk with easily implementable pivots. Consumers can perceive the exact same perk, say a free coffee, as more or less contractual simply based on how it is framed." As one example, the perceived contractuality of a perk can be lowered by giving consumers a free item after a set number of purchases, but not making the number of purchases salient to the consumer. As another example, the same perk could be accompanied by a thank you note, as opposed to a note that highlights all the effort a customer had to put in to earn the perk. In both instances, companies do not have to change the offering, only how consumers perceive it.

Interestingly, however, perks lower in contractuality can sometimes backfire against companies. This is more likely to occur when a perk characterized by low contractuality comes from a disliked or distrusted company. Under such circumstances, consumers become wary of the company's intentions and then interpret the perk as a manipulative act of persuasion driven by ulterior motives. When this happens, perks lower in contractuality in fact hinder rather than fuel WOM. To illustrate, many consumers do not like utility providers or financial institutions. To the extent that such dislike prompts consumers to make hostile attributions of benevolent gestures, such companies might be better off using perks that are higher in contractuality.

Finally, contractuality can entail a trade-off. Despite being more effective at fostering WOM, low contractuality perks might be less effective than high contractuality perks at inducing compliance with a direct request. For example, if a company wants consumers to complete a customer satisfaction survey, offering a high contractuality perk can be more effective and efficient than offering a low contractuality perk. Simply put, when brands have a specific action other than WOM that they would like consumers to take, perks higher in contractuality might serve as better incentives because they make behavior-reward contingencies clear and salient.

Bonezzi summarizes the study by saying "Our findings suggest that marketers could nudge consumers to generate positive WOM by providing them with perks that have fewer strings attached. Of note, this could be achieved at a similar cost to perks that come across as highly contractual."

Credit: 
American Marketing Association

UTSA researcher studies key predictors for college retention

(MARCH 17, 2021) - The current outbreak of COVID-19 has raised many questions about the value of consideration of standardized testing through the admissions process. One of the many Coronavirus cancellations included a growing number of universities to waive SAT and ACT scores as an admissions requirement for 2022 applicants.

With schools shifting their policy to making standardized "test-optional" and possibly permanently phasing out testing scores in the future as some college experts argue that standardized tests create barriers to students which could reduce their likelihood of acceptance.

A new study led by senior research scientist Paul Westrick from the College Board (ACT, Inc.), along with UTSA professor of management, Huy Le, Steve Robbins from the Educational Testing Service, Justin Radunzel ACT, Inc, and Frank Schmidt, professor emeritus of management and entrepreneurship at the University of Iowa shows that the most effective way to predict students' academic success and retention in a student first year of college relies heavily on test scores and grades combined together.

Based on a national representative sample of 189,612 students at 50 institutions, both ACT scores and high school grade point average are correlated to a first-year students' academic performance, the researchers made important contributions on academic persistence.

The study aims to examine the strength of correlation between ACT Composite scores, school grades, and socioeconomic status within the second- and third-year academic performance.

"We believe this research provides important evidence supporting the usefulness of high school GPA and standardized tests as predictors of student successes in colleges and refuting the misconception that the tests are just a proxy for students' socio-economic status," said Le.

The researchers provided valuable insight to determine the factors that best predict the effectiveness and reliability of continuing to use the practice of standardized tests as a measure of student success and whether or not they will persist in college.

"We are hopeful the findings of the study will help colleges in determining factors to be included for making admission decisions," Le added.

The researchers concluded that grades do not only measure academic characteristics but nonacademic characteristics as well. The findings demonstrate that test scores measure cognitive characteristics, while grades measure a combination of characteristics. Including attendance, participation and more.

The research has yielded many studies that have demonstrated the usefulness and validity of the standardized testing for the admission process.

Credit: 
University of Texas at San Antonio

A new study by Novateur Ventures provides global analysis of COVID-19 vaccines

image: The ranking chart compares how well the vaccines match with the harmonized TPP using a scoring system of 1-5 and a weighting parameter for each category to provide a value out of 100

Image: 
Novateur Ventures

A new study by Novateur Ventures provides a comparative analysis of twelve COVID-19 Vaccines that had initiated or announced the Phase III clinical trial stage by early November 2020. The study highlights the early successes, as well as the hurdles and barriers yet to be overcome for ending the global COVID-19 pandemic.

COVID-19 vaccines analyzed for the study

messenger RNA - Moderna and Pfizer/BioNtech

Viral Vector-based (non-replicating) Vaccines - Astra Zeneca/University of Oxford, CanSino Biologics, Gamaleya Research Institute, Johnson & Johnson/Janssen (J&J)

Recombinant Protein-based Vaccines - Novavax and Medicago

Inactivated Virus - Three Chinese conglomerates and one Indian company

The study 'Target Product Profile Analysis of COVID-19 Vaccines in Phase III Clinical Trials and Beyond: An Early 2021 Perspective' appears in the Special Issue "Vaccines and Therapeutics against Coronaviruses" of the journal Viruses, an international peer-reviewed, open-access journal published monthly by the Multidisciplinary Publishing Institute (MDPI).

"The global concerted effort to develop vaccines to fight COVID-19 and deliver it to millions of citizens around the world in less than a year is an unprecedented feat in the history of medicine and a triumph for vaccine research and development said study co-author Ali Ardakani, Founder & Managing Director at Novateur Ventures. "Vaccination-mediated herd immunity will play a key role in helping us in returning to a world unhampered by restrictions and to global prosperity."

COVID-19 Vaccines belonging to four different platforms were analyzed in five different categories using a "harmonized" target product profile (TPP) version of guidance from the World Health Organization, Coalition for Epidemic Preparedness Innovations (CEPI) and Center for Biologics Evaluation and Research (CBER). Key analysis from the study -

Vaccine Efficacy - mRNA vaccines were a clear winner with efficacy in the 95% range and across a spectrum of ages, followed by the protein subunit platform with an efficacy of just under 90% in the UK. The inactivated virus platform ranks lowest based on currently available variable data.

Dosing regimen - All but two (CanSino and Johnson & Johnson/Janssen) of the 12 vaccines in the various platforms analyzed in this study use a two-dose regimen.

Logistics - mRNA vaccines rank lowest with their burdensome cold-chain requirements.

Safety/reactogenicity - The inactivated virus platform was the top performer. The viral vector platform scores below the other three platforms due to some lingering concerns related to paused trials and adverse events.

Target price/accessibility - The production of mRNA vaccines can be scaled-up at a reasonable pace, but they are currently among the most expensive COVID-19 vaccines; the viral vector vaccines are the cheapest to prepare. The inactivated virus vaccines are relatively easy to produce and are cheap if one considers Bharat Biotech's COVAXIN. However, there are some indications that the pricing of the vaccines made in China are very high.

"Ten out of twelve vaccines we analyzed have already received some form of authorization for use in different countries in a period of less than a year. This is a remarkable achievement," said study co-author Colin D. Funk, Scientific Lead at Novateur Ventures. "We hope that SARS-CoV-2 viral variants, emerging at an alarming rate in various countries, will not derail the successful vaccine efforts to date."

The study has also identified three main barriers/hurdles for ending the global COVID-19 pandemic -

While we know that antibody levels induced by natural infection with SARS-CoV-2 last several months, we do not know if there will be a requirement for repeat vaccine dosing on an annual (or other timeframe) basis. If repeat booster doses are required, especially related to the viral vector platform, will antibodies be directed to the vector and will this diminish vaccine efficacy?

Determining an immunological correlate of protection against SARS-CoV-2 is an important objective that still has not been achieved and will be crucial in facilitating future COVID-19 vaccine development and licensing.

Public perception and compliance in vaccine administration are also very large hurdles to surmount in order to achieve herd immunity in some countries/populations. While not discussed in this review, this is a key point that cannot be overlooked.

"The process of developing a vaccine from scratch normally takes place over several years but it's truly amazing that we already have approved vaccines to fight the spread of COVID-19. This speaks volumes about the efforts various countries have put into pandemic preparedness and response," said co-author Craig Laferrière, Head Vaccine Development at Novateur Ventures. This was made possible through concurrent pre-clinical and early Phase I studies, strategic risk measures and adaptive trial designs."

Credit: 
PR Associates

Muscle cramp? Drink electrolytes, not water

image: Drinking electrolytes helps prevent muscle cramp

Image: 
Photo by Nigel Msipa on Unsplash

If you reach for water when a muscle cramp strikes, you might want to think again. New research from Edith Cowan University (ECU) has revealed drinking electrolytes instead of pure water can help prevent muscle cramps.

The study, published in the Journal of the International Society of Sports Nutrition, found that people who drank electrolyte enhanced water during and after exercise were less susceptible to muscle cramps than those who drank pure water.

Muscle cramps are a common painful condition affecting many people, including around 39 per cent of marathon runners, 52 per cent of rugby players and 60 per cent of cyclists.

Dilution solution

Lead researcher Professor Ken Nosaka, from ECU's School of Medical and Health Sciences, said the study builds on the evidence that a lack of electrolytes contributes to muscle cramps, not dehydration.

"Many people think dehydration causes muscle cramps and will drink pure water while exercising to prevent cramping," he said.

"We found that people who solely drink plain water before and after exercise could in fact be making them more prone to cramps.

"This is likely because pure water dilutes the electrolyte concentration in our bodies and doesn't replace what is lost during sweating."

When cramp strikes

Professor Nosaka began researching the causes of muscle cramps after regularly suffering from them while playing tennis.

The study involved 10 men who ran on a downhill treadmill in a hot (35ºC) room for 40 to 60 minutes to lose 1.5 to 2 per cent of their body weight through sweat in two conditions.

They drank plain water during and after exercise for one condition and took a water solution containing electrolytes in the other condition.

The participants were given an electrical stimulation on their calves to induce muscle cramp. The lower the frequency of the electrical stimulation required, the more the participant is prone to muscle cramp.

"We found that the electrical frequency required to induce cramp increased when people drank the electrolyte water, but decreased when they consumed plain water," said Professor Nosaka.

"This indicates that muscles become more prone to cramp by drinking plain water, but more immune to muscle cramp by drinking the electrolyte water."

Not all water is equal

Electrolytes are minerals including sodium, potassium, magnesium and chloride. They are essential for muscle health and help the body to absorb water.

Oral rehydration solutions contain electrolytes in specific proportions and can be made with water, salt and sugar. They are commonly found in supermarkets and pharmacies.

Professor Nosaka said electrolytes have many benefits for both athletes and the general population.

"Electrolytes are vital to good health - they help the body to absorb water more effectively than plain water and replace essential minerals lost through sweat or illness," he said.

"People should consider drinking oral rehydration fluids instead of plain water during moderate to intense exercise, when it's very hot or when you are sick from diarrhoea or vomiting."

Professor Nosaka is planning further research to find out the optimal amount of electrolytes to prevent muscle cramps as well as how they could help the elderly and pregnant women.

Credit: 
Edith Cowan University

Targeting a new antibody supersite key to COVID immunity

image: Potent antibodies (shown in red, purple, white and turquoise) attach to an antibody supersite in a lesser-studied region of the pandemic coronavirus called the N-terminal domain.

Image: 
Vir Biotechnology and David Veesler Lab at UW Medicine

Scientists are learning that a lesser-studied region on the pandemic coronavirus is recognized by COVID-19 infection-fighting antibodies. These antibodies were identified in blood samples from previously infected patients, and were found to potently prevent the virus from infecting cells.

The coronavirus spike protein is the key that unlocks the door to the cell, and antibodies bind to the spike protein to jam this function. Much attention has been given to studying antibodies that target the receptor-binding domain on the coronavirus spike protein. (The receptor-binding domain of the spike is responsible for triggering the merging of the virus with a host cell to achieve a takeover.)

However, some of the recovered patients' antibodies blocked the coronavirus by binding to a different place on the virus spike -- the N-terminal domain. These antibodies were as strong as those that bind to receptor-binding domain, a recent study shows.

Using electron cryo-microscopy (cryoEM) to map where these antibodies bound showed that all the antibodies that prevent infection bind a single place on the N-terminal domain. The research published in Cell demonstrated that these antibodies protected Syrian hamsters from SARS-CoV-2, the coronavirus that causes COVID-19 in people.

Additional recent findings indicate that the virus is slowly defying these antibodies that people are acquiring. The virus is adapting to these antibodies by accumulating mutations that help the virus escape these defenses, becoming so-called variants-of-concern.

Some of these variants, such as those first detected in the United Kingdom and South Africa, contain mutations that appear to make the virus less vulnerable to the neutralizing power of the N-terminal domain antibodies.

"Several SARS-CoV-2 variants harbor mutations within their N-terminal domain supersite," the researchers noted. "This suggests ongoing selective pressure."

They added that investigating these neutralization escape mechanisms is revealing some unconventional ways the N-terminal domain on the virus is acquiring antibody resistance, and are why N-terminal domain variants warrant closer monitoring.

The senior authors on the Cell paper are David Veesler, associate professor of biochemistry at the University of Washington School of Medicine in Seattle, as well as Matteo Samuele Pizzuto and Davide Corti of Humabs Biomed SA, a subsidiary of Vir Biotechnology. The lead authors are Matthew McCallum of the UW medical school's Department of Biochemistry, and Anna De Marco of Humabs Biomed.

The N-terminal domain antibodies in this study were derived from memory B cells, which are white blood cells that can persistently recognize a previously encountered pathogen and re-launch an immune response.

N-terminal domain-specific antibodies likely act in concert with other antibodies to wage a multi-pronged uprising against the coronavirus. The N-terminal domain antibodies appear to inhibit virus-cell fusion. In conjunction, another part of the antibody, called a constant fragment, might also activate some of the body's other approaches to eliminating the virus.

"This study shows that NTD-directed antibodies play an important role in the immune response to SARS-CoV-2 and they appear to contribute a key selective pressure for viral evolution and the emergence of variants," said Veesler

Continuing research on the N-terminal domain neutralizing antibodies may lead to improved therapeutic and preventive anti-viral drugs for COVID-19, as well as inform the design of new vaccines or the evaluation of current ones.

For example, patients who have recovered from COVID-19 and later received a first dose of an mRNA vaccine might experience a boost in their N-terminal domain neutralizing antibodies. Also, a cocktail of antibodies that target different critical domains on the coronavirus might also be a promising approach for medical scientists to examine to see if it provides broad protection against variant strains.

The researchers stressed that, although current vaccines "are being deployed at an unprecedented pace, the timeline for large-scale manufacturing and distribution to a large enough population for community immunity still remains uncertain."

Antiviral drugs, they explain, are expected to play a role in controlling disease during the ongoing pandemic. They are likely to be particularly helpful, according to the researchers, for unvaccinated individuals and for those who didn't get a strong enough immune response from their vaccinations.

Antivirals could also prove vital when immunity from previous infection or from vaccination wanes, or as mutant strains that break through the shield of vaccination emerge.

Credit: 
University of Washington School of Medicine/UW Medicine

The side stream of malting could be better used in human nutrition

image: Side-stream products of malting could be better used for human nutrition. Especially the rootlet is rich in phytochemicals.

Image: 
MostPhotos/Jochen

Malting, the processing of cereal grains into malt, generates rootlets as a side-stream product, which is currently mostly utilised as animal feed. However, this leftover material has not only a high protein content, but also high amounts of phytochemicals, which makes it a highly potential source of development for the food industry, according to a recent study from the University of Eastern Finland, published in npj Science of Food.

Germination increased the amount of phytochemicals

The study utilised metabolomics to analyse samples from grains of four cereals typically used in malting: barley, rye, wheat, and oats. The researchers were particularly interested in phytochemicals, which are bioactive compounds produced by plants. Many phytochemicals have been shown to work as antioxidants and to have other potential beneficial health effects. Samples were taken during several steps of the malting process, such as from malted grains, wort and water extract, as well as from rootlets and spent grain, the side-stream products of the processing. The main finding of the study was the observation of 285 different phytochemicals, most of which were found in the side-stream products, especially the rootlet. The germination occurring in the beginning of malting increased the levels of several phytochemicals compared to the unprocessed grains, and also produced compounds that were not abundantly present in the original grains.

The global food production is currently not on a sustainable level, which is why it is important to find new means to produce nutritious food with a reduced impact on the climate and the environment. It has been estimated that the protein contained in the side-stream products of malting could fulfil the annual protein need for five million people. Considering that these products are also abundant in phytochemicals, they may be nutritionally too valuable to be wasted as feed for production animals, when they could be used directly as human nutrition. Therefore, there is a need for further research into the use of the side-stream products of malting as food supplements.

"The biggest challenge in utilising the side-stream products of malting in foods is their taste and other sensory properties, which may not be acceptable as such. They would likely require further food processing, such as fermentation, so that they could be used to increase the nutritional value of white bread, for example," says Postdoctoral Researcher and the lead author of the article Ville Koistinen from the University of Eastern Finland.

"The health effects of phytochemicals are a hot topic among scientists because their mechanisms are largely unknown. However, we do have increasing evidence that the protective effect of plant-based foods against several diseases is at least partly attributable to the phytochemicals contained within them, and that these effects can only be obtained by eating plants, not from pills or supplements containing individual compounds."

Credit: 
University of Eastern Finland

Cellular Chinese whispers

image: The group studies how translation errors impact phenotypic variability

Image: 
Nivedita Mukherjee

The immense diversity in the living world and how it came into being has always been a subject of human enquiry. After centuries of playing detective in search of the basis of the parities and disparities that we see among living beings around us, the past century stood witness to some marvellous discoveries in biology and today the Central Dogma of life has been disclosed to us: DNA makes RNA and RNA makes protein (a facile view of a much more complex sequence of events). Together with contributing environmental factors, proteome(s) (total protein content of a cell) collectively influence 'traits' or characteristics of organisms that vary among individuals of a population. In a population, individuals with traits better suited to their environment have a higher chance at survival and reproduction than their competitors and hence percolate through the sieve of natural selection and end up transmitting these 'adaptive' traits to the next generation. Changes in the number of individuals carrying each trait, be it due to natural selection or simple chance (genetic drift), add up over generations, and this is how populations evolve over time. We might like to think that this simplistic view of variability and evolution is the whole story, but the roads linking genotype to phenotype and hence to evolution are hardly this straightforward. By the logic of the Central Dogma, individuals with identical genotype residing in the identical environment should have the identical phenotype. But is that always the case? Think about twins for example. Identical twins are born from splitting apart of an embryo inside the womb. This means that all cells in both of their bodies originate from a single zygote (fertilized egg cell) and hence have the same genetic repertoire. If you look close enough, however, you can find subtle differences in appearance by which you can tell apart identical twins reared even in the same environment. Whatever is the source of these differences, it's definitely not in the genes. So, where do such differences come from and do they influence survival and adaptation?

Phenotypic variability in populations with identical genetic makeup can be attributed to non-genetic sources which include both cell-extrinsic (environmental) and intrinsic mechanisms. One such cell-intrinsic non-genetic source involves stochastic errors in gene expression. Much like a game of Chinese whispers, the cell makes errors when copying information from DNA to RNA and from RNA to protein, such that the final protein sequence does not always exactly represent the original gene sequence it has been derived from. A large chunk of the error in the cellular game of Chinese whispers comes from the last step in the gene expression cascade, which is the process of translating RNA into a protein, owing to its exceptionally high error rates (~1 in 10^4). Theoretically, it seems obvious to assume that translation errors will result in proteome heterogeneity, generating a wide range of phenotypic variability in the population that will allow individuals to respond differently to identical environmental requirements and hence help the population better adapt to it. But there are a number of catches in this assumption! Firstly, the cell has many strategies to safeguard itself against protein mistranslation and thus errors in translation might not always lead to phenotypic variability. Secondly, protein production errors being random and unpredictable, the resulting variability is most likely to have maladaptive consequences for a population already optimized to a certain environment. Thirdly, proteome level variability is not heritable and hence might not even persist over generations to have implications on an evolutionary timescale. So, is our obvious assumption actually incorrect?

To give some empirical ground to these conjectures, researchers Laasya Samhita and Parth Raval from Dr. Deepa Agashe's lab at NCBS turned to our good ol' friend, the gut bacterium E. coli! They altered global mistranslation rates (protein translation error rates) in the bacteria through genetic and environmental manipulations and assessed how it impacts population-level parameters like growth rate, lag time and growth yield. To measure phenotypic variability at the single-cell level, they teamed up with researcher Godwin Stephenson from Dr Shashi Thutupalli's lab at NCBS. Godwin pored over individual E. coli cells trapped inside channels of a microfluidic device to investigate how the manipulation of mistranslation rates affects single-cell parameters like cell length (indicative of the physiological state of the cell) and division time (indicative of the reproductive rate of the bacterium). The results were interesting! E. coli modified to have higher mistranslation rates showed higher variability in cell length and division time, while the reverse was observed when mistranslation rates were reduced. Mysteriously, however, similar correlations between mistranslation levels and variability were not consistently found for population-level growth parameters. These results validate the prediction that higher mistranslation can result in higher phenotypic variability, addressing the first catch in our assumption. However, the results open up another question: why does the correlation between mistranslation and variability seen for single cells not hold at the level of the population? Maybe variability at the single-cell level is predictable and uniform across populations such that it evens out and does not show up as variation between populations. Or perhaps increased cell-to-cell variability leads to the generation of more cells with sub-optimal phenotypes which end up getting eliminated from the population due to selection, and hence cannot contribute to parameters like population growth rate. There can be different possibilities, but we can't say yet which one is correct.

Now that we have some idea about how mistranslation affects variability, lets head on to the second catch of our assumption and see if mistranslation-induced variability is adaptive or maladaptive for the population. Laasya and Parth found that both increase and decrease in mistranslation-induced variability turns out to be disadvantageous for the bacteria under optimal environmental conditions. To puzzle out what this observation implies, imagine cells as walking a tightrope when trying to balance between accuracy and speed of protein translation. Just like too much mistranslation is likely to lead to gravely malformed proteins that fail to do their job, being super accurate entails very slow and calibrated steps in protein production that may lengthen cell division time and hence slow down population growth. So, a tilt in either direction can make the cell fall off the rope. Surprisingly, however, mistranslating cells were often found to survive better when faced with stressful situations such as high temperature or starvation. This does make sense because the higher the mistranslation, the higher the variability and the higher the chance of some individuals of the population being better suited for stressful environmental conditions. To be noted, this is just a hypothesis. Thus, though higher variability is seen to be linked with higher survival under stress, it is not known if the relation between the two is that of direct cause and effect as seems intuitive, or if indirect pathways linking them are at play. What's more, just a brief initial pulse of altered mistranslation rates was sufficient to elicit better stress survival across generations; and there goes the third catch of our assumption which questioned if the effects of mistranslation can be carried forward through generations! This last observation is strange, as variability arising due to alterations solely in the proteome is not supposed to be heritable. The reasons behind this observation can be manifold but coming to any definite conclusion will require further experiments. So, as of now, this question is wide open for investigation.

The study under focus is one of the few attempts made to connect errors in cellular processes with variability and evolution. "The discovery that translation errors can increase phenotypic variability in fitness linked traits is exciting and of potential relevance for evolution. Future work should tell us more about the significance of this observation for natural bacterial populations", says Laasya Samhita, lead author of the paper that resulted from this study (you can find it here). Thus, while the study addresses some key questions in evolutionary biology, it also ends up uncovering some new ones. Why does mistranslation induced cell-to-cell variability not show up at the population-to-population level? How do cells with higher mistranslation rates survive better under stressful conditions? How does proteome heterogeneity persist over generations of cell division? There's a treasure chest of answers, and perhaps even more questions, waiting to be unearthed. We are bound to stumble upon many such questions and 'obvious' assumptions as we keep playing detectives in the quest to decode nature. But the important thing to remember while we do that is a maxim by none other than our favourite consulting detective, Sherlock Holmes: "There is nothing more deceptive than an obvious fact."

Credit: 
National Centre for Biological Sciences

The Lancet: Study finds COVID-19 reinfections are rare, more common for those above age 65

Prior infection with COVID-19 protects most people against reinfection, with 0.65% of patients returning a positive PCR test twice during Denmark's first and second waves, compared with 3.27% of people who tested positive after initially being negative.

People over the age of 65 are at greater risk of catching COVID-19 again, with only 47% protection against repeat infection compared with 80% for younger people.

Protection against reinfection remained stable for more than six months.

The findings underline that measures to protect the elderly - including social distancing and vaccinations are essential even if people have already been diagnosed with COVID-19.

The analysis focused on the original COVID-19 strain and made no assessment of variants.

Most people who have had COVID-19 are protected from catching it again for at least six months, but elderly patients are more prone to reinfection, according to research published in The Lancet.

Large-scale assessment of reinfection rates in Denmark in 2020 confirms that only a small proportion of people (0.65%) returned a positive PCR test twice. However, while prior infection gave those under the age of 65 years around 80% protection against reinfection, for people aged 65 and older it conferred only 47% protection, indicating that they are more likely to catch COVID-19 again.

The authors of the first large-scale study of its kind detected no evidence that protection against reinfection declined within a six-month follow-up period.

Their findings highlight the importance of measures to protect elderly people during the pandemic, such as enhanced social distancing and prioritisation for vaccines, even for those who have recovered from COVID-19. The analysis also suggests that people who have had the virus should still be vaccinated, as natural protection - particularly among the elderly - cannot be relied upon.

As of January 2021, COVID-19 had resulted in more than 100 million cases and over 2 million deaths worldwide. Recent studies have suggested that reinfections are rare and that immunity can last at least six months, however, the degree to which catching COVID-19 confers protection against repeat infection remains poorly understood.

Dr Steen Ethelberg, from the Statens Serum Institut, Denmark, said: "Our study confirms what a number of others appeared to suggest: reinfection with COVID-19 is rare in younger, healthy people, but the elderly are at greater risk of catching it again. Since older people are also more likely to experience severe disease symptoms, and sadly die, our findings make clear how important it is to implement policies to protect the elderly during the pandemic. Given what is at stake, the results emphasise how important it is that people adhere to measures implemented to keep themselves and others safe, even if they have already had COVID-19. Our insights could also inform policies focused on wider vaccination strategies and the easing of lockdown restrictions." [1]

The authors of the new study analysed data collected as part of Denmark's national COVID-19 testing strategy, through which more than two-thirds of the population (69%, 4 million people) were tested in 2020. Free, national PCR testing - open to anyone, regardless of symptoms - is one of the central pillars of Denmark's strategy to control COVID-19, an approach that sets it apart from most other countries.

Researchers used this data, spanning the country's first and second waves, to estimate protection against repeat infection with the original COVID-19 strain. Ratios of positive and negative test results were calculated taking account of differences in age, sex, and time since infection, and these were used to produce estimates of protection against reinfection.

Importantly, the authors note that the timeframe of their study meant it was not possible to estimate protection against reinfection with COVID-19 variants, some of which are known to be more transmissible. Further studies are needed to assess how protection against repeat infection might vary with different COVID-19 strains.

Among people who had COVID-19 during the first wave between March and May 2020, only 0.65% (72/11,068) tested positive again during the second wave from September to December 2020. At 3.3% (16,819/514,271), the rate of infection was five times higher among people who returned a positive test during the second wave having previously tested negative.

Of those under the age of 65 who had COVID-19 during the first wave, 0.60% (55/9,137) tested positive again during the second wave. The rate of infection during the second wave among people in this age group who had previously tested negative was 3.60% (14,953/420,909). Older people were found to be at greater risk of reinfection, with 0.88% (17/1,931) of those aged 65 or older who were infected during the first wave testing positive again in the second wave. Among people 65 or older who had previously not had COVID-19, 2.0% (1,866/93,362) tested positive during the second wave.

Similar results were obtained in an alternative cohort analysis, in which test data from almost 2.5 million people were assessed to determine reinfection rates throughout the epidemic, not just during the second wave. Only 0.48% (138/28,875) of people who had previously tested positive for COVID-19 caught it again at least three months later, compared with 2.2% (53,991/2,405,683) for those who initially tested negative. Estimated protection against reinfection was 78.8%. Protection against repeat infections varied little among people under the age of 65 years, with authors estimating 80.5% protection for this group. However, protection against reinfection was much lower among people over the age of 65 years, with estimated protection of just 47%.

Due to their high risk of exposure to the virus, a sub-analysis of healthcare workers was also carried out. Again, results were similar to those of the main analysis, with 1.2% (8/658) of those who had COVID-19 during the first wave becoming re-infected, compared with 6.2% (934/14,946) of those who were negative during the first wave. Estimated protection against reinfection was 81.1%.

Further analysis exploring two and four months' separation between pandemic waves - increasing the time between patients' first and second tests to limit the chances of misclassifying reinfections - also produced similar results (76.7% and 82.8% protection from reinfection, respectively).

In line with findings from other studies, the authors identified no evidence that protection against repeat infection with COVID-19 waned within six months. Because COVID-19 was only identified in December 2019, the period of protective immunity conferred by infection has still to be determined.

Dr Daniela Michlmayr, from the Staten Serum Institut, Denmark, said: "In our study, we did not identify anything to indicate that protection against reinfection declines within six months of having COVID-19. The closely related coronaviruses SARS and MERS have both been shown to confer immune protection against reinfection lasting up to three years, but ongoing analysis of COVID-19 is needed to understand its long-term effects on patients' chances of becoming infected again." [1]

The authors acknowledge some limitations to their study. Detailed clinical information is recorded only if patients are admitted to hospital, so it was not possible to assess whether the severity of COVID-19 symptoms affects patients' protection against reinfection. Misclassification of reinfections may have happened if viral RNA lingered for more than three months in some patients, although the authors sought to account for this by assessing two- and four-month gaps between COVID-19 waves. Errors in testing may also have occurred, however, the PCR tests used are believed to be highly accurate, and the authors would expect only around two false positives for every 10,000 tests in uninfected people and around three false negatives for every 100 tests in people with the infection.

Writing in a linked comment, Professors Rosemary J Boyton and Daniel M Altmann, from Imperial College London, UK, said: "Set against the more formal reinfection case reports that are based on differential virus sequence data and make reinfection appear an extremely rare event, many will find the data reported by Hansen colleagues about protection through natural infection relatively alarming. Only 80% protection from reinfection in general, decreasing to 47% in people aged 65 years and older are more concerning figures than offered by previous studies.

They continue, "These data are all confirmation, if it were needed, that for SARS-CoV-2 the hope of protective immunity through natural infections might not be within our reach and a global vaccination programme with high efficacy vaccines is the enduring solution."

Credit: 
The Lancet

Sheep vs. goats: Who are the best problem solvers?

image: Goats are better at solving problems than sheep.

Image: 
Christian Nawroth

When it comes to adapting to new situations, goats are a step ahead. Compared to sheep, they can more quickly adapt to changing environmental conditions. These are the findings of a new study by researchers at Martin Luther University Halle-Wittenberg (MLU) and the Leibniz Institute for Farm Animal Biology (FBN) which were published in Royal Society Open Science. The study investigated how well the animals were able to navigate around obstacles to reach food.

Sheep and goats have many things in common: They are closely related genetically, roughly the same size, have similar social structures, and have both been domesticated by humans over approximately the same amount of time. They do, however, differ greatly when it comes to their foraging strategies. "While sheep are grazers, goats are browsers and prefer buds and fresh shoots," explains Dr Camille Raoult from MLU who led the study together with Dr Christian Nawroth from FBN. The experiments were conducted at the Agroscope research centre in Switzerland, at Queen Mary University London (QMUL) and the Buttercups Sanctuary for Goats in Kent.

"It is important that animals are able to react swiftly to a changing environment because this allows them to find and exploit new food sources," says Nawroth. The team therefore wanted to investigate how both animal species react to new spatial obstacles. The experimental set-up of the study was rather straightforward: one animal at a time was led to the end of a small enclosure. Another person stood at the opposite end offering food. In between was a fence with a gap - the direct path was blocked each time. The researchers observed the animals' behaviour, specifically, whether they moved directly towards the gap, and recorded the time it took them to reach the food. After a few rounds, the position of the gap in the fence was changed. The animals then repeated the test. A total of 21 goats and 28 sheep completed the experiment.

The results: In the first round with the newly located gap, the goats managed to walk around the obstacle easier and faster, although the sheep reached their goal faster on average. Both the sheep and goats were initially puzzled by the new position of the gap and needed a few attempts to adjust to the new situation. Afterwards, they made fewer mistakes. The experiments could not be carried out under identical conditions at both locations, but the results were nevertheless clear: "Goats appear to adapt better and more accurately to new situations and move with less perseveration around the obstacle when the gap has changed. This suggests that they are more cognitively flexible than sheep," says co-author Dr Britta Osthaus from Canterbury Christ Church University in summary. One possible reason for the differences could be their different foraging strategies, the researcher adds.

Credit: 
Martin-Luther-Universität Halle-Wittenberg

Safety concerns determine level of public support for driverless vehicles, finds NTU study

image: An NTU Singapore study has found that safety concerns determine the level of public support for driverless vehicles. To enable the safe and effective rollout of autonomous vehicles (AVs) in Singapore, the Centre of Excellence for Testing and Research of Autonomous vehicles at NTU (CETRAN), designed to replicate the different elements of Singapore's roads, facilitates testing of AVs in a real-world setting. The world's first full size autonomous electric bus, launched by NTU and Volvo Buses in 2019, is one such AV that has undergone rigorous testing at CETRAN.

Image: 
NTU Singapore

When it comes to the use of driverless vehicles, an individual's support for their adoption hinges on how safe they are, rather than their economic impact or privacy concerns stemming from the data they might collect, a Nanyang Technological University, Singapore (NTU Singapore) study of 1,006 Singaporeans has found.
 

The NTU Singapore study led by the Wee Kim Wee School of Communication and Information exposed its participants to positive and negative blog posts about driverless vehicles and their safety, their impact on jobs and the economy, and their collection of data. These three 'frames' - how something is presented to the public - were selected from a content analysis of The Straits Times news reports from 2015 to 2020.
 

The respondents were then asked if they thought that driverless cars were bad/good, foolish/wise, unpleasant/pleasant, useless/useful, dangerous/safe. Their support for driverless cars was also measured on a five-point scale.
 

After being exposed to information on how potentially dangerous driverless vehicles could be, the respondents held less favourable attitudes towards driverless vehicles even when showed a positive blog post on how autonomous vehicles (AVs) could create many high-paying jobs, or on how driverless cars could provide convenience and efficiency using the data they collect, for example by remembering our schedules or monitoring our preferences.
 

Professor Shirley Ho, who led the research team, said: "One major debate about driverless cars, which hinges on the use of artificial intelligence technologies, lies in their limitation to make judgments that lie at the intersections of human values, moral rights, ethics, and social norms. This limitation may present safety risks, particularly in cases when traffic accidents are unavoidable. This could potentially explain why the negative safety messages in our study had a stronger effect on the respondents."
 

She added that with the drive towards AV adoption globally, these findings provide policymakers with important insights. Singapore has expanded AV testing to cover all public roads in its western areas and aims to serve three areas with driverless buses from next year.
 

Prof Ho, who is also NTU's Research Director for Arts, Humanities, Education and Social Sciences, said: "With AVs expected to be integrated into Singapore's land transport master plan, there is an urgent need for policymakers to examine the different strategies to communicate about driverless cars to the Singapore public.
 

"Our study has found that it is important to address the safety considerations. Even after all the safety measures are in place, public consultation is still necessary to ensure that the public's concerns, especially those of a moral and ethical nature, are taken into consideration in the process of developing the technology, before launching driverless cars on a large scale."
 

The study was published in the International Journal of Public Opinion Research in February.
 
 

Pre-existing values drive perception of AVs

 

These findings build on a 2020 NTU study that surveyed the same sample of 1,006 Singaporeans. This study, published in the scientific journal Transportation Research, found that public willingness towards using driverless cars, which is "marginally positive", is driven by their value predispositions such as their perception of the risks and benefits of AVs, as opposed to general science knowledge or AV-specific knowledge.
 
In this study, Prof Ho and her team also found that while the public acknowledged the potential benefits of AVs - such as helping the elderly and disabled to be more independent, lowering fuel consumption, and increasing human productivity - they also had a high level of risk perception towards driverless cars. In addition to technical errors that may lead to safety consequences, the respondents were concerned about potential security issues from hackers and privacy issues caused by exact location tracking capabilities found in AVs.
 

Prof Ho said: "These findings suggest that driverless cars may not achieve widespread usage in Singapore if there are no efforts to promote it, limiting the degree to which the society can reap the technology's benefits."
 

Credit: 
Nanyang Technological University