Tech

Is your drinking water toxic? This app may help you find out

image: Past studies have shown that fracking chemicals could cause infertility, cancer, and birth defects. According to data compiled from a new tool developed at Penn Medicine, a disproportionately high number of wells in Pennsylvania, Illinois, and Ohio contain chemicals that target testosterone-pathways in the human body.

Image: 
Penn Medicine

PHILADELPHIA - Exposure to hydraulic fracturing fluid in drinking water has been shown to increase the risk of respiratory problems, premature births, congenital heart defects, and other medical problems. But not all wells are created equal. Since different hydraulic fracturing -- or fracking -- sites use a diverse mix of chemical ingredients, often individuals and researchers are in the dark about the health consequences of living near a particular well.

Now, a new, interactive tool created by Penn Medicine researchers allows community members and scientists to find out which toxins may be lurking in their drinking water as a result of fracking. By typing your ZIP code into the website or accompanying app -- called WellExplorer -- you can view the closest fracking sites in your state, learn which chemicals are used at those sites, and view their levels of toxicity.

In a recent study, published in Database: The Journal of Biological Databases and Curation, the WellExplorer app's creators found, for example, that wells in Alabama use a disproportionately high number of ingredients targeting estrogen pathways, while Illinois, Ohio, and Pennsylvania use a high number of ingredients targeting testosterone pathways. The information found through WellExplorer might be particularly relevant for individuals who use private water wells, which are common in rural Pennsylvania, since homeowners may not be performing rigorous testing for these fracking chemicals, according to the study's principal investigator Mary Regina Boland, PhD, an assistant professor of Informatics in the Perelman School of Medicine at the University of Pennsylvania.

"The chemical mixtures used in fracking are known to regulate hormonal pathways, including testosterone and estrogen, and can therefore affect human development and reproduction," Boland said. "Knowing about these chemicals is important, not only for researchers who may be studying health outcomes in a community, but also for individuals who may want to learn more about possible health implications based on their proximity to a well. They can then potentially have their water tested."

While FracFocus.org serves as a central registry for fracking chemical disclosures in the United States, the database is not user-friendly for the general public, and it does not contain information about the biological action of the fracking chemicals that it lists. In order to create a tool that could provide more in-depth, functional information for researchers and individuals alike, the Penn researchers first cleaned, shortened, and subsetted the data from FracFocus.org to create two newly usable files that could be in used in WellExplorer website and app.

Because the research team also wanted to provide toxic and biological properties of the ingredients found at these well sites, they integrated data from the Toxin and Toxin Target Database (T3DB). From that database, they compiled information on fracking chemicals' protein targets (and the genes that encode those proteins), toxin mechanisms of actions, and specific protein functions. Moreover, they extracted the toxicity rankings of the top 275 most toxic ingredients from the Agency for Toxic Substances and Disease Registry, as well as a list of ingredients that were food additives as described by Substances Added to Food Inventory. The team then linked all of that information together and created a ZIP Searcher function into their web tool, so that people could easily find their exposure risks to specific chemicals.

"The information had been out there, but it was not all linked together in a way that's easy for regular people to use," Boland said.

However, Boland added that the use of chemicals at a fracking site may not necessarily mean that those chemicals would be present in the water supply, which would be dependent on other factors, such as what type of soil or bedrock is being drilled into, and the depth of both the hydraulic fracturing well and an individual's private well depth. Nonetheless, WellExplorer provides a starting point for residents who may be experiencing symptoms and want to have their water tested.

Beyond information-gathering for individuals, WellExplorer can also be used as an important tool for environmental scientists, epidemiologists, and other researchers to make connections between specific health outcomes and proximity to a specific fracturing well. From a development standpoint, this means that the research team had to be conscious of the two audiences when designing the website and app, said Owen Wetherbee, who aided in the development of WellExplorer while interning in the Boland Lab.

"Nationally, researchers are trying to link fracking to health outcomes, and I believe that a large reason why answering that question is challenging, is because different wells are using different ingredients, and so, the side effects of exposure would be different from place to place," Boland added. "What this app gives you is some information about where to start looking for these answers."

Credit: 
University of Pennsylvania School of Medicine

Parkinson's disease is not one, but two diseases

video: Parkinson's disease is not one, but two diseases.

Image: 
Per Borghammer/Jonathan Bjerg Møller (video), Aarhus University

Although the name may suggest otherwise, Parkinson's disease is not one but two diseases, starting either in the brain or in the intestines. Which explains why patients with Parkinson's describe widely differing symptoms, and points towards personalised medicine as the way forward for people with Parkinson's disease.

This is the conclusion of a study which has just been published in the leading neurology journal Brain.

The researchers behind the study are Professor Per Borghammer and Medical Doctor Jacob Horsager from the Department of Clinical Medicine at Aarhus University and Aarhus University Hospital, Denmark.

"With the help of advanced scanning techniques, we've shown that Parkinson's disease can be divided into two variants, which start in different places in the body. For some patients, the disease starts in the intestines and spreads from there to the brain through neural connections. For others, the disease starts in the brain and spreads to the intestines and other organs such as the heart," explains Per Borghammer.

He also points out that the discovery could be very significant for the treatment of Parkinson's disease in the future, as this ought to be based on the individual patient's disease pattern.

Parkinson's disease is characterised by slow deterioration of the brain due to accumulated alpha-synuclein, a protein that damages nerve cells. This leads to the slow, stiff movements which many people associate with the disease.

In the study, the researchers have used advanced PET and MRI imaging techniques to examine people with Parkinson's disease. People who have not yet been diagnosed but have a high risk of developing the disease are also included in the study. People diagnosed with REM sleep behaviour syndrome have an increased risk of developing Parkinson's disease.

The study showed that some patients had damage to the brain's dopamine system before damage in the intestines and heart occurred. In other patients, scans revealed damage to the nervous systems of the intestines and heart before the damage in the brain's dopamine system was visible.

This knowledge is important and it challenges the understanding of Parkinson's disease that has been prevalent until now, says Per Borghammer.

"Until now, many people have viewed the disease as relatively homogeneous and defined it based on the classical movement disorders. But at the same time, we've been puzzled about why there was such a big difference between patient symptoms. With this new knowledge, the different symptoms make more sense and this is also the perspective in which future research should be viewed," he says.

The researchers refer to the two types of Parkinson's disease as body-first and brain-first. In the case of body-first, it may be particularly interesting to study the composition of bacteria in the intestines known as the microbiota.

"It has long since been demonstrated that Parkinson's patients have a different microbiome in the intestines than healthy people, without us truly understanding the significance of this. Now that we're able to identify the two types of Parkinson's disease, we can examine the risk factors and possible genetic factors that may be different for the two types. The next step is to examine whether, for example, body-first Parkinson's disease can be treated by treating the intestines with faeces transplantation or in other ways that affect the microbiome," says Per Borghammer.

"The discovery of brain-first Parkinson's is a bigger challenge. This variant of the disease is probably relatively symptom-free until the movement disorder symptoms appear and the patient is diagnosed with Parkinson's. By then the patient has already lost more than half of the dopamine system, and it will therefore be more difficult to find patients early enough to be able to slow the disease," says Per Borghammer.

The study from Aarhus University is longitudinal, i.e. the participants are called in again after three and six years so that all of the examinations and scans can be repeated. According to Per Borghammer, this makes the study the most comprehensive ever, and it provides researchers with valuable knowledge and clarification about Parkinson's disease - or diseases.

"Previous studies have indicated that there could be more than one type of Parkinson's, but this has not been demonstrated clearly until this study, which was specifically designed to clarify this question. We now have knowledge that offers hope for better and more targeted treatment of people who are affected by Parkinson's disease in the future," says Per Borghammer.

According to the Danish Parkinson's Disease Association, there are 8,000 people with Parkinson's disease in Denmark and up to eight million diagnosed patients worldwide.

This figure is expected to increase to 15 million in 2050 due to the ageing population, as the risk of getting Parkinson's disease increases dramatically the older the population becomes.

Credit: 
Aarhus University

Scientists identify solid electrolyte materials that boost lithium-ion battery performance

image: The discovery by Evan Reed, associate professor of Materials Science & Engineering at Stanford, and visiting scholar Austin Sendek could help battery researchers design the first solid electrolytes that are safe, cheap and efficient.

Image: 
L.A. Cicero/Stanford University

Stanford University scientists have identified a new class of solid materials that could replace flammable liquid electrolytes in lithium-ion batteries.

The low-cost materials - made of lithium, boron and sulfur - could improve the safety and performance of electric cars, laptops and other battery-powered devices, according to the scientists. Their findings are published in a study in the journal ACS Applied Materials & Interfaces.

"A typical lithium-ion battery has two solid electrodes with a highly flammable liquid electrolyte in between," said study lead author Austin Sendek, a visiting scholar in Stanford's Department of Materials Science & Engineering. "Our goal is to design stable, low-cost solid electrolytes that also increase the power and energy output of the battery."

Promising materials

Battery electrolytes shuttle lithium ions between the positive and negative electrode during charging and discharging. Most lithium-ion batteries use a liquid electrolyte that can combust if the battery is punctured or short-circuited. Solid electrolytes, on the other hand, rarely catch fire and are potentially more efficient.

"Solid electrolytes hold promise as safer, longer-lasting and more energy-dense alternatives to liquid electrolytes," said senior author Evan Reed, an associate professor of materials science and engineering. "However, the discovery of suitable materials for use in solid electrolytes remains a significant engineering challenge."

Most solid electrolytes in use today are too unstable, inefficient and expensive to be commercially viable, the authors said.

"Conventional solid electrolytes can't conduct as much ionic current as liquid electrolytes," Sendek said. "The few that can usually degrade once they come in contact with the battery electrodes."

Machine learning

To find reliable solid electrolytes, Sendek and his colleagues in 2016 trained a computer algorithm to screen more than 12,000 lithium-containing compounds in a materials database. Within minutes the algorithm identified approximately 20 promising materials, including four little-known compounds made of lithium, boron and sulfur.

"As we were looking at the candidates, we noticed that four lithium-boron-sulfur compounds kept popping up," Sendek said. "Unfortunately, there wasn't much about these materials in the existing scientific literature."

In the current study, the researchers took a closer look at the four compounds using a technique called density functional theory, which simulates how the materials would behave at the atomic level.

Very promising results

Lithium-boron-sulfur electrolytes could be about twice as stable as the leading solid electrolytes, the current study shows. Stability can impact the amount of energy per unit weight a battery can store. In electric vehicles, that can mean a longer driving range.

"Teslas and other electric cars can go 250 to 300 miles on a single charge." Sendek said. "But with a solid electrolyte you could potentially double the energy density of lithium-ion batteries and get that range above 500 miles - and maybe even start thinking about electric flight."

When a typical solid electrolyte breaks down, it chemically transforms from a good conductor into a bad conductor, causing the battery to stop working. The study predicted that when mixed together, the four lithium-boron-sulfur compounds would continue functioning even as they decompose.

"All four compounds are chemically similar," Sendek said. "So when the mixture breaks down, each compound will likely transform from a good conductor to another good conductor to another. That means the materials can withstand several cycles of breaking down before they decompose into a bad conductor that ultimately kills your battery."

The study also predicted that certain phases of the lithium-boron-sulfur materials could be three times better at conducting lithium ions than state-of-the-art solid electrolytes made with costly germanium.

"If you get good ionic conductivity you can get more current flow out of your battery," Sendek said. "More current means more power to accelerate your car."

Some of the best solid electrolytes available today are made with rare elements like germanium, a kilogram of which costs about $500. Lithium, boron and sulfur are abundant chemicals with a price tag of $26 per kilogram.

"Our computer algorithm was searching for new materials based on their physical properties," Sendek said. "But it just so happened the four compounds were also much cheaper than the alternatives."

Lithium metal

Finding a viable solid electrolyte could also lead to the development of lithium metal batteries - energy-dense, lightweight batteries that are ideal candidates for electric cars.

Most lithium-ion batteries have a negatively charged electrode made of graphite. In lithium metal batteries, graphite is replaced with metallic lithium, which can store significantly more charge per kilogram.

"Lithium metal is really the holy grail of battery research," Sendek said. "But lithium metal electrodes have a tendency to internally short during operation, which liquid electrolytes do nothing to prevent. Solid electrolytes seem to be our best chance of overcoming that problem, and lithium-boron-sulfur electrolytes are promising candidates."

Research roadmap

The Stanford study provides a theoretical roadmap for future research. The next step is to synthesize all four lithium-boron-sulfur materials and test them in a battery.

"From what my experimentalist friends tell me, making these materials in the lab may be quite difficult," Sendek said. "Our job as theorists is to point the experimentalists to promising materials and let them see how the materials perform in real devices."

The ability to identify these promising materials from thousands of candidates was made possible through artificial intelligence and machine learning, Reed added.

"The discovery of most new materials to date has been accomplished by inefficient trial-and-error searches," he said. "Our results represent an inspiring success for the machine-learning approach to materials chemistry."

Credit: 
Stanford University

Very sensitive optical receivers for space communication

image: Researchers in Sweden recently demonstrated a novel concept for laser-beam-based communication links using a near 'noiseless' optical pre-amplifier in the receiver. They demonstrated an unprecedented receiver sensitivity of only one photon-per-information bit at a bit rate of 10.5 Gbit/s. This approach will provide a path for increasing the reach and information rates in future high-speed links for inter-satellite communication, deep-space missions, and Earth monitoring with Lidar.

Image: 
by Ravikiran Kakarla, Jochen Schröder, and Peter A. Andrekson

Space communication for deep-space missions, inter-satellite data transfer and earth monitoring require high-speed data connectivity. Such systems are increasingly using optical laser beams rather than radio-frequency beams. A key reason for this is that the loss of power as the beam propagates is substantially smaller at light wavelengths, since the beam divergence is then reduced. Nevertheless, light beams will also have substantial loss over long distances. As an example, the power lost when sending a laser beam from the Earth to the Moon (400,000 km) with a 10 cm aperture size will be about 80 dB (i.e. 1 part in 100 million will remain). As the transmitted power is limited, it is of critical importance to have receivers that can recover the information sent with as little received power as possible. This sensitivity is quantified as the minimum number of photons per information bit needed to recover the data without error.

A widely studied approach uses power-efficient pulse position modulation formats along with nanowire-based photon-counting receivers being cooled to only a few Kelvin while operating at speeds below 1 Gbit/s. However, to achieve multi-Gbit/s data-rates that will be required in the future, systems relying on pre-amplified receivers together with advanced signal generation and processing techniques from optical fiber communications are also being considered.

In a new paper published in Light Science & Application, a team of scientists, led by Professor Peter A. Andrekson, has developed a free-space optical transmission system relying on an optical amplifier that does not, in principle, add any excess noise in contrast to all other known optical amplifiers, referred to as a phase-sensitive amplifier (PSA).

In this concept, see figure 1, information is encoded onto a signal wave (green), which along with a pump wave (red) at different frequency generates a conjugated wave (an idler, blue) in a nonlinear medium denoted by . These three waves are launched together into the free space. At the receiving point, after capturing the light in an optical fiber, the PSA amplifies the signal and idler carrying data using a regenerated pump wave. The amplified signal is then detected in a conventional coherent receiver. This fundamentally result in the best possible sensitivity in any pre-amplified optical receiver.

With this approach, the team demonstrated an unprecedented error-free, "black-box" sensitivity of one photon-per-information-bit at a data rate of 10.5 Gbit/s. With 10 Watts of transmitter power, this receiver would allow for a link loss of 100 dB. The system uses a simple modulation format encoded with a standard forward error correction code and a coherent receiver with digital signal processing for signal recovery. This method is straightforwardly scalable to higher data rates if needed. It also operates at room temperature, allowing this to be implemented in space terminals and not only on the ground.

The theoretical sensitivity limits of this approach is also discussed in the paper and compared to other existing methods. A conclusion is that the above approach is the best possible for a very broad range of data rates.

The scientist summarize the results:

"These results show the viability of this new approach for extending the reach and data rate in long-haul space communication links. It therefore also has the promise to help break through the present-day science data return bottleneck in deep-space missions, which space agencies around the world are suffering from today."

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

Deep-blue organic light-emitting diodes based on a doublet-emission cerium(III) complex

image: a. Synthetic route for the complex. b. Single crystal structure of the complex shown as ellipsoids at the 50% probability level, where yellow represents Ce, pink represents B, blue represents N, red represents O, grey represents C, and the hydrogens are omitted for clarity. c. Single crystal structure of the complex shown in a space-fill style, where hydrogens are shown in white.

Image: 
by Liding Wang, Zifeng Zhao, Ge Zhan, Huayi Fang, Hannan Yang, Tianyu Huang, Yuewei Zhang, Nan Jiang, Lian Duan, Zhiwei Liu, Zuqiang Bian, Zhenghong Lu, Chunhui Huang

Compared with traditional display technologies, organic light-emitting diodes (OLEDs) have many advantages, such as high contrast, colorful, large viewing angle, light weight, flexible, and so on. Up to now, OLEDs have been successfully commercialized in the niche display market and are now under intense research for other applications, such as solid-state lighting.

During the past three decades, fluorescence, phosphorescence, thermally activated delayed fluorescence (TADF), and organic radical materials have been subsequently applied as emitters because of the pursuit of high efficiency, long-term stability, and low-cost OLEDs. As a new type of emitter in OLEDs, cerium(III) complexes have many potential advantages. First, the authors propose that the theoretical exciton utilization efficiency (EUE) could be as high as 100% since the cerium(III) complex shows a doublet 5d-4f transition from the single electron of the centre cerium(III) (4f1 configuration) ion rather than a singlet and/or triplet transition, which will not be limited by spin-statistics. Second, cerium(III) complexes are expected to be more stable in OLEDs since their excited-state lifetimes are generally tens of nanoseconds. Third, cerium(III) complexes are inherent blue or ultraviolet emitters, as demonstrated in the literature, although their emission colours could be theoretically affected by the ligand field. Moreover, cerium(III) complexes are inexpensive because the abundance of cerium in Earth's crust is 0.006 wt%, which is four orders of magnitude higher than that of iridium (0.0000001 wt%) and even slightly higher than that of copper (0.005 wt%). Hence, the cerium(III) complex has the potential to develop deep-blue OLEDs with high efficiency, long-term stability, and low cost.

However, most reported cerium(III) complexes are non-emissive because classic ligands and solvent molecules are found to quench cerium(III) ion luminescence upon coordination. Hence, electroluminescence studies on cerium(III) complexes are very rare, and their advantages have not been demonstrated. To date, there are only three examples of electroluminescence study of cerium(III) complexes in the literature. Among these examples, the maximum external quantum efficiency (EQE) of the best result is below 1%. As a breakthrough, the authors report a novel and neutral cerium(III) complex Ce-1 with rigid scorpionate ligands showing a high photoluminescence quantum yield (PLQY) up to 93% in doped film and consequently a high average EQE of 12.4% in prototype OLEDs.

The complex Ce-1 was synthesized by stirring potassium hydrotris(3,5-dimethylpyrazolyl)borate (KTpMe2) with Ce(CF3SO3)3 in tetrahydrofuran (THF), accompanied by hydrolysis due to a trace amount of water in the solvent (Figure 1). Through the chelating coordination of the two multidentate rigid ligands, the central cerium(III) ion is effectively protected from the influence of environmental quenching (Figure 1). Ce-1 powder emits deep-blue light, and the spectrum shows the typical double-peak emission of cerium(III) ions with an excited-state lifetime of 42 nanoseconds. The PLQY of its powder is as high as 82%.

As for the electroluminescence property of Ce-1, this article first uses the bipolar BCPO as the host material (Figure 2). Through testing the PLQY and the orientation ratio of the emitting layer (BCPO:Ce-1), and the EQE of the device, the EUE of Ce-1 in the device is deduced to be as high as 100%. Subsequently, this article employs the TSPO1:CzSi as host material to greatly increase the PLQY of the doped film to 93%, and finally the maximum EQE of the optimized device reaches 14% with the maximum brightness of 1008 cd m-2 (Figure 2). The Commission Internationale de L'Eclairage (CIE) coordinates of this device are (0.146, 0.078).

In this paper, the mechanisms of photoluminescence and electroluminescence are also studied. First, the electron paramagnetic resonance (EPR) spectroscopy of Ce-1 powder confirmed that Ce-1 is paramagnetic. Density functional theory (DFT) calculations also show that the donor and acceptor for the first symmetry allowed transition were recognized as the 4f and 5d orbitals of the central cerium(III) ion. The excited-state lifetime of tens of nanoseconds and the double emission peak with an energy difference of ~2000 cm-1 also indicate that the deep-blue light comes from the doublet 5d-4f transition of the cerium(III) ion. By comparing the electroluminescence spectrum of the device with the photoluminescence spectrum of the corresponding doped film, and the transient electroluminescence spectrum, it is deduced that the recombination of carriers occurs on the Ce-1 complex instead of the host material. On the basis of further analysis of the turn-on voltage of the device and the bandgap between the ligand and the central ion, this paper concludes that the cerium(III) ions can directly capture electrons/holes to form doublet excitons and emit deep-blue light (Figure 3).

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

Do rats like to be tickled?

Not all rats like to be tickled but by listening to their vocalisations it is possible to understand in real-time their individual emotional response, according to new research by the University of Bristol. The study, published today [21 September] in Current Biology, suggests that if this same relationship is observed for other situations, then it may be possible to use call patterns in rats to measure their emotional response and understand how best to improve their welfare.

Rats emit high frequency vocalisations which, when produced during human-simulated play or 'tickling', are thought to be similar to human laughter. Human laughter is complex and when a person is tickled, they may laugh even if they do not find the experience pleasurable. In rats, it has been impossible to know how much any individual rat 'likes' the experience because of limitations in method to directly measure their emotional response. In order to ask the question 'Do rats like to be tickled?' the researchers used a behavioural test developed at Bristol which provides a sensitive measure of an animal's individual emotional experience and they compared the data from this test with the animals vocalisations during 'tickling'.

The researchers found not all rats like to be tickled and that some rats emitted very high numbers of calls whilst others did not, and these calls are directly related with their emotional experience. Rats which emitted the most calls had the highest positive emotional response to tickling but those who did not emit any or few calls did not show a positive response.

Emma Robinson, Professor of Psychopharmacology in the School of Physiology, Pharmacology & Neuroscience, who led the research, said: "Being able to measure? a positive emotional response in animals is an important way to improve their welfare. What we have shown in this study is that the vocalisations made by rats in response to tickling are an accurate reflection of their emotional experience and something which is easy to measure.

"Should this be the case for other situations, measuring vocalisations could provide the simple, graded measure of emotional experience needed to better understand and improve the welfare of rats in a laboratory."

This work is important for two reasons. Related to tickling-induced laughter in rats, the team's findings support previous work that shows that these vocalisations indicate a positive experience. However, rats seem to be more 'honest' with their response to tickling than humans or non-human primates and the amount they laugh directly relates to how positive they find the experience. The findings also suggest that the high frequency vocalisations which rats emit can provide researchers with a simple, graded measure of their individual affective experience.

Being able to assess the welfare of animals accurately and objectively is important but is difficult to achieve. Without being able to ask an animal how it feels, researchers must rely on other methods which have their limitations. Researchers at Bristol have previously shown that the affective bias test used in this study can provide this type of objective measure, but it is highly specialist and time consuming to run so not readily applied in the wider laboratory animal setting. This research has found that human-simulated play or 'tickling' rats can cause a positive emotional state but not for all rats and by recording vocalisations it is possible to quickly identify which animals benefit from this type of enrichment.

The research team is seeking further funding to expand this work to look in more detail at the relationships between vocalisation patterns and emotional state using the Bristol developed affective bias test developed to provide an objective baseline measure. The research team want to look at whether similar associations are found between vocalisations and positive and negative emotional experiences.

Credit: 
University of Bristol

New research highlights impact of COVID-19 on food security in Kenya and Uganda

image: Female farmer protects herself against COVID-19.

Image: 
CABI

CABI scientists have conducted new research highlighting the impact of COVID-19 pandemic on food security in Kenya and Uganda with more than two-thirds of those surveyed having experienced economic hardship due to the pandemic.

Dr Monica Kansiime led a team of researchers who discovered, from a random sample of 442 respondents, that the proportion of food insecure people increased by 38% and 44% in Kenya and Uganda respectively.

The scientists, who conducted online questionnaires using WhatsApp, Facebook, Telegram, Twitter and email, also found that, in both countries, the regular consumption of fruits decreased by around 30% during the COVID-19 pandemic compared to before the crisis struck.

Besides income effects, the respondents mentioned other COVID-19-induced social challenges such as restricted movements, interrupted work schedules, mental health issues, and isolation.

Dr Kansiime said, "Taken together, the results suggest that although the COVID-19 pandemic is causing detrimental effects on all economic sectors, farmers are more likely than salary and wage earners to report suffering income shocks.

"Potential explanations include difficulties for farmers to go to farms, access inputs or transport their produce to markets due to COVID-19 induced lockdown. Compared to salary and wage-earning workers, the farmers in this sample earned relatively low incomes. Consequently, even a small shock to their income-earning activity could cause devastating effects."

The study, published in the journal World Development, suggests that households indicated a change in their dietary patterns in response to the COVID-19 outbreak by consuming less diverse diets, skipping meals, and reducing portions of food consumed. This points to the negative impacts of the pandemic on household food and nutrition security, the scientists say.

Dr Justice Tambo, co-author, added, "During the COVID-19 period in Kenya, more than half of the respondents were worried about insufficient food, unable to eat healthy and nutritious food, ate reduced portions of food, and consumed limited food varieties. However, before the COVID-19 outbreak, only 30% of the respondents in Kenya experienced these food insecurity situations.

"Similarly, the number of respondents in Uganda who reduced the amount of food eaten, were unable to eat healthy and nutritious food, consumed less diverse diets, or were worried about not having enough food to eat increased significantly by about 30, 35, 45, and 50 percentage points, respectively, during the COVID-19 period relative to a normal period."

Except for vegetables in Kenya, the number of respondents who regularly consumed each of the five food groups - fruits, vegetables, fish and seafood, meat and poultry - reduced by about 50 percentage points during the pandemic.

This is a cause for concern, the researchers argue, given that some of these food groups are important sources of micronutrients needed for good health. Estimates suggesting that over two billion people worldwide already suffer from micronutrient deficiency.

To buttress the effects of the pandemic on income-generating activities, a self-employed respondent in Kenya remarked: "Since the last 45 days of the outbreak of this deadly disease, so many people have travelled back to rural areas to hide. This has made my business weak because most of my customers went away, and the current situation now is nothing but survival. There is no movement after 7pm, and this is reducing the business activity hence lowering income. Life is hard, generally."

As a result of the hardships faced by residents in Kenya and Uganda the respective governments have put in place range of financial and economic policy changes to try and mitigate the impacts.

These include, in Kenya, proposals for a post-COVID-19 economic stimulus package of 53.7 billion shillings ($503 million) to support businesses that have been hit by the pandemic.

Meanwhile in Uganda, the government introduced repayment holidays, debt relief of up 12 months, and a reduction of the central bank lending rate from 9% to 8%. Food relief to vulnerable workers has also been considered particularly those whose daily activities would be affected by the lockdown, in a way of extending social protection to vulnerable sections of the population.

However, it is feared that social assistance programmes like direct cash and in-kind transfers to households and waiver of utility fees could have yielded more favourable outcomes to such households, in particular, the wage earners whose earning has been affected by restrictions.

"The relief measures came into effect when people had already lost their sources of income, and social protection measures were hardly implemented due to logistical challenges, hence amounting to minimal relief," Dr Kansiime said.

The scientists believe the results of the survey suggest that that ongoing and future government responses should focus on structural changes in social security by developing responsive packages to cushion members pushed into poverty by such pandemics.

Such measures, they say, should also build strong financial institutions to support the recovery of businesses in the medium term, and ensuring the resilience of food supply chains particularly those making available nutrient-dense foods.

Credit: 
CABI

Biodiversity hypothesis called into question

image: Aquatic organisms - and terrestrial ones - that do best when there is lots of food also do best when there is very little.

Image: 
©DTU/ Erik Selander

Biologists have long considered the origins and continued coexistence of the immense diversity of species found in our environment. How can we explain the fact that no single species predominates? A generally accepted hypothesis is that there are trade-offs, which means that no organism can do best in all conditions. One trade-off that is commonly assumed is that between gleaner organisms --which are able to acquire and consume more food than other species when resources are scarce-- and exploiters, which rapidly consume large quantities of the same resources when they are in abundance. However, when scientists from the University of Geneva (UNIGE) and the Technical University of Denmark (DTU) analysed the consumption of food resources of over 500 terrestrial and aquatic species, they showed that organisms that are efficient when there are low quantities of food, are also best when food resources are abundant. Consequently, biodiversity cannot be explained as a trade-off between gleaners and exploiters. Instead, the idea of risk taking to obtain food needs to be considered, as explained in this PNAS publication.

Dealing with trade-offs is one of the challenges organisms faces when they have to gain the energy needed to grow, defend themselves and reproduce. "If there were no trade-offs, the species that is the most effective in all conditions would come out on top," begins Mridul Thomas, senior research and teaching assistant in the Department F.-A. Forel for Environmental and Aquatic Sciences in UNIGE's Faculty of Sciences and the study's second author. "These trade-offs--and variations in environmental conditions--help explain why species are different and why we have diversity. No species can be best in all conditions."

Indeed, there is wide agreement in the scientific community that biodiversity can be explained partly through the gleaner-exploiter trade-off, which arises from the need to invest in both acquiring food and in quickly extracting energy and nutrients from it. Scientists expect organisms living in low-food environments to be gleaners that can quickly search for resources over large areas. Conversely, organisms living in food-rich environments are exploiters that consume resources in abundance and at great speed. Both these strategies can result in success depending on the environmental conditions encountered. And if the food availability changes through time or across space, it can allow competing gleaners and exploiters to co-exist, leading to diversity.

No gleaner-exploiter trade-off in nature

"Although it's taught commonly and is found in text books, there's little experimental evidence for the gleaner-exploiter trade-off," says Mridul. This is exactly the subject that Thomas Kiørboe, professor at the National Institute of Aquatic Resources at DTU--and first author of the study--decided to investigate. In an attempt to provide an answer, Professor Kiørboe has been collecting data found in the scientific literature on the food consumption of hundreds of species, derived from estimates from organisms ranging from single cells to large mammals living both in terrestrial and aquatic environments.

This immense collection of data has made it possible to analyse the speed at which over 500 species acquire and consume food. "For each species, such as a spider, scientists measured how fast it was able to capture and eat food, and they did this when food was abundant and when it was rare. Thanks to this valuable work by many scientists for hundreds of species, we were able to compare this across many organisms," continues Mridul. Curves of the speed of consumption as a function of the abundance of food are derived from this data, making it possible to describe the performance of the organisms in both low and high food conditions. "A negative correlation is expected from the gleaner-exploiter trade-off, but our results show a positive relationship", a clear indication, according to the biologist, that the gleaner-exploiter trade-off does not exist. Kiørboe and Mridul have demonstrated that species that perform well in an environment where energy resources are scarce are also the best in a rich environment.

Unexplained biodiversity

However, the researchers' interpretation does not call the concept of trade-offs into question. "Without trade-offs, it is very hard to maintain diversity. Our research does not explain biodiversity, but it does overturn an existing theory about precisely why we have biodiversity," says Mridul. Accordingly, there should be another trade-off: "A trade-off about risk-taking to access food is more likely, and would be consistent with our results. For instance, an organism may be better at getting food whether food is scarce or abundant because it takes more risks. Getting more food is generally good because it helps organisms grow and reproduce. But if in searching for food the organism gets eaten itself, it cannot reproduce. So it can sometimes be good to avoid taking these risks even if it means getting less food --which would explain why we see in our study that some species seem very good at getting food and some very bad at it." Whatever this other trade-off is, the Danish-Swiss study fundamentally changes an important idea about why we have biodiversity that is still being taught and has been taken for granted. It follows that our understanding of ecosystems must be revisited, since this knowledge is essential in the face of the environmental upheavals we are witnessing today.

Credit: 
Université de Genève

Soft robots, origami combine for potential way to deliver medical treatments

COLUMBUS, Ohio -- Researchers have found a way to send tiny, soft robots into humans, potentially opening the door for less invasive surgeries and ways to deliver treatments for conditions ranging from colon polyps to stomach cancer to aortic artery blockages.

The researchers from The Ohio State University and the Georgia Institute of Technology detailed their discovery, which makes use of the ancient Japanese practice of origami, in a study published Sept. 14 in the Proceedings of the National Academy of Sciences.

Under this system, doctors would use magnetic fields to steer the soft robot inside the body, bringing medications or treatments to places that need them, said Renee Zhao, corresponding author of the paper and assistant professor of mechanical and aerospace engineering at Ohio State.

"The robot is like a small actuator," Zhao said, "but because we can apply magnetic fields, we can send it into the body without a tether, so it's wireless. That makes it significantly less invasive than our current technologies."

That soft robot is made of magnetic polymer, a soft composite embedded with magnetic particles that can be controlled remotely. Robotic delivery of medical treatment is not a new concept, but most previous designs used traditional robots, made of stiff, hard materials.

The "soft" component of this robot is crucial, Zhao said.

"In biomedical engineering, we want things as small as possible, and we don't want to build things that have motors, controllers, tethers and things like that," she said. "And an advantage of this material is that we don't need any of those things to send it into the body and get it where it needs to go."

The soft origami robot in this case can be used to deliver multiple treatment selectively based on the independently controlled folding and deploying of the origami units. The origami allows the material to "open" when it reaches the site, unfurling the treatment along with it and applying the treatment to the place in the body that needs it.

This origami-style delivery of medication is also not new, but because previous designs relied on more cumbersome, bulky ways of activating or opening the origami to deliver the medication, those deliveries were often slow. Some did not allow for medication to be delivered to a pinpointed location in the body.

The soft robot, Zhao said, removes some of that bulkiness. The magnetic fields allowed the researchers, in the lab, to control the direction, intensity and speed of the material's folding and deployment.

Researchers conducted this work in a lab, not in the human body. But the technology, they think, could allow doctors to control the robot from outside the body using only magnetic fields.

"In this design, we don't even need any chip, we don't need any electric circuit," she said. "By just applying the external field, the material can respond itself -- it does not need any wired connection."

These findings may have applications beyond delivering medicine, said Glaucio Paulino, a co-author on the paper and professor and Raymond Allen Jones Chair in the Georgia Tech School of Civil and Environmental Engineering.

"We anticipate that the reported magnetic origami system is applicable beyond the bounds of this work, including future origami-inspired robots, morphing mechanisms, biomedical devices and outer space structures," Paulino said.

Credit: 
Ohio State University

Researchers discover cyber vulnerabilities affecting bluetooth based medical devices

image: An SUTD-led research team designed and implemented the Greyhound framework, a tool used to discover SweynTooth - a critical set of 11 cyber vulnerabilities.

Image: 
SUTD

Internet-of-Things (IoT) such as smart home locks and medical devices, depend largely on Bluetooth low energy (BLE) technology to function and connect across other devices with reduced energy consumption. As these devices get more prevalent with increasing levels of connectivity, the need for strengthened security in IoT has also become vital.

A research team, led by Assistant Professor Sudipta Chattopadhyay from the Singapore University of Technology and Design (SUTD), with team members from SUTD and the Institute for Infocomm Research (I2R), designed and implemented the Greyhound framework, a tool used to discover SweynTooth - a critical set of 11 cyber vulnerabilities.

Their study was presented at the USENIX Annual Technical Conference (USENIX ATC) on 15 to 17 July 2020 and they have been invited to present at the upcoming Singapore International Cyber Week (SICW) in October 2020.

These security lapses were found to affect devices by causing them to crash, reboot or bypass security features. At least 12 BLE based devices from eight vendors were affected, including a few hundred types of IoT products including pacemakers, wearable fitness trackers and home security locks.

The SweynTooth code has since been made available to the public and several IoT product manufacturers have used it to find security issues in their products. In Singapore alone, 32 medical devices reported to be affected by SweynTooth and 90% of these device manufacturers have since implemented preventive measures against this set of cyber vulnerabilities.

Regulatory agencies including the Cyber Security Agency and the Health Sciences Authority in Singapore as well as the Department of Homeland Security and the Food and Drug Administration in the United States have reached out to the research team to further understand the impact of these vulnerabilities.

These agencies have also raised public alerts to inform medical device manufacturers, healthcare institutions and end users on the potential security breach and disruptions. The research team continues to keep them updated on their research findings and assessments.

Beyond Bluetooth technology, the research team designed the Greyhound framework using a modular approach so that it could easily be adapted for new wireless protocols. This allowed the team to test it across the diverse set of protocols that IoTs frequently employ. This automated framework also paves new avenues in the testing security of more complex protocols and IoTs in next-generation wireless protocol implementations such as 5G and NarrowBand-IoT which require rigorous and systematic security testing.

"As we are transitioning towards a smart nation, more of such vulnerabilities could appear in the future. We need to start rethinking the device manufacturing design process so that there is limited reliance on communication modules such as Bluetooth to ensure a better and more secure smart nation by design," explained principal investigator Assistant Professor Sudipta from SUTD.

Credit: 
Singapore University of Technology and Design

CHOP researchers find MIS-C associated with myocardial injury

Philadelphia, September 21, 2020-- Using sensitive parameters to assess cardiac function, researchers at Children's Hospital of Philadelphia (CHOP) have found that cardiac involvement in multisystem inflammatory syndrome in children (MIS-C) differs from Kawasaki disease (KD) and is associated with myocardial injury. The findings were published recently in the Journal of the American College of Cardiology.

Thought to be a post-viral hyperinflammatory response related to COVID-19, MIS-C has some clinical overlap with KD, an inflammatory condition that causes rash, fever, and inflammation of the blood vessels in children. However, the two conditions differ in important ways, particularly when it comes to cardiac involvement. By utilizing a cardiac parameter known as strain, which measures subtle changes in cardiac function not detected by conventional echocardiography, the research team was able to show that while the coronary arteries were not typically involved in MIS-C as they are in KD, myocardial injury was more common in MIS-C.

"By using this novel approach for assessing cardiac function, we were able to detect subtle myocardial changes that were not detected by conventional echocardiography," said senior author Anirban Banerjee, MD, FACC, an attending cardiologist with the Cardiac Center at CHOP. "Long-term follow-up is essential to fully understand this new disease, including cardiac MRI studies to evaluate for myocardial scarring. This is especially true when considering the possible lingering effects of myocardial injury and consequent need for caution in sports participation."

In a retrospective study, the researchers compared echocardiographic findings in 28 patients with MIS-C, 20 patients with classic KD, and 20 healthy controls. The research team analyzed echocardiographic parameters, including various measures of strain, during the acute phase in patients with MIS-C and KD and during the subacute phase in the MIS-C patients (approximately 5 days after hospitalization).

While only one case in the MIS-C group presented with coronary artery dilation, cardiac injury and dysfunction were common, occurring in approximately 60% of MIS-C patients. Left ventricular systolic and diastolic function were worse in MIS-C compared to KD, and although systolic dysfunction in MIS-C patients recovered quickly within several days, diastolic dysfunction persisted. In addition to the left ventricle, both the right ventricle and left atrium - sometimes called the "forgotten chambers" of the heart - were significantly affected. Abnormal strain measurements in the latter two chambers correlated strongly with biomarkers of myocardial injury in MIS-C.

"Larger studies will be needed to fully characterize coronary involvement in MIS-C," Banerjee said. "Given the results of this study, we recommend following up with MIS-C patients several times over a one-year period, in a pattern similar to patients with Kawasaki disease, even if they look well outwardly."

Credit: 
Children's Hospital of Philadelphia

Strong markets for cultured meat across meat-reducing Germany and France

For the first time ever the majority of Germans are limiting their consumption of meat, and many are open to the concept of eating cultured meat, according to a new study.

The research, published in the journal Foods by an international research team from the University of Bath (UK), Université Bourgogne Franche-Comté (France), and Ipsos (Germany), finds there is growing acceptance of non-meat diets both in Germany and France - although strong tradition and culture still hold sway on attitudes in France in particular.

For their investigation, researchers surveyed 1,000 people in each country asking them a series of questions about their current and intended dietary habits, as well as for their thoughts about cultured meat - i.e. meat produced without raising and slaughtering animals. This new method of meat production mirrors the biological process of building muscle but does so under controlled conditions.

Their analysis found that just 45% of German respondents identified as full meat-eaters, with a further 31% now actively following flexitarian or meat-reduced diets. Meat consumption was more common in France, where 69% identified as full meat-eaters with a further 26% following a flexitarian diet.

The research also reveals promising markets for cultured meat in both countries. Although the majority of consumers in France and Germany had still not heard of cultured meat, 44% of French and 58% of Germans respondents said they would be willing to try it, with 37% of French consumers and 56% of Germans willing to buy it themselves.

The publication highlights Germany as one of the most vegetarian nations in Europe, noting that per capita meat consumption has been trending down for several decades. Now, for the first time, evidence suggests that German consumers who are not deliberately limiting their meat consumption are in the minority. These patterns are mirrored in France, where almost half of meat-eaters intend to reduce animal consumption in the years ahead although attitudes are harder to shift.

The researchers say that the social implications of these findings could be profound. Lead author Christopher Bryant from the Department of Psychology at the University of Bath explained: "We know that the social normality of meat consumption plays a large role in justifying it. Now we are approaching a tipping point where the majority of people are deciding that, primarily for ethical and environmental reasons, we need to move away from eating animals. As eating animals becomes less normal, we will likely see a rise in demand for alternatives like plant-based and cultured meat."

Strikingly, they find that cultured meat acceptance is higher among agricultural and meat workers in both France and Germany - two countries considered as the strongest agricultural powers in the European Union. The team behind the study say this indicates farmers may see cultured meat as a way to address the mass demand for affordable meat, enabling them to move away from intensive industrial production systems and return to more traditional systems, which are more harmonious with environmental and animal welfare outcomes.

In the US and in Europe, some of the world biggest meat producers have already backed and partnered with cultured meat innovators including Cargill, Tyson Foods, PHW, the largest German poultry breeder and processor and M-Industry which is part of the Swiss Migros Group.

The team found some evidence that pro-cultured meat messages which focus on antibiotic resistance and food safety were more persuasive than those that focused on animal welfare or the environment. Consumers also indicated that they would be more likely to consume cultured meat that is not genetically modified.

Study author Nathalie Rolland said, "We can expect to see an increase in interest in novel proteins including cultured meat. First, because we know that increasing familiarity with the concept tends to increase comfort with the idea of eating it. Also, this data was collected before the outbreak of COVID-19, a zoonotic disease which has caused many people to re-examine the role of animals in our food system."

Jens Tuider, International Director of ProVeg International, said: "Antimicrobial resistance is a serious public health issue, caused mainly by the widespread use of antibiotics in conventional animal agriculture.

"Globally, more than 70% of antibiotics are used on animals in intensive farming, dramatically decreasing the efficacy of antibiotics intended for humans. This represents a serious threat to global public health, with a projected death toll from antibiotic-resistant diseases of 10 million per year by 2050. Since cellular agriculture has no need for antibiotics, it could significantly mitigate against this major risk to public health."

The research posits that some of the differences observed between France and Germany might best be explained through the lens of culture and tradition, however they note the role that agricultural lobbies continue to play in France. This includes the French decision in 2018 to ban use of meat terms to describe vegetable-based products, such as vegetarian sausages or vegetarian steak. The law is ostensibly to avoid misleading consumers, though the UK's House of Lords disagreed in 2019, saying that these
changes could make things more confusing for consumers, not less.

Whilst this study focused on France and Germany, lead researcher Chris Bryant argues that the findings could have implications elsewhere. He adds: "Europe still has lower rates of vegetarianism compared to other parts of the world. If these surveys were repeated, we might expect to see even higher rates of meat reduction elsewhere.

"The normality of meat-eaters being the majority is reversing as more people move towards plant-based diets. The development of better and better alternatives, including cultured meat, only makes this transition easier."

Credit: 
University of Bath

Scientists propose multifunctional liquid metal nanocapsules

image: LM nanocapsules prepared by LM-initiated ring-open polymerization and their applications.

Image: 
LI Mingjie and LI Xiankai

Liquid metals (LMs) are promising for applications in flexible electronics and biomimetic functional composites. Nanometerization and surface modification of LMs are usually used to improve their substrate affinity and processing properties. In most cases, LM nanodroplets are encapsulated into ultrathin and fragile shells of oxides or amphiphile monolayers.

However, it may be hindered from being incorporated homogeneously into various composites through conventional processing methods. Therefore, producing stable and processable LM nanodroplets remains challenging.

In their previous study, Prof. LI Chaoxu and his coworkers from the Institute of Bioenergy and Bioprocess Technology (QIBEBT) of the Chinese Academy of Sciences (CAS) revealed that LM can initiate free radical polymerization of vinyl monomer under ultrasonication.

Recently, this research group, for the first time, has found that the ring-opening polymerization was initiated by sonicating liquid metal in fluidic lactones and proposed multifunctional liquid metal nanocapsules. "By this in-situ polymerization, LM nanodroplets were encapsulated into polylactone shells with tunable thickness, which could further be dried into solid powder," said Prof. LI.

Besides high chemical stability and dispersibility in organic solvents, the powder of LM capsules combined exceptional properties of LM droplets and polylactone shells. It could be introduced into thermoplastic composites through liquid casting and thermal-/photo-molding for notch-insensitive tearing property, sintering-induced electric conductivity and photo-thermal effect.

LM initiator of ring-opening polymerization may start a pathway to produce stable and thermal/photo-moldable powder of LM capsules and their multifunctional composites applicable in biomedicines, soft electronics and smart robots.

Credit: 
Chinese Academy of Sciences Headquarters

'Front of package' nutrition labels improved nutrition quality

A new study analyzing 16 years of data on tens of thousands of products finds that the adoption of nutrition data on "front of package" (FOP) labels is associated with improved nutritional content of those foods and their competitors.

"We wanted to know whether food companies were responding to increased public interest in healthier food," says Rishika Rishika, co-author of the study and an associate professor of marketing in North Carolina State University's Poole College of Management. "In other words, is the market driving change in the nutrition of food products? And the evidence suggest that this is exactly what's happening."

For this study, the researchers evaluated nutritional data on 44 categories of food products from 1996 through 2011. Altogether, the researchers looked at data on 21,096 products, representing 9,083 brands, covering everything from energy bars to soup.

Specifically, the researchers evaluated whether there was any impact when products adopted the "Facts Up Front" style FOP nutrition labels. Facts Up Front is a voluntary nutrition labeling program adopted by the food industry. Manufacturers participating in the program list the calories, saturated fat, sugar and sodium per serving size of their food products on relatively large FOP labels. The products still carry the mandated nutritional information panels on the back of the packages.

To determine whether the voluntary FOP program had influenced the nutritional content of food products, the researchers looked specifically at two things. For food categories in which at least one product had adopted the FOP labeling, the researchers evaluated differences in the nutritional quality of all products in the category both before and after any products adopted the FOP labels. These differences were also compared with food categories in which no products adopted labeling and that served as control groups.

The researchers calculated a product's nutritional content using the Nutrient Profiling model, which includes a host of nutrients, such as sugar, fat, sodium, protein and fiber.

The results showed a clear association between FOP labeling and changes in the nutritional content of food products. And there were five factors that were associated with the presence of FOP labels having a greater impact on nutrition:

Premium brands improved nutritional quality more than non-premium brands in the same category;

Brands that had narrower product lines, meaning they produced fewer products than their peers, improved nutritional quality more;

Products in categories that are broadly unhealthy, such as snack foods, showed a more pronounced response;

Foods in "more competitive" categories, meaning those in which there were many competitors at different price points, showed a more pronounced response; and

Products that had adopted FOP labeling showed more improved nutritional quality.

The researchers also found that there were pronounced changes in the content of nutrients that were singled out by the "Facts Up Front" FOP program.

Across all of the food categories in which at least some products adopted the FOP labels, there was a 12.5% reduction in calories; 12.97% reduction in saturated fat; 12.62% reduction in sugar; and 3.74% reduction in sodium.

"We had hypothesized that when nutritional information is clearly marked on the front of the package, that consumers would be more likely to consider it when deciding what to buy," Rishika says. "This would, in turn, cause competitive pressure on other brands in that category to innovate and improve the nutritional quality of their products.

"The fact that the effect of FOP labeling was most pronounced for the nutritional variables on the FOP labels supports our theory," Rishika says. "And the fact that the effect was stronger for brands that adopted FOP labeling also supports the hypothesis."

The researchers had a few takeaway messages.

"For consumers, we found that the presence of a Facts Up Front FOP label on a package generally meant that the product had a better nutritional profile than competing products that didn't have an FOP label," Rishika says.

More broadly, the findings suggest that voluntary, highly visible nutritional labeling can be an effective tool for encouraging change on an industry level.

"However, it remains unclear which aspect of the program is more important," Rishika says. "Is the fact that the program is voluntary more important, since it helps consumers identify brands that are choosing to share nutritional information on the front of package? Or is the fact that the FOP labeling is prominent more important, simply because the information is more clearly noticeable? Those are questions for future research."

Credit: 
North Carolina State University

Unlocking the secrets of plant genomes in high resolution

image: The new software tool makes it possible to determine the genome of species such as potatoes with a high degree of accuracy.

Image: 
HHU / Gunnar Klau

The genomes of all higher life forms are stored in the cell nucleus on chromosomes. Chromosomes are composed of strands of the DNA molecule. The genetic information itself is encoded in a sequence of adjacent base pairs of the molecules adenine (A), cytosine (C), guanine (G) and thymine (T).

Different species have different numbers of chromosomes; for example, humans have 23, while potatoes have 12 and wheat has 7. In addition, there are different copies or 'haplotypes' of the chromosomes. Humans have two copies, one from the mother and one from the father, while potatoes have four and wheat even has six. Species with two copies are referred to as 'diploid', whereas those with more than two are 'polyploid'. The copies are almost identical, with 'almost' being the operative word. It is the differences between them that determine the variability of the organisms within a population.

In order to unlock the genetic information, the researchers tackled something akin to a large jigsaw: They took a larger number of cells, divided the cells' genomes into lots of small parts - called 'reads' - and sequenced the information contained in those parts. This was necessary because the technology currently available can only process small sections of DNA.

The result was a huge volume of data - billions of reads, with a data volume of several hundred gigabytes. They comprise sequences of differing lengths made up of the letters A, C, G and T. The next task for the bioinformatics researchers was to determine their position within a chromosome, then assign the corresponding sections to a chromosome (a process known as 'mapping') and finally to find the right copies of the chromosome. This last stage is known as 'phasing'. The task is made more difficult by sequencing errors.

There are good, efficient tools available for mapping. However, the bioinformatics tools needed for phasing are still in their infancy. This was precisely where the team of bioinformatics researchers from HHU focused their attention. In a joint project funded by the German Research Foundation and managed by Prof. Dr. Gunnar Klau (Algorithmic Bioinformatics working group) and Prof. Dr. Tobias Marschall (Institute of Medical Biometry and Bioinformatics, University Hospital Düsseldorf) in collaboration with Prof. Dr. Björn Usadel (Institute of Biological Data Science), they developed a software tool named 'WhatsHap polyphase' and tested the tool successfully using model data as well as the potato genome.

This new tool solves the problem using a two-phase process. The first phase involves clustering the reads, i.e. splitting them into groups. Reads in one group probably come from one haplotype or a region of identical haplotypes. The second phase involves 'threading' the haplotypes through the clusters. During threading, the reads are assigned to the haplotypes as evenly as possible, ensuring as little as possible jumping back and forth between clusters.

The new tool has been added to the main 'WhatsHap' package, which is freely available. The package has already been used to carry out the phasing successfully for diploid chromosome sets, e.g. for humans. This new addition from the team based in Düsseldorf means that phasing is now also possible for polyploid organisms. Prof. Klau said: "Our new technology allows for plant genomes to be phased in high resolution and with a low margin of error".

Credit: 
Heinrich-Heine University Duesseldorf