Tech

German researchers compile world's largest inventory of known plant species

image: Leipzig is host to the oldest botanical garden in Germany. On an area of only three hectares, around 6500 of the 350,000 plant species worldwide grow here.

Image: 
Swen Reichhold

Leipzig could mean for the future of plant taxonomy what Greenwich meant for world time until 1972: it could become the reference city for correct scientific plant names. In an outstanding feat of research, the curator of the Botanical Garden of Leipzig University, Dr Martin Freiberg, and colleagues from iDiv and UL have compiled what is now the largest and most complete list of scientific names of all known plant species in the world. The Leipzig Catalogue of Vascular Plants (LCVP) enormously updates and expands existing knowledge on the naming of plant species, and could replace The Plant List (TPL) - a catalogue created by the Royal Botanic Gardens, Kew in London which until now has been the most important reference source for plant researchers.

"In my daily work at the Botanical Garden, I regularly come across species names that are not clear, where existing reference lists have gaps," said Freiberg. "This always means additional research, which keeps you from doing your actual work and above all limits the reliability of research findings. I wanted to eliminate this obstacle as well as possible."

World's most comprehensive and reliable catalogue of plant names

With 1,315,562 scientific names, the LCVP is the largest of its kind in the world describing vascular plants. Freiberg compiled information from accessible relevant databases, harmonized it and standardised the names listed according to the best possible criteria. On the basis of 4500 other studies, he investigated further discrepancies such as different spellings and synonyms. He also added thousands of new species to the existing lists - species identified in recent years, mainly thanks to rapid advances in molecular genetic analysis techniques.

The LCVP now comprises 351,180 vascular plant species and 6160 natural hybrids across 13,460 genera, 564 families and 84 orders. It also lists all synonyms and provides further taxonomic details. This means that it contains over 70,000 more species and subspecies than the most important reference work to date, TPL. The latter has not been updated since 2013, making it an increasingly outdated tool for use in research, according to Freiberg.

"The catalogue will help considerably in ensuring that researchers all over the world refer to the same species when they use a name," says Freiberg. Originally, he had intended his data set for internal use in Leipzig. "But then many colleagues from other botanical gardens in Germany urged me to make the work available to everyone."

LCVP vastly expands global knowledge of plant diversity

"Almost every field in plant research depends on reliably naming species," says Dr Marten Winter of iDiv, adding: "Modern science often means combining data sets from different sources. We need to know exactly which species people refer to, so as not to compare apples and oranges or to erroneously lump different species." Using the LCVP as a reference will now offer researchers a much higher degree of certainty and reduce confusion. And this will also increase the reliability of research results, adds Winter.

"Working alone, Martin Freiberg has achieved something truly incredible here," says the director of the Botanical Garden and co-author Prof Christian Wirth (UL, iDiv). "This work has been a mammoth task, and with the LCVP he has rendered an invaluable service to plant research worldwide. I am also pleased that our colleagues from iDiv, with their expertise in biodiversity informatics, were able to make a significant contribution to this work."

Credit: 
German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig

Microswimmers move like moths to the light

image: Janus particles under the electron microscope. The titanium dioxide microswimmers are barely larger than one micrometer.

Image: 
Copyright: Simmchen Group

TU Dresden Freigeist fellow Dr Juliane Simmchen is investigating with her multidisciplinary junior research group the motion of synthetic microswimmers in liquids. Her goal is to enable these inanimate microparticles to move in a certain direction of their own accord and thus, in future, to be used in sensor technology or biological cleaning.
"Actually, it's a bit like playing computer games in the laboratory," the chemist describes her extraordinary research work in an interview with the Volkswagen Foundation.

The Simmchen group is working with the so-called "Janus particles". These consist of a body of titanium dioxide with two differently coated sides: one side with a catalytically active layer of nickel and gold, the other side remains untreated. Titanium dioxide is used as a whitening agent, for example in wall paint, but it also reacts with light. As a result, Janus particles are photocatalytic, which means that as soon as light hits them, chemical reactions occur that set off a movement.

The group has now observed and analyzed an extremely unusual phenomenon in the motion of Janus particles: as soon as the particles leave an illuminated zone in the microscope, they turn around by themselves and swim back - a behavior that is actually only known from microorganisms. But how can such complex behavior be triggered in synthetic microswimmers?

First author Lukas Niese and Dr Simmchen were able to show that as long as the particles are active in the light, their swimming direction is stabilized by a combination of physicochemical effects. As soon as the particles are no longer exposed to light, there is no energy conversion and the direction of movement is no longer stable. "In this case," explains Lukas Niese, "the natural thermal movement (Brownian Motion) sets in. This causes the particles to virtually flip, and then they swim back into the exposed area."

"The fact that such simple effects as the Brownian Motion can lead to such complex behavior was quite astonishing and impressive, especially in terms of the evolution and the development of abilities. We could make use of this property for the targeted control of microrobots. Applications are conceivable in which the particles filter and remove pollutants from liquids or transport medicine through the body, and perhaps even transport information," says Dr Simmchen, explaining the significance of the discovery.

Credit: 
Technische Universität Dresden

Insulators in Alberta at higher risk of chest infections, COPD: study

Construction workers in Alberta who work with hazardous insulation materials are much more likely to be affected by repeated chest infections and chronic obstructive pulmonary disease (COPD), according to new research published in the International Journal of Environmental Research and Public Health.

The study followed 990 insulators over six years. Participants underwent regular pulmonary function tests and chest radiography throughout the study. Researchers found 46 per cent of the workers had one or more chest infections over a three-year time span and 16 per cent of insulators who were exposed to asbestos were diagnosed with COPD--a disease that causes obstructed airflow from the lungs.

"In the past, physicians have tried to advocate for compensation benefits to insulators who were declined because of a background of cigarette smoking," said Paige Lacy, professor of medicine at the University of Alberta's Faculty of Medicine & Dentistry and director of research at the Alberta Respiratory Centre. "This study shows that incidence of COPD and recurrent chest infections is independent of cigarette smoking and demonstrates that hazardous materials really are having an effect on the health of insulators."

Nearly all insulation materials--including asbestos, carbon fibres, calcium silica, fibreglass and refractory ceramic fibres--with the exception of aerogels and mineral fibres, were associated with chest infections. COPD was only associated with asbestos, a commonly used construction material in Canada until it was banned outright in 2018.

The findings of the study are already being used by the Workers' Compensation Board to assess insulators who are potentially exposed to hazardous materials in the course of their work, said Lacy.

"Not all of them are in safe working environments. We're trying to advocate to make their environment safer, to reduce their exposure to these hazardous materials and to make life better for Albertans who are working in the construction sector."

The research team believes far more can be done to address hazardous working conditions for insulators in Canada. They advocate for greater use of personal protective equipment such as respirators and hazmat suits on worksites, increased worksite monitoring, regular health checkups for workers and elimination of hazardous insulation materials in favour of safer ones.

"A large problem is that workers are not actually informed about potential health risks of some of the materials they're using," said Subhabrata Moitra, a post-doctoral fellow at the U of A and lead author of the study. "There really need to be stricter rules for utilizing less hazardous materials when they're available."

The team is now working on a followup study examining the same group of workers to determine whether their lung health remains the same or gets worse over time.

"People assume that in Canada, we don't have the same kinds of workplace exposures to hazardous materials," said Lacy. "We think it happens somewhere else, like India or China, because they handle very large quantities of raw material in their work, especially because of lack of safety policies. But we're finding evidence that within Canada, we're getting people exposed to these hazardous construction materials at very high levels, and this is a threat to their health."

Credit: 
University of Alberta Faculty of Medicine & Dentistry

Light confinement in a 3D space

image: a) 3D spiral waveguide. (b) Suspended air-bridge waveguide; inset shows the input and output coupling sections. (c) 30Gb/s NRZ and (d) 56Gb/s PAM4 eye diagram of the 3D Printed Waveguide output.

Image: 
SUTD

The emerging services such as data center cloud interconnection services, ultra-bandwidth video services, and 5G mobile services stimulate the fast development of Photonic integrated circuits (PIC), which can meet the increasing demand of communication systems for internet.

However, PICs today are largely perceived as planar structures, able to guide light in a single plane. This planarity arises because of the traditional top-down fabrication processes.

Multiphoton lithography is a new and promising 3D printing technology that allows for 3D objects to be fabricated more easily, compared to the fabrication of 3D objects in conventional cleanroom type fabrication methods used in electronics and optoelectronics.

With this technique, there is no longer a restriction of the top-down exposure for the realization of PICs as it unlocks the functions availed by the third dimension. Leveraging concepts of additive manufacturing, 3D multi-photon lithography involves the use of a femtosecond light source to initiate two-photon polymerization when focused onto a specific location in material. This technique was used to realize the high-resolution 3D photonic structures.

Researchers at the Singapore University of Technology and Design (SUTD) have demonstrated high-resolution 3D waveguides which transcend the restrictions of light confinement in a single plane. In the paper published in Advanced Optical Materials, Dr Gao Hongwei, Associate Professor Dawn Tan and their colleagues at the Photonics Devices and Systems Group demonstrated high-resolution 3D waveguides which guide light in a spiral and air-bridge configuration (refer to SEM images below).

Alongside these novel devices, they also demonstrated very low loss 3D waveguide couplers with 1.6dB fiber-waveguide coupling losses and 3dB bandwidth exceeding 60nm. This is in contrast with the current industry standards which require very labor intensive packaging for losses of around 1dB. The research team demonstrated their losses to be low without requiring any post processing or post-fabrication packaging. The high-resolution fabrication also resulted in ring resonators with sub-micron feature sizes.

"The fabricated photonic devices are an innovative advancement in the domain of photonic integrated circuits. Importantly, we were also able to demonstrate error-free 30Gb/s NRZ and 56Gb/s PAM4 data transmission through these waveguides. This is important because these high-speed testing formats and rates are in alignment with those used in commercial direct-detection transceiver products today," explained principal investigator Associate Professor Tan who heads the photonics devices and systems group at SUTD.

Indeed, the team managed to derive only small power penalties of 0.7 dB for NRZ (bit error rate (BER) = 10?12) and 1.5 dB for PAM4 (BER = 10?6) from the photonic devices. These results successfully demonstrate high speed, error-free optical transmission through the 3D fabricated waveguides. This also showcases the devices' suitability as low-loss waveguides and optical interconnects.

"Importantly, the 3D quality of these waveguides allows us to exceed the limitations of traditional planar structures. In this way, it is possible to achieve far higher density PICs. The high resolution, sub-micron feature sizes are also promising, especially to achieve advanced functions such as spectral filtering, resonator structures and metasurfaces," said Dr Gao, the first author of the paper and postdoctoral researcher from SUTD.

"This work demonstrates the potential of additive manufacturing in making advanced photonic devices with superior 3D designs in high resolution," added co-author Associate Professor Low Hong Yee from SUTD.

In the future, the capability to realize high resolution 3D photonic structures may create even more advancements in both form and function in photonics, including advanced optical signal processing, imaging techniques and spectroscopic systems.

Credit: 
Singapore University of Technology and Design

Surrey's new hybrid X-ray detector goes toe-to-toe with state-of-the-art rivals

A new hybrid X-ray detector developed by the University of Surrey outperforms commercial devices - and could lead to more accurate cancer therapy.

In a study published by the Advanced Functional Material journal, researchers from Surrey's Advanced Technology Institute (ATI) demonstrate a new hybrid X-ray detector architecture with slightly higher sensitivity for X-rays than typically used for radiotherapy.

The authors also show that their new architecture brings several new benefits, including industry-standard ultra-low dark currents that are the lowest reported for such detectors. The device also has fast response characteristics that compete with commercial X-ray semiconductor detectors based on silicon and selenium.

Prabodhi Nanayakkara, the lead scientist of the study and Ph.D. student at the University of Surrey, said: "Our hybrid detector has shown promising results - chief of which is its ability to be more accurate than current X-ray detectors. We hope that our technology will lead to improved patient survival rates and ultimately to a healthier society."

Professor Ravi Silva, Director of ATI at the University of Surrey, said: "Technologies with unique capability such as this only appear once in a lifetime -- with its plethora of applications that range from low dose mammography to high-speed border security to non-destructive testing over large areas using portable wireless technology.

"We are proud of this cutting-edge breakthrough and look forward to further developing the technology via our university spin-out vehicle, SilverRay Ltd."

Credit: 
University of Surrey

Deadly snake bites: Potential antivenom discovered

Amputations, deformed bones and disfigured skin. At worst, death. These are the potential consequences of a venomous snake bite.

For millions of people living in low-income countries deadly snake bites are an everyday occurrence, and the nearest health clinic or hospital can easily be hundreds of kilometres away. Add to this that the roads are often impassable, in these parts of the world, many do not own a vehicle, and the price of antivenom can be several hundred dollars (a fieldworker makes 1-2$/day). Therefore, the World Health Organization, WHO, has classified antivenoms for venomous snake bites as an official shortage.

A project headed by Associate Professor and Project Manager Brian Lohse, Department of Drug Design and Pharmacology at the University of Copenhagen, have set out to try and change that. Brian Lohse and collaborators have just published a study proving that the concept behind their alternative antivenom strategy appears to work on e.g. Cobra venom. Furthermore, through Serpentides, a UCPH start up company, Brian Lohse and COPA have filed a national patent on a peptide which binds and neutralises a particular type of toxin found in the venom of 75 per cent of all venomous snakes.

'We have been working on an alternative type of antivenom that is much cheaper than traditional, antibody-based antivenom. If it becomes a future product, it will fit in your pocket, and it can be used by anyone, anywhere. The idea is that it can be injected using an automatic injection unit, precisely like the ones used by diabetes patients, that is, directly into the muscle or fold of the skin at the site of the bite', Brian Lohse explains.

WHO estimates that more than 400,000 people each year suffer from serious consequences of a snake bite and 140,000 die. Among other things, this is due to the fact that antivenom is expensive and hard to gain access to for the people who need it.

Today, the only treatment available for snake bites is antibody-based antivenom, and it saves many lives each year. But the antibodies today, are produced in a live animals, horses for example, and this process can take up to 18 months. And as it requires several hundred venomous snakes and several hundred horses to produce enough antibodies to be able to support the demands in a region, it is a slow and cost-intensive process. On top of this, it is hazardous for the animal keepers, who often are bitten themselves.

Once the antibody-based antivenom has been produced, it must be injected into the veins of the person who has been envenomed. This requires healthcare training, and when the nearest clinic is located far away from the place of the bite, things are likely to end in disaster, Brian Lohse explains.

'A lot of people die before they reach a place for treatment, because the snake venom is free to spread in the body for several hours or even days. Add to this the many potential side effects of the treatment and the fact that high-quality products with hardly no side effects can cost up to 2,000 dollars per dose. This creates a destructive market of poor quality counterfeits, sold as cheap antivenom medicines.These poor-quality products can cause an allergic shock that may kill the patient', he explains.

Traditional antivenom treatment is further limited by the fact that antivenom for an Indian Cobra, for instance, not necessarily work on its African Cobra-cousin. Unlike antibodies for antivenom, the Serpentides version only takes one day to synthesize and can be produced in a standard chemistry laboratory.

'Right now we are testing the stability of the active substances in our antivenom, and the tests are showing good results. Stability is important if we want people to be able to carry the product in their pocket, but also if we want to avoid the need for cooling', Brian Lohse explains.

'The fact that our potential antivenom can be used right away (in the jungle or bush) buys the patient life-saving time. When you are dealing with snake venom it is important to prevent the venom from spreading further. Therefore, the faster you are able to neutralise it, the better are the patient's chances of survival and minimizing sequelae. nevertheless, a snake bite is a serious matter, and the patient should always go to the nearest hospital, even after having used our future Serpentides antivenom, whose main objective is to limit the spread of venom in the muscles into the veins', he says.

Brian Lohse says that he and his colleagues from the Faculty of Health and Medical Sciences (the Olsen Lab and the Pless Lab) together with a team from the Technical University of Denmark (the Dufva Lab) and collaborators from the University of Münster (the Kümmel Lab) just completed a so-called proof of concept in vitro study, which showed that these peptides can inhibit cobra venom, which is a quick-acting neurotoxin.

'Publications and patents are the easy part; we got that in the bag. Now comes the real challenges', says Brian Lohse.

Credit: 
University of Copenhagen - The Faculty of Health and Medical Sciences

In temperate trees, climate-driven increase in carbon capture causes autumn leaves to fall sooner

For decades, scientists have expected that the shedding of leaves from temperate trees will get later and later under ongoing climate change. And early observations supported that idea, as warming caused leaves to stay on the trees later over recent decades, driving increased growing season length that could help to slow the rate of climate change. However, a large-scale study of European trees now suggests that this trend is beginning to change, and in fact, tree leaves may start to fall earlier as the productivity of those trees increases. The results build on growing evidence that plant growth is limited by the ability of tree tissues to use and store carbon. While changes in the growing-season lengths of temperate trees greatly affect global carbon balance, future growing-season trajectories remain highly uncertain because the environmental drivers of autumn leaf senescence are poorly understood. Autumn leaf-shedding at the end of the growing season in temperate regions is an adaptation to stressors, such as freezing temperatures. A common related assumption is that alleviating some of these stressors - as a warmer climate could - would allow leaves to persist longer to fix more atmospheric carbon by photosynthesis. However, the role of photosynthesis in governing the timing of leaf senescence has not been widely tested in trees. To do this, Deborah Zani and colleagues used long-term observations from dominant Central European tree species from 1948 to 2015, and experiments designed to modify carbon uptake by trees, to evaluate related impacts on senescence. Collectively, their data show that increased growing-season productivity in spring and summer due to elevated carbon dioxide, temperature, or light levels can lead to earlier - not later - leaf senescence. This is likely because roots and wood cease to use or store leaf-captured carbon at a point, making leaves costly to keep. The authors used their observations to build a model to improve autumn senescence prediction under a business-as-usual climate scenario. It forecasts the possibility of slight advances, not delays, in autumn leaf-dropping dates over the rest of the century. The results "substantially lower our expectations of the extent to which longer growing seasons will increase seasonal carbon uptake in forests," they write, though the universality of this pattern in other forest types remains unknown. They note an important next avenue of research is implementing such growing-season length constraints in Earth system and vegetation models, which currently do not consider these dynamics when predicting seasonal carbon dioxide uptake of plants. A related Perspective discusses the results in more detail.

Credit: 
American Association for the Advancement of Science (AAAS)

A multidisciplinary policy design to protect consumers from AI collusion

Legal scholars, computer scientists and economists must work together to prevent unlawful price-surging behaviors from artificial intelligence (AI) algorithms used by rivals in a competitive market, argue Emilio Calvano and colleagues in this Policy Forum. Whether this algorithmic collusion - when prices are unlawfully raised by profit-hungry competitors who agree behind closed doors to deceive the market - is purposeful or a programming oversight, it is nonetheless as dangerous to the consumer as are human-directed collusions; thus, there must be policies in place to hold firms accountable for collusive behaviors in their pricing algorithms, the authors say. Experimental and empirical evidence has shown how easily algorithms can adopt collusive behaviors; for example, when tasked to maximize profit, the AI will autonomously set off to learn all possible pricing rules of conduct - including collusion -- to reach this goal without human intervention. Current policies are not equipped to stay appraised of the AI's capability to adopt unlawful pricing rules. Cases of human-made collusion can be investigated through evidence of surreptitious communications between competitors that suggests they agreed not to compete. By contrast, such trackable evidence is not as apparent in cases of algorithmic collusion, and AI can even evolve beyond established economic theories and studies of human collusion. Therefore, the authors propose a three-step method to investigate pricing algorithms for collusion in controlled environments. This includes testing which AI pricing rules can lead to collusion in the laboratory; applying an auditing exercise to uncover collusive properties that produce high prices; and, finally, developing constraints on the learning algorithm to prevent AI from evolving to collusion. Once these methods are implemented and completed, policymakers can consider banning specific pricing algorithms and hold firms accountable for AI pricing rules that lead to collusive behavior. "There are sev­eral obstacles down the road, including the difficulty of making a collusive property test operational, the lack of transparency and interpretability of algorithms, and courts' willingness and ability to incorporate tech­nical material of this nature," the authors note.

Credit: 
American Association for the Advancement of Science (AAAS)

A route for avoiding defects during additive manufacturing

Laser powder bed fusion is a dominant additive manufacturing technology that has yet to reach its potential. The problem facing industry is that tiny bubbles or pores sometimes form during the printing process, and these pores create weak spots in finished products.

When a slow-speed, high-power laser is melting metal powder during the 3D printing of a part, a keyhole-shaped cavity in the melt pool can result. Pores, i.e. defects, form at the bottom of the keyhole. New research published in Science reveals how the pores are generated and become defects trapped in solidifying metal.

"The real practical value of this research is that we can be precise about controlling the machines to avoid this problem," says Anthony D. Rollett, a professor of materials science and engineering in Carnegie Mellon College of Engineering and a lead co-author of the paper, "Critical instability at moving keyhole tip generates porosity in laser melting."

Building on previous research that quantified the keyhole phenomenon, the research team used extremely bright high-energy x-ray imagining to watch instabilities of the keyhole. Pores form during fluctuations of the keyhole, and it changes its shape: the keyhole tip morphs into a "J" shape and pinches off. This unstable behavior generates acoustic waves in the liquid metal that force the pores away from the keyhole so that they survive long enough to get trapped in the resolidifying metal. The team is the first to focus on this behavior and identify what is happening.

"When you have a deep keyhole, the walls oscillate strongly. Occasionally, the oscillations are strong enough at the bottom of the keyhole that they pinch off, leaving a large bubble behind. Sometimes this bubble never reconnects to the main keyhole. It collapses and generates an acoustic shock wave. This pushes the remaining pores away from the keyhole," explains Rollett.

It's important to note that keyholes themselves are not flaws and, e.g., they increase the efficiency of the laser. Using synchrotron x-ray equipment at Argonne National Laboratories, the only facility in the United States where the researchers could run these experiments, they noted that there is a well-defined boundary between stable versus unstable keyholes.

"As long as you stay out of the danger zone [i.e., too hot, too slow], the risk of leaving defects behind is quite small," says Rollett.

Fluctuations in the keyhole's depth increase strongly with decreasing scan speed and laser power on the unstable side of the boundary.

"You can think of the boundary as a speed limit, except it is the opposite of driving a car. In this case, it gets more dangerous as you go slower. If you're below the speed limit, then you are almost certainly generating a defect," adds Rollett.

At a broader scale, by proving the existence of well-defined keyhole porosity boundaries and demonstrating the ability to reproduce them, science can offer a more secure basis for predicting and improving printing processes. Rollett, who is the faculty co-director of Carnegie Mellon's Next Manufacturing Center, thinks that the findings from this research will quickly find their way into how companies operate their 3D printers.

Credit: 
College of Engineering, Carnegie Mellon University

Extraction of largely-unexplored bodily fluid could be a new source of biomarkers

image: Interstitial fluid (right) may provide an alternative source of biomarkers compared to blood (left) that could be useful in diagnosing human illness.

Image: 
Allison Carter, Georgia Tech

Using an array of tiny needles that are almost too small to see, researchers have developed a minimally-invasive technique for sampling a largely-unexplored human bodily fluid that could potentially provide a new source of information for routine clinical monitoring and diagnostic testing.

Biochemical information about the body most commonly comes from analysis of blood - which represents only 6% of bodily fluids - but valuable information may also be found in other bodily fluids that are traditionally hard to get. Researchers have now developed a way to extract dermal interstitial fluid (ISF) - which circulates between cells in bodily tissues - using a simple through-the-skin technique that could provide a new approach for studying the metabolic products of cells, obtaining diagnostic biomarkers, and identifying potential toxins absorbed through the skin.

Because the dermal interstitial fluid doesn't clot like blood, the microneedle-based extraction could offer a new approach for continuous monitoring of glucose and other key health indicators.

Results of a human trial on the microneedle-based ISF sampling is reported Nov. 25 in the journal Science Translational Medicine. The study, conducted by researchers from the Georgia Institute of Technology and Emory University, was supported in part by the National Institutes of Health.

"Interstitial fluid originates in the blood and then leaks out of capillaries to bring nutrients to cells in the body's tissues. Because interstitial fluid is in direct communication with the cells, it should have information about the tissues themselves beyond what can be measured from testing the blood," said Mark Prausnitz, Regents' Professor and J. Erskine Love Jr. Chair in Georgia Tech's School of Chemical and Biomolecular Engineering, "This microneedle-based technique could provide a minimally-invasive and simple way to access this interstitial fluid to make it available for medical diagnostic and research applications."

ISF has been difficult to sample. Indwelling instruments for monitoring glucose in ISF already exist, and other researchers have used surgically-implanted tubing and vacuum-created blisters to extract ISF through the skin, but these techniques are not suitable for routine clinical diagnostic use.

The researchers, led by first author Pradnya Samant, used a patch containing five solid stainless-steel microneedles that were a hundredth of an inch in length. By pressing the patch at an angle into the skin of 50 human subjects, they created shallow micropores that reached only into the outer layer of skin containing ISF. The researchers then applied a suction to the area of skin containing the pores and obtained enough ISF to do three types of analysis. For comparison, they also took blood samples and obtained ISF using the older blister technique.

To accurately determine the biomarkers available in the ISF, the researcher needed to avoid getting blood mixed with the ISF. Though major blood vessels don't exist in the outer layers of skin, capillaries there can be damaged by the insertion of the microneedles. In their studies, the researchers found that if they slowly ramped up the suction after inserting the microneedles, they could obtain fluid clear of blood.

The overall extraction procedure took at total of about 20 minutes for each test subject. The procedure was well tolerated by the volunteers, and the microscopic pores healed quickly within a day with minimal irritation.

The extracted fluid was analyzed at Emory University using liquid chromatography-mass spectrometry techniques to identify the chemical species it contained. Overall, there were about 10,000 unique compounds, most of which were also found in the blood samples. However, about 12 percent of the chemical species were not found in the blood, and others were found in the ISF at higher levels than in the blood.

"The skin is metabolically active, and it is full of cells that are changing the fluid," Prausnitz said. "We found that some of the compounds were unique to the ISF, or enriched there, and that is what we were hoping to find."

While not all the compounds unique to the ISF could be analyzed, the research team identified components of products that are applied to the skin - such as hand lotions - and pesticides that may enter the body through the skin. This discovery could set the stage for use of the microneedle technique for dermatological and toxicology studies.

"If you want to look at what accumulates in the skin over time, this may provide a way to obtain information about those kinds of exposures," Prausnitz said. "These are materials that may accumulate in the tissues of our body, but are not found in the bloodstream."

The researchers also determined the pharmacokinetics of caffeine and the pharmacodynamics of glucose - both small molecules - from the ISF, indicating that that dynamic biomarker information could be obtained from the technique. Those measurements suggested that ISF could provide a means for continuously monitoring of such compounds, taking advantage of the fact that the fluid does not clot.

"We were encouraged that we found a good correlation between the blood and interstitial fluid glucose, which suggests we might be able to have a continuous glucose monitoring system based on this technology," Prausnitz said. A microneedle-based system could provide a less-invasive alternative to existing implantable glucose sensors by allowing the sensing components to remain on the surface of the skin.

In future research, Prausnitz would like to reduce the time required to extract the ISF and simplify the process by eliminating the vacuum pump. Additional study of the compounds found in the fluid could also show whether they may have medical diagnostic value.

"We'd like to make this microneedle-based technique available to the research community to make ISF routinely available for study," he said. "Tissue interstitial fluid could be a novel source of biomarkers that complements conventional sources. This research provides a means to study this further."

Credit: 
Georgia Institute of Technology

Space worms experiment reveals gravity affects genes

Living at low gravity affects cells at the genetic level, according to a study of worms in space.

Genetic analysis of Caenorhabditis elegans worms on the International Space Station showed "subtle changes" in about 1,000 genes.

Stronger effects were found in some genes, especially among neurons (nervous system cells).

The study, by the University of Exeter and the NASA GeneLab, aids our understanding of why living organisms - including humans - suffer physical decline in space.

"We looked at levels of every gene in the worms' genome and identified a clear pattern of genetic change," said Dr Timothy Etheridge, of the University of Exeter.

"These changes might help explain why the body reacts badly to space flight.

"It also gives us some therapy targets in terms of reducing these health effects, which are currently a major barrier to deep-space exploration."

The study exposed worms to low gravity on the International Space Station, and to high gravity in centrifuges.

The high-gravity tests gave the researchers more data on gravity's genetic impacts, and allowed them to look for possible treatments using high gravity in space.

"A crucial step towards overcoming any physiological condition is first understanding its underlying molecular mechanism," said lead author Craig Willis, of the University of Exeter.

"We have identified genes with roles in neuronal function and cellular metabolism that are affected by gravitational changes.

"These worms display molecular signatures and physiological features that closely mirror those observed in humans, so our findings should provide foundations for a better understanding of spaceflight-induced health decline in mammals and, eventually, humans."

Dr Etheridge added: "This study highlights the ongoing role of scientists from Europe and the UK in space flight life sciences research."

Credit: 
University of Exeter

An ionic forcefield for nanoparticles

image: An SEM image of the nanoparticles on the red blood cell

Image: 
(Image courtesy of Eden Tanner/ Harvard SEAS)

Nanoparticles are promising drug delivery tools, offering the ability to administer drugs directly to a specific part of the body and avoid the awful side effects so often seen with chemotherapeutics.

But there's a problem. Nanoparticles struggle to get past the immune system's first line of defense: proteins in the blood serum that tag potential invaders. Because of this, only about 1 percent of nanoparticles reach their intended target.

"No one escapes the wrath of the serum proteins," said Eden Tanner, a former postdoctoral fellow in bioengineering at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS).

Now, Tanner and a team of researchers led by Samir Mitragotri, the Hiller Professor of Bioengineering and Hansjorg Wyss Professor of Biologically Inspired Engineering at SEAS, have developed an ionic forcefield that prevents proteins from binding to and tagging nanoparticles.

In mouse experiments, nanoparticles coated with the ionic liquid survived significantly longer in the body than uncoated particles and, surprisingly, 50 percent of the nanoparticles made it to the lungs. It's the first time that ionic liquids have been used to protect nanoparticles in the blood stream.

"The fact that this coating allows the nanoparticles to slip past serum proteins and hitch a ride on red blood cells is really quite amazing because once you are able to fight the immune system effectively, lots of opportunities open up," said Mitragotri, who is also a Core Faculty Member of Harvard's Wyss Institute for Biologically Inspired Engineering

The research is published in Science Advances.

Ionic liquids, essentially liquid salts, are highly tunable materials that can hold a charge.

"We knew that serum proteins clear out nanoparticles in the bloodstream by attaching to the surface of the particle and we knew that certain ionic liquids can either stabilize or destabilize proteins," said Tanner, who is now an Assistant Professor of Chemistry & Biochemistry at the University of Mississippi. "The question was, could we leverage the properties of ionic liquids to allow nanoparticles to slip past proteins unseen."

"The great thing about ionic liquids is that every small change you make to their chemistry results in a big change in their properties," said Christine Hamadani, a former graduate student at SEAS and first author of the paper. "By changing one carbon bond, you can change whether or not it attracts or repels proteins."

Hamadani is currently a graduate student at Tanner's lab at the University of Mississippi.

The researchers coated their nanoparticles with the ionic liquid choline hexenoate, which has an aversion to serum proteins. Once in the body, these ionic-liquid coated nanoparticles appeared to spontaneously attach to the surface of red-blood cells and circulate until they reached the dense capillary system of the lungs, where the particles sheared off into the lung tissue.

"This hitchhiking phenomenon was a really unexpected discovery," said Mitragotri. "Previous methods of hitchhiking required special treatment for the nanoparticles to attach to red blood cells and even then, they only stayed at a target location for about six hours. Here, we showed 50 percent of the injected dose still in the lungs after 24 hours."

The research team still needs to understand the exact mechanism that explains why these particles travel so well to lung tissue, but the research demonstrates just how precise the system can be.

"This is such a modular technology," said Tanner, who plans to continue the research in her lab at University of Mississippi. "Any nanoparticle with a surface change can be coated with ionic liquids and there are millions of ionic liquids that can be tuned to have different properties. You could tune the nanoparticle and the liquid to target specific locations in the body."

"We as a field need as many tools as we can to fight the immune system and get drugs where they need to go," said Mitragotri. "Ionic liquids are the latest tool on that front."

Credit: 
Harvard John A. Paulson School of Engineering and Applied Sciences

Barley pan-genome: IPK scientists reach milestone on the way to 'transparent' barley

image: The photo shows a spectrum of diversity in wheat and barley.

Image: 
Photo: IPK Leibniz Institute/ Andreas Bähring

In order to record all genetic information of an individual, its genome must be completely decoded. IPK scientists and international partners for barley already succeeded in doing this three years ago (Mascher et al. 2017). But to understand the genetic information of the entire barley species, much more is required. An international team, again led by IPK scientists, has now come a significant step closer to deciphering this so-called pan-genome of barley, as the science magazine Nature reports in today's issue.

What is astonishing, individual genomes sometimes differ considerably in their number of genes and in the arrangement and orientation of large parts of individual chromosomes, the carriers of genetic information. These "structural" changes in the barley genome can present an insurmountable barrier for recombining important plant characters in crossbreeding.

The starting point of this research was the attempt of characterising by sequencing all approximately 22,000 barley seed samples from the Federal Ex-situ Gene Bank at IPK (Milner et al. 2019). This identified twenty highly diverse genotypes, which have now been selected for complete sequencing. "Criteria for the selection included the greatest possible differences in their genetic diversity, geographical origin and biological traits such as winter or spring type , grain hull, row-type," says Prof. Dr. Nils Stein, head of the Genomics of Genetic Resources research group at IPK and holder of a joint professorship at the University of Göttingen.

Besides the observation that two barley varieties can differ substantially in their total gene content, the scientists found amazing differences in the linear order of the genetic information in the chromosomes - so-called structural genom differences. Two of these structural variations, inversions (the opposite arrangement of genetic information in two genomes), attracted the particular interest of the scientists: in one case, a link could be established to mutation breeding in the 1960s; the inversion since then spread unnoticed through breeding to present-day varieties. In the second case, the observed structural variation got potentially selected during environmental adaptation while the range of barley production in early agriculture expanded to northern latitudes in Europe. "The description of such large genomic inversions in barley is new", says Prof. Dr. Nils Stein. "They can play a decisive role in the breeding process as they might prevent recombination, thus making cross-breeding for desired trait combinations impossible." But in general: "These naturally occurring or artificially induced inversions are evidence of a considerable dynamics in the genome organisation of this important crop species."

The new findings have a great impact on science and breeding. "We have created a new knowledge-base and opened up a treasure trove of new information for breeding", confirms Prof. Dr. Nils Stein. Molecular markers could now be used to specifically take into account structural variation for barley breeding.

The project, including scientists from Australia, Canada, USA, China, Japan and Scotland, was initiated and coordinated at IPK. The IPK has been funded by the Federal Ministry of Education and Research in the field of cereal genome research for more than ten years.

Despite current progress, researchers still face major challenges. "We have not yet recorded the entire diversity of barley," explains Dr. Martin Mascher. "To do so, we need to fully sequence and decode additional genotypes," says the head of the independent Domestication Genomics Research Group at IPK. In a next step, the researchers want to take a closer look at wild barley, the direct ancestor of today's cultivated crop. "We still lack wild barley as an important gene pool," explains Dr. Martin Mascher. "And I am quite sure that we are discovering diversity that could be of considerable value for future barley breeding and research."

Credit: 
Leibniz Institute of Plant Genetics and Crop Plant Research

Space travel can adversely impact energy production in a cell

image: Georgetown researcher Evagelia C. Laiakis, PhD, and dozens of other scientists described recent findings about the impact of space travel on health as part of a large compendium of work that appears concurrently in Cell, Cell Reports, Cell Systems, Patterns, and iScience.

Image: 
Jerry Angdisen/Georgetown University

WASHINGTON --- Studies of both mice and humans who have traveled into space reveal that critical parts of a cell's energy production machinery, the mitochondria, can be made dysfunctional due to changes in gravity, radiation exposure and other factors, according to investigators at Georgetown Lombardi Comprehensive Cancer Center. These findings are part of an extensive research effort across many scientific disciplines to look at the health effects of travel into space. The research has implications for future space travel as well as how metabolic changes due to space travel could inform medical science on earth.

The findings appeared November 25, 2020, in Cell and are part of a larger compendium of research into health aspects of space travel that appears concurrently in Cell, Cell Reports, Cell Systems, Patterns, and iScience.

"My group's research efforts centered around muscle tissue from mice that were sent into space and were compared with analyses by other scientists who studied different mouse tissue," says Evagelia C. Laiakis, PhD, an associate professor of oncology at Georgetown. "Although we each studied different tissue, we all came to the same conclusion: that mitochondrial function was adversely impacted by space travel."

In addition to studying the effects of space travel on cellular function, the scientists used a trove of data from decades of NASA human flight experiments to correlate their outcomes in animals with those from 59 astronauts. They were also able to access data derived from NASA's repository of biospecimens that had flown in space to do further comparisons.
Data from NASA's Twin Study of Mark and Scott Kelly was particularly informative as it allowed for a comparison of the health effects seen in an astronaut in space, Scott, with his earth-bound brother, Mark, who is a retired astronaut.
Comparing their studies of mice with human data, Laiakis and the team of researchers were able to determine that space travel led to certain metabolic effects:

Isolated cells were adversely impacted to a higher degree than whole organs

Changes in the liver were more noticeable than in other organs

Mitochondrial function was impacted

Because space travel almost always exposes people to higher levels of radiation than would be found on earth, the scientists knew that such an exposure could harm mitochondria. This aspect of radiation exposure translates to health outcomes here on earth for cancer patients who undergo radiotherapy. With this knowledge of radiation's impact on mitochondria, clinicians might tailor radiation therapy in different ways in the future to protect normal tissue. The implications for travel to Mars are especially concerning, the researchers say, as that would involve a much longer time in space and hence lengthy exposure to radiation.

"The launch of SpaceX earlier this month was very exciting," says Laiakis. "From this, and other planned ventures to the moon, and eventually Mars, we hope to learn much more about the effects that spaceflight can have on metabolism and how to potentially mitigate adverse effects for future space travelers."

Credit: 
Georgetown University Medical Center

Mapping out the mystery of blood stem cells

image: Princess Margaret Cancer Centre Senior Scientist Dr. Mathieu Lupien and team used state-of-the-art 3D mapping techniques to analyze why some stem cells self-renew and others lose that ability.

Image: 
Images By Delmar

Princess Margaret scientists have revealed how stem cells are able to generate new blood cells throughout our life by looking at vast, uncharted regions of our genetic material that hold important clues to subtle biological changes in these cells.

The finding, obtained from studying normal blood, can be used to enhance methods for stem cell transplantation, and may also shed light into processes that occur in cancer cells that allow them to survive chemotherapy and relapse into cancer growth many years after treatment.

Using state-of-the art sequencing technology to perform genome-wide profiling of the epigenetic landscape of human stem cells, the research revealed important information about how genes are regulated through the three-dimensional folding of chromatin.

Chromatin is composed of DNA and proteins, the latter which package DNA into compact structures, and is found in the nucleus of cells. Changes in chromatin structure are linked to DNA replication, repair and gene expression (turning genes on or off).

The research by Princess Margaret Cancer Centre Senior Scientists Drs. Mathieu Lupien and John Dick is published in Cell Stem Cell, Wednesday, November 25, 2020.

"We don't have a comprehensive view of what makes a stem cell function in a specific way or what makes it tick," says Dr. Dick, who is also a Professor in the Department of Molecular Genetics, University of Toronto.

"Stem cells are normally dormant but they need to occasionally become activated to keep the blood system going. Understanding this transition into activation is key to be able to harness the power of stem cells for therapy, but also to understand how malignant cells change this balance.

"Stem cells are powerful, potent and rare. But it's a knife's edge as to whether they get activated to replenish new blood cells on demand, or go rogue to divide rapidly and develop mutations, or lie dormant quietly, in a pristine state."

Understanding what turns that knife's edge into these various stem cell states has perplexed scientists for decades. Now, with this research, we have a better understanding of what defines a stem cell and makes it function in a particular way.

"We are exploring uncharted territory," says Dr. Mathieu Lupien, who is also an Associate Professor in the Department of Medical Biophysics, University of Toronto. "We had to look into the origami of the genome of cells to understand why some can self-renew throughout our life while others lose that ability. We had to look beyond what genetics alone can tell us."

In this research, scientists focused on the often overlooked noncoding regions of the genome: vast stretches of DNA that are free of genes (i.e. that do not code for proteins), but nonetheless harbour important regulatory elements that determine if genes are turned on or off.

Hidden amongst this noncoding DNA - which comprise about 98% of the genome - are crucial elements that not only control the activity of thousands of genes, but also play a role in many diseases.

The researchers examined two distinct human hematopoietic stem cells or immature cells that go through several steps in order to develop into different types of blood cells, such as white or red blood cells, or platelets.

They looked at long-term hematopoietic stem cells (HSCs) and short-term HSCs found in the bone marrow of humans. The researchers wanted to map out the cellular machinery involved in the "dormancy" state of long-term cells, with their continuous self-renewing ability, as compared to the more primed, activated and "ready-to-go" short-term cells which can transition quickly into various blood cells.

The researchers found differences in the three-dimensional chromatin structures between the two stem cell types, which is significant since the ways in which chromatin is arranged or folded and looped impacts how genes and other parts of our genome are expressed and regulated.

Using state-of-the-art 3D mapping techniques, the scientists were able to analyze and link the long-term stem cell types with the activity of the chromatin folding protein CTCF and its ability to regulate the expression of 300 genes to control long-term, self-renewal.

"Until now, we have not had a comprehensive view of what makes a stem cell function in a particular way," says Dr. Dick, adding that the 300 genes represent what scientists now think is the "essence" of a long-term stem cell.

He adds that long-term dormant cells are a "protection" against malignancy, because they can survive for long periods and evade treatment, potentially causing relapse many years later.

However, a short-term stem cell that is poised to become active, dividing and reproducing more quickly than a long-term one, can gather up many more mutations, and sometimes these can progress to blood cancers, he adds.

"This research gives us insight into aspects of how cancer starts and how some cancer cells can retain stem-cell like properties that allow them to survive long-term," says Dr. Dick.

He adds that a deeper understanding of stem cells can also help with stem cells transplants for the treatment of blood cancers in the future, by potentially stimulating and growing these cells ex vivo (out of the body) for improved transplantation.

Credit: 
University Health Network