Earth

Revealing the reason behind jet formation at the tip of laser optical fiber

image: The schematics of the jet formation mechanism.

Image: 
Junnosuke Okajima, Tohoku University

When an optical fiber is immersed in liquid, a high temperature, high speed jet is discharged. Researchers expect this to be applied to medical treatment in the future. Now, a research team from Russia and Japan has explored this phenomenon further and revealed the reasons behind the jet formation.

Lasers using a thin optical fiber and combined with an endoscope and catheter can be easily transported into deep areas of the body or inside blood vessels. Traditionally, affected areas or lesions are removed by generating heat inside the tissue through laser absorption - a process known as the photothermal effect.

Yet, hydrodynamical phenomena, such as microbubble formation or high-speed jet generation from the optical fiber, show immense medical promise.

The process of jet formation happens when the laser is irradiated to the water, causing the water to boil and a vapor bubble to form at the tip of the optical fiber. The vapor bubble grows until the laser energy absorbed in the liquid is consumed. Because of the surrounding cold liquid, condensation suddenly shrinks the vapor bubble.

Using a numerical simulation, Dr. Junosuke Okajima from Tohoku University's Institute of Fluid Science, along with his colleagues in Russia, set out to clarify the jet formation mechanism. Their simulation investigated the relationship between the bubble deformation and the induced flow field.

When the bubble shrinks, the flow toward the tip of the optical fiber is formed. The flow deforms the bubble into the cylindrical shape. This deformation induces the collision of flow in a radial direction. This collision generates the jet forward. As a result of collision and jet formation, the vortex is formed at the tip of the deformed bubble and it grows larger.

"We found the jet velocity depends on the relationship between the size of the vapor bubble just before the shrinking and the fiber radius," said Okajima. "We will continue to develop this study and try to find the optimum condition which maximizes the jet velocity and temperature, making further laser surgical techniques more effective and safer."

Credit: 
Tohoku University

Age does not contribute to COVID-19 susceptibility

image: The age distribution of mortality by COVID-19 was similar in Italy (reported on 13th May 2020), Japan (reported on 7th May 2020), and Spain (reported on 12th May 2020). (Ryosuke Omori, Ryota Matsuyama, Yukihiko Nakata, Scientific Reports, October 6, 2020).

Image: 
Ryosuke Omori, Ryota Matsuyama, Yukihiko Nakata, Scientific Reports, October 6, 2020

Scientists have estimated that the age of an individual does not indicate how likely they are to be infected by SARS-CoV-2. However, development of symptoms, progression of the disease, and mortality are age-dependent.

There have been a large number of deaths due to the ongoing COVID-19 pandemic, and it has been shown that elderly individuals disproportionately develop severe symptoms and show higher mortality.

A team of scientists, including Associate Professor Ryosuke Omori from the Research Center for Zoonoses Control at Hokkaido University, have modeled available data from Japan, Spain and Italy to show that susceptibility to COVID-19 is independent of age, while occurrence of symptomatic COVID-19, severity and mortality is likely dependent on age. Their results were published in the journal Scientific Reports on October 6, 2020.

Causes of mortality in elderly individuals may be due to two factors: how likely they are to be infected due to their advanced age (age-dependent susceptibility), which is reflected in the number of cases; and, how likely they will be affected by a severe form of the disease due to their advanced age (age-dependent severity), which is reflected in the mortality rate. These factors are not fully understood for COVID-19.

The scientists chose to analyse data from Italy, Spain and Japan to determine if any relationship between age, susceptibility and severity. These three countries were chosen as they have well recorded, publicly available data. As of May 2020, the mortality rate (number of deaths per 100,000) was 382.3 for Italy, 507.2 for Spain and 13.2 for Japan. However, despite the wide disparity in mortality rates, the age distribution of mortality (the proportional number of deaths per age group) was similar for these countries.

The scientists developed a mathematical model to calculate susceptibility in each age group under different conditions. They also factored in the estimated human-to-human contact level in each age group, as well as varying restriction levels for outside-home activities in the three countries.

The model showed that the susceptibility has to be unrealistically different between age groups if they assume age does not influence severity and mortality. On the other hand, the model indicated the age should not influence susceptibility but should negatively influence severity and mortality, to explain the fact that the age distribution of mortality is similar between the three countries.

Ryosuke Omori, from the Research Center for Zoonoses Control at Hokkaido University, specializes in epidemiological modelling: the use of mathematics and statistics to understand and predict the spread of diseases. Since the outbreak of COVID-19, he has turned his efforts to ascertaining the true extent of the spread of the pandemic in Japan and abroad.

Credit: 
Hokkaido University

Study shows proof of concept of BioIVT HEPATOPAC cultures with targeted assay to evaluate bioactivation potential and drug-induced liver injury (DILI) risk

image: BioIVT logo

Image: 
BioIVT

BioIVT, a leading provider of research models and services for drug and diagnostic development, today announced the publication of research describing the use of HEPATOPAC® cultures with a targeted in vitro assay to identify small molecule drugs with high potential for drug-induced liver injury (DILI).1

DILI contributes to the high failure rate of drug candidates in clinical development; but frequently DILI risk is not evident until late in clinical trials. There remains a need for better preclinical models to screen drug candidates for DILI risk during the lead selection and optimization process.

The in vitro Bioactivation Liver Response Assay (BA-LRA) is a method based on a set of liver gene expression biomarkers that respond quantitatively to chemically reactive metabolites that are predicted to trigger bioactivation-mediated clinical DILI.

BioIVT's HEPATOPAC model was selected for the in vitro BA-LRA because of its long-term viability and demonstrated in-vivo relevance. The HEPATOPAC model is an in vitro bioengineered co-culture of primary hepatocytes and fibroblasts, which is used extensively for liver-based safety, metabolism, and efficacy evaluations of small molecule drug candidates.

The work, conducted by scientists at Merck Research Laboratories and published in the peer reviewed journal Toxicological Sciences, describes application of the in vitro BA-LRA using the HEPATOPAC model to evaluate 93 compounds known to be DILI positive or negative in humans. The assay was able to differentiate the drugs with lower DILI risk with an 81% sensitivity and 90% specificity in the rat HEPATOPAC model and a 68% sensitivity and 86% specificity in the human HEPATOPAC model.

"The high in-vitro in-vivo correlation of HEPATOPAC cultures, combined with their long-term viability makes this an excellent system for novel ADME Tox and disease models. This publication adds to the body of evidence for the utility of these assays as early de-risking tools to reduce the risk of drug induced liver injury in pharmaceutical development," said BioIVT Senior VP ADME Dr. Christopher Black.

Credit: 
Rana Healthcare Solutions LLC

VAV1 gene mutations trigger T-cell tumors in mice

Tsukuba, Japan -- Life is an exquisite orchestration of growth and change, with checks and balances that fine-tune complex entwined interactions, both intrinsic and external. White blood cells (WBC) are integral to an organism's immune defenses against disease and invasion; unfortunately, these mechanisms may go awry causing uncontrolled increases in dysfunctional cell numbers, resulting in tumor formation. Now, researchers at the University of Tsukuba have illustrated how mutations in a specific gene called VAV1 may promote tumors involving a type of white blood cell, the T-cell, in experimental mice.

Leukocytes, or WBC, are fundamental to the body's immune function. They include B-lymphocytes that generate antibodies, as well as thymic-lymphocytes or T-cells with diverse immune-related functions, so called because they develop in the thymus gland. T-cell neoplasms include a mature subtype called peripheral T-cell lymphoma. Studies have shown that VAV1, a gene that participates in T-cell receptor signaling, is altered in several peripheral T-cell lymphoma variants; therefore, the research team sought to elucidate the role of VAV1 mutants in the malignant transformation of T-cells in vivo.

The tumor suppressor gene p53 is called 'guardian of the genome' because it prevents genome mutation. The researchers replicated VAV1 mutations found in human T-cell tumors in both normal ("wild-type") mice and mice lacking p53. Lead author Kota Fukumoto describes their findings: "No tumors developed in the wild-type mice with VAV1 mutations over a year of observation; whereas immature tumors developed in mice that lacked p53. Remarkably, mice that both lacked p53 and had mutations in VAV1 developed mature tumors resembling human peripheral T-cell lymphoma, and had poorer prognosis than the mice lacking p53 only."

The team also transplanted tumor cells into mice that lacked a functional thymus. The results suggested that tumor initiation was likely due to mechanisms within the cell itself. "We noted that T-cell tumors with VAV1 mutation showed Myc pathway enrichment, as well as somatic copy number alterations (SCNAs), including at the Myc locus," explains Professor Shigeru Chiba, senior author. Significantly, Myc, a family of regulator genes and proto-oncogenes, and SCNAs, which cause discrepancies in DNA copies, are both distinct hallmarks of tumor formation.

"Interestingly, pharmaceutical inhibition of the Myc pathway increased overall survival of mice harboring VAV1-mutant tumors," Professor Chiba adds. "Therefore, our methodology and results suggest that the VAV1-mutant expressing mice developed in this study could be a research tool for evaluating therapeutic agents directed against specific T-cell neoplasms."

Credit: 
University of Tsukuba

In the eye of a stellar cyclone

image: Infrared image of Wolf-Rayet binary, dubbed Apep, 8000 light years from Earth.

Image: 
European Southern Observatory

While on COVID lockdown, a University of Sydney honours student has written a research paper on a star system dubbed one of the "exotic peacocks of the stellar world".

Only one in a hundred million stars makes the cut to be classified a Wolf-Rayet: ferociously bright, hot stars doomed to imminent collapse in a supernova explosion leaving only a dark remnant, such as a black hole.

Rarest of all, even among Wolf-Rayets, are elegant binary pairs that, if the conditions are right, are able to pump out huge amounts of carbon dust driven by their extreme stellar winds. As the two stars orbit one another, the dust gets wrapped into a beautiful glowing sooty tail. Just a handful of these sculpted spiral plumes has ever been discovered.

The object of this study is the newest star to join this elite club, but it has been found to break all the rules.

"Aside from the stunning image, the most remarkable things about this star system is the way the expansion of its beautiful dust spiral left us totally stumped," said Yinuo Han, who completed the research during his honours year in the School of Physics.

"The dust seems to have a mind of its own, floating along much slower than the extreme stellar winds that should be driving it."

Astronomers stumbled across this conundrum when the system was discovered two years ago by a team led by University of Sydney Professor Peter Tuthill. This star system, 8000 light years from Earth, was named Apep after the serpentine Egyptian god of chaos.

Now Mr Han's research, published in the Monthly Notices of the Royal Astronomical Society, confirms those findings and reveals Apep's bizarre physics with unprecedented detail.

Applying high-resolution imaging techniques at the European Southern Observatory's Very Large Telescope at Paranal in Chile, the team was able to probe the underlying processes that create the spiral that we observe.

"The magnification required to produce the imagery was like seeing a chickpea on a table 50 kilometres away," Mr Han said.

PRECISE MODEL

The team went further than confirming the earlier discovery, producing a model that matches the intricate spiral structure for the first time, advancing scientists' ability to understand the extreme nature of these stars.

"The fact this relatively simple model can reproduce the spiral geometry to this level of detail is just beautiful," Professor Tuthill said.

However, not all of the physics is straightforward. Mr Han's team confirmed that the dust spiral is expanding four times slower than the measured stellar winds, something unheard of in other systems.

The leading theory to explain this bizarre behaviour makes Apep a strong contender for producing a gamma-ray burst when it does finally explode, something never before witnessed in the Milky Way.

Dr Joe Callingham, a co-author of the study from Leiden University in the Netherlands, said: "There has been a flurry of research into Wolf-Rayet star systems: these really are the peacocks of the stellar world. Discoveries about these elegantly beautiful, but potentially dangerous objects, is causing a real buzz in astronomy."

He said this paper was one of three to be published this year on the Apep system alone.
Recently, the team demonstrated that Apep was not just composed of one Wolf-Rayet star, but in fact two. And colleagues from the Institute of Space and Astronautical Science in Japan will soon publish a paper on another system, Wolf-Rayet 112. Lead author of that paper, Ryan Lau, was a co-author on this paper with Mr Han.

TIME BOMBS

Wolf-Rayet stars are massive stars that have reached their final stable phase before going supernova and collapsing to form compact remnants such as black holes or neutron stars.

"They are ticking time bombs," Professor Tuthill said.

"As well as exhibiting all the usual extreme behaviour of Wolf-Rayets, Apep's main star looks to be rapidly rotating. This means it could have all the ingredients to detonate a long gamma-ray burst when it goes supernova."

Gamma-ray bursts are among the most energetic events in the Universe. And they are potentially deadly. If a gamma-ray burst were to impact Earth, it could strip the planet of its precious ozone layer, exposing us all to ultra-violet radiation from the Sun. Fortunately, Apep's axis of rotation means it presents no threat to Earth.

'MIND-BLOWING'

The numbers reveal Apep's extreme nature. The two stars are each about 10 to 15 times more massive than the Sun and more than 100,000 times brighter. Where the surface of our home star is about 5500 degrees, Wolf-Rayet stars are typically 25,000 degrees or more.

According to the team's newest findings, the massive stars in the Apep binary orbit each other about every 125 years at a distance comparable to the size of our Solar System.

"The speeds of the stellar winds produced are just mind-blowing," Mr Han said. "They are spinning off the stars about 12 million kilometres an hour; that's 1 percent the speed of light.

"Yet the dust being produced by this system is expanding much more slowly, at about a quarter of the stellar wind speed."

Mr Han said that the best explanation for this points to the fast-rotating nature of the stars.

"It likely means that stellar winds are launched in different directions at different speeds. The dust expansion we are measuring is driven by slower winds launched near the star's equator," he said.

"Our model now fits the observed data quite well, but we still haven't quite explained the physics of the stellar rotation."

Mr Han will continue his astronomical studies at the University of Cambridge when he starts his doctorate later this year.

Credit: 
University of Sydney

Planting parasites: Unveiling common molecular mechanisms of parasitism and grafting

image: (Left) Parasitism between the roots of P. japonicum and Arabidopsis. (Right) Grafting of P. japonicum with Arabidopsis. Yellow arrow heads indicate the grafted points.

Image: 
Michitaka Notaguchi

Using the model Orobanchaceae parasitic plant Phtheirospermum japonicum, scientists from Nagoya University and other research institutes from Japan have discerned the molecular mechanisms underlying plant parasitism and cross-species grafting, pinpointing enzyme β-1,4-glucanase (GH9B3) as an important contributor to both phenomena. Targeting this enzyme may help control plant parasitism in crops. Also, this mechanism can be exploited for novel cross-species grafting techniques to realize the goal of sustainable agricultural technologies.

Plant parasitism is a phenomenon by which the parasite plant latches onto and absorbs water and nutrients from a second host plant, with the help of a specialized organ called the "haustorium." Once the haustorium forms, specific enzymes then help in forming a connection between the tissues of the parasite and host plants, known as a "xylem bridge," which facilitates the transport of water and nutrients from the host to the parasite.

A similar mechanism is involved in the process of artificial stem grafting, during which, the cell walls of the two different plant tissues at the graft junction become thinner and compressed, a phenomenon made possible by specific cell wall modifying enzymes. Cell wall modification has also been implicated to play a role in parasitism in different lineages of parasitic plants.

Therefore, the research team, led by Dr Ken-ichi Kurotani of Nagoya University, hypothesized that similar genes and enzymes should be involved in the process of parasitism and cross-species grafting. "To investigate molecular events involved in cell-cell adhesion between P. japonicum and the host plant, we analyzed the transcriptome for P. japonicum-Arabidopsis parasitism and P. japonicum-Arabidopsis grafting," reports Dr Kurotani. When a gene in a cell is activated, it produces an RNA "transcript" that is then translated into an active protein, which is then used by the cell to perform various activities. A "transcriptome" is the complete set of RNA transcripts that the genome of an organism produces under various diverse conditions. The findings of their experiments are published in Nature's Communications Biology.

Comparison of the parasitism and graft transcriptomes revealed that genes associated with wound healing, cell division, DNA replication, and RNA synthesis were highly upregulated during both events, indicating active cell proliferation at both the haustorium and graft interface.

"What's more, we found an overlap between the transcriptome data from this study and that from grafting between Nicotiana and Arabidopsis, another angiosperm," reports Dr Michitaka Notaguchi, the co-corresponding author of the study. Glycosyl hydrolases are enzymes that specifically target the breakdown of cellulose, the primary component of plant cell walls. A β-1,4-glucanase identified in P. japonicum belongs to the glycosyl hydrolase 9B3 (GH9B3) family; an enzyme from the same family was recognized to be crucial for cell-cell adhesion in Nicotiana by Dr Notaguchi's group.

Further experiments showed that GH9B3-silenced P. japonicum could form the haustorium with Arabidopsis but could not form a functional xylem bridge, meaning that the P. japonicum β-1,4-glucanase is integral for the plant's parasitic activity. Further, high GH9B3 RNA transcript levels were observed during artificial grafting experiments, thereby proving that the enzyme plays an integral role in both parasitism and grafting mechanisms.

The transcriptome data generated in this study can be used to unearth additional genes and enzymes involved in plant parasitism. Additionally, further research along these directions will help scientists develop specific molecular approaches to arrive at sustainable cross-species grafting alternatives.

The paper, "Host-parasite tissue adhesion by a secreted type of β-1,4-glucanase in the parasitic plant Phtheirospermum japonicum," was published in the journal Communications Biology on July 30, 2020 at DOI: 10.1038/s42003-020-01143-5.

Credit: 
Nagoya University

New NIST project to build nano-thermometers could revolutionize temperature imaging

image: These prototype nanoparticle cores for thermometry are 35 nm in diameter.

Image: 
A. Biacchi/NIST

Cheaper refrigerators? Stronger hip implants? A better understanding of human disease? All of these could be possible and more, someday, thanks to an ambitious new project underway at the National Institute of Standards and Technology (NIST).

NIST researchers are in the early stages of a massive undertaking to design and build a fleet of tiny ultra-sensitive thermometers. If they succeed, their system will be the first to make real-time measurements of temperature on the microscopic scale in an opaque 3D volume -- which could include medical implants, refrigerators, and even the human body.

The project is called Thermal Magnetic Imaging and Control (Thermal MagIC), and the researchers say it could revolutionize temperature measurements in many fields: biology, medicine, chemical synthesis, refrigeration, the automotive industry, plastic production -- "pretty much anywhere temperature plays a critical role," said NIST physicist Cindi Dennis. "And that's everywhere."

The NIST team has now finished building its customized laboratory spaces for this unique project and has begun the first major phase of the experiment.

Thermal MagIC will work by using nanometer-sized objects whose magnetic signals change with temperature. The objects would be incorporated into the liquids or solids being studied -- the melted plastic that might be used as part of an artificial joint replacement, or the liquid coolant being recirculated through a refrigerator. A remote sensing system would then pick up these magnetic signals, meaning the system being studied would be free from wires or other bulky external objects.

The final product could make temperature measurements that are 10 times more precise than state-of-the-art techniques, acquired in one-tenth the time in a volume 10,000 times smaller. This equates to measurements accurate to within 25 millikelvin (thousandths of a kelvin) in as little as a tenth of a second, in a volume just a hundred micrometers (millionths of a meter) on a side. The measurements would be "traceable" to the International System of Units (SI); in other words, its readings could be accurately related to the fundamental definition of the kelvin, the world's basic unit of temperature.

The system aims to measure temperatures over the range from 200 to 400 kelvin (K), which is about -99 to 260 degrees Fahrenheit (F). This would cover most potential applications -- at least the ones the Thermal MagIC team envisions will be possible within the next 5 years. Dennis and her colleagues see potential for a much larger temperature range, stretching from 4 K-600 K, which would encompass everything from supercooled superconductors to molten lead. But that is not a part of current development plans.

"This is a big enough sea change that we expect that if we can develop it -- and we have confidence that we can -- other people will take it and really run with it and do things that we currently can't imagine," Dennis said.

Potential applications are mostly in research and development, but Dennis said the increase in knowledge would likely trickle down to a variety of products, possibly including 3D printers, refrigerators, and medicines.

What Is It Good For?

Whether it's the thermostat in your living room or a high-precision standard instrument that scientists use for laboratory measurements, most thermometers used today can only measure relatively big areas -- on a macroscopic as opposed to microscopic level. These conventional thermometers are also intrusive, requiring sensors to penetrate the system being measured and to connect to a readout system by bulky wires.

Infrared thermometers, such as the forehead instruments used at many doctors' offices, are less intrusive. But they still only make macroscopic measurements and cannot see beneath surfaces.

Thermal MagIC should let scientists get around both these limitations, Dennis said.

Engineers could use Thermal MagIC to study, for the first time, how heat transfer occurs within different coolants on the microscale, which could aid their quest to find cheaper, less energy-intensive refrigeration systems.

Doctors could use Thermal MagIC to study diseases, many of which are associated with temperature increases -- a hallmark of inflammation -- in specific parts of the body.

And manufacturers could use the system to better control 3D printing machines that melt plastic to build custom objects such as medical implants and prostheses. Without the ability to measure temperature on the microscale, 3D printing developers are missing crucial information about what's going on inside the plastic as it solidifies into an object. More knowledge could improve the strength and quality of 3D-printed materials someday, by giving engineers more control over the 3D printing process.

Giving It OOMMF

The first step in making this new thermometry system is creating nano-sized magnets that will give off strong magnetic signals in response to temperature changes. To keep particle concentrations as low as possible, the magnets will need to be 10 times more sensitive to temperature changes than any objects that currently exist.

To get that kind of signal, Dennis said, researchers will likely need to use multiple magnetic materials in each nano-object. A core of one substance will be surrounded by other materials like the layers of an onion.

The trouble is that there are practically endless combinations of properties that can be tweaked, including the materials' composition, size, shape, the number and thickness of the layers, or even the number of materials. Going through all of these potential combinations and testing each one for its effect on the object's temperature sensitivity could take multiple lifetimes to accomplish.

To help them get there in months instead of decades, the team is turning to sophisticated software: the Object Oriented MicroMagnetic Framework (OOMMF), a widely used modeling program developed by NIST researchers Mike Donahue and Don Porter.

The Thermal MagIC team will use this program to create a feedback loop. NIST chemists Thomas Moffat, Angela Hight Walker and Adam Biacchi will synthesize new nano-objects. Then Dennis and her team will characterize the objects' properties. And finally, Donahue will help them feed that information into OOMMF, which will make predictions about what combinations of materials they should try next.

"We have some very promising results from the magnetic nano-objects side of things, but we're not quite there yet," Dennis said.

Each Dog Is a Voxel

So how do they measure the signals given out by tiny concentrations of nano-thermometers inside a 3D object in response to temperature changes? They do it with a machine called a magnetic particle imager (MPI), which surrounds the sample and measures a magnetic signal coming off the nanoparticles.

Effectively, they measure changes to the magnetic signal coming off one small volume of the sample, called a "voxel" -- basically a 3D pixel -- and then scan through the entire sample one voxel at a time.

But it's hard to focus a magnetic field, said NIST physicist Solomon Woods. So they achieve their goal in reverse.

Consider a metaphor. Say you have a dog kennel, and you want to measure how loud each individual dog is barking. But you only have one microphone. If multiple dogs are barking at once, your mic will pick up all of that sound, but with only one mic you won't be able to distinguish one dog's bark from another's.

However, if you could quiet each dog somehow -- perhaps by occupying its mouth with a bone -- except for a single cocker spaniel in the corner, then your mic would still be picking up all the sounds in the room, but the only sound would be from the cocker spaniel.

In theory, you could do this with each dog in sequence -- first the cocker spaniel, then the mastiff next to it, then the labradoodle next in line -- each time leaving just one dog bone-free.

In this metaphor, each dog is a voxel.

Basically, the researchers max out the ability of all but one small volume of their sample to respond to a magnetic field. (This is the equivalent of stuffing each dog's mouth with a delicious bone.) Then, measuring the change in magnetic signal from the entire sample effectively lets you measure just that one little section.

MPI systems similar to this exist but are not sensitive enough to measure the kind of tiny magnetic signal that would come from a small change in temperature. The challenge for the NIST team is to boost the signal significantly.

"Our instrumentation is very similar to MPI, but since we have to measure temperature, not just measure the presence of a nano-object, we essentially need to boost our signal-to-noise ratio over MPI by a thousand or 10,000 times," Woods said.

They plan to boost the signal using state-of-the-art technologies. For example, Woods may use superconducting quantum interference devices (SQUIDs), cryogenic sensors that measure extremely subtle changes in magnetic fields, or atomic magnetometers, which detect how energy levels of atoms are changed by an external magnetic field. Woods is working on which are best to use and how to integrate them into the detection system.

The final part of the project is making sure the measurements are traceable to the SI, a project led by NIST physicist Wes Tew. That will involve measuring the nano-thermometers' magnetic signals at different temperatures that are simultaneously being measured by standard instruments.

Other key NIST team members include Thinh Bui, Eric Rus, Brianna Bosch Correa, Mark Henn, Eduardo Correa and Klaus Quelhas.

Before finishing their new laboratory space, the researchers were able to complete some important work. In a paper published last month in the International Journal on Magnetic Particle Imaging, the group reported that they had found and tested a "promising" nanoparticle material made of iron and cobalt, with temperature sensitivities that varied in a controllable way depending on how the team prepared the material. Adding an appropriate shell material to encase this nanoparticle "core" would bring the team closer to creating a working temperature-sensitive nanoparticle for Thermal MagIC.

In the past few weeks, the researchers have made further progress testing combinations of materials for the nanoparticles.

"Despite the challenge of working during the pandemic, we have had some successes in our new labs," Woods said. "These achievements include our first syntheses of multi-layer nanomagnetic systems for thermometry, and ultra-stable magnetic temperature measurements using techniques borrowed from atomic clock research."

Credit: 
National Institute of Standards and Technology (NIST)

Hydroxychloroquine does not counter SARS-CoV-2 in hamsters, high dose of favipiravir does

image: Lab technicians have to wear protective suits when working with infectious SARS-CoV-2 samples.

Image: 
Layla Aerts - KU Leuven

Virologists at the KU Leuven Rega Institute have been working on two lines of SARS-CoV-2 research: searching for a vaccine to prevent infection, and testing existing drugs to see which one can reduce the amount of virus in infected people.

To test the efficacy of the vaccine and antivirals preclinically, the researchers use hamsters. The rodents are particularly suitable for SARS-CoV-2 research because the virus replicates itself strongly in hamsters after infection. Moreover, hamsters develop a lung pathology similar to mild COVID-19 in humans. This is not the case with mice, for example.

For this study, the team of Suzanne Kaptein (PhD), Joana Rocha-Pereira (PhD), Professor Leen Delang, and Professor Johan Neyts gave the hamsters either hydroxychloroquine or favipiravir - a broad-spectrum antiviral drug used in Japan to treat influenza - for four to five days. They tested several doses of favipiravir. The hamsters were infected with the SARS-CoV-2 virus in two ways: by inserting a high dose of virus directly into their noses or by putting a healthy hamster in a cage with an infected hamster. Drug treatment was started one hour before the direct infection or one day before the exposure to an infected hamster. Four days after infection or exposure, the researchers measured how much of the virus was present in the hamsters.

Hydroxychloroquine versus favipiravir

Treatment with hydroxychloroquine had no impact: the virus levels did not decrease and the hamsters were still infectious. "Despite the lack of clear evidence in animal models or clinical studies, many COVID-19 patients have already been treated with hydroxychloroquine," explains Joana Rocha-Pereira. "Based on these results and the results of other teams, we advise against further exploring the use of hydroxychloroquine as a treatment against COVID-19."

A high dose of favipiravir, however, had a potent effect. A few days after the infection, the virologists detected hardly any infectious virus particles in the hamsters that received this dose and that had been infected intranasally. Moreover, hamsters that were in a cage with an infected hamster and had been given the drug did not develop an obvious infection. Those that had not received the drug all became infected after having shared a cage with an infected hamster.

A low dose of the drug favipiravir did not have this outcome. "Other studies that used a lower dose had similar results," Professor Delang notes. "The high dose is what makes the difference. That's important to know, because several clinical trials have already been set up to test favipiravir on humans."

Cautious optimism

The researchers are cautiously optimistic about favipiravir. "Because we administered the drug shortly before exposing the hamsters to the virus, we could establish that the medicine can also be used prophylactically, so in prevention," Suzanne Kaptein notes.

"If further research shows that the results are the same in humans, the drug could be used right after someone from a high-risk group has come into contact with an infected person. It may likely also be active during the early stages of the disease."

General preventive use is probably not an option, however, because it is not known whether long-term use, especially at a high dose, has side effects.

No panacea

Further research will have to determine whether humans can tolerate a high dose of favipiravir. "In the hamsters, we detected hardly any side effects," says Delang. In the past, the drug has already been prescribed in high doses to Ebola patients, who appear to have tolerated it well.

"Favipiravir is not a panacea," the researchers warn. This flu drug, nor any other drug, has not been specifically developed against coronaviruses. As a result, the potency of favipiravir is to be considered moderate at best.

The study also highlights the importance of using small animals to test therapies against SARS-CoV-2 in vivo. "Our hamster model is ideally suited to identify which new or existing drugs may be considered for clinical studies," explains Professor Johan Neyts. "In the early days of the pandemic, such a model was not yet available. At that time, the only option was to explore in patients whether or not a drug such as hydroxychloroquine could help them. However, testing treatments on hamsters provides crucial information that can prevent the loss of valuable time and energy with clinical trials on drugs that don't work."

Not all research models are equal

Kaptein, Rocha-Pereira, Delang and Neyts recently contributed to a commentary in Nature Communications in which they give additional context to the contradictory messages that have been circulating about (hydroxy)chloroquine. In the early days of the pandemic, several studies were set up to test these drugs in cell cultures. The results suggested that they could have an antiviral effect. As a result, clinical trials were organised to test the drugs on humans. However, cell cultures are not the best proxy for the human body, and no conclusive effect was found in humans.

In their commentary, the authors describe several recent studies on human organ-on-chip and other complex in vitro models, mice, hamsters, and non-human primates. Each of these studies demonstrates that hydroxychloroquine and chloroquine do not have the efficacy suggested by the studies in cell cultures. Therefore, the authors conclude that these malaria drugs are very unlikely to be effective in humans as a COVID-19 treatment.

Credit: 
KU Leuven

The Colorado river's water supply is predictable owing to long-term ocean memory

image: The Colorado River is the most important water resource in the semi-arid western United States. Scientists at Utah State University have developed a tool to forecast long-term drought and water flow for the river that could help policymakers prepare for changes that impact millions of people.

Image: 
Photo by Stephen Walker

A team of scientists at Utah State University has developed a new tool to forecast drought and water flow in the Colorado River several years in advance. Although the river's headwaters are in landlocked Wyoming and Colorado, water levels are linked to sea surface temperatures in parts of the Pacific and Atlantic oceans and the water's long-term ocean memory. The group's paper, "Colorado River water supply is predictable on multi-year timescales owning to long-term ocean memory" was published October 9 by Communications Earth and Environment, an open-access journal from Nature Research.

The Colorado River is the most important water resource in the semi-arid western United States and faces growing demand from users in California, Arizona, New Mexico, Colorado and Utah. Because water shortages in the Colorado River impact energy production, food and drinking water security, forestry and tourism, tools to predict drought and low water levels could inform management decisions that affect millions of people.

Current drought forecasts focus on short-term indicators which limits their usefulness because short-term weather phenomena have too great an influence on the models.

"This new approach is robust and means that water managers, for the first time, have a tool to better estimate water supply in the Colorado River for the future," Robert Gillies, professor in USU's Department of Plants, Soils and Climate (PSC) and director of the Utah Climate Center, said. "The model can be run iteratively so every year a new forecast for the next three years can be created."

In addition to ocean memory, water flows are impacted by land systems--including soils, groundwater, vegetation, and perennial snowpack--which play important roles in tempering the effects of short-term precipitation events. The researchers hypothesized that multi-year predictions could be achieved by using long-term ocean memory and associated atmospheric effects and the filtering effects of land systems.

The study's lead author, Yoshimitsu Chikamoto, assistant professor of earth systems modeling in USU's PSC department, said the components of the complex climate model include simulations of clouds and aerosols in the atmosphere, land surface characteristics, ocean currents and mixing and sea surface heat and water exchange.

"These predictions can provide a more long-term perspective," Chikamoto said. "So if we know we have a water shortage prediction we need to work with policymakers on allocating those water resources."

Simon Wang, USU professor of climate dynamics, said water managers and forecasters are familiar with El Niño and La Niña and the ocean's connections to weather in the southwestern U.S. However, the upper basin of the Colorado River is not in the southwest and forecasts have not connected the dynamics of parts of the oceans with the Colorado River as the new forecasting tool does.

Matt Yost, PSC assistant professor and USU Extension agroclimate specialist, said having a two-year lead-time on preparing for drought could have a huge impact on farmers as they plan crop rotations and make other business decisions.

Co-author Larissa Yocom, assistant professor of fire ecology in USU's Department of Wildland Resources, said a tool that can provide a long-term forecast of drought in areas impacted by the Colorado River could give managers a jump-start in preparing for wildland fire seasons.

Wang said Utah Climate Center researchers have developed models of drought cycles in the region and have recently studied the dynamics of river flows and shrinking water levels in the Great Salt Lake.

"In doing that work, we know that water managers don't have tools to forecast Colorado River flows very long into the future and that is a constraint on what they can do," Wang said. "We have built statistical models in the past, and Yoshi (Chikamoto) has expertise and in-depth knowledge of ocean dynamics so we talked about giving this idea a try because we found nothing in the literature to model these dynamics in the upper basin."

"Using our tool we can develop an operational forecast of the Colorado River's water supply," Chikamoto added

Credit: 
Utah State University

A dance of histones silences transposable elements in pluripotent stem cells

image: The mechanism of a peculiar type of heterochromatin, used by embryonic stem cells to silence 'parasitic' DNA-elements within the context of their highly dynamic pluripotent chromatin.

Image: 
Simon Elsässer

So-called transposons are abundant DNA-elements found in every eukaryotic organism as a consequence of their ability to jump and multiply within the host genome. Their activity represents a threat to the integrity of the host genome and thus the host cell engages a number of protective mechanisms to silence the expression of transposons. It is known that some of these mechanisms fail in cancer cells and also ageing cells, leading to a mobilization of transposons with largely unknown consequences. Histones, the proteins that package the genome in the eukaryotic nucleus, are key to the most fundamental line of defense to transposons. By forming a highly compacted array, so-called heterochromatin, they render the associated DNA sequence inert to being read and expressed. Heterochromatin is defined by characteristic modifications to histone proteins and DNA, such as histone H3 K9 trimethylation and DNA CpG methylation.

Simon Elsässer's team studied endogenous retroviral elements (ERVs), a particularly active and abundant family of transposable elements in the mouse genome, which are in fact remnants of once-active viruses. Curiously, while they found all the hallmarks of heterochromatin to be employed in the silencing mechanism, ERV chromatin was highly enriched in a histone variant, termed histone H3.3, which has previously been invariably associated with active regions of the genome. Following up on this observation, the team could elucidate an unexpected mechanism involving a continuous loss of 'old' histones and replenishment with newly synthesized histones H3.3 molecules. By genetic manipulation, the team was able to deduce a mechanism explaining this dynamic process: the ATP-dependent chromatin remodeler Smarcad1 evicts histones within heterochromatin, thus creating gaps in the chromatin fibre that could render parts of the ERV gene accessible. Following suit, the histone chaperone DAXX seals these gaps by facilitating reassembly of nucleosomes with histone variant H3.3.

"The concerted process of eviction of one and deposition of another histone is so smooth and efficient that it leaves no apparent trace of accessible DNA. Without a close look at the dynamics of histones within the chromatin fibre, we would have never noticed the phenomenon" says Simon Elsässer.

The result is puzzling because active remodeling and nucleosome eviction is expected to counteract a compacted chromatin structure, inert to transcriptional activation. But the team believes that dynamic heterochromatin is an adaption of a ubiquitous silencing mechanism to the specific requirements of a pluripotent chromatin state. The highly transient opening of heterochromatin may allow sequence-specific co-repressors to find their target DNA sequence within the transposable element, in turn recruiting more repressive factors to propagate and amplify the silent state.

Driver mutations and dysregulation of DAXX, H3.3 and Smarcad1, respectively, have been observed in various cancer types. The new study adds fresh indications that reactivation of silenced transposable elements may play a role in their tumorigenesis.

Credit: 
Science For Life Laboratory

Future ocean conditions could cause significant physical changes in marine mussels

image: Marine mussels are commonly used to monitor water quality in coastal areas

Image: 
University of Plymouth

The increased temperature and acidification of our oceans over the next century have been argued to cause significant physical changes in an economically important marine species.

Scientists from the University of Plymouth exposed blue mussels (Mytilus edulis) to current and future levels of ocean acidification (OA) or warming (W), as well as both together - commonly known as OAW.

Initial comparison of mussel shells showed that warming alone led to increased shell growth, but increasing warming and acidification led to decreased shell growth indicating that OA was dissolving their shells.

However, analysis using cutting edge electron microscopy of the shell crystal matrix or 'ultrastructure' revealed that, in fact, warming alone has the potential to significantly alter the physical properties of the mussels' shells, whereas acidification mitigated some of the negative effects.

Mussels grown under warming exhibited changes in their crystal structures including a propensity for increased brittleness, which would place mussels under greater threat from their many predators including crabs and starfish.

These negative effects were to some degree mitigated under acidified conditions with mussel shells showing evidence of repair, even though their crystals grew differently to the norm.

The study, published in a Frontiers of Marine Science special issue titled Global Change and the Future Ocean, is the latest research by the University into the potential effects of ocean warming and acidification on marine species.

Previous projects have suggested future conditions could significantly reduce the nutritional qualities of oysters as well as dissolving the shells of sea snails and reducing their overall size by around a third.

Dr Antony Knights, Associate Professor in Marine Ecology and the study's lead author, said: "By the end of the century, we are predicted to see increases in sea surface temperature of 2-4°C and at least a doubling of atmospheric CO2. It is no surprise that would have an effect on marine species, but this research is surprising in that acidification appears to mitigate changes in shell structure attributable to rising sea temperatures, which is counter to what we would have predicted. It may be that increased CO2 in the water is providing more 'raw material' for the mussels to repair their shells that is not available under just warming conditions."

Dr Natasha Stephen, Director of Plymouth Electron Microscopy Centre, added: "Until now, there have been relatively few studies assessing the combined effects of ocean acidification and warming on shell structures. However, understanding the changes that might result at a microscopic level may provide important insights in to how organisms will respond to future climate change. This study shows it can certainly have negative effects but also that they are not always predictable, which presents some serious challenges when it comes to trying to disentangle the consequences of climate change."

Credit: 
University of Plymouth

New research provides fresh hope for children suffering from rare muscle diseases

image: Metformin rescues muscle function in BAG3 myofibrillar myopathy models

Image: 
Taylor & Francis

Results of an international study published today in Autophagy and led by researchers from Monash University, School of Biological Sciences, provides renewed hope for children suffering from a progressive and devastating muscle disease.

Stephen Greenspan and Laura Zah were devastated when they learned their son Alexander had a rare genetic mutation, which causes a deadly neuromuscular disease with no known treatment or cure.

But the results of an international study published today in Autophagy and led by researchers from Monash University, School of Biological Sciences, provides renewed hope for children suffering from the progressive and devastating muscle disease. Known as myofibrillar myopathies, these rare genetic diseases lead to progressive muscle wasting, affecting muscle function and causing weakness.

Using the tiny zebrafish, Associate Professor Robert Bryson-Richardson from the School of Biological Sciences and his team of researchers were able to show that a defect in protein quality control contributes to the symptoms of the diseases.

"We tested 75 drugs that promote the removal of damaged proteins in our zebrafish model and identified nine that were effective" explained first author Dr Avnika Ruparelia, who completed her student and post-doctoral training in the team working on the disease. "Importantly two of these are already approved for human use in other conditions."

"We found that one of the drugs, metformin, which is normally used to treat diabetes, removed the accumulating damaged protein in the fish, prevented muscle disintegration and restored their swimming ability," said Associate Professor Bryson-Richardson, who led the study.

The most severe form of the myofibrillar myopathy, caused by a mutation in the gene BAG3, starts to affect children between 6 and 8 years of age. The disease is usually fatal before the age of 25 due to respiratory or cardiac failure.

In the case of Alexander (who was born in 2003) clinicians were able to draw on the study's information to prescribe metformin - which is so far proving positive.

"Initially, we were devastated by our son's diagnosis. Alexander has a rare mutation that causes a deadly neuromuscular disease. No treatment or cure was known. In desperation we formed the charitable organization, Alexander's Way, to promote and sponsor research into this disease. Upon learning of our awful problem, A/Prof Bryson-Richardson was compassionate, and found a way to share with us his pre-publication results about the disease and metformin. The research conducted by Robert Bryson-Richardson and Avnika Ruparelia has given us hope, and we thank them deeply for their work and compassion," said Alexander's father, Stephen.

"This is a wonderful outcome, as initially we thought that because of the rarity of the mutation, it was unlikely that there would ever be a treatment or therapeutic intervention available," said Alexander's mother, Laura Zah. "Compared to previous case studies, the progression of our son's disease has been slower, likely due to metformin. Another boy, Marco, who is affected by this disease also takes metformin, and is presently judged by his mother to be stable. Metformin may have given us more time with our boys and more time to work for a cure."

Associate Professor Bryson-Richardson said the repurposing of existing drugs provided a very rapid route to clinical use, as there was already existing safety data for the drug. This is especially important for these rare diseases as the patient numbers are low, meaning it might not be possible to do clinical trials with novel drugs.

"We have identified metformin as a strong candidate to treat BAG3 myofibrillar myopathy, and also myofibrillar myopathy due to mutations in other genes (we showed similar defects in protein quality control in three other forms) and in cardiomyopathy due to mutations in BAG3," he said.

"Given that metformin is taken by millions of people for diabetes and known to be very safe this makes clinical translation highly feasible, and in fact many patients are now taking it."

Stephen and Laura Zah are the founders of the charitable organisation Alexander's Way Research Fund which they established to promote and sponsor research into myofibrillar myopathies.

"The research conducted by Monash scientists has given us hope, and we thank them deeply for their compassion - they have given us time," said Laura Zah.

Credit: 
Taylor & Francis Group

Ice melt projections may underestimate Antarctic contribution to sea level rise

image: Thwaites Glacier, Antarctica, pictured in 2019.

Image: 
NASA

Fluctuations in the weather can have a significant impact on melting Antarctic ice, and models that do not include this factor can underestimate the global impact of sea level rise, according to Penn State scientists.

"We know ice sheets are melting as global temperatures increase, but uncertainties remain about how much and how fast that will happen," said Chris Forest, professor of climate dynamics at Penn State. "Our findings shed new light on one area of uncertainty, suggesting climate variability has a significant impact on melting ice sheets and sea level rise."

While it is understood that continued warming may cause rapid ice loss, models that predict how Antarctica will respond to climate change have not included the potential impacts of internal climate variability, like yearly and decadal fluctuations in the climate, the team of scientists said.

Accounting for climate variability caused models to predict an additional 2.7 to 4.3 inches -- 7 to 11 centimeters -- of sea level rise by 2100, the scientists recently reported in the journal Climate Dynamics. The models projected roughly 10.6 to 14.9 inches -- 27 to 38 centimeters -- of sea level rise during that same period without climate variability.

"That increase alone is comparable to the amount of sea level rise we have seen over the last few decades," said Forest, who has appointments in the departments of meteorology and atmospheric science and geosciences. "Every bit adds on to the storm surge, which we expect to see during hurricanes and other severe weather events, and the results can be devastating."

The Antarctic ice sheet is a complex system, and modeling how it will evolve under future climate conditions requires thousands of simulations and large amounts of computing power. Because of this, modelers test how the ice will respond using a mean temperature found by averaging the results of climate models.

However, that process smooths out peaks caused by climate variability and reduces the average number of days above temperature thresholds that can impact the ice sheet melt, creating a bias in the results, the scientists said.

"If we include variability in the simulations, we are going to have more warm days and more sunshine, and therefore when the daily temperature gets above a certain threshold it will melt the ice," Forest said. "If we're just running with average conditions, we're not seeing these extremes happening on yearly or decadal timescales."

To study the effects of internal climate variability, the researchers analyzed two large ensembles of climate simulations. Large ensembles are generated by starting each member with slightly different initial conditions. The chaotic nature of the climate system causes each member to yield slightly different responses, and this represents internally generated variability, the scientists said.

Instead of averaging the results of each ensemble, the scientists fed the atmospheric and oceanic data representing this variability into a three-dimensional Antarctic ice sheet model. They found atmospheric variations had a larger and more immediate impact on the ice sheet, but ocean variability was also a significant factor.

Extensive parts of the ice sheet are in contact with ocean water, and previous studies have suggested that warming oceans could cause large chunks to break away. The process may expose ice cliffs so tall that they collapse under their own weight, inducing a domino effect that further depletes the ice shelf.

The scientists found model simulations that did not include the effects of internal climate variability significantly delayed the retreat of the ice sheet by up to 20 years and underestimated future sea level rise.

"This additional ice melt will impact the hurricane storm surges across the globe. Additionally, for years, the IPCC reports have been looking at sea level rise without considering this additional variability and have been underestimating what the impact may be," Forest said. "It's important to better understand these processes contributing to the additional ice loss because the ice sheets are melting much faster than we expected."

Credit: 
Penn State

Oldest monkey fossils outside of Africa found

image: Reconstruction of M. pentelicus from Shuitangba by Mauricio Antón

Image: 
Mauricio Antón

Three fossils found in a lignite mine in southeastern Yunan Province, China, are about 6.4 million years old, indicate monkeys existed in Asia at the same time as apes, and are probably the ancestors of some of the modern monkeys in the area, according to an international team of researchers.

"This is significant because they are some of the very oldest fossils of monkeys outside of Africa," said Nina G. Jablonski, Evan Pugh University Professor of Anthropology, Penn State. "It is close to or actually the ancestor of many of the living monkeys of East Asia. One of the interesting things from the perspective of paleontology is that this monkey occurs at the same place and same time as ancient apes in Asia."

The researchers, who included Jablonski and long-time collaborator Xueping Ji, department of paleoanthropology, Yunnan Institute of Cultural Relics and Archaeology, Kunming, China, studied the fossils unearthed from the Shuitangba lignite mine that has yielded many fossils. They report that "The mandible and proximal femur were found in close proximity and are probably of the same individual," in a recent issue of the Journal of Human Evolution. Also uncovered slightly lower was a left calcaneus -- heel bone -- reported by Dionisios Youlatos, Aristotle University of Thessaloniki, Greece, in another paper online in the journal, that belongs to the same species of monkey, Mesopithecus pentelicus.

"The significance of the calcaneus is that it reveals the monkey was well adapted for moving nimbly and powerfully both on the ground and in the trees," said Jablonski. "This locomotor versatility no doubt contributed to the success of the species in dispersing across woodland corridors from Europe to Asia."

The lower jawbone and upper portion of the leg bone indicate that the individual was female, according to the researchers. They suggest that these monkeys were probably "jacks of all trades" able to navigate in the trees and on land. The teeth indicate they could eat a wide variety of plants, fruits and flowers, while apes eat mostly fruit.

"The thing that is fascinating about this monkey, that we know from molecular anthropology, is that, like other colobines (Old World monkeys), it had the ability to ferment cellulose," said Jablonski. "It had a gut similar to that of a cow."

These monkeys are successful because they can eat low-quality food high in cellulose and obtain sufficient energy by fermenting the food and using the subsequent fatty acids then available from the bacteria. A similar pathway is used by ruminant animals like cows, deer and goats.

"Monkeys and apes would have been eating fundamentally different things," said Jablonski. "Apes eat fruits, flowers, things easy to digest, while monkeys eat leaves, seeds and even more mature leaves if they have to. Because of this different digestion, they don't need to drink free water, getting all their water from vegetation."

These monkeys do not have to live near bodies of water and can survive periods of dramatic climatic change.

"These monkeys are the same as those found in Greece during the same time period," said Jablonski. "Suggesting they spread out from a center somewhere in central Europe and they did it fairly quickly. That is impressive when you think of how long it takes for an animal to disperse tens of thousands of kilometers through forest and woodlands."

While there is evidence that the species began in Eastern Europe and moved out from there, the researchers say the exact patterns are unknown, but they do know the dispersal was rapid, in evolutionary terms. During the end of the Miocene when these monkeys were moving out of Eastern Europe, apes were becoming extinct or nearly so, everywhere except in Africa and parts of Southeast Asia.

"The late Miocene was a period of dramatic environmental change," said Jablonski. "What we have at this site is a fascinating snapshot of the end of the Miocene -- complete with one of the last apes and one of the new order of monkeys. This is an interesting case in primate evolution because it testifies to the value of versatility and adaptability in diverse and changing environments. It shows that once a highly adaptable form sets out, it is successful and can become the ancestral stock of many other species."

Credit: 
Penn State

Graphene microbubbles make perfect lenses

image: In situ optical microscopic images showing the process of the microbubble generation and elimination.

Image: 
H. Lin et al., doi 10.1117/1.AP.2.5.055001.

Tiny bubbles can solve large problems. Microbubbles--around 1-50 micrometers in diameter--have widespread applications. They're used for drug delivery, membrane cleaning, biofilm control, and water treatment. They've been applied as actuators in lab-on-a-chip devices for microfluidic mixing, ink-jet printing, and logic circuitry, and in photonics lithography and optical resonators. And they've contributed remarkably to biomedical imaging and applications like DNA trapping and manipulation.

Given the broad range of applications for microbubbles, many methods for generating them have been developed, including air stream compression to dissolve air into liquid, ultrasound to induce bubbles in water, and laser pulses to expose substrates immersed in liquids. However, these bubbles tend to be randomly dispersed in liquid and rather unstable.

According to Baohua Jia, professor and founding director of the Centre for Translational Atomaterials at Swinburne University of Technology, "For applications requiring precise bubble position and size, as well as high stability--for example, in photonic applications like imaging and trapping--creation of bubbles at accurate positions with controllable volume, curvature, and stability is essential." Jia explains that, for integration into biological or photonic platforms, it is highly desirable to have well controlled and stable microbubbles fabricated using a technique compatible with current processing technologies.

Balloons in graphene

Jia and fellow researchers from Swinburne University of Technology recently teamed up with researchers from National University of Singapore, Rutgers University, University of Melbourne, and Monash University, to develop a method to generate precisely controlled graphene microbubbles on a glass surface using laser pulses. Their report is published in the peer-reviewed, open-access journal, Advanced Photonics.

The group used graphene oxide materials, which consist of graphene film decorated with oxygen functional groups. Gases cannot penetrate through graphene oxide materials, so the researchers used laser to locally irradiate the graphene oxide film to generate gases to be encapsulated inside the film to form microbubbles--like balloons. Han Lin, Senior Research Fellow at Swinburne University and first author on the paper, explains, "In this way, the positions of the microbubbles can be well controlled by the laser, and the microbubbles can be created and eliminated at will. In the meantime, the amount of gases can be controlled by the irradiating area and irradiating power. Therefore, high precision can be achieved."

Such a high-quality bubble can be used for advanced optoelectronic and micromechanical devices with high precision requirements.

The researchers found that the high uniformity of the graphene oxide films creates microbubbles with a perfect spherical curvature that can be used as concave reflective lenses. As a showcase, they used the concave reflective lenses to focus light. The team reports that the lens presents a high-quality focal spot in a very good shape and can be used as light source for microscopic imaging.

Lin explains that the reflective lenses are also able to focus light at different wavelengths at the same focal point without chromatic aberration. The team demonstrates the focusing of a ultrabroadband white light, covering visible to near-infrared range, with the same high performance, which is particularly useful in compact microscopy and spectroscopy.

Jia remarks that the research provides "a pathway for generating highly controlled microbubbles at will and integration of graphene microbubbles as dynamic and high precision nanophotonic components for miniaturized lab-on-a-chip devices, along with broad potential applications in high resolution spectroscopy and medical imaging."

Credit: 
SPIE--International Society for Optics and Photonics