Tech

NASA-NOAA satellite finds Hurricane Delta rapidly intensifying

image: On Oct. 6 at 3:06 a.m. EDT (0706 UTC) NASA-NOAA's Suomi NPP satellite analyzed Hurricane Delta's cloud top temperatures and found strongest storms (yellow) were around Delta's center of circulation and in a band of thunderstorms south of the center. Temperatures in those areas were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius). Strong storms with cloud top temperatures as cold as minus 70 degrees (red) Fahrenheit (minus 56.6. degrees Celsius) surrounded both of those areas.

Image: 
NASA/NRL

Infrared imagery from NASA-NOAA's Suomi NPP satellite revealed that Hurricane Delta has been rapidly growing stronger and more powerful. Infrared imagery revealed that powerful thunderstorms circled the eye of the hurricane and southern quadrant as it moved through the Caribbean Sea on Oct. 6.

At 11:20 a.m. EDT on Oct. 6, NOAA's National Hurricane Center (NHC) received data from a NOAA hurricane hunter aircraft that Delta has rapidly strengthened into a category 4 hurricane.

Infrared Imagery Reveals a More Powerful Delta

One of the ways NASA researches tropical cyclones is by using infrared data which provides temperature information. Cloud top temperatures identify where the strongest storms are located. The stronger the storms, the higher they extend into the troposphere, and the colder the cloud top temperatures.

On Oct. 6 at 3:06 a.m. EDT (0706 UTC) NASA-NOAA's Suomi NPP satellite analyzed Hurricane Delta's cloud top temperatures using the Visible Infrared Imaging Radiometer Suite or VIIRS instrument. At the time, Delta was a Category 1 hurricane with maximum sustained winds near 85 mph (140 kph). By 5 a.m. EDT, maximum sustained winds strengthened to 100 mph (155 kph). The storm continued to intensify rapidly during the morning hours.

VIIRS found strongest storms were around Delta's center of circulation and in a band of thunderstorms south of the center. Temperatures in those areas were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius). Strong storms with cloud top temperatures as cold as minus 70 degrees Fahrenheit (minus 56.6. degrees Celsius) surrounded both of those areas.

NASA research has shown that cloud top temperatures that cold indicate strong storms that have the capability to create heavy rain. NASA provides data to tropical cyclone meteorologists so they can incorporate it in their forecasts.

Warnings and Watches on Oct. 6

A Hurricane Warning is in effect from Tulum to Dzilam, Mexico and for Cozumel. A Tropical Storm Warning is in effect for the Cayman Islands including Little Cayman and Cayman Brac and for the Cuban province of Pinar del Rio and the Isle of Youth. A Tropical Storm Warning is also in effect from Punta Herrero to Tulum, Mexico and from Dzilam to Progresso, Mexico. A Tropical Storm Watch is in effect for the Cuban province of La Habana.

Delta's Status on Oct. 6

At 11:20 a.m. EDT, data from a NOAA Hurricane Hunter aircraft indicate that Delta is continuing to strengthen rapidly. The maximum winds have increased to near 130 mph (215 kph) with higher gusts.  This makes Delta a category 4 hurricane on the Saffir-Simpson Hurricane Wind Scale.

Delta was centered near latitude 18.2 degrees north and longitude 82.7 degrees west, about 315 miles (510 km) east-southeast of Cozumel, Mexico.  Delta is moving to the west-northwest at 16 mph (26 kph) and has a minimum central pressure near 954 millibars.

NHC Key Messages for Delta

The NHC issued several key messages about Delta today, Oct. 6:

STORM SURGE:  An extremely dangerous storm surge will raise water levels by as much as 6 to 9 feet above normal tide levels along coast of the Yucatan peninsula within the hurricane warning area, near and to right of where the center makes landfall. Near the coast, the surge will be accompanied by large and destructive waves.

WIND:  Tropical storm conditions are expected in portions of the Cayman Islands today.  In the Yucatan Peninsula, hurricane conditions are expected in the warning area early Wednesday, with tropical storm conditions beginning later today or tonight. Tropical storm conditions are expected in the tropical storm warning area tonight and Wednesday. In Cuba, tropical storm conditions are expected tonight in the warning area and possible in the watch area near the same time.

RAINFALL:  Delta is expected to produce 4 to 6 inches of rain, with isolated maximum totals of 10 inches, across portions of the northern Yucatan Peninsula through midweek. This rainfall may result in areas of significant flash flooding.

Over the next few days, Delta is expected to produce 2 to 4 inches of rain, with isolated higher amounts, across portions of the Cayman Islands and western Cuba. This rainfall may result in areas of flash flooding and mudslides.

Delta's Forecast

NHC noted, "A slower northwestward to north-northwest motion is forecast to begin by late Wednesday or Wednesday night. On the forecast track, the center of Delta is expected to continue to pass southwest of the Cayman Islands through early this afternoon, and move over the northeastern portion of the Yucatan peninsula late tonight or early Wednesday.  Delta is forecast to move over the southern Gulf of Mexico Wednesday afternoon, and be over the southern or central Gulf of Mexico through Thursday."

What Happened to Gamma's Remnants?

The remnant low-pressure area of Gamma was located off the north coast of Yucatan at 8 a.m. EDT. Gamma has since moved inland and is forecast to dissipate on Wednesday.

Credit: 
NASA/Goddard Space Flight Center

NASA catches development of Tropical Storm Norbert as Marie declines

image: NASA-NOAA's Suomi NPP satellite captured a visible image of the development of Tropical Storm Norbert near the coast of southwestern Mexico on Oct. 5 at 5:55 p.m. EDT (2155 UTC). Meanwhile, (top left), Tropical Storm Marie continues tracking toward the Central Pacific Ocean.

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

NASA-NOAA's Suomi NPP satellite passed over the Eastern Pacific Ocean and captured the birth of a depression that became Tropical Storm Norbert while Marie continued weakening while headed toward the Central Pacific.

Tropical Depression 19E formed well offshore of southwestern Mexico on Oct. 5 and at 5:55 p.m. EDT (2155 UTC) visible imagery from the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard Suomi NPP helped confirm the development. VIIRS showed the low-pressure area had become better defined than it was during the previous day. The image was generated at NASA's Goddard Space Flight Center in Greenbelt, Md. using the NASA Worldview application.

NOAA's National Hurricane Center (NHC) noted, "The associated deep convection has also become more organized and convection has persisted over the low-level center since early this morning. In addition, a banding feature has also developed over the western portion of the circulation. Based on these trends, advisories have been initiated for Tropical Depression 19E."

The same VIIRS visible image also caught a weakening Tropical Storm Marie as it continued toward the Central Pacific Ocean. The Suomi NPP image showed that deep convection and building thunderstorms associated with Marie had all but dissipated and what was left of it was located over 120 nautical miles away from the exposed low-level center of the cyclone (as a result of wind shear).

By 5 a.m. EDT on Oct. 6, Tropical Depression 19E strengthened into a tropical storm and was re-dubbed Norbert. At 11 a.m. EDT, Marie was barely hanging onto tropical storm status and fading quickly.

Norbert's Status on Oct. 6

At 11 a.m. EDT (1500 UTC) on Oct. 6, the center of Tropical Storm Norbert was located near latitude 14.2 degrees north and longitude 106.6 degrees west. That is 365 miles (585 km) south-southwest of Manzanillo, Mexico. Norbert is moving toward the northwest near 7 mph (11 kph). A slower northwestward motion is expected until tonight. The system is forecast to meander thereafter through midweek. Maximum sustained winds are near 45 mph (75 kph) with higher gusts. Some slow strengthening is possible over the next few days. The estimated minimum central pressure is 1002 millibars.

Marie's Fading Status of Oct. 6

At 11 a.m. EDT (1500 UTC), the center of Tropical Storm Marie was located near latitude 22.1 degrees north and longitude 135.1 degrees west. Marie is moving toward the west-northwest near 9 mph (15 kph), and this general motion with some decrease in forward speed is expected during the next day or so, followed by a turn toward the west late Wednesday or early Thursday.

Maximum sustained winds have decreased to near 45 mph (75 kph) with higher gusts. Gradual weakening is forecast during the next 48 hours, and Marie is forecast to become a remnant low-pressure area by tonight and a trough of low pressure in a few days.

Credit: 
NASA/Goddard Space Flight Center

Do eyeglasses help keep coronavirus out? Johns Hopkins expert says more evidence needed

During the current pandemic, we've all been advised to protect ourselves from infection by the SARS-CoV-2 virus that causes COVID-19 by masking, physical distancing and frequent hand-washing. In the Sept. 17 issue of JAMA Ophthalmology, a research team in China suggests that a fourth defensive measure also might be helpful: eye protection.

However, according to an infectious disease expert at Johns Hopkins Medicine, the team's findings don't yet mean everyone should don a pair of Clark Kent spectacles to enhance their "superpowers" during a coronavirus attack.

In their paper, published online Sept. 16, Weibiao Zeng, M.S., at the Second Affiliated Hospital of Nanchang University, and colleagues at three other Chinese medical institutions describe a retrospective study of 276 people in China's Hubei Province who tested positive for the SARS-CoV-2 virus at the beginning of the pandemic. The researchers found that the proportion of patients who wore eyeglasses more than eight hours per day was significantly lower than in the general population.

From these data, the researchers claim that wearing eyeglasses more than a third of the day may provide some protection against SARS-CoV-2 infection, and that eyeglasses may act as a partial barrier to help keep people from touching their eyes.

"The findings, although intriguing, should not be considered as conclusive proof that the general public should begin wearing face shields, goggles or other ocular personal protective equipment -- along with wearing masks and not touching their eyes -- to obtain any substantial protection from SARS-CoV-2 infection," says Lisa Maragakis, M.D., M.P.H., senior director of infection prevention at the Johns Hopkins Health System, associate professor of medicine at the Johns Hopkins University School of Medicine and author of a commentary on the study that appears in the same issue of JAMA Ophthalmology.

Maragakis says there are several reasons for her caution.

"The study looks at a time very early in the pandemic before universal masking and physical distancing became common prevention practices. There may be confounding variables or an alternate explanation for the apparent protective effect of eyeglasses, and the data on the general population -- against which the eyeglasses-wearing habits of the study patients are compared -- were collected years ago in a different region of China," she explains.

However, Maragakis says more studies -- using data from both people who previously had COVID-19 and from patients newly diagnosed with the disease -- would be valuable to confirm the study's findings and to better define any benefit for the general public by adding eye protection as a defensive practice.

Maragakis is available to discuss this topic with the media.

Credit: 
Johns Hopkins Medicine

Two's a crowd: Nuclear and renewables don't mix

image: If countries want to lower emissions as substantially, rapidly and cost-effectively as possible, they should prioritize support for renewables, rather than nuclear power, the findings of a major new energy study concludes.

Image: 
University of Sussex

If countries want to lower emissions as substantially, rapidly and cost-effectively as possible, they should prioritize support for renewables, rather than nuclear power.

That's the finding of new analysis of 123 countries over 25 years by the University of Sussex Business School and the ISM International School of Management which reveals that nuclear energy programmes around the world tend not to deliver sufficient carbon emission reductions and so should not be considered an effective low carbon energy source.

Researchers found that unlike renewables, countries around the world with larger scale national nuclear attachments do not tend to show significantly lower carbon emissions - and in poorer countries nuclear programmes actually tend to associate with relatively higher emissions.

Published today in Nature Energy, the study reveals that nuclear and renewable energy programmes do not tend to co-exist well together in national low-carbon energy systems but instead crowd each other out and limit effectiveness.

Benjmin K Sovacool, Professor of Energy Policy in the Science Policy Research Unit (SPRU) at the University of Sussex Business School, said: "The evidence clearly points to nuclear being the least effective of the two broad carbon emissions abatement strategies, and coupled with its tendency not to co-exist well with its renewable alternative, this raises serious doubts about the wisdom of prioritising investment in nuclear over renewable energy. Countries planning large-scale investments in new nuclear power are risking suppression of greater climate benefits from alternative renewable energy investments."

The researchers, using World Bank and International Energy Agency data covering 1990-2014, found that nuclear and renewables tend to exhibit lock-ins and path dependencies that crowd each other out, identifying a number of ways in which a combined nuclear and renewable energy mix is incompatible.

These include the configuration of electricity transmission and distribution systems where a grid structure optimized for larger scale centralized power production such as conventional nuclear, will make it more challenging, time-consuming and costly to introduce small-scale distributed renewable power.

Similarly, finance markets, regulatory institutions and employment practices structured around large-scale, base-load, long-lead time construction projects for centralized thermal generating plant are not well designed to also facilitate a multiplicity of much smaller short-term distributed initiatives.

Andy Stirling, Professor of Science and Technology Policy at the University of Sussex Business School, said: "This paper exposes the irrationality of arguing for nuclear investment based on a 'do everything' argument. Our findings show not only that nuclear investments around the world tend on balance to be less effective than renewable investments at carbon emissions mitigation, but that tensions between these two strategies can further erode the effectiveness of averting climate disruption."

The study found that in countries with a high GDP per capita, nuclear electricity production does associate with a small drop in CO2 emissions. But in comparative terms, this drop is smaller than that associated with investments in renewable energy.

And in countries with a low GDP per capita, nuclear electricity production clearly associates with CO2 emissions that tend to be higher.

Patrick Schmid, from the ISM International School of Management München, said: "While it is important to acknowledge the correlative nature of our data analysis, it is astonishing how clear and consistent the results are across different time frames and country sets. In certain large country samples the relationship between renewable electricity and CO2-emissions is up to seven times stronger than the corresponding relationship for nuclear."

Credit: 
University of Sussex

Snakes reveal the origin of skin colours

image: The skin of corn snakes (Pantherophis guttatus) has an orange base, decorated with red dorsal and lateral spots circled in black. The species can undergo mutations that lead to variations in skin colour, with the lavender corn snake being pink with grey spots.

Image: 
© UNIGE/ Milinkovitch

The skin colour of vertebrates depends on chromatophores -- cells found in the superficial layers of the epidermis. A team of specialists in genetic determinism and colour evolution in reptiles from the University of Geneva (UNIGE) is studying the wide variety of colours sported by different individuals within the corn snake species. The research, published in the journal PNAS, demonstrates that the dull colour of the lavender variant of corn snake is caused by the muta- tion of a gene involved in forming lysosomes, the "garbage disposal" vesicles of cells. This single mutation is enough to affect every skin colour, demonstrating that both the reflective crystals and pigments are stored in lysosome-related vesicles. The UNIGE study marks a significant step forward in our understanding of the origin of colours and patterns in the skin of vertebrates.

The chromatophores are the cells that determine skin colour, thanks to the presence of pigments or crystals that reflect light. There are three types of chromatophores: melanophores, which are responsible for the black or brown colour; xanthophores, for red and yellow; and iridophores, with crystals that reflect multiple colours. Mammals only have me- lanophores, while reptiles and fish carry all three types of chro- matophore, meaning they can display a very wide variety of colours and colour patterns. The pigments of melanophores are known to be stored in organelles known as LROs or lysosome-related organelles. These are small intra- cellular vesicles that have the same origin as lysosomes, the "garbage disposals" that digest the non-functional molecules in cells. On the other hand, the storage location of the red and yellow pigments and crystals in the other types of chromatophore is unknown.

When snakes turn pink

The skin of corn snakes (Pantherophis guttatus) has an orange base, decorated with red dorsal and lateral spots circled in black. The spe- cies can undergo mutations that lead to variations in skin colour, with the lavender corn snake being pink with grey spots. The experiments carried out by Athanasia Tzika, a researcher in the Department of Ge- netics and Evolution in UNIGE's Faculty of Sciences and her doctoral student Asier Ullate-Agote have identified that these altered colours are due to a single mutation pinpointed in the LYST gene, a gene that regu- lates lysosome trafficking. "It's very long-term work", begins Tzika, "since snakes only have one litter a year. We had to sequence the entire genome of the corn snake and identify all the genes within".

The liver is key

Mutations in the LYST gene in humans cause the Chediak-Higashi syn- drome, which is characterised by albinism, an impaired immune sys- tem and an accumulation of enlarged lysosomes. The Geneva team continued its study into corn snakes by analysing their hepatocytes, the main liver cells in vertebrates, which contain numerous lyso- somes. The scientists found that the hepatocytes of lavender corn snakes contain much larger and more aggregated lysosomes. Using electron microscopy, the authors observed that the morphology and arrangement of coloured vesicles in all the chromatophores were altered.

The result of evolution

Michel Milinkovitch, a professor in UNIGE's Department of Genetics and Evolution, explains further: "By characterising the mutant gene, the study has shown for the first time that the different chromato- phores were not created from scratch during evolution but that they all entail a basic mechanism involving LROs". Further studies will pro- vide a better understanding of the mechanisms responsible for the extraordinary variety of skin colours and colour patterns in vertebrates, features that play a part in functions as diverse and essential as camouflage, intraspe- cific communication, and protection against the harmful effects of solar radiation.

Credit: 
Université de Genève

Nanoparticles can turn off genes in bone marrow cells

Using specialized nanoparticles, MIT engineers have developed a way to turn off specific genes in cells of the bone marrow, which play an important role in producing blood cells. These particles could be tailored to help treat heart disease or to boost the yield of stem cells in patients who need stem cell transplants, the researchers say.

This type of genetic therapy, known as RNA interference, is usually difficult to target to organs other than the liver, where nanoparticles would tend to accumulate. The MIT researchers were able to modify their particles in such a way that they would accumulate in the cells found in the bone marrow.

"If we can get these particles to hit other organs of interest, there could be a broader range of disease applications to explore, and one that we were really interested in this paper was the bone marrow. The bone marrow is a site for hematopoiesis of blood cells, and these give rise to a whole lineage of cells that contribute to various types of diseases," says Michael Mitchell, a former MIT postdoc and one of the lead authors of the study.

In a study of mice, the researchers showed that they could use this approach to improve recovery after a heart attack by inhibiting the release of bone marrow blood cells that promote inflammation and contribute to heart disease.

Marvin Krohn-Grimberghe, a cardiologist at the Freiburg University Heart Center in Germany, and Maximilian Schloss, a research fellow at Massachusetts General Hospital, are also lead authors of the paper, which appears today in Nature Biomedical Engineering. The paper's senior authors are Daniel Anderson, a professor of chemical engineering at MIT and a member of MIT's Koch Institute for Integrative Cancer Research and Institute for Medical Engineering and Science, and Matthias Nahrendorf, a professor of radiology at MGH.

Targeting the bone marrow

RNA interference is a strategy that could potentially be used to treat a variety of diseases by delivering short strands of RNA that block specific genes from being turned on in a cell. So far, the biggest obstacle to this kind of therapy has been the difficulty in delivering it to the right part of the body. When injected into the bloodstream, nanoparticles carrying RNA tend to accumulate in the liver, which some biotech companies have taken advantage of to develop new experimental treatments for liver disease.

Anderson's lab, working with MIT Institute Professor Robert Langer, who is also an author of the new study, has previously developed a type of polymer nanoparticles that can deliver RNA to organs other than the liver. The particles are coated with lipids that help stabilize them, and they can target organs such as the lungs, heart, and spleen, depending on the particles' composition and molecular weight.

"RNA nanoparticles are currently FDA-approved as a liver-targeted therapy but hold promise for many diseases, ranging from Covid-19 vaccines to drugs that can permanently repair disease genes," Anderson says. "We believe that engineering nanoparticles to deliver RNA to different types of cells and organs in the body is key to reaching the broadest potential of genetic therapy."

In the new study, the researchers set out to adapt the particles so that they could reach the bone marrow. The bone marrow contains stem cells that produce many different types of blood cells, through a process called hematopoiesis. Stimulating this process could enhance the yield of hematopoietic stem cells for stem cell transplantation, while repressing it could have beneficial effects on patients with heart disease or other diseases.

"If we could develop technologies that could control cellular activity in bone marrow and the hematopoietic stem cell niche, it could be transformative for disease applications," says Mitchell, who is now an assistant professor of bioengineering at the University of Pennsylvania.

The researchers began with the particles they had previously used to target the lungs and created variants that had different arrangements of a surface coating called polyethylene glycol (PEG). They tested 15 of these particles and found one that was able to avoid being caught in the liver or the lungs, and that could effectively accumulate in endothelial cells of the bone marrow. They also showed that RNA carried by this particle could reduce the expression of a target gene by up to 80 percent.

The researchers tested this approach with two genes that they believed could be beneficial to knock down. The first, SDF1, is a molecule that normally prevents hematopoietic stem cells from leaving the bone marrow. Turning off this gene could achieve the same effect as the drugs that doctors often use to induce hematopoietic stem cell release in patients who need to undergo radiation treatments for blood cancers. These stem cells are later transplanted to repopulate the patient's blood cells.

"If you have a way to knock down SDF1, you can cause the release of these hematopoietic stem cells, which could be very important for a transplantation so you can harvest more from the patient," Mitchell says.

The researchers showed that when they used their nanoparticles to knock down SDF1, they could boost the release of hematopoietic stem cells fivefold, which is comparable to the levels achieved by the drugs that are now used to enhance stem cell release. They also showed that these cells could successfully differentiate into new blood cells when transplanted into another mouse.

"We are very excited about the latest results," says Langer, who is also the David H. Koch Institute Professor at MIT. "Previously we have developed high-throughput synthesis and screening approaches to target the liver and blood vessel cells, and now in this study, the bone marrow. We hope this will lead to new treatments for diseases of the bone marrow like multiple myeloma and other illnesses."

Combatting heart disease

The second gene that the researchers targeted for knockdown is called MCP1, a molecule that plays a key role in heart disease. When MCP1 is released by bone marrow cells after a heart attack, it stimulates a flood of immune cells to leave the bone marrow and travel to the heart, where they promote inflammation and can lead to further heart damage.

In a study of mice, the researchers found that delivering RNA that targets MCP1 reduced the number of immune cells that went to the heart after a heart attack. Mice that received this treatment also showed improved healing of heart tissue following a heart attack.

"We now know that immune cells play such a key role in the progression of heart attack and heart failure," Mitchell says. "If we could develop therapeutic strategies to stop immune cells that originate from bone marrow from getting into the heart, it could be a new means of treating heart attack. This is one of the first demonstrations of a nucleic-acid-based approach of doing this."

At his lab at the University of Pennsylvania, Mitchell is now working on new nanotechnologies that target bone marrow and immune cells for treating other diseases, especially blood cancers such as multiple myeloma.

Credit: 
Massachusetts Institute of Technology

Two-dimensional MXene as a novel electrode material for next-generation display

Researchers in the US and Korea reported the first efficient flexible light-emitting diodes with a two-dimensional titanium carbide MXene as a flexible and transparent electrode. This MXene-based light-emitting diodes (MX-LED) with high efficiency and flexibility have been achieved via precise interface engineering from the synthesis of the material to the application (Advanced Materials,2020, 2000919).

Flexible displays have been developing with a high pace and the global flexible display market has been expanding quickly over the years. Development of flexible transparent conducting electrodes (TCEs) with outstanding flexibility and electrical conductivity is one of the key requirements for the next-generation displays because indium tin oxide (ITO), the conventional TCE, is brittle. Diverse materials such as graphene, conducting polymers and metal nanowires have been suggested but their insufficient electrical conductivity, low work function and complicated electrode fabrication limited their practical use.

MXenes, a new family of two-dimensional materials

MXenes, a new class of two-dimensional materials discovered at Drexel University in 2011, consist of few-atoms-thick layers of transition metal carbides or nitrides. They have shown impressive properties such as metal-like electrical conductivity and tunable surface and electronic properties, offering new possibilities to the various fields of technology. Since their discovery, their use has been explored in a number of areas, such as metal ion batteries, sensors, gas and electrochemical storage, energy devices, catalysts and medicine. MXenes have exhibited potential as flexible electrodes because of their superior flexibility. However, exploration of MXenes in flexible electrodes of optoelectronic devices just started recently because the conventional MXene films do not meet the requirements of work function and conductivity in LEDs and solar cells and can degrade when they are exposed to the acidic water-based hole injection layer (HIL).

MXene for flexible LED application

An international team of scientists from Seoul National University and Drexel University, led by Tae-Woo Lee and Yury Gogotsi focused on the surface and interface modulation of the solution-processed MXene films to make an ideal MXene/HIL system. They tuned the surface of the MXene film to have high work function (WF) by low-temperature vacuum annealing and the HIL is designed to be pH-neutral and be diluted with alcohol, preventing detrimental surface oxidation and degradation of the electrode film. The MXene/HIL system suggested by the team provides advantages to the device efficiency due to efficient injection of holes to the emitting layer by forming a nearly ideal Ohmic contact.

Using the MXene/HIL system, the team fabricated high-efficiency green organic LEDs (OLEDs) exceeding 100 cd/A, which agrees well with the theoretical maximum values and is quite comparable with that of the conventional ITO-based devices. Finally, flexible MXene-LEDs on a plastic substrate show outstanding bending stability while the ITO-LEDs could not stand the bending stress. It is the first report that demonstrates highly efficient OLEDs using a single layer of 2D titanium carbide MXene as a flexible electrode.

This progressive research is published in the prominent journal 'Advanced Materials' (IF: 25.809). The authors explain further: "The results of interface engineered MXene film and the MXene electrode-based flexible organic LEDs show the strong potential of the solution-processed MXene TCE for use in next-generation optoelectronic devices that can be manufactured using a low-cost solution-processing technology."

Credit: 
Seoul National University

New study finds largest population increase among US adult electronic cigarette users is in younger adults that have never smoked combustible cigarettes

ATLANTA - October 5, 2020 - A new study from the American Cancer Society assessed trends between 2014 and 2018 in the prevalence of e-cigarette use and population count of e-cigarette users, according to combustible cigarette smoking histories, in younger (18-29 years), middle-aged (30-49 years), and older (?50 years.) U.S. adults. The study appears in the American Journal of Preventive Medicine.

The most notable finding was an increase in e-cigarette use among younger adult never smokers of combustible cigarettes, whose use nearly tripled (1.3% to 3.3%) between 2014-2018, potentially suggesting increasing primary nicotine initiation with e-cigarettes. While this two-percentage point increase appears modest, when combined with a large and growing prevalence and population of never-smokers nationally, this increase represented the largest absolute increase in e-cigarette users - an estimated 0.87 million more never smoking younger adults users in 2018 (1.35 million) than in 2014 (0.49 million). The authors also note substantial increases in e-cigarette use among near-term quitters (i.e. those that quit combustible cigarettes 1-8 years ago, when e-cigarettes proliferated the US retail market) across all age groups. This trend suggests continued use of e-cigarette devices among those who may have switched from cigarettes previously, potentially for nicotine maintenance.

"Urgent efforts are needed to address the potential rise in primary nicotine initiation with e-cigarettes among younger adults. It is also important to aid the transition of e-cigarette users--particularly among younger adults--to non-use of all tobacco or nicotine products given that the long-term consequences of e-cigarette use are mostly unknown," said Priti Bandi, PhD., Principal Scientist, Risk Factors Surveillance Research for The American Cancer Society.

Credit: 
American Cancer Society

Diagnosing COVID-19 in just 30 minutes

image: The reaction is composed of four main components: a set of probes, SplintR ligase, T7 RNA polymerase and a fluorogenic dye. In the presence of target RNA, hybridization, ligation, transcription and aptamer-dye binding reactions occur sequentially in a single reaction tube at a constant temperature.

Image: 
POSTECH

The year 2020 can be summarized simply by one word - COVID-19 - as it was the culprit that froze the entire world. For more than 8 months so far, movement between nations has been paralyzed all because there are no means to prevent or treat the virus and the diagnosis takes long.

In Korea, there are many confirmed cases among those arriving from abroad but diagnosis does not take place at the airport currently. Overseas visitors can enter the country if they show no symptoms and must visit the screening clinic nearest to their site of self-isolation on their own. Even this, when the clinic closes, they have no choice but to visit it the next day. Naturally, there have been concerns of them leaving the isolation facilities. What if there was a way to diagnose and identify the infected patients right at the airport?

A joint research team comprised of Professor Jeong Wook Lee and Ph.D. candidate Chang Ha Woo and Professor Gyoo Yeol Jung and Dr. Sungho Jang of the Department of Chemical Engineering at POSTECH have together developed a SENSR (SENsitive Splint-based one-pot isothermal RNA detection) technology that allows anyone to easily and quickly diagnose COVID-19 based on the RNA sequence of the virus.

This technology can diagnose infections in just 30 minutes, reducing the stress on one single testing location and avoiding contact with infected patients as much as possible. The biggest benefit is that a diagnostic kit can be developed within week even if a new infectious disease appears other than COVID-19.

The PCR molecular test currently used for COVID-19 diagnosis has very high accuracy but entails a complex preparation process to extract or refine the virus. It is not suitable for use in small farming or fishing villages, or airport or drive-thru screening clinics as it requires expensive equipment as well as skilled experts.

RNA is a nucleic acid that mediates genetic information or is involved in controlling the expression of genes. The POSTECH researchers designed the test kit to produce nucleic acid binding reaction to show fluorescence only when COVID-19 RNA is present. Therefore, the virus can be detected immediately without any preparation process with high sensitivity in a short time. And it is as accurate as the current PCR diagnostic method.

Using this technology, the research team found the SARS-CoV-2 virus RNA, the cause of COVID-19, from an actual patient sample in about 30 minutes. In addition, five pathogenic viruses and bacterial RNAs were detected which proved the kit's usability in detecting pathogens other than COVID-19.

Another great advantage of the SENSR technology is the ease of creating the diagnostic device that can be developed into a simple portable and easy-to-use form.

If this method is introduced, it not only allows onsite diagnosis before going to the screening clinic or being hospitalized, but also allows for a more proactive response to COVID-19 by supplementing the current centralized diagnostic system.

"This method is a fast and simple diagnostic technology which can accurately analyze the RNA without having to treat a patient's sample," commented Professor Jeong Wook Lee. "We can better prepare for future epidemics as we can design and produce a diagnostic kit for new infectious diseases within a week"

Professor Gyoo Yeol Jung added, "The fact that pathogenic RNAs can be detected with high accuracy and sensitivity, and that it can be diagnosed on the spot is drawing attention from academia as well as industry circles." He explained, "We hope to contribute to our response to COVID-19 by enhancing the current testing system.

Credit: 
Pohang University of Science & Technology (POSTECH)

Revealing secret of lithium-rich stars by monitoring their heartbeats

image: Astronomers reveal the secrets of the lithium-rich low-mass evolved stars by monitoring their heartbeats and analyzing their spectra

Image: 
YU Jingchuan, Beijing Planetarium

Lithium is an ancient element that is almost as old as the universe itself. As one of the building blocks of our present-day universe though, the context of lithium observed in many celestial bodies often disaccord with predictions of classic theories.

Lithium-rich stars, accounting for only 1% of the total number of the low-mass evolved stars, is one example of such conflict. They preserve up to thousands of times more lithium than the normal stars that account for the rest 99%. Astronomers are wondering what these stars really are and why.

A recent study from an international team led by Prof. ZHAO Gang, Prof. SHI Jianrong, and Dr. YAN Hongliang from the National Astronomical Observatories of the Chinese Academy of Sciences (NAOC) provides new insights to lithium-rich stars. The study was published in Nature Astronomy on Oct. 5.

By monitoring their "heartbeats", they found that most lithium-rich stars are the so-called "red clumps" rather than the "red giants" as previously thought.

"The 'red clumps' and 'red giants' are names for different stages of the senile stars," said Prof. ZHAO, the co-corresponding author of this paper. "Though they look alike on the H-R diagram, a tool for mapping the evolutionary stage of a star over its lifetime."

"Imagine you are looking at two grey-haired elders," he added, "it is very hard to tell who is older just by their appearances."

Traditionally, the convective movement in the "red giants" was thought to be a favorable environment for creating lithium in stars. That partly explains why most of lithium-rich stars were thought to be "red giants" at the very beginning.

"The key problem is that," said Dr. YAN, the lead author of this study, "we don't exactly know what the lithium-rich stars really are, but now we do."

The game changer here is the combination of spectroscopy and the asteroseismology, a technique that measures the feature of a star's oscillation by monitoring their light variations from a space satellite, Kepler, run by NASA. "We are monitoring the heartbeats, taking the cardiogram for stars," said Dr. YAN. "Although the 'red giants' and 'red clumps' are alike in appearances, they have different hearts, thus beat diversely."

Most lithium-rich stars in this study were found by the Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST), a special quasi-meridian reflecting Schmidt telescope with active optics technique, located in Xinglong station, China. Some of these stars were also observed by other telescopes worldwide using different resolution, such as Subaru telescope operated by Japan, for confirming that the information derived from LAMOST data are correct. "The spectra can tell us the physical parameters of the stars, and how much lithium are kept in their atmospheres," said Prof. ZHAO. "So spectra are equally important as the 'heartbeats' of stars in our study."

The research shows that over 80% of lithium-rich stars are "red clumps". Also importantly, it reveals a bunch of new signatures for lithium-rich stars when their "heartbeats" help to classify individual stars into "red clumps" or "red giants". "All of these signatures are hard to explain using the current scenarios," said Prof. SHI, the other co-corresponding author of the paper. "There is still some unknown processes that could significantly affect surface chemical composition in low-mass stellar evolution, but this is an exciting opportunity for us astronomers to find out how lithium is created in stars."

Credit: 
Chinese Academy of Sciences Headquarters

Potential drug treatment for particular type of lung-cancer

image: Mechanism of targeted drugs tolerance in lung cancer cells.

Image: 
Kanazawa University

The effectiveness of cancer treatment is often hampered by cancer cells being heterogeneous. This is the case for EGFR-mutated lung cancer: drugs based on biomolecules of a type known as tyrosine kinase inhibitor (TKI) have been used to treat the disease, but with various levels of efficacy. (EGFR stands for "epidermal growth factor receptor", a protein playing an important role in signaling processes from the extracellular environment to a cell.) Sometimes, tumor cells are simply resistant to the drug. Now, Seiji Yano from Kanazawa University and colleagues have investigated the efficacy of the TKI osimertinib for treating EGFR-mutated lung cancer, and how it relates to the expression in tumor cells of a particular protein called AXL. They found that both AXL-high and -low expressing tumor cells showed tolerance (acquired resistance) to osimertinib, but that the mechanisms involved are different for the two situations. Moreover, the researchers suggest a way to enhance the success of osimertinib treatment for the case of AXL-low expressing tumors.

First, the scientists compared the susceptibility to osimertinib in both AXL-high and -low expressing tumor cells in in vitro experiments. They observed that osimertinib inhibited the viability of the cancer cells in both cases, but that the sensitivity to the drug was higher for AXL-low expressing EGFR-mutated lung cancer cells. They also noticed that a small number of tumor cells survived the procedure -- an indication of osimertinib tolerance. These findings were consistent with results from the clinical study of the drug performed earlier on 29 patients with EGFR-mutated non-small cell lung cancer.

Through experiments aiming to understand the mechanism behind osimertinib tolerance, Yano and colleagues discovered that phosphorylation of IGF-1R was increased in AXL-low-expressing tumor cell lines, but not in AXL-high expressing tumors. (IGF-1R stands for 'insulin-like growth factor 1 receptor'; it is a protein located on the surface of human cells. Phosphorylation is the chemical process of adding a phosphoryl group.) The researchers then found that phosphorylated IGF-1R supported the survival of AXL-low expressing tumors after exposure to osimertinib.

The scientists then tested whether the observed osimertinib resistance could be resolved by administering linsitinib, a substance known to inhibit the phosphorylation of IGF-1R. Encouraged by the positive outcome of the experiment, Yano and colleagues went further and evaluated the combination of osimertinib and linsitinib. Their conclusion was that the transient combination of linsitinib with continuous osimertinib treatment could cure or at least dramatically delay tumor recurrence in AXL-low-expressing EGFR-mutated lung cancer. More investigating needs to be done, though. Quoting the researchers: "... the safety and efficacy of the transient combination of IGF-1R inhibitor and osimertinib should be evaluated in the clinical trials."

[Background]

Tyrosine kinase inhibitors

A tyrosine kinase inhibitor is a drug inhibiting (that is, preventing or reducing the activity of) a specific tyrosine kinase. A tyrosine kinase is a protein (enzyme) involved in the activation of other proteins by signaling cascades. The activation happens by the addition of a phosphate group to the protein (phosphorylation); it is this step that a tyrosine kinase inhibitor inhibits. Tyrosine kinase inhibitors are used as anticancer drugs. One such drug is osimertinib, used to treat EGFR-mutated lung cancer.

AXL

AXL is a receptor tyrosine kinase -- a tyrosine kinase consisting of an extracellular part, a transmembrane part ('sitting' within a cell membrane) and an intracellular part. AXL regulates various important cellular processes, including proliferation, survival and motility.

In recent years, it has become clear that AXL is a key facilitator of drug tolerance by cancer cells. Seiji Yano from Kanazawa University and colleagues have found that this is also the case for EGFR-mutated lung cancer. While a high expression of AXL correlates with resistance to osimertinib, such tolerance also occurs in AXL-low-expressing cancer cells. Yano and colleagues have now found that for the latter case, phosphorylation of IGF-1R (insulin-like growth factor 1 receptor) is responsible for the resistance to osimertinib.

Credit: 
Kanazawa University

Lego-like assembly of zeolitic membranes improves carbon capture

Zeolites are porous minerals that occur both naturally but also are being synthesized artificially. Because they are stable and durable, zeolites are used for chemical catalysis, purification of gases and liquids, and even in medical applications such as drug delivery and blood-clotting powders, e.g. the QuickClot trauma bandages used in the US military.

Zeolites used in gas separation are usually produced as membranes. The state-of-the-art zeolitic membranes are manufactured by a lengthy and complex crystallization process. Unfortunately, this method has proved difficult to reproduce. Also, it lacks in producing efficient gas-separation membranes, especially when it comes to the separation of hydrogen and carbon dioxide, which is necessary for pre-combustion carbon capture from power plants.

A team of chemical engineers led by Kumar Agrawal at EPFL Valais Wallis have now successfully simplified the chemistry behind zeolite membrane synthesis, making it simple, reproducible, and scalable. The achievement of the longstanding goal is published in Nature Materials.

The scientists developed a new material chemistry that eliminates the lengthy crystallization process altogether. "We build Lego-like crystals - nanosheets - and bonded them on top of each other using silanol condensation chemistry," says Agrawal. The resulting membrane shows ideal hydrogen-carbon dioxide separation performance, with selectivity up to 100 at 250-300 degrees Celsius.

The authors conclude: "The scalable synthesis of high-temperature hydrogen-sieving zeolitic membranes is expected to improve the energy-efficiency of pre-combustion carbon capture."

Credit: 
Ecole Polytechnique Fédérale de Lausanne

Efficient pollen identification

image: Microscopic images from pollen, which are important for pollinators, obtained by image-based particle analysis. Each row shows a single pollen grain of a specific plant with a normal microscopic image (first image on the left) and fluorescence images for different spectral ranges (colored images on the right).

Image: 
Susanne Dunker

From pollen forecasting, honey analysis and climate-related changes in plant-pollinator interactions, analysing pollen plays an important role in many areas of research. Microscopy is still the gold standard, but it is very time consuming and requires considerable expertise. In cooperation with Technische Universität (TU) Ilmenau, scientists from the Helmholtz Centre for Environmental Research (UFZ) and the German Centre for Integrative Biodiversity Research (iDiv) have now developed a method that allows them to efficiently automate the process of pollen analysis. Their study has been published in the specialist journal New Phytologist.

Pollen is produced in a flower's stamens and consists of a multitude of minute pollen grains, which contain the plant's male genetic material necessary for its reproduction. The pollen grains get caught in the tiny hairs of nectar-feeding insects as they brush past and are thus transported from flower to flower. Once there, in the ideal scenario, a pollen grain will cling to the sticky stigma of the same plant species, which may then result in fertilisation. "Although pollinating insects perform this pollen delivery service entirely incidentally, its value is immeasurably high, both ecologically and economically," says Dr. Susanne Dunker, head of the working group on imaging flow cytometry at the Department for Physiological Diversity at UFZ and iDiv. "Against the background of climate change and the accelerating loss of species, it is particularly important for us to gain a better understanding of these interactions between plants and pollinators." Pollen analysis is a critical tool in this regard.

Each species of plant has pollen grains of a characteristic shape, surface structure and size. When it comes to identifying and counting pollen grains - measuring between 10 and 180 micrometres - in a sample, microscopy has long been considered the gold standard. However, working with a microscope requires a great deal of expertise and is very time-consuming. "Although various approaches have already been proposed for the automation of pollen analysis, these methods are either unable to differentiate between closely related species or do not deliver quantitative findings about the number of pollen grains contained in a sample," continues UFZ biologist Dr. Dunker. Yet it is precisely this information that is critical to many research subjects, such as the interaction between plants and pollinators.

In their latest study, Susanne Dunker and her team of researchers have developed a novel method for the automation of pollen analysis. To this end they combined the high throughput of imaging flow cytometry - a technique used for particle analysis - with a form of artificial intelligence (AI) known as deep learning to design a highly efficient analysis tool, which makes it possible to both accurately identify the species and quantify the pollen grains contained in a sample. Imaging flow cytometry is a process that is primarily used in the medical field to analyse blood cells but is now also being repurposed for pollen analysis. "A pollen sample for examination is first added to a carrier liquid, which then flows through a channel that becomes increasingly narrow," says Susanne Dunker, explaining the procedure. "The narrowing of the channel causes the pollen grains to separate and line up as if they are on a string of pearls, so that each one passes through the built-in microscope element on its own and images of up to 2,000 individual pollen grains can be captured per second." Two normal microscopic images are taken plus ten fluorescence microscopic images per grain of pollen. When excited with light radiated at certain wavelengths by a laser, the pollen grains themselves emit light. "The area of the colour spectrum in which the pollen fluoresces - and at which precise location - is sometimes very specific. This information provides us with additional traits that can help identify the individual plant species," reports Susanne Dunker. In the deep learning process, an algorithm works in successive steps to abstract the original pixels of an image to a greater and greater degree in order to finally extract the species-specific characteristics. "Microscopic images, fluorescence characteristics and high throughput have never been used in combination for pollen analysis before - this really is an absolute first." Where the analysis of a relatively straightforward sample takes, for example, four hours under the microscope, the new process takes just 20 minutes. UFZ has therefore applied for a patent for the novel high-throughput analysis method, with its inventor, Susanne Dunker, receiving the UFZ Technology Transfer Award in 2019.

The pollen samples examined in the study came from 35 species of meadow plants, including yarrow, sage, thyme and various species of clover such as white, mountain and red clover. In total, the researchers prepared around 430,000 images, which formed the basis for a data set. In cooperation with TU Ilmenau, this data set was then transferred using deep learning into a highly efficient tool for pollen identification. In subsequent analyses, the researchers tested the accuracy of their new method, comparing unknown pollen samples from the 35 plant species against the data set. "The result was more than satisfactory - the level of accuracy was 96 per cent," says Susanne Dunker. Even species that are difficult to distinguish from one another, and indeed present experts with a challenge under the microscope, could be reliably identified. The new method is therefore not only extremely fast but also highly precise.

In the future, the new process for automated pollen analysis will play a key role in answering critical research questions about interactions between plants and pollinators. How important are certain pollinators like bees, flies and bumblebees for particular plant species? What would be the consequences of losing a species of pollinating insect or a plant? "We are now able to evaluate pollen samples on a large scale, both qualitatively and- at the same time - quantitatively. We are constantly expanding our pollen data set of insect-pollinated plants for that purpose," comments Susanne Dunker. She aims to expand the data set to include at least those 500 plant species whose pollen is significant as a food source for honeybees.

Credit: 
Helmholtz Centre for Environmental Research - UFZ

Groundwater depletion in US High Plains leads to bleak outlook for grain production

The depletion of groundwater sources in parts of the United States High Plains is so severe that peak grain production in some states has already been passed, according to new research.

An international team of scientists, including experts from the University of Birmingham, has extended and improved methods used to calculate peak oil production to assess grain production in three US states, Nebraska, Texas and Kansas. They related the levels of water extraction from the Ogallala aquifer, one of the largest underwater reservoirs in the High Plains, over the past five decades, to the amounts of grain harvested in each state and used this model to predict future trends. Their results are published in Proceedings of the National Academy of Sciences.

"We were inspired by insightful analyses of US crude oil production. They predicted a peak in crude oil production a decade in advance," says Assaad Mrad, a Ph.D. candidate at Duke University and lead author of the study.

The scientists found that in Texas and Kansas, even taking into account advances in technology and improved irrigation methods, production levels peaked around 2016 before starting to decline. By 2050, if no yield-boosting technologies are introduced, grain production in Texas could be reduced by as much as 40 per cent.

The decline is because rates of water extraction, coupled with delays in enforcing new policies on groundwater use and in introducing new irrigation and monitoring technologies mean the aquifers can no longer be sufficiently replenished to meet demand.

Nebraska, in contrast, has a different climate, with more rainfall and less aridity than Texas and Southern Kansas, which allows for higher sustainable rates of groundwater pumping. Nebraska is therefore succeeding in increasing the amount of land used for grain production without increasing the amount of water used.

The US High Plains produces more than 50 million tonnes of grain yearly and depends on the aquifers for as much as90 per cent of its irrigation needs. Taken as a whole, therefore, the model shows that continued depletion of the High Plains aquifers at current levels represents a significant threat to food and water security both in the US and globally.

"Overall, the picture we see emerging from these calculations is bleak," says Professor David Hannah, at the University of Birmingham. "The ultimate consequence of the aquifers continuing to be overused will be the decline and collapse of grain production. We have already seen this happen in Texas, where over the course of fifty years, peak water use has twice led to peak grain production followed by production crashes."

"This shows quite clearly that the aquifers are not being used in a sustainable way and it's essential to find new technologies that can irrigate crops in a sustainable way."

The paper has its origins in the Ettersburg Ecohydrology Workshop where29 experts and students from 11 countries gathered near Weimar, Germany to figure out how to start addressing the world's multifaceted water crisis.

Credit: 
University of Birmingham

210Pb dating of marine sedimentary cores

Laboratories from 14 countries (with different levels of experience in radiometric measurement of radionuclides in environmental samples and in the application of the 210 Pb dating method) participated in an interlaboratory comparison (ILC) exercise related to the application of the 210 Pb sediment dating technique. The exercise was conducted in the framework of a research project coordinated by the IAEA.

The participating laboratories were given aliquots from various strata of a marine sedimentary core collected from a bay near Sao Paulo, Brazil, and were asked to provide the mass activities of various radionuclides in each core stratum and an age vs. depth assignment based on the radiometric results obtained, using the 210Pb dating model considered most appropriate by each laboratory.

The results of this exercise show that the participating laboratories are highly skilled in the process of radiometric determinations, while, on the other hand, the dating results were not as successful, in part because the participating laboratories have widely varying experience with dating. The interlaboratory comparison exercise made it possible to evaluate the difficulties faced by laboratories using

210 Pb dating, and also made it possible to observe certain limitations in providing reliable chronologies. The application of the 210 Pb sediment dating method is far from a routine technique and requires expert knowledge and multidisciplinary experience.

The 210 Pb sediment dating method is an excellent tool to establish recent chronologies of sedimentary cores. Its proper application is essential for a large number of environmental studies. However, this dating method cannot be used as a routine tool, as each core requires a different approach and sufficient information to support the proposed chronologies. A correct understanding of each 210Pb dating model used, its assumptions and limitations in each environment is essential to providing robust and useful chronologies.

Credit: 
University of Seville