Tech

NASA analyzes Hurricane Delta's water vapor concentration

image: On Oct. 7 at 2:50 a.m. EDT (0650 UTC), NASA's Aqua satellite found highest concentrations of water vapor (brown) and coldest cloud top temperatures were around Hurricane Delta's center. Cloud top temperatures in those storms were as cold as or colder than minus 90 degrees Fahrenheit (minus 67.7 degrees Celsius). Delta was moving over Mexico's Yucatan Peninsula.

Image: 
Credits: NASA/NRL

When NASA's Aqua satellite passed over the Caribbean Sea on Oct. 7, it gathered water vapor data on Hurricane Delta as Mexico's Yucatan continues to feel its effects.

Water vapor analysis of tropical cyclones tells forecasters how much potential a storm has to develop. Water vapor releases latent heat as it condenses into liquid. That liquid becomes clouds and thunderstorms that make up a tropical cyclone. Temperature is important when trying to understand how strong storms can be. The higher the cloud tops, the colder and stronger the storms.

NASA's Aqua satellite passed over Delta on Oct. 7 at 2:50 a.m. EDT (0650 UTC), and the Moderate Resolution Imaging Spectroradiometer or MODIS instrument gathered water vapor content and temperature information. NASA's Aqua satellite is one in a fleet of NASA satellites that provide data for hurricane research.

The MODIS image showed highest concentrations of water vapor and coldest cloud top temperatures were around the center of circulation and east of the center. Strongest storms were over the northern Caribbean Sea, between Mexico's Yucatan Peninsula and western Cuba. MODIS data also showed coldest cloud top temperatures were as cold as or colder than minus 90 degrees Fahrenheit (minus 67.7 degrees Celsius) in those storms. Storms with cloud top temperatures that cold have the capability to produce heavy rainfall.

At 5 a.m. EDT, NHC Senior Hurricane Specialist Eric Blake said, "Satellite images show very deep convection associated with Delta, with extremely cold cloud-top temperatures to minus 97 degrees Celsius noted southwest of the center overnight.  However, this structure has not resulted in a stronger cyclone, and the full NOAA Hurricane Hunter aircraft mission actually indicated that Delta has significantly weakened since earlier today."

Forecasters at the National Hurricane Center (NHC) noted, "Through early Thursday, Delta is expected to produce 4 to 6 inches of rain, with isolated maximum totals of 10 inches, across portions of the northern Yucatan Peninsula. This rainfall may result in areas of significant flash flooding. In addition, 2 to 4 inches of rain, with isolated higher amounts, are expected across portions of western Cuba. This rainfall may result in areas of flash flooding and mudslides."

Warnings and Watches on Oct. 7

NHC issued a Hurricane Warning from Tulum to Dzilam, Mexico and for Cozumel. A Tropical Storm Warning is in effect for the Cuban province of Pinar del Rio; from Punta Herrero to Tulum, Mexico; and from Dzilam to Progreso, Mexico.

Delta's Status on Oct. 7

At 8 a.m. EDT (1200 UTC), the NHC said the center of Hurricane Delta was located by satellite images and surface observation inland over northeastern Mexico near latitude 21.1 degrees north and longitude 87.4 degrees west. Delta was centered just 35 miles (55 km) west of Cancun, Mexico.

Delta was moving toward the northwest near 17 mph (28 kph). A west northwestward-to-northwestward motion is expected over the next day or so.  A slower northwestward to north-northwestward motion is forecast to begin on Thursday, and a northward motion is likely Thursday night and Friday. Maximum sustained winds are near 105 mph (165 kph) with higher gusts.  The estimated minimum central pressure based on surface observations is 974 millibars.

Delta's Forecast Track

NHC forecasters said, "Although some additional weakening is likely when Delta moves over the Yucatan peninsula this morning, re-strengthening is forecast when the hurricane moves over the southern Gulf of Mexico Wednesday night and Thursday, and Delta could become a category 4 hurricane again by late Thursday.  Weakening is expected as Delta approaches the northern Gulf coast on Friday.

On the forecast track, Delta is expected to move over the southern Gulf of Mexico during the afternoon of Oct. 7 and be over the southern or central Gulf of Mexico through Thursday. Delta is expected to approach the northern Gulf coast on Friday, Oct. 9."

NASA Researches Tropical Cyclones

Hurricanes/tropical cyclones are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

For NHC's Key Messages, visit: http://www.hurricanes.gov

By Rob Gutro
NASA's Goddard Space Flight Center

Credit: 
NASA/Goddard Space Flight Center

Unusually shallow earthquake ruptures in Chinese fracking field

An unusually shallow earthquake triggered by hydraulic fracturing in a Chinese shale gas field could change how experts view the risks of fracking for faults that lie very near the Earth's surface.

In the journal Seismological Research Letters, Hongfeng Yang of The Chinese University of Hong Kong and colleagues suggest that the magnitude 4.9 earthquake that struck Rongxian County, Sichuan, China on 25 February 2019 took place along a fault about one kilometer (0.6 miles) deep.

The earthquake, along with two foreshocks with magnitudes larger than 4, appear to be related to activity at nearby hydraulic fracturing wells. Although earthquakes induced by human activity such as fracking are typically more shallow than natural earthquakes, it is rare for any earthquake of this size to take place at such a shallow depth.

"Earthquakes with much smaller magnitudes, for example magnitude 2, have been reported at such shallow depths. They are understood by having small scale fractures in such depths that can slip fast," said Yang. "However, the dimensions of earthquakes are scale-dependent. Magnitude 4 is way bigger than magnitude 2 in term of rupture length and width, and thus needs a sizeable fault as the host."

"The results here certainly changed our view in that a shallow fault can indeed slip seismically," he added. "Therefore, we should reconsider our strategies of evaluating seismic risk for shallow faults."

Two people died and twelve were injured in the 25 February earthquake, and the economic loss due to the event has been estimated at 14 million RMB, or about $2 million. There have been few historic earthquakes in the region, and before 2019 there had been no earthquakes larger than magnitude 3 on the fault where the main earthquake took place.

Since 2018, there have been at least 48 horizontal fracking wells drilled from 13 well pads in the region, with three well pads less than two kilometers (1.2 miles) from the Molin fault, where the main earthquake took place.

Yang and his colleagues located the earthquakes and were able to calculate the length of the main rupture using local and regional seismic network data, as well as InSAR satellite data.

It is unusual to see clear satellite data for a small earthquake like this, Yang said. "InSAR data are critical to determine the depth and accurate location of the mainshock, because the ground deformation was clearly captured by satellite images," he noted. "Given the relatively small size of the mainshock, it would not be able to cause deformation above the 'noise' level of satellite data if it were deeper than about two kilometers."

The two foreshocks took place on a previously unmapped fault in the area, the researchers found, underscoring how difficult it can be to prevent fracking-induced earthquakes in an area where fault mapping is incomplete.

The researchers note that the Molin fault is separated from the geologic formation where fracking took place by a layer of shale about 800 meters (2625 feet) thick. The separating layer sealed off the fault from fracking fluids, so it is unlikely that the pressures of fluid injected into rock pores around the fault caused the fault to slip. Instead, Yang and colleagues suggest that changes in elastic stress in rock may have triggered the main earthquake on the Molin fault, which was presumed to be stable.

"The results here certainly pose a significant concern: we cannot ignore a shallow fault that was commonly thought to be aseismic," Yang said, who said more public information on fracking injection volume, rate and duration could help calculate safe distances for well placement in the future.

Credit: 
Seismological Society of America

The good cough and the bad cough

Researchers might be able to treat a troublesome cough in disease without disrupting the protective cough we need for optimal lung health, by targeting the different brain circuits involved. That's according to new research published this week in The Journal of Physiology.

More people seek medical advice for an unwanted, nagging cough than any other ailment. In some people their cough can persist for years without relief, as effective treatments are not readily available.

These findings from Australian researchers have very important implications for understanding and potentially treating cough disorders because it appears that different types of coughs may use different brain circuits.

The act of coughing typically begins with an irritating stimulus within the larynx, airways or lungs that activates cough-evoking sensory nerves. These sensory nerves transmit this information to the brain, where the information is modifies the actions of the breathing muscles, to produce a cough response. These signals are also sometimes combined with "higher order" signals that make your throat tickle, make you feel annoyed or anxious at having a cough, and allow you to suppress or enhance your cough voluntarily.

Previous research in animals and humans suggested that the brain processes all inputs from cough sensory nerves in a single area. However, in an earlier study using guinea pigs, published this year in The Journal of Physiology, the same research team from Monash University and the University of Melbourne demonstrated that this is unlikely to be true.

Instead, they discovered that separate pathways in the brain are involved in the response to a good (needed to clear airways, to ensure optimal lung health) vs a bad cough (a sign of disease).

In this new study, human participants underwent behavioural testing to assess cough reflex sensitivity followed by functional brain imaging in an MRI scanner while inhaling different chemical substances.

One chemical stimulus used was capsaicin, the active component of hot chili peppers and known to activate two subsets of airway sensory nerves involved in coughing.

Another chemical stimulus was adenosine triphosphate (ATP), best known as an energy molecule in cells but it also selectively activates one of the two subsets of sensory nerves involved in coughing.

The final chemical stimulus was saline, used as a control stimulus because it doesn't activate any sensory nerves.

High resolution brainstem scans were collected during repeated randomised presentations of these stimuli, and the scans were analysed to identify where in the brainstem the neural responses to capsaicin and ATP are located.

The outcome showed that capsaicin inhalation activated both the nucleus of the solitary tract and the area of the brainstem containing the paratrigeminal nucleus, whereas ATP inhalation only activated the nucleus of the solitary tract.

The data confirm the team's prior studies using guinea pigs, in that one cough pathway (sensitive to both capsaicin and ATP) is integrated in the nucleus of the solitary tract while the other cough pathway (sensitive to capsaicin only) involves integration in the paratrigeminal nucleus.

Commenting on the study, senior author Professor Stuart Mazzone said:
"Chronic cough is a horribly unpleasant ailment. People can find themselves coughing hundreds of times every hour of their waking lives, for years on end, and current medicines simply aren't effective at relieving this condition."

"We are now performing a similar study comparing how these two different brain networks respond in patients with chronic troublesome coughing compared to healthy participants. This new study is also motivated by recent clinical trial outcomes showing a promising cough suppressing action of drugs that inhibit ATP receptors. How ATP is involved in cough is not fully known. We suspect that responsivity to ATP may change in patients with chronic cough and the newly identified cough circuit in the brain may be involved in this change."

Credit: 
The Physiological Society

Sensory device stimulates ears and tongue to treat tinnitus in large trial

image: A schematic showing the bimodal neuromodulation device. Wireless headphones delivered sounds, while a small electrode array stimulated the tongue with different patterns. This material relates to a paper that appeared in the Oct. 7, 2020, issue of Science Translational Medicine, published by AAAS. The paper, by B. Conlon at Neuromod Devices Limited in Dublin, Ireland; and colleagues was titled, "Bimodal neuromodulation combining sound and tongue stimulation reduces tinnitus symptoms in a large randomized clinical study."

Image: 
B. Conlon <i>et al., Science Translational Medicine</i> (2020)

A device that stimulates the ears and tongue substantially reduced the severity of tinnitus symptoms in 326 patients for as long as 1 year, while achieving high patient satisfaction and adherence. The study - one of the largest clinical trials of a tinnitus treatment to date - indicates the bimodal technique could potentially provide the first effective, clinically viable device for tinnitus, which affects up to 15% of the population. This irritating auditory disorder manifests when patients perceive phantom noises such as ringing without any external input. Despite its high prevalence and potentially debilitating nature, there are no approved medical devices or drug treatments that can provide relief to patients. However, recent research in animals has shown that stimulating the auditory nervous system through sounds and electricity improved symptoms. Based on these promising results, Brendan Conlon and colleagues used a non-invasive stimulating device, which delivers sound to the ears through headphones and stimulates the tongue with low amounts of electricity. In a randomized trial of 326 patients with different types of tinnitus, the authors instructed the patients to use the device for 60 minutes daily for 12 weeks. The device reduced tinnitus symptoms, and these improvements persisted throughout a 12-month follow-up period. The team notes they are currently conducting another large clinical trial to study the effects of changing the stimulation protocol over time.

Credit: 
American Association for the Advancement of Science (AAAS)

First detailed look at how molecular Ferris wheel delivers protons to cellular factories

image: An animation shows a proton pump called V-ATPase at work. These pumps are embedded in the membranes of cellular organelles, where they bring in protons that are essential for the organelle's function. The top part of the pump generates energy to drive the rotating part at the bottom, which is like a molecular Ferris wheel that picks up protons on the outside of the organelle and drops them off inside. Scientists at SLAC National Accelerator Laboratory, Stanford University, SUNY Upstate Medical University and Arizona State University used cryo-EM images and computer simulations to reveal key details about how the pump works.

Image: 
From the labs of S.-H. Roh and S. Wilkens

All cells with nuclei, from yeast to humans, are organized like cities, with a variety of small compartments - organelles - that serve as factories where various types of work are done. Some of those factories, like the ones that break down and recycle molecules, need to continually pump in protons - hydrogen atoms with their electrons stripped off - to maintain the acidic environment they need to do their job. For this they rely on molecular Ferris wheels.

Embedded in the organelle's fatty outer membrane, these microscopic machines have rotors that spin 100 times per second, picking up protons from outside the organelle and dropping them off on the inside.

Now scientists have figured out a key step in how these Ferris wheels work in a yeast proton pump known as vacuolar ATPase (V-ATPase). The results of their study, which combined high-resolution images made at the Department of Energy's SLAC National Accelerator Laboratory with supercomputer simulations, were published in Science Advances today, giving scientists insight into a fundamental process that could potentially be harnessed to thwart disease.

"The V-ATPase proton pumps perform a wide range of functions, from helping transmit nerve signals to helping specialized cells secrete acid for maintaining bone," said Stephan Wilkens, a biochemist at SUNY Upstate Medical University and study co-author. "Malfunctions in these molecular machines contribute to diseases such as osteoporosis, neurodegeneration, diabetes, cancer and AIDS, so understanding them is important for human health."

Wah Chiu, a professor at SLAC and Stanford and co-director of the Stanford-SLAC Cryo-EM Facilities where the imaging was done, said scientists are already investigating how these pumps in human cells might affect replication of the COVID-19-causing virus in patients. "It turns out the majority of therapeutic drugs on the market target molecular machines like this one that sit in cell membranes," he added.

Watching the wheel go round

No human cell can function without proton pumps, which among other things help organelles intercept viruses and other pathogens and divert them to cellular trash bins.

While previous studies had determined the molecular structure and basic function of V-ATPases in a number of organisms, Wilkens said, "the big question was how do they work? To explain the mechanism it's helpful to see it in action, just like the first serial snapshots of a galloping horse finally settled the question of whether it always had at least one hoof on the ground. The answer was no."

In earlier cryo-EM research, Chiu, Wilkens, SLAC/ Stanford postdoctoral researcher Soung-Hun Roh and others produced high-resolution images that allowed them to identify the 10 amino acid "seats" on the yeast Ferris wheel that bind protons and carry them through the membrane to the organelle's interior, as well as other amino acids that catch them when they arrive. Based on that picture, they suggested that the proton drop-off might be aided by water molecules, but their images were not sharp enough to confirm that the water molecules were there.

In the current study, thanks to another round of even higher resolution cryo-EM imaging at SLAC they were able to locate the water molecules around the suspected proton path. To make the proton pump motor come to life, a research group led by Abhishek Singharoy at the Arizona State University Biodesign Institute developed computer simulations of the process and ran them on a DOE supercomputer at Oak Ridge National Laboratory.

The simulations, which incorporated cryo-EM structures derived from images of the yeast Ferris wheel captured at two different points in its rotation, confirmed the experimentally observed water molecules lining up to form "wires" at the proton drop-off point. These wires convey protons from their seats on the Ferris wheel to landing spots inside the organelle, like a fire brigade passing buckets hand to hand, bridging a gap they couldn't navigate on their own.

Going forward, Chiu said, recent advances in cryo-EM that allow imaging of individual particles at atomic resolution - even when they take slightly different shapes - will open new opportunities for using it as a tool to discover effective drugs for illnesses involving proton pumps.

Credit: 
DOE/SLAC National Accelerator Laboratory

Next-gen smartphones to keep their cool

image: Model for NGF growth with respect to the Ni surface topography. The variable number of graphene layers correlates with the orientation, size and boundaries of the Ni grains at the surface of the polycrystalline metal foil.

Image: 
© 2020 KAUST; Xavier Pita

The powerful electronics packed inside the latest smartphones can be a significant challenge to keep cool. KAUST researchers have developed a fast and efficient way to make a carbon material that could be ideally suited to dissipating heat in electronic devices. This versatile material could also have additional uses ranging from gas sensors to solar cells.

Many electronic devices use graphite films to draw away and dissipate the heat generated by their electronic components. Although graphite is a naturally occurring form of carbon, heat management of electronics is a demanding application and usually relies on use of high-quality micrometer-thick manufactured graphite films. "However, the method used to make these graphite films, using polymer as a source material, is complex and very energy intensive," says G. Deokar, a postdoc in Pedro Costa's lab, who led the work. The films are made in a multistep process that requires temperatures of up to 3200 degrees Celsius and which cannot produce films any thinner than a few micrometers.

Deokar, Costa and their colleagues have developed a quick, energy-efficient way to make graphite sheets that are approximately 100 nanometers thick. The team grew nanometer-thick graphite films (NGF) on nickel foils using a technique called chemical vapor deposition (CVD) in which the nickel catalytically converts hot methane gas into graphite on its surface. "We achieved NGFs with a CVD growth step of just five minutes at a reaction temperature of 900 degrees Celsius," Deokar says.

The NGFs, which could be grown in sheets of up to 55 square centimeters, grew on both sides of the foil. It could be extracted and transferred to other surfaces without the need of a polymer supporting layer, which is a common requirement when handling single-layer graphene films.

Working with electron microscopy specialist Alessandro Genovese, the team captured cross-sectional transmission electron microscopy (TEM) images of the NGF on nickel. "Observing the interface of the graphite films to the nickel foil was an unprecedented achievement that will shed additional light on the growth mechanisms of these films," Costa says.

In terms of thickness, NGF sits between commercially available micrometer-thick graphite films and single-layer graphene. "NGFs complement graphene and industrial graphite sheets, adding to the toolbox of layered carbon films," Costa says. Due to its flexibility, for example, NGF could lend itself to heat management in flexible phones now starting to appear on the market. "NGF integration would be cheaper and more robust than what could be obtained with a graphene film," he adds.

However, NGFs could find many applications in addition to heat dissipation. One intriguing feature, highlighted in the TEM images, was that some sections of the NGF were just a few carbon sheets thick. "Remarkably, the presence of the few-layer graphene domains resulted in a reasonable degree of visible light transparency of the overall film," Deokar says. The team proposed that conducting, semitransparent NGFs could be used as a component of solar cells, or as a sensor material for detecting NO2 gas. "We plan to integrate NGFs in devices where they would act as a multifunctional active material," Costa says.

Credit: 
King Abdullah University of Science & Technology (KAUST)

Scientists are more specialized in larger and interdisciplinary teams

The roles of scientists change as research teams become more interdisciplinary and larger, finds new research from ESMT Berlin. Contemporary scientific challenges increasingly require large teams and interdisciplinary perspectives. However, it is not fully understood how these trends affect the division of labor among team members. In other words, how do team members divide the work and how do teams assure that individuals' contributions are brought back together to solve a scientific problem?

Henry Sauermann, Professor of Strategy at ESMT Berlin, and Prof. Carolin Haeussler from the University of Passau, conducted a study on the impact of increased team size and interdisciplinarity on the division of labor. They analyzed author contribution statements from 12,964 published articles in a range of fields and compared the extent to which team members engaged in various research activities such as conceptualizing the project, collecting data, and writing the paper.

They found that division of labor increased with the size of the team, meaning a higher proportion of team members specialized in fewer tasks, sometimes only contributing to one activity. However, generalist members, which are less specialized and contribute to multiple activities, did not disappear completely. The share of specialist members stopped increasing at around 30% in teams with 15 members, while the share of generalist members decreased before stabilizing at around 18% in groups of 10 members. Therefore, although the proportion of specialists increased and generalists decreased, even larger teams were composed of a mix.

Interestingly, these trends towards specialization differ depending on the particular research activity. Prof. Sauermann says, "Conceptual activities such as designing the study tend to be shared more than empirical activities in small teams. However, in larger teams it is the reverse, with empirical activities being shared more widely than conceptual activities."

The authors also find that interdisciplinary teams use greater division of labor - team members tend to specialize in fewer research activities. But there is an interesting twist, says Prof. Haeussler: "Some teams gain interdisciplinary perspectives by bringing together field specialists such as an engineer and a biologist. Other teams are composed of individuals who are themselves interdisciplinary in their backgrounds - think bio-engineers. We see that different approaches to interdisciplinarity have very different implications for how labor is divided between team members".

The authors also find important differences in task allocation depending on scientists' individual characteristics, with women more likely to be involved in performing experiments than conceptual activities. Moreover, Haeussler and Sauermann note that many teams seem to violate common authorship guidelines, which require authors to be involved in both empirical and conceptual activities. As such, authorship guidelines may need to be revised to accommodate increasing specialization in scientific work.

Credit: 
ESMT Berlin

How mobile apps grab our attention

video: Aalto University researchers together with international collaborators have done the first empirical study on how users pay visual attention to mobile app designs.

Image: 
Animation: Marianne Lenoir/Aalto University

As part of an international collaboration, Aalto University researchers have shown that our common understanding of what attracts visual attention to screens, in fact, does not transfer to mobile applications. Despite the widespread use of mobile phones and tablets in our everyday lives, this is the first study to empirically test how users' eyes follow commonly used mobile app elements.

Previous work on what attracts visual attention, or visual saliency, has centered on desktop and web-interfaces.

'Apps appear differently on a phone than on a desktop computer or browser: they're on a smaller screen which simply fits fewer elements and, instead of a horizontal view, mobile devices typically use a vertical layout. Until now it was unclear how these factors would affect how apps actually attract our eyes,' explains Aalto University Professor Antti Oulasvirta.

In the study, the research team used a large set of representative mobile interfaces and eye tracking to see how users look at screenshots of mobile apps, for both Android and Apple iOS devices.

According to previous thinking, our eyes should not only jump to bigger or brighter elements, but also stay there longer. Previous studies have also concluded that when we look at certain kinds of images, our attention is drawn to the centre of screens and also spread horizontally across the screen, rather than vertically. The researchers found these principles to have little effect on mobile interfaces.

'It actually came as a surprise that bright colours didn't affect how people fixate on app details. One possible reason is that the mobile interface itself is full of glossy and colourful elements, so everything on the screen can potentially catch your attention - it's just how they're designed. It seems that when everything is made to stand out, nothing pops out in the end,' says lead author and Post-doctoral Researcher Luis Leiva.

The study also confirms that some other design principles hold true for mobile apps. Gaze, for example, drifts to the top-left corner, as an indication of exploration or scanning. Text plays an important role, likely due to its role in relaying information; on first use, users thus tend to focus on text elements of a mobile app as parts of icons, labels and logos.

Image elements drew visual attention more frequently than expected for the area they cover, though the average length of time users spent looking at images was similar to other app elements. Faces, too, attracted concentrated attention, though when accompanied by text, eyes wander much closer to the location of text.

'Various factors influence where our visual attention goes. For photos, these factors include colour, edges, texture and motion. But when it comes to generated visual content, such as graphical user interfaces, design composition is a critical factor to consider,' says Dr Hamed Tavakoli, who was also part of the Aalto University research team.

The study was completed with international collaborators including IIT Goa (India), Yildiz Technical University (Turkey) and Huawei Technologies (China). The team will present the findings on 6 October 2020 at MobileHCI'20, the flagship conference on Human-Computer Interaction with mobile devices and services.

Credit: 
Aalto University

Dried blood spot sampling offers inexpensive way to widen access to antibody testing for COVID-19

image: An example of a dry blood spot sample card before being processed.

Image: 
University of Birmingham

Using dried blood spot samples (DBS) is an accurate alternative to venous blood in detecting SARS-CoV-2 antibody tests, a new study by immunology experts at the University of Birmingham has found.

Currently antibody testing for COVID-19 uses serum or plasma, which requires a full intravenous blood sample, collected by a trained phlebotomist. For population-wide or high volume testing, the use of such sampling is limited by logistic challenges, resources, and costs, as well as the risk of SARS-CoV-2 exposure from direct patient contact. In contrast, DBS sampling is simple, inexpensive and can be self-collected by the patient at home, using a simple finger prick. The sample can then be collected on a forensic grade card before being posted back to labs for processing. This offers exciting possibilities to widen access to antibody testing particularly in more resource limited countries.

Researchers analysed serum and DBS samples from volunteers at University Hospitals Birmingham Foundation NHS Trust, some of whom had previously tested positive for SARS-CoV-2 by molecular tests, while the status of other volunteers was either negative or unknown. The anonymised matched serum and DBS samples were then processed using a highly sensitive ELISA test, developed by the University's Clinical Immunology Service in partnership with The Binding Site, which specifically detects antibodies (IgG, IgA and IgM) to the SARS-CoV-2 trimeric spike protein.

Results showed a significant correlation between matched DBS and serum samples and minimal differences in results observed by sample type, with negligible discordance. Relative to serum samples, DBS samples achieved 98% sensitivity and 100% specificity for detecting anti-SARS-CoV-2 S glycoprotein antibodies. 100% of the PCR-positive samples were also antibody-positive in DBS.

Senior author Dr Matthew O'Shea from the University's Institute of Immunology and Immunotherapy said: "Our results have demonstrated that dry blood spot sampling not only offers a viable alternative for antibodies testing, but one that overcomes the limitations that current methods can present by eliminating the need for skilled phlebotomists.

"DBS offers the opportunity for wider population-level testing and improved surveillance in vulnerable groups such as patients with chronic conditions, the immunocompromised and the elderly by removing the need to come into contact with a healthcare professional during sample collection."

Co-author Professor Adam Cunningham from the Institute of Immunology and Immunotherapy said: "As well as offering the opportunity for improved population-wide antibody testing in the UK, the simplicity and cost-effectiveness of the dry blood spot method could improve the effectiveness of sampling in low and middle-income countries, among groups where venepuncture is culturally unacceptable or in geographically dispersed populations."

Credit: 
University of Birmingham

Social media postings linked to hate crimes

A new paper in the Journal of the European Economic Association, published by Oxford University Press, explores the connection between social media and hate crimes. The researchers combined methods from applied microeconomics with text analysis tools to investigate how negative rhetoric about refugees on social media may have contributed to hate crimes against refugees in Germany between 2015 and 2017.

Observers all over the globe have scrutinized social media increasingly over the last few years. News reports often suggest a relationship between fake news, social media "echo chambers," and online "bot armies" with real-life outcomes. But despite the public interest and demands for policy action, there is little evidence about the relationship between social media content and offline behavior.

In Germany social media is among the main news sources of 18 to 25-year-olds. In the United States, around half of all adults use social media to get news and two thirds of Facebook users use it as a news source. In contrast to traditional media, social media platforms allow users to easily self-select into niche topics and extreme viewpoints. This may limit the range of information people absorb and create online communities that reinforce similar ideas and viewpoints.

The researchers measured anti-refugee sentiment on social media based on the Facebook page of the Alternative für Deutschland, a relatively new right-wing party that positions itself as anti-refugee and anti-immigration. The party is by far the most popular far-right political movement in Germany and with more than 300,000 followers, 175,000 posts, 290,000 comments, and 500,000 likes (as of early 2017), its Facebook page has a broader reach than that of any other German party. As the researchers show, the rhetoric about refugees on the Alternative für Deutschland Facebook page differs markedly from traditional news sources and in many cases contains language that prominent German non-governmental organizations have classified as hate speech.

The researchers established that spikes in posts about refugees on social media are tightly linked to anti-refugee hate crimes, particularly in municipalities where people were more exposed to the Alternative für Deutschland page. This correlation was especially pronounced for violent incidents such as assault.

Municipalities with Alternative für Deutschland users were three times more likely to experience an attack during the observation period. Out of the total 3,335 attacks on refugees in the same sample, 3,171 occurred in municipalities with Alternative für Deutschland Facebook page users.

The paper used the timing of hundreds of local internet disruptions as well as Germany-wide Facebook outages to ask whether these data patterns may reflect a causal effect of social media. Both types of disruptions reduce exposure to social media content and therefore allow conclusion about causal effects.

The authors found that, while anti-refugee attacks increased with anti-refugee posts, this relationship disappeared during internet or Facebook outages. That is, if a municipality was cut off from social media or the internet more broadly, the frequency of anti-refugee hate crimes was no longer correlated with the amount of hateful content about refugees online posted in a given week.

The results suggest that, during weeks with Facebook outages, there were on average 11% fewer new total posts and 24% fewer posts about refugees on the Alternative für Deutschland page. It appears that Facebook outages reduced the probability of a hate crime by 12%.

The researchers do not claim that social media itself causes crimes against refugees. Rather, the results suggest that social media can help propagate violent crimes by enabling people to spread extreme viewpoints.

"We think our paper can only be a starting point for understanding how social media causes changes in our lives," said Karsten Müller, one of the paper's researchers. "It would be crucial to have additional empirical evidence. Our findings on hate crime suggest that the stakes are high."

Credit: 
Oxford University Press USA

Climate-friendly cooling to help ease global warming

A new IIASA-led study shows that coordinated international action on energy-efficient, climate-friendly cooling could avoid as much as 600 billion tonnes CO2 equivalent of greenhouse gas emissions in this century.

Hydrofluorocarbons (HFCs) are mainly used for cooling and refrigeration. While they were originally developed to replace ozone-depleting substances that are being phased out under the Montreal Protocol, many HFCs are potent greenhouse gases with a global warming potential up to 12,400 times that of CO2 over a 100-year period.

The Kigali Amendment to the Montreal Protocol, which entered into force in 2019, aims to phase down the consumption of HFCs by 2050. While previous agreements have resulted in improvements in the design and energy performance of, for instance, cooling equipment, the Kigali Amendment is the first to include maintaining and/or enhancing the energy efficiency of cooling technologies as an explicit goal. According to the authors of the study, which has been published in the journal Atmospheric Chemistry and Physics, there is however currently limited understanding of the potential future impacts of the Kigali Agreement on global warming and possible co-benefits from savings in electricity. The study is the first to try to quantify the overall effects of the Agreement on both greenhouse gas and air pollutant emissions.

The researchers developed a range of long-term scenarios for HFC emissions under varying degrees of stringency in climate policy and also assessed co-benefits in the form of electricity savings and associated reductions in emissions. The results indicate that, due to technical opportunities to improve energy efficiency in cooling technologies, there is potential for significant electricity savings under a well-managed phase-down of HFCs.

"Our results show that the global cumulative HFC emissions from refrigerant use in cooling technologies would have been over 360 billion tonnes CO2 equivalent between 2018 and 2100 in the pre-Kigali baseline scenario. In addition, indirect CO2 emissions from energy production of electricity used in cooling equipment will be approximately the same order of magnitude if the world continues along its present path, without any additional changes in energy policy," explains IIASA researcher Pallav Purohit, who led the study.

"We found that if technical energy efficiency improvements are fully implemented, the resulting electricity savings could exceed 20% of future global electricity consumption, while the corresponding figure for economic energy efficiency improvements would be about 15%," adds study coauthor and senior IIASA researcher Lena Höglund-Isaksson.

The researchers say that the combined effect of HFC phase-down, improvement of energy efficiency of stationary cooling technologies, and future changes in the electricity generation fuel mix would prevent between 411 and 631 billion tonnes CO2 equivalent of greenhouse gas emissions between 2018 and 2100, thereby making a significant contribution towards keeping the global temperature rise below 2°C. Transitioning to high efficiency cooling can therefore double the climate mitigation effects of the HFC phase-down under the Kigali Amendment, while also delivering economic, health, and development benefits.

The findings further show that reduced electricity consumption could mean lower air pollution emissions in the power sector, estimated at about 5 to 10% for sulfur dioxide, 8 to 16% for nitrogen oxides (NOx), and 4 to 9% for fine particulate matter (PM2.5) emissions compared with a pre-Kigali baseline.

"To be consistent with 1.5°C scenarios, by 2050 HFCs should be reduced by between 70 and 80% compared to 2010 levels. According to the Kigali Amendment and Maximum Technically Feasible Reduction (MTFR) scenarios we analyzed, we could achieve 92.5% and 99.5% reductions in 2050 compared to 2010 levels, respectively. This means that both scenarios surpass the 1.5 ?C threshold. If carefully addressed during the transition to alternatives that have the potential to relieve global warming, improvement potentials for energy efficiency in cooling technologies are extensive and can bring significant electricity savings," Purohit concludes.

Credit: 
International Institute for Applied Systems Analysis

Best materials for border molding in complete dentures fabrication

image: Internal palatal surface (left) and side view (right) of the designed custom tray.

Image: 
Dobromira Shopova, PhD

Edentulous jaw is a condition where either the upper (maxilla) or the lower (mandible) jaw is missing all teeth. In medical practice, it could be treated by placement of a complete denture.

Previous research has already pointed that the application of a border molding procedure (or functional shaping) results in significantly fewer cases of pressure ulcers (decubitus) and soft tissues deformations, hence increased retention and stability of the prosthesis, both at rest and in function. Since there are many factors that affect the optimal treatment, such as anatomical structures (i.e. muscles, muscular and soft-tissue gripping) and the asymmetry between the left and right halves of upper and lower jaws, it is important that special care is taken to determine the depth, as well as the width of the tissue where the teeth would normally be nested (gingivobuccal sulcus). With border molding, it is possible to determine those, however, the accuracy of the impression would still largely depend on the materials used in the procedure.

In their study, published in the open-access, peer-reviewed scholarly journal Folia Medica, Dr Dobromira Shopova and Prof. Diyan Slavchev at the Plovdiv Medical University (Bulgaria) sought to evaluate and determine the accuracy of two different groups of impression materials for border molding: thermoplastic and elastomers. They examined four different brands: Detaseal function (additive silicone for border molding), Sta-seal F (condensation silicone for border molding), GC Iso functional sticks (synthetic resin for border molding), Kerr Impression compound green sticks for border molding.

To perform their research, the team applied Dr Dobromira Shopova's clinical method to measure negative pressure after border molding procedure, referred to as the vacuum measurement technique on edentulous upper jaw. They also assembled a special custom tray from a light-curing base plate with a palatal adapter. This was a 900, 7-millimetre metal adapter, which was fixed to the midline on the palatal slope. To create and measure the negative pressure, they used a combined pressure pump. The maximum value was 3 bars for positive pressure and -1 bar for negative pressure.

Working protocol followed for all materials:

1. Apply the impression material along the edge of the individual tray;

2. Insert, position and perform Herbst functional tests;

3. Wait for the elasticity or hardening of the material;

4. Assemble the clinical unit for negative pressure measurement;

5. Measure the negative pressure that has been created between the custom tray and the prosthetic field, then record the result;

6. Release the individual impression tray from the patient's mouth.

A statistically significant difference was observed between the two thermoplastic materials: the GC Iso functional sticks and the Impression compound green sticks. No statistically significant difference was observed between the other groups of materials.

The measured mean negative pressure values ??created between the prosthetic field and the custom tray showed close values ??for each patient - with a difference of -0,05 to -0,1 bar. This showed that the anatomical features of the prosthetic field were of great importance.

In conclusion, quantitative measurement of negative pressure is entirely possible under clinical conditions. Thermoplastic materials for border molding are retained and formed only along the edge of the custom tray. However, silicone impression materials do not spread only on the edge of the custom tray, but also on the alveolar ridge, demonstrating their superior manipulative qualities and accuracy for the purposes of border molding.

Credit: 
Pensoft Publishers

Can your diet help protect the environment?

If Americans adhere to global dietary recommendations designed to reduce the impact of food production and consumption, environmental degradation could be reduced by up to 38%, according to a new paper published in the journal Environmental Justice.

"What we eat has an impact on the environment through the land used to grow food, net greenhouse gases released by producing food, and water use," said Joe Bozeman, a research associate in the University of Illinois Chicago Institute for Environmental Science and Policy and lead author of the study. "By following guidelines developed with human health and the environment in mind, we can help reduce the environmental impact of food production."

Bozeman and colleagues wanted to see what shifts would be required by Americans in order to adhere to the EAT-Lancet Commission guidelines, the first-ever global dietary guidelines. Drafted in 2019, the recommendations were developed to help reduce environmental degradation caused by food production and consumption of an estimated global population of 10 billion people by the year 2050.

In a previous study, the researchers analyzed data from the U.S. Environmental Protection Agency's What We Eat in America Food Commodity Intake Database -- which provides per capita food consumption estimates for more than 500 types of food, such as apples, poultry, bread and water -- and from the National Health and Nutrition Examination Survey, which provides estimates of individual dietary intake. They also collected information on the environmental impact of these foods from various databases and from the scientific literature. They found that meat and refined sugar are among foods with the highest negative impact on the environment, while vegetables, fish and nuts have a lower impact.

The researchers used the same resources to zero in on changes in food consumption and strategies that would bring the U.S. population into adherence with the EAT-Lancet Commission guidelines. They calculated changes that would be required for Black, Latinx and white populations in the U.S.

"We found that shifting to increased vegetable and nuts intake while decreasing red meat and added sugars consumption would help Americans meet EAT-Lancet criteria and reduce environmental degradation between 28% and 38% compared to current levels," Bozeman said. "At the same time, health outcomes would improve, so following these global recommendations would result in a win-win for the environment and human health."

Different populations would have to make different changes, based on their current dietary patterns, Bozeman said. Black people could meet the criteria by shifting dietary intake to include more vegetables and nuts, but less red meat, chicken and added sugars. Latinx people would need to shift their dietary intake to more vegetables and nuts, but less red meat, eggs and added sugars. White people would need to shift their consumption to include less red meat and added sugars, but more nuts.

Taken together, these results show that meeting all criteria, using a balanced diet approach, would significantly decrease environmental degradation in land, greenhouse gases and water.

"Our results provide foundational information that can inform the development of culturally-tailored dietary intervention strategies that consider the implications for human and environmental health," said Sparkle Springfield, assistant professor of public health sciences at Loyola University, Chicago and a co-author on the paper.

"However, there is still a need to address the structural and social determinants of diet outcomes, particularly in African American and Latinx populations, in order to promote health equity," she said.

In the paper, Bozeman and colleagues call upon the U.S. Department of Agriculture and the World Health Organization to address the unique barriers minority populations face in accessing the healthy foods needed to achieve a sustainable diet.

Credit: 
University of Illinois Chicago

CRISPRing trees for a climate-friendly economy

Researchers led by prof. Wout Boerjan (VIB-UGent Center for Plant Systems Biology) have discovered a way to stably finetune the amount of lignin in poplar by applying CRISPR/Cas9 technology. Lignin is one of the main structural substances in plants and it makes processing wood into, for example, paper difficult. This study is an important breakthrough in the development of wood resources for the production of paper with a lower carbon footprint, biofuels, and other bio-based materials. Their work, in collaboration with VIVES University College (Roeselare, Belgium) and University of Wisconsin (USA) appears in Nature Communications.

Towards a bio-based economy

Today's fossil-based economy results in a net increase of CO2 in the Earth's atmosphere and is a major cause of global climate change. To counter this, a shift towards a circular and bio-based economy is essential. Woody biomass can play a crucial role in such a bio-based economy by serving as a renewable and carbon-neutral resource for the production of many chemicals. Unfortunately, the presence of lignin hinders the processing of wood into bio-based products.

Prof. Wout Boerjan (VIB-UGent): "A few years ago, we performed a field trial with poplars that were engineered to make wood containing less lignin. Most plants showed large improvements in processing efficiency for many possible applications. The downside, however, was that the reduction in lignin accomplished with the technology we used then - RNA interference - was unstable and the trees grew less tall."

New tools

Undeterred, the researchers went looking for a solution. They employed the recent CRISPR/Cas9 technology in poplar to lower the lignin amount in a stable way, without causing a biomass yield penalty. In other words, the trees grew just as well and as tall as those without genetic changes.

Dr. Barbara De Meester (VIB-UGent): "Poplar is a diploid species, meaning every gene is present in two copies. Using CRISPR/Cas9, we introduced specific changes in both copies of a gene that is crucial for the biosynthesis of lignin. We inactivated one copy of the gene, and only partially inactivated the other. The resulting poplar line had a stable 10% reduction in lignin amount while it grew normally in the greenhouse. Wood from the engineered trees had an up to 41% increase in processing efficiency".

Dr. Ruben Vanholme (VIB-UGent): "The mutations that we have introduced through CRISPR/Cas9 are similar to those that spontaneously arise in nature. The advantage of the CRISPR/Cas9 method is that the beneficial mutations can be directly introduced into the DNA of highly productive tree varieties in only a fraction of the time it would take by a classical breeding strategy."

The applications of this method are not only restricted to lignin but might also be useful to engineer other traits in crops, providing a versatile new breeding tool to improve agricultural productivity.

Credit: 
VIB (the Flanders Institute for Biotechnology)

The number and clonality of TCRs are associated with the prognosis of colorectal cancer

Colorectal cancer (CRC) is the third most common cancer in the world, with more than one and a half million new cases diagnosed annually. Approximately 20% of diagnosed stage II patients experience relapses after surgery. There is no marker yet that identifies stage II patients at risk of relapse. Therefore, it is important to be able to identify prognostic biomarkers for this specific setting.

To date, it was already known that the infiltration of T cells (a type of lymphocyte that is part of the adaptive immune system) plays an important role in the survival of patients with colorectal cancer. Thus, using a new technique called 'TCR immuno-sequencing', in samples from more than 600 patients with colorectal cancer, they have verified the usefulness of this new biomarker. This new technique measures both the amount of infiltrated T lymphocytes and their clonality, which is the diversity of lymphocytes that recognize different targets.

This study is a collaboration led by Víctor Moreno, head of the Oncology Data Analysis Program (PADO) of the Catalan Institute of Oncology (ICO), the Bellvitge Biomedical Research Institute (IDIBELL), CIBERESP, and the University of Barcelona, and his team. The research group led by Dr. Steven Gruber, from the 'City of Hope National Medical Center' in Los Angeles, USA, Dr. Harlan Robins, CEO of the Adaptive Biotechnologies company, spin-off of the Cold Hutch Cancer Research Center, Seattle, USA, and Prof. Gad Rennert, Carmel Medical Center and Technion, Haifa, Israel.

How has this study been carried out?

Using this new TCR immuno-sequencing technique, a total of 640 colorectal cancer tumors have been sequenced from four different studies, three from patients from ICO Hospitalet and one from Israel. Thus, unlike the other methods, by sequencing the TCR regions (receptor for lymphocytes that recognize tumor antigens), both, the abundance of the T cell that infiltrates to the tumor and the clonality index, were obtained.

The results obtained have shown that the combination of both variables, quantity and clonality, is associated with the prognosis. The samples with the highest amount of TCR and diversity of clones are those that show a better prognosis of the disease. According to the head of the Oncology Data Analysis Program (PADO) of the ICO-IDIBELL, Víctor Moreno, "the results of this study show that higher levels of TCR, and a greater diversity of TCR receptor, is associated with a better prognosis in this specific group of patients where there are still no clear markers of recurrence ".

Credit: 
IDIBELL-Bellvitge Biomedical Research Institute