Tech

Revealing the unexpected structure of iron-exporter ferroportin

Too much or too little iron in the body can lead to disease but organisms have developed ways to keep iron levels in balance. Ferroportin, the only known iron exporter that releases iron into the blood stream, is a crucial component of iron-balancing mechanisms.

Looking to reach a better understanding of iron regulation, a team led by researchers at Baylor College of Medicine analyzed the 3D structure of a mammalian ferroportin, revealing unexpected characteristics and a novel mode of action that can guide the development of innovative therapeutic strategies. The study appears in Nature Communications.

"More than 60 mutations in the ferroportin gene have been associated with human diseases. Some of those mutations make ferroportin insensitive to hepcidin, a peptide hormone synthesized in the liver that contributes to ferroportin's regulation," said corresponding author Dr. Ming Zhou, Ruth McLean Bowman Bowers Professor in Biochemistry and Molecular Biology. Zhou also is a member of the Dan L Duncan Comprehensive Cancer Center at Baylor.

Ferroportin and hepcidin coordinate their activities to keep iron in the body in the right balance. Ferroportin exports iron into the blood, and hepcidin controls that the exports do not exceed the needs. When it does not respond to hepcidin, ferroportin remains active. Consequently, the body is burdened with iron, a condition called iron overload disease.

"We would like to better understand the process of regulation of iron transport by looking at the structure of ferroportin," said co-first author Dr. Yaping Pan, assistant professor of biochemistry and molecular biology at Baylor. "Neither the structure of ferroportin nor the structure of ferroportin and hepcidin together have been described. A closer look at these structures would provide new insights into how ferroportin works and how hepcidin regulates its activity, opening possibilities for novel approaches to treat iron overload disease."

A challenging protein

The researchers studied ferroportin from the primate Philippine tarsier, which is more than 90 percent similar to human ferroportin. Previous studies that looked at bacterial ferroportin and other iron transporters had shown that these proteins have only one iron-binding site, they carry one iron group at a time.

"We began our study assuming that tarsier ferroportin also had one iron-binding site and were quite puzzled by the results of our experiments," said co-first author Jiemin Shen, graduate student in quantitative and computational biosciences in the Zhou lab.

For instance, the team conducted experiments to determine how altering the iron-binding site would affect the 3D structure of ferroportin. They were intrigued when they found that altering the site did not seem to have much of an effect, the opposite of what they expected. Once we revealed ferroportin's structure with cryo-electron microscopy, we realized that it has two iron-binding sites. This was a surprise that explained the data that had puzzled us."

"We were altering only one site and it seems that the other site was still at work binding iron, so we didn't see much change on ferroportin's structure," Shen said.

A second really exciting discovery was that tarsier ferroportin's mode of action is different from those reported for other iron transporters," Zhou said.

"The iron ion ferroportin exports carries two positive electrical charges. We found that when ferroportin exports iron ions, protons, which have a positive charge, are transported into the cell, balancing the charges and facilitating further iron export," Pan said.

"We are using these new structural and functional findings to identify small molecule candidates that can regulate ferroportin. We are also studying human ferroportin," Zhou said. "This project has good potential for translating the findings to the bedside."

Credit: 
Baylor College of Medicine

Upgraded radar can enable self-driving cars to see clearly no matter the weather

video: A new radar system with improved imaging capability accurately predicts the dimensions of a moving car in fog.

Image: 
Kshitiz Bansal

A new kind of radar could make it possible for self-driving cars to navigate safely in bad weather. Electrical engineers at the University of California San Diego developed a clever way to improve the imaging capability of existing radar sensors so that they accurately predict the shape and size of objects in the scene. The system worked well when tested at night and in foggy conditions.

The team will present their work at the Sensys conference Nov. 16 to 19.

Inclement weather conditions pose a challenge for self-driving cars. These vehicles rely on technology like LiDAR and radar to "see" and navigate, but each has its shortcomings. LiDAR, which works by bouncing laser beams off surrounding objects, can paint a high-resolution 3D picture on a clear day, but it cannot see in fog, dust, rain or snow. On the other hand, radar, which transmits radio waves, can see in all weather, but it only captures a partial picture of the road scene.

Enter a new UC San Diego technology that improves how radar sees.

"It's a LiDAR-like radar," said Dinesh Bharadia, a professor of electrical and computer engineering at the UC San Diego Jacobs School of Engineering. It's an inexpensive approach to achieving bad weather perception in self-driving cars, he noted. "Fusing LiDAR and radar can also be done with our techniques, but radars are cheap. This way, we don't need to use expensive LiDARs."

The system consists of two radar sensors placed on the hood and spaced an average car's width apart (1.5 meters). Having two radar sensors arranged this way is key--they enable the system to see more space and detail than a single radar sensor.

During test drives on clear days and nights, the system performed as well as a LiDAR sensor at determining the dimensions of cars moving in traffic. Its performance did not change in tests simulating foggy weather. The team "hid" another vehicle using a fog machine and their system accurately predicted its 3D geometry. The LiDAR sensor essentially failed the test.

Two eyes are better than one

The reason radar traditionally suffers from poor imaging quality is because when radio waves are transmitted and bounced off objects, only a small fraction of signals ever gets reflected back to the sensor. As a result, vehicles, pedestrians and other objects appear as a sparse set of points.

"This is the problem with using a single radar for imaging. It receives just a few points to represent the scene, so the perception is poor. There can be other cars in the environment that you don't see," said Kshitiz Bansal, a computer science and engineering Ph.D. student at UC San Diego. "So if a single radar is causing this blindness, a multi-radar setup will improve perception by increasing the number of points that are reflected back."

The team found that spacing two radar sensors 1.5 meters apart on the hood of the car was the optimal arrangement. "By having two radars at different vantage points with an overlapping field of view, we create a region of high-resolution, with a high probability of detecting the objects that are present," Bansal said.

A tale of two radars

The system overcomes another problem with radar: noise. It is common to see random points, which do not belong to any objects, appear in radar images. The sensor can also pick up what are called echo signals, which are reflections of radio waves that are not directly from the objects that are being detected.

More radars mean more noise, Bharadia noted. So the team developed new algorithms that can fuse the information from two different radar sensors together and produce a new image free of noise.

Another innovation of this work is that the team constructed the first dataset combining data from two radars.

"There are currently no publicly available datasets with this kind of data, from multiple radars with an overlapping field of view," Bharadia said. "We collected our own data and built our own dataset for training our algorithms and for testing."

The dataset consists of 54,000 radar frames of driving scenes during the day and night in live traffic, and in simulated fog conditions. Future work will include collecting more data in the rain. To do this, the team will first need to build better protective covers for their hardware.

The team is now working with Toyota to fuse the new radar technology with cameras. The researchers say this could potentially replace LiDAR. "Radar alone cannot tell us the color, make or model of a car. These features are also important for improving perception in self-driving cars," Bharadia said.

Credit: 
University of California - San Diego

Like fire and ice: Why societies are increasingly fragmenting

image: People like to create social triangles with others. Red lines represent friendly and cooperative relations between individuals, blue lines are negative or hostile links. We usually cope better with balanced relationships, i.e. when all three in the triangle get along with each other well (triangle 1), or when one person (i) that is on good terms with one (j) and on bad terms with another (k) observes that j and k dislike each other too (triangle 2). What we dislike is when two friends don't get along (triangle 3). Unbalanced social relationships are much rarer in societies than balanced ones.

Image: 
CSH Vienna

[Vienna, Nov 18, 2020] Scientists at the Complexity Science Hub Vienna (CSH) show that the accelerating fragmentation of society - often referred to as filter bubbles - is a direct consequence of the growing number of social contacts. According to their model, societies can only be either cohesive or fragmented. And just as water becomes ice or gas at a certain temperature, a society abruptly changes from one state to the other at certain tipping points.

"Equal to equal"

For their theory of social fragmentation, just published in the Journal of the Royal Society Interface, the researchers use two classical sociological concepts that have been empirically tested in hundreds of studies over the past decades. The first hypothesis is that of homophily. "People are happier when they do not disagree or argue with others," explains Tuan Pham (CSH & Medical University of Vienna), the first author of the study. "One can also say: Like will to like. In order to avoid stress, there is a tendency for opinions within a group to become more and more similar and aligned with each other," he adds.

The second concept is the Social Balance Theory (SBT) of the Austrian psychologist Fritz Heider (1946). Put simply, it describes the fact that people are keen to ensure that their friends get along well with each other. "We like to construct social triangles," Stefan Thurner (CSH & Medical University of Vienna) points out. "What we like best is when all three in the triangle love each other. What we don't like is when two people with whom we are on good terms do not like or argue with each other. As a matter of fact, such states of imbalance occur much less frequently in societies."

Phase transition: from cohesive to fragmented

In their simple social model, the complexity researchers combine homophily and SBT with the physical principle of energy minimization. "We apply this to societies and say: People in societies seek the state of least social stress," says Thurner. "And here we clearly see two social phase states: Either it is cohesive, meaning that there is cohesion and exchange and cooperation can take place or the society disintegrates into small bubbles of like-minded people. Although they then get along well with each other, constructive communication across the bubbles is no longer possible. Society fragments."

Too many social contacts lead to the tipping point

The transition, the researchers say, is abrupt. But what causes the tipping? In the phase transition of water, it is temperature; in societies, according to their theory, the tipping point is the number of contacts people have. Thanks to the Internet, smartphones and social media, this number has exploded in recent years. "A few decades ago, we had to share our phone line with other households. Then every household had a line, later every person had his or her own phone. Today, smartphones connect us with people all over the world at all times - and simultaneously through many channels," explains Thurner.

This is becoming a problem for the well-being of individuals. "Disagreements in small groups, for example disputes with two out of ten people in an extended family, are something we can handle quite well," assures Tuan Pham. "But if 20 out of 100 people are suddenly against me, I can't stand it. As a consequence, I will avoid these 20 in the future. Instead I will stay within my own social bubbles. This is particularly easy in the online world." If many people do this at the same time, the automatic fragmentation observed in the new social model occurs. "This is as certain as a law of nature," says Thurner.

Democracies at risk

If the basic sociological assumptions are correct, CSH president Thurner sees a huge problem that could endanger our democracies as well as the management of massive challenges such as the climate crisis or future pandemics. "If people stay within their bubbles and are no longer willing to leave these comfort zones, how are we, as a society, supposed to negotiate important issues and reach compromises that are the basis of all democracy?" The last two U.S. elections or the increasingly rapid spread of conspiracy theories show how real and potentially explosive this development is.

But what to do to save democracy? "The most effective means would be to dramatically reduce contacts again - this is completely unrealistic," says Thurner. "We really have to think about this urgently." To begin with, the researchers want to test their model with large data sets.

Credit: 
Complexity Science Hub

NREL advanced manufacturing research moves wind turbine blades toward recyclability

A new material for wind blades that can be recycled could transform the wind industry, rendering renewable energy more sustainable than ever before while lowering costs in the process.

The use of a thermoplastic resin has been validated at the National Renewable Energy Laboratory (NREL). Researchers demonstrated the feasibility of thermoplastic resin by manufacturing a 9-meter-long wind turbine blade using this novel resin, which was developed by a Pennsylvania company called Arkema Inc. Researchers have now validated the structural integrity of a 13-meter-long thermoplastic composite blade, also manufactured at NREL.

In addition to the recyclability aspect, thermoplastic resin can enable longer, lighter-weight, and lower-cost blades. Manufacturing blades using current thermoset resin systems requires more energy and manpower in the manufacturing facility, and the end product often winds up in landfills.

"With thermoset resin systems, it's almost like when you fry an egg. You can't reverse that," said Derek Berry, a senior engineer at NREL. "But with a thermoplastic resin system, you can make a blade out of it. You heat it to a certain temperature, and it melts back down. You can get the liquid resin back and reuse that."

Berry is co-author of a new paper titled, "Structural Comparison of a Thermoplastic Composite Wind Turbine Blade and a Thermoset Composite Wind Turbine Blade," which appears in the journal Renewable Energy.

The other authors, also from NREL, are Robynne Murray, Ryan Beach, David Barnes, David Snowberg, Samantha Rooney, Mike Jenks, Bill Gage, Troy Boro, Sara Wallen, and Scott Hughes.

NREL has also developed a technoeconomic model to explore the cost benefits of using a thermoplastic resin in blades. Current wind turbine blades are made primarily of composite materials such as fiberglass infused with a thermoset resin. With an epoxy thermoset resin, the manufacturing process requires the use of additional heat to cure the resin, which adds to the cost and cycle time of the blades. Thermoplastic resin, however, cures at room temperature. The process does not require as much labor, which accounts for about 40% of the cost of a blade. The new process, the researchers determined, could make blades about 5% less expensive to make.

NREL is home to the Composites Manufacturing Education and Technology (CoMET) Facility at the Flatirons Campus near Boulder, Colorado. There, researchers design, manufacture, and test composite turbine blades. They previously demonstrated the feasibility of the thermoplastic resin system by manufacturing a 9-meter composite wind turbine blade. They followed that demonstration by manufacturing and structurally validating a 13-meter thermoplastic composite blade compared to a near-identical thermoset blade. This work, coupled with work by Arkema and other Institute for Advanced Composites Manufacturing Innovation partners, demonstrated advantages to moving away from the thermoset resin system.

"The thermoplastic material absorbs more energy from loads on the blades due to the wind, which can reduce the wear and tear from these loads to the rest of the turbine system, which is a good thing," Murray said.

The thermoplastic resin could also allow manufactures to build blades on site, alleviating a problem the industry faces as it trends toward larger and longer blades. As blade sizes grow, so does the problem of how to transport them from a manufacturing facility.

Credit: 
DOE/National Renewable Energy Laboratory

Just hours of training triples doctor confidence in use of handheld ultrasound devices

While ultrasound imaging technology has been available for more than 50 years, machine size and cost limited its reach to medical offices and hospitals. However, relatively recent advances in this technology has allowed for the development of ultra-portable handheld devices that appear more like the tricorder from Star Trek than a traditional ultrasound machine. These devices hold high promise in many areas of medicine, but are especially promising for uses such as monitoring heart failure and pneumonia in geriatric populations that have difficulties with mobility or remain entirely housebound.

But when any kind of innovative technology is adopted into the medical community, appropriate training to ensure diagnostic accuracy and safety is essential. A recent trial showed that a short training course - which was made even shorter when it was concluded due to the encroachment of the COVID-19 outbreak - dramatically increased the confidence of four geriatricians in their ability to use the devices and interpret images they got from them. Findings from a study of the approach by Penn Medicine researchers were published recently in the Journal of the American Geriatrics Society.

"With ultrasound devices like these being handheld, they can fit in your pocket, so it's always available to doctors, like a stethoscope," said the study's lead author, Daniel Kim, MD, a fellow in Geriatric Medicine in the Perelman School of Medicine at the University of Pennsylvania. "But physicians must be confident in their ability to use the technology, which means knowing the clinical indications for its usage and how to position the device to obtain adequate images and interpret them afterwards."

Geriatric patients disproportionally face barriers to care because of their serious medical conditions and transportation challenges. Ultra-portable handheld devices, called point-of-care-ultrasound (POCUS) devices, allow for them to be carried to places like nursing homes or even homes, bringing evaluation and the potential for care directly to patients. For instance, doctors or other health professionals may use handheld devices for fluid volume assessment, often related to heart failure, as well as checking for urinary retention, pneumonia, and lower extremity blood clots, which are all common among older adults.

Currently, there are training curriculums for emergency medicine physicians and trauma surgeons to learn to use these devices, but courses for geriatricians have not been established. As such, Kim and his coauthors created one, which included a hands-on workshop, to train the doctors working regularly with older populations. They tested it among four different fellows in geriatric medicine who considered themselves novice users at the start.

After Kim's four-hour ultrasound training workshop, the four geriatric medicine fellows' overall comfort and confidence in both using the POCUS device and interpreting the images doubled. By the end of the extended curriculum, which included about an hour of supervised scanning on patients each week, that comfort level almost tripled. These gains were roughly similar along three of the four separate areas of focus that the course covered (bladder, lungs, and internal jugular).

There was a difference when it came to the assessment of the heart. Confidence in using the device did increase dramatically, and continued to do so after the workshop, albeit at a much shallower rate than the other areas of focus. But confidence in the interpretation of images of the heart actually declined somewhat.

"A decline in confidence in heart ultrasound image interpretation after a brief training course is not uncommon," said senior author Nova Panebianco, MD, an associate professor of Emergency Medicine and emergency ultrasound fellowship director. "Cardiac ultrasound is so complex that even cardiologists sometimes obtain additional fellowship training in the subject. I suspect that, with training, the geriatric fellows realized how much more there is to know."

These high gains in confidence were remarkable because they came quickly. The workshop was held on the first day of the curriculum, Jan. 18, 2020. While scheduled to run for six months, the curriculumwas only able to continue into March due to COVID-19's outbreak. It was then suspended because a large part of the training centered on doing supervised scans, which was not possible amid social distancing. But the doctors had already gained a significant working knowledge of performing exams through the POCUS devices by then.

Kim hopes to develop new methods for teaching the use of this device among older populations, and expand the training to Geriatric fellows from nearby health systems.

"We want to organize a virtual ultrasound webinar workshop and teach others to use a similar model for instruction," Kim said. "We hope that the more people we can get in Geriatrics to feel comfortable with this, the more scans we'll be able to do on people who would never have otherwise gotten them."

Credit: 
University of Pennsylvania School of Medicine

Quantifying quantumness: A mathematical project 'of immense beauty'

image: Majorana constellations of some of the most quantum states in various dimensions

Image: 
Luis L. Sánchez Soto

WASHINGTON, November 17, 2020 -- Large objects, such as baseballs, vehicles, and planets, behave in accordance with the classical laws of mechanics formulated by Sir Isaac Newton. Small ones, such as atoms and subatomic particles, are governed by quantum mechanics, where an object can behave as both a wave and a particle.

The boundary between the classical and quantum realms has always been of great interest. Research reported in AVS Quantum Science, by AIP Publishing, considers the question of what makes something "more quantum" than another -- is there a way to characterize "quantumness"? The authors report they have found a way to do just that.

The degree of quantumness is important for applications such as quantum computing and quantum sensing, which offer advantages that are not found in their classical counterparts. Understanding these advantages requires, in turn, an understanding of the degree of quantumness of the physical systems involved.

Rather than proposing a scale whose values would be associated with the degree of quantumness, the authors of this study look at extrema, namely those states that are either the most quantum or the least quantum. Author Luis Sanchez-Soto said the idea for the study came from a question posed at a scientific meeting.

"I was giving a seminar on this topic when someone asked me the question: 'You guys in quantum optics always talk about the most classical states, but what about the most quantum states?'" he said.

It has long been understood that so-called coherent states can be described as quasi-classical. Coherent states occur, for example, in a laser, where light from multiple photon sources are in phase making them the least quantum of states.

A quantum system can often be represented mathematically by points on a sphere. This type of representation is called a Majorana constellation, and for coherent states, the constellation is simply a single point. Since these are the least quantum of states, the most quantum ones would have constellations that cover more of the sphere.

The investigators looked at several ways that other scientists have explored quantumness and considered the Majorana constellation for each way. They then asked what the most evenly distributed set of points on a sphere for this approach is.

As Sanchez-Soto and his colleagues considered the question of quantumness, they realized it was a mathematical project "of immense beauty," in addition to being useful.

Credit: 
American Institute of Physics

Seeking the most effective polymers for personal protective equipment

image: Using fluorescence to demonstrate how particles bind differently to different types of materials.

Image: 
Morgan Alexander

WASHINGTON, November 17, 2020 -- Personal protective equipment, like face masks and gowns, is generally made of polymers. But not much attention is typically given to the selection of polymers used beyond their physical properties.

To help with the identification of materials that will bind to a virus and speed its inactivation for use in PPE, researchers from the University of Nottingham, EMD Millipore, and the Philipps University of Marburg developed a high-throughput approach for analyzing the interactions between materials and viruslike particles. They report their method in the journal Biointerphases, from AIP Publishing.

"We've been very interested in the fact that polymers can have effects on cells on their surface," said Morgan Alexander, an author on the paper. "We can get polymers, which resist bacteria, for example, without designing any particular clever or smart material with antibiotic in there. You just have to choose the right polymer. This paper extends this thinking to viral binding."

The group created microarrays of 300 different monomer compositions of polymers representing a wide variety of characteristics. They exposed the polymers to Lassa and Rubella viruslike particles -- particles with the same structure as their viral counterparts but without the infectious genomes activated -- to see which materials were able to preferentially adsorb the particles.

"Knowing that different polymers bind and possibly inactivate virus to different degrees means we may be able to make recommendations. Should I use this existing glove material or that glove if I want the virus to bind to it and die and not fly into the air when I take the gloves off?" Alexander said.

Though this may seem like an obvious method for quickly screening large quantities of materials, the team's interdisciplinary makeup makes them uniquely positioned to conduct such a study. The surface scientists have the capabilities to create large numbers of chemicals on microarrays, and the biologists have access to viruslike particles.

So far, the tests have only looked at viruslike particles of Lassa and Rubella, but the group is hoping to acquire a grant to look at viruslike particles of SARS-CoV-2, the COVID-19 virus.

Once a handful of the best-performing materials have been determined, the next step of the project will be to use live viruses to evaluate the viral infectious lifetime on the materials, taking into account real-world environmental conditions, like humidity and temperature. With enough data, a molecular model can be built to describe the interactions.

"Strong binding and quick denaturing of a virus on a polymer would be great," said Alexander. "It remains to be seen whether the effect is significantly large to make a real difference, but we need to look to find out."

Credit: 
American Institute of Physics

New study could help predict which individuals are more susceptible to cancer-causing agen

New insights into the mechanisms behind how cancer-causing agents in the environment activate genetic recombination in DNA could help to explain some of the effects of exposure as well as predicting which individuals may be more susceptible to developing the disease, a new UK study has suggested.

Everyone is exposed to low levels of carcinogens (substances or radiation that promote the formation of cancer) in the environment. One of the most widely found is benzopyrene - a general chemical pollutant found in smoke from stoves such as wood burners, exhaust fumes and barbequed meat and fish. One active ingredient of benzopyrene, BPDE, directly damages the DNA sequence forming what is known as adducts which in turn promote cancer-causing mutations.

While models exist showing how BPDE causes these mutations, some of the pathways are still not understood. It is currently believed that a BPDE adducts cause mutations during DNA synthesis because they activate a process called translesion synthesis - where cells copy the DNA despite the presence of unrepaired damage to allow progression of the replication fork - and this induces mutations. However, evidence also suggests the involvement of another process called homologous recombination (HR) which works by copying other undamaged parts of the genome. HR proteins repair complex DNA damage such as breaks in the DNA strands and interstrand cross-links, and protect and recover stalled or broken replication forks.

This latest study treated human cell lines with BPDE before using molecular biology methods, such as microscopy, to characterise the homologous recombination pathway in detail. Results have offered new insights showing that HR proceeds by an unusual mechanism at BPDE adducts and the process can be activated even when there are no stalled or collapsed replication forks. Instead, it is activated at single-stranded gaps in the DNA that are generated by the re-priming activity of PrimPol - a protein encoded by the PRIMPOL gene in humans.

The findings also address longstanding questions by showing that at bulky DNA adducts, the exchanges between the sister chromatids (the identical copies formed by the DNA replication of a chromosome), products of HR that have been traditionally connected with replication fork collapse and DSB repair, are associated with the repair of post-replicative gaps. Furthermore, these post-replicative gaps are produced by PrimPol, shedding light on the function of PrimPol during DNA damage tolerance.

Corresponding author Dr Eva Petermann from the University of Birmingham's Institute of Cancer and Genomic Sciences, says: "Our study has revealed new insights into the effects of benzopyrene exposure in cells, which is important for understanding environmental causes of cancer and cancer development in general. Understanding this mechanism could help to better predict and detect negative effects of pollution as well as allowing for better interpretation of cancer genomics. For example, genetic variants in the HR genes BRCA2 and RAD52 have been liked to lung cancer susceptibility meaning that understanding how HR helps cell deal with benzopyrene could help us to predict individuals who may be more susceptible to the disease

"Moving forwards it will be important to investigate the impact of such genetic variants on HR at ssDNA gaps. A PRIMPOL variant has also been suggested to play a potential role in cancer. It could also help predict which individuals will be more sensitive to carcinogen exposure."

Credit: 
University of Birmingham

Existing antidepressant helps to inhibit growth of cancer cells in lab animals

New research has shown that the antidepressant sertraline helps to inhibit the growth of cancer cells. The substance acts on a metabolic addiction that allows different types of cancer to grow. This is shown by a study on cell cultures and lab animals performed by various research labs of KU Leuven. Their findings were published in Molecular Cancer Therapeutics, a journal of the American Association for Cancer Research.

Cancer cells use different biological mechanisms to stimulate their growth. In certain types of breast cancer, leukaemia, skin cancer, brain tumours and lung cancer, among others, the malignant cells produce large amounts of serine and glycine, two amino acids. This production stimulates the growth of cancer cells to such an extent that they become addicted to serine and glycine.

"This mechanism is an interesting target because cancer cells are so dependent on it", says Professor Kim De Keersmaecker, head of the Laboratory for Disease Mechanisms in Cancer (LDMC). "Healthy cells use this mechanism to a lesser extent and also take up serine and glycine from food. This is not sufficient for cancer cells, however, meaning they start producing more. If we can halt this production, we will be able to fight the cancer without affecting healthy cells."

From yeast to mice

In their search of a substance that influences the synthesis of serine and glycine, the researchers utilized a database of existing medicines. In a first phase, Professor Bruno Cammue's research group at the Centre for Microbial and Plant Genetics (CMPG) tested 1,600 substances on yeast cells.

"Because there are also yeasts, or moulds, which depend on the same mechanism", explains research coordinator Dr Karin Thevissen. "Certain yeasts produce these amino acids to protect themselves against antifungals. In addition, you can easily grow yeast cells, allowing you to test many different substances."

The screening showed that the antidepressant sertraline was the most effective substance. "Other studies had already indicated that sertraline has a certain anti-cancer activity, but there was no explanation for this yet", mention researchers Shauni Geeraerts (LDMC and CMPG) and Kim Kampen (LDMC). "In this study, we've been able to demonstrate that sertraline inhibits the production of serine and glycine, causing decreased growth of cancer cells. We also found that this substance is most effective in combination with other therapeutic agents. In studies with mice we saw that sertraline in combination with another therapy strongly inhibits the growth of breast cancer cells."

Considerable potential

"Now that we've been able to identify this mechanism for breast cancer, we can start examining other types of cancer that are also addicted to serine and glycine synthesis", says Professor De Keersmaecker. "This is for example the case in T-cell leukaemia, but also in certain types of brain, lung and skin cancer. The more tumours we can identify that are sensitive to sertraline, the better the prospects are for helping patients in the future."

"These are, of course, results of experimental research, not clinical studies, but we can be optimistic about the potential. The safety of using sertraline in humans has already been well described, which is a great advantage. That's why we are also looking for industrial partners to develop this further."

Credit: 
KU Leuven

December issue SLAS Technology features 'advances in technology to address COVID-19'

Oak Brook, IL - The December issue of SLAS Technology is a special collection featuring the cover article, "Advances in Technology to Address COVID-19" by editors Edward Kai-Hua Chow, Ph.D., (National University of Singapore), Pak Kin Wong, Ph.D., (The Pennsylvania State University, PA, USA) and Xianting Ding, Ph.D., (Shanghai Jiao Tong University, Shanghai, China).

The December special issue houses a collection of articles addressing COVID-19 caused by a novel coronavirus, SARS-CoV-2. The rise of the COVID-19 pandemic demands the urgent need to diagnose and treat the disease globally. Research and development of new technologies and therapeutics have remained a pressing need in order to combat the rising number of COVID-19 cases. This special collection focuses on the advancing technological innovations being used to address the novel disease.

The special collection includes seven articles of original research, in addition to two reviews and the featured cover article.

Original research articles include:

Ultrasensitive and Specific Detection of SARS-CoV-2 Antibodies Using a High-Throughput, Fully-Automated System

Detection of COVID-19 from Chest X-Ray Images Using Convolutional Neural Networks

Detection Methods of Coronavirus disease (COVID-19)

AmbuBox: A Fast-Deployable Low-cost Ventilator for COVID-19 Emergent Care

An Intestine-on-a-Chip Model of Plug-and-Play Modularity to Study Inflammatory Processes

Novel Self-Assembled Polycaprolactone-Lipid Hybrid Nanoparticles Enhance the Antibacterial Activity of Ciprofloxacin

Toward Automated Additive Manufacturing of Living Bio-Tubes Using Ring-Shaped Building Units

Other articles include:

Advances in Viral Diagnostic Technologies for Combating COVID-19 and Future Pandemics

Immunological Approaches for COVID-19 Diagnosis and Research: The Present for Future

Advances in Technology to Address COVID-19

Credit: 
SLAS (Society for Laboratory Automation and Screening)

Ethnic minorities face rising disparity in homicide risk across England and Wales

New research analysing racial disparities among murder victims across most of Britain over the last two decades shows that people of Asian ethnicity are on average twice as likely as White British people to be killed.

For Black people, however, the risk of homicide has been over five and a half times (5.6) higher than for White British people - on average - during the current century, and this disparity has been on the rise since 2015.

Researchers from the University of Cambridge's Institute of Criminology were surprised to find that official UK data did not include relative risk statistics by ethnicity, as is common in countries such as the US and Australia.

They argue that the UK's Office for National Statistics (ONS) should publish "relevant denominators with raw numerators" to help with public understanding of crime risk and police resourcing. The work is published as a research note in the Cambridge Journal of Evidence-Based Policing.

"Through a series of straightforward calculations, we found substantial racial inequality in the risks of being murdered in England and Wales," said co-author Professor Lawrence Sherman of the University of Cambridge's Institute of Criminology.

"The pandemic has given the public a crash course in statistics. It provides an opportunity to present all kinds of data in ways that have more meaning for the population as well as those on the front line of prevention," Sherman said.

Billy Gazard, a crime statistician for the ONS, said: "We have outlined our plans for improving crime statistics for England and Wales in our July 2020 progress update. Within this update we committed to better addressing inequalities in victimisation and highlighting those groups in society that are at most risk of experiencing crime. We plan to carry out further analysis over the coming year, which will include looking at homicide victimisation rates by ethnicity."

Cambridge criminologists went back over the last 20 years of annual figures using an approach now familiar to many through coronavirus statistics: rates of cases per 100,000 people. This provided a risk ratio for homicide rates by ethnicity in England and Wales.

The researchers say that, to the best of their knowledge, theirs is the first comparison of ethnic group trends in UK homicide victimisation rates per 100,000 to be published in recent decades, if ever.

They found that homicide risk for White and Asian people has stayed relatively consistent since the turn of the millennium - around one in 100,000 for White people and a little over two in 100,000 for Asian people, consisting primarily of persons of South Asian descent. For Black people, however, risks have fluctuated dramatically over the last 20 years.

The homicide victimisation rate for Black people was highest in the early noughties: almost 10 in 100,000 in 2001. It dropped by 69% between 2001 and 2012 to a low of 3 in 100,000 around 2013. Rates then began to climb again, rising seven times faster than for White people to reach over 5 in 100,000 last year.

When accounting for age, the disparity is starker still: for those aged 16 to 24, the 21st century average puts young Black people over ten and a half times (10.6) more likely than White people to be victims of homicide in England and Wales.

In fact, researchers point out that - per 100,000 people - the most recent data from 2018-19 puts the murder risk of young Black people 24 times higher than that of young White people.

The criminologists found no correlation between changes in homicide risk for different ethnicities. As an example, they point to the last three years of data: the homicide rate for White people aged between 16-24 dropped by 57%, while for young Black people it increased by 31%.

"Policing requires reliable evidence, and changing levels of risk are a vital part of preventative policing," said Sherman. "Our initial findings reveal risk inequalities at a national level, but they may be far greater or lower in local areas. We would encourage police forces to produce their own calculations of murder rates per 100,000."

Sherman has long advocated for a more "meaningful" approach to crime data. He has led on the development of the Cambridge Crime Harm Index: a classification system weighted by the impact of an offence on victims, rather than just counting crime numbers.

"Simple statistics show us that the risks of becoming a murder victim are far from equal," added Sherman. "We need more data analysis of this nature to inform police resource allocation, and promote a more fact-informed dialogue with communities across the country."

Credit: 
University of Cambridge

Reducing aerosol pollution without cutting carbon dioxide could make the planet hotter

image: A system of currents known as the Atlantic Meridional Overturning Circulation carries warm water into the North Atlantic. It could be disturbed if CO2 and aerosols are not simultaneously cut.

Image: 
R. Curry, Woods Hole Oceanographic Institution/Science/USGCRP

Humans must reduce carbon dioxide and aerosol pollution simultaneously to avoid weakening the ocean's ability to keep the planet cool, new research shows.

Aerosol pollution refers to particles in the air emitted by vehicles and factories that burn fossil fuels. This pollution contributes to asthma, bronchitis, and long-term irritation of the respiratory tract, which can lead to cancer.

"The conundrum," explained UC Riverside climate scientist and study co-author Robert Allen, "is that aerosols cause poor air quality and lead to premature deaths. However, these particles have a net cooling impact on the climate, so when you cut them that leads to a net warming effect."

Much research has examined aerosol impacts on air quality and land surface temperatures. Less explored is the way aerosols might impact the oceans, which is the focus of a UC Riverside study now published in the journal Science Advances.

The research team created detailed computer models to determine the impact on oceans under two different scenarios -- one in which there is only a reduction in aerosols, and another scenario in which greenhouse gases like carbon dioxide and methane are also reduced.

"The first scenario leads to the surprising result that fewer aerosols in the atmosphere could shift the region where most of the ocean is taking up heat, from the Southern Ocean toward the North Atlantic," Allen said.

In particular, the Atlantic meridional overturning circulation, or AMOC, would be disturbed as aerosols are removed from the atmosphere, the study found. The AMOC pulls warm water further north and pushes colder water south, ensuring the climate on land areas at higher latitudes, such as Europe, are relatively mild.

Roughly half the carbon dioxide humans put into the atmosphere -- mostly through fossil fuel combustion and deforestation -- stays there, and the remaining half is taken up by land and vegetation, as well as the ocean.

One of the ways the ocean takes up our carbon dioxide emissions is through AMOC circulation.

"A projected decline in manmade aerosols potentially induces a weakening of the AMOC, which plays an important role in ocean heat uptake and storage in the North Atlantic," said Wei Liu, an assistant professor of climate change and sustainability at UCR.

In addition, the researchers said a rise in sea level would occur if the North Atlantic Ocean were to get warmer.

This current study focused on ocean heat uptake and circulation via the AMOC. However, Allen explained the study did not attempt to rigorously identify the mechanisms by which aerosol reductions weaken the AMOC. Those mechanisms will be the focus of future studies.

Ultimately, the researchers conclude that even without a more in-depth explanation of the weakening mechanisms, it is necessary to reduce greenhouse gases and aerosols in tandem.

The Intergovernmental Panel on Climate Change recommends making every attempt to prevent the planet from reaching 1.5 degrees Celsius above pre-industrial levels in order to mitigate the worst effects of global warming.

Humans have already increased carbon dioxide levels by almost 50% since the 1850s, and it continues to increase worldwide. Stabilizing carbon dioxide at current levels would require zero net emissions before the year 2070, which is ambitious, but critical.

"Assuming complete removal, aerosols at most will cause warming of about 1 K," said Allen. "However, aerosol-induced warming, as well as the associated ocean circulation changes, can be moderated by rigorous cuts in greenhouse gases including methane and carbon dioxide."

Credit: 
University of California - Riverside

X-ray imaging of a beetle's world in ancient earthenware

image: (Left) The location of the Yakushoden site where the pottery with insect (weevil) impressions was discovered is indicated by the number 1. (Other numbers in the image indicate areas discussed in the research paper.)
(Right) Map of other maize weevil excavation sites from the Jomon period.

Image: 
Professor Hiroki Obata

Using X-rays, Professor Hiroki Obata of Kumamoto University, Japan has imaged 28 impressions of maize weevils on pottery shards from the late Jomon period (around 3,600 years ago) excavated from the Yakushoden site in Miyazaki Prefecture. This is the first example of pottery with multiple weevil impressions discovered in Kyushu, and the density of impressions is the highest ever found in Japan.

Impressions of seeds and insects may be found on the surface and inside ancient earthenware. Prof. Obata began detecting impressions using the "impression method" in 2003 because it is particularly useful for visualizing indentations that cannot be seen by the naked eye. Using this method, his research group discovered the impression of a 10,000-year-old maize weevil on earthenware from Tanegashima Island in 2010. Before that discovery, it was thought that maize weevils came to Japan from the Korean peninsula with shipments of rice, but his finding showed that they were there long before rice started being brought to the Japanese archipelago.

Prof. Obata's group found maize weevil pottery impressions at the Sannai-Maruyama site in Aomori Prefecture in 2012 and at the Tateyama site in Hokkaido in 2013. Several studies have shown that Jomon people brought chestnuts to Hokkaido and the Tohoku region, where chestnuts do not normally grow naturally. Prof. Obata's group, however, showed that Jomon people brought maize weevils with the chestnuts to these areas, thereby confirming the theory of anthropogenic spread of the food pest.

At the Yakushoden site in Miyazaki Prefecture, maize weevils and acorn peels were found in pottery shards. This provides indirect evidence of the relationship between the storage of hard fruits and the pests that attacked them, and also proves that the Jomon people were surrounded by a greater number of weevils than was previously imagined.

"The fact that food pests such as weevils existed even in the Jomon period, and that their spread was due to sedentary lifestyles and the transportation and trade of food is similar to what happens in modern society," said Professor Obata. "Modern epidemics and disasters spread not only through natural forces, but also by the gathering of people and the movement of goods. Thus, there are lessons to be learned from pottery from thousands of years ago."

Credit: 
Kumamoto University

Healthy food labels that work and don't work

image: Example product from NUSMart showing how the labels were presented across the three study conditions. NUSMart is an online grocery shopping platform set up by Duke-NUS Medical School researchers to study the impact of changes in the presentation of food and beverage information on consumer choices. It contains thousands of items found in regular stores in Singapore.

Image: 
Duke-NUS Medical School

SINGAPORE, 17 November 2020 - Want that packet of biscuits? That'll be 17 minutes of jogging to burn off one serving (and there's probably more than one serving in there).

Such Physical Activity Equivalent (PAE) labels on food and beverages have been shown to encourage consumers to make healthier purchases. However, new research from Duke-NUS Medical School and NUS Saw Swee Hock School of Public Health in Singapore has found that when displayed alongside a Healthier Choice Symbol, the positive effects of both labels are diluted. The study was recently published in the journal Appetite.

"Our study shows the importance of keeping labelling simple, so shoppers are not confused or overloaded with information," said Professor Eric Finkelstein, a health economist at Duke-NUS' Health Services and Systems Research Programme, who led the study. "With obesity and chronic diseases on the rise globally and in Singapore, it is important to understand what messaging is most effective in helping shoppers choose healthier diets."

Singapore has been encouraging shoppers to buy food products with the Healthier Choice Symbol. The Healthier Choice Symbol tells consumers the healthier choices within a food category, such as desserts and sugary beverages, by considering the profile of relevant nutrients of a product in that category (e.g. some products may be healthier due to having lower levels of fat).

Prof Finkelstein and his colleagues hypothesised that showing a PAE label alongside the Healthier Choice Symbol would lead shoppers to buy foods and beverages with fewer calories. They recruited more than 100 adults to shop in their online grocery store, NUSMart, which contains thousands of items found in regular stores in Singapore. Participants shopped three times, under different labelling scenarios on certain products: a) a Healthier Choice Symbol, b) both a Healthier Choice Symbol and a PAE label, and c) no label.

They found the Healthier Choice Symbol led to a five percent increase in purchasing foods and beverages with that label, showing that it works to influence consumer demand. However, when combined with a PAE label, the effect was diluted. One possible explanation is that the Healthier Choice Symbol and the PAE label focus on related but different nutritional aspects of a food product - the former focuses on the overall nutrient profile, whereas the latter focuses more specifically on calories alone.

"These findings show that while the Healthier Choice Symbol on its own does encourage Singaporean consumers to purchase healthier products, what is needed for widespread public health impact is a combination of measures to be delivered at sufficient scale across different sectors of the food supply," said Dr Annie Ling, Group Director, Policy, Research and Surveillance Division, at Singapore's Health Promotion Board (HPB). Dr Ling also noted that the findings from this study and from other similar studies have helped HPB to design and refine its policies and programmes.

Prof Finkelstein and his colleagues plan to investigate the effectiveness of the new labels with their NUSMart platform, as well as other options, including the potential for shoppers to pick which labels they find most helpful.

"What I have come to believe is that there is no magic bullet the government can use to improve diet quality, but rather a series of nudges that can combine to have meaningful impacts," said Prof Finkelstein, who has studied strategies to reduce obesity rates for more than 20 years.

Professor Rob van Dam, a study co-author from NUS Saw Swee Hock School of Public Health, concurred, "We are excited to conduct high-quality research to identify which messages are most effective to improve dietary choices, and to work with industry and government to improve food labelling and make a real difference in the health of the population."

Credit: 
Duke-NUS Medical School

Mastering the art of nanoscale construction to breathe easy and bust fraud

image: Heyou Zhang and collaborator Calum Kinnear working in the NanoScience Laboratory at The University of Melbourne's School of Chemistry.

Image: 
Gavan Mitchell & Michelle Gough (University of Melbourne/Exciton Science).

Special anti-counterfeiting and chemical sensing tools that we can use with our eyes could be created thanks to a new nanoscale building method.

In a world-first, researchers at the ARC Centre of Excellence in Exciton Science have been able to arrange tiny rods made from gold in exact patterns, and in numbers large enough for practical use. The results have been published in the journal Advanced Functional Materials.

Importantly, these gold rods can be arranged to generate a variety of colours, which change according to how they are viewed.

That makes them a great anti-counterfeiting feature. For example, if used on a banknote or passport, they could be helpful for cashiers or customs agents.

They can also be modified to turn different colours in the presence of chemicals, acting as a warning for dangerous levels of carbon monoxide and other gases.

Although these effects have been observed before, it was not possible to make them at a size visible to the naked eye. A new approach to chemical assembly was needed.

Consider this: Getting the bricks in a house to match is simple enough. Go a bit smaller, and kids can do the same with Lego. But how do you build things accurately at the nanoscale?

A nanometre is approximately one billionth the size of a metre. To put things in perspective, a sheet of paper is about 100,000 nanometres thick, and your fingernails grow about one nanometre every second. So, unless you're Ant Man and can shrink to the subatomic level, it's a tough task.

But fortunately, you don't need to be an Avenger to get the job done. Lead author Heyou Zhang, a PhD candidate at The University of Melbourne, has used a technique called electrophoretic deposition (EPD).

"The whole idea of my PhD is to be able to better control single nanoparticles. Builders construct houses, brick by brick, and they can put each brick where they want," Heyou said.

"I want to use nanoparticles in a similar way.

"But at the nanometre scale, you can't move nanoparticles yourself. They are invisible. You need to use a method to drive or push the particle into a certain position."

EPD involves applying an electric field of a certain strength to the materials, and using the separation of positive and negative charges to push the rods into place.

Heyou explained: "You have a positive potential and, if the particle is negative, they attract each other. If I have the positive potential on the side of a wall and I have some holes on the wall, the particle can only be attracted to those holes."

With the technique, Heyou and his colleagues are able to build collections of over one million nanorods per square millimetre, in patterns of their choosing.

As well as anticounterfeiting and chemical sensing, the assembly method could have applications in renewable energy, smart phones, laptops and efficient lighting.

Credit: 
ARC Centre of Excellence in Exciton Science