Tech

Seafood extinction risk: Marine bivalves in peril?

Boulder, Colo., USA: Marine bivalves are an important component of our global fishery, with over 500 species harvested for food and other uses. Our understanding of their potential vulnerability to extinction lags behind evaluation of freshwater bivalves or marine vertebrates, and so Shan Huang and colleagues, in analyses presented at the annual meeting of the Geological Society of America, used insights and data from the fossil record to assess extinction risk in this economically and ecologically important group. Their findings suggest that among all today's shallow-marine bivalves (~6,000 species), harvested species tend to be widespread along major coastlines and are able to tolerate wide ranges of environmental conditions (e.g. sea-surface temperature). This is good news, they note, because the fossil record shows that these broad ranges can help them survive "mild" changes in the environment.

Because little is known about the direct human impact on these harvested species, this study by Huang and colleagues studied species' intrinsic risk of extinction, laying the groundwork for efficiently managing these natural resources and conserving marine biodiversity. Having confirmed the ability of their method to predict how intrinsic characteristics interact with external pressures to yield high extinction rates in the past, the next step will be to estimate extinction risk in the future bivalve population. This goal highlights the urgent need for more complete data on the capture and harvesting of these bivalves, which would enable a comprehensive investigation on the direct effects of exploitation.

Overall, this study by Huang and colleagues showcases an integrative approach of combining paleontology and biogeography to study species' intrinsic risk of extinction, which is essential to efficiently managing our natural resources and conserving biodiversity.

Q: What kind of bivalves are you talking about here? How does this relate to what people are eating?

A: We studied all bivalve species that live in the sea from the shoreline to 200 meters deep (most bivalves living deeper than this aren't readily harvested). We find that only 500+ of almost 6,000 marine bivalve species are harvested, but we were surprised to find that many of those species come from evolutionary groups outside the ones we commonly eat--mussels, oysters, scallops, and cockles. People also use bivalves as sources of pearls, a kind of "silk" that can be woven into cloth, and even windowpanes!

Q: Does human consumption contribute to them being in peril, or is it changes to the environment?

A: Previous studies of marine fish have shown that a combination of human harvesting and climate change is negatively impacting a number of species and that such declines depend in part on the biological attributes of individual species. Comparable analyses are lacking for shellfish despite their biological and economic importance. In fact, global catch data are available for only a very small proportion of harvested bivalves. So in this study, we used the fossil record and today's geographic distribution of species to identify the harvested bivalves that are intrinsically more prone to extinctions. We found that many of the evolutionary lineages (here, taxonomic families) containing harvested bivalves were subject to high extinction rates during the past 65 million years. On the other hand, many of the harvested species within those lineages are sufficiently widespread today, suggesting that, all things being equal, they should be fairly extinction-resistant. But we urgently need more information on the extrinsic pressures being applied to those species -- global catch, pollution, and regional climate changes, to determine their future vulnerabilities. This finding calls for further investigation on how external pressures have interacted with family-specific characteristics to yield high extinction rates in the past, which could improve estimates of extinction risk in bivalves, particularly those of economic value.

Q: How does the fossil record tell us about the future of marine bivalves?

A: All bivalve lineages at the taxonomic level of families, including the harvested ones, have been around for tens of million years, and their evolutionary history is preserved in a rich fossil record. From this history, we can see that some families tended to have, on average, higher extinction rates throughout the last 65 million years. This suggests that these families might have biological properties that made them more extinction-prone, although we do not yet always know the immediate causes of these extinctions. The PERIL metric incorporates this information into an estimate of intrinsic extinction risk, and we were encouraged to find that this relatively simple metric successfully predicted extinctions over the past five million years in two regions with especially well-studied fossil records. Putting modern seafood bivalves into this historical framework, including their family-specific risk, gives us a better-informed estimate of their relative robustness to external pressures.

Credit: 
Geological Society of America

Liquid nanofoam: A game changer for future football helmets

image: An electronic microscopic image of the nanofoam material. Each pore is 10 nm in diameter.

Image: 
Weiyi Lu

A liquid nanofoam liner undergoing testing could prolong the safe use of football helmets, says a Michigan State University researcher.

When a helmet withstands an impact severe enough to cause a concussion to the player wearing it, the safety features of the helmet are compromised, rendering equipment unsafe for further use, said MSU's Weiyi Lu, an MSU assistant professor of civil and environmental engineering.

Lu has been testing a liquid nanofoam material that could change that and the research was published on Oct. 13 in the Proceedings of the National Academy of Sciences.

The material is full of tiny nanopores. "The pore diameters are between two and 200 nanometers and that creates a large amount of surface area," Lu said. "The whole area of MSU's Spartan Stadium could be folded up into one gram of nanofoam."

Ordinarily, the material is rigid and adding liquid would fill the holes. To fix this, Lu and his team coated the nanopores with a hydrophobic or water repellant silicone layer made from an organic silyl chain that prevents liquid from being absorbed by the material. As a result, the saltwater liquid inside the nanofoam material becomes pressurized during an impact.

"When the pressure reaches the safety threshold, ions and water are forced into the nanopores making the material deformable for effective protection. In addition, the liquid-like material is pliable enough to form into any shape," he said. "Helmets are pretty much one shape but the liquid nanofoam material can be made to fit a person's specific head shape or profile."

In early laboratory tests, Lu and his team compared an eighth-inch liquid nanofoam liner against the three-quarter-inch piece of solid foam traditionally used in helmets. Both materials were struck with a five-kilogram mass (the approximate weight of a human head) at three meters per second. Even though both materials were deformed by the impact, the liquid nanofoam recovered between the continuous impacts of the test and the solid foam did not.

"The liquid nanofoam outperformed the solid foam," Lu said. "The nanofoam was able to mitigate continuous multiple impacts without damage; the results were identical from test one through test 10."

A liquid nanofoam liner would be thinner and less bulky inside a helmet. And since the liner can withstand multiple high-impact forces, the liner would not need to be replaced after a high-impact collision unless the helmet shell was damaged."A helmet that can be safely reused is a huge advantage," he said. "We would love to see a liquid nanofoam liner in MSU football helmets in the future."

Lu envisions multiple future applications of the liquid nanofoam material beyond football and military helmets. The material could also be used in passive safety devices such as automobile air bags and bumpers. The liquid nanofoam could even be used to protect people and buildings from earthquake vibrations.

The next stage of research will focus on more dynamic studies that will test faster and more intensive impacts such as the effects of a bomb blast, Lu said. "We want this material to help not just extend people's lives but to improve their quality of life."

Credit: 
Michigan State University

Forecasting elections with a model of infectious diseases

image: Voters can interact both within and between states, thus potentially influencing each other's political opinions.

Image: 
Figure courtesy of Alexandria Volkening, Daniel F. Linder, Mason A. Porter, and Grzegorz A. Rempala.

Forecasting elections is a high-stakes problem. Politicians and voters alike are often desperate to know the outcome of a close race, but providing them with incomplete or inaccurate predictions can be misleading. And election forecasting is already an innately challenging endeavor -- the modeling process is rife with uncertainty, incomplete information, and subjective choices, all of which must be deftly handled. Political pundits and researchers have implemented a number of successful approaches for forecasting election outcomes, with varying degrees of transparency and complexity. However, election forecasts can be difficult to interpret and may leave many questions unanswered after close races unfold.

These challenges led researchers to wonder if applying a disease model to elections could widen the community involved in political forecasting. In a paper publishing today in SIAM Review, Alexandria Volkening (Northwestern University), Daniel F. Linder (Augusta University), Mason A. Porter (University of California, Los Angeles), and Grzegorz A. Rempala (The Ohio State University) borrowed ideas from epidemiology to develop a new method for forecasting elections. The team hoped to expand the community that engages with polling data and raise research questions from a new perspective; the multidisciplinary nature of their infectious disease model was a virtue in this regard. "Our work is entirely open-source," Porter said. "Hopefully that will encourage others to further build on our ideas and develop their own methods for forecasting elections."

In their new paper, the authors propose a data-driven mathematical model of the evolution of political opinions during U.S. elections. They found their model's parameters using aggregated polling data, which enabled them to track the percentages of Democratic and Republican voters over time and forecast the vote margins in each state. The authors emphasized simplicity and transparency in their approach and consider these traits to be particular strengths of their model. "Complicated models need to account for uncertainty in many parameters at once," Rempala said.

This study predominantly focused on the influence that voters in different states may exert on each other, since accurately accounting for interactions between states is crucial for the production of reliable forecasts. The election outcomes in states with similar demographics are often correlated, and states may also influence each other asymmetrically; for example, the voters in Ohio may more strongly influence the voters in Pennsylvania than the reverse. The strength of a state's influence can depend on a number of factors, including the amount of time that candidates spend campaigning there and the state's coverage in the news.

To develop their forecasting approach, the team repurposed ideas from the compartmental modeling of biological diseases. Mathematicians often utilize compartmental models--which categorize individuals into a few distinct types (i.e., compartments)--to examine the spread of infectious diseases like influenza and COVID-19. A widely-studied compartmental model called the susceptible-infected-susceptible (SIS) model divides a population into two groups: those who are susceptible to becoming sick and those who are currently infected. The SIS model then tracks the fractions of susceptible and infected individuals in a community over time, based on the factors of transmission and recovery. When an infected person interacts with a susceptible person, the susceptible individual may become infected. An infected person also has a certain chance of recovering and becoming susceptible again.

Because there are two major political parties in the U.S., the authors employed a modified version of an SIS model with two types of infections. "We used techniques from mathematical epidemiology because they gave us a means of framing relationships between states in a familiar, multidisciplinary way," Volkening said. While elections and disease dynamics are certainly different, the researchers treated Democratic and Republican voting inclinations as two possible kinds of "infections" that can spread between states. Undecided, independent, or minor-party voters all fit under the category of susceptible individuals. "Infection" was interpreted as adopting Democratic or Republican opinions, and "recovery" represented the turnover of committed voters to undecided ones.

In the model, committed voters can transmit their opinions to undecided voters, but the opposite is not true. The researchers took a broad view of transmission, interpreting opinion persuasion as occurring through both direct communication between voters and more indirect methods like campaigning, news coverage, and debates. Individuals can interact and lead to other people changing their opinions both within and between states.

To determine the values of their models' mathematical parameters, the authors used polling data on senatorial, gubernatorial, and presidential races from HuffPost Pollster for 2012 and 2016 and RealClearPolitics for 2018. They fit the model to the data for each individual race and simulated the evolution of opinions in the year leading up to each election by tracking the fractions of undecided, Democratic, and Republican voters in each state from January until Election Day. The researchers simulated their final forecasts as if they made them on the eve of Election Day, including all of the polling data but omitting the election results.

Despite its basis in an unconventional field for election forecasting--namely, epidemiology--the resulting model performed surprisingly well. It forecast the 2012 and 2016 U.S. races for governor, Senate, and presidential office with a similar success rate as popular analyst sites FiveThirtyEight and Sabato's Crystal Ball. For example, the authors' success rate for predicting party outcomes at the state level in the 2012 and 2016 presidential elections was 94.1 percent, while FiveThirtyEight had a success rate of 95.1 percent and Sabato's Crystal Ball had a success rate of 93.1 percent. "We were all initially surprised that a disease-transmission model could produce meaningful forecasts of elections," Volkening said.

After establishing their model's capability to forecast outcomes on the eve of Election Day, the authors sought to determine how early the model could create accurate forecasts. Predictions that are made in the weeks and months before Election Day are particularly meaningful, but producing early forecasts is challenging because fewer polling data are available for model training. By employing polling data from the 2018 senatorial races, the team's model was able to produce stable forecasts from early August onward with the same success rate as FiveThirtyEight's final forecasts for those races.

Despite clear differences between contagion and voting dynamics, this study suggests a valuable approach for describing how political opinions change across states. Volkening is currently applying this model--in collaboration with Northwestern University undergraduate students Samuel Chian, William L. He, and Christopher M. Lee--to forecast the 2020 U.S. presidential, senatorial, and gubernatorial elections. "This project has made me realize that it's challenging to judge forecasts, especially when some elections are decided by a vote margin of less than one percent," Volkening said. "The fact that our model does well is exciting, since there are many ways to make it more realistic in the future. We hope that our work encourages folks to think more critically about how they judge forecasts and get involved in election forecasting themselves."

Credit: 
Society for Industrial and Applied Mathematics

Reliable quality-control of graphene and other 2D materials is routinely possible

image: New experiments confirm that the Bell-Shaped-Component (BSC) is a reliable diagnostic of the quality of graphene growth.

Image: 
U.S. Department of Energy, Ames Laboratory

Graphene and other single-atom-thick substances are a category of wonder materials, with researchers the world over investigating their electronic properties for potential applications in technologies as diverse as solar cells, novel semiconductors, sensors, and energy storage.

The greatest challenge for the design of these single-layer or 2D materials into all their myriad potential uses is the need for an atom-by-atom perfection and uniformity that can be difficult and painstaking to achieve at such small scales, and difficult to assess as well.

"We are trying to be more clever than nature in assembling these materials," said Michael C. Tringides, a senior scientist at the U.S. Department of Energy's Ames Laboratory and professor of physics at Iowa State University, who investigates the unique properties of 2D materials and metals grown on graphene, graphite, and other carbon coated surfaces. "And to do so, we're forcing atoms to assemble in ways they normally would not. One of the major challenges of the field is to reliably produce high quality graphene and other materials like it."

Tringides and other scientists at Ames Laboratory have discovered and confirmed a method which could serve as an easy but reliable way to test the quality of graphene and other 2D materials. It takes advantage of the very broad background in surface electron diffraction, named the Bell-Shaped-Component (BSC) which strongly correlates to uniformly patterned, or "perfect" graphene.

Understanding the correlation has implications for reliable quality control of 2D materials in a manufacturing environment.

"This discovery challenges conventional wisdom, but the correlation between this strange phenomenon and high quality graphene is unmistakable. In practical application, we see it extending to other high-interest 2D materials that are similar to graphene in having similar uniformity of a single layer," said Tringides.

Last year, Ames Laboratory researchers discovered through low energy electron diffraction-- a technique commonly used in physics to study the crystal structure of the surfaces of solid materials-- that broad diffraction patterns are an indicator that reliably demonstrates a 2D material's high quality. It was a feature of high quality graphene that essentially lurked in the background, and had been overlooked in published research because it was the exact opposite of what is generally accepted from diffraction studies--that only sharp, bright diffraction spots should be present. Because that finding was counterintuitive, further investigation was required under different experimental conditions and to understand the origin of the BSC, said Tringides.

First, the scientists grew graphene by annealing, or heating it, through a range of high temperatures, and comparing the growth of the BSC diffraction along with the growth of the other, generally accepted indicator of sharp diffraction spots. The evolution of the broad diffraction background closely mirrored that of the sharper spot, which proved that they are correlated. Secondly, the group then experimented with depositing metal atoms (in this case dysprosium) on the surface and underneath the graphene. Called intercalation, this deposition process is one of the ways scientists can customize 2D materials for specific functions. In the second experiment, scientists measured the growth of the BSC during intercalation-- weak when the metal atoms are at first disordered, and then increasing as the metal atoms snap into place between the graphene and the substrate, creating a uniform layer. So while the BSC was not a textbook diffraction pattern, its cause is textbook quantum mechanics-- as electrons are squeezed into a single layer, their wave vectors must spread, creating the broad diffraction pattern.

Credit: 
DOE/Ames National Laboratory

SoundWatch: New smartwatch app alerts d/Deaf and hard-of-hearing users to birdsong, sirens and other desired sounds

image: University of Washington researchers have developed a smartwatch app for d/Deaf and hard-of-hearing people who want to be aware of nearby sounds. The smartwatch will identify sounds the user is interested in and send the user a friendly buzz along with information about them.

Image: 
Jain et al./ASSETS 2020

Smartwatches offer people a private method for getting notifications about their surroundings -- such as a phone call, health alerts or an upcoming package delivery.

Now University of Washington researchers have developed SoundWatch, a smartwatch app for deaf, Deaf and hard-of-hearing people who want to be aware of nearby sounds. When the smartwatch picks up a sound the user is interested in -- examples include a siren, a microwave beeping or a bird chirping -- SoundWatch will identify it and send the user a friendly buzz along with information about the sound.

The team presented these findings Oct. 28 at the ACM conference on computing and accessibility.

"This technology provides people with a way to experience sounds that require an action -- such as getting food from the microwave when it beeps. But these devices can also enhance people's experiences and help them feel more connected to the world," said lead author Dhruv Jain, a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering. "I use the watch prototype to notice birds chirping and waterfall sounds when I am hiking. It makes me feel present in nature. My hope is that other d/Deaf and hard-of-hearing people who are interested in sounds will also find SoundWatch helpful."

The team started this project by designing a system for d/Deaf and hard-of-hearing people who wanted to be able to know what was going on around their homes.

"I used to sleep through the fire alarm," said Jain, who was born hard of hearing.

The first system, called HomeSound, uses Microsoft Surface tablets scattered throughout the home which act like a network of interconnected displays. Each display provides a basic floor plan of the house and alerts a user to a sound and its source. The displays also show the sound's waveforms, to help users identify the sound, and store a history of all the sounds a user might have missed when they were not home.

The researchers tested HomeSound in the Seattle-area homes of six d/Deaf or hard-of-hearing participants for three weeks. Participants were instructed to go about their lives as normal and complete weekly surveys.

Based on feedback, a second prototype used machine learning to classify sounds in real time. The researchers created a dataset of over 31 hours of 19 common home-related sounds -- such as a dog bark or a cat meow, a baby crying and a door knock.

"People mentioned being able to train their pets when they noticed dog barking sounds from another room or realizing they didn't have to wait by the door when they were expecting someone to come over," Jain said. "HomeSound enabled all these new types of interactions people could have in their homes. But many people wanted information throughout the day, when they were out in their cars or going for walks."

The researchers then pivoted to a smartwatch system, which allows users to get sound alerts wherever they are, even in places they might not have their phones, such as at the gym.

Because smartwatches have limited storage and processing abilities, the team needed a system that didn't eat the watch's battery and was also fast and accurate. First the researchers compared a compressed version of the HomeSound classifier against three other available sound classifiers. The HomeSound variant was the most accurate, but also the slowest.

To speed up the system, the team has the watch send the sound to a device with more processing power -- the user's phone -- for classification. Having a phone classify sounds and send the results back to the watch not only saves time but also maintains the user's privacy because sounds are only transferred between the user's own devices.

The researchers tested the SoundWatch app in March 2020 -- before Washington's stay-at-home order -- with eight d/Deaf and hard-of-hearing participants in the Seattle area. Users tested the app at three different locations on or around the UW campus: in a grad student office, in a building lounge and at a bus stop.

People found the app was useful for letting them know if there was something that they should pay attention to. For example: that they had left the faucet running or that a car was honking. On the other hand, it sometimes misclassified sounds (labeling a car driving by as running water) or was slow to notify users (one user was surprised by a person entering the room way before the watch sent a notification about a door opening).

The team is also developing HoloSound, which uses augmented reality to provide real-time captions and other sound information through HoloLens glasses.

"We want to harness the emergence of state-of-the-art machine learning technology to make systems that enhance the lives of people in a variety of communities," said senior author Jon Froehlich, an associate professor in the Allen School.

Another current focus is developing a method to pick out specific sounds from background noise, and identifying the direction a sound, like a siren, is coming from.

The SoundWatch app is available for free as an Android download. The researchers are eager to hear feedback so that they can make the app more useful.

"Disability is highly personal, and we want these devices to allow people to have deeper experiences," Jain said. "We're now looking into ways for people to personalize these systems for their own specific needs. We want people to be notified about the sounds they care about -- a spouse's voice versus general speech, the back door opening versus the front door opening, and more."

Credit: 
University of Washington

Location and extent of coral reefs mapped worldwide using advanced AI

image: Visual comparisons of a map by the United Nations Environment Program World Conservation Monitoring Centre (UNEP-WCMC), the leading global coral reef map, and the GDCS coral reef extent map in different regions, including (a) Great Barrier Reef (GBR), Australia, Papua New Guinea, Indonesia; (b) Madagascar, East Africa; (c) Red Sea, Samoa, Virgin Islands

Image: 
Center for Global Discovery and Conservation Science at Arizona State University

Nearly 75% of the world's coral reefs are under threat from global stressors such as climate change and local stressors such as overfishing and coastal development. Those working to understand and protect coral reefs are building the know-how to mitigate the damage but doing so requires first knowing where reefs are located.

Many approaches, such as diver-based observation and satellite imagery, have been used to estimate the distribution of coral reefs around the world, but past approaches have led to inconsistent accuracy because the underlying data are derived from disparate sources and varying methodologies. Now, researchers from the Arizona State University Center for Global Discovery and Conservation Science (GDCS) have generated a global coral reef extent map using a single methodology capable of predicting the location of shallow coral reefs with nearly 90% accuracy.

The GDCS team used convolutional neural networks (CNNs), an advanced artificial intelligence approach, along with thousands of satellite images from Planet Inc. to create the new global map. Planet's satellites obtain daily coverage of the Earth's landmass and its coral reefs at a 3.7-meter resolution. Many of these satellites are as small as a loaf of bread but, operating together, they collect over 11 terabytes of data every day. This continuous stream of imagery yields a massive amount of data--too much for even a large team of scientists to manually sort through. Using convolutional neural networks and ASU's supercomputer, the GDCS team was able to analyze the data and extract the locations of shallow reefs less than 20 meters (70 feet) of water depth worldwide.

The maps are openly available through the Allen Coral Atlas, a collaborative partnership between ASU, Vulcan Inc., Planet Inc., University of Queensland, and National Geographic Society to map and monitor the world's coral reefs in unprecedented detail.

"The new map represents our best estimate of the location of shallow coral reefs on the planet, and it guides next steps including our ongoing collaboration to map the composition of these reefs and their changing health over time", said first author, Jiwei Li, of GDCS.

The researchers indicated that these new maps can be used with other global maps or datasets to create derived data or analytic products. Some immediate uses of the map at the Allen Coral Atlas include determining where to monitor for coral bleaching, a global phenomenon driven by ocean warming.

Greg Asner, co-author of the study and ASU's Allen Coral Atlas lead explained, "This first-ever A.I.-driven map of the world's coral reefs is just a drop in the bucket compared to what we have coming out over the next year and beyond. The partnership is already rolling out much more detailed reef composition maps on a region by region basis, and we are preparing to launch a global reef monitoring system that detects bleaching. These and other large-scale marine technology innovations are already helping conservation, management and resource policy specialists make decisions. That's our big picture goal."

Credit: 
Arizona State University

Greater prostate cancer incidence; mortality among Black men linked to genetic alterations

Bottom Line: Prostate cancer tumors from African American men had higher frequencies of certain genetic alterations that may be associated with aggressive disease, compared with prostate cancer tumors from white men.

Journal in Which the Study was Published: Molecular Cancer Research, a journal of the American Association for Cancer Research

Author: Jianfeng Xu, DrPH, Vice President of Translational Research at NorthShore University HealthSystem and senior author of the study

Background: "Prostate cancer incidence and mortality are highest in African American men, but the exact reasons for the disparity are not fully understood," said Xu. "The disparity is likely due to multiple factors, including socioeconomic differences and biology. We suspect that differences in the genetic changes that occur within tumors may play a critical role."

How the Study was Conducted and Results: In this study, Xu, together with first author Wennuan Liu, PhD, and colleagues, sequenced 39 genes of interest in tumors and matched normal tissue from 77 African American patients with prostate cancer. They found that over 35 percent of these patients' tumors harbored potentially damaging mutations in several genes, including the DNA repair genes ATM, BRCA2, and ZMYM3, among other genes. ZMYM3, which regulates chromatin and DNA repair, was found to be among the most frequently mutated genes in these patients. Nine of the 77 African American patients (11.7 percent) had tumors harboring mutations in ZMYM3, compared to 2.7 percent of tumors from 410 white patients whose data were included in the Genomic Data Commons database.

In addition, Xu and colleagues examined whether there were differences in the copy number alterations --when genetic material is gained or lost--between the prostate tumors of African American and white patients. The researchers pooled data representing 171 African American patients and 860 white patients from several public databases. They found distinct copy number alterations between African American and white patients in the more aggressive, high-grade prostate tumors (Gleason score 7 or higher), but not in low-grade tumors. High-grade tumors from African American patients were more likely to have additional copies of the MYC oncogene and deletions of the LRP1B, MAP3K7, BNIP3L, and RB1 genes than tumors from white patients. Gain of MYC and loss of MAP3K7 or RB1 were also associated with more advanced tumor stage.

Author's Comments: "Our findings suggest that distinct genetic alterations in the prostate cancers of African American men, in comparison to white men, may contribute to more aggressive prostate cancer and could lead to a higher mortality rate," said Xu. "If confirmed in other studies, these results will not only help to understand the racial disparity of prostate cancer but could also help guide personalized clinical management, such as predicting prognosis and guiding targeted therapy."

Future work from Xu and colleagues will aim to understand how genetic alterations in African American men affect recurrence, metastasis, treatment, and prostate cancer-specific death. In addition, they are interested in developing tests to detect such genetic changes.

Study Limitations: Limitations of the study include the small sample size, the use of different platforms to generate genetic data, and technical differences in how sequencing was performed between samples from African American men in this study and those from white men in the Genomic Data Commons database.

Credit: 
American Association for Cancer Research

Soil-powered fuel cell promises cheap, sustainable water purification

image: Soil microbial fuel cells as designed by researchers at the University of Bath

Image: 
University of Bath

Fuel cells that create energy using chemical reactions in soil-based organisms in successful field test in North-East Brazil

They can be used to produce energy to filter enough water for a person's daily needs, with potential to increase scale

"This project shows that SMFCs have true potential as a sustainable, low-energy energy source", says project lead Dr Mirella Di Lorenzo

Engineers at the University of Bath have shown that it's possible to capture and use energy created by the natural reactions occurring in microorganisms within soil.

A team of chemical and electrical engineers has demonstrated the potential of cheap, simple 'soil microbial fuel cells' (SMFCs), buried in the earth to power an electrochemical reactor that purifies water.

The proof-of-concept design was demonstrated during field testing in North-East Brazil that took place in 2019 and showed that SMFCs can purify about three litres of water per day- enough to cover a person's daily water needs.

The project is a collaboration with a team of geographers from Universidade Federal do Ceará and a team of chemists from Universidade Federal do Rio Grande do Norte.

Testing took place in Icapuí, a fishing village located in a remote semi-arid location where the main source of drinking water is rainwater and access to a reliable power network is scarce.

Rainwater must be chlorinated to be drinkable, and in addition to causing bad taste and odour, uncontrolled chlorination is dangerous to human health - so safe methods to treat water are essential.

Soil microbial fuel cells shown to work in the field

SMFCs generate energy from the metabolic activity of specific microorganisms (electrigens) naturally present in soil, which are able to transfer electrons outside their cells.

The system, developed by staff from Bath's Department of Chemical Engineering and Department of Electronic & Electrical Engineering, consists of two carbon-based electrodes positioned at a fixed distance apart (4cm) and connected to an external circuit. One electrode, the anode, is buried inside the soil, while the other, the cathode, is exposed to air on the soil surface.

Electrigens populate the surface of the anode and as they 'consume' the organic compounds present in soil, they generate electrons. These electrons are transferred to the anode and travel to the cathode via the external circuit, generating electricity.

By building a stack of several SMFCs, and by connecting this to a battery it is possible to harvest and store this energy, and use it to power an electrochemical reactor for water treatment.

A single SMFC unit costs just a few pounds, which could be further reduced with mass production and with the use of local resources for the electrode fabrication.

Cheap and sustainable solution for a chlorination problem

The need for sustainable water purification in the area stems from the fact that the main supply of water is from precipitation, which needs to be chlorinated to be drinkable.

The technology, installed at the EEF Professora Mizinha of Icapuí primary school, creates a small amount of power, which can be used to purify up to three litres of water in about a day. Further research is needed to scale-up its capacity.

The team is aiming to refine the design of the equipment and its efficiency to allow one piece of equipment to purify the water needed by a family in a day. This presents three challenges: generating enough energy; collecting and storing that energy effectively; and treating the water efficiently to ensure quality and drinkability.

Dr Mirella Di Lorenzo, who led the project said: "Using soil microbial fuel cell technology to treat a family's daily water needs is already achievable in laboratory conditions, but doing so outdoors and with a system that requires minimal maintenance is much trickier, and this has previously proven a barrier to microbial fuel cells being considered effective. This project shows that SMFCs have true potential as a sustainable, low-energy energy source."

She added: "We're addressing the issue of water scarcity and energy security in North-East Brazil, which is a semi-arid area. We sought a sustainable way to treat water effectively and make it drinkable. Rainwater is the main source of drinking water in the area, but this is not sterile - our approach in this work points to a way we could solve the issue.

"Another important element of our project is education around sustainable technologies. The field work was performed together with primary school pupils and their teachers. They were trained on the system's working principles, installation and maintenance."

During the fieldwork, which took place in 2019, a system was installed at the primary school, where it was tested to ensure it could replicate results previously seen in the lab.

The Brazilian leader of the project, Dr. Adryane Gorayeb, from Federal University of Ceará (UFC), said: "The application of the technology, as well as the educational element of the project, provided a transformative experience to the pupils, that have broadened their world view.

"The pupils helped with the soil microbial fuel cells fabrication and have learned how to handle the technology. They also participated in a dedicated workshop to raise environmental awareness, based on the United Nations Sustainable Development Goals."

Credit: 
University of Bath

Research lowers errors for using brain signals to control a robot arm

By measuring brain signals and implementing a clever feedback scheme, researchers from India and the UK have reduced the positional error in brain-controlled robot arms by a factor of 10, paving the way to greatly enhancing the quality of life for people suffering from strokes and neuro-muscular disorders.

Brain-computer interfaces (BCI) have seen a large influx of research in an effort to allow precise and accurate control of physical systems, such as position control of robotic arms, using only signals generated from the user's brain. Existing brain-computer interfaces, however, are hindered by two major challenges. First, most of the existing routines for BCI utilize open-loop control. In other words, the routines do not incorporate any feedback during the brain signal-driven movement to correct for any errors. This results in the failure of the system to take corrective actions and leads to large positional errors, such as a robot arm overshooting the desired position and pose. Second, contemporary BCI are designed to respond to inputs sequentially without finer adjustments, leading to further errors in positional control. Additionally, many BCI utilize multiple sensors to drive the functionality of the device under control. Sensors such as infrared spectroscopy, electroencephalography (EEG), and functional magnetic resonance imaging may be used in combination to process signals from the brain.

In this study, published in IEEE/CAA Journal of Automatica Sinica, researchers relied solely on EEG, due to its non-invasiveness, fast response time, and low cost. By using sophisticated processing techniques, the researchers were able to partition different brain signals from the EEG necessary to control a robot arm. The team then made use of a well-known brain signal, the P300, which appears when a subject notices a significant but rare stimulus. In this case, when the subject notices the robotic arm does not reach the position they originally desired.

"The P300 is used to freeze the current motion of the robotic arm," said Amit Konar, Professor in the Department of Electronics and Tele-Communication Engineering, Jadavpur University and co-author of the study. "Since elicitation and detection of the P300 signal requires a finite amount of time, the robotic link crosses the target position by a small amount before motion is stopped. The link is then moved in the reverse direction of the last motion before it is stopped."

Each subsequent stoppage and reversal of the robotic link reduces the speed at which the arm is moving, until a minimum speed is met and the movement ceases. By introducing P300 brain responses to the arm movement via a feedback mechanism, the team was able to bring the error of arm movements down from 2.1% to 0.20% when compared to the previous state of the art BCI.

The team plans to build on their BCI design by developing a more robust, noise-insensitive control interface, moving ever closer to realizing sophisticated, mind controlled physical symptoms that will drastically improve the quality of life of individuals with neuro-muscular disorders.

Credit: 
Chinese Association of Automation

Magnetic nature of complex vortex-like structures in a Kagome crystal Fe3Sn2

image: Numerical simulated depth-modulated two types of magnetic bubbles (upper panel) and corresponding integral in-plane magnetization mappings over the depth (bottom panel).

Image: 
©Science China Press

Recently, observation of new topological magnetic structures represented by skyrmions is expected to provide new paths in constructing spintronic devices. In magnetic bubbles, although these are "ancient" cylinder domains, the type-I bubbles (renamed as skyrmion bubbles with the same topology as skyrmions) have remotivated general scientific interests. On using Lorentz transmission electron microscopy (Lorentz-TEM) to recognize magnetic bubbles in magnetic nanostructures, scientists observed some complex vortex-like magnetic structures beyond the traditional magnetic bubbles (Figure 1a), which could be used as information carriers in emerging spintronic devices. Physical understanding of them, however, remains unclear. Recently, Tang et al. from High Magnetic Field Laboratory of Chinese Academy of Sciences clarified these complex vortex-like structures as depth-modulated three-dimensional (3D) magnetic bubbles in a Kagome crystal Fe3Sn2.

As retrieved from the traditional TIE analysis technique, the magnetic configurations may deviate significantly from real magnetic structures. Because of the direct detection of the local magnetic field of the differential phase contrast (DPC) technique, DPC makes it a more advanced technique in determining real magnetic configurations accurately. Using the DPC technique, first, authors obtained the real features of these complex magnetic configurations (Figure 1b). Then, by combining with 3D numerical simulated types-I and II magnetic bubbles, authors further demonstrated that the integral in-plane magnetization mappings of two types of magnetic bubbles are in high consistency with the experiments (Figure 2) and are responsible for the complex vortex-like magnetic structures.

As obtained from the TEM technique, the magnetic configurations are more readily considered as two-dimensional magnetic domains. This study suggests that 3D magnetic structures play an important role in understanding complex magnetic configurations. Recently, 3D magnetic structures have attracted much attention; however, direct observation of 3D magnetic structures remains a challenging task. This study provides an important experimental proof of the existence of 3D magnetic structures.

Credit: 
Science China Press

Lockdown interviews show poor housing quality has made life even tougher

image: New research from the University of Huddersfield, in conjunction with the Northern Housing Consortium and Nationwide Foundation, shows the shocking extent of how much people struggled to cope whilst living with poor housing conditions in the north of England during the first lockdown, between May and July 2020.

Image: 
University of Huddersfield

Life during COVID-19 has not been a uniform experience. There have been distinct differences in how people have contended with lockdown, depending on whether they have access to safe, secure and decent accommodation.

New research from the University of Huddersfield has looked at how people were coping while living with poor housing conditions in the north of England during the first lockdown, between May and July 2020. We spoke to 50 households: 40 in the private rented sector and 10 owner-occupiers, as well as eight housing workers.

The findings are stark and unsettling. The study found that the state of homes caused increasing distress and were costing more to run and maintain. People, particularly in rental properties, felt increasingly insecure in their tenancies.

Worsening conditions

Most of the people we spoke to were living in privately rented accommodation. We found that for these households, existing poor housing conditions worsened during lockdown.

Many households expressed a suspicion that landlords were using lockdown as an excuse to indefinitely postpone or delay repair works (repairs were permitted at the time the study was conducted).

Others reported that their landlords had refused to arrange repairs. People told us about leaking roofs and guttering, and about how water coming into their housing had caused internal damage, damp and mould.

These households faced the choice of waiting and trying to cope or using their own income and savings to fix their homes. As one resident told us:

"In the end, I had to pay for someone to come out and get rid of the mice myself because I can't have mice running about the flipping house... when it was leaking on the roof I had to pay to have tiles put in."

Our findings showed that people were not reporting or following up concerns or making complaints, due to a fear of possible revenge evictions or rent increases which they could not afford. Many respondents told us that they were putting paying for housing costs ahead of food and other outgoings.

Making ends meet

We heard accounts of the challenges of living in cold and damp conditions. This was a recurring factor in the lives of those people on low incomes, as well as for people for whom the pandemic had added a new layer of uncertainty.

A lack of control over rising energy costs in the home was an ongoing source of anxiety. A single parent said:

"I don't put my heating on as much as I should do. I make sure my daughter walks around in slippers, dressing gowns. You come into the home, you take your coat off and you put a dressing gown on, so you walk round in a housecoat, basically."

Often, these accounts did not come from people who had an existing experience of the welfare system, but from people who were still working full-time in professional occupations.

Spending weeks at a time in poor-quality accommodation had a crushing impact. One woman reported:

"I've got really bad damp in my house...it's always bothered me, but it's bothered me more and more and more because I work from home, and I'm working in the kitchen, and I'm looking at it every day directly and seeing it there. It is just getting worse. The landlord keeps saying, "There's nothing I can do".

Existing problems

The report makes it clear that the issues households were facing did not begin during lockdown. Rather, households were put into lockdown within homes that were already low quality. The stories within the report are not isolated cases - around 1 million homes across the north fail to meet basic decency standards.

Research has shown that those most at risk of experiencing the worst impacts of the pandemic are those people who are already vulnerable: those receiving benefits, living with long-term health conditions, in precarious employment, or living in insecure housing or with poor housing conditions. These issues are particularly acute in the north of England.

Immediate action is needed to ensure people retain as much income as possible, their outgoings are minimised, and their housing is secure. The housing crisis in the UK is not just about a lack of new homes, but also about the quality of existing homes that many of us will continue to live in for decades."

Credit: 
University of Huddersfield

Single-atom alloy: Superb cocatalyst for photocatalysis

image: The surface charge state of co-catalyst plays an important role in photocatalysis. However, the regulation on surface charge state of co-catalysts especially by changing their microstructures and coordination environment almost remains unexplored. Recently, Hai-Long Jiang's research group from the University of Science and Technology of China made very interesting progress in this aspect by fabricating single-atom alloy cocatalyst.

Image: 
©Science China Press

Photocatalysis, converting solar energy into chemical energy, has been recognized to be a very promising solution to the current energy and environmental issues. The performance of the photocatalytic system depends largely on the surface charge state of active sites (usually, i.e., co-catalysts), as the Schottky junction between photosensitizer and co-catalyst facilitates charge transfer between them and finally to reactant molecules, promoting the adsorption and activation of the latter.

In contrast to the existed reports centered on co-catalysts, such as the development of non-noble metal, particle size and distribution control, exposed crystal facets and their interface contact with photosensitizers, the regulation on surface charge state of co-catalysts by changing their microstructures provides vast opportunities for boosting photocatalysis, yet remains extremely rare.

In this work, Dr. Jiang's research group from the University of Science and Technology of China has achieved the goal of optimizing Pt surface charge state via the control of bimetallic Pd@Pt microstructure and Pt coordination environment.

The bimetallic core-shell-structured Pd@Pt NPs have been in situ fabricated and stabilized by a photosensitive and representative metal-organic framework (MOF), UiO-66-NH2. The microstructure of the Pd10@Ptx co-catalyst can be precisely controlled from core-shell to single-atom alloy (SAA), during which Pt coordination environment changes, by precisely and simply tuning the Pt content.

Given the different working functions of Pd and Pt, the charge between Pd and Pt is redistributed, accompanied by Pt coordination environment change, thus achieving the surface charge state regulation of Pt sites.

As a result, all Pd@Pt/MOF present excellent photocatalytic hydrogen production activity due to the electron-rich Pt sites benefited from charge redistribution effect. Moreover, the optimized Pd10@Pt1/MOF composite with SAA co-catalyst, which features the most electron-rich Pt, exhibits an exceptionally high photocatalytic hydrogen production activity, far surpassing its corresponding counterparts (see in image).

This is the first report on SAA co-catalyst toward photocatalysis. It provides the design strategy and synthetic protocol for the fabrication of SAA catalysts and opens up a new avenue to SAA-based photocatalysis. In addition, as an alternative to the classical Schottky junction strategy, this work introduces a novel approach to charge state optimization by regulating the co-catalyst microstructure (especially the coordination environment control), toward enhanced photocatalysis.

Credit: 
Science China Press

SARS-CoV-2 outbreak investigation in meat processing plant suggests aerosol transmission in confined

Heidelberg, 28 October 2020 - The importance of maintaining high quality air flow to restrict transmission of SARS-CoV-2 in confined workspaces has been strongly indicated by the investigation of an outbreak of the virus at a German meat processing plant during May and June 2020. The study, published in EMBO Molecular Medicine, found that the outbreak originated from a single worker on the meat processing production line. It also concluded that in such confined spaces where unfiltered air is recirculated at low rates of external air exchange, transmission of SARS-CoV-2 can occur over distances of at least eight metres.

The study is relevant for many workplaces, but especially significant for the meat and fish processing industries that emerged early during the pandemic as hotspots for SARS-CoV-2 around the world. A combination of environmental conditions and operational practices with close proximity between many workers on production lines engaged in physically demanding tasks promoting heavy breathing, along with shared housing and transportation, all conspire to encourage viral transmission in such plants.

Melanie Brinkmann from Technische Universität Braunschweig and Helmholtz Centre for Infection Research, Germany, Nicole Fischer from University Medical Center Hamburg-Eppendorf, Hamburg, Germany and Adam Grundhoff from the Heinrich Pette Institute for Experimental Virology, Hamburg, Germany, together with a group of further researchers conducted a multifactorial investigation at Germany's largest meat processing plant in the state of North Rhine Westphalia, where the outbreak occurred. They traced the events starting with an initial outbreak in May, followed by increasing numbers culminating in more than 1,400 positive cases having been identified by health authorities by 23 June.

The investigation of timing of infection events, spatial relationship between workers, climate and ventilation conditions, sharing of housing and transport, and full-length SARS-CoV-2 genotypes, demonstrated that a single employee transmitted the virus to more than 60% of co-workers in a distance of eight metres.

Viral genome sequencing was conducted and showed that all the cases shared a common set of mutations representing a novel sub-branch in the SARS-CoV-2 C20 clade. Furthermore, the same set of mutations was identified in samples collected in the time period between the initial infection cluster in May and the subsequent large outbreak in June within the same factory, suggesting that the large outbreak was seeded by cases related to the initial infection cluster.

The results indicated that climate conditions, fresh air exchange rates, and airflow, were factors that can promote efficient spread of SARS-CoV-2 over long distances, but that shared accommodation and transport played a smaller role, at least during the initial phase of the outbreak. Earlier studies already suggested that tiny droplets called aerosols may be responsible for so-called super spreading events where a single source transmits the virus to a large number of individuals. Whereas larger droplets typically travel no farther than two metres, aerosols can stay in the air for prolonged periods of time and may deliver infectious viral particles over substantially greater distances, especially in indoor settings.

The recurrent emergence of such outbreaks suggests that employees in meat or fish processing facilities should be frequently and systematically screened to prevent future SARS-CoV-2 outbreaks. Furthermore, immediate action needs to be taken to quarantine all workers in a radius around an infected individual that may significantly exceed two metres.

Additional studies are required to determine the most important workplace parameters that may be altered to lower infection risk, but optimization of airflow and ventilation conditions are clearly indicated.

Credit: 
EMBO

Surrey device takes us closer to high-performing wearable and eco-disposable AI electronics

The University of Surrey has unveiled a device with unique functionality that could signal the dawn of a new design philosophy for electronics, including next-generation wearables and eco-disposable sensors.

In a study published in Advanced Intelligent Systems, researchers from the University of Surrey detail how their device, called the Multimodal Transistor (MMT), overcomes long-standing challenges and can perform the same operations as more complex circuits.

One of the breakthroughs is the MMT's immunity to parasitic effects that reduce a transistor's capacity to produce uniform, repeatable signals. These have hindered traditional "floating gate" designs ever since their invention in 1967, but this new structure promises efficient analogue computation for robotic control, AI and unsupervised machine learning.

Traditionally, gate electrodes are used to control a transistor's ability to pass current. With Surrey's device, on/off switching is controlled independently from the amount of current passing through the structure. This allows the MMT to operate at a higher speed than comparable devices and to have a linear dependence between input and output, essential for ultra-compact digital-to-analogue conversion. This also gives engineers unprecedented freedom of design, which could lead to greatly simplified circuits.

Dr Radu Sporea, Project Lead and Senior Lecturer in Semiconductor Devices at the University of Surrey, said: "Our Multimodal Transistor is a paradigm shift in transistor design. It could change how we create future electronic circuits. Despite its elegantly simple footprint, it truly punches above its weight and could be the key enabler for future wearables and gadgets beyond the current Internet of Things."

Eva Bestelink is the co-inventor of the MMT. She chose to study Electronic Engineering at the University of Surrey after a career change. Eva said: "It has been an incredible journey since approaching Dr Sporea during my BEng with the idea to create a device based on neural function. When we started in 2017, we could not imagine all the benefits that would result from a relatively simple device design. I am lucky to be part of a group that is open-minded and willing to explore new ideas."

Credit: 
University of Surrey

Stem cells: new insights for future regenerative medicine approaches

image: Chromosome segregation in a human induced pluripotent stem cell.
Centromeres in green and DNA in magenta.

Image: 
© Carolina Pereira & Inês Milagre

Stem cells are considered one of the most promising tools in the field of regenerative medicine because they are a cell type that can give rise to all the cells in our bodies and that has the potential to be used to treat tissue loss due to damage or disease. Stem cells that are similar to the ones of embryonic origin can be generated in the laboratory and they are known as induced stem cells (which can obtained from skin cells, for example). Their induction relies on the reprograming of their gene expression and originates a stem cell from differentiated one, a finding that earned the Nobel Prize in Physiology or Medicine in 2012.

Despite their potential, little is known about the mechanisms that govern the division of stem cells, which have propensity to accumulate chromosome segregation errors during this process. Stem cells can duplicate almost indefinitely and one of the elements necessary for a successful cell division (or mitosis) is the centromere. This is the binding place of the protein complexes that ensure that the genetic material, when duplicated and condensed in chromosomes, is distributed equally between the two daughter-cells.

Driven by the curiosity to understand the mechanisms that govern chromosome segregation in stem cells, the team of researchers from the IGC, led by Raquel Oliveira and Lars Jansen, designed a fundamental biology project with eyes set on centromeres and the protein complexes associated to them.

The study allowed "a precise definition of the composition and size of the centromeres of stem cells and revealed that their chromosomes have weaker centromeres when compared to the ones of differentiated cells. Moreover, these structures become weaker as a consequence of acquiring the identity of stem cell itself", explains Inês Milagre, main author of the study.

"This 'weakness' in a structure of such importance for the correct distribution of chromosomes between daughter-cells might explain why these cells make more mistakes when they divide", adds Lars Jansen, principal investigator at the IGC and the University of Oxford.

The high tendency for errors during cell division, which originates chromosomal anomalies, is currently one of the biggest limitations to the usage of these cells. "To overcome this limitation we must understand why such mistakes occur. Beyond the important discovery of this study, we are now looking at other structures that are important for cell division in order to have a more holistic vision of all the mitotic machinery of stem cells, so that we can revert their tendency for erroneous divisions", reveals Raquel Oliveira, principal investigator at IGC.

This study brings new perspectives to the understanding of cell division fidelity and points our possible causes for the presence of anomalies, which can greatly impact the therapies developed in the field of regenerative medicine.

Credit: 
Instituto Gulbenkian de Ciencia