Tech

Making a case for organic Rankine cycles in waste heat recovery

image: Diagram illustrating cascaded organic Rankine cycle system

Image: 
Dr Martin White, City, University of London

A team from City, University of London's Department of Engineering believes that a new approach to generating energy through waste heat could yield important insights into delivering environmentally-friendly power.

In this recent paper, Making the case for cascaded organic Rankine cycles for waste-heat recovery, published in the Energy journal, Dr Martin White has identified optimal single-stage and cascaded organic Rankine cycle systems (ORC) to maximise performance, and has designed accompanying heat exchangers.

The ORC is based on the principle of heating a liquid which causes it to evaporate, and the resulting gas can then expand in a turbine, which is connected to a generator, thus creating power. Waste heat to power organic Rankine cycle systems can utilise waste heat from a range of industrial processes in addition to existing power generation systems.

A cascaded ORC system is essentially two ORC systems coupled together, with the heat that is rejected from the first ORC being used as the input heat for the second.

However, in developing his model of a cascaded ORC system, Dr White hastens to add that there is a trade-off between performance and cost - in the case of the heat exchangers deployed, the general rule is that the better the performance, the larger and more costly the heat exchangers.

He says the trade-off can be explored through optimisation and the generation of what is called a 'Pareto front' - a collection of optimal solutions that considers the trade-off between two things.

If quite large heat exchangers (in this specific case, greater than around 200m2), were affordable, then for that amount of area, it is possible to generate more power with a cascaded system than a single-stage system.

However, if the size of the heat exchangers was restricted, one would probably be better off with a single-stage system.

Dr White's results suggest that in applications where maximising performance is not the primary objective, single-stage ORC systems remain the best option. However, in applications where maximised performance is the goal, cascaded systems can produce more power for the same size heat exchangers.

His paper emerged out of his work on the NextORC project, funded by the Engineering and Physical Sciences Research Council (EPSRC).

Credit: 
City St George’s, University of London

New survey reveals toll Covid-19 is taking on mental health in Wales

image: Around half of the 13,000 participants who took part in a survey of Welsh wellbeing during the pandemic showed clinically significant psychological distress.

Image: 
Pexels

Wales faces a wave of mental health problems in the wake of Covid-19, with younger adults, women and people from deprived areas suffering the most.

That is the warning contained in new research, led by Swansea University's Professor Nicola Gray and Cardiff University's Professor Robert Snowden, which examines the pandemic's impact on the mental wellbeing of the Welsh population.

The initial findings of the survey reveal that around half of the 13,000 participants showed clinically significant psychological distress, with around 20 per cent suffering severe effects.

Their responses were given during June and July, when the pandemic was seen to be having a dramatic effect on psychological wellbeing.

Professor Gray, from the College of Human and Health Sciences, said: "We examined psychological wellbeing and the prevalence of clinically significant mental distress in a large sample 11 to 16 weeks into lockdown and compared this to population-based data collected pre-Covid-19. It showed a large decrease in wellbeing from pre-Covid-19 levels."

She said the effects in Wales - and by implication those in the UK and beyond - are larger than previous studies had suggested.

"This probably reflects that the current data was taken deeper into the lockdown period than previous evaluations. Public sector services need to prepare for this increase of mental health problems with an emphasis on younger adults, women, and in areas of greater deprivation."

The project was established to track the impact of the pandemic on people's wellbeing, examining the prevalence of significant levels of psychological distress and looking at the factors that might mitigate or aggravate that distress.

The 12,989 participants were recruited via social media and publicity and with support from large organisations across Wales who shared details of the bilingual survey widely. It had the backing of all seven Welsh health boards, the four police forces in Wales, the Welsh Ambulance Service Trust and the Fire & Rescue Service as well as many large employers and third sector organisations.

The Wales Wellbeing research group also consists of Dr Chris O'Connor, Divisional Director of Mental Health and Learning Disabilities at Aneurin Bevan University Health Board with assistance from marketing professional Stuart Williams and Swansea University PhD students James Knowles, Jennifer Pink and Nicola Simkiss.

Their paper, the influence of the COVID-19 pandemic on mental wellbeing and psychological distress: impact upon a single country, has just been published in journal Frontiers in Psychiatry.

The group has also presented its research to the Welsh Government with the findings set to help the NHS in Wales to not only understand the issues affecting communities but also how it can shape support services for the future.

The researchers are currently preparing to reopen the survey to collect more data from participants examining just how the ongoing Covid-19 pandemic continues to impacts daily life, what particular factors act as stressors and further analysis of how age affected responses and experiences.

Professor Snowden said: "While we need science to fight the physical consequences of disease and reduce rates of infection, we also need to understand the consequences of actions such as lockdowns have on the mental health and wellbeing of people so that any treatment is not worse than the disease it aims to cure."

Credit: 
Swansea University

Multiracial congregations in US have nearly tripled, Baylor University study finds

image: Baylor University sociologist Kevin D. Dougherty, Ph.D.

Image: 
Baylor University

Racially diverse congregations have more than tripled in the United States over the past 20 years, and the percentage of all-white congregations has declined, according to a study by a Baylor University sociologist and two colleagues.

Approximately a quarter of evangelical and Catholic churches are now multiracial -- defined as those in which no one racial or ethnic group comprises more than 80% of the congregants.

Congregations that meet the 80% mark also were more likely to be led by Black clergy in 2019 than in 1998, the period covered in the study, which is published in the Journal for the Scientific Study of Religion. However, Black churches remain the least racially diverse.

"More Americans are attending religious services with others who do not look like them," said Kevin D. Dougherty, Ph.D., associate professor of sociology at Baylor University. "The increase is slow but steady, and there is no sign that we've reached a plateau."

Researchers cautioned that the study does not conclude that diverse congregations necessarily promote racial justice. Whites continue to be overrepresented in the pulpits, with 76% of multiracial congregations led by white ministers. And the authors questioned whether conversations about racial inequality are occurring in these settings.

The study noted researchers and practitioners have examined and debated religion's potential -- or lack of it -- to counter racial divisions. In the past 20 years, numerous studies have focused specifically on racially diverse congregations.

For the study, which included a sample of more than 5,000 congregations, sociologists from Baylor, Duke University and the University of Illinois at Chicago analyzed data from the National Congregations Study, collected in 1998, 2006-2007, 2012 and 2018-2019.

"The National Congregations Study is a major resource for tracking change in American religion. This analysis of trends in ethnic diversity is a great example of the productive use of these data," said Mark Chaves, Ph.D., professor of sociology, religious studies and divinity at Duke University and director of the National Congregations Study.

The study examined trends in five Christian groups: mainline, evangelical, Pentecostal and Black Protestant congregations, as well as Catholics.

The biggest relative change was for mainline Protestants. One in 10 mainline Protestant churches were multiracial in 2018-2019, up from 1 in 100 in 1998.

Results showed these increases from 1998 to 2018-2019:

10% of mainline Protestant churches were multiracial, up from 1%.

22% percent of evangelical congregations were multiracial, up from 7%.

16% of Pentecostals are multiracial, up from 3%.

Catholic churches on average continue to be more diverse than Protestant churches with 23% multiracial, up from 17%.

Less than 1% of Black Protestant churches were multiracial in 1998 or 2019.

Despite these changes, difficulties face racial desegregation in American religion, said study co-author Michael O. Emerson, Ph.D., professor of sociology at the University of Illinois at Chicago.

"The path to diversity seems to be a one-way street, with people of color joining white congregations but very few whites joining Black churches," Emerson said. "Until congregations confront the historic structures that keep racial groups divided, diversity inside congregations may function mainly as a superficial performance."

Credit: 
Baylor University

Studies detail impact of mammal species decline in Neotropics

image: Defaunation wiped out 40% of the ecosystem services provided or supported by mammals, such as ecotourism, disease control and soil formation. Large-bodied mammals are disappearing fastest

Image: 
ICMBio

Mammal defaunation – the loss of mammals to extinction, extirpation and population decline – in the Neotropics and its adverse effects is the focus for two scientific papers produced recently by a group of scientists led by Juliano André Bogoni, an ecologist at the University of São Paulo (USP) in Brazil. The Neotropical realm extends south from the Mexican desert into South America as far as the Sub-Antarctic zone.

In the first paper, published in August in Ecosystem Services, the researchers estimate that defaunation has wiped out more than 40% of the ecosystem services provided by mammals, such as supplying animal protein for traditional populations and controlling disease, for example. However, small-bodied species are often “backed up” by others that perform the same ecosystem services.

Defaunation across the Neotropics has erased 56% of medium- to large-bodied mammal species, according to the second paper, published in September in Scientific Reports. The authors propose a novel Hunting Pressure Index (HPI) to indicate a site’s vulnerability to illegal hunting, based on factors that inhibit or intensify the activity. They also show that the surviving mammals are the smallest.

According to Bogoni, the researchers were surprised by the findings of the first study. “We knew that while mammal species are declining very rapidly in the Neotropics, there are still ‘backups’: for every species that disappears, another survives to perform the same service,” he said. “But this isn’t the case for all species. There are families like Cricetidae [rodents such as rats, mice, voles, etc.] in which there may be 30 species in a genus and as many as 100 species in ‘sibling groups’, which are closely related in evolutionary and morphological terms. In other words, there are many overlapping species among small mammals and flyers [bats]. If we had confined our analysis to medium- and large-bodied mammals, the loss of ecosystem services would have been far greater.”

Apex predators are cases in which there may be no such overlapping. “Only one backup exists in many places. Jaguar and puma, for example. When one is lost, only the other remains, if they coexist in the same place, which they often don’t, so loss of the species means loss of the services,” Bogoni said.

Bogoni is a postdoctoral researcher at the University of São Paulo’s Luiz de Queiroz College of Agriculture (ESALQ-USP) and is currently working at the University of East Anglia in the UK with the support of a Research Internship Abroad from FAPESP.

To establish the methodology, the scientists simulated two types of defaunation scenario: stochastic (i.e. random, assuming all groups of mammals decline at the same rate) and deterministic (driven by a feature of the environment or animal group). “The deterministic scenario is ‘real life’, what’s happening now,” Bogoni explained. “I had no idea which groups would be most penalized because we hadn’t yet published the second paper, so I also simulated a stochastic scenario for the sake of comparison.”

Eroded ecosystem services

The team divided the ecosystem services provided by mammals into four groups: provision, including protein for traditional populations, seed dispersal, forest regeneration, and genetic resources; regulation, including climate regulation, disease and pest control, biological control, disaster recovery, and pollination; cultural services, including ecotourism, ethnocultural identity, aesthetics, and education; and support, including soil formation, nutrient cycling, oxygen production, and primary productivity.

The ecosystem services most eroded under the different defaunation scenarios were ecotourism (43.4%), soil formation (39.8%), disease control (39.6%), protein acquisition for subsistence (38.0%) and ethnocultural identity (37.3%). The loss to these services under the deterministic defaunation scenario ranged from 38.9% to 53.0% compared to the baseline.

Under the deterministic scenario, the main ecosystem services affected across different defaunation regimes were ecotourism, soil formation, disease control, and protein acquisition by traditional people, all of which declined by over 40%.

According to Bogoni, some services, such as ethnocultural identity, can decline very quickly. “People mostly identify with apex predators or animals with charismatic ecomorphological traits,” he said. “Rats are unlikely to symbolize ethnocultural identity, whereas jaguars have gripped people’s imagination since pre-Columbian times. Another example of severe decline in services is the provision of animal protein for traditional communities in the form of subsistence hunting. This is a service without much backup and one of those that have declined most. The less backup, the greater the possibility of decline and even complete disappearance.”

Bogoni undertook a vast literature review in search of articles on the ecosystem services provided by mammals in accordance with ecomorphological criteria. Ecomorphology is the study of the interactions between morphological structures, ecology and evolution, including the behavioral factors that determine resource use. “It’s delicate to establish this trait because it’s putative: we predetermined that this or that animal provides certain services based on some of the animal’s characteristics,” he said. “The input came from data in the literature and criteria such as body size, diet, etc. To avoid biases and skewing, we consulted eight experts in mammalogy to obtain additional attributions of ecosystem services. The difference between our attributions and those of the experts was 3% on average. Service attributions in the paper were therefore highly credible, albeit putative.”

The researchers compiled data for 1,153 mammal species in 2,427 assemblages distributed across some 20.4 million square kilometers in Latin America. One definition of an assemblage is a taxonomically related group of species that occur together in space and time.

Bogoni said it took six to seven months to design the project, compile the database, and begin the analysis.

Loss of habitat and hunting

The second article discusses defaunation intensity and the pressure placed by hunting on large mammals in the Neotropics. “Based on current defaunation data I’ve been compiling since 2015 and statistics from the International Union for Conservation of Nature [IUCN] pointing to the approximate distribution of mammals in predetermined polygons, we assumed the polygons corresponded to the distribution of the animals concerned in pre-colonial America and made the comparison. I did the analysis for 1,029 assemblages,” Bogoni said.

He added that the researchers used a mathematical approach called confusion matrix to handle false negatives – situations in which an animal was presumed present but was not in the modern database. “Applying this matrix to ‘correct’ for possible false negatives, our results showed mean adjusted defaunation of 56.5%,” he said. “The most severe defaunation rates were in Central America, the Caatinga biome in Northeast Brazil, and the northern portion of South America.”

The key finding, he said, was that “assemblages have been downsized. A breakdown of the data by assemblage should show animals weighing 14 kilograms in 95% of cases according to the historical average, but now they weigh only 4 kg. In other words, only the smaller animals have survived. Defaunation is not only pervasive but also mainly concerns large-bodied animals, probably owing above all to loss of habitat accentuated by hunting.”

Bogoni and colleagues also propose a novel Hunting Pressure Index (HPI) based on factors that inhibit or intensify hunting and especially poaching. “In the section on methods we list several. For example, latitude: the lower the latitude, the closer to the equator, and the more species, biomass and productivity, the more likely there will be hunting than at the extremes, which are inhabited only by scattered populations of small-bodied animals,” Bogoni said. “The same goes for altitude: the higher the elevation, the less prey and the fewer opportunities for hunting. We considered other factors, such as artificial lighting or the ratio of primary productivity to plant biomass. Environments with high productivity and low biomass are probably pasturelands, and if there’s livestock there’s animal protein so there’s no need to hunt.”

The results showed fairly high HPI values for a vast swathe of the Neotropics totaling some 17 million square kilometers, including the Amazon, Cerrado (Central Brazil savanna), Caatinga (semi-arid Northeast), and Argentine Patagonia. “We’re trying to understand whether habitat loss or hunting accounts for more defaunation,” Bogoni said. “For now, all the research points to both as a synergistic effect, but we want to understand them separately so that conservation strategies can take these nuances into account.”

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Smell and taste changes provide early indication of COVID-19 community spread

UNIVERSITY PARK, Pa. -- Self-reports of smell and taste changes provide earlier markers of the spread of infection of SARS-CoV-2 than current governmental indicators, according to an international team of researchers. The researchers also observed a decline in self-reports of smell and taste changes as early as five days after lockdown enforcement, with faster declines reported in countries that adopted the most stringent lockdown measures.

"In response to the COVID-19 pandemic, many governments have taken drastic measures to prevent their intensive care units from becoming overwhelmed with patients," said John Hayes, professor of food science, Penn State. "Our research suggests that an increase in the incidence of sudden smell and taste change in the general population may indicate that COVID-19 is spreading. This knowledge could help decision-makers take important measures at the local level, either in catching new outbreaks sooner, or in guiding the relaxation of local lockdowns, given the strong impact of lockdown on economic and social activities."

In their study, which published on Oct. 14 in Nature Communications, the researchers used data from the Global Consortium for Chemosensory Research survey, a global, crowd-sourced online study deployed in more than 35 languages. Specifically, the team examined data that were collected from April 7 to May 14, 2020, although study recruitment is still ongoing.

In addition, the team looked at data from the French government -- which beginning on May 7, 2020, has been categorizing various geographical areas of the country as red or green, depending on their COVID-19 prevalence. Compared to green areas, red areas were characterized by higher active circulation of the virus, higher levels of pressure on hospitals and reduced capacity to test new cases.

Finally, to determine whether self-reported smell and taste loss could serve as an early indicator of the number of COVID-19 cases, and therefore hospital stress, the team compared data from France with data from Italy and the United Kingdom, each of which implemented lockdown measures at different times and with different levels of stringency.

"Our primary aim was to test the association between self-reported smell and taste changes and indicators of pressure in hospitals, such as COVID-related hospitalizations, critical care resuscitation unit admissions and mortality rates, for each French administrative region over the last three months," said Veronica Pereda-Loth, lead researcher at the Université Paul Sabatier Toulouse III in France. "Our secondary aim was to examine temporal relationships between the peak of smell and taste changes in the population and the peak of COVID-19 cases and the application of lockdown measures."

Overall, the team found that smell and taste changes were better correlated with the number of COVID-19 admissions to hospitals than France's current governmental indicators, which look at the ratio of ER consultations for suspicion of COVID-19 to general ER consultations. Specifically, the researchers found that the peak onset of changes in smell/taste appeared four days after lockdown measures were implemented. In contrast, the governmental indicator based on ER consultations peaked 11 days after the lockdown.

"Our findings are consistent with emerging data showing that COVID-19-related changes in smell and taste occur in the first few days after infection," said Hayes. "They suggest that self-reports of smell and taste changes are closely associated with hospital overload and are early markers of the spread of infection of SARS-CoV-2. Therefore, potential outbreaks and the short-term efficacy of a lockdown could be monitored by tracking changes in smell and taste in the population."

Data collection for multiple GCCR studies are still ongoing. You can participate by going to https://gcchemosensr.org.

Credit: 
Penn State

New study points to a better way to ward off asthma triggers

image: Left: Lung tissue of untreated mice. Right: Lung tissue of mice treated with antibodies blocking OX40L and CD30L.

Image: 
Dr. Gurupreet Sethi, La Jolla Institute for Immunology.

LA JOLLA--Every day, ten Americans die from asthma. While quick-acting inhalers and medications can reduce inflammation during an asthma attack, people with asthma have few tools to prevent the next attack from coming.

Now researchers at La Jolla Institute for Immunology (LJI) have discovered that blocking two immune molecules at the same time is key to preventing asthma attacks in a mouse model.

"We have found a way to block the acute asthmatic inflammatory response--and we saw a strong, long-lasting reduction in asthma exacerbations," says Michael Croft, Ph.D., professor at LJI and senior author of the new study, published November 5, 2020, in The Journal of Allergy and Clinical Immunology.

When a person with allergies encounters an asthma trigger, harmful T cells boost their numbers in the lungs and release molecules that cause inflammation. The new study shows how to throw a wrench in this process.

For the study, the Croft Lab focused on blocking OX40L and CD30L, which are signaling proteins similar to tumor necrosis factor (TNF), a protein that is the target of several FDA approved drugs. These molecules are upregulated by allergens and can activate the harmful T cells that drive inflammation in asthma.

In the new study, Croft and his colleagues worked with a mouse model sensitive to house dust mites--a very common allergy and asthma trigger. The scientists showed that blocking OX40L and CD30L at the same time could stop the expansion and accumulation of harmful T cells in the lungs during an allergen attack, and this then led to reduced inflammation.

"The combination of taking out the two sets of signals allowed for a strong reduction in the number of those pathogenic T cells, whereas only neutralizing either one had a relatively mild effect" says Croft. "That was quite a significant finding."

Importantly, blocking both OX40L and CD30L also reduced the number of pathogenic T cells that lingered in the lungs following the asthma attack. These "memory" T cells would normally drive inflammation when a person encounters an allergen again. Without OX40L and CD30L on the job, very few of these harmful T cells stuck around in the lungs, and mice had a weaker response to house dust mites for weeks after the initial treatment.

"This suggests we were diminishing the immune memory of the allergen," Croft says.

This study comes several years after an ineffective clinical trial targeting OX40L. Previous research by the Croft lab and other researchers had suggested that blocking signaling from OX40L could reduce airway inflammation, yet a neutralizing antibody against OX40L did not have a beneficial effect in asthmatic patients with house dust mite or cat allergies.

"Why did it fail?" asks Croft. "The new study supports the idea that simply blocking OX40L was not enough."

The research sheds light on the complexity of the immune system and suggests that long-lasting therapy of inflammatory and autoimmune diseases may require a multi-pronged targeting approach, especially when trying to limit the number of pathogenic T cells that are the central drivers of these diseases.

A combination therapeutic to block both molecules would be complicated to test (researchers would need to prove the safety of blocking each separately) but Croft thinks either dual antibodies or a "bi-specific" reagent could work to block OX40L and CD30L signaling together in a single treatment.

Croft is now thinking of the next steps for his lab. Blocking OX40L and CD30L reduced memory T cells but didn't eliminate all of them. Croft thinks additional target molecules could be out there.

"We're trying to understand what those molecules might be," says Croft.

Credit: 
La Jolla Institute for Immunology

COVID-delayed Arctic research cruise yields late-season data

image: The Cape Prince of Wales, the westernmost point of mainland North America, rises on the east coast of the Bering Strait in this view from the Norseman II in October 2020.

Image: 
Photo by Jordi Maisch

Researchers studying the Bering and Chukchi seas for three weeks in October found no ice and a surprisingly active ecosystem as they added another year's data to a key climate change record.

The research vessel Norseman II carried scientists from the University of Alaska Fairbanks, the University of Maryland Center for Environmental Science and Clark University.

Maintaining the continuity of long-term observations is crucial as the region is affected by climate change. For example, the researchers collected sediments and small bottom-dwelling animals to help document harmful algal blooms that are becoming more common as Arctic waters warm. The blooms pose a threat to the humans and marine mammals that eat them. 

Because of pandemic-related delays, the cruise began on Oct. 2 -- a much later start than originally planned. Historically, the Bering and Chukchi sea ecosystem transitioned to lower-level activity as sea ice formed in October. 

This year, unseasonably warm ocean temperatures delayed sea ice formation by several weeks. The lack of ice likely allowed the greater biological activity observed by the researchers. 

"The recovered data are already showing the effects of oceanic heat that extends further into the fall and early winter," said Seth Danielson of UAF's College of Fisheries and Ocean Sciences.

The scientists collected data for several marine science programs monitoring the Pacific Arctic ecosystem.

The Distributed Biological Observatory, led by Jacqueline Grebmeier of UMCES, has been sampling productive hot spots since the late 1980s in U.S. Arctic waters. 

The Arctic Marine Biodiversity Observing Network, led by Katrin Iken at UAF's College of Fisheries and Ocean Sciences, is part of a national network studying how biodiversity and species distributions are changing as a result of climate change in the U.S. Arctic.

The researchers also visited the Chukchi Ecosystem Observatory, a set of highly instrumented oceanographic moorings that monitor the ecosystem year-round. "We only get one chance each year to deploy fresh sensors with new batteries, so this cruise was important to avoid interruptions to the observations," said Danielson, who leads the project. 

"This was a really worthwhile effort that paid off in making biological data available from a part of the year where there have been historically few observations," said Grebmeier, the cruise's chief scientist.

To protect communities in the Bering Strait from potential exposure to the COVID-19 virus, the team completed quarantines and multiple tests in Anchorage before the cruise. They traveled by chartered aircraft to Nome and were taken directly to the research vessel, bypassing the passenger terminal. 

Everyone aboard also adhered to COVID-19 health and safety mandates from their institutions and followed an isolation and travel plan in accordance with the Port of Nome and the State of Alaska.

Credit: 
University of Alaska Fairbanks

Sensor for smart textiles survives washing machine, cars and hammers

image: Moritz Graule, a graduate student at SEAS, demonstrates a fabric arm sleeve with embedded sensors. The sensors detect the small changes in the Graule's forearm muscle through the fabric. Such a sleeve could be used in everything from virtual reality simulations and sportswear to clinical diagnostics for neurodegenerative diseases like Parkinson's Disease.

Image: 
(Image courtesy of Oluwaseun Araromi/Harvard SEAS)

Think about your favorite t-shirt, the one you've worn a hundred times, and all the abuse you've put it through. You've washed it more times than you can remember, spilled on it, stretched it, crumbled it up, maybe even singed it leaning over the stove once.

We put our clothes through a lot and if the smart textiles of the future are going to survive all that we throw at them, their components are going to need to be resilient.

Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineering have developed an ultra-sensitive, seriously resilient strain sensor that can be embedded in textiles and soft robotic systems.

The research is published in Nature.

"Current soft strain gauges are really sensitive but also really fragile," said Oluwaseun Araromi, a Research Associate in Materials Science and Mechanical Engineering at SEAS and the Wyss Institute and first author of the paper. "The problem is that we're working in an oxymoronic paradigm -- highly sensitivity sensors are usually very fragile and very strong sensors aren't usually very sensitive. So, we needed to find mechanisms that could give us enough of each property."

In the end, the researchers created a design that looks and behaves very much like a Slinky.

"A Slinky is a solid cylinder of rigid metal but if you pattern it into this spiral shape, it becomes stretchable," said Araromi. "That is essentially what we did here. We started with a rigid bulk material, in this case carbon fiber, and patterned it in such a way that the material becomes stretchable."

The pattern is known as a serpentine meander, because its sharp ups and downs resemble the slithering of a snake. The patterned conductive carbon fibers are then sandwiched between two prestrained elastic substrates.

The overall electrical conductivity of the sensor changes as the edges of the patterned carbon fiber come out of contact with each other, similar to the way the individual spirals of a slinky come out of contact with each other when you pull both ends. This process happens even with small amounts of strain, which is the key to the sensor's high sensitivity.

Unlike current highly sensitive stretchable sensors, which rely on exotic materials such as silicon or gold nanowires, this sensor doesn't require special manufacturing techniques or even a clean room. It could be made using any conductive material.

The researchers tested the resiliency of the sensor by stabbing it with a scalpel, hitting it with a hammer, running it over with a car, and throwing it in a washing machine ten times. The sensor emerged from each test unscathed.

To demonstrate its sensitivity, the researchers embedded the sensor in a fabric arm sleeve and asked a participant to make different gestures with their hand, including a fist, open palm, and pinching motion. The sensors detected the small changes in the subject's forearm muscle through the fabric and a machine learning algorithm was able to successfully classify these gestures.

"These features of resilience and the mechanical robustness put this sensor in a whole new camp," said Araromi.

Such a sleeve could be used in everything from virtual reality simulations and sportswear to clinical diagnostics for neurodegenerative diseases like Parkinson's Disease.

Harvard's Office of Technology Development has filed to protect the intellectual property associated with this project.

"The combination of high sensitivity and resilience are clear benefits of this type of sensor," said Robert Wood, the Charles River Professor of Engineering and Applied Sciences at SEAS and senior author of the study. "But another aspect that differentiates this technology is the low cost of the constituent materials and assembly methods. This will hopefully reduce the barriers to get this technology widespread in smart textiles and beyond."

"We are currently exploring how this sensor can be integrated into apparel due to the intimate interface to the human body it provides," says Conor Walsh, the Paul A. Maeder Professor of Engineering and Applied Sciences at SEAS and co-author of the study. "This will enable exciting new applications by being able to make biomechanical and physiological measurements throughout a person's day, not possible with current approaches."

Credit: 
Harvard John A. Paulson School of Engineering and Applied Sciences

Losing the American Dream

As many Americans struggle to pay their bills, keeping up with mortgage payments can be daunting with the risk of losing one's home. The challenges to retain a home are stratified along racial differences. Black homeowners are twice as likely to lose their homes and transition back to renting than white homeowners, according to a recent Dartmouth-led study published in Demography . African American owners exit their homes at a rate of 10 percent compared with whites' exit rate of five percent. These racial disparities in the loss of homeownership are due in part to Black homeowners having less access to wealth from extended family and higher rates of poverty across family networks.

Due to longstanding structural racism in the housing market, Black individuals have not had the same opportunities to buy into the ownership market as white individuals. Yet, after the Fair Housing Act of 1968 and policies prohibiting discrimination in lending were implemented, Blacks started buying more homes and their likelihood of returning to renting was similar to that of white homeowners. Between the 1970s and early 1990s, homeownership rates among Blacks improved. However, with the onset of the subprime housing market came a surge in predatory lending practices from the mid-1990s to 2008, which disproportionately impacted African Americans and deepened the racial gap in homeownership exits.

This is the first study of its kind to explore how resources among homeowners' extended family networks affect minority households' chances of sustaining homeownership. Using longitudinal data from the Panel Study of Income Dynamics, 1984 - 2017, which tracks families across multiple generations, the researchers examined the wealth of a homeowner's family (kin) and extended family members (extra-household kin) who live in another household, including whether the relatives live at or below poverty level. The team also looked at the impact of trigger events such as becoming unemployed, losing a substantial portion of income, and developing a disability.

"Owning your own home in the U.S. has long been associated with achieving the American dream. Historically however, African American households have been excluded not only from the homeownership market but also from neighborhoods due to persistent racial segregation and discrimination," explains Gregory Sharp, an assistant professor of sociology at Dartmouth. "White owners have accrued more wealth from their homes than Black owners, whose neighborhoods are often subjected to systemic racist practices that restrict the potential benefits of homeowning for Black owners. Our study shows that Black homeowners are much more vulnerable to losing their homes than similar white homeowners, in part because they have less kin wealth to draw upon while being much more likely to have impoverished relatives to aid in their housing situations," he added.

The study found that Black homeowners' relatives living outside the home had an average net worth of approximately $133,000 as compared to white homeowners' kin, who had an average net worth of nearly $442,000. White owners' extra-household kin were three times wealthier than that of Blacks. In addition, 24 percent of Black owners' extended relatives living outside the home live in poverty as compared to six percent of that of whites. "Our homeownership exit data show that having extended family who are at the poverty level can be a burden, just as wealthy extended family can be a resource," added Sharp.

Of the five percent Black-white gap in homeownership exit, kin network wealth and poverty account for roughly 20 percent of the gap with roughly half due to wealth and the other half due to poverty. Fifty percent of the homeownership exit gap can be attributed to personal wealth, income and employment status (one's own household economic resources), and trigger events, which can create financial hardship for homeowners.

"To help reduce racial inequalities in transitions out of homeownership, policies designed to reduce racial discrimination in lending practices and programs targeted to sustaining ownership should be implemented," said Sharp. "We know that these types of housing policies have major impacts on peoples' lives and can benefit underserved and vulnerable populations, including African Americans, Latinos and immigrants."

Credit: 
Dartmouth College

Ultrafast laser experiments pave way to better industrial catalysts

image: Scott Sayres is an assistant professor in the School of Molecular Sciences at Arizona State University. He is also a faculty member of ASU's Biodesign Institute for Applied Structural Discovery.

Image: 
none

Arizona State University's Scott Sayres and his team have recently published an ultrafast laser study on uncharged iron oxide clusters, which could ultimately lead to the development of new and less-expensive industrial catalysts. It might also contribute to a better understanding of the universe since iron oxides are observed in the emission spectra of stars.

Sayres is an assistant professor in ASU's School of Molecular Sciences and a faculty member in the Biodesign Institute's Center for Applied Structural Discovery.

Most chemical industries utilize catalysts to enhance the rate of reaction and selectivity in obtaining their desired products. For example, catalytic converters in the exhausts of our vehicles commonly use platinum, palladium and rhodium to help break down pollutants.

All three of these metals are significantly more expensive than gold, which is in turn a lot more costly than iron. On average a catalytic converter costs $1,000 but can be as high as $3,000 per vehicle.

"Transition metal oxides are widely used as heterogeneous catalysts in the chemical industry," Sayres said. "The photocatalytic process proceeds through a series of complex reactions, and a fundamental understanding of these catalytic mechanisms is still lacking. Gas-phase studies on molecular scale clusters allow us to probe chemical activities and mechanisms in an unperturbed environment. The atomic precision of clusters can be utilized to identify preferred adsorption sites, geometries or oxidation sites that enable chemical transformations."

The FenOm clusters under investigation here have different compositions: n and m vary but are less than 16. Fe is the chemical symbol for iron and O refers to oxygen.

"This research has not only revealed the stable fragments of bulk iron oxide materials but has shown how the change in atomic composition may affect stability and reactivity of these fragments," said Jake Garcia, graduate student and first author of this paper.

"By resolving the excited state dynamics of atomically precise materials such as iron oxides, we move one step closer to creating more directed molecular catalysts and understanding the reactions which may take place in interstellar media."

Garcia continues that he has found a passion for building experimental instruments in Sayres' lab, and loves studying materials relevant to planetary and earth science.

Ryan Shaffer, who was an undergraduate student working in Sayres' lab, is the second author of the current work.

Detecting iron oxide clusters

Experiments with electrically charged clusters have been common because they can be mass selected with electric or magnetic forces and subsequently reacted individually. Cluster ions are clearly much more reactive than their condensed-phase analogues and neutrals because of their net charge.

Far less work has been done with neutral clusters reported here, which are even better mimics of the true active sites of condensed phases and their surface chemistry. The net charge significantly affects cluster reactivity, and the influence becomes more important as the cluster size decreases due to charge localization.

"The timeframe of electron transitions following excitation is of fundamental interest to the understanding of reaction dynamics. Clusters are atomically precise collections of atoms, where the addition or subtraction of a single atom may drastically change the reactivity of the cluster," Sayres said. "In this work we apply ultrafast pump-probe spectroscopy to study the speed at which energy moves through small iron oxide clusters."

The laser pulses are extremely short: one thousandth of a billionth of a second.

Sayres concludes that the excited state lifetime is strongly affected by atomically precise changes to the cluster composition. Specifically, the higher the oxidation states of the metal, the faster the photoexcitation energy is converted into vibrations. They have found that the excited state lifetimes rely heavily on size and oxidation state.

Catalysts are also extensively used to minimize the harmful byproduct pollutants in environmental applications. Enhanced reaction rates translate to higher production volumes at lower temperatures with smaller reactors and simpler materials of construction.

When a highly selective catalyst is used, large volumes of desired products are produced with virtually no undesirable byproducts. Gasoline, diesel, home heating oil and aviation fuels owe their performance quality to catalytic processing used to upgrade crude oil.

Intermediate chemicals in the production of pharmaceutical products utilize catalysts, as does the food industry in the production of every day edible products. Catalysts are playing a key role in developing new sources of energy and a variety of approaches in mitigating climate change and controlling atmospheric carbon dioxide.

Credit: 
Arizona State University

New primate species discovered in Myanmar

image: Adult female and juvenile Popa langur (Trachypithecus popa) in the crater of Mount Popa, Myanmar.

Image: 
Photo: Thaung Win

A new primate species dubbed the Popa langur has been discovered in Myanmar after years of extensive study, including analysis of a 100-year old specimen kept in the London Natural History Museum. The Popa langur (Trachypithecus popa) is described in a new scientific paper released today that documents the extensive genetic and morphological studies and field surveys undertaken by the German Primate Center (DPZ) - Leibniz Institute for Primate Research in Göttingen and conservation NGO Fauna & Flora International.

The Popa langur only occurs in central Myanmar and is named after the sacred Mount Popa, which holds the largest population of the species with about 100 animals. Mount Popa is an extinct volcano, which features an important wildlife sanctuary, as well as a sacred pilgrimage site, home to Myanmar's most venerated spirits, known as 'Nats'. Altogether there are only 200 to 250 animals of the new species, which live in four isolated populations. Throughout its range the langur is threatened by habitat loss and hunting, and the new species can be considered critically endangered. "Just described, the Popa langur is already facing extinction," says Frank Momberg at FFI.

Researchers of the DPZ and FFI in collaboration with partners from other non-government organizations, universities and natural history museums, investigated the evolutionary history and species diversity of langurs in Myanmar. Their study resulted in the description of the new langur species, the Popa langur.

The Popa langur differs from known species in fur coloration, tail length and skull measurements. Genetic studies revealed that the new langur species separated from known species around one million years ago. The DNA for genetic analyses was obtained from fecal samples collected by FFI staff in the wild, as well as from tissue samples of historical specimens from the natural history museums in London, Leiden, New York and Singapore.

Christian Roos, scientist in the Primate Genetics Laboratory at DPZ says, "The DNA analysis of a museum specimen collected for the London Natural History Museum more than 100 years ago has finally led to the description of this new species, confirmed also by samples collected from the field by FFI's research team."

"Additional field surveys and protection measures are urgently required and will be conducted by FFI and others to save the langurs from extinction," says Ngwe Lwin, a primatologist with FFI's Myanmar program.

Credit: 
Deutsches Primatenzentrum (DPZ)/German Primate Center

Researchers 3D print biomedical parts with supersonic speed

ITHACA, N.Y. - Forget glue, screws, heat or other traditional bonding methods. A Cornell University-led collaboration has developed a 3D printing technique that creates cellular metallic materials by smashing together powder particles at supersonic speed.

This form of technology, known as "cold spray," results in mechanically robust, porous structures that are 40% stronger than similar materials made with conventional manufacturing processes. The structures' small size and porosity make them particularly well-suited for building biomedical components, like replacement joints.

The team's paper, "Solid-State Additive Manufacturing of Porous Ti-6Al-4V by Supersonic Impact," published Nov. 9 in Applied Materials Today.

The paper's lead author is Atieh Moridi, assistant professor in the Sibley School of Mechanical and Aerospace Engineering.

"We focused on making cellular structures, which have lots of applications in thermal management, energy absorption and biomedicine," Moridi said. "Instead of using only heat as the input or the driving force for bonding, we are now using plastic deformation to bond these powder particles together."

Moridi's research group specializes in creating high-performance metallic materials through additive manufacturing processes. Rather than carving a geometric shape out of a big block of material, additive manufacturing builds the product layer by layer, a bottom-up approach that gives manufacturers greater flexibility in what they create.

However, additive manufacturing is not without its own challenges. Foremost among them: Metallic materials need to be heated at high temperatures that exceed their melting point, which can cause residual stress buildup, distortion and unwanted phase transformations.

To eliminate these issues, Moridi and collaborators developed a method using a nozzle of compressed gas to fire titanium alloy particles at a substrate.

"It's like painting, but things build up a lot more in 3D," Moridi said.

The particles were between 45 and 106 microns in diameter (a micron is one-millionth of a meter) and traveled at roughly 600 meters per second, faster than the speed of sound. To put that into perspective, another mainstream additive process, direct energy deposition, delivers powders through a nozzle at a velocity on the order of 10 meters per second, making Moridi's method sixty times faster.

The particles aren't just hurled as quickly as possible. The researchers had to carefully calibrate titanium alloy's ideal speed. Typically in cold spray printing, a particle would accelerate in the sweet spot between its critical velocity - the speed at which it can form a dense solid - and its erosion velocity, when it crumbles too much to bond to anything.

Instead, Moridi's team used computational fluid dynamics to determine a speed just under the titanium alloy particle's critical velocity. When launched at this slightly slower rate, the particles created a more porous structure, which is ideal for biomedical applications, such as artificial joints for the knee or hip, and cranial/facial implants.

"If we make implants with these kind of porous structures, and we insert them in the body, the bone can grow inside these pores and make a biological fixation," Moridi said. "This helps reduce the likelihood of the implant loosening. And this is a big deal. There are lots of revision surgeries that patients have to go through to remove the implant just because it's loose and it causes a lot of pain."

While the process is technically termed cold spray, it did involve some heat treatment. Once the particles collided and bonded together, the researchers heated the metal so the components would diffuse into each other and settle like a homogeneous material.

"We only focused on titanium alloys and biomedical applications, but the applicability of this process could be beyond that," Moridi said. "Essentially, any metallic material that can endure plastic deformation could benefit from this process. And it opens up a lot of opportunities for larger-scale industrial applications, like construction, transportation and energy."

Credit: 
Cornell University

Weighing space dust with radar

image: The observatories are 173 km apart. The relatively close distance allows more accurate correlation of their data.

Image: 
© 2020 Ohsawa et al.

It is thought that over 1,000 kilograms of so-called interplanetary dust falls to Earth every day. This dust is essentially an untold number of small faint meteors, discarded remnants of asteroids and comets that pass by the Earth. Two ways to study faint meteors are radar and optical observations, each with advantages and limitations. Astronomers have combined specific observations with both methods, and can now use radar to make the kinds of observations that previously only optical telescopes could make.

Our solar system is a busy place -- in addition to the large bodies we are all familiar with exist an uncountably large number of rocky asteroids and icy comets. These mostly stay put in their orbits far from Earth but many also roam around the solar system. As they do, they shed some material due to collisions, deformations or heating. Due to this, the Earth is surrounded by small particles we call interplanetary dust. By investigating the size and composition of the interplanetary dust, astronomers can indirectly investigate the activity and makeup of the parent bodies.

"When in space, interplanetary dust is practically invisible. However, around 1,000 kilograms falls to Earth every day in the form of tiny meteors which appear as bright streaks in the night sky," said astronomer Ryou Ohsawa from the Institute of Astronomy at the University of Tokyo. "We can observe these with ground-based radar and optical instruments. Radar is useful as it can cover wide areas and gather vast readings, but optical telescopes can give more detailed information useful for our studies. So we set out to bridge this gap to boost our observational capacity."

Ground-based radar is very good at detecting the motion of meteors, but it does not reveal much information about the mass or composition of the meteors. Optical telescopes and sensors can infer those details based on the light given off by falling meteors due to interaction with the atmosphere. However, telescopes have a limited field of view and until recently lacked the sensitivity to see faint meteors at all. Ohsawa and his team wished to imbue radar observatories with the powers of optical ones. After a few years, they have finally succeeded.

"We thought that if you could observe enough meteors simultaneously with both radar and optical facilities, details of the meteors in the optical data may correspond to previously unseen patterns in the radar data too," said Ohsawa. "I am pleased to report this is in fact the case. We recorded hundreds of events over several years and have now gained the ability to read information about meteor mass from subtle signals in radar data."

In 2009, 2010 and 2018, the team used the Middle and Upper Atmosphere (MU) Radar facility, operated by Kyoto University and located in Shigaraki, Shiga Prefecture, and the Kiso Observatory, operated by the University of Tokyo, on the Nagano Prefecture side of Mount Ontake. They are 173 kilometers apart, which is important: the closer the facilities, the more accurately the data from them can be correlated. MU points directly upwards, but Kiso can be angled, so it was pointed 100 km above the site of MU. The team saw 228 meteors with both facilities and this was plenty to derive a statistically reliable relationship to connect radar and optical observations.

"Data analysis was laborious," said Ohsawa. "A sensitive instrument called the Tomo-e Gozen wide-field camera mounted to the Kiso telescope captured over a million images a night. This is too much for us to analyze manually so we developed software to automatically recognize faint meteors. From what we've learned here we hope to extend this project and begin using radar to investigate the composition of meteors. This could help astronomers explore comets and aspects of solar system evolution like never before."

Credit: 
University of Tokyo

Stanford-led team creates a computer model that can predict how COVID-19 spreads in cities

image: A new computer model predicts the COVID-19 infection-versus-activity trade-off for Chicago. According to the figure, COVID-19 infections will rise as the number of visits to businesses and public places approach pre-pandemic levels. However, restricting maximum occupancy can strike an effective balance: for example, a 20 percent occupancy cap would still permit 60 percent of pre-pandemic visits while risking only 18 percent of the infections that would occur if public places were to fully reopen.

Image: 
Serina Yongchen Chang

A team of researchers has created a computer model that accurately predicted the spread of COVID-19 in 10 major cities this spring by analyzing three factors that drive infection risk: where people go in the course of a day, how long they linger and how many other people are visiting the same place at the same time.

"We built a computer model to analyze how people of different demographic backgrounds, and from different neighborhoods, visit different types of places that are more or less crowded. Based on all of this, we could predict the likelihood of new infections occurring at any given place or time," said Jure Leskovec, the Stanford computer scientist who led the effort, which involved researchers from Northwestern University.

The study, published today in the journal Nature, merges demographic data, epidemiological estimates and anonymous cellphone location information, and appears to confirm that most COVID-19 transmissions occur at "superspreader" sites, like full-service restaurants, fitness centers and cafes, where people remain in close quarters for extended periods. The researchers say their model's specificity could serve as a tool for officials to help minimize the spread of COVID-19 as they reopen businesses by revealing the tradeoffs between new infections and lost sales if establishments open, say, at 20 percent or 50 percent of capacity.

Study co-author David Grusky, a professor of sociology at Stanford's School of Humanities and Sciences, said this predictive capability is particularly valuable because it provides useful new insights into the factors behind the disproportionate infection rates of minority and low-income people. "In the past, these disparities have been assumed to be driven by preexisting conditions and unequal access to health care, whereas our model suggests that mobility patterns also help drive these disproportionate risks," he said.

Grusky, who also directs the Stanford Center on Poverty and Inequality, said the model shows how reopening businesses with lower occupancy caps tend to benefit disadvantaged groups the most. "Because the places that employ minority and low-income people are often smaller and more crowded, occupancy caps on reopened stores can lower the risks they face," Grusky said. "We have a responsibility to build reopening plans that eliminate - or at least reduce - the disparities that current practices are creating."

Leskovec said the model "offers the strongest evidence yet" that stay-at-home policies enacted this spring reduced the number of trips outside the home and slowed the rate of new infections.

Following footsteps

The study traced the movements of 98 million Americans in 10 of the nation's largest metropolitan areas through half a million different establishments, from restaurants and fitness centers to pet stores and new car dealerships.

The team included Stanford PhD students Serina Chang, Pang Wei Koh and Emma Pierson, who graduated this summer, and Northwestern University researchers Jaline Gerardin and Beth Redbird, who assembled study data for the 10 metropolitan areas. In population order, these cities include: New York, Los Angeles, Chicago, Dallas, Washington, D.C., Houston, Atlanta, Miami, Philadelphia and San Francisco.

SafeGraph, a company that aggregates anonymized location data from mobile applications, provided the researchers data showing which of 553,000 public locations such as hardware stores and religious establishments people visited each day; for how long; and, crucially, what the square footage of each establishment was so that researchers could determine the hourly occupancy density.

The researchers analyzed data from March 8 to May 9 in two distinct phases. In phase one, they fed their model mobility data and designed their system to calculate a crucial epidemiological variable: the transmission rate of the virus under a variety of different circumstances in the 10 metropolitan areas. In real life, it is impossible to know in advance when and where an infectious and susceptible person come in contact to create a potential new infection. But in their model, the researchers developed and refined a series of equations to compute the probability of infectious events at different places and times. The equations were able to solve for the unknown variables because the researchers fed the computer one, important known fact: how many COVID-19 infections were reported to health officials in each city each day.

The researchers refined the model until it was able to determine the transmission rate of the virus in each city. The rate varied from city to city depending on factors ranging from how often people ventured out of the house to which types of locations they visited.

Once the researchers obtained transmission rates for the 10 metropolitan areas, they tested the model during phase two by asking it to multiply the rate for each city against their database of mobility patterns to predict new COVID-19 infections. The predictions tracked closely with the actual reports from health officials, giving the researchers confidence in the model's reliability.

Predicting infections

By combining their model with demographic data available from a database of 57,000 census block groups - 600 to 3,000-person neighborhoods - the researchers show how minority and low-income people leave home more often because their jobs require it, and shop at smaller, more crowded establishments than people with higher incomes, who can work-from-home, use home-delivery to avoid shopping and patronize roomier businesses when they do go out. For instance, the study revealed that it's roughly twice as risky for non-white populations to buy groceries compared to whites. "By merging mobility, demographic and epidemiological datasets, we were able to use our model to analyze the effectiveness and equity of different reopening policies," Chang said.

The team has made its tools and data publicly available so other researchers can replicate and build on the findings.

"In principle, anyone can use this model to understand the consequences of different stay-at-home and business closure policy decisions," said Leskovec, whose team is now working to develop the model into a user-friendly tool for policymakers and public health officials.

Credit: 
Stanford University School of Engineering

Smart devices to schedule electricity use may prevent blackouts

image: Image showing the nodes and lines of a prototypical power grid. It has a total of 400 nodes, of which 340 are nodes with only consumption, while the remaining 60 have consumption and generation. The nodes are interconnected by 617 lines.

Image: 
Image created by B. Carreras using Mathematica

WASHINGTON, November 10, 2020 -- Power plants generate electricity and send it into power lines that distribute energy to nodes, or sites, where it can be used. But if the electricity load is more than the system's capacity, transmission can fail, leading to a cascade of failures throughout the electric grid.

This domino effect was responsible for the largest blackout in U.S. history in 2003, which left 55 million Americans and Canadians without power at an estimated cost of $6 billion. An even larger blackout in 2015 affected 57 million people in Italy. Blackouts cause ripple effects throughout the economies they affect, and they can be dangerous for people depending on electronics in hospitals.

In a paper published in Chaos, from AIP Publishing, the authors show demand side control may be an effective solution to stabilizing the reliability of power grids that use a mix of energy generation sources.

Pere Colet and colleagues factored the effects of demand side management into power grids using a model to simulate the rapid fluctuations involved and tested the system under different demand loads.

The authors extended a model for the complex dynamics of blackouts in power grids to include three important factors: intraday variability (peaks in electricity demand when people wake up or come home from work), power bursts caused by simultaneous switching on of many electric devices (either by chance or by large factories), and the effect of managing demand (using devices that delay switching on until the grid is more clear) on the power grid.

"With a growing fraction of electric energy generated from wind and solar power plants, which are subject to weather changes, fluctuations will increase, and we have to consider different control approaches to balance the system," Colet said. "Devices that are smart enough to postpone certain tasks can help. This is what is known as demand side management."

The authors plan to continue to investigate more advanced forms of demand control, such as communication between nodes. They are also exploring models that can assess the amount of solar and wind power that can be included in grids without increasing the risk of blackouts due to fluctuations.

"The implementation of demand side management techniques may be quite helpful in preventing blackouts," Colet said. "One important aspect is that the customers will need to be trained to respond with social responsibility to the situations of the demand and learn to adapt to the situation. That will be particularly important when renewable energy sources are in use."

Credit: 
American Institute of Physics